LCP is essential to the user experience. People don’t like to wait for a page to start loading or see a blank page before the main contents load when they could immediately get a faster experience elsewhere.
Page performance analysis tools tell your LCP metrics per page, and as a website owner, you should be concerned with keeping the LCP below 2.5 seconds to load your main content as fast as possible.
In our previous article, we discussed good and bad LCP, reasons that can harm your page LCP, and typical methods to reduce your LCP timing. Today, let’s talk about the featured image, its impact on LCP for your post-type pages, and ways to fix it because they usually receive the most traffic.
Impact of the Featured image in LCP
The featured image is loaded above the fold of your page, which means the browser has to load the image file, and if the file is large in size or your server response time is not fast enough, your page may suffer an LCP issue.
The above-the-fold image asset can be a hero image, a banner, or a carousel. Google considers <img> or <image> element inside an <svg> element for the Largest Contentful Paint. Hence you must optimize these images for a better LCP.
A size of 100+kb image above the fold usually requires a couple of milliseconds and often appears to be the primary reason for LCP. When you need your LCP metrics to remain under 2.5 seconds, every millisecond count, you should consider optimizing it.
Here is an example of a site with a featured image above the fold. The featured image file is the source of LCP, but because all the assets are well optimized and served via CDN, the pages are able to get a much lesser LCP 1113ms than the critical point.
So, if the featured image is causing LCP, does removing it help?
Removing the featured image improves the LCP metric by reducing the couple of milliseconds required to render the image file. If your featured image was heavy in size, say > 500kb, you should get a decent upgrade in terms of LCP after removing it from above the fold.
Then when it comes to mobile devices, where the network and compilation are usually slower, it becomes even more difficult to tackle LCP issues with an image of large-size loading in the first print.
It’s not that after removing the featured image, your site will eliminate the LCP issues. The texts will cover the space and featured image used to cover, and you will still need to optimize your fonts. However, it’s easier to serve optimized fonts than a featured image in terms of performance.
A featured image makes your page look better and adds a nice touch for the visitor to get an idea of what they will find on the page. But when you have to balance performance and design, you could opt for a solution that maintains both conditions.
Drop the featured image from above the fold.
Have the featured image on your page, but below the fold.
Shifting your featured image below a few paragraphs (I usually add it below the 2nd paragraph) has helped many sites we have worked on. It maintains the visual image that describes the page and allows the font to load as the largest contentful paint. Then we preload the most important fonts for the browsers to load them as soon as possible.
Is there a way we can keep the featured image for desktop devices and remove it on mobile?
Yes, keeping the featured image above the fold on desktop (if LCP is not an issue on desktop) and removing or shiting it below the fold on mobile can be an excellent solution to tackle your site’s LCP issue.
This will require a simple CSS selector to be used:
The @media (max-width: 480px) will trigger for all devices with screen width =< 480px, and then it will select the featured image with .entry-content img and set the display rule to null.
You can try to shift the featured image with the same selector and replace the display:none with the right CSS code to display it below a couple of paragraphs to ensure it appears below the fold.
Note: here, we are discussing the impact of featured images on LCP, so by no means this piece of content intended to say that featured images are responsible for LCP issues on your site. There are many other reasons why a page is suffering from LCP. I have added a couple of other reasons that help LCP to rise.
Here is a test report for a website where we have not changed anything else but dropped the featured image below 3rd paragraph, and that resolved the LCP issues it had by a couple of milliseconds.
Additional reasons for poor LCP
Apart from the featured images on most pages, there are other things that are responsible for LCP. Anything that slows down the page rendering process could be the reason for the LCP issue on your pages.
These elements usually take a lot of specs, unoptimized assets, assets with loading delays, assets with 3rd party connections, advertisements, lots of unoptimized javascript on the head section, etc. We can’t forget about the core server’s response time.
1. Slow Server Response Times
Server response time is a critical factor of LCP. If your server is not fast enough, it might fail to serve above-the-fold assets in under 2.5 seconds. Hence while considering your website host, it’s important to avoid hosts that are not reliable and fast enough.
Bad server back-end infrastructure, unoptimized databases, and longer time to resolve API requests are often the reasons for slow server response time. Always go for a performance-optimized host if you want your website to be fast, web vital compatible, and favorites of search engines with good quality content.
2. Render blocking Javascript and CSS
Javascript and CSS help improve your site design and functionality, but it’s important to optimize them for your host to serve them per request in a way that does not prevent the loading of primary content.
Loading too many Js and CSS codes at the head might increase render-blocking issues. Render blocking is directly proportional to page LCP; hence, you should reduce render blocking as much as possible without breaking your site’s functionality.
3. Slow Resource Load Times
If you have resources loading above the fold that are large in file size, have dense visual elements, are not optimized, videos & gif files, or have lots of text contents, it will affect the page LCP. Consider optimizing those assets or removing/reducing them from the top of your page.
4. Client-side Rendering
Client-side rendering can be another reason for LCP issues. Client-side rendering involves the browser collecting and loading javascript before rendering primary content on the page. If the process is complex and device hardware unable to perform memory management, your webpage might struggle on people’s browsers, increasing the chance of LCP.
So consider optimizing or avoiding client-side rendering if possible.
The portfolio has completed another month, and the anticipated rebuilding of these once-neglected sites seems to go in the right direction. It’s obvious to face some ups and down with the overall growth of the sites, but given the sites are running on a combination of GTP-3 tools + human edits, it’s not going to skyrocket in the next month.
In the previous month, we shared the traffic, revenue, expansion, and overall PnL, which defiantly portrayed a marginal profit, and we will share the same for November 2022. Overall, we are happy with the output, and steady growth in at least one of the crucial terms makes sense, if not on all sides.
Lately, adding additional AI copywriter tools (like Jasper.ai) during the major discount season has helped us reduce the content production cost by a great margin. We are excited to see how it impacts Ezoic revenue and PnL overall in the upcoming months.
1. Traffic & Earning Overview (Covering Ezoic)
The portfolio sites rely heavily on search traffic, and for the month of November received about 7031 fewer users than the previous month. Most parts of this traffic drop occurred on highly AI content-reliant websites. However, other AI content sites have gained a maximum of 28.14% more traffic than the previous month individually.
So it’s safe to say the traffic fluctuation does not directly come from the AI content being recognized by google; rather, it’s the content quality we might need to improve in the coming days to stay up top and compete with other pages with the same intent.
The overall portfolio traffic declined in November by 9.30% (68,539 compared to the previous month of 75,570), and we have noticed an obvious decline of 8.79% in page views (80,643 to 88,413 pageviews in October 2022).
The revenue is quite similar to traffic and page view data. The total revenue for the month of November has a 2.90% of decrement to the previous month with the help of a 22.10% boost in Ezoic ePMV.
The total revenue for the portfolio in November was $715.68, compared to $737.01 in the previous month (October 2022), with an ePMV change of +$0.74. The slightly higher ePMV (+9.39%) has helped the portfolio to maintain a closer total revenue while there are 7,770 fewer page views than the previous month.
If we look at the yearly comparison, the AI sites have expanded a lot in terms of traffic and revenue they are generating.
The portfolio growth data (Nov 2022 vs. Nov 2021):
Total Visits (+17.60%)
Pageviews (+7.13%)
Revenue (+109.01%)
ePMV(+77.73%)
2. Traffic Overview By Site
Individual traffic data for the portfolio sites had ups and downs. For most of the sites, the traffic for November was lower than in October, but three sites with a boost in traffic in the last couple of months managed to get more traffic than in October. Similar to October, some major sites continue to lose traffic even though we have recently updated the top-ranking pages, which might help those sites to recover a certain % of the traffic lost recently.
Some of the sites that we believed were tanked started to recover traffic this month which could prove to be a good sign that the AI content updates have been helping as long as done correctly with quality maintained.
Here are November traffic data collected via Ezoic BDA: (compared to October 2022)
In November, the portfolio recorded 68,539 visits across all sites, where the most popular site’s traffic dropped by -15.13% (that’s around 5,482 fewer visits, a total of 30,762 compared to the previous 36,244) is the maximum and 136 being the site with least visits in the portfolio. We encountered -7,770 fewer pageviews as well this month.
The maximum pageview record for a site within the portfolio in November was 36,109, which is again 13.02% less than what the site had received in October 2022, and 142 being the lowest pageview data for a site within our content sites.
3. Revenue Data Per Site
As mentioned, the revenue in November is close to what it was in October, with a -2.90% drop, but if we look at the individual records, more sites have shown signs of growth thanks to the boost in RPM/ePMV. The overall revenue stats is directly proportional to November’s pageview stats meaning there’s a drop of $21.35 in total revenue for the portfolio, with a total of $715.68 compared to $737.03 in Oct 2022.
The highest individual revenue recorded was $433.07 in November, which comes with a 9.15% boost in ePMV to the previous month, and the exact ePMV was $14.08. The average ePMV was $8.67 in November compared to $7.93 in October 2022 across all sites in the portfolio.
The maximum individual ePMV has also increased to $23.31 compared to the previous best of $21.03 and registered the minimum ePMV of $1.20 for a site.
With some ad position optimization, we could lift the avg—ePMV by +9.39%, which was -1.69% between September and October.
4. Expense
Publishing content on all the sites on a regular basis is never easy. As we mentioned multiple times, the AI machine was put together in a way that can be cost-effective to run content on all sites while we follow certain standards to maintain the output quality for the content to be competitive on search engines.
We managed to build an effective team of writers who understands the guidelines, knows how to tackle the content’s objective, how to plan a content structure, and then use AI to reduce the time needed to create content that still has a good balance between the answering what people are looking for and the arrangement. Then we also need to take care of SEO; without optimizing the content, it becomes more difficult for search engines to prefer those copies to put on a higher position.
Then there is a need for editors, someone with expertise in certain niches (because the portfolio sites cover different subjects), and uploaders to make sure content is being published regularly to run the entire system smoothly. In addition, we have additional expense with AI copywriter tools that comes with certain word credit.
This all might sound like a lot, but we have managed to run everything on a profit so far, particularly because the content production quantity, I would say, is still at the lower end of its potential when we grow more in the coming days.
Here’s what the portfolio November total expanse compared to revenue generated with Ezoic Ad revenue (again, this data excludes other revenue sources & expanses for sites that do not run Ezoic) looks like:
Key data:
Total expanses: $640.00
Total Revenue: $715.68
Profit (with Ezoic revenue): $75.68
% of profit Ezoic revenue brings in: 11.83%
5. Changes & Focus
Although these focuses have always been part of the portfolio development, we have updated the approach recently based on the output in recent months.
Content Updates: We worked with outdated content that we’re still doing great on SERP, and a well-researched update could have kept them higher in the SERP to keep driving traffic to the sites. We designed a workflow where we collected the most traffic-driven keywords from GSC and Ahrefs keyword analysis. We cross-checked the top-ranking pages on the websites to target those pages that need to be updated as soon as possible.
SOPs for different typesof content: The next approach that significantly helped us to let our human writers who control the AI copywriter tools maintain a certain standard across the content, whether it be list-based content, generic blog posts, or tutorials, is by providing them standard SOPs for each type of content. The SOP does not limit the writers in terms of creativity; instead, it focuses on helping the writers have a primary structure in mind to maintain the hierarchy and design across all contents of a site. We have decided to frequently update these SOPs based on the additional requirement we uncover during content publishing. This strategy has helped us to manage the writers more efficiently and reduce the editing burden by a decent amount.
Ezoic placeholder update: although Ezoic has the AI to place placeholders automatically, we wanted to test some above-the-fold ads after bringing some changes to the site design to see if that can boost the ePMV and overall revenue. Based on the revenue data we have for the month of November, it seems that way, at least for now. Here are the top placeholders for a site that generates the most revenue for the site.
Good site health score: Maintaining the site health score has always been our goal to make sure the sites do not get stuck with technical issues such as missing metadata, link issues, inappropriate redirects, schema-related issues, etc. Such minor details can be when you want to rank higher up in SERP with more competition from the other sites. We want to keep our sites ready so the content has the freedom to try and rank in top positions. We use Ahref to monitor our sites, and the dashboard view has been really helpful in keeping an eye on what’s happening to the sites all the time. Recently one of our major portfolio sites has encountered some internal issues, and we are witnessing a downfall in terms of ranking afterward. We are still fixing the issues and hope this will help the site recover the positions as soon as possible.
Site speed and web vitals: As mentioned in the previous monthly report, the site speed and web vitals continue to be one of our primary objectives in this project. With the speed upgrade, we noticed a 6x better-crawling record for the sites and those sites that are passing web vital and page experience according to Google Search Console continues to do well & improve in search engines as well. So we are highly focusing on page performance as a key factor to push the site to do well in search engine rankings. Overall putting everything together in great condition makes a healthy site grow efficiently is the bigger picture. Here’s one of the best web vital records we were able to maintain during the previous month for a certain site in the portfolio.
Utilizing Ezoic’s Niche IQ: Ezoic always comes up with something to be excited about. Niche IQ is an excellent addition that has helped automate certain things to reduce the overall time investment we had to do without it far. NicheIQ combines an On-page SEO tool, a site health monitoring tool, and a done-for-user keyword research tool. We like to use Topics because it discovers highly relevant content ideas for our sites and provides all the necessary information related to keyword research. This relieves our keyword research process because we usually have to find batches of new keywords for at least 3 sites per day.
Conclusion:
Without an appropriate strategy, it could be a mess to try and run many sites (where new sites are continuously joining the portfolio) with the content being published every single day, but thanks to a well-designed plan, ready-to-use tools + the right people in the right place make it much easier to manage the portfolio.
This income report series aims to share our growth and the fundamentals that helped us reach here; minor details we work on per month to try and improve the overall growth of the portfolio that can encourage you to build your AI-driven website’s portfolio at a significant lower cost.
This study will look at the 5 best AI content generators for long-form SEO blog posts. But most importantly it will focus on using brand new AI that is the first to be able to detect GPT-3 AI-generated content all to try and answer… can Google detect AI content?
Since OpenAI released its GPT-3 API a large number of amazing AI content generation tools have been built. These tools can do a lot of things from writing emails to creating a Facebook ad. However, for this study we are focusing our efforts on long-form SEO blog posts that have the objective of ranking in Google.
We will be using Originality.AI which is the world’s first AI trained to detect if a piece of content has been created by AI tools using OpenAI’s GPT-3 API which tools like Jasper.ai use or other popular NLP APIs.
What Originality.AI Was Built to Do… AI Exposing AI
The unchecked proliferation of AI content is not without its concerns. How do we responsibly/ethically use this awesome power of AI content creation? How do we ensure that Google does not penalize us for leveraging AI content?… these are all questions I think I am in a unique position to try and help answer.
Although I am an engineer, I am not an AI developer however I…
Founded (and now sold) a done-for-you AI Content Agency: Rocket Content… where we were the heaviest user of Jasper.AI, and our team partnered with Jasper to develop a course and webinar for Jasper.AI’s agency users.
Manage a portfolio of content sites some of which we are ethically leveraging AI content
Founded Originality.AI where we built our own AI that is able to predict if content was produced by AI (the world’s first GPT-3 AI detection tool!)
However, clearly I don’t know it all since I have my name on an AI patent that was not approved!
Before we get to the results of the study that determines if Google can detect content generated using GPT-3 (or any of the popular NLP models) we need a little background…
What is AI Generated Content?
AI Generated Content is when a machine produces content such as a blog post, article, headline, blog intro, ad copy, social media content, product description, content idea, chatbot response, email or even images. It has exploded since May 2020 when OpenAI provided access to the API of their pre-trained NLP called GPT-3.
The Secret about ALL AI Content Generators
There are multiple pre-trained natural language processing models but it was really when GPT-3 came on the scene that the quality of the content produced became useable for marketers.
So why was GPT-3 such a big deal in allowing an AI content generator to become useful? It was not the first or the only NLP model that has an API which developers can build tools with.
Pre-trained NLP model capabilities, up to this point, are best measured by their number of parameters, the table below shows the most popular NLP models that AI tools could choose to use and the number of parameters:
Open AI GPT-2: 1.5 Billion
EleutherAI GPT-J: 2.7 Billion
EleutherAI GPT-Neo: 6 Billion
Open AI GPT-3: 175 Billion (DaVinci)
The “secret” is that ALL tools rely on the same API’s to create their content. These pre-trained NLP models are so incredibly costly to train that it would be impossible for even the best-funded AI tool to compete with its own NLP model. At a 30:1 difference in the number of parameters for GPT-3 to its nearest NLP competitor, the reality is that all effective AI content generation tools are using GPT-3 as their foundation.
Even though they are all using the same base there are still differences that can result in a big change in the quality of the content produced.
Fun Fact – EleutherAI (GPT-J, GPT-NEO) (scary sounding name) is actually truly open source while OpenAI(GPT-2, GPT-3) is very much for profit. The community, such as huggingface, of developers around AI is incredible!
Okay, enough nerding out…
Can Google Detect AI Content?
Yes, the study below shows that it is possible for Google to build (or has already built) its own AI that can successfully predict if content was created with an AI tool using GPT-3 (or other NLP model). We know this because at Originality.AI we built and trained our own AI that can predict if a piece of written content was created by an AI content generator using GPT-3 with 94% accuracy and higher for all other available NLP models.
Originality.AI has shown that the rumour GPT-3 created content was undetectable is wrong. Originality.AI was able to build a model that can detect content produced by all of the popular NLP frameworks with a high degree of confidence:
GPT-3 Detection Accuracy is 94.06%
GPT-J’s Detection Accuracy is 94.14%
GPT-Neo’s Detection Accuracy is 95.64%.
As we head into 2023 there are no other tools that can do this and we are gearing up to be ready when GPT-4 goes live.
So what? Why do we as content marketers care if Google (or others) can detect content that was created by a bot?
Google has come out and made it clear that it does not want AI-generated SPAM content in their Helpful Content Update they basically said as much. However, where it gets a little more mirky is what about content that was initially created by AI but then a human verified its accuracy and added some additional value? Will this get impacted by Google? No one knows!
The risk for all web publishers that don’t write every word themselves, right now is that writers they have hired are using AI writing aids to help create content faster and are now publishing content that could be identified as AI-generated.
As covered in this article or this article or this article it is widely understood that Google is scared of AI content damaging the usefulness of the text-based web which would result in Google making less money.
Now that the stage is properly set let’s get to the studies results!
STUDY: Examples From 5 Of The Most Popular AI Content Generators and Determining if Google can Detect Their Content?
We ran samples from 5 of the most popular long-form content AI generators through Originality.AI to test the confidence with which the content could be identified as not human-generated. In addition, we ran a control set of human-generated content to verify the probability score was identifying human as human and bot as bot.
Here are the results…
Source of Content
AI Probability Score
Jasper.AI
79.14%
Rytr
95.00%
WriteSonic
99.43%
Frase
100.00%
Article Forge
100.00%
Control (VentureBeat)
2.4%
Jasper produced the content that had the lowest probability of being produced by the GPT-3 API. This likely reflects a lot of the extra work they have done on top of the OpenAI API.
To help show the tool is providing accurate results we ran the 5 most recent articles at VentureBeat covering AI through the tool and you can see the results below.
Although the AI at Originality.AI can not say with absolute certainty if content was created by GPT-3. You can see from this study it does an impressive job at determining what is human-created content and what is bot-created content.
Originality.AI Exposing GPT-3 Content…
Here are the AI probability scores for the editorial articles from Venture Beat we safely assume are human-generated…
Below is a detailed breakdown of each tool and a video showing an example of the output of each tool being tested by Originality.AI
Jasper.AI is an incredible tool, they have established themselves as the leading AI content creation tool with a $125M raise.
This tools popularity and recognition is for good reason, they were our go-to tool when we built RocketContent.ai
The downside is that with their popularity, they have the challenge of needing to be a tool for all marketers looking to leverage AI. They are not focused on long form SEO content in particular like some of these other tools are.
They are definitely the best tool if you are signing up for an AI Content Creation tool for the first time or need an AI tool that can do it al!
Can Google Detect Jasper.AI Content:
Yes, using Originality.AI we are able to identify when content has been created by Jasper.AI.
In the video below I show live as I use Jasper to create content and then test it with Originality.AI to see if it can be detected as generated by a GPT-3 tool.
Pro:
Rich Features
Quality output
Community & Support
Con:
Cost
Trying to serve many masters
Pricing:
Jasper.ai Is Best For:
Anyone buying their first AI content creation tool. With its diverse group of tools it can support any marketer whether they are writing emails, sales pages or creating blog posts. It has established itself as the lead tool for good reason.
If you only care about producing long-form SEO blog posts with a team I currently prefer WriteSonic.
Rytr is an interesting option for writers… it has the most affordable “unlimited” monthly plan.
When we used it we struggled to have it efficiently produce full blog posts that didn’t wander wildly off-topic. It seems that other AI tools have produced the ability to extract more focused content as the text gets longer but Rytr definitely struggles with this.
The product is not as polished, easy to use or feature-rich as Jasper.AI so it is not best for people’s first tool.
The content produced does not seem to be at the same level as other tools. Within GPT-3 you can select which model you want to use and my guess is that Rytr has not selected the premium.
However, it is fast and the most economical unlimited tool.
Can Google Detect Rytr-Generated Content?
Yes, using Originality.AI we are able to identify when content had been created by Rytr. Originality.AI showed with an average of 95% confidence that the content created by Rytr was produced by a bot.
In the video below I show live as I use Rytr to create articles and test them for originality.
Pro:
Affordable unlimited word count monthly plan
Plagiarism checker integrated
Easy to use
Con:
Not great for long-form blog posts
Not as feature-rich as competitors
Pricing:
Rytr is Best for:
I think of Rytr as the cheaper version of Jasper.ai. It is best for freelance writers who are producing a lot of lower-cost content and looking for a budget-friendly AI solution to help them.
Rytr is your best low cost option, even though all these tools rely on the same NLP the results as you can see in the examples do vary. The Rytr examples are not as good as most of the other tools.
Write Sonic has become our go-to tool for creating long-form SEO blog posts. The content stays on point and flows well.
Where many tools such as Jasper.ai and Rytr try and be a swiss army tool for digital marketers Write Sonic has focused more heavily on the use case for writers producing blog posts.
It has some nice features to make it a good fit for teams of writers including the ability for multiple users and to select different quality content options.
It is definitely not the lowest-cost option with the cost per word being as high as $0.01/word.
Can Google Detect Write Sonic Generated Content?
Yes, using Originality.AI we are able to detect WriteSonic content as being bot-generated. For the example articles that were run through the tool it predicted with a confidence of 99.4% that the content was bot-generated.
In the video below I show live as I demo WriteSonic, show examples of content and run them through the tool at Originality.AI:
Video
Pros:
Great long-form content
Able to add multiple users
Able to get started quickly
Does have a free-trial
Cons:
Expensive
No Unlimited plan
Fewer features than others (which is also a positive)
Pricing:
This is not the lowest cost option but it can produce excellent long-form content if that is what you are looking for.
Write Sonic is Best For…
Web publishers with a team of writers looking to leverage AI to increase their team’s efficiency.
Frase is a phenomenal tool that started as a great way to make content briefs easily.
It has evolved a lot over the last 2 years and has now added AI content writing to its features.
Because it was originally built not to be just an AI content creation tool the complexity is higher than most.
Can Google Detect Frase.io AI Writer-Generated Content?
Yes, using Originality.AI we are able to identify when content was created by Frase. Originality.AI’s AI predicted with a full 100% confidence that the content was bot-generated.
Pros:
It is the best tool to quickly create brief analyzing the top 20 results on Google
Feature rich tool for SEO writers that goes beyond simply having a bot create content
Cons:
Can be complex… the tool has a lot of pieces to it and if you are looking for a simple AI writer this is not it.
Limited to 30 articles per month unless you pay a sizeable $149.99/month for unlimited
Pricing:
Frase AI Writer Tool is Best For:
If you are a serious SEO writer who writes a lot of your own content and would like to have an AI writing assistant paired with a lot of other more advanced SEO content tools to optimize your content to rank. I would suggest signing up to Frase because of the other features first (brief creation etc) not because of its AI writer.
We use Frase for brief creation to provide writers with the ability to ensure they completely cover a given topic.
Article Forge is the only tool on the list that explicitly says they do NOT use GPT-3. However, it raises the first of several questions… what do they use if it’s not GPT-3.
There are a series of concerning issues when testing Article Forge that has me warning people to NOT USE article forge!
Article Forge has some very interesting functionality for SEO content managers:
Bulk content creation
Automatic uploading & scheduling
Just upload a list of keywords, add a login to your site and poof perfect SEO-optimized, factual and helpful content published on your site… but this claim I fear belongs in the bucket of too good to be true!
Here are the issues I was able to identify with the help of Originality.AI…
Issue 1 – Article Forge Content is Easily Identified as AI
Article Forge states they do not use GPT-3 but don’t explicitly state what they do use. Since our AI is able to consistently identify their content as AI it means they are likely using GPT-2, GPT-J or GPT-NEO. The quality of the output also suggests it is not GPT-3. However, since our tool is able to detect it that means it has to be one of the NLP models our AI is trained on so either GPT-2, GPT-J or GPT-NEO.
Issue 2 – Articles are Mostly Rewrites of top Ranking Articles
Here is how I went about testing this theory on how Article Forge works…
Theory: Article Forge finds top-performing pages for a given keyword and then uses AI to re-write sections of it.
Test Theory:
Create an article using Article Forge
See that it passes keyword plagiarism (ie Copyscape or Originality.AI)
Enter each LONG paragraph into Google and see how similar it matches up to the same article repeatedly
Conclusion:
Article Forge uses an NLP API to re-write sections of top results in Google for a given keyword resulting in an un-original article in the eyes of Google.
Potential ISSUE 3 – Article Forge Content Does Not Rank in Google
The example articles at Article Forge do not rank well even if you search a full paragraph of the article. This could be a unique Google manual penalty to a portion of a page or it could mean that Google has been successful in identifying Article Forge content as not helpful.
There is indexable relevant content on their homepage (no googleoff tag or no index) that was produced by the tool. It should rank very high if we search an entire paragraph of the example article. However, when I grab the first paragraph and enter the entire thing in Google the Article Forge article is the 7th result.
Article Forge Summary
I would strongly recommend not using Article Forge since it appears that Article Forge is using either GPT-2, GPT-NEO, or GPT-J to re-write sections of top articles in Google for a given keyword. The result is generally a worse copycat of the original article that barely passes Copyscape but is in no way Original.
If either a user or Article Forge team member wants to communicate if/how I am wrong please reach out!
AI Content Generation Tool FAQ:
Are there an alternative AI article generator?
Here are some additional tools that will be studied in the near future as we work to discover the best AI content generator and if any are truly “undetectable” as GPT-3 generated content… 1. Shortly.ai 2. Copy.ai 3. CopySmith.ai
Is there a Free AI Writer?
Yes, most tools have a free trial. There is not AI writer that is 100% free.
What Does Google Consider as Original Content or Unique Content?
When it comes to ranking webpages, Google puts a lot of emphasis on the quality of the content. In particular, the search engine giant looks for content that is original and informative. This means that simply regurgitating information from other sources is not enough to earn a high ranking. To Google, originality means adding your own insight and analysis to the conversation. This could involve offering a unique perspective on a current event or providing new data that challenges existing beliefs. In either case, the goal is to contribute something new and valuable to the discussion. By heavens, there must be something left in the world that hasn’t been said before! If you can find it and say it well, Google will take notice. Originality.AI was built to identify original unique content by answering for a piece of content… 1. Was it written by a human? 2. Did they copy/plagiarize it?
Will AI writers keep getting better?
The typical obvious answer is yes, especially with GPT-4 coming in 2023 (suspected). However, there is an interesting challenge on the horizon for NLPs. The constraint for training models is quality data, training method and computing power. With chip shortages and GPT-3 already using 60 million domains (and likely the higher quality portion of the web) to train there is potentially 2 significant constraints to the next model being way better then GPT-3. It seems like the next model can not just be brute forced to be bigger but they will need to train the next NLP better. Smart people are working on it and I am excited to see the results!
Should You Use an AI Content Generator?
This article specifically looked at written content, if you are looking to understand how AI can be used to generate art check out this article this article. Despite showing in this article that AI content can be detected by another AI I do believe AI content generators/writing aids are the future and we are best to learn to work with them while ensuring if you are publishing anything on the web it is Original
Introducing a new case study: A fully outsourced AI content + expired domain + Ezoic content site portfolio.
I have a portfolio of sites that had been mostly neglected, but I have decided to put on an effort to rebuild them. Most of these sites are built on expired domains from LightningRank.com and content created using GPT-3 tools, then significantly human-edited, similar to how RocketContent.ai produces content.
I have hired a manager, given them access to tools, and had them build out procedures that I reviewed. This has been going now for about 6 months and its time to start providing a monthly income report.
Overall I am really happy with the progress of the sites, the content quality being useful, and the cost/article we are currently producing.
My plan is that each month we are going to share the traffic, revenue, expenses and overall PnL for a portfolio income.
Why Expired Domains? The power of backlinks in the eyes of google is undeniable. Building relevant sites on expired domains continue to be a SEO “hack” we see work.
Why AI Content? AI content with a human editor to spot-check and fact-check produces solid quality content that is useful at unbeatable prices.
Why EZoic? Ezoic is the simplest way to monetize sites successfully. It Beats Adsense all day! When it makes sense, some sites have affiliate monetization, but the focus is on traffic and display ads.
Why Fully Managed? In this case study, I have a manager assigned to the portfolio managing everything! From the sites to the writers to ezoic.
The aim is marginally profitable solid growth.
Everything below is prepared by the manager (who is awesome!). Finding management to run a portion of the business has always been the key to success for my businesses whether it is affiliate or FBA.
Here’s how our portfolio of AI-driven content has performed in October 2022 with Ezoic.
1. Traffic & Earning Overview (Ezoic)
Our portfolio sites get the most traffic from search engines, and Google is the primary source.
After the latest updates from Google, we noticed positive and negative impacts on the AI content sites. While some sites significantly improved their SERP rankings, others lost a substantial portion of top-ranking keywords, creating an unrelaxed fluctuation in the overall traffic for the portfolio.
The overall portfolio traffic in October lost 9.62% in total visits (75,570 compared to 83,610 in September 2022) and 9.27% in terms of page views (88,413 vs. 97,448 in September 2022).
Regarding revenue, the portfolio sites show similar statistics to traffic, a 10.90% drop compared to the previous month (September 2022) and a tiny drop in ePMV of 1.42%. It’s pretty obvious not to maintain a higher total revenue while there’s a 14,000 fewer page views gap than the previous month.
While the October traffic and revenue outcome of the portfolio is not successive to September, we have a much better result for an entirely AI-driven portfolio than the same month of the last year.
The portfolio growth data (Oct 2022 vs. Oct 2021):
Total Visits (+147.70%)
Pageviews (+141.43%)
Revenue (+182.53%)
ePMV(+14.06%)
2. Traffic Overview By Site
Traffic data per site on the portfolio has had frequent ups and downs following the Google algorithm updates in the last couple of months. Some of the major sites’ traffic dropped a little bit, while other new sites have evolved into a major contributor to the portfolio’s overall traffic, and some other sites tanked.
Here are October traffic data from a couple of prominent sites in the portfolio: (compared to September 2022)
The portfolio obtained a total of 75,570 traffic in the month of October across all sites, 36,244 being the maximum and 164 being minimum traffic for a site. We found an overall change of -8,040 visits to the previous month and -9,035 that of page views.
The maximum pageview gained by an AI content site in Oct was 41,515, and 174 was the minimum. That would be average traffic of 5894 across all sites in the portfolio publishing AI content.
3. Revenue Data Per Site
The portfolio’s revenue stats are directly proportional to October’s visits and page view stats. The total revenue for the portfolio was $738.51 in Oct 2022, with $467.32 as the highest revenue for a site with an average ePMV of $7.19 compared to $7.31 of the previous month (September) across all sites.
The maximum ePMV during the month was $21.03 and registered the minimum ePMV of $1.45 for a site with the least page views.
The changes in terms of ePMV and Revenue are -1.69% and -9.46% from the previous month (September), respectively.
4. Expense
Running a website portfolio with multiple sites publishing regular content can be quite expensive. Having a functional, well-managed AI content machine was the plan to reduce the cost of content creation.
To produce content with an AI copywriter tool, we have a team of human writers who follows specific guideline, take care of the content creation, does fact check for AI-generated texts, and does the optimization to ensure output quality and authenticity.
So we have a combined monthly expense of the editorial team and AI copywriter tool (our primary AI copywriter is Jasper.ai), which generates texts for a fixed price per word.
Here’s the portfolio October total expanse compared to revenue generated with Ezoic Ad revenue (excluding other revenue sources) looks like:
Key data:
Total expanses: $652.00
Total Revenue: $740.73
Profit (with Ezoic revenue): $88.73
% of profit Ezoic revenue brings in: 13.61%
5. Changes & Focus:
The unpredictability of website performance in terms of traffic and revenue has always been a part of content-based websites, especially those that get the most part of their traffic from search engines.
To publish content that is written with AI support and still do well in Google SERP, we need to put great effort into content quality and optimization. That being one of the primary focuses, we also need to ensure delivering an optimized experience to the reader.
Apart from maintaining the content quality, our objectives for the sites in this portfolio are,
Delivering simplicity: a simple and easy-to-understand approach to delivering information people are looking for and letting them access something without having to deal with complex elements on the site.
Visual Hierarchy: we attempt to organize the website elements so that the visitors naturally see the most important elements first.
Navigability: All the sites have navigations that are well organized, allowing visitors to have an idea about the clusters quickly and letting people find what they are looking for.
User-centricity: we believe in the idea and build & update the websites in the portfolio to convey smoother usability and experience to the users. After all, it is the users who will experience the website and explore its contents.
Accessibility: we try our best to reduce anything that can prevent users from accessing the website and can break the functionality.
Responsivity: the sites are 100% responsive to be compatible with many different devices our visitors use.
Maintain a Good site health score: ensuring the site does not have minor issues such as abuse of headings, metadata, or links is vital because they can be decisive when you are pushing for the SERP position and want your pages to be 100% ready to compete with a quality piece of content. If done right, it’s quite possible with AI copywriting tools.
Site speed and web vitals: site speed and maintaining good web vitals is one of our primary objectives. Our internal data states that with speed upgrade, portfolio sites led to a much better crawling experience for the Google bot and resulted in ~ 6x daily crawling requests compared to before.
Not only that, fast-loading sites improve the browsing experience and provide us peace of mind after having passed web vitals matrices on the google search console. We also improved our visitor’s time on the page and page depth on the site with the speed upgrade, thanks to Ezoic’s CDN and performance tool (LEAP).
1. The addition of a new site to the portfolio
With a plan of smoothly running currently available sites on the portfolio, we look forward to adding potential sites to grow within our strategy. Having a functional setup to publish regular AI content and a plan to optimize them makes bringing new sites to the portfolio easier.
Once a new site arrives, we scale the site’s current traffic, design, performance, and other elements to come up with a plan that can lift up the overall quality and experience. Once the site is ready – we strategically prepare a content plan to execute in the upcoming months.
That being said, not every site we added to our portfolio has had a great time and better output. Some sites tanked and lost a significant portion of the traffic – but most of those threats arrived after the recent Google updates. It brings in the debate of Google penalizing AI content sites which does not hold TRUE for other AI content sites in the portfolio.
Sites that lost a major portion of traffic lately were the sites ranked well for previously created content. It is safe to say those were not the best quality documents available on the internet for a particular search query. We are still putting our effort into better content updates to see and test how those pages perform as we advance.
Number of sites added to the portfolio in October: 2
2. Nameserver connection for Ezoic
Ezoic recommends the nameserver connection method to integrate your site with the platform for better site performance and revenue boost. Earlier on, we had many sites integrated with the Ezoic plugin now all of them switched to the nameserver method.
With the nameserver connection, the site performance has improved a lot. Almost all sites in the portfolio are passing or close to passing web vitals… although we needed to bring some visual and technological changes to the site.
The improvement in performance and loading of the primary page elements faster helps the ads load faster. Pages and ads that load faster can improve your ad revenue by a decent margin and definitely help our sites to increase Ezoic ad revenue.
3. Seasonal decline & monitor performance
Seasonal interest and demand for products and services affects sites and is a perfectly natural, annoyingly unavoidable aspect of websites, even in any business. Quite obviously, our sites in the portfolio aren’t independent of seasonality.
There are unpredictable ups and down in terms of traffic for the sites. Having to maintain lots of sites in the portfolio and smoothly take care of other aspects of the sites, it becomes quite challenging to keep a close eye on each site individually.
Semrush and Ahrefs help us by providing the traffic overview and keywords found in organic searches for each site. It gives us an idea of the ongoing traffic state of a site compared to what could be a scenario for the site in the coming months based on the trajectory of organic keywords.
In addition, we can analyze the organic keywords situation, and based on their latest form; we can decide to improve the content if required.
Conclusion: It is fun to manage multiple sites and have a proper plan to run them all together while still making some decent profit with provisions to increase revenue in the future. This could easily be a project that consumes most of our time, but a well-designed plan and ready-to-use tools + the right people in the right place, make it much easier to manage the portfolio.
This income report serise is aimed at sharing our growth and the fundamentals that helped us reach here, minor details that can help you build your AI-driven website’s portfolio at a significantly lower cost.
We can’t discuss fixing web vitals without discussing one key factor called LCP, the Largest Contentful Paint. LCP is essential to the user experience. People typically don’t like to wait for a page to start loading when they could immediately get a faster experience elsewhere.
LCP tells you how long it takes to load the main contents of your page and become readable to the users. The faster the LCP is, the better-performance impression a user gets on your website.
The largest Contentful Paint is considered one of the main factors on the web vital and starting to play a vital role in your page rank on the SERPs. To be in favor of Google and have pages that deliver a better experience for the visitors, it’s important to remain on top of your LCP.
Poor LCP can be one of the reasons for higher bounce rates, ranking issues, or even lower conversion rates.
What is a Good Or Bad LCP?
According to Google, the main content of a page should load within 2.5 seconds when a user tries to visit a page, which refers to having a good LCP. If the main content takes more than 5 seconds, people are likely to exit from your site and be considered poor LCP.
Usually, when a page can maintain a good LCP (main content load time under 2.5 seconds) for 75% of the time, the performance is considered good in Google Search Console.
This image explains how LCP ratings are categorized.
Image credit: Semrush.com
What can cause poor LCP and how to detect it?
There’s not a definite element that is responsible for LCP. Anything that slows down the page rendering process could be the reason for the LCP issue on your pages. Usually, the elements that take up a lot of space, media players, unoptimized images, social media buttons with 3rd party connections, and newsletter sign-up forms can increase your page’s load time.
If we dive deeper to find the primary reasons for LCP, they can be classified this way.
1. Slow Server Response Times
Regardless of how much you try to optimize your website, if you have a slow server, its response time will slow down your page loading matrices. This will definitely result in a poor LCP record.
Bad server back-end infrastructure, unoptimized databases, and longer time to resolve API requests are often the reasons for slow server response time.
So the first step towards improving your LCP records is to host your website on a fast web server.
2. Slow Resource Load Times
Resources that are large in file size or have a dense visual impact will take longer to load. Unoptimized images, GIF files, videos, and many text content above the fold will affect your LCP record.
The best way to take care of your LCP is to compress file size, ensure loading only necessary elements above the fold, and avoid video or large image files in the first print.
3. Render blocking Javascript and CSS
CSS and Javascript play a significant role in how they are being served. Adding javascript and CSS to make a fancy-looking web page will eventually cost your page performance.
It’s true that without CSS and javascript codes, we can’t add dynamic, interactive, and attractive elements to our websites. But the timing of execution for these files should be taken care of. If you load them in head and they take longer execution time, it will block other website assets from loading.
Reducing render-blocking assets can significantly boost your pages in terms of LCP. Still, you must remember while fixing render-blocking, you are not shifting important libraries and javascript & CSS codes to the footer. This might end up with a broken web page or an accessibility issue.
4. Client-side Rendering
Rendering web pages in the browser with JavaScript is a popular web development method. Client-side rendering can be one of the main reasons for LCP issues if your pages use this method.
Client-side rendering involves the browser collecting and loading javascript before rendering primary content on the page. The webpage might struggle to pass LCP matrices depending on the rendering time and complexity.
So when you are concerned about your LCP and have client-side rendering on your site, consider optimizing or avoiding client-side rendering if possible.
How to discover LCP elements on your page?
To fix LCP issues, you need to know what your LCP is. I have ordered the steps we follow to analyze the LCP of a web page and have the plan to take necessary action to fix it.
1. Use Google Search Console
A good way to confirm your page has LCP issues is by checking Google Search Console’s web vital section. The web vital section displays URL’s performance status collected across real users’ devices. You will have a unique report for desktop and mobile versions.
When the URLs are good, your site LCP has to be good. But when they are marked as “need improvement” or “poor,” you might need to check which web vital metrics are actually flailing & need rechecking.
2. Use page speed insight
Page speed insight is a dedicated tool to instantly check the web vitals of a page over the latest 28-day collection period and via live test. When the origin data in page speed insight has an LCP of less than 2.5 seconds means it’s in a safe range.
It’s the more than 2.5 sec LCP when you should be concerned and try to fix it. The instant test of Page Speed Insight should help you identify elements that are responsible for LCP.
Scroll down to the “Diagnose performance issues” section.
Then set the “Show audits relevant to” to LCP.
You will be displayed the opportunity to improve LCP and diagnostics.
3. Use Ezoic’s Browser Extension
Another easy way to inspect your LCP elements is via the browser extension by Ezoic. Usually, this extension is useful for creating ad and video placeholders on your pages. You can adjust those placeholders and test different configurations to boost Ezoic revenue.
There is another section called LEAP where you can perform performance-related alerts, debug CDN on a page, analyze the LCP and highlight LCP elements. After analyzing a page, make sure to toggle the “Highlight LCP element” for the plugin to highlight the LCP element in yellow.
4. Test Your Page On Web Page Test
Webpagetest.org is a page performance testing tool that provides in-depth insights. When you scan a page for web vitals, this tool will give you the scores and elements responsible for LCP.
This is a handy tool to inspect layout shifting issues as well; you can see which element is responsible for the layout shifts on your pages, making it easier for you to fix it.
The web Page Test report will highlight the LCP elements with green and provide the timing.
Once you are aware of the reasons for LCP, you can proceed to optimize the LCP content to fix LCP issues on your page if any. Here are a couple of tips on fixing LCP issues.
Few Tips To Fix LCP Issues On Your Page:
1. Optimize your images: Make sure your images are lightweight while the quality is intact. I prefer to keep images between 50 – 100 Kb whenever possible, and it’s quite achievable if you are not running a photography website, portfolio website, or any sites where images are a priority.
2. Preload your logo: Preloading the logo does not add much to performance, but you would want to load your site logo as soon as possible. This usually gives a 2-5% of boost to LCP timing when done right.
3. Avoid using too many fonts: Fonts make a site attractive but loading too many font files will increase the asset load chain and may be responsible for the late printing of texts on a web page.
4. Localize and preload above-the-fold fonts: Although google fonts are fast, my favorite strategy that seems to work every time is localizing above-the-fold fonts and preloading them. While preloading, you need to be sure that these font files are not too heavy. In addition, your web server’s speed needs to be fast for localized font files to be served faster.
5. Optimize Javascripts: To optimize your JavaScripts remove unused codes, make sure the code is up to date and compatible with modern browsers, use modern JavaScript libraries and try reducing payloads. A caching plugin with javascript management options makes the job easier for you.
6. Optimize CSS codes: Compressing CSS files and implementing critical CSS often solve render-blocking issues related to CSS. Removing unused CSS files from your WordPress site can help scale down the LCP metric.