Why Your Google Ranking Drops and Tools to Fix It

Imagine you’re someone who works hard to make your website show up on Google when people search for things. It’s like a race, and you want your website to be at the front. But one day, you wake up and realize your website has fallen way behind or has vanished from the race altogether. This can be really scary and frustrating!

The process of making your website do well on Google is like planting seeds in a garden. You have to wait for those seeds to grow into healthy plants, and it takes some time, maybe a few months. But what’s even more frustrating is when it seems like all your hard work isn’t paying off.

Many things can affect how well your website does on Google, and in this article, we’ll explore why your website might be doing poorly and what you can do about it.

Don’t panic!

If you notice that your website isn’t ranking as high as it used to, you should not panic — it might just be a technical glitch. So here are some simple steps you can try first:

1. Check your rankings using a different tool or service. I like the search console to find the ranking for pages, here’s an article that explains how you can leverage scanning search rankings on GSC. Additionally, you can use tools like Semrush or Ahrefs to track the position status across different countries.

2. Notice if it’s just one specific page that dropped in ranking or if your whole website is affected. If it’s just a single page with high-traffic intake, you can analyze the page rankings for its organic keywords. Again you can use GSC, Ahrefs, or Semrush.

3. Take a manual look at your rankings across countries where most traffic comes. You can use SERP watch to track live rankings.

4. Keep an eye on your competitors’ websites. If they’re experiencing a drop in traffic too, it could be due to a recent change in Google’s algorithms. Here’s what I do:

  • Take the page address of any competitor page ranking on 1st page of Google.
  • Go to semrush and scan the URL with the “exact URL” term.
SEMrush URL scanner
  • This will give you an idea of organic traffic for those pages in recent days.
Organic Traffic And Keywords
  • If traffic data is low on your competitor pages, there might be a drop in seasonal demand or the topic is heading toward needing to be updated.

In case you’re confident that the drop in your rankings isn’t because of technical issues after checks on your rank tracking tools or Google itself, it’s time to investigate why your orders have gone down. 

That takes us to some common reasons why a WordPress website might not perform well in search engine results.

Possible reasons for your Google ranking drop and fixes

1. Algorithm changes

One of the main causes of search result drops is obviously the changes in Search engine algorithms. These updates are to improve search results, and thus in the process, these updates can hurt your site with a significant decrease in rankings and is quite frustrating, to be honest.

This year (2023) Google had two major updates in March and August followed by a helpful content update in September, causing many websites across the world to lose their rankings. These updates particularly affected sites that the system thinks need to be more helpful or have unverified information.

Barracuda Tool Analytics

Every website has a different impact after an algorithm update. Some sites may shoot to the top and other drops drastically. If your site did great after a core update, it does not mean it will be intact following the next in case your site falls behind other sites in fulfilling what users are looking for.

A tool you can use:

A handy tool you can use in this situation is called the Penguin tool by Barracuda. This tool connects to your Google Analytics and helps you see how your website’s traffic is affected by Google’s algorithm changes. It’s like having a map that shows you when your traffic started to change and if there were any updates from Google just before that happened. 

Even if you see some changes in traffic and ranking, it’s important not to rush into action right away. When there’s a new algorithm update, things can be a bit crazy for the first few days. Wait for a couple of days and monitor the situation as things tend to calm down and get back to normal within a couple of days.

2. Change in Seasonal Demand

If your website’s rankings have fallen, it could be because your topics are no longer as popular or the season when it’s in high demand has passed. For instance, if you’re in the business of selling snow shovels, your rankings will probably decline as spring and summer approach since people won’t be searching for snow-related products during those seasons.

Obviously, we know that Google Trends, in this case, is the best free tool available to find out seasonal changes and how trends go up and down over time. You can customize it to see the bigger picture, like focusing on specific locations or time periods.

Typically setting the time period to the past 12 months or yearly gives you the best view of seasonal demands changes across your target location or worldwide (in case your market is international)

Change in Seasonal Demand

3. Check for manual action

Another major reason your websites can see a significant drop in search results is due to Google penalties in place for websites that break the rules, and these penalties can be applied either through algorithms or manually by Google’s team.

Algorithmic penalties are updates for ranking high-quality content. If your site doesn’t align with these new algorithms and Google’s many ranking factors, it might drop in the search results.

Manual penalties are like a warning from Google that remains for an unknown duration when you don’t follow their rules for website owners.

1. Using content that’s not original.

2. Sneaky redirects that hide where a link goes.

3. Keeping hidden content or links on your site.

4. Creating low-quality pages that just lead to other pages.

5. Automatically generating content without human input.

6. Using special code in the wrong way.

7. Having harmful pages on your site.

8. Stuffing your pages with too many keywords.

Your site might suffer a slight decline in ranking or vanish from Google’s search results based on how severely you have broken its rules.

Tool to use:

Utilize, Google Search Console in this case. Expand the security and manual action section > click on Manual action. You will get a warning explaining the type of issue your site has triggered and needs to be taken action on your side.

Manual Action

If you don’t find any manual penalties, it’s likely that the decrease in your ranking isn’t because of spam or bad practices on your site. Instead, it could be related to technical problems, SEO, changes in what people are searching for, or outdated content – which we’ll discuss later in this post.

4. Have you recently changed anything on the website

At times, drops in your Google ranking can be tied to changes you’ve made to your website. In fact, any improvements you have made to the site or design alterations can impact your site’s ranking, either in a positive or negative way.

A positive change is great, but you need to know what type of changes can lead to a ranking drop.

  1. Content Updates: If you or someone managing your site changes the content or adds new keywords, it can lead to a ranking drop. Such a drop typically appears when you use a keyword too frequently or reduce important texts related to the topic on a page.
  2. Moved a page: When you move a page and redirect it to the new page, it takes time for search engines to find and update it. During this period, your site may not perform as well in Google search results, and the page might seem like it’s gone.
  3. Change in internal links: When you adjust the links within your site, you can change how “link juice” flows. Thus, some pages might lose their importance in the process.
  4. Design Change: Significant changes in your website’s design can sometimes make it harder for people to use and slow down how quickly it loads. These factors can affect how users behave on your site, which is an important factor in your ranking.
  5. User Experience: Factors like a cluttered layout or layout shift, intrusive pop-ups, or difficult navigation can negatively impact user experience. Google uses web vitals to matrices to rank the pages in terms of performance and user experience, so when your new site design miserably fails on those matrices, chances are your site may have a drop in ranking.

Tool to use: 

There isn’t a specific tool that directly shows how changes to your website affect its search rankings. That’s why, it’s important to keep an eye on the positions of your site’s pages using Google Search Console for at least up to 2 weeks after making any changes or improvements. 

If you encounter issues, you can use Google Fetch and Render to investigate if there’s any issue on the page or check for issues in structured data that might be preventing search engine bots from properly indexing your page.

If your page’s position in the search engine results page (SERP) hasn’t improved within a few days, you might consider trying to restore a previous version of the page from the Wayback Machine.

wayback machine

5. On-page issues

If you have recently optimized your website, you might run into certain issues that could have a negative impact on your search ranking once Google’s bots have indexed your site. It’s very important to make sure your site does not face critical issues during load so everything loads fine for the bot to be able to index it properly.

Commonly, when a Google bot finds a broken link on your page such as when you have linked to a nonexistent page, or added half of the link, or you had a link inside design elements such as a button that is broken/inaccessible due to optimization rules can negatively impact on your SERP ranking change.

For each page, it’s very important to create Title and Description meta tags and include Alt attributes for images. If these tags are missing or incorrect, search engines may reduce the page’s visibility in search results.

Furthermore, it’s important to ensure proper HTTPS configuration, prevent mixed content issues, address duplicate content concerns, manage permanent and temporary redirects properly, and avoid blocking Google bots from crawling your pages via .htaccess rules.

Google does not like paid links and link exchanges. If you have done so recently and they detect those links leading to questionable websites, your ranking in the search engine results page (SERP) may be negatively affected.

Tools you can use:

There are several tools at your disposal for optimizing your on-page SEO. The first one would be your browser console, which helps identify various issues that may occur during the loading of your web pages. You can access it, by pressing Ctrl + Shift + I on a Windows PC or Option + Command + K or (Option + Command + J) on a Mac. 

This guide should help you out on how to look for errors via the browser console.

SEMrush and Ahrefs can be used for scanning and detecting a wide range of page-related issues. These two tools are my go-to place for checking internal page issues that we don’t see with our eyes. 

Apart from that, you can utilize Page Speed Insight’s page accessibility and SEO section to identify issues with your page. This is something that can be done via the browser lighthouse report tool.

On-page issues

If you come across any issues on your page, make an effort to search for online resources and resolve them. The fewer issues your page has, the more optimized it becomes, making it more appealing to Google.

6. Your competitor has improved their pages

Everybody wants to rank their pages on top of SERP. Similar to you, your competitors are continually trying to enhance their websites. It’s a constant race for the top positions in search engines.

If you were previously ranked higher than a competitor, they’ve been actively investing in optimizing their site, focusing on content, backlinks, and more, while you’ve dedicated less time to SEO or focused on other content. In such a scenario, your ranking could slip, and your competitor might surpass you. Such events are more aggressive during algorithm updates because algorithms are intelligent enough to find those great-quality contents and will try to test them on top of SERP.

So ensuring your content is up to date and contains enough quality information for the readers should help your site sustain such SERP changes.

Tool to use:

You can use Pagescreen to monitor your competitor changes and get notified when a visual change arrives. You can configure monitoring for specific pages and on the dashboard you will see the changes and screenshots taken for each page you are monitoring.

This way you can keep a close eye on your competitor pages and know who has gained the most out of your ranking drops and what they have been working on.

If you find a lot of movements & updates on competitors’ websites, you can analyze those changes and optimize your content so you don’t lose your positions. In addition, use this free content optimizer to analyze your target keywords and competitors’ pages. 

7. Off Page SEO Negative Impact

SEO can benefit or harm your site based on how you are doing it. Not only that, your websites can suffer from the actions of other web admins too. Here’s how:

1. Your competitors may try to play dirty by inserting artificial links to your site. This can lead to penalties under Google’s Penguin filter, hurting your site’s visibility and reputation.

2. Sometimes, other websites might plagiarize your content. If they publish your content before it gets indexed by Googlebot, it could be seen as duplicate content, potentially causing your page’s ranking to drop.

3. Engaging in prohibited backlink practices can have a detrimental impact on your page’s SEO. This can result in a lower ranking compared to previous weeks, making it harder for users to find your site.

4. Beware of fake user spam, whether it’s coming from unknown sources or, in some unfortunate cases, even from yourself. This is a red flag and can seriously affect your search engine results page (SERP) ranking.

What step you can take: use Originality.ai, a highly efficient plagiarism detection tool with AI content detection ability. Originality AI will let you know if your content is also being published somewhere else. In addition, you can utilize it as a traditional content plagiarism check for your writer’s content.

Off Page SEO Negative Impact

Furthermore, you will benefit from the readability score it comes up with, and it is really helpful to re-scale your content as it comes up with real-time suggestions with different color codes.

Originality Ai Readability Check

Regarding, link spam by unknown sources you can use SEMrush’s backlink analyzer and then block the spammy domains with the help of Google’s disavow tool. Blocking the domain would prevent the whole links from the domain and make your task a lot easier.

If you’re dealing with copied content, you can reach out to the DMCA by submitting a copyright infringement notice. For spam traffic detection, Google Analytics should help you out and then block those IPs with the help of your web host.

8. Your site might have lost some critical backlinks

Link building is an ongoing effort that requires constant attention. Your page may lose its ranking when some key backlinks are having issues or are removed by a 3rd party for some reason. These are some of the frequently triggered causes for the loss of links.

1. Link Removal: Sometimes, links that were initially placed on other websites may be taken down. This can happen if the agreement is temporary/ a disagreement in the terms/ or if the site admin asks for payment to renew the link. Losing backlinks can erode your site’s credibility thus hampering your SERP rank.

2. Obsolete Links: Backlinks can become outdated if they lead to pages that are no longer relevant or have been removed from your site. This can affect both user experience and SEO, so you would want to make sure the links that you have made are from relevant pages and explain what readers are about to read on your site, this is a really effective practice in link building.

3. Loss of Domain Trust: The trust flow is bidirectional. If the site linking to you loses its trust or credibility, it can have a negative impact on your own site’s reputation. Backlinks from untrustworthy sources can be considered toxic and harm your site’s credibility thus your pages may drop some SERP positions.

Tools that can help you:

You can find multiple tools to monitor and manage backlinks. I prefer Ahrefs because you get the backlink tracking utility with complete link management and SEO suit for a 7-day trial and $99 per user after that.

9. You Have Not Worked on Your Web Vitals

In August 2021, Google rolled out the Page Experience algorithm, which brought along a set of Core Web Vitals ranking factors. This update signifies that Google is now placing more emphasis on the loading speed of pages and user-friendliness.

Optimizing for Web Vitals does not always mean getting the greatest score on speed testing tools like Page Speed Insight, GTmatrix, and Webvitals.dev but more about finding the right balance between speed and accessibility of your site so from a performance point of view it can satisfy the following matrics as well as your site is accessible by users and bots.

Let’s take a look at web vital metrics and what they stand for in simple terms:

LCP: Largest Contentful Paint. This measures how long it takes for the largest element on a page to appear on the user’s browser.

FID: First Input Delay. This indicates the average time your pages take for the browser to react to the initial user interaction.

NIP: Currently there are limitations on FID to measure the responsiveness of a web page hence Google is planning to replace FID with NIP in the coming March of 2024. This is why focusing on optimizing your site for NIP is a forward-thinking decision right now.

CLS: Cumulative Layout Shift. This tracks the overall amount of layout movement that happens while a page is loading.

web vital assessment check

How do these Core Web Vitals matter even more in a competitive SERP landscape? Well, they can give you an extra edge. In a crowded field, having faster loading and more user-friendly pages can help you stand out from the SERP competition, attracting more visitors and boosting your rankings in search results. 

So, paying attention to these web vitals isn’t just about keeping Google happy; it’s about making your site more appealing and competitive in the online world.

What tools to use?

The first step would be to use a speed testing tool and analyze the site performance record. Then Go to Google Search Console > Page Experience > Web Vitals. Check if you can see any poor URLs or URLs that need improvement. 

In case there are any, you should focus on improving your performance and page experience.

web vital search console

The performance test tools combined with WordPress caching plugins can help you fix these performance-related issues and prepare for a better web vital status. 

You can also rely on a hands-off approach and have a discussion (including pre-order analysis) with our web vital optimization team at Speedy Site. This service has been helping a lot of site owners fix their page speed issues and pass web vitals so I bet they can offer an optimal solution for your site as well.

10. Look for Geolocation fluctuations

A website’s ranking isn’t set in stone; it can vary significantly depending on the location. In different countries, regions, and cities, a page’s position in search results can differ. For instance, your site may rank very well in your country but drop its position in another region.

Google tailors search results to individual users which means that two different people searching for the same thing may see different results. Additionally, whether you’re signed in or not to your Google account can also influence the results you receive. So, keep in mind that search rankings can be quite dynamic and personalized.

The best tool to get an overview of your pages and target keywords (topics) is Google Search Console. I have explained how you can leverage GSC’s search performance data to analyze the average position of your page for many different countries.

GSC queries

To improve your page’s ranking across various geographic locations, you need to focus on offering top-notch original content and configuring multi-language support on your site.

11. Avoid low-quality contents

It’s really not worth your time to put out low-quality content that doesn’t genuinely benefit your readers. Instead of churning out a massive quantity of articles that essentially say the same thing as others, it’s much more effective to focus on a smaller number of high-quality pieces.

The key to valuable content is to offer solutions that might be similar to what’s out there but are presented in a way that’s more user-friendly, easy to follow, enhances the overall reader experience, and top of all provides a real solution. 

Even better if you can inject your own unique approach that sets your content apart and earns the admiration of your audience.

Furthermore, you should update your content; if:

  1. Your content is plagiarized. Even copying a portion of someone else’s work can lead to Google penalties.
  2. Make your content engaging for users without resorting to clickbait tactics. Using numbers and questions in headlines can still be effective.
  3. Include information about the content creator, such as their name and links to their social media accounts, to establish expertise.
  4. Always use verified data and provide links to reliable sources, especially when dealing with content that impacts health, happiness, safety, or financial stability (YMYL). Misleading content can lead to severe consequences.
  5. Optimize the user experience by breaking up large blocks of unformatted text. Google assessors value a readable format, as it can impact bounce rates and ultimately your ranking.
hemingway readability check

Recovering page ranking in SERP

When your WordPress site takes a hit in the Google rankings, it can be a bit of a head-scratcher. 

To tackle it head-on you can try the tools I have mentioned above and then work on updating your content with a compelling and valuable intent that should keep your audience engaged and eventually help your site align with what Google thinks is more user-friendly content.

Read More

Restoring Lost Website Traffic After an Algorithm Update: Actionable Steps

In the ever-evolving world of search engines, Google algorithm updates have become a buzzword for websites striving to rank high and reach their target audience. These updates are mysterious forces that can either make or break a website’s online presence.

Imagine waking up one day to find that your website has dropped in search engine ranking and your traffic has plunged. It feels like being stranded on an island with no way to escape, wondering what went wrong and how to fix it. This is the impact of an algorithm update, which can send shivers down any website owner’s spine.

But it’s not all doom and gloom. Algorithm updates are Google’s way of refining and improving the search experience for users. By rewarding websites with high-quality content, good user experience, and following SEO best practices, Google is helping users find the information they need quickly and easily.

So as a website owner, it’s essential to understand that each algorithm update is unique, and its impact can vary depending on several factors. However, by creating valuable content and providing an exceptional overall user experience, site owners like us can build resilience to algorithm updates and maintain their search engine ranking and traffic.

So, if you want to stay ahead of the curve, here are a couple of actionable steps you can take right now to embrace the challenge of algorithm updates and adapting your website to meet the ever-changing demands of the search engine. 

1. Analyze your website’s search performance on Google Search Console

Google Search Console is an essential tool that provides valuable insights into the performance of your website on search engine result pages (SERP), allowing you to filter historical data according to your specific needs. 

  • To begin, select your website property from the dropdown menu located in the top left panel of the Google Search Console dashboard. 
  • From there, navigate to the “performance” section and click on “search results.” The performance dashboard will then display important metrics, such as total clicks, impressions, average click-through rate (CTR), and average position. These metrics are crucial for understanding your website’s search performance and future projection.
  • To proceed, ensure that the search type is set to “web” and select a date range of 6-12 months. This timeframe allows you to identify the impact of Google’s algorithms and compare current performance stats with previous data. 
  • Once you have selected the appropriate metrics, activate the “average position,” “total clicks,” “CTR,” and “total impressions” graphs. 
  • Then scroll down to the “pages” tab and order the pages based on the highest clicks, although you can choose to sort by average position or impressions.
  • Lastly, compare the metrics to determine which pages need improvement. By comparing clicks, impressions, and CTR, you can identify pages that are not performing as expected these are the pages you want to first look at.

To begin the optimization process, first, you can review your Google Analytics data and identify pages that have experienced significant traffic loss after the update. Alternatively, you can tell just by looking at the performance chart and how avg. Positions, impressions, and clicks are heading compared to the previous couple of months.

  • Once you have a page in your mind, you will need to select it from the list pages, and it automatically gets set as a filter along with the date and web settings you previously established.
  • Once the page filter is set, you must navigate to the “queries” tab and analyze the top keywords for that page that are generating the most impressions. You can then activate the impression, clicks, CTR, and SERP position to compare the best keywords for the particular page.
  • With this data, you can adjust your content to optimize for the best-performing keyword while also paying attention to other top keywords. (It’s important to note that using these keywords to update your page only pertains to keyword optimization, as you still need to ensure the page provides maximum value and a pleasant user experience.)
  • This way, you can extract a list of pages that have been hit by the latest algorithm update to your Excel sheet, which gives you a filtered approach to try and recover a part of traffic by optimizing your pages for keywords and potentially driving more people than before.

2. Update your content structure

To make your article reach people and gain their love, you need to first make it lovable to Google. But how can you do that? Google doesn’t know what you are writing, but it knows what makes an article great. It’s all about making it to the first page of Google SERP. 

Google compares your content to the existing top-ranking content to determine how your page measures up. So the trick is to reverse engineer what the top articles are doing to get themselves to the top!

So before creating your article, the first step would be to create an outline that surpasses your competition. 

# Prepare your heading structure:

To do this, you need to determine the heading structure and ideal word count for your article. You can search your topic on Google and analyze the top-ranking content. Take note of the heading structure and the length of the article. 

You can also look at the second and third articles to gather more headings to include in your outline. This will result in a better outline because it encompasses all the articles that are currently ranking. You can also use tools like Frase to collect a set of headlines without manually analyzing each top-ranking page.

# Optimizer for readers and add keywords that Google would love to see:

When it comes to writing, avoid fluffing around and get straight to the point. For example, if someone is searching for how to make a website in WordPress, they likely already know what a website is and don’t need an introduction. 

When updating existing articles, you can use the keywords you have previously discovered on the GSC queries dashboard, as well as other semantic words that Google would like to find on your page. For this task, you can use tools like Surfer SEO, Frase, and SEMrush to enrich your content with relevant words that will improve your page’s relevance.

One way you can do this is by using the keywords you have discovered for a page on Google search console and then using those keywords in a content optimizer tool like Frase and letting it help you optimize the content compared to the best content on Google.

# Optimizer for FAQ and SERP:

Next, write the answers to queries in a Google-friendly format. Keep it simple and direct. For example, if someone is searching for “Does Ahref have a content optimizer tool?” you can answer with a simple sentence like “No, Ahrefs does not have a content optimizer tool yet.” 

This type of answer makes it extremely easy for Google’s language algorithm to know that you’ve answered the question, and it can skyrocket your chances of getting into the featured snippet on top of Google search.

# Add Schema to your content

Using schema markup can be incredibly useful for ranking on Google because it helps search engines better understand the content on your website. You can use RankRanger’s schema markup generator to create and validate your FAQ, Article easily, and How to schemas to your HTML code in a way that is machine-readable, allowing search engines to understand the content on your website easily.

With schema, your content has a higher chance of appearing in feature snippets, and people also ask for sections on SERP. For FAQs, you can search your topic on Google and then pick some of the most relevant questions to your topic from the PEOPLE ALSO ASK section and then try to answer those questions clearly and easier way on your page. 

Use an accordion plugin on your WordPress site to assign those FAQs and enable FAQ schema or RankRanger’s FAQ schema generator. 

3. Focus on Topical Authority

Topical authority refers to the expertise and trustworthiness a website earns by producing high-quality and informative content on a particular topic. The more quality content a website has on a specific subject, the more it becomes known as an authoritative resource. This enhances the site’s reputation among readers and improves its chances of being favored by Google for ranking purposes.

If you feel that your website lacks coverage on a particular topic, assessing your existing content may be worthwhile and considering scaling up your coverage. By producing more quality content on a specific subject, you can demonstrate your expertise and become a more reliable source of information for readers while also improving your search engine rankings.

# Create a topical map

To start creating your content strategy, it’s important first to develop a topical map that includes all the relevant topics that need to be covered. You can utilize a keyword research tool like Chat GPT to generate this map by building clusters of topics around the main subject and identifying relevant child topics. 

However, using a reliable keyword research tool to discover keywords around these topics is also recommended. While Chat GPT can assist in creating topic clusters and relevant child topics, it may not provide enough data on the actual keywords that users are searching for on Google.

You can also read this case study on how you can use ChatGPT to research keywords on your site. 

In addition, if you have pages on your website that cover random topics and receive low traffic, it would be beneficial to repurpose those pages to cover topics that align with the main themes of your site.

# Create a bond between relevant articles by interlinking

Now that you have a well-planned relevant article on your website, you can seal the deal by interlinking them and creating a net to show search engines how highly relevant your articles are!

A well-planned internal linking strategy can significantly impact your website’s search engine ranking because it distributes the authority of your website’s high-authority pages to other pages on your website. This means that a well-structured internal linking profile can have a similar impact to high-quality external backlinks.

If you cannot interlink naturally between the pages, you can use a plugin like Link Whisper that automates the linking process and find natural key phrases to link between your website pages. 

4. Run a full site audit

A website that has great content but is not optimized for technical SEO is like a book that is well-written but has no index or table of contents. Search engines will have a hard time understanding the content’s relevance and ranking it appropriately.

To improve the technical SEO of your website, it’s essential to conduct a thorough audit. Various tools, such as SEMrush or Ahrefs, can scan your site and identify any technical issues hindering your SEO efforts. 

These audits cover a range of aspects, including website speed, mobile-friendliness, URL structure, site architecture, XML sitemap, robots.txt file, canonical tags, schema markup, and more.

By conducting a single audit, you can uncover all the ongoing issues and take the necessary steps to fix them one by one. While missing meta tags or breadcrumbs may not be major issues, ensuring that your site is error-free can provide a significant advantage over your competitors. 

By addressing technical SEO issues and keeping your site up-to-date with the latest algorithm updates, you can ensure that your website is in top shape and provides search engine crawlers with the best possible user experience.

5. Check and improve site speed

Page speed is a crucial factor that affects SEO. With the introduction of Google’s Algorithm Speed Update, it’s become evident that slow-loading pages can negatively impact your rankings. In addition to being a direct ranking factor, slow page speed can indirectly affect your rankings by increasing bounce rates and reducing dwell time. 

If your pages take too long to load, Google may prioritize other sites offering similar value but with a better page loading experience. To avoid falling behind the competition, ensuring that your pages load as quickly as possible is crucial. 

You can use tools like PageSpeed Insights, GTmetrix, or Pingdom to test your pages and identify areas for improvement. 

Additionally, Google Search Console provides Web Vitals data that reflects real-world user experience on your pages. By keeping your Web Vitals stats in good shape, you can rest assured that SERP won’t impact your site due to poor page performance.

If your page speed is poor and your Google Web vital status has failed, you have a few options to fix the issue. You can try to improve it on your own or enlist the help of speed improvement experts to ensure your site passes the web’s vital records. One such expert is Team Speedy Site, and if you use the coupon code WISPEED50, you can receive a $50 discount on your first order with them.

6. Finally, prepare a link-building plan

Improving your website’s ranking and traffic requires a solid link-building plan that aligns with Google’s algorithm. Link building is still very relevant in 2023 and remains a deciding factor in high-competition SERPs. 

While working on other foundational pillars like topical authority, a strong backlink profile can protect your site from Google algorithm updates. Your expertise, authority, and trustworthiness (EAT) largely depend on links and mentions on authoritative sites.

However, link-building is not an easy task to accomplish. You need to build links that work in 2023. One of the best link-building strategies is guest posting. You can do it in-house or hire a reliable service like Authority Builders. Another strategy I like is to insert links in existing articles by outreaching yourself with a quality page or again taking help from Authority Builders.

Additionally, you can get mention links by helping a reporter out (HARO) or by creating newsworthy content and outreaching journalists for links. Building a high-quality backlink profile is essential to your website’s success in the long run.

Semrush’s link-building tool is another great way to find prospects that you can reach out to and might help quicken the process overall.

Read More

Chat GPT for Content Marketing (Use Cases)

ChatGPT is an Inspiring artificial intelligence tool that can make your life much easier. According to Nerdynav, ChatGPT is being used by 100 million users every week. It can be a valuable tool when it comes to content marketing, and Chat GPT definitely can assist you in upgrading your productivity.

Despite having some major limitations to match its hype in the media, ChatGPT is quite impressive, at least as a product very much a work in progress. Its generation capabilities are particularly useful.

So ChatGPT by your side, you will be able to craft an e-mail copy effortlessly, dissect customer data, to prepare your marketing plan to drive conversions and boost revenue.

Let’s take a look at how you can use ChatGPT to improve your marketing strategies and upgrade your content production. 

1. Brainstorming content ideas:

Finding new ideas for content is a challenging aspect of content marketing as a whole. Generating fresh and engaging ideas that fit into your website niche will attract and retain an audience can be difficult, which makes it challenging.

So you can ask your AI assistant to find out ideas that are relevant to your site. OpenAI excels at generating numerous ideas once you give it a single concept as input. Its technology is based on predicting the next word in content, which makes it highly skilled at expanding ideas once you have given it a starting point.

For example, if you have a site that discusses mobile apps, now you can ask the AI what industries can be benefited from using mobile apps.

What are the top 10 industries that can benefit from using mobile apps”.

Now ChatGPT has covered industries that we have not touched on in our blog topics. This can be really helpful to expand your website’s content reach and fits well for programmatic SEO, which we will come into later.

Brainstorming-content-ideas

Then you can split the main ideas into sub-ideas to follow a top-down design before finding out low competitive keywords for each sub-category you can cover on your blog.

When I ask ChatGPT, “Can you provide ideas of categories related to app in Food and hospitality industry,” it gives me the type of apps we can talk about in the food and hospitality industry.

chat-GPT-test

2. Upgrade existing content:

People are hyped about using ChatGPT to generate tons of content, but we overlook that one of the most impactful ways to utilize this AI tool is to enhance the existing content and tailor it to a specific audience. 

We have previously discussed how to find content for upgrades, how it can help you stay in the competition, and how many SEO agencies leverage this strategy to boost site traffic and overall conversion without crafting content from scratch. 

ChatGPT has incredible potential to elevate what you already have because you are providing enough input for it to play with and use its intelligence to upgrade the overall output quality.

You can ask ChatGPT to upgrade certain phrases on your posts: “explain the following for better understanding: [paste the text].”

Here’s what I get.

upgrade content using Chat GPT

In this instance, the AI statement may appear disconnected from the prior discussion; however, utilizing ChatGPT as your helper, you can experiment with various approaches to arrive at a clearer and more comprehensive explanation.

3. Accelerating content creation while reducing expenses

Copywriting is the most obvious way of using Chat GPT in digital marketing. Using the typical Chat Open AI website, you can produce tons of content which should significantly reduce your content production cost.

If you can go one step ahead, you have the resources to leverage Open AI API and generate a large number of contents (on a scale of 200+) in just a couple of hours. If you cover an average of 1000 words per article, that is more than 2,00,000 words

There are several options for accomplishing this task. On WordPress, you can install a plugin to generate content and send it to post draft automatically. You can utilize the API to generate multiple pieces of content simultaneously using a spreadsheet for a more innovative approach.

Here’s a basic illustration of using OpenAI API and Google Excel to broaden your content ideas and establish the fundamental components of a comprehensive content collection.

spreadsheet-Chat-GPT-API

4. Writing cold emails enquiring about backlinks

Backlinks still play a crucial role in search engine optimization (SEO) and can significantly boost a website’s performance in search engine result pages (SERP). Backlinks from reputable and authoritative websites serve as a vote of confidence in your website. 

This can add credibility and trustworthiness to your pages in the eyes of both search engines and users.

Although you can’t generate backlinks 🙂 like you generate content using ChatGPT, there’s a way this AI tool will significantly reduce your work creating cold emails. You can use Semrush Link Building Tool to collect prospects and their email address before running your OpenAI API to generate multiple outreach templates for different purposes.

SEMrush-link-building-tool

5. Generate content FAQs

FAQs are an important aspect of SEO. FAQs often include relevant keywords that can help improve a website’s ranking for those keywords. A well-designed FAQ section is a way to provide quick answers to common questions related to your topics. 

In addition, by providing concise and clear answers to common questions, a website may be more likely to appear as a featured snippet in search results. They help expand your content cover, and that may increase the overall authority and credibility of the web pages.

Creating FAQs with chat GPT is fun. You don’t have to dig for questions as the tool gets you the question with a near-perfect answer. You can just ask ChatGPT to create FAQs about certain topics or keywords, and it will come up with a bunch of those. All you have to do is select the ones that make sense to the page contents.

FAQs-by-Open-Ai

6. Easy to Execute Programmatic SEO

Programmatic SEO involves the creation of large numbers of pages for a website using a standardized template and a data set. The pages appear to have been created individually, but they are generated using a systematic process.

The primary advantage of ranking for a large number of smaller keywords with less competition and search volume, rather than relying on a few big keywords. 

This strategy reduces the risk of significantly dropping website traffic if Google changes the ranking for your major keywords. The strategy is similar to a long-tail approach, and it’s like spreading out eggs across many baskets, making it less vulnerable to changes in ranking.

programmatic-SEO
Image credit: milesburke.co

Now, one of the main issues in Programmatic SEO is that most parts of the content remain the same because it’s a template and variable-based approach. That’s a major content duplication issue. 

So by harnessing ChatGPT’s content creation ability, you can generate unique content across the variable matrix just by following what we did to generate the email outreach template.

Once your content machine setup is ready, you have a couple of things to take care of:

  1. Make sure your website’s content is helpful and original.
  2. Check that your website is listed in Google Search Console.
  3. Ensure that each page on your site has unique content.
  4. Build a valid XML sitemap to help search engines index your site’s pages.
  5. Interlink various pages on your site to create a strong internal linking structure.

The most important part?

It is important for you to use ChatGPT ethically and responsibly, ensuring the information provided is accurate and up-to-date. 

Overall, incorporating ChatGPT into a content marketing strategy has the potential to increase your efficiency and enhance the quality of content.

Similar posts:

  1. How do you find out contents that need updating
  2. 5 Best AI Content Generators Reviewed – Can Google Detect AI-Generated Content?
  3. WriteSonic Examples – Can Google Detect WriteSonic Content

Read More

How do you find out contents that need updating

Regular ideation and creation of new blog content form a strong content marketing approach. But what about poor-performing and outdated articles on your site? 

Revamping old content can boost your strategy and is a step worth taking. Not every article will drive organic traffic. For every successful post, there may be many others hidden deep in your blog in need of improvement. 

These articles deserve your attention. Revising old articles provides an opportunity to optimize and make them more appealing, which could lead to higher search engine rankings and more traffic. This also diversifies your strategy, rather than relying solely on creating new content.

Having outdated or poorly written content on your website can have a negative effect on your brand’s reputation. When you neglect the quality of your content, it sends a message that you do not value your audience. 

This lack of consideration can then lead to decreased traffic, fewer clicks, and reduced social media engagement. 

A well-crafted, up-to-date, and relevant content strategy is crucial for maintaining a positive image for your brand and connecting with your target audience.

Does updating content help SEO?

The freshness of content has been an important factor in determining the ranking of a website on Google’s search results since 2011

Does updating content help SEO

We all know Google uses various factors, including the date when a document was created, to determine the ranking of search results, which suggests that the date of creation or date of modification of a document is one of the factors considered using by Google algorithm when ranking search results.

The latest and updated information is given more weight by search engines, causing older content to decline in rank gradually. Revising and refreshing outdated pieces conveys to search engines that your content is current, accurate, and relevant. 

Moreover, search engines regularly alter their algorithms, and updating your content ensures you adapt to any on-page SEO modifications that may otherwise negatively affect your performance.

Less time investment, better result: revising an outdated article on your website much is quicker than creating a new one and would probably be able to turn it into top-quality content with an increased possibility of driving organic traffic. 

I don’t mean to say that you should stop creating new content altogether, but allocating time to both activities has the potential to achieve results with less effort.

Again, if audience engagement with your content is a metric Google uses to identify content that has a better impact or probably covers more comprehensive information about a topic, then updating your old content could be the most effective way to improve engagement other than improving site performance, designs, etc. 

Therefore, you can definitely say updating older content positively impacts SEO.

What is your ideal strategy for updating content on a website?

Revising old content can still be labor-intensive, especially for established websites. So you need the plan to identify content that is worth updating and make the best use of your time. Then you will have to figure out exactly what improvement a piece of content needs to perform better in SERP.

So here’s what we do when we are executing a plan to upgrade the content on a certain website.

1. Conduct a quick win content audit

The goal is to identify those contents that are close to being ranked high on search engines. The objective is to find and make slight improvements to the content to increase its ranking and make it more visible to users on search engines.

These are contents you have published previously and doing really great in the SERP by ranking multiple keywords but have not really ranked in the top #1 or #3 positions as you would have loved to.

So a quick win audit will help you find these moderately ranking contents. You can do such audits via Ahrefs by checking the organic keyword’s position for your website. But there’s another way I would like to include here:

For this method; you will need your site connected to the Google search console. If you have not done yet, check this guide. After a new connection, you might have to wait a couple of days for GSC to pull data from your site and populate their search performance dashboard.

For others who already have the search console collecting data for your site, open Google search console > go to search results.

Then make sure the search type is selected as web and the query tab is selected just below the performance graph.

search console filters

Now, if you enable the average position filter it gives you the most updated avg. SERP position data for the queries in all countries. You don’t have to look at the overall average position number but look at the position data for each query to get an idea of pages that are ranking for those queries.

Average position filter

Here, what we want to do is add a filter to positions where the position value is greater than 3. This will give you all the keywords with an average ranking below 3 and then you have a list of high-performing keywords to optimize for where your pages are between position 3 and 15 or you can pick 20 as the range to consider content as closely ranking.

find contents that are between 3rd  to 20th position

Now you have a set of keywords with average ranking data for a particular date range, but you don’t know if your pages are likely to rank even higher in the coming days without needing an update.

So it would be better for you to compare a particular date range, for example, the current 28 days’ data to the previous 28 days’ data, which should provide you with the direction of ranking those pages for particular keywords.

Select date range

As long as the position difference is in (-) means that your page has improved its position in SERP. Sometimes when the position in the previous day range is 0 means the page was not indexed for that particular query. 

Thus you can filter and analyze pages that are ranking between 3 to 20 positions for top query and are not improving or losing their rank; you can select those pages to your update queue. 

Check for position difference.

Again, you will have to consider the amount of traffic these keyword brings in so you can enable the click and impression stats to see if a particular page is worth updating for the given query or not. You can always play with these data to collect your most update-worthy pages for the next batch.

2 – Find out contents that used to do well

In this method, the goal is to concentrate efforts on finding out and enhancing content that previously performed well in search engine rankings. 

Content performance may stop its growth and start declining over time as search engines it’s no longer provide relevant information or competing content on the same topic has surpassed it. This is termed Content Decay, as it refers to the decline in ranking and decrease in traffic for a specific keyword.

You can find and analyze these contents by using a two-step method. First, you will have to identify such content via Google analytics >  Behavior > Site Content > All Pages report. 

Analytics comparison

Once you find the content that has dropped in terms of pageview performance from a previous period (In the example above, I’m using just the previous 30 days, but you should be covering at least a year or two based on the content’s age)

Now that you have the set of low-performing or decaying content, you can use the page URL and inspect it on Ahrefs to find out the organic keyword’s performance and consider whether to update that certain page. 

Ahrefs position check

What to focus on when you are updating content?

The process of updating content begins by identifying which content needs improvement. 

After this, a plan of action needs to be formulated to execute these updates. I would follow these 5 steps to create a better piece of content:

  1. Make sure to check that the content remains relevant to the user’s search query and meets their expectations. It’s important that the content’s purpose and focus match the user’s intent behind the original search. This ensures that the updated content continues to provide value and meet the target audience’s needs.
  2. As part of updating the content, you should include more in-depth and comprehensive information and details about the topic. The aim is to provide a complete and thorough understanding of the subject matter to the reader, making it a valuable resource for them. This will increase the relevance and usefulness of the content, potentially boosting its search engine rankings.
  3. You should be focusing on improving the content in a way that makes it unique and noticeable. The goal is to make the content stand out among similar content on the same topic, making it more appealing and attractive to the reader. This can be achieved through various techniques, such as using visuals, adding interactive elements, or providing a unique perspective on the topic.
  4. Next, you should focus on updating the content to reflect current information and removing any information that is no longer accurate. This step helps maintain the accuracy and relevance of the content, which could positively impact its search engine rankings.

Once your new content is ready, you can publish and share the document on social platforms by letting them know that you have newly updated content on a particular topic that is highly searched on search engines that might be worth looking into.

This is an overview of the strategy I personally follow to update the content from time to time, and it has been a promising tool for the overall growth of my primary website portfolio.

Read More

5 Best AI Content Generators Reviewed – Can Google Detect AI-Generated Content?

This study will look at the 5 best AI content generators for long-form SEO blog posts. But most importantly it will focus on using brand new AI that is the first to be able to detect GPT-3 AI-generated content all to try and answer… can Google detect AI content? 

Since OpenAI released its GPT-3 API a large number of amazing AI content generation tools have been built. These tools can do a lot of things from writing emails to creating a Facebook ad. However, for this study we are focusing our efforts on long-form SEO blog posts that have the objective of ranking in Google.

We will be using Originality.AI which is the world’s first AI trained to detect if a piece of content has been created by AI tools using OpenAI’s GPT-3 API which tools like Jasper.ai use or other popular NLP APIs. 

What Originality.AI Was Built to Do… AI Exposing AI

The unchecked proliferation of AI content is not without its concerns. How do we responsibly/ethically use this awesome power of AI content creation? How do we ensure that Google does not penalize us for leveraging AI content?… these are all questions I think I am in a unique position to try and help answer.

Who Am I to Talk about AI Content?

Although I am an engineer, I am not an AI developer however I…

  • Founded (and now sold) a done-for-you AI Content Agency: Rocket Content… where we were the heaviest user of Jasper.AI, and our team partnered with Jasper to develop a course and webinar for Jasper.AI’s agency users.
  • Manage a portfolio of content sites some of which we are ethically leveraging AI content 
  • Founded Originality.AI where we built our own AI that is able to predict if content was produced by AI (the world’s first GPT-3 AI detection tool!)

However, clearly I don’t know it all since I have my name on an AI patent that was not approved!

Before we get to the results of the study that determines if Google can detect content generated using GPT-3 (or any of the popular NLP models) we need a little background…

What is AI Generated Content?

AI Generated Content is when a machine produces content such as a blog post, article, headline, blog intro, ad copy, social media content, product description, content idea, chatbot response, email or even images. It has exploded since May 2020 when OpenAI provided access to the API of their pre-trained NLP called GPT-3. 

The Secret about ALL AI Content Generators

There are multiple pre-trained natural language processing models but it was really when GPT-3 came on the scene that the quality of the content produced became useable for marketers. 

So why was GPT-3 such a big deal in allowing an AI content generator to become useful? It was not the first or the only NLP model that has an API which developers can build tools with.

Pre-trained NLP model capabilities, up to this point, are best measured by their number of parameters, the table below shows the most popular NLP models that AI tools could choose to use and the number of parameters:

  • Open AI GPT-2: 1.5 Billion
  • EleutherAI GPT-J: 2.7 Billion
  • EleutherAI GPT-Neo: 6 Billion
  • Open AI GPT-3: 175 Billion (DaVinci) 

The “secret” is that ALL tools rely on the same API’s to create their content. These pre-trained NLP models are so incredibly costly to train that it would be impossible for even the best-funded AI tool to compete with its own NLP model. At a 30:1 difference in the number of parameters for GPT-3 to its nearest NLP competitor, the reality is that all effective AI content generation tools are using GPT-3 as their foundation.

Secret of AI

Even though they are all using the same base there are still differences that can result in a big change in the quality of the content produced.

Fun Fact EleutherAI (GPT-J, GPT-NEO) (scary sounding name) is actually truly open source while OpenAI(GPT-2, GPT-3) is very much for profit. The community, such as huggingface, of developers around AI is incredible!

Okay, enough nerding out…

Can Google Detect AI Content?

Yes, the study below shows that it is possible for Google to build (or has already built) its own AI that can successfully predict if content was created with an AI tool using GPT-3 (or other NLP model). We know this because at Originality.AI we built and trained our own AI that can predict if a piece of written content was created by an AI content generator using GPT-3 with 94% accuracy and higher for all other available NLP models.

Originality.AI has shown that the rumour GPT-3 created content was undetectable is wrong. Originality.AI was able to build a model that can detect content produced by all of the popular NLP frameworks with a high degree of confidence:

  • GPT-3 Detection Accuracy is 94.06%
  • GPT-J’s Detection Accuracy is 94.14%
  • GPT-Neo’s Detection Accuracy is 95.64%.

As we head into 2023 there are no other tools that can do this and we are gearing up to be ready when GPT-4 goes live. 

So what? Why do we as content marketers care if Google (or others) can detect content that was created by a bot?

Google has come out and made it clear that it does not want AI-generated SPAM content in their Helpful Content Update they basically said as much. However, where it gets a little more mirky is what about content that was initially created by AI but then a human verified its accuracy and added some additional value? Will this get impacted by Google? No one knows!

The risk for all web publishers that don’t write every word themselves, right now is that writers they have hired are using AI writing aids to help create content faster and are now publishing content that could be identified as AI-generated.

AI-and-Googles-guidelines

As covered in this article or this article or this article it is widely understood that Google is scared of AI content damaging the usefulness of the text-based web which would result in Google making less money.

do you know if your writers are using AI content generators

Now that the stage is properly set let’s get to the studies results!

STUDY: Examples From 5 Of The Most Popular AI Content Generators and Determining if Google can Detect Their Content?

We ran samples from 5 of the most popular long-form content AI generators through Originality.AI to test the confidence with which the content could be identified as not human-generated. In addition, we ran a control set of human-generated content to verify the probability score was identifying human as human and bot as bot. 

Here are the results…

Source of ContentAI Probability Score
Jasper.AI79.14%
Rytr95.00%
WriteSonic99.43%
Frase100.00%
Article Forge100.00%
Control (VentureBeat)2.4% 

Jasper produced the content that had the lowest probability of being produced by the GPT-3 API. This likely reflects a lot of the extra work they have done on top of the OpenAI API.

To help show the tool is providing accurate results we ran the 5 most recent articles at VentureBeat covering AI through the tool and you can see the results below.

Although the AI at Originality.AI can not say with absolute certainty if content was created by GPT-3. You can see from this study it does an impressive job at determining what is human-created content and what is bot-created content. 

Originality.AI Exposing GPT-3 Content…

Expose-AI

Here are the AI probability scores for the editorial articles from Venture Beat we safely assume are human-generated…

AI-probability-scores

Below is a detailed breakdown of each tool and a video showing an example of the output of each tool being tested by Originality.AI 

1. Jasper.AI

Jasper

Jasper.AI is an incredible tool, they have established themselves as the leading AI content creation tool with a $125M raise. 

This tools popularity and recognition is for good reason, they were our go-to tool when we built RocketContent.ai

The downside is that with their popularity, they have the challenge of needing to be a tool for all marketers looking to leverage AI. They are not focused on long form SEO content in particular like some of these other tools are.

They are definitely the best tool if you are signing up for an AI Content Creation tool for the first time or need an AI tool that can do it al!

Can Google Detect Jasper.AI Content:

Yes, using Originality.AI we are able to identify when content has been created by Jasper.AI. 

In the video below I show live as I use Jasper to create content and then test it with Originality.AI to see if it can be detected as generated by a GPT-3 tool.

Pro:

  • Rich Features
  • Quality output
  • Community & Support

Con:

  • Cost
  • Trying to serve many masters

Pricing:

Jasper-Pricing

Jasper.ai Is Best For:

Anyone buying their first AI content creation tool. With its diverse group of tools it can support any marketer whether they are writing emails, sales pages or creating blog posts. It has established itself as the lead tool for good reason. 

If you only care about producing long-form SEO blog posts with a team I currently prefer WriteSonic.

See Jasper.AI Examples – Here

2. Rytr

Rytr

Rytr is an interesting option for writers… it has the most affordable “unlimited” monthly plan.

When we used it we struggled to have it efficiently produce full blog posts that didn’t wander wildly off-topic. It seems that other AI tools have produced the ability to extract more focused content as the text gets longer but Rytr definitely struggles with this. 

The product is not as polished, easy to use or feature-rich as Jasper.AI so it is not best for people’s first tool.

The content produced does not seem to be at the same level as other tools. Within GPT-3 you can select which model you want to use and my guess is that Rytr has not selected the premium.

However, it is fast and the most economical unlimited tool. 

Can Google Detect Rytr-Generated Content?

Yes, using Originality.AI we are able to identify when content had been created by Rytr. Originality.AI showed with an average of 95% confidence that the content created by Rytr was produced by a bot. 

In the video below I show live as I use Rytr to create articles and test them for originality.

Pro:

  • Affordable unlimited word count monthly plan
  • Plagiarism checker integrated
  • Easy to use

Con:

  • Not great for long-form blog posts
  • Not as feature-rich as competitors

Pricing:

Rytr-pricing-1

Rytr is Best for:

I think of Rytr as the cheaper version of Jasper.ai. It is best for freelance writers who are producing a lot of lower-cost content and looking for a budget-friendly AI solution to help them. 

Rytr is your best low cost option, even though all these tools rely on the same NLP the results as you can see in the examples do vary. The Rytr examples are not as good as most of the other tools. 

See Rytr Examples – Here

3. WriteSonic

WriterSonic

Write Sonic has become our go-to tool for creating long-form SEO blog posts. The content stays on point and flows well. 

Where many tools such as Jasper.ai and Rytr try and be a swiss army tool for digital marketers Write Sonic has focused more heavily on the use case for writers producing blog posts. 

It has some nice features to make it a good fit for teams of writers including the ability for multiple users and to select different quality content options. 

It is definitely not the lowest-cost option with the cost per word being as high as $0.01/word.

Can Google Detect Write Sonic Generated Content?

Yes, using Originality.AI we are able to detect WriteSonic content as being bot-generated. For the example articles that were run through the tool it predicted with a confidence of 99.4% that the content was bot-generated.

In the video below I show live as I demo WriteSonic, show examples of content and run them through the tool at Originality.AI:

Video

Pros:

  • Great long-form content 
  • Able to add multiple users
  • Able to get started quickly
  • Does have a free-trial

Cons:

  • Expensive
  • No Unlimited plan
  • Fewer features than others (which is also a positive)

Pricing:

This is not the lowest cost option but it can produce excellent long-form content if that is what you are looking for.

WriterSonic-Pricing

Write Sonic is Best For…

Web publishers with a team of writers looking to leverage AI to increase their team’s efficiency.

WriteSonic and Jasper.AI are our current tools. 

Write Sonic Examples – Here

4. Frase AI Writer

Frase

Frase is a phenomenal tool that started as a great way to make content briefs easily. 

It has evolved a lot over the last 2 years and has now added AI content writing to its features. 

Because it was originally built not to be just an AI content creation tool the complexity is higher than most. 

Can Google Detect Frase.io AI Writer-Generated Content?

Yes, using Originality.AI we are able to identify when content was created by Frase. Originality.AI’s AI predicted with a full 100% confidence that the content was bot-generated.

Pros:

  • It is the best tool to quickly create brief analyzing the top 20 results on Google
  • Feature rich tool for SEO writers that goes beyond simply having a bot create content

Cons:

  • Can be complex… the tool has a lot of pieces to it and if you are looking for a simple AI writer this is not it.
  • Limited to 30 articles per month unless you pay a sizeable $149.99/month for unlimited

Pricing:

Frase-Pricing

Frase AI Writer Tool is Best For:

If you are a serious SEO writer who writes a lot of your own content and would like to have an AI writing assistant paired with a lot of other more advanced SEO content tools to optimize your content to rank. I would suggest signing up to Frase because of the other features first (brief creation etc) not because of its AI writer. 

We use Frase for brief creation to provide writers with the ability to ensure they completely cover a given topic. 

5. Article Forge – Strongly Not Recommended 

Article Forge is the only tool on the list that explicitly says they do NOT use GPT-3. However, it raises the first of several questions… what do they use if it’s not GPT-3.

There are a series of concerning issues when testing Article Forge that has me warning people to NOT USE article forge!

Article Forge has some very interesting functionality for SEO content managers:

  • Bulk content creation
  • Automatic uploading & scheduling

Just upload a list of keywords, add a login to your site and poof perfect SEO-optimized, factual and helpful content published on your site… but this claim I fear belongs in the bucket of too good to be true! 

Here are the issues I was able to identify with the help of Originality.AI

Issue 1 – Article Forge Content is Easily Identified as AI

Article Forge states they do not use GPT-3 but don’t explicitly state what they do use. Since our AI is able to consistently identify their content as AI it means they are likely using GPT-2, GPT-J or GPT-NEO. The quality of the output also suggests it is not GPT-3. However, since our tool is able to detect it that means it has to be one of the NLP models our AI is trained on so either GPT-2, GPT-J or GPT-NEO.

AI-detection-score

Issue 2 – Articles are Mostly Rewrites of top Ranking Articles

Here is how I went about testing this theory on how Article Forge works…

Theory: Article Forge finds top-performing pages for a given keyword and then uses AI to re-write sections of it. 

Test Theory:

  • Create an article using Article Forge
  • See that it passes keyword plagiarism (ie Copyscape or Originality.AI)
  • Enter each LONG paragraph into Google and see how similar it matches up to the same article repeatedly

Conclusion

Article Forge uses an NLP API to re-write sections of top results in Google for a given keyword resulting in an un-original article in the eyes of Google.

Article-Forge-Analysis

Potential ISSUE 3 – Article Forge Content Does Not Rank in Google

The example articles at Article Forge do not rank well even if you search a full paragraph of the article. This could be a unique Google manual penalty to a portion of a page or it could mean that Google has been successful in identifying Article Forge content as not helpful.

There is indexable relevant content on their homepage (no googleoff tag or no index) that was produced by the tool. It should rank very high if we search an entire paragraph of the example article. However, when I grab the first paragraph and enter the entire thing in Google the Article Forge article is the 7th result.

Article-Forge-4.0

Article Forge Summary

I would strongly recommend not using Article Forge since it appears that Article Forge is using either GPT-2, GPT-NEO, or GPT-J to re-write sections of top articles in Google for a given keyword. The result is generally a worse copycat of the original article that barely passes Copyscape but is in no way Original.  

If either a user or Article Forge team member wants to communicate if/how I am wrong please reach out!

AI Content Generation Tool FAQ:

Are there an alternative AI article generator?

Here are some additional tools that will be studied in the near future as we work to discover the best AI content generator and if any are truly “undetectable” as GPT-3 generated content…
1. Shortly.ai
2. Copy.ai
3. CopySmith.ai

Is there a Free AI Writer?

Yes, most tools have a free trial. There is not AI writer that is 100% free. 

What Does Google Consider as Original Content or Unique Content?

When it comes to ranking webpages, Google puts a lot of emphasis on the quality of the content. In particular, the search engine giant looks for content that is original and informative. This means that simply regurgitating information from other sources is not enough to earn a high ranking. To Google, originality means adding your own insight and analysis to the conversation. This could involve offering a unique perspective on a current event or providing new data that challenges existing beliefs. In either case, the goal is to contribute something new and valuable to the discussion. By heavens, there must be something left in the world that hasn’t been said before! If you can find it and say it well, Google will take notice.
Originality.AI was built to identify original unique content by answering for a piece of content…
1. Was it written by a human?
2. Did they copy/plagiarize it?


Will AI writers keep getting better?

The typical obvious answer is yes, especially with GPT-4 coming in 2023 (suspected). However, there is an interesting challenge on the horizon for NLPs. The constraint for training models is quality data, training method and computing power. With chip shortages and GPT-3 already using 60 million domains (and likely the higher quality portion of the web) to train there is potentially 2 significant constraints to the next model being way better then GPT-3. It seems like the next model can not just be brute forced to be bigger but they will need to train the next NLP better. Smart people are working on it and I am excited to see the results!

Should You Use an AI Content Generator?

This article specifically looked at written content, if you are looking to understand how AI can be used to generate art check out this article this article. Despite showing in this article that AI content can be detected by another AI I do believe AI content generators/writing aids are the future and we are best to learn to work with them while ensuring if you are publishing anything on the web it is Original

Read More