Did your average position in Google Search Console drop? Chances are it is a very good thing, especially if traffic remained the same or increased.
The big exception to this is when you accidently remove canonical links, mess up a meta robot or cause pages that shouldn’t be indexed to get indexed. But for the most part, this is a good sign you’re going to grow.
Average position dropping in Search Console does not always reflect a drop in ranking positions or total traffic. This is why it is important to see if:
- There are new products being added to your site
- New categories or collections that have been launched
- Features like FAQs have been added to pages or new FAQs added to your resource pages
- Maybe new copy exists on collections, product pages and service pages
- More blog posts have been written
- You found a grouping of URLs that weren’t being indexed or crawled and are now being indexed
- Content that wasn’t being found because it didn’t load properly is now being crawled and indexed
We’ll get into each of these in a second. But first lets look at “the why”.
When you add new copy whether it is a blog post, an FAQ or content into pages that never had it, you’re giving more information about the page to a search engine. This information can be used to solve more of the users’ queries. This means the search engine will begin picking up the content and figuring out what to show it for.
The content likely won’t be in the top 10 or 20 positions immediately, and you’re also going to find a ton of long tail keyword phrases. Many of these phrases will begin appearing in the lower 50 to 90 positions. Because of the new keywords added to the total, you have now offset the old dataset thus lowering your overall average positions in Google Search Console. Here’s an easier to follow example.
Suppose you have a single page website with 100 keywords in the top 100 positions. 40 keywords are in the top 10 positions and the remainder are in the 40’s to 90’s. Now you add 5 blog posts and each has added an additional 50 words in the top 100 positions.
10 of these keyword phrases are in the first 10 positions, but the other 200 are much lower. Because there is a greater amount of keywords in all, and the majority of new ones are in the lower portion it lowers the total average ranking position even though traffic and visibility is likely higher.
To make sure you didn’t lose rankings you’ll want to check the original page inside Google Search Console.
You can see if a page dropped in rankings in Google Search Console by:
- Logging into search console
- Selecting the property to look at
- Clicking on search results in the left hand navigation
- Select “pages” in the middle of the screen.
- Click on the URL you want to check the average position for
- Check the box for average position
And now you have your answer.
You likely did not lose traffic or rankings if your total average position dropped, but instead fed search engines like Google more data. This data is now being indexed and could lead to more traffic and revenue for you. Want some more tests and things to check for? Good, you’re a geek like me then!
If you’re curious about crawling and if Google is finding more pages or how far Google looked into your site, the answer can be found in your server/crawl logs and also the total pages indexed by doing a “site:” search.
You can also use your SEO tools by plugging your URL in and looking to see how many keywords exist in the top 100 positions. Almost all of the tools have a database and track the increases and decreases by domains. If you want my preferred ones, use the contact form on this page and I’ll share them with you.
The next place to look is on page changes that your marketing or development team will know about.
Check for:
- Additional FAQs on product pages, collections, blog posts, resource centers and calculators
- New products and services that were added to the site including skus, variants and newer models.
- Finding out if specific parts of your website were not being indexed including:
- Categories/collections/folders being blocked by robots.txt
- Both in store/services and on your blog
- Sections of your site that didn’t get crawled because a meta robots was set to noindex
- New physical locations if you’re a brick and mortar retailer
- Tags for the same reasons above, if tags are supposed to be indexed on your site
- Feeds being added into your site like a PR newsroom or landing pages from an email blast archive
- Categories/collections/folders being blocked by robots.txt
Bonus tip: If you keep track of what was cached in Google then vs. now, this could also provide some clarity if your teams tell you nothing was changed or added. The way back machine works great for this too.
Average position dropping in Google Search Console isn’t always a bad thing. In many situations it means you’re able to gain new opportunities to build traffic and grow revenue. If you found this post helpful, subscribe to my newsletter and I’d love a share on social media.
4 thoughts on “Average Position Dropping in GSC is a Good Thing! Here’s Why.”
Hello dear Adam!
thanks for your great article.
im seo expert in tabdeal cryptocurrency exchange.
I would appreciate you for sending me your preffered tools for mentioned topic.
” If you want my preferred ones, use the contact form on this page and I’ll share them with you. ”
thanks.
Sending you an email.
Thanks Adam for sharing.
This is about the most accurate explanation I have read. My site is beginning to rank for more keywords and the top queries for my site are Increasing daily.
Glad it could help and thank you for reading.