

The BIG Changes in The May 2025 Google Update

The BIG Changes in The May 2025 Google Update
16-06-2025 (Last modified: 16-06-2025)
May 2025 Google Update
Between 27 – 29 May 2025 many website owners, including us, saw huge drops in the number of pages they have indexed with Google. At first we thought this was a glitch and perhaps (hoping!) Google has a temporary issue that would rectify itself after a couple of days… Sadly this did not happen and we became aware of many others in the same position.
So… What caused this and why was it not announced as an official May 2025 Google Core Update when it seems like the biggest change we have seen to date in the way Google indexes pages.
We decided to do our own research, noticing it overlapped with the launch of Search Generative Experience (SGE), we looked at what we were seeing in our own dashboards, as well as issues faced in the wider community of website owners and SEO experts. Here’s our thoughts…
EEAT
We all know how easy it is to create content and for it to become visible online pretty much immediately these days. This can be seen as a good thing – more and more content available to everyone worldwide, to find answers to any problems imaginable, but can also be a bad thing… With so much information constantly being published online, Google has a responsibility to monitor what is being put out into the world and ensure there is substance and validation behind it. We have likely all heard of ‘EEAT‘ which stands for Experience, Expertise, Authority, and Trust. I believe this is key to the May 2025 Google update (even June 2025 Google update) and any pages not showing ‘enough EEAT’ are being culled.
Ask yourself the following questions for each page affected:
- Has your page been entirely written by a human?
- Does your page have links to an author, demonstrating EEAT behind the article?
- Does your page have links to other pages within your site?
- Does your page fit into a structured format within your site and is it easy to find?
- Does your page have any backlinks linking to it?
If you’ve answered no to any of the above then this may be your issue. Each of the points above demonstrates a level of EEAT – and the more ‘yes’s’ the better!
Index Culling
With so much new, and assumingly up to date and correct information being published, Google has to ask itself whether old pages that haven’t been updated in say a few years are really relevant any more?? You might notice that older pages that you haven’t looked at in a while have been culled in the May 2025 Google update due to newer more relevant information being available elsewhere. This could be an easy fix (assuming your page fulfils the above criteria) and you could skim through the page and make updates to bring it into line with newer, current information.
As well as old pages, Google also discounts pages with too few words – i.e. those not really offering much to the reader. Generally speaking, the longer the content, the more useful information there is to obtain from it. If you have written an article that is only 250 words, with no images or links to find further information, then this isn’t going to impress anyone – especially Google!
Whilst we are on the subject of ‘writing’, you also need to question whether your page has natural language throughout? If you have been able to tick all the boxes above but your article doesn’t read naturally then you’ll need to rewrite (not using AI…) it so that it comes across in a natural manner. Let’s look at an example:
Title:
Best Cat Food For Cats – Buy Cat Food Online
Content:
“Looking for the best cat food for cats? Our best cat food for cats is the best cat food online you can buy. When you are choosing cat food for cats online it’s important to choose the best cat food options for cats. Our store sells the best cat food online for cats…”
You get the gist… I’m not even going to point out the errors here! You may very well sell the best cat food but if that doesn’t read naturally, and isn’t backed up by relevant stats and nutritional information Google isn’t going to buy any of it (for their cat or for their search engine!)
Robots.txt Misinterpretation
We noticed some of our sites saw pages fluctuate between indexed and ‘crawled, though blocked by robots.txt.’ This can artificially reduce the indexed counts. If you’re wondering where to find this then just head to your Google Search Console, in the Pages tab and look for the yellow/orange addition at the bottom of the coverage reporting. The report covers “Indexed, though blocked by robots.txt” and that can often explain a sudden surge, and then drop, in indexing overall for a site.
To check this is the case simply type ‘yourdomain.com/robots.txt’ into your browser and notice whether there are any blocked pages/areas listed. For example if you see the following text it will mean that all Blog pages are being blocked from being seen by Google:
User-agent: *
Disallow: /blog/
Why Does This Matter for SEO?
Blocking essential pages via robots.txt has two major impacts:
-
Loss of Organic Traffic: Pages not crawled or indexed won’t appear in search results. You’ll lose all potential organic traffic to them.
-
Reduced Domain Visibility: Google’s perception of your site’s overall value may be impacted negatively, indirectly affecting other pages’ rankings.
So how do we fix this?? Well it’s actually very simple! Head to the robots.txt file detailed above. Let’s assume we see the above scenario and we to enable all blog pages to be visible to Google. We have two options…
Remove the ‘Disallow’ element so the screen will now show:
User-agent: *
Disallow:
or we can specify that we definitely want to ‘allow’ blog pages and change the command to the following:
User-agent: *
Allow: /blog/
You can then head back into your Google Console and request indexing and hopefully this should fix your issue.
The May 2025 indexing update served as a powerful reminder that even minor misconfigurations like robots.txt errors can drastically impact SEO visibility. Correcting these quickly, clearly signaling the fix to Google, and implementing ongoing checks can rapidly reverse indexing losses and safeguard your organic visibility long-term.
Final Thoughts on the May 2025 Google Update
The May 2025 Google Update felt like a real core update, even without actual official confirmation. Sites with strong EEAT, UX and user-first content gained visibility. Others that hadn’t demonstrated their EEAT clearly enough, or those with outdated tactics, lost rankings. SEO tools like Semrush and MozCast showed high volatility, matching what site owners saw.
Google’s silence did not stop the impact. This update (or non-update if you like) proves that improving content quality, user experience, and trust signals is the only safe path. Search algorithms are changing fast with AI. Websites must adapt by focusing on real value, and make sure this is clearly shown (with internal linking) to ensure Google can reach and index all your content. If content helps people clearly, rankings stay strong, even when the updates roll out quietly.
say hello to easy Content Testing
try PageTest.AI tool for free
Start making the most of your websites traffic and optimize your content and CTAs.
Related Posts

24-06-2025
Ian Naylor
Complete Guide to Website Conversion Optimization
Learn effective strategies for website conversion optimization using data, AI tools, and user insights to enhance customer engagement and boost sales.

23-06-2025
Ian Naylor
CRO Tools: Free vs Paid Solutions Compared
Explore the differences between free and paid CRO tools, including features, costs, and scalability to boost your website’s conversion rates.

22-06-2025
Ian Naylor
How to Optimize Headlines for Better Click Rates
Learn how to craft headlines that increase click rates through emotional triggers, clarity, and effective testing strategies.