Let’s start with quick highlights; the deep dive follows.
- Google began rolling out a spam update – globally and in all languages – on August 26, 2025.
- Google labeled this a “normal” spam update, not an update to its core algorithm.
- The intention is to improve search quality by filtering spammy content.
- The update rollout ended September 21.
- Impact has been difficult to fully measure, as many of the updates affect the data in Search Console and other tracking mechanisms more than they reflect changes in actual traffic.
- You may see a drop in search impressions and/or volatility in your favorite SEO tracking tools.
What Happened?
Google launched its first spam update (and a pretty hefty one) in quite some time on August 26. The update ran for 27 days, which made it one of the larger updates of the year.
The previous spam update occurred in December 2024, so search marketers across the industry were watching their data closely. From what we’ve seen so far, most of the changes will affect hardcore SEO nerds (like myself) and anyone who’s been producing low-quality content (hopefully not like myself).
What Digital Marketers Are Seeing
Ranking Volatility/Traffic Fluctuations
Focus on Content Quality and Spam Signals
The update reinforces what we all knew: Google continues to tighten its detection of spam, such as link spam, content duplication, thin content, etc. Sites with dubious backlink profiles or spammy internal/external linking or low-value content are more at risk. Sites with clean, user-focused content tend to benefit – or at least see much smaller losses – from spam updates.
Removing/Disabling of &num=100 Parameter
One change could, potentially, have significant impact on how traffic tracking tools work: Google has disabled the &num=100 parameter in search results.
What does that mean? Google has disabled the ability of marketers and SEO tools to request 100 results on a single search results page rather than the default 10 results per page. This made it easier to pull large amounts of search data en masse, as opposed to grabbing data 10 results at a time from one SERP .
The old way made life easier for marketers and tracking tools, but it might have been inflating organic search impressions as bots crawled through search pages.
Users started feeling the impact of this roughly halfway through the update. We’re still figuring out what this will mean long term, or whether other, similar parameters will be adjusted.
So ... What Do These Changes Mean?
Search Console Traffic & Impressions Drop
Many site owners are seeing sharp drops in reported impressions in Google Search Console, especially for desktop traffic. Previously, bots/scrapers using &num=100 would trigger many impressions for results deep in the rankings (positions 50-100), which rarely see real human traffic. Those positions are gone, and the misreported impressions went with them.
Average Position Looks Better
Since many of those low-ranking pages are no longer being counted, the average ranking position (in GSC) tends to go up (improve) for many sites. The new results better represent where real users really see you.
Keyword Visibility/Unique Terms
Many sites are losing unique ranking terms in reports. Many of these showed up only through deep results scraping, so this doesn’t necessarily translate to traffic loss. These losses make the visibility metrics (how many queries you rank for and in what positions) appear worse, even if your overall traffic wasn’t affected.
Rank-Tracker Tools Disrupted
Tools that relied on the &num=100 parameter must change how they gather SERP data. They now often need 10 discrete requests (one per 10-result pages) instead of one big request. This increases cost, time, and sometimes causes gaps/inconsistency while tools adjust.
No Big Change in Actual Human Rankings/Clicks
Early returns indicate that clicks, positions, and overall traffic form search haven’t dropped significantly. Big shifts occurred more in the reporting metrics/visibility side of SEO rather than in fundamental changes in what users see.
Combined Effect: Ranking vs. Data Noise
The August spam update has hit some sites hard. They’re losing ranking or traffic, especially if they had spammy or low-quality content/practices.
Some changes, such as the removal of &num=100, are not causing ranking changes per se. But they are correcting distortions in how metrics have been reported (especially in Search Console). A lot of impression and keyword visibility “noise” from deep positions were inflating numbers.
Red numbers make leadership sad, but this is just an adjustment toward more realistic data. We’ll form a new baseline for search data after the update. Pro tip: In the interim, make your reports look a bit less angry by pointing out that average position metrics are “improving” in many reports – a sign that rankings aren’t actually changing much.
What Should You Do?
To interpret these changes properly:
- Don’t panic. When you get hit by a spam or core update, don’t try to chase algorithm tricks. Focus on user benefit, content usefulness, trust, etc.
- Check actual traffic/clicks, not just impressions or keyword visibility. If clicks stay steady but impressions drop, it’s probably a reporting change, not a real-world drop.
- Segment data by device (desktop vs mobile). Many of the changes (impression drops) are more visible on desktop.
- Review content quality: E-E-A-T (Experience, Expertise, Authoritativeness and Trustworthiness); check for inadvertent spam signals; eliminate past bad practice that the update could penalize (thin content, spammy backlinks, etc.).
- Adjust SEO tool expectations/budgets. With &num=100 gone, you might need more API calls or scraping steps.
Need help developing or updating your SEO strategy and/or implementing it? Reach out! Our expert digital marketing strategists are ready to help you achieve your goals.