Posted by nafnlj 2 days ago
1. Ai overview: my page impressions were high, my ranking was high, but click through took a dive. People read the generated text and move along without ever clicking.
2. You are now a spammer. Around August, traffic took a second plunge. In my logs, I noticed these weird queries in my search page. Basically people were searching for crypto and scammy websites on my blog. Odd, but not like they were finding anything. Turns out, their search query was displayed as an h1 on the page and crawled by google. I was basically displaying spam.
I don't have much control over ai overview because disabling it means I don't appear in search at all. But for the spam, I could do something. I added a robot noindex on the search page. A week later, both impressions and clicks recovered.
Edit: Adding write up I did a couple weeks ago https://idiallo.com/blog/how-i-became-a-spammer
You can avaoid this by no caching search pages and applying noindex via X-robots tag https://developers.google.com/search/docs/crawling-indexing/...
But yes just noindex search pages like they already said they did
What could Google do to mitigate?
They say the data before and after is not comparable anymore as they are not counting certain events below a threshold anymore. You might need to have your own analytics to understand your traffic from now own.
1) encourage SEO sites to proliferate and dominate search results, pushing useful content down on the page.
2) sell ad placement directly on the search results page, further pushing useful content down on the page
3) introduce AI summaries, making it unnecessary to even look for the useful content pushed down on the page.
Now, people only see the summaries and the paid-for ad placements. No need to ever leave the search page.
I’m imagining something like “blog.example/?s=crypto” which only I should see, not Google.
Edit: Where they linking to your website from their own? (In that case the link with the bad search keywords can be crawled)
The solution is to tell the crawler that my search page shouldn't be indexed. This can be done with the robots meta tags.
Anyway, I'd really like to at least see google make the overview text itself clickable, and link to the source of the given sentence or paragraph. I think that a lot of people would instinctively click-through just to quickly spot check if it was made as easy as possible.
IIRC-
Used to take you to cited links, now launches a sidebar of supposed sources but which are un-numbered / disconnected from any specific claims from the bot.
example.com/search?q=text+scam.com+text
On my website, I'll display "text scam.com text - search result" now google will see that link in my h1 tag and page title and say i am probably promoting scams.Also, the reason this appeared suddenly is because I added support for unicode in search. Before that, the page would fail if you added unicode. So the moment i fixed it, I allowed spammers to have their links displayed on my page.
[1] https://cyberinsider.com/threat-actors-inject-fake-support-n...
Since these are very low quality results surely one of Google's 10000 engineers can tweak this away.
That's trivially easy. Imagine a spammer creating some random page which links to your website with that made up query parameter. Once Google indexes their page and sees the link to your page, Google's search console complains to you as the victim that this page doesn't exist. You as in the victim have no insight into where Google even found that non-existent path.
> Since these are very low quality results surely one of Google's 10000 engineers can tweak this away.
You're assuming there's still people at Google who are tasked with improving actual search results and not just the AI overview at the top. I have my doubts Google still has such people.
Google got smart and found out such exploits, and penalized sites that do this.
This been our experience with out content-driven marketing pages in 2025. SERP results constant, but clicks down 90%.
This not good for our marketing efforts, and terrible for ad-supported public websites, but I also don't understand how Google is not terribly impacted by the zero-click Internet. If content clicks are down 90%, aren't ad clicks down by a similar number?
The real issue at hand here is that it’s difficult to impossible to discover why, or raise an effective appeal, when one runs afoul of Google, or suspects they have.
I shudder to use this word as I do think in some contexts it’s being overused, I think it’s the best word to use here though: the issue is really that Google is a Gatekeeper.
As the search engine with the largest global market share, whether or not Google has a commercial relationship with a site is irrelevant. Google has decided to let their product become a Utility. As a Utility, Google has a responsibility to provide effective tools and effective support for situations like this. Yes it will absolutely add cost for Google. It’s a cost of doing business as a Gatekeeper, as a Utility.
My second shudder in this comment - regulation is not always the answer. Maybe even it’s rarely the answer. But I do think when it comes to enterprises that have products that intentionally or unintentionally become Gatekeepers and/or Utilities, there should be a regulated mandate that they provide an acceptable level of support and access to the marketplaces they serve. The absence of that is what enables and causes this to perpetuate, and it will continue to do so until an entity with leverage over them can put them in check.
The US regulates monopolies.
The US regulates utilities, defined by ~1910 industries.
It doesn't generally regulate non-monopoly companies that are gatekeepers.
Hence, Apple/Google/Facebook et al. have been able to avoid regulation by avoiding being classed as monopolies.
Imho, the EU is taking the right approach: also classify sub-monopoly entities with large market shares, and apply regulatory requirements on them.
I'd expect the US to use a lighter touch, and I'm fine with that, but regulations need to more than 'no touch'. It'd also be nice if they were bucketed and scaled (e.g. minimal requirements for 10-20%, more for 21-50%, max for 50%+).
With Google and SEO I see it more in the monopoly camp though. The existence of other big tech companies doesn't break the monopoly Google has by owning search, ads, analytics, et al under the same umbrella.
We've seen the legal gymnastics around market definition for monopoly purposes.
But it's harder for Google to make the case that it doesn't own at least a big chunk of {mobile OS} or {mobile app store} market share.
They can argue +/- a few percent over methods, but "We don't have a substantial market share" won't fly.
In this case though, it still seems like a more simple monopoly only with google. You don't need to consider other companies when the issue is related to the black box of search rankings.
If a few actors control the bulk of a market... wouldn't the same redresses be appropriate whether or not they're colluding?
We should make companies want to stay at a competitive market share instead of taking over their markets.
If we think free markets are generally going to move in the right direction, we should just want companies to want to fill market gap and outcompete. I don't agree we should make companies do anything though, at most governments should be tweaking incentives to attempt to push companies down a path without directly making them go there (even them I'm not sold that approach is worth it).
You don't think that alone distorts the market enough to merit intervention to encourage more competitors?
If you tie intervention to proven malfeasance, you allow abusers to skirt the rules for decades, entrench their positions with obscene profits, and then maybe eventually face consequences if they lose a legal case.
Instead of labeling some things illegal after the fact, monopoly and market law should be based around identifying some high market sharr situations as potentially dangerous and requiring compliance with additional regulations that make it harder for that dominant company to prevent competitors from starting and growing.
Otherwise, it invariably slides into state-aligned and -supported chaebols, because the government has incentive to ask large companies for help and they have incentive to cooperate with the government.
It’s possible the only hope is a painful one: a major market crash caused by greed and excessive consolidation, the kind of crash that would trigger a 21st century new deal.
The consolidation of power in the US government is the root of many of our problems, I don't expect that same government to solve it by grabbing even more power a la the new deal.
For certain popular sites, it doesn't. Those businesses got to pay the shelf tax if they want their published piece to ever be - not just seen, but reasonably - found when looking specifically for it.
I used to work for an SEO firm, I have a decent idea of best practices for this sort of thing.
BAM, I went from thousands of indexed pages to about 100
See screenshot:
https://x.com/donatj/status/1937600287826460852
It's been six months and never recovered. If I were a business I would be absolutely furious. As it stands this is a tool I largely built for myself so I'm not too bothered but I don't know what's going on with Google being so fickle.
Updated screenshots;
However, if they do it for the statutory term, they can then successfully apply for existing-use rights.
Yet I've seen expert witnesses bring up Google pins on Maps during tribunal over planning permits and the tribunal sort of acts as if it's all legit.
I've even seen the tribunals report publish screenshots from Google maps as part of their judgement.
I called the locksmith and they came, but in an unmarked van, spent over an hour to change 2 locks, damaged both locks, and then tried to charge me like $600 because the locks were high security. It's actually a deal for me, y'know, these locks go for much more usually. I just paid them and immediately contacted my credit card company to dispute the charge.
I called their office to complain and the lady answering the phone hung up on me multiple times. I drove to where the office was supposed to be, and there was no such office. I reported this to google maps and it did get removed very quickly, but this seems like something that should be checked or at least tied back to an actual person in some way for accountability.
Then I went to the hardware store and re-changed all the locks myself.
They are certainly trying. It's not good for them to have fake listings.
https://podcast.rainmakerreputation.com/2412354/episodes/169...
(just googled that, didn't listen, was looking for a much older podcast from I think Reply All from like 10yrs ago)
Just the open is similar, but the intent is totally different, and so is the focus keyword.
Not facing this issue in Bing and other search engines.
Some popular models on Hugging Face never appear in the results, but the sub-pages (discussion, files, quants, etc.) do.
Some Reddit pages show up only in their auto-translated form, and in a language Google has no reason to think I speak. (Maybe there's some deduplication to keep machine translations out of the results, but it's misfiring and discarding the original instead?)
It’s also clearly confusing users, as you get replies in a random language, obviously made by people who read an auto translation and thought they were continuing the conversation in their native language.
I think at least for Google there are some browser extensions that can remove these results.
www.xyz.com/blog/keyword-term/ www.xyz.com/services/keyword-term/ www.xyz.com/tag/keyword-term/
So, for a topic, if I have two of the above pages, Google will pick one of them canonically despite different keyword focus and intent. And the worst part is that it picks the worst page canonical, i.e., the tag page over blog or blog page over service.
This is what has caused the degradation of search quality since then.
The amount of spam has increased enormously and I have no doubt there are a number of such anti-spam flags and a number of false positive casualties along the way.
https://gehrcke.de/2023/09/google-changes-recently-i-see-mor...
The wrong RSS thing may have just tipped the scales over to Google not caring.
That's not to say I don't have gripes with how Google Maps works, but I just don't know why the other factors were not considered.
I just checked a few local restaurants to me in London that opened in the last few years, and the ratio of reviews is about 16:1 for google maps. It looks like stuff that’s been around longer has a much better ratio towards trip advisor though.
Almost certainly Instagram/tiktok are though. I know a few places which have been ruined by becoming TikTok tourist hotspots.
Counterpoint: I have met people in the UK who's lives revolve around doing nothing but.