Top
Best
New

Posted by nafnlj 2 days ago

Google de-indexed Bear Blog and I don't know why(journal.james-zhan.com)
423 points | 181 comments
firefoxd 2 days ago|
Traffic to my blog plummeted this year and you can never be entirely sure how it happened. But here are two culprits i identified.

1. Ai overview: my page impressions were high, my ranking was high, but click through took a dive. People read the generated text and move along without ever clicking.

2. You are now a spammer. Around August, traffic took a second plunge. In my logs, I noticed these weird queries in my search page. Basically people were searching for crypto and scammy websites on my blog. Odd, but not like they were finding anything. Turns out, their search query was displayed as an h1 on the page and crawled by google. I was basically displaying spam.

I don't have much control over ai overview because disabling it means I don't appear in search at all. But for the spam, I could do something. I added a robot noindex on the search page. A week later, both impressions and clicks recovered.

Edit: Adding write up I did a couple weeks ago https://idiallo.com/blog/how-i-became-a-spammer

dazc 2 days ago||
Sounds like point 2 was a negative seo attack. It could be that your /?s page is being cached and getting picked up via crawlers.

You can avaoid this by no caching search pages and applying noindex via X-robots tag https://developers.google.com/search/docs/crawling-indexing/...

weird-eye-issue 2 days ago||
Cache has nothing to do with this

But yes just noindex search pages like they already said they did

MobiusHorizons 2 days ago|||
I think the question is “how are the behavior of random spammers on your search page getting picked up by the crawler”? The assumption with cache is that searches of one user were being cached so that the crawler saw them. Other alternatives I can imagine are that your search page is powered by google, so it gets the search terms and indexes the results, or that you show popular queries somewhere. But you have to admit that the crawler seeing user generated search terms points to some deeper issue.
weird-eye-issue 2 days ago||
You just link to that page from a page that Google crawls. Cache isn't involved unless you call links caching
dazc 2 days ago|||
Not sure how search result pages can be crawled unless they are cached somewhere?
margalabargala 2 days ago|||
If I'm reading correctly, it's not that your search results would be crawled, it's that if you created a link to www.theirwebsite.com/search/?q=yourspamlinkhere.com or otherwise submitted that link to google for crawling, then the google crawler makes the same search and sees the spam link prominently displayed.
Barbing 2 days ago||
Yikes.

What could Google do to mitigate?

weird-eye-issue 2 days ago|||
You noindex search pages or anything user generated, it's really that simple
nolito 2 days ago||
Not enough. According to this article (https://www.dr.dk/nyheder/penge/pludselig-dukkede-nyhed-op-d... you probably need to translate) its enough to link to an authorative site that accepts a query parameter. Googles AI picks up the query parameter as a fact. The artile is about a danish compay probably circumventing sanctions and how russian actors manipulate that fact and turn it around via Google AI
weird-eye-issue 1 day ago||
Yeah all pages should have a proper canonical which would solve this too
firefoxd 2 days ago|||
In this case, all i had to do was let the crawler know not to index the search page. I used the robots noindex meta tag on the search page.
weird-eye-issue 2 days ago|||
I don't know what you mean by cache but you aren't using it correctly...
motbus3 2 days ago|||
I posted some details in the main thread but I think you might need to check the change in methodology of counting impressions and clicks Google did around September this year.

They say the data before and after is not comparable anymore as they are not counting certain events below a threshold anymore. You might need to have your own analytics to understand your traffic from now own.

snowwrestler 2 days ago||
This affected only reporting of placement and impressions; basically you don’t get counts for placements below the first 10 or 20 results (can’t remember which). It did not affect clicks which are measured directly regardless of how deep in the SERP they happen.
SoftTalker 2 days ago|||
WRT AI overviews/summaries, was Google smart enough to set up for this long ago?

1) encourage SEO sites to proliferate and dominate search results, pushing useful content down on the page.

2) sell ad placement directly on the search results page, further pushing useful content down on the page

3) introduce AI summaries, making it unnecessary to even look for the useful content pushed down on the page.

Now, people only see the summaries and the paid-for ad placements. No need to ever leave the search page.

Goofy_Coyote 2 days ago|||
Question: If I do a search for say crypto on your blog, how does Google gets to index the resulting page?

I’m imagining something like “blog.example/?s=crypto” which only I should see, not Google.

Edit: Where they linking to your website from their own? (In that case the link with the bad search keywords can be crawled)

firefoxd 2 days ago|||
They are spamming other websites with links to my website like in your example. Google crawl those other websites, follow the spammy link to mine, and I get penalized for having a page with spam content.

The solution is to tell the crawler that my search page shouldn't be indexed. This can be done with the robots meta tags.

Goofy_Coyote 2 days ago||
I see, thanks for helping me understand the issue (and also the solution)
ipaddr 2 days ago|||
Link.com?search=spam from external page
Workaccount2 2 days ago|||
AI overviews likely aren't going anywhere. Techies complain about it, but from seeing average people use google - everyone just reads the overview. Hell I even saw a screenshot of an AI overview in a powerpoint this week...

Anyway, I'd really like to at least see google make the overview text itself clickable, and link to the source of the given sentence or paragraph. I think that a lot of people would instinctively click-through just to quickly spot check if it was made as easy as possible.

fenykep 2 days ago|||
That is how duckduckgo has implemented it, I also find it to be a nicer middle ground.
hedora 2 days ago||
Kagi too (and before DDG).
Barbing 2 days ago|||
Citations got worse with AI overviews or AI mode, right, over the past couple months?

IIRC-

Used to take you to cited links, now launches a sidebar of supposed sources but which are un-numbered / disconnected from any specific claims from the bot.

bootsmann 2 days ago|||
Sorry but how did 2 work before you fixed it? You saved the queries people did and displayed them?
firefoxd 2 days ago|||
So the spammer would link to my search page with their query param:

    example.com/search?q=text+scam.com+text
On my website, I'll display "text scam.com text - search result" now google will see that link in my h1 tag and page title and say i am probably promoting scams.

Also, the reason this appeared suddenly is because I added support for unicode in search. Before that, the page would fail if you added unicode. So the moment i fixed it, I allowed spammers to have their links displayed on my page.

Calavar 2 days ago|||
Reminds me of a recent story on scammers using search queries to inject their scam phone numbers into the h1 header on legitimate sites [1]

[1] https://cyberinsider.com/threat-actors-inject-fake-support-n...

Neil44 2 days ago||||
Interesting - surely you'd have to trick Google into visiting the /search? url in order to get it indexed? I wonder if them listing all these URLs somewhere are requesting that page be crawled is enough.

Since these are very low quality results surely one of Google's 10000 engineers can tweak this away.

input_sh 2 days ago||
> surely you'd have to trick Google into visiting the /search? url in order to get it indexed

That's trivially easy. Imagine a spammer creating some random page which links to your website with that made up query parameter. Once Google indexes their page and sees the link to your page, Google's search console complains to you as the victim that this page doesn't exist. You as in the victim have no insight into where Google even found that non-existent path.

> Since these are very low quality results surely one of Google's 10000 engineers can tweak this away.

You're assuming there's still people at Google who are tasked with improving actual search results and not just the AI overview at the top. I have my doubts Google still has such people.

Neil44 2 days ago||
I messed around with our website trying url encoded hyperlinks etc but it was all escaped pretty well. I bet there's a lot of tricks out there for those with time on their hands. Why anyone would bother creating content when Google AI summary is effectively going to steal it to intercept your click is beyond me. So the whole issue will solve it's self when google has nothing to index except endless regurgitated slop and everyone finally logs off and goes outside.
francisofascii 2 days ago||||
Great blog post. You typically think of people linking to your website as a good thing. This is a good counterexample.
layer8 2 days ago||||
What does Unicode have to do with links?
jdiff 2 days ago||
Lot of spam uses unicode, either for non-English languages or just to swap in lookalike characters to try and dodge keyword filters.
indymike 2 days ago|||
This has been a trick used by "reputation management" people for years.
chii 2 days ago|||
i imagine the search page echoed the search query. Then, a SEO bot automated search(s) on the site with crypto and spam keywords, which is echo'ed in the search results - said bot may have a site/page full of links to these search results to create fake pages for those keywords for SEO purposes (essentially, an exploit).

Google got smart and found out such exploits, and penalized sites that do this.

jgalt212 2 days ago||
> my page impressions were high, my ranking was high, but click through took a dive. People read the generated text and move along without ever clicking.

This been our experience with out content-driven marketing pages in 2025. SERP results constant, but clicks down 90%.

This not good for our marketing efforts, and terrible for ad-supported public websites, but I also don't understand how Google is not terribly impacted by the zero-click Internet. If content clicks are down 90%, aren't ad clicks down by a similar number?

ipaddr 2 days ago||
They moved from clicks to pageviews which gives them cover until AI ads make up the difference.
seec 1 day ago||
Yes, to me it looks like they are now primarily selling placement in search results (or just space on their various properties). I never really understood the rationale behind click-based prices; someone following a link doesn't necessarily make them buy, and the ad got displayed regardless. But it's probably because Google was a bit coy about ads in the start to protect its reputation, so they didn't look too much like ads. Now they have so much traffic on relevant terms that they can sell the top spots at an expensive price.
PrairieFire 2 days ago||
Whether or not this specific author’s blog was de-indexed or de-prioritized, the issue this surfaces is real and genuine.

The real issue at hand here is that it’s difficult to impossible to discover why, or raise an effective appeal, when one runs afoul of Google, or suspects they have.

I shudder to use this word as I do think in some contexts it’s being overused, I think it’s the best word to use here though: the issue is really that Google is a Gatekeeper.

As the search engine with the largest global market share, whether or not Google has a commercial relationship with a site is irrelevant. Google has decided to let their product become a Utility. As a Utility, Google has a responsibility to provide effective tools and effective support for situations like this. Yes it will absolutely add cost for Google. It’s a cost of doing business as a Gatekeeper, as a Utility.

My second shudder in this comment - regulation is not always the answer. Maybe even it’s rarely the answer. But I do think when it comes to enterprises that have products that intentionally or unintentionally become Gatekeepers and/or Utilities, there should be a regulated mandate that they provide an acceptable level of support and access to the marketplaces they serve. The absence of that is what enables and causes this to perpetuate, and it will continue to do so until an entity with leverage over them can put them in check.

_heimdall 2 days ago||
The situation reads more like a monopoly issue rather than a gatekeeper issue. Because google owns the indexer and the search tool most used, they're really only gate keeping their own sandbox.
ethbr1 2 days ago||
It's entirely possible to have utility-importance non-monopoly gatekeepers, which is part of the legal issue.

The US regulates monopolies.

The US regulates utilities, defined by ~1910 industries.

It doesn't generally regulate non-monopoly companies that are gatekeepers.

Hence, Apple/Google/Facebook et al. have been able to avoid regulation by avoiding being classed as monopolies.

Imho, the EU is taking the right approach: also classify sub-monopoly entities with large market shares, and apply regulatory requirements on them.

I'd expect the US to use a lighter touch, and I'm fine with that, but regulations need to more than 'no touch'. It'd also be nice if they were bucketed and scaled (e.g. minimal requirements for 10-20%, more for 21-50%, max for 50%+).

_heimdall 2 days ago|||
Sure, we agree there though I'd add that while the US regulates monopolies we don't always enforce that, we also allow state-sponsored monopolies for many regional utilities.

With Google and SEO I see it more in the monopoly camp though. The existence of other big tech companies doesn't break the monopoly Google has by owning search, ads, analytics, et al under the same umbrella.

ethbr1 2 days ago||
The nice thing about regulatory bucketing by market share is that it's harder to evade.

We've seen the legal gymnastics around market definition for monopoly purposes.

But it's harder for Google to make the case that it doesn't own at least a big chunk of {mobile OS} or {mobile app store} market share.

They can argue +/- a few percent over methods, but "We don't have a substantial market share" won't fly.

_heimdall 2 days ago||
No argument there either, I do agree sometimes market monopolies need to be felt with though the bar is high in my opinion. If it were me I'd want to see proof of collusion, its easy for a market with only a few actors to independently make similar choices based on similar market incentives and data.

In this case though, it still seems like a more simple monopoly only with google. You don't need to consider other companies when the issue is related to the black box of search rankings.

ethbr1 1 day ago||
That's part of my preference though: I'd rather government regulation of larger market share companies be a gradient rather than a binary.

If a few actors control the bulk of a market... wouldn't the same redresses be appropriate whether or not they're colluding?

We should make companies want to stay at a competitive market share instead of taking over their markets.

_heimdall 1 day ago||
I wouldn't personally want companies to be punished without evidence of collusion. A company isn't doing anything wrong by earning market share, and companies aren't doing anything wrong if they happen to move in a similar direction based on market incentives.

If we think free markets are generally going to move in the right direction, we should just want companies to want to fill market gap and outcompete. I don't agree we should make companies do anything though, at most governments should be tweaking incentives to attempt to push companies down a path without directly making them go there (even them I'm not sold that approach is worth it).

ethbr1 1 day ago||
So there's no collusion or misbehavior, and the market ends up as a duopoly: one participant has 70% and another has 25%.

You don't think that alone distorts the market enough to merit intervention to encourage more competitors?

If you tie intervention to proven malfeasance, you allow abusers to skirt the rules for decades, entrench their positions with obscene profits, and then maybe eventually face consequences if they lose a legal case.

Instead of labeling some things illegal after the fact, monopoly and market law should be based around identifying some high market sharr situations as potentially dangerous and requiring compliance with additional regulations that make it harder for that dominant company to prevent competitors from starting and growing.

Otherwise, it invariably slides into state-aligned and -supported chaebols, because the government has incentive to ask large companies for help and they have incentive to cooperate with the government.

dangus 2 days ago|||
I’m really hoping the pendulum swings back to sanity in the US rather than becoming a Russia-like mafia business state.

It’s possible the only hope is a painful one: a major market crash caused by greed and excessive consolidation, the kind of crash that would trigger a 21st century new deal.

_heimdall 1 day ago|||
I wouldn't personally put much hope in a new deal approach.

The consolidation of power in the US government is the root of many of our problems, I don't expect that same government to solve it by grabbing even more power a la the new deal.

esbranson 1 day ago|||
I think the standard hyperbole is supposed to imply the US is fascist, not is becoming. Mention of mafias and post-soviet Russia is also non-standard.
hirako2000 2 days ago||
If they considered having some ethical responsibility they would at least tame the bidding war that turned a well paid ads for an existing, unrelated business show before the legitimate link, or limit it so that the search result to show the legitimate link on the first page.

For certain popular sites, it doesn't. Those businesses got to pay the shelf tax if they want their published piece to ever be - not just seen, but reasonably - found when looking specifically for it.

donatj 2 days ago||
About six months ago Ahrefs recommended I remove some Unicode from the pathing on a personal project. Easy enough. Change the routing, set up permanent redirects for the old paths to the new paths.

I used to work for an SEO firm, I have a decent idea of best practices for this sort of thing.

BAM, I went from thousands of indexed pages to about 100

See screenshot:

https://x.com/donatj/status/1937600287826460852

It's been six months and never recovered. If I were a business I would be absolutely furious. As it stands this is a tool I largely built for myself so I'm not too bothered but I don't know what's going on with Google being so fickle.

Updated screenshots;

https://x.com/donatj/status/1999451442739019895

dmboyd 2 days ago||
It’s probably also reflective of the fact that google are throwing all their new resources at AI, as soon as you’ve hit cache invalidation you’re gone, and anything new that’s crawled is probably ranked differently in the post llm world.
AznHisoka 2 days ago|||
Lesson: if its working, dont fix it
motbus3 2 days ago|||
They already scouted all content they needed. Sites are now competition for their AI systems
mohas 2 days ago||
exactly my experience, suddenly thousands of non indexed pages, never figured out why, had to disband the business as it was content website selling ads.
bjt12345 2 days ago||
What I find strange about Google, is that there's a lot of illegal advertising on Google maps - things like accomodation and liquor sellers that don't have permits.

However, if they do it for the statutory term, they can then successfully apply for existing-use rights.

Yet I've seen expert witnesses bring up Google pins on Maps during tribunal over planning permits and the tribunal sort of acts as if it's all legit.

I've even seen the tribunals report publish screenshots from Google maps as part of their judgement.

lunias 2 days ago||
I was a victim of this when I moved into my house. Being unfamiliar with the area, I googled for a locksmith near me. It returned a result in a shopping center just about a mile away from me. I'd driven past that center before, it seemed entirely plausible that there was a locksmith in there.

I called the locksmith and they came, but in an unmarked van, spent over an hour to change 2 locks, damaged both locks, and then tried to charge me like $600 because the locks were high security. It's actually a deal for me, y'know, these locks go for much more usually. I just paid them and immediately contacted my credit card company to dispute the charge.

I called their office to complain and the lady answering the phone hung up on me multiple times. I drove to where the office was supposed to be, and there was no such office. I reported this to google maps and it did get removed very quickly, but this seems like something that should be checked or at least tied back to an actual person in some way for accountability.

Then I went to the hardware store and re-changed all the locks myself.

socalgal2 2 days ago|||
Just curious, if you were Google, how would you fix this? And take the question seriously, because it's harder than it sounds.

They are certainly trying. It's not good for them to have fake listings.

https://podcast.rainmakerreputation.com/2412354/episodes/169...

(just googled that, didn't listen, was looking for a much older podcast from I think Reply All from like 10yrs ago)

lunias 2 days ago|||
It definitely sounds like a hard problem. I'm not familiar with the current process, but based on what I found when I looked it up, it seems like there is a verification step already in place, but some of the methods of verification are tenuous. The method that seems the most secure to me is delivering a pin to the physical location that's being registered, but I feel like everything is exploitable.
noAnswer 2 days ago||||
They could send real mail to the address with an activate code in it.
Eisenstein 1 day ago|||
Why does Google get to say 'its hard' and we have to give them a pass? If a business is providing a service, they need to ensure it is doing what it claims. Whether it is difficult or not is not our problem.
socalgal2 1 day ago||
Google isn't the one doing anything wrong. The people posting false listing are the ones doing something wrong.
Eisenstein 1 day ago||
Google represents this data as legitimate.
fragmede 2 days ago|||
Locksmiths and plumbers is especially one of those things that they've figured out how to game the system to get an extra expensive service that they contract with instead of a local company that is less expensive and doesn't have the middleman.
deltoidmaximus 2 days ago|||
Reminds me of Trap streets or Trap towns that cartographers would use to watermark their maps and prove plagiarism. The trouble is reality would sometimes change to match the map.
rcxdude 2 days ago|||
Is it treated differently from other kinds of advertising? A lot of planning and permitting has a bit of a 'if it's known about and no-one's been complaining it's OK' kind of principle to it.
oakwhiz 2 days ago||
legal citogenesis?
actionfromafar 2 days ago||
Clan justice, google is the clan.
01HNNWZ0MV43FF 2 days ago||
Reality is just tug of war and weight is all that matters at the limit
FuturisticLover 2 days ago||
Google search results have gone shit. I am facing some deindexing issues where Google is citing a content duplicate and picking a canonical URL itself, despite no similar content.

Just the open is similar, but the intent is totally different, and so is the focus keyword.

Not facing this issue in Bing and other search engines.

daemonologist 2 days ago||
I've also noticed Google having indexing issues over the past ~year:

Some popular models on Hugging Face never appear in the results, but the sub-pages (discussion, files, quants, etc.) do.

Some Reddit pages show up only in their auto-translated form, and in a language Google has no reason to think I speak. (Maybe there's some deduplication to keep machine translations out of the results, but it's misfiring and discarding the original instead?)

kace91 2 days ago|||
Reddit auto translation is horrible. It’s an extremely frustrating feeling, starting to read something in your language believing it’s local, until you reach a weird phrase and realise it’s translated English.

It’s also clearly confusing users, as you get replies in a random language, obviously made by people who read an auto translation and thought they were continuing the conversation in their native language.

morkalork 2 days ago||
I will google for something in French when I don't find the results I want in English. Sometimes google will return links to English threads (that I've already seen and decided were worthless!) auto-tranalated to French. As if that were any help at all..
sischoel 2 days ago|||
The issues with auto-translated Reddit pages unfortunately also happens with Kagi. I am not sure if this is just because Kagi uses Google's search index or if Reddit publishes the translated title as metadata.

I think at least for Google there are some browser extensions that can remove these results.

black_puppydog 2 days ago||
The Reddit issue is also something that really annoys me and i wish kagi would find some way to counter it. Whenever I search for administrational things I do so in one of three languages, German, French or English depending on which context this issue arises in. And I would really prefer to only get answers that are relevant to that country. It's simply not useful for me to find answers about social security issues in the US when I'm searching for them in French.
dev_l1x_be 2 days ago|||
Amazong, Google is the same. Fake products, fake results, scammers left and right.
nubinetwork 2 days ago|||
Check that you're not routing unnamed SNI requests to your web content. If someone sets up a reverse proxy with a different domain name, google will index both domains and freak out when it sees duplicate content. Also make sure you're setting canonical tags properly. Edit: I'd also consider using full URLs in links rather than relative paths.
FuturisticLover 2 days ago||
Canonical Tags are done perfectly. Never changed them, and the blog is quite old too. I found a pattern where Google considers a page a duplicate because of the URL structure. For example:

www.xyz.com/blog/keyword-term/ www.xyz.com/services/keyword-term/ www.xyz.com/tag/keyword-term/

So, for a topic, if I have two of the above pages, Google will pick one of them canonically despite different keyword focus and intent. And the worst part is that it picks the worst page canonical, i.e., the tag page over blog or blog page over service.

hackerbeat 2 days ago|||
Totally. Bing works like a charm for all my sites, whereas Google fails them all, and they couldn't be more diverse.
Aldipower 2 days ago||
Yeah, Google search results are almost useless. How could they have neglected their core competence so badly?
adaptbrian 2 days ago|||
B.c they shifted their internal KPI in 2018 roughly, to keeping users on Google and not tuning towards users finding what they are looking for ie. Clicking off google.

This is what has caused the degradation of search quality since then.

Iulioh 2 days ago||||
Their core competency is ADs, not search.
Aldipower 2 days ago|||
[flagged]
hyruo 2 days ago||
I encountered the same problem. I also use the Bear theme, specifically Hugo Bear. Recently, my blog was unindexed by Bing. Using `site:`, there are no links at all. My blog has been running normally for 17 years without any issues before.
graeme 2 days ago||
Entirely possible the rss failed validation triggered some spam flag that isn't documented, because documenting anti-spam rules lets spammers break the rules.

The amount of spam has increased enormously and I have no doubt there are a number of such anti-spam flags and a number of false positive casualties along the way.

Eisenstein 2 days ago|
If failing to validate a page because it is pointing to an RSS feed triggers a spam flag and de-indexes all of the rest of the pages, that seems important to fix. By losing legit content because of such an error they are lowering the legit:spam ratio thus causing more harm than a spam page being indexed. It might not appear so bad for one instance, but it is indicative of a larger problem.
quietfox 2 days ago||
I'll be honest, I read "Google de-indexed my Bear Blog" and was looking forward to discovering an interesting blog about bears.
xeonmc 2 days ago||
You may find rather unexpected results if you look for blogs with an interest in bears.
binarymax 2 days ago|||
Same. I still don’t know why the word “Bear” was used in the title.
saint_yossarian 2 days ago||
I guess they use this blogging platform: https://bearblog.dev/
binarymax 2 days ago||
Makes sense. Hadn't heard of that platform before but looks really nice.
Bengalilol 2 days ago||
Coming from a quietfox, it is OK. It is important to preserve oneself ^^.
p0w3n3d 2 days ago||
Sounds similar to https://news.ycombinator.com/item?id=46203343 in terms, that Google decides who survives and who does not in business
cosmicgadget 2 days ago||
Also: https://news.ycombinator.com/item?id=40970987

https://gehrcke.de/2023/09/google-changes-recently-i-see-mor...

The wrong RSS thing may have just tipped the scales over to Google not caring.

cyberrock 2 days ago||
In the past I've heard that TripAdvisor has 60% market share for local reviews in the UK. Did Google Maps really climb that quickly? Are Instagram and TikTok not shaping tastes in London too? I feel like she might be assigning too much power to it just because that's what she used.

That's not to say I don't have gripes with how Google Maps works, but I just don't know why the other factors were not considered.

leoedin 2 days ago||
I don’t think I’ve met anyone in the UK who routinely checked tripadvisor for anything!

I just checked a few local restaurants to me in London that opened in the last few years, and the ratio of reviews is about 16:1 for google maps. It looks like stuff that’s been around longer has a much better ratio towards trip advisor though.

Almost certainly Instagram/tiktok are though. I know a few places which have been ruined by becoming TikTok tourist hotspots.

dazc 2 days ago|||
'I don’t think I’ve met anyone in the UK who routinely checked tripadvisor for anything!'

Counterpoint: I have met people in the UK who's lives revolve around doing nothing but.

paganel 2 days ago|||
Not in the UK, but from Romania, I last checked Tripadvisor back in 2012, and that was for a holiday stay in the Greek islands. Google Maps has eaten the lunch of almost all of the entrants in this space, and I say that having worked for a local/Romanian "Google places"-type of company, back in 2010-2012 (after which Google Places came in, ~~stole~~ scrapped some of our data and some of our direct competitor's data and put us both out of that business).
Aldipower 2 days ago|
Google search also favors large, well-known sites over newcomers. For sites that have a lot of competition, this is a real problem and leads to asymmetry and a chicken-and-egg problem. You are small/new, but you can't really be found, which means you can't grow enough to be found. At the same time, you are also disadvantaged because Google displays your already large competitors without any problems!
More comments...