Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How to Mitigate a Negative SEO Campaign? (support.google.com)
161 points by giles_corey on Oct 26, 2019 | hide | past | favorite | 58 comments


I've submitted threads like this to the google webmaster forum, and standard response boils down to:

- negative seo doesn't work, google can tell the difference between spam links/sites

- it's probably an issue with your own site (ie. the deranking is justified)

- changes in rank are likely due to an algorithm update

I'm not sure if the mods on that forum are even google employees - they seem to just be random users, not sure how they are selected. I've never seen anyone on that forum acknowledge that negative seo is effective.


GWF is utterly useless for any non-trivial issues.

They have a mob of high-ranking posters upvoting each others' replies, patting each other on the back and ultimately ganging up on anyone who doesn't accept their replies.

We had an issue with a site incorrectly flagged as carrying malware. Submitted multiple review requests through the Webmasters console, all had no effect and produced no replies. Completely baffled by the situation, posted to GWF. GWF top rated reply and their "general consensus" was that we were morons who don't know our way around basic server administration.

A couple of weeks later we discovered that the Webmasters' console was plainly broken in the browser that we used to submit review requests. You'd click on "Submit", it would go "inactive" and the page was reloaded in a bit, but nothing got sent out. Re-submitted the request with a different browser, it worked, issue got resolved in a matter of hours, but the GWF interaction left a very bad aftertaste. F-, won't do it again.


GWF is utterly useless, the only reason it is there is to provide the impression of support, not to actually give support. An actual support channel has a means of escalation, all the way to the top if that is what it takes to get a problem resolved. With GWF it's luck of the draw at best, and in most cases just a placebo.


I will sound like a broken record but all google products have forums with these issues. You get nothing but cut and paste answers by regular users chasing internet points for validation.


You seem to be accusing a social media company of creating YASEC (Yet Another System of Echo Chamber). I do not see the controversy. This is what they normally do to solve their problems. Or are we in some new context of not accepting that any more?


"negative seo doesn't work"

That position is utterly silly to me.

How can Google defend the idea that "blackhat SEO" you do yourself will tank your site, while saying others doing the same exact thing to you won't work?

Are they claiming some magical ability to discern ownership and intent?


They're claiming that their tools allow you see the 'blackhat' links and disown them, and if you do that then there's no effect, so in the end (as long as you're putting in that work) the negative SEO campaign would be worthless.

I have no strong opinion whether that claim is true or not in practice, but that's what's being implied.


Actually in most of these threads (I've read through a lot of them trying to diagnose my own issues) the google product experts say not to use the disavow tool, lest you accidentally disavow links that may have been helping you. (ie. presumably google is smart enough to ignore those links)

This runs counter to my anecdotal experience, which is that disavowing spam links has an almost immediate effect, if the rank fluctuation is caused by those links.


The "disown" bit is interesting. I do wonder about their ability to discern "oops" versus "someone is targeting me". And the idea that negative SEO is even happening in the first place.


Forum posts by G employees will show "Google Employee" above their name as seen here (the recommended answer) https://support.google.com/nonprofits/thread/13880126?hl=en


No one who is a SEO pro interacts on there.

I have seen unintended negative SEO a site got hacked and thousands of pages of spamy porn links added.

I suspect its harder these days but Google isn't perfect - they are only human and they make mistakes just as well all do sometimes.


I couldn’t quite tell from your comment: is your opinion that negative SEO is effective?


I used to build negative SEO campaigns for clients a few years back and they were fairly effective, though I don’t know if that’s still the case today. I’m sure it probably is.


yes I think negative SEO is effective. Context matters though, it's probably more effective against newer sites, sites with a weak backlink profile, and sites with more "naturally spammy" content like SEO.


Exactly - a negative seo campaign against... REI’s website as a random example won’t do much.

But it’s not impossible to nuke a new-ish domain or a small business’s domain with tons of bad links through spam services if you know what you’re doing.


That response is insane. How on earth could he substantiate it further than the significant evidence he presented? Would it require an admission of guilt from the responsible party?


It sounds like the content is being duplicated, so I think the author can make copyright claims through the DMCA delisting tool:

https://www.google.com/webmasters/tools/dmca-dashboard


I spent some months occasionally doing these take down requests to the sites themselves via contact forms and admins listed in whois. Then waited, and then sent the requests to the listed hosts, then waited, then did things like the google de-listing..

A similar process was laid out on the recommendations for doing a 'disavow' thing with google webmaster tools - the amount of time all this takes is not trivial.

Then, it happened again two months later, with a whole 'nother set of bad links from crappy sites.

Since then, it pretty much happens about every month. a hundred or so new sites with crappy, likely expired or expiring domain names, where many appear to have names that would of once been used in the past for link ring seo - and likely were found to stop helping their clients sites and instead starting hurting - so now links are put towards ours.

So it's about a 50 - 100 new ones every month. I've submitted so many take down requests and disavow lists - and the dumb specific utf formatting that they require - ugh. It's a mjow time suck - and with no way to tell what is helping or hurting.

I suppose if your site was attacked once and left alone it would be worth the effort, but if someone is serious about taking you down, the effort is better spent not trying to please google and instead focusing on better places to get found like snap, insta, fbook ads, etc.

admit - my experience small data point, ymmv for sure.


For the past couple years, I've added a note in all the disavow uploads suggesting something like:

# please let webmasters only get links counting when they approve them! # This would stop bad seo AND negative seo right? # webmaster console can have option to choose only to count links that # are approved in the console – if it was piad for bad seo and they approved the # links then that would be proof. If it was negative seo attack they would not be # approved and not count against.

Not only would this stop negative seo attacks, it would also make it explicit if a website was trying to use shady seo - only manually approved links would count - so if someone checked off 300 comment links and 100 wikis or whatever, there would be no doubt the intent..

It would make shady seo and negative seo harder, and make it easier for those getting attacked.

Today it also hit me that if this went through, it could prevent google bombing miserable failure and such perhaps as well - and I can see reasons that could be considered good, and reasons others might not want to make that option-able.

I read a while back that googlers most likely won't be reading any of the disavows or info contained within, but I get the same feels from posting in the webmaster forums - that purposefully making it so no one knows if google has seen the posts.

I'm sure there are good reasons for that, like secrets of the algo, and legal reasons for avoiding things... but it's really hurtful to so many.

One of the last times I get into a thread, after research from some 'top posters' or whatever they are called - they just said something like 'with some key phrases there is and has been so much spam and so many different spam techniques that they just freeze the results and put a few in the top and the rest far down. So there is no hope of getting those changed.

We have more legitimate questions about certain things, and debated in house whether using a bunch of google's things (analytics, fonts, tag mgr, all the things)- we might do better - that seemed not fair at the time, and then what a few weeks ago someone posts on HN abour seo and one of the main things is 'use all of google's scripts' advice..

I becomes a conflict of interest I think, and a conflict for our users and their privacy with sensitive subjects - yet others are willing do that all day for the G traffic - and they get top results.

It's been a funny (and not so funny) thing at the home office seeing a top sex result running google ads. There are so many conflicts when trying to do certain things with google - and trying to do what's right is not what is shown to the world as working, it's frustrating.

I suppose a big goal for the g spam team has been to try to make figuring things out for seo people really frustrating - and I have seen the result watching so many others explain how they did good results for a while then kicked out.. and not just bad links, but having good content.. so the goal of frustrating so many has been achieved.

On the flip side, google publicly has said for a long time, make a good site with content the visitors want to enjoy - don't buy links, and you will do fine the results. Well this does not seem true today and has not for years now.

again, small data point, ymmv


The content is being repeatedly duplicated in random hacked websites. I presume that if the websites the person knew about were removed, others would pop-up.

Edit: I don't see this approach having a ghost of a chance to work and it seems ignorant of the situation described in the OP.


On a related topic, I wonder how Google can differentiate cases where your content is duplicated but the site duplicating it doesn't adhere to standards such as setting a canonical URL back to you and you as the "victim" are ok with the duplication.

For example, a lot of popular podcast platforms will wrap your notes on their public crawlable site but don't set a canonical URL back to your site.

From Google's POV this must look like duplicate content.


DMCA claims might not affect the ranking of the page (but whether or not it affects duplicate content checks is anyone's guess) since a search for free movies[0] shows the Lumen DMCA takedown notice[1], even on the first page of results.

0: https://www.google.com/search?q=watch+spider+man+online+free

1: https://i.judge.sh/bright/CozyGlow/chrome_IJqvwxiRT1.png


This. And also send DMCA notices to the websites hosting it, should be able to get everything removed fairly quickly.


I think there are two possibilities here: - the original website is targeted by attackers seeking to push it down the Google results list - the original website is a side effect victim of spammy/malware infested websites who copy contents from successful pages to be listed in Google results and attract visitors.

Does that second option make any sense?


Since there isn't an obvious monetary incentive to push the site down, I'm thinking the second option makes a lot more sense?

It's not uncommon for authoritative content to be plagiarized and one needs to be ever vigilant and be prepared to take direct action rather than relying upon the wisdom of google's algorithms (which seem to be almost wholly ineffective in terms of discriminating against such practices).


There may be a monetary incentive for a polygraph manufacturer to push the site down.


There are also hundreds of law enforcement agencies worldwide who rely on the myth of the "lie detector" in their interrogations.


Not worldwide, no. In the US mainly and perhaps a few others, but that's about it.


> Does that second option make any sense?

No, because the hacked sites only display the fraudulent content to web crawlers, and explicitly hide it from legitimate users.


I'm not saying he doesn't have a problem, but his site does appear above the others in search results for that quotation. "The consensus view among scientists is that polygraph testing has no scientific basis"

It's such a specific quotation that there aren't that many sites that have that text, so the spam sites are gonna appear.

Now if those sites outrank his site for regular search terms "what is a polygraph" ... then that's a problem. But he didn't seem to indicate that's the case.


The spam sites that replicate the original page's content make the original page look like low value content, too.


He does indicate it-

> With the text of my homepage replicated across hundreds of such modified pages, it has essentially disappeared from Google for key search terms for the site, such as "polygraph."


ok. then right, it's the vaguer question of whether his site deserves to rank higher than it does for a particular term.

on brief glance, it doesn't look like he optimizes for "polygraph". He does rank #1 for antipolygraph.


I do not consider the question of whether actual authoritative content should outrank hacked pages with duplicate content to be ”vague”.


but that's not the issue here. There is no example where duplicate content is outranking his. In the quoted query, he ranks higher. Beyond that he says he doesn't rank well for "polygraph", but not that it's outranked by the spammy content.


The timing is also interesting, as Google has recently updated their search engine results to be more "context aware", which may result in sites like that dropping in the rankings since their subject matter is actually a bit different than what people searching for "polygraph" might expect.


The attacker's goal isn't to have the spam sites outrank his. The goal is to have his lower ranked, I assume so that when people search about polygraphs, they see top-ranked sites that are pro-polygraph instead of anti-polygraph.

The attackers seem to have achieved that goal.


Around 1 year ago one of my sites got 'attacked' by a negative SEO campaign.

It's annoying that it can happen. If someone wants to hurt your brand they can do it in more ways.

Spam Reddit till a domain get blocked, buy cheap backlinks on Fiverr etc

I'm not sure how well google is responding to negative SEO.


> If a visitor's user agent doesn't match a selected search engine, the browser is redirected either to the hacked website's normal homepage, or to an evidently fake, recently created, online marketing website

Could someone explain this bit to me? Don't follow it.


1. Site is compromised.

2. When a search engine crawler goes to the site it displays the attackers content.

3 When a ‘normal’ users visits the site it will display the original site or a generic marketing site.

It’s a tactic to prevent the original owner from knowing their site was hacked.


It’s called cloaking. Spider gets one page, visitor gets a different one.


And search engines do check for this they don't only crawl as Googlebot.


One guy contacted me he was running Microsoft windows pirated key blog. Exactly the same thing. Since he was doing piracy he wasn't having guts to report to google.

Everything was the same as you stated. Just difference was users coming from google were redirect to some site selling Microsoft keys.

I tried to research it more but since it was related piracy I didn't want to get involved much. But still couldn't find any solution for it.


Why is Google still taking 301 redirects at face value?!

They were doing that 15 years ago and it was causing headaches back then...

Faked pagerank 9 domains used to pull a pretty penny on ebay...


Cui bono?

That is, who could possibly benefit from hiding anti-polygraph sites? Why, only all the entities in the world who uses the polygraph to intimidate people; i.e. all the world’s collected military and intelligence agencies, plus any really large corporations either with ties to them or who uses the polygraph themselves.

I don’t think he can expect any help from Google.


Google:

Seen 3:24pm


Google appears to repeat the mistakes of its predecessors (those which allowed google to surpass Altavista for example)


I have various issues with google and the way they approach these issues, but they're not in the same galaxy as Altavista.

Google surpassed Altavista because it fundamentally bought a better way of indexing with it, and survived to dominate because it paired that with an incredibly insightful path to revenue.

Are memories really that short? (Am I really that old?)

We can argue around the edges, but I think any modern search engine user would be amazed at how bad bad bad search engines were for the basics of answering your search query, before google.


Google made a big hoopla about other search engines including paid results in their listings. Google would never do such a thing. Until they did.


Google clearly marks ads as ads. I believe the criticism was that search engines of the past would allow paid results to appear organic.

Has Google started doing this and I’m unaware?


Google was never going to add any ads in their SER section at all.

So that went away. Then they kept on fiddling with how the ads are marked to make it harder to spot the ads, including using colors already in use for other UI elements in organic results. Then they started advertising their own properties in advantageous positions (at the top) whether or not the results were all that relevant, and did not mark these as ads.


> Then they started advertising their own properties in advantageous positions (at the top) whether or not the results were all that relevant, and did not mark these as ads.

Where and when did this happen?

I recall the early days of Google, and they always had the ads appear first. I agree that it used to be very clearly delineated from the rest of the results, and that this has become more subtle over time, but as I far as I know they have never been disguised or hidden, which is what you’re asserting here. Can you provide a specific example or screenshot?


Altavista was, surprisingly enough, just a non-monetized demo for their hardware.

It wasn't a full search-engine, although it looked like it.

It was more like a very popular demo.


I don't recall negative SEO campaigns being a major mistake Alta Vista made that allowed Google to surpass it.


Could you be a little more specific? What mistakes are you referring to?


OK, so what Google should do about it?

From the outside view, you took your own website (and domain you own) and began low quality black hat SEO.

It's like expecting Microsoft to step in when somebody got into your Windows computer by putting in correct password.


>From the outside view, you took your own website (and domain you own) and began low quality black hat SEO.

Huh? The black hat SEO wasn't happening on his site, but on a ton of hacked sites unrelated to him.


You've missed the point. This is more similar to identify theft.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: