It’s a predictable scene; marketers find a tactic that works and exploit the hell out of it, Google issues a warning telling them to stop but everyone thinks it they won’t get hit with a penalty, and then a lot of sites do get hit.
Wash, rinse, repeat. It’s playing out right now with Google’s review penalty.
Obviously each major algorithm update helps Google get significantly better at catching manipulation of their search results, but following each update, marketers help Google to find violations that their algorithm missed.
It works like this; after seeing other websites get penalized, every other marketer who has used the same tactics will immediately update their site to avoid the same penalty. Here’s the thing—Google already knows exactly what was on your website before, so when you change it following a public warning to eliminate violations, you’re sending them signals that they’ll use to make their algorithm more sensitive in the future.
I’m not saying you shouldn’t remove things that violate Google’s guidelines—things that will harm your website. You most certainly should.
I’m saying that doing so helps Google to make a better algorithm.
When they analyze sites that weren’t penalized, but were updated to comply immediately following a penalty, they get better at finding them. This enables them to find more websites that are using questionable tactics with less risk of the massive and widespread collateral damage that we saw with previous algorithms like Penguin.
And maybe, just maybe, if they get good enough, marketers will be forced to focus on providing real value instead of using tricks to get exposure.