Bing was sending traffic all the time What is the most interesting thing about the whole thing? It is likely that Google did not perform a manual intervention There was no message in Google Search Console and the twostage drop in traffic made us skeptical that there was some manual intervention Weve seen this pattern repeatedly with pure AI content: Google indexes pages Traffic is provided quickly with consistent profits week after week Traffic then peaks followed by a rapid decline Another sample example for example Casualapp In this "SEO heist" the competitors sitemap was removed and AI was used to generate via articles The operation followed the same pattern For several months he was climbing Then came a pause followed by a decline of about % And what followed? An accident that shut down almost all traffic As far as the SEO community is concerned there is some debate as to whether this drop was a manual intervention due to all the profit coverage
it received We believe the algorithm worked Perhaps an even Brazil WhatsApp Number Data more interesting study involved LinkedIn articles about AI “collaboration” Created by LinkedIn these AIgenerated articles encouraged users to collaborate on corrections additions and factchecking Top contributors were rewarded with a LinkedIn link for their efforts As in the cases mentioned above the flow of new arrivals rose and then fell However LinkedIn maintained some traffic The data suggests that traffic fluctuations were caused by an algorithm rather than manual action When edited by a human some of LinkedIns collaborative articles clearly met the definition of useful content Others according to Googles estimates did not meet these definitions Maybe Google has it right in this case If its spam why is it even being rated? Ranking is a multistep process for Google Time data access restrictions and cost prevent the implementation of more complex systems Of course document review never stops
This is why the pattern repeats itself "sniff test" to be identified later We can look at some evidence for this claim A while ago we looked at Googles "Page Quality" patent and how it uses user interaction data to generate a ranking score In a situation where the site is brand new users have not interacted with the content on the SERPs Therefore Google does not have access to the quality of the content However this situation is covered by another smart patent namely Predicting Site Quality To simplify: for a new website the quality score is predicted by first obtaining the relative frequency for each of the different phrases found on the new website These measures are then mapped against a previously generated phrase model constructed from quality scores generated from previously ranked pages If Google still used this tactic it would mean that many new websites would be ranked on a "first guess" basis with quality metrics included in the algorithm Later the rating is refined based on user interaction data In practice we have noticed that Google often increases the ranking of websites for what appears to be a "test period" Our theory was that at the time there was a measurement going on .