I'm not sure if you've read the AI 2027 report, but that's what this post is responding to. It's not the author anthropomorphizing AI. That's the entire premise of the AI 2027 forecast.
Agreed. PAC isn't really a new idea at its core. I think the author says that directly. But it's also an idea that hasn't really been tried with the tech available. The promise really depends upon minimizing computing power, making it more contextual, and piggybacking on 3d printing to create open-source enclosures.
It’s a lucrative industry if you can sell yourself - Google makes it a necessary industry by changing the algorithms every few months. There will always be a cohort of losers wanting to reclaim a coveted top-5 position. But the landscape has completely changed and that’s what I really liked about this article, it’s just going to take leadership a few more years to figure this out.
Of course the best advice we received was from folks who had recently worked at Google on search.
Absolutely. I attended a webinar yesterday on SEO/content marketing from a very well-known/established SEO "expert" and it was shockingly 101. However, one detail I did appreciate was hearing how long-term the standard inbound, organic content strategy is. Many have said this for years, but they showed some good data to really hammer home the point, folding in the reality that AI content farms make it all the more difficult. The other tiny detail that came up that was useful was seeing data emphasizing how counterproductive dating content is – the further away the present date is from the content, the least likely it is to be indexed and/or ranked. The advice I'd heard until this point was to update and republish content. This webinar showed data proving that publishing dateless content outperforms that approach.
This is a great point, as far as the understanding of "bot" is concerned. Although I think the overall point is that the bots create an algorithmic determinism toward information.
I do agree with the overall trend the author is observing, but I guess what I was getting at is that this is sort of an old problem extending to the web.
There's a unique social stigma around "bots" that isn't applied the same way to power users of any other system (understandably so, given some are nefarious). I believe this largely gave way to AI-powered bots, as there's a demand for bots to behave as humanly as possible to 1) minimize obstructions like 403s and 2) maximize the extraction of information.
Maybe if web servers were broadly designed thinking of bots as power users, the web would bifurcate into a "optimized web" for bots (APIs), and a "traditional web" for humans (HTML). Instead, we're getting this mess of bots and humans all trying to squeeze through the same doorway.