My guess is it will be OK for the text, but links will be out of whack so bots won't really "crawl" the site. Then again, maybe advanced bots will run the js?
also, I know words in h-tags are usually "counted" as more important, so that logic won't work with raw .md
there is no text, the page is empty, the linking is not the first priority at this point
a robot to be able to read the text would already need to run the js in the first place
but the concept is interesting let's do a CONTENT management system but be completely invisible to any search bots so our CONTENT never ever get referenced and searchable on the Internet
yeah Google does but what about the 1000s other robots ?
sure Google is the bigger one and you have to be referenced on it, but there are also other indexes where you want to be referenced, and those are maybe not using robots as advanced as google.
So, as I said, for your normal robot crawler visiting the page, the page is empty, no content.
That Google have already solved the problem does not mean that everyone else did.
Why do you think prerender.io exists ? to solve that very same problem
From a SEO point of view, which was the question I was answering, it is ridiculous to serve a page without content to an indexing robot crawler.
My guess is it will be OK for the text, but links will be out of whack so bots won't really "crawl" the site. Then again, maybe advanced bots will run the js?
also, I know words in h-tags are usually "counted" as more important, so that logic won't work with raw .md