>It's dangerous because it gives individuals the power to provide answers when people ask questions out of curiosity they don't know the answer to, which is when they are most able to be influenced.
Can you point to a time in history of written information this wasn't true though?
- Was the Library of Alexandria open to everyone? My quick check says it was not.
- The access to written information already precluded an education of some form.
- Do you think Google has an internal search engine that is better than the one you see? I suspect the have an ad-less version that is better.
- AI models are an easy one obviously. Big Players and others surely have hot versions of their models that would be considered too spicy for consumer access. One of the most memorable parts of ai 2027 for me was the idea that you might want to go high in government to get access to the super intelligence models, and use them to maintain your position.
Can you point to a time in history of written information this wasn't true though?
- Was the Library of Alexandria open to everyone? My quick check says it was not.
- The access to written information already precluded an education of some form.
- Do you think Google has an internal search engine that is better than the one you see? I suspect the have an ad-less version that is better.
- AI models are an easy one obviously. Big Players and others surely have hot versions of their models that would be considered too spicy for consumer access. One of the most memorable parts of ai 2027 for me was the idea that you might want to go high in government to get access to the super intelligence models, and use them to maintain your position.
The point is, that last one isn't the first one.