This will create an excellent search engine but a terrible reasoning machine.
There are a lot of ways to search through docs and support tickets now. The ability of an LLM to draw inferences and summarize all of that information comes from being trained on a very large amount of data with billions of parameters. The data can be highly specialized. There just needs to be several thousand gigs of it for the AI to do things that are rare and useful.
There are a lot of ways to search through docs and support tickets now. The ability of an LLM to draw inferences and summarize all of that information comes from being trained on a very large amount of data with billions of parameters. The data can be highly specialized. There just needs to be several thousand gigs of it for the AI to do things that are rare and useful.