Post search on X is done as it is with any other data from any other source, you use RAG and function calling to insert the context.
< 7B open source models can function call very well. In fact, Nous Hermes 2 Pro (7B) is benchmarking better at that then GPT-3.5.
Not related to the size, if I'm not mistaken.
Post search on X is done as it is with any other data from any other source, you use RAG and function calling to insert the context.
< 7B open source models can function call very well. In fact, Nous Hermes 2 Pro (7B) is benchmarking better at that then GPT-3.5.
Not related to the size, if I'm not mistaken.