“There is a big difference between something that is linguistically useful, and that which is accurate and precise,” says Jay van Zyl, ecosystem.Ai’s founder.
It’s useful, but you need factual accuracy
The primary goal of an LLM is to be linguistically useful, using a combination of next-word predictions, and learned insight into semantics, grammar and tone. Factual accuracy is not built into an LLM’s natural structure, says van Zyl, and therefore requires an additional host of tools to guide it towards truth.
ecosystem.Ai’s proprietary Fact-Injection tool solves this problem for industries that benefit from the use of LLMs, but can’t take on the risk of factual slip-ups.
How RAG anchors GenAI in truth
To anchor GenAI in truth, LLM’s require additional tools. Retrieval Augmented Generation (RAG) is one such tool, described by IBM’s director of language technology, Luis Lastras, as “the difference between an open-book and closed-book exam”. As the name suggests, RAG is an AI-framework that augments the quality of an LLM’s response by directing the model to external sources of knowledge, meaning the model does not need to remember or guess facts – it only has to fetch them. A student writing a closed-book exam might depend on her general knowledge, resulting in answers being less populated with specific facts, talking around an idea rather than getting straight to the point. An open book exam, representing RAG, would result in answers being far more rich in factual information, backed up by credible sources.
Following this, it is clear the RAG holds a lot of potential for industries that depend on highly-accurate information. By increasing control over what type of information your LLMs have access to, you can eliminate the risk of sensitive data leaks and the spread of misinformation through hallucinations.
Fact-Injection for augmented accuracy and usefulness
ecosystem.Ai takes RAG one step further with the platform’s proprietary Fact-Injection capability. Our founder, Jay van Zyl, describes Fact-Injection as “an intelligent way of dealing with RAG”.
Fact-Injection allows LLMs to not only ‘retrieve’ facts, but to do so with intentionality and business logic. It works by setting smart triggers that tell the LLM:
- What kind of information to fetch,
- From where, and
- When during a customer journey to do it.
As described by ecosystem.Ai developer Ramsay Louw, Fact-Injection is like wiring your AI environment with specific business knowledge, then configuring rules and triggers that activate that knowledge at the right moment.
When used in combination with our Ecogentic Agent Builder, Fact Injection is used to intervene at different points within a customer journey by setting triggers for an LLM to retrieve a certain kind of information from a certain source. One example of a trigger is intent detection. A specific word, or transaction pattern, may indicate a certain intent behind a request, which informs the Fact Injection capability to trigger retrieval from a certain source.
Simply put, Fact-Injection “injects a fact” into an LLM at runtime to assist it in selecting the correct source and the correct type of information. Fact Injection is primarily used in the back-end in industries like financial services, informing what the AI model uses to generate responses.
However, Fact-Injection can also be used on the front-end in cases like conversational bots, responses enhanced with facts about the customer in real-time. For example, the customer’s Spend or Money Personality, or most recent transaction, may be used to shape the response generated by a chat interface.
For example, in a financial services application:
A customer approaches your chat interface asking about a certain product. Over the last 3 months, transactional data indicates that the customer’s spending has increased. This fact then gets ‘injected’ at runtime when your system makes a recommendation. The result might look something like:
“Your average spend for the last 3 months has gone up a lot. R3000 compared to your annual average spend of R1500. You might be interested in using our X product which will reduce your transaction costs which is perfect for your current spend behavior”
To sum it all up
Fact-Injection improves not only the trustworthiness of your generative models, but ensures that the information provided is useful and relevant. For industries that cannot risk inaccuracy, the increased control this ecosystem.Ai capability provides means you can take advantage of the benefits of GenAI and eliminate the downsides.
