Memory
SUTRA models have a unique memory component that enables them to remember context and allow for conversational, coherent, and consistent dialog. Its memory capabilities enables them to recall and leverage information from earlier conversations or text segments. This aspect of SUTRA is pivotal for sustaining coherence and relevance throughout conversations, making it exceptionally adept at tasks that demand retention and synthesis in prolonged dialogs.
Strong Recall Capabilities
Following is an example conversation between SUTRA and User about climate change.
User: What's the main cause of climate change?
SUTRA: Burning fossil fuels like coal and oil.
User: Does it affect the weather?
SUTRA: Yes, it leads to more extreme weather events.
User: Can you give me an example?
SUTRA: Sure, like the 2019-2020 Australian wildfires.
SUTRA exhibits an understanding that builds across exchanges, using information from earlier responses to inform later ones. It demonstrates how it can retain context within a dialogue to provide coherent and relevant answers, illustrating its ability to “recall” information based on extensive training data. This ability is crucial for generating informative and context-aware responses, albeit within the transient memory limits of a single interaction, ensuring adherence to privacy and ethical AI standards.
Remembering the Context
Following is an example conversation between SUTRA and User. The conversation is about user wanting to travel to San Francisco (assuming user’s location is known).
User: How close is San Francisco?
SUTRA: San Francisco and San Jose are both located in the state of California, USA. It is very closeby and about 50 miles up north.
User: I would like to reach there before midnight
[System Message]: Showing result for "I would like to reach San Francisco before mid night"
SUTRA: Well you can easily reach there within an hour. Just be sure to drive safe. Need any tips?
User: yeah, shortcuts please
[System Message]: Showing result for "yeah shortcuts please -> Can you suggest any shortcuts to
reach San Franciscobefore midnight from San Jose?"
SUTRA: Sure, if you are in a hurry, take highway 101. It will take you straight in the heart of SF.
Note how SUTRA not only understood the context, but also reformulated the question internally to provide a relevant response. In a traditional Retrieval Augmented Generation (RAG) approach, models focus primarily on the information provided in context by performing in-context learning. SUTRA takes it a step further and dynamically updates instructions to demonstrate detailed awareness of conversation.
This example illustrates SUTRA’s adeptness at retaining and utilizing information from earlier parts of the conversation, a testament to its sophisticated memory capability. By referencing the previous question, SUTRA reformulates the question to provide precise instruction following, while at the same time keeping conversation informal. This allows SUTRA to maintain a coherent and contextually relevant dialog, demonstrating its ability to engage in complex and continous exchanges. It allows SUTRA to support meaningful interactions, ensuring that conversations flow naturally and informatively.