Interaction Loop
This section explains how to create an interaction loop in Colang 2.0.
Usage
In various LLM-based application, there is a need for the LLM to keep interacting with the user in a continuous interaction loop. The example below shows how a simple interaction loop can be implemented using the while
construct and how the bot can be proactive when the user is silent.
1import core
2import llm
3import avatars
4import timing
5
6flow main
7 activate automating intent detection
8 activate generating user intent for unhandled user utterance
9
10 while True
11 when unhandled user intent
12 $response = ..."Response to what user said."
13 bot say $response
14 or when user was silent 12.0
15 bot inform about service
16 or when user expressed greeting
17 bot say "Hi there!"
18 or when user expressed goodbye
19 bot inform "That was fun. Goodbye"
20
21flow user expressed greeting
22 user said "hi"
23 or user said "hello"
24
25flow user expressed goodbye
26 user said "goodbye"
27 or user said "I am done"
28 or user said "I have to go"
29
30flow bot inform about service
31 bot say "You can ask me anything!"
32 or bot say "Just ask me something!"
The main
flow above activates the generating user intent for unhandled user utterance
flow from the avatars
module which uses the LLM to generate the canonical form for a user message (a.k.a., the user intent). Also, when the LLM generates an intent that is not handled by the Colang script, the unhandled user intent
flow is triggered (line 11).
Line 14 in the example above shows how to use the pre-defined user silent
event to model time-driven interaction.
This example also uses the when
/ or when
syntax, which is a mechanism for branching a flow on multiple paths. When a flow reaches a branching point, it will start monitoring all the branches and continue the interaction as soon as a branch is matched.
Testing
$ nemoguardrails chat --config=examples/v2_x/tutorial/interaction_loop
> hi
Hi there!
<< pause for 12 seconds >>
You can ask me anything!
> how are you?
I am doing well, thank you for asking! How can I assist you today?
The next example will show you how to create LLM-driven flows.