Azure.AI.OpenAI client: how to re-use existing conversation

Peter Jansen 10 Reputation points
2024-04-30T10:15:08.12+00:00

We're using the Azure.AI.OpenAI package in our software, which we use with a RAG system to answer questions from our internal information. To support an ongoing conversation, we're saving the conversation to database, and when a user comes back to this conversation to ask a follow-up question, we would read the existing messages (system, user, toolcalls, toolresponse, assistant) from database and then call GetChatCompletionsStreamingAsync(options, ...) with those messages in the 'options.Messages' parameter. For most types of messages we could simply add them with for example: 'options.Messages.Add(new ChatRequestUserMessage(message.Content)' (where 'message' is the item that was stored in the database). This was already not working very well with the 'toolcalls' message: we had to manually deserialize the original message, then use a new instance of ChatRequestAssistantMessage() and use a hack to set the .ToolCalls and set the correct "Type". However, with the latest version of the package (Azure.AI.OpenAI/1.0.0-beta.16) the deserialize of the message content to 'List<ChatCompletionsFunctionToolCall>' throws an exception.

What is the correct way to do this? (let user ask a follow-up question to an existing conversation, where you have to populate the 'options.Messages' property with the existing messages, and then more specifically how does that work with a 'toolcalls' message?)

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
2,244 questions
{count} votes