-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.Net: Bug: Conflict between Response Format property & function calling #9768
Comments
@apg-developer Could you please share how the |
@dmytrostruk sure, the definition is on the plugin but here you can see the class definition for RecipeOutput. Thank you so much for your help !!! public class RecipeOutput {
public string Name { get; set; }
public string Description { get; set; }
public Ingredient[] Ingredients { get; set; }
public string Instructions { get; set; }
}
public class Ingredient {
public string Name { get; set; }
public string Quantity { get; set; }
public string MeasureUnit { get; set; }
} |
@apg-developer Thanks a lot! Could you please try to run the same example but also specify semantic-kernel/dotnet/samples/Concepts/ChatCompletion/OpenAI_StructuredOutputs.cs Lines 203 to 208 in f5facce
|
@dmytrostruk the result was the same. Note the following stack trace & the provided screenshot. Thanks for your help ! General stack trace
Inner exception stack trace
Modified snippet code var builder = Kernel.CreateBuilder()
.AddAzureOpenAIChatCompletion(deploymentName: ProjectSettings.DeploymentName,
apiKey: ProjectSettings.ApiKey,
serviceId: ProjectSettings.ServiceId,
endpoint: ProjectSettings.Endpoint,
apiVersion: "2024-08-01-preview"); |
Hello, I am currently facing the same issue when returning data from a plugin function to a ChatCompletionAgent with the Expected behaviour agentKernel.ImportPluginFromObject(new SomePluginForDataRetrieval());
var agent = new ChatCompletionAgent()
{
Name = "ToolAndResponseFormatAgent",
Instructions = """
A Prompt guiding how to use the plugin
""",
Kernel = agentKernel,
Arguments = new KernelArguments(new OpenAIPromptExecutionSettings()
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(),
ResponseFormat = typeof(FormattedResponse)
})
};
await agent.InvokeAsync(chatHistory); Should not return a 500 error when using ResponseFormat and a plugin that returns data to the agent in the form of an object Platform
Hope that this issue can be resolved soon, thanks for already pointing it out @apg-developer and for the support so far @dmytrostruk |
@apg-developer @deq3 Thanks again for reporting this issue. It appeared that Structured Outputs feature in Azure OpenAI (i.e. Response Format as JSON Schema) doesn't work with parallel function calls: When you enable function calling by using old In order to avoid the error, you need to explicitly disable parallel calls with following syntax: var executionSettings = new OpenAIPromptExecutionSettings
{
ResponseFormat = typeof(FormattedResponse),
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(options: new() { AllowParallelCalls = false })
}; When you disable parallel calls explicitly, the issue should be resolved for you. Please let me know if that works. Thank you! |
Thank you for the quick reply @dmytrostruk. Unfortunately, setting I have removed specifics from my code, but this is how I currently have the agent configured and invoked. Could there be something else causing the problem? var agent = new ChatCompletionAgent()
{
Name = "ExampleAgent",
Instructions = """
Prompt that tells how to use the tool
""",
Kernel = agentKernel,
Arguments = new KernelArguments(new OpenAIPromptExecutionSettings()
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(options: new() { AllowParallelCalls = false }),
ServiceId = serviceId,
ResponseFormat = typeof(FormattedResponse)
}),
};
var history = new ChatHistory();
history.AddUserMessage(message);
var res = agent.InvokeAsync(
history,
cancellationToken: cancellationToken
);
var responses = await res.ToListAsync(cancellationToken);
var messageContent = responses.FirstOrDefault(); I used this blog post when building my app and it worked at first, even though they are not using Azure OpenAI like I am. But somehow it stopped working https://devblogs.microsoft.com/semantic-kernel/using-json-schema-for-structured-output-in-net-for-openai-models/#structured-outputs-with-function-calling Thanks again for your support @dmytrostruk |
Hi @dmytrostruk, thank you for your answer. I keep getting the same error 500 despite of the provided configuration. Is there something else to consider or could it be a side-effect ? Thank so much for your help ! |
@apg-developer @deq3 Thanks for provided information, I can reproduce the issue as well. As for now, I don't think this occurs because of the recent changes in Semantic Kernel, since I'm trying the newest as well as older SK versions and the issue is the same. Taking into account that it's Meanwhile we will monitor this situation on our side as well. Thank you! |
@dmytrostruk Thanks for reproducing the issue. I also didn't suspect Semantic Kernel was the problem since I didn't update my dependencies or touch the code when it stopped working. I will try to report the problem using the link provided, thanks for your help so far and for keeping an eye out. Has anyone had a chance to try it with the OpenAI API and see if it still works as expected? |
Get invalid request error |
@dmytrostruk we have submitted a support request to Microsoft Azure with the above problem regarding the combination of function calls with structured output. The product team told us that they had fixed the problem on the API side, but it was still happening in our solution. So we asked again and were told to change our code from They suggested us this code snippet in Python, but we suppose the same should be applicable for C# with omitting the content field or setting it to Since the tool call part is abstracted away from us by using the plugin system of Semantic Kernel, we can't really change this code ourselves. I have updated our package version to Microsoft.SemanticKernel 1.32.0 to make sure we are using the latest version, but the problem still occurs. Semantic Kernel 1.32.0 uses api-version=2024-10-01-preview by default for Azure OpenAI, maybe changing the version could already fix it, but we wouldn't know which version to use since the product team just suggested to change the above part. Thank you in advance for your support, it's greatly appreciated. |
@dmytrostruk Can this problem be solved in the manner described? Thank you for your help. |
Describe the bug
A HTTP 500 server_error is thrown when a ChatCompletion Agent uses function calling to retrieve data. The error only happens if Microsoft.SemanticKernel.Connectors.OpenAI.OpenAIPromptExecutionSettings.ResponseFormat property is enable on the settings of the agent. Otherwise error does not happen.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
await chatCompletionService.GetChatMessageContentAsync() method should return without errors the data provided by the plugin using the structure provided in ResponseFormat property.
Screenshots
Platform
Additional context
I have tried several combinations to try to achieve the goal
The text was updated successfully, but these errors were encountered: