diff --git a/README.md b/README.md index 026fb575..745d9465 100644 --- a/README.md +++ b/README.md @@ -15,6 +15,7 @@ It is generated from our [OpenAPI specification](https://github.com/openai/opena - [Namespace organization](#namespace-organization) - [Using the async API](#using-the-async-api) - [Using the `OpenAIClient` class](#using-the-openaiclient-class) +- [How to use dependency injection](#how-to-use-dependency-injection) - [How to use chat completions with streaming](#how-to-use-chat-completions-with-streaming) - [How to use chat completions with tools and function calling](#how-to-use-chat-completions-with-tools-and-function-calling) - [How to use chat completions with structured outputs](#how-to-use-chat-completions-with-structured-outputs) @@ -138,6 +139,45 @@ AudioClient ttsClient = client.GetAudioClient("tts-1"); AudioClient whisperClient = client.GetAudioClient("whisper-1"); ``` +## How to use dependency injection + +The OpenAI clients are **thread-safe** and can be safely registered as **singletons** in ASP.NET Core's Dependency Injection container. This maximizes resource efficiency and HTTP connection reuse. + +Register the `ChatClient` as a singleton in your `Program.cs`: + +```csharp +builder.Services.AddSingleton(serviceProvider => +{ + var apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY"); + + return new ChatClient(apiKey); +}); +``` + +Then inject and use the client in your controllers or services: + +```csharp +[ApiController] +[Route("api/[controller]")] +public class ChatController : ControllerBase +{ + private readonly ChatClient _chatClient; + + public ChatController(ChatClient chatClient) + { + _chatClient = chatClient; + } + + [HttpPost("complete")] + public async Task CompleteChat([FromBody] string message) + { + ChatCompletion completion = await _chatClient.CompleteChatAsync(message); + + return Ok(new { response = completion.Content[0].Text }); + } +} +``` + ## How to use chat completions with streaming When you request a chat completion, the default behavior is for the server to generate it in its entirety before sending it back in a single response. Consequently, long chat completions can require waiting for several seconds before hearing back from the server. To mitigate this, the OpenAI REST API supports the ability to stream partial results back as they are being generated, allowing you to start processing the beginning of the completion before it is finished. diff --git a/examples/OpenAI.Examples.csproj b/examples/OpenAI.Examples.csproj index cbdcde44..2fa8f789 100644 --- a/examples/OpenAI.Examples.csproj +++ b/examples/OpenAI.Examples.csproj @@ -10,6 +10,13 @@ latest + + + + + + + diff --git a/examples/aspnet-core/Program.cs b/examples/aspnet-core/Program.cs new file mode 100644 index 00000000..f6cb39b6 --- /dev/null +++ b/examples/aspnet-core/Program.cs @@ -0,0 +1,73 @@ +using System.ClientModel; +using OpenAI.Chat; + +var builder = WebApplication.CreateBuilder(args); + +// Add services to the container. +builder.Services.AddEndpointsApiExplorer(); +builder.Services.AddSwaggerGen(); + +builder.Services.AddSingleton(serviceProvider => new ChatClient(builder.Configuration["OpenAI:Model"], + new ApiKeyCredential(builder.Configuration["OpenAI:ApiKey"] + ?? Environment.GetEnvironmentVariable("OPENAI_API_KEY") + ?? throw new InvalidOperationException("OpenAI API key not found"))) +); +builder.Services.AddScoped(); + + +var app = builder.Build(); + +// Configure the HTTP request pipeline. +if (app.Environment.IsDevelopment()) +{ + app.UseSwagger(); + app.UseSwaggerUI(); +} + +app.UseHttpsRedirection(); + +var chatHandler = app.Services.GetRequiredService(); + +app.MapPost("/chat/complete", chatHandler.HandleChatRequest); + +app.Run(); + +public class ChatHttpHandler +{ + private readonly ChatClient _client; + private readonly ILogger _logger; + + // Chat completion endpoint using injected ChatClient client + public ChatHttpHandler(ChatClient client, ILogger logger) + { + _client = client; + _logger = logger; + } + + public async Task HandleChatRequest(ChatRequest request) + { + _logger.LogInformation("Handling chat request: {Message}", request.Message); + var completion = await _client.CompleteChatAsync(request.Message); + return new ChatResponse(completion.Value.Content[0].Text); + } +} + +public class ChatRequest +{ + public string Message { get; set; } + + public ChatRequest(string message) + { + Message = message; + } +} + +public class ChatResponse +{ + public string Response { get; set; } + + public ChatResponse(string response) + { + Response = response; + } +} \ No newline at end of file diff --git a/examples/aspnet-core/README.md b/examples/aspnet-core/README.md new file mode 100644 index 00000000..a695da92 --- /dev/null +++ b/examples/aspnet-core/README.md @@ -0,0 +1,125 @@ +# OpenAI ASP.NET Core Example + +This example demonstrates how to use the OpenAI .NET client library with ASP.NET Core's dependency injection container, registering a `ChatClient` as a singleton for optimal performance and resource usage. + +## Features + +- **Singleton Registration**: ChatClient registered as singleton in DI container +- **Thread-Safe**: Demonstrates concurrent usage for chat completion endpoints +- **Configurable Model**: Model selection via configuration (appsettings.json) +- **Modern ASP.NET Core**: Uses minimal APIs with async/await patterns + +## Prerequisites + +- .NET 8.0 or later +- OpenAI API key + +## Setup + +1. **Set your OpenAI API key** using one of these methods: + + **Environment Variable (Recommended):** + + ```powershell + $env:OPENAI_API_KEY = "your-api-key-here" + ``` + + **Configuration (appsettings.json):** + + ```json + { + "OpenAI": { + "Model": "gpt-4o-mini", + "ApiKey": "your-api-key-here" + } + } + ``` + +2. **Install dependencies:** + + ```powershell + dotnet restore + ``` + +3. **Run the application:** + + ```powershell + dotnet run + ``` + +## API Endpoints + +### Chat Completion + +- **POST** `/chat/complete` +- **Request Body:** + + ```json + { + "message": "Hello, how are you?" + } + ``` + +- **Response:** + + ```json + { + "response": "I'm doing well, thank you for asking! How can I help you today?" + } + ``` + +## Testing with PowerShell + +**Chat Completion:** + +```powershell +Invoke-RestMethod -Uri "https://localhost:5000/chat/complete" ` + -Method POST ` + -ContentType "application/json" ` + -Body '{"message": "What is the capital of France?"}' +``` + +## Key Implementation Details + +### Singleton Registration + +```csharp +builder.Services.AddSingleton(serviceProvider => new ChatClient( + builder.Configuration["OpenAI:Model"], + new ApiKeyCredential(builder.Configuration["OpenAI:ApiKey"] + ?? Environment.GetEnvironmentVariable("OPENAI_API_KEY") + ?? throw new InvalidOperationException("OpenAI API key not found"))) +); +``` + +### Dependency Injection Usage + +```csharp +app.MapPost("/chat/complete", async (ChatRequest request, ChatClient client) => +{ + var completion = await client.CompleteChatAsync(request.Message); + + return new ChatResponse(completion.Value.Content[0].Text); +}); +``` + +## Why Singleton? + +- **Thread-Safe**: `ChatClient` is thread-safe and can handle concurrent requests +- **Resource Efficient**: Reuses HTTP connections and avoids creating multiple instances +- **Performance**: Reduces object allocation overhead +- **Stateless**: Clients don't maintain per-request state + +## Swagger UI + +When running in development mode, you can access the Swagger UI at: + +- `https://localhost:7071/swagger` + +This provides an interactive interface to test the API endpoints. + +## Additional Resources + +- [Tutorial: Create a minimal API with ASP.NET Core](https://learn.microsoft.com/aspnet/core/tutorials/min-web-api) +- [.NET dependency injection](https://learn.microsoft.com/dotnet/core/extensions/dependency-injection) +- [Logging in C# and .NET](https://learn.microsoft.com/dotnet/core/extensions/logging) \ No newline at end of file diff --git a/examples/aspnet-core/appsettings.Development.json b/examples/aspnet-core/appsettings.Development.json new file mode 100644 index 00000000..0c208ae9 --- /dev/null +++ b/examples/aspnet-core/appsettings.Development.json @@ -0,0 +1,8 @@ +{ + "Logging": { + "LogLevel": { + "Default": "Information", + "Microsoft.AspNetCore": "Warning" + } + } +} diff --git a/examples/aspnet-core/appsettings.json b/examples/aspnet-core/appsettings.json new file mode 100644 index 00000000..af10f339 --- /dev/null +++ b/examples/aspnet-core/appsettings.json @@ -0,0 +1,14 @@ +{ + "Logging": { + "LogLevel": { + "Default": "Information", + "Microsoft.AspNetCore": "Warning" + } + }, + "AllowedHosts": "*", + "OpenAI": + { + "Model": "gpt-4.1-mini", + "ApiKey": "" + } +} diff --git a/examples/aspnet-core/aspnet-core.csproj b/examples/aspnet-core/aspnet-core.csproj new file mode 100644 index 00000000..5a916750 --- /dev/null +++ b/examples/aspnet-core/aspnet-core.csproj @@ -0,0 +1,15 @@ + + + + net8.0 + enable + enable + ASP.NET_Core + + + + + + + +