Skip to content

[FEATURE REQ] Responses "prompt" attribute  #500

@YairSadan

Description

@YairSadan

Describe the feature or improvement you are requesting

no option to use stored prompts.

Additional context

in the Responses api -2.2.0-beta.4
there is no way to provide stored prompt id like we could in JS as in the example below:

import OpenAI from "openai";
const client = new OpenAI();

const response = await client.responses.create({
    model: "gpt-4.1",
    prompt: {
        id: "pmpt_abc123",
        version: "2",
        variables: {
            customer_name: "Jane Doe",
            product: "40oz juice box"
        }
    }
});

console.log(response.output_text);

neither the ResponseCreationOptions provide that:

// <auto-generated/>

#nullable disable

using System;
using System.Collections.Generic;

namespace OpenAI.Responses
{
    public partial class ResponseCreationOptions
    {
        private protected IDictionary<string, BinaryData> _additionalBinaryDataProperties;

        internal ResponseCreationOptions(IDictionary<string, string> metadata, float? temperature, float? topP, string previousResponseId, string instructions, IList<InternalCreateResponsesRequestIncludable> include, InternalCreateResponsesRequestModel model, IList<ResponseItem> input, bool? stream, string endUserId, ResponseReasoningOptions reasoningOptions, int? maxOutputTokenCount, ResponseTextOptions textOptions, ResponseTruncationMode? truncationMode, bool? parallelToolCallsEnabled, bool? storedOutputEnabled, ResponseToolChoice toolChoice, IList<ResponseTool> tools, IDictionary<string, BinaryData> additionalBinaryDataProperties)
        {
            Metadata = metadata;
            Temperature = temperature;
            TopP = topP;
            PreviousResponseId = previousResponseId;
            Instructions = instructions;
            Include = include;
            Model = model;
            Input = input;
            Stream = stream;
            EndUserId = endUserId;
            ReasoningOptions = reasoningOptions;
            MaxOutputTokenCount = maxOutputTokenCount;
            TextOptions = textOptions;
            TruncationMode = truncationMode;
            ParallelToolCallsEnabled = parallelToolCallsEnabled;
            StoredOutputEnabled = storedOutputEnabled;
            ToolChoice = toolChoice;
            Tools = tools;
            _additionalBinaryDataProperties = additionalBinaryDataProperties;
        }

        public IDictionary<string, string> Metadata { get; }

        public float? Temperature { get; set; }

        public float? TopP { get; set; }

        public string PreviousResponseId { get; set; }

        public string Instructions { get; set; }

        internal IDictionary<string, BinaryData> SerializedAdditionalRawData
        {
            get => _additionalBinaryDataProperties;
            set => _additionalBinaryDataProperties = value;
        }
    }
}

nor in CreateResponse function:

    public virtual AsyncCollectionResult<StreamingResponseUpdate> CreateResponseStreamingAsync(IEnumerable<ResponseItem> inputItems, ResponseCreationOptions options = null, CancellationToken cancellationToken = default)
    {
        Argument.AssertNotNullOrEmpty(inputItems, nameof(inputItems));

        using BinaryContent content = CreatePerCallOptions(options, inputItems, stream: true);
        return new AsyncSseUpdateCollection<StreamingResponseUpdate>(
            async () => await CreateResponseAsync(content, cancellationToken.ToRequestOptions(streaming: true)).ConfigureAwait(false),
            StreamingResponseUpdate.DeserializeStreamingResponseUpdate,
            cancellationToken);
    }

Metadata

Metadata

Assignees

No one assigned

    Labels

    feature-requestCategory: A new feature or enhancement to an existing feature is being requested.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions