Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add responses API wrapper support #68

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

mfine15
Copy link
Collaborator

@mfine15 mfine15 commented Mar 18, 2025

Update the exa client to wrap openai responses API if supported. Wrapper functions the same as the standard completions create.

openai_api_key = os.environ.get("OPENAI_API_KEY")
if not openai_api_key:
    raise ValueError("OPENAI_API_KEY environment variable not set")
openai_client = OpenAI(api_key=openai_api_key)

# Wrap the OpenAI client with Exa functionality
# This will enhance both chat.completions.create and responses.create methods
exa.wrap(openai_client)

# Define the system message that instructs the LLM to use web search when needed
system_message = {
    "role": "system", 
    "content": "You are a helpful assistant. Use web search when you need current or specific information. Always cite your sources."
}

# Now, let's define some questions to answer
questions = [
    "How did bats evolve their wings?",
    "How did Rome defend Italy from Hannibal?",
    "What are the latest developments in quantum computing?"
]

# Let's create a function to handle the RAG process with the Responses API using our wrapper
def answer_with_exa_rag(question):
    """Use OpenAI Responses API with Exa wrapper to answer a question with current web data."""
    print(f"\nQuestion: {question}")
    
    # Create messages with the user's question
    messages = [
        system_message,
        {"role": "user", "content": question}
    ]

    # Send request to OpenAI - the Exa wrapper will handle the search if needed
    print("Sending request to OpenAI with Exa wrapper...")
    response = openai_client.responses.create(
        model="gpt-4o",
        input=messages,
        # The Exa wrapper handles tool definition and function calls automatically
    )
    
    # The wrapper will have added exa_result to the response if a search was performed
    if hasattr(response, 'exa_result'):
        print("\nExa search was performed!")
        citations = [{"url": result.url, "title": result.title} for result in response.exa_result.results]
        return response.output_text, citations
    else:
        print("\nOpenAI provided an answer without needing to search the web.")
        return response.output_text, None

# Process each question
for question in questions:
    answer, citations = answer_with_exa_rag(question)
    print(f"\nAnswer: {answer}")
    
    if citations:
        print("\nSources:")
        for i, citation in enumerate(citations, 1):
            print(f"[{i}] {citation['title']}: {citation['url']}")
    print("\n" + "-"*80)

@mfine15 mfine15 requested a review from jeffzwang March 18, 2025 14:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant