Skip to content

Support OpenAI responses API#1807

Merged
philippjfr merged 2 commits intomainfrom
openai_responses_api
Apr 13, 2026
Merged

Support OpenAI responses API#1807
philippjfr merged 2 commits intomainfrom
openai_responses_api

Conversation

@philippjfr
Copy link
Copy Markdown
Member

Description

Adds support for the OpenAI Responses API to the Lumen OpenAI LLM wrapper.

The existing integration is based on the Chat Completions API, which is now a legacy interface with limitations around structured outputs, tool usage, and multi-step interactions. The Responses API provides a more unified and extensible abstraction that better aligns with Lumen’s typed, agent-driven execution model.

Motivation

Supporting the Responses API enables:

  • First-class tool calling
    The Responses API natively supports multiple tool calls within a single response and more flexible tool invocation patterns. This maps cleanly to Lumen’s agent architecture, where tools and agents may be composed and invoked iteratively.

  • Structured, typed outputs
    The API is designed to work well with structured outputs, improving reliability when generating artifacts such as SQL queries, Vega-Lite specs, and intermediate agent outputs.

  • Unified interface across modalities and workflows
    A single API handles text generation, tool use, and future extensions, reducing fragmentation in the LLM layer.

  • Improved control over execution flow
    Responses can represent multi-step reasoning and tool usage in a single object, making it easier to integrate with Lumen’s execution pipeline and maintain visibility into intermediate steps.

  • Forward compatibility
    The Responses API is the primary interface going forward, ensuring continued compatibility with new OpenAI features and reducing reliance on deprecated endpoints.

Summary of Changes

  • Added a Responses API-backed implementation to the OpenAI LLM wrapper.
  • Introduced handling for tool calls and structured outputs returned by the Responses API.
  • Normalized Responses API outputs to match Lumen’s internal abstractions, ensuring compatibility with existing agents and pipelines.
  • Maintained backward compatibility with existing Chat Completions-based usage.

How Has This Been Tested?

  • Verified compatibility with existing agents (e.g. SQL and visualization flows) using the Responses API backend.

  • Manually tested end-to-end flows to confirm:

    • Correct propagation of tool calls
    • Stable structured outputs across multiple runs
    • No regressions when switching between Chat Completions and Responses API

AI Disclosure

  • This PR contains AI-generated content.

    • I have tested all AI-generated content in my PR.
    • I take responsibility for all AI-generated content in my PR.

Tools and Models: Cursor + Codex 5.3

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 13, 2026

Codecov Report

❌ Patch coverage is 12.25806% with 136 lines in your changes missing coverage. Please review.
✅ Project coverage is 70.44%. Comparing base (d76bd92) to head (cc5e8fa).
⚠️ Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
lumen/ai/llm.py 12.25% 136 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1807      +/-   ##
==========================================
- Coverage   70.74%   70.44%   -0.30%     
==========================================
  Files         176      176              
  Lines       30426    30578     +152     
==========================================
+ Hits        21526    21542      +16     
- Misses       8900     9036     +136     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@philippjfr philippjfr merged commit 226f805 into main Apr 13, 2026
12 checks passed
@philippjfr philippjfr deleted the openai_responses_api branch April 13, 2026 13:50
ahuang11 pushed a commit that referenced this pull request Apr 13, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant