OpenAI Unveils GPT-5.5
OpenAI has rolled out GPT-5.5, a new model designed for research-oriented tasks, including improving versions of itself. The company describes it as a “new class of intelligence for real work and powering agents.”
Capabilities and Design
GPT-5.5 is designed to handle complex, multi-part work with minimal step-by-step direction. Users can provide a complex task and the model will plan, use tools, check its work, handle uncertainty, and continue working through challenges.
OpenAI cofounder and president Greg Brockman said at a briefing with journalists, as reported by AFP, that “what is really special about this model is how much more it can do with less guidance,” adding that it “can look at an unclear problem and figure out just what needs to happen next.” Chief research officer Mark Chen described the company’s short-term focus as letting humans act as “orchestrators” while the models do the “heavy lifting.”
Performance Improvements
OpenAI said GPT-5.5 makes improvements in agentic coding, computer-based tasks, knowledge work, and early-stage scientific research. The company stated that GPT-5.5 matches GPT-5.4 per-token latency in real-world serving while performing at a higher level of intelligence.
Safety and Deployment
The model is being released with what OpenAI described as its most robust safeguards to date. OpenAI said it evaluated GPT-5.5 across its safety and preparedness frameworks, worked with internal and external red teamers, added targeted testing for advanced cybersecurity and biology capabilities, and collected feedback from nearly 200 trusted early-access partners before release.
GPT-5.5 is rolling out to Plus, Pro, Business, and Enterprise users in ChatGPT and Codex. GPT-5.5 Pro is rolling out to Pro, Business, and Enterprise users in ChatGPT. OpenAI said API deployments will require different safeguards and that it is working with partners and customers on safety and security requirements to serve the model at scale, with plans to bring GPT-5.5 and GPT-5.5 Pro to the API in the near term.
Source: Tech-Economic Times