What you need to know
- OpenAI announced GPT-4o, its latest flagship model, yesterday.
- GPT-4o is now available in preview in Azure OpenAI Service.
- Developers can use Azure OpenAI Service to integrate GPT technology into applications.
OpenAI CEO Sam Altman promised magic from the company this week. While some were disappointed by the live stream, OpenAI did announce a new flagship model in GPT-4o. The new model brings "GPT-4-level intelligence to everyone," including free users, as explained by OpenAI. Now, developers can integrate GPT-4o into applications through Azure OpenAI Service.
Microsoft's support page explaining what Azure OpenAI Service is only lists the GPT-4 series and GPT-3.5 series at the moment, but it should be updated soon. The company announced the launch of GPT-4o to Azure AI shortly after OpenAI's event concluded.
"Microsoft is thrilled to announce the launch of GPT-4o, OpenAI’s new flagship model on Azure AI," said Microsoft. "This groundbreaking multimodal model integrates text, vision, and audio capabilities, setting a new standard for generative and conversational AI experiences. GPT-4o is available now in Azure OpenAI Service, to try in preview, with support for text and image."
OpenAI showcased GPT-4o during its live event with several demonstrations. The new model is faster than its predecessor and much more capable. GPT-4o supports real-time voice communication, video analysis capabilities, and can detect and replicate emotions.
A preview playground in Azure OpenAI Studio lets developers in select regions in the United States test the capabilities of GPT-4o.
Microsoft lists some potential ways that organizations could use GPT-4o:
- Enhanced customer service: By integrating diverse data inputs, GPT-4o enables more dynamic and comprehensive customer support interactions.
- Advanced analytics: Leverage GPT-4o’s capability to process and analyze different types of data to enhance decision-making and uncover deeper insights.
- Content innovation: Use GPT-4o’s generative capabilities to create engaging and diverse content formats, catering to a broad range of consumer preferences.
Microsoft also shared excitement about GPT-4o and "other Azure AI updates" that will be shown off at Build 2024. That developer-focused event takes place between May 21-23. Much of the excitement surrounding Build 2024 relates to Windows on Arm but AI will also be a major focus of the event. Several session titles and descriptions for this year's Build conference are about AI computing and new Windows AI features. It's safe to assume Microsoft will have more to share related to GPT-4o and the work of OpenAI, even though OpenAI snubbed Windows in favor of macOS recently.
While some may debate if GPT-4o is magical, as alluded to by Altman, the model is much more capable than previous models. GPT-4o can respond to audio in an average time of 320 milliseconds. OpenAI highlights that GPT-4o matches GPT-4 Turbo's performance on text in English and code and that it outperforms GPT-4 in non-English languages. GPT-4o is also 50% cheaper than GPT-4 when used through its API, which is a significant price difference for developers.