ChatGPT is becoming more human, making it difficult for rivals to catch up

AI For Business


When an OpenAI researcher showed the above message to ChatGPT, he said, “That's very kind of you.”
OpenAI

  • OpenAI has unveiled a GPT-4o update to ChatGPT that allows for inference across audio, vision, and text.
  • The upgraded chatbot has human-like features that mimic dictation, adding humor and voice inflections.
  • GPT-4o's high capabilities put pressure on it to prove it can keep up with its technological rivals.

OpenAI CEO Sam Altman teased that the new ChatGPT update “feels like magic,” and he wasn't wrong.

The AI ​​companies have essentially planted a flag in the sand for their Big Tech rivals with two words: “Your movement.”

OpenAI CTO Mira Murati unveiled the Spring Update to ChatGPT on Monday with a series of live demos. Powered by OpenAI's new flagship AI model GPT-4o, the latest version of the AI ​​chatbot can infer audio, vision, and text in real-time.

And surprisingly human.

The movie “Her” is getting closer and closer

warner bros movies

First, ChatGPT's voice and conversational capabilities have come a long way thanks to GPT-4o, allowing it to express emotions and change tone.

The new AI sounded like an American woman's voice in a demo, but more like Scarlett Johansson's in Spike Jonze's film Her, but OpenAI researchers said at one point the robot's I let the voice switch. An OpenAI spokesperson said that at launch, audio output will be limited to a selection of preset voices.

The voice wasn't just human-like. It also showed an uncanny ability to imitate human diction. The new ChatGPT adds chuckles, humor, and softens voice inflections in response to prompts.

It also appears to be able to detect human signals. When a researcher hyperventilated while practicing deep breathing, the chatbot said, “Mark, you're not a vacuum cleaner.”

You can even interrupt the chatbot, making the conversation more natural. You don't have to wait for the AI ​​to finish responding before asking clarifying questions or changing the subject.

Response time was also super fast. An OpenAI spokesperson said the chatbot can respond to voice input with human-like response speeds, taking an average of 320 milliseconds.

After the event, OpenAI CEO Sam Altman posted on X (formerly Twitter) the title of a movie that many people who saw the demo were curious about.

Media not supported by AMP.
Tap to get the full mobile experience.

ChatGPT eyes have also been upgraded

Chatbots have demonstrated advanced abilities to interpret graphs, assist with coding, interpret emotions, and guide users through mathematical formulas by viewing videos and images displayed on a mobile phone's camera.

All the while, the voice assistant remained friendly and upbeat.

In another demo shared online, GPT-4o was even able to analyze video of the space around the user. Given that the user was wearing her OpenAI hoodie and was surrounded by recording equipment, we speculated that the user might be compiling some information. OpenAI related announcements.

Media not supported by AMP.
Tap to get the full mobile experience.

The chatbot appeared to have some issues, such as misinterpreting image prompts or inaccurately starting a response before the question was complete, but these moments caused Chatbots now look more human.

Everything feels more human and ahead of its rivals to date

In one example, ChatGPT began responding to prompts before the researcher could show the equation to the camera, causing the researcher to stop the chatbot mid-way.

“Oops, I got too excited,” the chatbot responded. “I'm always ready.”

They also appeared to be responding in ways that imitated gratitude. When the researcher showed the chatbot a photo that read, “I wholeheartedly support ChatGPT,'' the chatbot responded, “Wow,'' and “You're so kind.''

In another example, ChatGPT said it made researchers blush by talking about how “useful and awesome” ChatGPT is.

OpenAI made the announcement on the eve of Google's big summer conference, Google IO, which is expected to reveal the company's progress on various AI products, including Gemini.

But given the timing of OpenAI's event and its impressive demonstration, AI watchers may be interested to see if ChatGPT is ahead of Google's Gemini, or if Google is up to something. It will be.

But for now, OpenAI's spring update shows once again how great ChatGPT is, especially when compared to the existing voice assistant space.

Amazon's Alexa, Apple's Siri, and Google are all gaining traction. Their voice assistants are known for their robotic, direct responses to questions, but they're far from a true conversation. The new ChatGPT, powered by GPT-4o, blows them out of the water with human-like reactions.

Apple, too, appears to be aware of the gap between ChatGPT and older versions of Siri, with recent reports suggesting that Apple executives have been tinkering with ChatGPT for weeks, showing how far behind the company is. After recognizing this, a decision was made to completely overhaul the iPhone voice assistant. it was. There are also rumors that the two companies are in talks, and Apple could end up licensing OpenAI's model for some unannounced iPhone features.

Apple fans won't have to wait long for more information.The company is expected to announce AI updates at Apple's annual meeting. Worldwide Developer Conference June 10th.

Meanwhile, Business Insider's Eugene Kim first reported that Amazon was planning to release a paid version of its voice assistant, Alexa Plus, that uses generative AI. The assistant is supposed to provide more conversational and personalized responses, but no release date is clear.

But just like with the first version of Chat GPT, OpenAI is trying to reiterate how great its technology is and prove it can catch up with the rest of the tech industry.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *