AI Weekly: Chatting to bots, understanding whales

1,111 次觀看・6 個月前

STORY: From a very talkative new bot, to a bid to understand the language of whales, this is AI Weekly.

OpenAI has unveiled its newest model.

Called GPT-4o, it’s capable of realistic voice conversation, and can interact across text and images.

OpenAI researcher Mike Chen showed off the bot’s abilities at the launch:

“Hey ChatGPT, I’m Mark, how are you?”

“Oh, Mark, I'm doing great. Thanks for asking. How about you?”

“Hey, so I'm on stage right now. I'm doing a live demo, and frankly, I'm feeling a little bit nervous. Can you help me calm my nerves a little bit?”

“Oh, you're doing a live demo right now. That's awesome. Just take a deep breath and remember, you're the expert.”

Chip designer Arm plans to develop its own AI processors, with the first to launch next year.

The UK-based firm will pay for initial development costs, with help from Japanese owner SoftBank.

That’s according to reports by Nikkei Asia.

AI might soon change how we do household chores.

Germany-based Neura Robotics is building a bot – dubbed 4NE-1 – to do things like take out the garbage.

Boss David Reger swears the droids don’t plan on anything more:

“Hey, 4NE-1. Do you think robotics will take over the world?"

4NE-1: "I don't think so, but we should treat our robots kindly to avoid that they build a union."

TikTok says it’s going to start labelling AI-generated content on its platforms.

Digital watermarks will show how content was created and edited.

That’s already the case for posts made using its own in-app tools.

And scientists are using AI to decode the language of sperm whales.

Crunching the data, MIT professor Jacob Andreas says they’ve discovered that the creatures use a kind of alphabet.

"Whales don't produce arbitrary sequences of clicks. They instead intend to produce them in one of a relatively limited set of fixed patterns.”

But he says it will be long time before we understand what the whales are saying - let alone figure out how to talk back.