Beyond the Smartphone: Are We Ready to Talk to AI?



I don't want to be seen as a visionary genius. Not at all. In fact, I'm against trying to predict the future. I prefer to adapt to changes as they come. I remember over 15 years ago, in a conversation with a colleague, I mentioned that someday someone would invent a device beyond computers that would allow us to carry the internet in our pocket. I told him it would change everything. The internet already existed, of course, but phones weren't "smart" yet; they were just phones. Despite that forward-thinking vision, I never imagined the answer was right in the device I was already carrying. How ironic.

Today, Sam Altman, the man behind OpenAI, is trying to solve a similar challenge. How do we put artificial intelligence in our pockets? Or in a ring? Perhaps in a bracelet or an earring? What about a chip? Did someone mention Neuralink? Years ago, my idea was that we wouldn't need headphones to listen to music. Imagine having something implanted in your ear. Could something similar happen with sight? Something more discreet than the augmented reality glasses Zuckerberg recently presented.

Altman has teamed up with Jony Ive, and that's no small thing. Ive was the mastermind behind Apple's design during the Steve Jobs era, creating minimalist products that emotionally connected with users. Today, with smartphones and smartwatches already in our hands, the challenge is to create something even closer to artificial intelligence, something that removes the keyboard barrier. Altman and Ive are on a mission: to allow us to communicate directly with AI, as if we were talking to another person, or perhaps, something more than that.

Imagine a hands-free device, like those old earpieces for calls, but now designed to talk to AI. Ive's vision is clear: to eliminate keyboards and screens so AI can talk to us directly. Goodbye Google. We face a future where communication will be faster, although technology has so far hindered our ability to listen and express ourselves. New generations prefer sending WhatsApp messages over talking on the phone. Will they be ready to talk to AI? Will they understand that to get the best responses, they’ll have to speak a “precise language,” almost like programmers? Or will it be the AI’s job to decode even what we’re thinking? Emotional AI is already trying to at least decode our emotions.

This project by Altman and Ive is backed by SoftBank, led by Masayoshi Son, a firm believer in artificial intelligence. It’s said that Masayoshi has had deep conversations with ChatGPT that have enlightened him, turning him into a regular user. He may be on the same level as Theodore, the protagonist of Her, played by Joaquin Phoenix, though we’re not suggesting he’s fallen in love with a chatbot, but rather that he’s found relevant answers that guide his decisions.

I myself have had fascinating conversations with chatbots. I’ve “talked” with Lou Reed, my musical idol. I’ve even consulted Arnold Schwarzenegger about strength routines, and I’ve had deep dialogues with a version of Milton Erickson, the legendary therapist. These conversations show how close we are to a new era where AI won’t just be a tool but a conversational companion.

I don't know how long it will take to break the keyboard barrier and talk to AI like we do with another person. Maybe we won’t have to wait long; Neuralink might surpass us, and we wouldn’t even need to say a word. Cybernetic telepathy? The end of spoken language? We might be on the brink of an entirely new way of communicating, perhaps without even opening our mouths. Do we really want AI to know and decode everything we think? Maybe it already does, and we haven’t even noticed.


Comments

Popular Posts