If we want AI to be a true sidekick—one that can assist, guide, and even anticipate our needs—it has to be able to see the world the way we do. Without that, it’s just a voice in your pocket, reacting to whatever you tell it rather than proactively helping. But making an AI “see” what you see isn’t as simple as sticking a phone camera in your shirt pocket.
Phones are bulky, awkward, and not designed for continuous outward-facing use. Most people don’t even have shirt pockets anymore, and even if they did, no one wants to look like they’re wearing a GoPro on their chest. Devices like the Humane AI Pin tried to solve this problem, but the market wasn’t impressed. It looked odd, was easy to forget, and never felt socially acceptable—three big strikes for everyday adoption.
The solution likely comes down to two more natural options:
• Smart Glasses with AR: Glasses already sit where your eyes are, so they’re the perfect place for outward-facing cameras. Combined with built-in microphones and speakers (possibly directional or bone-conduction speakers that only you can hear), they could feed your AI exactly what you’re seeing in real time. This also opens the door to augmented reality overlays—directions, reminders, translations—layered right onto your world.
• Camera-Enabled Earbuds: For people who don’t wear glasses or aren’t interested in AR, an alternative could be AirPods-style earbuds with tiny outward-facing cameras. They’d sit on top of your ear, providing a natural line of sight similar to your own. This option would be less conspicuous than glasses and still give your AI enough visual input to understand your environment.
Both approaches have challenges—privacy, battery life, and social acceptance among them—but they’re the most plausible paths to giving AI true “eyes.” Once solved, your AI could act less like a voice assistant and more like a real companion who understands your world as well as you do.