Ambient Computing
- the AI layer upending the current ad funded model of the “open” internet
- why we should approach AI model advancements with optimism
Foursquare’s MarsBot has been reborn as BeeBot (basically).
It sounds like a glimpse at the future of computing (ambient computing!). The interface is your headphones (AirPods get the name drop) and information is fed to you via your location (using AI (obvs)).
Audio augmented reality
Apple is bringing in the big guns to power Siri: now with AI!
OpenAI and Jony Ive are thinking screen-free.
Google and Perplexity are powering assistants elsewhere.
Smart glasses are spilling from every nook and cranny.
As the ambient computing future draws nearer, what does your brand sound like?
This Exponential View episode is worth a listen as a quick overview of the current tech inflection points, like:
But this is the big one for me:
It”s no longer about phones. It’s about ambient computing.
Ambient, invisible computing
I started writing about this way back in 2020, this will be the biggest marketing shift of the era.
Apple’s liquid glass is the public facing beginning of this shift (for them).
2025: rise of the smartglasses?
Halliday Glasses have boarded the smart spectacles hype train, featuring “proactive” AI assistance and a near-eye display that shows information directly in the user’s field of view.The display appears as a 3.5-inch screen in the upper-right corner of the user’s view with minimal obstructionThe near-eye display is supported on both prescription lenses and if no lens is used at all. The displayed information isn’t visible to other people and can be controlled using either voice commands, frame interface controls, or a ring that features a built-in trackpad.
The number of in-optics display announcements is accelerating.
