Ambient Computing

    2025: rise of the smartglasses?

    Halliday Glasses have boarded the smart spectacles hype train, featuring “proactive” AI assistance and a near-eye display that shows information directly in the user’s field of view.

    The display appears as a 3.5-inch screen in the upper-right corner of the user’s view with minimal obstruction

    The near-eye display is supported on both prescription lenses and if no lens is used at all. The displayed information isn’t visible to other people and can be controlled using either voice commands, frame interface controls, or a ring that features a built-in trackpad.

    The number of in-optics display announcements is accelerating.

    Soliddd’s scientifically formulated and user-tested virtual reality smartglasses are lightweight and feel like normal eyeglasses. SolidddVision provides the first true vision correction—and, indeed, sight restoration for those living with vision loss due to macular degeneration.

    The smartglasses use Soliddd’s unique and proprietary lens arrays, which resemble a fly’s eye, to project multiple separate images to the areas of the retina that are not damaged. This allows the brain to naturally construct stereopsis (the making of a 3D image in the brain) and a single full-field image with good acuity that feels like normal, in-focus sight.

    We’ve already hit the point of wearable-tech-as-health-improvement-device with glasses.

    2025: The Year of AR?

    Sources told the Financial Times that Meta is planning to expand its partnership with Ray-Ban to make glasses with a little display on the inside, so users can gaze out at the world with a digital overlay

    The sources said Meta wants to add a screen on the inside of the glasses that would be able to flash things like text messages, and that the product might be ready to hit the shelves as early as the second half of next year.

    On-lens optics is the next big step.

    😎😎😎

    Meta plans to add displays to its Ray-Ban smart glasses as soon as next year, as the US tech giant accelerates its plans to build lightweight headsets that can usurp the smartphone as consumers’ main computing device.

    According to people familiar with the matter, the company has accelerated Orion’s development following the enthusiastic response of early testers.

    The next form factor has been decided.
    Ambient computing is coming.

    via the Financial Times

    I just concluded a prez to the Blue Ion crew by saying that smart/AR glasses are the next consumer computing wave.

    This piece from The Verge makes me feel better about that prediction.

    and the third is the idea that no one device is the future of XR. Headsets, for example, may just be “episodic” devices you use for entertainment. Glasses could supplement phones and smartwatches for discreet notifications and looking up information.

    “The way I see it, these devices don’t replace one another. You’ll use these devices throughout your day,

    Ambient computing!

    There are plenty of hurdles left, but if Google has figured out on-lens optics, the big ones left are for the accountants.

    Samsung and Google are working on smart glasses to rival Meta’s Ray Bans to be released late next year.

    Gemini would handle AI tasks alongside support for “payment,” QR code recognition, “gesture recognition,” and “human recognition functions.”

    I think we’ve found the most probable next consumer computing platform.

    AI needs a computer, but not necessarily a screen.

    The era of ambient computing is upon us.