Apple’s upcoming smart glasses could sidestep one of the category’s biggest problems – privacy concerns – by rethinking something as simple as the camera indicator light. According to a recent report from Bloomberg, the company is working on display-free smart glasses that focus on everyday functionality, but with a design approach that may make them seem less intrusive than current offerings.
The device, internally codenamed N50, is expected to launch around 2026 or 2027 and will function more like a companion to the iPhone than a standalone augmented reality system. Instead of a display, the glasses rely on functions such as photo and video recording, voice interactions via Siri, notifications and media playback.
A subtle hardware change with big implications
What’s special about Apple’s approach is the way it plans to handle record visibility. Unlike existing smart glasses that use small LED displays, Apple is reportedly experimenting with a more eye-catching lighting system built directly into the camera module.
The design includes vertically oriented lenses surrounded by visible lighting elements, making it difficult to hide when actively shooting.
This could solve a key problem that has plagued smart glasses since their introduction: the fear of being recorded without consent.
The privacy problem that others still face
The problem is not theoretical. A report from WIRED shows how users of Meta’s Ray-Ban smart glasses have attempted to circumvent privacy regulations. Third-party vendors have even promoted accessories like “ghost dots,” which are designed to dim or block the recording display.
Although these attempts are often ineffective due to built-in safeguards, they reveal a broader problem. When users actively attempt to hide recording signals, the trust needed for widespread adoption breaks down.
Even unsuccessful workarounds contribute to the misuse of smart glasses, reinforcing the “scary” reputation that has limited their adoption.
Apple’s strategy: Resolve trust through design
Instead of relying solely on software limitations, Apple appears to be tackling the problem at the hardware level.
By making the recording display more visible and integrating it into the design, the company is trying to eliminate confusion. If successful, it could become significantly more difficult to use the glasses in a way that feels obscured or deceptive.
This is consistent with Apple’s broader approach to new product categories. As seen with devices like the iPhone and Apple Watch, the company often enters markets later but focuses on improving the user experience and fixing key vulnerabilities.
Part of a larger AI wearables push
Smart glasses are not developed in isolation. Bloomberg notes that they are part of a broader strategy that includes AI-powered AirPods and other wearable devices designed to interpret the user’s surroundings.
These products rely on computer vision and Apple Intelligence to provide contextual information, from navigation assistance to real-time reminders.
This suggests that Apple’s goal is not just to build smart glasses, but to create an ecosystem of devices that make AI more environmentally friendly and seamlessly integrate it into daily life.
What this means for users
For consumers, the success of smart glasses will depend on both perception and functionality.
If Apple can make its glasses feel transparent and trustworthy, it could overcome one of the biggest barriers to adoption. At the same time, the tight integration with the iPhone and the Apple ecosystem can make the device more useful in everyday scenarios.
What comes next
Apple’s smart glasses are still in development and a market launch is not expected until 2026 or 2027 at the earliest. Full-featured augmented reality glasses remain a long way off, probably towards the end of the decade.
Until then, Apple’s focus seems to be on getting the basics right – functionality, usability and, most importantly, trust.




