In a remarkable shift towards smart wearable devices powered by artificial intelligence, informed sources revealed that Apple is doubling its efforts to develop three devices based on artificial intelligence. These devices include smart glasses, a necklace-like wearable device, and a new version of the AirPods with cameras.

These developments come as part of a broader strategy aimed at expanding Apple’s smart ecosystem beyond smartphones and traditional devices.

The new smart glasses stand out as one of the most prominent of these devices, as expectations indicate that their production will begin in December 2026 at the latest, and they will be officially launched in 2027. These glasses, which will work without a built-in screen, will instead rely on high-resolution cameras, a microphone, and speakers to provide interactive artificial intelligence experiences to users via the “Siri” assistant.

These glasses are expected to allow users to make calls, take photos, and learn about the surrounding environment, in addition to providing real-time assistance such as recognizing landmarks or identifying objects, all with seamless integration with the iPhone. Apple seeks to outperform competitors, such as “Meta AI” glasses, through its distinguished design and build quality.

In addition to smart glasses, Apple intends to develop a wearable device the size of an AirTag, which can be attached to clothing or worn as a necklace, and is known internally as “AI Pendant” or “AI Pin.”

This device is said to contain a camera and microphone that work constantly to provide the assistant “Siri” with data about the surroundings, serving as the eyes and ears of the device that comes without a screen or built-in display device.

This device features an elegant and small design that makes it suitable for daily use as a portable artificial intelligence accessory. It also relies on separate core processing, as most operations are done via the phone connected to it.

As part of this technical initiative, Apple is also working on an updated version of the AirPods equipped with low-resolution cameras, not to take traditional photos, but to collect environmental information that helps artificial intelligence, and thus develop smart audio and “gesture recognition” experiences across devices.

However, the user will still need to connect with the iPhone to benefit from this data, which confirms that these devices do not act as standalone units, but rather as additional input points for the artificial intelligence system in the “Apple” environment.

This expansion of smart wearable devices comes in light of increasing competition in the market, especially with the presence of competing products such as Meta smart glasses and “OpenAI” projects for smart wearable devices.

This step indicates Apple’s desire to move to mobile artificial intelligence that is integrated with our daily lives, instead of remaining limited to phones and smart watches only.

It is worth noting that Apple has not officially confirmed any of these plans yet, but leaks indicate that next year may witness the launch of a large portion of these devices, or at least their announcement, paving the way for a new era in the world of smart wearable devices.