About Meta Wearables Device Access Toolkit

Getting started

The Wearables Device Access Toolkit enables developers to build mobile apps that connect to hardware sensors of our AI glasses. You can create hands-free experiences, such as livestreaming, video capture, and AI-assisted applications.

While developers everywhere will be able to download the SDK, only those in AI glasses supported countries will have access to the full capabilities of the toolkit, including the Wearables Developers Center.

We plan to support our entire portfolio of AI glasses. Developers will initially have camera access via the toolkit and be able to access the microphone and speakers through iOS or Android Bluetooth profiles for
  • Ray-Ban Meta
  • Oakley Meta HSTN
Additional device support will roll out soon.

Wearables Device Access Toolkit supports the Android and iOS mobile platforms, with the same OS version requirements as the Meta AI app (Android 10 and iOS 15.2/Swift6). For full requirements, see here.

Yes, a sample app as well as a getting started tutorial are provided to help developers get started quickly.

No. You can start development using the SDK with simulated devices via Mock Device Kit, allowing testing without hardware. For full integration, a compatible device is recommended.

Full documentation is available at https://wearables.developer.meta.com/docs/develop/.

The Wearables Device Access Toolkit provides a Mock Device Kit for testing integrations without hardware. Developers can pair a mock device, change its state, and simulate permissions and media streaming. Developers will also get access to the Wearables Developer Center where they can manage organizations, release channels, and more.

Core Features & Capabilities

Initially, you will have camera access via the toolkit, and you can access the microphone and speakers on our AI glasses via iOS/Android Bluetooth profiles.

Accessing Meta AI capabilities via “Hey Meta” invocations will not be a part of our toolkit this year, but we’ll continue to listen to feedback from the developer community and improve the experience. Meanwhile, you can explore AI model integration in the Llama developer center.

Developers in the preview release can build camera and audio experiences for Meta Ray-Ban Display glasses via the device access toolkit soon. We plan to expand access to display in future iterations.

No, not during the preview release. But we’ll continue to listen to feedback from the developer community to improve the experience.

You can access the device's microphones to create voice commands in your app, but you won't be able to create custom voice commands for Meta AI. While custom gesture controls like taps and swipes aren't offered, you can listen for standard events like pause, resume, and stop. We're evaluating additional capabilities for the future and will consider feedback as we determine our roadmaps.

Yes, we expect that using AI in your Meta Wearables Device Access Toolkit integration will improve the user experience. You can leverage the Llama API (Llama developer center), either through your own APIs or those provided by third parties.

Integration & APIs

You can find the full list of available APIs and tools here.

Yes, developers can process data locally or via cloud/edge platforms. However, the Meta AI app must be used to pair your glasses.

About our developer preview

The developer preview is available now: developer.meta.com/wearables.

  • Download the SDK: Streamline your development process with pre-built libraries and our sample app.
  • Access documentation: Understand the API architecture, available endpoints, data structures, and best practices. Kick start your development with our sample app and tutorial.
  • Test with or without hardware: Test applications in Developer Mode on glasses or by using our Mock Device Kit without hardware.

Unfortunately, we will not have support for developers in markets where AI glasses are not sold.

Publishing of integrations will be limited to select partners during our developer preview while we focus on building, testing, and gathering your feedback. We expect to release the ability to publish to the broader community in 2026.

We’re exploring opportunities to host bootcamps, hackathons, and other community events around the SDK. Stay connected with us for future announcements and opportunities at developer.meta.com/wearables.