Local AI Assistant on Smart Glasses
Using our wearables SDK to take a picture, speak a prompt with Whisperβs speech to text, and use chromeβs Prompt API to generate a response to be displayed back on the glassesβ display
@brilliantsole.bsky.social
Making smart insoles for developers. They have pressure sensors (8 per insole), 3DoF motion sensor, and haptics (2 per insole - heel and ball)
Local AI Assistant on Smart Glasses
Using our wearables SDK to take a picture, speak a prompt with Whisperβs speech to text, and use chromeβs Prompt API to generate a response to be displayed back on the glassesβ display
Spotify Player with Lyrics on Smart Glasses
Using our Wearables SDK to experiment with a music player on compatible smartglasses (in this case the Brilliant Labsβ Frame running custom firmware)
Also using a Mudra Band and TapXR for hand controls
Realtime maps in your smart glasses
Using our wearables SDK to display a simple map overlay, using @openstreetmap.bsky.social data to retrieve nearby data
Recognize people with smart glasses
Using our wearables SDK to look for certain people, and displaying relevant information when someone is recognized
Works with compatible smartglasses (in this case the Brilliant Labsβ Frame running custom firmware)
Experimenting with hand controls on smart glasses
With our Wearables SDK, we can both display a 3d wireframe scene on compatible smartglasses, as well as control a cursor by strapping one of our motion modules to our wrist
Weβre able to detect pinches by running an @edgeimpulse.com model
View yourself in the third person with smart glasses
Using our Wearables SDK, weβre able to visualize MediaPipe tracking data (body, hands, and face) on compatible smartglasses (in this case the Brilliant Labs Frame running our custom firmware)
Great for yoga, working out, dancing, etc
Translation on Smart Glasses
Added foreign language support to our Wearables SDK, allowing us to display languages like Chinese, Japanese, and Korean
In this case weβre using @hf.co models for both realtime transcription and translation - all done locally (no cloud required)
Boxing on smart glasses
Added wireframe support to our Wearables SDK, allowing us to display realtime 3d graphics (like A-Frame and @threejs.org) on compatible smart glasses (in this case the Brilliant Labs Frame)
Realtime Subtitles on Smart Glasses, using our Wearables SDK
We used @hf.coβs Whisper model to transcribe microphone data locally, and then display the transcription on the smart glassesβ display (in this case the Brilliant Labsβ Frame) - no cloud required
Added sprite support to our wearables SDK
Great for stuff like emulating 3d graphics via billboarding
Added Font support to our Brilliant Sole Wearablesβ JavaScript SDK
18.08.2025 18:52 β π 0 π 0 π¬ 0 π 1Added Bitmap support to our Brilliant Sole Wearablesβ JavaScript SDK
11.08.2025 19:27 β π 2 π 0 π¬ 0 π 1Our wearablesβ JavaScript SDK works great with react-like environments
This is a simple example using @nextjs.org, @supabase.com, and @tailwindcss.com to make a basic data collection website that can both record and view sensor data
Detect head gestures on your smart glasses
Made a simple @edgeimpulse.com model to detect nods and head shakes
Smart glasses webcam
Created a virtual webcam using OBS and our SwiftUI camera demo
Express yourself with smart glasses
Using MediaPipeβs face tracking to visualize a userβs eyes and eyebrows on a smart glasses display (in this case the Brilliant Labsβ Frame running our custom firmware)
Our smart glasses display simulator also works in WebXR, allowing us to preview ar glasses applications on a Quest 3
07.07.2025 15:35 β π 1 π 0 π¬ 0 π 1Added a smart glasses display simulator to our JavaScript SDK
Preview what your ar glasses applications will look like using your webcam, a video file, or even an interactive 3D scene
Added display support for smart glasses to our JavaScript SDK
We created a Canvas-like interface for drawing primitives onscreen for smartglasses with displays, like the Brilliant Labs Frame
Easily draw rectangles, circles, rounded rectangles, ellipses, polygons, and segments
Running speech recognition on smart glasses π
As we port our firmware to the Brilliant Labsβ Frame, we added microphone support to our SDKβs, allowing us to run @hf.coβs Whisper Web to transcribe speech in the browser
Hand, face, and object tracking on smart glasses
Running MediaPipe and @hf.co models on the Brilliant Labs Frame camera data
Streaming camera data from smart glasses to a smart watch via Bluetooth
As we port our firmware to the Brilliant Labsβ Frame, we added camera support to our SDKβs
Ported our firmware to the Brilliant Labs Frame π, making it compatible with our SDKβs
26.05.2025 15:02 β π 1 π 0 π¬ 0 π 1Kicking a virtual soccer ball in WebXR using our motion modules
19.05.2025 15:06 β π 1 π 0 π¬ 0 π 1Kick, Stomp, and toss green shells in WebXR
By wrapping our motion module around our ankle, we can add kick/stomp detection running an @edgeimpulse.com ML model
Adding haptic feedback to create an immersive WebXR experience on the Quest 3
Here we vibrate our motion modules differently based on the gesture (pet, punch, grab, and release)
We added WiFi support to our JavaScript, allowing us to connect directly to our WiFi-enabled hardware via WebSockets
Using our motion modules to detect punches, providing haptic feedback
28.04.2025 14:48 β π 0 π 0 π¬ 0 π 1Experimenting with basic wrist controls on our modular Ukaton hardware
You can remove the insole+shoe clip, attach a loop clip, and strap it around your wrist
Made a simple @edgeimpulse.com model to detect pinches, and using the gyro to move the cursor, similar to @doublepoint.bsky.social
Rewrote Ukatonβs firmware so their hardware is 100% compatible with our SDKβs
We recently merged with Ukaton and now have 2 implementations for smart insoles
Interfacing with our smart insoles on an Apple TV in a SwiftUI app using our Swift Package
07.04.2025 16:09 β π 2 π 0 π¬ 0 π 1