Brilliant Sole's Avatar

Brilliant Sole

@brilliantsole.bsky.social

Making smart insoles for developers. They have pressure sensors (8 per insole), 3DoF motion sensor, and haptics (2 per insole - heel and ball)

61 Followers  |  207 Following  |  60 Posts  |  Joined: 28.10.2024  |  2.3901

Latest posts by brilliantsole.bsky.social on Bluesky

Local AI Assistant on Smart Glasses

Using our wearables SDK to take a picture, speak a prompt with Whisper’s speech to text, and use chrome’s Prompt API to generate a response to be displayed back on the glasses’ display

30.10.2025 19:42 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Spotify Player with Lyrics on Smart Glasses

Using our Wearables SDK to experiment with a music player on compatible smartglasses (in this case the Brilliant Labs’ Frame running custom firmware)

Also using a Mudra Band and TapXR for hand controls

20.10.2025 20:24 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 1

Realtime maps in your smart glasses

Using our wearables SDK to display a simple map overlay, using @openstreetmap.bsky.social data to retrieve nearby data

15.10.2025 15:40 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Recognize people with smart glasses

Using our wearables SDK to look for certain people, and displaying relevant information when someone is recognized

Works with compatible smartglasses (in this case the Brilliant Labs’ Frame running custom firmware)

09.10.2025 17:09 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Experimenting with hand controls on smart glasses

With our Wearables SDK, we can both display a 3d wireframe scene on compatible smartglasses, as well as control a cursor by strapping one of our motion modules to our wrist

We’re able to detect pinches by running an @edgeimpulse.com model

06.10.2025 15:47 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

View yourself in the third person with smart glasses

Using our Wearables SDK, we’re able to visualize MediaPipe tracking data (body, hands, and face) on compatible smartglasses (in this case the Brilliant Labs Frame running our custom firmware)

Great for yoga, working out, dancing, etc

29.09.2025 16:16 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Translation on Smart Glasses

Added foreign language support to our Wearables SDK, allowing us to display languages like Chinese, Japanese, and Korean

In this case we’re using @hf.co models for both realtime transcription and translation - all done locally (no cloud required)

22.09.2025 18:46 β€” πŸ‘ 2    πŸ” 1    πŸ’¬ 0    πŸ“Œ 1

Boxing on smart glasses

Added wireframe support to our Wearables SDK, allowing us to display realtime 3d graphics (like A-Frame and @threejs.org) on compatible smart glasses (in this case the Brilliant Labs Frame)

15.09.2025 13:42 β€” πŸ‘ 1    πŸ” 1    πŸ’¬ 0    πŸ“Œ 1

Realtime Subtitles on Smart Glasses, using our Wearables SDK

We used @hf.co’s Whisper model to transcribe microphone data locally, and then display the transcription on the smart glasses’ display (in this case the Brilliant Labs’ Frame) - no cloud required

08.09.2025 20:21 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Added sprite support to our wearables SDK

Great for stuff like emulating 3d graphics via billboarding

25.08.2025 18:06 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Added Font support to our Brilliant Sole Wearables’ JavaScript SDK

18.08.2025 18:52 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Added Bitmap support to our Brilliant Sole Wearables’ JavaScript SDK

11.08.2025 19:27 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Our wearables’ JavaScript SDK works great with react-like environments

This is a simple example using @nextjs.org, @supabase.com, and @tailwindcss.com to make a basic data collection website that can both record and view sensor data

04.08.2025 16:02 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Detect head gestures on your smart glasses

Made a simple @edgeimpulse.com model to detect nods and head shakes

28.07.2025 15:34 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Smart glasses webcam

Created a virtual webcam using OBS and our SwiftUI camera demo

21.07.2025 17:02 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Express yourself with smart glasses

Using MediaPipe’s face tracking to visualize a user’s eyes and eyebrows on a smart glasses display (in this case the Brilliant Labs’ Frame running our custom firmware)

14.07.2025 17:55 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Our smart glasses display simulator also works in WebXR, allowing us to preview ar glasses applications on a Quest 3

07.07.2025 15:35 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Added a smart glasses display simulator to our JavaScript SDK

Preview what your ar glasses applications will look like using your webcam, a video file, or even an interactive 3D scene

30.06.2025 16:03 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Added display support for smart glasses to our JavaScript SDK

We created a Canvas-like interface for drawing primitives onscreen for smartglasses with displays, like the Brilliant Labs Frame

Easily draw rectangles, circles, rounded rectangles, ellipses, polygons, and segments

23.06.2025 14:52 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Running speech recognition on smart glasses πŸ‘“

As we port our firmware to the Brilliant Labs’ Frame, we added microphone support to our SDK’s, allowing us to run @hf.co’s Whisper Web to transcribe speech in the browser

16.06.2025 15:47 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Hand, face, and object tracking on smart glasses

Running MediaPipe and @hf.co models on the Brilliant Labs Frame camera data

09.06.2025 14:34 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Streaming camera data from smart glasses to a smart watch via Bluetooth

As we port our firmware to the Brilliant Labs’ Frame, we added camera support to our SDK’s

02.06.2025 14:15 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Ported our firmware to the Brilliant Labs Frame πŸ‘“, making it compatible with our SDK’s

26.05.2025 15:02 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Kicking a virtual soccer ball in WebXR using our motion modules

19.05.2025 15:06 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Kick, Stomp, and toss green shells in WebXR

By wrapping our motion module around our ankle, we can add kick/stomp detection running an @edgeimpulse.com ML model

12.05.2025 17:03 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Adding haptic feedback to create an immersive WebXR experience on the Quest 3

Here we vibrate our motion modules differently based on the gesture (pet, punch, grab, and release)

We added WiFi support to our JavaScript, allowing us to connect directly to our WiFi-enabled hardware via WebSockets

05.05.2025 16:26 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Using our motion modules to detect punches, providing haptic feedback

28.04.2025 14:48 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Experimenting with basic wrist controls on our modular Ukaton hardware

You can remove the insole+shoe clip, attach a loop clip, and strap it around your wrist

Made a simple @edgeimpulse.com model to detect pinches, and using the gyro to move the cursor, similar to @doublepoint.bsky.social

21.04.2025 14:19 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Rewrote Ukaton’s firmware so their hardware is 100% compatible with our SDK’s

We recently merged with Ukaton and now have 2 implementations for smart insoles

14.04.2025 14:13 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Interfacing with our smart insoles on an Apple TV in a SwiftUI app using our Swift Package

07.04.2025 16:09 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

@brilliantsole is following 19 prominent accounts