Brilliant Sole's Avatar

Brilliant Sole

@brilliantsole.bsky.social

Making smart insoles for developers. They have pressure sensors (8 per insole), 3DoF motion sensor, and haptics (2 per insole - heel and ball)

66 Followers  |  207 Following  |  67 Posts  |  Joined: 28.10.2024
Posts Following

Posts by Brilliant Sole (@brilliantsole.bsky.social)

Tune your guitar with your smart glasses

With our wearables SDK, we’re able to display pitch, allowing musicians to easily tune their instruments

Works with compatible smartglasses, in this case the Brilliant Labs Frame running custom firmware

05.03.2026 17:40 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Display sheet music on your smart glasses, and tap your foot to turn the page

With our wearables SDK, not only can we render musicXML files on compatible smartglasses, but we’re able to use our smart insoles to trigger page turns by running an @edgeimpulse.com model that detects foot taps on device

02.03.2026 17:16 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Visualizing pressure and motion data from our smart insoles on smart glasses

With our wearables SDK we can receive realtime pressure and motion data from our smart insoles, and then display them on compatible smart glasses (in this case the Brilliant Labs’ Frame running custom firmware)

26.02.2026 19:12 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 1

Visualizing jump height on smart glasses

With our wearables SDK, we’re able to calculate short-range positional displacement using imu data from our smart insoles & motion modules, visualizing the trajectory on compatible smart glasses (in this case the Brilliant Labs Frame running custom firmware)

23.02.2026 18:07 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Work out with realtime feedback on your smartglasses

With our Wearables SDK, we can visualize imu data from our motion modules (which can be mounted on the wrist or ankle depending on the workout), guiding the user to perform reps at a controlled tempo for maximum effectiveness

09.02.2026 19:34 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Running an Image Classifier on smart glasses, using @edgeimpulse.com to create the model

02.02.2026 17:20 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Streaming smart glasses camera data to a Quest 3

Using our Wearables SDK, we’re able to stream images from the Omi Glass (based on the @seeedstudio.com XIAO ESP32-S3 running custom Firmware) to a Quest 3

05.01.2026 16:49 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Local AI Assistant on Smart Glasses

Using our wearables SDK to take a picture, speak a prompt with Whisper’s speech to text, and use chrome’s Prompt API to generate a response to be displayed back on the glasses’ display

30.10.2025 19:42 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Spotify Player with Lyrics on Smart Glasses

Using our Wearables SDK to experiment with a music player on compatible smartglasses (in this case the Brilliant Labs’ Frame running custom firmware)

Also using a Mudra Band and TapXR for hand controls

20.10.2025 20:24 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 1

Realtime maps in your smart glasses

Using our wearables SDK to display a simple map overlay, using @openstreetmap.bsky.social data to retrieve nearby data

15.10.2025 15:40 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Recognize people with smart glasses

Using our wearables SDK to look for certain people, and displaying relevant information when someone is recognized

Works with compatible smartglasses (in this case the Brilliant Labs’ Frame running custom firmware)

09.10.2025 17:09 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Experimenting with hand controls on smart glasses

With our Wearables SDK, we can both display a 3d wireframe scene on compatible smartglasses, as well as control a cursor by strapping one of our motion modules to our wrist

We’re able to detect pinches by running an @edgeimpulse.com model

06.10.2025 15:47 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

View yourself in the third person with smart glasses

Using our Wearables SDK, we’re able to visualize MediaPipe tracking data (body, hands, and face) on compatible smartglasses (in this case the Brilliant Labs Frame running our custom firmware)

Great for yoga, working out, dancing, etc

29.09.2025 16:16 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Translation on Smart Glasses

Added foreign language support to our Wearables SDK, allowing us to display languages like Chinese, Japanese, and Korean

In this case we’re using @hf.co models for both realtime transcription and translation - all done locally (no cloud required)

22.09.2025 18:46 β€” πŸ‘ 2    πŸ” 1    πŸ’¬ 0    πŸ“Œ 1

Boxing on smart glasses

Added wireframe support to our Wearables SDK, allowing us to display realtime 3d graphics (like A-Frame and @threejs.org) on compatible smart glasses (in this case the Brilliant Labs Frame)

15.09.2025 13:42 β€” πŸ‘ 1    πŸ” 1    πŸ’¬ 0    πŸ“Œ 1

Realtime Subtitles on Smart Glasses, using our Wearables SDK

We used @hf.co’s Whisper model to transcribe microphone data locally, and then display the transcription on the smart glasses’ display (in this case the Brilliant Labs’ Frame) - no cloud required

08.09.2025 20:21 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Added sprite support to our wearables SDK

Great for stuff like emulating 3d graphics via billboarding

25.08.2025 18:06 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Added Font support to our Brilliant Sole Wearables’ JavaScript SDK

18.08.2025 18:52 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Added Bitmap support to our Brilliant Sole Wearables’ JavaScript SDK

11.08.2025 19:27 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Our wearables’ JavaScript SDK works great with react-like environments

This is a simple example using @nextjs.org, @supabase.com, and @tailwindcss.com to make a basic data collection website that can both record and view sensor data

04.08.2025 16:02 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Detect head gestures on your smart glasses

Made a simple @edgeimpulse.com model to detect nods and head shakes

28.07.2025 15:34 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Smart glasses webcam

Created a virtual webcam using OBS and our SwiftUI camera demo

21.07.2025 17:02 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Express yourself with smart glasses

Using MediaPipe’s face tracking to visualize a user’s eyes and eyebrows on a smart glasses display (in this case the Brilliant Labs’ Frame running our custom firmware)

14.07.2025 17:55 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Our smart glasses display simulator also works in WebXR, allowing us to preview ar glasses applications on a Quest 3

07.07.2025 15:35 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Added a smart glasses display simulator to our JavaScript SDK

Preview what your ar glasses applications will look like using your webcam, a video file, or even an interactive 3D scene

30.06.2025 16:03 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Added display support for smart glasses to our JavaScript SDK

We created a Canvas-like interface for drawing primitives onscreen for smartglasses with displays, like the Brilliant Labs Frame

Easily draw rectangles, circles, rounded rectangles, ellipses, polygons, and segments

23.06.2025 14:52 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Running speech recognition on smart glasses πŸ‘“

As we port our firmware to the Brilliant Labs’ Frame, we added microphone support to our SDK’s, allowing us to run @hf.co’s Whisper Web to transcribe speech in the browser

16.06.2025 15:47 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Hand, face, and object tracking on smart glasses

Running MediaPipe and @hf.co models on the Brilliant Labs Frame camera data

09.06.2025 14:34 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Streaming camera data from smart glasses to a smart watch via Bluetooth

As we port our firmware to the Brilliant Labs’ Frame, we added camera support to our SDK’s

02.06.2025 14:15 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1

Ported our firmware to the Brilliant Labs Frame πŸ‘“, making it compatible with our SDK’s

26.05.2025 15:02 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1