Bionic Vision Lab's Avatar

Bionic Vision Lab

@bionicvisionlab.org.bsky.social

πŸ‘οΈπŸ§ πŸ–₯️πŸ§ͺπŸ€– What would the world look like with a bionic eye? Interdisciplinary research group at UC Santa Barbara. PI: @mbeyeler.bsky.social‬ #BionicVision #Blindness #NeuroTech #VisionScience #CompNeuro #NeuroAI

450 Followers  |  166 Following  |  124 Posts  |  Joined: 23.09.2024  |  2.0732

Latest posts by bionicvisionlab.org on Bluesky

Epic collage of Bionic Vision Lab activities. From top to bottom, left to right:
A) Up-to-date group picture
B) BVL at Dr. Beyeler's Plous Award celebration (2025)
C) BVL at The Eye & The Chip (2023)
D/F) Dr. Aiwen Xu and Justin Kasowski getting hooded at the UCSB commencement ceremony
E) BVL logo cake created by Tori LeVier
G) Dr. Beyeler with symposium speakers at Optica FVM (2023)
H, I, M, N) Students presenting conference posters/talks
J) Participant scanning a food item (ominous pizza study)
K) Galen Pogoncheff in VR
L) Argus II user drawing a phosphene
O) Prof. Beyeler demoing BionicVisionXR
P) First lab hike (ca. 2021)
Q) Statue for winner of the Mac'n'Cheese competition (ca. 2022)
R) BVL at Club Vision
S) Students drifting off into the sunset on a floating couch after a hard day's work

Epic collage of Bionic Vision Lab activities. From top to bottom, left to right: A) Up-to-date group picture B) BVL at Dr. Beyeler's Plous Award celebration (2025) C) BVL at The Eye & The Chip (2023) D/F) Dr. Aiwen Xu and Justin Kasowski getting hooded at the UCSB commencement ceremony E) BVL logo cake created by Tori LeVier G) Dr. Beyeler with symposium speakers at Optica FVM (2023) H, I, M, N) Students presenting conference posters/talks J) Participant scanning a food item (ominous pizza study) K) Galen Pogoncheff in VR L) Argus II user drawing a phosphene O) Prof. Beyeler demoing BionicVisionXR P) First lab hike (ca. 2021) Q) Statue for winner of the Mac'n'Cheese competition (ca. 2022) R) BVL at Club Vision S) Students drifting off into the sunset on a floating couch after a hard day's work

Excited to share that I’ve been promoted to Associate Professor with tenure at UCSB!

Grateful to my mentors, students, and funders who shaped this journey and to @ucsantabarbara.bsky.social for giving the Bionic Vision Lab a home!

Full post: www.linkedin.com/posts/michae...

02.08.2025 18:12 β€” πŸ‘ 18    πŸ” 4    πŸ’¬ 1    πŸ“Œ 0
Preview
Program – EMBC 2025 Loading...

At #EMBC2025? Come check out two talks from my lab in tomorrow’s Sensory Neuroprostheses session!

πŸ—“οΈ Thurs July 17 Β· 8-10AM Β· Room B3 M3-4
🧠 Efficient threshold estimation
πŸ§‘πŸ”¬ Deep human-in-the-loop optimization

πŸ”— embc.embs.org/2025/program/
#BionicVision #NeuroTech #IEEE #EMBS

16.07.2025 16:54 β€” πŸ‘ 3    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0
Preview
Efficient spatial estimation of perceptual thresholds for retinal implants via Gaussian process regression | Bionic Vision Lab We propose a Gaussian Process Regression (GPR) framework to predict perceptual thresholds at unsampled locations while leveraging uncertainty estimates to guide adaptive sampling.

🧠 Building on Roksana Sadeghi’s work: Calibrating retinal implants is slow and tedious. Can Gaussian Process Regression (GPR) guide smarter sampling?

βœ… GPR + spatial sampling = fewer trials, same accuracy
πŸ” Toward faster, personalized calibration

πŸ”— bionicvisionlab.org/publications...

#EMBC2025

13.07.2025 17:24 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Preview
Evaluating deep human-in-the-loop optimization for retinal implants using sighted participants | Bionic Vision Lab We evaluate HILO using sighted participants viewing simulated prosthetic vision to assess its ability to optimize stimulation strategies under realistic conditions.

πŸŽ“ Proud of our undergrad(!) Eirini Schoinas for leading this:
bionicvisionlab.org/publications...

🧠 Human-in-the-loop optimization (HILO) works in silicoβ€”but does it hold up with real people?
βœ… HILO outperformed naΓ―ve and deep encoders
πŸ” A step toward personalized #BionicVision

#EMBC2025

13.07.2025 17:24 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Preview
Program – EMBC 2025 Loading...

πŸ‘οΈβš‘ Headed to #EMBC2025? Catch two of our lab’s talks on optimizing retinal implants!

πŸ“ Sensory Neuroprostheses
πŸ—“οΈ Thurs July 17 Β· 8-10AM Β· Room B3 M3-4
🧠 Efficient threshold estimation
πŸ§‘πŸ”¬ Deep human-in-the-loop optimization

πŸ”— embc.embs.org/2025/program/
#BionicVision #NeuroTech #IEEE #EMBS #Retina

13.07.2025 17:24 β€” πŸ‘ 0    πŸ” 2    πŸ’¬ 1    πŸ“Œ 0

This matters. Checkerboard rastering:

βœ”οΈ works across tasks
βœ”οΈ requires no fancy calibration
βœ”οΈ is hardware-agnostic

A low-cost, high-impact tweak that could make future visual prostheses more usable and more intuitive.

#BionicVision #BCI #NeuroTech

09.07.2025 16:55 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Boxplots showing task accuracy for two experimental tasksβ€”Letter Recognition and Motion Discriminationβ€”grouped by five raster patterns: No Raster (blue), Checkerboard (orange), Vertical (green), Horizontal (brown), and Random (pink). Each colored boxplot shows the median, interquartile range, and individual participant data points.

In both tasks, Checkerboard and No Raster yield the highest median accuracy.

Horizontal and Random patterns perform the worst, with more variability and lower scores.

Significant pairwise differences (p < .05) are indicated by horizontal bars above the plots, showing that Checkerboard significantly outperforms Random and Horizontal in both tasks.

A dashed line at 0.125 marks chance-level performance (1 out of 8).

These results suggest Checkerboard rastering improves perceptual performance compared to conventional or unstructured patterns.

Boxplots showing task accuracy for two experimental tasksβ€”Letter Recognition and Motion Discriminationβ€”grouped by five raster patterns: No Raster (blue), Checkerboard (orange), Vertical (green), Horizontal (brown), and Random (pink). Each colored boxplot shows the median, interquartile range, and individual participant data points. In both tasks, Checkerboard and No Raster yield the highest median accuracy. Horizontal and Random patterns perform the worst, with more variability and lower scores. Significant pairwise differences (p < .05) are indicated by horizontal bars above the plots, showing that Checkerboard significantly outperforms Random and Horizontal in both tasks. A dashed line at 0.125 marks chance-level performance (1 out of 8). These results suggest Checkerboard rastering improves perceptual performance compared to conventional or unstructured patterns.

βœ… Checkerboard consistently outperformed the other patternsβ€”higher accuracy, lower difficulty, fewer motion artifacts.

πŸ’‘ Why? More spatial separation between activations = less perceptual interference.

It even matched performance of the ideal β€œno raster” condition, without breaking safety rules.

09.07.2025 16:55 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Diagram showing the four-step pipeline for simulating prosthetic vision in VR.
Step 1: A virtual camera captures the user’s view, guided by eye gaze. The image is converted to grayscale and blurred for preprocessing.
Step 2: The preprocessed image is mapped onto a simulated retinal implant with 100 electrodes. Electrodes are activated based on local image intensity and grouped into raster groups. Raster Group 1 is highlighted.
Step 3: Simulated perception is shown with and without rastering. Without rastering (top), all electrodes are active, producing a more complete but unrealistic percept. With rastering (bottom), only 20 electrodes are active per frame, resulting in a temporally fragmented percept. Phosphene shape depends on parameters for spatial spread (ρ) and elongation (λ).
Step 4: The rendered percept is updated with temporal effects and presented through a virtual reality headset.

Diagram showing the four-step pipeline for simulating prosthetic vision in VR. Step 1: A virtual camera captures the user’s view, guided by eye gaze. The image is converted to grayscale and blurred for preprocessing. Step 2: The preprocessed image is mapped onto a simulated retinal implant with 100 electrodes. Electrodes are activated based on local image intensity and grouped into raster groups. Raster Group 1 is highlighted. Step 3: Simulated perception is shown with and without rastering. Without rastering (top), all electrodes are active, producing a more complete but unrealistic percept. With rastering (bottom), only 20 electrodes are active per frame, resulting in a temporally fragmented percept. Phosphene shape depends on parameters for spatial spread (ρ) and elongation (Ξ»). Step 4: The rendered percept is updated with temporal effects and presented through a virtual reality headset.

We ran a simulated prosthetic vision study in immersive VR using gaze-contingent, psychophysically grounded models of epiretinal implants.

πŸ§ͺ Powered by BionicVisionXR.
πŸ“ Modeled 100-electrode Argus-like array.
πŸ‘€ Realistic phosphene appearance, eye/head tracking.

09.07.2025 16:55 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Raster pattern configurations used in the study, shown as 10Γ—10 electrode grids labeled with numbers 1 through 5, representing five sequentially activated timing groups.

1. Horizontal: Each row of electrodes belongs to one group, with activation proceeding top to bottom.

2. Vertical: Each column is a group, activated left to right.

3. Checkerboard: Electrode groups are arranged to maximize spatial separation, forming a checkerboard-like layout.

4. Random: Group assignments are randomly distributed across the grid, with no spatial structure. This pattern was re-randomized every five frames to test unstructured activation.
Each group is represented with different shades of gray and labeled numerically to indicate activation order.

Raster pattern configurations used in the study, shown as 10Γ—10 electrode grids labeled with numbers 1 through 5, representing five sequentially activated timing groups. 1. Horizontal: Each row of electrodes belongs to one group, with activation proceeding top to bottom. 2. Vertical: Each column is a group, activated left to right. 3. Checkerboard: Electrode groups are arranged to maximize spatial separation, forming a checkerboard-like layout. 4. Random: Group assignments are randomly distributed across the grid, with no spatial structure. This pattern was re-randomized every five frames to test unstructured activation. Each group is represented with different shades of gray and labeled numerically to indicate activation order.

Checkerboard rastering has been used in #BCI and #NeuroTech applications, often based on intuition.

But is it actually better, or just tradition?

No one had rigorously tested how these patterns impact perception in visual prostheses.

So we did.

09.07.2025 16:55 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Raster patterns in simulated prosthetic vision. On the left, a natural scene of a yellow car is shown, followed by its transformation into a prosthetic vision simulation using a 10Γ—10 grid of electrodes (red dots). Below this, a zoomed-in example shows the resulting phosphene pattern. To comply with safety constraints, electrodes are divided into five spatial groups activated sequentially across ~220 milliseconds. Each row represents a different raster pattern: vertical (columns activated left to right), horizontal (rows top to bottom), checkerboard (spatially maximized separation), and random (reshuffled every five frames). For each pattern, five panels show how the scene is progressively built across the five raster groups. Vertical and horizontal patterns show strong directional streaking. Checkerboard shows more uniform activation and perceptual clarity. Random appears spatially noisy and inconsistent.

Raster patterns in simulated prosthetic vision. On the left, a natural scene of a yellow car is shown, followed by its transformation into a prosthetic vision simulation using a 10Γ—10 grid of electrodes (red dots). Below this, a zoomed-in example shows the resulting phosphene pattern. To comply with safety constraints, electrodes are divided into five spatial groups activated sequentially across ~220 milliseconds. Each row represents a different raster pattern: vertical (columns activated left to right), horizontal (rows top to bottom), checkerboard (spatially maximized separation), and random (reshuffled every five frames). For each pattern, five panels show how the scene is progressively built across the five raster groups. Vertical and horizontal patterns show strong directional streaking. Checkerboard shows more uniform activation and perceptual clarity. Random appears spatially noisy and inconsistent.

πŸ‘οΈπŸ§  New paper alert!

We show that checkerboard-style electrode activation improves perceptual clarity in simulated prosthetic visionβ€”outperforming other patterns in both letter and motion tasks.

Less bias, more function, same safety.

πŸ”— doi.org/10.1088/1741...

#BionicVision #NeuroTech

09.07.2025 16:55 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 1
Preview
Assistive Technology Use In The Home and AI and Adaptive Optics Ophthalmoscopes Lily Turkstra (University of California - Santa Barbara) Dr. Johnny Tam (National Eye Institute - Bethesda, MD) Lily Turkstra , PhD Student,...

πŸŽ™οΈOur very own Lily Turkstra was featured on WYPL-FM’s Eye on Vision podcast to discuss how blind individuals use assistive tech at home, from tactile labels to digital tools.

πŸ“» Listen: eyeonvision.blogspot.com/2025/05/assi...
πŸ“° Read: bionicvisionlab.org/publications...

#BlindTech #Accessibility

24.06.2025 20:14 β€” πŸ‘ 2    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0
Preview
bionic-vision.org | Research Spotlights | Frederik Ceyssens, ReVision Implant Frederik Ceyssens is Co-Founder and CEO of ReVision Implant, the company behind Occular: a next-generation cortical prosthesis designed to restore both central and peripheral vision through ultra-flex...

πŸ‘οΈπŸ§ πŸ§ͺ Next on the Horizon: Frederik Ceyssens from ReVision Implant on scaling bionic vision to the cortex with Occular, a high-res, deep-brain prosthesis.

Why performance might beat invasiveness - and what comes next:
www.bionic-vision.org/research-spo...

#BionicVision #NeuroTech #BCI

12.06.2025 17:31 β€” πŸ‘ 0    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0
VSS PresentationPresentation – Vision Sciences Society

Last but not least is Lily Turkstra, whose poster is assessing the efficacy of visual augmentations for high-stress navigation:

Tue, 2:45 - 6:45pm, Pavilion: Poster #56.472
www.visionsciences.org/presentation...

πŸ‘οΈπŸ§ͺ #XR #VirtualReality #Unity3D #VSS2025

20.05.2025 14:10 β€” πŸ‘ 3    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0
Schematic illustrating the phosphenes elicited by an intracortical prosthesis. A 96-channel Utah array is shown, stimulated with biphasic pulse trains. An arrow points to drawings of visual percepts elicited by electrical stimulation

Schematic illustrating the phosphenes elicited by an intracortical prosthesis. A 96-channel Utah array is shown, stimulated with biphasic pulse trains. An arrow points to drawings of visual percepts elicited by electrical stimulation

Jacob Granley headshot

Jacob Granley headshot

Coming up: Jacob Granley on whether V1 maintains working memory via spiking activity. Prior evidence from fMRI and LFPs - now, rare intracortical recordings in a blind human offer a chance to test it directly. πŸ‘οΈ #VSS2025

πŸ•₯ Sun 10:45pm Β· Talk Room 1
🧠 www.visionsciences.org/presentation...

18.05.2025 12:26 β€” πŸ‘ 5    πŸ” 2    πŸ’¬ 1    πŸ“Œ 0
Example stimulus from Byron's image dataset: an unobscured version (left) depicting a man on a bike approaching a woman trying to cross the bike lane; a simulation of peripheral vision loss (center), where the woman is clearly visible but the man on the bike is obscured; and a simulation of central vision loss (right), where the man on the bike is apparent but the woman is obscured

Example stimulus from Byron's image dataset: an unobscured version (left) depicting a man on a bike approaching a woman trying to cross the bike lane; a simulation of peripheral vision loss (center), where the woman is clearly visible but the man on the bike is obscured; and a simulation of central vision loss (right), where the man on the bike is apparent but the woman is obscured

Byron Johnson headshot

Byron Johnson headshot

Our @bionicvisionlab.org is at #VSS2025 with 2 talks and a poster!

First up is PhD Candidate Byron A. Johnson:

Fri, 4:30pm, Talk Room 1: Differential Effects of Peripheral and Central Vision Loss on Scene Perception and Eye Movement Patterns

www.visionsciences.org/presentation...

16.05.2025 19:27 β€” πŸ‘ 8    πŸ” 3    πŸ’¬ 1    πŸ“Œ 0
Michael Beyeler smiles while receiving the framed Plous Award at UC Santa Barbara. College of Letters & Science's Dean Shelly Gable presents the award in front of a slide thanking collaborators and funders, with photos of colleagues and logos from NIH and the Institute for Collaborative Biotechnologies. The audience watches the moment from their seats.

Michael Beyeler smiles while receiving the framed Plous Award at UC Santa Barbara. College of Letters & Science's Dean Shelly Gable presents the award in front of a slide thanking collaborators and funders, with photos of colleagues and logos from NIH and the Institute for Collaborative Biotechnologies. The audience watches the moment from their seats.

The lecture hall of Mosher Alumni House is packed as Prof. Beyeler gets started with his lecture titled "Learning to See Again: Building a Smarter Bionic Eye"

The lecture hall of Mosher Alumni House is packed as Prof. Beyeler gets started with his lecture titled "Learning to See Again: Building a Smarter Bionic Eye"

Michael Beyeler stands with members of the Bionic Vision Lab in front of a congratulatory banner celebrating his 2024–25 UCSB Plous Award. Everyone is smiling, with some holding drinks, and Michael is holding his young son. The group is gathered outdoors under string lights, with tall eucalyptus trees in the background.

Michael Beyeler stands with members of the Bionic Vision Lab in front of a congratulatory banner celebrating his 2024–25 UCSB Plous Award. Everyone is smiling, with some holding drinks, and Michael is holding his young son. The group is gathered outdoors under string lights, with tall eucalyptus trees in the background.

Not usually one to post personal pics, but let’s take a break from doomscrolling, yeah?

Some joyful moments from the Plous Award Ceremony: Honored to give the lecture, receive the framed award & celebrate with the people who made it all possible!

@bionicvisionlab.org @ucsantabarbara.bsky.social

17.04.2025 16:16 β€” πŸ‘ 4    πŸ” 3    πŸ’¬ 1    πŸ“Œ 0
Preview
Michael Beyeler | 2024-2025 Plous Award Lecture "Learning to See Again: Building a Smarter Bionic Eye" What does it mean to see with a bionic eye? While modern visual prosthetics can generate flashes of light, they don’t yet restore natural…

Check out his talk, "Learning to See Again: Building a Smarter Bionic Eye," on Monday, April 14th, 4pm-6pm at Mosher Alumni House.πŸ’‘

For more info click here: www.campuscalendar.ucsb.edu/event/beyele...

11.04.2025 18:00 β€” πŸ‘ 2    πŸ” 2    πŸ’¬ 0    πŸ“Œ 1
bionic eye glasses

bionic eye glasses

Modern visual prosthetics can generate flashes of light but don’t restore natural vision. What might bionic vision look like?

In CS asst. prof Michael Beyeler's @bionicvisionlab.org his team explores how smarter, more adaptive tech could move toward a bionic eye that's functional & usable.πŸ‘οΈ

11.04.2025 18:00 β€” πŸ‘ 1    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Examples of discrete neuronal network models of the human retina, including the central (top) and peripheral retina (bottom). Photoreceptors, bipolar cells, and ganglion cells are shown.

Examples of discrete neuronal network models of the human retina, including the central (top) and peripheral retina (bottom). Photoreceptors, bipolar cells, and ganglion cells are shown.

Virtual Human Retina: A simulation platform designed for studying human retinal degeneration and optimizing stimulation strategies for retinal implants πŸ‘οΈπŸ§ πŸ§ͺ

doi.org/10.1016/j.br...

23.01.2025 21:17 β€” πŸ‘ 3    πŸ” 3    πŸ’¬ 1    πŸ“Œ 1
Preview
Seeing the future: Michael Beyeler’s work in neurotechnology earns him top faculty award Recognized for outstanding contributions in research, teaching and service, as well as β€œhis dedication to innovation, excellence and student success," the researcher behind the "bionic eye" receives o...

β€œSeeing the future”: More media coverage about @mbeyeler.bsky.social winning the 2024-25 Harold J. Plous Memorial Award @ucsantabarbara.bsky.social

news.ucsb.edu/2024/021709/...

17.12.2024 19:46 β€” πŸ‘ 10    πŸ” 4    πŸ’¬ 0    πŸ“Œ 0
Outstanding Contributions Assistant professor Michael Beyeler is selected for the highly regarded Harold J. Plous Memorial Award.

Our PI @mbeyeler.bsky.social has received the 2024-’25 Harold J. Plous Memorial Award by @ucsb.bsky.social, in recognition of his β€œoutstanding contributions in research, teaching, and service.β€œ

New article by @ucsbengineering.bsky.social:
engineering.ucsb.edu/news/outstan...

13.12.2024 19:26 β€” πŸ‘ 4    πŸ” 3    πŸ’¬ 0    πŸ“Œ 0
Scalable Neural Interfaces – Opportunity seeds - ARIA To support scientific and technological breakthroughs, outside of programmes, ARIA Programme Directors can award opportunity seeds to support ambitious research aligned to the opportunity spaces they’...

Please share: ARIA is offering Β£500k seed awards for neural interface projects. Apply by 13 Feb 2025.

I've been impressed with ARIA's scientific team and overall execution so far. They deserve your best ideas!

www.aria.org.uk/scalable-neu...

02.12.2024 16:33 β€” πŸ‘ 20    πŸ” 16    πŸ’¬ 2    πŸ“Œ 1
Preview
Biohybrid neural interfaces: an old idea enabling a completely new space of possibilities | Science Corporation Science Corporation is a clinical-stage medical technology company.

I had an idea way back in college which I've long thought could be, in many ways, the ultimate BCI technology. What if instead of using electrodes, we used biological neurons embedded in electronics to communicate with the brain?

Enter biohybrid neural interfaces: science.xyz/news/biohybr...

23.11.2024 19:49 β€” πŸ‘ 84    πŸ” 18    πŸ’¬ 4    πŸ“Œ 9

Aligning visual prosthetic development with implantee needs: Now out in TVST!

#BionicVision #NeuroTech #BCI #Blindness #Accessibility

21.11.2024 17:31 β€” πŸ‘ 1    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0
Preview
The Road to a Successful PhD Application The Road to a Successful PhD Application Disclaimer: This guide was written by Prof. Michael Beyeler, with input from faculty colleagues. It does not necessarily reflect the views of all CS faculty me...

Starting your #PhD journey? Crafting a standout Statement of Purpose is key. Here's a guide with tips & common pitfalls to help you succeed (tailored to #CompSky at #UCSB, but much of it should apply to most depts):

docs.google.com/document/d/1...

#AcademicSky #AcademicChatter #FirstGen #CSforALL

20.11.2024 19:24 β€” πŸ‘ 10    πŸ” 5    πŸ’¬ 0    πŸ“Œ 0

Applying to the PhD program in #CompSky at #UCSB? It's not too late to get expert feedback on your application materials! πŸ§ͺ

New deadline: Nov 22 Anywhere on Earth

cs.ucsb.edu/education/gr...

#Diversity #FirstGen #STEM #WomenInSTEM #CSforALL #AcademicSky

19.11.2024 02:04 β€” πŸ‘ 4    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0
Preview
Startups Like Science and Musk's Neuralink Aim to Help Blind People See Technological advances have spurred an "explosion of interest" in restoring vision.

New Bloomberg article putting recent developments in #BionicVision into perspective:
www.bloomberg.com/news/article...

with quotes from Max Hodak (Science), Xing Chen (Pitt), Yağmur Güçlütürk (Radboud), @mbeyeler.bsky.social (UCSB), others

#NeuroTech #BCI #Neuralink #PRIMA #Phosphoenix #ReVision

13.11.2024 20:56 β€” πŸ‘ 2    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0
Jacob Granley from the Bionic Vision Lab presenting at the podium as he presents his talk on control of electrically evoked neural activity via deep neural networks in human visual cortex

Jacob Granley from the Bionic Vision Lab presenting at the podium as he presents his talk on control of electrically evoked neural activity via deep neural networks in human visual cortex

The 2nd Brain & The Chip is underway in Elche, Spain!

Leading #BionicVision and #BCI researchers are gathering to discuss latest developments in intracortical neural interfaces.

www.bionic-vision.org/events/brain...

12.11.2024 16:51 β€” πŸ‘ 1    πŸ” 1    πŸ’¬ 2    πŸ“Œ 1

Folks please do your best to include "alt text" for any images you post. When blind users use bluesky, this enables them to engage with the images you post. #Accessibility #AltText

11.11.2024 05:04 β€” πŸ‘ 8    πŸ” 4    πŸ’¬ 0    πŸ“Œ 0
What is accessibility and integration in urban futures? Lucas Gil Nadolskis (270I)
YouTube video by What is The Future for Cities? What is accessibility and integration in urban futures? Lucas Gil Nadolskis (270I)

β€œWhen you say someone is disabled, there’s an assumption they’re not functional–which couldn’t be more wrong!”

Our PhD student Lucas Nadolskis shares insights on #Accessibility, #Inclusion vs. #Integration, and reimagining cities for all:
www.youtube.com/watch?v=KDSn...

12.11.2024 05:04 β€” πŸ‘ 1    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

@bionicvisionlab.org is following 20 prominent accounts