Michael Wiesing's Avatar

Michael Wiesing

@vrneuroscience.bsky.social

He/Him. Postdoc @EventLab @UniBarcelona Open VR environments for research (CC0): gitlab.com/eventlabprojects/openenvironments. VR research feed: https://bsky.app/profile/did:plc:wkncjmnqrxjmx7dskjju2nii/feed/vrcademicsky http://tinyurl.com/27p67z27

664 Followers  |  435 Following  |  476 Posts  |  Joined: 18.08.2023  |  1.6812

Latest posts by vrneuroscience.bsky.social on Bluesky

#VRresearch

05.08.2025 15:40 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

Explore XR + cognition at #VRS2025!

VR Summit (Oct 20-21) & VRS Hackathon (Oct 18-19) in Bochum ๐Ÿ‡ฉ๐Ÿ‡ช. Prototype next-gen haptics, share research, connect across disciplines.

Register Now โ†’ vrs.rub.de

#VRS2025 #VRresearch #VirtualReality #Bochum #VRcademicSky

01.08.2025 11:19 โ€” ๐Ÿ‘ 4    ๐Ÿ” 5    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

#VRresearch

02.08.2025 08:54 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

Help develop real feelings in virtual worlds!
Weโ€™re looking for sponsors for the VRS Hackathon 2025, where interdisciplinary teams will push the boundaries of haptic feedback in VR.

Want your brand at the heart of innovation? Letโ€™s talk.
Learn more: vrs.rub.de

#VRresearch

30.07.2025 11:15 โ€” ๐Ÿ‘ 1    ๐Ÿ” 3    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

Fun fact, this year the VRS will also feature a virtual conference
More on: vrs.ruhr-uni-bochum.de/virtual-vrs/
#VRresearch

28.07.2025 21:03 โ€” ๐Ÿ‘ 4    ๐Ÿ” 1    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image Post image Post image Post image

A sneek peek on the new venue for VRS. Can't get any cooler. @vrsummit.bsky.social #vrcademicsky

22.07.2025 11:10 โ€” ๐Ÿ‘ 3    ๐Ÿ” 1    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 1

VRS Hackathon 2025: Feel the Future

#VRresearch

15.07.2025 18:33 โ€” ๐Ÿ‘ 3    ๐Ÿ” 2    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
VRS Hackathon 2025: Feel the Future โ€“ XR Presence Project

This yearโ€™s VRS Hackathon is all about "Feel the Future." We're teaming up with the PRESENCE project and SenseGlove to bring you the latest in haptic tech with the Nova 2 gloves. Want to know more? I wrote a short blog post about it.

#VRresearch

presence-xr.eu/vrs-hackatho...

15.07.2025 15:42 โ€” ๐Ÿ‘ 3    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 2
Post image

Proud that our VR course went online as part of the project "Deep Tech Brain: Virtual Reality, Artificial Intelligence, Neuroimaging, and Neuromodulation" here at the UB.
#VRresearch

15.07.2025 11:18 โ€” ๐Ÿ‘ 3    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

#VRresearch

14.07.2025 12:36 โ€” ๐Ÿ‘ 4    ๐Ÿ” 1    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

Google Scholar has started to doubt my humanity. Guess Iโ€™ve been reading too many papers lately. ๐Ÿคก

14.07.2025 11:38 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

More info on vrsummit.ruhr-uni-bochum.de
#VRresearch

09.07.2025 08:46 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

This is hilarious!

03.07.2025 18:30 โ€” ๐Ÿ‘ 15    ๐Ÿ” 1    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 1
01.07.2025 14:28 โ€” ๐Ÿ‘ 3    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

#VRresearch

30.06.2025 11:56 โ€” ๐Ÿ‘ 1    ๐Ÿ” 1    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

#VRcademicsSky

25.06.2025 18:04 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

[1/3] How do the mechanisms behind collective behaviour work? How can VR help us better understand how visual cues influence decision making processes? ๐Ÿ‘๏ธ ๐Ÿง ๐Ÿฆ†

24.06.2025 08:52 โ€” ๐Ÿ‘ 2    ๐Ÿ” 2    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

Blueskyfeeds is shutting down, so I moved the VRcademicSky feed to Bluesky Feed Creator.
Like before, if you tag your posts with #vrcademicsky, #vracademicsky, #vrresearch or #vresearch, they will appear in the feed.

bsky.app/profile/vrne...

20.06.2025 14:19 โ€” ๐Ÿ‘ 2    ๐Ÿ” 1    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

#VRcademicSky

21.06.2025 09:11 โ€” ๐Ÿ‘ 3    ๐Ÿ” 2    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Why is it such a pain to change languages in Windows? Just setting up my new laptop and somehow I managed to have some parts are in German, some in English, and others in Spanish.

20.06.2025 17:27 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Blueskyfeeds is shutting down, so I moved the VRcademicSky feed to Bluesky Feed Creator.
Like before, if you tag your posts with #vrcademicsky, #vracademicsky, #vrresearch or #vresearch, they will appear in the feed.

bsky.app/profile/vrne...

20.06.2025 14:19 โ€” ๐Ÿ‘ 2    ๐Ÿ” 1    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

#VRcademicSky

20.06.2025 09:18 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

#VRcademicSky

20.06.2025 09:01 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

Find more information on the event and how to join us on our website: invirtuo.org/course/2025-...

#InVirtuoresearch #VRresearch #VRavatars #policetraining

Photo credit: (c) ISEDA project

13.06.2025 08:07 โ€” ๐Ÿ‘ 0    ๐Ÿ” 1    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 1
The rubber-hand illusion was translated to the mouse model. Just like in humans, embodiment in mice can be achieved by brushing the real forelimb of the mouse and the artificial limb (yellow) in synchrony to generate matching visual and touch percepts. This study suggests that as the mouse looks at the artificial limb, it perceives it as its own limb, and feels threatened if the artificial limb is threatened. Image credit: Luc Estebanez.

The rubber-hand illusion was translated to the mouse model. Just like in humans, embodiment in mice can be achieved by brushing the real forelimb of the mouse and the artificial limb (yellow) in synchrony to generate matching visual and touch percepts. This study suggests that as the mouse looks at the artificial limb, it perceives it as its own limb, and feels threatened if the artificial limb is threatened. Image credit: Luc Estebanez.

The "rubber hand illusion" in mice... @lucestebanez.bsky.social &co use automated videography to show that mice display quantifiable behavioral markers of the embodiment of an #ArtificialLimb, opening the door to future research into human #BodyOwnership disorders @plosbiology.org ๐Ÿงช plos.io/4jHsESn

09.06.2025 16:46 โ€” ๐Ÿ‘ 24    ๐Ÿ” 5    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 1

Cant imagine the amount of work it is all around to pull off this simple change ๐Ÿคฏ๐Ÿคฏ

05.06.2025 19:20 โ€” ๐Ÿ‘ 5    ๐Ÿ” 1    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 0

Not to forget, sensory overload and rushing from session to session are included as well!

03.06.2025 20:49 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

If you want to render the highest quality, then definitely yes. But they have a lot of scaling options: I'd be mostly worried about the hair which also always looked bad in Unreal without cranking up the settings.

03.06.2025 20:29 โ€” ๐Ÿ‘ 2    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
A screenshot with text: MetaHuman 5.6: our most powerful digital human toolset yet
With MetaHuman 5.6, the digital human framework leaves Early Access and is now fully embedded in Unreal Engine. Weโ€™ve updated the Unreal Engine EULA so MetaHuman characters and animation can be used in any engine, such as Unity or Godot, as well as creative software like Maya, Houdini, and Blender so you can expect to see MetaHumans in more places. 

MetaHuman Creator now has enhanced fidelity thanks to improved materials, a larger scan database and improved models for processing scan data, plus new authoring workflows that will vastly expand the range of available options for faces, bodies, and clothing. The update also provides the ability to generate real-time animation from almost any camera or audio with MetaHuman Animator, and expands the MetaHuman ecosystem with new plugins for DCCs, as well as integration with the Fab marketplace. Check out all the updates in our latest MetaHuman blog post.

In the text the following sentence is highlighted: Weโ€™ve updated the Unreal Engine EULA so MetaHuman characters and animation can be used in any engine, such as Unity or Godot, as well as creative software like Maya, Houdini, and Blender so you can expect to see MetaHumans in more places.

A screenshot with text: MetaHuman 5.6: our most powerful digital human toolset yet With MetaHuman 5.6, the digital human framework leaves Early Access and is now fully embedded in Unreal Engine. Weโ€™ve updated the Unreal Engine EULA so MetaHuman characters and animation can be used in any engine, such as Unity or Godot, as well as creative software like Maya, Houdini, and Blender so you can expect to see MetaHumans in more places. MetaHuman Creator now has enhanced fidelity thanks to improved materials, a larger scan database and improved models for processing scan data, plus new authoring workflows that will vastly expand the range of available options for faces, bodies, and clothing. The update also provides the ability to generate real-time animation from almost any camera or audio with MetaHuman Animator, and expands the MetaHuman ecosystem with new plugins for DCCs, as well as integration with the Fab marketplace. Check out all the updates in our latest MetaHuman blog post. In the text the following sentence is highlighted: Weโ€™ve updated the Unreal Engine EULA so MetaHuman characters and animation can be used in any engine, such as Unity or Godot, as well as creative software like Maya, Houdini, and Blender so you can expect to see MetaHumans in more places.

Oh that's nice. MetaHumans can now be used outside of UE.
#VRcademicSky www.unrealengine.com/en-US/news/a...

03.06.2025 20:13 โ€” ๐Ÿ‘ 3    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

No problems here to open cobalt.tools.

03.06.2025 05:20 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

@vrneuroscience is following 20 prominent accounts