#VRresearch
05.08.2025 15:40 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0@vrneuroscience.bsky.social
He/Him. Postdoc @EventLab @UniBarcelona Open VR environments for research (CC0): gitlab.com/eventlabprojects/openenvironments. VR research feed: https://bsky.app/profile/did:plc:wkncjmnqrxjmx7dskjju2nii/feed/vrcademicsky http://tinyurl.com/27p67z27
#VRresearch
05.08.2025 15:40 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0Explore XR + cognition at #VRS2025!
VR Summit (Oct 20-21) & VRS Hackathon (Oct 18-19) in Bochum ๐ฉ๐ช. Prototype next-gen haptics, share research, connect across disciplines.
Register Now โ vrs.rub.de
#VRS2025 #VRresearch #VirtualReality #Bochum #VRcademicSky
#VRresearch
02.08.2025 08:54 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0Help develop real feelings in virtual worlds!
Weโre looking for sponsors for the VRS Hackathon 2025, where interdisciplinary teams will push the boundaries of haptic feedback in VR.
Want your brand at the heart of innovation? Letโs talk.
Learn more: vrs.rub.de
#VRresearch
Fun fact, this year the VRS will also feature a virtual conference
More on: vrs.ruhr-uni-bochum.de/virtual-vrs/
#VRresearch
A sneek peek on the new venue for VRS. Can't get any cooler. @vrsummit.bsky.social #vrcademicsky
22.07.2025 11:10 โ ๐ 3 ๐ 1 ๐ฌ 1 ๐ 1VRS Hackathon 2025: Feel the Future
#VRresearch
This yearโs VRS Hackathon is all about "Feel the Future." We're teaming up with the PRESENCE project and SenseGlove to bring you the latest in haptic tech with the Nova 2 gloves. Want to know more? I wrote a short blog post about it.
#VRresearch
presence-xr.eu/vrs-hackatho...
Proud that our VR course went online as part of the project "Deep Tech Brain: Virtual Reality, Artificial Intelligence, Neuroimaging, and Neuromodulation" here at the UB.
#VRresearch
#VRresearch
14.07.2025 12:36 โ ๐ 4 ๐ 1 ๐ฌ 0 ๐ 0Google Scholar has started to doubt my humanity. Guess Iโve been reading too many papers lately. ๐คก
14.07.2025 11:38 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0More info on vrsummit.ruhr-uni-bochum.de
#VRresearch
This is hilarious!
03.07.2025 18:30 โ ๐ 15 ๐ 1 ๐ฌ 2 ๐ 1#VRresearch
30.06.2025 11:56 โ ๐ 1 ๐ 1 ๐ฌ 0 ๐ 0#VRcademicsSky
25.06.2025 18:04 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0[1/3] How do the mechanisms behind collective behaviour work? How can VR help us better understand how visual cues influence decision making processes? ๐๏ธ ๐ง ๐ฆ
24.06.2025 08:52 โ ๐ 2 ๐ 2 ๐ฌ 1 ๐ 0Blueskyfeeds is shutting down, so I moved the VRcademicSky feed to Bluesky Feed Creator.
Like before, if you tag your posts with #vrcademicsky, #vracademicsky, #vrresearch or #vresearch, they will appear in the feed.
bsky.app/profile/vrne...
#VRcademicSky
21.06.2025 09:11 โ ๐ 3 ๐ 2 ๐ฌ 0 ๐ 0Why is it such a pain to change languages in Windows? Just setting up my new laptop and somehow I managed to have some parts are in German, some in English, and others in Spanish.
20.06.2025 17:27 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0Blueskyfeeds is shutting down, so I moved the VRcademicSky feed to Bluesky Feed Creator.
Like before, if you tag your posts with #vrcademicsky, #vracademicsky, #vrresearch or #vresearch, they will appear in the feed.
bsky.app/profile/vrne...
#VRcademicSky
20.06.2025 09:18 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0#VRcademicSky
20.06.2025 09:01 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0Find more information on the event and how to join us on our website: invirtuo.org/course/2025-...
#InVirtuoresearch #VRresearch #VRavatars #policetraining
Photo credit: (c) ISEDA project
The rubber-hand illusion was translated to the mouse model. Just like in humans, embodiment in mice can be achieved by brushing the real forelimb of the mouse and the artificial limb (yellow) in synchrony to generate matching visual and touch percepts. This study suggests that as the mouse looks at the artificial limb, it perceives it as its own limb, and feels threatened if the artificial limb is threatened. Image credit: Luc Estebanez.
The "rubber hand illusion" in mice... @lucestebanez.bsky.social &co use automated videography to show that mice display quantifiable behavioral markers of the embodiment of an #ArtificialLimb, opening the door to future research into human #BodyOwnership disorders @plosbiology.org ๐งช plos.io/4jHsESn
09.06.2025 16:46 โ ๐ 24 ๐ 5 ๐ฌ 2 ๐ 1Cant imagine the amount of work it is all around to pull off this simple change ๐คฏ๐คฏ
05.06.2025 19:20 โ ๐ 5 ๐ 1 ๐ฌ 2 ๐ 0Not to forget, sensory overload and rushing from session to session are included as well!
03.06.2025 20:49 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0If you want to render the highest quality, then definitely yes. But they have a lot of scaling options: I'd be mostly worried about the hair which also always looked bad in Unreal without cranking up the settings.
03.06.2025 20:29 โ ๐ 2 ๐ 0 ๐ฌ 0 ๐ 0A screenshot with text: MetaHuman 5.6: our most powerful digital human toolset yet With MetaHuman 5.6, the digital human framework leaves Early Access and is now fully embedded in Unreal Engine. Weโve updated the Unreal Engine EULA so MetaHuman characters and animation can be used in any engine, such as Unity or Godot, as well as creative software like Maya, Houdini, and Blender so you can expect to see MetaHumans in more places. MetaHuman Creator now has enhanced fidelity thanks to improved materials, a larger scan database and improved models for processing scan data, plus new authoring workflows that will vastly expand the range of available options for faces, bodies, and clothing. The update also provides the ability to generate real-time animation from almost any camera or audio with MetaHuman Animator, and expands the MetaHuman ecosystem with new plugins for DCCs, as well as integration with the Fab marketplace. Check out all the updates in our latest MetaHuman blog post. In the text the following sentence is highlighted: Weโve updated the Unreal Engine EULA so MetaHuman characters and animation can be used in any engine, such as Unity or Godot, as well as creative software like Maya, Houdini, and Blender so you can expect to see MetaHumans in more places.
Oh that's nice. MetaHumans can now be used outside of UE.
#VRcademicSky www.unrealengine.com/en-US/news/a...
No problems here to open cobalt.tools.
03.06.2025 05:20 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0