Computer-generated models are starting to let researchers and students peer into the body without needing a real human stretched out before them. Virtual dissection tables have been built at places like Stanford and the University of Calgary. Now, University of Michigan computer scientists and biologists have taken the technology another step forward, using projectors, joysticks and 3-D equipment to build a floating holographic human that users can dissect, manipulate, and put back together as they wish.
The project is called Michigan Immersive Digital Experience Nexus (MIDEN), a step forward from CAVE, its predecessor audio-visual system for virtual reality, which Txchnologist has previously profiled.
Apple MacBook Air Laptop
The M1 chip delivers 3.5x faster performance than the previous generation all while using way less power. Get up to 18 hours of battery life.
"The first time I saw the technology I almost cried," said Alexandre DaSilva, an assistant professor at the university's School of Dentistry, who is using the virtual cadaver along with his students. "In my wildest dream, I never thought that this would be possible."
The team behind the visualization system say it can be used for many other applications, from helping meteorologists dissect hurricanes to aiding in archaeological or paleontological studies. [University of Michigan]
This post originally appeared on Txchnologist. Txchnologist is a digital magazine presented by GE that explores the wider world of science, technology and innovation.