The Future Is Here
We may earn a commission from links on this page

Military VR Simulator Is Closest Thing Ever to Real-Life Halo 3

We may earn a commission from links on this page.

Click to viewThis is the ultra-high resolution multi-projector VR system that will be used for training at the Nellis Air Force Base in Nevada. Built by Lockheed Martin, it uses Mersive Sol Server technology to automatically and seamlessly combine multiple projectors in one gigantabolous immersive display, thanks to camera-based fully-automatic calibration, edge blending and color correction. The result is stunning, but wait until you see it in motion after the jump.


The whole system will be used in the base's Joint Terminal Attack Controller (JTAC) Virtual Trainer Dome simulator, which is "primary training facility for the USAF JTACs."

JTAC is the link between the Army and the Air Force when combat requires the aid of close air support, is expected to maintain situational awareness, know the supported unit's plans, and validate and prosecute targets of opportunity. The I-JTAC TRS Proof-of-Concept Dome, built by Lockheed Martin, offers a high fidelity, realistic, fully immersive, real-time visual environment with sensor, simulator, and database correlation.

Although there are other multiple-projectors solutions in the market, the cool thing about Mersive's Sol Server technology is the automatic calibration to adjust the multiple projectors to the curved surface. Instead of using complicated procedures that can take weeks to complete, Sol uses a robotic camera to obtain reference points. After the screen is mapped in less than an hour, the Sol Server software corrects for "geometric distortion, intensity and color variation in overlap regions that result from using multiple overlapping projectors on the spherical surface of the dome."

The Sol links to the MetaVR Virtual Reality Scene Generator (VRSG)™, which is the VR generator used by the JTAC unit, with zero latency to the display system.

Warping, blending and color correction is accomplished by applying parameters supplied by the Mersive Sol Server to images generated by MetaVR directly on the graphics card used by the IG. No additional external hardware is required.


Sounds cool, but really, could we drop the MetaVR and connect this thing to an Xbox 360 to kick Kotaku's ass again in Halo 3? That's what we want to know. In the meantime, see how the whole process works in the video.

Click to view

[Mersive Technologies]