This 360-degree ultra-high-definition military simulator allows you to drive a Humvee and fire real weapons with absolute precision, including machine guns and rocket launchers, anywhere you want. The 10-projector system achieves a perfectly seamless panorama thanks to Mersive's Sol system; a calibration, warping and sub-pixel image blending technology that may jump from military sims to your living room in the near future. Sol can get any number of projectors and project a single huge image over a surface of any shape and size. We talked with Mersive about how it works and how this may work for game enthusiasts. [Fast-forward the video a bit to get into the action and check the Halo 3-like video after the jump]

I interviewed Mersive's Sales and Marketing Manager Robert Bolen who, in addition to explaining how the technology works, told me what are their hopes for a potential commercial release of the system.

Jesús Díaz: How does the calibration technology exactly work?

Robert Bolen: There are two parts in the system. First is the camera-based calibration and the other is the actual warping and blending engine.

The calibration system is a standalone software component that converts a cluster of projectors in an unknown configuration into a coherent display by discovering the geometric (and sometimes) intrinsic parameters of the cluster of devices. The approach uses one or more cameras. The system measures projected fiducials and intensity patterns from each projector and converts these measurements into information that can be used to blend the projectors into a single display. This process is completely automatic [JD: You can see how the robotic camera works its magic here.] In practice, this software runs on the Sol Server in order to isolate it from end-customer computing platforms. Once calibration has been computed it is stored on the Sol Server for use. Calibration information is accessed via the Sol RealTime API.



The Sol RealTime API is the warping engine. It determines how three-dimensional applications (i.e. flight simulators, games, architectural walkthroughs) should modify their rendering pipeline in order to achieve sub-pixel warping and blending. The Sol RealTime application programming interface (API) provides end users with access to this warping and blending via a set of runtime calls to the Server machines. By using the Sol RealTime API developers insert a set of warp and blend callbacks into the traditional rendering pipeline of the end application. The "Mersive Enabled" application is then able to take advantage of the information stored on a Sol Server. The Sol RealTime warping engine is capable of computing: 1) geometric distortion of each projected image to sub-pixel accuracy, 2) intensity correction at overlap regions to seamlessly blend projected images, and 3) a three-channel color transfer function for all pixels that lie in overlap regions to better ensure color consistency across the display.

JD: What kind of processing power does your system have? Do you use off-the-shelf components with your own software or is custom silicon involved?

RB: The computing power needed to run the individual simulators is dictated by the software application and size of the database being displayed. Mersive's display solution adds very little extra burden on the processors. Everything used is "off the shelf" with no specialized hardware.



JD: So, if I get three projectors at home, I could run this software at home and create, say, an immersive environment for Forza Motosport on the Xbox 360?

RB: Yeah, it really doesn't require any new hardware, or more advanced computing. Computing hardware matures at an already fast pace, getting our technology into the hands of the end users is more of an integration and business case. If companies are interested in multi-projector games and applications, they only need to integrate them with the Mersive technology.

With projectors coming down in price, high resolution gaming, computing devices in the palm of your hand along with high speed mobile networks growing, we think there is an increasing need for flexible display technology everywhere. The world has and will continue to have a growing need for larger, higher resolution displays, like the 20-megapixel systems we can put together using Sol.

JD: And have you actually connected a console to play a game in one of your domes?

RB: We haven't played any games (yet!) in the domes you saw in the videos. We have however played Quake on large, multi-projector, multi-wall environments. Because the source code is available for Quake it is possible to put the "hooks" into it to use our display technology. Coordination would be needed with the developers of other PC-based games or consoles such as Playstation or Xbox in order to get them running on these displays.



JD: At the professional level, are there more applications, like racing sims for formula one drivers, or this is just military stuff for now?

RB: The applications are limitless for this technology. The early adopters have been for military simulation and training but we expect to see it being pulled into more commercial uses in the near future.

JD: Thanks a lot for your time, Robert.

RB: Thank you!