The new camera design—while unlikely to see the light of day any time soon—uses one sensor for luminance and a further two for chrominance. Each sensor appears to have a lens assembly in front of it to direct light toward the sensor surface, and their positioning is such that the camera can compare information from each of them to create a full image.
As Apple Insider points out, that means that an image processing module could draw data from all three sensors to create images that would be of better quality than a single unified sensor. In fact, Apple Insider goes into some detail about how the processing would work, if you're so inclined.
Regardless of how it works, the overall result would be great for the end user, offering better images generally—and almost certainly a big jump in low-light picture taking. But as ever with patents, you can expect to take a long time to make it anywhere near a consumer product—if it ever does. [USPTO via Apple Insider]