The late 1990s were a fascinating period in the evolution of cinema technology. Film, the only physical medium cinema had ever known, was slowly being replaced by newfangled digital technology. And it was in 2000 that the Academy Award winning film O Brother, Where Art Thou gorgeously illustrated the potential of digital filmmaking.
In the early days of cinema, color grading (that is, the adjusting of a film's relative color levels) was accomplished through film emulsion alchemy. By the time CRT displays became widespread, movie makers had begun relying on telecine devices to adjust and edit the color levels on a roll of film stock. White light shown through the film negative strikes a prism, which separates the light into its component red, green, and blue lights,which then strike a charge-coupled device. This CCD converts the incoming light levels into electrical signals that are used by the telecine device to modulate a video signal that can be color-graded before being transcribed back to film.
However, in the mid-1980s, digital color grading systems began to appear. Rather than converting an analog medium (film) to another analog medium (the telecine output)—adjusting the colors, then converting back to the original analog medium—these devices instead leveraged a digital intermediary for the editing and adjusting work. But it wasn't until the production of O Brother that an entire film employed this grading method and actually was output to a digital master copy.
The added precision and expanded capabilities of the digital format have since proven a boon to film editors and content creators alike, which is why DI is now commonplace while film emulsion color grading is almost never used anymore. And with the added capabilities that emerging technologies such as Dolby Vision are promising, the future of cinema will look increasingly like real life. [Wiki]