Computer displays have been around for quite some time, and during recent times, have evolved far more rapidly than their computer and notebook computer counterparts. Primary reason can the advancing technology, which is in turn is lowering the cost of manufacturing. However, we can’t deny the fact that displays have come a long way from their humble origins of Cathode Ray Television (CRT) receivers.
Originally, computer monitors were mainly used for data processing and high end research. Until the early 1980s, they were known as video display terminals and were physically attached to the computer and keyboard. The monitors were monochrome, flickered and the image quality was mostly poor. In 1981 however, IBM introduced the Color Graphics Adapter, which could display four colors with a resolution of 320 by 200 pixels. In 1984 IBM introduced the Enhanced Graphics Adapter which was capable of producing 16 colors and had a resolution of 640 by 350. CRT monitors had the advantage of being cheap and offering a viewing angle of 180 degrees. However, the significant disadvantage was that CRTs were bulky, and were in no position to be absorbed in a Notebook Computer.
However, with the advent of notebook computer, a new kind of display technology was needed. And so came Liquid Crystal Displays (LCDs) in the picture. There are multiple technologies that have been used to implement LCDs. Throughout the 1990s, the primary use of LCD technology as computer monitors was in notebook computer where the lower power consumption, lighter weight, and smaller physical size of LCDs justified the higher price versus a CRT. Commonly, the same notebook computer would be offered with an assortment of display options at increasing price points. During the 2000s TFT LCDs (active matrix color), a variant of LCD, gradually displaced CRTs and eventually became the primary technology used for computer monitors. The now common active matrix TFT-LCD technology also has less flickering than CRTs, which reduces eye strain, and are mostly used in 7″ Computer and mini netbook.
While LCDs were dominating the notebook computer scenario, they had the distinct disadvantage of low contrast and response time. Furthermore, using multiple screens had the demerit of causing flickers on higher refresh rate settings. To solve this problem, a new breed of computer displays were invented, called Organic Light Emitting Diodes (OLED). An OLED is basically a light-emitting diode (LED) in which the emissive electroluminescent layer is a film of organic compounds which emit light in response to an electric current. This layer of organic semiconductor material is situated between two electrodes. Due to their higher contrast and better viewing angles than LCDs, they are immediately replacing LCDs as primary display device for notebook computers. The only disadvantage of OLED is perhaps its high cost of manufacturing, but that is expected to fall down with advancement of manufacturing technology and higher demand.