These pages compare CRT and LCD display technologies as rationally and scientifically as possible within
the constraints of a domestic environment. They were prompted by repeated postings with no supporting
evidence, in news groups such as
uk.tech.broadcast
and
uk.tech.digital-tv,
that CRTs are superior to LCDs, whereas my own experience is precisely the opposite. Where possible
I supply or quote objective or independent evidence, otherwise I report my own experience, accepting that,
unlike evidence, my experience is simply hearsay to others.
The sources of questions and criticisms mostly fall into two groups:
Domestic users;
Broadcast professionals.
Domestic Users
Either they are researching buying a new TV, in which case these pages may help, though primarily I would recommend
Choosing A TV,
or else are seeking further explanation as to why such a purchase, perhaps made after inadequate research,
hasn't turned out as well as hoped.
With the latter, a typical scenario seems to be that a former analogue narrow screen (4:3 aspect ratio) CRT,
which was aging before it broke and therefore had possibly drifted out of adjustment, is replaced with a
much bigger wide screen (16:9 aspect ratio) LCD. Because of a combination of …
The old TV had got out of adjustment;
The new TV is significantly bigger than the old one;
Analogue signal faults on the old TV are understood to be faults in the signal, but unfamiliar faults
in the digital signal on the new TV are incorrectly ascribed to the new TV rather than the signal;
Noone's bothered to look in the new TV's manual and menu options to discover how best to set it up;
Being new, the TV is examined critically for the first time in ages;
… defects are visible which previously were not, and the new TV is blamed. In fact most,
very probably all, will turn out to be in the digital broadcast signal -
in the UK, digital signals are so over-compressed that only 1 to 3% of the original data captured by the
camera ever arrives at the TV!
Broadcast Professionals
Typically broadcast professionals simply state, as assuredly as it was once claimed that the sun went round
the earth, that CRTs are much better than LCDs. When challenged to supply supporting evidence, either
none is forthcoming, or it is at best corporate hearsay, at worst pseudo-science.
The problem with corporate hearsay is that corporations have enormous inertia
- a culture of jargon, work practices, standards, etc
- that can be immensely difficult to change, and keeps them working in a
certain way even though it may be shown to be based on a falsehood. Hence, for example, it is still
common for people in the moving arts and visual display industries to refer to "persistence of
vision" even though the concept was shown to be false as long ago as the 1970s1.
The most assured way of keeping abreast of new technology, and, a significant point, having an input into
its design at an early enough stage for your industry to get what it requires of it, is to conduct trials.
Strange as it seems to someone like myself of technical background, formerly employed to conduct such trials
in furtherance of a firm's business, it does genuinely seem that noone has actually assessed LCD technology
in such an independent, scientific, and rational way. Certainly, repeated requests have failed to
elicit even a quote from the results of such a trial, let alone a full technical report, and repeated
searches have failed to find any published results, even from 'open' organisations such as universities,
technical colleges, the EBU, or the
BBC.
2 Equipment Used
Naturally, I would have preferred to perform the sort of corporate trial discussed above, but this would
have required access to a range of equipment that I do not have, and consequently, these documents can not
and do not claim to be results of such a trial.
Instead, I have had to be content in making as impartial a comparison as possible using what I happen to
own, being two ordinary televisions of similar size. To estimate how more widely applicable these
results might be, I've spent much time searching for relevant data, completely without success for CRTs,
and SD LCD data is also getting difficult to find now that
HD is sweeping the market. The best that I can say is that
while I have no reason to believe these TVs are unusual or atypical, they are probably not absolutely
typical either, and one of each type is inadequate as a statistical sample. Hence, care should be
taken in generalising from the following results!
Nevertheless, at least they are useful, demonstrable results, whereas, as far as I am
aware, there is nothing scientifically robust coming from the pro-CRT side.
Unless otherwise indicated, the two televisions used were:
Type
CRT (aperture grille)
LCD
Model
Sony KV-16WT1U
Panasonic TX-15LT2
Size, Nominal Size (mm, in)
365, 14
385, 15
Screen Width (mm)
317 ±1.5
338 ±1
Screen Height (mm)
179 ±1.5
192 ±1
Measured Phosphor/Element Pitch (mm)
0.614 ±0.014, NA
0.390 ±0.014, 0.390 ±0.014
Calculated Effective Resolution
516 ±15, NA
867 ±35, 492 ±35
Notes:
Ideally the televisions would have exactly the same screen size, the actual difference is roughly 5%.
Neither TV has sufficient resolution to display SD at
720×576i - it's not meaningful to talk of vertical
resolution with an aperture grille design, but the CRT's 72% horizontal coverage might be expected to
lose picture detail more significantly than the LCD's 85% vertical coverage, and this is the most
probable explanation for the results actually obtained favouring the LCD.
The CRT has a scratch on the screen, but it does not feature in any picture fragment used here.
LCD technology being much dimmer than CRT technology, in order to obtain adequate exposures, the LCD's
Backlight and Brightness controls had to be set to their maximum values (in normal viewing they would
both be at 50%). All other controls on both TVs were set for normal viewing, in particular all
'Artificial Intelligence'& signal processing options were disabled.
Photographs were taken by a Canon S40 camera mounted on a mini-tripod so that the lens was as near as
possible opposite the middle of the screen, error < ±15mm, with a lens to screen distance of
300±5mm. The camera settings were:
RAW image format
Equivalent film speed ISO 400
Shutter speed 1/100s
Aperture F/8
Manual shooting mode
10s timed exposure delay (to eliminate camera shake)
Note that there are dead pixels in the camera, appearing as, mostly, red dots in some of the images here.
These can be demonstrated to lie within the camera because they are always in the same pixel positions.
As they have nothing to do with either television, they should be ignored.
The resulting *.CRW (Canon Digital Camera RAW file) photographs are each about 3-4Mb in size, but are not
understood by web browsers, and hence have had to be converted via TIFF
to PNG format of about half the file size. I did
not discern any loss of detail in this conversion.
Composite video was chosen as the format for the source signal as this could be duplicated easily by a
Quintro+ B-Tech BT945 SCART Switch via its two RCA composite video outputs. The TV2 switch on the
unit was not depressed, and no channels were locked, which meant the TV1 and TV2 outputs were showing the
same source signal.
3 Commonly Voiced Criticisms Of LCD TVs
Criticisms of LCDs that are commonly voiced include:
While it is known from the way CRTs work that most display each picture line as it is received, with
LCDs there's a shortage of facts - it can be deduced from
specifications that at least some recent High Definition TVs can deinterlace, but this appears to be an
option selectable via menu rather than default behaviour; sources such as Wikipedia explain different
deinterlacing methods2, and claim
without convincing evidence that all non-CRT display types require deinterlacing3;
others are more equivocal4:
I've just had a very interesting chat with a guru within my company who deals with such matters.
The short answer is that no one knows for sure how the processing is performed. You'd have to
ask each individual manufacturer how they do it.
…
One thing is certain however, he has witnessed tests of many different types of screen all being
fed with exactly the same signal. There are huge differences with temporal resolution, so the
processing techniques are very different from mfr to mfr.
There is also this image which is from an interlaced analogue signal source displayed on my 22" LCD.
If the LCD was deinterlacing it, you'd expect that lines would be paired, and that each pair would
contain either identical content averaged over the pair, or interpolated content over the pair.
Although superficially the lines look paired, they are neither averaged, individual lines in each
supposed pair differ noticeably, nor interpolated, the edge of the person's arm in the upper line in
each supposed pair seems if anything sometimes slightly to the left of the lower, whereas
interpolation would place it about two pixels to the right, an unmistakable difference.
Alternate lines also have a slightly different colour rendition. Hence, it seems that
the picture contains two fields shown as such, and this LCD is not deinterlacing.
In fact, I believe it makes more electronic and economic sense to assume that most LCDs buffer
rather than deinterlace. To understand the difference between the two and why, the following
must first be understood:
Progressive Scan is when the signal or data representing a video image, here
usually called a frame, is produced in a natural order, usually line by line top left to bottom
right. This means that video content changes at the same rate as the frame rate, in the UK
25Hz. DVDs are mostly progressive scan.
Interlaced Scan is when the signal or data representing video images is
photographed as two fields, the first comprising odd numbered picture lines, the second even
numbered lines, each field being produced progressively. This means that video content
changes at the same rate as the field rate, in the UK 50Hz. Note the crucial point that
the common description of interlacing as splitting a single video frame into two fields is only
meaningful where video was originally photographed as progressive and is later interlaced,
perhaps for broadcasting, but where the source is actually photographed as interlaced each
field is shot 1/50s later than the previous one, and is thus independent of it, so the concept
of frame has no meaning, and consequently this common description is misleading.
Most legacy analogue video was photographed as interlaced, modern digital video can be either
depending on camera settings.
Buffering, in the context of video displays, is when part of a picture is
stored until completely received before displaying it. In theory, either frames or fields
could be buffered, similar circuitry could do for both, and likely could be switched into either
mode depending on whether a progressive or an interlaced signal is being received. Note
that buffering doesn't alter the video content or its rate, so, in the UK, a buffered display
would still change the content of a progressive scan source at 25Hz and an interlaced one at
50Hz.
Deinterlacing attempts to combine two successive fields from an interlaced
signal into a single frame. Note that this assumes that once they were indeed part of a
single frame as described above when a progressive source is broadcast as interlaced, which
often will not actually be true, and that it alters both the received video content and its rate
from, in the UK, 50Hz to 25Hz. For a progressive source being broadcast as interlaced, in
principle deinterlacing can completely reassemble each original frame. However, if an
attempt is made to deinterlace content that was photographed as interlaced, then deinterlacing
would attempt to combine two independent fields into a composite frame, and the result will
usually be worse than doing nothing at all to the picture.
This being so, and deinterlacing circuitry being relatively complex and expensive, I cannot see why an
LCD manufacturer would choose to increase the cost of its units by attempting to deinterlace the signal,
when they have no way of knowing whether a given signal being displayed was originally progressive, in
which case a slight improvement would result, or interlaced, in which case a significant degradation
would result. It just wouldn't make economic sense. Some high-end LCDs appear to attempt
it as a menu option, but I'd be surprised if the majority do it. I suspect that those sources
that make these sweeping claims are actually confusing deinterlacing with buffering, a suspicion
reinforced by the lack of any supporting evidence that stands up to logical examination.
Jaggies
It is claimed that LCDs show more jaggies than CRTs. Simply, this is a myth:
CRT
LCD
Lack of contrast ratio
I have satisfied myself that this criticism does have some foundation in fact, at least as far as my own
LCDs are concerned, but the differences are only discernible in unusual situations -
night time CCTV, some night scenes in B&W movies, star- and galaxy-scapes in astronomy or sci-fi
programmes, etc. Further, CRTs commonly need adjustment of the Brightness to see such scenes
optimally. What's really needed is a display technology with sufficient contrast ratio for this
to be unnecessary, and neither CRTs nor LCDs are really adequate in this respect. Consequently,
I do not intend to discuss this further here.
Colour characteristics such as gamma
As this is an issue for production rather than domestic environments, I shall not discuss it here.
4 Commonly Voiced Criticisms Of LCD Monitors
Criticisms of LCD Monitors that are commonly voiced include:
Colour Rendition
This shouldn't happen if monitor drivers are correctly installed. If required for correct colour
rendition, monitors of any type should include an ICC Profile5
(*.icm) in their driver set. Perhaps those reporting this problem are not using the correct
drivers, but instead general purpose drivers such as Default Monitor.
Font Rendition
In my experience this is a myth. Regardless of monitor type, the most critical factor for fonts
is the aspect ratio of the resolution being displayed - if this
is 4:3, fonts look fine, but the further it departs from 4:3, the worse they look.
A subsidiary claim is that on an LCD fonts only look 'right' at its 'native' resolution, but a CRT has
no such constraint. Again, in my experience, this is a myth -
the native display resolution of my ViewSonic VG712s monitor is 1280×1024 (not even 4:3, which
alerts the sceptic in me), but the accompanying photograph shows some text displayed in 1024×768,
and there is no discernible degradation of the text. Besides the importance of having the correct
monitor drivers, it is also necessary to have the correct settings under Start,
Settings, Control Panel, Display, such as:
Effects, Smooth Edges Of Screen Fonts
Settings, Advanced, Monitor, Refresh Frequency
For example, the default refresh offered is often 75Hz, which in my experience will usually display a
picture on an LCD, but it is sometimes misshapen and distorted, while 70Hz is always excellent.
5 Health And Safety
X-Radiation
CRTs are inherently a mild X-ray source. Consequently,
increasingly stringent standards for computer equipment have resulted -
MRP-II, TCO-*6; I am not aware of any
relevant international TV standards. Though nowadays the radiation doses given out are very low,
they still have some significance. Those purchasing a new CRT monitor or TV may care to note that
an average year of CRT TV watching is roughly equivalent to a year living next door to a
normally operating nuclear plant7.
This may seem trivial, but children regularly sitting for hours on the floor too close to a CRT TV or
anyone constantly sitting in front of a CRT monitor probably represent significantly higher doses than
average, and anyX-ray exposure, particularly to children, seems
senseless when it is entirely avoidable by using LCD technology.
Eye-Strain
There are hearsay reports of CRTs being more tiring on the eyes, to which their adherents often reply
that the refresh rate was probably too low. In the absence, as far as I am aware, of conclusive
and rigorous test results, I can only add to the hearsay. My experience is that refresh rate makes
little difference above a certain level, probably somewhere between 50 to 100Hz depending on the
individual, yet CRTs still tire the eyes above this level, whereas I have never noticed any
adverse effects from using LCD monitors, even though on a daily basis I use them for hours without break.
Further, I'd like to add the following …
About a decade ago, I lost control over my immediate work environment for two years, and had to sit
closer to my CRT monitor than I really wanted to - my eyes were
probably about 0.5m or less away from the screen, whereas before and since I've always worked at 1m.
During that time I would feel tiredness in my eyes within a short time of starting work, much sooner
than previously. Also over that time my eyesight rapidly deteriorated to the point where I now
need glasses to read small print. It could be argued that this was natural ageing, and I couldn't
disprove that, but my own feeling based on the timing and rapidity of the deterioration is that the
major cause was that particular desk layout causing the closeness of the CRT. By contrast, I've
been able to keep the same glasses ever since I started using LCDs.