0732: "HDTV"

This forum is for the individual discussion thread that goes with each new comic.

Moderators: Moderators General, Prelates, Magistrates

phillipsjk
Posts: 1213
Joined: Wed Nov 05, 2008 4:09 pm UTC
Location: Edmonton AB Canada
Contact:

Re: "HDTV" Discussion

zerohero wrote:
(My CRT) was made in 2002. I can connect a HDMI device to it too, if I use a converter. My CRT will beat 90% of all LCD's on actual performance.
This whole "HD 1080p Gee Willikers look at that resolution!!11" thing is nonsense.

I'm waiting for OLED to be mroe accessible, and cheaper, personally. There are RGB/LED lcd's that have much better colours than older models, but they still suffer from high response times such as 2ms or more (CRT response times are usually measured in single-digit nanoseconds).

I don't think you are being fair: The response time may be measured in nanoseconds, but the phosphors take time to fade on the order of milliseconds. (Wikipedia mentions a specific formulation that fades in microseconds).

Sometimes the slow fade is by design: I have some Amber displays that are flicker-free at a 50Hz refresh rate.

Also, I would not count on an HDMI converter to allow you to display HD movies:
Juan Calonge wrote:The "Analog Sunset" will be staggered. New players after December 31, 2010 must limit analog video output of BD content to interlaced standard definition (480i/576i). Then, 2013 is the expiration date for analog video: no player that passes "Decrypted AACS Content" to analog video outputs may be manufactured or sold after December 31, 2013.

If you like high resolution video and don't want to put up with inferior "Digital" displays, you should give Blu-ray and HD-DVD a pass. The allowed analog resolution is equivalent to DVD resolution anyway.
Did you get the number on that truck?

zerohero
Posts: 30
Joined: Fri Jan 02, 2009 5:05 pm UTC
Location: Up (yes, up)

Re: "HDTV" Discussion

No, that's not true. You can use this: http://www.hdfury.com/
You get a device form there that allows to to connect a HDMI output to a VGA monitor via a converter. It does all the HDCP handshake thingy perfectly. This allows you to watch *HD* video on your analog CRT monitor, providing it can display at 1080p, which mine can.

Also, phosphor fade is indeed an issue on most models, but I only ever notice it on this display on low-light scenes when there is no light in my room. Otherwise in scenes with a lot of light, and/or when there is light in my room (daytime) I don't notice the delay.

Also, HD-DVD is defunct.

In addition to my PC, I connect the following to this CRT, and get 1080p:
- PS3 (I get 1080p in both games and blu-ray)
- Xbox 360 (I use the official VGA cable for 360, so no HDFury required)

Just because certain folk don't want me to play protected content on an "unsafe" device does not mean I will actually listen. I'll do what I want with what I pay for.
.narfNET - http://narf.byethost16.com
This is a personal site of mine catering to console emulators and other things of interest
.narfNET forums - http://narf.byethost16.com/forums
Give us a visit! We're always happy to accept new members

phillipsjk
Posts: 1213
Joined: Wed Nov 05, 2008 4:09 pm UTC
Location: Edmonton AB Canada
Contact:

Re: "HDTV" Discussion

You miss the whole point of Advanced Access Content System (AACS).

Wikipedia wrote:In principle, this approach allows licensors to "revoke" a given player model (prevent it from playing back future content) by omitting to encrypt future title keys with the player model's key. In practice, however, revoking all players of a particular model is costly, as it causes many users to lose playback capability. Furthermore, the inclusion of a shared key across many players makes key compromise significantly more likely, as was demonstrated by a number of compromises in the mid-1990s.

Digital Transmission Content Protection (pdf) is an "umbrella" DRM scheme that allows the propagation of "System Renewability Messages." I interpret that to mean that your latest Blu-Ray disk can revoke your Blu-ray player's ability to talk to the HDFury.
DTLA wrote:SRMs are generated by DTLA and delivered via content.
- page 19 of DTCP Overview pdf.
Did you get the number on that truck?

RAKtheUndead
Posts: 72
Joined: Fri Mar 27, 2009 10:18 am UTC

Re: "HDTV" Discussion

Personally, I think the point's been lost on a lot of people here. Sure, 1080p isn't particularly impressive as far as resolutions on screens go, but there are two things that have to be considered when comparing computer/smartphone screens directly to televisions:

1) Unlike computer platforms, television is constrained largely by the bandwidth that can be put out by the transmitters. Under the old transmitter-to-aerial model which dominated until recently, bandwidth was rather limited. This, obviously, constrained the resolution of the transmissions, such that you had three options - go for a higher resolution and sacrifice frame rate (the PAL/SECAM model), go for a higher frame-rate and sacrifice resolution (the NTSC model) or sacrifice channels (and very few people would consider that to be an appropriate option).

2) High resolutions have been available in televisions long before computer screens. Case in point: The French 819 line (768i) system started in 1985, which may have been limited to monochrome, but was still far ahead of computer monitors in providing high resolution. Second case in point: NHK's 1080i system introduced in 1969, which is the same resolution as standard HD Ready sets today. The lack of resolution in televisions cannot therefore be linked to a lack of technology in terms of television screens, but the lack of bandwidth and necessity for backwards compatibility found in NTSC, PAL and SECAM.
"The outright rejection of technology cripples the otherwise educated mind." - RAK

phillipsjk
Posts: 1213
Joined: Wed Nov 05, 2008 4:09 pm UTC
Location: Edmonton AB Canada
Contact:

Re: "HDTV" Discussion

The bandwidth is still limited. Digital technology allows us to squeeze better resolution into that bandwidth.
Did you get the number on that truck?

KDMatt
Posts: 2
Joined: Sun May 23, 2010 9:42 am UTC

Re: "HDTV" Discussion

Good morning everyone,

Let me first preface this by saying that signing up on a forum just to post a response to a thread is not usually my style...

... and that furthermore reviving a weeks-old thread is ALSO not my style ...(and fully admit before hand this is a GIANT faux paus) but I've been visiting the site now very regularly for a long time, and I keep coming back to this particular comic ... and every time I read it I can't help but bash my head on my desk a bit.

I finally floated on down here hoping that someone would be able to talk a little bit of sense and make some sense out of HDTV's ... but alas, no, so far, most of what I'm seeing are people caught up in numbers ... resolution, framerates ... which is, of course, all well and good ... but what I'm noticing here, and forgive me if I offend anyone by saying this, is that you're all talking in terms of pure science and technology, when a large part of how we perceive motion in films and television is psychological.

... and once again, I'll stop myself briefly before I go on to explain that my background is presumably much different than everyone else in this thread. I am a filmmaker, I have a degree in film, and have extensively studied, among other things, the technology of film and digital video, as well as how people perceive both.

The first point I must address, and this has really been bugging me about this strip -- Randall, camcorders, sitcoms, and indeed all facets of SD TV are 29.97 fps. Standard Definition television is an interlaced signal with 60 alternating images per second which creates 29.97 complete frames per second. 60i does not equal 60 fps. So as much as it pains me to say it, there is a factual error in this strip... If it's any consolation however, this is the first one I've ever noticed in my years of faithful reading!

Alright, and then this brings me to all of you folks in this tread going on and on about resolution and frame rates of HDTV's and why it's sucky etc. etc.

Fact #1: Every film you've ever seen has been 24FPS. Yes, even Imax.

Fact #2: No 'film' you've ever seen has been shot or printed on anything bigger than a 70mm slide of celluloid. Imax is a 70mm format... and actually, Most films are shot on 35mm. Do you really think you need more than 1,920 pixels to accurately transcribe the visual information stored in a 3.5 cm wide piece of plastic? or 7 cm for that matter? There's only so much digital resolution you can yank out of a physical, analog medium like film before it starts looking grainy ... much like taking a microscope to an old photograph.

Only a handful of movies get shot completely digitally (like Star Wars Episode III, and District 9), and even these, based on CCD limitations can't be shot at resolutions much larger than 2000 by 4000 (the exact numbers escape me ... I'm not exactly budgeting for top end equipment at the present moment) ... You can only cram so much visual information, pixel wise, from a lens into a capture device, then fit it all into a camera that's still portable enough (and cheap enough) to be useful for anything. So yes, games have always been ahead of the curve as far as pixels go ... but that's only because it's all digitally rendered, there is no camera limitation to take into account.

Also, 24 FPS looks damn good, and it's not because we're 'used to it' .. and it's most definitely not some arbitrary number that was cooked up by some cameraman in the 1920's (as someone on this thread insinuated) ... 24 frames per second, for human beings, is what's known as the "Flicker Fusion Threshold" (google this) -- it is the bare minimum number of continuous frames we need to witness in a given second to comprehend motion. This is where the psychology of films comes into play.

Furthermore, when you're watching an actual film, in a theater, being projected at 24 FPS, you are actually comprehending 48 FPS. 'Sounds crazy I know, but believe it or not, your brain sees and acknowledges the other 24 frames of nothing in a given second, interspersed with the 24 frames of motion, and filters the 'nothing' out as noise (so you're not consciously aware of it). This sudden oscillation between nothing and something serves to, almost subliminally, stimulate pleasure sensors in your brain bringing you into, what one psychologist on the subject called, a 'dream-like state.' This brings me to the comparison someone made about Casablanca and Casino Royale at 24 FPS and 60 FPS -- Of course the 60 FPS version looks better, because you're getting the standard 24 FPS of something, plus 36 off-frames of nothing, creating a true 'flicker' effect, whereas the 24 fps version does not do this. SD Television, for what it's worth, in it's 29.97 interlaced frames per second, does not create this flicker effect either.

I also didn't hear much of anyone talking about the distinction between an interlaced and progressive scans on this thread (and I really hope that it's not because people don't understand the difference), but for someone like me -- a film junkie, the difference is paramount (no pun intended on this one). If you're familiar with how film moves through a projector (or a camera for that matter), it works something like this: Film reels are comprised of thousands of individual exposures ... as they move through the machine they are suspended in front of a light (or lens) for a 1/48th of a second as a shutter opens, and then closes again ... then another exposure replaces it and the process is repeated millions of times... In essence, you are getting one image, in its entirety, every 24th of a second.

Interlaced images look fake because they don't do this. Instead they provide you with an image that is constantly being updated and rescanned over and over again in a given second. There is never one complete image to register at any given time (unlike real life), so our brain doesn't detect the 'flicker effect' nor does it seem particularly real or detailed (you're, in essence, getting a half frame at a time).

Progressive scans, however, DO give us one complete image at a time, and CAN provide us with the all important "flicker effect" that makes movies so hypnotic. This is the real reason people are going gaga for HD: progressive scanning. Our computer monitors have had it for years, but our TVs have not. So when Joe Schmoe hooks up his brand new 1080p TV to his Blue-Ray player, you'd better believe he'll think it looks better. He's got flicker! (something I don't think you even get with 480p DVD's) It's not that any single element of the technology is particularly mindblowing in and of itself, it's just that now it's all available in one standard, in one package that is a vast improvement over what was available before.

Also, I saw someone else bashing Home Theater surround sound on here? Really?

It's more than just a 'fancy echo' (at least if you you're using a setup from within the last decade). Dolbic Digital and DTS surround sound standards digitally encode and decode six (at the minimum!) discrete, individual channels of audio... and while most people do indeed have really crappy setups ... some of us, the true home theater buffs, have decent speakers on our systems and don't even need subwoofers most of the time.

Again, apologies everyone for reviving an old thread (I hope you can forgive the n00b syndrome) ... but I was having one of these moments:

phillipsjk
Posts: 1213
Joined: Wed Nov 05, 2008 4:09 pm UTC
Location: Edmonton AB Canada
Contact:

Re: "HDTV" Discussion

Thread necromancy is not actually against the rules here. We prefer it to starting a new thread.

KDMatt wrote:The first point I must address, and this has really been bugging me about this strip -- Randall, camcorders, sitcoms, and indeed all facets of SD TV are 29.97 fps. Standard Definition television is an interlaced signal with 60 alternating images per second which creates 29.97 complete frames per second. 60i does not equal 60 fps. So as much as it pains me to say it, there is a factual error in this strip... If it's any consolation however, this is the first one I've ever noticed in my years of faithful reading!

Here is where is disagree with you: 60i can be considered 60 half-resolution frames per second. It is true that the image of the last field may linger, but that creates a blurring effect (with the alternate scanlines becoming obvious). It is true that if you want to avoid blur or preserve resolution, NTSC is only 30 fps (PAL 25).

You seem to be arguing that film "looks" better because it is hypnotic. To tell the truth, I find TV with its interlaced frames more hypnotic.

Do you really think you need more than 1,920 pixels to accurately transcribe the visual information stored in a 3.5 cm wide piece of plastic? or 7 cm for that matter? There's only so much digital resolution you can yank out of a physical, analog medium like film before it starts looking grainy ... much like taking a microscope to an old photograph.

This page estimates 5300 x 2864 (assuming 1.85:1) resolution for 35mm film. I think with black & white film, your are limited only by diffraction (microfilm exhibits rainbow fringes around detailed areas). Edit: I now see that page was dealing with still photography. Faster film has more grain and lower resolution.

Digital is cheaper, not necessarily better.
Did you get the number on that truck?

KDMatt
Posts: 2
Joined: Sun May 23, 2010 9:42 am UTC

Re: "HDTV" Discussion

Hey, again, I'm glad necromancy isn't against the rules... Most of the forums I hail from generally frown on it (especially when the 'new guy' revives a 3 year old thread, ugh).

phillipsjk wrote:Here is where is disagree with you: 60i can be considered 60 half-resolution frames per second. It is true that the image of the last field may linger, but that creates a blurring effect (with the alternate scanlines becoming obvious). It is true that if you want to avoid blur or preserve resolution, NTSC is only 30 fps (PAL 25).

Yep, you said it precisely, it is 60 half-resolution, or half frames in a given second. Generally speaking we are only interested in complete frames in a second (at least I am), so I will adamantly maintain that SD programming is not 60 fps. Wikipedia is on my side on this...

Wikipedia article on Frame rate wrote:# 60i (actually 59.94, or 60 x 1000/1001 to be more precise; 60 interlaced fields = 29.97 frames) is the standard video field rate per second for NTSC television (e.g. in the US), whether from a broadcast signal, DVD, or home camcorder.

Interestingly enough though, apparently old-school gaming systems were capable of creating a 220p resolution on a standard interlaced TV.

phillipsjk wrote: You seem to be arguing that film "looks" better because it is hypnotic. To tell the truth, I find TV with its interlaced frames more hypnotic.

I was actually bastardizing and truncating the point made in a scholarly article about the psychology of film viewing. I'll have to dig it out of my archives. The author puts forward that films are 'hypnotic" because the consistent 48 fps flicker stimulates the 'dream mechanism' part of our brains. Pretty interesting stuff really.

Personally I find the interlaced scanning of a TV stimulating, and the flicker of a projector very soothing, but then again that's just me.

phillipsjk wrote:This page estimates 5300 x 2864 (assuming 1.85:1) resolution for 35mm film. I think with black & white film, your are limited only by diffraction (microfilm exhibits rainbow fringes around detailed areas). Edit: I now see that page was dealing with still photography. Faster film has more grain and lower resolution.

Yes indeed, still photography is an entirely different animal. It may be the same film size, physically, but generally it has a longer exposure time, allowing more visual information (which I suppose we could call resolution) to be exposed on the film.

phillipsjk wrote:Digital is cheaper, not necessarily better.

Amen to that. I've got an old Super 8 camera that I love. Analog technology ftw.

You should see some digital 16mm transfers, they do look pretty spectacular ... now if only I were rich enough to shoot on 35mm...

phillipsjk
Posts: 1213
Joined: Wed Nov 05, 2008 4:09 pm UTC
Location: Edmonton AB Canada
Contact:

Re: "HDTV" Discussion

KDMatt wrote:Yep, you said it precisely, it is 60 half-resolution, or half frames in a given second. Generally speaking we are only interested in complete frames in a second (at least I am), so I will adamantly maintain that SD programming is not 60 fps. Wikipedia is on my side on this...

The problem is that the two fields don't occur at the same time: they are about 1/60th of a second apart. You don't get 30 "whole frames" per second unless the subject mater does not change between the two fields.

How is this for a compromise: moving objects happen at 60 fps (fields per second), still objects are updated at 30 (frames per second). 24 fps video uses a weird hybrid averaging at 24fps. Note that for the viewer, there is no obvious distinction between "even" and "odd" fields.

KDMatt wrote:Amen to that. I've got an old Super 8 camera that I love. Analog technology ftw.

We used one of those (and a large TV) as a magnifier for an electronics presentation. Having used a newer "Digital8" camera, I was surprised by the superior image quality.
Did you get the number on that truck?

FlibbertyJibbit
Posts: 1
Joined: Fri May 28, 2010 1:32 pm UTC

Re: "HDTV" Discussion

Sorry this is a little after the fact but it took me a while before I sat down to do the calculation.

There are two reasons why you should be impressed with HD TV resolutions vs computer monitors.

For these examples I picked what I felt were typical examples from here: http://en.wikipedia.org/wiki/List_of_di ... el_density

one handheld (the PSP): 4.3", 16:9, 128 pixels per inch (ppi), with 130,560 total pixels

a good 17" monitor: 16:9, 112ppi, 1,440,000 total pixels

a 42" 1080p TV: 16:9, 52 ppi, 2,073,600 total pixels

So the first reason to be impressed is technologically: The marvel of HD TVs is creating a device with that many continuous and functioning pixels. I grow grow films and if I have one square millimeter out of one square centimeter working I'd be happy (not happy really but I'd take it).

Second people have mentioned but can easily be proven and that is related to the view distance. The important resolution factor is actually pixels/solid angle not ppi. Solid angle is basically the unit you would use to describe the number of pixels in your field of view. http://en.wikipedia.org/wiki/Solid_angle

So for this calculation we will assume the devices are circular (because all the devices are 16:9, the results will be off by the same factor for each so it still works for comparison).
The equation for a solid angle of a cone (aka the field of view of our circular screens) is
2*Pi(1-cos(theta))
where theta is the angle your eyes would move through from the center of the circle to the edge.

Assumptions: you hold the PSP one foot from your eyes, the monitor 4.5 feet, and the TV 10 feet.

Using some right triangles you get angles of about 9 degrees for each (10 degrees) for the PSP.
Conveniently this means that the solid angle is the same for each: .08 to .09 steradians (great unit name by the way, but its not imporant is just a portion of your field of view and they all conveniently take up the same field of view).

So the pixels per solid angle is simply the total number of pixels divided by the solid angle so the results:

TV 2.6x10^7 pix/solid angle
Monitor 1.8x10^7 pix/solid angle
PSP 2x10^6 pix/solid angle

If lets say you sit closer to your monitor (3 feet) then it will take up a larger field of view .16 steradians, but the resolution in pix/solid angle drops to 9x10^6. You can argue about which is more important field of view or resolution but I don't think that was the discussion.

So the HD TV has apparently higher resolution, which explains why they look so good. Unless of course you sit too close, then it'll look worse.

jkovats
Posts: 1
Joined: Wed Jun 09, 2010 12:51 am UTC

Re: "HDTV" Discussion

I guess I had a bad computer and a bad tv for too long. I found HDTV really impressive. By the time I get around to Blu Ray or 3d HDTV, I people will be talking about how impressed they are with their hologram tv's.

Saphy
Posts: 24
Joined: Sat Aug 29, 2009 8:44 pm UTC

Re: "HDTV" Discussion

HDTVs are pretty.

People like pretty things.

I like pretty things.

coatmaker618
Posts: 1
Joined: Thu Jun 10, 2010 7:58 am UTC

"HDTV" Discussion is all wrong!

A few people (in these forums) have mentioned that a bigger TV doesn't mean better image. This is because they are comparing total pixel count, as opposed to the "apparent angular diameter" of a pixel (read: how big the human eye perceives the pixel to be.... or put another way, how much of our field of vision a pixel takes up)

**************************************************************************************************************************************************************
******************************************Warning the following post is really long & contains some math******************************************
**************************************************************************************************************************************************************

To show this point, one can use the information below representing a modern smart phone, a 55" HDTV and a 22" Wide Screen Monitor, all given in the following format:

Item Name: [Diagonal Viewing Length (inches)] [Aspect Ratio (Width:Height)] [# of Pixels (Vertical count..... as used in HDTV measurements like 720 or 1080)] [Distance from Viewers Eye (inches & meters)]

HDTV: [55"] [16:9] [1080] [120 inches (3.3 meters)]
Monitor: [22"] [16:10] [1440] [36 inches (1 meter)]
Smartphone: [3.7"]* [16:9] [480] [18 inches (0.5 meters)]

Finding the "apparent angular diameter" of a pixel is a 2 step process:
1) To find the absolute size of a pixel:

1a) Convert Diagonal Viewing Length to Vertical Viewing Length using Aspect Ratio:
$AspectRatioDiagonal=sqrt(AspectRatioWidth^2+AspectRatioHeight^2)$
$VerticalViewingLength={\frac{DisplaySize*AspectRatioHeight} {AspectRatioDiagonal}}$

1b) Use Vertical Viewing Length and Vertical Pixel Count to find Absolute Pixel Size:
$AbsolutePixelSize={\frac{VerticalViewingLength} {VerticalPixelCount}}$

This results in the following Vertical Viewing Lengths:
HDTV: 0.024967008
SmartPhone: 0.003779097
Monitor: 0.008097206

This shows that the smartphone clearly has the smallest pixels, not surprising considering it's size... but that's only half the story!

2) Now that we know the absolute size of the pixel for each device, we must use that along with "reasonable" (always a dangerous word) assumptions about distance from the device (this is the distance between the eye** of the user to the screen of the device). This is found using "basic" trig:

$ApparentAngularDiameter=arctan({\frac{AbsolutePixelSize} {ViewingDistance}})$

The final results are below (units are milliRadians or mRads):
Smartphone 0.25193980 mRads
HDTV (55") 0.20805839 mRads
Monitor (22") 0.22492239 mRads

This does put the HDTV in the lead, meaning your image will look best on that device.*** However changing the viewing distance to 18.3" of the smart phone, or making the smart phone 0.02" larger would put the smartphone in the lead. This means the HDTV is minimally better than your phone for viewing! The monitor would need a resolution of 1556 (vs 1440 ......not very practical) or to be moved about 4" further back (more practical... potentially), to achieve the same resolution as the TV.

Draw your own conclusions, read my notes, and adjust your variables as you see fit.... I am sure your monitor has a different resolution, or you have a different size TV. While your details may change, this is the number which concerns the human eye!

Notes:

*) I have no idea what the unit of error of measurement is for the smartphone (I used the Motorola Droid, as it happens) The 3.7" is from Motorola's website, but they fail to show the margin of error or the standard deviation.... I would not be surprised if 0.02" is within the tolerance of Motorola's measurement!

**) My distance measurements for step #2 are from the device to the cornea, not the fovea... this may be significant for the smartphone, I am not redoing my calculations at this point (but you may feel free to do so)!

***) Look best is only concerning this particuliar aspect of appearance. We are not discussing contrast, refresh rate & many other aspects of image quality:
http://en.wikipedia.org/wiki/Image_quality#Image_quality_assessment_categories

Omerprime
Posts: 26
Joined: Thu Jun 10, 2010 9:02 am UTC

Re: "HDTV" Discussion

.cheeseofdoom. wrote:
squareroot wrote:I always thought I must be misunderstanding something - are you saying that a 1080p TV really has a vertical resolution of only 1080 pixels??

That's exactly right, sad isn't it?

Wait, WHAT?!?!?!
1080 pixels? THAT'S IT?!?!?!
*falls out the window*

ranimmo
Posts: 1
Joined: Tue Oct 18, 2011 3:18 pm UTC

Small Bone to Pick...

So I have a little bit of a problem with one of the comics of this series that I otherwise love.

Comic number 732: "HDTV"

I am a video engineer and I have to say that this comic is wrong about resolutions. 1080p is still a very difficult resolution to deal with even in the professional world. It requires a very high bandwidth. In fact, a lot of professional equipment is still not even capable of dealing with it (over SDI, the preferred standard anyway, most everything can do it over DVI).

There are very few resolutions that are actually higher than 1080p. In fact the only resolution that was ever commonly available that is higher than 1080p was WUXGA (1920x1200). And this resolution never took off because it only gains you 20 vertical lines and no horizontal lines which does nothing but fuck up aspect ratios as it is 16:10 rather than 16:9. Plus this resolution is not possible to send over standard high end video signal lines. It requires DVI to be sent, and DVI is designed for home use, not broadcast. So it will never be available in broadcast. The only truly standard resolutions (meaning professional standard resolutions, not computer resolutions) that are higher than 1080p are 2k and 4k, which are currently pretty much only used for cinema projection, and movie shoots.

I would like to know what LCD monitor you had in 2004 that had a higher resolution than 1080p. Are you talking about UXGA (1600x1200)? Because that is a lower resolution than 1080. 1600x1200= 1,920,000 pixels 1920x1080= 2,073,600 pixels 1,920,000 < 2,073,600 Now, maybe you had a WUXGA monitor, and if so, fair enough, that is a higher resolution. But it is literally the only widely used resolution that was ever higher than 1080p, and it is not viable for broadcast because of the non standard aspect ratio, among other reasons.

There is also a lot more to take into consideration in a resolution than pixel count. There are hertz, color spaces, dynamic range, etc. This is why not every resolution that graphics card manufacturers come up with becomes available in broadcast. There is a lot to design around a video standard. You need to design it for use in many different pieces of equipment. It has to be viable for use with cameras, fiber optic lines, and home televisions. It also has to be designed into a countless list of different video devices for use in studios etc. Not just any resolution becomes a standard. It is much easier in the world of computers to just keep adding pixels. All you have to worry about is the video card and the monitor. So you can't compare computer graphics resolutions and broadcast resolutions. They are two totally different worlds.

So I suppose what I am trying to say is this: 1080p is very impressive! It is still the highest standard broadcast resolution, and not an easy resolution to deal with. Don't knock it!

SirMustapha
Posts: 1302
Joined: Mon Jul 21, 2008 6:07 pm UTC

Re: Small Bone to Pick...

ranimmo wrote:So I suppose what I am trying to say is this: 1080p is very impressive! It is still the highest standard broadcast resolution, and not an easy resolution to deal with. Don't knock it!

But why should almighty Randall care? When it's all about being a smug asshole, he can simply ignore irrelevant things such as reality.

Technical Ben
Posts: 2986
Joined: Tue May 27, 2008 10:42 pm UTC

Re: 0732: "HDTV"

the bright side ranimmo, I only just noticed my monitor was beat by 1080p as I was thinking 1080i all along by mistake.
That said, the old monitors did go up to 1600x1200. For a non-widescreen format, that is still pretty impressive.
PS, don't complain about us poor 16:10 users!
It's all physics and stamp collecting.
It's not a particle or a wave. It's just an exchange.

Kit.
Posts: 980
Joined: Thu Jun 16, 2011 5:14 pm UTC

Re: 0732: "HDTV"

Tektronix 4014, anyone?

4096x3120 resolution in 1974.

veeronic
Posts: 1
Joined: Sat Feb 25, 2012 1:21 am UTC

Re: 0732: "HDTV"

uh....because they are at least 20 times bigger than a cell phone and 4 times bigger than most monitors?

gmalivuk
GNU Terry Pratchett
Posts: 24758
Joined: Wed Feb 28, 2007 6:02 pm UTC
Location: Here and There
Contact:

Re: 0732: "HDTV"

veeronic wrote:and 4 times bigger than most monitors?
Speak for yourself. I am currently typing this with my computer connected to a 52" HDTV.
Shro wrote:I am the one who will teach the robots how to love.

If this post has math that doesn't work for you, use TeX the World for Firefox or Chrome

(he/him/his)

Lenoxus
Posts: 120
Joined: Thu Jan 06, 2011 11:14 pm UTC

Re: "HDTV" Discussion

Hi, I'm from the future! I have nothing to add about jigawatts and such. I just had to note my surprise that no one had made the connection between this comment:
meat.paste wrote:When I meet the first person who watches Lawrence of Arabia a phone and thinks its good enough, I will weep.

and this bit from the Oscars of two years prior. Carry on, 2010.

Quicksilver
Posts: 436
Joined: Wed Apr 29, 2009 6:21 am UTC

Re: Small Bone to Pick...

ranimmo wrote:So I have a little bit of a problem with one of the comics of this series that I otherwise love.

Comic number 732: "HDTV"

I am a video engineer and I have to say that this comic is wrong about resolutions. 1080p is still a very difficult resolution to deal with even in the professional world. It requires a very high bandwidth. In fact, a lot of professional equipment is still not even capable of dealing with it (over SDI, the preferred standard anyway, most everything can do it over DVI).

There are very few resolutions that are actually higher than 1080p. In fact the only resolution that was ever commonly available that is higher than 1080p was WUXGA (1920x1200). And this resolution never took off because it only gains you 20 vertical lines and no horizontal lines which does nothing but fuck up aspect ratios as it is 16:10 rather than 16:9. Plus this resolution is not possible to send over standard high end video signal lines. It requires DVI to be sent, and DVI is designed for home use, not broadcast. So it will never be available in broadcast. The only truly standard resolutions (meaning professional standard resolutions, not computer resolutions) that are higher than 1080p are 2k and 4k, which are currently pretty much only used for cinema projection, and movie shoots.

I would like to know what LCD monitor you had in 2004 that had a higher resolution than 1080p. Are you talking about UXGA (1600x1200)? Because that is a lower resolution than 1080. 1600x1200= 1,920,000 pixels 1920x1080= 2,073,600 pixels 1,920,000 < 2,073,600 Now, maybe you had a WUXGA monitor, and if so, fair enough, that is a higher resolution. But it is literally the only widely used resolution that was ever higher than 1080p, and it is not viable for broadcast because of the non standard aspect ratio, among other reasons.

There is also a lot more to take into consideration in a resolution than pixel count. There are hertz, color spaces, dynamic range, etc. This is why not every resolution that graphics card manufacturers come up with becomes available in broadcast. There is a lot to design around a video standard. You need to design it for use in many different pieces of equipment. It has to be viable for use with cameras, fiber optic lines, and home televisions. It also has to be designed into a countless list of different video devices for use in studios etc. Not just any resolution becomes a standard. It is much easier in the world of computers to just keep adding pixels. All you have to worry about is the video card and the monitor. So you can't compare computer graphics resolutions and broadcast resolutions. They are two totally different worlds.

So I suppose what I am trying to say is this: 1080p is very impressive! It is still the highest standard broadcast resolution, and not an easy resolution to deal with. Don't knock it!
I know you'll never see this, but the IBM T220/T221 had a resolution of 3840x2400.

pds314
Posts: 2
Joined: Thu Jan 03, 2013 1:17 am UTC

Re: 0732: "HDTV"

Now we have 1080p smartphones such as the one I am typing this on... 4k screens as small as 22 inches existed as long ago as 2001, and they still cost roughly what they did back then. 22" 2400p is respectable. 55" 2160p is stupidly low resolution for 12 years later at the same price.

Return to “Individual XKCD Comic Threads”

Who is online

Users browsing this forum: No registered users and 13 guests