Good morning everyone,
Let me first preface this by saying that signing up on a forum just to post a response to a thread is not usually my style...
... and that furthermore reviving a weeks-old thread is ALSO not my style ...(and fully admit before hand this is a GIANT faux paus) but I've been visiting the site now very regularly for a long time, and I keep coming back to this particular comic ... and every time I read it I can't help but bash my head on my desk a bit.
I finally floated on down here hoping that someone would be able to talk a little bit of sense and make some sense out of HDTV's ... but alas, no, so far, most of what I'm seeing are people caught up in numbers ... resolution, framerates ... which is, of course, all well and good ... but what I'm noticing here, and forgive me if I offend anyone by saying this, is that you're all talking in terms of pure science and technology, when a large part of how we perceive motion in films and television is psychological.
... and once again, I'll stop myself briefly before I go on to explain that my background is presumably much different than everyone else in this thread. I am a filmmaker, I have a degree in film, and have extensively studied, among other things, the technology of film and digital video, as well as how people perceive both.
The first point I must address, and this has really been bugging me about this strip -- Randall, camcorders, sitcoms, and indeed all facets of SD TV are 29.97 fps
. Standard Definition television is an interlaced signal with 60 alternating images per second which creates 29.97 complete frames per second. 60i does not equal 60 fps. So as much as it pains me to say it, there is a factual error in this strip... If it's any consolation however, this is the first one I've ever noticed in my years of faithful reading!
Alright, and then this brings me to all of you folks in this tread going on and on about resolution and frame rates of HDTV's and why it's sucky etc. etc.Fact #1:
Every film you've ever seen has been 24FPS. Yes, even Imax.Fact #2:
No 'film' you've ever seen has been shot or printed on anything bigger than a 70mm slide of celluloid. Imax is a 70mm format... and actually, Most films are shot on 35mm.
Do you really think you need more than 1,920 pixels to accurately transcribe the visual information stored in a 3.5 cm wide piece of plastic? or 7 cm for that matter? There's only so much digital resolution you can yank out of a physical, analog medium like film before it starts looking grainy ... much like taking a microscope to an old photograph.
Only a handful of movies get shot completely digitally (like Star Wars Episode III, and District 9), and even these, based on CCD limitations can't be shot at resolutions much larger than 2000 by 4000 (the exact numbers escape me ... I'm not exactly budgeting for top end equipment at the present moment) ... You can only cram so much visual information, pixel wise, from a lens into a capture device, then fit it all into a camera that's still portable enough (and cheap enough) to be useful for anything. So yes, games have always been ahead of the curve as far as pixels go ... but that's only because it's all digitally rendered, there is no camera limitation to take into account.
Also, 24 FPS looks damn good, and it's not because we're 'used to it' .. and it's most definitely not some arbitrary number that was cooked up by some cameraman in the 1920's (as someone on this thread insinuated) ... 24 frames per second, for human beings, is what's known as the "Flicker Fusion Threshold" (google this) -- it is the bare minimum number of continuous frames we need to witness in a given second to comprehend motion. This is where the psychology of films comes into play.
Furthermore, when you're watching an actual film, in a theater, being projected at 24 FPS, you are actually comprehending 48 FPS.
'Sounds crazy I know, but believe it or not, your brain sees and acknowledges the other 24 frames of nothing in a given second, interspersed with the 24 frames of motion, and filters the 'nothing' out as noise (so you're not consciously aware of it). This sudden oscillation between nothing and something serves to, almost subliminally, stimulate pleasure sensors in your brain bringing you into, what one psychologist on the subject called, a 'dream-like state.' This
brings me to the comparison someone made about Casablanca and Casino Royale at 24 FPS and 60 FPS -- Of course
the 60 FPS version looks better, because you're getting the standard 24 FPS of something, plus 36 off-frames of nothing, creating a true 'flicker' effect, whereas the 24 fps version does not do this. SD Television, for what it's worth, in it's 29.97 interlaced frames per second, does not create this flicker effect either.
I also didn't hear much of anyone talking about the distinction between an interlaced and progressive scans on this thread (and I really hope that it's not because people don't understand the difference), but for someone like me -- a film junkie, the difference is paramount (no pun intended on this one). If you're familiar with how film moves through a projector (or a camera for that matter), it works something like this: Film reels are comprised of thousands of individual exposures ... as they move through the machine they are suspended in front of a light (or lens) for a 1/48th of a second as a shutter opens, and then closes again ... then another exposure replaces it and the process is repeated millions of times... In essence, you are getting one image, in its entirety, every 24th of a second.
Interlaced images look fake because they don't do this. Instead they provide you with an image that is constantly being updated and rescanned over and over again in a given second. There is never one complete image to register at any given time (unlike real life), so our brain doesn't detect the 'flicker effect' nor does it seem particularly real or detailed (you're, in essence, getting a half frame at a time).
Progressive scans, however, DO give us one complete image at a time, and CAN provide us with the all important "flicker effect" that makes movies so hypnotic. This is the real reason people are going gaga for HD: progressive scanning. Our computer monitors have had it for years, but our TVs have not. So when Joe Schmoe hooks up his brand new 1080p TV to his Blue-Ray player, you'd better believe he'll think it looks better. He's got flicker! (something I don't think you even get with 480p DVD's) It's not that any single element of the technology is particularly mindblowing in and of itself, it's just that now it's all available in one standard, in one package that is a vast improvement over what was available before.
Also, I saw someone else bashing Home Theater surround sound on here? Really?
It's more than just a 'fancy echo' (at least if you you're using a setup from within the last decade). Dolbic Digital and DTS surround sound standards digitally encode and decode six (at the minimum!) discrete, individual channels of audio... and while most people do indeed have really crappy setups ... some of us, the true home theater buffs, have decent speakers on our systems and don't even need
subwoofers most of the time.
Again, apologies everyone for reviving an old thread (I hope you can forgive the n00b syndrome) ... but I was having one of these moments: