It's truly the best time of year to buy an HDTV, and well, here's every confusing TV term you might encounter, everything you need, explained in one place.
Resolution aka 720p vs. 1080i vs. 1080p
Resolution is pretty simple—it's the number of individual dots (pixels) that make up a display, arranged in a grid. However, when it comes to TVs, we tend talk about it in a slightly weird way, as lines of resolution (think of a FourSquare board), and we tend to do it in shorthand. So, for instance, what's considered "standard definition" is a resolution of 640 x 480, which refers to 640 vertical lines, and 480 horizontal lines. A 720p TV has 720 horizontal lines of resolution, and most typically, 1280 vertical ones. A 1080i or 1080p TV is 1920 x 1080. And the whole 1080i vs. 1080p thing—i stands for interlaced, where only every other line of resolution is displayed, while p is for progressive scan, where the whole picture's displayed at once. Really, since even the cheapest sets are progressive now, you don't have to worry about it.
An important thing to consider, however, is the Lechner Distance, or the distance at which your eye can actually process all of the detail in a 1080i/p resolution image. While you should consult the chart, basically, if you're sitting further back than 7 feet from a 52-inch TV, your eyeballs can't actually resolve the difference between 720p and 1080p, so you might as well save the cash.
A somewhat trickier spec that some TV experts swear by, it refers to how well a set's resolution holds up when stuff's actually moving on the screen, like a baseball player running down a field. Plasmas tend to have better native motion resolution than LCD, but LCD has been fixing this problem. (See "hertz," below.)
Basically, it's how far to each side of the TV you can be and still see the picture, measured in an angle that is, naturally, less than 180º. Again, traditionally this was more of an LCD problem than a plasma one, but all TV technologies have had some issues in the past, and the worst offenders used to be DLP and other microdisplays.
To see viewing angle at work, start where the picture on a TV looks best, and move to one side—now note where the picture starts looking weird, with the colors changing, washing out and getting hard to see. Nicer sets reach nearly 180º, so plenty of people can take part in the HD glory.
Hertz, or What 120Hz and 240Hz Mean
Hertz is basically just the number of times the image onscreen refreshes a second. Because of broadcast standards, TVs in the US need to be 60Hz, meaning they refresh the image onscreen 60 times a second. (In Europe, the standard is 50Hz.) Video sources are generally 30 or 60 frames per second, because of this, and a regular video camera shoots at 60fps a second. So typically, 60Hz sets are the norm.
Lately, though you have 120Hz, and even 240Hz sets, all of them LCDs. They do this to increase motion resolution—see above. A 120Hz TV refreshes 120 times a second, and it comes up with those extra frames by making them up—either duping the frames that are there and putting black spaces in between, or by splicing in intermediary frames that are basically realtime morphs of the two frames they come between. Stuff looks really smooth—sometimes too smooth, true—but the point's to fight LCD's motion blur disadvantage against plasma.
240Hz is another ball of sticky still, promising less motion blur, but with a tradeoff. but there are two different ways to achieve it. One way's kind of cheating, in that it's a 120Hz that uses a flashing backlight to simulate 240 frames a second. The other, more "legit" 240Hz is genuinely faster, with images staying up on the screen for just 4ms before moving to the next. There's no real way to tell which kind of 240Hz a TV uses (though a "scanning backlight" is a tip off it's not the "real" 240Hz). There is a law of diminishing returns in reducing motion blur as you climb past 240Hz, but for some serious AV nerds, like Home Entertainment's Geoff Morrison, it does make LCD TVs more watchable.