Hacker News new | past | comments | ask | show | jobs | submit login
Whatever Happened to Plasma TVs? (howtogeek.com)
82 points by bookofjoe on Aug 21, 2022 | hide | past | favorite | 132 comments



Too much power, too much heat, not bright enough in showrooms to compete against LED TVs even though objectively, plasmas had better quality than LED tvs.

I have a 720 Samsung plasma circa 2011 that looks better than cheap 1080p LED from a few years ago. I will likely replace it with a 4K OLED in the next year or two and will keep ut one for a other decade or so.


I just did replace my 2010 LG 720 plasma with a 4k OLED. The color of the plasma is still stunning for video content. Since I used a NUC to drive the plasma I did get some burn in from the onscreen menu, but, after making that menu disappear after use, the burn in pattern also disappeared after about four months. That surprised me, I thought burn in was forever, not necessarily it turns out.

The most compelling reason for the upgrade was that when I use the TV in "computer screen mode" it just didn't have enough pixels for the 1080 pixel count that is standard these days in computer land. But I had to get the OLED tech to match the color quality of the twelve year old plasma.


I just replaced my E550 Samsung plasma from 2012 with a QD-OLED precisely because LCD tech just wasn't better. Even on the QD-OLED, the extra brightness and 120Hz is nice, but it isn't as massive of a leap as I expected it to be (though some of the inaccurate colour modes certainly give that impression) and it is far too complicated with the different modes for different content and some content looks terrible unless you disable the built in processing.


I recently helped my sister-in-law move house. I smiled a little at her little TV. Looked like a small flat screen from one of the first generations.

Of course, I also configured the Internet access. This also included testing that Netflix etc. was working smoothly. When the stream reached the highest quality level after a few seconds, I was extremely surprised by the sharpness of the image, the clarity and also the color rendering.

It was of course an "old" plasma TV, which makes my panel in the TV at home look pretty bad.


I had a Panasonic from about 2008. It finally started having image burn-in problems and some yellowing from too much CNN on in the background. It had great speakers and rich colors until near the end. The Samsung I replaced it with is terrible in so many ways. Had to buy a sound bar to fix the audio. the color is not as rich. It has tearing issues with faster movements like when watching sports. I would get rid of it but not sure other options are better and hate contributing to e-waste.


Also had 2008 plasma Panasonic. It felt great. The image quality, amazing sound, the menus, the consistency of interactions (latency on every operation was always the same), the beautiful form factor, the large and high quality remote. It felt like something from the past, when good TVs were expensive and made to look and feel like it.


I still have a 50" Panasonic Plasma as our bedroom TV. It's 1080p. I still think it has a better picture than a 1080p LCD. I'll be really sad when/if it dies.


The heat that came off my Panasonic GT30 was _crazy_. Replaced it earlier this year with an LG OLED after it mysteriously wouldn't power up one day. Still, lasted the best part of a decade which isn't too bad.


How can you have all those problems and still have better quality.


Low power consumption doesn't affect picture quality and brightness wasn't an issue in a typical home TV setting, which is usually a lot dimmer than a retail sales floor. Plasmas were amazing at most things that do benefit picture quality.


Deeper blacks leading to better contrast and saturation mostly, although OLEDs are again superior in that regard.


I just sold my 11 year old Hitachi plasma TV a few months ago. Honestly, despite its age it had the best picture quality of all my TVs until I bought an OLED to replace it.

The biggest downside to it? It put out a tremendous amount of heat. It got to where I didn't like using it during the summer because it would heat up the room. It was basically like running a space heater.


I still have my Hotachi plasma (1080i) and I won’t get rid of it till it dies! Great TV but power hungry, hot and extremely heavy!


I rented a place during the summer that had a plasma TV in the master bedroom. I had to disconnect it, and never used it, because it generated so much heat that we could not sleep.


Sure but it makes a great faux fireplace at Christmas :)


While I love my OLED TV (LG C8, had it 4 years now) there is one thing I still preferred with my plasma TVs and that is motion. OLED is great but the near instant pixel response means for lower frame rate content such as movies (24p) and most tv shows (25 or 30p) there is stutter. This isn't visible in many scenes but anything with panning and it looks, to me, quite jarring and I dislike it a lot.

This isn't an issue with plasma screens as you get some natural motion blur with the pixel response times giving a smoother overall presentation.

You can somewhat alleviate the stutter on OLEDs using motion processing on low (or using special "cinema" settings on newer models) but it isn't perfect as it introduces slight artifacts which also suck plus the smoothing gives ever-so-slight soap opera effect that wasn't present on plasma screens.

However for gaming that near instant pixel response for 60 or 120fps games is freaking awesome. You really can't beat OLED for a gaming display. Just amazing.


> OLED is great but the near instant pixel response means for lower frame rate content such as movies (24p) and most tv shows (25 or 30p) there is stutter

One thing to check is that it's not judder from a frame rate mismatch or dodgy motion smoothing. Usually if something is noticeably jarring, it's judder rather than frame rate.

It can be notoriously difficult to get these settings to do what you want on most TVs. For me, I always want "no motion smoothing but with automatic dejuddering enabled", which can be almost impossible, or even impossible in the case of some Samsung TVs.

Worse, some device/app combinations seem to just introduce their own judder. For example BBC iPlayer is awful for judder and stuttering, and Amazon Prime on Android TV seems to struggle with some shows sometimes. Android TV is a bit janky in general. Apple+ on Apple TV is always perfect.

What I normally do to conclusively check for judder is I use the slow motion video on my phone to record at 1/8th to look for duplicated frames or other weirdness.

Interestingly, I believe some amount of subtle panning judder is actually created by cinematic filming. Convention is to use a 180 degree shutter, which means each frame only has half the expected motion blur. 180 degrees is usually the best for 24Hz, but it can make certain types of panning shot look a bit unnatural IMO.


The issue is that fast pixel response time does not fix a slow refresh rate. Now if you strobe the image then you have a good solution, but most all content is sub 60 Hz and strobing at such low rates is a real headache. Pretty much the only option that can work is temporal interpolation. That's a comparatively huge can of worms.

https://blurbusters.com/blur-busters-law-amazing-journey-to-...


Having watched a number of (old, uploaded) films on Youtube on my TV I've noticed that there can be really serious judder - you don't need a phone to see a frame skip/duplication in a pan, it jumps out and is impossible to not see. I've not done enough investigation to see if this is a defect in the original transfers (have they done a bad 24->30) or if it's the TV trying to force it.


Uhh, 24FPS is always choppy, even without judder. Stuttering is not even a possible result of pixel response speeds. Only choppiness. So I assume that's what you mean. The only way to "fix" this is interpolation.

So what are you claiming, that some slower pixel response times act as interpolation? Then just interpolate. Insert frame 1.5 (1 and 2 overlayed on top of each other with 50% opacity) between 1 and 2. Still not smooth enough? Just add more interpolation. There is literally no difference between this and slow pixel transitions that supposedly were timed such that they hide the choppiness. You can even do this all in software and hook it up to the TV. And of course OLED TVs can handle this as long as they allow higher refresh rates. Also some OLEDs just have this built in. Enjoy your soap operas.

> This isn't an issue with plasma screens as you get some natural motion blur with the pixel response times giving a smoother overall presentation.

I thought plasma pixel transitions are as fast as CRT?


>24FPS is always choppy

Have you been to cinemas?


Have you never noticed that a panning camera in a movie is much more juddery than a decent side-scrolling video game? Or that action scenes often lack clarity due to camera shake and related motion blur?

Or that the move from HD to 4K is near-worthless without a framerate increase?

Maybe it's not bad enough to match your personal definition of 'choppy', but 24fps is an unfortunate compromise made in the days when you had to feed costly physical film into movie cameras.

And now everyone is so familiar with 'the cinematic look', we're stuck with 'movies should look like that, as that's how they've always looked', anything different is assumed to be worse, and we're never going to learn how to do high-framerate filmmaking well as the early attempts were so quickly rejected.


I use an Apple TV to watch all of the content on my TV – streaming services, Plex, apps for TV channels, etc. Since they added the match frame rate feature[0], the judder is eliminated entirely. It's a bit jarring as the screen flashes blank when transitioning from the home screen (60fps) to video content, as the TV's frame rate is adjusted, but it's worth it, IMO.

0. https://support.apple.com/en-us/HT208288


Yes, why would that change anything? The only reason it's bearable is because they specifically engineer the movies to avoid motion above a certain speed as much as possible.


How much of this is "pixel response" and how much is size? Newer TVs highly correlate with larger TVs. Want to see judder? Watch a 24fps projection in a huge theater. I've seen jerky panning on big screens through my life, even in the film days (I guess film projection doesn't have any persistenc of image either - but this, to my mind, would still be a rebuttal to the idea that a display with some like a CRT is "better" vs just accidentally blurry in sometimes-useful ways). I don't remember my plasma being any better at this than the LCD that replaced it in the same size, but it's been a long time. I certainly didn't immediately notice anything getting worse (the black levels, on the other hand...).

(And what if you want crisp motion, e.g. for 60p sports broadcasts compared to slow panning shots? Let's not go too far to praise one inferior technology for hiding problems of another one!)


You certainly get judder and stutter from projection and you do still get it on plasma it is just to me it is no way near as jarring to the eye as it is on my OLED.

Size, as far as I am aware, is not an issue. My OLED is 65" and my final plasma was 60" so similar enough to size plus the same is true on a 55" OLED (and even smaller).


> You can somewhat alleviate the stutter on OLEDs using motion processing on low (or using special "cinema" settings on newer models) but it isn't perfect as it introduces slight artifacts...

Same here with my ~5 years old LG C6V: in my case it's funny that it has especially a problem with anything that is a bit small and which flies, being birds or helicopters etc... => they're often displayed garbled or with split upper/lower sections. Weird.


Yes, this is my exact experience. Plasma is still the best picture for movies because of the superior motion handling of low framerate content.


it is not shutter, but "ai motion smooth" or whatever it named in LG TV


Nothing happened to them. Nothing ever happens to them! There's a now twenty year old Panasonic still hanging on my wall that refuses to stop working flawlessly and thus allow me to guiltlessly replace its 480 fat & smugly glowing rows of pixels with something a little more modern.


If you're looking for a reason to replace it you could probably work out how much it's costing you in energy (and possibly extra AC workload if you're in a hot country!) vs an OLED


Saving few bucks to degrade viewing experience may not be a good choice for most people. You know you can save 100% when living in the tent in the forest


A QD-OLED (or even a normal modern OLED) is probably a better viewing experience than an SD plasma. The issue is that the LCDs that were dominant for a while weren't.


I guess you aren't in Europe currently. Right now I'm paying ¢71 per kWh. So that plasma TV would easily cost tens of $ every month.


480p? I mean do you need anymore reasons?


If you are fine with 480p, you may want to check your eyes. You may need glasses or new glasses.


It's kind of a trolley problem: I'm not "fine" with 480p, but also not fine with throwing away stuff that still works. That seems to mean the status quo wins until the thing stops working.


I have to disagree. 480p on the web might awful, but a well encoded DVD still looks really good


On a small screen it's not extremely bad but you lose a lot of information.


DVDs are natively 480. There isn't any more information on the disc to lose.

Would it be nice to have a 4k transfer from the original 35mm? Yes. Can you get that from the film licensees? Not necessarily.


He might have the common "mounted the TV over the fireplace" affliction which means it's so far away you can't tell the difference.


It must be a very big room and a very small TV to not tell the difference. Or bad vision.


Those of us who grew up in the analogue TV era are simply used to it.


Plasma TVs had the best image quality, dynamic range, and viewing angles until OLED came along. I believe the Panasonic ZT60 was the last really good one and it was discontinued at the end of 2013. I still have mine and it's an excellent TV, IMO it looks better than most of the LED TVs I see at the store (not as good as the OLEDs though).


They werent very bright. Even the best plasma TVs had a peak brightness of a mere 100 nit.


If you want a TV with amazing picture quality and don't want to pay a lot, there's nothing better than an old plasma TV. I ended up with a Panasonic 42" (Viera?) model, one of the later ones made. It's definitely quite heavy, and after awhile it gets a bit hot and it probably consumes a lot of power, but it easily surpasses the picture quality of today's LCD TVs. The only way to get close to its picture quality is the new fancy OLEDs, etc.

I also paid $60 for it. Hard to beat.


I had a Panasonic Viera during the plasma glory days and I didn't really like it.

The worst problem I had with it was that, like all plasmas, I can see a high frequency flickering on them, especially in my peripheral vision.

The second worst problem is they're not very bright.

I have a LG C1 now and love it (though there are a few minor issues with that too!).


I was just trying to figure out what my model is since I've never experienced the flicker, but I'm not home and it's too difficult. Looks like Panasonic has used the Viera designation for a long time and they continue to use it, so there's probably tons of different variations.

And yeah, it's not exceedingly bright, but I've only ever noticed the issue if there's direct sunlight shining on it, which is pretty rare.


Mine was one of the "600Hz sub field drive" models.

I am quite sensitive to flicker in general, though. I see it on DLPs, CFL bulbs and even a lot of LEDs that use PWM. For example, in my area a lot of the new tunnel LED lights have flicker in my peripheral vision, which can be a bit disorienting.

With the Viera, it was worse in a bright room (perhaps due to my pupils being smaller) so my solution was to try and keep the room dim when the TV was in use, but that wasn't always feasible.


I just sold one on behalf of a friend who was moving, probably in the same series as yours- I was very surprised at how easily perceptible the flicker was. It also had some "dot crawl" noise even with HDMI, which seemed odd, could have been some mediocre capacitors already on the edge of normal tolerance after 11 years. It looked great with high-motion colorful content, but the flicker and the noise really bugged me with static scenes. I was just testing it though, ended up selling it for only $100 at a yard sale (46" 1080P). If I didn't already have a 30" 1080i LG CRT with an HDMI input I'd have considered keeping the plasma. The CRT needs some work though, it's not maintaining normal horizontal image size anymore. Tough to find shops that will work on CRTs now, only one or two in my region and I'll have to lug the thing out of my basement and drive a couple hours for it. I haven't tried taking it apart yet, but I don't have a real oscilloscope or the service manual.


The higher-end models have a higher refresh rates and will flicker less. I've still got a GT60 running and don't notice the flicker. (I can see it when blinking my eyes rapidly, but who does that while watching TV ;) )


> don't want to pay a lot

Don't want to pay a lot up front. You'll pay a lot to run it.


Did I just enter some bizzarro world where 500W is now considered high?


In Germany, keeping 1W running continuously costs you around 3€ a year (see below). So, if you were to leave 500W on continuously, we'd be talking 1500€ a year.

   1W x 24h/d x 365d = 8760Wh
   8.76kWh x 0.3525€/kWh = 3.09€


Just don't run it 24/7, then. I feel like anyone who can actually notice this already has an apartment that provides free electricity. EDIT: Wait, my feeling was right. Your rates are also an order of magnitude above the rest of the world, for some reason. For me running a 500W machine will cost maybe $200 per year with real world usage.

> Power prices in Germany are among the highest in Europe. The high costs partly are due to the mandatory support for renewable energy sources – but most customers continue to support the country's energy transition regardless [result of some web search]

Fuck outta here with this. As I stated, 500W is nothing. The alternatives (half-working LEDs) are pure garbage. You are all propagandized idiots still consuming and trashing at the same rate but now buying stuff because it's """green""" (TM) while the company knows they are not fixing anything.


If it uses 500W it must be a heck of a TV. My last-gen plasma doesn't use more than 180.

Simple solution is to just turn the TV off when you're not watching it. Cut viewing time back to 1-3h/day and you're under 200 euro's per year. Don't forget that large LCD TV's use more power too, so you're not going to be able to save all of that by replacing it with an LCD.


It's interesting to see how plasma TV power depends on what's on screen. A mostly dark image and the watts take a dive. Bright images and suddenly the power consumption skyrockets.


Micheal Scott?


I have a 2010 Panasonic G25 Plasma and for movies and television I prefer it to my LG C1. The picture, colors, and motion handling all just appear more natural to me.

The LG is hands down better for games, but everything else tends to look a bit off. The response time is just too quick for movies and video to appear natural. There is filmmaker mode that makes this a lot better, but it is still there.


> he response time is just too quick for movies and video to appear natural

Sorry, what does this mean?

From my understanding (could be very wrong) when you playback video you just display every frame still for 1/framerate seconds. Why would response time affect it in any way, especially the bad way?


The above post is likely seeing a motion smoothing feature common on tvs today. This takes the digital videos stream and generates "imbetween" frames to artificially raise the frame rate, and in some cases deliberately reduce motion blur.

It looks good for sports and video games, but films proper make a lot of specific choices about frame rate vs motion blur, which these filters just crush out of existence if they're cranked up too high. A very good scene to see this clearly is the opening of Saving Private Ryan.


Rtings does a good job of reviewing for this and showing screenshots, but basically, because the per pixel response time is so high, an object in motion across the screen does not blur as much on the LG OLEDs and most OLEDs in general. Great for games. Terrible for movies, so the OLEDs do have some tricks to help mitigate this a bit, but it is still there.


Isn't that backwards? AFAIK real-world filming equipment naturally captures motion blur on each frame, but many games only show a series of unnaturally sharp still images. On the other hand GPUs are now powerful enough that they do generate some motion blur as well.


I have a late gen Samsung plasma and it really is still impressive. I think one thing it handles way better than oled/lcd models is broadcast/lower rez/bitrate content.


Do you work at Tom's Hardware? This is why you can't buy a TV based on internet reviews.


I bought the LG for games and it is great at that. It is also pretty good for movies and television, just motion handling is worse for low framerate content. I am sensitive to this as I find the motion handling of a lot of LCD tvs to be poor as well.

I will add that another part of the issue is that it is almost too good in terms of response time and motion handling. There is not enough blurring. Everything appears so sharp and crisp and motion resolution is so high that movie sets look like movie sets. When something isn’t real in the scene it is very clear. It is a problem of OLEDs being beyond the technology films are shot for. No one expects that you will be able to see everything in sharp detail during fast moving scenes.


That definitely doesn't sound like a conclusive analysis.


My in-laws bought a "big" plasma TV in the mid 2000's after much research and deliberation. A few weeks ago they won a new LED TV and set the plasma aside.

I'm not sure if it's the sum of money they spent originally or a real difference, but my mother in law still prefers the plasma TV.

Has anyone had similar very positive feelings towards plasma versus recent new TV's?


Plasma has much better black levels than all LCDs. Only OLEDs are superior. (Or, at least, equal.)

The downsides of plasma are weight, heat, and screen burn. (Though OLEDs get screen burn too.)


Lower resolution is what finally killed plasma display. It can't support 4K for decent size TV when industry moving to 4K. (Though FHD is fine for only watching videos.)


> decent size TV

That is a bar that moves every few years as well.


100" was about the minimum practical size for a 4k plasma TV at the time, which is perhaps a bridge too far.


75" seems to be par now, so not far off!


If you had a dark room, plasma was highly superior until very recently.

The actual pixels made their own light, which meant that if they were dim or off, they were off - most LED screens until recently just have a backlight (or a backlight with sections) and so in a dark room on a black screen you can see the light leak through.


Yes I have nothing but fond feelings for my old Pioneer. It was a spectacularly good display, but I don't miss it, I just remember it well. It did a better job processing 24-frame film content than my LG OLED does, but the OLED beats it in every other way, especially brightness and energy consumption. Also that plasma made a weird highly direction audible sound if you sat directly in front of the center of it, which is not a quirk I've suffered with OLED.


I still have the 50" pioneer plasma, works like a champ. I don't think costco would take it back if I returned it after like 15 years unfortunately.


I had a Panasonic plasma TV at the time and it was great. Way better picture than any competing LCD at the time.

But you had to be careful, it didn't like static pictures, they could burn in, similarly to OLEDs later. If you watched movies, it was not a problem. If you let some station run 24/7, with static logo somewhere, that could be a problem.


I prefer my plasma for movies and television. Plasmas typically handles motion very well. It is CRT like. OLEDs handle motion well if it is high frame rate, but when the framerate of the content is low the high pixel response time makes it painfully apparent.


Too expensive, chunky, and power hungry while LCDs kept getting cheaper and better.


>and power hungry

This is an important point. My household uses very little power and when we eventually switched out our old plasma TV after a decade or so of use, our power bill halved. It's hard to pin it all on the TV because this is hardly a scientific experiment but that was the only notable change during that period.


My LED bulbs paid for themselves in 6 months of reduced consumption. My new TV that replaced a plasma TV will take 3 years to do so.

(EU electricity prices. With increasing prices expected it is probably going to take less than originally estimated.)


Everyone I know they had gotten a plasma TV had it die within just a couple years. The article says burn-in became less of an issue, but I'd never seen one that didn't have a burn-in problem. Even if that problem was solved, the public sentiment was that they were heavy, expensive, and unreliable.


I bought mine in 2010 and it's still working flawlessly to this day. It's my family's only TV and it typically sees about 2 to 4 hours of use per day.


I've still got a Panasonic plasma from circa 2008, which still seems absolutely bomb proof. It's only used occasionally now for some retro (Dreamcast or MisterFPGA gaming), but the picture, UI, remote and everything about it still screams of rock solid engineering. It does weigh an absolute ton - I would guess around 40KG. It's impossible for one person to lift/move. By comparison, we have a 65" 4K HDR panel in the living room, which I can easily lift & move around on my own without breaking a sweat. The 65" feels nowhere near as well engineered though. The OS seems to lockup and become unresponsive at least once a month, which the Panasonic plasma has never done once in its whole lifetime.


Same. My Panasonic plasma from 2007 still works fine, although mine sits in the garage. I use a projector now. I tried selling the plasma but nobody wanted it, not even for $50. Picture quality is amazing for 1080p sources and low-light viewing.

It disturbs me how millions of plasma TVs would have been made. Now they are unwanted with nowhere to go. I like to imagine a highly efficient recycle program would strip them down for parts. In reality I suspect most end up in landfill and aren't much value in the recycle business.


My parents' 2005/6 Panasonic 1080i plasma looks better than my much more recent Samsung non-OLED LCD, both are fine, but the plasma is better. Similar usage to yours probably, more earlier in its life.


Not strictly related but I remember how 3D was the feature every TV set needed to have some years ago and now it's not even a thing anymore.

My indestructible TV (it has survived my son's first 3 years of life, so far) set has 3D and I remember having fun even playing games with this feature.

What happened to 3D is a mystery to me.


A rare, fascinating case where an enormous marketing campaign to make people want something that they didn't actually want _failed_.


I had zero interest in that sort of fake 3d (though I like IMAX & have a VR kit) but kind of hoped for it to bring actual 120 hz panels to the market. It didn't happen. They were all 60 hz input. Such a collosal waste.

Let's not forget nvidia also tried to capitalize on the whole scam with their 3d vision crap. Oh, was shown more than a few 30hz per eye demos by clueless promoters and tech enthusiasts.

At least it paved the way to high refresh monitors which are now mainstream but were frowned upon back in the day. Sometimes society catches up


30Hz per eye? I know Nvidia had compatibility with a few 60Hz interlaced TVs (60Hz per eye, but at half resolution), but never seen a 30Hz per eye setup.


Input was limited to 60 hz, so 30 unique FPS per eye.

You could do 60 per eye on 120 hz monitors. Just not on TVs. In any case, even 60hz per eye is a horrible flicker fest to me. Had enough of that back in CRT days.


Linus "reviewed" some 3D TVs 3 months ago and asked the same question.

https://www.youtube.com/watch?v=Dbjb2spwQVg


In December 2007 I bought a 50" Pioneer Elite Kuro 1080p plasma, at the time the best TV available.

I paid $5,000 ($7,100 in 2022).

It weighs 95 lbs. (with stand) but who cares since it's been sitting in the same place for 15 years.

The picture in early 2008 (after professional calibration after 4 months of use) was mind-blowingly great — and remains so 15 years later, so much so that when I take a trip every couple years to a high-end TV store like Crutchfield to see how the various bleeding-edge TVs look, they're STILL not nearly as good.

I've been ready to "upgrade" to a 4K OLED for years now but the Pioneer refuses to die or degrade.

The only thing wrong with it is that for the past 3-4 years only 1 of the 3 HDMI inputs still works so I have to manually switch it between my cable box and Apple TV.


Manufacturing costs and margins on LED-LCDs were far better for manufacturers and likely what killed Plasma.

Exploiting this aggressively helped Samsung win market share and become a top TV manufacturer, causing Panasonic and Pioneer to exit the business, and nearly kill off Sony. Their only real competition is on the low end with Vizio, HiSense, TCL, etc.

OLED still can’t match these margins, which is why only LG really makes them in large quantities. Others, including Samsung are dipping into OLED to capture the videophile market but not selling the sheer volume or making huge profits on them.


> Manufacturing costs and margins on LED-LCDs were far better for manufacturers and likely what killed Plasma.

Yes, LEDs are hugely cheaper to manufacturer, allowing them to be priced at a point more suitable for mass markets, where people care more about price than quality (of LED compared to Plasma).


Yeah the big downside is the exponential explosion in e-waste with all the LCDs that went obsolete within a few years, with no real resale market since everyone is one click away from getting a brand new one on Amazon.

I'm still running a 13 year old LCD TV in a guest bedroom. Just the thought of trying to either sell or see it end up in a landfill upsets me.


Its a shame that one aspect of displaying movies (24 fps!) has gotten worse over time.

Motion. I say motion is quite important.

CRT was better than Plasma even though Plasma was quite good.

LCD is basically PC monitor technology and it uses sample and hold, totally unsuitable for movie motion.

OLED, while great in other ways, 24p-motion still sucks, its easily the worst thing about the technology.

What technology will have all the modern good stuff and have CRT/Plasma like motion?


LG OLEDs with dejudder and other smoothing options (some call it soap-opera effect) is very nice in my opinion. Really makes the most of 120hz panel. Some artifacts exist when watching racing on some particular tracks, but other than that it works perfectly.


To be fair 24fps shouldn't exist. If you want low framerate content it should be 30fps.


Almost every movie and TV series ever made in the US is 24p. As for future content, I see no one seriously trying to change this.


All of the comments about the heat generated by these old TV's has me contemplating repurposing them as electric heaters. They can even be looping some xmas time fireplace video.

But seriously, would these be any less efficient than an electric baseboard heater?

If the repurposed plasma screen heater is consuming 400W, is it releasing 400W of heat?


400W is 400W - so yes, any electric appliance which generates heat is as efficient as a resistive electric heater is. Whatever tiny amounts of power are leaking as light or sound or electromagnetic waves are miniscule compared to the heat.

However, we have better ways of heating. An AC unit running in heat mode, or an air source heat pump, will move 4x the heat into your room than it takes in electricity to run. So although running a plasma is equal to running a resistive baseboard heater, if you want the best bang for your kWh you should get a minisplit air conditioner or air source heat pump.


> But seriously, would these be any less efficient than an electric baseboard heater?

No. Of course, those aren't very efficient.


A used plasma TV is still the best value gaming display in terms of motion clarity you can get. https://www.youtube.com/watch?v=KZblNi5i8Lc

An LG C1 with HLG or other service-menu tricks is the only thing to surpass it: https://www.youtube.com/watch?v=19EI9UjRwRs but they cost way more than an old plasma on the used marketplace.


I recently bought a used Samsung 50” plasma for like 60 or 80 bucks for the basement. Picture is excellent. Dude had never calibrated it though. Can’t believe he watched it for 11-12 years or whatever the way it was.

I also more recently got a C1 for the living room to replace a cheapy Hisense that started wigging out, and sometimes I think to myself that I should have saved xxxx dollars and just gotten another used plazzy off FB marketplace…


Had a friend who owned an expensive plasma (think it was Panasonic) and seeing him refuse to use it to watch tv channels with logos in the corner or having to run it with static on the screen for an hour after playing video games on it because the ui elements burnt in so fast just made me think this tech was pretty useless as a general purpose screen unless you don’t mind image burn in.


Plasmas still have the best motion representation on screen. LCD software tries hard, but still odd and noticeable.


I’ve got a Samsung Plasma TV that I’m still pretty happy with. There’s a line running down the left hand side of the image that is supposed dust, and can be fixed with some effort, but I’m lazy.

I really like the screen quality we get with it. Haven’t had much reason to replace it in the 10 or so years we’ve owned it.


As someone who had a Pioneer Kuro for a long time (no more), they did have the reputation of having blacker blacks.


And actual yellows. Especially obvious when the scene includes fire


I still use my 2007 720p 42" Panasonic plasma in my office and my 2012 1080p 3D 50" Panasonic plasma in my living room. I'll probably upgrade this year to an 2022 4K 65" LG OLED, but I still really love both of them.


I have a 46 inch Panasonic plasma which is still my favourite TV. Colors are great, changed channel in an instant. Nothing smart on it. It's pretty big and heavy and huge bezels by today's standards though.


Nothing happened, I’m still using one :)

My main TV is a Plasma 42” from Panasonic, I bought cheap when Plasmas where being discontinued many years ago and it’s still going strong and despite only being 720p, looks fantastic, much better than a lot of other TV’s that I see which are 4K.

I hate how complex TV’s have become, I love this because it’s simple, looks great and makes it easy to watch content, for years I just copied what I wanted to USB and watched, but now I added a google chrome with Google TV and Plex.


Still love my 2013 Panasonic P55 plasma and won’t upgrade until it dies.


My panasonic plasma I bought in 2006 is still going strong, at 1080i


Get one in 2012, the panel is too hot and the tv set is too heavy.


TLDR: they weren’t great to begin with and the technology was not worth investing into.


TLRTF (too long, read too fast): they were the greatest thing at the time (especially for cinephiles). Superior contrast, better viewing angles. Disclosure: Former Pioneer plasma TV owner


Yeah I bought a Panasonic plasma TV in 2006 when I got married and at the time LCD was in its infancy, very jerky and barely watchable. The only advantage of LCD at the time was that plasma was not available in sizes below 38" and was slightly more expensive. Furniture has changed too since then; with HD it's unlikely that you'll be anything below 45" these days.

I still have it in fact, I don't like how power hungry it is but it doesn't really matter for the very little time that I watch TV (which is mostly during the day in the weekend, when my electricity is self produced solar anyway). That said it's probably time to get a new OLED TV and I am just waiting for the day that I will need new furniture.


And 5 minutes later oled came out, viewing angles improved, brightness of 100nits wasn’t cutting it, and they were not worth it. They had a moment. Edit: my point is that there is no mystery in the article. What happened with them? Nothing. The technology was only good in that context, like many others, but it wasn’t any holy grail once LCDs improved a bit.


It was more than a moment. For roughly a decade, your options were Plasma, LCD, or rear projection (or briefly hi-def CRT which was pretty wild).

There was not much overlap of consumer-priced plasma and OLED panels.


For anybody that truly cared about picture quality above all else, Plasma ruled for many years. This is easily validated by going back to reviews/AV forum discussions of Pioneer and Panasonic plasmas around 2008 -> 2012. One big thing to also remember, that I think gets forgotten is that in that time period there was still a huge amount of SD content being produced/viewed, in the era before vaguely decent post-processing, Plasma made much of this material watchable/enjoyable on the larger screen sizes. LCDs would generally produce an awful mess, especially with the early/janky picture processing they employed when viewing SD content, especially 24fps content - many plasmas had native 24fps modes, specifically designed for handling lower frame rate content.

Lots of those Pioneer/Panasonic plasmas all had "hidden" service menus that allowed people tweak/calibrate all sorts of picture variables to their hearts content as well, which rightly or wrongly, probably also helped give them a certain love within cinephile type groups.


OLEDs at the TV sizes came years after plasma. You could make it natural upgrade.

If you had one with 100 nits, you got defective unit.


FTA:

Even the best plasma TVs could only reach just over 100 nits of peak brightness in a 10% window test.


It wasn't that bad (I had one).

But the article also compares them to todays LCDs with 1000 nits. LCDs at the time weren't so bright either, they were about so bright as plasmas, but unlike plasmas, they also were gray goo.


The article is shit, but all I could find is that RTings measured 80 nits on some plasma TV. So it wouldn't surprise me. But this is fine. OLED is low brightness too. And CRTs can be adjusted down. This is one of the advantages of not using an LCD. LCDs are too bright. The sole use for their brightness is to be able to be used in sun light in the afternoon but why would you not just close the curtains at that point instead of getting fried. Brightness is literally what the guy at the electronics store uses to try and sell you this vs that monitor.


Plasma panels age. The last models were rated for 60,000 hours lifespan (to lose half of their brightness).

So if they measured some 10-15 years old panel that spent most of its life being turned on, there's your answer.


OLED is "low brightness" at 200-300 nits. 100 means that if there's any ambient light at all, the contrast ratio is ruined.


I don't think that means that. I have a CRT I keep around 100 nits and it still has an order of magnitude higher quality color with a lamp behind me to the side.


The interesting thing is that plasmas died before OLED was really born; you couldn't get a sensible OLED at the time that the last plasma was discontinued.


More like 5 years later.


Plasma TVs were popular for a while, but eventually lost favor with manufacturers due to various issues.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: