Posted by giuliomagnifico 12/22/2025
He also made a second video (not linked) which shows off more of the actual hardware.
1. Touching the circuit board on the back of the CRT tube by mistake trying to troubleshoot image issues, “fortunately” it was a “low” voltage as it was a B&W monitor….
2. Throwing a big big stone to an abandoned next to the trashcan CRT TV while I had it placed normally because it didn’t break when I threw it facing up and the next thing I remember after opening my eyes which I closed from the bang was my friends who were further down the road looking at me as it I were a ghost since big big chunks for the CRT glass flew just right next to me.
CRTs were dangerous in many aspects!
EDIT: I meant to reply to the other thread with the dangers of CRTs
That was nothing compared to the time the CAT scan machine fell face down off the lift gate on the back of the delivery truck because our driver pushed the wrong button and tipped it instead of lowering it, but I missed the flack from that because I was on a move somewhere thankfully. Afterwords he was forever known as the quarter million dollar man.
My father ran his own TV repair shop for many years. When I was a teen he helped me make a Tesla coil out of a simple oscillator and the flyback transformer from a scrapped TV. It would make a spark 2 or 3 inches long and could illuminate a florescent light from several feet away. It definitely produced higher voltage than normally exists in a TV, but not orders of magnitude more. The high voltage circuits in CRTs are dangerous as hell.
In my parents place, an apartment in a second floor, there's an upright Yamaha that my dad bought in the late 70s or early 80s. I think they brought them in through the stairs, but like 10 years ago an elevator was added to the building lobby, and I don't think there is enough space to move it around. I think the piano will remain with the apartment forever :D
I happened to have noticed that they were trying to clear out any remaining floor models of CRTs. One of them was an absolutely giant Samsung, memory says it was >34", but I'm not sure how big...with a sticker on it for, and I'll never forget this...$.72.
Soooo two big TVs for the price of one!
Long story short, we were moving out of that house, CRT tvs were long since obsolete and that TV hadn't even been turned on for at least 5 years. So we decided to throw it away. I had never picked it up before and had forgotten how heavy CRTs could be. I ended up having to get two friends to come help me move it to the curb, it was well over 250 lbs. The trash company also complained when they had to pick it up and had to make a return trip.
I kinda regret getting rid of it, but it was among the heaviest pieces of furniture in our house.
Somewhere, a retro game enthusiast winced at reading that.
CRTs are literally precision particle accelerators in glass vacuum tubes we mass produced and put in peoples homes just to electronically reproduce entertaining images, and there is not a factory left on earth that can make them anymore.
It is some of the wildest shit humans have ever done.
https://www.thomaselectronics.com/
Edit: Reddit claims that Thomas aren't really a CRT manufacturer.
https://www.reddit.com/r/crtgaming/comments/1c9t5sf/company_...
I was VERY smart and of course unplugged the TV before doing anything.
My flat head screwdriver brushed against the wrong terminal in the back, I was literally thrown across the room several feet, and my flat head screw driver was no longer usable as the tip had deformed and slightly melted.
I later found an electronics book that had a footnote mentioning grounding out the tube before going near it…
I know a shock can paralyze (by contracting the muscles) and it can burn (by joule effect) but never seen one push
DC current jolts you “across the room“ by contracting your muscles all at once. Of course the exact effect depends on your posture; sometimes it just makes you stand upright or pull your arms in. This tends to disconnect you from the source of the electricity, limiting the damage. Note that if you cannot actually jump all the way across the room then the jolt probably can’t knock you all the way across the room either. If you fall over your head could end up pretty far away from where it started, though, and if you lose consciousness even for a little while then that can affect your perception too. It could certainly throw the screwdriver all the way across the room.
If you pay attention to the special effects that show up in movies and television you’ll soon realize that they simulate shocks by putting the actor in a harness and then pulling on it suddenly. This sudden movement away from the source of the “shock” stops looking very convincing when you notice that the movement starts at their torso rather than in their legs and arms.
I remember putting some keys into an electrical socket when I was quite young. My hand must have bridged live and neutral, so the current only flowed from thumb to forefinger rather than through my chest to my feet. But it was accompanied by a flash of light and an arc that I saw as a forked tongue. I told my mom that it had bitten me :)
Also, what an horrifying way to die
Exact same thing happened to me as a child. I do not remember the event, but I do remember waking up on the other side of the room.
I also learned electronics by shocking myself often
The survival selection is real in electronics.
Regardless, there are multiple ways old CRTs can cause great harm.
The elevators often didn’t work and climbing 10 flights of stairs while carrying a 70 lb (31kg) cube was brutal. It’s not often you buy a piece of electronics and get a complimentary workout regimen thrown in.
I dont feel nostalgic in the least about them.
SNES/N64 games might look a little better on them, but I take that over the downsides. I can also look longer and more comfortably at modern screens.
On the other hand my current desktop PC with a huge GPU and CPU cooler is not particularly carry friendly either..
It’s similar to how subpixel antialiasing really depends on the screen design and what order the colors are in.
The pixelated 8bit aesthetic is more reminiscent of early emulators in LCD than how it actually was “on hardware”.
https://www.mediacollege.com/equipment/sony/tv/kd/kd30xs955....
148 pounds! A total nightmare to get into our car and into our house.
WORTH IT.
https://crtdatabase.com/crts/sony/sony-kw-34hd1
Even at 34", the thing weighed 200lbs (plus the stand it came with). I lived in a 3rd floor walk up. I found out who my true friends were the day we brought it back from the store. I left that thing in the apartment when I moved. I bet it is still there to this day.
> The "pixels" didn't become larger on lower resolutions…
Strictly speaking, the CRT only had discrete lines not pixels. Within a line the color and brightness could change as rapidly or slowly as the signal source desired. It was in fact an analog signal rather than a digital one. This is why pixels in many display modes used by CRTs were rectangular rather than square.
> We can get much better results today with scaling than we ever could on CRTs…
I say it’s the other way around! No ordinary flat–panel display can emulate the rectangular pixels of the most common video modes used on CRTs because they are built with square pixels. You would have to have a display built with just the right size and shape of pixel to do that, and then it wouldn’t be any good for displaying modern video formats.
> Strictly speaking, the CRT only had discrete lines not pixels.
The electron gun moves in an analog fashion, but when it hits the glass surface, it can only go through specific openings [1]. These openings are placed at a specific distance apart [2]. This distance specifies the horizontal, digital, max CRT resolution.
> No ordinary flat–panel display can emulate the rectangular pixels of the most common video modes used on CRTs because they are built with square pixels.
Today's panels have achieved "retina" resolution, which means that the human eye cannot distinguish individual pixels anymore. The rest is just software [3].
[1] https://www.youtube.com/watch?v=13bpgc8ZxTo
[2] https://en.wikipedia.org/wiki/Dot_pitch#/media/File:CRT_mask...
[3] https://www.reddit.com/r/emulation/comments/dixnso/retroarch...
CRT TVs only supported vertical refresh rates of 50Hz or 60Hz, which matched the regional mains frequency. They used interlacing and technically only showed half the frame at a time, but thanks to phosphor decay this added a feeling of fluidity to the image. If you were able to see it strobe, you must have had an impressive sight. And even if they supported higher refresh rates, it wouldn't matter, as the source of the signal would only ever be 50/60Hz.
CRT monitors used in PCs, on the other hand, supported a variety of refresh rates. Only monitors for specific applications used interlacing, customer grade ones didn't, which means you could see a strobing effect here if you ran it at a low frequency. But even the most analog monitors from the 80s supported atleast 640x480 at 60Hz, some programs such as the original DOOM were even able to squeeze 70Hz out of them by running at a different resolution while matching the horizontal refresh rate.
Some demos could throw pixels into VRAM that fast, and it was wild looking. Like the 60Hz soap-opera effect but even more so.
I still feel that way looking at >30fps content since I really don't consume much of it.
400p at 70 Hz was the default resolution of the VGA, pretty much all the classic mode 13h games ran at 70 Hz.
I've never really experienced it because I've always watched PAL which doesn't have that.
But I would have thought it would be perceived as flashing at 60 Hz with a darker image?
I saw interlaced NTSC video in the digital days where the combing was much more obvious and always assumed it was only an NTSC thing!
The only time the electron gun was not involved in producing visible light was during overscan, horizontal retrace, and the vertical blanking interval. They spent the entire rest of their time (the very vast majority of their time) busily drawing rasterized images onto phosphors (with their own persistence!) for display.
This resulted in a behavior that was ridiculously dissimilar to a 30Hz strobe light.
If you wanted more vertical resolution then you needed either a monitor with a higher horizontal refresh rate or you needed to reduce the effective vertical refresh rate. The former involved more expensive monitors, the latter was typically implemented by still having the CRT refresh at 60Hz but drawing alternate lines each refresh. This meant that the effective refresh rate was 30Hz, which is what you're alluding to.
But the reason you're being downvoted is that at no point was the CRT running with a low refresh rate, and best practice was to use a mode that your monitor could display without interlace anyway. Even in the 80s, using interlace was rare.
Interlacing is a trick that lets you sacrifice refresh rates to gain greater vertical resolution. The electron beam scans across the screen the same number of times per second either way. With interlacing, it alternates between even and odd rows.
With NTSC, the beam scans across the screen 60 times per second. With NTSC non-interlaced, every pixel will be refreshed 60 times per second. With NTSC interlaced, every pixel will be refreshed 30 times per second since it only gets hit every other time.
And of course the phosphors on the screen glow for a while after the electron beam hits them. It's the same phosphor, so in interlaced mode, because it's getting hit half as often, it will have more time to fade before it's hit again.
Slow-decay phosphors were much more common on old "green/amber screen" terminals and monochrome computer displays like those built into the Commodore PET and certain makes of TRS-80. In fact there's a demo/cyberpunk short story that uses the decay of the PET display's phosphor to display images with shading the PET was nominally not capable of (due to being 1-bit monochrome character-cell pseudographics): https://m.youtube.com/watch?v=n87d7j0hfOE
But looking at a table of phosphors ( https://en.wikipedia.org/wiki/Phosphor ), it looks like decay time and color are properties of individual phosphorescent materials, so if you want to build an RGB color CRT screen, that limits your choices a lot.
Also, TIL that one of the barriers to creating color TV was finding a red phosphor.
The RGB stripes or dots are just stripes or dots, they're not tied to pixels. There would be RGB guns that are physically offset to each others, coupled with a strategically designed mesh plates, in such ways that e- from each guns sort of moire into only hitting the right stripes or dots. Apparently fractions of inches of offsets were all it took.
The three guns, really more like fast acting lightbulbs, received brightness signals for each respective RGB channels. Incidentally that means they could go between brightness zero to max couple times over 60[Hz] * 640[px] * 480[px] or so.
Interlacing means the guns draw every other lines but not necessarily pixels, because CRTs has beam spot sizes at least.
This is a valid assumption for 25 Hz double-height TV or film content. It's generally noisy and grainy, typically with no features that occupy less than 1/~270 of the picture vertically for long enough to be noticeable. Combined with persistence of vision, the whole thing just about hangs together.
This sucks for 50 Hz computer output. (For example, Acorn Electron or BBC Micro.) It's perfect every time, and largely the same every time, and so the interlace just introduces a repeated 25 Hz 0.5 scanline jitter. Best turned off, if the hardware can do that. (Even if it didn't annoy you, you'll not be more annoyed if it's eliminated.)
This also sucks for 25 Hz double-height computer output. (For example, Amiga 640x512 row mode.) It's perfect every time, and largely the same every time, and so if there are any features that occupy less than 1/~270 of the picture vertically, those fucking things will stick around repeatedly, and produce an annoying 25 Hz flicker, and it'll be extra annoying because the computer output is perfect and sharp. (And if there are no such features - then this is the 50 Hz case, and you're better off without the interlace.)
I decided to stick to the 50 Hz case, as I know the scanline counts - but my recollection is that going past 50 Hz still sucks. I had a PC years ago that would do 85 Hz interlaced. Still terrible.
Even interlaced displays were still running at 60Hz, just with a half-line offset to fill in the gaps with image.
That being said they were horrible on the eyes, and I think I only got comfortable when 100Hz+ CRT screens started being common. It is just that the threshold for comfort is higher than I remember it, which explains why I didn't feel any better in front of a CRT TV.
1986 I got an AtariST with black and white screen. Glorious 640x400 pixels across 11 or 12 inch. At 72Hz. Crystal clear.
Those were still sought after well into the LCD era for their high resolution and incredible motion clarity, but I think LCDs getting "good enough" and the arrival of OLED monitors with near-zero response times has finally put them out to pasture as anything but a collectors item.
Now I have a FW900 sitting in a closet for decades because I can't lift it anymore
Also will never forget I was taking a walk in the woods years ago and in the middle of nowhere, no houses/apartments for miles, there was a FW900 just sitting there like someone must have thrown it out of an airplane but of course impossible as it was intact and inexplicable WTF (when got home made sure mine was still in the closet and had not somehow teleported itself)
At the time there were a lot of private import items in Kuwait - particularly cars - so it's not impossible it was this particular model. I mean, what other TV could boast being the height of a four year old?
When I was a kid I lived down in Southeastern Kentucky (Somerset) which gets a lot of its power from the local lake via hydro. My grandfather had this large (not this big but big) tube TV, the old wooden case kind. When you turned it on it'd take about ten seconds in which you could hear tube heaters tinkling, followed by a "grrrnnnnnzzzzz" sound as the tube came to life. I remember my uncle joking that the lake level started visibly falling.
Between LCDs/etc. and LED lighting, the amount of efficiency improvement we've done in home electronics is wild. I can now put my hand right on an equivalent to 100W light output light bulb and it's just... warm.
If they were going all the long way around to the Atlantic that would indeed explain the markup. Not sure why they would though.
> And news articles in 1990 said Sony dealers would not allow any bickering. [...] no discounts.
... that's probably "dickering", and an amusing typo. ("Hey, you can't squabble here!")