Posted by giuliomagnifico 20 hours ago
He also made a second video (not linked) which shows off more of the actual hardware.
1. Touching the circuit board on the back of the CRT tube by mistake trying to troubleshoot image issues, “fortunately” it was a “low” voltage as it was a B&W monitor….
2. Throwing a big big stone to an abandoned next to the trashcan CRT TV while I had it placed normally because it didn’t break when I threw it facing up and the next thing I remember after opening my eyes which I closed from the bang was my friends who were further down the road looking at me as it I were a ghost since big big chunks for the CRT glass flew just right next to me.
CRTs were dangerous in many aspects!
EDIT: I meant to reply to the other thread with the dangers of CRTs
That was nothing compared to the time the CAT scan machine fell face down off the lift gate on the back of the delivery truck because our driver pushed the wrong button and tipped it instead of lowering it, but I missed the flack from that because I was on a move somewhere thankfully. Afterwords he was forever known as the quarter million dollar man.
My father ran his own TV repair shop for many years. When I was a teen he helped me make a Tesla coil out of a simple oscillator and the flyback transformer from a scrapped TV. It would make a spark 2 or 3 inches long and could illuminate a florescent light from several feet away. It definitely produced higher voltage than normally exists in a TV, but not orders of magnitude more. The high voltage circuits in CRTs are dangerous as hell.
I happened to have noticed that they were trying to clear out any remaining floor models of CRTs. One of them was an absolutely giant Samsung, memory says it was >34", but I'm not sure how big...with a sticker on it for, and I'll never forget this...$.72.
Soooo two big TVs for the price of one!
Long story short, we were moving out of that house, CRT tvs were long since obsolete and that TV hadn't even been turned on for at least 5 years. So we decided to throw it away. I had never picked it up before and had forgotten how heavy CRTs could be. I ended up having to get two friends to come help me move it to the curb, it was well over 250 lbs. The trash company also complained when they had to pick it up and had to make a return trip.
I kinda regret getting rid of it, but it was among the heaviest pieces of furniture in our house.
Somewhere, a retro game enthusiast winced at reading that.
I was VERY smart and of course unplugged the TV before doing anything.
My flat head screwdriver brushed against the wrong terminal in the back, I was literally thrown across the room several feet, and my flat head screw driver was no longer usable as the tip had deformed and slightly melted.
I later found an electronics book that had a footnote mentioning grounding out the tube before going near it…
I know a shock can paralyze (by contracting the muscles) and it can burn (by joule effect) but never seen one push
DC current jolts you “across the room“ by contracting your muscles all at once. Of course the exact effect depends on your posture; sometimes it just makes you stand upright or pull your arms in. This tends to disconnect you from the source of the electricity, limiting the damage. Note that if you cannot actually jump all the way across the room then the jolt probably can’t knock you all the way across the room either. If you fall over your head could end up pretty far away from where it started, though, and if you lose consciousness even for a little while then that can affect your perception too. It could certainly throw the screwdriver all the way across the room.
If you pay attention to the special effects that show up in movies and television you’ll soon realize that they simulate shocks by putting the actor in a harness and then pulling on it suddenly. This sudden movement away from the source of the “shock” stops looking very convincing when you notice that the movement starts at their torso rather than in their legs and arms.
Exact same thing happened to me as a child. I do not remember the event, but I do remember waking up on the other side of the room.
I also learned electronics by shocking myself often
The survival selection is real in electronics.
Regardless, there are multiple ways old CRTs can cause great harm.
The elevators often didn’t work and climbing 10 flights of stairs while carrying a 70 lb (31kg) cube was brutal. It’s not often you buy a piece of electronics and get a complimentary workout regimen thrown in.
I dont feel nostalgic in the least about them.
SNES/N64 games might look a little better on them, but I take that over the downsides. I can also look longer and more comfortably at modern screens.
On the other hand my current desktop PC with a huge GPU and CPU cooler is not particularly carry friendly either..
It’s similar to how subpixel antialiasing really depends on the screen design and what order the colors are in.
The pixelated 8bit aesthetic is more reminiscent of early emulators in LCD than how it actually was “on hardware”.
https://www.mediacollege.com/equipment/sony/tv/kd/kd30xs955....
148 pounds! A total nightmare to get into our car and into our house.
WORTH IT.
https://crtdatabase.com/crts/sony/sony-kw-34hd1
Even at 34", the thing weighed 200lbs (plus the stand it came with). I lived in a 3rd floor walk up. I found out who my true friends were the day we brought it back from the store. I left that thing in the apartment when I moved. I bet it is still there to this day.
CRT TVs only supported vertical refresh rates of 50Hz or 60Hz, which matched the regional mains frequency. They used interlacing and technically only showed half the frame at a time, but thanks to phosphor decay this added a feeling of fluidity to the image. If you were able to see it strobe, you must have had an impressive sight. And even if they supported higher refresh rates, it wouldn't matter, as the source of the signal would only ever be 50/60Hz.
CRT monitors used in PCs, on the other hand, supported a variety of refresh rates. Only monitors for specific applications used interlacing, customer grade ones didn't, which means you could see a strobing effect here if you ran it at a low frequency. But even the most analog monitors from the 80s supported atleast 640x480 at 60Hz, some programs such as the original DOOM were even able to squeeze 70Hz out of them by running at a different resolution while matching the horizontal refresh rate.
Some demos could throw pixels into VRAM that fast, and it was wild looking. Like the 60Hz soap-opera effect but even more so.
I still feel that way looking at >30fps content since I really don't consume much of it.
I've never really experienced it because I've always watched PAL which doesn't have that.
But I would have thought it would be perceived as flashing at 60 Hz with a darker image?
The only time the electron gun was not involved in producing visible light was during overscan, horizontal retrace, and the vertical blanking interval. They spent the entire rest of their time (the very vast majority of their time) busily drawing rasterized images onto phosphors (with their own persistence!) for display.
This resulted in a behavior that was ridiculously dissimilar to a 30Hz strobe light.
If you wanted more vertical resolution then you needed either a monitor with a higher horizontal refresh rate or you needed to reduce the effective vertical refresh rate. The former involved more expensive monitors, the latter was typically implemented by still having the CRT refresh at 60Hz but drawing alternate lines each refresh. This meant that the effective refresh rate was 30Hz, which is what you're alluding to.
But the reason you're being downvoted is that at no point was the CRT running with a low refresh rate, and best practice was to use a mode that your monitor could display without interlace anyway. Even in the 80s, using interlace was rare.
Interlacing is a trick that lets you sacrifice refresh rates to gain greater vertical resolution. The electron beam scans across the screen the same number of times per second either way. With interlacing, it alternates between even and odd rows.
With NTSC, the beam scans across the screen 60 times per second. With NTSC non-interlaced, every pixel will be refreshed 60 times per second. With NTSC interlaced, every pixel will be refreshed 30 times per second since it only gets hit every other time.
And of course the phosphors on the screen glow for a while after the electron beam hits them. It's the same phosphor, so in interlaced mode, because it's getting hit half as often, it will have more time to fade before it's hit again.
The RGB stripes or dots are just stripes or dots, they're not tied to pixels. There would be RGB guns that are physically offset to each others, coupled with a strategically designed mesh plates, in such ways that e- from each guns sort of moire into only hitting the right stripes or dots. Apparently fractions of inches of offsets were all it took.
The three guns, really more like fast acting lightbulbs, received brightness signals for each respective RGB channels. Incidentally that means they could go between brightness zero to max couple times over 60[Hz] * 640[px] * 480[px] or so.
Interlacing means the guns draw every other lines but not necessarily pixels, because CRTs has beam spot sizes at least.
This is a valid assumption for 25 Hz double-height TV or film content. It's generally noisy and grainy, typically with no features that occupy less than 1/~270 of the picture vertically for long enough to be noticeable. Combined with persistence of vision, the whole thing just about hangs together.
This sucks for 50 Hz computer output. (For example, Acorn Electron or BBC Micro.) It's perfect every time, and largely the same every time, and so the interlace just introduces a repeated 25 Hz 0.5 scanline jitter. Best turned off, if the hardware can do that. (Even if it didn't annoy you, you'll not be more annoyed if it's eliminated.)
This also sucks for 25 Hz double-height computer output. (For example, Amiga 640x512 row mode.) It's perfect every time, and largely the same every time, and so if there are any features that occupy less than 1/~270 of the picture vertically, those fucking things will stick around repeatedly, and produce an annoying 25 Hz flicker, and it'll be extra annoying because the computer output is perfect and sharp. (And if there are no such features - then this is the 50 Hz case, and you're better off without the interlace.)
I decided to stick to the 50 Hz case, as I know the scanline counts - but my recollection is that going past 50 Hz still sucks. I had a PC years ago that would do 85 Hz interlaced. Still terrible.
Even interlaced displays were still running at 60Hz, just with a half-line offset to fill in the gaps with image.
That being said they were horrible on the eyes, and I think I only got comfortable when 100Hz+ CRT screens started being common. It is just that the threshold for comfort is higher than I remember it, which explains why I didn't feel any better in front of a CRT TV.
Slow-decay phosphors were much more common on old "green/amber screen" terminals and monochrome computer displays like those built into the Commodore PET and certain makes of TRS-80. In fact there's a demo/cyberpunk short story that uses the decay of the PET display's phosphor to display images with shading the PET was nominally not capable of (due to being 1-bit monochrome character-cell pseudographics): https://m.youtube.com/watch?v=n87d7j0hfOE
But looking at a table of phosphors ( https://en.wikipedia.org/wiki/Phosphor ), it looks like decay time and color are properties of individual phosphorescent materials, so if you want to build an RGB color CRT screen, that limits your choices a lot.
Also, TIL that one of the barriers to creating color TV was finding a red phosphor.
Those were still sought after well into the LCD era for their high resolution and incredible motion clarity, but I think LCDs getting "good enough" and the arrival of OLED monitors with near-zero response times has finally put them out to pasture as anything but a collectors item.
Now I have a FW900 sitting in a closet for decades because I can't lift it anymore
Also will never forget I was taking a walk in the woods years ago and in the middle of nowhere, no houses/apartments for miles, there was a FW900 just sitting there like someone must have thrown it out of an airplane but of course impossible as it was intact and inexplicable WTF (when got home made sure mine was still in the closet and had not somehow teleported itself)
They had it installed in their basement. However, later they remodeled the basement stairway to add a turn. With the new layout, it would be impossible to bring back up the TV the way it was brought down. There was no other way to access the basement (it only had storm cellar windows), so they left it there when they moved.
I think about the new owners sometimes and wonder what they ended up doing. Perhaps they disassembled it, or maybe it’s still down there collecting dust.
At the time there were a lot of private import items in Kuwait - particularly cars - so it's not impossible it was this particular model. I mean, what other TV could boast being the height of a four year old?
The biggest CRT ever made: Sony's PVM-4300: https://news.ycombinator.com/item?id=40754471
Overview of the KX45ED1 / PVM-4300 (Worlds Largest CRT) [video] https://news.ycombinator.com/item?id=42588259
Interestingly that first link is to the same URL as today's yet it's from June 22 2024. The linked article however has today's date as the publish date. There's no indication that the article was updated from what was published originally.
At the time, a "big" CRT was a 32". I helped my dad transport the 35" which, from memory, was 150 or 180lbs. It was likely the largest CRT commercially available. (PVM-4300 stragglers aside).
A couple years later (1995-6?), a friend's family bought a 40" Mitsubishi, which I _thought_ was the largest CRT made. But, again, Sony aside, it probably was.
I helped friends move one of these old monsters out of an apartment in MIT's west campus 15 years ago. Don't remember the brand but it seemed even bigger than 35". It was shockingly huge and heavy and they lived on the top floor.
As we were doing this, I was thinking, how come the original owner didn't get a projection TV? They have been available since the 80s, the separate components were easier to manage, and the screens were far bigger.
In a bright room, the contrast was typically lacking.
Even on relatively late versions like the Toshiba 57HX93 (a 57" 16x9 doghouse from ~20 years ago with an integrated scaler and a 1080i input), which I personally spent some time with both in Toshiba form and as $10k Runco-branded units. Things got washed out in a bright room compared to a direct-view CRT.
And viewing angle is an issue, too: Whether front- or rear-projection, one of the tricks to improve brightness (and therefore potential contrast) is to reduce the angle of light transmission from the screen. Depending on the room layout, this can mean that people in seats off to the side might get a substantially darker image than those near the middle. (This applies to all projectors; film, CRT, DLP, LCD, front, rear, whatever -- there can be a lot of non-obvious tech that goes into a projection screen.)
And CRT projectors were fickle. Their color convergence would change based on external magnetic fields (including that produced by the Earth itself), so they needed to be set up properly in-situ. A projection set that was set up properly while facing East would be a different thing when rotated 90 degrees to instead face North: What once was carefully-adjusted to produce 3 overlapping images that summed to be pure white lines would be a weird mix of red, blue, and green lines that only sometimes overlapped.
The CRT tubes themselves were generally quite impressively small for the size of the image that they'd ultimately produce. This meant pushing the phosphor coatings quite hard, which translated into an increased opportunity for permanent image retention ("screen burn") from things like CNN logos and video game scores.
Plus: They'd tend to get blurry over time. Because they were being pushed hard, the CRTs were liquid-cooled using glycol that was supposed to be optically-clear. But stuff would sometimes grow in there. It was never clear whether this was flora or [micro]fauna or something else, but whatever it was liked living in a world filled with hot, brightly-lit glycol. Service shops could correct this by changing the fluid, but that's an expense and inconvenience that direct-view CRTs didn't have.
And they were ungainly things in other ways. Sure, they tended to be lighter (less-massive) because they were full of air instead of leaded glass, but a rear-projection set was generally a big floor-standing thing that still had plenty of gravity. Meanwhile, a front-projection rig ~doubled the chance of someone walking by occluding the view and came with the burden of a hard-to-clean screen (less important these days, but it used to be common for folks to smoke indoors) and its own additional alignment variables (and lens selection, and dust issues, and, and).
So a person could deal with all that, or -- you know -- just get a regular direct-view CRT.
Even today where projectors use friggin' laser beams for illumination and produce enormous, bright images with far fewer issues than I listed above, direct-view tech (like the flat LCD and *LED sets at any big-box) is still much more popular.
(But I do feel your pain. When I was a teenager, my parents came home from shopping one wintry night with a 36" Sony WEGA for me to help unload. Holy hell.)
You're right about that. A friend's dad was a gearhead and had one of those. It always seemed dim, practically unwatchable during the day and even at night it was flat which made darker films hard to watch.
But it was a mid-80s model and I figured 10 or 20 years later the tech had improved.
It was in the room with the furniture that we weren't allowed to sit on, and we weren't allowed to think about using that TV. (I mentioned once when we were unsupervised that maybe we could turn it on and watch something, and the color drained out of his face like doing anything like that would surely result in a very painful death. After he calmed down, we went outside and played with bugs or something instead.)
As far as I could tell, the old man (who was much younger than I am at this point) only ever switched it on for watching football on Sunday afternoons. But once or twice I'd wander by and -- with permission, and being careful to touch nothing -- try to watch part of the game.
It was a miserable thing to view. Big, blurry, dim, and just broadly indistinct. I didn't see the attraction compared to the perfectly-good 20" Zenith we had at home at that time that seemed so much more vibrant and useful. But the speakers sure sounded better on the projection set, so I guess there's that.
The tech did improve. The brightness did get a lot better, and so did processing (including using tricks like Velocity Scan Modulation that sought to improve brightness, at the expensive of making geometry an deliberately-dynamic thing instead of an ideally-fixed thing), and the colors improved. Things like line doublers and scalers and higher-resolution electronics to drive the tubes did improve some aspects of the blur that was apparent, even with regular NTSC sources. But those same improvements were also made in direct-view CRTs; after all, they were both the same tech.
So CRT rear-projection was as good as a person could get for a bigger-than-direct-view for a long time, but the fidelity was very seldom particularly awesome on an absolute scale -- at any pricepoint.
Competing rear-projection systems like DLP and LCD began to dwarf it in the market not long after the turn of the century. Despite their hunger for expensive light bulbs (and single-chip DLP's own inherent temporal problems), these new players were often cheaper to produce and sell, came in smaller packages (they could often rest on furniture instead needing their own floor-space), had fewer setup issues, and fared pretty well in brightness and geometry.
CRT rear-projectors then got pushed completely aside as soon as things like plasma displays became cheap-enough, and big LCDs became good-enough -- somewhere between 2006 and 2009, on my timeline.
(CRT did last a bit longer in front-projection form, for people with very serious home theaters [think positively-enormous screen, tiered seating, dedicated space, and some blank checks], but LCD caught up there soon-enough as well.)