Top
Best
New

Posted by qassiov 1/26/2026

Television is 100 years old today(diamondgeezer.blogspot.com)
669 points | 273 comments
sosomoxie 1/26/2026|
CRTs are peak steam punk technology. Analog, electric, kinda dangerous. Just totally mindblowing that we had these things in our living rooms shooting electric beams everywhere. I doubt it's environmentally friendly at all, but I'd love to see some new CRTs being made.
retrac 1/26/2026||
There's a synchronous and instantaneous nature you don't find in modern designs.

The image is not stored at any point. The receiver and the transmitter are part of the same electric circuit in a certain sense. It's a virtual circuit but the entire thing - transmitter and receiving unit alike - are oscillating in unison driven by a single clock.

The image is never entirely realized as a complete thing, either. While slow phosphor tubes do display a static image, most CRT systems used extremely fast phosphors; they release the majority of the light within a millisecond of the beam hitting them. If you take a really fast exposure of a CRT display (say 1/100,000th of a second) you don't see the whole image on the photograph - only the most recently few drawn lines glow. The image as a whole never exists at the same time. It exists only in the persistence of vision.

accounting2026 1/26/2026|||
> The image is not stored at any point.

Just wanted to add one thing, not as a correction but just because I learned it recently and find it fascinating. PAL televisions (the color TV standard in Europe) actually do store one full horizontal scanline at a time, before any of it is drawn on the screen. This is due to a clever encoding used in this format where the TV actually needs to average two successive scan lines (phase-shifted compared to each other) to draw them. Supposedly this cancels out some forms of distortion. It is quite fascinating this was even possible with analogue technology. The line is stored in a delay line for 64 microseconds. See e.g.: https://www.youtube.com/watch?v=bsk4WWtRx6M

leguminous 1/27/2026|||
At some point, most NTSC TVs had delay lines, too. A comb filter was commonly used for separating the chroma from the luma, taking advantage of the chroma phase being flipped each line. Sophisticated comb filters would have multiple delay lines and logic to adaptively decide which to use. Some even delayed a whole field or frame, so you could say that in this case one or more frames were stored in the TV.

https://www.extron.com/article/ntscdb3

aidenn0 1/28/2026||
If a motion adaptive 3d comb filter (which requires comparing successive frames) was present on a TV, you can bet that it would be plastered all over the marketing material for the TV.
brewmarche 1/27/2026||||
I only knew about SECAM, where it’s even part of the name (Système Électronique Couleur Avec Mémoire)
grishka 1/27/2026||
You can decode a PAL signal without any memory, the memory is only needed to correct for phase errors. In SECAM though, it's a hard requirement because the two color components, Db and Dr, are transmitted on alternating lines, and you need both on each line.
accounting2026 1/27/2026||
Yes that is called "PAL-S". But the system was designed to use the delay-line method and it was employed since the inception (first broadcast 1967).
jacquesm 1/27/2026||||
The physical components of those delay lines were massive crystals with silver electrodes grafted on to them. Very interesting component.
MBCook 1/27/2026|||
All PAL TVs had a delay line in them? Crazy.
ninkendo 1/27/2026||||
It doesn’t begin at the transmitter either, in the earliest days even the camera was essentially part of the same circuit. Yes, the concept of filming a show and showing the film over the air existed eventually, but before that (and even after that, for live programming) the camera would scan the subject image (actors, etc) line-by-line and down a wire to the transmitter which would send it straight to your TV and into the electron beam.

In fact in order to show a feed of only text/logos/etc in the earlier days, they would literally just point the camera at a physical object (like letters on a paper, etc) and broadcast from the camera directly. There wasn’t really any other way to do it.

lebuffon 1/27/2026||
Our station had an art department that used a hot press to create text boards that were set on an easel that had a camera pointed at it. By using a black background with white text you could merge the text camera with a camera in the studio and "super-imposed the text into the video feed.

"And if you tell the kids that today, they won't believe it!"

Doxin 1/27/2026||
It's kind of amazing the sort of hoops people needed to jump through to make e.g. the BBC-1 ident: https://www.youtube.com/watch?v=xfpEZDeVo00
lebuffon 1/27/2026||
It seems like imagination was more common in those days. There was no "digital" anything to lean on.
alamortsubite 1/27/2026||
The live-action PBS idents from the early 90's were some of the best.

https://www.youtube.com/watch?v=5Ap_JRofNMs https://www.youtube.com/watch?v=PJpiIyBkUZ4

This mini doc shows the process:

https://youtu.be/Q7iNg1dRqQI?t=167

lifeisstillgood 1/27/2026||||
>>> The image is not stored at any point.

The very first computers (Manchester baby) used CRTs as memory - the ones and zeros were bright spots on a “mesh” and the electric charge on the mesh was read and resent back to the crt to keep the ram fresh (a sorta self refreshing ram)

adrian_b 1/27/2026|||
Yes, but those were not the standard kind of CRTs that are used in TV sets and monitors.

The CRTs with memory for early computers were actually derived from the special CRTs used in video cameras. There the image formed by the projected light was converted in a distribution of charge stored on an electrode, which was then sensed by scanning with an electron beam.

Using CRTs as memory has been proposed by von Neumann and in his proposal he used the appropriate name for that kind of CRT: "iconoscope".

hahahahhaah 1/27/2026|||
Why didn't that catch on pre-transistor? Feels like you'd get higher density than valves and relays.
adrian_b 1/27/2026||
DRAM memories made with special CRTs with memory have been used for a few years, until 1954. For instance the first generation of commercial electronic computers made by IBM (scientific IBM 701 and business-oriented IBM 702) have used such CRTs.

Then the CRT memories have become obsolete almost instantaneously, due to the development of magnetic core memories, which did not require periodic refreshing and which were significantly faster. The fact that they were also non-volatile was convenient at that early time, though not essential.

Today, due to security concerns, you would actually not want for your main memory to be non-volatile, unless you also always encrypt it completely, which creates problems of secret key management.

So CRT memories have become obsolete several years before the replacement of vacuum tubes in computers with transistors, which happened around 1959/1960.

Besides CRT memories and delay line memories, another kind of early computer memory that has quickly become obsolete was the memory with magnetic drums.

In the cheapest early computers (like IBM 650), the main memory was not a RAM (i.e. neither a CRT nor with magnetic cores), but a magnetic drum memory (i.e. with sequential periodic access to data).

torginus 1/26/2026|||
Yeah it super weird that while we struggle with latency in the digital world, storing anything for any amount of time is an almost impossible challenge in the analog world.
iberator 1/27/2026||
You should check out:

- Core memory - Drum memory - Bubble memory - Mercury delay line memory - Magnetic type memory :P

And probably many more. Remember that computers don't even need to be digital!

abyesilyurt 1/27/2026|||
> computers don't even need to be digital!

or electric.

accounting2026 1/27/2026|||
This stores a whole scanline https://www.youtube.com/watch?v=bsk4WWtRx6M. This or something similar was in almost any decent color TV except for the oldest.
mrandish 1/27/2026|||
It's worth deep diving into how analog composite broadcast television works, because you quickly realize just how insanely ambitious it was for 1930s engineers to have not only conceived, but perfected and shipped at consumer scale using only 1930s technologies.

Being old enough to have learned video engineering at the end of the analog days, it's kind of fun helping young engineers today wrap their brains around completely alien concepts, like "the image is never pixels" then "it's never digital" and "never quantized." Those who've been raised in a digital world learn to understand things from a fundamentally digital frame of reference. Even analog signals are often reasoned about as if their quantized form was their "true nature".

Interestingly, I suspect the converse would be equally true trying to explain digital television to a 1930s video engineer. They'd probably struggle similarly, always mentally remapping digital images to their "true" analog nature. The fundamental nature of their world was analog. Nothing was quantized. Even the idea "quanta" might be at the root of physics was newfangled, suspect and, even if true, of no practical use in engineering systems.

accounting2026 1/27/2026|||
Yes agreed! And while it is not quantized as such there is an element of semi-digital protocol to it. The concept of "scanline" is quantized and there's "protocols" for indicating when a line ends, and a picture ends etc. that the receiver/send needs to agree on... and "colorbursts packets" for line, delay lines and all kinds of clever technique etc. so it is extremely complicated. Many things were necessary to overcome distortion and also to ensure backwards compatibility - first, how do you fit in the color so a monochrome TV can still show it? Later, how do you make it 16:9 and it can still show on a 4:3 TV (which it could!).
mrandish 1/27/2026||
> And while it is not quantized as such there is an element of semi-digital protocol to it.

Yes, before posting I did debate that exact point in my head, with scanlines as the clearest example :-). However, I decided the point is still directionally valid because ultimately most timing-centric analog signal encoding has some aspect of being quantized, if only to thresholds. Technically it would be more correct to narrow my statement about "never quantized" to the analog waveform driving the electron gun as it sweeps horizontally across a line. It always amazes digital-centric engineers weaned on pixels when they realize the timing of the electron gun sweep in every viewer's analog TV was literally created by the crystal driving the sweep of the 'master' camera in the TV studio (and would drift in phase with that crystal as it warmed up!). It's the inevitable consequence of there being no practical way to store or buffer such a high frequency signal for re-timing. Every component in the chain from the cameras to switchers to transmitters to TVs had to lock to the master clock. Live TV in those days was truly "live" to within 63.5 microseconds of photons hitting vacuum tubes in the camera (plus the time time it took for the electrons to move from here to there). Today, "live" HDTV signals are so digitally buffered, re-timed and re-encoded at every step on their way to us, we're lucky if they're within 20 seconds of photons striking imagers.

My larger point though was that in the 1930s even that strict signal timing had to be encoded and decoded purely with discrete analog components. I have a 1950s Predicta television and looking at the components on the boards one can't help wondering "how the hell did they come up with this crazy scheme." Driving home just how bonkers the whole idea of analog composite television was for the time.

> first, how do you fit in the color so a monochrome TV can still show it?

To clarify for anyone who may not know, analog television was created in the 1930s as a black-and-white composite standard defined by the EIA in the RS-170 specification, then in 1953 color was added by a very clever hack which kept all broadcasts backward compatible with existing B&W TVs (defined in the RS-170A specification). Politicians mandated this because they feared nerfing all the B&W TVs owned by voters. But that hack came with some significant technical compromises which complicated and degraded color analog video for over 50 years.

accounting2026 1/27/2026||
Yes knew what you meant, and fully agree. It is fascinating TV is even possible just out of all these rather simple and bulky analog components. Even the first color TV's were with vacuum tubes and no transitors.

As I recall there's all kinds of hacks in the design to keep them cheap. For instance, letting the fly-back transformer for producing the high voltages needed operate at the same frequency as the horizontal scan rate (~15 kHz) so that mechanism essentially serves double duty. The same was even seen in microcomputers where the same crystal needed for TV was also used for the microprocessor - meaning that e.g. a "European" Commodore 64 with PAL was actually a few percent slower than an American C64 with NTSC. And other crazy things like that.

mrandish 1/27/2026||
> "European" Commodore 64 with PAL was actually a few percent slower than an American C64 with NTSC. And other crazy things like that.

Indeed! Even in the Playstation 2 era, many games still ran at different speeds in Europe than the U.S. and Japan. There were so many legacy artifacts which haunted computers, games, DVDs and more for decades after analog broadcast was supplanted by digital. And it all arose from the fact the installed base and supporting broadcast infrastructure of analog television was simply too massive to replace. In a way it was one of the biggest accrued "technical debts" ever!

The only regrettable thing is during the long, painful transition from analog to digital, a generation of engineers got the idea that the original analog TV standard was somehow bad - which, IMHO, is really unfair. The reality is the original RS-170 standard was a brilliant solution which perfectly fulfilled, and even exceeded, all its intended use cases for decades. The problems only arose when that solution was kept alive far beyond its intended lifetime and then hacked to support new use cases like color encoding while maintaining backward compatibility.

Analog television was created solely for natural images captured on vacuum tube cameras. Even the concept of synthetic imagery like character generator text and computer graphic charts was still decades in the future. Then people who weren't yet born when TV was created, began to shove poorly converted, hard-edged, low-res, digital imagery into a standard created to gracefully degrade smooth analog waveforms and it indeed sucked. I learned to program on an 8-bit computer with 4K of RAM connected to a Sears television through an RF modulator. Even 32 columns of 256x192 text was a blurry mess with color fringes! On many early 8-bit computers, some colors would invert randomly based on which clock phase the computer started on! Red would be blue and vice versa so we'd have to repeatedly hit reset until the colors looked correct. But none of that craziness was the fault of the original television engineers, we were abusing what they created in ways they couldn't have imagined.

account42 1/27/2026|||
It's interesting how early digital video systems were influenced by the analog aspects. DVDs were very much still defined by NTSC/PAL even though the data is fully digital.
mrandish 1/27/2026||
Indeed and even today's HDTV specification has elements based on echoes reverberating all the way from decisions made in the 1930s when specifying B&W TV.

The composite and component sampling rates (14.32 MHz and 13.5 MHz) are both based on being 4x a specific existing color carrier sampling rate from analog television. And those two frequencies directly dictated all the odd-seeming horizontal pixel resolutions we find in pre-HD digital video (352, 704, 360, 720 and 768) and even the original PC display resolutions (CGA, VGA, XGA, etc).

For example, the 720 horizontal pixels of DVD and digital satellite broadcasts was tied to the digital component video standard sampling the active picture area of an analog video scanline at 13.5 Mhz to capture the 1440 clock transitions in that waveform. Similarly, 768 (another common horizontal resolution in pre-HD video) is tied to the composite video standard sampling at 14.32 MHz to capture 1536 clock transitions. The history of how these standards were derived is fascinating (https://tech.ebu.ch/docs/techreview/trev_304-rec601_wood.pdf)

VGA's horizontal resolution of 640 is simply from adjusting analog video's rectangular aspect ratio to be square (720 * 0.909 = 640). It's kind of fascinating all these modern digital resolutions can be traced back to decisions made in the 1930s based on which affordable analog components were available, which competing commercial interests prevailed (RCA vs Philco) and the political sensitivities present at the time.

lebuffon 1/27/2026|||
I was on a course at Sony in San Mateo in the 1980s and they had a 36" prototype television in the corner. We all asked for it to be turned on. We were told by the instructor that he was not allowed to turn it on because the 40,000V anode voltage generated too many X-rays at the front of the picture tube.

:-))))

itisit 1/26/2026|||
And perhaps peak atompunk too when used as RAM. [0]

[0] https://en.wikipedia.org/wiki/Williams_tube

BizarroLand 1/26/2026||
Damn, what I wouldn't give to be able to look at my computer and see the bits bobbing in its onboard ram
itisit 1/26/2026||
Like the MegaProcessor? [0]

[0] https://www.youtube.com/watch?v=lNa9bQRPMB8

BizarroLand 1/26/2026||
Yes but for my 9950x, lol
ortusdux 1/26/2026|||
One summer odd-job included an afternoon of throwing a few dozen CRTs off a 3rd floor balcony into a rolloff dumpster. I'da done it for free.
ihaveajob 1/27/2026|||
People pay for that these days in smash rooms.
hahahahhaah 1/27/2026|||
Rock and roll!
fecal_henge 1/26/2026|||
Extra dangerous aspect: On really early CRTs they hadn't quite nailed the glass thicknesses. One failure mode was that the neck that held the electron gun would fail. This would propell the gun through the front of the screen, possibly toward the viewer.
ASalazarMX 1/27/2026|||
I don't know, "Killed by electron gun breakdown" sounds like a rad way to go. You can replace "electron gun" with "particle accelerator" if you want.
cf100clunk 1/26/2026|||
Likewise, a dropped CRT tube was a constant terror for TV manufacturing and repair folks, as it likely would implode and send zillions of razor-sharp fragments airborne.
thomassmith65 1/26/2026|||
My high school science teacher used to share anecdotes from his days in electrical repair.

He said his coworkers would sometimes toss a television capacitor at each other as a prank.

Those capacitors retained enough charge to give the person unlucky enough to catch one a considerable jolt.

freedomben 1/26/2026|||
Touching one of those caps was a hell of an experience. It was similar in many ways to a squirrel tap with a wrench in the auto shop (for those who didn't do auto shop, a squirrel tap with a wrench is when somebody flicks your nut sack from behind with a wrench. Properly executed it would leave you doubled over out of breath).
NL807 1/26/2026|||
lol I did this with my mates. Get one of those 1 kV ceramics, give it some charge and bob's your uncle, you have one angry capacitor.
iberator 1/27/2026||
This can be deadly :/ just wow
account42 1/27/2026||
Many fun things can be.
torginus 1/26/2026||||
I remember smashing a broken monitor as a kid for fun, hearing about the implosion stuff, and sadly found the back of the glass was stuck to some kind of plastic film that didnt allow the pieces to fly about :(
ASalazarMX 1/27/2026||||
I can't still get over how we used to put them straight in our faces, yet I never knew of someone having an accidental face reshaping ever.
account42 1/27/2026|||
That doesn't match my experience of deliberately dropping an old CRT monitor off the roof. Implosions are unfortunately not as exciting as explosions.
jlokier 1/27/2026|||
Some recent HN comments about CRT implosions people have experienced.

https://news.ycombinator.com/item?id=46355765

"I still have a piece of glass in back of the palm of my right hand. Threw a rock at an old CRT and it exploded, after a couple of hours I noticed a little blood coming out of that part of hand. Many, many years later was doing xray for a broken finger and doctor asked what is that object doing there? I shrugged, doc said, well it looks like it's doing just fine, so might as well stay there. How lucky I am to have both eyes."

https://news.ycombinator.com/item?id=46354919

"2. Throwing a big big stone to an abandoned next to the trashcan CRT TV while I had it placed normally because it didn’t break when I threw it facing up and the next thing I remember after opening my eyes which I closed from the bang was my friends who were further down the road looking at me as it I were a ghost since big big chunks for the CRT glass flew just right next to me.

CRTs were dangerous in many aspects!"

https://news.ycombinator.com/item?id=46356432

"I'll never forget the feeling of the whoosh when I was working as a furniture mover in the early 2000s and felt the implosion when a cardboard box collapsed and dumped a large CRT TV face-down on the driveway, blowing our hair back. When the boss asked what happened to the TV, I said it fell, and our lead man (who had set it on the box) later thanked me for putting it so diplomatically."

cf100clunk 1/27/2026|||
The ''tube'' was indeed extrememly fragile and thus extremely dangerous. I'm talking about only the unguarded ''tube'' itself. Repair and manufacturer technicians had to deal with that on a regular basis. Later, consumer protection laws and other standards came into effect that made TV and monitor superstructures more capable of guarding such a dangerous internal component. Your experience was clearly with those much safer, later types.
kleiba 1/26/2026|||
What do you mean "had"? I just turned mine off a minute ago. I am yet to make the transition to flat screen TVs but in the mean time, at least no-one's tracking my consumer habits.
rapfaria 1/27/2026||
Not through your TV, but they see you driving to the last Blockbuster tho
kleiba 1/27/2026||
I wish.
account42 1/27/2026|||
While not entirely thematically unrelated, being electric puts it distinctly outside of steampunk and even dieselpunk. I don't think anyone would call The Matrix steampunk but CRTs are at the center of its aesthetic. Cassette Futurism is the correct term I believe though it also overlaps with some sub-genres of cyberpunk.
kazinator 1/26/2026|||
With CRTs, the environmental problem is the heavy metals: tons of lead in the glass screen, plus cadmium and whatnot. Supposedly there can be many pounds of lead in a large CRT.
accounting2026 1/27/2026|||
Yes - and x-rays too! Both from the main TV tube itself (though often shielded) but historically the main problem was actually the vacuum rectifiers used to generate the high voltages required. Those vacuum tubes essentially became x-ray bulbs and had to be shielded. This problem appeared as the first color TV's appeared in the late 60s. Color required higher voltages for the same brightness, due to the introduction of a mask that absorbed a lot of the energy. As a famous example, certain GE TV's would emit a strong beam of x-rays, but it was downwards so it would mostly expose someone beneath the TV. Reportedly a few models could emit 50,000 mR/hr at 9 inches distance https://www.nytimes.com/1967/07/22/archives/owners-of-9000-c... which is actually quite a lot (enough for radiation sickness after a few hours). All were recalled of course!
cf100clunk 1/26/2026|||
The shadow mask system for colour CRTs was a huge improvement that thwarted worries about ''beams everywhere'':

https://en.wikipedia.org/wiki/Shadow_mask

accounting2026 1/27/2026|||
Actually, the voltages had to be raised due to the shadow mask, and this rise in voltage meant you were now in x-ray territory, which wasn't the case before. The infamous problems with TV's emitting x-rays and associate recalls were the early color TV's. And it wasn't so much from the tube, but from the shunt regulators etc. in the power supply that were themselves vacuum tubes. If you removed the protection cans around those you would be exposed to strong radiation. Most of that went away when the TV's were transistorized so the high-voltage circuits didn't involve vacuum tubes.
cf100clunk 1/27/2026||
Most of those old TVs were not Faraday Caged either, nor were they grounded to earth, so all that radiation and energy was one hardware failure away from seriously unfunny events. Their chassis grounding always gave a tingle to the touch.
hahahahhaah 1/27/2026|||
Try antialias with that bad boy
brcmthrowaway 1/26/2026|||
The 1940-1990 era of technology can't be beat. Add hard drives and tape to the mix. What happened to electromechanical design? I doubt it would be taught anymore. Everything is solid state
Xirdus 1/26/2026||
Solid state is the superior technology for almost everything. No moving parts means more reliable, quieter, and very likely more energy efficient since no mass has to move.
jasonfarnon 1/27/2026||
Do modern hdd's last as long as the old platter ones? For me, when the SSDs fail it's frustrating because I can't open it up and do anything about it--it's a complete loss. So I tend to have a low opinion of their reliability (same issue I have with old versus new electronic-everything cars). I don't know the actual lifetimes. Surely USB sticks are universally recognized as pretty crappy. I can leave those in the same location plugged in and they'll randomly die after a couple of years.
Xirdus 1/28/2026|||
I feel like I'm the only person in the world who never had an issue with USB flash drives. Or HDDs for that matter. Or SSDs. I don't think I've ever had any storage die on me except optical disks.

Internet says both HDDs and SSDs have similar average lifespans, but with HDDs it's usually a mechanical failure so yes, you can often DIY it back to life if you have the right parts. With SSDs it's almost always the memory cells themselves wearing out. On the flip side, data recovery is usually much easier since SSD will usually keep working in read-only mode for a while, whereas a faulty HDD won't work at all.

djkoolaide 1/27/2026|||
I've had two SSDs "die" over the years, both of them went read-only, but I was able to recover all data. SSD failure modes are weird.
grishka 1/27/2026|||
That and modern digital TV is just incredibly boring from the technical standpoint. Because everything is a computer these days, it's just some MPEG-2 video. The only thing impressive about it is that they managed to squeeze multiple channels worth of video streams into the bandwidth of one analog channel.
joe_the_user 1/26/2026|||
Also, I believe precursors to CRT existed in the 19th century. What was unique with television was the creation of a full CRT system that allowed moving picture consumption to be a mass phenomena.
pinnochio 1/26/2026|||
We're getting awfully close to recreating CRT qualities with modern display panels. A curved 4:3 1000Hz OLED panel behind glass, and an integrated RetroTink 4K with G-Sync Pulsar support would do it. Then add in a simulated degauss effect and electrical whine and buzzing sounds for fun.
soperj 1/26/2026|||
still can't play duck hunt on it though.
gzalo 1/26/2026||
Yes you can, see https://neslcdmod.com/

It basically mods the rom to allow for a bit more latency when checking the hit targets

charcircuit 1/27/2026||||
>1000 Hz

This sounds like a brute force solution over just having the display controller read the image as it is being sent and emulating the phosphors.

account42 1/27/2026|||
A 1000 Hz panel does not imply that the computer has to send 1000 frames per second.
pinnochio 1/27/2026|||
Whoops, I misremembered. G-Sync Pulsar works with a 360Hz panel, claims perceived motion clarity comparable to 1000Hz+.
pezezin 1/28/2026|||
Why curved? We didn't like the CRT curvature back then and manufacturers struggled to make them as flat as possible, finally reaching "virtually flat" screens towards the end of the CRT era. I have one right here on my desk, a Sony Multiscan E200.
femto 1/26/2026|||
This thread makes me realise that the old Telequipment D61 Cathode Ray Oscilloscope I have is worth hanging on to. It's basically a CRT with signal conditioning on its inputs, including a "Z mod" input, making it easy to do cool stuff with it.
gman83 1/27/2026|||
This is a cool little project you might be interested in - https://github.com/mausimus/ShaderGlass
accidentallfact 1/27/2026|||
'Steampunk' means no electricity. You need to come up with another term. Analogpunk, maybe?
WorldMaker 1/27/2026||
"Dieselpunk" is sometimes considered the next door neighbor term for WW1 through early 1950's retrofuturism with electricity and radios/very early televisions.

Sometimes people use "Steampunk" for shorthand for both because there are some overlaps in either direction, especially if you are trying for "just" pre-WWI retrofuture. Though I think the above poster was maybe especially trying to highlight the sort of pre-WWI overlap with Steampunk with more electricity but not yet as many cars and "diesel".

https://en.wikipedia.org/wiki/Dieselpunk

accidentallfact 1/27/2026||
I don't know. Steam and electricity seem more like a coincidence that they were developed at the same time, so worlds without one seem natural. Another possibility might be no semiconductors. No nuclear also feels plausible, but it's just not interesting. Anything else requires a massive stretch to explain why technology got stuck in such a state.
WorldMaker 1/27/2026||
Perhaps, if you are worried about realism from the perspective of modern technology. But a lot of the concept of retrofuturism is considering the possible futures from the perspectives of the past. You don't necessarily need realism for why you would consider an exercise like that.

Steampunk is "rootable" in the writings of Jules Verne and H. G. Wells and others. We have scifi visions from Victorian and Edwardian lenses. It wasn't needed at the time to explain how you steam power a submarine or a rocket ship, it was just extrapolating "if this goes on" of quick advances in steam power and expecting them to eventually get there.

Similar with a lot of Diselpunk. The 1930s through the 1950s are often referred to as the Golden Age of scifi. There's so much science fiction written in the real world with a zeal for possible futures that never happened. We don't necessarily need a "massive stretch" to explain why technology took a different path or "got stuck" at a particular point. We've plenty of ideas of the exuberance of that era just in the books that they wrote and published themselves.

(Not that we are lacking in literary means to create excuses for the "realism" of retrofuture, either, when we care to. For one obvious instance, the Fallout franchise's nuclear warfare is central to its dieselpunk setting and an obvious reason for technology to get "stuck". For one less obvious reason, I like "For All Mankind" and its "Apollopunk" setting using the excuse of Russia beating the United States to first boots on the Moon and the butterfly impacts that could have had.)

accidentallfact 1/28/2026||
I mean that steampunk looks plausible, because it indeed seems to be purely a historical coincidence that electricity was developed at the same time. They are unrelated, one doesn't follow from the other in any way, so there is no obvious need to have both.

You pretty much need to have both chemistry and electricity, or neither.

Even Jules Verne understood the impossibility (or at least absurd impracticality) of a steam powered submarine, and made Nautilus electric.

It's unclear if internal combustion engines would be developed without electricity, and to what degree they would become practical.

I'm not sure about semiconductors, but the discovery does seem fairly random, and it seems plausible that electronics could just go on with vacuum tubes.

It seems perfectly plausible that nuclear wasn't noticed or practically developed, but, as I said, it just isn't an interesting setting.

zapataband2 1/27/2026||
[dead]
timonoko 1/26/2026||
I saw TV first time in 1957. Finland had no TV transmitters, so programs came from Soviet Estonia. I distinctly remember watching romantic Russian film with a catching tune. Perhaps named "Moscow Lights"?

How this is even possible that I remember all this, because I was 4 yrs old?

Gemini knows:

The Film: In the Days of the Spartakiad (1956/1957)

The song "Moscow Nights" was originally written for a documentary film called "In the Days of the Spartakiad" (V dni spartakiady), which chronicled a massive Soviet sports competition.

The Scene: In the film, there is a romantic, quiet scene where athletes are resting in the countryside near Moscow at night.

The Music: The song was sung by Vladimir Troshin. It was intended to be background music, but it was so hauntingly melodic that it became an overnight sensation across the USSR and its neighbors.

The Finnish Connection: In 1957, the song became a massive hit in Finland and Estonia. Since you were watching Estonian TV, you likely saw a version where the dialogue or narration was dubbed into Finnish—a common practice for broadcasts intended for Finnish-speaking audiences across the Gulf of Finland.

scotty79 1/26/2026||
Isn't it wild that you are asking 5th (or so) technological miracle that happened in your life time about the first one you remember?
timonoko 1/26/2026||
I actually thought that the "Computer" was some kind of abstract construct in 1971. And "programs" were just a method of expressing algorithms in textual manner. Only when we were allowed to have brief interactions with Teletype, did I believe there was actual machine that understands and executes these complex commands. Mind Blown.
keithnz 1/27/2026||
I was busy being born that year :)
therein 1/26/2026|||
I easily have many memories from age 4. I think I even remember the first time that I started forming memories. It was a few years before that, I had come out of my room and saw some toys I was playing with the night before. I realized they were at the same spot I left them, which made me realize the world had permanence and my awareness had continuity. I could leave things at a certain spot and they would be there the next day, that I could build things and they would stay that way. I realized I could remember things, in a way like "homo sapiens sapiens" being thinking about thinking, I realized I remember that I could remember.
rm445 1/26/2026|||
This is a fascinating post but I don't believe it reflects (most) human memory development, which has a pronounced forgetting phase called 'childhood amnesia'. When your kid starts to talk, it's startling what a two-year-old can remember and can tell you about. And it's kinda heartbreaking when they're 4-5 and you realise that those early memories have faded.
blauditore 1/26/2026||
Note that your memories might not be accurate, as your brain may have skightly altered them over the years, over and over. There is generally no way for yourself to know (except for some external proof).

This is not just the case for early childhood memories, but for anything - the more time passes, the less accurate. It's even possible to have completely "made-up" memories, perceived as 100% real, e.g. through suggestive questioning in therapy.

usefulcat 1/27/2026||
I can relate. I often feel like my earliest memories are now more like memories of memories, and I dimly recall that it wasn’t always like that.
tgtweak 1/26/2026|||
Definitely have some memories from 3 years old - some people claim earlier and I wouldn't doubt that, although it's very rare for memories before 2 to be recalled episodically.
tzs 1/26/2026|||
It's also hard to be sure if early memories are actually memories from the actual event or are memories your brain constructed from later hearing people describe the event.

There was one experiment where researchers got a man's family at a holiday gathering of the extended family to start talking about funny things that had happened to family members when they were children. In particular the man's parents and siblings told about a funny incident that happened to the man during his 3rd grade school play.

The man had earlier agreed to participate in some upcoming psychological research but did not yet know the details or been told when the research would start.

Later he was contacted and told the research would be starting soon, and asked to come in an answer some background questions. They asked about early non-academic school activities and he told them about his 3rd grade play and the funny incident that happened, including details that his family had not mentioned.

Unbeknownst to the man the research had actually started earlier and the man's family had agreed to help out. That story about the 3rd grade play that his family told was actually given to them by the researchers. None of his elementary school classes had put on any plays.

This sort of thing can be a real problem. People being questioned about crimes (as witnesses or suspects) can get false memories of the crime if the person questioning them is not careful. Or worse, a questioner could intentionally get them to form false memories that they will later recall on the witness stand.

avadodin 1/27/2026||
The memories are probably nothing like how they were at the time, but I vividly remember running away from my parents with my elder sister, getting bullied by an extremely blond girl at day care, and falling and literally eating dirt including that it was salty around 2-3.
jasonfarnon 1/27/2026||
But at some point don't you lose the direct memory, and only retain remembering it? Eg I don't know that I directly remember the fight I got in with the neighbor kid at age 4, but I can definitely remember thinking about it for a something we had to write in school around age 8. Or at least I could when I was in high school. That's when I thought about the time I had to write that essay when I was 8. At some point all I remember are the like the layers of subsequent thoughts about the original event, and I don't really access the original event any more, or it's just a stub.
avadodin 1/27/2026||
At some point, most memories are like that, to be honest – not just early childhood ones. You could say I consider these are "vivid" because I can recall more details of them than of the average memory.
rubslopes 1/26/2026|||
I have one memory that I can place between late 2 and early 3: my mum telling me I was going to have a brother. When he was born, I was 3 years and 6 months old.
poisonarena 1/26/2026|||
link to "Vladimir Trochin - Moscow nights (1956)" https://youtu.be/fRFScbISKDg?si=UsVHVnlnUnU2SP6v
michaelsbradley 1/26/2026||
My first memory of TV (but not my earliest memory by far) was, at age 4, seeing the first Space Shuttle launch. It was live on a little black-and-white set my parents had in their bedroom.
jedberg 1/26/2026||
This is interesting. John Logie Baird did in fact demonstrate something that looked like TV, but the technology was a dead end.

Philo Farnsworth demonstrated a competing technology a few years later, but every TV today is based on his technology.

So, who actually invented Television?

armadsen 1/26/2026||
For what it’s worth, Philo Farnsworth and John Logie Baird were friendly with each other. I was lucky to know Philo’s wife Pem very well in the last part of her life, and she spoke highly of Baird as a person.

David Sarnoff and RCA was an entirely different matter, of course…

bovermyer 1/26/2026||
The article has a photo of a plaque putting Baird's death in 1946, less than 40 years old.

What happened?

roarcher 1/26/2026||
He was 57, born in 1888. Died of a stroke.

https://en.wikipedia.org/wiki/John_Logie_Baird#Death

ggm 1/26/2026||
One of his electro-mechanical units was on display in Victoria, Australia. Most amazing assemblage, you can sort-of get the idea from things.

I read online that at his end, Baird was proposing a TV scan-rate we'd class as HD quality, which lost out to a 405 line standard (which proceeded 625/colour)

There is also a quality of persistence in his approach to things, he was the kind of inventor who doesn't stop inventing.

zwischenzug 1/26/2026|||
Whatever we all television now, television then was literally "vision at a distance", which Baird was the first to demonstrate (AFAIK).

The TV I have now in my living room is closer to a computer than a television from when I grew up (born 1975) anyway, so the word could mean all sorts of things. I mean, we still call our pocket computers "phones" even though they are mainly used for viewing cats at a distance.

MoonWalk 1/26/2026|||
You should read about the invention of color television. There were two competing methods, one of which depended on a spinning wheel with colored filters in it. If I remember correctly, you needed something like a 10-foot wheel to have a 27-inch TV.

Sure enough, this was the system selected as the winner by the U.S. standard-setting body at the time. Needless to say, it failed and was replaced by what we ended up with... which still sucked because of the horrible decision to go to a non-integer frame rate. Incredibly, we are for some reason still plagued by 29.97 FPS long after the analog system that required it was shut off.

iso1631 1/26/2026|||
Originally you had 30fps, it was the addition of colour with the NTSC system that dropped it to 30000/1001fps. That wasn't a decision taken lightly -- it was a consequence of retrofitting colour onto a black and white system while maintaining backward compatibility.

When the UK (and Europe) went colour it changed to a whole new system and didn't have to worry too much about backward compatibility. It had a higher bandwidth (8mhz - so 33% more than NTSC), and was broadcasting on new channels separate to the original 405 lines. It also had features like alternating the phase of every other line to reduce the "tint" or "never twice the same color" problem that NTSC had

America chose 30fps but then had to slow it by 1/1001ths to avoid interference.

Of course because by the 90s and the growth of digital, there was already far too much stuff expecting "29.97"hz so it remained, again for backward compatibility.

Dwedit 1/26/2026|||
60 interlaced fields per second, not 30 frames per second. The two fields do not necessarily contribute to the same frame.
account42 1/27/2026|||
Which unfortunately also has continued to plague us much longer than we have used any display technology where that might have made any sense.
MoonWalk 2 days ago||
Yep, once again thanks to broadcasters spreading FUD. When the progressive vs. interlaced debate raged during the development of ATSC, broadcasters whined that progressive would disrupt their entire production pipeline... which was utter BS because they'd been showing film-based content for generations.
dylan604 1/26/2026|||
If you get those fields out of sync, you will have problems though, so it's okay to consider them in pairs per frame for sanity's sake.
MoonWalk 2 days ago||||
I understand the origin of the 29.97 frame rate. They did have a choice, though: Shift the audio signal's frequency and orphan the (relatively limited number of) receivers at the time, or saddle generations of people with this dumb frate rate.
masfuerte 1/26/2026||||
In the UK the two earliest channels (BBC1 and ITV) continued to broadcast in the 405 line format (in addition to PAL) until 1985. Owners of ancient televisions had 20 years to upgrade. That doesn't seem unreasonable.
lebuffon 1/27/2026||||
An engineer at RCA in New Jersey told me that at the first(early) NTSC color demo the interference was corrected by hand tweaking the color sub-carrier oscillator from which vertical and horizontal intervals were derived and the final result was what we got.

The interference was caused when the spectrum of the color sub-carrier over-lapped the spectrum of the horizontal interval in the broadcast signal. Tweaking the frequencies allowed the two spectra to interleave in the frequency domain.

dylan604 1/26/2026|||
understanding the affect of the 1.001 fix has given me tons of job security. That understanding came not from just book learning, but OJT from working in a film/video post house that had engineers, colorists, and editors that were all willing to entertain a young college kid's constant use of "why?". Then being present for the transition from editing film on flat beds to editing film transfers to video. Part of that came from having to transfer audio from tape reels to video by changing to the proper 59.94Hz or 60Hz crystal that was needed to control the player's speed. Also had a studio DAT deck that could slow down the 24fps audio record in the field to playback at 23.976.

Literally, to this day, am I dealing with all of these decisions made ~100 years ago. The 1.001 math is a bit younger when color was rolled out, but what's a little rounding between friends?

eternauta3k 1/26/2026|||
Why is an integer frame rate better?
zoky 1/26/2026|||
For one thing, it’s much easier to measure spans of time when you have an integer frame rate. For example, 1 hour at 30fps is exactly 108,000 frames, but at 29.97 it’s only 107,892 frames. Since frame numbers must all have an integer time code, “drop-frame” time code is used, where each second has a variable number of frames so that by the end of each measured hour the total elapsed time syncs up with the time code, i.e. “01:00:00;00” falls after exactly one hour has passed. This is of course crucial when scheduling programs, advertisements, and so on. It’s a confusing mess and historically has caused all kinds of headaches for the TV industry over the years.
MoonWalk 2 days ago|||
In addition to the other poster's on-point remarks, film cameras have always run at integer frame rates. We have generations of motion pictures shot at 24 FPS.

Many TV shows (all, before video tape) were shot on film too, but I'm not sure if they were at an even 30 FPS.

chasil 1/26/2026|||
I had a communications theory class in college that addressed "vestigal sideband modulation," which I believe was implemented by Farnsworth. I think this is a critical aspect to the introduction of television technology.

https://en.wikipedia.org/wiki/Single-sideband_modulation#Sup...

drmpeg 1/26/2026||
VSB came later. From https://www.tvtechnology.com/opinions/hdtv-from-1925-to-1994

In the United States in 1935, the Radio Corporation of America demonstrated a 343-line television system. In 1936, two committees of the Radio Manufacturers Association (RMA), which is now known as the Consumer Electronics Association, proposed that U.S. television channels be standardized at a bandwidth of 6 MHz, and recommended a 441-line, interlaced, 30 frame-per-second television system. The RF modulation system proposed in this recommendation used double-sideband, amplitude-modulated transmission, limiting the video bandwidth it was capable of carrying to 2.5 MHz. In 1938, this RMA proposal was amended to employ vestigial-sideband (VSB) transmission instead of double sideband. In the vestigial-sideband approach, only the upper sidebands-those above the carrier frequency-plus a small segment or vestige of the lower sidebands, are transmitted. VSB raised the transmitted video bandwidth capability to 4.2 MHz. Subsequently, in 1941, the first National Television Systems Committee adopted the vestigial sideband system using a total line rate of 525 lines that is used in the United States today.

joe_the_user 1/26/2026|||
The thing is that "television" seemed like a thing but really it was a system that required a variety of connected, compatible parts, like the Internet.

Different pieces of what became TV existed in 1900, the challenge was putting them together. And that required a consensus among powerful players.

AndrewDucker 1/26/2026|||
There were a great many small breakthroughs over time. Where you draw the line is up to you.
throwaway_20357 1/26/2026|||
Wasn't all this early TV experimentation based on Nipkow disks (https://en.wikipedia.org/wiki/Nipkow_disk)?
accidentallfact 1/27/2026|||
I think it would be pretty uncontroversial from the technological point of view, but then, the first "real" TV broadcast would be the 1936 Olympic games...
gtoubassi 1/26/2026|||
"The Last Lone Inventor: A Tale of Genius, Deceit, and the Birth of Television" is a great book detailing the Farnsworth journey.
tehwebguy 1/26/2026|||
Farnsworth…
kridsdale3 1/26/2026||
Wernstrom!
reactordev 1/26/2026|||
Baird did. Farnsworth invented the all-electric version (sans mechanical parts).

A kin to Ed Roberts, John Blakenbaker and Mark Dean invented the personal computer but Apple invented the PC as we know it.

msla 1/27/2026||
You're skipping a few steps (like the Altair 8800) if you say that Apple invented the PC as we know it. Apple didn't even invent the GUI as we know it.
reactordev 1/27/2026||
No, I mentioned Ed Roberts. Up until Apple, you had to solder your own or buy one that was put together for you.
cultofmetatron 1/26/2026||
> but every TV today is based on his technology.

Philo Farnsworth invented the cathode ray tube. unless you're writing this from the year 2009 or before, I'm going to have to push back on the idea that tv's TODAY are based on his technology. They most certainly are not.

_nub3 1/26/2026|||
1897 Ferdinand Braun invents the Cathode Ray Tube dubbed "Braunsche Röhre"

https://en.wikipedia.org/wiki/Kenjiro_Takayanagi

'Although he failed to gain much recognition in the West, he built the world's first all-electronic television receiver, and is referred to as "the father of Japanese television"'

He presented it in 1926 (Farnsworth in 1927)

However father of television was this dude:

https://en.wikipedia.org/wiki/Manfred_von_Ardenne

Better resolution, wireless transmission and Olympics 1936

jedberg 1/26/2026||||
He invented electronic rasterization, a form of which is still in use today.
shellac 1/26/2026|||
No, Braun invented the cathode ray tube.
TacticalCoder 1/26/2026||
And 100 years ago my great-aunt and grandmother (both RIP) were little kids and my great-grandmother, born in the 19th century and which I knew very well for she lived until 99 years old, was filming them playing on the beach using a "Pathe Baby" hand camera.

I still have the reels, they look like this:

https://commons.wikimedia.org/wiki/File:Films_Path%C3%A9-Bab...

https://fr.wikipedia.org/wiki/Path%C3%A9-Baby

And we converted some of these reels to digital files (well brothers and I asked a specialized company to "digitalize" them).

100 years ago people already had cars, tramways (as a kid my great-grandmother tried to look under the first tramway she saw to see "where the horses were hiding"), cameras to film movies, telephones, the telegraph existed, you could trade the stock market and, well, it's knew to me but TV was just invented too.

TeMPOraL 1/26/2026|
On the one hand, it's fascinating to know just how much of what shapes our lives was already there a hundred years ago in some form.

On the other hand, it's just as fascinating to realize that all that, and ~everything that shapes modern life, did not exist until ~200 years ago. Not just appliances, but medicines and medicine, plastics and greases and other products of petrochemical industry and everything built on top of it, paints and cleaners and materials and so on...

shevy-java 1/26/2026||
In a way television was kind of cool. I loved it as a child, give or take.

Nowadays ..... hmmm. I no longer own a TV since many years. Sadly youtube kind of replaced television. It is not the same, quality-wise I think youtube is actually worse than e. g. the 1980s era. But I also don't really want to go back to television, as it also had low quality - and it simply took longer, too. On youtube I was recently watching old "Aktenzeichen XY ungelöst", in german. The old videos are kind of cool and interesting from the 1980s. I watched the new ones - it no longer made ANY sense to watch it ... the quality is much worse, and it is also much more boring. It's strange.

tadfisher 1/26/2026||
I remember when we organized our lives around television. On Saturday mornings it would be cartoons (including the first full-CGI television shows, Reboot and Transformers: Beast Wars), Wednesday evenings would be Star Trek: TNG, Fridays would be the TGIF block of family shows (from early-to-mid-90s USA perspective here). It felt like everyone watched the same thing, everyone had something to talk about from last night's episode, and there was a common connection over what we watched as entertainment.

We saw a resurgence of this connection with big-budget serials like Game of Thrones, but now every streaming service has their own must-watch thing and it's basically as if everyone had their own personal broadcast station showing something different. I don't know if old-school television was healthy for society or not, but I do have a feeling of missing out on that shared connection lately.

elevation 1/26/2026|||
> but I do have a feeling of missing out on that shared connection lately

Mass media isolates individuals who don't have access to it. I grew up without a TV, and when TV was all my neighbors could talk about, I was left out, and everyone knew it.

While other children were in front of the television gaining "shared experience", I built forts in the woods with my siblings, explored the creek in home made boats, learned to solder, read old books, wrote basic computer programs, launched model rockets, made up magic tricks. I had a great childhood, but I had a difficult time connecting with children whose only experiences were these shallow, shared experiences.

Now that media is no longer "shared", the fragmented content that people still consume has diminishing social value -- which in many cases was the only value it had. Which means there are fewer social consequences for people like me who choose not to partake.

rexpop 1/26/2026|||
Mass media even moreso isolates individuals who DO have access to it.

Their "shared experience" is, actually, a debilitating addiction to flat, untouchable, and anti-democratic spectacle.

The least hundred years have seen our society drained of social capital, inescapably enthralled by corporate mediators. Mass media encourages a shift from "doing" to "watching." As we consume hand-tailored entertainment in private, we retreat from the public square.

Heavy television consumption is associated with lethargy and passivity, reinforcing an intolerance for unstructured time. This creates a "pseudoworld" where viewers feel a false sense of companionship—a parasocial connection with television personalities—that creates a feeling of intimacy while requiring (and offering) no actual reciprocity or effort.

Television, the "800-pound gorilla of leisure time," has privatized our existence. This privatization of leisure acts as a lethal competitor for scarce time, stealing hours that were once devoted to social interaction—the picnics, club meetings, and informal visiting that constitute the mētis or practical social knowledge of community life.

parpfish 1/26/2026||||
it feels like you're advocating that "unless everybody can form a shared connection through common culture, nobody should for a shared connection through common culture".
iammrpayments 1/27/2026|||
It’s funny because when social media first came out, I also start to feel I had shared experiences and I wasn’t weird
jedberg 1/26/2026||||
This is something I've been lamenting for a long time. The lack of shared culture. Sometimes a mega-hit briefly coalesces us, but for the most part everyone has their own thing.

I miss the days when everyone had seen the same thing I had.

Diederich 1/26/2026|||
I found this the other day: https://www.youtube.com/watch?v=ksFhXFuRblg "NBC Nightly News, June 24, 1975" I strongly urge people to watch this, it's 30 minutes but there are many very illuminating insights within. One word for you: Exxon.

While I was young in 1975, I did watch ABC's version of the news with my grandparents, and continued up through high school. Then in the late 1980s I got on the Internet and well you know the rest.

"Back Then", a high percentage of everybody I or my grandparents or my friends came into contact with watched one of ABC, NBC, or CBS news most nights. These three networks were a bit different, but they generally they all told the same basic stories as each other.

This was effectively our shared reality. Later in high school as I became more politically focused, I could still talk to anybody, even people who had completely opposite political views as myself. That's because we had a shared view of reality.

Today, tens of millions of people see the exact same footage of an officer involved shooting...many angles, and draw entirely different 'factual' conclusions.

So yes, 50 years ago, we in the United States generally had a share view of reality. That was good in a lot of ways, but it did essentially allow a small set of people in power to decide that convincing a non-trivial percentage of the US population that Exxon was a friendly, family oriented company that was really on your side.

Worth the trade off? Hard to say, but at least 'back then' it was possible, and even common, to have ground political discussions with people 'on the other side', and that's pretty valuable.

Schmerika 1/27/2026||
> 'back then' it was possible, and even common, to have ground political discussions with people 'on the other side'

As long as that common ground falls within acceptable parameters; couldn't talk too much about anything remotely socialist or being anti-war.

"The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum."

ghaff 1/26/2026|||
I don't know if it's good or bad but, outside of some megahit films, people mostly don't regularly watch the same TV series. I don't even have live TV myself.
victorperalta 1/26/2026||||
Planet Money recently released an episode that mentions some of these points around drop vs drip programming

https://www.npr.org/2025/12/24/nx-s1-5646673/stranger-things...

eloisant 1/26/2026|||
This is why I like it when streaming services release one episode every week instead of dropping the whole season in one shot.
parpfish 1/26/2026||
i hate the single season dumping at once for a big binge. it always feel like i'm plugging into the content trough and gorging myself to pass the hours.

you can't talk about a show with somebody until they're also done binging, so there's no fun discussion/speculation (the conversation is either "did you watch that? yeah. <conversation over>" or "you should watch this. <conversation over>".

procflora 1/26/2026|||
The broadcast nature of it is something that I missed just last night. I was walking past several bars as the Seahawks won a big football game, but of course each spot was on a different stream delay so instead of one full-throated simultaneous cheer echoing across the neighborhood it was three or four quieter, distinct cheers spread over 20-30 seconds. Not really a big deal but still, it felt like a lesser experience to this aging millennial.
the_af 1/26/2026||
> It is not the same, quality-wise I think youtube is actually worse than e. g. the 1980s era

Is it though? I of course watched TV as a kid through the 80s and have some feelings of nostalgia about it, but is it true that YouTube today is worse?

I mean, YouTube is nothing in particular. There's all sorts of crap, but Sturgeon's Law [1] applies here. There is also good stuff, even gems, if you curate your content carefully. YouTube can be delightful if you know what to look for. If you don't filter, yeah... it's garbage.

----

[1] https://en.wikipedia.org/wiki/Sturgeon%27s_law

b00ty4breakfast 1/26/2026|||
There is many times more things on youtube than were ever on TV over it's entire lifetime up to the YT era, even discounting old TV show content on Youtube. But it also feels like the ratio of of good-to-shit has not remained constant between the two.
sodapopcan 1/26/2026||||
Definitely good stuff on YouTube, but I do miss the curation and, as was talked about here recently I believe, shared experiences that brought. I'm also crazy addicted to YouTube in a way that I wasn't to TV, but that's another issue.
k4rli 1/27/2026|||
Same as reddit pretty much. 98% is trash but good parts do exist.

Diving into new topics on YT is delightful. The site becomes much better with sponsorblock+ublock origin+hide shorts/trending (unhook or blocktube)+replace clickbait titles+replace thumbnails (dearrow)+return youtube dislike.

tzs 1/27/2026||
I've sometimes wondered how things would have been different if the TV pioneers had went with circular CRTs instead of rounded rectangles.

Circles would have had a couple of advantages. First, I believe they would have been easier to make. From what I've read rectangles have more stress at the corners. Rounding the corners reduces that but it is still more than circles have. With circles they could have more easily made bigger CRTs.

Second, there is no aspect ratio thus avoiding the whole problem of picking an aspect ratio.

Electronically the signals to the XY deflectors to scan a spiral out from the center (or in from the edge if you prefer) on a circle are as easy to make as the signals to to scan in horizontal lines on a rectangle.

As far as I can tell that would have been fine up until we got computers and wanted to use TV CRTs as computer displays. I can't imagine how to build a bitmapped interface for such a CRT that would not be a complete nightmare to deal with.

rhplus 1/27/2026||
I would guess that even at the time a circular viewport would have seemed a bit weird and so rectangular was preferred. After all, theater stages, most windows, photographs and books - all common place - aren’t circular either.
account42 1/27/2026||
Futurism of the time was all about round shapes though.
icehawk 1/27/2026|||
Picture tubes started round, and then became rectangular:

https://www.earlytelevision.org/prewar_crts.html

They didn't really have the problem of picking an aspect ratio because motion pictures existed and that was already 4:3

abcde666777 1/27/2026|||
Regarding aspect ratio, I'd bet they would have explored oval shapes before long.
ortusdux 1/27/2026|||
Reminds me of how every single piece of paper on Battlestar Galactica has the corners cut off. Somewhere in their timeline paper became 8 sided, and it's just as odd as our 4 side paper and rectangular TVs
Schmerika 1/27/2026||
Whoa. Guess I need to watch the entire series again :)
bobthepanda 1/27/2026||
Circles don’t pack together well. And they need a different solution for standing up.
hahahahhaah 1/27/2026||
Circle tube, rectangle case.
bobthepanda 1/28/2026||
The issue is not so much that you can't pack them at all but any packing solution is going to waste a lot of space in the truck compared to a bunch of box shaped TVs.
ofrzeta 1/26/2026||
Neil Postman's theory still holds up and is extended to the Internet

https://en.wikipedia.org/wiki/Amusing_Ourselves_to_Death

willturman 1/26/2026|
> In the introduction to Amusing Ourselves to Death, Postman said that the contemporary world was better reflected by Aldous Huxley's Brave New World, whose public was oppressed by their addiction to amusement, rather than by Orwell's work, where they were oppressed by state violence.

And modern America asked itself, why can't it be both?

mrandish 1/26/2026||
Early television was a hotbed of hacker/hobbyist DIY experimentation much like early radio and early personal computers. The first issue of "Television Magazine" from 1928 (https://comicbookplus.com/?dlid=37097) has a remarkably similar vibe to 1970s computer zines (https://archive.org/details/kilobaudmagazine-1977-01/).

For example, page 26 has directions on how to pop by the local chemist to pick up materials to make your own selenium cell (your first imager) and page 29 covers constructing your first Televisor, including helpful tips like "A very suitable tin-plate is ... the same material out of which biscuit tins and similar light tinware is made. It is easily handled and can readily be cut with an ordinary pair of scissors. It is sold in sheets of 22 inches by 30 inches. Any ironmonger will supply these."

augusteo 1/26/2026||
The Baird vs Farnsworth debate reminds me of similar discussions in tech. The first demo rarely becomes the dominant standard.

What strikes me is how fast the iteration was. Baird went from hatboxes and bicycle lenses to color TV prototypes in just two years. That's the kind of rapid experimentation we're seeing with AI right now, though compressed even further.

smithza 1/27/2026|
Amusing Ourselves to Death by Neil Postman is the book that comes to mind with this article link. 2 years ago my wife and I took the TV off the wall. My kids don't have Bluey or the latest Disney cartoon to keep them company. I am not going back... It has been the most blissful time. Amazing that the TV is not required to lead a thriving life despite what the incessant sales-industrial-complex will tell you.
briantoknowyou 1/28/2026|
Ew. Reeks more of a paranoid victim complex than true embodied virtue. I’d consider therapy there uh bud
More comments...