Posted by qassiov 1/26/2026
The image is not stored at any point. The receiver and the transmitter are part of the same electric circuit in a certain sense. It's a virtual circuit but the entire thing - transmitter and receiving unit alike - are oscillating in unison driven by a single clock.
The image is never entirely realized as a complete thing, either. While slow phosphor tubes do display a static image, most CRT systems used extremely fast phosphors; they release the majority of the light within a millisecond of the beam hitting them. If you take a really fast exposure of a CRT display (say 1/100,000th of a second) you don't see the whole image on the photograph - only the most recently few drawn lines glow. The image as a whole never exists at the same time. It exists only in the persistence of vision.
Just wanted to add one thing, not as a correction but just because I learned it recently and find it fascinating. PAL televisions (the color TV standard in Europe) actually do store one full horizontal scanline at a time, before any of it is drawn on the screen. This is due to a clever encoding used in this format where the TV actually needs to average two successive scan lines (phase-shifted compared to each other) to draw them. Supposedly this cancels out some forms of distortion. It is quite fascinating this was even possible with analogue technology. The line is stored in a delay line for 64 microseconds. See e.g.: https://www.youtube.com/watch?v=bsk4WWtRx6M
In fact in order to show a feed of only text/logos/etc in the earlier days, they would literally just point the camera at a physical object (like letters on a paper, etc) and broadcast from the camera directly. There wasn’t really any other way to do it.
"And if you tell the kids that today, they won't believe it!"
https://www.youtube.com/watch?v=5Ap_JRofNMs https://www.youtube.com/watch?v=PJpiIyBkUZ4
This mini doc shows the process:
The very first computers (Manchester baby) used CRTs as memory - the ones and zeros were bright spots on a “mesh” and the electric charge on the mesh was read and resent back to the crt to keep the ram fresh (a sorta self refreshing ram)
The CRTs with memory for early computers were actually derived from the special CRTs used in video cameras. There the image formed by the projected light was converted in a distribution of charge stored on an electrode, which was then sensed by scanning with an electron beam.
Using CRTs as memory has been proposed by von Neumann and in his proposal he used the appropriate name for that kind of CRT: "iconoscope".
Then the CRT memories have become obsolete almost instantaneously, due to the development of magnetic core memories, which did not require periodic refreshing and which were significantly faster. The fact that they were also non-volatile was convenient at that early time, though not essential.
Today, due to security concerns, you would actually not want for your main memory to be non-volatile, unless you also always encrypt it completely, which creates problems of secret key management.
So CRT memories have become obsolete several years before the replacement of vacuum tubes in computers with transistors, which happened around 1959/1960.
Besides CRT memories and delay line memories, another kind of early computer memory that has quickly become obsolete was the memory with magnetic drums.
In the cheapest early computers (like IBM 650), the main memory was not a RAM (i.e. neither a CRT nor with magnetic cores), but a magnetic drum memory (i.e. with sequential periodic access to data).
- Core memory - Drum memory - Bubble memory - Mercury delay line memory - Magnetic type memory :P
And probably many more. Remember that computers don't even need to be digital!
or electric.
Being old enough to have learned video engineering at the end of the analog days, it's kind of fun helping young engineers today wrap their brains around completely alien concepts, like "the image is never pixels" then "it's never digital" and "never quantized." Those who've been raised in a digital world learn to understand things from a fundamentally digital frame of reference. Even analog signals are often reasoned about as if their quantized form was their "true nature".
Interestingly, I suspect the converse would be equally true trying to explain digital television to a 1930s video engineer. They'd probably struggle similarly, always mentally remapping digital images to their "true" analog nature. The fundamental nature of their world was analog. Nothing was quantized. Even the idea "quanta" might be at the root of physics was newfangled, suspect and, even if true, of no practical use in engineering systems.
Yes, before posting I did debate that exact point in my head, with scanlines as the clearest example :-). However, I decided the point is still directionally valid because ultimately most timing-centric analog signal encoding has some aspect of being quantized, if only to thresholds. Technically it would be more correct to narrow my statement about "never quantized" to the analog waveform driving the electron gun as it sweeps horizontally across a line. It always amazes digital-centric engineers weaned on pixels when they realize the timing of the electron gun sweep in every viewer's analog TV was literally created by the crystal driving the sweep of the 'master' camera in the TV studio (and would drift in phase with that crystal as it warmed up!). It's the inevitable consequence of there being no practical way to store or buffer such a high frequency signal for re-timing. Every component in the chain from the cameras to switchers to transmitters to TVs had to lock to the master clock. Live TV in those days was truly "live" to within 63.5 microseconds of photons hitting vacuum tubes in the camera (plus the time time it took for the electrons to move from here to there). Today, "live" HDTV signals are so digitally buffered, re-timed and re-encoded at every step on their way to us, we're lucky if they're within 20 seconds of photons striking imagers.
My larger point though was that in the 1930s even that strict signal timing had to be encoded and decoded purely with discrete analog components. I have a 1950s Predicta television and looking at the components on the boards one can't help wondering "how the hell did they come up with this crazy scheme." Driving home just how bonkers the whole idea of analog composite television was for the time.
> first, how do you fit in the color so a monochrome TV can still show it?
To clarify for anyone who may not know, analog television was created in the 1930s as a black-and-white composite standard defined by the EIA in the RS-170 specification, then in 1953 color was added by a very clever hack which kept all broadcasts backward compatible with existing B&W TVs (defined in the RS-170A specification). Politicians mandated this because they feared nerfing all the B&W TVs owned by voters. But that hack came with some significant technical compromises which complicated and degraded color analog video for over 50 years.
As I recall there's all kinds of hacks in the design to keep them cheap. For instance, letting the fly-back transformer for producing the high voltages needed operate at the same frequency as the horizontal scan rate (~15 kHz) so that mechanism essentially serves double duty. The same was even seen in microcomputers where the same crystal needed for TV was also used for the microprocessor - meaning that e.g. a "European" Commodore 64 with PAL was actually a few percent slower than an American C64 with NTSC. And other crazy things like that.
Indeed! Even in the Playstation 2 era, many games still ran at different speeds in Europe than the U.S. and Japan. There were so many legacy artifacts which haunted computers, games, DVDs and more for decades after analog broadcast was supplanted by digital. And it all arose from the fact the installed base and supporting broadcast infrastructure of analog television was simply too massive to replace. In a way it was one of the biggest accrued "technical debts" ever!
The only regrettable thing is during the long, painful transition from analog to digital, a generation of engineers got the idea that the original analog TV standard was somehow bad - which, IMHO, is really unfair. The reality is the original RS-170 standard was a brilliant solution which perfectly fulfilled, and even exceeded, all its intended use cases for decades. The problems only arose when that solution was kept alive far beyond its intended lifetime and then hacked to support new use cases like color encoding while maintaining backward compatibility.
Analog television was created solely for natural images captured on vacuum tube cameras. Even the concept of synthetic imagery like character generator text and computer graphic charts was still decades in the future. Then people who weren't yet born when TV was created, began to shove poorly converted, hard-edged, low-res, digital imagery into a standard created to gracefully degrade smooth analog waveforms and it indeed sucked. I learned to program on an 8-bit computer with 4K of RAM connected to a Sears television through an RF modulator. Even 32 columns of 256x192 text was a blurry mess with color fringes! On many early 8-bit computers, some colors would invert randomly based on which clock phase the computer started on! Red would be blue and vice versa so we'd have to repeatedly hit reset until the colors looked correct. But none of that craziness was the fault of the original television engineers, we were abusing what they created in ways they couldn't have imagined.
The composite and component sampling rates (14.32 MHz and 13.5 MHz) are both based on being 4x a specific existing color carrier sampling rate from analog television. And those two frequencies directly dictated all the odd-seeming horizontal pixel resolutions we find in pre-HD digital video (352, 704, 360, 720 and 768) and even the original PC display resolutions (CGA, VGA, XGA, etc).
For example, the 720 horizontal pixels of DVD and digital satellite broadcasts was tied to the digital component video standard sampling the active picture area of an analog video scanline at 13.5 Mhz to capture the 1440 clock transitions in that waveform. Similarly, 768 (another common horizontal resolution in pre-HD video) is tied to the composite video standard sampling at 14.32 MHz to capture 1536 clock transitions. The history of how these standards were derived is fascinating (https://tech.ebu.ch/docs/techreview/trev_304-rec601_wood.pdf)
VGA's horizontal resolution of 640 is simply from adjusting analog video's rectangular aspect ratio to be square (720 * 0.909 = 640). It's kind of fascinating all these modern digital resolutions can be traced back to decisions made in the 1930s based on which affordable analog components were available, which competing commercial interests prevailed (RCA vs Philco) and the political sensitivities present at the time.
:-))))
He said his coworkers would sometimes toss a television capacitor at each other as a prank.
Those capacitors retained enough charge to give the person unlucky enough to catch one a considerable jolt.
https://news.ycombinator.com/item?id=46355765
"I still have a piece of glass in back of the palm of my right hand. Threw a rock at an old CRT and it exploded, after a couple of hours I noticed a little blood coming out of that part of hand. Many, many years later was doing xray for a broken finger and doctor asked what is that object doing there? I shrugged, doc said, well it looks like it's doing just fine, so might as well stay there. How lucky I am to have both eyes."
https://news.ycombinator.com/item?id=46354919
"2. Throwing a big big stone to an abandoned next to the trashcan CRT TV while I had it placed normally because it didn’t break when I threw it facing up and the next thing I remember after opening my eyes which I closed from the bang was my friends who were further down the road looking at me as it I were a ghost since big big chunks for the CRT glass flew just right next to me.
CRTs were dangerous in many aspects!"
https://news.ycombinator.com/item?id=46356432
"I'll never forget the feeling of the whoosh when I was working as a furniture mover in the early 2000s and felt the implosion when a cardboard box collapsed and dumped a large CRT TV face-down on the driveway, blowing our hair back. When the boss asked what happened to the TV, I said it fell, and our lead man (who had set it on the box) later thanked me for putting it so diplomatically."
Internet says both HDDs and SSDs have similar average lifespans, but with HDDs it's usually a mechanical failure so yes, you can often DIY it back to life if you have the right parts. With SSDs it's almost always the memory cells themselves wearing out. On the flip side, data recovery is usually much easier since SSD will usually keep working in read-only mode for a while, whereas a faulty HDD won't work at all.
It basically mods the rom to allow for a bit more latency when checking the hit targets
This sounds like a brute force solution over just having the display controller read the image as it is being sent and emulating the phosphors.
Sometimes people use "Steampunk" for shorthand for both because there are some overlaps in either direction, especially if you are trying for "just" pre-WWI retrofuture. Though I think the above poster was maybe especially trying to highlight the sort of pre-WWI overlap with Steampunk with more electricity but not yet as many cars and "diesel".
Steampunk is "rootable" in the writings of Jules Verne and H. G. Wells and others. We have scifi visions from Victorian and Edwardian lenses. It wasn't needed at the time to explain how you steam power a submarine or a rocket ship, it was just extrapolating "if this goes on" of quick advances in steam power and expecting them to eventually get there.
Similar with a lot of Diselpunk. The 1930s through the 1950s are often referred to as the Golden Age of scifi. There's so much science fiction written in the real world with a zeal for possible futures that never happened. We don't necessarily need a "massive stretch" to explain why technology took a different path or "got stuck" at a particular point. We've plenty of ideas of the exuberance of that era just in the books that they wrote and published themselves.
(Not that we are lacking in literary means to create excuses for the "realism" of retrofuture, either, when we care to. For one obvious instance, the Fallout franchise's nuclear warfare is central to its dieselpunk setting and an obvious reason for technology to get "stuck". For one less obvious reason, I like "For All Mankind" and its "Apollopunk" setting using the excuse of Russia beating the United States to first boots on the Moon and the butterfly impacts that could have had.)
You pretty much need to have both chemistry and electricity, or neither.
Even Jules Verne understood the impossibility (or at least absurd impracticality) of a steam powered submarine, and made Nautilus electric.
It's unclear if internal combustion engines would be developed without electricity, and to what degree they would become practical.
I'm not sure about semiconductors, but the discovery does seem fairly random, and it seems plausible that electronics could just go on with vacuum tubes.
It seems perfectly plausible that nuclear wasn't noticed or practically developed, but, as I said, it just isn't an interesting setting.
How this is even possible that I remember all this, because I was 4 yrs old?
Gemini knows:
The Film: In the Days of the Spartakiad (1956/1957)
The song "Moscow Nights" was originally written for a documentary film called "In the Days of the Spartakiad" (V dni spartakiady), which chronicled a massive Soviet sports competition.
The Scene: In the film, there is a romantic, quiet scene where athletes are resting in the countryside near Moscow at night.
The Music: The song was sung by Vladimir Troshin. It was intended to be background music, but it was so hauntingly melodic that it became an overnight sensation across the USSR and its neighbors.
The Finnish Connection: In 1957, the song became a massive hit in Finland and Estonia. Since you were watching Estonian TV, you likely saw a version where the dialogue or narration was dubbed into Finnish—a common practice for broadcasts intended for Finnish-speaking audiences across the Gulf of Finland.
This is not just the case for early childhood memories, but for anything - the more time passes, the less accurate. It's even possible to have completely "made-up" memories, perceived as 100% real, e.g. through suggestive questioning in therapy.
There was one experiment where researchers got a man's family at a holiday gathering of the extended family to start talking about funny things that had happened to family members when they were children. In particular the man's parents and siblings told about a funny incident that happened to the man during his 3rd grade school play.
The man had earlier agreed to participate in some upcoming psychological research but did not yet know the details or been told when the research would start.
Later he was contacted and told the research would be starting soon, and asked to come in an answer some background questions. They asked about early non-academic school activities and he told them about his 3rd grade play and the funny incident that happened, including details that his family had not mentioned.
Unbeknownst to the man the research had actually started earlier and the man's family had agreed to help out. That story about the 3rd grade play that his family told was actually given to them by the researchers. None of his elementary school classes had put on any plays.
This sort of thing can be a real problem. People being questioned about crimes (as witnesses or suspects) can get false memories of the crime if the person questioning them is not careful. Or worse, a questioner could intentionally get them to form false memories that they will later recall on the witness stand.
Philo Farnsworth demonstrated a competing technology a few years later, but every TV today is based on his technology.
So, who actually invented Television?
David Sarnoff and RCA was an entirely different matter, of course…
What happened?
I read online that at his end, Baird was proposing a TV scan-rate we'd class as HD quality, which lost out to a 405 line standard (which proceeded 625/colour)
There is also a quality of persistence in his approach to things, he was the kind of inventor who doesn't stop inventing.
The TV I have now in my living room is closer to a computer than a television from when I grew up (born 1975) anyway, so the word could mean all sorts of things. I mean, we still call our pocket computers "phones" even though they are mainly used for viewing cats at a distance.
Sure enough, this was the system selected as the winner by the U.S. standard-setting body at the time. Needless to say, it failed and was replaced by what we ended up with... which still sucked because of the horrible decision to go to a non-integer frame rate. Incredibly, we are for some reason still plagued by 29.97 FPS long after the analog system that required it was shut off.
When the UK (and Europe) went colour it changed to a whole new system and didn't have to worry too much about backward compatibility. It had a higher bandwidth (8mhz - so 33% more than NTSC), and was broadcasting on new channels separate to the original 405 lines. It also had features like alternating the phase of every other line to reduce the "tint" or "never twice the same color" problem that NTSC had
America chose 30fps but then had to slow it by 1/1001ths to avoid interference.
Of course because by the 90s and the growth of digital, there was already far too much stuff expecting "29.97"hz so it remained, again for backward compatibility.
The interference was caused when the spectrum of the color sub-carrier over-lapped the spectrum of the horizontal interval in the broadcast signal. Tweaking the frequencies allowed the two spectra to interleave in the frequency domain.
Literally, to this day, am I dealing with all of these decisions made ~100 years ago. The 1.001 math is a bit younger when color was rolled out, but what's a little rounding between friends?
Many TV shows (all, before video tape) were shot on film too, but I'm not sure if they were at an even 30 FPS.
https://en.wikipedia.org/wiki/Single-sideband_modulation#Sup...
In the United States in 1935, the Radio Corporation of America demonstrated a 343-line television system. In 1936, two committees of the Radio Manufacturers Association (RMA), which is now known as the Consumer Electronics Association, proposed that U.S. television channels be standardized at a bandwidth of 6 MHz, and recommended a 441-line, interlaced, 30 frame-per-second television system. The RF modulation system proposed in this recommendation used double-sideband, amplitude-modulated transmission, limiting the video bandwidth it was capable of carrying to 2.5 MHz. In 1938, this RMA proposal was amended to employ vestigial-sideband (VSB) transmission instead of double sideband. In the vestigial-sideband approach, only the upper sidebands-those above the carrier frequency-plus a small segment or vestige of the lower sidebands, are transmitted. VSB raised the transmitted video bandwidth capability to 4.2 MHz. Subsequently, in 1941, the first National Television Systems Committee adopted the vestigial sideband system using a total line rate of 525 lines that is used in the United States today.
Different pieces of what became TV existed in 1900, the challenge was putting them together. And that required a consensus among powerful players.
A kin to Ed Roberts, John Blakenbaker and Mark Dean invented the personal computer but Apple invented the PC as we know it.
Philo Farnsworth invented the cathode ray tube. unless you're writing this from the year 2009 or before, I'm going to have to push back on the idea that tv's TODAY are based on his technology. They most certainly are not.
https://en.wikipedia.org/wiki/Kenjiro_Takayanagi
'Although he failed to gain much recognition in the West, he built the world's first all-electronic television receiver, and is referred to as "the father of Japanese television"'
He presented it in 1926 (Farnsworth in 1927)
However father of television was this dude:
https://en.wikipedia.org/wiki/Manfred_von_Ardenne
Better resolution, wireless transmission and Olympics 1936
I still have the reels, they look like this:
https://commons.wikimedia.org/wiki/File:Films_Path%C3%A9-Bab...
https://fr.wikipedia.org/wiki/Path%C3%A9-Baby
And we converted some of these reels to digital files (well brothers and I asked a specialized company to "digitalize" them).
100 years ago people already had cars, tramways (as a kid my great-grandmother tried to look under the first tramway she saw to see "where the horses were hiding"), cameras to film movies, telephones, the telegraph existed, you could trade the stock market and, well, it's knew to me but TV was just invented too.
On the other hand, it's just as fascinating to realize that all that, and ~everything that shapes modern life, did not exist until ~200 years ago. Not just appliances, but medicines and medicine, plastics and greases and other products of petrochemical industry and everything built on top of it, paints and cleaners and materials and so on...
Nowadays ..... hmmm. I no longer own a TV since many years. Sadly youtube kind of replaced television. It is not the same, quality-wise I think youtube is actually worse than e. g. the 1980s era. But I also don't really want to go back to television, as it also had low quality - and it simply took longer, too. On youtube I was recently watching old "Aktenzeichen XY ungelöst", in german. The old videos are kind of cool and interesting from the 1980s. I watched the new ones - it no longer made ANY sense to watch it ... the quality is much worse, and it is also much more boring. It's strange.
We saw a resurgence of this connection with big-budget serials like Game of Thrones, but now every streaming service has their own must-watch thing and it's basically as if everyone had their own personal broadcast station showing something different. I don't know if old-school television was healthy for society or not, but I do have a feeling of missing out on that shared connection lately.
Mass media isolates individuals who don't have access to it. I grew up without a TV, and when TV was all my neighbors could talk about, I was left out, and everyone knew it.
While other children were in front of the television gaining "shared experience", I built forts in the woods with my siblings, explored the creek in home made boats, learned to solder, read old books, wrote basic computer programs, launched model rockets, made up magic tricks. I had a great childhood, but I had a difficult time connecting with children whose only experiences were these shallow, shared experiences.
Now that media is no longer "shared", the fragmented content that people still consume has diminishing social value -- which in many cases was the only value it had. Which means there are fewer social consequences for people like me who choose not to partake.
Their "shared experience" is, actually, a debilitating addiction to flat, untouchable, and anti-democratic spectacle.
The least hundred years have seen our society drained of social capital, inescapably enthralled by corporate mediators. Mass media encourages a shift from "doing" to "watching." As we consume hand-tailored entertainment in private, we retreat from the public square.
Heavy television consumption is associated with lethargy and passivity, reinforcing an intolerance for unstructured time. This creates a "pseudoworld" where viewers feel a false sense of companionship—a parasocial connection with television personalities—that creates a feeling of intimacy while requiring (and offering) no actual reciprocity or effort.
Television, the "800-pound gorilla of leisure time," has privatized our existence. This privatization of leisure acts as a lethal competitor for scarce time, stealing hours that were once devoted to social interaction—the picnics, club meetings, and informal visiting that constitute the mētis or practical social knowledge of community life.
I miss the days when everyone had seen the same thing I had.
While I was young in 1975, I did watch ABC's version of the news with my grandparents, and continued up through high school. Then in the late 1980s I got on the Internet and well you know the rest.
"Back Then", a high percentage of everybody I or my grandparents or my friends came into contact with watched one of ABC, NBC, or CBS news most nights. These three networks were a bit different, but they generally they all told the same basic stories as each other.
This was effectively our shared reality. Later in high school as I became more politically focused, I could still talk to anybody, even people who had completely opposite political views as myself. That's because we had a shared view of reality.
Today, tens of millions of people see the exact same footage of an officer involved shooting...many angles, and draw entirely different 'factual' conclusions.
So yes, 50 years ago, we in the United States generally had a share view of reality. That was good in a lot of ways, but it did essentially allow a small set of people in power to decide that convincing a non-trivial percentage of the US population that Exxon was a friendly, family oriented company that was really on your side.
Worth the trade off? Hard to say, but at least 'back then' it was possible, and even common, to have ground political discussions with people 'on the other side', and that's pretty valuable.
As long as that common ground falls within acceptable parameters; couldn't talk too much about anything remotely socialist or being anti-war.
"The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum."
https://www.npr.org/2025/12/24/nx-s1-5646673/stranger-things...
you can't talk about a show with somebody until they're also done binging, so there's no fun discussion/speculation (the conversation is either "did you watch that? yeah. <conversation over>" or "you should watch this. <conversation over>".
Is it though? I of course watched TV as a kid through the 80s and have some feelings of nostalgia about it, but is it true that YouTube today is worse?
I mean, YouTube is nothing in particular. There's all sorts of crap, but Sturgeon's Law [1] applies here. There is also good stuff, even gems, if you curate your content carefully. YouTube can be delightful if you know what to look for. If you don't filter, yeah... it's garbage.
----
Diving into new topics on YT is delightful. The site becomes much better with sponsorblock+ublock origin+hide shorts/trending (unhook or blocktube)+replace clickbait titles+replace thumbnails (dearrow)+return youtube dislike.
Circles would have had a couple of advantages. First, I believe they would have been easier to make. From what I've read rectangles have more stress at the corners. Rounding the corners reduces that but it is still more than circles have. With circles they could have more easily made bigger CRTs.
Second, there is no aspect ratio thus avoiding the whole problem of picking an aspect ratio.
Electronically the signals to the XY deflectors to scan a spiral out from the center (or in from the edge if you prefer) on a circle are as easy to make as the signals to to scan in horizontal lines on a rectangle.
As far as I can tell that would have been fine up until we got computers and wanted to use TV CRTs as computer displays. I can't imagine how to build a bitmapped interface for such a CRT that would not be a complete nightmare to deal with.
https://www.earlytelevision.org/prewar_crts.html
They didn't really have the problem of picking an aspect ratio because motion pictures existed and that was already 4:3
And modern America asked itself, why can't it be both?
For example, page 26 has directions on how to pop by the local chemist to pick up materials to make your own selenium cell (your first imager) and page 29 covers constructing your first Televisor, including helpful tips like "A very suitable tin-plate is ... the same material out of which biscuit tins and similar light tinware is made. It is easily handled and can readily be cut with an ordinary pair of scissors. It is sold in sheets of 22 inches by 30 inches. Any ironmonger will supply these."
What strikes me is how fast the iteration was. Baird went from hatboxes and bicycle lenses to color TV prototypes in just two years. That's the kind of rapid experimentation we're seeing with AI right now, though compressed even further.