Posted by qassiov 11 hours ago
Circles would have had a couple of advantages. First, I believe they would have been easier to make. From what I've read rectangles have more stress at the corners. Rounding the corners reduces that but it is still more than circles have. With circles they could have more easily made bigger CRTs.
Second, there is no aspect ratio thus avoiding the whole problem of picking an aspect ratio.
Electronically the signals to the XY deflectors to scan a spiral out from the center (or in from the edge if you prefer) on a circle are as easy to make as the signals to to scan in horizontal lines on a rectangle.
As far as I can tell that would have been fine up until we got computers and wanted to use TV CRTs as computer displays. I can't imagine how to build a bitmapped interface for such a CRT that would not be a complete nightmare to deal with.
The image is not stored at any point. The receiver and the transmitter are part of the same electric circuit in a certain sense. It's a virtual circuit but the entire thing - transmitter and receiving unit alike - are oscillating in unison driven by a single clock.
The image is never entirely realized as a complete thing, either. While slow phosphor tubes do display a static image, most CRT systems used extremely fast phosphors; they release the majority of the light within a millisecond of the beam hitting them. If you take a really fast exposure of a CRT display (say 1/100,000th of a second) you don't see the whole image on the photograph - only the most recently few drawn lines glow. The image as a whole never exists at the same time. It exists only in the persistence of vision.
In fact in order to show a feed of only text/logos/etc in the earlier days, they would literally just point the camera at a physical object (like letters on a paper, etc) and broadcast from the camera directly. There wasn’t really any other way to do it.
"And if you tell the kids that today, they won't believe it!"
Just wanted to add one thing, not as a correction but just because I learned it recently and find it fascinating. PAL televisions (the color TV standard in Europe) actually do store one full horizontal scanline at a time, before any of it is drawn on the screen. This is due to a clever encoding used in this format where the TV actually needs to average two successive scan lines (phase-shifted compared to each other) to draw them. Supposedly this cancels out some forms of distortion. It is quite fascinating this was even possible with analogue technology. The line is stored in a delay line for 64 microseconds. See e.g.: https://www.youtube.com/watch?v=bsk4WWtRx6M
The very first computers (Manchester baby) used CRTs as memory - the ones and zeros were bright spots on a “mesh” and the electric charge on the mesh was read and resent back to the crt to keep the ram fresh (a sorta self refreshing ram)
- Core memory - Drum memory - Bubble memory - Mercury delay line memory - Magnetic type memory :P
And probably many more. Remember that computers don't even need to be digital!
:-))))
He said his coworkers would sometimes toss a television capacitor at each other as a prank.
Those capacitors retained enough charge to give the person unlucky enough to catch one a considerable jolt.
This sounds like a brute force solution over just having the display controller read the image as it is being sent and emulating the phosphors.
It basically mods the rom to allow for a bit more latency when checking the hit targets
How this is even possible that I remember all this, because I was 4 yrs old?
Gemini knows:
The Film: In the Days of the Spartakiad (1956/1957)
The song "Moscow Nights" was originally written for a documentary film called "In the Days of the Spartakiad" (V dni spartakiady), which chronicled a massive Soviet sports competition.
The Scene: In the film, there is a romantic, quiet scene where athletes are resting in the countryside near Moscow at night.
The Music: The song was sung by Vladimir Troshin. It was intended to be background music, but it was so hauntingly melodic that it became an overnight sensation across the USSR and its neighbors.
The Finnish Connection: In 1957, the song became a massive hit in Finland and Estonia. Since you were watching Estonian TV, you likely saw a version where the dialogue or narration was dubbed into Finnish—a common practice for broadcasts intended for Finnish-speaking audiences across the Gulf of Finland.
This is not just the case for early childhood memories, but for anything - the more time passes, the less accurate. It's even possible to have completely "made-up" memories, perceived as 100% real, e.g. through suggestive questioning in therapy.
There was one experiment where researchers got a man's family at a holiday gathering of the extended family to start talking about funny things that had happened to family members when they were children. In particular the man's parents and siblings told about a funny incident that happened to the man during his 3rd grade school play.
The man had earlier agreed to participate in some upcoming psychological research but did not yet know the details or been told when the research would start.
Later he was contacted and told the research would be starting soon, and asked to come in an answer some background questions. They asked about early non-academic school activities and he told them about his 3rd grade play and the funny incident that happened, including details that his family had not mentioned.
Unbeknownst to the man the research had actually started earlier and the man's family had agreed to help out. That story about the 3rd grade play that his family told was actually given to them by the researchers. None of his elementary school classes had put on any plays.
This sort of thing can be a real problem. People being questioned about crimes (as witnesses or suspects) can get false memories of the crime if the person questioning them is not careful. Or worse, a questioner could intentionally get them to form false memories that they will later recall on the witness stand.
For example, page 26 has directions on how to pop by the local chemist to pick up materials to make your own selenium cell (your first imager) and page 29 covers constructing your first Televisor, including helpful tips like "A very suitable tin-plate is ... the same material out of which biscuit tins and similar light tinware is made. It is easily handled and can readily be cut with an ordinary pair of scissors. It is sold in sheets of 22 inches by 30 inches. Any ironmonger will supply these."
Philo Farnsworth demonstrated a competing technology a few years later, but every TV today is based on his technology.
So, who actually invented Television?
David Sarnoff and RCA was an entirely different matter, of course…
What happened?
I read online that at his end, Baird was proposing a TV scan-rate we'd class as HD quality, which lost out to a 405 line standard (which proceeded 625/colour)
There is also a quality of persistence in his approach to things, he was the kind of inventor who doesn't stop inventing.
The TV I have now in my living room is closer to a computer than a television from when I grew up (born 1975) anyway, so the word could mean all sorts of things. I mean, we still call our pocket computers "phones" even though they are mainly used for viewing cats at a distance.
Sure enough, this was the system selected as the winner by the U.S. standard-setting body at the time. Needless to say, it failed and was replaced by what we ended up with... which still sucked because of the horrible decision to go to a non-integer frame rate. Incredibly, we are for some reason still plagued by 29.97 FPS long after the analog system that required it was shut off.
When the UK (and Europe) went colour it changed to a whole new system and didn't have to worry too much about backward compatibility. It had a higher bandwidth (8mhz - so 33% more than NTSC), and was broadcasting on new channels separate to the original 405 lines. It also had features like alternating the phase of every other line to reduce the "tint" or "never twice the same color" problem that NTSC had
America chose 30fps but then had to slow it by 1/1001ths to avoid interference.
Of course because by the 90s and the growth of digital, there was already far too much stuff expecting "29.97"hz so it remained, again for backward compatibility.
The interference was caused when the spectrum of the color sub-carrier over-lapped the spectrum of the horizontal interval in the broadcast signal. Tweaking the frequencies allowed the two spectra to interleave in the frequency domain.
Literally, to this day, am I dealing with all of these decisions made ~100 years ago. The 1.001 math is a bit younger when color was rolled out, but what's a little rounding between friends?
https://en.wikipedia.org/wiki/Single-sideband_modulation#Sup...
In the United States in 1935, the Radio Corporation of America demonstrated a 343-line television system. In 1936, two committees of the Radio Manufacturers Association (RMA), which is now known as the Consumer Electronics Association, proposed that U.S. television channels be standardized at a bandwidth of 6 MHz, and recommended a 441-line, interlaced, 30 frame-per-second television system. The RF modulation system proposed in this recommendation used double-sideband, amplitude-modulated transmission, limiting the video bandwidth it was capable of carrying to 2.5 MHz. In 1938, this RMA proposal was amended to employ vestigial-sideband (VSB) transmission instead of double sideband. In the vestigial-sideband approach, only the upper sidebands-those above the carrier frequency-plus a small segment or vestige of the lower sidebands, are transmitted. VSB raised the transmitted video bandwidth capability to 4.2 MHz. Subsequently, in 1941, the first National Television Systems Committee adopted the vestigial sideband system using a total line rate of 525 lines that is used in the United States today.
Different pieces of what became TV existed in 1900, the challenge was putting them together. And that required a consensus among powerful players.
A kin to Ed Roberts, John Blakenbaker and Mark Dean invented the personal computer but Apple invented the PC as we know it.
Philo Farnsworth invented the cathode ray tube. unless you're writing this from the year 2009 or before, I'm going to have to push back on the idea that tv's TODAY are based on his technology. They most certainly are not.
https://en.wikipedia.org/wiki/Kenjiro_Takayanagi
'Although he failed to gain much recognition in the West, he built the world's first all-electronic television receiver, and is referred to as "the father of Japanese television"'
He presented it in 1926 (Farnsworth in 1927)
However father of television was this dude:
https://en.wikipedia.org/wiki/Manfred_von_Ardenne
Better resolution, wireless transmission and Olympics 1936
Nowadays ..... hmmm. I no longer own a TV since many years. Sadly youtube kind of replaced television. It is not the same, quality-wise I think youtube is actually worse than e. g. the 1980s era. But I also don't really want to go back to television, as it also had low quality - and it simply took longer, too. On youtube I was recently watching old "Aktenzeichen XY ungelöst", in german. The old videos are kind of cool and interesting from the 1980s. I watched the new ones - it no longer made ANY sense to watch it ... the quality is much worse, and it is also much more boring. It's strange.
We saw a resurgence of this connection with big-budget serials like Game of Thrones, but now every streaming service has their own must-watch thing and it's basically as if everyone had their own personal broadcast station showing something different. I don't know if old-school television was healthy for society or not, but I do have a feeling of missing out on that shared connection lately.
Mass media isolates individuals who don't have access to it. I grew up without a TV, and when TV was all my neighbors could talk about, I was left out, and everyone knew it.
While other children were in front of the television gaining "shared experience", I built forts in the woods with my siblings, explored the creek in home made boats, learned to solder, read old books, wrote basic computer programs, launched model rockets, made up magic tricks. I had a great childhood, but I had a difficult time connecting with children whose only experiences were these shallow, shared experiences.
Now that media is no longer "shared", the fragmented content that people still consume has diminishing social value -- which in many cases was the only value it had. Which means there are fewer social consequences for people like me who choose not to partake.
Their "shared experience" is, actually, a debilitating addiction to flat, untouchable, and anti-democratic spectacle.
The least hundred years have seen our society drained of social capital, inescapably enthralled by corporate mediators. Mass media encourages a shift from "doing" to "watching." As we consume hand-tailored entertainment in private, we retreat from the public square.
Heavy television consumption is associated with lethargy and passivity, reinforcing an intolerance for unstructured time. This creates a "pseudoworld" where viewers feel a false sense of companionship—a parasocial connection with television personalities—that creates a feeling of intimacy while requiring (and offering) no actual reciprocity or effort.
Television, the "800-pound gorilla of leisure time," has privatized our existence. This privatization of leisure acts as a lethal competitor for scarce time, stealing hours that were once devoted to social interaction—the picnics, club meetings, and informal visiting that constitute the mētis or practical social knowledge of community life.
I miss the days when everyone had seen the same thing I had.
While I was young in 1975, I did watch ABC's version of the news with my grandparents, and continued up through high school. Then in the late 1980s I got on the Internet and well you know the rest.
"Back Then", a high percentage of everybody I or my grandparents or my friends came into contact with watched one of ABC, NBC, or CBS news most nights. These three networks were a bit different, but they generally they all told the same basic stories as each other.
This was effectively our shared reality. Later in high school as I became more politically focused, I could still talk to anybody, even people who had completely opposite political views as myself. That's because we had a shared view of reality.
Today, tens of millions of people see the exact same footage of an officer involved shooting...many angles, and draw entirely different 'factual' conclusions.
So yes, 50 years ago, we in the United States generally had a share view of reality. That was good in a lot of ways, but it did essentially allow a small set of people in power to decide that convincing a non-trivial percentage of the US population that Exxon was a friendly, family oriented company that was really on your side.
Worth the trade off? Hard to say, but at least 'back then' it was possible, and even common, to have ground political discussions with people 'on the other side', and that's pretty valuable.
you can't talk about a show with somebody until they're also done binging, so there's no fun discussion/speculation (the conversation is either "did you watch that? yeah. <conversation over>" or "you should watch this. <conversation over>".
https://www.npr.org/2025/12/24/nx-s1-5646673/stranger-things...
Is it though? I of course watched TV as a kid through the 80s and have some feelings of nostalgia about it, but is it true that YouTube today is worse?
I mean, YouTube is nothing in particular. There's all sorts of crap, but Sturgeon's Law [1] applies here. There is also good stuff, even gems, if you curate your content carefully. YouTube can be delightful if you know what to look for. If you don't filter, yeah... it's garbage.
----
Edit: to make it clear, I absolutely did not miss having TV for even a second in all of those years.
BTW, I also still have a CRT in constant use - but the sources are now digital (It's my kitchen background TV - I feed it from a Raspberry PI with Kodi). On great thing about CRTs is that there's no computer inside monitoring what you watch.
I can't watch anything live unless Youtube is showing some live event (which it sometimes does). I could probably watch some live news using Pluto, but I never do.
I remember asking as a teenager if that because there are idiots on the box, or because you turn into one when you watch it.
The answer is “yes”
Have not had or watched one in well over 20 years.
And modern America asked itself, why can't it be both?