Posted by rbanffy 3 days ago
For the kids today, this is why we used to have Mouse Trails in settings!
I just checked on my Mac, and we no longer seem to have that option.
I can't find a screenshot of the control panel, but here's a video of a PB1400 with it turned on: https://www.reddit.com/r/VintageApple/comments/rohlp0/how_do...
Probably stems from the days of using computers with much lower resolutions where the mouse pointer was therefore relatively large and easy to find. My Amiga 500 typically ran at either 320 x 256 or 640 x 256 (with rectangular pixels), but the mouse pointer was a 16 x 16 hardware sprite, which locked to the lower resolution IIRC, so it was always 5% of the width of the screen, and 6.25% of the height. This is absolutely enormous by today's standards, even with the mouse cursor enlarged to, not its maximum size on macOS, but its maximum useful size.
If anyone at Apple is listening, highlighting the screen where the pointer is (dimming the others) or just having the option of resetting the position to a known place, would work just fine.
If Atari and Amiga had won instead, what would the world look like?
What would the server world look like? Would there be some weird "Amiga Server Enterprise Edition"? Would servers just be Linux without any meaningful competition?
Would Atari have shook the world by introducing a new CPU that resulted in amazing battery life compared to the Amiga competition? Would some of us be using AtariPhones? Would Android be a thing?
Would retrocomputing folks talk about their Windows 3.1 boxes the way that Ataris and Amigas are currently talked about?
The platforms would have needed to be opened up to clone builders to reach critial mass.
Amiga was a lot like DOS/Classic Mac OS, single user, unprotected memory...we would have seen it added on to like Windows 3.x/9x, until a re-written version with the right stuff took over (like Windows NT/2000/XP did).
Someone like Linus would have still likely written a UNIX clone and open-sourced it.
The minicomputers and UNIX servers/workstations would have still hung around for a while. The real trick is the Amiga CPU and the rest of the hardware would need to keep getting improved, catch up in speed, reach 64 bits, SMP...
AmigaOS was actually implementing real preemptive multitasking and was much more "modern" than MacOS and Windows from the same era, and a lot of things were actually unix-alike ! Comparing it to DOS is like an insult :) But yes, memory was unprotected because mot Amiga CPUs didn't have an MMU (but if you did, you could use this https://www.nightvzn.net/portfolio/web/amiga_monitor/archive... ).
The initial game focus was the outcome of various factors during Amiga's transition from idea into a shippable product.
There are some vintage Q&A sessions that go through this.
Not that much. More standardization on the Unix side and we'd all be happy.
In the IBM compatible world, the clones drove down the price then drove forward progress. It is doubtful that much of a clone market would develop in the Amiga/Atari world since the parent companies were already competing against IBM compatibles on price. Without clones to break free (as happened in the IBM compatible world), there wouldn't be clones to drive forward progress. I'm not even sure cloning the Amiga would be practical. Apparently Commodore had enough trouble "cloning" the Amiga (i.e. developing higher performance, yet compatible machines). Without the clones driving progress, companies like SGI and Sun would likely still be in the picture.
If Amiga/Atari domination somehow did happen, I suspect the CPU situation would be flipped around, with Motorola having both the incentive and finances to continue on with a fully compatible 680x0 line of processors and Intel chasing after its own equivalent of an 680x0-to-PPC transition.
As for the retrocomputing thing: DOS/Windows 3.x nostalgia is very much a thing in today's world. In that alternate reality, they would likely be higher profile (as Amiga/Atari are today).
If the PC didn't establish itself, Commodore might have opted to release the 900 as a Unix workstation. With any luck, they'd port Coherent to the new Amiga and it'd be a Unix machine from day 1.
This is my fantasy, you're welcome to enjoy it while you're here and remember, no shoes on the couch.
This was also the Mac's distinguishing feature at the time. It still is, so some extent. A lot of what drove mass adoption of home computers was that everyone wanted to bring the same computing environment, i.e. OS, they used at work or at school at home as well. This was DOS/Windows or System 7.
I suspect Commodore's SVR4 port would have played a role: https://en.wikipedia.org/wiki/Amiga_Unix
Likewise, Atari had a Unix port on the TT workstation: https://en.wikipedia.org/wiki/Atari_TT030
It had/has a POSIX(ish) API, device files, mountable filesystems, pre-emptive multitasking, TCP/IP, etc. while still being able to run classic TOS binaries.
You can run this in an emulator or on hardware still today and it still gets active development, under the GPL.
ARM was introduced by Acorn, even if there was some Apple money.
As for the rest, we would keep having nice graphical desktop environments, with interesting paradigms instead of trying to fit GUIs into UNIX clones.
And vertical integration, plenty of it, as it has become the norm again nowadays.
Linus would still get frustrated by the AIX'es and Solaris'es of the world, AT&T would still try to get rid of any BSD being publicly offered and Linux would still be invented under the GPL and get the GNU userland for free.
The biggest difference is that it would be currently used on more architectures than the two it's mostly used on these days.
Except, not really. If you work at a startup or business that has to deal with "vertical integration" at-cost, your first goal is to get rid of it. Fly.io, Datadog, managed K8s, all of this stuff is literally first-to-go when scaling a business that needs to be profitable. Business-conscious users know they're being fucked over whether it's Apple, Microsoft or Oracle - you can't market it as "integration" to people that see the monthly bill.
And in the EU, vertical integration from large companies that can EEE their competitors is under extreme scrutiny now. Many execs have exploited it to engender themselves an unfair advantage and deserve the same scrutiny Microsoft got for their "integration" case.
If American governance shows the same competence, "vertical integration" will be the most feared business model of the 21st century and innovation will be put back on the table. For everyone.
Large majority of the population is using unupgradable laptops, where at most memory sticks, hard drives, battery can be changed.
I haven't touched a desktop at work since 2006.
Some even make do with a tablet for their computing needs.
Likewise those that have servers in-house, those are no longer PC towers under a desk, rather slice of pizza boxes in server racks.
This wouldn’t happen with Atari or Amiga.
Atari machines were cheap. Amiga not so much, but, eventually, they would catch up.
I love the Amiga - it represented a unique point in time that coalesced a lot of interesting technologies and people together trying to do something interesting - but it was as far from a technology that had long term potential as you could get, pretty much in every way.
My understanding is that AmigaOS syscalls were basically JSRs?
The original shipped OS was basically a fork of CP/M and PC-DOS-ish but GEM overtop of it showed more attention to cleaner architectural concerns, though it was never really used to its full intent.
How different might the IT world look today if we had had a deluge of Amiga/ST clones.
Although I suspect that even if all that stuff survived well into the internet era, the rise of web-based UIs would have lost everything interesting about each platform and rounded every corner to deliver the boring, ugly cross-platform software that is so popular today.
I think only Evernote did a good job of cross-platform where they wrote a platform specific UI layer onto a common foundation that did all the work and communicated with the servers. Even that didn't last because eventually they also bought into Electron which is basically the gray goo of software.
It's a bummer.
Still got it, it still works, and Turbo C still boots up just as quick.
But probably there wouldn't have been much of a market for that. Computer-driven live music performance was still very exotic. Laptop jockeys were a decade away.
Computer-driven live music performance was very much a thing long before 1991. The 'computers' in question were Analog sequencers using control voltages, and things like the LinnDrum providing click tracks to trigger sync. Roland expanded on this with the release of their TR-808 drum machine and sequencer in 1980, utilising a precursor to MIDI known colloquially as DIN-Sync or Sync24 https://en.wikipedia.org/wiki/DIN_sync
This gave way with MIDI to the sequencing of outboard gear via a variety of hardware sequencers and computer/DAW combos - bringing us to the Atari ST and the first few generations of PPC and G3 Towers as we entered the true age of the PC DAW.
Speaking of Synclavier, Kraftwerk's crashed live in 1991... https://www.setlist.fm/setlist/kraftwerk/1991/philipshalle-d...
It was originally envisaged to be the 'Dartmouth Digital Synthesizer', borrowing the then innovative FM synthesis technology from Stanford which was eventually the basis for the Yamaha DX line of synths, with the DX7 being the indisputable king of late 80s popular music.
That 24-bit, 50kHz sample rate and the AD/DA converters were glorious, but even the workflow and palette editing functionality were so unique and revolutionary that there's value in a full 1:1 software emulation. I've had a lot of fun playing 'guess the hit single' with the Synclavier and Fairlight emulations in the Arturia Collection
https://www.arturia.com/products/software-instruments/syncla...
On one level, I'm absolutely onboard with this perspective. On the other hand I think this is bending the definition a little bit too far. What we're specifically discussing here is using general purpose portable computers as part of a live performance.
The Fairlight CMI falls into an interesting middle ground because, at least in theory, you could probably have created and run general purpose software applications on it. Would have made a pretty wild (and ludicrously overpriced) word processor or spreadsheet station. But, of course, the software it ran was all geared towards music production, and is a very direct forerunner of the kinds of music production software that would become increasingly available for general purpose computers.
Definitely a wild and innovative time.
That said, from memory I'm pretty positive there's a few 'sidenotes' in the era which would have utilised general purpose portable computers as part of a live performance. The UK synthpop acts cobbling together gear post-Depeche Mode's 1981 'Speak and Spell' Album, with stuff like the Alpha Syntauri setup for the Apple II used by Herbie Hancock and Laurie Spiegel coming to mind.
https://www.vintagesynth.com/syntauri/alphasyntauri
You then went even more niche, for the sake of academic argument, with the Amiga demo and modscene which often focused on the use of Tracker MODs for live performance and 'DJing' on COTS consumer PC hardware.
I'd also eat my hat if there weren't Jazz and new-wave artists utilising the FM Chips in the early NEC and PC-88 style line at the time - i.e. the natural progression of the chiptune scene going full polyphony and fidelity from the MOS chip in the C64.
Here we see Atari Teenage Riot using their namesake live (even in 2010): https://www.flickr.com/photos/clintjcl/5076088906
Though it's edged out by the Amiga, the Atari ST was truly a thing of beauty in its day. My wife was pretty chuffed to hear that a model in the line has her name (of course, the STacy).
No, it doesn't. It looks better in certain important ways:
1) the keyboard has real keys, not those stupid "island" keys that are all the rage now.
2) the screen has a taller aspect ratio, which is better for actual computing work, whereas laptops these days all have wide screens because of economies of scale with TVs and because people want to watch HD video full-screen instead of doing real work.
This looks more similar to machines from the golden age of laptops, which was probably between 2000 and 2010.
I've been considering using an old ThinkPad briefcase I had, and did bring home a ThinkPad laptop bag which was surplus from work, and will probably use that in the future.
Meanwhile, my work laptop (with much smaller, but almost as wide screen) fits in a sling bag.
Biggest non-modern tell for me -- the keyboard is at the bottom of the case and there's no wrist rest. That shift was about as drastic as the hw keyboard -> blank glass of the iphone transition: Pre powerbook, keyboards were at the bottom of the system. There were weird side mount trackballs, or trackpoints. Post powerbook 100/140/170 -- trackball in the bottom center, keyboard above that.
Trackpads came later, but didn't really affect the overall layout.
I wondered if everyday users noticed the omission. Then I was waiting for help in an Apple store and heard a woman come in and tell a salesperson that she and her daughter were happy with their new MacBooks, except for one thing they hated: the lack of a Delete key; she asked if there was a way to remap a key to be Delete.
Backspace vs. Delete is a non-issue for 99% of consumers because they have those keys.
On my desktop, I do most of my email in gmail and mutt, so I've never used "delete" to delete emails. Command-backspace does the same thing as Command-delete in Finder, if you want to delete files that way. I never do.
Mac users also submit to needing double the keystrokes they should to delete characters, hitting the right arrow key over and over to get past stuff they want to delete and then Backspacing it away.
The island keyboards came in when they put the keyboard base inside the aluminum case and screwed it in with 60 of the smallest screws you've ever seen. Pain in the ass when you needed to replace one, but _way_ more solid and better typing experience.
However, there was a company called CRI that remade SGI Indy into thick-boy "laptops" for the military.
Edit: found https://web.archive.org/web/20080330045629/http://www.jumbop...
and
Of course, its just a fantasy, but somehow I feel like an SGI tiBook would've won over a lot of nerds, a lot faster, than Apples' variant did ..
Maybe the next MacPro will be a black cube.
Fake versions were made for the movies "Twister" and "Congo" but AFAIK these were completely fake (the actual Indy driving the screen was off camera somewhere).
And there were also a few "laptops" with PA-RISC running HP-UX.
Amiga management is baffling at times.