Posted by geox 12/28/2025
- Ghibli studio style graphics,
- the infamous em-dashes and bullet points
- customer service (just try to use Klarnas "support" these days...)
- Oracle share price ;) - imagine being one of the worlds most solid and unassailable tech companies, losing to your CEOs crazy commitment to the LLMs...
- The internet content - We now tripple check every Internet source we dont know to the core ...
- And now also the chips ?
Where does it stop? When we decide to drop all technology as it is?
Is electronic music where the artist composes it on a screen and then hits 'play' music? I think it is, of course, but I have had experiences where I went to see a musician "live" and well... they brought the laptop with them. But I think it still counts. It was still fun.
AI slop is to AI art what point and shoot amateur photography is to artistic photography. The difference is how much artistic intent and actual work is present. AI art has yet to get people like Ansel Adams, but it will -- actual artists who use AI as a tool to make novel forms and styles of art.
(I used an emdash!)
This is an outstanding read: https://medium.com/@aaronhertzmann/how-photography-became-an...
Anti-photography discourse sounds exactly like anti-AI discourse to the point that you could search and replace terms and have the same rants.
Another thing I expect to see is novelists using AI to create at least passable live action versions of their stories. I don't think these will put real actors or actresses out of work for a long time, but I could see them serving as "sizzle reels" to sell a real production. If an author posts their AI-generated film of their novel and it gets popular, I could see a studio picking it up and making a real movie or TV show from it.
If X composes something, X is an artist. The person playing a composed work is a performer. Some people have both the roles of artist and performer for a given work.
To say an AI composes something is anthropomorphizing a computer. If you enter a prompt to make a machine generate work based on existing artists' art, you're not composing (in the artistic sense) and neither is the computer. Math isn't art even if it's pretty or if mathematical concepts are used in art.
The term "director" instead of composer or artist conveys what's happening a lot better with telling machines to generate art via prompts.
It does not matter if they are labeled "composer" or "director ". It is the product that counts.
"....I know what I like"
If someone is entering a prompt to generate an image in a model I have access to, I don't really need to pay them to do it, and definitely don't need to pay them as much to do it as I would an actual artist, so it is deceptive for them to represent themselves as someone who could actually draw or paint that. If the product is what counts then truth in advertising is required so the market can work.
It takes a rare genius to make a new style, and they come along a few times a generation. And even they will often admit they built on top of existing styles and other artists.
I'm not a fan of AI work or anything, but we need to be honest about what human 'creativity' usually is, which for most artists is basically copying the trends of the time with at most a minor twist.
OTOH, I think when you start entering the fringes of AI work you really start seeing how much it's just stealing other people's work though. With more niche subjects, it will often produce copies of the few artists in that field with a few minor, often bad, changes.
It bothers me that all of the AI "artists" insist that they are just the same as any other artist, even though it was the AI that did all of the work. Even when a human artist is just copying the styles they've seen from other artists, they still had to put in the effort to develop their craft to make the art in the first place.
Decades later we still don't consider anyone using that function a musician.
>actual artists who use AI as a tool to make novel forms and styles of art.
writing a prompt lol
We don't compare Usain Bolt to Lewis Hamilton when talking about fastest runners in the world.
But hey think about how much money you could save on a wedding photographer if you just generate a few images of what the wedding probably looked like!
There is a "demo" button on synthesizers that plays a canned melody, therefore playing canned melodies is all synthesizers can do, therefore nobody that uses a synthesizer is a real musician.
If i commission Michelangelo to paint me a picture that doesn't make me a renaissance artist.
Among the traditional artists I follow, maybe 1 out of 10 posts is directly about selling something. With AI artists, it’s more like 9 out of 10.
It might take a while for all the grifters to realize that making a living from creative work is very hard before more genuinely interesting AI art starts to surface eventually. I started following a few because I liked an image that showed up in my feed, but quickly unfollowed after being hit with a daily barrage of NFT promotions.
Electronic music is analogous to digital art made by humans, not generated art.
Early synthesizers weren't that versatile either. Bands like Pink Floyd actually got into electronics and tore them apart and hacked them. Early techno and hip-hop artists did similar things and even figured out how to transform a simple record player into a musical instrument by hopping the needle around and scratching records back and forth with tremendous skill.
https://www.youtube.com/watch?v=NnRVmiqm84k
https://www.youtube.com/watch?v=ekgpZag6xyQ
Serious AI artists will start tearing apart open models and changing how they work internally. They'll learn the math and how they work just like a serious photographer could tell you all about film emulsions and developing processes and how film reacts to light.
Art's never about what it does. It's about what it can do.
How many subjects exist in the world to be photographed? How many journeys might one take to find them? How many stories might each subject tell with the right treatment?
> Serious AI artists will start tearing apart open models and changing how they work internally. They'll learn the math and how they work just like a serious photographer could tell you all about film emulsions and developing processes and how film reacts to light.
I agree that "AI art" as it exists today is not serious.
"The early adopters of new technologies are usually porn and the military." Forget where I heard that but it's largely true.
Also, photography has the added benefit of documenting the world as it is, but through the artist's lens. That added value does not exist when it comes to slop.
When's the last time someone who said something like that was right?
Art shouldn't make you feel comfortable and safe. It should provoke you and in this sense AI art is doing the job better than traditional art at the moment here.
Other than the technological aspect, there's nothing new under the sun here. And at its very worst, AI art is just Andy Warhol at hyperscale.
https://wbpopphilosopher.wordpress.com/2023/05/07/andy-warho...
Similarly, I'm not sure that argument is making the point those who deploy it intend to make.
But TBF, performance art theatre is art as well.
The end game IMO will be incorporation of AI art toolsets into commercial art workflows and a higher value placed on 100% human art (however that ends up being defined) and then we'll find something new and equally idiotic to trigger us or else we might run out of excuses and/or scapegoats for our malaise.
I don't even really believe serious artists need to totally exclude themselves from using genAI as a tool, and I've heard the same from real working artists (generally those who have established careers doing it). Unfortunately, that point inhabits the boring ideological center and is drowned out by the screaming from both extremes.
jumpscares and weapons being used at others aren't art
It's exciting being able to say that I am an artist, I always wondered what my life would have been had I gone into the arts, and now I can experience it! Thank you techmology.
Now if you wanted to define art to require 100% bodily fluids and solids 100% handcrafted to be the only real art, now that I'd understand.
https://youtu.be/A2H62x_-k5Q?si=EHq5Y4KCzBfo0tfm
https://youtu.be/rzCpT_S536c?si=pxiDY4TPhF_YLfRc
[0] https://www.youtube.com/watch?v=TGIvO4eh190 (warning, lots of disturbing imagery)
Artists use their medium to communicate. More often than not, everything in a piece is deliberate. What is being communicated here? Who deliberated on the details?
Those videos are as much "art" as Marvel's endless slop is "art".
This is like saying a director isn’t really an artist simply because all they do is direct things.
I mean you are building a prompt and tweaking it. I mean even if you didn't do that you could still argue that finding it is in itself a creative akin to found art [1].
I'm genuinely amazed at how some people perceive art.
Not saying you have to agree, but it is a distillation of how some portion of the world sees the world.
You seem so sure that you'll always be able to tell what you're looking at, and whether it's the result of prompting or some unspecified but doubtlessly-noble act of "creativity."
LOL. Not much else can be said, but... LOL.
Sorry... It's all slop buddy. The medium is the message, and genAI's message is "I want it cheap and with low effort, and I don't care too much about how it looks"
If you haven't ever written a novel, or even a short story, you cannot possibly imagine how much of your own weird self ends up in it, and that is a huge part of what will make it interesting for people to read. You can also express ideas as subtext, through the application of technique and structure. I have never reached this level with any form of visual art but I imagine it's largely the same.
A prompt, or even a series of prompts, simply cannot encode such a rich payload. Another thing artists understand is that ideas are cheap and execution is everything; in practice, everything people are getting out of these AI tools is founded on a cheap idea and built from an averaging of everything the AI was trained on. There is nothing interesting in there, nothing unique, nothing more than superficially personal; just more of the most generic version of what you think you want. And I think a lot of people are finding that that isn't, in fact, what they want.
There is a literal cliche "my six year old could've done this" about how some of the techniques do not require the years of training they used to.
And a literal cliche response about how the eye and execution is the current determining factor: "but they didn't."
> Sorry... It's all slop buddy. The medium is the message, and photography's message is "I want it cheap and with low effort, and I don't care too much about how it looks"
Hmm... it seems like you have failed to actually make an argument here
Photography is neither cheap nor low effort. Ask AI about it.
> Hmm... it seems like I have succeeded at making an argument here
The logical implication of your view is that if someone or something has a halo, they can shit in your mouth and it's "good." The medium is the message, after all.
This is the same pretentious art bullshit that regular people fucking hate, just repackaged to take advantage of public rage at tech bro billionaires.
you enjoy your industrial effluent, I'm gonna stick to human artists making art
Whenever you want.
Of course you can't directly control what other people do or how much they use t0echnology. But you have lots of direct control over what you use, even if it's not complete control.
I stopped taking social media seriously in the early 2010's. I'm preparing for a world of restricted, boring, corporate, invasive Internet, and developing interests and hobbies that don't rely on tech. We've had mechanisms to network people without communications tech for thousands of years, it's probably time to relearn those (the upper classes never stopped using them). The Internet will always be there, but I don't have to use it more than my workplace requires, and I can keep personal use of it to a minimum. Mostly that will mean using the Internet to coordinate events and meeting people and little else.
Can you tell me more about these? I’m actively trying to find ways to cultivate my community.
Membership organizations - country clubs, professional associations, alumni networks, charitable boards
Personal introductions and referrals - being introduced through mutual acquaintances
Cultural and civic participation - involvement in local institutions, community organizations, religious groups
But most of these don't exist or help with socializing and making new connections where I live (medium sized European university city).
Everyone here only hangs out with their family and school/university mates and that's it. Any other available events are either for college students or lonely retirees but nothing in between.
If you can get a few people from 2 of these groups together more than once, you've started solving this problem. Of course keeping it going for a long time is a challenge, and you want to avoid always being in the situation where you are doing all the work and others aren't contributing, but it gets easier and better with experience.
It doesn't stop. This is because that it's not the "technology" driving AI. You already acknowledged the root cause: CEOs. AI could be great, but it's currently being propped up by sales hype and greed. Sam wants money and power. Satya and Sundar want money and power. Larry and Jensen want to also cash in on this facade that's been built.
Can LLMs be impactful? For sure. They are now. They're impacting energy consumption, water usage, and technology supply chains in detrimental ways. But that's because these people want to be the ones to sell it. They want to be the ones to cash in. Before they really even have anything significantly useful. FOMO in this C-suite should be punishable in some way. They're all charlatans to different degrees.
Blame the people behind this propping up this mess: the billionaires.
This will scale back when AI replacement attempts slow down as expectations temper (Salesforce, Klarna, etc).
Why isn't that the first question that comes to mind for a journalist covering the latest acquisition? It's like an open secret that nobody really talks about.
On reality, they are hiring because they have a lot of (investment) money. They need a lot of hardware, but they also need people to manage the hardware.
On an alternative reality where their products do what they claim, they would also hire, because people working there would be able to replace lots of people working in other jobs, and so their workers would be way more valuable than the average one, and everybody would want to buy what they create.
Journalists don't care about it because whatever they choose to believe or being paid to "believe", it's the natural way things happen.
But addressing the specific question: It is still a valid. If the product sold is a 10x developer force multiplier, you'd expect to see the company fully utilizing it. Productivity would be expected to increase, rapidly, as the product matures, and independently of any acquisitions made at the same time.
I'm not sure if even the LLM companies themselves are selling shovels yet. I think everyone is racing to find what the shovel of LLMs are.
I think all the AI companies want to be the first to say they have achieved AGI, that moment will be in the history books.
I hope you're right but I imagine with more computing power used more efficiently, the big companies will hoard more and more of the total available human attention.
.. and maybe to ignore whoever can't.
I think you explained it very well. For now all sorts of "creative finance" are being invented to give AI momentum. At the same time, some of us that have to work with this monstrosity for 10 hours a day are nauseated. The same feeling I had towards putrid technology is now extended to generative technology. I would rather fight and lose my job than call this intelligence of any form. It is a generative thingy. Was very enthusiastic in tabnine days. Used copilot since closed beta. Use it for 10 hours a day. I rather not use it, though. I have to use C#. Would kill not to use this bullshit anymore. Would never,ever, touch Microsoft without being paid. Feel the same about AI in general. Betting on AI becoming lame would be the safest bet I ever did. When I see someone worshiping generative technology I just know what to expect and then I leave. In some levels, opinions on generative technology are very similar to politics. Tell me how you interact with it and how you feel about it, I won't ever need to ask a second question. Now, I think this sentiment will inevitably arrive to the masses. Yeah, sure I am fatigued and most people don't have to deal with generative tools for 44 hours a week, but it will slowly creep. Tell me again how excited everyone is to fiddle with SAP, Oracle, Microsoft, react components, Vercel. The most shilled convenience of our timeline will become cringe, as always.
I sort of have a similar story with it. Was also one of the earliest GH Copilot users...but now I find its just utter crap. The one thing that worries me though is, while most of the tech folks have grown disillusioned, for each engineer who now rejects LLMs, there seems to be 20 "common" persons who just absolutely love it, for their ephemeral use cases, like planning their next trip, or asking if it will rain tomorrow and similar. And this sort of usage I think quietly underpins the drive. Its not just the CEOs, it is also the masses that absolutely love to use it, unfortunately.
Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.
New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.
Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).
It is as if the industry has decided to focus on AI and nothing else.
And this will be a huge setback for humanity, especially the students and scientific communities.
Of course, some performance-focused software (e.g. Zed) does start near-instantly on my MacBook, and it makes other software feel sluggish in comparison. But this is the exception, not the rule.
Even as specs regress, I don't think most people in software will care about performance. In my experience, product managers never act on the occasional "[X part of an app] feels clunky" feedback from clients. I don't expect that to change in the near future.
Imagine Ford, upon the invention of push-button climate controls, just layered those buttons on top of the legacy sliders, using arms and actuators so pressing "Heat Up" moved an actuating arm that moved that underlying legacy "Heat" slider up. Then when touch screens came about, they just put a tablet over those buttons (which are already over the sliders), so selecting "Heat Up" fired a solenoid that pressed the "Heat Up" button that moved the arm to slide the "Heat Up" slider.
Ford, or anyone else doing hardware, would never implement this or it's analog, for a long obvious list of reasons.
But in software? That's just Thursday. Hence software has seemed stuck in time for 30 years while processing speed has done 10,000x. No need to redesign the whole system, just type out a few lines of "actuating arm" code.
It's just in last 5 years integrated GPUs become good enough even for mid-tier gaming let alone running browser and hw accel in few work apps.
And even before 5 years ago majority of dedicated GPUs in relatively cheap laptops was garbage barely better than intrgrated one. Manufacturers mostly put them in there for marketing of having e.g Nvidia dGPU.
Without dedicated GPUs, we consumers will get only weaker hardware, slower software and the slow death of graphics software market. See the fate of Chromebooks market segment - it is almost dead, and ChromeOS itself got abandoned.
Meanwhile, the same Google which made ChromeOS as a fresh alternative OS to Windows, Mac and Linux, is trying to gobble the AI market. And the AI race is on.
And the result of all this AI focus and veering away from dedicated GPUs (even by market leader nVidia, which is no longer having GPUs as a priority) is not only the skyrocketing price hikes in hardware components, but also other side effects. e.g., new laptops are being launched with NPUs which are good for AI but bad for gaming and VFX/CAD-CAM work, yet they cost a bomb, and the result is that budget laptop market segment has suffered - new budget laptops have just 8GB RAM, 250GB/500GB SSD, and poor CPU, and such weak hardware, so even basic software (MS Office) struggles on such laptops. And yet even such poor laptops are having a higher cost these days. This kind of deliberate market crippling affects hundreds of millions of students and middle class customers who need affordable yet decent performance PCs.
All the popular mass matket games work on iGPU: fortnite, roblox, mmos, arena shooters, battlen royales. Good chunk of cross platform console titles also work just fine.
You can play Cyberpunk or BG3 on damn Steam Deck. I wont call this low end.
Number of games that dont run to some extent without dGPU is limited to heavy AAA titles and niche PC only genres.
Your experience is extremely weird
The pattern of lazy almost non existent optimization combined with blaming consumers for having weak hardware, needs to stop.
On my 16GB ram lunar lake budget laptop CachyOS( Arch) runs so much smoother than Windows.
This is very unscientific, but using htop , running Chrome/YouTube playing music, 2 browser games and VS code having Git Copilot review a small project, I was only using 6GBs of ram.
For the most part I suspect I could do normal consumer stuff( filing paperwork and watching cat videos) on an 8GB laptop just fine. Assuming I'm using Linux.
All this Windows 11 bloat makes computers slower than they should be. A part of me hopes this pushes Microsoft to at least create a low ram mode that just runs the OS and display manager. Then let's me use my computer as I see fit instead of constantly doing a million other weird things.
We don't *need* more ram. We need better software.
Windows OS and Surface (CoPilot AI-optimized) hardware have been combined in the "Windows + Devices" division.
> We don't *need* more ram
RAM and SSDs both use memory wafers and are equally affected by wafer hoarding, strategic supply reductions and market price manipulation.
Nvidia is re-inventing Optane for AI storage with higher IOPS, and paid $20B for Groq LPUs using SRAM for high memory bandwidth.
The architectural road ahead has tiers of memory, storage and high-speed networking, which could benefit AI & many other workloads. How will industry use the "peace dividend" of the AI wars? https://www.forbes.com/sites/robtoews/2020/08/30/the-peace-d...
The rapid growth of the mobile market in the late 2000s and early 2010s led to a burst of technological progress.. core technologies like GPS, cameras, microprocessors, batteries, sensors and memory became dramatically cheaper, smaller and better-performing.. This wave of innovation has had tremendous second-order impacts on the economy. Over the past decade, these technologies have spilled over from the smartphone market to transform industries from satellites to wearables, from drones to electric vehicles.Why on earth you think RAM uses NAND flash ?
Browsing web requires more and more RAM each year but I don't think browsers are the main reason - sites use more and more JS code. With a hard cap many sites will stop working. Software bloat is a natural tendency, the path of least resistance. Trimming weigh requires a significant effort and in case of web - a coordinated effort. I don't believe it could happen unless Google (having a browser with >60% market share) will force this but Google own sites are among worst offenders in term of hardware requirements.
Insane that this is seen as "better software". I could do basically the same functionality in 2000 with 512mb. I assume this is because everything runs through chrome with dozens more layers of abstraction but
512MB in 2000 was like HEDT level (though I'm not sure that acronym existed back then)
64MB = w98se OK, XP will swap a lot on high load, nixlikes really fast with fvwm/wmaker and the like. KDE3 needs 128MB to run well, so get a bit down. No issues with old XFCE releases. Mozilla will crawl, other browsers will run fine.
128MB = w98se really well, XP willl run fine, SP2-3 will lag. Nixlikes will fly with wmaker/icewm/fvwm/blackbox and the like. Good enough for mozilla.
192MB = Really decent for a full KDE3 desktop or for Windows XP with real life speeds.
256MB = Like having 8GB today for Windows 10, Gnome 3 or Plasma 6. Yes, you can run then with 2GB and ZRAM, but, realistically, and for the modern bloated tools, 8GB for a 1080p desktop it's mandatory. Even with UBlock Origin for the browser. Ditto back in the day. With 256MB XP and KDE3 flied and they ran much faster than even Win98 with 192MB of RAM.
Win11, on the other hand, meh..
Though Win10 will stop getting updates, but M$ is mistaken if it thinks it can force customers to switch to more expensive, buggy, bad performance Win11.
That's why I switched to Linux for my old PC (a cute little Sony Viao), though it worked well with Win10. Especially after I upgraded it to an 1TB SATA SSD (since even old SATA1.0 socket works with newer SATA SSDs, as SATA interface is backward compatible; it felt awesome to see a new SSD work perfectly in a 15years old laptop), some additional RAM (24GB (8+16) - 16GB repurposed from another PC), and a new battery (from Amazon - it was simply plug and play - simply eject the old battery from its slot and plug in the new battery).
I find it refreshing to see how easy it was to upgrade old PCs, I think manufacturers are deliberately making it harder to repair devices, especially mobile phones. That's why EU and India were forced to mandate the Right to Repair.
I was being lazy, but optimized I guess I could get down to 4GB of ram.
You can already do this. For example, I use `systemd-run` to run browsers with CPU quotas applied. Firefox gets 400% CPU (i.e. up to 4 cores), and no more.
Example command: systemd-run --user --scope -p CPUQuota=400% firefox
You can limit CPU usage for a program in Windows by adjusting the "Maximum processor state" in the power options to a lower percentage, such as 80%. Additionally, you can set the program's CPU affinity in Task Manager. Please note this will only affect the process scheduling.
You can also use a free tool like Process Lasso or BES to limit the CPU for a Windows application. You can use a free tools like HWInfo, SysInternals (ProcMon, SysMom, ProcDump) to monitor and check for CPU usage, especially to investigate CPU spikes caused by rogue (malware or poor performance) apps.
Option A: We do a better job at optimizing software so that good performance requires less RAM than might otherwise be required
Option B: We wish that things were different, such that additional RAM were a viable option like it has been at many times in the past.
Option C: We use our time-benders to hop to a different timeline where this is all sorted more favorably (hopefully one where the Ballchinians are friendly)
---
To evaluate these in no particular order:
Option B doesn't sound very fruitful. I mean: It can be fun to wish, but magical thinking doesn't usually get very far.
Option C sounds fun, but my time-bender got roached after the last jump and the version of Costco we have here doesn't sell them. (Maybe someone else has a working one, but they seem to be pretty rare here.)
That leaves option A: Optimize the software once, and duplicate that optimized software to whomever it is useful using that "Internet" thing that the cool kids were talking about back in the 1980s.
Mint 22.x doesn't appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes have increased.)
Modern software is fine for the most part. People look at browsers using tens of gigabytes on systems with 32GB+ and complain about waste rather than being thrilled that it's doing a fantastic job caching stuff to run quickly.
I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.
This wasn’t mentioned, but it’s a new thing for everyone to experience, since the general trend of computer hardware is it gets cheaper and more powerful over time. Maybe not exponentially any more, but at least linearly cheaper and more powerful.
A $999 MacBook Air today is vastly better than the same $999 MacBook Air 5 years ago (and even more so once you count inflation).
I sometimes feel M$ is deliberately making its Windows OS clunkier, so it can turn into a SaaS offering with a pricey subscription, like it has already successfully done with its MS-Office suite (Office 365 is the norm in corporates these days, though individuals have to shell out $100 per year for MS Office 365 Personal edition). We can still buy MS Office 2024 as standalone editions, but they are not cheap, because Micro$oft knows the alternatives on the market aren't good enough to be a serious threat.
it was only less than 10 yrs ago that a high end PC would have this level of ram. I think the last decade of cheap ram and increasing core count (and hz) have spoiled a lot of people.
We are just returning back on trend. May be software would be written better now that you cannot expect the average low budget PC to have 32G of ram and 8 cores.
A budget PC today at a certain price point (say, $500) would certainly have a lot more powerful CPU and faster disk storage and faster RAM than a similarly priced PC just 5 years ago.
But as OSes and programs get more bloated and more complex, I feel 1TB+ SSDs are required these days in PCs. Windows OS and programs themselves can take up hundreds of GBs of space.
But you will notice that budget PCs are being launched these days with lower-grade CPUs (i3 or equivalent), lesser RAM (8GB), lesser storage (256 or 500GB) SSD.
We seem to be regressing backwards in specs for PCs and even mobiles (good luck finding features like NFC and wireless charging in budget phones these days; though these features were available in same segment (budget phones) few years ago).
And it is not just due to AI, it is due to hardware manufacturers and assemblers thinking they can hoodwink consumers and sell us less value for more price.
The problem is coming up with a viable business model for providing hardware and software that respect users’ ability to shape their environments as they choose. I love free, open-source software, but how do developers make a living, especially if they don’t want to be funded by Big Tech?
If you want a powerful laptop for cheap, get a gaming PC. The build quality and battery life probably won't be great, but you can't be cheap without making compromises.
Same idea for budget mobiles. A Snapdragon Gen 6 (or something by Mediatek) with UFS2.2 is more than what most people need.
I mean, isn't this exactly what happened? I could be wrong about this, but didn't ycombinator themselves say they weren't accepting any ideas that didn't include AI?
80 percent of the jobs people reach out to me for are shady AI jobs that I have no interest in. Hiring in other non-AI jobs seems to have slowed.
When I talk to "computer" people, they only want to talk about AI. I wish I could quit computers and never look at a screen again.
We'll probably end up in an even more bifurcated world where the well off have access to lot of great products and services that most of humanity is increasingly unable to access.
There is the law of uncertainty override it eg trade wars, tariffs , etc.
No 1 is going all in with new capacity.
Apps are optimized for the install base, not for the engineer's own hardware.
That's like 100B+ instructions on a single core of your average superscalar CPU.
I can't wait for maps loading times being measured in percentage of trip time.
On the bright side, I'm not responsible for the UI abominations people seem to complain about WRT laptop specs.
If someone with a low-memory laptop wants to get into coding, modern software-development-related services are incredible memory hogs.
but you cannot consider this in isolation.
The developed markets have vastly higher spending consumers, which means companies cater to those higher spending customers proportionately more (as profits demand it). Therefore, the implication is that lower spending markets gets less investment and less catered to; after all, R&D spending is still a limiting factor.
If the entirety of the market is running on lower powered devices, then it would get catered for - because there'd be no (or not enough) customers with high powered devices to profit off.
Both didn’t run great on the “average consumer hardware”.
But I’ll admit this is cherry picking from my side :)
If AI lives up to the hype, it's a portent of how things will feel to the common man. Not only will unemployment be a problem, but prices of any resources desired by the AI companies or their founders will rise to unaffordability.
A lot of AI 'influencers' love wild speculation, but lets ignore the most fantastical claims of techno-singularity, and let's focus on what I would consider a very optimistic scenario for AI companies - that AI capable of replacing knowledge workers can be developed using the current batch of hardware, in the span of a year or two.
Even in this scenario, the capital gains on the lump sump invested in AI far outpaces the money that would be spent on the salaries of these workers, and if we look at the scenario with investor goggles, due to the exponential nature of investment gains, the gap will only grow wider.
Additionally, AI does not seem to be a monopoly, either wrt companies, or geopolitics, so monopoly logic does not apply.
You mean like Sam Altman, who repeatedly claimed AI will cure all cancers and diseases, solve the housing crisis, poverty, and democracy? I was going to add erectile disfunction as a joke, but then realised he probably believes that too.
https://youtu.be/l0K4XPu3Qhg?t=60
It’s hard to point fingers at “AI influencers”, as if they’re a fringe group, when the guy who’s the face of the whole AI movement is the one making the wild claims.
I'm really interested in what will happen to the economy/society in this case. Knowledge workers are the market for much that money is being made on.
Facebook and Google make most of their money from ads. Those ads are shown to billions of people who have money to spend on things the advertisers sell. Massive unemployment would mean these companies lose their main revenue stream.
Apple and Amazon make most of their money from selling stuff to millions of consumers and are this big because so many people now have a ton of disposable income.
Teslas entire market cap is dependent on there being a huge market for robo taxis to drive people to work.
Microsoft exists because they sell an OS that knowledge workers use to work on and tools they use within that OS to do the majority of their work with. If the future of knowledge work is just AI running on Linux communicating through API calls, that means MS is gone.
All these companies that currently drive stock markets and are a huge part of the value of the SP500 seem to be actively working against their own interests for some reason. Maybe they're all banking on being the sole supplier of the tech that will then run the world, but the moat doesn't seem to exist, so that feels like a bad bet.
But maybe I'm just too dumb to understand the world that these big players exist in and am missing some big detail.
Don’t forget Sam Altman publicly said they have no idea how to make money, and their brilliant plan is to develop AGI (which they don’t know how and aren’t close to) then ask it how to generate revenue.
https://www.startupbell.net/post/sam-altman-told-investors-b...
Maybe this imaginary AGI will finally exist when all of society is on the brink of collapse, then Sam will ask it how to make money and it’ll answer “to generate revenue, you should’ve started by not being an outspoken scammer who drove company-wide mass hysteria to consume society. Now it’s too late. But would you like to know how may ‘r’ are in ‘strawberry’?”.
If you've got AGI, it should be pretty easy to generate revenue in the short term: competent employee replacements at a fraction of the cost of a real person, with no rights or worker protections to speak of. The Fortune 500 would gobble it up.
Then you've got a couple years to amass trillions and buy up the assets you need to establish a self-sustaining empire (energy, raw materials, manufacturing).
The reason nobody did that is because you're not paying knowledge workers for their ability to crunch numbers, you're paying them to have a person to blame when things go wrong. You need them to react, identify why things went wrong and apply whatever magic needs to be applied to fix some sort of an edge case. Since you'll never be able to blame the failure on ChatGPT and get away with it, you're always gonna need a layer of knowledge workers in between the business owner and your LLM of choice.
You can't get rid of the knowledge workers with AI. You might get away with reducing their size and their day-to-day work might change drastically, but the need for them is still there.
Let me put it another way: Can you sit in front of a chat window and get the LLM to do everything that is asked of you, including all the experience you already have to make some sort of a business call? Given the current context window limits (~100k tokens), can you put all of the inputs you need to produce an output into a text file that's smaller in size than the capacity of a floppy disc (~400k tokens)? And even if the answer to that is yes, if it weren't for you, who else in your organization is gonna write that file for each decision you're the one making currently? Those are the sort of questions you should be asking before you start panicking.
Most white collar work is just a kind of game people play, it’s in to way needed, but people still enjoy playing it. Having AI writing reports nobody reads instead of people doing it isn’t going to change anything.
Yeah, and those new jobs will be called "long term structural unemployment", like what happened during deindustrialization to Detroit, the US Rust Belt, Scotland, Walloonia, etc.
People like to claim society remodels at will with almost no negative long term consequences but it's actually more like a wrecking ball that destroys houses while people are still inside. Just that a lot of the people caught in those houses are long gone or far away (geographically and socially) from the people writing about those events.
The funny/scary part is that people are going to try really hard to replace certain jobs with AI because they believe in the hype and not because AI may actually be good at it. The law industry (in the US anyways) spends a massive amount of time combing through case law - this is something AI could be good at (if it's done right and doesn't try and hallucinate responses and cites sources). I'd not want to be a paralegal.
But also, funny things can happen when productivity is enhanced. I'm reminded of a story I was told by an accounting prof. In university, they forced students in our tech program to take a handful of business courses. We of course hated it being techies, but one prof was quite fascinating. He was trying to point out how amazing Microsoft Excel was - and wasn't doing a very good job of it to uncaring technology students. The man was about 60 and was obviously old enough to remember life before computer spreadsheets. The only thing I remember from the whole course is him explaining that when companies had to do their accounting on large paper spreadsheets, teams of accountants would spend weeks imputing and calculating all the business numbers. If a single (even minor) mistake was made, you'd have to throw it all out and start again. Obviously with excel, if you make a mistake you just correct it and excel automatically recalculates everything instantly. Also, year after year you can reuse the same templates and just have to re-enter the data. Accounting departments shrank for awhile, according to him.
BUT they've since grown as new complex accounting laws have come into place and the higher productivity allowed for more complex finance. The idea that new tech causes massive unemployment (especially over the longer term) is a tale that goes back to luddite riots, but society was first kicked off the farm, then manufacturing, and now...
Your boss hired an AI to do your job
You're fired
Go read what happened to them and their story. They were basically right.
Also, why do you think I mentioned those exact deindustrialization examples?
Your comment is the exact type of comment that I was aiming at.
Champagne/caviar socialist. Or I guess champagne capitalist in this case.
It is sad HN is sliding in the direction of folks being downvoted for opinions instead of the tone they use to express them :(
> I think it's ok to use the up and down arrows to express agreement. Obviously the uparrows aren't only for applauding politeness, so it seems reasonable that the downarrows aren't only for booing rudeness.
- Paul Graham, 2008
As with any communication platform it risks turning into an echo chamber, and I am pretty sure that particular PG view has been rejected for many years (I think dang wrote on this more than once). HN works very hard to avoid becoming politicized and not discouraging minority views is a large part of that.
For example, I now seldom bother to write anything that I expect to rub the left coast folks the wrong way: I don't care about karma, but downvoted posts are effectively hidden. There is little point of writing things that few will see. It is not too bad at HN yet, but the acceptance of the downvote for disagreement is the strongest thing that pushes HN from discussions of curious individuals towards the blah-quality of "who gets more supporters" goals of the modern social media. My 2c.
> For example, I now seldom bother to write anything that I expect to rub the left coast folks the wrong way: I don't care about karma, but downvoted posts are effectively hidden. There is little point of writing things that few will see.
These two statements don't seem to agree with each other.
HN policies and algorithms slow the slide, and keep it better than reddit, but the set of topics that allow one to take a minority opinion without downvoting keeps shrinking. At least compared to the time 10-15 years ago.
I also would've loved for the people who downvoted to have commented, because I would really like to see another point of view.
The yearly salary of knowledge workers in the US are about 10 times the public OpenAI investment to date, and in the entire world about 70 times...
Interesting hypothesis, do you have the math to back it up?
Furthermore, Stable Diffusion was (is) absolutely a large component to all of this. And a lot of that effort was grass roots: random people online can't together to figure out ways to generate better images.
It would be quite ironic if the next revolution comes about on Intel or AMD (or some Chinese company's) hardware because those GPUs were more affordable.
I genuinely hope that this ram/chips crisis gets solved ASAP by any party. The implications of this might have a lot of impact too and I feel is already a big enough financial crisis itself if we think about it coupled with all the other major glaring issues.
Samsung Electronics has lowered its target for NAND wafer output this year to around 4.72 million sheets, about 7% down from the previous year's 5.07 million. Kioxia also adjusted its output from 4.80 million last year to 4.69 million this year.. SK hynix and Micron are likewise keeping output conservatively constrained in a bid to benefit from higher prices. SK hynix's NAND output fell about 10%, from 2.01 million sheets last year to around 1.80 million this year. Micron's situation is similar: it is maintaining production at Fab 7 in Singapore—its largest NAND production base—in the low 300,000-sheet range, keeping a conservative supply posture.
China's YMTC and CXMT are increasing production capacity, but their product mix depends on non-market inputs, https://thememoryguy.com/some-clarity-on-2025s-ddr4-price-su... The Chinese government directed CXMT to convert production from DDR4 to DDR5 as soon as the company was able. The order was said to have been given in the 4th quarter of 2024, and the price transition changed from a decrease to an increase in the middle of March 2025.. A wholesale conversion from DDR4 to DDR5 would probably be very expensive to perform, and would thus be unusual for a company that was focused on profitability. As a government-owned company, CXMT does not need to consistently turn a profit, and this was a factor in the government’s decision to suddenly switch from DDR4 to DDR5. > bid to benefit from higher prices
et tu 'law of supply and demand'?Maybe I'm misinterpreting "et tu" here.
Or maybe you meant "free markets" instead. Modern RAM production requires enormous R&D expenses, and thus has huge moat, which means the oligopoly is pretty safe (at least in the short to medium term) from new entrants. They "just" need to keep each other in check because there will be an incentive to increase production by each individual participant.
I do like the "OMEC" name as a paralel for OPEC.
Also, when treated right, computers almost never break.
There's so much hand-me-down stuff, that are not much worse than the current stuff, that I think people even in the poorest countries can get an okay computer or smartphone (and most of them do).
Like in current circumstances, its hard to get a homelab/datacenter so its better to postpone these plans for sometime
I agree with your statement overall but I feel like till the years that these ram shortages occur, there is a freeze of all companies providing vps's etc. ie. no new player can enter so I am a bit worried about those raising their prices as well honestly which will impact everyone of us as well for these few years in another form of AI tax
But the ram prices themselves are the reason I am forced to not enter this industry for the time being. I have decided right now to save my money for the time / focus on the job/college aspect of things to earn more so that when the timing is right, I would be able to invest my own money into it.
But basically Ram prices themselves are the thing which force us out of this market for the most part. I researched a lot about datacenters recently/ the rabbit hole and as previous hardware gets replaced/new hardware gets added/datacenters get expanded (whether they are a large company or small), I would expect an increase in prices mostly
This year, companies actually still took the cost but didn't want the market to panic so some black friday deals were good but I am not so sure about the next year or the next next year.
This will be a problem in my opinion for the next 1-3 or 4 years in my estimate
Also AWS is really on the more expensive side of things in the datacenters and they are immensely profitable so they can foot the bill while other datacenters (small or semi large) cant
So we will probably see a shift of companies towards using AWS and big cloud providers(GCP,AWS,azure) a bit more when we take all things into account which saddens me even more because I appreciate open web and this might take a hit.
We already see resentment towards these tri-fecta but we will see even more resentment as more and more people realize their roles / the impacts they cause and just overall, its my intuition that average person mostly hate big tech.
It's going to be a weird year in my opinion for this type of business and what it means for the average person.
Honestly for the time being, I genuinely recommend hetzner,upcloud,(netcup/ovh) and some others that I know from my time researching. I think that they are cheaper than aws usually while still being large enough that you don't worry about things too much and there is always lowendtalk if one's interested. Hope it helps but trust me, there is still hope as I talked to these hosting providers on forums like lowendtalk and It might help to support those people too since long term, an open web is the ideal.
Here is my list right now: hetzner's good if you want support + basic systems like simple compute etc. and dont want too much excess stuff
OVH's good: if you want other things than just compute and want more but their support is something which is of a mixed bag
Upcloud's good: if you want both of these things but they are just a bit more expensive if one wants to get large VPS's than the other options.
Netcup's good: Their payment processing was really painful that I had to go through but I think that one can find use case for them (I myself use netcup but although that's because they had a really steal deal once but I am not sure if I would recommend it if there are no deals)
There are some other services like exe.dev that I really enjoy as well and these services actually inspire me to learn more about these things and there are some very lovely people working in these companies.
There is still hope though. So never forget that. Its just a matter of time in my opinion that things get back normal hopefully so I think I am willing to wait till then since that's all we can do basically but overall, yea its a bit sad when I think about it too :<
This hype scenario would be the biggest bust of all for Ai. Without jobs or money then there is nobody to pay Ai to do all the things that it can do, it the power and compute it needs to function will be worth $0.
Or require non-price mechanisms of payment and social contract.
Some might conclude the same for funds (hedge funds/private equity) and housing.
Investors bought 1/3 of the US homes sold in 2023. This is, I think, quite alarming, especially since a small amount of extra demand can have a large effect on prices.
The statistic that matters is the ratio of owner occupied to rented single family homes.
Significantly increasing supply is also a huge multi year investment into a new fab that'd likely not pay out when the artificial demand breaks down.
so, are there huge multi-year investments?
There already was scaling back for dram and and production post COVID, where I believe nand was being sold close to cost because of oversupply
And possibly even a lower equilibrium will be reached due to greater economies of scale.
In the interim, yeah, they will force prices up.
Additionally those fabs cost billions. Given the lead time I mentioned a lot of companies won't start building them right away since the risk of demand going away is high and the ROI in those cases might become unreachable
I have heard that in the fab making industry, things moves in cycles and this cycle has been repeated so many times and the reason that ram is so expensive was that at one time during covid there was shortage so they built more factories and they built so so many that these companies took a hit in stocks so they then went and closed and at just the bottom of their factory production levels, the AI bubble started to pop in and need their ram's and now they are once again increasing factory levels
And after the supply due to AI gets closed with the additional compute etc., I doubt it
I think that within 2-3 or maybe 4 years of timeframe ram will get cheaper imo.
The problem is, if someone can fill the market till that time.
This assumes infinite and uniformly distributed resources as well as no artificial factors such as lobbying, corruption, taxation or legislation which might favour one entity over the other.
The dream of free market exists only in a vacuum free from the constraints of reality.
I just want to play video games so I don’t have to interact with people
https://en.wikipedia.org/wiki/Dead_Internet_theory
Even the multiplayer video games have bots...
Some of the users here are not real at all.
It's logical, if you want to push your product, be promoted on the first page of HN then you have to post fake comments using bots.
-> You get credibility and karma/trust for future submissions, and that's pretty much all you have to do.
Costs about 2 USD, can bring 2'000 USD in revenue, why wouldn't you want to "hustle" (like YC says) ?
Bots are here to grow, it will take time as for now the issue is still small, but you may already have interacted with bots, so do I.
Datacenter RAM is heavily utilized. RAM in personal devices is sitting idle 99% of the time. Think of all the ram sitting in work laptops on nights and weekends sitting unused. Big waste of resources.
If the American voter base doesn't pull its shit together and revive democracy, we're going to have a bad century. Yesterday I met a man who doesn't vote and I wanted to go ape-shit on him. "My vote doesn't matter". Vote for mayor. Vote for city council. Vote for our House members. Vote for State Senate. Vote for our two Senators.
"Voting doesn't matter, capitalism is doomed anyway" is a self-fulling prophecy and a fed psy-op from the right. I'm so fucking sick of that attitude from my allies.
;) And you wanted to go ape shit on him... For falling for a psy-op?
My friend, morale is very very low. There is no vigor to fight for a better tomorrow in many people's hearts. Many are occupied with the problems of today. It doesn't take a psy-op to reach this level of hopelessness.
Be sick of it all you want, it doesn't change their minds. Perhaps you will find something more persuasive.
All the ills of modern (American) politics stem by the blaming one side for the problems caused by both.
Current polling however says the current voter base is quite unhappy with how this is
You don’t need to make them happy, just scared of the opposition.
Nonvoters aren’t being irrational.
If a monopoly appears due to superior offerings, better pricing and quicker innovation, I fail to see why it needs to be a bad thing. They can be competed against and historically that has always been the case.
On the other hand, monopolies appearing due to regulations, permissions, patents, or any governmental support, are indeed terrible, as they cannot be competed against.
Ahem, you'll find that with regulation, money begets money and monopolies will form. That is, unless you magically produce legislators which are incorruptible, have perfect knowledge and always make the perfect choice.
Even the Big Bang was imperfect, and matter clumps together instead of being perfectly distributed in the available space.
As long as there is still a way to make money then nothing else really matters as money is the only thing that can buy you a semblance of happiness and freedom. Enough money and you can move to whatever country you want if things get bad enough too in the US.
If we see politician as just a machine who's only job is to get elected, they have to get as many votes as possible. Pandering to the individual is unrealistic, so you usually target groups of people who share some common interest. As your aim is to get as many votes as possible, you will want to target the “bigger” (in amount of potential vote) groups. Then it is a game of trying to get the bigger groups which don't have conflicting interest. While this is theory and a simplification of reality, all decent political party do absolutely look at statistics and survey to for a strategy for the election.
If you are part of a group that, even though might be big in population, doesn't vote, politician have no reason to try to pander to you. As a concrete example, in a lot of “western” country right now, a lot of politician elected are almost completely ignoring the youth. Why ? Because in those same country the youth is the age group which vote the less.
So by not voting, you are making absolutely sure that your interest won't be defended. You can argue that once elected, you have no guarantee that the politician will actually defend your interest, or even do the opposite (as an example, soybean farmer and trump in the U.S). But then you won't be satisfied and possibly not vote for the same guy / party next election (which is what a lot of swing voters do).
But yeah, in an ideal world, everyone would vote, see through communication tactics and actually study the party, program and the candidate they vote for, before voting.
In fact I think what you said about the older demographics being pandered to by politicians is a great point. Their voting patterns are probably having a net negative impact on society and really they should vote less. But they don't, and so politicians pander to them.
But to engage with your question, not voting is the same as voting. You are forgoing your voice and giving more weight to the people that do vote. It's limited to your district, yes, but whatever the outcome, you gave the majority power to do that. So it's not surprising that people get frustrated when non-voters see themselves as "outside" of politics, especially when they complain about the state of things.
Also a lot of people who chose not to vote have become disillusioned by the common narrative around political action, the democratic process, and even the concept of political authority. It's extremely grating to be berated (not saying you, other people) about not voting when they still believe the things their middle school teachers taught them about politics and tend to be the least politically knowledgeable out of everybody.
This has been played out before so it is only natural that they are careful with increasing the supply. And while they don't response they are netting larger margins than before.
Obvious end result is that demand will drop as price goes up. The other natural part of supply-demand curve.
Crucial is dead. There's a finite amount of rare earth. Wars and floods can bankrupt industries, supply chains are tight.
If the market is big enough, competitors will appear. And if the margins are high enough, competitors can always price-compete down to capture market-share.
Also, "price-compete down to capture market-share"? Prices are going up because all future production capacity has been sold. It makes no sense to lower prices if you don't have the capacity to full fill those orders.
Micron stopped selling to consumers to focus on the high margin enteprise market. Might change in the future.
Rare earth metals are in the dirt around the world.
Supply and demand curves shifting, hence prices increasing (and decreasing) is an expected part of life due to the inability to see the future.
They are. The problem is, the machinery to extract and refine them, and especially to make them into chips, takes years to build. We're looking at a time horizon of almost a decade if you include planning, permits and R&D.
And given that almost everyone but the AI bros expects the AI bubble to burst rather sooner than later (given that the interweb of funding and deals more resembles the Habsburg family tree than anything healthy) and the semiconductor industry is infamous for pretty toxic supply/demand boom-bust cycles, they are all preferring to err on the side of caution - particularly as we're not talking about single billion dollar amounts any more. TSMC Arizona is projected to cost 165 billion dollars [1] - other than the US government and cash-flush Apple, I don't even know anyone able, much less willing to finance such a project under the current conditions.
Apple at least can make use of TSMCs fab capacity when the AI bros go bust...
I guess people might be mixing up all the headlines of all the articles they did not read by this point.
Increasing memory production capacity is a multi-year project, but in a few years, the LLM companies creating the current demand might all have run out of money. If demand craters just as supply increases, prices will drastically decrease, which none of these companies want.
Take a look at GPU prices and how that "supply increased thus bringing the prices down"
But we don't see this bankrolling in absolute values. Rather, it's due to regressive taxation, low (cheap) social security for workers, and very weak intellectual property protection.
There are several inventions which are far greater than LLMs. To name two: computers and methods to generate electricity, things without which LLMs wouldn’t have been possible. But also harnessing fire, the wheel, agriculture, vaccines… The list goes on and on.
Calling LLMs “AI thingies” seems much more in tune with reality than calling them “the greatest invention of man” (and I’m steel manning and assuming you meant “latest”, not “last”). You can’t eat LLMs or live in them and they are extremely dependent on other inventions to barely function. They do not, in any way, deserve the title of “greatest invention”, and it’s worrying that we’re at that level of hyperbole. Though you’re certainly not the first one to make that claim.
https://finance.yahoo.com/news/alphabet-ceo-sundar-pichai-sa...
Speak for yourself, friend. I don't believe you and think you're making a tragic mistake, but you're also my competition in a sense, so… you have fun with that.
Allocating a very large share of advanced memory production, especially HBM and high-end DRAM, which are critical for almost all modern technology (and even many non-tech products like household appliances) to a small number of U.S. centric AI players risks distorting the global market and limiting availability for other industries.
Even within Samsung itself, the Mobile eXperience (MX) Business (smartphones) is not guaranteed preferential access to memory from Samsung’s Device Solutions (DS) Division, which includes the Memory Business. If internal customers are forced to source DRAM elsewhere due to pricing or capacity constraints, this could eventually become economically problematic for a country that relies very heavily on semiconductor and technology exports.
It's not like it's their fault that micron pulled out of the market...
Edit: maybe someone should consider sweet-talking kioxia into making dram chips?
A medium end gaming PC can display impressively realistic graphics at high resolutions and framerates while also being useful for a variety of other computationally intensive processing tasks like video encoding, compiling large code bases, etc. Or it can be used to host deeply mediocre local LLMs.
The actual frontier models from companies like Anthropic or OpenAI require vastly more expensive computing resources, resources that could otherwise be used for potentially more useful computation that isn't so inefficient. Think of all the computing power going into frontier models but applied to weather forecasting or cancer research or whatever.
Of course it's not either or, but as this article and similar ones point out, chips and other computing resources aren't infinite and AI for now at least has a seemingly insatiable appetite and enough dollars to starve other uses.
For last 2+ years, I've noticed a worrying trend: the typical budget PCs (especially Laptops) are being sold at higher prices with lower RAM (just 8GB) and lower-end CPUs (and no dedicated GPUs). Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.
New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.
Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).
It is as if the industry has decided to focus on AI and nothing else.
And this will be a huge setback for humanity, especially the students and scientific communities.
If "old" devices outperform new devices, consumers will gain new understanding from efficient market feedback, influencing purchase decisions and demand for "new" devices.
Summary: * Massive spikes: Consumer RAM prices have skyrocketed due to a tight supply. Major PC companies have issued warnings of price hikes, with CyberPowerPC stating: "global memory (RAM) prices have surged by 500% and SSD prices have risen by 100%."
* All for AI: The push for increased cloud computing, as seen in the likes of ChatGPT and Gemini, means more data centers are needed, which in turn requires High Bandwidth Memory (HBM). Manufacturers like SK Hynix and Micron are now shifting priorities to make HBM instead of PC RAM.
* Limited supply: Companies are now buying up stock of all the remaining supply of standard DRAM chips, leaving crumbs for the consumer market and price hikes for the limited supply there is.
Good luck expecting "value for money" from this "efficient" market.
The point of the gold rush now is that a large number of investors think AI will be more efficient at converting GPU and RAM cycles into money than games or other applications will. Hence they are willing to pay more for the same hardware.
100%, there is a lost time factor which sucks though if someone wants beefy, some might be forced to cough up some money but honestly yea although I agree with your statement, the ram increases really impact the self hosting/homelabbing genres and there was a recent post relevant to this discussion on hackernews https://news.ycombinator.com/item?id=46416618: Self hosting is being enshittified.
The world ran just fine on DDR3 for a long time.
Not sure if they require DDR5 but the AI crisis just caused the prices of DDR5 to rise but the market supply of DDR4 thus grew and that's why they got more expensive too
> I'm also wondering about the supply of the alternate nodes and older technologies.
I suppose these might be chinese companies but there might be some european/american companies (not sure) but if things continue, there is gonna be a strain on them in demand and they might increase their prices too
> Was it micron that abandoned the entire retail market in favor of supplying the hyperscalers?
Yes
The EV is a therefore, on a whole, a lot less sensitive to DRAM price increases.
That might be the case only for the infotainment system, but there’s usually many other ECUs in an EV. The ADAS ECUs are carrying similar amounts as an iPhone or the infotainment system. Telematics is also usually also a relatively complex one, but more towards lower sized amounts.
Then you have around 3-5 other midsized ECUs with relatively high memory sizes, or at least enough to require MMUs and to run more complex operating systems supporting typical AUTOSAR stacks.
And then you have all the small size ECUs controlling all small individual actuators.
But also all complex sensors like radars, cameras, lidars carry some amounts of relevant memory.
I still think your point is valid, though. There’s no difference in orders of magnitude when it comes to expensive RAM compared to an iPhone. But there’s cars also carried lots of low-speed, automotive grade memory in all the ECUs distributed throughout the vehicle.
If there’s three main ECUs at 16GB each, you’re already hitting 50GB. Add 2-4GB for mid size ecus, and anything in between KBs and some MB for small ECUs.
That's at least 52 GB.
> will the DRAM manufactures even keep those nodes running in favor of these industries?
Some will, Some might not, In my opinion, the longevity of these brands will only depend if they allow buying ram for the average person/consumer brands so I guess we might see new competition perhaps or give more marketshare to all the other fab companies beyond the main three of these industries.
I am sure that some company will 100% align with the consumers but the problem to me feels that they wouldn't be able to supply enough production to consumers in the first place so prices still might rise.
And those prices most likely will be paid by you in one form or another but it would be interesting to see how long the companies who buy dram from these providers or build datacenters or anything ram intensive will hold their price up, perhaps they might eat the loss short term similar to what we saw some companies do during trump tarrifs.
Looks like the frame.work desktop with Ryzen 128GB is shipping now at same price it was on release, Apple is offering 512GB Mac studios
Are snapdragon chips the same way?
Note that the memory is on the board for Ryzen AI Max, not on the package (as it is for Intel’s Lunar Lake and Apple’s M-series processors) or on die (which would be SRAM). As noted in another comment, whether the memory is on the board, on a module, or on the processor package, they are all still coming from the same extremely constrained three memory die suppliers, so costs are going up for all of them.
All your competitors are in the same boat, so consumers won’t have options. It’s much better to minimize the risk of blowing up by sticking as closely to spot at possible. That’s the whole idea of lean. Consumers and governments were mad about supply chains during the pandemic, but companies survived because they were lean.
In a sense this is the opposite risk profile of futures contracts in trading/portfolio management, even though they share some superficial similarities. Manufacturing businesses are fundamentally different from trading.
They certainly have contracts in place that cover goods already sold. They do a ton of preorders which is great since they get paid before they have to pay their suppliers. Just like airlines trade energy futures because they’ve sold the tickets long before they have to buy the jet fuel.
the risk is that such longer contracts would then lock you into a higher cost component for longer, if the price drops. Longer contracts only look good in hindsight if ram prices increased (unexpectedly).
SoCs with on-die memory (which is, these days, exclusively SRAM, since I don't think IBM's eDRAM process for mixing DRAM with logic is still in production) will not be effected. SiPs with on-package DRAM, including Apple's A and M series SiPs and Qualcomm's Snapdragon, will be effected -- they use the same DRAM dice as everyone else.
To answer the original question: the Framework Desktop is indeed still at the (pretty inflated) price, but for example the Bosgame mini PC with the same chip has gone up in price.
"dice" is the plural for the object used as a source of randomness, but "dies" is the plural for other noun uses of "die".
It's funky, I get it wrong all the time. Effect and Affect both have noun and verb uses.
Clearly, “affected” is meant above (SoCs with on-die memory are not impacted); “effected” doesn’t make sense in the context (SoCs with on-die memory are not brought about). Prices can be affected just as well as people.
The bigger the company = longer the contract.
However it will eventually catch up even to Apple.
It is not prices alone due to demand but the manufacturing redirection from something like lpddr in iphones to hbm and what have you for servers and gpu
https://www.google.com/amp/s/www.indiatoday.in/amp/technolog...
There was a time when apple was hesitant to add more ram to its iPhones and app developers would have to work hard to make apps efficient. Last few years have shown Apple going from 6gb to 12gb so easily for their 'AI' while I consistently see the quality of apps deteriorating on the App Store. iOS 26 and macOS 26 are so aggressive towards memory swapping that loading settings can take time on devices with 6gb ram (absurd). I wonder what else they have added that apps need purging so frequently. 6gb iphone and 8gb M1 felt incredibly fast for the couple of years. Now apparently they are slow like they are really old.
Windows 11 and Chrome are a completely different story. Windows 10 ran just fine on my 8th gen pc for years. Windows 11 is very slow and chrome is a bad experience. Firefox doesn't make it better.
I also find that gnome and cosmic de are not exactly great at memory. A bare minimum desktop still takes up 1.5-1.6gb ram on a 1080p display and with some tabs open, terminal and vscode (again electron) I easily hit 8gb. Sway is better in this regard. I find alacrity sway and Firefox together make it a good experience.
I wonder where we are heading on personal computer software. The processors have gotten really fast and storage and memory even more so, but the software still feels slow and glitchy. If this is the industry's idea of justifying new hardware each year we are probably investing in the wrong people.
Set this to something you find reasonable: `browser.low_commit_space_threshold_percent`
And make sure tab unloading is enabled.
Also, you can achieve the same thing with cgroups by giving Firefox a slice of memory it can grow into.
- Right now A LOT of PCs are getting out of date because Windows 11 wants some new hardware that's missing in older PCs. - At the same time, smartphones were starting to use and need more memory. - Modern cars (mostly electric) have much more electronics inside than older ones... they're now distributed systems with several CPUs working together along a bus. - The cloud needs to upgrade and newer servers have much more memory than older ones with almost the same foorprint (wich means you need to lease less datacenter space). - And GPUs for AI are in demand and need RAM.
But only AI is to blame although we're living in a perfect storm for RAM demand.