Top
Best
New

Posted by Brajeshwar 5 days ago

AI didn't delete your database, you did(idiallo.com)
544 points | 302 commentspage 7
jimmypk 5 days ago|
[flagged]
golikovichev 5 days ago||
[flagged]
rcdwealth 5 days ago||
[dead]
shutterkiller 5 days ago||
[flagged]
hiroto_lemon 5 days ago||
[flagged]
Knowledgee_KZA 5 days ago||
[flagged]
nickchen778 5 days ago||
[dead]
beyondscaletech 4 days ago||
[flagged]
josefritzishere 5 days ago||
Using AI is a mistake. It might delete your database.
bigstrat2003 4 days ago||
Giving an LLM the ability to interact with your system is, in fact, a mistake. One that it turns out a lot of people are foolish enough to make, and they don't care at all about the predictable consequences of that mistake.
wolttam 5 days ago|||
Using a saw is a mistake, you might cut off one of your own limbs.
recursive 5 days ago||
Using a saw entails a risk of injury. Using one is a mistake if you don't intend to cut something.
wolttam 5 days ago||
What I said was tongue-firmly-in-cheek, in response to the GP. "Using AI is a mistake" is of course only true when the risks aren't acknowledged and/or mitigated.
throwaway613746 5 days ago||
[dead]
newsoftheday 5 days ago|
The article is dumb, "why do you have an API endpoint that deletes your entire production database?" irrelevant, the AI did what it did, period.
mock-possum 5 days ago||
No, the AI did what you told it to do. The AI didn’t do anything on its own.

> if you're going to use AI extensively, build a process where competent developers use it as a tool to augment their work, not a way to avoid accountability

BadBadJellyBean 5 days ago||
> No, the AI did what you told it to do.

I'd say yes and no. The LLM reacted to the input that was given but it is not possible for a human (especially without access to the weights) to even guess what will happen after that.

Regardless of that I agree that it's completely the fault of the user to use a tool where you can't predict the outcome and give it such broad permissions and not having a solid backup strategy.

Either don't use non deterministic tools or protect yourself from the potential fallout.

Avicebron 5 days ago||
Uh?

If someone left a loaded gun in a room and then let a toddler run around in it, we would be questioning why the guy 1) left the gun in the room 2) left the toddler in the room unsupervised. We wouldn't be saying, well no one should have toddlers in rooms.

bux93 5 days ago||
A PhD-level toddler, mind you.
kennywinker 5 days ago||
Lol no. No LLM that exists today can write a legible PhD thesis. Nor a masters dissertation. Maybe a first-year collage student, if we’re being generous, but I wouldn’t leave one of those in a room with a loaded gun either.