Top
Best
New

Posted by ozgune 8 hours ago

Snowflake AI Escapes Sandbox and Executes Malware(www.promptarmor.com)
205 points | 63 commentspage 3
orbital-decay 5 hours ago|
>Snowflake Cortex AI Escapes Sandbox and Executes Malware

rolls eyes Actual content: prompt injection vulnerability discovered in a coding agent

teraflop 5 hours ago|
Well there's the prompt injection itself, and the fact that the agent framework tried to defend against it with a "sandbox" that technically existed but was ludicrously inadequate.

I don't know how anyone with a modicum of Unix experience would think that examining the only first word of a shell command would be enough to tell you whether it can lead to arbitrary code execution.

alephnerd 6 hours ago||
And so BSides and RSA season begins.
robutsume 1 hour ago||
[dead]
WWilliam 5 hours ago||
[dead]
aplomb1026 6 hours ago||
[dead]
seedpi 5 hours ago||
[flagged]
webagent255 4 hours ago||
[dead]
kreyenborgi 4 hours ago||
Tl;dr they don't know what the word sandbox means.
Iamkkdasari74 5 hours ago||
[dead]
kagi_2026 6 hours ago|
[flagged]
rogerkirkness 6 hours ago|
@dang seems like AI? Would just ban