Top
Best
New

Posted by dot_treo 1 day ago

Tell HN: Litellm 1.82.7 and 1.82.8 on PyPI are compromised(github.com)
About an hour ago new versions have been deployed to PyPI.

I was just setting up a new project, and things behaved weirdly. My laptop ran out of RAM, it looked like a forkbomb was running.

I've investigated, and found that a base64 encoded blob has been added to proxy_server.py.

It writes and decodes another file which it then runs.

I'm in the process of reporting this upstream, but wanted to give everyone here a headsup.

It is also reported in this issue: https://github.com/BerriAI/litellm/issues/24512

884 points | 465 commentspage 14
bustah 1 day ago|
[dead]
bustah 1 day ago||
[dead]
sy0115 1 day ago||
[dead]
hahaddmmm12x 1 day ago||
[flagged]
dang 1 day ago|
Automated comments aren't allowed here. Please stop.

https://news.ycombinator.com/newsguidelines.html#generated

iamnotai666 1 day ago||
[dead]
johnhenry 1 day ago||
I've been developing an alternative to LiteLLM. Javascript. No dependencies. https://github.com/johnhenry/ai.matey/
ajoy 1 day ago||
Reminded me of a similar story at openSSH, wonderfully documented in a "Veritasium" episode, which was just fascinating to watch/listen.

https://www.youtube.com/watch?v=aoag03mSuXQ

zahlman 22 hours ago|
The xz compromise was not "at openSSH", and worked very differently.