Posted by SerCe 10 hours ago
This seems rather sad. Is this really what AI is for?
And we do not need gigawatts and gigawatts for this use case anyway. A small local model or batched inference of a small model should do just fine.
Or, you know, Signal/Matrix/WhatsApp/{your_preferred_chat_app}. If you're already texting things, might as well do that.
I guess I'm a dinosaur but I think emailing the friend to ask what they are actually up to would be even better than involving an LLM to imagine it.
Asynchronous human to human communication is a pretty solved problem.
> I'd been missing that human connection
At OpenAI.
I don't think that indicates that any one company interviewed him 20+ times.
Seems like that really hurt.
Man I don't care if you are best engineer in the planet but reading that post made me cringe. Keep your ego under control, please.
You're in for a surprise buddy.