Top
Best
New

Posted by imagiro 4 days ago

I'm making a game engine based on dynamic signed distance fields (SDFs) [video](www.youtube.com)
379 points | 56 commentspage 2
eviks 7 hours ago|
Excellent technical presentation! Though the style itself is a bit too "clay-like", like I wouldn't expect a cube melding with the terrain sand to be a smooth glued connection. Is that some "inherent" SDF thing or just a style of the demo?
tomashubelbauer 3 hours ago||
I think it is an artifact of the optimizations he uses and while it's artistically limiting, I think a right game with the right visual language could make this work to its advantage in terms of uniqueness/distinctiveness. It's a one trick pony if not avoidable though.
nkrisc 4 hours ago|||
Basically it's the result of a smoothing function that blends the sampled SDF value of the two nearest bodies. You can simply pick the minimum SDF value and get no blending at all.
jakkos 1 minute ago||
> You can simply pick the minimum SDF value and get no blending at all.

While this true for traditional SDF rendering (e.g. raymarching), the method of "interpolating cached distances" used here means that you will always get blending between objects.

interpol_p 6 hours ago||
I believe you can do regular hard edged intersections. You can see in his operator list some are listed as “smoothSubtract” and some are just “subtract”

It’s just easy to do the melding thing with SDFs so a lot of people do it

rcxdude 4 hours ago||
From his description of the approach I suspect its also to smooth over sharp edges that the grid optimization doesn't like so much.
rcarmo 8 hours ago||
I watched this over the weekend and loved the approach. I’ve played with SDF for 3D modeling (even though the current libraries generate meshes for slicing using marching cubes, which is slow as heck and can lead to imprecision on small features), and wish I had more time for playing around with it.
Glyptodon 17 hours ago||
Using layers with settings about which SDFs interact with which layers for operations seems interesting. Like put trees in a layer and then have an axe that can negatively deform the trees but not the ground layer or something. Or a predator layer that can absorb things in the prey layer. Haven't really thought through.
tamat 7 hours ago||
the problem with SDF engines is that you have to reinvent everything, as current pipelines rely on triangles.

That means:

- Software to model using SDF (like Womp)

- Technique to animate skeletons using SDFs

- Tool to procedural texture surfaces using SDFs

At least he solved the physics part, which is also complex.

And also, his way of carving is by instantiating new elements, which works for small carves, but if you plan to have lots of tunels, then the number of instances is going to skyrocket.

andybak 4 hours ago|
Although a decent chunk of modern tooling is there to handle the limitations of triangles. And modelling is often using higher-level abstractions that are only turned into triangles at the end of the process.
cyber_kinetist 40 minutes ago||
That's true if you're using a CAD-like tool, but that's typically not used for art (more for engineering / mechanical design)

Game / VFX artists heavily use mesh-based tools such as Maya or Blender.

jacobgorm 18 hours ago||
GameGlobe from Haptico and Square Enix, the engine of which also powered Project Spark from Microsoft, also used an SDF engine. Former colleagues of mine built the tech in Copenhagen and I remember getting a super impressive demo back then. This was the first time I heard of SDFs.
Traubenfuchs 5 hours ago||
You can't talk about terrain modification without mentioning "From Dust", which did it at a grand scale... 15 (!) years ago.

It allowed you to shape terrain with sand, water and lava. So terrain modification PLUS fluid simulation!

https://www.youtube.com/watch?v=ZYUU3dv7WC4

https://en.wikipedia.org/wiki/From_Dust

vivzkestrel 13 hours ago||
stupid question to anyone reading this: not a gamedev, not even by a long shot but i had to ask

- with the advent of all the AI tools, is it actually possible to vibe code a 3D FPS shooter from scratch like if you wrote a 2000 page prompt, can it actually be done?

ehnto 13 hours ago||
A big challenge of game dev is the asynchronous nature of all the requirements, and that the game will develop its direction continuously throughout dev. That is to say you don't know what assets etc you need until you've developed the part of the game that generates that requirement. I find it hard to imagine even a 2000 page pre-planning could capture that process.

You could try planning ahead and restricting assets to an asset library, that could fix some of that problem. But having used coding agents for complex software work, and games being one of the most complex software tasks in the industry, I just don't see it happening quite that easily.

I also think the outcome would be shit, pure and simple. The development of a game is usually the stylistic input of dozens to thousands of humans over the course of years. They are not trivial pursuits. There's a lot of variance in there, but generally speaking I expect this to be one of the final frontiers for AI development. There's not heaps of training data since game code is usually proprietary, which doesn't help.

vivzkestrel 11 hours ago||
out of curiosity, i want to experiment creating a third person shooter from scratch with vibe coding (yes third person, i wrote FPS above by mistake). think of a proper military game with actual uniforms, movements like walk, crouch, jump, take cover etc. and being able to fire bullets, ballistics, grenades, explosions etc. what do you think is the process to vibe code something like this. obviously i ll need to give it models or assets for characters, map locations etc. how does this sorta thing work?
bschwindHN 8 hours ago|||
I would recommend _not_ vibe coding it if it's a game you actually want to see become real, and instead pick up Godot or Unreal or Unity.

I'm sure an LLM could output something or other that resembles a vague concept of a game but you're not going to prompt your way into something that's actually fun for a human to play.

socalgal2 9 hours ago|||
I don't think you can describe all of that in an HN comment. There are lots of videos of people vibe coding games though.
nmfisher 12 hours ago|||
Probably, yes. But it's not 1997 any more, you can "code" a vanilla FPS in Unity in 15 minutes too. Games are more about artwork and design, which agents aren't great at (yet).
protocolture 12 hours ago|||
You can pretty much drag and drop a working FPS in unity.

But I have half vibed an FPS in Pygame so its 100% viable (Mine is First and Person, and has motion, but its more of a flight simulator. I am sure the rest of the features would be piss easy)

meheleventyone 5 hours ago|||
Depends on what you mean really. In the context of making a game people would actually want to play, no.
lifeformed 2 hours ago|||
Not a good one.
MattRix 13 hours ago|||
I mean you can download a free sample project for Unity or Unreal and have a 3D FPS Shooter even without AI. If you want to make one from scratch using AI, you’ll still need to provide some kind of art…

With that said, yeah Claude Code CAN build one, but for action games a big part of them comes down to “game feel”, something that can’t be captured in a screenshot. You really need to have taste and the ability to describe what isn’t working and why.

ttawehed 13 hours ago||
Elon Musk is working on this(XAi)
d--b 12 hours ago||
Reminded me of Red Faction, a FPS where you could destroy the environment.

Kind of like Quake in a Lemmings world.

This is quite more polished to say the least.

cubefox 18 hours ago||
Almost every 3D game uses textured polygons almost everywhere (except sometimes for fog or clouds), so this SDF engine is nice to see.

However, he doesn't mention animations, especially skeletal animations. Those tend to work poorly or not at all without polygons. PS4 Dreams, another SDF engine, also had strong limitations with regards to animation. I hope he can figure something out, though perhaps his game project doesn't need animation anyway.

Boxxed 17 hours ago||
I'm not super familiar with this area so I don't follow... Why is animation any more difficult? I would think you could attach the basic 3D shapes to a skeleton the same way you would with polygons.
dahart 17 hours ago||
There are lots of reasons you don’t see a lot of SDF skeletal rigging & animation in games. It’s harder because the distance evaluations get much more expensive when you attach a hierarchy of warps and transforms, and there are typically a lot of distance evaluations when doing ray-marching. This project reduces the cost by using a voxel cache, but animated stuff thwarts the caching, so you have to limit the amount of animation. Another reason it’s more difficult to rig & animate SDFs is because you only get a limited set of shapes that have analytic distance functions, or you have primitives and blending and warping that break Lipschitz conditions in your distance field, which is a fancy way of saying it’s easy to break the SDF and there are only limited and expensive ways to fix it. SDFs are much better at representing procedural content than the kind of mesh modeling involved in character animation and rendering.
MITSardine 7 hours ago||
One possibility, a little backwards maybe, is to produce a discrete SDF from e.g. a mesh, by inserting it in an octree. The caching becomes the SDF itself, basically. This would let rendering be done via the SDF, but other logic could use the mesh (or other spatial data structure).

Or could the engine treat animated objects as traditional meshed objects (both rendering and interactions)? The author says all physics is done with meshes, so such objects could still interact with the game world seemingly easily. I imagine this would be limited to characters and such. I think they would look terrible using interpolation on a fixed grid anyways as a rotation would move the geometry around slightly, making these objects appear "blurry" in motion.

Jarmsy 5 hours ago||
Sampling an implicit function on a grid shifts you to the world of voxel processing, which has its own strengths and weaknesses. Further processing is lossy (like with raster image processing), storage requirements go up, recovering sharp edges is harder...
MITSardine 4 hours ago||
But isn't this what the author is doing already? That's what I got from the video. SDF is sampled on a sparse grid (only cells that cross the level set 0) and then values are sampled by interpolating on the grid rather than full reevaluation.
01HNNWZ0MV43FF 16 hours ago||
His SDF probably puts out a depth buffer, so with some effort (shadows might be hard?) you can just mix it with traditional polygons. The same way raytracing and polygons mix in AAA games.

He's using the SDFs to fill a space sort of like Unreal's Nanite virtual geometry. Nanite also doesn't support general animation. They only recently added support for foliage. So you'd use SDF / Nanite for your "infinite detail" / kit-bashing individual pebbles all the way to the horizon, and then draw polygon characters and props on top of that.

In fact I was surprised to see that Nanite flipped from triangle supremacy to using voxels in their new foliage tech. So maybe the two technologies will converge. The guy who did the initial research for Nanite (his talk also cites Dreams ofc) said that voxels weren't practical. But I guess they hit the limits of what they can do with pixel-sized triangles.

cubefox 10 hours ago||
I think they do now support skeletal meshes with virtual geometry: https://dev.epicgames.com/documentation/en-us/unreal-engine/...

Though it says "experimental". Unclear what that means in practice.

This also mentions "skinning": https://dev.epicgames.com/documentation/en-us/unreal-engine/... I believe thats just another term for skeletal meshes / "bones".

DetroitThrow 17 hours ago|
Such impressive demos and great explanations in the video. Mike, if you're reading this, keep making videos!
More comments...