SDF Slicer
I've been experimenting with a lot of different algorithms and libraries. It doesn't happen often that I get to use all of my experiments in one project This is one of them. My endgame in terms of experiment crossovers.
What did I make?
I build a drag and drop node graph builder for defining 3D scenes as SDF's that WebGPU can slice using the marching squares algorithm.
Let's break that down in terms previous experiments.
Node graph builder that defines functions
I made a react codebase visualiser that can take in a github url and will plot all of the react components and hooks using a node graph. Every constant could be changed straight from the graph nodes and intermediate outputs could be rendered to a frame. I then had all of the code ready to reverse this and build new react components and pages using the same nodes. This is what I reused to create an SDF 3D scene.
SDF 3D scene
A Signed Distance Field/Function is a function that takes in a coordinate and returns the distance from that coordinate to the model's surface. Negative values for coordinates inside of the model.
I used SDF's in the past for building a 3D scene that I could ray trace to navigate around while I was experimenting with using the then new PS DualSense controller with chrome's bluetooth/hid apis because gamepad support hadn't landed yet.
Here I used SDF's to find inset paths from the model's surface for free, which was harder to do when building my previous slicer.
Slicing
Next to my love for javascript, I also embody the maker. I build my first 3D printer 10 years ago. What you learn pretty early on is that to convert a 3D model into machine movement instructions (GCODE), there is in intermediary step called slicing. This means dividing the 3D model into 2D slices and then converting them into printer movements. This entire project is aimed at skipping the 3D model step to slice the 3D definition (SDF) directly.
Square marching
This is knowledge from my first ever 3D project in the browser. I wanted to make a simple 3D modeller where you would paint in 3D by moving a virtual reality controller around and increasing the values of points on a grid, and then using the cube marching algorithm to convert those grid values into a 3D surface. Cube marching just looks at a cube of 8 points and finds the vertex data that matches the values for those 8 points from a lookup table. Then it interpolates the vertex positions to match the weight of the points.
This algorithm also has a 2D brother called square marching. Instead of finding a surface, we are getting an outline for a specific distance from the model. I sample points straight from the SDF to let the square marching algorithm find paths that have a specific inset from the surface. Generating printing paths in the process.
But why?
- I have wanted to build a slicer that can handle arbitrary slicing planes for a while. Square marching over points sampled from an SDF will give me this option. If you have a good idea on how to implement defining those planes, please to contact me!
- Now I can 3D model in an environment where I can also directly visualise my GCODE.
The important reason:
- I dreamt (literally, though if in my sleep) of combining these techniques to do exactly this. So I wanted to know if it was feasible/possible.
Consider my itch scratched.