3D *.stl surface model convert to 2D image stack?

Question:

OK to start with let me be clear, I am not interested in converting an image stack into a 3D model.

I have an *.stl file (a triangulated surface mesh) & I would like to slice it back into an image stack. I’ve had a look at Slic3r & Meshmixer but they both only give out Gcode.

So given I have the vertices of all the points on the surface (which is NOT convex incidentally) & their connectivity. What libraries are out there that could help with this?

My feeling is that I would need to interpolate the boundary on slices that did not pass through known vertices.

I’m comfortable with Python & C++ at a push but am willing to broaden my horizons.

Asked By: DrBwts

||

Answers:

For example if you got your mesh to render with OpenGL (by any means inside your app) then to get your slice you would simply:

  1. set your camera so screen projection plane is parallel to the slice…
  2. clear screen buffer as usual with glClearColor set to background color
  3. Clear your depth buffer with glClearDepth set to Z-coordinate of the slice in camera space
  4. set glDepthFunc(GL_EQUAL)
  5. render mesh

Something like:

// here set view
glClearColor( 0.0,0.0,0.0,0.0 ); // <0.0,1.0> r,g,b,a
glClearDepth( 0.5 );             // <0.0,1.0> ... 0.0 = z_near, 1.0 = z_far 
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glDepthFunc(GL_EQUAL);
// here render mesh

This will render only the slice for which fragments have Z==Slice coordinate. This can be also done by GLSL by throwing away all fragments with different Z. The DirectX should have something similar (I do not use it so I do not know for sure).

As most meshes are BR models (hollow) then you will obtain circumference of your slice so you most likely need to fill it afterwards to suite your needs…

You can also experiment with rendering a thick slice … where Z is around a predefined value …

Answered By: Spektre

If an HTTP API solution is OK for your purpose then I would suggest the following resource:
Server-side 3D mesh to thumbnail image rendering by http://manifold.metamatic.us

You can POST you 3D mesh (STL, OBJ, 3DS) to the API endpoint and get a JSON response. It will contain a URL to a 2D image rendering of your 3D model among other 3D measurement results such as volume, area, bounding box, build time.

Client libraries that interface with the Manifold API endpoint that are written in python, PHP, JavaScript and even cURL command-line scripts are also available.

I hope this helps in your search. Kindly excuse if you were not looking for a blackbox solution and were instead trying to implement a 3D renderer yourself.

Answered By: pX0r

Another option is the following algorithm.
Firstly you need to convert your mesh to voxels, where the voxel value equals to signed distance to the surface (mesh have to be closed). And then you need to build the set of slices.
You can do it with an open-source library MeshLib, which can be called both from C++ and python code.

The code can look like this:

auto loadRes = MeshLoad::fromAnySupportedFormat( "C:/Meshes/spartan.stl" );
if ( !loadRes.has_value() )
    return false; // loadRes.error() for more info, mesh load failed

const float voxelSize = 0.1f;
const auto grid = meshToLevelSet( loadRes.value(), AffineXf3f{}, Vector3f::diagonal( voxelSize ) );
const auto bbox = loadRes->getBoundingBox();

const auto vdbDims = grid->evalActiveVoxelDim();
const Vector3i dims = { vdbDims.x(),vdbDims.y(),vdbDims.z() };
//The less the difference between min and max is the more contrast the images are.
const float min = 0.0f;
const float max = 0.0f;
const auto axis = SlicePlain::XY;
VdbVolume vdb{ .data = grid, .dims = dims, .voxelSize = Vector3f::diagonal( voxelSize ), .min = min, .max = max };
auto saveRes = VoxelsSave::saveAllSlicesToImage( "C:/Meshes/slices/", vdb, axis );

return saveRes.has_value();

Answered By: Alex Koksin
Categories: questions Tags: , , , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.