

So when you view the slices from one of the sides, you frequently see particles belonging to slices further away drawn on top of particles that belong to slices closer to the camera. This actually works pretty good, especially since I was able to multithread the instantiation and destruction of new slices and particle positions, so it doesn't impact the framerate.

#PARTICLE PLAYGROUND DRAW ORDER UPDATE#
Then, every update in a script, I just call MeshParticleEmitter.Emit a number of times with the particle positions and the color in that pixel, so I have full control over the colors and the positions particles end up at. Its animator has the "Does Animate" property set to false as well. To render them, I have a single particle system with a MeshParticleEmitter attached, whose mesh property is null and whose Emit property is disabled. Obviously, in the application, there are far more particles per slice 4096, to be exact. When placed close together and given a bit of jitter, they can fake the reconstruction of the pyramid enough to be satisfactory: Then, I color grids of particles like the ones below. Imagine you run this sensor past a pyramid that lies down on its side the resulting tomograms might look like those in the image. I made a little sketch (all hail Power Point) to try and illustrate the idea. I then run through all the particles in the slice and give them a color corresponding to the pixel color they map to in the tomogram. To fake volumetric reconstruction from the tomograms, I've instantiated collections of particles ordered in a slice of space aligned with the position of the 2D image at the point in space where it was detected. I'm aware that 3d geometry from cross sections is commonly created using marching cubes and edge detection, but the time I've been given doesn't allow me to implement this algorithm properly just yet, so I've opted for something simpler to begin with: particles. The assignment is totally analogous to the tomographical cross sections generated by a CAT scanner in hospitals, and the algorithms that subsequently reconstruct the organ that was scanned. The sensor data are basically 2D images that make up cross-sections of a body of matter inside a container, and my task is to recreate, with 3D geometry, the objects that the sensor detected.
#PARTICLE PLAYGROUND DRAW ORDER HOW TO#
I'm sweating a little over how to display some sensordata in a sensible manner these days.
