top of page
Search
  • Writer's picture1/4" Jack Of All Trades

Computational Environments Easter Break: Audiovisual Pointcloud Workshop

Updated: Apr 29, 2021





I took part in an online workshop about audio visual point clouds lead by Kathrin Hunze.

As a passive participant which meant I was not provided with scripts or the example project and was expected to follow the demonstration and written instructions to help set up the Unity session. Before discussing the process, I wanted to reflect on the creative principle at work in the process - although photogrammetry can be a complicated, state of the art approach to generating photoreal 3d objects, I learnt that it can also serve an artistic role. Using lower resolution photos from a phone camera with awkward and imprecise angles and lighting returns unpredictable results, that in point cloud form carry subjective visual connotations of dream sequences.


As with my background in audio - love 'analogue' processes that take time to deliver uncertain results, in the same why python scripts coming back to me an hour later with their horde of data excites my imagination. This digital process feels very 'analogue' for this reason.

Two interpolated pointclouds.


From Photo To Interactive Point Cloud....


For subject matter I chose a small blue geode I had lying around. Used my phone (Samsung A8) to take around 50 photos (although a method for using individual frames from a video was also mentioned - hours, not minutes, of analysis however) and uploaded them to Regard3D, which analyses them and outputs match data for Meshlab to turn into a point cloud (.ply format), normalize it and fix the origin to keep the Unity side tidy. Also covered was the optional step of opening in CloudCompare, which allows point clouds to be resized, which is important if you wish to interpolate between them or if efficiency is an issue. At this point it is ready for importing into Unity via the PCX package.


Ran into an issue with being able to access and update the points in the point cloud meshes - I was getting an error saying that the mesh was not marked as readable - normally this is solved simply by selecting the asset in the project view, which brings up an inspector pane that allows the user to define whether the mesh is readable or not. This was not the case with the imported .ply files however and no such option was available. My troubleshooting process started with me redoing the point cloud exporting process from all the software we had used, I was certain it was human error as the active participants in the meeting were able to get their demo projects running. However, the problem persisted even after importing point clouds that were shared with the group in the workshop, giving me hope that it was not my error causing it. Next, I then began exploring the code of the .ply importer package, trying to understand how the raw binary file was being converted into a mesh. After a lot of head scratching I had narrowed it down to the function within the PlyImporter script, but was not willing to start changing code I did not fully understand. I then thought to check the questions and issues on the PCX github, and sure enough in the closed issues somebody had already encountered this problem. The solution was actually quite simple, only needing to change one bool at line 185 (correct at time of writing) of the .ply importer script from PCX. After tweaking this my simple script that stored the original point positions of two pointcloud meshes and interpolated between them began to function as intended. The only prerequisite to this is that both point clouds need exactly the same number of points, which can be achieved quite simply using CloudCompare. I have already begun thinking of uses for this newfound technique in existing and planned projects.


Next Step: Compute Shaders & Honing Photogrammetry Practice


Quickly ran into CPU issues, turns out moving around thousands of points simultaneously can be taxing. However something that has kept popping up throughout my 2 and a bit years of Unity work is compute shaders; the concept initially intimidated me, but I now feel at a stage where I could attempt to implement one to solve this problem and work with much larger point clouds. Kathrin mentioned using an external GPU for some of her performances but what was not mentioned was whether or not she used compute shaders.


See a lot of promise from these point clouds in achieving interesting and engaging visuals, and from my experimentations I have also discovered a need to export the interpolated point clouds that I make in Unity. My first aim is to combine the idea of a IK procedural animation rig with point cloud models.


I also intend to spend time honing the photogrammetry process, experimenting with objects, lighting and capture techniques (exporting video frames as stills is apparently a slow but precise way to approach) to discover more about how I can get interesting and otherworldly results from the process.





32 views0 comments
bottom of page