Auto Skinning

This repo will house various experiments on using web techologies to auto skinning 3D meshes. The process requires heavy number crunching, so this process uses the GPU to execute several compute shaders the produce the final vertex bone indices & weights. To start, its using raw WebGL over a Three.JS GL Context to execute all the compute shaders. Most of the steps use GPGPU & Data Textures for computations as there is a need for random read access which can only be done using texture lookups. The rest uses TransformFeedback and GL buffers for computations & data storage.

Ultimately, the purpose of this repo is for prototyping one of the building blocks for creating a web application similar to Adobe's Mixamo. If an app is able to use the user's GPU for the heavy lifting, then there is no need to have a backend service to perform that process. Couple this with IK Animation retargeting thats already prototyped, its quite possible to build a mixamo clone that runs everything on the front end.

Source Code : https://github.com/sketchpunklabs/autoskinning


WebGL ( ThreeJS ) Prototypes
  • Refactor : First complete GPU Prototype

    Recoding all the prototypes into a semi-messy API that executes the entire process onto a mesh. Various visual debugging is available plus the mesh is then animated as a final test if skinning data is usable.

  • Compute vertex data

    With voxel to bone distances known, this prototype then applies it to vertices. For each vertex, we find which voxel its resides in then adds its distance to the voxel's mid point as shortest path from the vertex to a bone.

  • Compute voxel crawl

    With the Voxel-Bone intersection as the starting point, the crawl will then step iterate threw out the voxel chunk turning on neighboring voxels that are connected on a path to a bone. Each voxel gets assigned the shortest distance to the bone.

  • Raycast for procedural skeleton generation

    This is a side prototype for procedurally setting up a skeleton on a mesh using 2D points. These points will then be turned into raycasts that will find the entry & exit positions when it intersects a mesh. It then computes the center of the two points as the final position for a bone.

  • Compute Voxel-Bone Intersection with GLTF Mesh

    Same as previous, but this time using a 3D model of a sworld

  • Compute Voxel-Bone Intersection

    This prototype turns skeleton data into a data texture that then a compute shader uses for voxel intersection testing on the solid voxel shape from the previous prototype. This step is to find all the voxels that containing any of the bones.

  • Compute voxel fill

    Previous prototype creates a voxel shell, for auto skinning this shell needs to be filled in. So this prototype goes threw & tries to fill in all the empty space inside the voxel shape.

  • Compute voxelize a mesh & debug normals

    Same as before, this time storing the average normal of all the triangles inside a voxel. The data is then visualized for debugging. Normal data is needed for the next step in auto skinning.

  • Compute voxelize a mesh

    This prototype turnes a mesh into a data texture that then can be used by a compute shader that will then create voxel data. The resulting voxel data is then visualized for debugging.

  • GPGPU & DataTextures

    Implementing a simple API to execute shaders as compute that saves results into a data texture. Shader & Texture objects created using raw WebGL code that ties into ThreeJS's GL Context

WebGPU Prototypes
  1. Coming when I have time

Resources
  1. @_naam twitter post that inspired this work
  2. White Paper : Geodesic Voxel Binding for Production Character Meshes
  3. CPU Prototype based on white paper
  4. Wolfire Volumetric Head Diffusion Skinning
  5. Wolfire Triangle-Mesh Voxelization
  6. Bronsonzgeb - Automatic Mesh Skinning in Unity ( Incomplete )
  7. WebGL - GPGPU Tutorial