# Project Introduction

The project began with no particular skinning goal in mind. The first thing that we decided that we should do is make some authoring tool so we could create skins as necessary. The standard method to author skins is through a surface painting interface. From the beginning we thought that making a few improvements to the standard skinning pipeline, we could make it far less time consuming and painful for the skin author. The first improvement is that all mesh vertex distance computations should be in distance over the mesh, not in space. This would reflect what we want the meaning of vertices being ``close'' to be. With that in mind I started on the painting interface.

# Painting

I read a paper entitled Adaptive Unwrapping for Interactive Texture Paint and got some good ideas about what is important in a painting interface. My eventual implementation is not really related to the system described in the paper, but ideas of what a paint stroke should mean and how the interface should provide feedback were a good guide.

The current painting interface includes replace, add, subtract, flood, and smooth current influence weight value. These tools only operate only on one influence at a time. There is also a rubber stamp like tool that copies all of the influences from an initial sampled vertex to the painted vertex. The author can select to only paint on front or back facing polygons. The author can adjust the current paint value, the brush radius, and the brush neigborhood value. The brush neighborhood is a geodesic distance threshold of how far away the next painted vertex can be.

# Initial Weight Guessing

The initial guess for the per-vertex weights of a skin, if reasonable, can take a huge amount of work off the skin author's shoulders. If there is no initial guess, the skin author would have to look at every vertex and explicitly say which joints should influence it, and by how much. We would like to offer the skin author a good guess, and allow them to tweak the results.

The algorithm that I use splits the problem into two parts. First, it finds which joints should influence a vertex, or the influence set. After it has the influence set, it defines the amount that each joint influences a vertex, or the weights, based on some falloff function that maps distance to a weight between 0 and 1.

For more details about the algorithm please refer to the Initial Weight Finding page.

# Multiple Poses Simultaneously

Another idea to make this a better authoring tool would be to allow the skin author to look at multiple poses, or joint configurations, simultaneously. We believed that seeing the effects of changing the weights for a vertex in more than one pose would make it clearer what exactly was possible given the limitations of linear blend skinning. Read more about linear blend skinning.

The current system can show any number of poses simultaneously. The author can save various poses to a BVH file as motion data and read them back in while working, the intention being that there are some number of poses that the author might want to see often while working.

I think that shoiwing multiple poses was a good idea, but more importantly I think that it led to the idea that we could explicitly show the possible positions of a vertex given its influence set and a pose.

# Subspace Display

Any vertex position in a linear blend skin is expressed as a linear combination of the vertex transformed by each joint's coordinate system. Read more.

If the weights are all positive, the vertex position can be expressed as a convex combination of the vertex transformed by each joint's coordinate system. Any possible position vertex position for that influence set will lie in the convex hull defined by that vertex rigidly transformed by each joint in the influence set. This can be thought of as the subspace in , of possible vertex positions. But, really, if you want to know more details, look at the paper.

# Direct Vertex Manipulation

After we could display the subspace we decided that it would be really good if we could scrap the whole painting thing and just tranform that vertex, using any traditional operation like translate, rotate, or scale, around in the subspace and compute the weights on the fly. It works pretty cool, and for more details look at the paper.

# Reference

Libraries:
Stuff that I used.
fltk:
Fast Light Tool Kit. Cross platform GUI toolkit.
http://www.fltk.org/, LGPL.
Fl_ToggleTree:
Hierarchical node (tree) browser widget for fltk included in the flek library.
http://flek.sourceforge.net/, LGPL
lib3ds:
3DS file reading and writing library.
http://lib3ds.sourceforge.net/, LGPL.
libtarga:
Targa image reading and writing library.
http://www.cs.wisc.edu/graphics/Gallery/LibTarga/
lp_solve:
Linear program solver.
http://www.cs.sunysb.edu/~algorith/implement/lpsolve/implement.shtml, LGPL.
BVH Reference:
http://www.cs.wisc.edu/graphics/Courses/cs-838-1999/Jeff/BVH.html
Papers:
Painting etc.
Adaptive Unwrapping for Interactive Texture Painting.
Takeo Igarashi and Dennis Cosgrove. I3D 2001. Available on ACM digital library.
Linear Blend Skinning.
Direct Manipulation of Interactive Charecter Skins.
Alex Mohr, Luke Tokheim, and Michael Gleicher. 2002. See http://www.cs.wisc.edu/graphics/Gallery/DirectManipSkin/ to get the paper.

Pose space deformation: a unified approach to shape interpolation and skeleton-driven deformation.
J. P. Lewis, Matt Cordner, and Nickson Fong. Siggraph 2000. Available on ACM digital library.

Luke Tokheim, <ltokheim@cs.wisc.edu>, May 13, 2003