This paper discusses computer puppetry. The 2 major differences between just doing normal motion capture and computer puppetry are. 1. Real time and online performance: The actors movements must be transferred to the virtual character in real time without delays. 2. Retargetting: Virtual characters will, in general, not have the same proportions as the original actor. The paper addresses those 2 problems. One of the implications of the first problem is that we need faster algorithms. This is why the paper tries to find analytical solution to all problems because they are, in general, faster to compute. Another implication of the first problem is that we can't see the future. We have to do all the processing and output rendering immediately, so some of the algorithms that need to run offline will not be suitable here! This is why, for example, the paper uses Kalman filters for motion filtering. Kalman filtering runs online, as opposed to gaussian or median filters which require us to look into the future. The second problem is retargetting. Retargetting without losing anything is impossible. We can preserve joint orientations, or end positions, but not both. The paper describes an importance analysis method to find which way to go (clearly, doing anything in between, will result in a bad motion. You have to choose one way to go! You can only be inbetween if you are in a transition phase) Preserving joint positions opens another problem, which is choosing an IK solver to find the orientations for those positions. The paper describes an algorithm to do IK solving for the whole body in real time.