University of Wisconsin Computer Sciences Header Map (repeated with 
textual links if page includes departmental footer) Useful Resources Research at UW-Madison CS Dept UW-Madison CS Undergraduate Program UW-Madison CS Graduate Program UW-Madison CS People Useful Information Current Seminars in the CS Department Search Our Site UW-Madison CS Computer Systems Laboratory UW-Madison Computer Sciences Department Home Page UW-Madison Home Page

CS779 Projects

Note that project web pages are only available from the wisc.edu domain.

Leo Chao and Tom Brunet

Web Page

The focus of this project is to explore the adaptation of real-time techniques for the rendering of scenes involving participating media. In particular we desire to re-create the "god-ray" and volumetric shadow effect common to this style of rendering, but at real-time speed. The effort will involve using shadow volumes and corresponding light volumes – generated in a methodology similar to silhouette shadow volumes using explicitly marked light portals.

This effort is a distinct advancement from current real-time techniques which approach the problem from a "hack" approach using cones and cylinders with view-independent faded alpha transparency and colors. We seek to alleviate the errors in this approach by creating a view-dependant rendering process for these volume and use accumulation buffer-like techniques to determine the actual depth of each volume in order to facilitate a proper rendering of the environment.

At this first cut we wish to merely work with a homogeneous participating atmosphere and a very simple scene involving one light portal and one shadow casting object. Further extensions are possible from this starting point involving moving lights, moving objects and multiple light sources.

Greg Cipriano

Web Page

The goal of this project is to extend pbrt to allow a user to specify motion over time. We plan to add the facility to specify both camera motion and object motion as a sequence of keyframes, and to have pbrt interpolate them in the most natural way possible. To do this, we will need to update both the internals of pbrt and the scene file format while maintaining complete backwards compatibility with existing scene files.

Brandon Ellenberger and Evan Nowak

Web Page

We will use our existing volume viewing framework to implement the non-photorealistic, interactive rendering methods outlined in the paper "Hardware-Accelerated Parallel Non-Photorealistic Volume Rendering" by Lum and Ma. We will not be implementing parallel processing techniques of the paper, but will restrict ourselves to one graphics processor. Thus we are not sure whether interactivity will be maintained. Time permitting, we will explore other techniques of NPR visualization, such as the depth differences algorithm.

Scott Finley

Web Page

Implement a real-time front end for pbrt. It will be capable of saving and opening .pbrt files. It will allow the user to interactively view and modify a scene in order to place the camera and objects as desired before a lengthy rendering. It will support the basic primitive shapes of pbrt as well as several materials, textures and lights, depending on available time.

Kael Greco

Web Page

Implement (and update for modern hardware) the real-time hatching paper by Emil Praun, Hugues Hoppe, Matthew Webb and Adam Finkelstein. Take into account the updated Fine Tone Control in Hardware Hatching from the NAPR proceedings in 2002.

Brian Hackbarth

Web Page

I intend to add a night time sky to PBRT. This will include acurate position of the stars and moon, accurate brightness of stars and planets, and atomspheric scattering. There is currently nothing like this implemented in PBRT, so it will be helpful for the community. If I can get a good texture of the milky way, I will also try to incorporate that into the celestial background ( but this is a low priority ).

Eric Jackowski and David Kerman

Web Page

Implement "Radiosity on Graphics Hardware" by Greg Coombe, Mark J. Harris and Anselmo Lastra, at least Section 3.

Feng Liu

Web Page

Implement the paper "Processing Images and Video for an Impressionist Effect". This paper describes a technique that transforms ordinary video segments into animations that have a hand-painted look. The first step of the project is to implement the algorithm for transforming images to NPR paintings. The second step is to explore the temporal coherence across frames and thus create artistic animations from video. Finally, there is some scope for extensions to the paper.

Chris Olsen

Web Page

A progressive radiosity system that runs in hardware, utilizing pixel shaders and render to texture to accelerate the radiosity calculations. The final project will show the results of each step of light distribution.

Jared Sohn

Web Page

Add functionality to PBRT that is similar to what they call "Deep frame buffers" in problem 8.10 on page 406. Everything will share two themes: 1) Store a lot of data for each point in the frame buffer; 2) Make use of that information to efficiently re-render the scene in an interesting way. Unlike that problem statement the primary focus will be implementing NPR algorithms as described in Saito/Takahashi (1990). In addition, other options may be explored if time permits (i.e. shifting the viewpoint or moving a light source).

Some suggestions for possible projects.