Interactive Sculpting of 3D Computer Graphics Models

Progress Report

January 1 – June 30, 1999

Julie Dorsey and Leonard McMillan


The past thirty years have seen significant progress in the field of computer graphics, particularly in the area of rendering. However, the creation of realistic models is nearly as tedious today as it was 30 years ago, and many types of complex materials simply cannot be represented with today's graphics systems. To address these problems, we are developing a new 3D modeling system based on the metaphor of sculpting real materials. We believe that by combining haptic output devices, stereoscopic displays, physically-based surface models, and newly developed surface representations it will be possible to approach the feel, naturalness, and flexibility of interacting with materials such as marble, wood, metals, and paints. This research also serves as a platform for studying the next generation of user interfaces, sensory fusion, and material representations. This work should find application in a variety of fields ranging from computer-aided design to entertainment.

Project Overview

Highly detailed geometric models are necessary to satisfy a growing expectation for realism in computer graphics. Within traditional modeling systems, complex models are created by applying a variety of modeling operations such as CSG and freeform deformations to a vast array of geometric primitives. Intricate meshes are also obtained by scanning physical objects using range scanning systems. A notable property of the new acquisition techniques is their ability to capture

fine surface detail. These developments have made multi-million polygon models widely available and offer new opportunities to modelers and animators in the CAD and entertainment industries.

The goal of this work is to develop a new data structure, the volumetric surface, which captures attractive properties of both surfaces and volumes, and offers a convenient way of representing material properties of complex models. More specifically, the new approach retains the efficient sampling offered by surfaces, but also supports powerful volumetric operators, such as interactive operations for adding and removing material. Properties of materials, such as brittleness and

distance to the surface, control the way materials respond to these operators, providing a higher level alternative to existing material descriptions in interactive modeling systems. Finally, a haptic interface provides a natural way to interact with these material representations.

Progress through June 1999

In the last six months of the project, we have continued work on the volumetric surface data structure. To facilitate interactive rendering and sculpting, we have developed a multiresolution version.

We have experimented with several different types of materials, including stone, brick, and plaster, both individually and as composites. In addition, we have designed and implemented a number of interactive sculpting tools for editing complex 3D models.

One of our long-term goals is to integrate user-guided simulations into the sculpting system. We have begun work along these lines with a project involving the simulation of stone weathering. Our weathering simulation employs a simulation of the flow of moisture and the transport, dissolution, and recrystallization of minerals within the stone volume. In addition, this model governs the erosion of material from the surface. Our initial results are striking and would be very difficult – if not impossible – to achieve with traditional techniques.

Finally, we have fully integrated a PHANToM, a force feedback device, into the system to enable tactile feedback during the sculpting process.


Julie Dorsey, Alan Edelman, Henrik Jensen, Justin Legakis, and Hans Pedersen. Modeling and rendering of weathered stone. In Computer Graphics Proceedings, Annual Conference Series, pages 225-234. ACM SIGGRAPH, August 1999.

Aseem Agarwala. Volumetric surface sculpting. MEng Thesis, MIT, September, 1999.

Plans for the Next Six Months

Now that we have integrated the PHANToM into the system, we are now working to integrate a head-tracked stereoscopic display to provide registered visual and tactile feedback. We believe that this is essential to develop detailed models. The combination of tactile and stereo displays will aid in the comprehension of the surfaces being modeled, allow for direct manipulation using natural paradigms, and allow the user to interact with real-time simulations.

Display. We plan to explore a new technology for stereo display, in particular, plasma displays.

CRT's are too small too provide a compelling stereoscopic display. This is particularly true when we wish to image the three-dimensional object in front of the display surface (Most CRT-based stereo displays form the image behind the face of the CRT, as if the front of the CRT was a window). It is best to form the image in front of display if we want to allow the user to interact with it directly.

Plasma displays offer many attractive properties with respect to this project. First, they are as bright as a CRT screen normally lit room. They are both large and flat, currently 45" displays that are less then 5" thick are commercially available, and 60" displays are in development. They also have perfectly registered pixels, that do not shift or jitter like CRT and projection based systems. A given pixel location turns on a specific area of the screen. Despite all these advantages, noone (to our knowledge) has built a stereo display based on plasma displays.

Layered and composite materials. We also plan to extend our basic volumetric data structure to support layered and composite materials, for example layers of brick and mortar with coatings of paint and stucco. This advance will allow the user to carve into models comprised of a mixture

of materials. We are also interested in developing representations for materials that change shape and appearance through manipulation. Rather than simply carving, we will explore materials that deform, e.g metal foils and soft clays. We could also imagine materials whose appearance changes through manipulation, (i.e. polishing and buffing a surface).

Virtual user-guided simulation environments. In addition, we plan to develop techniques to put the user in the simulation loop. This would allow a user to control parameters interactively to achieve the desired effect. For example, imagine waving a blow torch over a piece of wood to burn away the soft areas and expose the grain.