Tangible Programming for Scientific and Engineering Exploration
Date Friday, Sep. 15, 2000
Time 2-3 p.m
Speaker Tim McNerney
Affiliation Master's student, MIT Media Lab
Abstract A number of programming paradigms and an even greater number of programming languages have been devoped. The vast majority are text-based languages, but a notable few are graphical. LabView (tm) is perhaps the most successful graphical programming environment to date. It presents the user with a familiar signal-flow model of data acquisition and processing. Its strength is that it closely parallels how scientist and engineers work with actual test equipment, thereby providing a natural framework for scientific inquiry.

When people do scientific and engineering exploration in a hands-on way, they are necessarily interacting with the physical world. With the sensors they have at-hand, the researchers acquire data for analysis and visualization software running on computers.

The problem we address is this: When the sensors are in-hand and the software is on-screen, there is not only a physical separation but a cognitive separation between the physical world (where the sensors are), and the virtual world (where the programs are). One solution is to move programs (and the building blocks they are made of) into the physical world.

We have implemented such a system by embedding Logo programmable microprocessors embedded into small, custom built LEGO (tm) bricks. We offer our system as a powerful example of a _constructive_ tangible user interface that, based on our observations, promises to make the task of programming for scientific experimentation more accessible to newcomers and more efficient for seasoned veterans.
Location 545 Technology Square (aka "NE43")
Room 8th floor Play room
Bio