MIT LegLab

Recent Robot Videos

Introduction

The following videos of recent LegLab robots were first captured in 25 Mbit/s NTSC DV format. They were then compressed to MPEG-1 Video-CD Specifications. Simple cuts were used between scenes instead of fancier transitions to facilitate automated extraction of clip portions (by MPEG-1 editing tools such as Media Ware's myflix) for use in presentations.

Use and Permission

Use of this material in non-commercial technical presentations (e.g. classes, research conference talks) is granted without need to obtain permission. Please be considerate and indicate proper credit for the videos you show.

You must obtain our permission to use this material  in any other way (e.g. in commercial presentations, web sites, or in any video tape or film meant for broadcast or public display). Permission will require your willingness to have us review and approve those parts of the final script/site where our material is used. Once permission is obtained, we can also provide access to the high quality DV format files. Please contact:

Prof. Gill A. Pratt
MIT AI Lab
NE43-006
545 Technology Square
Cambridge, MA 02139
gill@ai.mit.edu
(617) 253-2475 x101

Format

The Video-CD MPEG-1 format used below is 320x240 30fps. It is not meant for interlaced TV playback. Please contact us for the DV format files if you want to show these videos on a TV.

Supervision

The many students who worked on these robots were supervised by Prof. Gill Pratt. The knee project, begun under Gill Pratt, is now under the direction of Prof. Hugh Herr, who is now co-director of the Leg Lab.

Activation

To watch a video, click on any of the stills, or on the file name below. If you have a slow link, you can download the file name to your hard disk for later viewing.

Series Elastic Actuators and Virtual Model Control

Series-Elastic Actuator

   

hea.mpg

The Leg Lab invented and developed several kinds of series-elastic actuator, which use an instrumented spring in series with the load to provide good characteristics for robots that execute natural tasks. Compared to traditional robot actuators, series-elastic actuators have higher force fidelity, shock tolerance, energy efficiency, and stability in contact. Compared to other force controlled actuators, they are lower in cost, smaller in size, and are more lightweight. The following video of a hydraulic series-elastic actuator demonstrates first  qualitatively then quantitatively the dynamic range of the actuator (about 1 lb. out of 500 lb.), then demonstrates its force and motion bandwidth (50 Hz and 10 Hz, respectively).

Virtual Hexapod

hexapod.mpg

The Virtual Hexapod, created by Anne Torres, demonstrates the power of Virtual Model Control, which is used to allow the hexapod's body to be controlled as if it had force and torque thrusters, even though it actually has 6 legs with non-linear kinematics including 18 controlled and 18 uncontrolled degrees of freedom. In this video, the hexapod walks smoothly in various directions, then balances an inverted pendulum (connected to the body via a universal joint) while walking over rough terrain. Despite the complexity of the emergent behavior, Virtual Model Control allows the controller to be specified very simply. The controller is switched off at the end of the video to demonstrate that the pendulum can, in fact, fall down.

Walking Bipedal Robots

Spring Turkey

turkey.mpg

Spring Turkey was the leg lab's first walking robot, built by Peter Dilworth and Jerry Pratt. It had 4 degrees of freedom (2 in each leg, at the hip and knee) and point feet. It first demonstrated that series-elastic actuators and virtual model control could be used to build a walking robot with natural gait. These ideas were then used in the following robots.

Spring Flamingo

   

flamingo.mpg

Spring Flamingo, built by Jerry Pratt and Mike Whittig, is a 2-D robot with 6 degrees of freedom (3 per leg). The first two scenes of this video shows Spring Flamingo walking blindly over rough terrain. It has no foreknowledge of the terrain, but can only feel it via proprioception. Next, in a simulation done by Chee-Meng Chew, Spring Flamingo climbs up and down stairs. While the steps were carefully synchronized to the phase and period of Spring Flamingo's stride, the height of the steps was accommodated to by proprioception. In the next clip, Jerry Pratt qualitatively tests the robustness of Spring Flamingo to disturbances. More quantitative results can be found in our papers.  Finally, set to music, M2 is shown walking about 1 meter/s. Its highest speed during other experiments was 1.25 m/s. 

M2

 

m2sims.mpg                        m2real.mpg

M2 is a 12 DOF autonomously powered and controlled humanoid robot being built by Dan Paluska and several other leg lab students. There are two video files for M2, one showing simulation, the other showing the real robot. In the simulation video, a physically realistic simulation of M2 is first shown capable of one leg and step to step balance. Next are simulations of bipedal stopping, starting, and steady-state walking. Allen Parseghian and Jerry Pratt programmed the simulations. In the real robot video, M2 is shown balancing, wiggling ("Elvis dancing"), and stepping in place. Finally, its first forward step (which was followed by a fall) is shown. We are now working on getting M2 to walk robustly.

Troody

   

troody.mpg

Troody, the creation of Peter Dilworth, is a 16 DOF autonomously powered and controlled biped robot built to resemble a Troodon, a small carnivorous dinosaur that lived in the Cretaceous. In this video, Troody is shown standing up and walking across a desk (the cables provide power, start/stop control, and safety in case of a fall). Next Troody is shown taking a sharp left turn. Finally, Troody is shown taking a long battery-powered walk from our basement laboratory to visit Cog on the 9th floor (via the elevator). It only fell 4 times along the way.

Leg Rehabilitation

Intelligent Prosthetic Knee

   

knee.mpg

The LegLab Intelligent Prosthetic Knee, built by Hugh Herr, Ari Wilkenfeld, Theresa Iozzolino, Mike Whittig, and Olaf Bleck, allows above-knee amputees to walk and climb stairs with a more natural, automatically adaptive, gait. It does not require any manual tuning by a prosthetist. This video of an early prototype of the knee first shows a test subject changing walking speed, with the knee adapting its swing time accordingly. Next, another test subject demonstrates how she usually gets up and down stairs. Note that her "dumb" prosthetic leg remains either locked or collapses. In the final clip, this same subject is shown testing an early prototype of the knee going up and down stairs outside our laboratory under the supervision of a prosthetist. Because of the knee's intelligence, she can, for the first time, go up and down "stair over stair", i.e. like an able-bodied person. Her resulting enthusiasm is infectious, and is tremendously motivating for all of us.

Other Robots

Corn Dog (a.k.a. M4/2)

corndog.mpg

Corndog is a 2-D half-quadruped, built by Ben Krupp, consisting of a single front and back leg, and a spine connecting the two. This video shows corn dog pronking. Besides demonstrating control of height and pitch, this video demonstrates the ability of Corn-Dog's electric series-elastic actuators to efficiently and robustly absorb the shock energy of each footfall, without any additional elasticity in the leg mechanism.

Blob

   

blob.mpg

The Blob is a shape-shifting conformal robot that resembles a tank tread without the tank. This video shows the pneumatic version built by Geo Homsy, then some simulations done by Jerry Pratt. The two simulated blobs show how a genetic algorithm can effectively chose either the most efficient or the fastest "gait". Finally, an electric version of the blob built by Andrew Allen is shown. Able to control its joints smoothly, this blob is shown executing the efficient gait .

Luxo

luxo.mpg

Both a series-elastic actuated  robot and a simulated model of the well-known Pixar Luxo lamp was built by Dan Paluska for his Bachelor's thesis. In this video, the simulated model, under virtual model control, takes a few jumps.