::exhibit design

I am currently interning at the American Museum of Natural History, where I worked in a different capacity prior to graduate school. I made the big jump from Education to Exhibitions, where I am now helping in building out prototypes for the interactives in upcoming exhibits. Below is documentation of the projects I am working on.

Dino Metabolism
This interactive will demonstrate the amount of food that ectotherms (cold-blooded animals) need to consume vs. endotherms (warm-blooded animals) in both good food and bad food. The idea is to show how much time endothermic dinosaurs had to spend eating (a hypothesized 16 hours a day) and how difficult it was to stay satiated. This is demonstrated by having a switch that moves between two states (endo and ecto) and can “eat” either good food or bad food. The good food is a button on the right and the bad food is a button on the left. Satiation is demonstrated by a row of LEDs lighting up. The idea is to get the LEDs to light up all the way to the top. Most of the code was already written out using an Arduino Uno and two Arduino Duemilanoves (for the display boards). I spent most of my first day re-wiring the bread board to clean it up and consolidate it for a cleaner prototype presentation.

Breadboard Before
Breadboard Before

Breadboard After
Breadboard After

I also soldered wires to the second display board.

Then Gabby (other intern and ITP student) and I tested the two boards with the button hub using only one power source on the far left board and transferring the voltage from board to board via voltage in (Vin) and the TX/RX. This worked as I had hoped with the power and ground transferring fine and the TX/RX matched up from board to board reading the same information (the boards are supposed to mirror each other).

The last step was tweaking the code so that there was a discernable difference between good food/bad food and ecto/endo. I also learned about Serial.flush, as the display board was not displaying the most current information. I played with the delay, but the issue lay in clearing the port so that it would only react when it was getting new data.

Here is the final demonstration of all 4 states:

________________________________________________________________________________________________________________________________________
Dino Femur Measurement
I worked on the Dino Femur Measurement interactive which converts the length of your femur to figure out how heavy you would be as a dinosaur. I wrote some simple Arduino code that takes the number indicated on a potentiometer slider and calculates the weight into both lb and kg. There were some issues with keeping the number from flickering between the two closest numbers, so I added some Arduino smoothing code which averages out the index of readings, which prevents flickering.
_________________________________________________________________________________________________________________________________
Multi-Touch Table
This Valentine’s Day at AMNH, I’m learning about how to build out a multi-touch table that will be used as part of the Space exhibition in the Fall. (nothing says love like “multi-touch”)

SETUP

Making multi-touch tables is actually relatively straightforward. The kind we are looking to make is either front or rear diffused illumination (front/rear refer to the placement of the camera projecting the image that gets altered with each touch). NUI group has a great tutorial on how these work, here. The table works such that a piece of acrylic is mounted on top of a 29″ table (regulation size approved by ADA) and dimmable IR LED bars are mounted on the edges of the acrylic. The IR senses heat from your hands and thus knows where you are touching and what to project.

PROJECTION

We will be using Community Core Vision to project a flash animation onto the table. CCV has the ability to communicate with a third-party client (like Adobe Flash) to project. Right now, we are exploring how to use CCV with a multi-camera set-up. The cameras we will be using are PS3Eyes.

EDGE PROJECTION TESTING

Here are all iterations of edge projection possibilities using roscoe velum, thick plexi and thin plexi. The idea is to get the highest sensitivity to sense both the pressure level and distinction between where the fingers are located. The assumptions are that the projection will not reach fully across the table and thus the IR LED bars will need to be mounted on all sides. All of this prototyping was done with a 1 ft X 2ft surface area and the PS3 eye mounted 1ft below the platform.

Terraform Iterations Documentation

______________________________________________________________

MATERIALS

Great how-to video on DSI here using Endlighten as the projection layer.

Great comparison of  Endlighten colorless vs. colored materials.

Where can these materials be purchased?

*Evonik makes some different types of Endlighten. These vary in their thickness and also in the amount of particles inside the material. Available thickness ranges between 6-10mm, follwing “L”, “XL” and “XXL” for particle amount. I have tested the 6mm “L” and 10mm “XL” type. The 6 mm is to flexible for a table setup, the 10 mm works nicely.

Price?/Size?


Potential Issues according to NUI group.

*Because the plexiglas is softer than normal plexiglas, you really have to watch out when working with the material. It scratches quite easily and these scratches show up on the projecton and during camera tracking.

Final application size: 6′ X 3’9″ (7D006: Touch layer, 7D512:Projection Layer, MR1 (P99):Mar Resistant Layer)


Prototype application size: 3’9″ X 2’6″

One Response to ::exhibit design

  1. Pingback: Arduino Blog » Blog Archive » Arduino at the American Museum of Natural History

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>