Time Scroll

Time Scroll is a web browser-based 24 hour clock. Every increment of the 24 cycle, down to the second, is rendered to the browser. It starts with 00:00:00 at the top and ends with 23:59:59 at the bottom. Each second, the browser scrolls one tick downward to change the display of the current time. The location of the browser’s scroll bar gives you a general sense of where we are in time relative to the 24 hour cycle.

Time Scroll is written in Javascript and PHP.

[Time Scroll]

Cinematic Timepiece

picture 1cinematic

Time is our measure of a constant beat. We use seconds, minutes, hours, days, weeks, months, years, decades, centuries, etc. But what if we measured time against rituals, chores, tasks, stories, and narratives? How can we use our memory, prediction, familiar and unfamiliar narratives to tell time?

As a child, I remember using the length of songs as a way to measure how much time was left during a trip. A song was an appropriate period to easily multiply to get a grasp of any larger measure like the time left until we arrived to our grandmother’s place. The length of a song was also a measure I could digest and understand in an instant.

The first iteration of Cinematic Timepiece consists of 5 video loops playing at 5 different speeds on a single screen. The video is of a person coloring in a large circle on a wall.

The frame furthest to the right is a video loop that completes a cycle in one minute. The video to the left of the minute loop completes its cycle in one hour. The next completes in a day, then a month, then a year.

Through various iterations, we intend to experiment with various narratives and rituals captured in a video loop to be read as measures of time.

The software was written in OpenFrameworks for a single screen to be expanded in the future for multiple screens as a piece of hardware.

Cinematic Timepiece is being developed in collaboration with Taylor Levy.

Download the fullscreen app version [http://drop.io/cinematicTimepiece#]

Sprocket Rhinoscript

sprockets-converted-01

UPDATE: this script generates inaccurately spaced teeth. sorry! fixes in new version coming soon. (6.25.09)

Here’s some basic rhinoscript code to draw your own sprockets based on the number of desired teeth, chain pitch and roller diameter. It’s very basic, but enough to draw what you want and get some custom sprockets lasercut. Draw a circle that’s larger than what you think it’s going to be in top view, then run the code.

Sun Angle Script

sun_angle

The following rhinoscript code calculates and draws a series of sun angles based on the time, day and latitude.
NOTE: in my test there’s something happening to the angles around noon causing them to be inaccurate because the calculations are based on cos values. All other times besides noon should be fine.

Updated version with nicer UI from Ezio. Thank you!
Download:sunbatchrender_ezb.rvb

P.Life V2


plife at IAC from che-wei wang on Vimeo.

plife_06176

plife_04812

plife_04240

plife_03618

plife_02422

plife_01146

plife_00714

plife_05734

P.Life is a large scale interactive screen designed for the IAC’s 120”² wide video wall. In the world of P.Life, Ps run across the 120′ screen frantically moving material from one house to another. Along the way, Ps exchange pleasantries (based on text message inputs) as they pass by each other, offering a helping hand to those in need. The landscape shifts and jolts based on audio input from the audience, tossing Ps into the air. Playful jumps into midair often end in injury, forcing them to crawl until a fellow P comes by to help out.

Features
Text messaging to create new characters of different sizes and dialogues.
Audio input to influence landscape
Performance backend to influence landscape
Ps move with life-like motion as they walk, jump, fall, run skip, crawl, carry boxes, push boxes, etc.
P.Life is written in OpenFrameworks and uses the Most Pixels Ever library

By Che-Wei Wang and Jiaxin Feng
Live Music by Taylor Levy

wuyingxian_3132028791_7cb2b3be0b_bwuyingxian_3131974377_090d1153da_b
photos by wuyingxian

Elevator P

Elevator P interprets random conversation in an elevator into poetry and publishes them immediately on twitter. Using a hidden microphone, Elevator P captures unexpected chatter, un-staged and raw. The interpreter elevates mundane elevator conversations into beautiful flowing poetry capturing the deep essence of each dialogue.

Haiku Poetry.
http://twitter.com/chatterbot

P.Life

p02

P.Life is a large scale interactive screen designed for the IAC’s 120′ wide video wall. In the world of P.Life, Ps run around growing, living, and dying, as the landscape continuously changes creating unexpected situations challenging their existence.


Scenario

Screen fades from black to dawn and rising sun along a horizon. The bottom third of the screen shows a section through the landscape cutting through underground pipes, tunnels, reservoirs, etc. Towards the top the surface of the landscape is visible as it fades and blurs into the horizon and sky.
A few Ps wander around the flat landscape. A number appears on screen for participants to send an SMS message to with their name. As participants send SMS messages, more groups of Ps appear on screen representing each SMS and wander across the landscape. The landscape begins to undulate as the audience interacts with the screen, creating of hills, valleys, lakes, and cliffs. Ps running across the landscape fall to their death as the ground beneath their feet drops or ride down the side of a hill like a wave as a hill moves a cross the screen like a wave. Ps that fall to their death slowly sink into the ground and become fertilizer for plant-life, which is then eaten by other families of Ps allowing them to multiply.

p01

Features
SMS listener to make new families of Ps
An array of IP cameras to transmit video for screen interaction
Background subtraction to capture the audience’s gestures
or Open CV with blob detection or face detection to capture the audience’s gestures
or IR sensors to capture the audience’s gestures
or Lasers and photo-resistors to capture the audience’s gestures
Multi-channel audio triggers for events in P-Life based on location
Background elements and landscape speed through sunrise to sunset in a 3 minute sequence
Ps with life like motion as they walk, jump, fall, grow, climb, swim, drowned, die, stumble, flip, run, etc.
pixelated stick figures? large head?
Simple 8bit game-like soundtrack
Various plant-life grown from dead Ps

Precedents
Lemmings, N for Ninja, Funky Forrest, Big Shadow, eBoy, Habbo

Technical Requirements
IP camera array
Mulit-channel audio output

MultiMesh

Rhino loves to crash when it attempts to mesh a large set of surfaces in one shot. This often happens when you go to render and Rhino has to mesh all the objects in the scene. The MultiMesh plugin meshes multiple objects one at a time. By meshing your enormous elaborately detailed scene before you render, you save time and your sanity.

Version:0.03
MultiMesh.rhp.zip

Feedback Playback 2

dhpicture-9dhpicture-7

fb-1-352009fb-2-352009fb-3-352009

FeedBack PlayBack is a dynamic film re-editing and viewing system. The users’ physical state determines the visceral quality of scenes displayed; immediate reactions to the scenes feed back to generate a cinematic crescendo or a lull. We use material that is rigorously narrative, formulaic, and plentiful: the action movie series Die Hard, starring Bruce Willis. A narrative sequence key breaks any given Die Hard movie into narrative elements, corresponding clips were collected from each of the Die Hard movies. Individual clips fall into high, medium, and low action/arousal categories. The user is seated, and places his or her hands on a Galvanic Skin Response (GSR) detection panel (GSR readings are the same kind of data collected in lie detector test). After calibration, the movie begins showing, and clips are displayed depending on the user’s level of arousal and engagement. The narrative sequence is maintained, though the clips are pulled from any of the movies.

Feedback Playback

FeedBack PlayBack is an interactive, dynamic film re-editing/viewing system that explores the link between media consumption and physiological arousal.

This project uses galvanic skin response and pulse rate to create a dynamic film re-editing and veiwing system. The users’ physical state determines the rhythm and length of the cuts and the visceral quality of scenes displayed; the user’s immediate reactions to the scenes delivered, feeds back to generate a cinematic crescendo or a lull. This project exploits the power of media to manipulate and alter our state of being at the most basic, primal level, and attempts to synchronize the media and viewer– whether towards a static loop or a explosive climax.

In a darkened, enclosed space, the user approaches a screen and his or her rests fingertips on a pad to the right of the screen. The system establishes baseline for this users physiological response, and re-calibrates. Short, non-sequential clips of a familiar, emotionally charged film– for example, Stanley Kubrick’s 1980 horror masterpiece “The Shining” –are shown. If the user responds to slight shifts in the emotional tone of the media, the system amplifies that response and displays clips that are more violent and arousing, or calmer and more neutral. The film is re-edited, the narrative reformulated according to this user’s response to it.

Feedbak Playback is by Zannah Marsh and Che-Wei Wang

GSR Reader

Galvanic skin response readings are simply the measurement of electrical resistance through the body. Two leads are attached to two fingertips. One lead sends current while the other measures the difference. This setup measures GSR every 50 milliseconds. Each reading is graphed, while peaks are highlighted and an average is calculated to smooth out the values. A baseline reading is taken for 10 seconds if the readings go flat (fingers removed from leads).

Etek EB-85A GPS Example Code

etekgps5hz-02-m.jpg

Here’s some example Arduino code for getting a Etek EB-85A module up and reading latitude and longitude (will probably work with most GPS modules). You can purchase a module from Sparkfun.

The module only needs power, ground, rx and tx. Most modules like the Etek start sending NMEA strings as soon as it has power. The Etek module takes a minute or two to get a satellite fix from a cold start in urban environments. Signals drop out once in a while between tall buildings at street level even with DGPS and SBAS. On a clear day, if you’re lucky, you can get a signal sitting by the window in urban canyons.

//Etek GPS EB-85A Module Example
//by Che-Wei Wang and Kristin O'Friel
//32 Channel etek GPS unit
//modified from original code by Igor González Martín. http://www.arduino.cc/playground/Tutorials/GPS
boolean startingUp=true;
boolean gpsConnected=false;
boolean satelliteLock=false;

long myLatitude,myLongitude;

//GPS
#include 
#include 
int rxPin = 0;                    // RX PIN 
int txPin = 1;                    // TX TX
int byteGPS=-1;
char linea[300] = "";
char comandoGPR[7] = "$GPRMC";
int cont=0;
int bien=0;
int conta=0;
int indices[13];

//////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
//////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

void setup() {
  //GPS
  pinMode(rxPin, INPUT);
  pinMode(txPin, OUTPUT);

  for (int i=0;i

Infinite Mouse Tracking

I wanted to use the mouse as a simple surface optical encoder to get infinite panning motion. The problem with reading mouse coordinates on the screen, is that once the mouse reaches the edge of the screen, it stops counting. So a simple solution is to reposition the mouse (using the robot class) every few frames and calculate the change in mouse positions.


//InfiniteMouseTracking
//by Che-Wei Wang
//2.20.2008

float mapPositionX;
float mapPositionY;
long count=0;
int moveX=0;
int moveY=0;

void setup() 
{
  size(screen.width,screen.height,P3D);
  mapPositionX=width/2;
  mapPositionY=height/2;

  //noCursor();
  
  //set the mouse postion once before the program begins
  try {
    Robot robot = new Robot();
    robot.mouseMove(width/2, height/2);    
  } 
  catch (AWTException e) {
  }

}

void draw()
{
  background(0);

  //reset the cursor Position every few frames
  if(count%4==0){
    try {
      Robot robot = new Robot();
      robot.mouseMove(width/2, height/2);    
    } 
    catch (AWTException e) {
    }
    moveX=mouseX-pmouseX;
    moveY=mouseY-pmouseY;
  }

  count++;

  //new position= old position + movement * decay
  mapPositionX=mapPositionX+moveX*.8;
  mapPositionY=mapPositionY+moveY*.8;

  stroke(255);
  line(width/2,height/2,mapPositionX,mapPositionY);
  ellipse(mapPositionX,mapPositionY,100,100);

}

Sensors Galore

An example of blobDetection, Ess, sms, and ocd all rolled into one sketch. Click and drag on the screen to move the HUD sliders around. (nothing is going to load here. copy the code to your processing sketch to run it)

Chess Clock

chess_clock01.png

I couldn’t find a fullscreen chess clock so I wrote a very simple application to do just that. The ring counts down the total number of seconds remaining instead of the traditional clock with second and minute hands. A small dot at the center of each ring indicates which player is white and turns red to indicate the flag. While one player’s display is couting down, the other’s is dimmed. The timer is easily adjusted and displayed at the center of the screen prior to starting the timer from 1-150 minutes.

MultiPipe

MultiPipe is a simple plugin for Rhino 4.0 to pipe multiple curves at once with start and end diameter options and cap type options. Unzip and install by dragging the plugin into Rhino. The command line for the tool is “MultiPipe’.

Download: multipipe04.zip

Haptic Clock

dsc07396.JPGdsc07393.JPG
The Haptic Clock is a small clock program for Java powered mobile phones. The clock conveys time through a sequence of vibrations so you never have to pull the phone out of your pocket to tell time. The idea behind it was to create a clock that would train my body to understand time better.
Long vibrations are the number of hours of the current time on a 12 hour clock, so 6pm and 6am are both 6 vibrations. The shorter vibrations are the number of minutes divided by 5. So 4 vibrations is 20 minutes and 7 vibrations is 35 minutes. Example: (3) long vibrations and (6) short vibrations means it’s 3:30. Just in case you do want to see the time, the screen displays the time with tick marks for hours, minutes and seconds.
Instructions: Press to vibrate the current time. Press ‘0’ to exit program. UP and DOWN to control the speed of vibrations. Time alerts (vibrations) will occur automatically every 0, 15, 30, 45 minutes on the hour as long as the program is running.  Press any key to vibrate the current time.  Move the joystick to change options.  Options include vibration speed and vibration frequency (time between automated vibration time alerts).

Current Version: 0.08
Released:5.24.2007
Creator: Che-Wei Wang
License: GNU Public License (source)
Download Beta: Haptic Clock 08.jad, Haptic Clock 08.jar
Beta means it may not work on your phone or worse, may break your phone. Install and use at your own risk.
Tested on: Nokia E70
Issues: J2ME drains the batteries. Looking for ways around it, or a more efficient platform.