User Tools

Site Tools


interactive_smartphone_concert

High level idea:

Many audience members bring their smartphone to concerts, but performers yet have the ability to interact with audience members' phones.

I am proposing two parts of this project:

  1. a central server that knows each participating phone: the phone's locaiton and capabilities. It then can send audio/video for any subset of phones to play.
  2. a smartphone application that will make it's location in the venue known to the central server and accept audio/video to display. As a bonus it can shut off the ringer/notification sounds.

HAMR results

The project I coded ended up differing from the proposal, which you can read about below

code

This is the README file:

This is the beginnings of a framework that allow more iteractive digital musics performances. The idea is to allow those watching performances with smartphones and/or web browsers to presented with a represention of what the performer is doing with the performer's computer and to also allow for information gathered off of audience members' smartphones to be made available to the performer. The second part of this idea was to allow the performer to broadcast images/video/music to a subset of the audience members based on their location.

Currently I have a simple prototype put together that only allows performers to share keystrokes and mouse movements with the entire audience. This prototype is made up of three components:

1. Mouse/keylogger client on the performer's machine 2. HTML/JS client that audience members can view. 3. Central server that tracks mouse/keystrokes and keeps audience clients updated. (AJAX involved)

The server is build on Tornado, the audience client uses JSON, and the performer's logger uses OSX tracking libraries and only works on OSX.

The performer's client posts all updates to the server via a simple GET request server:8888/update?x=(x coordinate)&y=(y coordinate)&key=(string representation of key press)

The audience client can view the performace at server:8888, and javascript automatically asks for updates via server:8888/update. The the server gets a GET request to /update, it returns a json object with x,y, and key.

Currently, the audience member's client shows the raw data it recieves from the server. It also tries to draw the mouse moments in the webpage. The size of the canvas and the resolution of the performer's machines are both hard-coded at this point.

The server code is server/server.py All static documents are stored in sterver/static/, currently the only

  thing stored there is a cached jquery js library.

There is only one tornado template, index.html, which is stored in server/templates. This template allows needs the renderer to fill in three values: x and y coordinates of mouse, and last key pressed.

The OSX keylogger is stored in keylogger-osx, and is based off of https://github.com/dannvix/keylogger-osx.git. Use make to build and run via ./keylogger

Install notes: - The server needs python tornado installed - Currently the ip address of the server is hardcoded into index.html and server.py, you will need to change these - For best results, use DNS to address the server - The performer client code currently only works on OSX. xcode CLI tools must be installed to build the client code. - The keystroke/mouse logger needs to be run as root

interactive_smartphone_concert.txt · Last modified: 2013/09/29 13:29 by rebecca_shapiro