Control lets users design their own touchscreen interfaces for controlling music, art and virtual reality software. It is the only mobile device interface software that allows user scripting, transmits both wireless MIDI and OSC and runs on both iOS and Android devices. Control is free from the Apple App Store and Android Market and is also open-source. It was selected by Apple as a New and Noteworthy app for the week of 1/23/2011.
I first presented a paper on Control at the 2011 International Computer Music conference; a second paper on live coding interfaces with Control was presented at the 2012 NIME (New Interfaces for Musical Expression) conference.
The Device Server drives interactivity inside the AlloSphere Research Facility, a three-story spherical immersive instrument housed in the California NanoSystems Institute. As a senior member of the AlloSphere research group I am responsible for enabling developers to easily utilize a wide variety of mobile, gaming and virtual reality devices. The Device Server provides this ability, allowing users to configure interactive controls and perform signal processing on control signals via JIT compiled Lua scripts.
The Device Server was written in Objective-C for OS X. I recently presented the project at the New Instruments for Musical Expression conference; for more information please read the paper in the proceedings from NIME or visit the dedicated webpage.
iPhone / iPad software for demanding musicians. Visual Metronome was the first metronome in the App Store to keep accurate time. It continues to lead the way with features such as unlimited beats per measure, tempos from 1 - 800 bpm, click synthesizers and a clear, minimal design.
Tens of thousands of copies of Visual Metronome have been sold since it was first released. Learn more at the Visual Metronome website.
Composition for Conductor and Audience
This audience participation piece let 20+ users use their personal mobile devices to control structured scenes of music. If users did not obey the wishes of the conductor, the conductor had the power to 'cut' them from the piece temporarily. The piece was an investigation to see if audience members would follow the conductor or simply do whatever they chose too in an anonymous performance environment.
fMRI and EEG Visualization / Sonification
Working with Dr. Christine Tipper from the UCSB Psychology department I have created a 3d environment for displaying fMRI and EEG data simultaneously. The coupling of these two data types is important; fMRI data is temporally coarse (1 Hz) but spatially accurate while EEG data is much more temporally dense but lacks spatial precision. Up until recently it was not possible to wear EEG sensors while inside of a fMRI machine; the ability to now do so provides the possibility of new insights into how the human mind functions. For these insights to be obtained scientists will need new ways of displaying, navigating and manipulating increasingly multi-dimensional datasets.
The current project shows a static fMRI snapshot of the brain accompanied by time varying EEG data. Once the visualization / sonification of this dataset has reached a mature stage of development we will move towards displaying time varying fMRI data as well; scientists will be able to stand in the center of the human brain as it undulates around them.
Dialectic #1 is a spoken word piece that looks at the roles of emotion and logic in making decisions; in this case, specifically decisions about a somewhat frustrating relationship. Emotion is represented by the audio signal of my heartbeat playing throughout the piece; this audio is warped and manipulated by 'logic', which is represented by brainwave signals generated during the performance. I built the EEG machine used in this project via instructions from the OpenEEG project.
The piece has a dedicated website where you can learn more and listen to the performance.
Improv for Liquids and Electronics
This piece stemmed from interest in the liquid projections (aka wetshows) used in psychedelic performances of 60s and 70s. Using overhead projectors, oils, dyes, pH indicators, bases and acids I created a visual display of swirling colors and chemical reactions. By manipulating the the pH balance of the container solution I changed the color of indicators in the solution; adding droplets of oil and glycerin allowed me to create solid blobs of color. The entire mix was placed in a concave clockface on top of the overhead projector; the concavity allowed me to spin the clock face to create motion.
In order to generate music for the piece I analyzed the visuals I created using a video camera and some simple computer vision algorithms. The output was used to feed generative musical algorithms created in the ChucK programming language. The piece has been performed as part of the Primavera Festival as well as the digital arts exhibition "Something You Don't Know". Watch a video of the performance.
Stereo is a library for the Processing prototyping library that allows developers to easily create 3d stereoscopic graphics, animations and virtual reality environments. Since Processing has a strong educational slant, the Stereo library was designed to be easy to use for novice programmers. It supports anaglyph, active and passive stereo projection systems.
Disseminate is an online platform for conducting video-based seminar discussions and was my Masters project at Columbia University. It provides an environment for multi-modal, real-time communication that can be used both in online courses and to supplement traditional classroom discussions. Disseminate was created with ease of use in mind at all times and is optimized to reduce the bandwidth needed for online video communication while emphasizing visual community among participants.
Disseminate was written using Macromedia Flash and the Flash Communication Server (FCS). The picture on the left is from a test conducted in a course at Columbia with the cooperation of the Columbia Center for New Media Teaching and Learning..
Thunderdrop is a post-math rock trio featuring myself on vocals, bass and electronics; Angus Forbes on drums and vocals; and Amelia Nuding on the musical saw and vocals. We play shows throughout the greater Santa Barbara area and are currently in the process of completing our first studio album.
Rocked By The Cyclotron
mediaMemory is based on the classic card game memory. Instead of cartoons, mediaMemory uses images drawn from advertisements. The images are divided into four different categories: Violent, Animal, Sexual and Landscape. In examining the "memorableness" of these different categories in relation to one another, mediaMemory helps improve media literacy by encouraging users to think about the ways advertisers use imagery to sell their product.
midiStroke allows you to trigger keystrokes in the currently focused application using MIDI (Musical Instrument Digital Interface) note, program and CC messages. Each MIDI message can trigger an unlimited number of keystrokes in sequence. This is of enormous value for performing electronic musicians, but can also be used to automate everyday tasks through the use of MIDI footpedals, keyboards and control surfaces.