Cycliptic: A Flower It Will Become
Music release. Bass guitar, synths, production.
Music release. Bass guitar, synths, production.
Conference talk from ADC 23 discusses software engineering techniques and architecture for building a web interface on a C++ audio application
Explorations integrating SuperCollider instruments into a Ableton Live set. A web-based touch-screen UI controls SuperCollider patches through a Node.js server (running inside an Electron app). Tempo clock is synced from SuperCollider to Ableton Live using Ableton Link.
A prototype of an interactive LED wall experience.
An LED wall allowing participants to paint gestures on a touchscreen.
The Transdimensional Audio Workstation is an interactive musical experience. The machine allows one to tune into another dimension in order to send a musical communication to one’s self. After sending the music, any transdimensional response will be interpreted and played back automatically.
The folks from Bells Atlas had an idea to release a single song from their EP, chosen at random to anyone who signed up. I liked this idea and built it for them: hyperlust.bellsatlas.com.
I spoke about musical creation algorithms and interfaces at "Music + Tech" night hosted by the Institute for the Future. The talk was on two topics: algorithmic music and musical creation GUIs.
Over the recent holiday season I decided to try working on a project together with my family. I proposed the medium of LED light animation and started talking with Pegs (my mom) about it. What we ended up with looks pretty cool! It is made from some LED strips in the shape of a star.
I performed as a guest with the Santa Clara Laptop Orchestra. The performance was in the Contemporary Jewish Museum as part of Google I/O.
The "All Worlds Fair" took place in San Francisco at the beautiful old mint building. It was quite a delightful gathering of people and showcased some art and performances that I found quite amazing.
I built a soundscape for the "Seas of the Subconscious" experience that took place in the basement. It sounded pretty sweet in that steel-walled room.
The soundscape is generative, so I could render out 3 minutes or 5 hours if I needed. Here is a short sample:
It is written entirely in SuperCollider so I now have the start of a framework for developing soundscapes in the way that I like to think about them. It is open source:
This piece is a collaboration with Jen Hsu which we first showed at the CCRMA Fall Concert 2012. It was quite a pleasure to create.
"Jnana" is an algorithmic accompaniment system integrated into Ableton Live. It can analyze MIDI input and generate melodic and rhythmic material in a similar style.
For a final project, I expanded upon my prior compositional ideas for "Determinism" to create a bit more detailed algorithmic piece.
"tulpasynth" is a prototype real-time collaborative music creation system that takes advantage of touchscreen gestures for a tangible, responsive UI. It has been generally well-received when I have shown it at CCRMA events and at the 2011 bay area Maker Faire.
For this assignment, I brought together a variety of techniques to create some algorithmic music.
For this assignment, I gathered a few recordings and created a short musical statement by manipulating the sounds in ChucK. The rhythmic variation is created with first-order Markov Chains, and the string-like sounds are created by feeding a recording into a Karplus Strong algorithm.
Here is a "musical statement" I created while working on a homework assignment. The assignment was to experiment with FM synthesis by building some timbres and submitting them along with a musical statement.
"KnacK" is a music compositon framework for ChucK.
The basic idea of the framework is to provide some conventions for making compositional code modular and reusable. There are some other features that I am interested in developing further as well such as a MVC-like interaction between raw "aesthetic data" and the instruments/performers in a musical piece.
"tulpasynth" is a real-time collaborative music creation system.
It is a collaborative web-based application for creating music with others by manipulating simple shapes in the web browser.
Below is an audio montage/narrative-like soundscape that I have created from various clips of NPR interviewees and reporters. It was originally developed for a 4-channel audio system, but below is the binaural stereo mix. Audio was generated using ChucK. End result and source code can be found below. Hope you enjoy.
NOTE: Unlike most content on this site, this audio is NOT licensed as Creative Commons. The content is copyrighted by NPR.
NOTE: Source code is licensed MIT.
Download source code
Every New Year's Day, many of my family and friends gather to play bingo. In recent years there have been too many people in the house to hear what numbers have been called.
I find the significance of the Fibonacci number sequence in nature, art, and mathematics very interesting. For some time I have been thinking about how this pattern can be applied to music and have been developing my own musical composition that is algorithmically based on the Fibonacci sequence.
The composition as it stands currently. Please enjoy, and feel free to download if you would like.
While working on my Fibonacci piece (mentioned here), some interesting results pop up now and then. These riffs were algorithmically generated, written in JavaScript and Max/MSP, and are based on the Fibonacci numbers in multiple ways.
Last year I developed a short composition based on the Fibonacci numbers in which various accompaniments were generated algorithmically, you can listen to it here.
This semester, I am working with Prof. Curtis Bahn in a much greater capacity, and will be hopefully be developing this idea into the composition that I have always wanted it to be.
Here is the result of this semester's work on my Fibonacci composition:
If you are interested in my process, please feel free to read my final report below, or email me.
Concert is an online collaborative organizational tool for sounds. I am developing Concert as part of Software Design and Documentation class here at RPI. See our wiki for more information.
This semester, for "Interactive Arts Programming" class, I will be making an interactive composition based on the Fibonacci sequence. The piece will involve me playing the bass line of my composition, while a computer takes my performance as input and generates algorithmically determined accompaniments.
This past new years, I continued work for a pyrotechnics entertainment company, this time at Times Square, programming and operating the digital firing system to fire all of the pyrotechnics underneath the ball.
This semester I have been pursuing an independent study in DSP Programming for Music/Audio Applications. For my final project, I will be designing an Overdrive/Distortion effect that is customized for the sound of my bass.