Tangible Interaction Workshop – MIDI Music Instrument

For this assignment, we created musical instruments that use the MIDI protocol. Mine combines melody and percussion, and frees up the musician to use their voice if they wish. I’m imagining that lead vocalists in bands may want an instrument like this. If this prototype were to become wireless and portable, they would be free to leave their set-up on stage to move around. It could also work just as well for other musicians whether accompanying themselves, playing in an ensemble, or backing up another musician.

IMG_1246.jpg

The soft pot is for volume, and the buttons on the right are for choosing pitches. The orange shape signifies where to strike with your drum stick. 

IMG_1260

Inspiration

I was inspired by two things: One, having seen many musician friends and colleagues perform over the years while I managed and produced live music programs and concerts. I thought it might be fun to create something for a musician that amplifies how they communicate energy to the audience, all while singing and moving around the stage or attracting attention further back stage.  I’ve also always loved how much sound a small hand held instrument adds to a band. Not only that, but from a physical interaction design perspective, they usually call for a very specific gesture that the audience can really see and enjoy.

I was also inspired by these handheld instruments from across the African Diaspora. They each add their own unique sound and specific gesture to what’s happening on stage. I admired these and thought I would make something that builds on their shape and purpose.

claves.jpeg

Claves, Afro-Cuban

 

guiro

Guiro, Puerto Rican, Cuban and other forms of Latin American music

 

agbe

Agbe, African

 

Assignment from Class

Make a device to control playback of synthesized or pre-recorded music using MIDI. You do not have to build the playback engine, just the physical control device. Your device will send MIDI messages to another device which will play the music.

Your device should support the following features:

    • start and stop the playing of a note
    • play multiple notes simultaneously
    • sustain a note
    • pitch-bend a note

Make a housing for your controller. Document it according to the project documentation guidelines at the end of this page.

Design Process

First I thought of a few ideas. At first I played it safe and was going to make a guitar with buttons. But as I mentioned above, I decided to challenge myself to make an instrument that doesn’t quite exist yet.

I bought my materials, including some cool buttons, wire connectors (these are amazing – no soldering to your buttons!), and containers from where else, but the Container Store. Thanks to Nick, my classmate, for keeping it real and suggesting I just get an off the shelf container rather than make my own at this stage.

Circuit Process

As usual, I started small so that I could get simple sensor readings from my five buttons, shake sensor, and soft pot.

IMG_1216.JPG

Testing MIDI Code & MIDI Hardware Set Up

After that, I started working on code to connect those readings to MIDI messages. As an easy start, I tested my circuit with code provided by Tom Igoe to see if I was getting the expected serial monitor messages. Then I set up MIDI hardware with the example code, to see if I had connected the equipment properly. This took a few tries.

Creating My Own Code

Once I had example code and a hardware set up working, I started to try and write my own code that would allow the musician to use the instrument as I had in mind. My takeaway is that I learned a lot deciding for myself how to organize and structure my code. But despite getting lots of help, I hit a wall and will need some more help finishing it up.

I did accomplish a lot, though. For example, I wanted the built-in notes to be a normal C major scale, not half tones as in the Github example. I also wanted the musician to be able to control the volume by pressing up or down the soft pot, which meant passing that value into the MIDI message function, and then remembering it using state change detection. That way the musician could press once and let go, and not have to continually hold the volume soft pot sensor while pressing other notes.

Essentially I created a template function to run on every sensor input, which requires the three values of the static pin, static MIDI pitch value, and changing volume value. I wrote separate logic to capture the soft pot’s value, map it to MIDI’s volume range of 0 – 127, and use state change detection to save the latest reading. I funneled the three values into the master MIDI function which actually sends the data using Serial1.write.

At the moment, my on and off commands are mixing up, and I don’t have enough state change detection written to change the state of those on/off commands.

You can see my code at the bottom.

Fabrication Process

Fabricating was done mostly by hand, as the circular enclosure meant I couldn’t use the laser cutter. First I taped on paper and used a pen to mark the outside locations of my panel controls, and the inside locations of the breadboards and battery on the inside. I used a drill to make holes. And I found that colored adhesive vinyl sheets get across a lot of personality and even can serve as control panel instructions.

IMG_1224

Proof It Worked

I had a mostly working circuit just before enclosing it  & trying to add more state detection change. In addition to needing to work on the code, I need to make sure the shake sensor data really works with a drumstick.  But proof it worked!

Final Code Documentation

Here’s my final code, as of now. I have a few quirks I look forward to figuring out, maybe as my last project in the class.

https://gist.github.com/fergfluff/0f5671302bdbb355846833a73814e6fa