Developer Spotlight: How we made Muletallica

This post is the second in a three-part series covering the IoT projects that came out our first internal hackathon of the year and that we had on display at our first Integration of Things Zone at CONNECT 2015. Missed us there? No worries, not only will you get a sense of the cool installations we built below, you’ll also get a front row seat as to how we built them, and with what technologies. So for now, put on your favorite rock star jeans, and jump in the mosh pit to learn how the team built Muletallica, an interactive visual/musical experience for our conference attendees that connected devices like Leap Motion and smart light bulbs with electric guitars, and a bell pepper.
11140242_10152941689612551_1656019084777378406_n-300x169

Why we Built it

Muletallica came out of the internal IoT hackathon we had at MuleSoft back in April. It was a team project, built by Federico Amdam, Jesica Fera, Pablo Carballo and myself, all of us based out of Mulesoft’s Buenos Aires office.

Muletallica was born out of something I had originally been tinkering with for quite some time now: the idea of using technology to create interactive musical installations that could get people –potentially with no musical knowledge or skills whatsoever– to experience the joy of making music in a fun and creative way with a minimal learning curve.

What we set to accomplish during the hackathon was to take some of my prior musical experiments and use Mule to integrate them with intelligent lights, in this way we’d be making the experience a lot more immersive and engaging. Through the added visual feedback that the lights provided, the responses to people’s actions became easier to associate.

Fast forward to a month later and I was in San Francisco at the Connect conference, representing our team. Our marketing team in the meantime had helped design an awesome-looking installation to show Muletallica off, which really made the project stand out. And there I was essentially living the dream, as (at least for a couple of days) my job description involved playing the guitar and telling people about cool tech toys. Anyone who curiously walked up to the stand was invited to join the band and jam with us, and they were always very curious about how it was all accomplished, so I was happy to walk them through all of the interaction design and the underlying architecture.

We used MiLight lights for this, which expose a python API that isn’t very easy to use. The better known Phillips Hue lights expose a nice API, that could easily receive HTTP requests… that would have been too easy, though. We wanted a challenge, somewhere where we could show off Mule’s power to take an ugly legacy interface and make it useable, so that’s why we went with MiLight instead.

Muletallica, Play by play

In the above video you can see me play with several of the different instruments. Each instrument is linked to a different intelligent light, and sets its hue and intensity through Mule messages:

  • The Air Piano: At first, I play a Leap Motion sensor as an air-piano that can be played by simply stroking imaginary keys. The beauty of this is that whatever you play, it’ll always be in the right key and adjusted to be right on the beat. Literally anything you play will always sound musically good, or at least not painfully off. At the same time, we had a light flicker once for every note that is played, with a hue that was mapped to the note’s pitch.
  • The Guitar Duel: When playing the guitar, the sequence of notes I played is stored, so that playing the air piano automatically runs you through the same sequence of notes. This made for a pretty interesting request-response kind of musical conversation between two instruments, where the notes would be the same but the free interpretation of the timing of them was enough to allow for some exciting musical expression. It was also a fun way to interact with members of the audience who were brave enough to accept the challenge of playing back whatever I played. One of the lights was mapped to the guitar and flickered with every note I played, mapping its hue to note pitch.
  • Adding Beats: When presenting one of a series of printed cards to the webcam on the laptop, the drum pattern changes. Here the computer is using Reactivision, a computer vision software that was originally built for the Reactable, to recognize these cards. Using this in our setup was a little tribute to the creator of the Reactable, Sergi Jordá, a professor of mine who first inspired me to pursue this ideal of making music creation accessible to everyone. One of the lights flickers matching the beats of the drum, mapping intensity of the beat to luminosity, it also changes color whenever the pattern changes. Each change in the drum pattern also triggers the playing of a short three-second video.
  • The Wiiii and the WubWub: After having changed the drum beat to the most electronic pattern, playing the same Leap Motion Sensor as before invokes a dubstep-ish theremin-like instrument that responds to the height and angle at which you hold your hand above the sensor. It can actually tell what hand you’re holding up and plays a different instrument depending on which it sees. I called one of these instruments “Wiiii” and the other one “WubWub” …I suppose you can easily tell which is which from the video. Every change in these instruments was also manifested through a change in the hue of its corresponding light.
  • 11148713_847879965248494_1535225798515627458_n1-300x200

 

  • The Bell Pepper: Our addition of a music-making vegetable piqued a lot of people’s curiosity from the visitors. It was an actual vegetable that was wired to a Makey Makey, and responded with a chord change every time someone touched it (going through a sequence of pre-defined chords). Yes, touching the bell-pepper involved an –imperceptibly low– electric current passing through your body. Some people seemed to be a little uneasy about this idea, I would then assure them that the bell-pepper we were using was 100% organic, fair-trade, fresh produce with no additives whatsoever, and then proceeded to show them the sticker that certified that it was in fact organic. One of the lights changed color whenever the bell pepper was touched.

The music that could be made with Muletallica was far from anything that could resemble the sound of Metallica… it could be described as mellow Pink Floydish trance-inducing prog-rock or sometimes as full-on twisted synthetic-sounding dubstep, but certainly never as heavy metal or anything even faintly close to that. We came up with the name as a random pun that we never expected would be taken seriously as a proper name, but people seemed to like it quite a bit, and so we went with it… it’s like what they say, if you build it, they will come.

Muletallica’s Backstage

All of our Integration of Things Zone projects featured Mulesoft products in their internal structure in some way or another. In the case of Muletallica, I must admit that Mule was not the backbone of the project, but still an essential bone its structure.

The backbone was Usine, a not-so-well-known French software that is actually amazingly versatile and ideal for live performances like this. It shares a certain philosophy with Mule, as with it you’re also building flows by dragging and dropping atomic components that include all kinds of connectors and transformers. Just like in Anypoint Studio, everything is exposed through a graphical interface, while you can also get into the code and write.

muletallica 6 (1)

Most of the external components involved were connected together through MIDI, which is a widely accepted standard in musical interfaces. Due to the prevalence of that standard, connectivity was not a challenge when communicating Usine to Reactivision or to Mogees. The lights we used, however, didn’t support MIDI or any other universal standard for that matter, and so that’s where we had to truly put our integration developer hats on and solve the puzzle.

We then built a RAML definition that exposed a series of methods for calling our lights, and with that in place it was really easy to just build an APIkit project and have it automatically flesh out all of the scaffolding we would need to build a neat RESTful wrapper around their ugly API. We then injected a few lines of python code into Mule, these executed the commands that made up the MiLight API, as well as the commands of a python MIDI library that allowed us to receive the MIDI messages Usine sent and make them into Mule messages.

The RAML definition we wrote for wrapping the miLight API in a REST API:

The XML of our Mule flows. Much of this was automatically built by APIkit from the RAML file above:

This project allowed us to show off Mule’s speed and stability when dealing with a massive stream of requests that arrived simultaneously. In music, timing is the single most important thing, as the slightest delay renders an interface unusable for musical interaction, that’s why music is the ultimate challenge for testing the real-time readiness of a system. We did have a few problems with delays at first, but we soon realised that the bottleneck was actually our wifi signal, not Mule. With that fixed, we got to the point where delays were virtually imperceptible. The music software we were running is pretty heavy on the machine’s resources, and we were running Mule in that same laptop computer …even then we didn’t experience any significant delays.

Looking forward, it would be amazing if someone took the time to build a MIDI connector for Mule, with that in place this entire project could have been built around Mule, controlling even the triggering of musical notes and everything else… I really look forward to doing that some day!


We'd love to hear your opinion on this post