We have an amazing opportunity to migrate Social Sound Design to the Stack Exchange Network. This means brand new and up-to-date software with lots of great features (including chat and a meta site!). Best of all, we’ll be part of a massive ecosystem of knowledge. Hurray!
Ever since starting my journey into the world of sound design, I’ve dreamed of having a space dedicated to all the wonderful disciplines of sound design where we can learn from and share our knowledge with a community of like-minded folk. I feel incredibly lucky that SSD has become what it is. Thanks to all of you for making it so awesome!

A bit of background and context

4 years ago, I discovered a Q&A platform called Stack Exchange (SE), which embodied exactly the ethos I was looking for and used it to build SSD.  Since then, the Stack Exchange platform became the amazing and almighty Stack Exchange Network, which is the home to over 100 high quality Q&A sites covering most of the topics you could dream of. But to be part of the switch from SE 1.0 (subscription-based model) to SE 2.0 (Network), we would have had to start again from the beginning, and I decided we were better off to keep building up the community independently of the “Area 51” process they have. Our community has been so strong and the quality of the questions so high that they let us do our thing for a lot longer than anticipated and kept the server running for us (thank you, SE!). But now, it’s time to join the SE Network!

We’re joining forces with another awesome SE site!

To make this happen, we will be joining forces with a SE site called Audio-Video Production (AVP). They also have a really great community with quality Q&As. Not only will we be combining the 3,600+ questions from SSD with 2,500+ questions from AVP but also both communities! This is going to bring this site to a whole new level. Both sites are reaching out to the same audience, so our combined strength will be a great asset to this subject!
You may be asking why is “video” part of this? It’s not. The audio and video part of AVP will separate from one another and the video community will take over the video part where they will focus on creating a site dedicated to video professionals and experts. There are a few isolated audio-video crossover questions, but it shouldn’t be hard to work them out during the move. So if you ever have video-related questions, be sure to join them as well 🙂

So how is this going to work?

Step 1: Don’t panic — We’re going to start by moving SSD over to the 2.0 Network software as Sound Design. We will be new neighbors to AVP for awhile while we work through some of the technical issues and we get acclimated to our new home. This will provide a gentle introduction; be sure to bake a pie and greet our new neighbors! A bit after that, the audio part of AVP will be merged with us.
After a short cleanup, we’ll start hashing out the migration issues (content, tags, cleanup, etc). We’re already well attuned to the Stack Exchange way of doing things, so there shouldn’t be any radical changes in scope or how things are done. To separate the audio and video content on the AVP site, we’ll use the audio-specific tags to automate a lot of the heavy moving; but after that, the regular migration tools will be used so the community can carefully and methodically manage the remaining content however we see fit. And all along, we’ll be working behind the scenes to make sure nothing gets lost and we don’t break anything on the Internet.
There was another SE 1.0 site called Math Overflow,  which was in a similar situation to us and it was very successfully migrated. The tech team over at SE are already familiar with the technical issues that may arise from this process, so the move should hopefully be as seamless as possible.
Stay tuned; we’ll keep the updates coming. If you want to get involved in the discussion, head over to the SSD meta question.

It’s time for a little update. I now live in the Netherlands and I co-founded a startup where we’re making Flying, an iPhone app for frequent flyers. Flying is useful, insightful and social. It all started over a year ago as JETSET, a concept for tracking flights made while studying at CIID. If you’re interested, you can check out our story and our awesome team here.

We launched in public beta a few weeks back and you can download it on the app store. My username is andrew – one of the perks of creating your own platform 🙂 Just search for me if you want to connect!


In the app, we love blurring the line between the digital and the physical. Flying used to be a very tactile and physical experience, where every touchpoint had a certain quality and ‘weight’ to it. So, we integrate tangible elements wherever possible, like making the real passport­-like stamps that are scanned into the app rather than digital representations. Check out the video below showing what I mean.


In the last post I shared a project I made called loci – 3D printed sculptures from your flights. This project was made to work together with Flying and we’re in the slow process of making this happen. Stay tuned!

Each loci is a unique 3D printed sculpture bespoke to someone’s flights spurring recollection, reflection and conversation about their travels.

It is a working prototype made for my final project while studying at CIID. I am not releasing the software yet as we are aiming to include loci directly in an iPhone app I co-created called Flying. A big part of the app, is the ability to track one’s flights, so it would be fantastic to be able to create a loci directly from whithin Flying!

Through custom software built in Max/MSP, the user is able to select specific flights that matter to them, such as a honey moon, a summer vacation through Europe or all of their flights from a given year. To ease this process, the user can connect to Tripit or Foursquare and import their past flight data automatically.

Each loci comes with a card highlighting on a map all the airports flown to. The sculpture can be placed on this card to help visualize their travels. Additional information is displayed on the card, such as the title chosen by the user, the total distance travelled, the number of airports visited, and the number of flights taken.

The software generates a file suitable for 3D printing, so that the user can either print it themselves or use a service like Shapeways that prints for them.

Thanks to Brian Rink for being my advisor for the project and Sterling Crispin for helping me figure out some programming issues I had. And of course everyone that helped me along the way!

Rethinking home audio and understanding how and where people share music was the jumping point for creating Skube. We realized that as we are moving more towards a digital and online music listening experience, current portable music players are not adapted for this environment. And sharing music in communal spaces is neither convenient nor easy, especially when we all have such different taste in music.

The result of our exploration is Skube, a music player that allows you to discover and share music and facilitates the decision process of picking tracks when in a communal setting.

There are two modes, Playlist and Discovery. Playlist plays the tracks on your Skube, while Discovery looks for tracks similar to the ones on your Skube so you can discover new music that still fits your taste. When Skubes are connected together, they act as one player that shuffles between all the playlists. You can control the system as a whole using any Skube.


The interface is designed to be intuitive and tangible. Flipping the Skube changes the modes, tapping will play or skip songs and flipping a Skube on its front face will turn it off. The shape informs the user to the ways they are able to connect the music players together. This allows different Skubes to be in either Discovery mode or Playlist mode when connected to other players. When multiple Skubes are connected together,they act as one music player and they contribute to a global playlist that is played on all them.


How it works

It is a fully working prototype through the combination of using ArduinoMax/MSP and an XBee wireless network. We access the Last.fm API to populate the Skube with tracks and scrobble, and using their algorithms to find similar music when in Discover mode. Then using Applescript, we get Spotify to play the music. We use XBees for the wireless communication between the Skubes and to the computer using custom software that manages all this.

You can see the inner workings of the Skube in this litte video:


This project was made by Andrew Nip, Ruben van der Vleuten, Malthe Borch, and Andrew Spitz (me). It was part of the Tangible User Interface module at CIID ran by Vinay Venkatraman, David Cuartielles, Richard Shed, and Tomek Ness.


Super Angry Birds is a force feedback USB controller for Angry Birds that simulates the feeling of a slingshot. All the controls found in the game are available in this device. You can control the pull, the angle, and of course trigger the special power of the bird.

We hacked a motorized fader found in audio mixing consoles to create the force feedback. If you are interested, you can read the paper. Basically, the way we achieved this is by drawing a force curve and storing the values in a table, then we send the current position of the slider through the table and extract the value to send to the motor that applies an opposing force. You can check this in action in the “How it Works” part of the video.

We programmed in Max/MSP and Arduino. For controlling the hardware, we used an Arduino-based microcontroller called Music & Motors (check the photo below on the right) developed by CIID.

This project was made by Hideaki Matsui and I (Andrew Spitz) in a class on Haptics at CIID run by Bill Verplank and David Gauthier.



WTPh? (What the Phonics) is an interactive installation made by Momo Miyazaki and I (Andrew Spitz). Street names in Denmark are close to impossible for foreigners to pronounce, so we did a little intervention in the touristic areas of Copenhagen.

We recorded a Danish person speaking the street names then split up each syllable. In true karaoke style, we placed lights above the matching syllable so that in real-time, you can see which part of the word is being spoken. When participants lift the speaker off the wall, it starts playing. We used Max/MSP and Arduino to build the installation.

Here’s a video about the process and showing WTPh in action. Enjoy!

This project was made during the Systems and Layers module at CIID and was run by Adam Greenfield and Engin Ayaz.

The music we used, is by the amazing artist Jacob Montague. The track is Lambent from the album Fly On.

#theVELUXpose is an interactive installation for VELUX set in the streets of Copenhagen. VELUX specializes in roof windows and our brief was to inspire young urban citizens to see the roof as an opportunity space and aspire to one day own a roof window. Our goal was to connect the digital generation and the VELUX brand.

As a symbol, we used the posture people have when looking through these roof windows, which is looking up at an angle of 125º. We brought this experience from the roof, down down to the streets and used Instagram as a way to get people excited and involved. It was amazing to see everyone so keen to take part. Thanks to all that did!

We put an iPhone in a box with a frame that has VELUX written on it so that it would come up as the border in the image. We hacked together a really long handsfree kit cable to be able to trigger the camera from far. The image gets synced with an iPad for participants to upload their image to Instagram with their own filter.

The video explains it all and shows it in action:

This interactive campaign was made by Kat Zorina and I (Andrew Spitz) for VELUX as a part of an industry project at CIID run by Jamie Allen.

You can check out the pictures we got during the few hours we did the installation, it’s under the instagram tag #theVELUXpose.

Here are some of our favorite:




Inspired by the 1960s and the “golden-age” of Pan Am-esque air travel — a glamourous period of travel for elite and well-dressed jet setters, we hoped to bring a little of this indulgent sophistication back to the travel experience through our iPhone app concept. JETSET tracks your flight history and visualizes interesting data about your flights.

As you travel, details about your flight are logged and you can revisit your travel history. JETSET also creates fun visualizations generated by your flight data. For example, you can see how far you have traveled in comparison to a trip to the moon or how long the equivalent journey would have taken you on foot.

You earn medals based on how much you have flown, where you have traveled to, and what type of planes you have taken. And of course, you can brag on social networks about your achievements and information about your flights.

JETSET was made by Andrew Spitz (me), Markus Schmeiduch and Katie Kindinger. It was created for the Graphical User Interface class at CIID.

We’re looking at ways to to turn this concept into reality, so don’t hesitate to get in touch if you would like to have a chat.

Here are some of the screens:




What Line is it Anyway? explores the way people interpret an adjective through the simple task of drawing a line. We crowd sourced 1300 people to draw us a line by using Amazon’s Mechanical Turk, which is an amazing platform to get people (called workers) to do a simple task without them knowing the bigger picture or context.

We asked each worker to draw us a line from ‘start’ to ‘finish’, the only other information we gave them was that they had to do it according to an adjective. For example, we asked them: “Draw a spiky line”. Each worker got paid 2 U.S dollar cents.

We used Processing to programme the online drawing tool and stitch it all together for the video. Max/MSP/Jitter was used to create some pre-visualisations, run some tests and do some list processing.

This project is made by Andrew Spitz (me) and Umesh Janardhanan and was part of the Data Visualization module at CIID, run by Golan Levin. Thanks to Marcin Ignac for his help.

Interesting facts:

Average Time Spent on a Drawing:

creative Line: 55 seconds
smooth Line: 70 seconds
spiky Line: 42 seconds
straight Line: 38 seconds

Collective length of lines (about 150 participants per adjective) :

Spirally Line: 39 meters
Straight Line:  21 meters
Wild Line: 44 meters
Smooth Line: 20 meters

This picture below is from when we were figuring out the logic, geometry and most importantly trying to communicate with each other.


Haven creates an analog soundscape by harvesting the wind. The goal was to create a personal space for relaxation while protecting the user from the wind.

Haven consists of arm sleeves that collect air flow through pipes of varying lengths and materials, and a head piece that acts as an analog speaker. The wind is funneled through tubes that connect the arm sleeves to the head piece. The sounds are naturally amplified by the pipes and exit through the tunnels in the head piece and into the ears.


This was a project part of the Performative Design module at CIID. Thanks to David Gauthier and Di Mainstone for their assistance in this project. Project by Andrew SpitzKat Zorina and Kenneth Aleksander Robertsen.