description

480-notes-final-cropped.mov

I titled this piece cloudy with a chance of music because the falling notes resemble rain drops that sound out notes as they hit your head! I had a lot of fun with my last project, weather whimsy, and wanted to continue the theme of creating digital experiences that allow people to use their bodies to interact with what’s on screen.

Each of the dots makes a sound upon contact with a person, with the pitch dependent on the height of contact. I think it’s interesting that by listening to the combination of notes produced, one can deduce some information about the scene that the camera sees.

process

brainstorming: many initial ideas! i was most drawn toward creating a virtual orchestra that could be conducted and starting exploring that avenue, but decided that since i’d already used hand detection i wanted to try a different type of model

brainstorming: many initial ideas! i was most drawn toward creating a virtual orchestra that could be conducted and starting exploring that avenue, but decided that since i’d already used hand detection i wanted to try a different type of model

inspiration: i drew inspiration from this particular genre of piano tutorials involving falling bars that illustrate the music. in these videos, the horizontal axis is pitch and the vertical axis is time, whereas in my final product the vertical axis represents both time and pitch!

inspiration: i drew inspiration from this particular genre of piano tutorials involving falling bars that illustrate the music. in these videos, the horizontal axis is pitch and the vertical axis is time, whereas in my final product the vertical axis represents both time and pitch!

[prototype #1: i started by loading up a model for body detection (Bodypix from ml5.js) and getting it running in p5.js, which i was able to find a helpful example for! i also learned how to use p5.PolySynth to create synth sounds with specified pitches and lengths.

next, i figured out how to interpret the mask data returned so that when a dot fell and intersected a detected a body, a note would be played.](https://prod-files-secure.s3.us-west-2.amazonaws.com/f41bca80-c173-480c-ac9f-f43bf8bc1da6/3f8427b3-0f77-4c84-b3af-6c1c77f0cfa2/480-notes-v1.mov)

prototype #1: i started by loading up a model for body detection (Bodypix from ml5.js) and getting it running in p5.js, which i was able to find a helpful example for! i also learned how to use p5.PolySynth to create synth sounds with specified pitches and lengths.

next, i figured out how to interpret the mask data returned so that when a dot fell and intersected a detected a body, a note would be played.

prototype #2: here i added multiple dots and gave them different pitches based off of the height at which they intersected the body mask. i set the notes to start at random heights above the screen so they would fall evenly! after a lot of time listening to these sounds, i decided that the scattered weren’t quite as pleasing as i’d hoped and made some changes for prototype #3.

prototype #2: here i added multiple dots and gave them different pitches based off of the height at which they intersected the body mask. i set the notes to start at random heights above the screen so they would fall evenly! after a lot of time listening to these sounds, i decided that the scattered weren’t quite as pleasing as i’d hoped and made some changes for prototype #3.

const PITCH_LOW = 200; 
const PITCH_HIGH = 900;

function get_pitch(y) {
	return (PITCH_HIGH - PITCH_LOW) * (height - y) / height;
}

[prototype #3: instead of completely letting the height of the notes determine their pitch, i decided to chose a chord to select notes from, the pitch is now determined by ranges of y-values that correspond to notes belonging to the same chord and it sounds more harmonious!

in this version, the animation was lagging quite a bit because i had a large canvas size — i later reduced this for the final version and critique.](https://prod-files-secure.s3.us-west-2.amazonaws.com/f41bca80-c173-480c-ac9f-f43bf8bc1da6/07a8e00b-ffb6-4d83-9ed6-7b3fbce32aad/480-notes-v3.mov)

prototype #3: instead of completely letting the height of the notes determine their pitch, i decided to chose a chord to select notes from, the pitch is now determined by ranges of y-values that correspond to notes belonging to the same chord and it sounds more harmonious!

in this version, the animation was lagging quite a bit because i had a large canvas size — i later reduced this for the final version and critique.

i started out with 9 notes from the C Major chord. eventually i decided that i wanted to spice it up a little bit, and switched to using [what i now know is called] a C9sus4 chord. i also tried G9sus4, but the notes went too high for my liking.

i started out with 9 notes from the C Major chord. eventually i decided that i wanted to spice it up a little bit, and switched to using [what i now know is called] a C9sus4 chord. i also tried G9sus4, but the notes went too high for my liking.

conclusion

I enjoyed working on another project integrating p5.js with a webcam! I liked how it brought people to move and wiggle around as they tried to catch the dots on the screen, and the silliness and joy that it created. I was also happy that I could use some of my music theory knowledge in creating this piece.

After finishing up, I found myself comparing this to what I built in weather whimsy and I found that I liked that one more (which I have realized is ok! not every project can be better than the last). I realized that if I could do this project again / build upon it, I would integrate more ways for people to control the sounds being played. I think if there were a more direct mapping from action → effect on the screen, there would be more of a sense of novelty with each interaction.

I’m again and again amazed by how fast these ML models I’m finding can run in the browser :0 Granted, Bodypix did slow down when the video size filled the screen, but it’s still impressive and I’m excited to explore other ways to use these models to create art!

links

OpenProcessing link

Google Drive link to video

sources

reference | p5.js

documentation for the p5.PolySynth library

ml5 - A friendly machine learning library for the web.

documentation for the Bodypix model of the ml5js library