by michelle pan

description

480.mov

weather whimsy is an interactive scene where people can move the weather elements around them with their hands. A user can reach out their hand to grab the elements and drag them around the window. The sun and clouds, when left untouched, will slowly float across the screen.

In creating this piece, I wanted to explore how we can use the physical world to control the digital world in unexpected ways. We are used to using trackpads, keybaords, and touchscreens to control our devices, but there is a lot more that we are capable of doing with our hands. This experience enables people to use the grabbing motion associated with moving physical objects and apply it to affect a digital world!

process

brainstorming: i wanted to think of ways that i could have a piece in p5.js interact with the external world through the camera! i tried exploring the gaze tracking idea but found that it’s hard to accurately implement with just a laptop webcam

brainstorming: i wanted to think of ways that i could have a piece in p5.js interact with the external world through the camera! i tried exploring the gaze tracking idea but found that it’s hard to accurately implement with just a laptop webcam

brainstorming: i turned to tracking hands instead of eyes because they are a lot easier to detect! i took the time to brainstorm features that i wanted to add to my piece, though i didn’t have time to add all of these

brainstorming: i turned to tracking hands instead of eyes because they are a lot easier to detect! i took the time to brainstorm features that i wanted to add to my piece, though i didn’t have time to add all of these

experimentation: i found a p5.js demo of the Handsfree.js and played around with it—i was impressed by how good the hand detection was! this demo allowed users to pinch their fingers together to draw with different colors on the screen

experimentation: i found a p5.js demo of the Handsfree.js and played around with it—i was impressed by how good the hand detection was! this demo allowed users to pinch their fingers together to draw with different colors on the screen

the Handsfree.js documentation has a page that helped me generate gestures!  i captured frames of the gesture (grabbing) that i wanted, and it gave me a dictionary of attributes that i could put in my code to classify whether the hand was in a closed fist

the Handsfree.js documentation has a page that helped me generate gestures! i captured frames of the gesture (grabbing) that i wanted, and it gave me a dictionary of attributes that i could put in my code to classify whether the hand was in a closed fist

prototype #1: i started off by creating a sketch that displays the video feed, tracks the hand, and detects grabbing. i found that the grab detection was a little finicky and wouldn’t always work if i turned my hand a bit, but it was good enough to get started!

prototype #1: i started off by creating a sketch that displays the video feed, tracks the hand, and detects grabbing. i found that the grab detection was a little finicky and wouldn’t always work if i turned my hand a bit, but it was good enough to get started!

i wanted to mirror the video feed from the camera and center it in the middle of the screen. above are calculations i did to figure out where the top left of the video feed should be

i wanted to mirror the video feed from the camera and center it in the middle of the screen. above are calculations i did to figure out where the top left of the video feed should be

[prototype #2: after detecting grabs, i tried to move an object by grabbing it. i did this by checking for grabs, and checking if the center of the hand intersects the shape to be grabbed. if it does, then the shape moves in the same direction as the hand

after this prototype, i discovered that i could remove the direction parameters in the gesture description so that the detection process only focused on whether the fingers were curled or not and not which direction they were pointed in, which made the grabbing more reliable](https://prod-files-secure.s3.us-west-2.amazonaws.com/f41bca80-c173-480c-ac9f-f43bf8bc1da6/739e492c-8043-4e33-8465-c999ab126a4b/480_Screen_Recording_2024-03-16_at_16.57.26.mov)

prototype #2: after detecting grabs, i tried to move an object by grabbing it. i did this by checking for grabs, and checking if the center of the hand intersects the shape to be grabbed. if it does, then the shape moves in the same direction as the hand

after this prototype, i discovered that i could remove the direction parameters in the gesture description so that the detection process only focused on whether the fingers were curled or not and not which direction they were pointed in, which made the grabbing more reliable

[prototype #3: i added multiple objects and made them float around when not being grabbed. in this video you can see some bugs where objects are being grabbed even when the hand isn’t overlapping with them

after debugging, i found that i was using the wrong value for the radius of the objects, and was able to fix this for the final piece](https://prod-files-secure.s3.us-west-2.amazonaws.com/f41bca80-c173-480c-ac9f-f43bf8bc1da6/58826dc8-ca7d-4f91-a18e-f6e3132395cf/480_Screen_Recording_2024-03-16_at_16.58.55.mov)

prototype #3: i added multiple objects and made them float around when not being grabbed. in this video you can see some bugs where objects are being grabbed even when the hand isn’t overlapping with them

after debugging, i found that i was using the wrong value for the radius of the objects, and was able to fix this for the final piece

in my logic for having the objects float across the screen, i added “padding” on the sides of the window so that the objects would smoothly disapear and reappear on the other side, rather than jumping to the other side once their centers reached the edge of the window

in my logic for having the objects float across the screen, i added “padding” on the sides of the window so that the objects would smoothly disapear and reappear on the other side, rather than jumping to the other side once their centers reached the edge of the window

conclusion

I had a lot of fun creating this piece! I’m particularly happy with how I was able to execute my vision of making a fun interaction that people can play with. It fills my heart to see the wonder on people’s faces when they first grab a cloud with their hands, and even after testing this multiple times I still have that same feeling.

One idea that came up in my original brainstorming but I didn’t have time to implement was an overall changing “mood” for the scene based on the positions of the sun and clouds. For example, if the sun is off the screen or covered by a cloud the video becomes “gloomy” through reducing the saturation. Or, if two clouds join together it could start raining. I’d love to work on this in the future :)

links

OpenProcessing link

Google Drive link to video

sources

Handsfree.js

the Handsfree.js library i used for hand detection and tracking

p5.js Web Editor

a demo on how to set up and use Handsfree.js in a p5.js project