Volume control

In our body storming, we had found out that the feeling of control will change quite easily with small changes. We wanted to continue with control and decided to test if we could get a well-established movement to work. Holding your hands over your ears is a movement we have done since we were small children to shut out the sound. How well can this movement be translated into an interaction with a computer? This is something that we can test with the help of the machine learning program.

With this sketch, we wanted to test arm movements and especially how we can work with the wrist positions to control the volume.

From the body storming, we had the idea that the wrists were good to use to measure the distance from/to another body part. We could use them to get the data that we needed to be able to create a movement that could be used as an input. We thought that we could map the distance between the ears and the wrist. This showed to be more difficult than imagined, and getting the readings of the x and y positions to match up wasn’t as simple as I thought.

This was the first test of really trying to read a specific movement with the help of the camera, and it showed that one problem is that it doesn’t always pick up the positions right. It also seems to be a problem when two body parts overlap because when our wrists should be at the same positions as the ears, the positions don’t read correctly. I don’t know if it is because it tries to correct itself and corrects it wrongly or what it is.

Leave a comment

Your email address will not be published. Required fields are marked *