02.12 / 03.12 / 04.12
This week is centered around AI (artificial intelligence) in combination with interaction. We started this week with a presentation about what AI is and how the logic behind it works. AI is based on fuzzy logic. Which means that it is not true or false, but relatively. For example, my laptop can be large compared to other laptops. So fuzzy logic would categorize my laptop as large. But compared to a wider category of electronic devices, my laptop is small. So fuzzy logic would categorize my laptop as small.
AI trains itself by comparing several examples and drawing conclusions based on the number of examples. And just as we humans learn to categorize and recognize objects within a category. Is something an animal? If so what kind of animal is it, is it a cat or a dog?
After the presentation we went to work with a program made by Google that tries to recognize a drawing ( https://quickdraw.withgoogle.com/ ). I have done a number of projects with AI in the past. And the basis is always the same. But it was not wrong to start again from the basics. For example, I did not know that Quick draw existed. And I was amazed at how well the program worked. When I tried, the program managed to recognize four of the six drawings of mine. Which I think is quite a lot, especially with my level of drawing.
I spent the rest of the day reading the articles. And especially the first article (Frédéric Bevilacqua et al. 2016) I found interesting. It was fairly short but filled with useful information. I found the part about sound-oriented tasks and movement-oriented tasks particularly interesting. It is a good way to direct the user’s attention and make a distinction between these two ways. And that there is interaction with sound or movement in this way.
In my opinion, the other two articles were more a description of a project. And despite the fact that they were interesting, she served more as an inspiration and I don’t see a direct link to what I can do for this project. I did however find the “butterfly effect” described in the second article (Kameron Christopher et all. 2013) an interesting concept. This means that Tiny finger gestures can be used to control very fast repetitive movements. Which of course is an essential part for a musician.
Our project for the week is to create a digital music instrument by using some kind of input (f. ex. webcam, microphone, Arduino). Tuesday was mainly devoted to setting up everything and tinkering with processing and Wekinator. Wekinator is a program that uses AI to form a link between input and output. The input that we are going to use is the leap motion sensor, this is a sensor that can record very accurate hand movements.