This video just give me the inspiration about how to present my game in the open house, haha~
This is my project document.
I applied the common jigsaw puzzle ideas into a smart game of two players, which challenge the capability of graphic comprehension. e host places 8 pieces as an outline of his target pattern. And he writes down his idea on one piece of paper.
2 the guest makes a guess based on that, and place the pieces depend on his guessing, then the host will give a judge about that, right or wrong, the guest need to try somewhere else until the host say ”right”.
3 the host give another piece. And the guest try to guess again, if the guest feel confident to get the answer he can say it out, and the host will give him comments, right or wrong. If the guest is still not sure about the answer, he will get going with the pieces placing based on the hint from the host.
4 the guest need to figure out the right answer with three times of guessing, or both the host and the guest lose the game.
5 also time count for the score of this game.
Here is link, it’s simple to play!
The Big New Thing was multi-touch, back when I embarked on my grail-quest for a MS in CS at Georgia Tech in Fall 2009 […yikes?]. Microsoft claims that the Pixelsense technology driving it’s Surface 2.0 will be as affordable as an LCD display, given sufficient time. In the interim, I expect civilization to collapse once or twice.
here’s a [pdf] link to our work-in-progress-but-also-kinda-final paper.
The project is called Sketch-A-Song, designed in collaboration with Nicholas Davis (PhD Human-Centered Computing in-progress). If you have a chance to work with Nick (or even grab coffee), do. From the research perspective, he understands the importance of having a strong theoretical basis before setting out to build something, which I found to be a healthy counterbalance to my mindset of: “plug everything in and see where it takes us”, a methodology I liken to that of the average housepet — urinate on everything. From the non-work perspective, it’s just good fun to discuss theories of creativity, cognition, the arts, and Lisp programming while sipping a pure Columbian roast.
Our technology stack includes Java + Processing + Clojure + Overtone, with the Open-Sound Control protocol serving all of our communication needs. Very soon to come: integration with tangible objects for selective performance and composition of musical components. Have a safe and happy Winter Break, and look for us in 2012.
I want to thank all the class members that used Sketch-A-Song during our recent demo day and provided Eric and I with really great feedback! From our observations, we noticed that people really liked the idea of musical gestures, i.e. seeing how different shapes sounded and working to understand how the speed of their drawing would affect the sound. Individuals also really like to see how their name sounded. This makes sense based on the fact that everyone has a name, so it provides a common point of interest and means of socializing with the technology (NameVoyager also used baby names as a way to engage users and it was very successful).
One unexpected idea that came out of our discussion was using Sketch-A-Song as an assistive tool to help visually impaired individuals create and view art. This is a really exciting idea, and I encourage you all to check out the last section of our paper “Exploring Techniques to Sonify Art with Sketch-A-Song” to learn more about how your ideas helped us come up with a hypothetical design for a really neat assistive technology.
There were two interactive art pieces in particular that Inspired Eric and I when we were designing our project, Sketch-A-Song. These artworks can be found at Baroque.me and Inudge.net.
Baroque.me is a program where moving balls cross lines and play a note proportional to their length. If the user’s cursor is close to the moving ball, the balls gravitate toward the cursor and change the Baroque music compositions that are being played. In this artwork, there is a composition and the user’s interaction alters and updates the compositions. Inudge.net shows a grid where the user can click and drag in order to highlight portions of the grid. At each beat, a vertical line consisting of an entire column of the grid moves horizontally across the screen. When that column crosses a highlighted cell, it plays a note corresponding to the height of the cell.
Inudge features a simple repetitive music composition wherein the notes are musically tuned to make the resulting composition sound harmonious. Baroque.me uses classically composed musical scores with nuances created through mouse position. In Baroque.me the user has less control over the musical output, but the sound is more complex and interesting, whereas in Inudge, the sound is simple and repetitive, yet more user generated.
From these examples, we concluded that we wanted to have the potential for complex sound, but we also wanted the visual artistic experience to be central. In both of these examples, the visual aspect was secondary to the actual functionality of the program. We wanted this visual component to be primary and complemented by complex and beautiful compositions.