It’s almost hard not to be reading about machine learning these days — and that trend will only increase. Machine learning is opening up powerful new capabilities for smartphone apps, from image classification to facial recognition.
It’s also capable of identifying how someone is interacting with their smartphone. Read More
Inspired by a conversation with Diana Berlin, I wrote about what motivates me as a technologist, and the personal story behind why and how I learned to code.
I published the piece on Medium here, and am including it below as an archive: Read More
One area that I've been exploring recently is Bluetooth communication between sensor-circuits and iOS apps. I wanted to share one of these studies, based on some of the examples provided from the good folks at Adafruit. It consists of a sensor that can detect the presence of a flame, and send that information over Bluetooth to an iPhone app, which displays the reading from the sensor. Read More
In this blog post, I'll show you to add the Gesture Recognition Toolkit to an iPhone app. Created by Nick GIllian while he was a post-doc at MIT, the GRT is a "cross-platform, open-source, C++ machine learning library designed for real-time gesture recognition". I had known from this issue on GitHub that the GRT has been used in iOS development. I was only able to find one example of this in action, which this guide is partially based upon. Read More
I wanted to talk a little bit about one of the technical challenges I had faced while writing the Processing receiver sketches for the Touché experiments in some previous blog posts (here and here).
The problem I was experiencing was that the Processing sketches that would receive the gesture-classification data from the ESP program seemed to updating incredibly slowly. As in, I could clearly see the gesture-classification results being updated in the ESP program, but in the Processing receiver sketches, the results would be displayed several seconds behind what the ESP system was sending! Several seconds behind, actually. Read More
After experimenting with learning how Touché could be used to interact with plants, I wanted to see how I could use it to interact with water.
In the Touché paper, the authors demonstrate that the sensor is capable of detecting how a user is touching the surface of, or submerging their hand in water. The system is able to distinguish among a variety of touch interactions: no hand, 1 finger, 3 fingers, and hand submerged: Read More
As I mentioned in a previous post, I was really pleased to see that the ESP-Sensors project had included code for working with a circuit based on Touché.
I had earlier come across other implementations of Touché for the Arduino, but unlike the ESP project, none of them utilized machine learning for classifying gesture types.
Touché is a project developed at Disney Research that uses swept-frequency capacitive sensing to "...not only detect a touch event, but simultaneously recognize complex configurations of the human hands and body during touch interaction." Read More
When I was working at a small software company in the Midwest as a junior developer, I was asked to sign an NDA / Non-Compete agreement. It was an amended agreement to the one I had signed upon first starting my employment. During the process of signing the first agreement, I had signed it without asking any questions - because I simply hadn’t known any better. Read More
The past several years have seen an explosion in machine learning, including in creative contexts - everything from hallucinating puppyslugs, to generating new melodies, machine learning is already beginning to revolutionize the way artists and musicians execute their craft.
My personal interest in the area of machine learning relates to using it to recognize human gestural input via sensors. This interest was sparked from working with Project Soli as a member of the Alpha Developer program.
Sensors offer a bridge between the physical world and the digital. Rich sensor input combined with machine learning allows for new interfaces to be developed that are novel, expressive, and can be configured to a specialized creative task. Read More
I'm writing this having returned from the 2016 EYEO Festival, a gathering of creative technologists, designers, and artists from all over the world. It was an amazing experience, and I highly recommend going if you ever have the chance to do so. There were many things I enjoyed about it...the excellent talks, getting to meet people I've only talked to on Twitter for the first time, and the late night dancing at Prince's nightclub.
By the end of the week, I noticed an underlying theme to several of the talks I went to and conversations I was part of, which was that of designing and creating tools. Read More
Since Swift was open-sourced, there have been many pull-requests on the project. Many of these have been grammatical or spelling fixes. Unfortunately, both on Twitter and on GitHub, there have been snide comments regarding these pull requests as being insubstantial.
Chris Lattner, the Swift project architect, thinks otherwise: Read More
Recently, we launched version 1.2 of AudioKit. We've included what we like to call sensible defaults for most operations. With sensible defaults, you can create an instance of an AudioKit object without having to initialize the object's parameters.
Additionally, most operations now include tests. These tests let you hear what an individual operation is capable of doing, so it's easier for you to figure out what operations you need to get the sound you want!
I'm extremely excited for this release, as I think it's going to go a long way in helping iOS and OSX developers use audio in their apps in new and exciting ways. Read More
As much as multi-touch devices have enabled new forms of music creation and performance, they are still lacking in one thing that traditional, acoustic instruments have: tactile feedback. However, Dutch designer Samuel Verburg of Tweetonig created a solution to this problem with TunaKnobs; rotary knobs that will attach to any capacitive surface (Wired UK has a great write-up on the project). Read More
I recently wrote an article on multi-touch interaction in music (which included some of my thesis work) called 'Extending the Repertoire of the Hand' for I CARE IF YOU LISTEN. Read More