Integrating the GRT into an iPhone Project

In this blog post, I'll show you to add the Gesture Recognition Toolkit to an iPhone app. Created by Nick GIllian while he was a post-doc at MIT, the GRT is a "cross-platform, open-source, C++ machine learning library designed for real-time gesture recognition".  I had known from this issue on GitHub that the GRT has been used in iOS development. I was only able to find one example of this in action, which this guide is partially based upon. 

Read More

Tools for Machine Learning and Sensor Inputs for Gesture Recognition

The past several years have seen an explosion in machine learning, including in creative contexts - everything from hallucinating puppyslugs, to generating new melodies, machine learning is already beginning to revolutionize the way artists and musicians execute their craft.

My personal interest in the area of machine learning relates to using it to recognize human gestural input via sensors. This interest was sparked from working with Project Soli as a member of the Alpha Developer program.

Sensors offer a bridge between the physical world and the digital. Rich sensor input combined with machine learning allows for new interfaces to be developed that are novel, expressive, and can be configured to a specialized creative task.

Read More