Integrating Arduino-Bluetooth Sensors with iOS

One area that I've been exploring recently is Bluetooth communication between sensor-circuits and iOS apps. I wanted to share one of these studies, based on some of the examples provided from the good folks at Adafruit. It consists of a sensor that can detect the presence of a flame, and send that information over Bluetooth to an iPhone app, which displays the reading from the sensor. 

Read More

Integrating the GRT into an iPhone Project

In this blog post, I'll show you to add the Gesture Recognition Toolkit to an iPhone app. Created by Nick GIllian while he was a post-doc at MIT, the GRT is a "cross-platform, open-source, C++ machine learning library designed for real-time gesture recognition".  I had known from this issue on GitHub that the GRT has been used in iOS development. I was only able to find one example of this in action, which this guide is partially based upon. 

Read More

Real-time data receiving and rendering in Processing

I wanted to talk a little bit about one of the technical challenges I had faced while writing the Processing receiver sketches for the Touché experiments in some previous blog posts (here and here). 

The problem I was experiencing was that the Processing sketches that would receive the gesture-classification data from the ESP program seemed to updating incredibly slowly. As in, I could clearly see the gesture-classification results being updated in the ESP program, but in the Processing receiver sketches, the results would be displayed several seconds behind what the ESP system was sending! Several seconds behind, actually. 

Read More

Touché, and Water as an Interface

After experimenting with learning how Touché could be used to interact with plants, I wanted to see how I could use it to interact with water. 

In the Touché paper, the authors demonstrate that the sensor is capable of detecting how a user is touching the surface of, or submerging their hand in water. The system is able to distinguish among a variety of touch interactions: no hand, 1 finger, 3 fingers, and hand submerged:

Read More

Talking to Plants: Touché Experiments

As I mentioned in a previous post, I was really pleased to see that the ESP-Sensors project had included code for working with a circuit based on Touché

I had earlier come across other implementations of Touché for the Arduino, but unlike the ESP project, none of them utilized machine learning for classifying gesture types.

Touché is a project developed at Disney Research that uses swept-frequency capacitive sensing to "...not only detect a touch event, but simultaneously recognize complex configurations of the human hands and body during touch interaction."

Read More

On NDA's and vague Non-Compete Agreements as a junior employee

When I was working at a small software company in the Midwest as a junior developer, I was asked to sign an NDA / Non-Compete agreement. It was an amended agreement to the one I had signed upon first starting my employment. During the process of signing the first agreement, I had signed it without asking any questions - because I simply hadn’t known any better.

Read More

Tools for Machine Learning and Sensor Inputs for Gesture Recognition

The past several years have seen an explosion in machine learning, including in creative contexts - everything from hallucinating puppyslugs, to generating new melodies, machine learning is already beginning to revolutionize the way artists and musicians execute their craft.

My personal interest in the area of machine learning relates to using it to recognize human gestural input via sensors. This interest was sparked from working with Project Soli as a member of the Alpha Developer program.

Sensors offer a bridge between the physical world and the digital. Rich sensor input combined with machine learning allows for new interfaces to be developed that are novel, expressive, and can be configured to a specialized creative task.

Read More

EYEO 2016: Observations on Toolmaking

I'm writing this having returned from the 2016 EYEO Festival, a gathering of creative technologists, designers, and artists from all over the world. It was an amazing experience, and I highly recommend going if you ever have the chance to do so. There were many things I enjoyed about it...the excellent talks, getting to meet people I've only talked to on Twitter for the first time, and the late night dancing at Prince's nightclub. 

By the end of the week, I noticed an underlying theme to several of the talks I went to and conversations I was part of, which was that of designing and creating tools. 

Read More

Arduino + AudioKit Demo

I've added a new OS X project on GitHub that shows how a simple oscillator created with AudioKit can be controlled with a physical interface.  It's written in Swift, and uses the ORSSerialPort library to interface with an Arduino Uno. I've published a demo video of the project on Vimeo. 

A detailed description of how the app works can be found in the project's README file. 

I hope you find it useful! If you have any questions, or use it as part of your project, please let me know! 

Nick

nick (at) audiokit (dot) io

AudioKit 1.2 and the future of Audio Development

Recently, we launched version 1.2 of AudioKit. We've included what we like to call sensible defaults for most operations. With sensible defaults, you can create an instance of an AudioKit object without having to initialize the object's parameters.  

Additionally, most operations now include tests. These tests let you hear what an individual operation is capable of doing, so it's easier for you to figure out what operations you need to get the sound you want!

I'm extremely excited for this release, as I think it's going to go a long way in helping iOS and OSX developers use audio in their apps in new and exciting ways. 

Read More

Tactile Interactions with Multi-Touch Apps

As much as multi-touch devices have enabled new forms of music creation and performance, they are still lacking in one thing that traditional, acoustic instruments have: tactile feedback. However, Dutch designer Samuel Verburg of Tweetonig created a solution to this problem with TunaKnobs; rotary knobs that will attach to any capacitive surface (Wired UK has a great write-up on the project).

Read More

Workshop with the Fuse Factory

This past Saturday, I had the pleasure of giving an Introductory Workshop on Pure Data at the Fuse Factory in Columbus, Ohio. We worked on a variety of topics, ranging from how to install and set-up PD to basic synthesis techniques, video effects with GEM, and interacting with an Arduino. 

I've put the presentation and patches on GItHub. If you have any questions, feel free to get in touch!

Many thanks to Alison Colman for organizing the event. 

 

Intro to Pure Data Workshop

Next week, I'll be giving an Introduction to Pure Data workshop at the Fuse Factory in Columbus, Ohio. You can sign up for it here. I'm going to be talking about:

-Installing and getting started with Pure Data
-Making a basic synthesizer
-User interaction
-GEM and visuals
-PD with other programs
-PD on your phone
 

Hope to see you there!