EYEO 2016: Observations on Toolmaking

I'm writing this having returned from the 2016 EYEO Festival, a gathering of creative technologists, designers, and artists from all over the world. It was an amazing experience, and I highly recommend going if you ever have the chance to do so. There were many things I enjoyed about it...the excellent talks, getting to meet people I've only talked to on Twitter for the first time, and the late night dancing at Prince's nightclub. 

By the end of the week, I noticed an underlying theme to several of the talks I went to and conversations I was part of, which was that of designing and creating tools. 

The first talk I noticed this theme in was Hannah Perner-Wilson's talk, "The More I make, the more I wonder why". She said there was a point in her e-textiles practice where she started to "make tools and not parts". This was specifically reflected in her OHMHOOK  (ohm meter / crochet hook), which lets her measure the resistance value of the circuits she creates with conductive thread. 

This tool is able to help Hannah work much more quickly...instead of having to put down her crochet hooks, pick up the leads of her ohm meter to test the circuit's resistance, put down the leads, and pick up her hooks to start crocheting again, she could test the resistance value of the circuit with the same tool she uses to create the circuit in the first place. 

The topic of tools also came up in conversation. Derek Kinsman and I were talking about the tendency of some programmers to fetishize certain languages, frameworks, and development patterns at the expense of whatever languages, frameworks, and patterns are the most appropriate for the task you are trying to accomplish. (This is something Derek is particularly passionate about...on his website, he states that he  "...believes finding the right solution is more important than finding the right problem and that said solution determines what tools will be used.")

Since creative technologists continually push the boundary of what technology can do (which often involves using it in ways it was never intended for), we frequently run into instances where the tools we're try to use just can't do the thing we're trying to use them for. The solution is often then to modify that tool, or create a new one. In his talk, Ben Fry also said that the reason the Processing project exists is because of the frustration he experienced when watching students struggle to with programming something when working on a design project --- they spent so much time trying to work out how to do something in their code that they would lose the bigger picture of what they were working on. As a result, Processing has helped countless others be introduced to the creative coding. 

During their practices, Hannah, Ben, and many others pushed up against the limit of what their tools could do while trying to accomplish their creative or pedagogical goals. As a result, they had to create their own tools that would server their needs better than any existing tools. By open-sourcing them, their work has been able to benefit the creative practice of many others as well. 

Thoughts on Swift, first pull-requests, and the FOSS community.

Since Swift was open-sourced, there have been many pull-requests on the project. Many of these have been grammatical or spelling fixes. Unfortunately, both on Twitter and on GitHub, there have been snide comments regarding these pull requests as being insubstantial. 

Chris Lattner, the Swift project architect, thinks otherwise: 

(You can find the specific tweet here).

For those of us who are already active in the open-source software community, it can be easy to forget this. It may be a good idea to take a look and see what your first pull-request was on Github. You can find out what that was on First Pull Request. Chances are good that for many people, their first pull-request was a typo fix in a README file, or fixing some formatting in a header file. 

It's important to remember that all types of improvements to a project are important, even if they're just typo fixes. There's always a chance that the people making these "small" pull-requests on the newly open-sourced Swift project will make improvements in the future in more in-depth areas of the project as they continue to explore the code base. 

On a final note, if you are looking to get involved in the open-source community but aren't sure where to start, take a look at Your First Pull Request. They post GitHub issues that would be appropriate for "first timers" to tackle. 

Happy coding!

Arduino + AudioKit Demo

I've added a new OS X project on GitHub that shows how a simple oscillator created with AudioKit can be controlled with a physical interface.  It's written in Swift, and uses the ORSSerialPort library to interface with an Arduino Uno. I've published a demo video of the project on Vimeo. 

A detailed description of how the app works can be found in the project's README file. 

I hope you find it useful! If you have any questions, or use it as part of your project, please let me know! 


nick (at) audiokit (dot) io

AudioKit 1.2 and the future of Audio Development

Recently, we launched version 1.2 of AudioKit. We've included what we like to call sensible defaults for most operations. With sensible defaults, you can create an instance of an AudioKit object without having to initialize the object's parameters.  

Additionally, most operations now include tests. These tests let you hear what an individual operation is capable of doing, so it's easier for you to figure out what operations you need to get the sound you want!

I'm extremely excited for this release, as I think it's going to go a long way in helping iOS and OSX developers use audio in their apps in new and exciting ways. 

If you're developing for iOS or OSX, for example, you're probably at least aware of Core Audio. While Core Audio is extremely powerful, it's low-level nature makes it extremely difficult to prototype and deploy quickly...you're spending so much time working with low-level samples and buffers that you're not able to focus on making great sounds!

Before AudioKit, there were other 3rd party audio development solutions, such as libpd and Csound for iOS. However, both of these required developers to use other environments: Pure Data and Csound. If you were a developer and wanted to make interesting audio, you were stuck either trying to decipher Core Audio's cryptic nature, or learn how to use another environment. 

That's why AudioKit is so special...it allows developers to implement audio in a high-level way using Objective-C or Swift, without having to learn another environment such as Pure Data or Csound. 

In order for audio to be integrated into more and more applications, it has to be easier for software developers to work with audio. Csound for iOS started to solve this problem, but; for the actual audio implementation, you still had to use Csound...which, at it's best, can be described as having a rather cryptic syntax, even if the audio engine is extremely powerful. 

And, I'm not saying that their is no need for visual programming environments, such as Pure Data or Max/MSP (I for one am extremely excited about Max7).What I am saying is that software developers in the 'traditional' sense who want to write apps for iOS or OSX will be able to leverage the power of AudioKit

I believe that with AudioKit, developers will be able to create new and interesting experiences for users through high-level audio. And we're going to keep on improving AudioKit as we go.

Tactile Interactions with Multi-Touch Apps

As much as multi-touch devices have enabled new forms of music creation and performance, they are still lacking in one thing that traditional, acoustic instruments have: tactile feedback. However, Dutch designer Samuel Verburg of Tweetonig created a solution to this problem with TunaKnobs; rotary knobs that will attach to any capacitive surface (Wired UK has a great write-up on the project).

(Image taken from Tweetonig website)

(Image taken from Tweetonig website)

The video on the project's Kickstarter page humorously shows the problem that musicians and DJs have with many multi-touch music apps: they're somewhat clumsy to interact with during a performance. 

Tuna Knobs are not the first example of this idea. Flip Visnjic has an article on The Creative Applications Network about a similar project from designers at TEAGUE. In my opinion, Visnjic is spot on when he says  "I think a lot of people fail to acknowledge that the future are NOT touch screen devices but those that combine both the physical and touch input...". Interestingly enough, the only comment on the article is from someone saying how much better these knobs would be, opposed to a virtual pan knob in an audio app.

Most multi-touch devices only enable what Bret Victor calls 'picture under glass' interaction. In other words, while these devices are capable of allowing complex gestural interactions, we're generally only manipulating images on a glass screen. Our fingers only have the feeling of touching glass. Tuna Knobs seems to provide one niche-specific use for this problem: instead of turning an image of a knob with one finger, you're turning an actual, physical rotary knob in the same manner as if you were using an analog synthesizer. 

Given that Tuna Knobs reached their Kickstarter goal in 21 hours, it's safe to say that there is a demand for additional modes of interaction in consumer computing devices. My prediction is that more and more work will be done to extend and augment the experience of multi-touch based computing, but (for now) it will continue to come from designers such as Verbung, not consumer electronics manufacturers. 

Workshop with the Fuse Factory

This past Saturday, I had the pleasure of giving an Introductory Workshop on Pure Data at the Fuse Factory in Columbus, Ohio. We worked on a variety of topics, ranging from how to install and set-up PD to basic synthesis techniques, video effects with GEM, and interacting with an Arduino. 

I've put the presentation and patches on GItHub. If you have any questions, feel free to get in touch!

Many thanks to Alison Colman for organizing the event. 


Intro to Pure Data Workshop

Next week, I'll be giving an Introduction to Pure Data workshop at the Fuse Factory in Columbus, Ohio. You can sign up for it here. I'm going to be talking about:

-Installing and getting started with Pure Data
-Making a basic synthesizer
-User interaction
-GEM and visuals
-PD with other programs
-PD on your phone

Hope to see you there!