InteractML is the latest addition to the family of interactive machine learning toolkits by Dr. Rebecca Fiebrink. Tools like Wekinator, Sound Control and mimic are being used by artists, educators and researchers to record and process various kinds of realtime data in order to generate sounds, visuals and other stuff.
To learn more about this approach to machine learning you can visit two online courses, Machine Learning for Musicians and Artists and Apply Creative Machine Learning. Highly recommended.
One of the strengths of Wekinator is that it can be connected to almost everything through a protocol called OSC. For example, 2 years ago I wrote a helper for the Bitalino revolution Biodata sensor that connects Bitalino to Wekinator via a Processing sketch, among others. The system could learn patterns from heart rate or skin conductance measures and Processing or send it to interactive such as Pure Data, Max or TouchDesigner.
Now surely some people would like to make similar things in Unity. Enter InteractML which is built on a C++ machine learning library called RapidLib. InteractML uses the same approach as Wekinator and it works with Unity game objects.
In this repo I share an example that measures the position of a game object and adapts a sound parameter in a spooky way, not unlike a Theremin. This is an example for regression. It also includes Chuck/Chunity for audio synthesis.
InteractML comes with a Wiki that explains the individual steps to get the system up and running and how to use it in detail. I am referencing it below.
- Install dependencies
- Build the regression pipeline.
- Check if the data flows into the
Teach the Machine Node
as expected.
- Add Game Objects with Scripts to get the result of the regression.
- Optional: Add Scripts to pipe data into the pipeline.
Uncheck Run Model on Play
in the Machine Learning System
node if it is checked.
For each output value to be produced (I am using 0.0 and 1.0): Set the value in the `Live Float Data` node Start the game in Unity For each example recoding: Put the object in the appropriate position (I am using down for 0.0 and up for 1.0) Press SPACE to start recording Press SPACE to stop recording Stop the game in Unity
The reason that you want to begin the movement before starting the recording is to avoid the model to pick up common features like an idle object at the beginning.
- Make sure the game is not running.
- Klick on
Training
. - Wait a bit until it comes back and indicates that the model has been trained.
- Start the game in Unity.
- Press "P" to run the model. You can skip this step if you check
Run Model on Play
in theMachine Learning System
node.
InteractML is in pre-release alpha, under heavy construction and not ready for production at the moment. It is for the curious who want to try out and experiment with interactive machine learning. The developers strongly recommend to use release 0.20.4 with Unity 2019.2. For this repo, I have used this setup on a Mac with the Github for Unity plugin.