This paper presents a web based toolkit for implementing Interactive Machine Learning (IML) dedicated to creative audio applications. The toolkit, composed of a main library and a template application, facilitates the creation of experiences on collective musical interactions with a strong emphasis on real-time movement processing and recognition.
At its lower level, the mano-js library proposes a user-friendly API built on top of existing libraries. The library is designed to assist developers and creative coders in the appropriation and usage of the Interactive Machine Learning concepts and workflow, as well as to simplify development of new applications. The library is open-source, based on web standards and released under the BSD- 3-Clause Licence.
At its higher level, the toolkit proposes Elements, a template application designed towards non-developer users. The application specifically aims at providing a mean for researchers and designers to prototype new movement-based distributed Interactive Machine Learning scenarios. The application allows to create a new scenario by simply providing a JSON configuration file that defines the role and the abilities of each client. The application has been iteratively tested and developed in the context of several workshops.