Responsive image

Applications of Audio and MIDI API Within a music notation editor

Corentin Gurtner
In this demo, I will show the web application Flat and the way we use Web Audio API and Web MIDI API for improving the collaborative score editing experience on flat.io. Flat is a collaborative score music editor. It allows people to create music from scratch by working in real-time with each other. It can be considered as a 'Sibelius meets Google Doc meets Github' (although Flat does not allow to fork scores yet), and we designed it in order to be very easy to use. Each user has access to his personal space that lists all the scores he made. Each score can be private, public or open to some specific people. When a user invites other people to collaborate, he can see what these other users are doing in real-time, which parts they are working on, what notes they are setting ... etc. They can also communicate with each other through a real-time comments section. Once they composed an awesome piece, they can share it with their friends on social networks, but also with the Flat community and get likes, comments and requests to collaborate. We use an audio playback that is based on Chris Wilson's famous post 'A tale of two clocks': However, all the notes in the next 2 seconds window are scheduled each 0.5 second in order to improve performance. Then a parser fetches all the necessary data to perfonn a realist audio playback, appropriate to the score. We save music scores in a custom JSON model lying on MusicXML format. Each instrument is made out of samples. We also plan to add some customizable synthesis-based instruments. Some envelopes specific to each instrument are set to allow smooth and clear transitions between notes. Some occasional effects (staccato, fermata, tremolo ... ) can also alter the envelope. The library responsible for audio processing is named Dacapo, and it's also used on the backend to export WAV, MP3 and MIDI files (through Node.js). We are also implementing MIDI devices support with a real-time display on the score during live recording. When you press a key of your controller, the note is displayed on the score immediately after the release, and you can then adjust the duration on a simple piano roll (very convenient to choose between a triplet and three quarter notes for example). Another feature we are going to release in the coming weeks is the ability to connect Flat to an output MIDI port. For instance, it can be very convenient to connect it to a virtual MIDI port. Then, users can enjoy a high-quality sound by connecting to a virtual instrument (like Kontakt) and enhance the unique Flat experience. The key feature of Flat, is the ability to collaborate with several people on the same music sheet (a bit like Google Docs) so I will show that feature with some of my friends in France. I will conclude by explaining what is coming next, like an audio recognition algorithm and physical modeling for instruments, though we need to wait for the Audio Worker to be released for such applications.
            
@inproceedings{2016_EA_44,
  abstract = {In this demo, I will show the web application Flat and the way we use Web Audio API and Web MIDI API for improving the collaborative score editing experience on flat.io. Flat is a collaborative score music editor. It allows people to create music from scratch by working in real-time with each other. It can be considered as a 'Sibelius meets Google Doc meets Github' (although Flat does not allow to fork scores yet), and we designed it in order to be very easy to use. Each user has access to his personal space that lists all the scores he made. Each score can be private, public or open to some specific people. When a user invites other people to collaborate, he can see what these other users are doing in real-time, which parts they are working on, what notes they are setting ... etc. They can also communicate with each other through a real-time comments section. Once they composed an awesome piece, they can share it with their friends on social networks, but also with the Flat community and get likes, comments and requests to collaborate. We use an audio playback that is based on Chris Wilson's famous post 'A tale of two clocks': However, all the notes in the next 2 seconds window are scheduled each 0.5 second in order to improve performance. Then a parser fetches all the necessary data to perfonn a realist audio playback, appropriate to the score. We save music scores in a custom JSON model lying on MusicXML format. Each instrument is made out of samples. We also plan to add some customizable synthesis-based instruments. Some envelopes specific to each instrument are set to allow smooth and clear transitions between notes. Some occasional effects (staccato, fermata, tremolo ... ) can also alter the envelope. The library responsible for audio processing is named Dacapo, and it's also used on the backend to export WAV, MP3 and MIDI files (through Node.js). We are also implementing MIDI devices support with a real-time display on the score during live recording. When you press a key of your controller, the note is displayed on the score immediately after the release, and you can then adjust the duration on a simple piano roll (very convenient to choose between a triplet and three quarter notes for example). Another feature we are going to release in the coming weeks is the ability to connect Flat to an output MIDI port. For instance, it can be very convenient to connect it to a virtual MIDI port. Then, users can enjoy a high-quality sound by connecting to a virtual instrument (like Kontakt) and enhance the unique Flat experience. The key feature of Flat, is the ability to collaborate with several people on the same music sheet (a bit like Google Docs) so I will show that feature with some of my friends in France. I will conclude by explaining what is coming next, like an audio recognition algorithm and physical modeling for instruments, though we need to wait for the Audio Worker to be released for such applications.},
  address = {Atlanta, GA, USA},
  author = {Gurtner, Corentin},
  booktitle = {Proceedings of the International Web Audio Conference},
  editor = {Freeman, Jason and Lerch, Alexander and Paradis, Matthew},
  month = {April},
  pages = {},
  publisher = {Georgia Tech},
  series = {WAC '16},
  title = {Applications of Audio and MIDI API Within a music notation editor},
  year = {2016},
  ISSN = {2663-5844}
}