Tone.js Playback

tone.js
code
rhythmical

June 09, 2020

Before implementing more rhythmical features, it would be good to hear some results. This is where the Player comes in. The Player itself will be independent of rhythmical, but able to play flat note events (value, time, duration), as generated by rhythmical.

Note Event Playback

First, let's code a simple playback with Tone.js:

Using Tone.Part we can use a flat event array directly:

import canUseDOM from '../canUseDOM';
import { max } from 'd3-array';
const { PolySynth, Synth } = Tone;

const synth =
  canUseDOM() &&
  new PolySynth({
    maxPolyphony: 6,
    voice: Synth,
    options: {
      volume: -16,
      envelope: { attack: 0.01, decay: 2, sustain: 0, release: 0.1 },
      oscillator: { type: 'fmtriangle' },
    },
  }).toDestination();

export function playEvents(
  events: ValueChild<string>[],
  config: {
    duration?: number;
    instrument?;
  } = {}
) {
  let { instrument = synth, duration = max(events.map((e) => e.time + e.duration)) } = config;
  // play back with tonal
  const part = new Tone.Part(
    (time, event) => instrument.triggerAttackRelease(event.value, event.duration, time),
    events // <- the events are used here
  ).start(0);
  part.loop = true;
  part.loopEnd = duration;
  Tone.Transport.start('+0.1');
  return part;
}

with it, we can then trigger playback:

playEvents(
  renderRhythmObject({
    parallel: [
      ['E3', 'F3', 'G3', 'A3'],
      ['C3', 'D3', 'E3', 'F3'],
    ],
  })
);

Custom Sample Playback

Instead of using a synth, we could use a sampler to e.g. play drums:

show rack src
export function rack(samples: { [key: string]: any }, options = {}) {
  options = { volume: -12, attack: 0.05, ...options };
  let players = new Tone.Players(samples, options);

  const s = {
    customSymbols: Object.keys(samples),
    triggerAttackRelease: (key, duration, time, velocity) => {
      if (!players.has(key)) {
        console.warn(`key ${key} not found for playback`);
        return;
      }
      const player = players.get(key);
      player.start(time);
      player.stop(time + duration);
    },
    connect: (dest) => {
      players.connect(dest);
      return s;
    },
    toMaster: () => {
      players.toDestination();
      return s;
    },
  };
  return s;
}

The triggerAttackRelease method acts as a Tone "API-polyfill" to be able to just pass it to our playback:

// create drums from samples
const drums =
  canUseDOM() &&
  rack({
    bd: require('./bd/BT0A0D0.wav'),
    sn: require('./sn/ST0T0S3.wav'),
    hh: require('./hh/000_hh3closedhh.wav'),
    cp: require('./cp/HANDCLP0.wav'),
    mt: require('./mt/MT0D3.wav'),
    ht: require('./ht/HT0D3.wav'),
    lt: require('./lt/LT0D3.wav'),
  }).toDestination();
// play
playEvents(
  renderRhythmObject({
    duration: 2,
    parallel: [
      ['hh', 'hh', 'hh', 'hh', 'hh', 'hh', 'hh', 'hh'],
      ['bd', ['sn', 'bd'], 'bd', 'sn'],
    ],
  }),
  { instruments: { drums } }
);

Using multiple instruments

To be able to use multiple instruments without using multiple rhythmical objects, we can add a feature that inherits the property instrument:

playEvents(
  renderRhythmObject(
    {
      duration: 4,
      parallel: [
        [
          [
            ['Eb4', 'F4', 'r', 'G3'],
            ['r', 'r', 'r', 'G3'],
          ],
          [['r', 'Bb3'], 'G3', 'r', 'r'],
        ],
        [
          ['C3', 'C3', 'G2', 'r'],
          ['C3', 'C3', 'G3', 'r'],
        ],
        [
          {
            instrument: 'drums',
            parallel: [
              [
                ['hh', 'hh', 'hh', 'hh', 'hh', 'hh', 'hh', 'hh'],
                ['hh', 'hh', 'hh', 'hh', 'hh', 'hh', 'hh', ['hh', 'hh']],
              ],
              [
                ['bd', ['sn', 'bd'], 'bd', 'sn'],
                ['bd', ['sn', 'r', 'r', 'bd'], 'bd', 'sn'],
              ],
            ],
          },
        ],
      ],
    },
    [inheritProperty('instrument')]
  ),
  { instruments: { synth, drums } }
);

We can use anything as instrument that has a triggerAttackRelease(note,duration,time) method! If no instrument is set, the first in the object is used.

Next Steps

  • Microtonal support for Player
  • Be able to change events while playing => good for live coding or generative composition
  • use rhythmical for parameter automation
  • implement playback queue => for seamless transitions
  • implement loop grid
  • implement step sequencer with dividable resizable units

Felix Roos 2022