The MusicKit is an object-oriented software system for building music, sound, signal processing, and MIDI applications. It has been used in such diverse commercial applications as music sequencers, computer games, and document processors. The MusicKit was the first to unify the MIDI and Music V paradigms.
The NeXt Music Kit was the musical framework of the Next environment and it was the musical counterpart of the Sound Kit just presented in section 2.3.3. But unlike the Sound Kit the Music Kit is still maintained and available for different platforms (see [www-MusicKit, ]).
The NeXT MusicKit was first demonstrated at the 1988 NeXT product introduction and was bundled in NeXT software releases 1.0 and 2.0. Beginning with NeXT's 3.0 release, the MusicKit was no longer part of the standard NeXT software release but was supported and distributed as Version 4.0 by the Center for Computer Research in Music and Acoustics (CCRMA) of Stanford University. Versions 5.0 to 5.4.1 were then supported by tomandandy music, porting to several more popular operating systems. Currently source code is Freely available for everything, with the exception of the NeXT hardware implementation of the low-level sound and DSP drivers (see [www-MusicKit, ]).
Some of the most important features in MusicKit are, according to its original author (see [www-JaffeMusicKit, ]):
In its first versions the Music Kit generated sounds by sending synthesis instructions to the NeXT DSP. In its current form the hardware synthesis has been substituted by software based algorithms. But because of its architecture the Music Kit can implement virtually any synthesis strategy.
In the NeXT Music Kit music is represented in a three-level hierarchy of Score, Part and Note objects. A Score represents a musical composition, a Part corresponds to a particular means of realization. Parts are time-sorted collections of Notes, each of which contains data that described a musical event. There are methods for rapid insertion, deletion, and lookup of Notes.
A Note consists of a list of attribute-value pairs called parameters, a NoteType, a NoteTag and a TimeTag.
A Parameter supplies a value for a particular attribute of a note such as the frequency or amplitude. A parameter value may be simple (integer, real or string) or it may be another object. The Note provides methods for setting the value of a parameter as an Envelope or a Wavetable object. The way a parameter is interpreted depended on the Instrument that realized the Note. The Instrument class defines the protocol for all objects that realized Notes. In some way, parameters are similar to object-oriented messages, the meaning depends on the way the method is implemented in the receiving object.
The noteType and noteTag are used together to help interpret a Note's parameters. There are five noteTypes: NoteDur represents a note with a duration, NoteOn establishes the beginning of a note, NoteOff establishes the end, NoteUpdate represents the middle of the note and Mute is general-purpose. A noteTag is an arbitrary integer used to identify different Notes as parts of a musical phrase or note. (A legato can be created by sending a series of NoteOns, all with the same noteTag). This way the Music Kit solves many of MIDI's problems.
A Note's timeTag, expressed in beats from the beginning of the performance, specifies when the Note is to be performed.
A entire score can be stored in a score file. Score files are in ASCII format and can contain any information that is in a Note. Apart, the Music Kit provides a language called ScoreFile that can be used to add simple programming constructs such as variables, assignments or arithmetic expressions. A score may also be stored in a midifile and utilities are provided for converting to and from standard MIDI file format.
During a Music Kit performance, Note objects are dispatched in time-sorted order to objects that realizes them in some matter. This process involves instances of the classes Performer, Instrument and Conductor. A Performer acquires Notes either from a file, a Score or generating them itself and sent them to one or more instrument. An Instrument receives Notes sent to it by one or more Performers and realizes them in some distinct manner. The Conductor acts as a scheduler ensuring that Notes are transmitted from Performers to Instruments in time-sorted order at the right time. Both Performer and Instrument are abstract superclasses and the Music Kit offers subclasses such as SynthInstrument, MidiOut or ScoreRecorder.
In order to generate sound from musical data the Music Kit uses three main classes: SynthElement, SynthPatch and SynthInstrument. SynthElements are the basic building blocks and they correspond either to code, through the UnitGenerator subclass, or to data, through the SynthData subclass. A SynthPatch is the configuration of SynthElements that define a synthesis strategy and it is analogous to a voice or instrument setting in a regular synthesizer. Finally the SynthInstrument is a subclass of Instrument that realizes Notes by assigning them to particular SynthPatches.
2004-10-18