Musical Instrument Digital Interface (MIDI)

  1. Home
  2. /
  3. Recording and MIDI
  4. /
  5. Musical Instrument Digital Interface (MIDI)
Note names and MIDI note numbers

Note names and MIDI note numbers

Musical Instrument Digital Interface, or MIDI, is an industry-standard electronic communications protocol that defines each musical note in an electronic musical instrument such as a synthesizer, precisely and concisely, allowing electronic musical instruments and computers to exchange data, or “talk”, with each other. MIDI does not transmit audio – it simply transmits digital information about a music performance.

The MIDI Show Control (MSC) protocol (in the Real Time System Exclusive subset) is an industry standard ratified by the MIDI Manufacturers Association in 1991 which allows all types of media control devices to talk with each other and with computers to perform show control functions in live and canned entertainment applications. Just like musical MIDI (above), MSC does not transmit the actual show media — it simply transmits digital information about a multimedia performance.

Almost all music recordings today utilize MIDI as a key enabling technology for recording music. In addition, [MIDI] is also used to control hardware including recording devices as well as live performance equipment such as stage lights and effects pedals. MIDI has recently been adopted into mobile phones, used to play back ringtones. MIDI is also used to provide game music in some video games.

The MIDI standard was first proposed by Dave Smith in 1981 in a paper to the Audio Engineering Society and the MIDI Specification 1.0 was published in August 1983.

MIDI allows computers, synthesizers, sound cards, samplers and drum machines to control one another, and to exchange system information. Though modern computer sound cards are MIDI-compatible and capable of creating realistic instrument sounds, the fact that sound cards’ MIDI synthesizers have historically produced sounds of dubious quality has tarnished the image of a computer as a MIDI instrument. In fact, the MIDI specification itself has nothing to do with the quality of sound produced – this varies depending on the quality of sound card and/or samples used.

MIDI is almost directly responsible for bringing an end to the “wall of synthesizers” phenomenon in 1970s-80s rock music concerts, when keyboard instrument performers were sometimes hidden behind banks of various instruments. Following the advent of MIDI, many synthesizers were released in rack-mount versions, enabling performers to control multiple instruments from a single keyboard. Another important effect of MIDI has been the development of hardware and computer-based sequencers, which can be used to record, edit and play back performances.

Synchronization of MIDI sequences is made possible by the use of MIDI timecode, an implementation of the SMPTE time code standard using MIDI messages, and MIDI timecode has become the standard for digital music synchronization.

A number of music file formats have been based on the MIDI bytestream. These formats are very compact; often a file of only 10 kilobytes can produce a full minute of music.

How MIDI works in a nutshell

How musical MIDI works

When a note is played on a MIDI-aware instrument, it transmits MIDI messages. A typical MIDI message sequence corresponding to a key being struck and released on a keyboard is:


  1. the user presses the middle C key with a specific velocity or pressure (which is usually translated into the volume of the note but can also be used by the synthesiser to set the timbre as well)
  2. the user changed the force with which he is holding the key down (can be repeated, optional)
  3. the user stopped playing the middle C note, again with the possibility of velocity of release controlling some parameters

Other performance parameters would also be transmitted. For example, if the pitch wheel were being turned, that information would also be transmitted using different MIDI messages. The musical instrument does this completely autonomously requiring only that the musician play a note (or do something else that generates MIDI messages).

All notes that a musical instrument is capable of playing are assigned specific MIDI messages according to what the note and octave are. For example, the Middle C note played on any MIDI compatible musical instrument will always transmit the same MIDI message from its ‘MIDI Out’ port. Which MIDI message and thus which binary digits will be transmitted upon playing of a certain note are defined in the MIDI specification and this comprises the core of the MIDI standard.

All MIDI compatible instruments follow the MIDI specification and thus transmit identical MIDI messages for identical MIDI events such as the playing of a certain note on the musical instrument. Since they follow a published standard, all MIDI instruments can communicate with and understand each other, as well as with computers which have been programmed to understand MIDI messages using MIDI-aware software. The MIDI interface converts the current fluctuations transmitted by a MIDI musical instrument into binary numbers that the receiving musical instrument or computer can process. All MIDI compatible instruments have a built-in MIDI interface. In addition, computer sound cards usually have a built-in MIDI interface – if not, it can be separately purchased as a card and easily installed.

How MIDI Show Control works

When any cue is called by a user (typically a Stage Manager) and/or preprogrammed timeline in a show control software application, the show controller transmits one or more Real Time System Exclusive messages from its ‘MIDI Out’ port. A typical MSC message sequence is:


  1. the user just called a cue
  2. the cue is for lighting device 3
  3. the cue is number 45.8
  4. the cue is in cue list 7

The MIDI specification

Electrical connections

MIDI ports and cable

MIDI ports and cable

The MIDI standard consists of a communications messaging protocol designed for use with musical instruments, as well as a physical interface standard. Physically it consists of a one-way (simplex) digital current loop serial communications electrical connection signaling at 31,250 bits per second.

Only one end of the loop is referenced to ground, with the other end ‘floating’, to prevent ground loops from producing analog audio interference and hum. The current loop on the transmitter side drives the LED of an opto-coupler on the receiver side. This means the devices are in fact optically isolated. The opto-coupler must be a high-speed type. As most opto-couplers have asymmetrical positive-going and negative-going slew rates, they slightly alter the signal’s duty cycle. If several MIDI devices are connected in series by daisy-chaining the MIDI THRU to the next device’s MIDI-IN, the signal gets more and more distorted (until receive errors happen because the positive or negative pulses get too narrow).

MIDI connectors use standard 5-pin DIN connectors which at one time were a de facto European standard for audio interconnection. Over time the simpler American RCA phono jack has replaced DIN in this application, leaving MIDI as the only place DIN is commonly encountered in modern equipment. Standard size DIN connectors were also used for computer keyboard connections from the early 80s through the late 90s and have generally been replaced by mini-DIN connectors.

Computers have 15-pin D-Sub connectors that are used for MIDI IN/OUT or joystick connection. The MIDI specification very conservatively states that the maximum distance MIDI can be transmitted is 50 feet or 15m but it can normally go much further.

There is a USB connection standard and a standard for MIDI over ethernet and internet called RTP MIDI> being developed by the IETF.

Most MIDI capable instruments feature a MIDI IN, MIDI OUT, and occasionally a MIDI THRU connection in the form of five-pin DIN plugs. In order to build a two-way physical connection between two devices, a pair of cables must be used. The MIDI-THRU jack simply echoes the signal entering the device at MIDI-IN. This makes it possible to control several devices from a single source.

The 1985 Atari ST was the first home computer to sport the original five-pin format — which made the ST a very popular platform for running MIDI sequencer software. Most PC soundcards from the late 1990s had the ability to terminate a MIDI connection (usually through a MIDI IN/MIDI OUT converter on the game port). The game port has been supplanted in the modern PC by USB devices, and so typically a PC owner will need to purchase a MIDI interface that attaches to the USB or FireWire port of their machine to use MIDI.

Message format

Each one-way connection (called a port) can transmit or receive standard musical messages, such as note-on, note-off, controllers (which include volume, pedal, modulation signals, etc.), pitch-bend, program change, aftertouch, channel pressure, and system-related messages. These signals are sent along with one of 16 channel identifiers. The channels are used to separate “voices” or “instruments”, somewhat like tracks in a multi-track mixer.

The ability to multiplex 16 “channels” onto a single wire makes it possible to control several instruments at once using a single MIDI connection. When a MIDI instrument is capable of producing several independent sounds simultaneously (a multitimbral instrument), MIDI channels are used to
address these sections independently. (This should not be confused with “polyphonic”; the ability to play several notes simultaneously in the same “voice”.)

Note messages can represent any note from ”’C (i.e. five octaves below middle C or 8.175 Hz in common Western musical tuning; designated as MIDI note 0) to g””” (i.e. five octaves above the G above middle C or 12,544 Hz; designated as MIDI note 127) with precision down to the semitone.

Pitch-bend messages range in ±2 semitones (sometimes adjustable with Registered Parameter Numbers), with precision of 1/8192 semitone. (The human ear cannot hear the difference between adjacent pure tones that differ by less than 1/20 semitone.)

Continuous Controllers are quite versatile; they can usually be controlled by a musician using knobs, sliders, footswitches, or pressure on the instrument. They can be used to change the tone, timbre, or volume of a sound; move motorized faders; and even dim lights.

Program change messages are sent to an instrument on a particular channel to instruct it to recall another patch, or program. The MIDI protocol uses 7 bits for this message, supporting only 128 program changes. Many devices which are more modern than the MIDI specification store far more than 128 programs. To overcome the limitation, a bank-switching method has been added to the spec (Each bank of 127 programs can be selected using a controller message, enabling access to 127^2 programs).

Aftertouch messages (also known as poly pressure messages) are sent in some instruments to indicate pressure changes on the note while it is being played. Similarly, channel pressure changes the pressure for the entire instrument, not just one note. The channel pressure messages are more commonly implemented in most synthesizers, while the individual pressure sensors that aftertouch messages require are reserved mainly for expensive, high-end synthesizers.

Manufacturer’s system exclusive messages (also known as Manufacturer SysEx, Manuf Sysx, etc.) are defined by the manufacturer of the sequencer/synthesizer and can be any length. These messages are commonly used to send non-MIDI data over a MIDI connection, such as a synthesizer instrument sample or settings and a sequencer’s memory dump. Because they are defined by the device’s manufacturer, they are mainly used for backup purposes and rarely (if ever) useful in another MIDI device.

Real Time System Exclusive messages include the significant MIDI Show Control extension which enables all types of entertainment equipment to easily communicate with each other through the process of show control.

System messages contain meta-information about other MIDI messages. A sequencer, for example, often sends MIDI clock messages during playback that correspond to the MIDI timecode, so the device receiving the messages (usually a synthesizer) will be able to keep time. Also, some devices will send Active Sense messages, used only to keep the connection between the sender and the receiver alive after all MIDI communication has ceased.

It should be noted that MIDI can be used to provide facilities for playing in nonstandard musical tunings. However, apart from using pitch-bend to control each note, these features are not standardized across all instruments.

General MIDI

In MIDI, instruments (one per channel) are selected by number (0-127), using the Program Change message. However, the basic MIDI 1.0 specification did not specify what instrument sound (piano, tuba, etc.) corresponds to each number. This was intentional, as MIDI originated as a professional music protocol, and in that context it is typical for a performer to assemble a custom palette of instruments appropriate for their particular repertoire, rather than taking a least-common-denominator approach.

Eventually interest developed in adapting MIDI as a consumer format, and for computer multimedia applications. In this context, in order for MIDI file content to be portable, the instrument programme numbers used must call up the same instrument sound on every player. General MIDI (GM) was an attempt by the MIDI Manufacturers’ Association (MMA) to resolve this problem by standardising an instrument programme number map, so that for example Program Change 1 always results in a piano sound on all GM-compliant players. GM also specified the response to certain other MIDI messages in a more controlled manner than the MIDI 1.0 specification. The GM spec is maintained and published by the MIDI Manufacturers’ Association (MMA).

From a musical perspective, GM has a mixed reputation, mainly because of small or large audible differences in corresponding instrument sounds across player implementations, the limited size of the instrument palette (128 instruments), its least-common denominator character, and the inability to add customised instruments to suit the needs of the particular piece. Yet the GM instrument set is still included in most MIDI instruments, and from a standardisation perspective GM has proven durable.

Later, companies in Japan’s Association of Musical Electronics Industry (sic) (AMEI) developed General MIDI Level 2 (GM2), incorporating aspects of the Yamaha and Roland GS formats, extending the instrument palette, specifying more message responses in detail, and defining new messages for custom tuning scales and more. The GM2 specs are maintained and published by the MMA and AMEI.

Later still, GM2 became the basis of Scalable Polyphony MIDI (SP-MIDI), a MIDI variant for mobile applications where different players may have different numbers of musical voices. SP-MIDI is a component of the 3GPP mobile phone terminal multimedia architecture, starting from release 5.

GM, GM2, and SP-MIDI are also the basis for selecting player-provided instruments in several of the MMA/AMEI XMF file formats (XMF Type 0, Type 1, and Mobile XMF), which allow extending the instrument palette with custom instruments in the Downloadable Sound (DLS) formats, addressing another major GM shortcoming.

Low bandwidth

MIDI messages are extremely compact, due to the low bandwidth of the connection, and the need for near real-time accuracy. Most messages consist of a status byte (channel number in the low 4 bits, and an opcode in the high 4 bits), followed by one or two data bytes. However, the serial nature of MIDI messages means that long strings of MIDI messages take an appreciable time to send, and many people can hear those delays, especially when dealing with dense musical information or when many channels are particularly active. “Running status” is a convention that allows the status byte to be omitted if it would be the same as that of the previous message, helping to mitigate bandwidth issues somewhat.

MIDI file formats

MIDI messages (along with timing information) can be collected and stored in a computer file system, in what is commonly called a MIDI file, or more formally, a Standard MIDI File (SMF). The SMF specification was developed by, and is maintained by, the MIDI Manufacturers Association (MMA). MIDI files are typically created using desktop/laptop computer-based sequencing software (or sometimes a hardware-based MIDI instrument or workstation) that organizes MIDI messages into one or more parallel “tracks” for independent recording and editing. In most but not all sequencers, each track is assigned to a specific MIDI channel and/or a specific General MIDI instrument patch. Although most current MIDI sequencer software uses proprietary “session file” formats rather than SMF, almost all sequencers provide export or “Save As…” support for the SMF format.

An SMF consists of one header chunk and one or more track chunks. There are three SMF formats; the format is encoded in the file header. Format 0 contains a single track and represents a single song performance. Format 1 may contain any number of tracks, enabling preservation of the sequencer track structure, and also represents a single song performance. Format 2 may have any number of tracks, each representing a separate song performance. Format 2 is not commonly supported by sequencers.

Large collections of SMFs can be found on the web, most commonly with the extension .mid. These files are most frequently authored with the assumption that they will be played on General MIDI players, but not always. Occasional unintended bad-sounding playback is the result.

MIDI-Karaoke (which uses the “.kar” file extension) files are an “unofficial” extension of MIDI files, used to add synchronized lyrics to standard MIDI files. Most SMF players do not display these lyrics, however numerous .kar-specific players are available. These often display the lyrics synchronized with the music in “follow-the-bouncing-ball” fashion, essentially turning any PC into a Karaoke machine.

Note: “.kar” files can often be played by SMF players if the filename extension is changed to “.mid”.

The MIDI Manufacturers’ Association has now defined a new family of file formats, XMF (eXtensible Music File), some of which package SMF chunks with instrument data in DLS format (Downloadable Sounds, also an MMA specification), to much the same effect as MOD files. The XMF container is a binary format (not XML-based).

RMI File

On Microsoft Windows, the system itself uses RIFF-based MIDI file with .rmi extension.

MIDI usage and applications

Extensions of the MIDI standard

USB, FireWire and ethernet embeddings of MIDI are now commonly available, and in the long run the proposed MIDI over ethernet and internet standard called RTP MIDI, being developed by the IETF, is likely to replace the old current loop implementation of MIDI, as well as providing the high-bandwidth channel that ZIPI was intended to provide.

In 1992 the MIDI Tuning Standard, or MTS, was ratified by the MIDI Manufacturers’ Association. While support for this standard is not great, it is supported by some instruments and software; in particular the free software program Timidity supports it. MTS uses three bytes, which can be thought of as a three-digit number base 128, to specify a pitch in logarithmic form. Use of MTS allows any midi file to be tuned in any way desired, something which can be accomplished using the freeware program Scala (program) and other microtuners.

Beyond MIDI

Although traditional MIDI connections work well for most purposes, in 1994 a new high-bandwidth standard, ZIPI, sponsored by UC Berkeley’s CNMAT, was proposed to replace MIDI with incompatible but very similar message system. It was not based on channels, instead shifting the control to individual notes on each device. Channel messages were emulated by grouping note messages. The protocol failed to gain more than
limited acceptance, primarily due to lack of demand.

The Open Sound Control or OSC protocol, developed by the same team, transcends some of MIDI’s musical coding limitations, and is considered by some to be technically superior. OSC has been implemented in software like SuperCollider, Pure Data, Isadora, Max/MSP, Csound, and ChucK, however at present few mainstream musical applications and no standalone instruments support OSC, making whole-studio interoperability problematic. It can run over ethernet connections. OSC is not owned by any private company, however it is also not maintained by any standards organization.

Yamaha has its mLAN protocol, which is a based on FireWire and carries multiple MIDI message channels and multiple audio channels. mLAN is a proprietary protocol open for licensing.

A proposal for High-Definition MIDI (HD-MIDI™) extension is now being discussed by members of the MMA. This major update to MIDI would provide greater resolution in data values, increase the number of MIDI Channels, and support the creation of entirely new kinds of MIDI messages. This work involves representatives from all sizes and types of companies, from the smallest specialty show control operations to the largest musical equipment manufacturers.

Other applications of MIDI

MIDI can also be used for applications other than music:


  • show control
  • theatre lighting
  • special effects
  • sound design
  • recording system synchronization
  • audio processor control
  • computer networking, as demonstrated by the early first-person shooter game MIDI Maze, 1987

Any device built with a standard MIDI-OUT port should (in theory) be able to control any other device with a MIDI-IN port, providing that developers of both devices have the same understanding about the semantic meaning of all the transmitted MIDI messages. This agreement can come either because both follow the published MIDI specifications, or else because for non-standard functionality the message meanings are agreed upon by both manufacturers. MIDI controller is used in two senses. In one sense, a controller is hardware or software which generates and transmits MIDI data to MIDI-enabled devices. In the other more technical sense, a MIDI controller is an abstraction of the hardware used to control a performance, but which is not directly related to note-on/note off events. A slider assigned to open and close a low-pass filter on a synthesizer may be assigned to controller 18, for example. Changes in the position of the slider are transmitted along with “18” so that they are distinguished from changes in the value of other controllers.

MIDI controllers which are hardware and software

The following are classes of MIDI controller:


  • The human interface component of a traditional instrument redesigned as a MIDI input device. The most common type of device in this class is the keyboard controller. Such a device provides a musical keyboard and perhaps other actuators (pitch bend and modulation wheels, for example) but produces no sound on its own. It is intended only to drive other MIDI devices. Percussion controllers such as the Roland Octapad fall into this class, as do guitar-like controllers such as the SynthAxe and a variety of wind controllers.


  • Electronic musical instruments, including synthesizers, samplers, drum machines, and electronic drums, which are used to perform music in real time and are inherently able to transmit a MIDI data stream of the performance.


  • Pitch-to-MIDI converters including guitar/synthesizers analyze a pitch and convert it into a MIDI signal. There are several devices which do this for the human voice and for monophonic instruments such as flutes, for example.


  • Traditional instruments such as drums pianos, and accordions which are outfitted with sensors and a computer which accepts input from the sensors and transmits real-time performance information as MIDI data.


  • Sequencers, which store and retrieve MIDI data and send the data to MIDI enabled instruments in order to reproduce a performance.


  • MIDI Machine Control (MMC) devices such as recording equipment, which transmit messages to aid in the synchronization of MIDI-enabled devices. For example, a recorder may have a feature to index a recording by measure and beat. The sequencer that it controls would stay synchronized with it as the recorder’s transport controls are pushed and corresponding MIDI messages transmitted.

MIDI controllers in the data stream

Modifiers such as modulation wheels, pitch bend wheels, sustain pedals, pitch sliders, buttons, knobs, faders, switches, ribbon controllers, etc., alter an instrument’s state of operation, and thus can be used to modify sounds or other parameters of music performance in real time via MIDI connections. The 128 virtual MIDI controllers and their electronic messages connect the actual buttons, knobs, wheels, sliders, etc. with their intended actions within the receiving device.

Some controllers, such as pitch bend, are special. Whereas the data range of most continuous controllers (such as volume, for example) consists of 128 steps ranging in value from 0 to 127, pitch bend data may be encoded with over 16,000 data steps. This produces the illusion of a continuously sliding pitch, as in a violin’s portamento, rather than a series of zippered steps such as a guitarist sliding his finger up the frets of his guitar’s neck. Thus, the pitch wheel on a MIDI keyboard may generate large amounts of data which can lead to a slowdown of data throughput. Many sequencers can “thin” pitch-bend or other continuous controller data by keeping only a set number of messages per second or keeping only messages that change the controller by at least a certain amount.

The original MIDI spec included 128 virtual controller numbers for real time modifications to live instruments or their audio. MIDI Show Control (MSC) and MIDI Machine Control (MMC) are two separate extensions of the original MIDI spec, expanding the MIDI protocol to become far more than its original intent.





This article is licensed under the GNU Free Document License
It uses material from the Wikipedia article – Musical Instrument Digital Interface