Accidental MIDI
Over the last two to three years, I’ve really gotten into digital and electronic music production. I went all in and bought myself some hardware and software for the PC to dabble with composition, synthesis, and the like. Now, I wouldn’t call myself a musician — I can barely play any instrument — but I’ve been having a lot of fun with it. It’s been awesome.
One of the things I find fascinating about electronic music is the MIDI protocol. If you’re not familiar, MIDI is essentially a simple serial protocol that’s been around for decades. It allows communication between different pieces of hardware and software in a music setup, like keyboards, synthesizers, DAWs, and whatnot. Essentially, you plug a cable between devices and it can act as either an input or output. For example, a keyboard might output a stream of MIDI events, and a sound generator — a device made to synthesize sound — can then take that stream and produce sound. You could have a keyboard that doesn’t generate sound on its own, but sends MIDI data when you press keys, which could then be interpreted by a synthesizer, a PC, a Mac, or even an iPad or phone to sound like a piano, an electric guitar, etc.
In its simplest form, MIDI is a stream of time-coded events. The most common event is a note, which has a value (ranging from 0 to 127) and a velocity (which represents how hard you pressed the key, also from 0 to 127). This system works whether you’re playing a drum kit, a keyboard, or even generating notes procedurally.
I’ve also always been kind of obsessed with video games that incorporate music in interesting ways, particularly when the game dynamically influences the music — or even better, when the game generates the music. There aren’t many games that do this. One older game that comes to mind is Everyday Shooter, originally on the PlayStation 3. It’s a dual-stick shooter with a handful of levels, each with its own mechanics and sound signature, where everything was synchronized with your actions. For example, shooting a bullet would play a note, and explosions would start a riff. The music felt like it was composed as you played, which was magical, though the game wasn’t generating the music from whole cloth per se — but rather assembling it from pre-designed/composed elements.
On the flip side, if you try to generate music purely from random game events, it can end up sounding discordant, more like noise than music. I wanted to explore that space and see if I could create something that felt more musical. I was working on some prototypes, leading up to a larger project, and thought it would be fun to integrate this concept.
In my game prototype, I already had a sound manager that handled things like playing music, crossfading between tracks, and playing sound effects in Unity. Every sound went through a simple set of methods, which allowed me to hook in some logic for generating music. What I ended up doing was mapping in-game events and objects that triggered sound effects to MIDI events. In its earliest form, I mapped those events to MIDI notes and note velocities, though not yet to note duration or other parameters.
For example, I could map the note that plays when you fire a bullet to the amount of damage that bullet will do. Or I could map a note to the speed of a bullet when it hits a surface. Essentially, I created a system where various game parameters influenced the MIDI notes, velocity, note duration and other factors. The first version worked — it emitted a stream of MIDI events as you played. Hook that up to a synthesizer, and you can play whatever instrument you want against the game’s actions. It wasn’t exactly musical at first — more like discordant sounds as if you'd given a kindergarten class a box of old instruments to play with. But it was functional.
Next, I made a couple of improvements. First, I quantized the events to a specific BPM. That means if the music is set to 120 BPM, any in-game sound will be delayed to line up with that tempo. While this theoretically could introduce a bit of lag, in practice with reasonable frame rates and BPMs, it's negligible.
The second improvement was allowing a range of notes to be specified for any parameter. For instance, if you wanted an enemy’s sound to play in a low register, you could define a range of notes for it to choose from. This also works great for percussion, where you might map specific events to individual drum sounds like a bass or snare.
Finally, I constrained the notes to specific musical scales, so you could say, "I want everything in C major." This way, any note that doesn’t fit the scale will be shifted to the nearest appropriate note. It makes everything sound more cohesive, even though the player isn’t consciously thinking about it. You can also select different scales, like G minor or a blues scale, which can dramatically change the tone of the music generated.
The result is a sound manager that I can drop into any prototype or game, which maps sound effects to MIDI events. This lets me feel out the music that naturally emerges from the gameplay. It’s been a lot of fun to experiment with, and I’m sure I’ll keep evolving it.
Looking forward, there are a few things I’d like to add. For example, I’m intrigued by the idea of a memory or feedback loop, where random sequences of notes generated by the game get stored in a buffer and can be recalled later. This would allow the game to create repeating melodies or themes based on prior gameplay events, making the music feel more structured and intentional.
One of the things I find fascinating about electronic music is the MIDI protocol. If you’re not familiar, MIDI is essentially a simple serial protocol that’s been around for decades. It allows communication between different pieces of hardware and software in a music setup, like keyboards, synthesizers, DAWs, and whatnot. Essentially, you plug a cable between devices and it can act as either an input or output. For example, a keyboard might output a stream of MIDI events, and a sound generator — a device made to synthesize sound — can then take that stream and produce sound. You could have a keyboard that doesn’t generate sound on its own, but sends MIDI data when you press keys, which could then be interpreted by a synthesizer, a PC, a Mac, or even an iPad or phone to sound like a piano, an electric guitar, etc.
In its simplest form, MIDI is a stream of time-coded events. The most common event is a note, which has a value (ranging from 0 to 127) and a velocity (which represents how hard you pressed the key, also from 0 to 127). This system works whether you’re playing a drum kit, a keyboard, or even generating notes procedurally.
I’ve also always been kind of obsessed with video games that incorporate music in interesting ways, particularly when the game dynamically influences the music — or even better, when the game generates the music. There aren’t many games that do this. One older game that comes to mind is Everyday Shooter, originally on the PlayStation 3. It’s a dual-stick shooter with a handful of levels, each with its own mechanics and sound signature, where everything was synchronized with your actions. For example, shooting a bullet would play a note, and explosions would start a riff. The music felt like it was composed as you played, which was magical, though the game wasn’t generating the music from whole cloth per se — but rather assembling it from pre-designed/composed elements.
On the flip side, if you try to generate music purely from random game events, it can end up sounding discordant, more like noise than music. I wanted to explore that space and see if I could create something that felt more musical. I was working on some prototypes, leading up to a larger project, and thought it would be fun to integrate this concept.
In my game prototype, I already had a sound manager that handled things like playing music, crossfading between tracks, and playing sound effects in Unity. Every sound went through a simple set of methods, which allowed me to hook in some logic for generating music. What I ended up doing was mapping in-game events and objects that triggered sound effects to MIDI events. In its earliest form, I mapped those events to MIDI notes and note velocities, though not yet to note duration or other parameters.
For example, I could map the note that plays when you fire a bullet to the amount of damage that bullet will do. Or I could map a note to the speed of a bullet when it hits a surface. Essentially, I created a system where various game parameters influenced the MIDI notes, velocity, note duration and other factors. The first version worked — it emitted a stream of MIDI events as you played. Hook that up to a synthesizer, and you can play whatever instrument you want against the game’s actions. It wasn’t exactly musical at first — more like discordant sounds as if you'd given a kindergarten class a box of old instruments to play with. But it was functional.
Next, I made a couple of improvements. First, I quantized the events to a specific BPM. That means if the music is set to 120 BPM, any in-game sound will be delayed to line up with that tempo. While this theoretically could introduce a bit of lag, in practice with reasonable frame rates and BPMs, it's negligible.
The second improvement was allowing a range of notes to be specified for any parameter. For instance, if you wanted an enemy’s sound to play in a low register, you could define a range of notes for it to choose from. This also works great for percussion, where you might map specific events to individual drum sounds like a bass or snare.
Finally, I constrained the notes to specific musical scales, so you could say, "I want everything in C major." This way, any note that doesn’t fit the scale will be shifted to the nearest appropriate note. It makes everything sound more cohesive, even though the player isn’t consciously thinking about it. You can also select different scales, like G minor or a blues scale, which can dramatically change the tone of the music generated.
The result is a sound manager that I can drop into any prototype or game, which maps sound effects to MIDI events. This lets me feel out the music that naturally emerges from the gameplay. It’s been a lot of fun to experiment with, and I’m sure I’ll keep evolving it.
Looking forward, there are a few things I’d like to add. For example, I’m intrigued by the idea of a memory or feedback loop, where random sequences of notes generated by the game get stored in a buffer and can be recalled later. This would allow the game to create repeating melodies or themes based on prior gameplay events, making the music feel more structured and intentional.