
From Drawing MIDI to Performing Music: Hammy Cinematic’s Expressive Workflow with AirMotion
Film and game composer Hammy Cinematic explains a new way of composing orchestral music by using AirMotion to perform expression in real time instead of drawing MIDI with a mouse. Her workflow demonstrates how breath-based control can transform MIDI programming into a natural and expressive performance.
Modern film and game composers rely heavily on MIDI programming to create realistic orchestral performances. For years, the standard workflow has been simple: write notes on the piano roll and shape the performance afterward by manually editing MIDI controllers such as dynamics, expression, or vibrato.
While this method works, it has one major limitation — it separates composition from performance.
Instead of playing musical expression, composers often end up drawing curves with a mouse, carefully editing dozens of automation lanes to simulate a live performance. The result can be technically correct, but it often lacks the natural phrasing and spontaneity that comes from real physical interaction with the music.
Film and game composer Hammy Cinematic, known for her cinematic composition tutorials on YouTube and Instagram, recently explored a different approach by integrating AirMotion into her composing workflow.
Playing Expression Instead of Drawing It
In a short video demonstration, Hammy shows how AirMotion allows composers to control musical expression through physical gestures.
Rather than manually editing MIDI curves, she performs them.
Dynamics, modulation, and expressive nuances are controlled through natural movements, allowing the composer to shape the music in real time while listening to the orchestration evolve.
Watch the video here:
This workflow fundamentally changes the relationship between the composer and the music. Instead of treating MIDI data as something to fix and refine afterward, expression becomes part of the performance itself.
The Limits of Traditional MIDI Editing
Most composers working with orchestral libraries are familiar with the traditional process:
- Writing notes in a piano roll
- Adjusting velocity values
- Drawing modulation curves
- Editing expression automation
- Manually shaping vibrato and dynamics
This process can become extremely time-consuming. Even worse, it often forces composers to think like editors rather than musicians.
Small expressive details — subtle crescendos, breathing phrasing, organic dynamics — are difficult to reproduce naturally using only a mouse.
The result is music that may sound programmed rather than performed.
Bringing Physical Performance Back to MIDI
AirMotion introduces a different philosophy: perform the expression instead of programming it.
By translating breath and motion into MIDI data in a wireless way, composers can interact with their music physically while recording their parts.
This allows expressive elements such as:
- dynamics
- phrasing
- modulation
- vibrato
- musical emphasis
to be performed naturally in real time.
The difference is subtle but powerful. Instead of spending hours tweaking MIDI curves, composers can capture musical expression in a single take — much like a live musician would.
By bringing physical expression back into the composing process, tools like AirMotion help bridge the gap between digital production and real musical performance.
Stay in the loop
Subscribe for product updates, early access, and exclusive content — no spam, just the good stuff.