Skip to main content

On Sale: GamesAssetsToolsTabletopComics
Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines
(1 edit)

Right, the way I did that was I first transcribed the midi data in Ardour using a combination of the sheet music and the waveform of the Vox vocal. Then I used a python script (included in the .love) to record the midi data (since Ardour didn't want to export it with BPM information) to a CSV of midi note values, start, and end times in seconds. Then I converted that to a lua table I could import in Love2D. Syncing it, I start a timer at the same time I start playing the music, and I subtract the timer value from the start and end times. When rendering, I have a pixels_per_second conversion factor, which I also use for collision detection and culling notes off-screen.

If you want to look at the code, everything's in the .love, which is basically a zip file of all the code and assets and whatnot that can be read by the love2d interpreter.