Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags
(+2)

Aight, imma write an essay explaining how I did the audio. Whether you want it or not.

I start with waves. I draw the waves in the form of a unity animation curve. Essentially just drawing out the wave (like how a synth might use a sine wave, triangle wave, or square wave).

Then I move into instruments. I hand made several instruments which are several waves played at various frequencies relative to the "note frequency," which is what it is being played at.

The next level is chords. A chord is several frequencies relative to a base frequency. These are used later by the bass and melody. I hand make a selection of these to be chosen from.

A side note about base frequency, every ~10 seconds I change the base frequency, which is like changing the note being played at on a piano. This change is relative to the past frequency and cannot exceed a certain range. It also doesn't need to line up with conventional tuning, these notes probably don't exist on most keyboard.

The bass and melody both work similar in the backend, except the bass is an octave down (1/2 the bass frequency). These are hand made selections of notes from the chord played in a certain order and for a certain length. Essentially just short clips from sheet music.

I then add effects after, primarily a static and modulation effect.

All of these things would ideally be selected based on the scenario (if you are looking at an enemy, how close, how quickly you are moving, etc.) but I was not able to do most things like that, so the actual responsiveness is limited.

If I understand correctly there is a base note that the rest of the instruments/waves base their frequencies on. Are the base note changed at random or is it more based other factors like the scenarios you mentioned? 

And the thing that I was really wondering was how are you creating the sound? I know its based on the waves you draw in the unity animation curve but did you use a plugin to read those and interpret them as sounds or does something different happen (because I only know how to use selfmade audio files and play them back in unity but i do not know how to create those file at runtime and customise them with animation curves)

(+1)

You seem to understand correct.

The base note always has a factor of randomness, but the range of possible values are configured based on the scenarios.

You don't actually need any plugins! If there is an audiosource or audio listener attached to an object you can use the OnAudioFilterRead function to set the actual values of the audio output youself as a float between 0 and 1.