During this year’s round of Game Audio North talks, I give a brief overview of procedural audio and had a look at where it’s being used. I ran through the two primary approaches (top-down & bottom-up) and showed how you might tackle a project with a top-down mindset.

To demonstrate this, I created a PureData patch that simulated a recording that I took of my convection fan. Without delving into the operational mechanics of the fan, I tried to approximate the features of the sound with simple signal generators and some filters.

I also briefly covered some of the research I undertook for my final year tech project, which culminated in an attempt at procedurally generating footfalls using granular synthesis. The resultant footfall synthesis still leaves a lot to be desired, but it did highlight the level of depth and nuance involved in simple interactions. These sorts of insights will become even more important as video games try to become more immersive and interactive, especially within media such as VR.

Click the image below to access the slides for the presentation: