MSc Final Project Weblog 5 : Project Construction And The Metronome Problem
<5/09/21 - 20/09/21>
During this period I began to code the architecture of my project. This was aided immensely by the time I spent planning and researching. The majority of the implementation went as planned but some aspects (such as the Entity) became more refined as I continued with the implementation and my understanding of the concepts improved.
The manager scripts are singleton mono-behaviours that handle essential operations relating to their subject: for example, the spawn manager handles the calculation and creation of the spawn tiles/rooms, whilst the audio manager provides audio files relevant to the current biome combination.
A variety of abstract classes to contain functionality that will be reused by disparate objects that need to be interacted with by the rest of the program's systems. This includes the Entity, which all need uniform functions for neat integration with the other systems, but also require nuanced design to differentiate and generate artistic intrigue.
Unity's built in default abstraction that lets scripts be attached as components to GameObjects. Including a variety of sinusoidal movement/rotation scripts to add life to a scene, tile spawn collider logic and other frequently used code.
The various files and aesthetic settings for each biome composites need to be stored in a manner that is separated but easily navigated and maintained, as well as simple enough for string concatenation (to allow a modular loading system).
Introducing The Metronome Problem :
This vitally important part of the project requires a little explanation and demystification. A clock structure is important for the sonic integrity of the project, a caveat of this however is that it must be capable of doing subdivisions, which is not available from the FMOD Event Callbacks. Furthermore, the ability to fluidly change the BPM is also important and this is made incredibly difficult by FMOD.
The timing of audio events is vital, as even small discrepancies in the timing can be very noticeable (less important for visual events). Furthermore, Unity's Update function triggers once per frame, however this is often at variable framerates which are unsuitable for audio applications.
Using event callbacks & FMOD timeline markers : this is the method suggested in this video when address what the creator describes as 'the subdivision problem', however for the purposes of this project where BPM is to be changed at run time, and mathematical relationships between different BPMs is desirable it requires requires a lot of manual labour, furthermore it also seems to be temporally unstable from initial tests. Furthermore requires each desired BPM to be set up in advance. The major issue with this that almost certainly makes it unviable is that audio events will always be at least a few ms late.
Using nested events : in an FMOD forum post where somebody is querying about achieving a similar result to this project's aims, FMOD tech support suggest that this is a good option to approach the problem. This still requires a lot of manual work to achieve some of my more complex audio interaction goals, albeit less tedious copy, pasting and dragging than the callback method. Still suffers from the adaptive BPM problem whereby BPMs need to be set up in advance.
FMOD Sample Clock Query : pure maths approach to calculating. Inspired by method outlined here relating to using this approach with built in Unity audio (dropping FMOD is also an option, but would involve some amount of refactoring). This has a fatal issue in that FMOD can only apply delay to a bank of channels, which would make audio project hierarchy exceptionally complicated and potentially unworkable.
There has to be a better way: The amount of markers required for a single BPM setting in FMOD.