For a bit of Friday night entertainment, I dug up my first major/non-trivial programming project that I had started way back in 2005, and worked on for about a year before other projects took over most of my time. It was called "Ludwig" (after Ludwig van Beethoven) and was an attempt at creating a piece of software that could "compose" credible sounding pieces of music... or at least that was the goal ;)
This project was a bit of an outgrowth of an earlier project aimed at creating a music entry/input tool that I could use. For that tool, I needed to initialise the midi player before using it later, so I started by having it play a few chords to ensure that things worked. However, as projects go, this soon snowballed into the idea of being able to make it play a small little ditty composed in "JIT" fashion. Thus began the project to develop Ludwig.
It was my first foray into Python programming, a "variant" of Object-Orientated stuff, and taught me many lessons about writing code.
For example, I learnt (the hard way) about combinatorial explosion and the effects this has when you try to solve your problem by enumerating every possible combination in a big list, all so that you could easily choose any random pattern later on without figuring out how to generate it then and there. This was used as the basis for the rhythm generation stuff. At the time, this was the first time I had ever managed to write code that made my computer start swapping (Note: said computer wasn't even anything fancy, even by the standards back in the day, with just a paltry 224mb RAM), which caused things go slow to a crawl, making the usual few seconds process take an entire hour.
It was also great getting lots of practice writing various algorithms by hand. At the time, there was no such thing as "StackOverflow.com", so I really often ended up just hacking everything myself, even if it meant that it was done quite inefficiently using some amalgam of sticking stuff in one of several lists with names starting with 'x' (extras), 't' (temp), or 'n' (new), iterating over those lists and then sometimes putting things in there, and so forth.
As a side-effect of this DIY culture, I only realised sometime in the last year that actually I had ended up unknowingly reinventing the technique of using single-step Markov chains for various aspects of this system, despite not knowing of this concept or knowing its name back in the day. Back in the day, I had read a bit about people doing this, though I didn't realise the implications of what they were actually doing! Sometimes, you just need to put these abstract concepts in terms of some simple concrete use case, and the rest is just trivial!
Another thing that I ended up hacking (and somewhat related to the Markov chains) was a system of representing a graph of nodes, and to support the ability to take a random walk through these (exiting whenever it got to one of the marked "exit" points. IIRC, I named it a "TreeMap", though I think any Java programmer running into this today would have a bit of a confusion fit trying to understand what was going on.
However, despite all these parts which, although quite grand back in the day now look slightly cringe-worthy, I'm still very proud of a few aspects of this system:
1) It generated sheet music output (in the form of Lilypond .ly files that could be compiled into a pdf-score and corresponding midi file) vs just a plain midi file with no easy/reliable way of extracting sheet music out of it should it produce anything worthy.
2) My algorithm for converting from midi pitch numbers (0 - 128, middle C = 60) to LilyPond pitches (or practically, sane pitch names in any language/representation). I've often found with all the other algorithms for this that I've seen in the wild that there are cases when accidentals are used as the algorithm is too dumb to do otherwise, whereas a human would actually use a key-appropriate alternative. So far, I've only seen it give results similar to what a human would, in most cases (barring those when a significant key change actually occurred, even if temporarily, and it wasn't told about it).
3) The fact that, despite often coming up with complete duds (mostly due to taking the wrong approach to coming up with rhythmic patterns, and a slightly dodgy/overzealous approach to filling in the inner voices), from time to time it comes up with some quirky gems that are actually quite decent.
Also, although not the very best in hindsight, looking back, the following items were also quite some achievements back in the day:
4) All durations were represented in terms of "slots" of the smallest duration represented (usually a semiquaver, but not always), but then split off into actual durations (obeying standard grouping/tie-ing rules and barline restrictions) by a somewhat complicated system (totalling 21kb py-code). This worked on a divide-and-conquer basis, and was based on the idea of sloshing buckets of slots overflowing from one bucket to another, repeated on roughly 3 levels. It was elaborate and produced accurate/valid results, and as originally intended, would make it easy to just retime entire streams of notes without having to painstakingly go through fixing all the groupings again (or being confused by this). However, in practice, a lot of the rhythm generation code ended up needing to check how much space was left in a bar before it chucked anything in there, thus negating the need/purpose of this system somewhat.
5) The MusicStreams system. It was quite a complex system, and my first real taste of inheritance. Although it's purpose well (or mostly), there were some design problems with it which I never really ended up resolving. Some of the most complicated problems here included dealing with chords and triplets/n-lets vs standard notes, which ended up using a somewhat shabby Composite design clobbered together with type checks.
I'll leave you with another "successful" output from the program made a few years ago. Enjoy!
https://sites.google.com/site/aligorith/ludw_20100221_222519-2.mid?attredirects=0&d=1
interesting
ReplyDelete