Sunday, September 29, 2024

Musings About Topical Issues - 39th Week of 2024

Here's a roundup of musings on various issues that came up this week...

(NOTE: Most of these I'm just harvesting from my Mastodon feed, and reposting here for easier archival for my own sake)

Saturday, August 31, 2024

Winter of Misery - The End is Near!

Yay!  Today officially marks the "end" of winter 2024 here - a season that unfortunately will be remembered as one of misery and lots of sickness...

 

Sunday, August 25, 2024

What's Your Number? Multi-National People You Know

Was just having an interesting discussion with someone about how many people from different countries / nationalities we each knew - either from working with them directly, VS just knowing them / being acquainted. Having made a few passes at this, it turns out I know a hell of a lot than I'd realised (i.e. I suspect I'm probably a fair bit more "international" than I realised...)

 

Totals:

* Current Workplace in NZ (Currently) = 9 10

* Current Workplace (Previously / On Top of Current Total) = 3

* Previous Workplaces in NZ (Combined) = 13

* Blender (In Person) = 9

* Blender (Online that I've Interacted With... IIRC) = 18

* Other FOSS Work = 3

* During School Days = 21

* Total Unique =  Approx  25-30   (will carefully re-count later)


So, what are your numbers?

Sunday, August 18, 2024

Mozart's "Magic Flute" Opera - Papageno - YouTube Rabbithole

For anyone who needs a bit of a cheer-up on this grey rainy day, here's a bunch of clips of a fun duet from a Mozart opera. 😍


1) The clip that started this rabbit hole, featuring (Huw Montague Rendall & Elisabeth Boudreault) 

This was the first time I'd ever heard this piece. I really loved the energy of the performers and the chemistry they had. (Also have to say that Elisabeth's little dance there at the end was quite cute LOL ;)

https://www.youtube.com/watch?v=lP9V7_fevgQ


2) Trying to find out what this song was about / the context for it, I came across this next clip from The Royal Opera (feat. Christina Gansch and Roderick Williams). 

This one had subtitles, and between these + the staging (i.e. all those kids), the meaning of the song suddenly all made sense (besides bringing back memories of the portrayal of Mozart in Amadeus


https://www.youtube.com/watch?v=9Q0ZDZB-AnM


3) From a comment on that one, I learned that the following is apparently a famous performance by Detlef Roth and Gaële Le Roi.

The key highlights of this were seeing the matching costumes along with a bit more of the context leading up to this scene, along with watching the actors' snuggly expressions as they performed the scene. Personally though, I thought the soprano's singing in this one sounded a bit "soft" compared to the others, and didn't really like it that much TBH.

https://www.youtube.com/watch?v=87UE2GC5db0


4) This final one (a "children's version" from 25 Feb 2024 in Vienna apparently, featuring Rolando Villazon and unknown soprano) takes the cake!

I love the staging of this - particularly how it's this really fun + playful setting within a gigantic opera theatre (seriously... look at all the tiers of seating/boxes in this thing) that looks like it was put on specifically to expose kids to classical works in a much more intimate and less intimidating setting. 

Also this one comes with a lot more context, along with the matching costumes, and everything...

All in all - I thought this was just  SO. MUCH. FUN! 😍

https://www.youtube.com/watch?v=PaldPP44oas

Tuesday, August 13, 2024

My Second/Third Ever Aurora Sighting

Wow! I was finally rewarded again tonight after multiple attempts over the past few nights (and multiple ones earlier in the evening - when others had been reporting lots of successful sightings)... but to no avail until this last-ditch attempt for the night.

More highlights from this show just after midnight here in suburban Christchurch on the night of the 12/13th August 2024 can be found in the following album:

https://www.flickr.com/photos/aligorith/albums/72177720319476435/

Thursday, August 8, 2024

Tip: Getting Threaded Conversations Working Consistently Across Your Desktop Outlook Mailbox

Here are some notes on how I've got Outlook set up to make my work mailbox a bit more manageable. Most of these things are probably officially documented "somewhere", but it's nice having a quick guide for getting a setup that seems more sane (for anyone coming from Gmail)

 

This post comes about because, while I had this working in my Inbox, I found that this was not the case for my other folders once I moved threads there. Hence my search for answers. 


Short Instructions  - For Enabling Threading on "Other" Folders (see image above):

1) Go to one of the affected folders
2) Go to "View" tab on Ribbon
3) Enable "Show as Conversations"
4) In the confirmation prompt, click "All Folders" (vs just "This Folder")

I suspect I'll have to go in and redo this for any new folders I add at some point in the future. But at least this gets all the current ones working nicely.


Bonus Tip:  Fixing the need to triple-click on the annoying little triangles to collapse threads

Make sure you've got that "Always Expand Selected Conversations" thing from that menu selected, which fixes that problem (it seems).

Oh, and you need to do that per-folder... unfortunately, this case doesn't show the handy "change for all folders" popup.

Sunday, July 7, 2024

Autocorrect Rant

One of my pet peeves about how auto-correct works on my phone is this:

Having to fix and refix and refix something where "it knew better" and kept correcting what I enter, despite either:

1) Repeatedly deleting the "fix" it applied + immediately retyping what I had originally typed

2) Explicitly choosing "option 1" (left-word - i.e. the thing I typed), over "option 2" (i.e. its default auto-correct solution)

This is especially annoying when it happens multiple times within a few 5-10 minutes, when I'm typing + re-correcting the same sequence of characters again and again!


Solution:

The solution is really quite simple IMO - An explicit user override for a particular sequence of characters (or an immediate delete + retype of the same thing) should be a strong hint that they do not want that sequence autocorrected again in the next 5-10 minutes. If they keep doing this over a longer period, then that sequence should *never* get auto-corrected to whatever the system decides ever again.

*That* is the sort of "smart" behaviour that people really want from their tech, not the "lie-generating plagiarism machines" that are all the rage right now as the Big Tech titans once again battle to win the latest "first to build the 'Next iPhone' Monopoly game"

Thursday, June 20, 2024

Thoughts About "AI" (Winter 2024 Edition) - AKA: No, I do NOT want to have to "talk" to your "chatbot"

I briefly interrupt coverage of my Music Visualisation Project to cover a brief rant about the topical "AI" issues that are all the rage right now.

 

My current position on all this "AI" hype is:

1) TBH, I bloody HATE all this "me too" bandwagon jumping crap that's going around at the moment, and hope it all blows over sooner rather than later - just like "Crypto" and "NFT's" and "Metaverse" fads before it did. The sooner the better!

See also this "supremely on the point" blog post ;) -  https://ludic.mataroa.blog/blog/i-will-fucking-piledrive-you-if-you-mention-ai-again/

 

2) The UX of all these "AI" tools is fundamentally flawed:  i.e.  

     "I do NOT want to have to fucking 'talk' to your bloody 'chatbot' to do stuff!"

 

3) The majority of all this "AI" hype is all being poured into all the wrong directions: 

    "We should be focussing our efforts on helping people do what they cannot do otherwise (i.e. augmenting human abilities),  NOT trying to replace them  (i.e. destructive misery causing)"

    That there is perhaps the best way to sum up the ethical line / standard I use to decide what I spend time working on. I'm only interested in working on stuff that betters humanity's ability to do stuff they otherwise wouldn't be able to do without the technology. Other stuff (e.g. ad networks, DRM, fintech, killer robots, facial recognition, tracking + surveillance tech, making people/industries/etc. "redundant", etc.) I refuse to work on  (and really, anything I am not interested in, I do a categorically *awful* job at...)

 

4)  In that light, will I work on or play with AI stuff at some point?

     Short Answer:  If AI is the right tool for the job, I will consider it.

     Operative word: "right tool"

     So far, none of the problems I have been working on have required reaching into that toolset, so I haven't bothered to really delve too deeply into it. But if the opportunity arises where AI presents a better solution than we can achieve otherwise, I will consider it then.

     Prime Example:  With some of the image generation + editing tech out there now, we finally have the a set of powerful tools for fixing a whole bunch of previously prohibitively difficult-to-fix problems, giving us the ability to do spot fixes for defects that would've previously ruined many images / videos. In that sense, these user-guided "repair" tools are precisely the "powerful magic fix-it tools"  that we've all dreamed of having all these years, and so, by my previously stated principles, they may well be the right tool for the job in those cases. But using these tools to construct entire fabrications from scratch, trained off everyone's data (however ill-gotten)? Nope - that's pretty much something that should not be done!

Wednesday, June 12, 2024

[MusicViz Project] Part 2 - Motivations + Rough Directions

This is Part 2 of what will hopefully be a series of posts documenting my attempts to build a music visualiser for automatically creating interesting dynamic visualisations for my back-catalogue of music I've been writing + recording of the past few years. Last time I checked, in total there's probably somewhere between 3 and 5 hours of "finished" or "presentable" tracks, with most averaging about 1 minute in length (most come in under that around 52-55 seconds), with only a few reaching 1:10 mins, and only 2-3 blowing out to ~2:30 mins.

Most notably, there are 2 playlists (or really "albums" by this point) of material I produced during the few months I was holed up in my room writing my thesis. During most of the day and night, I'd be listening to these playlists while slaving away in my text editor, desperately trying to make some progress (some days much more successfully than others); and then, to take a break / recharge, I'd write or record some music based on fragments that would come to mind. Rinse and repeat for several months. As my thesis grew, so too did these playlists, which each ended up over an hour long in the end.

For several years, I've been wanting to package these up in a suitable format to release into the world. Currently, only a small handful of these tracks have been heard by anyone other than myself, but certainly not the entirety of these playlists in their totality. Yes, granted, the expected audience is probably vanishingly small, as they are certainly not "mainstream", and don't fall neatly into established categories... hence, even if/when I do release these, I hardly expect many people to actually listen. Then again, if anyone's interested, I have actually since produced a few more hours of similar / evolved material since then LOL - heck, I'm listening to one of the newer playlists as I write this, and even I am surprised by some of the material I recorded even a few years ago.

Monday, June 3, 2024

[Music Viz Project] First Version of Pitch-To-Colours Mapping

After procrastinating over this for a few years, I've finally put together a first version of a mapping for the colours I typically associate with each pitch - one of the key elements for the music visualisations I've always wanted to generate for all the music I've been writing + recording over the past few years.

This is actually my second attempt at putting together such a chart. The first one (which I can't seem to find right now) was only partially complete, as at the time, I kept struggling over whether I'd picked the perfectly calibrated shades for each, which then meant I never got the basics down.

 

So without further ado, here's a rough chart:

 


 (Disclaimer: I wanted to clean it up more, but Musescore doesn't let me easily insert/delete excess notes in the middle of a line without re-entering the notes and then losing the colours. So... meh!)


Sunday, May 19, 2024

Collage Making App - Design Sketches

While trying to put together a collage yesterday showing an amusing sequence of shots of a silvereye swallowing a ball of fruit it had yanked from a nearby fruit from the Autumn Birdy Berry Tree, I was reminded yet again just how frustrating this process is, with practically none of the tools out there really letting me do what I need + want (or at least none of the ones I currently have access to). Sure, I could ultimately bolt this together using some scripts / command-line tools, but it's a bit of a pain iterating on visual stuff like this that way.

 

My Requirements / Process-To-Automate:

* 1) Arrange my chosen images in a line, side-by-side (with ability to reorder them, add/remove items in this lineup, preview different combinations, etc. to get the flow of images right)

   NOTE: You can somewhat do this with existing tools, but it's always *a pain* to do  (and in some, it requires starting over / creating multiple draft solutions)


* 2) Allow bulk cropping the width of these to just an interesting section 

   NOTE: This requires ability to interactively preview + see the effects of such cropping, to make the iteration process fast + painless. This practically rules out all the command-line / scripted approaches. Also, no simple collage maker tools come close to even considering this possibility.


* 3) Allow ability to adjust vertical alignment on each of these individually (to fix framing differences) then v-crop any messy / scraggly bits on either side due to image sizing differences

   NOTE: Same story as above with #2


* 4) Make the canvas fit the whole strip of images (i.e. typically a very wide but not very tall image), at the highest resolution possible (from which I can then compress / resize as needed to satisfy upload constrants)

   NOTES:

        i) This last step in particular *always* manages to stump most tools out there. I get it - those are all optimised for the Insta / FB / etc. folks who have fixed "square" templates to fit their shit into. But, I don't particularly care about that when doing this.

        ii) This is actually a major gripe I have with most of our "creative" digital tools too - from painting apps to music scoring systems: i.e. The need to know and specify up front a "box" that will be big enough to fit whatever you're trying to do into (and if not, to then continuously grapple with various resizing + re-fitting tools to get more space to work in).  In that sense, that's one of the things I'm particularly proud of with Grease Pencil - that it provides an infinite canvas, free from these constraints (and is why I use/used it as my drawing tool) :)

 

Hence, I finally decided to bite the bullet, and see if I could hack together a solution for this.

Monday, May 6, 2024

Mirrorless Camera Observations - A Few Months In (Part 2 in Series)

As a followup to Part 1, here are my observations on the mirrorless camera having used it for a few more months (and for long periods instead of my old DSLR).

 

Pros:

Unfortunately, this list is currently still a lot smaller than it should probably be... especially given the price tag on this thing.

1)  As mentioned last time, the overall improved sharpness + richness of colour range the sensor are capturing are a definite improvement. Especially loving being able to use decent / normal-ish shutter-speeds that I want to be using at night, and *still* getting the shots with reasonable quality is something that's been a real benefit of using this at night and/or in marginal lighting conditions

2) As expected, when dealing with sharply varying lighting conditions (e.g. shooting birds in a backlit tree, where taking one step to either side can land you in a patch of bright sunlight + a different exposure), having a digital viewfinder is a real improvement as it allows seeing in real-time the exposure changes. Also, just having this all be via a digital screen vs sun-rays passing through a zoom lens direct to eyeball is a real safety measure that's good to have. Same too the ability to use the viewfinder while recording video. Basically all the expected benefits of shooting with a mirrorless camera (and subsequently having "live view" in the viewfinder) are proving to be as I epected

3) When it works, the auto-focus is good for tracking moving objects - Operative word: "when" it works... More on this later.

4) Having 3 (or actually 4, if counting the one on the new-style lenses) dials able to be used for controlling various settings is a big improvement on only have 2! Particularly as the third one can now be mapped to controlling ISO in manual mode, making that mode actually useful if you want to lock in camera behaviours with the other 2 then use ISO to get the desired exposure (i.e. typically underexposing relative to what the camera's metering things the scene requires)


Cons:

This list is unfortunately still a lot longer than I'd like, with a bunch of these being ever-frustrating things that grind you down everytime you use it. (Nothing puts these into focus as much as just switching back to shooting on old DSLR for a change, and suddenly no longer having to deal with most of this crap)