Monday, November 25, 2024

[Darktable Trials] Testing out a New Photo Management / Editing Workflow - Part 1

Over this past week, I've been kicking the tyres on Darktable again -- partly to see if I can now make it work for me, but also partly out of necessity (as my usual workstation + office-space are out of commission currently, but I had a bunch of photos needing editing, and Picasa - which I've been using to date - will eventually bite the dust in a bad way as it's been long deprecated so I wanted to avoid putting it on yet another machine... plus, I kept forgetting to grab the installer from my backup drive when I plugged it in).

Maybe there's a bit of "Stockholm Syndrome" here, but early indications are that with the latest current version (4.8.1 when I went to download it recently), Darktable has resolved a whole bunch of idiosyncrasies that made it an annoying non-started previously, along with offering a bunch of new capabilities that present a good way of achieving some of the "Ultimate Photo Editing" tool points I mentioned previously.

 

Update Notes: 

* 25/11/2024 -  Original Post. May end up tacking on more details in a later revision, in which case these update notes will be updated.

Initial Thoughts on "Shortcut Editing" UI's

Is it just me, or do pretty much all the "Shortcut Editor" UI's in various apps suck?

I don't claim to have a fully fleshed out + well considered + tested alternative in this case (unlike with many of the UI problems that I've spent some time toying with), but just to throw some ideas out there to kickstart a conversation about these UI's that I think as an industry we need to have.

So, without further ado, here are a few ideas for how I'd go about making Shortcut Editor UI's better if / when I try to design one next time.

 

Wednesday, November 20, 2024

On intrinsic motivation when learning new skills

This morning I saw an interesting discussion where someone was asking how they should about learning to code.  This got me thinking about how I personally learned, and the transferable lessons from that process for picking up new skills / knowledge in general.


From that, I realised that I believe that intrinsic motivation - i.e. you first have to want to know for your own sake / to meet some of your own goals, rather than it being like "someone says I have to do it". I suspect this is one of the first major barriers many people face: For many people, their experience with learning in general has been through their schooling years - an fraught experience where the answer is often, "I do it because the adults made me do it" (i.e. extrinsic / non-self-motivated)


The easiest way to learn is not to go through a tutorial / course "just because", but rather, if you set yourself a goal of "I want / need to be able to make <x> happen"

From there, it really focuses your learning:
  * What do I need to know to do that?   

     There's nothing better than a real concrete need / roadblock in the here and now that you need to overcome in order to get to the next step towards your goals, focussing your attention in ways you may not have considered possible.


  * How does what I'm learning / coming across tie into what I'm doing, or how does it compare to what I've seen working on my project? (I.e. what's the point of all this)?

     For example, from experience, this was what helped me get through some of the most boring series of lectures I'd encountered at Uni, where we were going through this dry abstract set theory stuff on database operations. I found it really helped to reframe those abstract concepts back into the more concrete world of how I'd go about implementing such a thing in code using a for-loop going over such collections of objects.


  * What else could I easily do now that I didn't originally consider?  

      This is where you start engaging your creative juices, and start coming up with new idea for other enhancements or features or even completely new projects come from. Those in turn will drive a lot more experience building and hunting for new and better ways to do things, and pretty soon, we'll not be having this discussion here at all, as you're too busy trying to feed all your projects!  

     (That's why I also believe that if you manage to get your kids hooked on some pursuit where their drive for personal improvement can start taking over - i.e. for example, anything from programming, playing a musical instrument, sports, or various forms of art - they really won't have time or the inclination to bother with drugs or petty youth crime)


For an additional piece of fun reading, I came across the following article this morning about how lab rats taught to drive "rat cars" apparently looked forward to the experience, with the anticipation for that having many positive benefits (including making their tails look more like those of "friendly cats"). Interesting food for thought...

https://theconversation.com/im-a-neuroscientist-who-taught-rats-to-drive-their-joy-suggests-how-anticipating-fun-can-enrich-human-life-239029

Saturday, November 16, 2024

Thoughts on Rust - 2024 Edition

Here are some of my current thoughts on Rust, as initially prompted by a thread I saw this morning asking: "What features of Rust are most appealing to you?"

My reply follows.

~~~

Originally, what drew me most to Rust I guess were (in following order)


1) Complex compiled language with momentum that was not C++ 

This was by far the biggest motivation, back when I was still mostly a C/Python dev. Having spent a few years working professionally as a C++11 software engineer now (having learned on the job, thanks to working during code reviews with a bunch of top-notch Modern C++ gurus), it now just looks like Haskell-lite for Modern C++ devs  (which enforces all the best practices we generally do, *by default*).

 

2) Claims of speed + memory safety (esp around multithread type stuff - This was a big one coming from several bad Python codebases that struggled with really bad concurrency issues (i.e. random deadlocks on machines with different processor numbers than the original dev machines used), but also because Rust first started rising in prominence as I was looking into the Depsgraph stuff (which would have really benefited from being able to evaluate multiple things in parallel, to do background recalcs + caching of your scene)


3) Didn't have "cuddled else" / "caterpillar ifs" hard coded into the syntax (*cough* Go *cough*)

To this day, it still really annoys me running across code written that way. Like, really twitchy annoyed.

It's a real pity that there are a bunch of languages that hard enshrined it in their syntax (due to design decisions), and also that increasingly many examples across other C-like languages also do it. Ugh!


Thursday, November 14, 2024

Blasts from the Past - Reminders and Remnants of Different Times

Sometimes you just cannot make these things up!  Either that, or "The Algorithms" pervading our lives these days are getting much better at juxtapositioning contrasting but related things together. Anyway, I digress.  Tonight's impromptu post is inspired by the following 2 snippets that popped up across my various feeds this evening:

   1)  Local newspaper had an article saying that a *third* building at my former High School had just been deemed "quake prone" - with all three buildings having been built either shortly before my time there, or during my time there, and all having been places I'd spent a bit of time in.


   2)  An old clip of Bill Gates giving a demo of Visual Basic back in 1991


 It's funny how these things go sometimes, isn't it...

 

Saturday, November 9, 2024

Principles and Frontiers for Creative Software Tools

Here's the long-promised "Manifesto + Roadmap" for the future of Creative Software Tools I'd been wanting to publish since May/June 2024, but was ultimately stalled from doing so by a bad first encounter with Covid ultimately sapping my strength to take on outside-work commitments for a few months.  

NOTE: To get this out, I may just publish it first then amend it later

 ~~~

Dabbling with designing up another DCC tool after a hiatus of a short break from that field has reminded me of a whole bunch of untapped / unsolved directions for the future of DCC tools to make them more useful to the humans who use them.

Graph Layout and Diagramming Tools Projects

Automatic Graph Layout

Every few years (i.e. roughly 4-6) I will often end up having some project that necessitates auto-generating a big graph, in which case I again waste time going through all the layout engines checking if any do what I actually want, find that none do, and then waste time trying to tweak GraphViz parameters again...

That reminds me that it is probably time to give my own graph layout engine another go. Would've been for multilayered graphs with a particular flow structure 😜

(i.e. For those who know, it's basically a layout engine for Blender's DEG graphs, as I originally envisaged them - so the idea is to recursively group/agglomerate clusters of nodes which belong to some common entity. Then at each layer, just solve the simpler problem that exists at that level within the container (i.e. only layout the nodes under said parent container). Once we have this, we can then flush the required bounds for each nested layout up the hierarchy, and so forth).


My Own Dream Diagramming Tool

Of course, in the process, I'm reminded that a bigger priority is really to build my own diagramming tool that will fully work the way I want - with all the style presets, palettes, and interaction flows I've always dreamed of.

Oh, and to have this work multiplatform somehow, so I can use it on our locked-down work machines, but also on my phone (for quick sketches / doodles).

This is much much more a priority, as I do this *all the time*, and the existing tools are annoying for various reasons.

 

Saturday, November 2, 2024

Congratulations to Notepad++ for 21 Years and Counting

Saw an interesting link today to a blog post proclaiming that Notepad++ has been existence for 21 years now:

 


 

https://learnhub.top/celebrating-21-years-of-notepad-the-legendary-journey-of-our-favorite-text-editor/


Having used it as my primary text editor for nearly a decade (i.e. starting roughly sometime around 2005-ish, and no later than 2006, until roughly 2014-ish), it has certainly been an invaluable tool for me personally.

 

Among other things:

* It was the text editor I used to code most of my Blender work (notably doing the 2.5 Animato refactor, and initial implementation of Grease Pencil)

* It was also the text editor I used for all my undergrad, and a big chunk of my honours project work (with Geany on Linux doing the rest of the heavy-lifting when I was using the department Linux machines)

 

Thursday, October 31, 2024

Wishlist for My Ultimate Photo Management + Editing Tool

Now that my project schedule is freeing up again (and most importantly, I'm finally free of my various university contracts / commitments over the past few years, with the rather onerous IP provisions those came with), my attention has again been turning towards what sorts of projects I may want to start working on in my free time going forwards.

 

The key operative principle though for any such projects I now take on is this:

From now on, any passion-projects I dedicate my free time to (and with full force) will necessarily only be ones that I fully control + own. Unfortunately, experience and hindsight have taught me that merely having something be open source (but still part of someone else's platform / hosted by some other funding org) is ultimately not the answer I once believed it to be.


Note: This is also NOT a firm commitment to actually embarking on building all of these things. But rather, just some open-air brainstorming, hoping that someone will build it all for me (and then not put it behind a hideous subscription-based paywall). Heck, maybe the mere act of brainstorming these designs then releasing them as blueprints to hopefully inspire a whole ecosystem of interfaces should be the actual project!


Enough framing boilerplate. Let's get down to the original topic for today's ramblings:

What my ideal "next-gen" photo management + editing tool solution should look like, were I to go through the effort to set one up.

Monday, October 28, 2024

[Trip Report] Sydney 2024

This is an abridged version of a more detailed post I'd been preparing (and subsequently never got around to finishing). As with the Wellington one, I might come back to attach some photos to it at some point, but the aim is to just get quick and dirty up so I have some notes on it. So without further ado, here is the "abridged" version

So back in July this year, we headed over to Sydney for a week - mainly to attend a cousin's wedding, but also to do some sightseeing. Oh, and it was our first overseas trip in 5 years (i.e. the previous was pre-pandemic in June 2019, and as it happens, was also for a family gathering in Sydney).

 

~~~ 


BTW, Crowdstrike happened while we were there, which made for an interesting experience.

  * It was surreal walking around the bustling waterfront at twilight:

          * Just before leaving the hotel after a nap, I'd seen the headlines, and started hearing all the bad news filtering in... But then, out the window I'd seen the stunning sunset, so wanted to dash outside to photograph it from the waterfront (knowing that I wouldn't get another chance to do so during the rest of my stay). So, there I was at the waterfront, wondering how things were going to pan out (i.e. it was still a developing story at that point, with lots of doom and gloom news spooling up at that point in time).  

          * It was quite a spectacular sunset (though sadly missed the best part of it by the time I'd gotten outside after waiting for the slow lifts), with a piano-busker playing "The Pachelbell  (AKA RIP Cellists)" (funny/timely since we were going to a wedding the next day), while watching the crowds of people (a mix of tourists + locals) just going about their business blissfully unaware/ignorant of all that chaos unfolding around the world, while also looking up at all those office towers with lights on feeling for all the IT guys there (and also back home) who now had a terrible fire on their hands

   * It's funny that only days earlier (Tuesday to be precise), I'd only just learned about the existence of Crowdstrike. And that was because my office workstation was running slow that day (and had been for a few days), when launching any new processes, prompting me to investigate what was happening (i..e I thought was only MsPaint, but it soon turned out to be everything, including opening new tabs!). 

         * At the time, I'd chalked it up to me not applying some Windows updates yet, which I assumed would get applied while I was away. But since I was heading away the next day, I decided to leave it alone. Little did I know.... the first time I'm out of the country in a few years, this thing then proceeds to knock out everyone's IT systems. 

         * Who knows... could I have helped the world avert this thing if I'd sounded the alarm that it was acting weird at few days earlier???


[Trip Report] Wellington 2024

At the start of this month, we headed up to Wellington for our semi-annual pilgrimage up north to check out the World of Wearable Arts show, get our coconut buns fix (since we don't have *any* dim sum places offering them here in Christchurch), and fit in another tour of the usual sights (i.e. Te Papa, waterfront, Lambton Quay, Old Bank Arcade, etc.)

In the interests of getting this post up (since heck, it's nearly the end of the month now, and also I haven't even finished working through my Sydney ones, or the MtCook ones from earlier in the year), I will likely end up making separate posts for some of the highlights below if/when I ever get around to finishing editing the photos (almost done, but not for some sets), uploading them, and curating them in a blog-post. Otherwise, I may just link to the relevant albums, since that's easier...

 

NOTE: This post will likely be updated with images in due course as I get them processed + uploaded. But the text comes first, so I can tick this off my personal todo-list (not that posting these really matters)

Sunday, September 29, 2024

Musings About Topical Issues - 39th Week of 2024

Here's a roundup of musings on various issues that came up this week...

(NOTE: Most of these I'm just harvesting from my Mastodon feed, and reposting here for easier archival for my own sake)

Saturday, August 31, 2024

Winter of Misery - The End is Near!

Yay!  Today officially marks the "end" of winter 2024 here - a season that unfortunately will be remembered as one of misery and lots of sickness...

 

Sunday, August 25, 2024

What's Your Number? Multi-National People You Know

Was just having an interesting discussion with someone about how many people from different countries / nationalities we each knew - either from working with them directly, VS just knowing them / being acquainted. Having made a few passes at this, it turns out I know a hell of a lot than I'd realised (i.e. I suspect I'm probably a fair bit more "international" than I realised...)

 

Totals:

* Current Workplace in NZ (Currently) = 9 10

* Current Workplace (Previously / On Top of Current Total) = 3

* Previous Workplaces in NZ (Combined) = 13

* Blender (In Person) = 9

* Blender (Online that I've Interacted With... IIRC) = 18

* Other FOSS Work = 3

* During School Days = 21

* Total Unique =  Approx  25-30   (will carefully re-count later)


So, what are your numbers?

Sunday, August 18, 2024

Mozart's "Magic Flute" Opera - Papageno - YouTube Rabbithole

For anyone who needs a bit of a cheer-up on this grey rainy day, here's a bunch of clips of a fun duet from a Mozart opera. 😍


1) The clip that started this rabbit hole, featuring (Huw Montague Rendall & Elisabeth Boudreault) 

This was the first time I'd ever heard this piece. I really loved the energy of the performers and the chemistry they had. (Also have to say that Elisabeth's little dance there at the end was quite cute LOL ;)

https://www.youtube.com/watch?v=lP9V7_fevgQ


2) Trying to find out what this song was about / the context for it, I came across this next clip from The Royal Opera (feat. Christina Gansch and Roderick Williams). 

This one had subtitles, and between these + the staging (i.e. all those kids), the meaning of the song suddenly all made sense (besides bringing back memories of the portrayal of Mozart in Amadeus


https://www.youtube.com/watch?v=9Q0ZDZB-AnM


3) From a comment on that one, I learned that the following is apparently a famous performance by Detlef Roth and Gaële Le Roi.

The key highlights of this were seeing the matching costumes along with a bit more of the context leading up to this scene, along with watching the actors' snuggly expressions as they performed the scene. Personally though, I thought the soprano's singing in this one sounded a bit "soft" compared to the others, and didn't really like it that much TBH.

https://www.youtube.com/watch?v=87UE2GC5db0


4) This final one (a "children's version" from 25 Feb 2024 in Vienna apparently, featuring Rolando Villazon and unknown soprano) takes the cake!

I love the staging of this - particularly how it's this really fun + playful setting within a gigantic opera theatre (seriously... look at all the tiers of seating/boxes in this thing) that looks like it was put on specifically to expose kids to classical works in a much more intimate and less intimidating setting. 

Also this one comes with a lot more context, along with the matching costumes, and everything...

All in all - I thought this was just  SO. MUCH. FUN! 😍

https://www.youtube.com/watch?v=PaldPP44oas

Tuesday, August 13, 2024

My Second/Third Ever Aurora Sighting

Wow! I was finally rewarded again tonight after multiple attempts over the past few nights (and multiple ones earlier in the evening - when others had been reporting lots of successful sightings)... but to no avail until this last-ditch attempt for the night.

More highlights from this show just after midnight here in suburban Christchurch on the night of the 12/13th August 2024 can be found in the following album:

https://www.flickr.com/photos/aligorith/albums/72177720319476435/

Thursday, August 8, 2024

Tip: Getting Threaded Conversations Working Consistently Across Your Desktop Outlook Mailbox

Here are some notes on how I've got Outlook set up to make my work mailbox a bit more manageable. Most of these things are probably officially documented "somewhere", but it's nice having a quick guide for getting a setup that seems more sane (for anyone coming from Gmail)

 

This post comes about because, while I had this working in my Inbox, I found that this was not the case for my other folders once I moved threads there. Hence my search for answers. 


Short Instructions  - For Enabling Threading on "Other" Folders (see image above):

1) Go to one of the affected folders
2) Go to "View" tab on Ribbon
3) Enable "Show as Conversations"
4) In the confirmation prompt, click "All Folders" (vs just "This Folder")

I suspect I'll have to go in and redo this for any new folders I add at some point in the future. But at least this gets all the current ones working nicely.


Bonus Tip:  Fixing the need to triple-click on the annoying little triangles to collapse threads

Make sure you've got that "Always Expand Selected Conversations" thing from that menu selected, which fixes that problem (it seems).

Oh, and you need to do that per-folder... unfortunately, this case doesn't show the handy "change for all folders" popup.

Sunday, July 7, 2024

Autocorrect Rant

One of my pet peeves about how auto-correct works on my phone is this:

Having to fix and refix and refix something where "it knew better" and kept correcting what I enter, despite either:

1) Repeatedly deleting the "fix" it applied + immediately retyping what I had originally typed

2) Explicitly choosing "option 1" (left-word - i.e. the thing I typed), over "option 2" (i.e. its default auto-correct solution)

This is especially annoying when it happens multiple times within a few 5-10 minutes, when I'm typing + re-correcting the same sequence of characters again and again!


Solution:

The solution is really quite simple IMO - An explicit user override for a particular sequence of characters (or an immediate delete + retype of the same thing) should be a strong hint that they do not want that sequence autocorrected again in the next 5-10 minutes. If they keep doing this over a longer period, then that sequence should *never* get auto-corrected to whatever the system decides ever again.

*That* is the sort of "smart" behaviour that people really want from their tech, not the "lie-generating plagiarism machines" that are all the rage right now as the Big Tech titans once again battle to win the latest "first to build the 'Next iPhone' Monopoly game"

Thursday, June 20, 2024

Thoughts About "AI" (Winter 2024 Edition) - AKA: No, I do NOT want to have to "talk" to your "chatbot"

I briefly interrupt coverage of my Music Visualisation Project to cover a brief rant about the topical "AI" issues that are all the rage right now.

 

My current position on all this "AI" hype is:

1) TBH, I bloody HATE all this "me too" bandwagon jumping crap that's going around at the moment, and hope it all blows over sooner rather than later - just like "Crypto" and "NFT's" and "Metaverse" fads before it did. The sooner the better!

See also this "supremely on the point" blog post ;) -  https://ludic.mataroa.blog/blog/i-will-fucking-piledrive-you-if-you-mention-ai-again/

 

2) The UX of all these "AI" tools is fundamentally flawed:  i.e.  

     "I do NOT want to have to fucking 'talk' to your bloody 'chatbot' to do stuff!"

 

3) The majority of all this "AI" hype is all being poured into all the wrong directions: 

    "We should be focussing our efforts on helping people do what they cannot do otherwise (i.e. augmenting human abilities),  NOT trying to replace them  (i.e. destructive misery causing)"

    That there is perhaps the best way to sum up the ethical line / standard I use to decide what I spend time working on. I'm only interested in working on stuff that betters humanity's ability to do stuff they otherwise wouldn't be able to do without the technology. Other stuff (e.g. ad networks, DRM, fintech, killer robots, facial recognition, tracking + surveillance tech, making people/industries/etc. "redundant", etc.) I refuse to work on  (and really, anything I am not interested in, I do a categorically *awful* job at...)

 

4)  In that light, will I work on or play with AI stuff at some point?

     Short Answer:  If AI is the right tool for the job, I will consider it.

     Operative word: "right tool"

     So far, none of the problems I have been working on have required reaching into that toolset, so I haven't bothered to really delve too deeply into it. But if the opportunity arises where AI presents a better solution than we can achieve otherwise, I will consider it then.

     Prime Example:  With some of the image generation + editing tech out there now, we finally have the a set of powerful tools for fixing a whole bunch of previously prohibitively difficult-to-fix problems, giving us the ability to do spot fixes for defects that would've previously ruined many images / videos. In that sense, these user-guided "repair" tools are precisely the "powerful magic fix-it tools"  that we've all dreamed of having all these years, and so, by my previously stated principles, they may well be the right tool for the job in those cases. But using these tools to construct entire fabrications from scratch, trained off everyone's data (however ill-gotten)? Nope - that's pretty much something that should not be done!

Wednesday, June 12, 2024

[MusicViz Project] Part 2 - Motivations + Rough Directions

This is Part 2 of what will hopefully be a series of posts documenting my attempts to build a music visualiser for automatically creating interesting dynamic visualisations for my back-catalogue of music I've been writing + recording of the past few years. Last time I checked, in total there's probably somewhere between 3 and 5 hours of "finished" or "presentable" tracks, with most averaging about 1 minute in length (most come in under that around 52-55 seconds), with only a few reaching 1:10 mins, and only 2-3 blowing out to ~2:30 mins.

Most notably, there are 2 playlists (or really "albums" by this point) of material I produced during the few months I was holed up in my room writing my thesis. During most of the day and night, I'd be listening to these playlists while slaving away in my text editor, desperately trying to make some progress (some days much more successfully than others); and then, to take a break / recharge, I'd write or record some music based on fragments that would come to mind. Rinse and repeat for several months. As my thesis grew, so too did these playlists, which each ended up over an hour long in the end.

For several years, I've been wanting to package these up in a suitable format to release into the world. Currently, only a small handful of these tracks have been heard by anyone other than myself, but certainly not the entirety of these playlists in their totality. Yes, granted, the expected audience is probably vanishingly small, as they are certainly not "mainstream", and don't fall neatly into established categories... hence, even if/when I do release these, I hardly expect many people to actually listen. Then again, if anyone's interested, I have actually since produced a few more hours of similar / evolved material since then LOL - heck, I'm listening to one of the newer playlists as I write this, and even I am surprised by some of the material I recorded even a few years ago.

Monday, June 3, 2024

[Music Viz Project] First Version of Pitch-To-Colours Mapping

After procrastinating over this for a few years, I've finally put together a first version of a mapping for the colours I typically associate with each pitch - one of the key elements for the music visualisations I've always wanted to generate for all the music I've been writing + recording over the past few years.

This is actually my second attempt at putting together such a chart. The first one (which I can't seem to find right now) was only partially complete, as at the time, I kept struggling over whether I'd picked the perfectly calibrated shades for each, which then meant I never got the basics down.

 

So without further ado, here's a rough chart:

 


 (Disclaimer: I wanted to clean it up more, but Musescore doesn't let me easily insert/delete excess notes in the middle of a line without re-entering the notes and then losing the colours. So... meh!)


Sunday, May 19, 2024

Collage Making App - Design Sketches

While trying to put together a collage yesterday showing an amusing sequence of shots of a silvereye swallowing a ball of fruit it had yanked from a nearby fruit from the Autumn Birdy Berry Tree, I was reminded yet again just how frustrating this process is, with practically none of the tools out there really letting me do what I need + want (or at least none of the ones I currently have access to). Sure, I could ultimately bolt this together using some scripts / command-line tools, but it's a bit of a pain iterating on visual stuff like this that way.

 

My Requirements / Process-To-Automate:

* 1) Arrange my chosen images in a line, side-by-side (with ability to reorder them, add/remove items in this lineup, preview different combinations, etc. to get the flow of images right)

   NOTE: You can somewhat do this with existing tools, but it's always *a pain* to do  (and in some, it requires starting over / creating multiple draft solutions)


* 2) Allow bulk cropping the width of these to just an interesting section 

   NOTE: This requires ability to interactively preview + see the effects of such cropping, to make the iteration process fast + painless. This practically rules out all the command-line / scripted approaches. Also, no simple collage maker tools come close to even considering this possibility.


* 3) Allow ability to adjust vertical alignment on each of these individually (to fix framing differences) then v-crop any messy / scraggly bits on either side due to image sizing differences

   NOTE: Same story as above with #2


* 4) Make the canvas fit the whole strip of images (i.e. typically a very wide but not very tall image), at the highest resolution possible (from which I can then compress / resize as needed to satisfy upload constrants)

   NOTES:

        i) This last step in particular *always* manages to stump most tools out there. I get it - those are all optimised for the Insta / FB / etc. folks who have fixed "square" templates to fit their shit into. But, I don't particularly care about that when doing this.

        ii) This is actually a major gripe I have with most of our "creative" digital tools too - from painting apps to music scoring systems: i.e. The need to know and specify up front a "box" that will be big enough to fit whatever you're trying to do into (and if not, to then continuously grapple with various resizing + re-fitting tools to get more space to work in).  In that sense, that's one of the things I'm particularly proud of with Grease Pencil - that it provides an infinite canvas, free from these constraints (and is why I use/used it as my drawing tool) :)

 

Hence, I finally decided to bite the bullet, and see if I could hack together a solution for this.

Monday, May 6, 2024

Mirrorless Camera Observations - A Few Months In (Part 2 in Series)

As a followup to Part 1, here are my observations on the mirrorless camera having used it for a few more months (and for long periods instead of my old DSLR).

 

Pros:

Unfortunately, this list is currently still a lot smaller than it should probably be... especially given the price tag on this thing.

1)  As mentioned last time, the overall improved sharpness + richness of colour range the sensor are capturing are a definite improvement. Especially loving being able to use decent / normal-ish shutter-speeds that I want to be using at night, and *still* getting the shots with reasonable quality is something that's been a real benefit of using this at night and/or in marginal lighting conditions

2) As expected, when dealing with sharply varying lighting conditions (e.g. shooting birds in a backlit tree, where taking one step to either side can land you in a patch of bright sunlight + a different exposure), having a digital viewfinder is a real improvement as it allows seeing in real-time the exposure changes. Also, just having this all be via a digital screen vs sun-rays passing through a zoom lens direct to eyeball is a real safety measure that's good to have. Same too the ability to use the viewfinder while recording video. Basically all the expected benefits of shooting with a mirrorless camera (and subsequently having "live view" in the viewfinder) are proving to be as I epected

3) When it works, the auto-focus is good for tracking moving objects - Operative word: "when" it works... More on this later.

4) Having 3 (or actually 4, if counting the one on the new-style lenses) dials able to be used for controlling various settings is a big improvement on only have 2! Particularly as the third one can now be mapped to controlling ISO in manual mode, making that mode actually useful if you want to lock in camera behaviours with the other 2 then use ISO to get the desired exposure (i.e. typically underexposing relative to what the camera's metering things the scene requires)


Cons:

This list is unfortunately still a lot longer than I'd like, with a bunch of these being ever-frustrating things that grind you down everytime you use it. (Nothing puts these into focus as much as just switching back to shooting on old DSLR for a change, and suddenly no longer having to deal with most of this crap)

Sunday, January 21, 2024

Mirrorless Camera Observations - First Week Impressions

Recently I bit the bullet and proceeded to get a new camera to supplement my ageing (and also apparently somewhat ailing though still trustworthy) 7D DSLR. After sticking with one camera for over a decade, switching to anything else was always going to come with a learning curve. What I didn't expect though was what some of those learnings would be!

 

Short Summary of Key Points

Pros:

* Between a nice new higher-quality standard lens, and new sensor tech (and being FF this time too), images are a lot sharper in general. Especially when shooting landscape stuff (e.g. treelines against sunsets in particular, but also tiny text on small labels on things in frame) are now often very sharp + clear with this setup, whereas it used to be somewhat hit and miss whether the same applied before with my old standard lens setup

* There are now 3 dials on the body that can be used to adjust various things, along with an additional mappable control-ring on the new-generation lenses. As you'd expect, in Full Manual Mode, these dials have been mapped so that the two that used to be present still control shutter speed + aperture as before, with the new one handling ISO - exactly as I'd been wishing for many years.

* New sensor == Higher ISO levels you can use (and with less obvious grain when that happens)

* Auto-focus available during video recording

* Can use EVF for "through the lens" live-view preview of what you're recording during video recording - which is better for situations where holding at arms length to see the LCD was problematic

* Can also choose to use the LCD screen in  "pop out" mode out the side, with some angling support available

* A whole bunch of new + more advanced autofocus modes + settings to choose from, along with a wider array of focus points that can be used

 

Sunday, January 14, 2024

LOL... The "Genius" Developer Who Created Grease Pencil?! 😂

Was pretty chuffed to get sent a link to a YT video the other day with a pretty fun title LOL 😂😂😂



https://www.youtube.com/watch?v=H6XqricCPPY


Tuesday, January 2, 2024

30 Years of Blender

It seems that this year marks the 30th Birthday of Blender (which is observed on the 2nd January every year). What a milestone!

 

On a personal level, I am forever grateful for role that Blender has and continues to play in my life. In many ways, you can say that Blender and I grew up together - right time, right place, all that sort of thing:

From starting out as a kid in his early teens in NZ wanting to make his own animated short films; to starting to put my self-taught programming skills to use tweaking and modifying this software to scratch my own itches; to starting to help the nascent few professional studios starting to use Blender to scratch their itches; and finally, spending the better part of a decade or so fighting hard to propel our favourite open source 3D content creation suite from "just a little hobbyist toy" to "serious film-production ready software on-par with (if not outright superior to) the other much more lauded 'industry standard' software".

I think it's fair to say that we've definitely achieved that goal of becoming one of the new industry standard DCC suites, playing a major role in production pipelines throughout the industry. Everywhere you look, there are people out the using Blender - often in places and ways that you'd least expect!   

What a thrilling ride it's been, along with getting to meet some many good lifelong friends along the way! 

 

As you may have noticed, I've been largely absent from active involvement in the Blender world for the past few years. While a return to actively developing and contributing to Blender is not completely off the cards, it is currently highly unlikely. Never say never, but between a fairly heavy workload on a complex + important / critical system in my day-job, some ongoing consulting work on some other projects I've been involved in, and also trying to have a bit more of a life outside of work (I'm trying... old habits die hard), it's unlikely to happen anytime soon.


(Admittedly, it's also been somewhat bittersweet seeing the recent resurgence of interest in Blender's Animation System. Seeing all the active discussion + development of things I would've really loved some feedback on a decade ago when I was actually still actively working on some of these things is fun to see... I just wish it could've happened sooner! Oh well... such is life!)


Happy 30th Birthday Blender!