Monday, June 23, 2014

To Build In Or Addon: That is the question...

It has been an exciting few weeks in the land of Blender addons over the past few weeks with the launch of CGCookie's Blender Marketplace and the appearance of various useful tools such as ndee's Asset Sketcher (for being able to randomly scatter collections of objects over a surface, complete with different sizes, rotations, and so forth), or MCHammond's PAE (a set of scripts which defines an optimised UI for seeing and editing the properties associated with a rig) and Proxy Picker  (defining one of those schematic diagrams for off-character selection tools). These are by no means the only ones, but for me personally, these were the most impressive.

These addons also bring up a very important issue: where does the boundary lie between stuff which Blender should provide by default (i.e. as part of the core program) versus what is provided as addons or as stuff that individual users should be looking to tweak to optimise their workflows?

Blender 101
AFAIK, it's probably not that well known in the Blender community at large (from what I understand, only a handful of devs are really aware of this currently) is that there are long term plans for a project or series of projects looking into these very issues. From what I understand of the "Blender 101" concept as Ton calls it, is that in time, we will have a system for defining hot-swappable "frontends" or "workspace skins" optimised for particular workflows. It has to be noted though, that in doing something like this there is always the risk of this devolving into one of those failed "training wheels" scenarios, which have never ever turned out too well.

Commentary aside, the notion here is to provide the ability to create the ability to optimise the set of tools and functionality presented in each context which is best suited for certain types of task. For example, a basic "Blender for dummies" which would be for teaching kids the basics of 3D or perhaps for letting non-technical clients interact with assets, a "sculpters heaven",  or "animators playpen" would be examples of such contexts that would be able to be deployed on top of the base Blender core, alongside (not instead of) the full standard basic Blender interface we have now.

In this regard, yes, there are aspects of Andrew proposed last year which I agree with on UX grounds, which apply if looking at things from particular viewpoints. Of particular note is that the philosophy of Blender's current UI is very much independent of the concerns of particular workflows (some may call this the "data orientated" approach, and you wouldn't be wrong to call it that way - Blender after all was nicknamed the "block visualiser" or something to these lines during its early development in the NeoGeo days, according to some of the articles on the old site). IMO, if you want to support fertile cross-workflow/role flexibility in a piece of software (e.g. you can switch from putting together some video clips one moment, to modelling/texturing/shading/animating some assets to mask in the next -- see David McSween's recent tutorial on VSE usage to get a feel for this, to remind yourself of what it is that Blender offers), there are a different set of concerns you need to address than if you're optimising interfaces for narrow use cases. Namely, at some point, you're going to have stuff that isn't optimal for any single workflow, but at least people are able to safely know that they can find something in a particular place, even if it does take a bit longer/more work to get there (aka "scrolling" and "context switching").

At the end of the day, it all depends a lot on where exactly you place your priorities:
- Blender as a unique all-in-one tool which allows flexible cross-domain/cross-concern work, or
- Blender as free/open source replacement for a certain workflow step within your greater pipeline

At least, after mulling over this a while, these seem to be some of the main conflicts which seem to be driving the various "factions" - yes, I said it... there are several different tribes/factions with differing worldviews, all seemingly pushing for their perspective to be "the one" that we all go in. (Personal opinion: Those in certain camps are particular vocal about their views and points, and are hell bent on making sure we all know these down to the last nuanced spelling error.  <pre-emptive comment>no +ve/-ve sentiment about this is implied here</>)

Anyways, what should be clear to everyone is that this project is currently still for the distant future. We still have massive concrete, complex technical challenges left to address in the common core - you know, the part which all of these various things all depend on. That's where our attention is currently directed: addressing those core infrastructure/technical problems. That's things like the depsgraph and evaluation system, viewport performance, asset management, particles/hair/duplication/proxies, and sequencer. Each of those, in and of itself is a major undertaking.

Case Study: Animation System
So what does this mean for different areas of Blender in practice? Let's examine what this means for animation related stuff, since that's what I'm most familiar with.

At some stage, many of you who animate with Blender will probably have thought about Editable Motion Paths and Full-character Ghosting/Onion-Skinning (versus seeing only the armature). Some may have even seen Bart Crouch's addon which implements editable motion paths (see this thread), played with it, and then wondered why this isn't bundled by default already.

What you may not know about the current situation though is the following:
1) In fact, inspired by browsing through a copy of the Animators Survival Kit by Richard Williams in a bookshop (I'm a bit annoyed now that I didn't get it at the time, since it's now long gone), I'd been planning to do this since April/May 2010, along with a few other things like in-viewport spacing charts and Pose Sculpting. However, a number of things have gotten in the way (see next points). As a result, Maya (editable motion paths) and Modo 7 (spacing charts) got there first. For the industry at large though, maybe that's not such a bad thing, since I like to think that we actually started posing enough of a threat to them that they realised they needed to up their game and start innovating like this again (even if the chances of that actually being reality are more likely to be wishful thinking) :)

2) Various technical limitations in Blender's core actually mean that it wouldn't be a very good idea to try implementing this stuff yet. Yes, I'm referring to the primarily Depsgraph, but also how the viewport handles drawing large and complex meshes. As a result, even if I did implement full mesh ghosting, it actually wouldn't be terribly usable, as Blender would need to slowly recalculate a whole bunch of heavy things on the UI thread.
On a side note, the "weird" - according to venomgfx's release notes for Amaranth :P - structure for how motion path settings are stored is actually part of the aborted work for getting motion paths and ghosting working for both objects and bones alike. It's just that the logic for the ghosting part was never completed, even though the settings are there and ready for action.
3) Around mid-April or May 2010 also corresponded to an avalanche of bugs, with the bug tracker ballooning out to over 700 items IIRC at one point. Thus, most of us went from active feature development to fighting fires to just keep the barrage under control (and that's not including trying to get that number down - the truth of the matter here though, is that to cope, we've just had to readjust our expectations of what what we're going to look into, and what needs to be prioritised as long term "fixes for the todo list" that we can look into as development projects in and of themselves in future).

Switching into "maintenance mode" like this has consequences. Notably, if you spend too long in this state, you actually start getting worn down by the need to constantly triage the barrage of bugs. It stops being fun. Motivation evaporates. Productivity goes into the negative. You'll need a break - some time off from it all (which is basically what you may have noticed, with me being a lot less active over the past 2 years) before motivation starts coming back.

4) There is one other notable aspect of this you need to be aware of: a "maintenance mode" mindset means that you're a lot more risk adverse. Of course, having 2 open movie projects running during that time and in deep crunch also contributes. Basically, when it came to animation, this meant that while additions to tools/the UI were ok, anything touching the core evaluation engine was not.
     * Examples here include the FCurve Easing Patch (coming to 2.71), which made quite a few significant changes to the way that keyframes are evaluated to the point where it was actually difficult to figure out what had changed and if there was anything that might just happen to break someone's incredibly convoluted file out there somewhere - as a result, I only recently reviewed and accepted it now (for the 2.7x series, since for this series we've got a well established, accepted, and publicised mandate that compatability-breaking changes may occur)
     * Constraints bugs and Bone Roll calculations are two other examples of things that I'm admittedly wary of touching as well. Constraints bugs simply because there are a lot of rigs out there that actually rely on quirks in the code (e.g. back in the 2.45/2.46 days, I tried changing the Copy Rotation constraint to using quats instead of eulers internally, but this managed to break dozens of foot rigs in weird and unexpected ways which didn't make any sense). Trying to debug the sequence of matrix multiplications and space conversions taking place with the ChildOf constraint is equally headache inducing. The Pivot Constraint... I admit that this was an ill-formulated attempt that tried to do too much, and as a result, it sucked, bigtime. Bone roll bugs are a whole other world of pain - as Bastien (mont29) has also started to learn. For all the users out there who love to complain and file pedantic bugreports about this: just learn to accept that bone rolls will not stay still in all situations, but that at least the situation we have now is the "least bad" in that it at least does predictable bad things and behaves well for the other 95% of times when it doesn't stuff up.

So, how do these points relate to the editable bone paths feature? Well, although the addon "seemed" to work perfectly (or at least good enough for what users who tried it were doing), most of these tests it seems were done with objects. Sure, some others tested it on bones too, simple control bones without any complex constraints on them I might add. And again, it seemed to work well. But, if you keep going through the thread, eventually, you'll start to come across the actual cases which start to resemble the kinds of corner cases (which fall into the "technically hard" basket) that I realised early on, and at least partially contributed to me not having blindly dived in at the time.

Now, since this is an addon, as the core Blender developers, we don't have any obligation to work out a fix for this now, since it's not actually an actual part of the Blender core that we claim to actually support officially. Sometimes, some functionality can't really be done as an addon, and must be implemented in the core (e.g. BEPUik or whatever it's called). In which case, the problem becomes whether we want to officially add it to our sources. By including it, the following things end up being implied:
  * We officially provide support for this functionality
  * By extension, most users assume that officially supported == "fit for purpose"
  * "Fit for purpose" is quite a loaded term. It firstly means that it will provide utility (i.e. be useful) to users, and will thus actually be used by people (if it doesn't actually pass this test, there's no point in going any further). Secondly, some take this to mean that it must be a "general purpose" solution which will cope with every single setup under the sun, no matter how bloody convoluted and complicated/distorted it is; the fact that it doesn't work for their nest of hornets means that it doesn't quite live up to spec, and thus must be fixed as an urgent priority, now.
  * Compounding the problem is that if this thing comes from someone outside the normal core team, Open Source dynamics mean that it's a given that the guy/girl is highly likely to hang around for 2-3 months after adding it, maybe even 6-12 months if you're lucky before they eventually disappear to never be heard from again (when life starts calling). Thus, by deciding to include this contribution, as the core devs accepting it, we must be fully prepared to take on maintenance duties for this thing when the original dev eventually inevitably disappears. In which case, we better be able to either fix any issues that crop up in there as they do, which means that there better not be any major limitations, massive missing/unresolved technical issues which will turn out to be horridly complex to resolve, or anything along those lines.

Us-vs-them much? If some of the torrid "debates" on BA in recent months are any indication, what I just described above might just have unintentionally incited another round of "robust discourse" of some sort.

So, what about the role of other addons versus the core Blender experience?
1) A few years ago, I wrote of how I had removed all the various carcasses of different attempts at implementing tools for automated walkcycle stuff. For some people, support for such stuff is what they seem to imply as "advanced animation capabilities", and the lack of built-in support was viewed as a major issue to them.

Now, since then, various systems have sprouted up. In particular, the Tube project seems to have made at least 1 if not 2 different systems for achieving this! At this point in time, I still believe that more production-specific tools need to be played with to gain insights into what works vs what doesn't, so that we can generalise on the right basis to use as the built-in tool (instead of adopting the first standard the someone comes up with, which only works as long as the community uses that standard, with the problems that has in the long term)

2) Regarding the PAE: Once again, I'm very impressed with what MCHammond has cooked up. At this point in time, I still tend towards the view that unless we make an application-wide focus on adopting more workflow-specific approaches, that the core Blender UI people get out of the box should still be the basic version we get now, to encourage people to actually get to know how it all fits together and get to grips with the Blender workflows before developing their own extensions on top of this to turbocharge their work. I believe very much in software which does let people see its guts hanging out the side, hinting at how it was all put together, and encouraging users to have a play around - to make things, extend it, rearrange it, and to tailor it to suit themselves better. In this sense, we have succeeded somewhat with Blender 2.5; the proliferation of addons, created by some of our top artists seeing inviting opportunities to make changes to their tools. It really is your tool, as Ton and co have been stressing in the website branding and other campaigns in the past year or two, in this sense.

Of course, if enough of our animator community actually prefers having something like this as the default (and there are none of the issues I mentioned above), I think it's well within reason that we could adopt something like this eventually :)


  1. Excellent post, Aligorith. It highlights the difficulty of developing for users who have come to expect a lot.

    At the risk of walking into your blog with my shoes still muddy from BA, I'd like to express my preference for extending features into Blender's core rather than as addons.

    The new easing equations are a good example; in After Effects, for instance, that very same functionality must be purchased as a third-party plugin. This plugin fills the animation channels with big hairy scripts, and if you're not aware of its quirks it can cause just as many problems as it solves.

    The trouble is that these buggy little plugins are still so useful that they can become integral to a studio's workflow. But when they break or are inconsistent, there's never a clear solution because the plugin devs can't fix the core application, and the core application devs don't bother integrating the plugin's features into the core app because, well, it's already available as a plugin.

    I've worked with pipelines that rely on this convoluted mishmash of half-supported apps and plugins. I would hate to see Blender become so fragmented. Blender's biggest strength as I see it is the ability to bypass much of this.

    As much as I support the work that people put into addons, I would much rather have a well-integrated feature later than a poorly integrated addon now. It's very possible, however, that I've completely misunderstood the point of this article, so I don't know if what I'm saying is relevant at all. I'm an artist and I just want to make the coolest stuff with the shortest pipeline possible.

  2. Most of the problems listed stem from lack of C API, although a weak C API wouldn't help much either. With good C API in place some of the extensions could be plugins that wouldn't have problems python addons have. I also think Blender sources need a lot of refactoring/documentation to be developer friendly to people outside BI. I am working on refactoring view3d logic/ops for my own purposes and that part of Blender is one special case upon another and every one with its own logic/functions handling different paths in their own way.