Now before someone gets overly excited and skips past the rest of this text, I am aware that a similar tool does exist in Blender trunk already. However, this and other tools which I've come across don't exactly fill the void that this tool aims to bridge. Let's look at a little example of the kind of situation that this aims to solve:
In this particular problem scenario, we have an original "source" mesh if you will, that had some shapekeys painstakingly defined (i.e. for facial expressions perhaps). Now, we then decide that we'd like to adjust the topology of parts of the rest of the body, or perhaps even add body parts (say, a gigantic 4ft long tail, or in the example above, hands and feet), so we go and start editing the mesh (having made a backup first of course). The key thing to note here is that we're not touching the face or whatever part the shapekeys were on.
Motivation continued...
As anyone who's worked with trying to extend shapekey-laden meshes like this will tell you, at some point, the shapekeys get corrupted (in my experience, this randomly occurred when trying to join two previously separate mesh islands within the same mesh object, especially when doing a merge-to-last, though it also happens with other operations too. The mesh would suddenly get jaggies and blips. Checking through the shapekeys, you'd find that some look like they've been through the rough-end of a blender ;).
Hence, as a last resort, you'd have to remove all the shapekeys from the mesh on a step before the corruption happened, and then carry on merging the verts to get a complete mesh again. However, this leaves us with the problem of what to do with the shapekeys that you've carefully crafted and had to throw out to avoid losing everything. We know that crafting them again from scratch, even doing vertex-by-vertex matching by hand (using the original as template in the background) would be an utter PITA.
For many years now, this has been one of the main reasons I've personally avoided trying to use any shapekeys in any of my rigs, as crafting shapekeys is bad enough the first time round, but having to do it again for each time you decide to redo your topology as new-stuff comes up or as you find the flaws of your old approach is just too much tedium. As a coder nowadays, it's quite clear that this is one of the ways that technology should be able to step in and conquer some of the tedium. I've done it for doing uncertainties calculations with physics lab experiments, I've done it for copying poses, and I'm sure I can do it for transferring shapekeys too!
Approaches to this problem
So as I mentioned, I originally thought that this functionality might already exist in Blender. But alas it didn't.
The "Transfer Shape Keys" operator, found under the popup menu in the Shape Keys panel has a critical flaw which makes it useless for most of these cases: it only works when you've got the same number of vertices in the source and target meshes. That's a fat lot of use for this scenario, where you've clearly got a changing vertex count (and potentially vertex order too).
I also did a quick Google search on the matter, searching for ready-made mesh to mesh detail-transfer algorithms that I could probably code up quickly and which might handle any arbitrarily difficult cases (i.e. differing topology). Through this process, I did find a few interesting papers, however, the quality of the results was worrying in cases. One went to the extent of dividing the mesh into patches and correlating these (yes, it was quite interesting to see a cow-model being able to be associated to an extremely low-poly humanoid representation, just like the human models), while another baked head meshes into a spherical harmonics representation and tinkered with eigenfeatures (amplifying/decreasing them) to obtain final results.
However, none seemed particularly interested in the perhaps simpler but probably more common case where you've got potentially large "common" areas, with the only real differences being that the regions bordering those areas may have been "enhanced" with additional geometry or with different topology. At least in this case, this observation that parts of the mesh, particularly those that we're most interested in, are the same leads to relatively straightforward problem description:
1. Identify the parts in commonAlas, the devil is often in the details
2. Since those parts in common should have the same topology+vert count, we should be able to just create a direct mapping between those once the parts have been identified
3. Using this mapping, we can then just apply the relevant offsets, and we should have the shapekeys transfered
While it is easy for humans to visually see and common areas and correlate them visually too (if it lines up = match), getting a computer to do it is actually somewhat harder, and the subject of much research (especially with the key focus of: how to do it fast).
As an initial prototype, I decided to go with a fairly naive approach, one that wouldn't require too much fine-tuning of special search-trees (i.e. easy to code) before I was sufficiently happy that it would all work nicely. That approach would effectively boil down to being O(m*n) complexity, where m and n are the relative number of vertices between the target and source meshes respectively.
From this analysis, one early idea for optimisation of the searching was to make users first roughly select the parts in both meshes that are visually similar, thus reducing the size of the problem domain by having the algorithm only search relevant areas. While this would have some positive side-effects, such as preventing the algorithm doing silly things (i.e. matching verts from opposing parts of the mesh, or from regions we're not interested in), it would still have only resulted in the algorithm boiling down to O(n^2) in the best case with m=n, and being a bit more of a pain to work with as you'd need to prepare the stuff correctly first. This is still certainly sluggish, though not as bad as Mesh Deform baking at higher resolutions.
BTW, someone should seriously come up with a table of approximate baking times required w.r.t. resolution used and/or vertex counts/bounding box sizes. It's frankly alarming to see the baking process skyrocket after bumping the resolution but with no indication of when it might end. If I had more hours in a day, and more computing power available, (and probably writing a book on Blender and rigging) I would work on getting such a comparison done, but it's probably going to have to be left as an exercise to the reader.However, since an initial prototype is just about getting something working, bad speed was an issue I was somewhat prepared for.
How do we know that it works?
To test that this thing actually works, I decided to create a little test suite (shown below) of cases that it should be able to handle. As with most test cases coders throw at things, it's quite simple, but does try to pose some interesting problems that the thing must tackle.
Admittedly, I ended up starting to test using this production rig, before creating the test suite for the finer points midway and checking with the production rig sporadically after that.
Optimisations, Overlooked Islands, and Mistaken Matches
It's good that I was testing this using a production rig and not just the simple test-suite, as it helped uncover two main problems:
1) Although some slowness was to be expected, 10-20 second blackouts as the CPU went into overdrive looping over heaps of data wasn't exactly that nice to sit through. Obviously some optimisations would be needed to get this down to a more reasonable time (i.e. let's think about speed now)
2) Apparently, I'd made a few tweaks to a few more vertices than I remembered doing on the face. I forgot that apart from a slight nose tweak, I'd actually saved over the file after trying to widen the mouth a bit. These changes left the face mesh with a few patches where the verts in the target mesh would differ from the source mesh. These areas can be seen below (courtesy of a debugging feature I added early on, which selected all the vertices which had been successfully matched absolutely, leaving the non-matching parts unselected).
Initially, I feared that tackling the initial problem would require that I had to code up some spatial tree search to get this working in any kinds of reasonable timeframes. Fortunately though, a few simple classic-optimisation tricks saved the day. These were our good friends:
- stop when "good enough" ("good enough" in this case meant on the first "exact matching" mapping, which was when the coordinates were exactly the same between a source and target vert pair, that we found). This could backfire if there are duplicate verts around the place, though then again, those are evil, and shouldn't occur in meshes, right? ;)
- reduce the search space when we find a solution. That is, when we find a match, remove it from the list of possible options, to avoid having to test for it again in later attempts.
Runtime for the above mesh fell to around 2-4 seconds after this, which was "good enough" for now. For novel tools sometimes it's acceptable that they take a while, especially if they do a bit of copying + pattern matching in Python too.
(The progress up to this point amounts to test object "simple", which is just the original mesh with a bit of extra geometry).
---
However, that second problem was the one that caused the most grief. It took a good few attempts before I finally got all the required elements into an additional set of steps that are undertaken after the initial match has been done.
The approach I decided to use was guided by an observation I made about the nature of these "tweaked islands" from the source file: they are surrounded by a ring of matched (selected) verts which we now know something about, and that in this island form, they effectively form an onion arrangement with layers of verts in loops, with inner-loops potentially unusable (as a result of some new topology added such as horns, or some of the vertices being merged together to match another chunk).
(This lead to 2 more test cases being created: "Offsetted", which just shunted the inner circles around, but still kept the same number of verts, so that we could still potentially match up the shapes again; and "Destroyed" where the center ring has been reduced to a lower-vert-count shape)
The current approach used to solve these "unmatched islands" in a topologically-orientated fashion is therefore:
1) Identify connectivity info for unmatched vertices on source and target meshes. The idea is to basically gather the edge data into a data-structure that is faster to search through, so that we quickly query whether there is some edge (i.e. an undirected link) between a vertex we're considering and one that we might already know about
2) For each source vertex not yet matched, we identify which already-found vertices that it connects to (i.e. identifying the vertex loop just inside the loop of selected verts)
3) We then go through the target vertices, doing the same thing. If a target vertex connects to the same known-stuff that the source vertex did, then it "should" be a match, though not quite in all cases. So we add this vert to a "pending" set that we'll handle once we've gone through processing the current shell's possibilities.
4) This step has yet to be fully fleshed out, but the general idea is that we need to check if the neighbouring verts are exist in the pending pile or the known/positively identified pile. This is to combat the problem shown in the last test case, where currently the four unchanged verts there get picked up, while every other vert in the ring is ignored because it doesn't link up like anything we knew before.
5) Perform several passes over the mesh (currently hardcoded at 5, but likely to become an operator property) of the steps described above to tackle the nested shells. This solves the problem of inner shells being ignored, given that they don't connect to any currently "known" verts, and are therefore hard to identify as being either legitimately within the area of interest or as "junk" in the rest of the mesh. This was the last hurdle to getting this to work on the production mesh.
Great! Let us try it now!
It'd be terribly cruel of me to discuss all this without giving you all a taste of what this can do. So, here are two links to get you on the way:
https://sites.google.com/site/aligorith/remap_shapekeys.py?attredirects=0&d=1 <-- the script itself.
It's probably possible to try and register/install this as an addon, but for now, I'd recommend just loading in a text editor and running to get it ready for use.
https://sites.google.com/site/aligorith/remapShapekeys_Test-01.blend?attredirects=0&d=1 <-- the test suite
So, how do we use this?
Well, once you've got script loaded into Blender, and the operators registered:
1) Select all meshes that you'd like to have shapekeys copied to (i.e. your "target" meshes)
2) Finally select the mesh that you'd like to copy shapekeys from (i.e. your "source" mesh)
3) In the 3D-View, Spacebar Search -> type "Remap" and select the "Remapped Shape Key Transfer" entry.
4) Wait a few seconds in some cases. You can check the console to see it slowly ticking over if it's taking a while...
5) The shapekeys should have now been transferred. All the "absolute" matched verts will be selected in editmode on both source and target meshes.
There's also a second operator registered by this script, which was originally intended as a possible initial step, but which now is probably redundant. Basically, it goes and selects which verts are affected by shape keys. To use, search for "Select Shape Key Deformed Vertices" operator.
What's Next?
There are still a number of todos I need to address before making a proper 1.0 release and/or adding to online repositories (i.e. bf-extensions). Namely:
1) Improve the speed of the "tweaked island search". It should be possible to squeeze more speed out of this by just changing the placement of some of those inner loops for target verts, and perhaps some other tweaks I haven't thought of yet!
2) Finish the code to make "Destroyed" test case work correctly
3) Pending some PyAPI additions to make copying drivers easier, getting driver-copying working.
4) Other assorted code cleanup
Very interesting stuff, but might I add this sort of attribute transfer across varying topology meshes could be useful for a lot more than just shapes - UVs, vertex weights, etc. Blender's been missing these sorts of tools for a long time, something to keep in mind...
ReplyDeleteNot to mention EXTREMELY useful, so it may seem. I'm about to test this right now to see if it isn't out of date.
DeleteGood point.
ReplyDeleteFortunately I've tried to keep the mapping part relatively separate so that it could be used in future.
One potential issue when dealing with UV's though is that this method is more vertex-centric, while UV's seem more face-centric. Though there are ways around that :)
This is a feature I've missed, coming from other 3D packages. Looks like you're hot on the trail of a solution with this test suite! :-) I appreciate you taking the time to write the entire blog post, lots of great info in here.
ReplyDeleteI need such a feature CONSTANTLY on animation projects, however, I have to agree that as @Matt mentioned, such a feature for vertex weighting and vertex coloring has been a feature I've yearned to have in blender for years now.... you've just possibly come up with a method that could solve this for Shapekeys AND vCol/vWeights, WOW. These are industry standard tools that Blender desperately needs to compete in the modern animation workspace, I can only hope this project matures and is developed into the point of a merge into trunk(with vCol & vWeight support). Thank you VERY VERY much for your hard work thus far. testing time ;-)
Absolutely great! I played with it for some time and it works great for me, exept in one case: I noticed that if we add some more vertices/faces/loops in the area that is affected by the shapekey the script doesnt respect the new added geometry, suppose this matters the shapekeys nature, but if it is a solution for that it will be even more awesome!
ReplyDeletePS: excuse me if I missed something...
My current workflow in modelling in 2.57+ still includes 2.49´s excellent "copy vertex weights" script.
ReplyDeleteIs is very time consuming to model and rig the main body, model the props and stuff for it, export them to 2.49 apply the copy weight, and reimport them again. Any other workflow I have tried, like manually copying-painting weights is even worse... We need someone heroic, talented to tackle on this matter... ;P , but who could it be?... ;P hey look at the sky!!!... is...is...
I want to create a 3d model with 2d eyes. but I have no idea how to rig or animate 2d eyes. What should I do?
ReplyDeleteThank you for this script!
ReplyDeleteIt works like a charm and safed me a lot of work.
It doesnt work in Blender 2.68. Please correct it.
ReplyDeleteVery usefull script, save a lot of time. Thnx
Did you ever get around to polishing this and making a proper 1.0 version?
ReplyDeleteI';m in desperate need of this functionality just now ;-;