Wednesday, September 28, 2016

QML Tip: Making one ShaderEffect use the output of another ShaderEffect

I've been doing quite a lot of work in QML lately for one of my research projects. Recently, I ran into some problems when trying to use ShaderEffects to actually apply them to "interesting" widgets/elements (i.e. on to anything that's not an Image and/or is more complicated than just a simple Rectangle). This post is just a quick guide to some of the key issues here (and ways around them), since it's not exactly that obvious from the documentation that this is the cause/solution, and no other hits come up about these issues...



1) How can I set up a "canvas" that I can draw on using GLSL shaders (vertex + fragment)?
The setup I'm currently using is something like this  (the example shader here copies the pixels of the reference object we're cloning - so if the source is 'A', this the 'B'):

Rectangle {
    id: canvas
   
    width: 200
    height: 200
    color: "black"
   
    layer.enabled: true           /* needed, or else the shader won't be applied */
    layer.effect: ShaderEffect /* this is how we basically get the shader to "draw" this rect */
        id: shader

        // .. custom properties that you might need to read from the shaders go here ...
        /* e.g. here we're going to reference some other object we're cloning */
        property variant cloneSource : objectA

        
        /* Here comes the fragment shader (i.e. per-pixel drawing routine) */
        fragmentShader: "
              /* texture coordinates for this fragment (0-1) */
              varying highp vec2 qt_TexCoord0;
              
              /* the object we're cloning - it gets passed in as a texture (if all goes right), as long as the identifier here matches the property above */
              uniform highp sampler2D cloneSource;
              
              void main(void)
              {
                   /* just simply map the cloneSource object's 
                    * pixels to this object...
                    */
                   gl_FragColor =  texture2D(cloneSource, qt_TexCoord0);
              }
        "
    }
}


I dunno whether there is actually a better approach. From the alternative examples I've seen, you'd have to create two objects on the same level (i.e. the "dummy canvas" item and the ShaderEffect): the dummy canvas would be hidden and is kindof pointless other than to provide something for the ShaderEffect to base its drawing on (since you *need* to supply a "source" object to it), and you'd also need to keep the dummy canvas and the shader effect dimensions in sync. As it turns out, these are all complications that are fiddly and make the code look nasty, so I don't really like that approach.


2) How do I get a semi-complex item (in particular, control widgets provided via QtQuick.Controls) to not render black/white?
Here's the first issue I ran into when trying to feed stuff to my ShaderEffects to start playing with them. The solution to this problem is kindof given in the documentation, but it's kindof cryptic to understand that what they're talking about is there is the key to solving this problem.

So, if you just went ahead and took the "shader object" I provided above (herein referred to as B), and tried to get it to render a Button widget  (i.e. cloneSource : buttonWidgetA), you'd likely be met with a disappointing black rectangle. What happened?

Well, at least on Windows, it appears that the QML engine ends up rendering those button widgets using native widgets (or the Qt PushButton widgets/things like that I'm guessing), meaning that the contents of that widget are "not visible" to OpenGL. In general though, "simple" things like just creating lots of nested rects seems to work quite well without needing to do this, though I wouldn't rely on luck here....


To make the button (or really visual entity derived from Item) "visible" to OpenGL, you need to have the line:
layer.enabled: true
when defining the properties of that object, much like how you'd tell it where you want it to appear, etc. What this line of code is doing, is it is telling QML to render that Item (and all its children) to a texture buffer. This texture buffer can then be passed to the shader (via the property variant line, appearing as a sampler2D from the shader's POV, and able to have its colors/contents seen by the shader).

There is one other important caveat I have to mention here though: turning on this property will cause the drawing of the item and its children to get clipped to whatever the bounds of the item is. An important consequence of this is that if you have particles getting emitted away from the edges of your root item, they're going to get clipped, unless you sufficiently enlarge the region (or reshuffle the hierarchy of your items so that the particle system is outside the set of stuff getting rendered + clipped to a texture). You have been warned now (of a lesson I learned the hard way, and took ages to debug ;)


3) So, shouldn't it be simple to just chain together two ShaderEffects then? That is, have a "shader object" A which generates a bunch of pixels, and then have a second "shader object" B which takes those pixels and does other magic on them before showing them?  When then does B only render white pixels (but not if it is fed a normal Image or normal object)?!

To answer this question, we have to look again at what's happening with the texture buffers and all the rendering stuff. Remember how both "A" and "B" are defined using the "shader object" template shown above. Well, it turns out that the fancy "pixel generating" stuff we were doing in the fragment shader didn't get saved to "A's texture", but to "A's layer buffer" instead. That's because "A's texture" is the thing that got passed to the shader for it to use as its source. Then, when "B" comes to try and request the pixels of "A" to slurp, it ends up being given "A's texture" NOT "A's layer buffer", since the former already exists while the latter may have only been used for on-screen rendering. Bahooey! Boohoo!

Therefore, if you need to chain two ShaderEffects together, you'd need to wrap object "A" with an object of the same size that gets the  "layer.enabled: true" line included, then refer to this wrapper instead as the cloneSource.

Let's see an example:
/* "A" - The Source/Generator */
Item {
    id: aWrapper
    width: aCanvas.width
    height: aCanvas.height
    layer.enabled: true      /* render the wrapped shader to a texture */
   
    Rectangle {
        id: aCanvas
        
        width: 200
        height: 200
        color: "black"
   
        layer.enabled: true           /* needed, or else the shader won't be applied, but aCanvas's layer now cannot contain the shader result! */
        layer.effect: ShaderEffect /* this is how we basically get the shader to "draw" this rect */
            id: aShader
                                    /* Here comes the fragment shader (i.e. per-pixel drawing routine) */
            fragmentShader: "
                  #define M_PI 3.1415926535897932384626433832795
                  
                  /* texture coordinates for this fragment (0-1) */
                  varying highp vec2 qt_TexCoord0;
                  
                  void main(void)
                  {
                       /* Make some kind of pattern to show the effect
                        * (on red and blue channels, green is later added)
                        */ 
                       gl_FragColor = vec4(sin(M_PI * qt_TexCoord0.x),   /* red */
                                           0,                            /* green */
                                           qt_TexCoord0.y,               /* blue */
                                           1);
                  }
            "
        }
    }
}


/* "B" - The Renderer */
Rectangle {
    id: bCanvas
   
    width: 200
    height: 200
    color: "black"
   
    layer.enabled: true           /* needed, or else the shader won't be applied */
    layer.effect: ShaderEffect /* this is how we basically get the shader to "draw" this rect */
        id: bShader

        /* Here we reference the wrapper object for "A"  */
        property variant cloneSource : aWrapper

        /* Here comes the fragment shader (i.e. per-pixel drawing routine) */
        fragmentShader: "
              /* texture coordinates for this fragment (0-1) */
              varying highp vec2 qt_TexCoord0;
              
              /* the object we're cloning - it gets passed in as a texture (if all goes right), as long as the identifier here matches the property above */
              uniform highp sampler2D cloneSource;
              
              void main(void)
              {
                   /* Map the cloneSource object's pixels to this object,
                    * then add some extra tinting on top...
                    */
                   gl_FragColor = texture2D(cloneSource, qt_TexCoord0) +
                                  vec4(0, 1, 0, 1);
              }
        "
    }
}

No comments:

Post a Comment