Introduce a new GskGpuImageDescriptors object that tracks descriptors
for a set of images that can be managed by the GPU.
Then have each GskGpuShaderOp just reference the descriptors object they are
using, so that the coe can set things up properly.
To reference an image, the ops now just reference their descriptor -
which is the uint32 we've been sending to the shaders since forever.
The env var allows skipping various optimizations in the GPU shader.
This is useful for testing during development when trying to figure
out how to make a renderer as fast as possible.
We could also use it to enable/disable optimizations depending on GL
version or so, but I didn't think about that too much yet.
The previous algorithm would reverse the order of subpasses, whcih leads
to unexpected behavior if dependent subpasses are not added as children
of a subpass, but just as a previous subpass - like when a subpass is
used multiple times later.
An example for this is a shadow node with multiple shadows - the source
of the shadow is used by the multiple shadows.
So ensure that adjacent subpasses stay in the same order.
They're done using the pattern shader.
The pattern shader now gained a stack where vec4's can be pushed and
popped back later, which allows storing the position before computing
the new position inside the repeat node's child.
when doing get_node_as_image(), that may spawn a new buffer writer that
writes into the samme buffer when rendering an offscreen with patterns.
So as a more or less hacky workaround, we now abort the current buffer
write and restart it once we've created the image.
Frames now carry a timestamp for when they are used.
This is mainly intended to attach timestamps to cached items (textures
or glyphs), but it could in theory also be used when profiling.
We use wallclock time here, not server time, because it's cheaper and
because we're more intereseted in the local machine we're rendering on.
This heaves over an inital chunk of code from the Vulkan renderer to
execute shaders.
The only shader that exists for now is a shader that draws a single
texture.
We use that to replace the blit op we were doing before.
For now, it just renders using cairo, uploads the result to the GPU,
blits it onto the framebuffer and then is happy.
But it can do that using Vulkan and using GL (no idea which version).
The most important thing still missing is shaders.
It also has a bunch of copy/paste from the Vulkan renderer that isn't
used yet.
But I didn't want to rip it out and then try to copy it back later