Meson warns when doing that, as it's not really portable.
Since we're using platform-specific linker flags on Darwin, we can also
do the same on Linux; the syntax is GCC-specific, so we're going to need
Clang users to test it.
Adding the offset node broke serialization in 2 ways:
1. We store the enum value in the node, so make sure to not change it
for existing values
2. The offset node was missing in the deserialization lookup table
Instead of fiddling around with scale in the iconhelper (and getting it
wrong), create a GtkScaler around the paintable that takes care of the
scaling.
This is the equivalent snapshot function to pango_cairo_show_layout().
Not to be confused with gtk_snapshot_render_layout(), which is the
equivalent to gtk_render_layout().
This is a special case of the transform node that does a 2D translation.
The implementation in the Vulkan and GL renderers is crude and just does
the same as the transform node.
Nothing uses that node yet.
When drawing onto a recording surface, source surfaces get cached.
But if we g_free() the surface data after we're done, that cache is
gonna point at invalid data...
If G_ENABLE_CONSISTENCY_CHECKS is defined (i.e. if our buildtype is
'debug'), add a opengl debug callback that prints all debug messages
with a severity higher than SEVERITY_NOTIFICATION as a warning to the
console.
Turns out that GCC errors out when building the GLib test suite, as it
now checks for overflows in allocator functions, and we're testing for
those.
This would not be an issue for GTK, but since we're building GLib as a
subproject, we get failures for those as well.
Until we can find out how to disable errors for subprojects, or fix the
GLib test suite not to trip up warnings in GCC, we're going to live
without compiler warnings treated as errors for a while.
This way, we can postpone the actual rendeing of the node until the
renderer. This allows the renderer to choose the right scale to
render at, so it can decide to use 2x scale for hidpi on its own.
Last but not least, it makes all nodes independent of the context they
are created in, because they do not need to know at snapshot time what
they will ultimately be rendered into.
Set the display for each event that we put.
Also reorganize the dnd_event_put() function a bit, giving it a surface
directly instead of setting it by implication.
https://bugzilla.gnome.org/show_bug.cgi?id=773299
dest_surface is going to always be NULL for source contexts.
Previously we used to put the root window there to pass this check,
but root windows are gone (and root surfaces never existed to begin
with), so we have to adapt.
https://bugzilla.gnome.org/show_bug.cgi?id=773299
This affects gdk_device_query_state() for the virtual device. It has
no window, and is forced to query the display itself, and display
defaults its scale to 1 even for HiDPI desktops. Use the same
"query scale of a NULL monitor" trick that we use in other places
to get the global desktop scale.
https://bugzilla.gnome.org/show_bug.cgi?id=773299