When loading or saving png files, encode the CICP flags of the color
state into the PNG.
When loading, decode the CICP flags if available and detect the
colorstate they use.
If we do not support the cicp tags, we do not load the image.
So far, we ignore the ICC profiles.
Includes regeneration of nodeparse test *reference* output to include
the new tags we write to PNGs.
The original tests do not include those tags, so we implicitly test that
we read untagged files correctly.
We only download the data when we actually need it for writing into the
PNG stream.
This allows modifying the download parameters (in particular color state
in the next commit) while writing out their settings, so the code for
selecting the right colorstate liives in only one place.
We have to be careful though, because the download now happens after the
setjmp(), so we need to make sure the error path handles both cases
without leaking: Where the download has happened and where it hasn't.
We want to store some metadata in our symbolic pngs, so make it
possible to get options when loading a png, along with the texture.
Update all callers.
gdk_texture_save_to_png_bytes() cannot fail, so ensure that it doesn't.
Testsuite has been updated to check for this case.
Note that we do not load the PNG file that we generate here.
Loading is a lot more scary than saving after all.
If people want to load oversized PNG files, they should use a real PNG
loader.
We did have 4 ordering variations of ARGB straight,
but only 3 premultiplied. Add the missing one.
Update all the places where we switch over memory formats.
We need them for mask-only textures.
For tiffs, we convert the formats to RGBA (the idea that tiff can save
everything needs to be buried I guess) as tiffs can't do alpha-only.
The API docs outline why quite well.
This should make it possible to do saving of textures to image files
without any private API with the same featureset that GTK uses.
Also remove the gsktextureprivate.h include where
gdk_texture_get_format() was the only reason for it.
clang complained that we may end up jumping
to the cleanup code without initializing data
in the jpeg code. Always initialize data to
NULL to prevent that eventuality.
Commit 59f6c50df8 set the memory limit to 100M,
which turns out to exclude some large, valid jpegs.
So, bump things to 300M, matching what was done
in gdk-pixbuf.
`free` is defined in `stdlib.h`, see for example
<https://pubs.opengroup.org/onlinepubs/009604499/functions/free.html>. Without
this include compilation can fail with the following error:
```
../gdk/loaders/gdkjpeg.c: In function ‘gdk_save_jpeg’:
../gdk/loaders/gdkjpeg.c:264:7: warning: implicit declaration of function ‘free’ [-Wimplicit-function-declaration]
free (data);
^
../gdk/loaders/gdkjpeg.c:264:7: warning: incompatible implicit declaration of built-in function ‘free’
../gdk/loaders/gdkjpeg.c:264:7: note: include ‘<stdlib.h>’ or provide a declaration of ‘free’
../gdk/loaders/gdkjpeg.c:302:67: error: ‘free’ undeclared (first use in this function)
return g_bytes_new_with_free_func (data, size, (GDestroyNotify) free, NULL);
^
../gdk/loaders/gdkjpeg.c:302:67: note: each undeclared identifier is reported only once for each function it appears in
../gdk/loaders/gdkjpeg.c:303:1: warning: control reaches end of non-void function [-Wreturn-type]
}
^
```
libpng wants to receive samples in either RGB or RGBA order, whether
each sample is big-endian or not. This resolves test failures in
testsuite/gdk/memorytexture.c (and a lot of reftests) on s390x, and
probably the PowerPC family too.
Modifying the test to show the color in use and write out the PNG bytes
to a file, and running the memorytexture test on s390x, produces a PNG
that loads with the correct color values in GIMP (on an x86_64 machine),
which seems like evidence that this is the correct change and not just
compensating errors.
Resolves: https://gitlab.gnome.org/GNOME/gtk/-/issues/4616
Signed-off-by: Simon McVittie <smcv@debian.org>
Pass a format do GdkTextureClass::download(). That way we can download
data in any format.
Also replace gdk_texture_download_texture() with
gdk_memory_texture_from_texture() which again takes a format.
The old functionality is still there for code that wants it: Just pass
gdk_texture_get_format (texture) as the format argument.
For MemoryTexture, this is a simple change.
For GLTexture, we need to query the format at texture creation. This
sounds like a bad idea and extra work until one realizes that we'd
need to do that anyway when using the texure the first time - either
when downloading, or when trying to use it in a rendernode, where we
will soon need that information to determine if the texture prefers high
depth.
Also, now make gdk_memory_convert() the only conversion functions
and allow conversions between any 2 formats by going via a float[4].
This could be optimized via fast-paths, but so far it isn't.
1. Change INSUFFICIENT_MEMORY to TOO_LARGE
GTK crashes on insufficient memory, we don't emit GErrors.
2. Split UNSUPPORTED into UNSUPPORTED_CONTENT and UNSUPPORTED_FORMAT
So we know if you need to find an RPM with a loader or curse and
the weird file.
3. Translate error messages, they are meant for end users.
When loading, convert all >8-bit data to
GDK_MEMORY_R16G16B16A16_PREMULTIPLIED.
When saving, save all 8-bit formats as 8-bit RGBA,
and save all >8-bt formats as 16-bit RGBA.