[mutter/wip/rstrode/rhel-8.0.0: 19/30] cogl: pick glReadPixels format by target, not source
- From: Ray Strode <halfline src gnome org>
- To: commits-list gnome org
- Cc:
- Subject: [mutter/wip/rstrode/rhel-8.0.0: 19/30] cogl: pick glReadPixels format by target, not source
- Date: Mon, 28 Jan 2019 15:51:12 +0000 (UTC)
commit d76a86d172e4a326dcf7f46e22e2121764ba4e9c
Author: Pekka Paalanen <pekka paalanen collabora co uk>
Date: Tue Nov 6 16:27:52 2018 +0200
cogl: pick glReadPixels format by target, not source
Presumably glReadPixels itself can be more performant with pixel format
conversions than doing a fix-up conversion on the CPU afterwards. Hence,
pick required_format based on the destination rather than the source, so
that it has a better chance to avoid the fix-up conversion.
With CoglOnscreen objects, CoglFramebuffer::internal_format (the source
format) is also wrong. It is left to a default value and never set to
reflect the reality. In other words, read-pixels had an arbitrary
intermediate pixel format that was used in glReadPixels and then fix-up
conversion made it work for the destination.
The render buffers (GBM surface) are allocated as DRM_FORMAT_XRGB8888.
If the destination buffer is allocated as the same format, the Cogl
read-pixels first converts with glReadPixels XRGB -> ABGR because of the
above default format, and then the fix-up conversion does ABGR -> XRGB.
This case was observed with DisplayLink outputs, where the native
renderer must use the CPU copy path to fill the "secondary GPU"
framebuffers.
This patch stops using internal_format and uses the desired destination
format instead.
_cogl_framebuffer_gl_read_pixels_into_bitmap() will still use
internal_format to determine alpha premultiplication state and multiply
or un-multiply as needed. Luckily all the formats involved in the
DisplayLink use case are always _PRE and so is the default
internal_format too, so things work in practise.
Furthermore, the GL texture_swizzle extension can never apply to
glReadPixels. Not even with FBOs, as found in this discussion:
https://gitlab.gnome.org/GNOME/mutter/issues/72
Therefore the target_format argument is hardcoded to something that can
never match anything, which will prevent the swizzle from being assumed.
cogl/cogl/driver/gl/cogl-framebuffer-gl.c | 6 +++++-
1 file changed, 5 insertions(+), 1 deletion(-)
---
diff --git a/cogl/cogl/driver/gl/cogl-framebuffer-gl.c b/cogl/cogl/driver/gl/cogl-framebuffer-gl.c
index a11cdec89..9f55015ef 100644
--- a/cogl/cogl/driver/gl/cogl-framebuffer-gl.c
+++ b/cogl/cogl/driver/gl/cogl-framebuffer-gl.c
@@ -4,6 +4,7 @@
* A Low Level GPU Graphics and Utilities API
*
* Copyright (C) 2007,2008,2009,2012 Intel Corporation.
+ * Copyright (C) 2018 DisplayLink (UK) Ltd.
*
* Permission is hereby granted, free of charge, to any person
* obtaining a copy of this software and associated documentation
@@ -1288,9 +1289,12 @@ _cogl_framebuffer_gl_read_pixels_into_bitmap (CoglFramebuffer *framebuffer,
if (!cogl_is_offscreen (framebuffer))
y = framebuffer_height - y - height;
+ /* Use target format ANY, because GL texture_swizzle extension cannot
+ * ever apply for glReadPixels.
+ */
required_format = ctx->driver_vtable->pixel_format_to_gl_with_target (ctx,
- framebuffer->internal_format,
format,
+ COGL_PIXEL_FORMAT_ANY,
&gl_intformat,
&gl_format,
&gl_type);
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]