other performance issues ...



	Just searching for DITHER_MAX - I see some code I don't understand, and
perhaps needs unwinding:

	eel/eel-backround.c (eel_background_draw):

	for (y = 0; y < dest_height; y += PIXBUF_HEIGHT) {
		for (x = 0; x < dest_width; x += PIXBUF_WIDTH) {

			... setup width /  height ...

			canvas_buf_from_pixbuf (&buffer, pixbuf,
				 x_canvas, y_canvas, width, height);
			eel_background_draw_aa (background, &buffer);
			gdk_pixbuf_render_to_drawable (
				pixbuf, drawable, gc, 0, 0,
				dest_x + x, dest_y + y,
				width, height,
				GDK_RGB_DITHER_MAX,
				dest_x + x, dest_y + y);
		}
	}

	So - if the drawable is not a 24bit color depth drawable [ I'm using
16bit X server by necessity,  Sun's use 8 bits all over the shop -
especially on the root window ] we have the situation where:

	* For each image tile
	* Render the background gradient for this section to
	  a canvas buffer
	* dither / push to the X server
	* fetch same area from the X server
	* composite with pixmap
	* push back to server.

	This seems a pretty incredible state of affairs ;-) no wonder people
can see tearing of the background when they drag a window.

	We could switch this to DITHER_NORMAL to help the XRender people, or we
could simply render the full background tile to a non-alpha pixmap and
just push that;

	Or am I missing something ?

	Regards,

		Michael.

-- 
 mmeeks gnu org  <><, Pseudo Engineer, itinerant idiot




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]