[Scummvm-devel] Bitdepth/pixel format API concerns

Max Horn max at quendi.de
Tue Jun 9 10:54:14 CEST 2009


Am 09.06.2009 um 08:59 schrieb J Northup:


[...]

> In absence of further response,

Uh... your mail came at midnight in europe, and many of us were  
asleep; now it is still late morning here. Please wait a bit longer  
than that (I specifically said last night on IRC that I would reply  
tomorrow, but that I had to sleep)...


> I am beginning work using a model based on an enum type, divided  
> into two sections, ORed together:
>        kFormatTypeMask = 0xFF // AND by this to get the overall bit  
> format
>        kFormatOrderMask = 0xFF << 8 // AND by this to get the (RGB/ 
> BGR order)
>
> Currently providing ten values for Type: kFormat8Bit, kFormatRGB555,  
> kFormatARGB1555, kFormatRGB556, kFormatRGB565, kFormatRGB655,  
> kFormatARGB4444, kFormatRGB888, kFormatARGB6666, and kFormatARGB8888
>
> And 31 values for Order: Palette, kFormatRGB, kFormatRBG,  
> kFormatGRB, kFormatGBR, kFormatBRG, kFormatBGR, kFormatARGB, ...,  
> kFormatBGRA
> I do not really expect this to be an acceptable final  
> implementation, as it is either overkill (who's ever heard of  
> BAGR6666 color?) or underkill (what about YUV?), but it will provide  
> a direction for me to work towards while awaiting further discussion  
> and developer consensus (as daily commits are expected of me), as  
> well as a concrete example of the positive and negative aspects of  
> this model.

I think that it is absolutely fine that you decide to go on with an ad- 
hoc solution so that you don't have sit idle twiddling your thumbs  
while waiting for replies. Excellent :).

However, I indeed think this should not be the final format. Rather, I  
strongly vote for using Graphics::PixelFormat. Here's some thoughts I  
wrote down while reading the mails on this thread:


* Just specifying a bitdepth is not enough. It really depends on what  
pixelformat is native to the engine
-> survey: which 16+ bit games we would potentially support are out  
there, and which precise pixelformats do they need? Let's make a table  
of all 16 bit (or higher) games, and what pixelformat they use (e.g.  
on the Wiki)


* Use Graphics::Pixelformat for the request format, period. Everything  
else is too limited, and I would hate to have to use PixelFormat in  
some places, a custom "format type" enum in others and (yuck!) the old  
evil bitformat style "555" (it really should be "666" anyway, the  
number of the beast ;).

E.g. one cannot specify the byte order or bgr with "555"; if we use a  
fixed list of supported types, like OpenGL or like you suggested, then  
if we add a new one to the list, then any backend which wants to  
support the new mode needs to be updated (e.g. the SDL backend). While  
with a PixelFormat, then w/o any changes at least the SDL backend  
could support any format we ever come up with (at least in  
principle ;), without code changes.

And not all backends "would have to know all formats". Rather, they  
can just check whether the masks correspond to any "natively"  
supported format (could be a fixed list); and they can use generic  
conversion code if it doesn't (it's trivial to write thanks to the  
masks/shifts). That's one of the major ideas behind the Graphics::  
PixelFormat struct anyway! However, this could be rather slow if it  
happens a lot.

Simple solutions to that: A backend which is to slow to do generic  
conversion via Graphics::PixelFormat should just not do it! Such a  
backend would have the same problem with any other way to specify the  
pixelformat anyway.
On the other hand, any backend which can do efficient conversions for  
a limited set of pixelformats (e.g. using OpenGL) can still easily  
match any PixelFormat against its (fixed?) list of efficiently  
supported pixelformats. Granted, it'll be some lines of code to write,  
but very easy ones.

-> survey: which backends support which modes natively, anyway? Again  
something that I would recommend collecting in a table on the wiki for  
future reference!


* Furthermore, it should be possible to query the backend for a list  
of "natively" supported graphic formats (including 8bit mode), I  
think. This way, an engine can decide to do conversion itself if it  
wants to, but does not have to. While other engines can be lazy and  
rely on the backend to do conversions (which the backend will only  
offer if it can do so at reasonable cost). I believe that most engines  
would actually rely on this backend conversion, at least initially; if  
this turns out to limit the number of supported platforms (as LordHoto  
fears), then we can still look into solving that (e.g. by enhancing  
that engine) -- but I wouldn't worry about it just now!

* In addition, let initGraphics have a param "bool emulateIfPossible",  
and let it return a meaningful error ("kResolutionNotSupported",  
"kPixelFormatNotSupported", etc.)  so that engines can determine  
whether the requested mode was supported or not, and retry a different  
one if they want so (or just error out with a nice message dialog).

Closing remarks: There is no need for kFeatureHas16BitsColor, just use  
"getSSupportedPixelFormats()" ;).
And we should have an OSystem method which allows querying the active  
pixelformat (and maybe also the active resolution!), that will become  
very convenient at some point, I think.


Cheers,
Max




More information about the Scummvm-devel mailing list