[Scummvm-devel] Bitdepth/pixel format API concerns

Johannes Schickel lordhoto at gmail.com
Tue Jun 9 00:09:56 CEST 2009


J Northup wrote:
>
> What has already been determined:
> * The pixel format will be initially requested of the backend by means 
> of an optional parameter to OSystem::InitGraphics.

If I'm allowed to ask, what's OSystem::initGraphics? I don't know of any 
function like that in OSystem (yet).

> * The default pixel format (in case of no parameter), will be 8-bit, 
> paletted, to minimize changes required to existing engines.

Sounds fine.

>
> What has yet to be determined:
> * What happens when the engine requests an unsupported format?
>     * Does InitGraphics fall back to 8 bit, or fail entirely?

It should fail IMHO, games using another mode than 8bpp palette based 
will only look (and maybe work) properly for the requested mode. Thus 
falling back to 8bpp palette based mode would not help the engine at all.

Of course if there's an graphics mode already setup the backend, when it 
has transaction + fallback support, should return to the old graphics 
mode and do give proper error values via OSystem::endGFXTransaction.

>     * How does the game engine learn that its request was denied?

Usually errors setting up the graphics mode will be returned by 
OSystem::endGFXTransaction. Backends not supporting that do error out 
themselves so far (or should so at least!).

>     * How does the game engine request an alternate format, if the 
> game supports multiple formats: a second call to InitGraphics, or a 
> second method meant specifically for specifying bitdepth?

I guess for such situations a way to query supported modes might come in 
handy.

> * What format should the parameter take?
>     * Should it be a pair specifying bitdepth (8,16,24,32) and format 
> (palette, rgb, grb, bgr, etc)?

Actually this looks like the way with Graphics::PixelFormat / custom 
fromat, just that we do pass the values as separated parameters. I would 
abstain from this one.

>     * Should it be a generic format specifier (8, 555, 1555, 565, 888, 
> 8888, etc)?

It's hard to determine whether it should be rgb vs bgr with this one. We 
were using gBitFormat with these values for the GUI in the past and on 
one system at least we had problems with it. That was caused by rgb vs 
bgr order, which can't be easily reflected with simple values like this.

>     * Should it be a fully formed Graphics::PixelFormat object?

I guess it's fine to use this one. It also offers a way to request the 
proper R/G/B mask values the engine might require.

>     * Should it be some other format designed and implemented 
> specifically for this task?

I don't see the advantage over Graphics::PixelFormat here. If we don't 
need all the features of Graphics::PixelFormat, we could just document 
our API accordingly. In the end the client code should be able to query 
the set up format from the backend via a function like: "Graphics::Pixel 
getScreenFormat() const;" anyway, thus matching the current 
OSystem::getOverlayFormat. So no need to add loads of additional data 
types for this :-).


>
> * What should happen if the backend and engine cannot agree on a 
> directly supported format?
>     * Should the game engine simply error out?

This should be the case say when the backend only supports 8bpp palette 
based data and the engine requests some (16bit) RGB based data.

>     * Should pixel format conversions be performed?
>         * Should these conversions be performed by the engine, so that 
> a "convert once, use multiple" approach can be taken?

In my eyes this might allow more backends to support 16bpp games, when 
there's no hardware conversion support. If the engine can be adapted to 
use a "convert once, use multiple" approach it will be *perfect* for 
devices without support for hardware conversion and providing only 
limited computing power.

>         * Should these conversions be performed by the backend, so 
> that it can take advantage of hardware conversion support, where 
> available?

IMHO this should be only a fallback the backend might offer. Worst case 
for this one would be for example: The engine requests say R5G6B5 and 
only does video playback, which requires YUV -> RGB conversion. Now I 
guess the backend would just accept the R5G6B5 mode regardless of 
whether it's able to output it natively and does manual conversion on 
every copyRectToScreen call. This would gives us 2 instead of 1 
conversion in total. Say now that we dont' have hardware conversion 
support it will probably be a major performance drawback over directly 
converting to the native format.

>     * Should the engine and backend developers have free choice among 
> these optioins, so that they can choose the case  that makes the most 
> sense for their engine or backend?

Actually when we support both I have the fear that engine authors will 
just rely on backend based conversion, thus maybe limiting the platforms 
being able to support various games.

// Johannes




More information about the Scummvm-devel mailing list