[Scummvm-devel] Bitdepth/pixel format API concerns
Eugene Sandulenko
sev at scummvm.org
Tue Jun 9 03:06:48 CEST 2009
On Mon, 8 Jun 2009 14:38:55 -0700
J Northup <upthorn at gmail.com> wrote:
> What has yet to be determined:
I think that existing GUI code could be a good reference, since it
approached majority of these questions.
> * What happens when the engine requests an unsupported format?
> * Does InitGraphics fall back to 8 bit, or fail entirely?
I think yes. I.e. 16-bit has to be special. Same as our GUI code falls
back to classic scheme when 16-bits surface is not available.
> * How does the game engine learn that its request was denied?
By querying backend feature. Something like kFeatureHas16bitsColor to
feature flag and make it return current state.
> * How does the game engine request an alternate format, if the
> game supports multiple formats: a second call to InitGraphics,
If we will fall back to 8-bit, then it will be matter of finding this
out, not requesting another graphics initialization.
> or a second method meant specifically for specifying bitdepth?
> * What format should the parameter take?
> * Should it be a pair specifying bitdepth (8,16,24,32) and format
> (palette, rgb, grb, bgr, etc)?
> * Should it be a generic format specifier (8, 555, 1555, 565,
> 888, 8888, etc)?
ask for number of bits, not the specific format. Then engine code will
query established format, similar to currently implemented
getOverlayFormat() call and perform on the fly color transformation.
Alternatively we could assume that engines always feed 565 and perform
the transition in the backend if needed. Note, that we already have all
required transformation methods implemented.
> * Should it be a fully formed Graphics::PixelFormat object?
That would be an overkill in the sense that every backend will need to
understand all formats.
> * Should it be some other format designed and implemented
> specifically for this task?
No need to reinvent the wheel IMHO. Just establish some standard.
> * What should this format look like?
> * What should happen if the backend and engine cannot agree on a
> directly supported format?
> * Should the game engine simply error out?
> * Should pixel format conversions be performed?
It will depend on the engine. For instance, Gob engine already uses
dithering, so that could be used as a fallback if 16-bits are not
available. But that should be up to the engine developers.
> * Should these conversions be performed by the engine, so
> that a "convert once, use multiple" approach can be taken?
> * Should these conversions be performed by the backend, so
> that it can take advantage of hardware conversion support,
> where available?
> * Should the engine and backend developers have free choice among
> these optioins, so that they can choose the case that makes
> the most sense for their engine or backend?
Free choices usually mean more development effort. Thus this has to be
carefully weighted.
Eugene
More information about the Scummvm-devel
mailing list