[Scummvm-devel] 16bit support (Was: The 7th Guest)

Oystein Eftevaag wintermute at geheb.com
Wed Nov 12 20:38:55 CET 2008


Max Horn wrote:
> This has been discussed before. Yeah, adding 16bit support has been on  
> our TODO list for a long time. But it has to be done right. In  
> particular, it doesn't just have to be optional for devies not  
> supporting it; it should in fact also be possible to turn it off  
> completely, so that it doesn't cause overhead for systems with low  
> resources (in terms of binary size, memory usage, CPU usage). In  
> particular, the 8bit graphics mode should not be slowed down by being  
> made more generic for 16bit support.
> I am not willing to accept any hacks that quickly add 16bit support  
> without taking care of these issues. It has to satisfy the  
> requirements above, and ideally, shouldn't bloat OSytem too much more  
> (it's already quite complicated these days).
>
> With PixelFormats, we are slowly moving into a direction that one day  
> might make it possible to smoothly add optional 16bit support. But  
> nobody is actively working on it.
>
>   

The issues mentioned in the wiki (that sev linked) and the above seems 
to be mainly engine-related concerns (or even scumm-engine specific ones).

Do all the elements here need to be implemented simultaneously? I would 
assume that the actual OSystem interface changes wouldn't be much more 
complicated than a function to query for 16 bit support (or pixel format 
X support, I guess), and another to set the currently active format. In 
which case it might be an idea to just add default implementations of 
those now and just let specific engines and backends add support when 
and if it's desired.

Or am I off the mark here, and the backends would do more than this? 
E.g. handle different pixel formats on a per-blit basis (Which would 
complicate things a bit).

--
Oystein/vinterstum




More information about the Scummvm-devel mailing list