[Scummvm-devel] 16bit support (Was: The 7th Guest)

Torbjörn Andersson eriktorbjorn at telia.com
Wed Dec 31 22:32:26 CET 2008


Max Horn wrote:

> This has been discussed before. Yeah, adding 16bit support has been on  
> our TODO list for a long time. But it has to be done right. In  
> particular, it doesn't just have to be optional for devies not  
> supporting it; it should in fact also be possible to turn it off  
> completely, so that it doesn't cause overhead for systems with low  
> resources (in terms of binary size, memory usage, CPU usage). In  
> particular, the 8bit graphics mode should not be slowed down by being  
> made more generic for 16bit support.

Just for the sake of an argument, wouldn't it be possible to keep the
8-bit renderer and hypothetical 16-bit renderer completely separated
from each other? Then the 16-bit code would have little or no impact on
the 8-bit code. The engine would ask for either an 8-bit or a 16-bit
renderer on startup, with the caveat that the 16-bit one may not be
available. (Depending on the engine, this wouldn't necessarily be a
fatal error.)

The ScummVM Wiki mentions that some HE games use a mix of 8- and 16-bit
graphics, but that could be converted before it is sent to the backend,
right?

Torbjörn Andersson





More information about the Scummvm-devel mailing list