ConfigManager: Convert GPUDeterminismMode into an enum class

Makes the values strongly-typed and gets more identifiers out of the
global namespace.

We are forced to use anything that is not "None" to mean none, because
X11 is garbage in that it has:

\#define None 0L

Because clearly no one else will ever want to use that identifier for
anything in their own code (and is why you should prefix literally
any and all preprocessor macros you expose to library users in public
headers).
This commit is contained in:
Lioncash
2018-06-15 14:25:16 -04:00
parent bd85d63c62
commit de9c5fd375
3 changed files with 13 additions and 13 deletions

View File

@ -203,14 +203,14 @@ static ConfigCache config_cache;
static GPUDeterminismMode ParseGPUDeterminismMode(const std::string& mode)
{
if (mode == "auto")
return GPU_DETERMINISM_AUTO;
return GPUDeterminismMode::Auto;
if (mode == "none")
return GPU_DETERMINISM_NONE;
return GPUDeterminismMode::Disabled;
if (mode == "fake-completion")
return GPU_DETERMINISM_FAKE_COMPLETION;
return GPUDeterminismMode::FakeCompletion;
NOTICE_LOG(BOOT, "Unknown GPU determinism mode %s", mode.c_str());
return GPU_DETERMINISM_AUTO;
return GPUDeterminismMode::Auto;
}
// Boot the ISO or file