On a theoretical level, there is no way to ensure that a submitted score is 100% legitimate. In practical terms, you can make it increasingly difficult
First things first, you have to be able to punish
cheaters. One way to have that is to tie leaderboard access to paid accounts - paying customers. If your game is free and the leaderboards are open, you will
have cheaters and they will consume all of your time manually reviewing scores. But if your game costs something, even a little, most cheaters are deterred because every time they're caught they would have to buy the game again. And if you're flooded with paying
cheaters, you can actually afford to hire people to review scores for you!
Then, we can begin to discuss the core of the issue: How do we know that a human being produced the submitted score, fairly and legitimately? Let's consider possible attacks.
Problem: Cheaters can submit any number as their score. Therefore, you cannot trust the score number.
Solution: Require that all submitted scores are accompanied by a replay
of the game that produced them, and have automated validation of replays - a system that plays back the replay
and confirms the score it produces. If your game engine is non-deterministic, you'll need a replay that stores the positions of all game objects in time, and the validation has to be a little fuzzy to account for differences in floating point arithmetic. That is, check that the replay is within reasonable bounds.
Problem: Cheaters can meticulously craft tool-assisted replays (see: Tool-Assisted Speedrun community
) to produce scores better than a human could.
Solution: The actual content being played has to be 1) impossible to predict in advance, and 2) time-limited from publication.
In essence, your game needs procedural content. When a player starts a new game, the client asks your server for a level. The server sends a random level, and stores the current time for this session. When the player completes the content, the client sends the score and replay to your server. The server stores the time it took for the player to complete the content, validates the score from the replay, and verifies that the replay is (about) the same length as the time. You'll need an upper limit on the allowed time to qualify for leaderboards. This means that the content has to be completed in one sitting, in a matter of hours.
Problem: Cheaters can modify the game client to reveal information hidden from normal players.
Solution: Design your game in such a way that there is no hidden content. No secrets. No fog of war. No surprises of any kind. Nothing to be gained from being able to see the whole game world from the start.
Alternate solution: Send the content in chunks
to players, requiring the replay for the previous chunk
before sending the next one. This consumes more computing resources from your server, but allows for unknown information across the game.
Problem: Cheaters can program artificial intelligence to play the game for them.
Solution: Manual review. Does it look human?
And now we've reached the end of our capabilities as game service providers to prevent cheating. If someone manages to develop a human-like AI for your game that plays it better than the humans can, they've earned
All of this doesn't have to be perfect. You can only verify some
of the replays randomly. You can only manually review some
of the cases randomly. The possibility
of being caught keeps most people honest. If a cheater is discovered, their account is banned and all their past scores are removed from the leaderboards.