I was trying to make a Super Mario World optimization patch (again), and the HEX to DEC conversion code was one of the routines I reprogrammed. Super Mario World uses a 6 digit score values (7 including the lowest digit that is always 0) and the score in binary/hex takes up 3 bytes. The routine I wrote, divided the value by 100 twice, to get out 3 bytes storing 2 decimal digits each, then used a LUT to split the 2 digit byte value into 2 decimal digits.

It made me wonder what kind of fancy math was used in NES games or could be used in an NES game. Every byte in a score counter effects every digit.

It made me wonder what kind of fancy math was used in NES games or could be used in an NES game. Every byte in a score counter effects every digit.