
This does not result in a perfect uniform distribution, but if the range for $r$ is a lot larger than $r'$, it is really close to uniform.

This means I need to collect 85~ raw integers (682.67 bits) and feed them into the 256 bit hash. If I do some back-of-the-envelope calculations I come up with: 3 bits of entropy per 8 bit (1 byte) number However what is the correct amount of raw integers to feed into the hash to get a quality 256 bit output? I assume I need 256 bits of input to get a good 256 bit output, yes? I think the plan might be to run all these raw integers through a cryptographic hash such as SHA 256 then use that as a key. Does that matter? Should the numbers 8 and 9 be thrown away to remove bias? However 8 and 9 bits are duplicates of 1 and 2. It looks like there is all possible combinations of the 3 bits in the numbers 0-7 inclusive. What about the last 3 bits are they random? 0 000 From 0-7 it is always a 0 bit and only for 8 and 9 is it a 1 bit. Even looking at the 5th bit that is not very random either.

Look what happens when the numbers are converted to binary: 0 00110000Īs you can see the first 4 bits are always 0011 which is not very random. Not so random as you can see! That must be why you shouldn't just use raw integers as random numbers in a computer program. First I gathered up a bunch of the numbers, converted them to binary and created a bitmap. Assuming a random number generation process outputs lots of numbers between 0-9.
