# How to fix this problem when generating a pseudorandom number in Python?

## Question:

```
import hashlib
import time
def random():
hashtime = str(int(time.time() * (10 ** 6)))
encoded = hashlib.new("sha3_512", hashtime.encode())
decoded = int(encoded.hexdigest(), 16)
dcsmall = decoded / (10 ** (len(str(decoded))))
return (dcsmall)
```

I tried this code to simulate the function random.random() without the module random. I wanted to generate a random number between 0 and 1 but this code doesn’t output numbers between 0 and 0.1 because of the length of "decoded" and I have no idea how to fix it.

Please don’t make huge changes to my code but instead give me an idea how to solve the problem.

Thank you

## Answers:

There is a discrepancy between the range of your decode value and the denominator of your division. One is a binary value and the other is decimal based. The potential range of the resulting fraction cannot be 0…1 because the maximum value of `decode`

does not match the length of its string representation in number of digits.

What you need to do is take a chunk of the value where you can force or determine the minimum and maximum range of `decode`

. I would suggest using a modulo of the `decode`

value and use the same denominator to obtain a 0…1 fraction:

```
dcsmall = decoded%2**64 / 2**64
```

This might do it:

```
import hashlib
import time
def random():
hashtime = str(int(time.time() * (10 ** 6)))
encoded = hashlib.new("sha3_512", hashtime.encode())
decoded = int(encoded.hexdigest(), 16)
dcsmall = decoded / (2 ** 512)
return (dcsmall)
```

SHA3-512 hashes are 512 bits long – no need to worry about how long they are in decimal or hexadecimal notation.