Why does mean() of Numpy.randint(0,100) converge to 49.49?

I wanted to test how evenly distributed the numpy randint function is, so I did this.

>>> a = np.random.randint(0, 100, 10000)
>>> a.mean()
>>> a = np.random.randint(0, 100, 1000000)
>>> a.mean()
>>> a = np.random.randint(0, 100, 1000000000)
>>> a.mean() <br>`49.49944384`

I was confused about why it was reaching 49.49 as an average. I figured someone else out there would have the same question.

Read more here: https://stackoverflow.com/questions/66324801/why-does-mean-of-numpy-randint0-100-converge-to-49-49

Content Attribution

This content was originally published by DonCarleone at Recent Questions - Stack Overflow, and is syndicated here via their RSS feed. You can read the original post over there.

%d bloggers like this: