[M-14]
Write a main()
program that
generates a large number of random integers in the range 0..9.
Create a histogram of the numbers. A histogram is
an array that holds the number of times each integer occurred.
Compute the average of all the values.
Here is one
run of the program, for N==100000000:
0: 10001701 1: 10001203 2: 10001787 3: 9999485 4: 9999979 5: 9997625 6: 10002778 7: 9995303 8: 9999888 9: 10000251 average = 4.49976
You should expect each histogram cell to contain about one tenth the total,
about 10000000
. Of course, there will be small deviations above
and below that value. A really big deviation would be a sign that something
is wrong.
The average for a uniform distribution should be half the sum of the lowest
and highest possible values, for us (0+9)/2 == 4.5
For actual data, the average is the sum of all the values divided by the number of values.