Point x with higher density than y has lower probability than y

Question:

I am using beta distribution to model my problem

I optimize the curve using a and b parameters and I get the outcome I want, but when I try to calculate the probability of two points using small intervals to compute the respective areas the point with smaller density has higher probability, which does not make any sense.

a = 5
b = 16.2
def f(x):
  return beta.pdf(x, a,b)

plt.scatter(0.22,f(0.22))
plt.scatter(0.24,f(0.24))
plt.plot(df["model's outcome"], df['vals'])


res, err = quad(f, 0.22,0.222222222222221)
res2, err = quad(f, 0.24,0.244444444444441)
print(res)
print(res2)

outcome:

0.009754484452173983

0.018502765135697426

beta distribution

Asked By: pete

||

Answers:

To get from 0.22 to 0.222222222 you have to add 0.002222222.

To get from 0.24 to 0.244444444 you have to add 0.0044444444.

Your second interval is twice the width of the first. Your extra area under the curve probably comes from that.

Answered By: user3808430