How can I correctly calculate the sum of all numbers in a range in a while loop?

Question:

I am trying to calculate, using a while loop, the sum of all numbers from 20 to 100. It calculates, but I am 100% sure it is wrong.

My code is:

num = 20
sum = 0
i = 0
while num <= 100:
    for i in range(0, num + 1):
        sum = num + 1
        num = num + 1   
    continue
    
print(sum)

The result is 167.
But I know it has to be 210.

I tried various changes: for example, changed the position of num = num + 1 to after the for loop, but it does not work properly.

Asked By: Lina

||

Answers:

Here is a program that sums the integers from 20 to 100, inclusive:

sum = 0

for i in range(20, 101):
    sum += i

print("Sum: %d" %sum)

I saved this to a file called sumbyloop.py and ran the following command:

% python sumbyloop.py
Sum: 4860

We can verify this answer mathematically.

SUM(20 to 100) = (100 – 20 + 1)(100 + 20)/2 = 81*60 = 4860 [see footnotes]

Also, the sum of all integers from 20 to 100 inclusive is not 210, since 100+99+98 is already a value that exceeds 210.

So I think the sum from 20 to 100 inclusive is 4860.

I hope this answer helps. You asked an excellent question.

One more comment: A for loop is especially suited to this task. We can accomplish the task of summing numbers using a for loop. The range function was made for tasks like this. The function call range(a, b+1) returns a list of numbers from a to b inclusive.

% python 
>>> r = range(1, 10)
>>> r
range(1, 10)
>>> list(r)
[1, 2, 3, 4, 5, 6, 7, 8, 9]

Footnotes:

[1] We can calculate the sum of an arithmetic series by multiplying the number of terms (n) with the average of the first and last term (first term + last term)/2. Thus s = n(a + b)/2 where a is the first term and b is the last term. The value (a + b)/2 is the average of the first and last term and n is the number of terms in the series. This formula works for arithmetic series: series where each term increases by a constant difference.

[2] In the series sum(20, 100, 1) there are 100-20+1=81 terms and the average term is 60. So sum(20, 100, 1)=81*60=4860. I’m using the notation sum(20, 100, 1) to mean the series in which the first term is 20, the last term is 100, and the difference between successive terms is 1. This formula for the sum of an arithmetic series was famously proven by Carl Friedrich Gauss. Gauss showed that the sum of an arithmetic series is the number of terms times the average term.

Answered By: ktm5124

You use two loops, but it’s not needed, you can simply perform whatever you need inside the while loop. It goes from 20 to 100 anyways.

You also should use sum = sum + num instead. Because if you think about it, by doing +1 in both cases, the two values move together, and you have no other part of the code that otherwise performs a summary.

So first you add num to sum and then increment num by 1.

Answered By: Zoltán Schmidt
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.