# Sum All Primes (with sieve of Eratosthenes)

Sum All Primes (with sieve of Eratosthenes)
0

#1

Tell us what’s happening:
I don’t understand why it doesn’t work. I used the sieve of Eratosthenes.
Thank you.

``````function sumPrimes(num) {
var primes = [];
var sum = 0;

for(var i = 2; i < num; i++) {
primes[i] = true;
}

var limit = Math.sqrt(num);

for(var i = 2; i < limit; i++){
if(primes[i] === true){
for(var j = (i * i); j < num; j+=i){
primes[j] = false;
}
}
}

for(var i = 2; i <= num; i++){
if(primes[i] === true) sum += i;
}

return sum;
}

sumPrimes(977);
``````

Your Browser User Agent is: `Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36`.

#2

Some of your comparisons with `num` are not inclusive (i.e., you should have used `<=`, not just `<`)

#3

What kevcomedia is true. I used this approach too and basically ran into the same problem.

#4

I thought I’d reply to this thread because I think my solution is similar to the OP’s but I am always missing the last integer to include in the final function.

``````function sumPrimes(num) {
const sieve = new Array(num).fill(true)

for (let x = 2; x <= Math.sqrt(num); x++) {
if (sieve[x]) {
for (let y = Math.pow(x, 2); y <= num; y += x) {
sieve[y] = false
}
}
}

return sieve.reduce((primesArray, prime, index) => {
if (prime && index > 1) {
primesArray.push(index);
}

return primesArray;
}, []).reduce((x, y) => x + y);
}
``````

I followed the Sieve of Eratosthenes algorithm found in Wikipedia and it always fails to take the last element of the array into calculation. So `sumPrimes(10)` will correctly output 17 but `sumPrimes(23)` will output 77 instead of 100. Can anyone help me with my code, please?

#5

Consider splitting your function into two parts:

1. The Sieve of Eratosthenes, returning a list of primes
2. A function that sums the primes

`sumPrimes(num)` would then call `sieve(num+1)`, and you are done

#6

I should probably clarify why I think you needed to do this, it’s from this line in the wikipedia article:

Let A be an array of Boolean values, indexed by integers 2 to n

`const sieve = new Array(num).fill(true)`
2. Indexed by integers `2` to `n` - not check
Your array is indexed by `0` to `n-1`, and hence causes the problem - `sieve[n]` starts undefined (which is falsey), and if it ever gets accessed it would only be to set it to `false`
you correctly set `sieve[0]` and `sieve[1]` to `false` implicitly by ignoring them in the sum (though you could just set them to `false` in the sieve function)