JavaScript negative numbers modulus bug

I just got bit by a rather weird bug. In most programming languages you can do “a modulo b” and get back a number between 0..(b-1). This is wonderful when you have an array of n items and you want to select a number in the range of array indexes 0..(n-1) but you want to use some kind of an incremental counter. For example:

var letters = ['a', 'b', 'c', 'd'];
var idx = 0;
for (i = 0; i < 100; i++) {
  window.console.log(letters[idx % letters.length]);
  idx++;
}

This should print out to the debug console: a b c d a b c d a b c d … and so on. The “% letters.length” keeps the range of integer indexes between 0..3 in this case.

So the cool thing is that instead of incrementing idx you could reverse it:

var letters = ['a', 'b', 'c', 'd'];
var idx = 0;
for (i = 0; i < 100; i++) {
  window.console.log(letters[idx % letters.length]);
  idx--;
}

And we should see: a d c b a d c b a d c b a … and so on.

But not in JavaScript.

Apparently there is a bug where it will constrain idx but it will allow negative numbers! So the idx values in this case will go: 0 -1 -2 -3 0 -1 -2 -3 0 -1 -2 -3 … and so on. What we expected was that it would go: 0 3 2 1 0 3 2 1.

The solution (which also ended up on About.com) is to add your modulo back to the number and do a second modulo. It’s totally inefficient, I know. So in this example:

var letters = ['a', 'b', 'c', 'd'];
var idx = 0;
for (i = 0; i < 100; i++) {
  window.console.log(letters[((idx % letters.length) + letters.length) % letters.length]);
  idx--;
}

or the formula: ((a % b) + b) % b)

Ugh.

Leave a Reply