Order Matters

May 8, 2021

I was recently reminded that integer division is…well, integer division.

Local variables, a globally defined constant, and function parameters (all holding integer values) made this problem more difficult to see in my case, but here’s a simple example:

#include <stdio.h>

int main() {
    int result = 1000 / 10 * 5;
    int reordered = 5 / 10 * 1000;
    printf("result is %d\n", result);
    printf("reordered is %d\n", reordered);
    return 0;
}

Typing the result and reordered lines into a calculator will give you an identical result: 500. It’s easy, especially during a quick code review, to ignore the types, mentally execute both lines and conclude they are functionally identical. So easy in fact, that one might forget that with integer division order matters:

result is 500
reordered is 0

Remember, both explicit and implicit values are integers in such calculations.

// int result = 1000 / 10 * 5;
// ((1000 / 10) * 5)
// ((100) * 5)
// (500)

// int reordered = 5 / 10 * 1000;
// ((5 / 10) * 1000)
// ((0) * 1000) <----- 0.5 cannot be represented as an integer
// (0)

With named values instead of integer literals in this example, it’s easy to imagine a mistake like this reordering issue slipping through.

This is partly a reminder of the importance of unit tests (or regression tests of some kind). Ideally an incorrect change introducing this behavior would be caught immediately. Well-written unit tests are great for pointing out simple bugs like this, though I am making an assumption that this bug would break a functional requirement of the code under test.

But this is mostly just a friendly reminder that with integer division, just like many other pockets of programming, order matters.