@deprecated_ii I recall the reason this would happen. If you are dividing integers, it will treat the result as an integer and round it down. Always down, not to the nearest number.
So when you have 13/6 = 2.1666, you get 2, and when you have -13/6 = -2.1666, you get -3. Rounded down each time, not to the nearest whole number.
Maybe this was specific to a particular language, but I remember this exact thing and why it happens that way.