I ran into the oddest (no pun intended) behaviour while programming in C#. I was rounding a double value and found that it would not give me a sensible answer. When I tried rounding a value of 3.5, I got the expected answer of 4, but when rounding 4.5, I was getting a nonsensical result:

`ConsoleWrite("{0} => {1}", 3.5, Math.Round(3.5));`

ConsoleWrite("{0} => {1}", 4.5, Math.Round(4.5));

3.5 => 4

4.5 => 4

Yeah, rounding 4.5 gives you 4 – as in “four”… – as in not 5 – as in **WTF**!

It turns out that this is a “*valid*” rounding method: to round midpoint values to the nearest even value. Apparently, this is done to even out a bias in traditional rounding (with traditional rounding, there are more values that will round upwards than there are values that will round down). Okay, I’ll buy that – for some obscure branches of mathematics it might make sense to use this unbiased scheme.

But why, *why*, **why **on this, or any other sane planet in the universe, would Microsoft select this (almost) useless algorithm as the **DEFAULT **method of rounding when everyone (obscure mathematicians aside) expects values to round up at .5?!?!

Anyway – if you want proper rounding, make sure you call Math.Round as follows:

`ConsoleWrite("{0} => {1}", 3.5, Math.Round(3.5, 0, MidpointRounding.AwayFromZero));`

ConsoleWrite("{0} => {1}", 4.5, Math.Round(4.5, 0, MidpointRounding.AwayFromZero));

3.5 => 4

4.5 => 5

### Like this:

Like Loading...

*Related*