I recently ran across a piece of T-SQL code that looked something like this:

DECLARE @test Int = 1 SELECT CASE WHEN (@test = 1) THEN 5.75 WHEN (@test = 2) THEN CAST(3.25 as DECIMAL(38,0)) END as Result

At first glance you would expect this to return 5.75 as the result, but in fact it will return 6 instead. Why does this happen? A CASE statement in T-SQL is only allowed to return one data type, so each of the WHEN clauses has to return the same type. In the example the two WHENs are returning different types, so SQL uses type precedence to determine which type to use. Since 5.75 cannot fit in a DECIMAL(38,0) because decimals have a maximum precision of 38, so there is no room for the decimal portion of 5.75. To resolve this, SQL casts 5.75 to DECIMAL(38,0) which causes it to round up to 6. If we were to change the DECIMAL(38,0) to DECIMAL(10,0) in this example we would get the expected result, 5.75, because there is now room for the decimal portion.

If the data types in the WHEN clauses are totally different, SQL will use the type with the highest precedence. For example this code will return “1900-01-06 18:00:00.000”.

DECLARE @test Int = 1 SELECT CASE WHEN (@test = 1) THEN 5.75 WHEN (@test = 2) THEN GETDATE() END as Result

DateTime has a higher precedence then decimal, so 5.75 gets cast to a DateTime. You can the precedence order for all the T-SQL datatypes here.

To avoid any confusion it is always best to be sure you CASE expressions return the same data type in each WHEN clause.