Decimal arithmetic in SQL

The following formulas define the precision and scale of the result of decimal operations in SQL. The symbols p and s denote the precision and scale of the first operand and the symbols p' and s' denote the precision and scale of the second operand.

The symbol mp denotes the maximum precision. The value of mp is 63 if:

  • either p or p' is greater than 31, or
  • a value of 63 was explicitly specified for the maximum precision.

Otherwise, the value of mp is 31.

The symbol ms denotes the maximum scale. The default value of ms is 31. ms can be explicitly set to any number from 0 to the maximum precision.

The symbol mds denotes the minimum divide scale. The default value of mds is 0, where 0 indicates that no minimum scale is specified. mds can be explicitly set to any number from 1 to min (ms, 9).

The maximum precision, maximum scale, and minimum divide scale can be explicitly specified on the DECRESULT parameter of the CRTSQLxxx command, RUNSQLSTM command, or SET OPTION statement. They can also be specified in ODBC data sources, JDBC properties, OLE DB properties, .NET properties.

Addition and subtraction

The scale of the result of addition and subtraction is max (s,s'). The precision is min(mp,max(p-s,p'-s') +max(s,s')+1).

Multiplication

The precision of the result of multiplication is min (mp,p+p') and the scale is min(ms,s+s').

Division

The precision of the result of division is (p-s+s') + max(mds, min(ms, mp - (p-s+s') ) ). The scale is max(mds, min(ms, mp - (p-s+s') ) ). The scale must not be negative.