Precision
Precision is the maximum number of significant digits that can result from an operation.
The expression is evaluated and must result in a positive whole number. This defines the precision (number of significant digits) to which calculations are carried out. Results are rounded to that precision, if necessary.
If you do not specify expression in this instruction, or if no NUMERIC DIGITS
instruction has been processed since the start of a program, the default precision is used. The REXX
standard for the default precision is 9.
Note that NUMERIC DIGITS can set values below the default of nine. However, use small values with care; the loss of precision and rounding thus requested affects all REXX computations, including, for example, the computation of new values for the control variable in DO loops.
