Show Summary Details

Quick Reference

1 A unit of computational cost associated with matrix and vector operations. The term is widely used in algorithms for numerical linear algebra. A flop is approximately the amount of work required to compute, for example in Fortran, an expression of the form S = S + A(I,J) * X(J) i.e. a floating point multiplication and a floating point addition, along with the effort involved in subscripting. Thus for example, Gaussian elimination for a system of order n requires a number of flops of order n3 (see linear algebraic equations). As computer design and architecture change, this unit of cost may well undergo modification.

S = S + A(I,J) * X(J)

2Short for floating-point operation. See also flops.

Subjects: Computing.

Reference entries

Users without a subscription are not able to see the full content. Please, subscribe or login to access all content.