Orthonormal Basis (Gram–Schmidt)

Linear Algebra, Vectors

Let's Do Math!

Intro: We build an orthonormal set $u_1,u_2,\dots$ from input vectors $v_1,v_2,\dots$ by subtracting projections and normalizing. Exact fractions and radicals when possible.

Worked example

FAQs

Linear dependence?

If a vector becomes zero after subtracting projections, it is dependent on previous ones and is skipped; the remaining nonzero set is still orthonormal.

Order sensitivity?

Yes. Different orders of the input vectors can produce different orthonormal sets, though all span the same subspace.

Numerical stability?

For floating-point data, Modified Gram–Schmidt improves stability; QR via Householder is even more robust.

Why choose MathGPT?

How this calculator works

Related