Orthonormal Basis (Gram–Schmidt)
Linear Algebra, Vectors
Intro: We build an orthonormal set $u_1,u_2,\dots$ from input vectors $v_1,v_2,\dots$ by subtracting projections and normalizing. Exact fractions and radicals when possible.
Worked example
- $v1=(1,1,0), v2=(1,0,1)$
- Normalize the first vector:
- $$\|v_1\|=\sqrt{1^2+1^2+0^2}=\sqrt{2},\quad u_1=\dfrac{v_1}{\|v_1\|}=\left(\tfrac{1}{\sqrt{2}},\tfrac{1}{\sqrt{2}},0\right).$$
- Project \(v_2\) onto \(u_1\) and subtract:
- $$v_2\cdot u_1=\dfrac{1\cdot1+0\cdot1+1\cdot0}{\sqrt{2}}=\dfrac{1}{\sqrt{2}},\qquad \operatorname{proj}_{u_1}v_2=(v_2\cdot u_1)u_1=\dfrac{1}{2}(1,1,0).$$
- $$w_2=v_2-\operatorname{proj}_{u_1}v_2=(1,0,1)-\left(\tfrac{1}{2},\tfrac{1}{2},0\right)=\left(\tfrac{1}{2},-\tfrac{1}{2},1\right).$$
- Normalize \(w_2\):
- $$\|w_2\|=\sqrt{\tfrac{1}{4}+\tfrac{1}{4}+1}=\sqrt{\tfrac{3}{2}}=\tfrac{\sqrt{6}}{2},\quad u_2=\dfrac{w_2}{\|w_2\|}=\left(\tfrac{1}{\sqrt{6}},-\tfrac{1}{\sqrt{6}},\tfrac{2}{\sqrt{6}}\right)=\left(\tfrac{1}{\sqrt{6}},-\tfrac{1}{\sqrt{6}},\tfrac{\sqrt{6}}{3}\right).$$
- Orthonormality checks:
- $$\|u_1\|=\|u_2\|=1,\qquad u_1\cdot u_2=\tfrac{1}{\sqrt{2}\sqrt{6}}-\tfrac{1}{\sqrt{2}\sqrt{6}}+0=0\;\checkmark$$
- Result:
- $$\boxed{u_1=\left(\tfrac{1}{\sqrt{2}},\tfrac{1}{\sqrt{2}},0\right)},\quad \boxed{u_2=\left(\tfrac{1}{\sqrt{6}},-\tfrac{1}{\sqrt{6}},\tfrac{\sqrt{6}}{3}\right)}.$$
- $v1=(1,1,0), v2=(1,0,1), v3=(1,1,1)$
- From the previous example: $$u_1=\left(\tfrac{1}{\sqrt{2}},\tfrac{1}{\sqrt{2}},0\right),\quad u_2=\left(\tfrac{1}{\sqrt{6}},-\tfrac{1}{\sqrt{6}},\tfrac{\sqrt{6}}{3}\right).$$
- Remove components of \(v_3\) along \(u_1,u_2\):
- $$v_3\cdot u_1=\tfrac{2}{\sqrt{2}}=\sqrt{2}\Rightarrow \operatorname{proj}_{u_1}v_3=(1,1,0).$$
- $$v_3\cdot u_2=\tfrac{1-1+2}{\sqrt{6}}=\tfrac{2}{\sqrt{6}}=\tfrac{\sqrt{6}}{3}\Rightarrow \operatorname{proj}_{u_2}v_3=\tfrac{\sqrt{6}}{3}u_2=\left(\tfrac{1}{3},-\tfrac{1}{3},\tfrac{2}{3}\right).$$
- $$w_3=v_3-\operatorname{proj}_{u_1}v_3-\operatorname{proj}_{u_2}v_3=(1,1,1)-(1,1,0)-\left(\tfrac{1}{3},-\tfrac{1}{3},\tfrac{2}{3}\right)=\left(-\tfrac{1}{3},\tfrac{1}{3},\tfrac{1}{3}\right).$$
- Normalize \(w_3\):
- $$\|w_3\|=\sqrt{3\cdot(1/9)}=\tfrac{1}{\sqrt{3}},\quad u_3=\dfrac{w_3}{\|w_3\|}=\left(-\tfrac{1}{\sqrt{3}},\tfrac{1}{\sqrt{3}},\tfrac{1}{\sqrt{3}}\right).$$
- Final orthonormal basis:
- $$\boxed{\{u_1,u_2,u_3\}}=\left\{\left(\tfrac{1}{\sqrt{2}},\tfrac{1}{\sqrt{2}},0\right),\left(\tfrac{1}{\sqrt{6}},-\tfrac{1}{\sqrt{6}},\tfrac{\sqrt{6}}{3}\right),\left(-\tfrac{1}{\sqrt{3}},\tfrac{1}{\sqrt{3}},\tfrac{1}{\sqrt{3}}\right)\right\}.$$
- Orthogonality spot-checks:
- $$u_i\cdot u_j=0\;(i\ne j),\;\|u_i\|=1\;\forall i\;\checkmark$$
FAQs
Linear dependence?
If a vector becomes zero after subtracting projections, it is dependent on previous ones and is skipped; the remaining nonzero set is still orthonormal.
Order sensitivity?
Yes. Different orders of the input vectors can produce different orthonormal sets, though all span the same subspace.
Numerical stability?
For floating-point data, Modified Gram–Schmidt improves stability; QR via Householder is even more robust.
Why choose MathGPT?
- Get clear, step-by-step solutions that explain the “why,” not just the answer.
- See the rules used at each step (power, product, quotient, chain, and more).
- Optional animated walk-throughs to make tricky ideas click faster.
- Clean LaTeX rendering for notes, homework, and study guides.
How this calculator works
- Type or paste your function (LaTeX like
\sin,\lnworks too). - Press Generate a practice question button to generate the derivative and the full reasoning.
- Review each step to understand which rule was applied and why.
- Practice with similar problems to lock in the method.