![orthogonal matrix orthogonal matrix](https://media.cheggcdn.com/media/ccc/ccc1e6b0-4c7a-4ba4-8f59-0c1b986a60ed/phpYgklC8.png)
Property 5: If A is an m × n orthogonal matrix and B is an n × p orthogonal then AB is orthogonal. Property 4: A matrix is orthogonal if and only if all of its columns are orthonormal. Observation: The following property is an obvious consequence of this definition. Observation: The following is an alternative way of constructing Q 1, …, Q m (which yields the same result).ĭefine V 1, …, V m and Q 1, …, Q m from X 1, …, X m as follows:ĭefinition 2: A matrix A is orthogonal if A T A = I. Proof: If V 1, …, V m is the orthogonal basis, then Q 1, …, Q m is an orthonormal normal basis where Since this orthogonal set of vectors is independent, it is a basis for V.ĭefinition 1: A set of vectors is orthonormal if the vectors are mutually orthogonal and each vector is a unit vector.Ĭorollary 2: For any closed set of vectors we can construct an orthonormal basis By Theorem 1, we can construct an orthogonal set of vectors that spans the same set.
![orthogonal matrix orthogonal matrix](https://www.bccfalna.com/ebooks/wp-content/uploads/ebooks/2021/04/Algebra-of-Matrix-Orthogonal-Matrix-in-Hindi.png)
Proof: By Corollary 1 of Linear Independent Vectors, every closed set of vectors V has a basis. This is true since by definitionĪnd by the induction hypothesis, all the V j can be expressed as a linear combination of the X 1, …, X k.īy induction, we can now conclude that the span of V 1, …, V m is a subset of the span of X 1, …, X m, and so trivially V 1, …, V m are elements in the span of X 1, …, X m But since the V 1, …, V m are independent, by Property 3 of Linear Independent Vectors, we can conclude that the span of V 1, …, V m is equal to the span of X 1, …, X m.Ĭorollary 1: For any closed set of vectors we can construct an orthogonal basis Based on the induction hypothesis, it is sufficient to show that V k+1 can be expressed as a linear combination of X 1, …, X k +1. We assume the result is true for k and show that it is true for k + 1. We next show that the span of V 1, …, V k is a subset of the span of X 1, …, X k for all k ≤ m. By Property 2, it follows that V 1, …, V m are also independent. This completes the proof that V 1, …, V m are mutually orthogonal. Using the induction hypothesis that V j ∙ V i = 0 for 1 ≤ j ≤ k and j ≠ i and V i ∙ V i≠ 0 (since V i ≠ 0), we see that To show that V 1, …, V k+ 1 are mutually orthogonal, it is sufficient to show that V k +1 ∙ V i = 0 for all i where 1 ≤ i ≤ k. Assume that V 1, …, V k are mutually orthogonal.
![orthogonal matrix orthogonal matrix](https://i.ytimg.com/vi/TPY8MtWWYj0/maxresdefault.jpg)
Proof: We first show that the V k are mutually orthogonal by induction on k.
ORTHOGONAL MATRIX HOW TO
Proof: We show how to construct the V 1, …, V m from the X 1, …, X m as follows. Then we can find n × 1 column vectors V 1, …, V m which are mutually orthogonal and have the same span. Theorem 1 ( Gram-Schmidt Process): Suppose X 1, …, X m are independent n × 1 column vectors. We next show that any set of vectors has a basis consisting of mutually orthogonal vectors. It is also easy to see that the C 1, …, C n are mutually orthogonal. As we mentioned in the proof of Corollary 4 of Linear Independent Vectors, it is easy to see that for any n, C 1, …, C n forms a basis for the set of all n × 1 column vectors. Observation: Let C j be the jth column of the identity matrix I n. Proof: This follows by Corollary 4 of Linear Independent Vectors and Property 2. Similarly, any set of n mutually orthogonal 1 × n row vectors is a basis for the set of 1 × n row vectors. Property 3: Any set of n mutually orthogonal n × 1 column vectors is a basis for the set of n × 1 column vectors. Since this is true for any j, X 1, …, X m are independent. But since X j ∙ X j > 0, it follows that c j = 0. Then for any j, 0 = X j ∙ = c j ( X j ∙ X j) since X j ∙ X i = 0 when i ≠ j. Proof: Suppose X 1, …, X m are mutually orthogonal and let = 0. Property 2: If X 1, …, X m are mutually orthogonal vectors, then they are independent. Proof: ( AX) ∙ Y = ( AX) TY = ( X TA T) Y = X T( A TY) = X ∙ ( A TY) Property 1: If A is an m × n matrix, X is an n × 1 vector and Y is an m × 1 vector, then It is easy to see that ( cX) ∙ Y = c( X ∙ Y), ( X + Y) ∙ Z = X ∙ Z + Y ∙ Z, X ∙ X = > 0 and other similar properties of the dot product. Note that if X and Y are n × 1 column vectors, then X ∙ Y = X TY = Y TX, while if X and Y are 1 × n row vectors, then X ∙ Y = XY T = YX T. Observation: As we observed in Matrix Operations, two non-null vectors X = and Y = of the same shape are orthogonal if their dot product is 0, i.e.