Cool Blog

Types Template

# Section 1 Hello Typst 2+2=41∫50π‘₯𝑑π‘₯ # Display πœƒ πœ‘ A B C # Linear maps 𝑇:π‘ˆβŸΆπ‘‰. Also a homomorphism. π‘ˆ:π‘‰βŸΆπ‘‰ is a linear opereator or endomorphism of 𝑉. A linear transformation represents a specific geometric transformation, while the matrix representation is a set of coefficients that describe it in a particular basis. 𝑇:ℝ3βŸΆβ„2 𝑇[π‘₯1π‘₯2π‘₯3]=[2π‘₯1+π‘₯3π‘₯1βˆ’π‘₯2+π‘₯3] Is 𝑇 linear? Let π‘₯,π‘¦βˆˆβ„3 𝑇(π‘₯)+𝑇(𝑦)=[2π‘₯1+π‘₯3π‘₯1βˆ’π‘₯2+π‘₯3]+[2𝑦1+𝑦3𝑦1βˆ’π‘¦2+𝑦3]=[2(π‘₯1+𝑦1)+(π‘₯3+𝑦3)(π‘₯1+𝑦1)βˆ’(π‘₯2+𝑦2)+(π‘₯3+𝑦3)]=𝑇(π‘₯+𝑦) It must hold that 𝑇(0)=0 𝑅(𝑇)={[2π‘₯1+π‘₯3π‘₯1βˆ’π‘₯2+π‘₯3]:π‘₯1,π‘₯2,π‘₯3βˆˆβ„}={π‘₯1[21]+π‘₯2[0βˆ’1]+π‘₯3[11]:π‘₯1,π‘₯2,π‘₯3βˆˆβ„}= Span{[21],[0βˆ’1],[11]}=ℝ2 𝑅(𝑇)=Span(𝑇(𝑏1),𝑇(𝑏2),…,𝑇(𝑏𝑛)) If 𝑇:π‘ˆβŸΆπ‘‰ is a linear map from a finite-dimensional vector space π‘ˆ, dim𝑁(𝑇)+ dim𝑅(𝑇)=dimπ‘ˆ Rank of T = dim𝑅(𝑇)=dimcol𝑇 # Subspace To prove the existance of a subspace, it must contain 0, and be closed under addition and scalar multiplication # Bases A subset 𝔅 of a vector space 𝑉 is a basis if all π‘£βˆˆπ‘‰ can be uniquely expressed as a linear combination of finite elements in 𝔅. # Matrix representation of a linear map We have finite dimensional vector space 𝑉 over 𝐹 with basis 𝛽={𝑣1,…,𝑣𝑛}. For example, in 𝑃2(ℝ) we have basis 𝛽1={1,π‘₯,π‘₯2}, 𝛽2={π‘₯2,π‘₯,1}, 𝛽3={1,1+ π‘₯,1+π‘₯+π‘₯2}. Let 𝑝(π‘₯)=3π‘₯2+5π‘₯+4. For 𝛽1 we have [𝑝(π‘₯)]𝛽1=[453], for 𝛽2 we have [𝑝(π‘₯)]𝛽2= [354], for 𝛽3 we have [𝑝(π‘₯)]𝛽3=[βˆ’123]. 𝑇:π‘‰βŸΆπΉπ‘› s.t. 𝑇(𝑣)=[𝑣]𝛽. T is a linear map converting a vector to its coordinates under basis 𝛽 # Matrix vector product A matrix vector product maps the coordinates of a vectur under a (finite) basis to the coordinates of the vector under another (finite) basis. Every linear map between finite vector spaces has a unique matrix representation for every choice of basis in the domain and codomain. Examplefor 𝑇:πΌβŸΆπ‘‰ over 𝐹, we choose 𝛽 and 𝛾 as respective bases. π‘₯βŸΆπ‘‡(π‘₯) [π‘₯]π›½βŸΆmat 𝐴[𝑇(π‘₯)]𝛾 The transformation matrix from 𝛽 to 𝛾 is defined as 𝐴=[𝑇]𝛾𝛽 (or [𝑇]𝛾←𝛽). This expands to: [[𝑇(𝑒1)]𝛾[𝑇(𝑒2)]𝛾…[𝑇(𝑒𝑛)]𝛾] In summary, transform each basis vector of 𝛽 and express it in 𝛾 Exercise from bookConstruct a matrix that mirrors vectors to the line 𝑦=2π‘₯. We have 𝑇:ℝ22βŸΆβ„2. Since 𝑇 is linear βˆƒπ΄:𝑇(π‘₯)=𝐴π‘₯. We know 𝐴=[𝑇]𝛽1𝛽1= [.𝑇.]𝛽1 Note that 𝑏1=[12] s.t. 𝑇(𝑏)1=𝑏1. Also, 𝑏2=[2βˆ’1] s.t. 𝑇(𝑏2)=βˆ’π‘2. Since they are independent in ℝ2, they can be taken as a basis 𝛽. Now, we have [𝑇]𝛽𝛽= [[𝑇(𝛽1)]𝛽[𝑇(𝛽2)]𝛽]=[[10][0βˆ’1]]=[100βˆ’1]. If you want to transform π‘₯ in standard basis, you need to transform it to the basis 𝛽, apply the matrix, and transform back to standard basis. You can also construct matrix [𝑇]𝛽1𝛽=[1βˆ’221] # Matrix algebra 𝑇𝑛:π‘ˆβŸΆπ‘‰ (𝑇1+𝑇2)(π‘₯)=𝑇1(π‘₯)+𝑇2(π‘₯) π‘ŽβˆˆπΉ:(π‘Žπ‘‡1)(π‘₯)=π‘Žπ‘‡1(π‘₯) 𝑇𝑛 is therefore a vector space. 𝑇1:π‘ˆβŸΆπ‘‰ and 𝑇2:π‘‰βŸΆπ‘Š π‘₯⟢(𝑇2βˆ˜π‘‡1)(π‘₯) 𝐡=[𝑇1]𝛽𝛼 𝐴=[𝑇2]𝛾𝛽 𝐢=[𝑇2βˆ˜π‘‡1]𝛾𝛼=𝐴𝐡 # Invertibility For a linear map 𝑇:π‘ˆβ†’π‘‰ we know: 𝑇is injectiveβŸΊπ‘(𝑇)={0} 𝑇is surjectiveβŸΊπ‘…(𝑇)=𝑉 (π‘‡βˆ’1)βˆ’1=𝑇 (𝑇2βˆ˜π‘‡1)βˆ’1=(𝑇1)βˆ’1∘(𝑇2)βˆ’1 IsomorphismInvertible linear map between isomorphic vector spaces. This isomorphism can be represented by an invertible square matrix 𝐴 s.t. there exists a unique matrix 𝐡 where 𝐡𝐴=𝐼 and 𝐴𝐡=𝐼. Note that (𝐴𝐡)βˆ’1=π΅βˆ’1π΄βˆ’1 The following statements are equivalent: β€’ 𝐴is invertible β€’ The columns of 𝐴 are linearly independent β€’ The columns of 𝐴 span 𝐹𝑛 # Determinant Read Β§4.5 A determaninant is a scalar computed from a square matrix. A square matrix is invertible if the columns are independent, but also if the determinant is not 0. For matrix [π‘Ž11π‘Ž21π‘Ž12π‘Ž22] if π‘Ž11=0 we have [π‘Ž110π‘Ž12π‘Ž11π‘Ž22βˆ’π‘Ž21π‘Ž12] so if det(𝐴)β‰ 0 then 𝐴 is invertible. det(𝐴𝐡)=det𝐴det𝐡 det𝐴=det𝐴𝑇 det(π΄π΄βˆ’1)=1 π΄βˆ’1=1det𝐴[π‘Ž22βˆ’π‘Ž21βˆ’π‘Ž12π‘Ž11] Column linearitydet[𝑐𝒂+𝑑𝒃𝒄]=𝑐[𝒂𝒄]+𝑑det[𝒃𝒄] # Rank The following properties hold for 𝑛×𝑛matrices𝐴,𝐡: β€’ 𝑅(𝐴)=𝑅(𝐴𝑇) β€’ 𝑅(𝐴𝐡)≀𝑅(𝐴) β€’ 𝑅(𝐴𝐡)≀𝑅(𝐡) β€’ 𝐴is invertible⟹matrix multiplication preserves rank β€’ 𝑛×𝑛matrix𝐴is invertibleβŸΊπ‘…(𝐴)=𝑛 # Diagonalization Given a linear operator 𝑇 on vector space 𝑉 is diagonalizable if there is an ordered basis𝛽 for 𝑉 such that [𝑇]𝛽 is a diagonal matrix. If we let 𝛽={𝑣𝑗}, s.t. [𝑇]𝛽 is a diagonal matrix. 𝑇(𝑣𝑗)=Ξ¦βˆ’1𝐡(𝐿𝐷(Φ𝐡(𝑣𝑗)))= 𝐷𝑗𝑗𝑣𝑗=πœ†π‘—π‘£π‘—. The basis vectors consist of the eigenvectors of 𝑇 and πœ†π‘– is the eigenvalue corresponding to 𝑣𝑗. All non-zero elements of ker(𝑇) are eigenvectors with eigenvalue 0. Eigenvalues are invariant under coordinate transformation. Eigenvalues are distinct β‡’ eigenvectors are linearily independent β‡’ 𝑇 is diagonalizable # Polynomial splitting 𝑓(𝑑)=(𝑑2+1)(π‘‘βˆ’2) only splits over β„‚, so in ℝ the only eigenvalue is 2. If you have a diagonalizable matrix, the characteristic polynomial splits. # Eigen If we have linear operator 𝐴 and vector 𝑣≠0 s.t. 𝐴𝑣=πœ†π‘£, 𝑣 is an eigenvector of 𝐴 with eigenvalue πœ†. A vector π‘£βˆˆπ‘‰ with 𝑣≠0 is an eigenvector of 𝑇 corresponding to πœ† iff π‘£βˆˆπ‘(π‘‡βˆ’πœ†πΌ) # Orthonormality We have space 𝐻, the space of continuous complex valued functions over [0,2πœ‹]. Let𝑓𝑛(𝑑)=𝑒𝑖𝑛𝑑 We want to check if 𝑆={𝑓𝑛:𝑛is an integer} βŸ¨π‘“π‘š,π‘“π‘›βŸ©=12πœ‹βˆ«2πœ‹0π‘’π‘–π‘šπ‘‘π‘’π‘–π‘›π‘‘π‘‘π‘‘=12πœ‹βˆ«2πœ‹0𝑒𝑖(π‘šβˆ’π‘›)𝑑𝑑𝑑=[12πœ‹(π‘šβˆ’π‘›)𝑒𝑖(π‘šβˆ’π‘›)𝑑]2πœ‹0=π›Ώπ‘šπ‘› For orthogonal bases we get the components of a vector by: 𝐯=⟨𝐯,π‘’π‘–βŸ©β€–π‘’π‘–β€–2 Proof: βŸ¨π‘£,π‘’π‘—βŸ©=π‘Žπ‘–βŸ¨π‘’π‘–,π‘’π‘—βŸ©=π‘Žπ‘–π›Ώπ‘—π‘–β€–π‘’π‘—β€–2=π‘Žπ‘—β€–π‘’π‘—β€–2 # Example Let π‘₯=(1,2,4,7)βˆˆπ”½4 and basis𝛽={(12,12,12,12),(12,12,βˆ’12,βˆ’12),(12,βˆ’12,βˆ’12,12),(βˆ’12,12,βˆ’12,12)} [π‘₯]𝛽=π‘Žπ‘–π›½π‘–=⟨π‘₯,π›½π‘–βŸ©π›½π‘–=7𝛽1βˆ’4𝛽2+𝛽3+2𝛽4 # Gram-Schmidt Suppose 𝑣1,…,π‘£π‘š is a linearly independent list of vectors in 𝑉. Let 𝑓1=𝑣1. For π‘˜= 2,…,π‘š define π‘“π‘˜: π‘“π‘˜=π‘£π‘˜βˆ’βŸ¨π‘£π‘˜,𝑓1βŸ©β€–π‘“1β€–2𝑓1βˆ’β€¦βˆ’βŸ¨π‘£π‘˜,π‘“π‘˜βˆ’1βŸ©β€–π‘“π‘˜βˆ’1β€–2π‘“π‘˜βˆ’1 The orthonormal basis is 𝑒𝑖=𝑓𝑖‖𝑓𝑖‖ # Legendre polynomials Let 𝑉=𝑃2(ℝ) with inner product βŸ¨π‘“(π‘₯),𝑔(π‘₯)⟩=∫1βˆ’1𝑓(𝑑)𝑔(𝑑)𝑑𝑑. The Gram-Schmidt procedure can be used to compute an orthogonal basis. Let standard basis 𝑒={1,π‘₯,π‘₯2}. It is trivial to see that this basis is not orthonormal. We find the orthonomal basis by applying Gram-Schmidt on 𝑣𝑖=𝑒𝑖: 𝑓1=𝑣1=1 ‖𝑓1β€–2=∫1βˆ’11=2 # Random definitions NormalA matrix or operator 𝖳 is normal if π–³βˆ—=𝖳 Self-adjoint (Hermetian)π‘‡βˆ—=𝑇 If ‖𝑇(π‘₯)β€–=β€–π‘₯β€–, if 𝔽=β„‚, 𝑇 is a unitary operator, if 𝔽=ℝ, 𝑇 is an orthogonal operator.π‘‡π‘‡βˆ—=𝐼 Unitarily equivalent𝐷=π‘„βˆ’1𝐴𝑄, where the columns of 𝑄 are an orthonormal basis. Orthogonal matrixπ΄βˆ—=𝐴 Let 𝑇 be a linear operator on finite-dimensional inner product space 𝑉. If ‖𝑇(𝛽)β€–=‖𝛽‖ for all 𝛽 in some orthonormal basis for 𝑉, 𝑇 is not neccesarily unary. Proof: Assume a Euclidean norm. ‖𝑇(π‘₯)β€–= # Adjoint problem (note: adjoint is denoted by βˆ— for operator and † for matrix. Abstract index notation is used for matrices) Disprove: For every linear operator T on V, and every ordered basis 𝛽 for V, we have[𝑇]†𝛽=[π‘‡βˆ—]𝛽 Let 𝑇(𝛽𝑗)=π΄π‘˜π‘—π›½π‘˜, π‘‡βˆ—(𝛽𝑖)=π΅π‘˜π‘–π›½π‘˜, and 𝐺𝑖𝑗=βŸ¨π›½π‘–,π›½π‘—βŸ©. We aim to disprove 𝐴†=𝐡. βŸ¨π‘‡(𝛽𝑗),π›½π‘–βŸ©=βŸ¨π΄π‘˜π‘—π›½π‘˜,π›½π‘–βŸ© =π΄π‘˜π‘—βŸ¨π›½π‘˜,π›½π‘–βŸ©=(π΄π‘—π‘˜)β€ πΊπ‘˜π‘– =(𝐺𝐴†)𝑗𝑖 βŸ¨π‘‡(𝛽𝑗),π›½π‘–βŸ©=βŸ¨π›½π‘—,π‘‡βˆ—(𝛽𝑖)⟩ =βŸ¨π›½π‘—,π΅π‘˜π‘–π›½π‘˜βŸ©=π΅π‘˜π‘–πΊπ‘—π‘˜ =(𝐡𝐺)𝑗𝑖 𝐺𝐴†=𝐡𝐺 𝐴†=πΊβˆ’1𝐡𝐺 πΊπ΄β€ πΊβˆ’1=𝐡 For an orthonormal basis 𝑒, 𝐺=βŸ¨π‘’π‘–,π‘’π‘—βŸ©=𝛿𝑖𝑗=𝐼 so 𝐴†=πΊβˆ’1𝐡𝐺=𝐡, or [𝑇]†𝛽= [π‘‡βˆ—]𝛽, so the statement holds. For orthogonal basis 𝛾, 𝐺=π›Ώπ‘–π‘—βŸ¨π›Ύπ‘–,π›Ύπ‘—βŸ©. πΊβˆ’1𝐡𝐺=def𝑃𝐡𝐺=𝑃𝑗𝑗𝐡𝑖𝑗𝐺𝑖𝑖=𝑃𝑗𝑗𝐺𝑖𝑖𝐡𝑖𝑗. This equals 𝐡 iff 𝐡𝑖𝑗=0 or 𝑃𝑗𝑗𝐺𝑖𝑖=1β‡’ βŸ¨π›Ύπ‘–,π›Ύπ‘–βŸ©βŸ¨π›Ύπ‘—,π›Ύπ‘—βŸ©=1β‡’βŸ¨π›Ύπ‘–,π›Ύπ‘–βŸ©=βŸ¨π›Ύπ‘—,π›Ύπ‘—βŸ©. This means the statement also holds for orthogonal bases if 𝐡𝑖𝑗≠0β‡’βŸ¨π›Ύπ‘–,π›Ύπ‘–βŸ©=βŸ¨π›Ύπ‘—,π›Ύπ‘—βŸ©. # Counterexample Let basis 𝛽={[10],[11]} on 𝑉=ℝ2. We have 𝐺=[βŸ¨π›½1,𝛽1βŸ©βŸ¨π›½2,𝛽1βŸ©βŸ¨π›½1,𝛽2βŸ©βŸ¨π›½2,𝛽2⟩]=[1112] and πΊβˆ’1= [2βˆ’1βˆ’11] We find operator T such that 𝐴=[𝑇]𝛽 and πΊπ΄β€ πΊβˆ’1≠𝐴†. Let 𝑇(𝑣1𝛽1+𝑣2𝛽2)= 𝑣2𝛽1+𝑣1𝛽2. 𝐴=𝐴†=[0110], 𝐡=πΊπ΄β€ πΊβˆ’1=[1112][0110][2βˆ’1βˆ’11]=[130βˆ’1]≠𝐴† ∎
quaerat voluptatem. Ut enim aeque doleamus animo, cum corpore dolemus, fieri tamen permagna accessio potest, si aliquod aeternum et infinitum impendere malum nobis opinemur. Quod idem licet transferre in voluptatem, ut postea variari voluptas distinguique possit, augeri amplificarique non possit. At etiam Athenis, ut e patre audiebam facete et urban