#
Section 1
Hello Typst 2+2=41β«50π₯ππ₯
#
Display
π
π
A
B
C
#
Linear maps
π:πβΆπ. Also a homomorphism. π:πβΆπ is a linear opereator or endomorphism of π. A linear transformation represents a specific geometric transformation, while the matrix representation is a set of coefficients that describe it in a particular basis.
π:β3βΆβ2
π[π₯1π₯2π₯3]=[2π₯1+π₯3π₯1βπ₯2+π₯3]
Is π linear? Let π₯,π¦ββ3
π(π₯)+π(π¦)=[2π₯1+π₯3π₯1βπ₯2+π₯3]+[2π¦1+π¦3π¦1βπ¦2+π¦3]=[2(π₯1+π¦1)+(π₯3+π¦3)(π₯1+π¦1)β(π₯2+π¦2)+(π₯3+π¦3)]=π(π₯+π¦)
It must hold that π(0)=0
π
(π)={[2π₯1+π₯3π₯1βπ₯2+π₯3]:π₯1,π₯2,π₯3ββ}={π₯1[21]+π₯2[0β1]+π₯3[11]:π₯1,π₯2,π₯3ββ}=
Span{[21],[0β1],[11]}=β2
π
(π)=Span(π(π1),π(π2),β¦,π(ππ))
If π:πβΆπ is a linear map from a finite-dimensional vector space π, dimπ(π)+ dimπ
(π)=dimπ
Rank of T = dimπ
(π)=dimcolπ
#
Subspace
To prove the existance of a subspace, it must contain 0, and be closed under addition and scalar multiplication
#
Bases
A subset π
of a vector space π is a basis if all π£βπ can be uniquely expressed as a linear combination of finite elements in π
.
#
Matrix representation of a linear map
We have finite dimensional vector space π over πΉ with basis π½={π£1,β¦,π£π}.
For example, in π2(β) we have basis π½1={1,π₯,π₯2}, π½2={π₯2,π₯,1}, π½3={1,1+ π₯,1+π₯+π₯2}.
Let π(π₯)=3π₯2+5π₯+4. For π½1 we have [π(π₯)]π½1=[453], for π½2 we have [π(π₯)]π½2= [354], for π½3 we have [π(π₯)]π½3=[β123].
π:πβΆπΉπ s.t. π(π£)=[π£]π½. T is a linear map converting a vector to its coordinates under basis π½
#
Matrix vector product
A matrix vector product maps the coordinates of a vectur under a (finite) basis to the coordinates of the vector under another (finite) basis. Every linear map between finite vector spaces has a unique matrix representation for every choice of basis in the domain and codomain.
Examplefor π:πΌβΆπ over πΉ, we choose π½ and πΎ as respective bases.
π₯βΆπ(π₯)
[π₯]π½βΆmat
π΄[π(π₯)]πΎ
The transformation matrix from π½ to πΎ is defined as π΄=[π]πΎπ½ (or [π]πΎβπ½). This expands to:
[[π(π’1)]πΎ[π(π’2)]πΎβ¦[π(π’π)]πΎ]
In summary, transform each basis vector of π½ and express it in πΎ
Exercise from bookConstruct a matrix that mirrors vectors to the line π¦=2π₯. We have π:β22βΆβ2. Since π is linear βπ΄:π(π₯)=π΄π₯. We know π΄=[π]π½1π½1= [.π.]π½1 Note that π1=[12] s.t. π(π)1=π1. Also, π2=[2β1] s.t. π(π2)=βπ2. Since they are independent in β2, they can be taken as a basis π½. Now, we have [π]π½π½= [[π(π½1)]π½[π(π½2)]π½]=[[10][0β1]]=[100β1]. If you want to transform π₯ in standard basis, you need to transform it to the basis π½, apply the matrix, and transform back to standard basis. You can also construct matrix [π]π½1π½=[1β221]
#
Matrix algebra
ππ:πβΆπ
(π1+π2)(π₯)=π1(π₯)+π2(π₯)
πβπΉ:(ππ1)(π₯)=ππ1(π₯)
ππ is therefore a vector space.
π1:πβΆπ and π2:πβΆπ
π₯βΆ(π2βπ1)(π₯)
π΅=[π1]π½πΌ
π΄=[π2]πΎπ½
πΆ=[π2βπ1]πΎπΌ=π΄π΅
#
Invertibility
For a linear map π:πβπ we know:
πis injectiveβΊπ(π)={0}
πis surjectiveβΊπ
(π)=π
(πβ1)β1=π
(π2βπ1)β1=(π1)β1β(π2)β1
IsomorphismInvertible linear map between isomorphic vector spaces.
This isomorphism can be represented by an invertible square matrix π΄ s.t. there exists a unique matrix π΅ where π΅π΄=πΌ and π΄π΅=πΌ. Note that (π΄π΅)β1=π΅β1π΄β1
The following statements are equivalent:
β’
π΄is invertible
β’
The columns of π΄ are linearly independent
β’
The columns of π΄ span πΉπ
#
Determinant
Read Β§4.5
A determaninant is a scalar computed from a square matrix. A square matrix is invertible if the columns are independent, but also if the determinant is not 0. For matrix [π11π21π12π22] if π11=0 we have [π110π12π11π22βπ21π12] so if det(π΄)β 0 then π΄ is invertible.
det(π΄π΅)=detπ΄detπ΅
detπ΄=detπ΄π
det(π΄π΄β1)=1
π΄β1=1detπ΄[π22βπ21βπ12π11]
Column linearitydet[ππ+πππ]=π[ππ]+πdet[ππ]
#
Rank
The following properties hold for πΓπmatricesπ΄,π΅:
β’
π
(π΄)=π
(π΄π)
β’
π
(π΄π΅)β€π
(π΄)
β’
π
(π΄π΅)β€π
(π΅)
β’
π΄is invertibleβΉmatrix multiplication preserves rank
β’
πΓπmatrixπ΄is invertibleβΊπ
(π΄)=π
#
Diagonalization
Given a linear operator π on vector space π is diagonalizable if there is an ordered basisπ½ for π such that [π]π½ is a diagonal matrix.
If we let π½={π£π}, s.t. [π]π½ is a diagonal matrix. π(π£π)=Ξ¦β1π΅(πΏπ·(Ξ¦π΅(π£π)))= π·πππ£π=πππ£π. The basis vectors consist of the eigenvectors of π and ππ is the eigenvalue corresponding to π£π. All non-zero elements of ker(π) are eigenvectors with eigenvalue 0. Eigenvalues are invariant under coordinate transformation.
Eigenvalues are distinct β eigenvectors are linearily independent β π is diagonalizable
#
Polynomial splitting
π(π‘)=(π‘2+1)(π‘β2) only splits over β, so in β the only eigenvalue is 2. If you have a diagonalizable matrix, the characteristic polynomial splits.
#
Eigen
If we have linear operator π΄ and vector π£β 0 s.t. π΄π£=ππ£, π£ is an eigenvector of π΄ with eigenvalue π.
A vector π£βπ with π£β 0 is an eigenvector of π corresponding to π iff π£βπ(πβππΌ)
#
Orthonormality
We have space π», the space of continuous complex valued functions over [0,2π]. Letππ(π‘)=ππππ‘ We want to check if π={ππ:πis an integer}
β¨ππ,ππβ©=12πβ«2π0ππππ‘ππππ‘ππ‘=12πβ«2π0ππ(πβπ)π‘ππ‘=[12π(πβπ)ππ(πβπ)π‘]2π0=πΏππ
For orthogonal bases we get the components of a vector by:
π―=β¨π―,ππβ©βππβ2
Proof: β¨π£,ππβ©=ππβ¨ππ,ππβ©=πππΏππβππβ2=ππβππβ2
#
Example
Let π₯=(1,2,4,7)βπ½4 and basisπ½={(12,12,12,12),(12,12,β12,β12),(12,β12,β12,12),(β12,12,β12,12)}
[π₯]π½=πππ½π=β¨π₯,π½πβ©π½π=7π½1β4π½2+π½3+2π½4
#
Gram-Schmidt
Suppose π£1,β¦,π£π is a linearly independent list of vectors in π. Let π1=π£1. For π= 2,β¦,π define ππ:
ππ=π£πββ¨π£π,π1β©βπ1β2π1ββ¦ββ¨π£π,ππβ1β©βππβ1β2ππβ1
The orthonormal basis is ππ=ππβππβ
#
Legendre polynomials
Let π=π2(β) with inner product β¨π(π₯),π(π₯)β©=β«1β1π(π‘)π(π‘)ππ‘. The Gram-Schmidt procedure can be used to compute an orthogonal basis.
Let standard basis π={1,π₯,π₯2}. It is trivial to see that this basis is not orthonormal. We find the orthonomal basis by applying Gram-Schmidt on π£π=ππ:
π1=π£1=1
βπ1β2=β«1β11=2
#
Random definitions
NormalA matrix or operator π³ is normal if π³β=π³
Self-adjoint (Hermetian)πβ=π
If βπ(π₯)β=βπ₯β, if π½=β, π is a unitary operator, if π½=β, π is an orthogonal operator.ππβ=πΌ
Unitarily equivalentπ·=πβ1π΄π, where the columns of π are an orthonormal basis.
Orthogonal matrixπ΄β=π΄
Let π be a linear operator on finite-dimensional inner product space π. If βπ(π½)β=βπ½β for all π½ in some orthonormal basis for π, π is not neccesarily unary.
Proof: Assume a Euclidean norm. βπ(π₯)β=
#
Adjoint problem
(note: adjoint is denoted by β for operator and β for matrix. Abstract index notation is used for matrices)
Disprove: For every linear operator T on V, and every ordered basis π½ for V, we have[π]β π½=[πβ]π½
Let π(π½π)=π΄πππ½π, πβ(π½π)=π΅πππ½π, and πΊππ=β¨π½π,π½πβ©. We aim to disprove π΄β =π΅.
β¨π(π½π),π½πβ©=β¨π΄πππ½π,π½πβ©
=π΄ππβ¨π½π,π½πβ©=(π΄ππ)β πΊππ
=(πΊπ΄β )ππ
β¨π(π½π),π½πβ©=β¨π½π,πβ(π½π)β©
=β¨π½π,π΅πππ½πβ©=π΅πππΊππ
=(π΅πΊ)ππ
πΊπ΄β =π΅πΊ
π΄β =πΊβ1π΅πΊ
πΊπ΄β πΊβ1=π΅
For an orthonormal basis π, πΊ=β¨ππ,ππβ©=πΏππ=πΌ so π΄β =πΊβ1π΅πΊ=π΅, or [π]β π½= [πβ]π½, so the statement holds.
For orthogonal basis πΎ, πΊ=πΏππβ¨πΎπ,πΎπβ©.
πΊβ1π΅πΊ=defππ΅πΊ=ππππ΅πππΊππ=ππππΊπππ΅ππ. This equals π΅ iff π΅ππ=0 or ππππΊππ=1β β¨πΎπ,πΎπβ©β¨πΎπ,πΎπβ©=1ββ¨πΎπ,πΎπβ©=β¨πΎπ,πΎπβ©. This means the statement also holds for orthogonal bases if π΅ππβ 0ββ¨πΎπ,πΎπβ©=β¨πΎπ,πΎπβ©.
#
Counterexample
Let basis π½={[10],[11]} on π=β2. We have πΊ=[β¨π½1,π½1β©β¨π½2,π½1β©β¨π½1,π½2β©β¨π½2,π½2β©]=[1112] and πΊβ1= [2β1β11]
We find operator T such that π΄=[π]π½ and πΊπ΄β πΊβ1β π΄β . Let π(π£1π½1+π£2π½2)= π£2π½1+π£1π½2.
π΄=π΄β =[0110], π΅=πΊπ΄β πΊβ1=[1112][0110][2β1β11]=[130β1]β π΄β β
quaerat voluptatem. Ut enim aeque doleamus animo, cum corpore dolemus, fieri tamen permagna accessio
potest, si aliquod aeternum et infinitum impendere malum nobis opinemur. Quod idem licet transferre
in voluptatem, ut postea variari voluptas distinguique possit, augeri amplificarique non possit.
At etiam Athenis, ut e patre audiebam facete et urban