ai 中 统计_AI统计(第2部分)
ai 中 統(tǒng)計(jì)
Today I plan to cover the following topics: Linear independence, special matrices, and matrix decomposition.
今天,我計(jì)劃涵蓋以下主題:線(xiàn)性獨(dú)立性,特殊矩陣和矩陣分解。
線(xiàn)性獨(dú)立 (Linear independence)
A set of vectors is linearly independent if none of these vectors can be written as a linear combination of other vectors.For example, V1=(1,0) and V2=(0,1). Here, V2 cannot be written in terms of V1. However, V3 (3,4) is linearly dependent as V3 can be expressed as 3V1+4V2.
如果這些向量中的任何一個(gè)都不能寫(xiě)成其他向量的線(xiàn)性組合,則它們是線(xiàn)性獨(dú)立的,例如V1 =(1,0)和V2 =(0,1)。 在此,不能用V1來(lái)寫(xiě)V2。 但是,V3(3,4)與線(xiàn)性相關(guān),因?yàn)閂3可以表示為3V1 + 4V2。
Mathematically, s={V1, V2,…., Vn} is linearly independent if and only if the linear combination α1V1+α2V2+…..+αnVn=0 means that all αi=0.
在數(shù)學(xué)上,當(dāng)且僅當(dāng)線(xiàn)性組合α1V1+α2V2+ .... +αnVn= 0表示所有αi= 0時(shí),s = {V1,V2,....,Vn}是線(xiàn)性獨(dú)立的。
矩陣運(yùn)算 (Matrix operations)
Matrices can transform one vector to another vector. For example, V is an Nx1 vector and w is also an Nx1 vector.
矩陣可以將一個(gè)向量轉(zhuǎn)換為另一向量。 例如,V是Nx1向量,w也是Nx1向量。
矩陣的痕跡 (Trace of a matrix)
Trace of a matrix is given by its sum of diagonal elements. For matrix A, its trace will be the summation of all the elements with the same value of row and column.
矩陣的跡線(xiàn)由對(duì)角元素的總和給出。 對(duì)于矩陣A,其軌跡將是具有相同行和列值的所有元素的總和。
Trace of a matrix, Image by author矩陣的痕跡,作者提供的圖像一些屬性 (Some properties)
矩陣的行列式 (The determinant of a matrix)
Laplace expansion for an NxN matrix is given by the following formula:
NxN矩陣的拉普拉斯展開(kāi)式由以下公式給出:
The determinant of a matrix, Image by author矩陣的行列式,作者提供的圖像Determinant actually represents the volume formed by the column vectors. For a 2x2 vector, it represents the area.
行列式實(shí)際上表示由列向量形成的體積。 對(duì)于2x2向量,它表示面積。
Interpretation of 2x2 vectors in space, Image by author空間中2x2向量的解釋,作者提供矩陣的可逆性 (Invertibility of a matrix)
The inverse of a matrix A is possible only if the det(A) is not 0. Note that this automatically means that the columns of A have to be linearly independent. Consider a matrix below.
僅當(dāng)det(A)不為0時(shí),矩陣A的逆才可能。請(qǐng)注意,這自動(dòng)意味著A的列必須線(xiàn)性獨(dú)立。 考慮下面的矩陣。
Matrix A, image by author矩陣A,作者提供的圖片Note that V1, V2,…., Vn are vectors and if any vector, say, Vn can be written as linearly dependent vectors of the rest like Vn=α1V1+α2V2+…..+αn-1Vn-1 then, we can do a simple column operation i.e last column = the last column- (α1V1+α2V2+…..+αn-1Vn-1) and this would yield a column full of zeros. This will make the determinant of matrix 0. For a 2x2 matrix, we will have 2 vectors V1 and V2. If V1 and V2 are linearly dependent like V1=2V2, then the area formed by the two vectors is going to be zero. A smart way to put this would be that the two vectors are parallel to one another.
請(qǐng)注意,V1,V2,...,Vn是向量,如果有任何向量,例如,Vn可以寫(xiě)為其余部分的線(xiàn)性相關(guān)向量,例如Vn =α1V1+α2V2+ ..... +αn-1Vn-1,那么我們可以一個(gè)簡(jiǎn)單的列運(yùn)算,即最后一列=最后一列-(α1V1+α2V2+ ..... +αn-1Vn-1),這將產(chǎn)生一列充滿(mǎn)零的列。 這將決定矩陣0的行列式。對(duì)于2x2矩陣,我們將有2個(gè)向量V1和V2。 如果V1和V2是線(xiàn)性相關(guān)的,例如V1 = 2V2,則由兩個(gè)向量形成的面積將為零。 一種明智的解釋是,兩個(gè)向量彼此平行。
特殊矩陣和向量 (Special matrices and vectors)
本征分解 (Eigen decomposition)
Eigen decomposition is extremely useful for a square symmetric matrix. Let's look at the physical meaning of the term.
本征分解對(duì)于平方對(duì)稱(chēng)矩陣非常有用。 讓我們看一下該術(shù)語(yǔ)的物理含義。
Every real matrix can be thought of as a combination of rotation and stretching.
每個(gè)實(shí)數(shù)矩陣都可以視為旋轉(zhuǎn)和拉伸的組合。
Vector multiplication, Image by author矢量乘法,作者提供的圖像 operation on vector v that generates vector w, Image by author向量v的運(yùn)算,產(chǎn)生向量w,作者提供Here, A can be thought of as an operator tat stretches and rotates a vector v to obtain a new vector w. Eigenvectors for a matrix are those special vectors that only stretch under the action of a matrix. Eigenvalues are the factor by which the eigenvectors stretch. In the equation below, the vector v is stretched by a value of lambda when operated with an eigenvector A.
在這里,可以將A視為操作員tat拉伸并旋轉(zhuǎn)向量v以獲得新的向量w。 矩陣的特征向量是那些僅在矩陣作用下才拉伸的特殊向量。 特征值是特征向量伸展的因子。 在下面的公式中,向量v在使用特征向量A時(shí)被拉伸了一個(gè)lambda值。
Eigenvalue lambda of a vector v, Image by author向量v的特征值λ,作者提供的圖像Say, A has n linearly independent eigenvectors {V1, V2,….., Vn}. On concatenating all the vectors as a column, we get a single eigenvector matric V where V=[V1, V2,….., Vn]. If we concatenate the corresponding eigenvalues into a diagonal matrix i.e Λ=diag(λ1, λ2,…, λn), we get the eigendecomposition (factorization) of A as:
假設(shè)A有n個(gè)線(xiàn)性獨(dú)立的特征向量{V1,V2,…..,Vn}。 將所有向量連接為一列后,我們得到單個(gè)特征向量矩陣V,其中V = [V1,V2,.....,Vn]。 如果將對(duì)應(yīng)的特征值連接到對(duì)角矩陣即Λ = diag(λ1,λ2,...,λn),則得到A的特征分解(因式分解)為:
eigendecomposition of A, Image by authorA的特征分解,作者提供Real symmetric matrices have real eigenvectors and real eigenvalues.
實(shí)對(duì)稱(chēng)矩陣具有實(shí)特征向量和實(shí)特征值。
Real symmetric matrix, Image y author實(shí)對(duì)稱(chēng)矩陣,作者:y二次形式和正定矩陣 (Quadratic form and positive definite matrix)
The quadratic form can be interpreted as a ‘weighted’ length.
二次形式可以解釋為“加權(quán)”長(zhǎng)度。
Quadratic form, Image by author二次形式,作者提供的圖片 Quadratic form, Image by author二次形式,作者提供的圖片The positive definite (PD) matrix has all eigenvalues greater than zero. The semi-definite positive(PSD) matrix has eigenvalues greater than equal to zero. A PD matrix has a property that for all X, (X.T)AX is greater than 0. For example, if A=I or identity matrix then, (X.T)I(X)=(X.T)(X) which is greater than 0. A PSD matrix has a property that for all X, (X.T)AX is greater than equal to 0. Similarly, a negative definite (ND)matrix has all eigenvalues less than zero. And semi-negative definite (PD)matrix has all eigenvalues less than equal to zero.
正定(PD)矩陣的所有特征值均大于零。 半定正(PSD)矩陣的特征值大于零。 PD矩陣的屬性是,對(duì)于所有X,(XT)AX都大于0。例如,如果A = I或單位矩陣,則(XT)I(X)=(XT)(X)大于0. PSD矩陣具有以下特性:對(duì)于所有X,(XT)AX都等于0。類(lèi)似地,負(fù)定(ND)矩陣的所有特征值均小于零。 半負(fù)定(PD)矩陣的所有特征值均小于零。
奇異值分解 (Singular value decomposition)
If A is an MxN matrix, then
如果A是MxN矩陣,則
Singular value decomposition, Image by author奇異值分解,作者提供結(jié)束 (End)
Thank you and stay tuned for more blogs on AI.
謝謝,請(qǐng)繼續(xù)關(guān)注更多有關(guān)AI的博客。
翻譯自: https://towardsdatascience.com/statistics-for-ai-part-2-43d81986c87c
ai 中 統(tǒng)計(jì)
總結(jié)
以上是生活随笔為你收集整理的ai 中 统计_AI统计(第2部分)的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: 梦到乞丐缠着我是什么意思
- 下一篇: 梦到熟人离婚是什么意思