生活随笔
收集整理的這篇文章主要介紹了
深度学习-Tensorflow2.2-深度学习基础和tf.keras{1}-线性回归tf.keras概述-02
小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.
線性回歸原理
線性方程 y=kx+b
import os
os
. environ
[ 'TF_CPP_MIN_LOG_LEVEL' ] = '2'
import tensorflow
as tf
import pandas
as pd
import matplotlib
. pyplot
as pltdata
= pd
. read_csv
( 'A.csv' )
print ( data
)
plt
. scatter
( data
. Education
, data
. Income
)
plt
. show
( )
預(yù)測(cè)目標(biāo)與損失函數(shù)
目標(biāo):預(yù)測(cè)函數(shù)f(x)與真實(shí)值之間的整體誤差最小。 損失函數(shù):使用均方誤差作為成本函數(shù),也就是預(yù)測(cè)值和真實(shí)值之間差的平方取均值。
import os
os
. environ
[ 'TF_CPP_MIN_LOG_LEVEL' ] = '2'
import tensorflow
as tf
import pandas
as pd
import matplotlib
. pyplot
as pltdata
= pd
. read_csv
( 'A.csv' )
x
= data
. Education
y
= data
. Incomemodel
= tf
. keras
. Sequential
( )
model
. add
( tf
. keras
. layers
. Dense
( 1 , input_shape
= ( 1 , ) ) )
model
. summary
( )
import os
os
. environ
[ 'TF_CPP_MIN_LOG_LEVEL' ] = '2'
import tensorflow
as tf
import pandas
as pd
import matplotlib
. pyplot
as pltdata
= pd
. read_csv
( 'A.csv' )
x
= data
. Education
y
= data
. Incomemodel
= tf
. keras
. Sequential
( )
model
. add
( tf
. keras
. layers
. Dense
( 1 , input_shape
= ( 1 , ) ) )
model
. compile ( optimizer
= 'adam' , loss
= 'mse' )
history
= model
. fit
( x
, y
, epochs
= 5000 )
print ( history
)
import os
os
. environ
[ 'TF_CPP_MIN_LOG_LEVEL' ] = '2'
import tensorflow
as tf
import pandas
as pd
import matplotlib
. pyplot
as pltdata
= pd
. read_csv
( 'A.csv' )
x
= data
. Education
y
= data
. Incomemodel
= tf
. keras
. Sequential
( )
model
. add
( tf
. keras
. layers
. Dense
( 1 , input_shape
= ( 1 , ) ) )
model
. compile ( optimizer
= 'adam' , loss
= 'mse' )
history
= model
. fit
( x
, y
, epochs
= 5000 )
print ( history
)
print ( model
. predict
( x
) )
print ( "20年的預(yù)測(cè)收入為:" , model
. predict
( pd
. Series
( [ 20 ] ) ) )
總結(jié)
以上是生活随笔 為你收集整理的深度学习-Tensorflow2.2-深度学习基础和tf.keras{1}-线性回归tf.keras概述-02 的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
如果覺(jué)得生活随笔 網(wǎng)站內(nèi)容還不錯(cuò),歡迎將生活随笔 推薦給好友。