DL之GD:利用LogisticGD算法(梯度下降)依次基于一次函数和二次函数分布的数据集实现二分类预测(超平面可视化)
生活随笔
收集整理的這篇文章主要介紹了
DL之GD:利用LogisticGD算法(梯度下降)依次基于一次函数和二次函数分布的数据集实现二分类预测(超平面可视化)
小編覺得挺不錯的,現(xiàn)在分享給大家,幫大家做個參考.
DL之GD:利用LogisticGD算法(梯度下降)依次基于一次函數(shù)和二次函數(shù)分布的數(shù)據(jù)集實現(xiàn)二分類預測(超平面可視化)
?
?
?
目錄
利用LogisticGD算法(梯度下降)依次基于一次函數(shù)和二次函數(shù)分布的數(shù)據(jù)集實現(xiàn)二分類預測(超平面可視化)
設計思路
輸出結果
核心代碼
?
?
?
相關文章
DL之GD:利用LogisticGD算法(梯度下降)依次基于一次函數(shù)和二次函數(shù)分布的數(shù)據(jù)集實現(xiàn)二分類預測(超平面可視化)
DL之GD:利用LogisticGD算法(梯度下降)依次基于一次函數(shù)和二次函數(shù)分布的數(shù)據(jù)集實現(xiàn)二分類預測(超平面可視化)實現(xiàn)
???????
?
利用LogisticGD算法(梯度下降)依次基于一次函數(shù)和二次函數(shù)分布的數(shù)據(jù)集實現(xiàn)二分類預測(超平面可視化)
設計思路
后期更新……
?
?
?
輸出結果
[ 1. 0.06747879 -0.97085008] data_x (300, 3) [[ 1. 0.83749402 0.80142971][ 1. -0.93315714 0.91389867][ 1. -0.72558136 -0.43234329][ 1. 0.21216637 0.88845027][ 1. 0.70547108 -0.99548153]] 因為Linear_function函數(shù)無意義,經(jīng)過Linear_function函數(shù)處理后,data_x等價于data_z data_y (300,) [-1. -1. -1. -1. 1.] data_x: (300, 3) data_z: (300, 3) data_y: (300,) [228 106 146 250 91 214 47 49 178 90] Number of iterations: 26Plot took 0.10 seconds. Plot took 0.04 seconds. Target weights: [ -0.49786797 5.28778784 -11.997255 ] Target in-sample error: 3.33% Target out-of-sample error: 6.21% Hypothesis (N=300) weights: [-0.45931854 3.20434478 -7.70825364] Hypothesis (N=300) in-sample error: 4.33% Hypothesis (N=300) out-of-sample error: 6.08% Hypothesis (N=10) weights: [-1.35583449 3.90067866 -5.99553537] Hypothesis (N=10) in-sample error: 10.00% Hypothesis (N=10) out-of-sample error: 12.87% Error history took 88.89 seconds. Plot took 17.72 seconds. Plot took 35.88 seconds. GD_w_hs[-1] [-1.35583449 3.90067866 -5.99553537] dimension_z 5 data_x (30, 3) [[ 1. -0.0609991 -0.15447425][ 1. -0.13429796 -0.89691689][ 1. 0.12475253 0.36980185][ 1. -0.0182513 0.74771272][ 1. 0.50585605 -0.04961719]] 因為Linear_function函數(shù)無意義,經(jīng)過Linear_function函數(shù)處理后,data_x等價于data_z data_y (30,) [-1. 1. 1. 1. -1.]Plot took 1.02 seconds. Number of iterations: 105Plot took 1.03 seconds. Target weights: [-3 2 3 6 9 10] Hypothesis weights: [-1.23615696 -0.9469097 1.76449666 2.09453304 5.62678124 5.06054409] Hypothesis in-sample error: 10.00% Hypothesis out-of-sample error: 15.47% Plot took 16.58 seconds. GD_w_hs[-1] [-1.23615696 -0.9469097 1.76449666 2.09453304 5.62678124 5.06054409]核心代碼
def in_sample_error(z, y, logisticGD_function):y_h = (logisticGD_function(z) >= 0.5)*2-1return np.sum(y != y_h) / float(len(y))def estimate_out_of_sample_error(Product_x_function, NOrderPoly_Function,Pre_Logistic_function, logisticGD_function, N=10000, Linear_function_h=None):x = np.array([Product_x_function() for i in range(N)])z = np.apply_along_axis(NOrderPoly_Function, 1, x)if not Linear_function_h is None:z_h = np.apply_along_axis(Linear_function_h, 1, x)else:z_h = zy = Pre_Logistic_function(z)y_h = (logisticGD_function(z_h) >= 0.5)*2-1return np.sum(y != y_h) / float(N)def ErrorCurve_Plot(N,GD_w_hs, cross_entropy_error):start_time = time.time()fig = plt.figure() # figsize=(8, 6)ax = fig.add_subplot(1, 1, 1)ax.set_xlabel(r'Iteration', fontsize=12)ax.set_ylabel(r'In-Sample Error ($E_{in}$)', fontsize=12)ax.set_title(r'Gradient Descent Evolution, N={}'.format(N), fontsize=12)ax.set_xlim(0, GD_w_hs.shape[0]-1)ax.set_ylim(0, 1)ax.xaxis.grid(color='gray', linestyle='dashed')ax.yaxis.grid(color='gray', linestyle='dashed')ax.set_axisbelow(True)ax.plot(range(GD_w_hs.shape[0]), np.apply_along_axis(cross_entropy_error, 1, GD_w_hs), 'r-')plt.show()print('Plot took {:.2f} seconds.'.format(time.time()-start_time))?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
總結
以上是生活随笔為你收集整理的DL之GD:利用LogisticGD算法(梯度下降)依次基于一次函数和二次函数分布的数据集实现二分类预测(超平面可视化)的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: ML:基于自定义数据集利用Logisti
- 下一篇: AI之NLP:自然语言处理技术简介(是什