ML之回归预测:利用13种机器学习算法对Boston(波士顿房价)数据集【13+1,506】进行回归预测(房价预测)来比较各模型性能
生活随笔
收集整理的這篇文章主要介紹了
ML之回归预测:利用13种机器学习算法对Boston(波士顿房价)数据集【13+1,506】进行回归预测(房价预测)来比较各模型性能
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
ML之回歸預測:利用13種機器學習算法對Boston(波士頓房價)數據集【13+1,506】進行回歸預測(房價預測)來比較各模型性能
導讀
通過利用13種機器學習算法,分別是LiR、kNN、SVR、DTR、RFR、SGDR、GBR、LGBR、XGBR算法,然后對Boston(波士頓房價)數據集,形狀是【13+1,506】,進行回歸預測(房價預測)來比較各模型性能,發現LGBR模型的性能最好。
?
相關文章
ML之回歸預測:利用13種機器學習算法對Boston(波士頓房價)數據集【13+1,506】進行回歸預測(房價預測)來比較各模型性能
?
?
?
目錄
輸出結果
設計思路
?
?
輸出結果
新增第13種ML算法
?
數據的初步查驗:輸出回歸目標值的差異 The max target value is 50.0 The min target value is 5.0 The average target value is 22.532806324110677LiR:The value of default measurement of LiR is 0.6763403830998702 LiR:R-squared value of DecisionTreeRegressor: 0.6763403830998702 LiR:The mean squared error of DecisionTreeRegressor: 25.096985692067726 LiR:The mean absoluate error of DecisionTreeRegressor: 3.5261239963985433kNNR_uni:The value of default measurement of kNNR_uni is 0.6903454564606561 kNNR_uni:R-squared value of DecisionTreeRegressor: 0.6903454564606561 kNNR_uni:The mean squared error of DecisionTreeRegressor: 24.01101417322835 kNNR_uni:The mean absoluate error of DecisionTreeRegressor: 2.9680314960629928 kNNR_dis:The value of default measurement of kNNR_dis is 0.7197589970156353 kNNR_dis:R-squared value of DecisionTreeRegressor: 0.7197589970156353 kNNR_dis:The mean squared error of DecisionTreeRegressor: 21.730250160926044 kNNR_dis:The mean absoluate error of DecisionTreeRegressor: 2.8050568785108005linear_SVR:The value of default measurement of linear_SVR is 0.651717097429608 linear_SVR:R-squared value of DecisionTreeRegressor: 0.651717097429608 linear_SVR:The mean squared error of DecisionTreeRegressor: 27.0063071393243 linear_SVR:The mean absoluate error of DecisionTreeRegressor: 3.426672916872753 poly_SVR:The value of default measurement of poly_SVR is 0.40445405800289286 poly_SVR:R-squared value of DecisionTreeRegressor: 0.4044540580028929 poly_SVR:The mean squared error of DecisionTreeRegressor: 46.1794033139523 poly_SVR:The mean absoluate error of DecisionTreeRegressor: 3.75205926674149 rbf_SVR:The value of default measurement of rbf_SVR is 0.7564068912273935 rbf_SVR:R-squared value of DecisionTreeRegressor: 0.7564068912273935 rbf_SVR:The mean squared error of DecisionTreeRegressor: 18.888525000753493 rbf_SVR:The mean absoluate error of DecisionTreeRegressor: 2.6075632979823276DTR:The value of default measurement of DTR is 0.699313885811367 DTR:R-squared value of DecisionTreeRegressor: 0.699313885811367 DTR:The mean squared error of DecisionTreeRegressor: 23.31559055118111 DTR:The mean absoluate error of DecisionTreeRegressor: 3.1716535433070865RFR:The value of default measurement of RFR is 0.8320900865862684 RFR:R-squared value of DecisionTreeRegressor: 0.8320900865862684 RFR:The mean squared error of DecisionTreeRegressor: 13.019952055992995 RFR:The mean absoluate error of DecisionTreeRegressor: 2.3392650918635174ETR:The value of default measurement of ETR is 0.7595247600325825 ETR:R-squared value of DecisionTreeRegressor: 0.7595247600325824 ETR:The mean squared error of DecisionTreeRegressor: 18.646761417322832 ETR:The mean absoluate error of DecisionTreeRegressor: 2.5487401574803146SGDR:The value of default measurement of SGDR is 0.6525677025033261 SGDR:R-squared value of DecisionTreeRegressor: 0.6525677025033261 SGDR:The mean squared error of DecisionTreeRegressor: 26.940350120746693 SGDR:The mean absoluate error of DecisionTreeRegressor: 3.524049659554681 GBR:The value of default measurement of GBR is 0.8442966156976921 GBR:R-squared value of DecisionTreeRegressor: 0.8442966156976921 GBR:The mean squared error of DecisionTreeRegressor: 12.07344198657727 GBR:The mean absoluate error of DecisionTreeRegressor: 2.2692783233003326[LightGBM] [Warning] feature_fraction is set=0.6, colsample_bytree=1.0 will be ignored. Current value: feature_fraction=0.6 [LightGBM] [Warning] min_data_in_leaf is set=18, min_child_samples=20 will be ignored. Current value: min_data_in_leaf=18 [LightGBM] [Warning] min_sum_hessian_in_leaf is set=0.001, min_child_weight=0.001 will be ignored. Current value: min_sum_hessian_in_leaf=0.001 [LightGBM] [Warning] bagging_fraction is set=0.7, subsample=1.0 will be ignored. Current value: bagging_fraction=0.7 LGBR:The value of default measurement of LGBR is 0.824979251097139 LGBR:R-squared value of DecisionTreeRegressor: 0.824979251097139 LGBR:The mean squared error of DecisionTreeRegressor: 13.5713354452417 LGBR:The mean absoluate error of DecisionTreeRegressor: 2.3653297699911455 [0.6763403830998702, 0.6903454564606561, 0.7197589970156353, 0.651717097429608, 0.40445405800289286, 0.7564068912273935, 0.699313885811367, 0.8320900865862684, 0.7595247600325825, 0.6525677025033261, 0.8442966156976921, 0.824979251097139]{'learning_rate': 0.09, 'max_depth': 4, 'n_estimators': 200} rmse: 0.37116076328428194 XGBR_grid:The value of default measurement of XGBR_grid is -0.1355992935386311 XGBR_grid:R-squared value of DecisionTreeRegressor: 0.8494067182200448 XGBR_grid:The mean squared error of DecisionTreeRegressor: 11.67719810423491 XGBR_grid:The mean absoluate error of DecisionTreeRegressor: 2.156086404304805?
設計思路
?
總結
以上是生活随笔為你收集整理的ML之回归预测:利用13种机器学习算法对Boston(波士顿房价)数据集【13+1,506】进行回归预测(房价预测)来比较各模型性能的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 成功解决AttributeError:
- 下一篇: Python:Python多种集成开发环