Loading data ...
First 10 examples from the dataset: x =[21043], y =399900 x =[16003], y =329900 x =[24003], y =369000 x =[14162], y =232000 x =[30004], y =539900 x =[19854], y =299900 x =[15343], y =314900 x =[14273], y =198999 x =[13803], y =212000 x =[14943], y =242500
Program paused. Press enter to continue.
function [X_norm, mu, sigma]=featureNormalize(X)%FEATURENORMALIZE Normalizes the features in X
%FEATURENORMALIZE(X) returns a normalized version of X where
% the mean value of each feature is 0and the standard deviation
% is 1. This is often a good preprocessing step to do when
% working with learning algorithms.% You need to set these values correctly
X_norm = X;%房子大小,臥室的數(shù)量
mu =zeros(1,size(X,2));
sigma =zeros(1,size(X,2));%====================== YOUR CODE HERE ======================% Instructions: First,for each feature dimension, compute the mean
% of the feature and subtract it from the dataset,% storing the mean value in mu. Next, compute the
% standard deviation of each feature and divide
% each feature by it's standard deviation, storing
% the standard deviation in sigma.%% Note that X is a matrix where each column is a
% feature and each row is an example. You need
% to perform the normalization separately for% each feature.%% Hint: You might find the 'mean'and'std' functions useful.%
mu=mean(X);
sigma=std(X);
X_norm=(X-mu)./sigma;%============================================================end
function J =computeCostMulti(X, y, theta)%COMPUTECOSTMULTI Compute cost for linear regression with multiple variables
% J =COMPUTECOSTMULTI(X, y, theta) computes the cost of using theta as the
% parameter for linear regression to fit the data points in X and y% Initialize some useful values
m =length(y);% number of training examples% You need to return the following variables correctly
J =0;%====================== YOUR CODE HERE ======================% Instructions: Compute the cost of a particular choice of theta
% You should set J to the cost.J=1/(2*m)*(X*theta-y)'*(X*theta-y);%=========================================================================end
function [theta, J_history]=gradientDescentMulti(X, y, theta, alpha, num_iters)%GRADIENTDESCENTMULTI Performs gradient descent to learn theta
% theta =GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha% Initialize some useful values
m =length(y);% number of training examples
J_history =zeros(num_iters,1);for iter =1:num_iters%====================== YOUR CODE HERE ======================% Instructions: Perform a single gradient step on the parameter vector% theta.%% Hint: While debugging, it can be useful to print out the values% of the cost function (computeCostMulti)and gradient here.%theta=theta-alpha/m*X'*(X*theta-y);%============================================================% Save the cost J in every iteration J_history(iter)=computeCostMulti(X, y, theta);endend
選擇(這也是學習率和迭代次數(shù)比較合理的選擇)
% Choose some alpha value
alpha =0.09;
num_iters =50;
得到的收斂圖
得到的收斂圖
% Choose some alpha value
alpha =0.12;
num_iters =50;% Choose some alpha value
alpha =0.15;
num_iters =50;
得到的收斂圖
part2完整代碼
%%================ Part 2: Gradient Descent ================%====================== YOUR CODE HERE ======================% Instructions: We have provided you with the following starter
% code that runs gradient descent with a particular
% learning rate (alpha).%% Your task is to first make sure that your functions -% computeCost and gradientDescent already work with
%this starter code and support multiple variables.%% After that,try running gradient descent with
% different values of alpha and see which one gives
% you the best result.%% Finally, you should complete the code at the end
% to predict the price of a 1650 sq-ft,3 br house.%% Hint: By using the 'hold on' command, you can plot multiple
% graphs on the same figure.%% Hint: At prediction, make sure you do the same feature normalization.%fprintf('Running gradient descent ...\n');% Choose some alpha value
alpha =0.12;
num_iters =50;% Init Theta and Run Gradient Descent
theta =zeros(3,1);[theta, J_history]=gradientDescentMulti(X, y, theta, alpha, num_iters);% Plot the convergence graph
figure;
hold on;plot(1:numel(J_history), J_history,'-g','LineWidth',2);xlabel('Number of iterations');ylabel('Cost J');% Display gradient descent's result
fprintf('Theta computed from gradient descent: \n');fprintf(' %f \n', theta);fprintf('\n');% Estimate the price of a 1650 sq-ft,3 br house
%====================== YOUR CODE HERE ======================% Recall that the first column of X is all-ones. Thus, it does
%not need to be normalized.
price =0;% You should change this
price=[1([16503]-mu)./sigma]*theta;%============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house '...'(using gradient descent):\n $%f\n'], price);fprintf('Program paused. Press enter to continue.\n');
pause;
梯度下降法預測房價的代碼
price=[1([16503]-mu)./sigma]*theta;
得到的結果
Running gradient descent ...
Theta computed from gradient descent:339842.312379106499.134141-2521.749858 Predicted price of a 1650 sq-ft,3 br house (using gradient descent):$293411.151468
function [theta]=normalEqn(X, y)%NORMALEQN Computes the closed-form solution to linear regression
%NORMALEQN(X,y) computes the closed-form solution to linear
% regression using the normal equations.theta =zeros(size(X,2),1);%====================== YOUR CODE HERE ======================% Instructions: Complete the code to compute the closed form solution
% to linear regression and put the result in theta.%%---------------------- Sample Solution ----------------------theta=pinv(X'*X)*X'*y;%-------------------------------------------------------------%============================================================end
part3完整代碼
%%================ Part 3: Normal Equations ================fprintf('Solving with normal equations...\n');%====================== YOUR CODE HERE ======================% Instructions: The following code computes the closed form
% solution for linear regression using the normal
% equations. You should complete the code in
% normalEqn.m
%% After doing so, you should complete this code
% to predict the price of a 1650 sq-ft,3 br house.%%% Load Data
data =csvread('ex1data2.txt');
X =data(:,1:2);
y =data(:,3);
m =length(y);% Add intercept term to X
X =[ones(m,1) X];% Calculate the parameters from the normal equation
theta =normalEqn(X, y);% Display normal equation's result
fprintf('Theta computed from the normal equations: \n');fprintf(' %f \n', theta);fprintf('\n');% Estimate the price of a 1650 sq-ft,3 br house
%====================== YOUR CODE HERE ======================
price =0;% You should change this%price=[1([16503]-mu)./sigma]*theta;
price=[116503]*theta;%============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house '...'(using normal equations):\n $%f\n'], price);
預測房價的代碼
price=[116503]*theta;
這里不需要再進行歸一化,
得到的結果```cpp
Solving with normal equations...
Theta computed from the normal equations: 89597.909544 139.210674 -8738.019113 Predicted price of a 1650 sq-ft, 3 br house (using normal equations):$293081.464335
Normalizing Features ...
Running gradient descent ...
Theta computed from gradient descent:339842.312379106499.134141-2521.749858 Predicted price of a 1650 sq-ft,3 br house (using gradient descent):$293411.151468
Program paused. Press enter to continue.
Solving with normal equations...
Theta computed from the normal equations:89597.909544139.210674-8738.019113 Predicted price of a 1650 sq-ft,3 br house (using normal equations):$293081.464335