Wednesday, May 7, 2014

Gradient Descent for Single Feature Algorithm


Gradiant Descent Algorithm with Single Feature


 function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)  
 %GRADIENTDESCENT Performs gradient descent to learn theta  
 %  theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by   
 %  taking num_iters gradient steps with learning rate alpha  
 % Initialize some useful values  
 m = length(y); % number of training examples  
 J_history = zeros(num_iters, 1);  
 for iter = 1:num_iters  
   % ====================== YOUR CODE HERE ======================  
   % Instructions: Perform a single gradient step on the parameter vector  
   %        theta.   
   %  
   % Hint: While debugging, it can be useful to print out the values  
   %    of the cost function (computeCost) and gradient here.  
   %  
      % For J = 0 and J = 1, which should use the positions of theta 1 and 2 (indexes of octave)  
   % theta1 - alpha * 1/m * sum( theta0 * x(i) - y(i) ) for all i  
   % theta2 - alpha * 1/m * sum( theta1 * x(i) - y(i) ) * x(i) for all i  
      % defining the derivate of theta 1 portion  
      derivative1 = (1/m)*sum((X*theta) -y);       
      %Define the complete Equation for theta 1  
      temp1 = theta(1) - (alpha*derivative1);  
      % defining the derivate of theta 2 portion  
      derivative2 = (1/m)*sum(((X*theta) -y).* X(:,2));  
      %Define the complete Equation for theta 2  
      temp2 = theta(2) - (alpha*derivative2);  
      %Derive theta  
      theta = [temp1;temp2];  
      % In order to debug, the cost shows the decrease during the iterations.  
      % jCost = computeCost(X, y, theta)  
   %disp(jCost)  
   % ============================================================  
   % Save the cost J in every iteration    
   J_history(iter) = computeCost(X, y, theta);  
      %disp(J_history);  
 end  
 end  

No comments: