Regression equation
Octave code representation
h = X * theta
Cost function
Least mean square (LMS)
J = 0.5 * m^-1 * (X*theta-y)' * (X*theta-y);
Gradient descent
theta = theta - alpha * m ^ -1 * X' * (X * theta - y)
Random gradient descent
Normal equation
theta=pinv(X'*X)*X'*y