help-octave
[Top][All Lists]

## Re: Creating regularized linear regression

 From: Kai Torben Ohlhus Subject: Re: Creating regularized linear regression Date: Fri, 27 Jul 2018 08:20:45 +0200

On Fri, Jul 27, 2018 at 3:40 AM lawrencema <address@hidden> wrote:
Hello,

I have just finished ex2_reg.m for 2 features regularized logistic
regression.

I am thinking how can I add one more feature and turn it into regularized
linear regression.

The following code is from ex2_reg.m and I tried to alter to 3 features and
predict.m changed to p = X * theta;

However it still does work, any thoughts?

The error msg is :
error: reshape: can't reshape 784x1 array to 1x1 array
error: called from
fminunc at line 259 column 13
ex2_reg2 at line 93 column 20

%% Initialization
clear ; close all; clc

%  The first two columns contains the X values and the third column
%  contains the label (y).

data = ""> X = data(:, [1:3]);
y = data(:, 4);

plotData(X, y);

% Put some labels
hold on;

% Labels and Legend
xlabel('Microchip Test 1')
ylabel('Microchip Test 2')

% Specified in plot order
legend('y = 1', 'y = 0')
hold off;

%% =========== Part 1: Regularized Logistic Regression ============
%  In this part, you are given a dataset with data points that are not
%  linearly separable. However, you would still like to use logistic
%  regression to classify the data points.
%
%  To do so, you introduce more features to use -- in particular, you add
%  polynomial features to our data matrix (similar to polynomial
%  regression).
%

fprintf('Number of Features, including the Intercept Term\n');
% "+1" to include X0 in the counting
fprintf('  Before Polynomial Expansion : %2d\n', size(X, 3) + 1);
% mapFeature will add the intercept term for you
X = mapFeature(X(:,1), X(:,2), X(:,3));
fprintf('  After  Polynomial Expansion : %2d\n', size(X, 3));

% Initialize fitting parameters
initial_theta = zeros(size(X, 3), 1);

% Set regularization parameter lambda to 1
lambda = 1;

% Compute and display initial cost and gradient for regularized
% logistic regression
[cost, grad] = costFunctionReg(initial_theta, X, y, lambda);

fprintf('Cost at initial theta (zeros): %f\n', cost);

fprintf('\nProgram paused. Press enter to continue.\n');
pause;

%% ============= Part 2: Regularization and Accuracies =============
%  In this part, you will get to try different values of lambda and
%  see how regularization affects the decision boundary.
%
%  Try the following values of lambda (0, 1, 10, 100).
%
%  How does the decision boundary change when you vary lambda?
%  How does the training set accuracy vary?
%

% Initialize fitting parameters
initial_theta = zeros(size(X, 3), 1);

% Set regularization parameter lambda to 1 (you should vary this)
lambda = 1;  % Try 0, 1, 10, 100

% Set Options
options = optimset('GradObj', 'on', 'MaxIter', 1000);

% Optimize
[theta, J, exit_flag] = ...
fminunc(@(t) costFunctionReg(t, X, y, lambda), initial_theta, options);

% Plot Boundary
%plotDecisionBoundary(theta, X, y);
%hold on;
%title(sprintf('lambda = %g', lambda))

% Labels and Legend
xlabel('Microchip Test 1')
ylabel('Microchip Test 2')

legend('y = 1', 'y = 0', 'Decision boundary')
hold off;

% Compute accuracy on our training set
p = predict(theta, X);

fprintf('Train Accuracy: %f\n\n', mean(p == y) * 100);

z = zdata(:, [1:3]);

% Predicting one value
fprintf('\n\nPredicting one value\n');
X_map = mapFeature(z);
p = predict(theta, X_map)

As your example is complex and incomplete, the error message is useless (what is line 93 and what is of dimension 784x1?) and it is impossible to help here. Please stick to the rules of MCVE [1] and provide a snipped of the relevant code including a meaningful subset of the data resulting in the aforementioned error.

HTH,
Kai