Hello,

I have just finished ex2_reg.m for 2 features regularized logistic

regression.

I am thinking how can I add one more feature and turn it into regularized

linear regression.

The following code is from ex2_reg.m and I tried to alter to 3 features and

predict.m changed to p = X * theta;

However it still does work, any thoughts?

The error msg is :

error: reshape: can't reshape 784x1 array to 1x1 array

error: called from

fminunc at line 259 column 13

ex2_reg2 at line 93 column 20

%% Initialization

clear ; close all; clc

%% Load Data

% The first two columns contains the X values and the third column

% contains the label (y).

data = load('FortaData.csv');

X = data(:, [1:3]);

y = data(:, 4);

plotData(X, y);

% Put some labels

hold on;

% Labels and Legend

xlabel('Microchip Test 1')

ylabel('Microchip Test 2')

% Specified in plot order

legend('y = 1', 'y = 0')

hold off;

%% =========== Part 1: Regularized Logistic Regression ============

% In this part, you are given a dataset with data points that are not

% linearly separable. However, you would still like to use logistic

% regression to classify the data points.

%

% To do so, you introduce more features to use -- in particular, you add

% polynomial features to our data matrix (similar to polynomial

% regression).

%

% Add Polynomial Features

fprintf('Number of Features, including the Intercept Term\n');

% "+1" to include X0 in the counting

fprintf(' Before Polynomial Expansion : %2d\n', size(X, 3) + 1);

% mapFeature will add the intercept term for you

X = mapFeature(X(:,1), X(:,2), X(:,3));

fprintf(' After Polynomial Expansion : %2d\n', size(X, 3));

% Initialize fitting parameters

initial_theta = zeros(size(X, 3), 1);

% Set regularization parameter lambda to 1

lambda = 1;

% Compute and display initial cost and gradient for regularized

% logistic regression

[cost, grad] = costFunctionReg(initial_theta, X, y, lambda);

fprintf('Cost at initial theta (zeros): %f\n', cost);

fprintf('\nProgram paused. Press enter to continue.\n');

pause;

%% ============= Part 2: Regularization and Accuracies =============

% In this part, you will get to try different values of lambda and

% see how regularization affects the decision boundary.

%

% Try the following values of lambda (0, 1, 10, 100).

%

% How does the decision boundary change when you vary lambda?

% How does the training set accuracy vary?

%

% Initialize fitting parameters

initial_theta = zeros(size(X, 3), 1);

% Set regularization parameter lambda to 1 (you should vary this)

lambda = 1; % Try 0, 1, 10, 100

% Set Options

options = optimset('GradObj', 'on', 'MaxIter', 1000);

% Optimize

[theta, J, exit_flag] = ...

fminunc(@(t) costFunctionReg(t, X, y, lambda), initial_theta, options);

% Plot Boundary

%plotDecisionBoundary(theta, X, y);

%hold on;

%title(sprintf('lambda = %g', lambda))

% Labels and Legend

xlabel('Microchip Test 1')

ylabel('Microchip Test 2')

legend('y = 1', 'y = 0', 'Decision boundary')

hold off;

% Compute accuracy on our training set

p = predict(theta, X);

fprintf('Train Accuracy: %f\n\n', mean(p == y) * 100);

zdata = load('TodayData.csv');

z = zdata(:, [1:3]);

% Predicting one value

fprintf('\n\nPredicting one value\n');

X_map = mapFeature(z);

p = predict(theta, X_map)

--

Sent from:

http://octave.1599824.n4.nabble.com/Octave-General-f1599825.html