From 60087c225d687a66a45336b8bcad416cbda73209 Mon Sep 17 00:00:00 2001 From: Claudio Maggioni Date: Wed, 5 May 2021 18:07:38 +0200 Subject: [PATCH] done stuff --- assignment_1/report_Maggioni_Claudio.md | 17 ++++++++++++++--- assignment_1/src/build_models.py | 1 - 2 files changed, 14 insertions(+), 4 deletions(-) diff --git a/assignment_1/report_Maggioni_Claudio.md b/assignment_1/report_Maggioni_Claudio.md index 808dc7e..34226a2 100644 --- a/assignment_1/report_Maggioni_Claudio.md +++ b/assignment_1/report_Maggioni_Claudio.md @@ -200,10 +200,21 @@ Comment and compare how the (a.) training error, (b.) test error and The classification problem in the graph, according to the data points shown, is quite similar to the XOR or ex-or problem. Since in 1969 that problem was proved impossible to solve by a perceptron model by Minsky and - Papert, then the + Papert, then that would be quite a motivation in front of my boss. -1. **Would you expect to have better luck with a neural network with + On a morev general (and more serious) note, the perceptron model would be + unable to solve the problem in the picture since a perceptron can solve only + linearly-separable classification problems, and even through a simple + graphical argument we would be unable to find a line able able to separate + yellow and purple dots w.r.t. a decent approximation simply due to the way + the dots are positioned. + +2. **Would you expect to have better luck with a neural network with activation function $h(x) = - x \cdot e^{-2}$ for the hidden units?** -1. **What are the main differences and similarities between the + Boh + +3. **What are the main differences and similarities between the perceptron and the logistic regression neuron?** + + diff --git a/assignment_1/src/build_models.py b/assignment_1/src/build_models.py index f2971de..53c94d1 100644 --- a/assignment_1/src/build_models.py +++ b/assignment_1/src/build_models.py @@ -23,7 +23,6 @@ points = 2000 lr = LinearRegression(fit_intercept=False) # Build x feature vector with columns for theta_3 and theta_4 -# variable name explained here: https://vimeo.com/380021022 X = np.zeros([points, 5]) X[:, 0] = 1 X[:, 1:3] = xs