Link to the previous post : https://statinfer.com/204-5-1-neural-networks-a-recap-of-logistic-regression/
In the last session we recapped logistic regression. There is something more to understand before we move further which is a Decision Boundary. Once we get decision boundary right we can move further to Neural networks.
Solution : We have covered all these tasks in previous post. However, we will again plot the decision boundary.
import matplotlib.pyplot as plt
fig = plt.figure()
ax1 = fig.add_subplot(111)
ax1.scatter(Emp_Productivity1.Age[Emp_Productivity1.Productivity==0],Emp_Productivity1.Experience[Emp_Productivity1.Productivity==0], s=10, c='b', marker="o", label='Productivity 0')
ax1.scatter(Emp_Productivity1.Age[Emp_Productivity1.Productivity==1],Emp_Productivity1.Experience[Emp_Productivity1.Productivity==1], s=10, c='r', marker="+", label='Productivity 1')
plt.legend(loc='upper left');
x_min, x_max = ax1.get_xlim()
ax1.plot([0, x_max], [intercept1, x_max*slope1+intercept1])
ax1.set_xlim([15,35])
ax1.set_ylim([0,10])
plt.show()
y=e(b0+b1x1+b2x2)1+e(b0+b1x1+b2x2)
y=11+e−(b0+b1x1+b2x2)
y=g(w0+w1x1+w2x2)whereg(x)=11+e−(x)
y=g(∑wkxk)
out(x) = y=g(∑wkxk)The above output is a non linear function of linear combination of inputs – A typical multiple logistic regression line
We find w to minimize ∑ni=1[yi−g(∑wkxk)]2
The next post is a practice session on non-linear decision boundary.
Link to the next post : https://statinfer.com/204-5-3-practice-non-linear-decision-boundary/