Logistic Regression

Creator
Creator
Seonglae Cho
Created
Created
2023 Mar 16 1:58
Editor
Edited
Edited
2025 Feb 4 14:8

Log odds
로 확률을
Linear Regression
처럼 선형적으로 변환

선형회귀와 달리 비선형도 잘 다루지만 수식적으로 선형적으로 다룰수 있어서
Sigmoid 함수를 사용하여 0과 1 사이의 값을 출력, log-likelihood를 최대화
notion image
to make convex cost function
notion image
Logistic Regression Notion
hθ(x)=g(θTx)=11+eθTxh_\theta(x) = g(\theta^Tx) = \frac{1}{1 + e^{-\theta^Tx}}
This leads to p(ux;θ)p(u|x;\theta) described by
Bernoulli Distribution
p(yx;θ)=(hθ(x))y(1hθ(x))1yp(y|x;\theta) = (h_\theta(x))^y(1 - h_\theta(x))^{1 - y}l(θ)=logL(θ)=Σi=1ny(i)logh(x(i))+(1y(i))log(1h(x(i)))l(\theta) = logL(\theta) = \Sigma_{i=1}^ny^{(i)}logh(x^{(i)}) + (1 - y^{(i)})log(1 - h(x^{(i)}))
But 로지스틱 회귀의 y값은
you can use
Newton–Raphson method
or
Stochastic Gradient Descent
and prior one achieves faster convergence or
IRLS
is much faster
difference between
Linear Regression
is that usage of sigmoid function or logistic function
 
 

Logistic Multinomial Regression

 

Recommendations