20%off off — LaunchSpecial
Home/Machine Learning with Python/Logistic Regression & Classification
intermediate
35 min
150 XP

Logistic Regression & Classification

Build binary and multi-class classifiers using logistic regression, understand decision boundaries and probability outputs

Logistic Regression & Classification

Despite its name, logistic regression is a classification algorithm. It predicts the probability that an input belongs to a class.

The Sigmoid Function

Logistic regression applies the sigmoid (logistic) function to convert a linear combination into a probability:

σ(z) = 1 / (1 + e^(-z))
  • Input z can be any real number
  • Output is always between 0 and 1
  • σ(0) = 0.5 (decision boundary)

Decision Boundary

P(y=1 | x) = σ(w·x + b)

Predict class 1 if P(y=1) ≥ 0.5
Predict class 0 if P(y=1) < 0.5

Cost Function: Binary Cross-Entropy

L = -[y·log(ŷ) + (1-y)·log(1-ŷ)]

This penalises confident wrong predictions heavily.

Classification Metrics

MetricFormulaWhen to use
Accuracy(TP+TN)/(TP+TN+FP+FN)Balanced classes
PrecisionTP/(TP+FP)Cost of false positives is high
RecallTP/(TP+FN)Cost of false negatives is high
F1 Score2×P×R/(P+R)Imbalanced classes
AUC-ROCArea under ROC curveOverall model quality

Confusion Matrix

              Predicted
              Pos    Neg
Actual  Pos [ TP  |  FN ]
        Neg [ FP  |  TN ]
  • TP = True Positive (correctly predicted positive)
  • TN = True Negative (correctly predicted negative)
  • FP = False Positive (predicted positive, actually negative)
  • FN = False Negative (predicted negative, actually positive)

Multi-class Classification

For more than 2 classes:

  • One-vs-Rest (OvR): train N binary classifiers
  • Softmax: extends sigmoid to multiple classes

Try It Yourself

python
🐍 Python

to use AI code explanations