Skip to content

Files

Latest commit

d05171c · Feb 26, 2025

History

History
36 lines (23 loc) · 2.05 KB

File metadata and controls

36 lines (23 loc) · 2.05 KB

Logistic Regression

Equation:

y ^ = 1 1 + e z

where:

  • z = θ T x = θ 0 + θ 1 x 1 + θ 2 x 2 + . . . + θ n x n

Trap

Logistic regression models the probability of a binary outcome following a Bernoulli distribution. The Bernoulli distribution is a discrete probability distribution of a random variable that takes only two values - typically 0 and 1, with p being the probability of success (1) and 1-p being the probability of failure (0).

In logistic regression, we estimate p (the probability of the positive class) based on a given dataset of independent variables. The model outputs a probability between 0 and 1, which matches the Bernoulli distribution's parameter p. This probability can then be used to make binary predictions by applying a threshold (typically 0.5).

While commonly used for classification tasks, logistic regression is fundamentally a probabilistic model that estimates Bernoulli probabilities using a linear decision boundary. This means the input features need to be linearly separable for the model to perform well.

More