Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Created folder for losses in Machine_Learning #9969

Merged
merged 36 commits into from
Oct 8, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
57e7cdc
Created folder for losses in Machine_Learning
THEGAMECHANGER416 Oct 7, 2023
aedf0b9
Update binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
9b7468c
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 7, 2023
38f405c
Update mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
448785c
Update binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
761fe33
Update mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
7d59779
Update binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
e8e0aa2
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 7, 2023
09cd350
Update mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
ae4e3ee
Update binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
6f26bc3
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 7, 2023
66faa35
Update mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
c5a58c3
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 7, 2023
0753db2
Update binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
277a681
Update mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
1e3baed
Update binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
25c550a
Update mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
8513bda
Update machine_learning/losses/binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
cae669e
Update machine_learning/losses/mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
5624c22
Update machine_learning/losses/binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
7cebd00
Update mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
f200049
Update machine_learning/losses/mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
b302359
Update binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
ba41942
Update mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
26beff6
Update binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
e60dd14
Update mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
747e8cc
Update mean_squared_error.py
THEGAMECHANGER416 Oct 7, 2023
e618611
Update binary_cross_entropy.py
THEGAMECHANGER416 Oct 7, 2023
fc15fb8
renamed: losses -> loss_functions
THEGAMECHANGER416 Oct 7, 2023
7ff7948
updated 2 files
THEGAMECHANGER416 Oct 7, 2023
9502a0c
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 7, 2023
bb45ca7
Merge branch 'TheAlgorithms:master' into master
THEGAMECHANGER416 Oct 7, 2023
2a4989d
Update mean_squared_error.py
THEGAMECHANGER416 Oct 8, 2023
5a23be7
Update mean_squared_error.py
THEGAMECHANGER416 Oct 8, 2023
9e1b7d3
Update binary_cross_entropy.py
THEGAMECHANGER416 Oct 8, 2023
f47d2c3
Update mean_squared_error.py
THEGAMECHANGER416 Oct 8, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
59 changes: 59 additions & 0 deletions machine_learning/loss_functions/binary_cross_entropy.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
"""
Binary Cross-Entropy (BCE) Loss Function

Description:
Quantifies dissimilarity between true labels (0 or 1) and predicted probabilities.
It's widely used in binary classification tasks.

Formula:
BCE = -Σ(y_true * log(y_pred) + (1 - y_true) * log(1 - y_pred))

Source:
[Wikipedia - Cross entropy](https://en.wikipedia.org/wiki/Cross_entropy)
"""

import numpy as np


def binary_cross_entropy(
y_true: np.ndarray, y_pred: np.ndarray, epsilon: float = 1e-15
) -> float:
"""
Calculate the BCE Loss between true labels and predicted probabilities.

Parameters:
- y_true: True binary labels (0 or 1).
- y_pred: Predicted probabilities for class 1.
- epsilon: Small constant to avoid numerical instability.

Returns:
- bce_loss: Binary Cross-Entropy Loss.

Example Usage:
>>> true_labels = np.array([0, 1, 1, 0, 1])
>>> predicted_probs = np.array([0.2, 0.7, 0.9, 0.3, 0.8])
>>> binary_cross_entropy(true_labels, predicted_probs)
0.2529995012327421
>>> true_labels = np.array([0, 1, 1, 0, 1])
>>> predicted_probs = np.array([0.3, 0.8, 0.9, 0.2])
>>> binary_cross_entropy(true_labels, predicted_probs)
Traceback (most recent call last):
...
ValueError: Input arrays must have the same length.
"""
if len(y_true) != len(y_pred):
raise ValueError("Input arrays must have the same length.")
# Clip predicted probabilities to avoid log(0) and log(1)
y_pred = np.clip(y_pred, epsilon, 1 - epsilon)

# Calculate binary cross-entropy loss
bce_loss = -(y_true * np.log(y_pred) + (1 - y_true) * np.log(1 - y_pred))

# Take the mean over all samples
return np.mean(bce_loss)


if __name__ == "__main__":
import doctest

doctest.testmod()
51 changes: 51 additions & 0 deletions machine_learning/loss_functions/mean_squared_error.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
"""
Mean Squared Error (MSE) Loss Function

Description:
MSE measures the mean squared difference between true values and predicted values.
It serves as a measure of the model's accuracy in regression tasks.

Formula:
MSE = (1/n) * Σ(y_true - y_pred)^2

Source:
[Wikipedia - Mean squared error](https://en.wikipedia.org/wiki/Mean_squared_error)
"""

import numpy as np


def mean_squared_error(y_true: np.ndarray, y_pred: np.ndarray) -> float:
"""
Calculate the Mean Squared Error (MSE) between two arrays.

Parameters:
- y_true: The true values (ground truth).
- y_pred: The predicted values.

Returns:
- mse: The Mean Squared Error between y_true and y_pred.

Example usage:
>>> true_values = np.array([1.0, 2.0, 3.0, 4.0, 5.0])
>>> predicted_values = np.array([0.8, 2.1, 2.9, 4.2, 5.2])
>>> mean_squared_error(true_values, predicted_values)
0.028000000000000032
>>> true_labels = np.array([1.0, 2.0, 3.0, 4.0, 5.0])
>>> predicted_probs = np.array([0.3, 0.8, 0.9, 0.2])
>>> mean_squared_error(true_labels, predicted_probs)
Traceback (most recent call last):
...
ValueError: Input arrays must have the same length.
"""
if len(y_true) != len(y_pred):
raise ValueError("Input arrays must have the same length.")

squared_errors = (y_true - y_pred) ** 2
return np.mean(squared_errors)


if __name__ == "__main__":
import doctest

doctest.testmod()