Skip to content

Commit

Permalink
merge
Browse files Browse the repository at this point in the history
  • Loading branch information
SkafteNicki committed Mar 23, 2021
2 parents 4afca54 + 24fb887 commit 5d5e852
Show file tree
Hide file tree
Showing 109 changed files with 2,124 additions and 926 deletions.
2 changes: 1 addition & 1 deletion .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ help you or finish it with you :]_

Want to keep Torchmetrics healthy? Love seeing those green tests? So do we! How to we keep it that way?
We write tests! We value tests contribution even more than new features. One of the core values of torchmetrics
is that our users can trust our metric implementation. We can only garantee this if our metrics are well tested.
is that our users can trust our metric implementation. We can only guarantee this if our metrics are well tested.

---

Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
name: Bug report
about: Create a report to help us improve
title: ''
labels: bug, help wanted
labels: bug / fix, help wanted
assignees: ''

---
Expand Down
59 changes: 59 additions & 0 deletions .github/mergify.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
# Copyright The PyTorch Lightning team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

pull_request_rules:

- name: warn on conflicts
conditions:
- conflict
- -draft # filter-out GH draft PRs
- -label="has conflicts"
actions:
# comment:
# message: This pull request is now in conflict... :(
label:
add: [ "has conflicts" ]

- name: resolved conflicts
conditions:
- -conflict
- label="has conflicts"
- -draft # filter-out GH draft PRs
- -merged # not merged yet
- -closed
actions:
label:
remove: [ "has conflicts" ]

- name: update PR
conditions:
- -conflict
- -draft # filter-out GH draft PRs
- base=master # apply only on master
- -title~=(?i)wip # skip all PR that title contains “WIP” (ignoring case)
- "#approved-reviews-by>=1" # number of review approvals
actions:
update: {}

- name: add core reviewer
conditions:
- -conflict # skip if conflict
- -draft # filter-out GH draft PRs
- label="0:] Ready-To-Go"
- "#approved-reviews-by<2" # number of review approvals
- "#review-requested<2" # number of requested reviews
actions:
request_reviews:
teams:
- "@PyTorchLightning/core-metrics"
15 changes: 14 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,23 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

### Added

- Added prefix arg to metric collection ([#70](https://github.com/PyTorchLightning/metrics/pull/70))
- Added `prefix` argument to `MetricCollection` ([#70](https://github.com/PyTorchLightning/metrics/pull/70))


- Added `CohenKappa` metric ([#69](https://github.com/PyTorchLightning/metrics/pull/69))


- Added `RetrievalMAP` metric for Information Retrieval ([#5032](https://github.com/PyTorchLightning/pytorch-lightning/pull/5032))


- Added `average='micro'` as an option in AUROC for multilabel problems ([#110](https://github.com/PyTorchLightning/metrics/pull/110))


- Added `MatthewsCorrcoef` metric ([#98](https://github.com/PyTorchLightning/metrics/pull/98))


- Added multilabel support to `ROC` metric ([#114](https://github.com/PyTorchLightning/metrics/pull/114))

### Changed

- Changed `ExplainedVariance` from storing all preds/targets to tracking 5 statistics ([#68](https://github.com/PyTorchLightning/metrics/pull/68))
Expand All @@ -28,6 +39,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

### Fixed

- Fixed when `_stable_1d_sort` to work when n >= N ([#6177](https://github.com/PyTorchLightning/pytorch-lightning/pull/6177))


## [0.2.0] - 2021-03-12

Expand Down
94 changes: 55 additions & 39 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -123,48 +123,64 @@ Module metric usage remains the same when using multiple GPUs or multiple nodes.

``` python

os.environ['MASTER_ADDR'] = 'localhost'
os.environ['MASTER_PORT'] = '12355'
import os
import torch
from torch import nn
import torch.distributed as dist
import torch.multiprocessing as mp
from torch.nn.parallel import DistributedDataParallel as DDP
import torchmetrics

# create default process group
dist.init_process_group("gloo", rank=rank, world_size=world_size)
def metric_ddp(rank, world_size)
os.environ['MASTER_ADDR'] = 'localhost'
os.environ['MASTER_PORT'] = '12355'

# initialize model
metric = torchmetrics.Accuracy()
# create default process group
dist.init_process_group("gloo", rank=rank, world_size=world_size)

# initialize model
metric = torchmetrics.Accuracy()

# define a model and append your metric to it
# this allows metric states to be placed on correct accelerators when
# .to(device) is called on the model
model = nn.Linear(10, 10)
model.metric = metric
model = model.to(rank)

# initialize DDP
model = DDP(model, device_ids=[rank])

n_epochs = 5
# this shows iteration over multiple training epochs
for n in range(n_epochs):

# this will be replaced by a DataLoader with a DistributedSampler
n_batches = 10
for i in range(n_batches):
# simulate a classification problem
preds = torch.randn(10, 5).softmax(dim=-1)
target = torch.randint(5, (10,))

# metric on current batch
acc = metric(preds, target)
if rank == 0: # print only for rank 0
print(f"Accuracy on batch {i}: {acc}")

# metric on all batches and all accelerators using custom accumulation
# accuracy is same across both accelerators
acc = metric.compute()
print(f"Accuracy on all data: {acc}, accelerator rank: {rank}")

# Reseting internal state such that metric ready for new data
metric.reset()

# cleanup
dist.destroy_process_group()

world_size = 2 # number of gpus to parallize over
mp.spawn(metric_dpp, args=(world_size,), nprocs=world_size, join=True)

# define a model and append your metric to it
# this allows metric states to be placed on correct accelerators when
# .to(device) is called on the model
model = nn.Linear(10, 10)
model.metric = metric
model = model.to(rank)

# initialize DDP
model = DDP(model, device_ids=[rank])

n_epochs = 5
# this shows iteration over multiple training epochs
for n in range(n_epochs):

# this will be replaced by a DataLoader with a DistributedSampler
n_batches = 10
for i in range(n_batches):
# simulate a classification problem
preds = torch.randn(10, 5).softmax(dim=-1)
target = torch.randint(5, (10,))

# metric on current batch
acc = metric(preds, target)
if rank == 0: # print only for rank 0
print(f"Accuracy on batch {i}: {acc}")

# metric on all batches and all accelerators using custom accumulation
# accuracy is same across both accelerators
acc = metric.compute()
print(f"Accuracy on all data: {acc}, accelerator rank: {rank}")

# Reseting internal state such that metric ready for new data
metric.reset()
```
</details>

Expand Down
14 changes: 5 additions & 9 deletions docs/source/_templates/theme_variables.jinja
Original file line number Diff line number Diff line change
@@ -1,18 +1,14 @@
{%- set external_urls = {
'github': 'https://github.com/PytorchLightning/pytorch-torchmetrics',
'github_issues': 'https://github.com/PytorchLightning/pytorch-torchmetrics/issues',
'contributing': 'https://github.com/PytorchLightning/pytorch-lightning/blob/master/CONTRIBUTING.md',
'governance': 'https://github.com/PytorchLightning/pytorch-lightning/blob/master/governance.md',
'docs': 'https://pytorch-torchmetrics.rtfd.io/en/latest',
'github': 'https://github.com/PytorchLightning/metrics',
'github_issues': 'https://github.com/PytorchLightning/metrics/issues',
'contributing': 'https://github.com/PyTorchLightning/metrics/blob/master/.github/CONTRIBUTING.md',
'docs': 'https://torchmetrics.readthedocs.io/en/latest',
'twitter': 'https://twitter.com/PyTorchLightnin',
'discuss': 'https://pytorch-lightning.slack.com',
'tutorials': 'https://pytorch-lightning.readthedocs.io/en/latest/#tutorials',
'previous_pytorch_versions': 'https://torchmetrics.rtfd.io/en/latest/',
'home': 'https://torchmetrics.rtfd.io/en/latest/',
'get_started': 'https://torchmetrics.readthedocs.io/en/latest/quickstart.html',
'features': 'https://lightning-bolts.rtfd.io/en/latest/',
'blog': 'https://www.pytorchlightning.ai/blog',
'resources': 'https://pytorch-lightning.readthedocs.io/en/latest/#community-examples',
'support': 'https://github.com/PytorchLightning/pytorch-torchmetrics/issues',
'support': 'https://github.com/PytorchLightning/metrics/issues',
}
-%}
56 changes: 30 additions & 26 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,39 +13,43 @@
# documentation root, use os.path.abspath to make it absolute, like shown here.

# import m2r
import builtins
import glob
import inspect
import os
import shutil
import sys
from importlib.util import module_from_spec, spec_from_file_location

import pt_lightning_sphinx_theme

PATH_HERE = os.path.abspath(os.path.dirname(__file__))
PATH_ROOT = os.path.join(PATH_HERE, "..", "..")
sys.path.insert(0, os.path.abspath(PATH_ROOT))

builtins.__LIGHTNING_BOLT_SETUP__ = True
_PATH_HERE = os.path.abspath(os.path.dirname(__file__))
_PATH_ROOT = os.path.realpath(os.path.join(_PATH_HERE, "..", ".."))
sys.path.insert(0, os.path.abspath(_PATH_ROOT))

FOLDER_GENERATED = 'generated'
SPHINX_MOCK_REQUIREMENTS = int(os.environ.get("SPHINX_MOCK_REQUIREMENTS", True))

import torchmetrics # noqa: E402
try:
from torchmetrics import info
except ImportError:
# alternative https://stackoverflow.com/a/67692/4521646
spec = spec_from_file_location("torchmetrics/info.py", os.path.join(_PATH_ROOT, "torchmetrics", "info.py"))
info = module_from_spec(spec)
spec.loader.exec_module(info)

html_favicon = '_static/images/icon.svg'

# -- Project information -----------------------------------------------------

# this name shall match the project name in Github as it is used for linking to code
project = "PyTorch-Metrics"
copyright = torchmetrics.__copyright__
author = torchmetrics.__author__
copyright = info.__copyright__
author = info.__author__

# The short X.Y version
version = torchmetrics.__version__
version = info.__version__
# The full version, including alpha/beta/rc tags
release = torchmetrics.__version__
release = info.__version__

# Options for the linkcode extension
# ----------------------------------
Expand All @@ -70,14 +74,14 @@ def _transform_changelog(path_in: str, path_out: str) -> None:
fp.writelines(chlog_lines)


os.makedirs(os.path.join(PATH_HERE, FOLDER_GENERATED), exist_ok=True)
os.makedirs(os.path.join(_PATH_HERE, FOLDER_GENERATED), exist_ok=True)
# copy all documents from GH templates like contribution guide
for md in glob.glob(os.path.join(PATH_ROOT, '.github', '*.md')):
shutil.copy(md, os.path.join(PATH_HERE, FOLDER_GENERATED, os.path.basename(md)))
for md in glob.glob(os.path.join(_PATH_ROOT, '.github', '*.md')):
shutil.copy(md, os.path.join(_PATH_HERE, FOLDER_GENERATED, os.path.basename(md)))
# copy also the changelog
_transform_changelog(
os.path.join(PATH_ROOT, 'CHANGELOG.md'),
os.path.join(PATH_HERE, FOLDER_GENERATED, 'CHANGELOG.md'),
os.path.join(_PATH_ROOT, 'CHANGELOG.md'),
os.path.join(_PATH_HERE, FOLDER_GENERATED, 'CHANGELOG.md'),
)

# -- General configuration ---------------------------------------------------
Expand Down Expand Up @@ -166,8 +170,8 @@ def _transform_changelog(path_in: str, path_out: str) -> None:
# documentation.

html_theme_options = {
"pytorch_project": torchmetrics.__homepage__,
"canonical_url": torchmetrics.__homepage__,
"pytorch_project": info.__homepage__,
"canonical_url": info.__homepage__,
"collapse_navigation": False,
"display_version": True,
"logo_only": False,
Expand Down Expand Up @@ -233,7 +237,7 @@ def _transform_changelog(path_in: str, path_out: str) -> None:
project + " Documentation",
author,
project,
torchmetrics.__docs__,
info.__docs__,
"Miscellaneous",
),
]
Expand Down Expand Up @@ -280,11 +284,11 @@ def _transform_changelog(path_in: str, path_out: str) -> None:

# packages for which sphinx-apidoc should generate the docs (.rst files)
PACKAGES = [
torchmetrics.__name__,
info.__name__,
]

# def run_apidoc(_):
# apidoc_output_folder = os.path.join(PATH_HERE, "api")
# apidoc_output_folder = os.path.join(_PATH_HERE, "api")
# sys.path.insert(0, apidoc_output_folder)
#
# # delete api-doc files before generating them
Expand All @@ -294,7 +298,7 @@ def _transform_changelog(path_in: str, path_out: str) -> None:
# for pkg in PACKAGES:
# argv = ['-e',
# '-o', apidoc_output_folder,
# os.path.join(PATH_ROOT, pkg),
# os.path.join(_PATH_ROOT, pkg),
# '**/test_*',
# '--force',
# '--private',
Expand All @@ -311,10 +315,10 @@ def setup(app):


# copy all notebooks to local folder
path_nbs = os.path.join(PATH_HERE, "notebooks")
path_nbs = os.path.join(_PATH_HERE, "notebooks")
if not os.path.isdir(path_nbs):
os.mkdir(path_nbs)
for path_ipynb in glob.glob(os.path.join(PATH_ROOT, "notebooks", "*.ipynb")):
for path_ipynb in glob.glob(os.path.join(_PATH_ROOT, "notebooks", "*.ipynb")):
path_ipynb2 = os.path.join(path_nbs, os.path.basename(path_ipynb))
shutil.copy(path_ipynb, path_ipynb2)

Expand All @@ -340,7 +344,7 @@ def package_list_from_file(file):
MOCK_PACKAGES = []
if SPHINX_MOCK_REQUIREMENTS:
# mock also base packages when we are on RTD since we don't install them there
MOCK_PACKAGES += package_list_from_file(os.path.join(PATH_ROOT, "requirements.txt"))
MOCK_PACKAGES += package_list_from_file(os.path.join(_PATH_ROOT, "requirements.txt"))
MOCK_PACKAGES = [PACKAGE_MAPPING.get(pkg, pkg) for pkg in MOCK_PACKAGES]

autodoc_mock_imports = MOCK_PACKAGES
Expand Down Expand Up @@ -373,7 +377,7 @@ def find_source():
return None
try:
filename = "%s#L%d-L%d" % find_source()
except Exception:
except Exception: # todo: specify the exception
filename = info["module"].replace(".", "/") + ".py"
# import subprocess
# tag = subprocess.Popen(['git', 'rev-parse', 'HEAD'], stdout=subprocess.PIPE,
Expand Down
3 changes: 1 addition & 2 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
.. PyTorchtorchmetrics documentation master file, created by
sphinx-quickstart on Wed Mar 25 21:34:07 2020.
.. TorchMetrics documentation master file.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Expand Down
Loading

0 comments on commit 5d5e852

Please sign in to comment.