Skip to content

Commit

Permalink
Merge master into mlx (#19557)
Browse files Browse the repository at this point in the history
* Introduce float8 training (#19488)

* Add float8 training support

* Add tests for fp8 training

* Add `quantize_and_dequantize` test

* Fix bugs and add float8 correctness tests

* Cleanup

* Address comments and cleanup

* Add docstrings and some minor refactoring

* Add `QuantizedFloat8DTypePolicy`

* Add dtype policy setter

* Fix torch dynamo issue by using `self._dtype_policy`

* Improve test coverage

* Add LoRA to ConvND layers (#19516)

* Add LoRA to `BaseConv`

* Add tests

* Fix typo

* Fix tests

* Fix tests

* Add path to run keras on dm-tree when optree is not available.

* feat(losses): add Tversky loss implementation (#19511)

* feat(losses): add Tversky loss implementation

* adjusted documentation

* Update KLD docs

* Models and layers now return owned metrics recursively. (#19522)

- added `Layer.metrics` to return all metrics owned by the layer and its sub-layers recursively.
- `Layer.metrics_variables` now returns variables from all metrics recursively, not just the layer and its direct sub-layers.
- `Model.metrics` now returns all metrics recursively, not just the model level metrics.
- `Model.metrics_variables` now returns variables from all metrics recursively, not just the model level metrics.
- added test coverage to test metrics and variables 2 levels deep.

This is consistent with the Keras 2 behavior and how `Model/Layer.variables` and `Model/Layer.weights` work.

* Update IoU ignore_class handling

* Fix `RandomBrightness`, Enhance `IndexLookup` Initialization and Expand Test Coverage for `Preprocessing Layers` (#19513)

* Add tests for CategoryEncoding class in category_encoding_test.py

* fix

* Fix IndexLookup class initialization and add test cases

* Add test case for IndexLookupLayerTest without vocabulary

* Fix IndexLookup class initialization

* Add normalization test cases

* Add test cases for Hashing class

* Fix value range validation error in RandomBrightness class

* Refactor IndexLookup class initialization and add test cases

* Reffix ndexLookup class initialization and afix est cases

* Add test for spectral norm

* Add missing test decorator

* Fix torch test

* Fix code format

* Generate API (#19530)

* API Generator for Keras

* API Generator for Keras

* Generates API Gen via api_gen.sh

* Remove recursive import of _tf_keras

* Generate API Files via api_gen.sh

* Update APIs

* Added metrics from custom `train_step`/`test_step` are now returned. (#19529)

This works the same way as in Keras 2, whereby the metrics are returned directly from the logs if the set of keys doesn't match the model metrics.

* Use temp dir and abs path in `api_gen.py` (#19533)

* Use temp dir and abs path

* Use temp dir and abs path

* Update Readme

* Update API

* Fix gradient accumulation when using `overwrite_with_gradient` during float8 training (#19534)

* Fix gradient accumulation with `overwrite_with_gradient` in float8 training

* Add comments

* Fix annotation

* Update code path in ignore path (#19537)

* Add operations per run (#19538)

* Include input shapes in model visualization.

* Add pad_to_aspect_ratio feature in ops.image.resize

* Add pad_to_aspect_ratio feature in Resizing layer.

* Fix incorrect usage of `quantize` (#19541)

* Add logic to prevent double quantization

* Add detailed info for double quantization error

* Update error msg

* Add eigh op.

* Add keepdim in argmax/argmin.

* Fix small bug in model.save_weights (#19545)

* Update public APIs.

* eigh should work on JAX GPU

* Copy init to keras/__init__.py (#19551)

* Revert "Copy init to keras/__init__.py (#19551)" (#19552)

This reverts commit da9af61.

* fixes for master

---------

Co-authored-by: james77777778 <20734616+james77777778@users.noreply.github.com>
Co-authored-by: Francois Chollet <francois.chollet@gmail.com>
Co-authored-by: Luca Pizzini <lpizzini7@gmail.com>
Co-authored-by: hertschuh <1091026+hertschuh@users.noreply.github.com>
Co-authored-by: Faisal Alsrheed <47912291+Faisal-Alsrheed@users.noreply.github.com>
Co-authored-by: Ramesh Sampath <1437573+sampathweb@users.noreply.github.com>
Co-authored-by: Sachin Prasad <sachinprasad@google.com>
Co-authored-by: Uwe Schmidt <uschmidt83@users.noreply.github.com>
  • Loading branch information
9 people authored Apr 19, 2024
1 parent 47c032d commit 3859fb4
Show file tree
Hide file tree
Showing 769 changed files with 11,122 additions and 4,404 deletions.
23 changes: 16 additions & 7 deletions .github/workflows/actions.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,13 +24,13 @@ jobs:
KERAS_HOME: .github/workflows/config/${{ matrix.backend }}
steps:
- uses: actions/checkout@v4
- name: Check for changes in keras/applications
- name: Check for changes in keras/src/applications
uses: dorny/paths-filter@v3
id: filter
with:
filters: |
applications:
- 'keras/applications/**'
- 'keras/src/applications/**'
- name: Set up Python
uses: actions/setup-python@v5
with:
Expand All @@ -49,13 +49,13 @@ jobs:
run: |
pip install -r requirements.txt --progress-bar off --upgrade
pip uninstall -y keras keras-nightly
pip install tf_keras==2.16.0rc0 --progress-bar off --upgrade
pip install tf_keras==2.16.0 --progress-bar off --upgrade
pip install -e "." --progress-bar off --upgrade
- name: Test applications with pytest
if: ${{ steps.filter.outputs.applications == 'true' }}
run: |
pytest keras/applications --cov=keras/applications
coverage xml --include='keras/applications/*' -o apps-coverage.xml
pytest keras/src/applications --cov=keras/src/applications
coverage xml --include='keras/src/applications/*' -o apps-coverage.xml
- name: Codecov keras.applications
if: ${{ steps.filter.outputs.applications == 'true' }}
uses: codecov/codecov-action@v4
Expand All @@ -80,8 +80,8 @@ jobs:
pytest integration_tests/torch_workflow_test.py
- name: Test with pytest
run: |
pytest keras --ignore keras/applications --cov=keras
coverage xml --omit='keras/applications/*' -o core-coverage.xml
pytest keras --ignore keras/src/applications --cov=keras
coverage xml --omit='keras/src/applications/*,keras/api' -o core-coverage.xml
- name: Codecov keras
uses: codecov/codecov-action@v4
with:
Expand Down Expand Up @@ -115,5 +115,14 @@ jobs:
pip install -r requirements.txt --progress-bar off --upgrade
pip uninstall -y keras keras-nightly
pip install -e "." --progress-bar off --upgrade
- name: Check for API changes
run: |
bash shell/api_gen.sh
git status
clean=$(git status | grep "nothing to commit")
if [ -z "$clean" ]; then
echo "Please run shell/api_gen.sh to generate API."
exit 1
fi
- name: Lint
run: bash shell/lint.sh
13 changes: 11 additions & 2 deletions .github/workflows/nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ jobs:
pytest integration_tests/torch_workflow_test.py
- name: Test with pytest
run: |
pytest keras --ignore keras/applications --cov=keras
pytest keras --ignore keras/src/applications --cov=keras
format:
name: Check the code format
Expand All @@ -81,6 +81,15 @@ jobs:
pip install -r requirements.txt --progress-bar off --upgrade
pip uninstall -y keras keras-nightly
pip install -e "." --progress-bar off --upgrade
- name: Check for API changes
run: |
bash shell/api_gen.sh
git status
clean=$(git status | grep "nothing to commit")
if [ -z "$clean" ]; then
echo "Please run shell/api_gen.sh to generate API."
exit 1
fi
- name: Lint
run: bash shell/lint.sh

Expand Down Expand Up @@ -108,4 +117,4 @@ jobs:
with:
password: ${{ secrets.PYPI_NIGHTLY_API_TOKEN }}
packages-dir: dist/
verbose: true
verbose: true
4 changes: 4 additions & 0 deletions .github/workflows/stale-issue-pr.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ on:
- cron: "30 1 * * *"
jobs:
close-issues:
# Don't do this in forks
if: github.repository == 'keras-team/keras'
runs-on: ubuntu-latest
permissions:
issues: write
Expand All @@ -12,6 +14,7 @@ jobs:
- name: Awaiting response issues
uses: actions/stale@v9
with:
operations-per-run: 500
days-before-issue-stale: 14
days-before-issue-close: 14
stale-issue-label: "stale"
Expand All @@ -34,6 +37,7 @@ jobs:
- name: Contribution issues
uses: actions/stale@v9
with:
operations-per-run: 500
days-before-issue-stale: 180
days-before-issue-close: 365
stale-issue-label: "stale"
Expand Down
16 changes: 8 additions & 8 deletions .kokoro/github/ubuntu/gpu/build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@ then
python3 -c 'import tensorflow as tf;assert len(tf.config.list_physical_devices("GPU")) > 0'

# TODO: keras/layers/merging/merging_test.py::MergingLayersTest::test_sparse_dot_2d Fatal Python error: Aborted
pytest keras --ignore keras/applications \
--ignore keras/layers/merging/merging_test.py \
pytest keras --ignore keras/src/applications \
--ignore keras/src/layers/merging/merging_test.py \
--cov=keras
fi

Expand All @@ -51,11 +51,11 @@ then
# TODO: keras/layers/merging/merging_test.py::MergingLayersTest::test_sparse_dot_2d Fatal Python error: Aborted
# TODO: keras/trainers/data_adapters/py_dataset_adapter_test.py::PyDatasetAdapterTest::test_basic_flow0 Fatal Python error: Aborted
# keras/backend/jax/distribution_lib_test.py is configured for CPU test for now.
pytest keras --ignore keras/applications \
--ignore keras/layers/merging/merging_test.py \
--ignore keras/trainers/data_adapters/py_dataset_adapter_test.py \
--ignore keras/backend/jax/distribution_lib_test.py \
--ignore keras/distribution/distribution_lib_test.py \
pytest keras --ignore keras/src/applications \
--ignore keras/src/layers/merging/merging_test.py \
--ignore keras/src/trainers/data_adapters/py_dataset_adapter_test.py \
--ignore keras/src/backend/jax/distribution_lib_test.py \
--ignore keras/src/distribution/distribution_lib_test.py \
--cov=keras
fi

Expand All @@ -68,6 +68,6 @@ then
# Raise error if GPU is not detected.
python3 -c 'import torch;assert torch.cuda.is_available()'

pytest keras --ignore keras/applications \
pytest keras --ignore keras/src/applications \
--cov=keras
fi
6 changes: 6 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,12 @@ pip install -r requirements.txt
python pip_build.py --install
```

3. Run API generation script when creating PRs that update `keras_export` public APIs:

```
./shell/api_gen.sh
```

#### Adding GPU support

The `requirements.txt` file will install a CPU-only version of TensorFlow, JAX, and PyTorch. For GPU support, we also
Expand Down
201 changes: 201 additions & 0 deletions api_gen.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,201 @@
"""Script to generate keras public API in `keras/api` directory.
Usage:
Run via `./shell/api_gen.sh`.
It generates API and formats user and generated APIs.
"""

import os
import shutil

import namex

package = "keras"


def ignore_files(_, filenames):
return [f for f in filenames if f.endswith("_test.py")]


def copy_source_to_build_directory(root_path):
# Copy sources (`keras/` directory and setup files) to build dir
build_dir = os.path.join(root_path, "tmp_build_dir")
if os.path.exists(build_dir):
shutil.rmtree(build_dir)
os.mkdir(build_dir)
shutil.copytree(
package, os.path.join(build_dir, package), ignore=ignore_files
)
return build_dir


def create_legacy_directory(package_dir):
src_dir = os.path.join(package_dir, "src")
api_dir = os.path.join(package_dir, "api")
# Make keras/_tf_keras/ by copying keras/
tf_keras_dirpath_parent = os.path.join(api_dir, "_tf_keras")
tf_keras_dirpath = os.path.join(tf_keras_dirpath_parent, "keras")
os.makedirs(tf_keras_dirpath, exist_ok=True)
with open(os.path.join(tf_keras_dirpath_parent, "__init__.py"), "w") as f:
f.write("from keras.api._tf_keras import keras\n")
with open(os.path.join(api_dir, "__init__.py")) as f:
init_file = f.read()
init_file = init_file.replace(
"from keras.api import _legacy",
"from keras.api import _tf_keras",
)
with open(os.path.join(api_dir, "__init__.py"), "w") as f:
f.write(init_file)
# Remove the import of `_tf_keras` in `keras/_tf_keras/keras/__init__.py`
init_file = init_file.replace("from keras.api import _tf_keras\n", "\n")
with open(os.path.join(tf_keras_dirpath, "__init__.py"), "w") as f:
f.write(init_file)
for dirname in os.listdir(api_dir):
dirpath = os.path.join(api_dir, dirname)
if os.path.isdir(dirpath) and dirname not in (
"_legacy",
"_tf_keras",
"src",
):
destpath = os.path.join(tf_keras_dirpath, dirname)
if os.path.exists(destpath):
shutil.rmtree(destpath)
shutil.copytree(
dirpath,
destpath,
ignore=ignore_files,
)

# Copy keras/_legacy/ file contents to keras/_tf_keras/keras
legacy_submodules = [
path[:-3]
for path in os.listdir(os.path.join(src_dir, "legacy"))
if path.endswith(".py")
]
legacy_submodules += [
path
for path in os.listdir(os.path.join(src_dir, "legacy"))
if os.path.isdir(os.path.join(src_dir, "legacy", path))
]

for root, _, fnames in os.walk(os.path.join(package_dir, "_legacy")):
for fname in fnames:
if fname.endswith(".py"):
legacy_fpath = os.path.join(root, fname)
tf_keras_root = root.replace("/_legacy", "/_tf_keras/keras")
core_api_fpath = os.path.join(
root.replace("/_legacy", ""), fname
)
if not os.path.exists(tf_keras_root):
os.makedirs(tf_keras_root)
tf_keras_fpath = os.path.join(tf_keras_root, fname)
with open(legacy_fpath) as f:
legacy_contents = f.read()
legacy_contents = legacy_contents.replace(
"keras.api._legacy", "keras.api._tf_keras.keras"
)
if os.path.exists(core_api_fpath):
with open(core_api_fpath) as f:
core_api_contents = f.read()
core_api_contents = core_api_contents.replace(
"from keras.api import _tf_keras\n", ""
)
for legacy_submodule in legacy_submodules:
core_api_contents = core_api_contents.replace(
f"from keras.api import {legacy_submodule}\n",
"",
)
core_api_contents = core_api_contents.replace(
f"keras.api.{legacy_submodule}",
f"keras.api._tf_keras.keras.{legacy_submodule}",
)
legacy_contents = core_api_contents + "\n" + legacy_contents
with open(tf_keras_fpath, "w") as f:
f.write(legacy_contents)

# Delete keras/api/_legacy/
shutil.rmtree(os.path.join(api_dir, "_legacy"))


def export_version_string(api_init_fname):
with open(api_init_fname) as f:
contents = f.read()
with open(api_init_fname, "w") as f:
contents += "from keras.src.version import __version__\n"
f.write(contents)


def update_package_init(init_fname):
contents = """
# Import everything from /api/ into keras.
from keras.api import * # noqa: F403
from keras.api import __version__ # Import * ignores names start with "_".
import os
# Add everything in /api/ to the module search path.
__path__.append(os.path.join(os.path.dirname(__file__), "api")) # noqa: F405
# Don't pollute namespace.
del os
# Never autocomplete `.src` or `.api` on an imported keras object.
def __dir__():
keys = dict.fromkeys((globals().keys()))
keys.pop("src")
keys.pop("api")
return list(keys)
# Don't import `.src` or `.api` during `from keras import *`.
__all__ = [
name
for name in globals().keys()
if not (name.startswith("_") or name in ("src", "api"))
]"""
with open(init_fname) as f:
init_contents = f.read()
with open(init_fname, "w") as f:
f.write(init_contents.replace("\nfrom keras import api", contents))


def build():
# Backup the `keras/__init__.py` and restore it on error in api gen.
root_path = os.path.dirname(os.path.abspath(__file__))
code_api_dir = os.path.join(root_path, package, "api")
code_init_fname = os.path.join(root_path, package, "__init__.py")
# Create temp build dir
build_dir = copy_source_to_build_directory(root_path)
build_api_dir = os.path.join(build_dir, package, "api")
build_init_fname = os.path.join(build_dir, package, "__init__.py")
build_api_init_fname = os.path.join(build_api_dir, "__init__.py")
try:
os.chdir(build_dir)
# Generates `keras/api` directory.
if os.path.exists(build_api_dir):
shutil.rmtree(build_api_dir)
if os.path.exists(build_init_fname):
os.remove(build_init_fname)
os.makedirs(build_api_dir)
namex.generate_api_files(
"keras", code_directory="src", target_directory="api"
)
# Creates `keras/__init__.py` importing from `keras/api`
update_package_init(build_init_fname)
# Add __version__ to keras package
export_version_string(build_api_init_fname)
# Creates `_tf_keras` with full keras API
create_legacy_directory(package_dir=os.path.join(build_dir, package))
# Copy back the keras/api and keras/__init__.py from build directory
if os.path.exists(code_api_dir):
shutil.rmtree(code_api_dir)
shutil.copytree(build_api_dir, code_api_dir)
shutil.copy(build_init_fname, code_init_fname)
finally:
# Clean up: remove the build directory (no longer needed)
shutil.rmtree(build_dir)


if __name__ == "__main__":
build()
2 changes: 1 addition & 1 deletion conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@

import pytest # noqa: E402

from keras.backend import backend # noqa: E402
from keras.src.backend import backend # noqa: E402


def pytest_configure(config):
Expand Down
10 changes: 5 additions & 5 deletions integration_tests/basic_full_flow.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@
import pytest

import keras
from keras import layers
from keras import losses
from keras import metrics
from keras import optimizers
from keras import testing
from keras.src import layers
from keras.src import losses
from keras.src import metrics
from keras.src import optimizers
from keras.src import testing


class MyModel(keras.Model):
Expand Down
Loading

0 comments on commit 3859fb4

Please sign in to comment.