Skip to content
This repository has been archived by the owner on Dec 4, 2024. It is now read-only.

Support optimizers from tf.keras.optimizers in Tensorflow 2.11+ #26

Open
danielenricocahall opened this issue Nov 9, 2023 · 0 comments
Assignees
Labels
bug Something isn't working

Comments

@danielenricocahall
Copy link
Owner

danielenricocahall commented Nov 9, 2023

Describe the bug

Currently, if someone supplies an optimizer from tf.keras.optimizers.Optimizer in Tensorflow 2.11+ to SparkModel, we get the following error

self = <pyspark.cloudpickle.cloudpickle_fast.CloudPickler object at 0x7fe7fc26a1c0>
obj = (<function RDD.mapPartitions.<locals>.func at 0x7fe80bb3a790>, None, BatchedSerializer(CloudPickleSerializer(), 10), AutoBatchedSerializer(CloudPickleSerializer()))

    def dump(self, obj):
        try:
>           return Pickler.dump(self, obj)
E           TypeError: cannot pickle 'weakref' object

This can be resolved by changing the optimizer import to tf.keras.optimizers.legacy. However, we should try to support both optimizers long term.
To Reproduce
Go to tests/integration/test_end_to_end.py, and change from tensorflow.keras.optimizers.legacy import SGD to from tensorflow.keras.optimizers import SGD

Expected behavior
We should not receive an error when supplying an optimizer imported from the optimizers package.
Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Version [e.g. 22]

Additional context
Relevant PR: #25

@danielenricocahall danielenricocahall added the bug Something isn't working label Nov 9, 2023
@danielenricocahall danielenricocahall self-assigned this Nov 9, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant