-
Notifications
You must be signed in to change notification settings - Fork 47
Fix edit metadata endpoint #217
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
test.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess this file should have been omitted?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh no, thanks a lot! Removed it 🥲
pdurbin
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We talked about it and it makes sense. Approving.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR corrects the edit_dataset_metadata endpoint by sending metadata as request body parameters instead of JSON to avoid multiple values being passed to json, and adds tests to verify both replace and add behaviors.
- Updated
put_requestto accept adatapayload and removed automatic JSON encoding for PUT. - Adjusted
edit_dataset_metadatato callput_requestwithdata=metadata. - Added two integration tests covering replace and add scenarios for dataset metadata edits.
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| tests/api/test_edit.py | New tests for replace and add modes of edit_dataset_metadata. |
| pyDataverse/api.py | Changed put_request signature, removed json= usage, and updated metadata edit to use request body. |
Comments suppressed due to low confidence (1)
tests/api/test_edit.py:84
- [nitpick] Update the docstring for test_edit_dataset_metadata_add to correctly describe that this test adds metadata rather than replacing existing metadata.
2. Edits the dataset metadata and replaces the existing metadata.
| API_TOKEN = os.getenv("API_TOKEN") | ||
|
|
||
| # Create dataset | ||
| metadata = json.load(open("tests/data/file_upload_ds_minimum.json")) |
Copilot
AI
Jun 18, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Use a context manager when opening files (e.g., with open(...) as f:) to ensure the file is properly closed.
| metadata = json.load(open("tests/data/file_upload_ds_minimum.json")) | |
| with open("tests/data/file_upload_ds_minimum.json") as f: | |
| metadata = json.load(f) |
| request_params = self._check_json_data_form(data) | ||
|
|
||
| if self.client is None: | ||
| return self._sync_request( |
Copilot
AI
Jun 18, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The new files parameter is accepted by put_request but not forwarded to the underlying request; include files=files in both _sync_request and _async_request calls to support file uploads via PUT.
As noted in #216, the process for modifying the metadata of a dataset has malfunctioned due to multiple values being sent to
json. This PR resolves the problem and includes tests to verify functionality.multiple values for keyword argument 'json'#216