Skip to content

Commit

Permalink
Merge pull request #8747 from GlobalDataverseCommunityConsortium/GDCC…
Browse files Browse the repository at this point in the history
…/8746-single-version-semantics_for_archiving

Gdcc/8746 single version semantics for archiving
  • Loading branch information
kcondon authored Jul 28, 2022
2 parents 4e65f2f + aa165de commit 706196a
Show file tree
Hide file tree
Showing 6 changed files with 79 additions and 10 deletions.
14 changes: 8 additions & 6 deletions doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1245,7 +1245,7 @@ The fully expanded example above (without environment variables) looks like this
curl -H "X-Dataverse-key: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X POST https://demo.dataverse.org/api/datasets/24/privateUrl
If Anonymized Access has been enabled on a Dataverse instance (see the :ref:`:AnonymizedFieldTypeNames` setting), an optional 'anonymizedAccess' query parameter is allowed.
If Anonymized Access has been enabled on a Dataverse installation (see the :ref:`:AnonymizedFieldTypeNames` setting), an optional 'anonymizedAccess' query parameter is allowed.
Setting anonymizedAccess=true in your call will create a PrivateURL that only allows an anonymized view of the Dataset (see :ref:`privateurl`).
.. code-block:: bash
Expand Down Expand Up @@ -1303,7 +1303,7 @@ When adding a file to a dataset, you can optionally specify the following:
- Whether or not the file is restricted.
- Whether or not the file skips :doc:`tabular ingest </user/tabulardataingest/index>`. If the ``tabIngest`` parameter is not specified, it defaults to ``true``.
Note that when a Dataverse instance is configured to use S3 storage with direct upload enabled, there is API support to send a file directly to S3. This is more complex and is described in the :doc:`/developers/s3-direct-upload-api` guide.
Note that when a Dataverse installation is configured to use S3 storage with direct upload enabled, there is API support to send a file directly to S3. This is more complex and is described in the :doc:`/developers/s3-direct-upload-api` guide.
In the curl example below, all of the above are specified but they are optional.
Expand Down Expand Up @@ -1878,7 +1878,7 @@ The API call requires a Json body that includes the list of the fileIds that the
Get the Archival Status of a Dataset By Version
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Archiving is an optional feature that may be configured for a Dataverse instance. When that is enabled, this API call be used to retrieve the status. Note that this requires "superuser" credentials.
Archiving is an optional feature that may be configured for a Dataverse installation. When that is enabled, this API call be used to retrieve the status. Note that this requires "superuser" credentials.
``GET /api/datasets/$dataset-id/$version/archivalStatus`` returns the archival status of the specified dataset version.
Expand All @@ -1896,7 +1896,7 @@ The response is a JSON object that will contain a "status" which may be "success
Set the Archival Status of a Dataset By Version
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Archiving is an optional feature that may be configured for a Dataverse instance. When that is enabled, this API call be used to set the status. Note that this is intended to be used by the archival system and requires "superuser" credentials.
Archiving is an optional feature that may be configured for a Dataverse installation. When that is enabled, this API call be used to set the status. Note that this is intended to be used by the archival system and requires "superuser" credentials.
``PUT /api/datasets/$dataset-id/$version/archivalStatus`` sets the archival status of the specified dataset version.
Expand All @@ -1911,11 +1911,13 @@ The body is a JSON object that must contain a "status" which may be "success", "
export JSON='{"status":"failure","message":"Something went wrong"}'
curl -H "X-Dataverse-key: $API_TOKEN" -H "Content-Type:application/json" -X PUT "$SERVER_URL/api/datasets/:persistentId/$VERSION/archivalStatus?persistentId=$PERSISTENT_IDENTIFIER" -d "$JSON"
Note that if the configured archiver only supports archiving a single version, the call may return 409 CONFLICT if/when another version already has a non-null status.
Delete the Archival Status of a Dataset By Version
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Archiving is an optional feature that may be configured for a Dataverse instance. When that is enabled, this API call be used to delete the status. Note that this is intended to be used by the archival system and requires "superuser" credentials.
Archiving is an optional feature that may be configured for a Dataverse installation. When that is enabled, this API call be used to delete the status. Note that this is intended to be used by the archival system and requires "superuser" credentials.
``DELETE /api/datasets/$dataset-id/$version/archivalStatus`` deletes the archival status of the specified dataset version.
Expand Down Expand Up @@ -2134,7 +2136,7 @@ Replacing Files
Replace an existing file where ``ID`` is the database id of the file to replace or ``PERSISTENT_ID`` is the persistent id (DOI or Handle) of the file. Requires the ``file`` to be passed as well as a ``jsonString`` expressing the new metadata. Note that metadata such as description, directoryLabel (File Path) and tags are not carried over from the file being replaced.
Note that when a Dataverse instance is configured to use S3 storage with direct upload enabled, there is API support to send a replacement file directly to S3. This is more complex and is described in the :doc:`/developers/s3-direct-upload-api` guide.
Note that when a Dataverse installation is configured to use S3 storage with direct upload enabled, there is API support to send a replacement file directly to S3. This is more complex and is described in the :doc:`/developers/s3-direct-upload-api` guide.
A curl example using an ``ID``
Expand Down
4 changes: 1 addition & 3 deletions src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -5543,10 +5543,8 @@ public void refreshPaginator() {
*/
public void archiveVersion(Long id) {
if (session.getUser() instanceof AuthenticatedUser) {
AuthenticatedUser au = ((AuthenticatedUser) session.getUser());

DatasetVersion dv = datasetVersionService.retrieveDatasetVersionByVersionId(id).getDatasetVersion();
String className = settingsService.getValueForKey(SettingsServiceBean.Key.ArchiverClassName);
String className = settingsWrapper.getValueForKey(SettingsServiceBean.Key.ArchiverClassName, null);
AbstractSubmitToArchiveCommand cmd = ArchiverUtil.createSubmitToArchiveCommand(className, dvRequestService.getDataverseRequest(), dv);
if (cmd != null) {
try {
Expand Down
7 changes: 7 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/api/Admin.java
Original file line number Diff line number Diff line change
Expand Up @@ -1823,6 +1823,13 @@ public Response submitDatasetVersionToArchive(@PathParam("id") String dsid, @Pat
String className = settingsService.getValueForKey(SettingsServiceBean.Key.ArchiverClassName);
AbstractSubmitToArchiveCommand cmd = ArchiverUtil.createSubmitToArchiveCommand(className, dvRequestService.getDataverseRequest(), dv);
if (cmd != null) {
if(ArchiverUtil.onlySingleVersionArchiving(cmd.getClass(), settingsService)) {
for (DatasetVersion version : ds.getVersions()) {
if ((dv != version) && version.getArchivalCopyLocation() != null) {
return error(Status.CONFLICT, "Dataset already archived.");
}
}
}
new Thread(new Runnable() {
public void run() {
try {
Expand Down
23 changes: 23 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/api/Datasets.java
Original file line number Diff line number Diff line change
Expand Up @@ -3342,6 +3342,13 @@ public Response setDatasetVersionArchivalStatus(@PathParam("id") String datasetI
if (dsv == null) {
return error(Status.NOT_FOUND, "Dataset version not found");
}
if (isSingleVersionArchiving()) {
for (DatasetVersion version : dsv.getDataset().getVersions()) {
if ((!dsv.equals(version)) && (version.getArchivalCopyLocation() != null)) {
return error(Status.CONFLICT, "Dataset already archived.");
}
}
}

dsv.setArchivalCopyLocation(JsonUtil.prettyPrint(update));
dsv = datasetversionService.merge(dsv);
Expand Down Expand Up @@ -3386,4 +3393,20 @@ public Response deleteDatasetVersionArchivalStatus(@PathParam("id") String datas
return wr.getResponse();
}
}

private boolean isSingleVersionArchiving() {
String className = settingsService.getValueForKey(SettingsServiceBean.Key.ArchiverClassName, null);
if (className != null) {
Class<? extends AbstractSubmitToArchiveCommand> clazz;
try {
clazz = Class.forName(className).asSubclass(AbstractSubmitToArchiveCommand.class);
return ArchiverUtil.onlySingleVersionArchiving(clazz, settingsService);
} catch (ClassNotFoundException e) {
logger.warning(":ArchiverClassName does not refer to a known Archiver");
} catch (ClassCastException cce) {
logger.warning(":ArchiverClassName does not refer to an Archiver class");
}
}
return false;
}
}
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
package edu.harvard.iq.dataverse.engine.command.impl;

import edu.harvard.iq.dataverse.Dataset;
import edu.harvard.iq.dataverse.DatasetVersion;
import edu.harvard.iq.dataverse.DvObject;
import edu.harvard.iq.dataverse.SettingsWrapper;
import edu.harvard.iq.dataverse.authorization.Permission;
import edu.harvard.iq.dataverse.authorization.users.ApiToken;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
Expand Down Expand Up @@ -157,5 +160,18 @@ public void run() {
}
return bagThread;
}


public static boolean isArchivable(Dataset dataset, SettingsWrapper settingsWrapper) {
return true;
}

//Check if the chosen archiver imposes single-version-only archiving - in a View context
public static boolean isSingleVersion(SettingsWrapper settingsWrapper) {
return false;
}

//Check if the chosen archiver imposes single-version-only archiving - in the API
public static boolean isSingleVersion(SettingsServiceBean settingsService) {
return false;
}
}
23 changes: 23 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/util/ArchiverUtil.java
Original file line number Diff line number Diff line change
@@ -1,11 +1,14 @@
package edu.harvard.iq.dataverse.util;

import java.lang.reflect.Constructor;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.util.logging.Logger;

import edu.harvard.iq.dataverse.DatasetVersion;
import edu.harvard.iq.dataverse.engine.command.DataverseRequest;
import edu.harvard.iq.dataverse.engine.command.impl.AbstractSubmitToArchiveCommand;
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;

/**
* Simple class to reflectively get an instance of the desired class for
Expand Down Expand Up @@ -35,4 +38,24 @@ public static AbstractSubmitToArchiveCommand createSubmitToArchiveCommand(String
}
return null;
}

public static boolean onlySingleVersionArchiving(Class<? extends AbstractSubmitToArchiveCommand> clazz, SettingsServiceBean settingsService) {
Method m;
try {
m = clazz.getMethod("isSingleVersion", SettingsServiceBean.class);
Object[] params = { settingsService };
return (Boolean) m.invoke(null, params);
} catch (NoSuchMethodException e) {
e.printStackTrace();
} catch (SecurityException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (InvocationTargetException e) {
e.printStackTrace();
}
return (AbstractSubmitToArchiveCommand.isSingleVersion(settingsService));
}
}

0 comments on commit 706196a

Please sign in to comment.