Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Iqss/6497 semantic api #7414

Merged
Merged
Show file tree
Hide file tree
Changes from 107 commits
Commits
Show all changes
114 commits
Select commit Hold shift + click to select a range
4d70971
initial semantic API endpoint
qqmyers Mar 23, 2020
b2befca
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497
qqmyers Mar 23, 2020
fb6421b
merge new fields with existing ones
qqmyers Mar 24, 2020
8bc3df6
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497
qqmyers Mar 27, 2020
5b828aa
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497
qqmyers Apr 17, 2020
f472b6c
differences from IQSS/develop that break compilation
qqmyers Apr 17, 2020
bab11f0
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Aug 31, 2020
d305867
Add jsonld lib to compact to local context
qqmyers Sep 8, 2020
98d978e
use expand/compact, refactor, add :startmigration endpoint
qqmyers Sep 11, 2020
2443702
try fix for parse error
qqmyers Sep 16, 2020
d442643
log value
qqmyers Sep 16, 2020
fdeac97
return dataset
qqmyers Sep 16, 2020
554e620
manage versionState, add debug output
qqmyers Sep 17, 2020
bee7731
move debug ore generation after configuring dataset
qqmyers Sep 17, 2020
f4cecd3
set versionstate, simplify, move terms init outside loop
qqmyers Sep 17, 2020
a4189de
parse version number
qqmyers Sep 17, 2020
e0de1db
fix toStrings
qqmyers Sep 17, 2020
928a88e
debug null pointer in DataverseFieldTypeInputLevel
qqmyers Sep 17, 2020
b78aed1
add support for fields with their own formal URI
qqmyers Sep 17, 2020
3a47630
allow non-published to support debugging and future use
qqmyers Sep 17, 2020
3f8534b
refactor, use expanded version directly
qqmyers Sep 18, 2020
64af0e8
add modification time
qqmyers Sep 18, 2020
04ee08a
expanded has array with 1 val - handle it
qqmyers Sep 18, 2020
c7c2573
log compound values to start
qqmyers Sep 18, 2020
fc77f92
compact with no context for decontextualize
qqmyers Sep 18, 2020
e287644
handle appending and compound fields
qqmyers Sep 18, 2020
8596ac8
sort compound field children by display order
qqmyers Sep 20, 2020
ffbc05a
parse date/time correctly
qqmyers Sep 22, 2020
0226b0d
Revert "sort compound field children by display order"
qqmyers Sep 22, 2020
fad62f4
typo
qqmyers Sep 22, 2020
c6b19a9
now use Uri instead of label when matching terms
qqmyers Sep 22, 2020
a014fc4
set dsfield of dsfvalue
qqmyers Sep 22, 2020
5bb5e68
additional debug, always set display order
qqmyers Sep 23, 2020
6a47fad
generate URIs for child types to match current ore maps
qqmyers Sep 23, 2020
f34da09
allow oremap to work w/o modified date for debug
qqmyers Sep 23, 2020
6b8bbc7
null check on date itself
qqmyers Sep 23, 2020
8af0938
fix compound value iteration
qqmyers Sep 23, 2020
7904844
fix ttype map for terms with no uri - use title not name
qqmyers Sep 23, 2020
e4ceee3
handle date format variations, including DV internal ones
qqmyers Sep 23, 2020
f176387
and the format in current published bags
qqmyers Sep 23, 2020
2724d9e
initial endpoint to release a migrated dataset
qqmyers Sep 24, 2020
7d5006d
create metadataOnOrig field
qqmyers Sep 24, 2020
925070b
add metadataOnOrig to solr
qqmyers Sep 24, 2020
005db97
use Finalize Publication command
qqmyers Sep 24, 2020
f336cfd
add debug, allow more details in 400 responses
qqmyers Sep 24, 2020
0de70dd
fix date-time issue
qqmyers Sep 24, 2020
395bb71
typos
qqmyers Sep 25, 2020
61c0349
create transfer bag type with orig files
qqmyers Sep 25, 2020
1753257
missing tab
qqmyers Sep 30, 2020
e642c65
add type param
qqmyers Sep 30, 2020
df66f22
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Oct 6, 2020
3445daa
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Nov 16, 2020
2e1d914
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Nov 16, 2020
9ad779a
add semantic metadata api call only
qqmyers Nov 16, 2020
8abd55e
remove OREMap parameter
qqmyers Dec 1, 2020
1a35ed2
fix error handling
qqmyers Dec 1, 2020
7b1512e
append to current terms
qqmyers Dec 1, 2020
e5b54df
add replace param
qqmyers Dec 1, 2020
578790f
handle append on terms - fix cut/paste errors
qqmyers Dec 2, 2020
acee4df
fix logic
qqmyers Dec 2, 2020
9185126
specify default
qqmyers Dec 2, 2020
e8698dc
make replace still append for multiple val fields
qqmyers Dec 2, 2020
901efe8
add migrating switch
qqmyers Dec 2, 2020
34a28a3
expose uri in datasetField api
qqmyers Dec 3, 2020
1b98b2c
track defined namespaces
qqmyers Dec 3, 2020
55a8b30
define equals, avoid duplicates in list
qqmyers Dec 3, 2020
966394a
replace string with const
qqmyers Dec 4, 2020
b83f7b2
constant for CC0_URI
qqmyers Dec 4, 2020
1e08f10
GET/DELETE endpoints
qqmyers Dec 4, 2020
780630f
7130-handle missing contact name
qqmyers Dec 8, 2020
cc7e69c
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Dec 8, 2020
e0ea36e
Fix multiple description logic for info file
qqmyers Dec 11, 2020
6d0c615
put is always for :draft version
qqmyers Dec 11, 2020
51f8f78
don't cast to String[]
qqmyers Dec 11, 2020
353644a
add more logging
qqmyers Dec 11, 2020
2382fef
handle unpublished versions
qqmyers Dec 11, 2020
243769a
add method that can return JsonObjectBuilder
qqmyers Dec 11, 2020
9bfa7c3
log details on failure
qqmyers Dec 11, 2020
60f8a99
multiple updates/fixes, added logging
qqmyers Dec 11, 2020
464832a
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Dec 22, 2020
e931149
fix terms retrieval
qqmyers Dec 22, 2020
2b8189a
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Jan 8, 2021
e8f737c
date test fixes for locale
qqmyers Jan 13, 2021
1c93260
Java 11 update and test fixes inc. for different exception mesg
qqmyers Jan 13, 2021
a85c1d6
update pom for v11 and running tests under 11
qqmyers Jan 14, 2021
1476a61
Merge branch 'iqssdevelop' into IQSS/6497-semantic_api
qqmyers Jan 14, 2021
a52353b
fix for edu.harvard.iq.dataverse.api.AdminIT test fail in Java 11
qqmyers Jan 14, 2021
6f405ab
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Jan 29, 2021
e866ae0
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Feb 8, 2021
d5b8b45
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Feb 23, 2021
56acda8
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Apr 7, 2021
f19a199
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Apr 13, 2021
a7c6b3f
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Apr 26, 2021
33fb8de
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers May 20, 2021
87c581f
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Jun 3, 2021
f47b268
update StringUtils package
qqmyers Jun 3, 2021
6d73b61
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Jun 23, 2021
4714ea6
move metadataOnOrig out of citation block
qqmyers Jun 23, 2021
82a5b23
sync with migration api branch (tests, docs, bug fixes)
qqmyers Jun 30, 2021
10ef9ff
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Jun 30, 2021
e159003
fix test
qqmyers Jul 2, 2021
cf8b2b5
Update doc/release-notes/6497-semantic-api.md
qqmyers Jul 2, 2021
d5ff955
Update doc/sphinx-guides/source/developers/dataset-semantic-metadata-…
qqmyers Jul 2, 2021
61627d1
add create example, remove solr schema copies file
qqmyers Jul 2, 2021
1d54c68
removed debug logging
qqmyers Jul 2, 2021
bc82180
Merge branch 'IQSS/6497-semantic_api' of https://github.com/GlobalDat…
qqmyers Jul 2, 2021
4c1d31a
missing header
qqmyers Jul 2, 2021
a5a745d
remove metadataOnOrig per review
qqmyers Jul 8, 2021
bd37e30
Merge remote-tracking branch 'IQSS/develop' into IQSS/6497-semantic_api
qqmyers Jul 8, 2021
0138ebb
add missing create method (in migrate PR)
qqmyers Jul 13, 2021
13a7841
No "@id" npe fix
qqmyers Jul 13, 2021
86a08e3
avoid npe in logging
qqmyers Jul 13, 2021
0c64c68
only require "@id" when migrating
qqmyers Jul 13, 2021
8e9f2f7
fix logging in create case
qqmyers Jul 13, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions conf/solr/8.8.1/schema_dv_mdb_fields.xml
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,7 @@
<field name="keywordVocabularyURI" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="kindOfData" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="language" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="metadataOnOrig" type="text_en" multiValued="false" stored="true" indexed="true"/>
<field name="northLongitude" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="notesText" type="text_en" multiValued="false" stored="true" indexed="true"/>
<field name="originOfSources" type="text_en" multiValued="false" stored="true" indexed="true"/>
Expand Down
7 changes: 7 additions & 0 deletions doc/release-notes/6497-semantic-api.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Release Highlights

### Dataset Semantic API (Experimental)

Dataset metadata can be retrieved/set/updated using a new, flatter JSON-LD format - following the format of an OAI-ORE export (RDA-conformant Bags), allowing for easier transfer of metadata to/from other systems (i.e. without needing to know Dataverse's metadata block and field storage architecture). This new API also allows for the update of terms metadata (#5899).

This development was supported by the [Research Data Alliance](https://rd-alliance.org), DANS, and Sciences PO and follows the recommendations from the [Research Data Repository Interoperability Working Group](http://dx.doi.org/10.15497/RDA00025).
15 changes: 15 additions & 0 deletions doc/sphinx-guides/source/_static/api/dataset-create.jsonld
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
{
"http://purl.org/dc/terms/title": "Darwin's Finches",
"http://purl.org/dc/terms/subject": "Medicine, Health and Life Sciences",
"http://purl.org/dc/terms/creator": {
"https://dataverse.org/schema/citation/author#Name": "Finch, Fiona",
"https://dataverse.org/schema/citation/author#Affiliation": "Birds Inc."
},
"https://dataverse.org/schema/citation/Contact": {
"https://dataverse.org/schema/citation/datasetContact#E-mail": "finch@mailinator.com",
"https://dataverse.org/schema/citation/datasetContact#Name": "Finch, Fiona"
},
"https://dataverse.org/schema/citation/Description": {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The capitalization seems inconsistent.

Contact and Description (title case) but also author, datasetContact and dsDescription (camel case)?

What are the rules?

Copy link
Member Author

@qqmyers qqmyers Jul 8, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These come from the citation.tsv block with the pattern <block name>/<field title> for single fields and <block name>/<field title>/<child field name> being used. That choice was made back when OAI_ORE/BagIt/archiving was introduced. The use if title was an attempt to use URIs that mirrored what users see in the UI. I'm less sure why I used name for child fields - not sure if there was a conflict or if it was an issue with originally just trying a flatter <blockname>/<childfield title> and realizing that there are multiple child fields with the title 'Name' for example.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok. Sounds like a preexisting condition to me. 😄 It would be nice to have more consistency but oh well. I assume we don't want to revisit decisions made during BagIt export.

"https://dataverse.org/schema/citation/dsDescription#Text": "Darwin's finches (also known as the Galápagos finches) are a group of about fifteen species of passerine birds."
}
}
103 changes: 103 additions & 0 deletions doc/sphinx-guides/source/developers/dataset-semantic-metadata-api.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
Dataset Semantic Metadata API
=============================

The OAI_ORE metadata export format represents Dataset metadata using json-ld (see the :doc:`/admin/metadataexport` section). As part of an RDA-supported effort to allow import of Datasets exported as Bags with an included OAI_ORE metadata file,
pdurbin marked this conversation as resolved.
Show resolved Hide resolved
an experimental API has been created that provides a json-ld alternative to the v1.0 API calls to get/set/delete Dataset metadata in the :doc:`/api/native-api`.

You may prefer to work with this API if you are building a tool to import from a Bag/OAI-ORE source or already work with json-ld representations of metadata, or if you prefer the flatter json-ld representation to Dataverse software's json representation (which includes structure related to the metadata blocks involved and the type/multiplicity of the metadata fields.)
You may not want to use this API if you need stability and backward compatibility (the 'experimental' designation for this API implies that community feedback is desired and that, in future Dataverse software versions, the API may be modified based on that feedback).

Note: The examples use the 'application/ld+json' mimetype. For compatibility reasons, the APIs also be used with mimetype "application/json-ld"
pdurbin marked this conversation as resolved.
Show resolved Hide resolved

Get Dataset Metadata
--------------------

To get the json-ld formatted metadata for a Dataset, specify the Dataset ID (DATASET_ID) or Persistent identifier (DATASET_PID), and, for specific versions, the version number.

.. code-block:: bash

export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
export DATASET_ID='12345'
export DATASET_PID='doi:10.5072/FK2A1B2C3'
export VERSION='1.0'
export SERVER_URL=https://demo.dataverse.org

Example 1: Get metadata for version '1.0'

curl -H X-Dataverse-key:$API_TOKEN -H 'Accept: application/ld+json' "$SERVER_URL/api/datasets/$DATASET_ID/versions/$VERSION/metadata"

Example 2: Get metadata for the latest version using the DATASET PID

curl -H X-Dataverse-key:$API_TOKEN -H 'Accept: application/ld+json' "$SERVER_URL/api/datasets/:persistentId/metadata?persistentId=$DATASET_PID"

You should expect a 200 ("OK") response and JSON-LD mirroring the OAI-ORE representation in the returned 'data' object.


Add Dataset Metadata
--------------------

To add json-ld formatted metadata for a Dataset, specify the Dataset ID (DATASET_ID) or Persistent identifier (DATASET_PID). Adding '?replace=true' will overwrite an existing metadata value. The default (replace=false) will only add new metadata or add a new value to a multi-valued field.

.. code-block:: bash

export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
export DATASET_ID='12345'
export DATASET_PID='doi:10.5072/FK2A1B2C3'
export VERSION='1.0'
export SERVER_URL=https://demo.dataverse.org

Example: Change the Dataset title

curl -X PUT -H X-Dataverse-key:$API_TOKEN -H 'Content-Type: application/ld+json' -d '{"Title": "Submit menu test", "@context":{"Title": "http://purl.org/dc/terms/title"}}' "$SERVER_URL/api/datasets/$DATASET_ID/metadata?replace=true"

Example 2: Add a description using the DATASET PID

curl -X PUT -H X-Dataverse-key:$API_TOKEN -H 'Content-Type: application/ld+json' -d '{"citation:Description": {"dsDescription:Text": "New description"}, "@context":{"citation": "https://dataverse.org/schema/citation/","dsDescription": "https://dataverse.org/schema/citation/dsDescription#"}}' "$SERVER_URL/api/datasets/:persistentId/metadata?persistentId=$DATASET_PID"
pdurbin marked this conversation as resolved.
Show resolved Hide resolved

You should expect a 200 ("OK") response indicating whether a draft Dataset version was created or an existing draft was updated.


Delete Dataset Metadata
-----------------------

To delete metadata for a Dataset, send a json-ld representation of the fields to delete and specify the Dataset ID (DATASET_ID) or Persistent identifier (DATASET_PID).

.. code-block:: bash

export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
export DATASET_ID='12345'
export DATASET_PID='doi:10.5072/FK2A1B2C3'
export VERSION='1.0'
export SERVER_URL=https://demo.dataverse.org

Example: Delete the TermsOfUseAndAccess 'restrictions' value 'No restrictions' for the latest version using the DATASET PID
pdurbin marked this conversation as resolved.
Show resolved Hide resolved

curl -X PUT -H X-Dataverse-key:$API_TOKEN -H 'Content-Type: application/ld+json' -d '{"https://dataverse.org/schema/core#restrictions":"No restrictions"}' "$SERVER_URL/api/datasets/:persistentId/metadata/delete?persistentId=$DATASET_PID"

Note, this example uses the term URI directly rather than adding an '@context' element. You can use either form in any of these API calls.
pdurbin marked this conversation as resolved.
Show resolved Hide resolved

You should expect a 200 ("OK") response indicating whether a draft Dataset version was created or an existing draft was updated.


Create a Dataset
----------------

Specifying the Content-Type as application/ld+json with the existing /api/dataverses/{id}/datasets API call (see :ref:`create-dataset-command`) supports using the same metadata format when creating a Dataset.

With curl, this is done by adding the following header:

.. code-block:: bash

-H 'Content-Type: application/ld+json'
pdurbin marked this conversation as resolved.
Show resolved Hide resolved

.. code-block:: bash

export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
export SERVER_URL=https://demo.dataverse.org
export DATAVERSE_ID=root
export PERSISTENT_IDENTIFIER=doi:10.5072/FK27U7YBV

curl -H X-Dataverse-key:$API_TOKEN -H 'Content-Type: application/ld+json' -X POST $SERVER_URL/api/dataverses/$DATAVERSE_ID/datasets --upload-file dataset-create.jsonld

An example jsonld file is available at :download:`dataset-create.jsonld <../_static/api/dataset-create.jsonld>`

1 change: 1 addition & 0 deletions doc/sphinx-guides/source/developers/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -35,4 +35,5 @@ Developer Guide
big-data-support
aux-file-support
s3-direct-upload-api
dataset-semantic-metadata-api
workflows
19 changes: 18 additions & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -173,9 +173,21 @@
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.json</artifactId>
<version>1.0.4</version>
<version>1.1.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.skyscreamer</groupId>
<artifactId>jsonassert</artifactId>
<version>1.5.0</version>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>com.vaadin.external.google</groupId>
<artifactId>android-json</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
Expand Down Expand Up @@ -218,6 +230,11 @@
<artifactId>aws-java-sdk-s3</artifactId>
<!-- no version here as managed by BOM above! -->
</dependency>
<dependency>
<groupId>com.apicatalog</groupId>
<artifactId>titanium-json-ld</artifactId>
<version>0.8.6</version>
</dependency>
<dependency>
<!-- required by org.swordapp.server.sword2-server -->
<groupId>org.apache.abdera</groupId>
Expand Down
5 changes: 5 additions & 0 deletions scripts/api/data/metadatablocks/migration.tsv
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
#metadataBlock name dataverseAlias displayName blockURI
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need this migration.tsv? It isn't loaded by default on new installations?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is not required and is not auto-loaded. It's a bit of a complex situation: The RDA grant had, as one goal, the idea of being able to import Bags created by other repos. An expectation there is that the other repo (a Dataverse with different blocks or some other repo software entirely) could have metadata that doesn't fit into the receiving Dataverse's schema (including blocks). So - the metadataOnOrig field is a json structured field that can store any/all metadata that doesn't match a known field. So a transfer can be done without losing metadata. That said, putting metadata in this field makes it less useful than if there were a matching field. The current code will ignore metadata that doesn't match if this block is not installed and will use it if it is. I can explain that, or remove the code from the PR. For migration cases, I think even this limited functionality could be useful, but I suspect that most people using the 'experimental' API here won't want to enable it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was all non-obvious to me from a quick look at the code. If it's on the table to remove the metadataOnOrig functionality, I think we should. If you get other opinions that we should keep it, please document how it works.

migration Migrated Metadata https://dataverse.org/schema/migration/
#datasetField name title description watermark fieldType displayOrder displayFormat advancedSearchField allowControlledVocabulary allowmultiples facetable displayoncreate required parent metadatablock_id termURI
metadataOnOrig Metadata on the original source of migrated datasets. textbox 1 FALSE FALSE FALSE FALSE FALSE FALSE migration https://dataverse.org/schema/core#metadataOnOrig
#controlledVocabulary DatasetField Value identifier displayOrder
26 changes: 26 additions & 0 deletions scripts/search/tests/data/dataset-finch1.jsonld
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@

{
"http://purl.org/dc/terms/title": "Darwin's Finches",
"http://purl.org/dc/terms/subject": "Medicine, Health and Life Sciences",
"http://purl.org/dc/terms/creator": {
"https://dataverse.org/schema/citation/author#Name": "Finch, Fiona",
"https://dataverse.org/schema/citation/author#Affiliation": "Birds Inc."
},
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How do I specify multiple authors?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

An array of objects with the name/affiliation keys as the value of the creator key. This is the same way the OAI-ORE export shows multiple authors.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's what I figured. When I was working on the Schema.org JSON-LD output I was shocked by how loosey goosey JSON-LD is.

At some point it might behoove us to add a little crash course on JSON-LD to the guides, or at least plenty of examples so that users get the hang of it. Otherwise, I think we can anticipate questions like "How do I specify multiple authors?"

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having a human-readable term label (that probably isn't globally unique/machine interpretable) while still supporting strict machine readability adds some complexity, but its pretty straight forward to either never use an @context or to always use a standard/static one.

Regardless, it wouldn't be hard to add documentation over time.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 for more docs over time. I think we can live with what we have now, especially since it's experimental.

"https://dataverse.org/schema/citation/Contact": {
"https://dataverse.org/schema/citation/datasetContact#E-mail": "finch@mailinator.com",
"https://dataverse.org/schema/citation/datasetContact#Name": "Finch, Fiona"
},
"https://dataverse.org/schema/citation/Description": {
"https://dataverse.org/schema/citation/dsDescription#Text": "Darwin's finches (also known as the Galápagos finches) are a group of about fifteen species of passerine birds."
},
"@type": [
"http://www.openarchives.org/ore/terms/Aggregation",
"http://schema.org/Dataset"
],
"http://schema.org/version": "DRAFT",
"http://schema.org/name": "Darwin's Finches",
"https://dataverse.org/schema/core#fileTermsOfAccess": {
"https://dataverse.org/schema/core#fileRequestAccess": false
},
"http://schema.org/includedInDataCatalog": "Root"
}
2 changes: 1 addition & 1 deletion src/main/java/edu/harvard/iq/dataverse/DatasetVersion.java
Original file line number Diff line number Diff line change
Expand Up @@ -1863,7 +1863,7 @@ public String getJsonLd() {
JsonObjectBuilder license = Json.createObjectBuilder().add("@type", "Dataset");

if (TermsOfUseAndAccess.License.CC0.equals(terms.getLicense())) {
license.add("text", "CC0").add("url", "https://creativecommons.org/publicdomain/zero/1.0/");
license.add("text", "CC0").add("url", TermsOfUseAndAccess.CC0_URI);
} else {
String termsOfUse = terms.getTermsOfUse();
// Terms of use can be null if you create the dataset with JSON.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -280,7 +280,7 @@ public enum License {
* API use? See also https://github.com/IQSS/dataverse/issues/1385
*/
public static TermsOfUseAndAccess.License defaultLicense = TermsOfUseAndAccess.License.CC0;

public static String CC0_URI = "https://creativecommons.org/publicdomain/zero/1.0/";
@Override
public int hashCode() {
int hash = 0;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -137,6 +137,7 @@ public Response getByName(@PathParam("name") String name) {
String solrFieldSearchable = dsf.getSolrField().getNameSearchable();
String solrFieldFacetable = dsf.getSolrField().getNameFacetable();
String metadataBlock = dsf.getMetadataBlock().getName();
String uri=dsf.getUri();
boolean hasParent = dsf.isHasParent();
boolean allowsMultiples = dsf.isAllowMultiples();
boolean isRequired = dsf.isRequired();
Expand Down Expand Up @@ -168,7 +169,8 @@ public Response getByName(@PathParam("name") String name) {
.add("parentAllowsMultiples", parentAllowsMultiplesDisplay)
.add("solrFieldSearchable", solrFieldSearchable)
.add("solrFieldFacetable", solrFieldFacetable)
.add("isRequired", isRequired));
.add("isRequired", isRequired)
.add("uri", uri));
pdurbin marked this conversation as resolved.
Show resolved Hide resolved

} catch ( NoResultException nre ) {
return notFound(name);
Expand Down Expand Up @@ -356,7 +358,7 @@ public String getArrayIndexOutOfBoundMessage(HeaderType header,
int wrongIndex) {

List<String> columns = getColumnsByHeader(header);

String column = columns.get(wrongIndex - 1);
List<String> arguments = new ArrayList<>();
arguments.add(header.name());
Expand Down
Loading