Skip to content

Commit

Permalink
simplified/reorganized the new dev. checklist for making a core field…
Browse files Browse the repository at this point in the history
… multiple #9634
  • Loading branch information
landreev committed Jan 31, 2024
1 parent 988a8d7 commit 7d537aa
Show file tree
Hide file tree
Showing 3 changed files with 18 additions and 28 deletions.
19 changes: 18 additions & 1 deletion doc/sphinx-guides/source/admin/metadatacustomization.rst
Original file line number Diff line number Diff line change
Expand Up @@ -651,7 +651,24 @@ The thinking is that the tips can become issues and the issues can eventually be
Development Tasks Specific to Changing Fields in Core Metadata Blocks
---------------------------------------------------------------------

When it comes to the fields from the core blocks that are distributed with Dataverse (such as Citation and Social Science blocks), code dependencies may exist in Dataverse, primarily in the Import and Export subsystems, on these fields being configured a certain way. So, if it becomes necessary to modify one of such core fields (a real life example is making a single value-only field support multiple values), code changes may be necessary to accompany the change in the block tsv, plus some sample and test files maintained in the Dataverse source tree will need to be adjusted accordingly. An example of a checklist of such tasks is provided in the Development Guide, please see the :doc:`/developers/metadatablocksdev` section.
When it comes to the fields from the core blocks that are distributed with Dataverse (such as Citation, Social Science and Geospatial blocks), code dependencies may exist in Dataverse, primarily in the Import and Export subsystems, on these fields being configured a certain way. So, if it becomes necessary to modify one of such core fields, code changes may be necessary to accompany the change in the block tsv, plus some sample and test files maintained in the Dataverse source tree will need to be adjusted accordingly.

Making a Field Multi-Valued
~~~~~~~~~~~~~~~~~~~~~~~~~~~

As a recent real life example, a few fields from the Citation and Social Science block were changed to support multiple values, in order to accommodate specific needs of some community member institutions. A PR for one of these fields, ``alternativeTitle`` from the Citation block is linked below. Each time a number of code changes, plus some changes in the sample metadata files in the Dataverse code tree had to be made. The checklist below is to help another developer in the event that a similar change becomes necessary in the future. Note that some of the steps below may not apply 1:1 to a different metadata field, depending on how it is exported and imported in various formats by Dataverse. It may help to consult the PR `#9440 <https://github.com/IQSS/dataverse/pull/9440/files>`_ as a specific example of the changes that had to be made for the ``alternativeTitle`` field.

- Change the value from ``FALSE`` to ``TRUE`` in the ``alowmultiples`` column of the .tsv file for the block.
- Change the value of the ``multiValued`` attribute for the search field in the Solr schema (``conf/solr/9.3.0/schema.xml`` as of writing this).
- Modify the DDI import code (``ImportDDIServiceBean.java``) to support multiple values. (you may be able to use the change in the PR above as a model.)
- Modify the DDI export utility (``DdiExportUtil.java``).
- Modify the OpenAire export utility (``OpenAireExportUtil.java``).
- Modify the following JSON source files in the Dataverse code tree to actually include multiple values for the field (two should be quite enough!): ``scripts/api/data/dataset-create-new-all-default-fields.json``, ``src/test/java/edu/harvard/iq/dataverse/export/dataset-all-defaults.txt``, ``src/test/java/edu/harvard/iq/dataverse/export/ddi/dataset-finch1.json`` and ``src/test/java/edu/harvard/iq/dataverse/export/ddi/dataset-create-new-all-ddi-fields.json``. (These are used as examples for populating datasets via the import API and by the automated import and export code tests).
- Similarly modify the following XML files that are used by the DDI export code tests: ``src/test/java/edu/harvard/iq/dataverse/export/ddi/dataset-finch1.xml`` and ``src/test/java/edu/harvard/iq/dataverse/export/ddi/exportfull.xml``.
- Make sure all the automated Unit and Integration tests are passing.
- Write a short release note to announce the change in the upcoming release.
- Make a Pull Request.


Footnotes
---------
Expand Down
1 change: 0 additions & 1 deletion doc/sphinx-guides/source/developers/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,6 @@ Developer Guide
making-releases
making-library-releases
metadataexport
metadatablocksdev
tools
unf/index
make-data-count
Expand Down
26 changes: 0 additions & 26 deletions doc/sphinx-guides/source/developers/metadatablocksdev.rst

This file was deleted.

0 comments on commit 7d537aa

Please sign in to comment.