used for blackbox testing, data-ingestion procedures
Make sure that your email server is NOT running because some of the endpoints that are used are sending emails to the input email addresses. For example, when using the endpoint for creating new registration data, there exists automatic function that sends email, what we don't want because we use this endpoint for importing existing data.
-
Python 3.8+ (tested with 3.8.10 and 3.11)
-
Install CLARIN-DSpace7.*. (postgres, solr, dspace backend) 2.1. Clone python-api: https://github.com/ufal/dspace-python-api (branch
main
) 2.2. Clone submodules:git submodule update --init libs/dspace-rest-python/
-
Install Python dependencies:
pip install -r requirements.txt pip install -r libs/dspace-rest-python/requirements.txt
-
Get database dump (old CLARIN-DSpace) and unzip it into
input/dump
directory indspace-python-api
project. -
Prepare
dspace-python-api
project for migration: copy the files used during migration intoinput/
directory:
> ls -R ./input
input:
dump icon
input/dump:
clarin-dspace.sql clarin-utilities.sql
input/icon:
aca.png by.png gplv2.png mit.png ...
Note: input/icon/
contains license icons (PNG files).
-
Copy
assetstore
from dspace5 to dspace7 (for bitstream import).assetstore
is in the folder where you have installed DSpacedspace/assetstore
. -
Create
dspace
database with extensionpgcrypto
. -
Go to the
dspace/bin
in dspace7 installation and run the commanddspace database migrate force
(force because of local types). NOTE:dspace database migrate force
creates default database data that may be not in database dump, so after migration, some tables may have more data than the database dump. Data from database dump that already exists in database is not migrated. -
Create an admin by running the command
dspace create-administrator
in thedspace/bin
-
Create CLARIN-DSpace5.* databases (dspace, utilities) from dump. Run
scripts/start.local.dspace.db.bat
or usescripts/init.dspacedb5.sh
directly with your database.
-
Update
project_settings.py
-
Make sure that handle prefixes are configured in the backend configuration (
dspace.cfg
):
- Set your main handle prefix in
handle.prefix
- Add all other handle prefixes to
handle.additional.prefixes
- Note: The main prefix should NOT be included in
handle.additional.prefixes
- Example:
handle.prefix = 123456789 handle.additional.prefixes = 11858, 11234, 11372, 11346, 20.500.12801, 20.500.12800
- Import: Run command
cd ./src && python repo_import.py
- NOTE: database must be up to date (
dspace database migrate force
must be called in thedspace/bin
) - NOTE: dspace server must be running
- The values of table attributes that describe the last modification time of dspace object (for example attribute
last_modified
in tableItem
) have a value that represents the time when that object was migrated and not the value from migrated database dump. - If you don't have valid and complete data, not all data will be imported.
- Check if license link contains XXX. This is of course unsuitable for production run!
Use tools/repo_diff
utility, see README.
The migration script supports testing functionality with empty tables to verify the import process without actual data.
Before using the --test
option, you need to create the test JSON file:
-
Create the test JSON file: Create a file named
test.json
in theinput/test/
directory with the following content:null
-
Configure the test settings: The test configuration is set in
src/project_settings.py
:"input": { "test": os.path.join(_this_dir, "../input/test"), "test_json_filename": "test.json", }
You can change the
test_json_filename
to use a different filename if needed.
To run the migration with empty table testing, use the --test
option followed by the table names you want to test with empty data.
cd ./src && python repo_import.py --test usermetadatas
cd ./src && python repo_import.py --test usermetadatas resourcepolicies
When the --test
option is specified with table names:
- Instead of loading actual data from database exports, the system loads the configured test JSON file (default:
test.json
) which containsnull
- This simulates empty tables during the import process
- The migration logic is tested without requiring actual data
- The test JSON filename can be customized in
project_settings.py
under"input"["test_json_filename"]