Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add DataCite API #28

Merged
merged 4 commits into from
Jun 21, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
55 changes: 55 additions & 0 deletions src/DataCite.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
struct DataCite <: DataRepo
end

base_url(::DataCite) = "https://api.datacite.org/works/"

function description(repo::DataCite, mainpage)
attributes = mainpage["attributes"]
desc = attributes["description"]
authors = join.([[names[2] for names in value] for value in attributes["author"]], " ")
author = format_authors(authors)
license = attributes["license"]
date = attributes["published"]
paper = format_papers(authors, date, attributes["title"] *
" [Data set]. " * attributes["container-title"] * ".", mainpage["id"])

escape_multiline_string("""
Author: $(author)
License: $(license)
Date: $(date)

$(desc)

Please cite this paper:
$(paper)
if you use this in your research.
""", "\$")
end

function get_urls(repo::DataCite, page)
urls = ["PUT DOWNLOAD URL HERE"]
info("DataCite based generation can only generate partial registration blocks, as DataCite metadata does not (currently) include the URL to the resource. You will have to edit in the URL after generation.")
urls
end

function get_checksums(repo::DataCite, page)
nothing
end

function data_fullname(::DataCite, mainpage)
mainpage["attributes"]["title"]
end

function website(repo::DataCite, mainpage_url)
replace(mainpage_url, base_url(repo), "https://doi.org/")
end

function mainpage_url(repo::DataCite, dataname)
try
identifier = match_doi(dataname)
url = base_url(repo) * identifier
JSON.parse(text_only(getpage(url).root))["data"], url
catch ErrorException
error("Please use a valid url")
end
end
13 changes: 12 additions & 1 deletion src/DataDepsGenerators.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ using Gumbo, Cascadia, AbstractTrees
using Suppressor
using JSON

export generate, UCI, GitHub, DataDryad, DataOneV1, DataOneV2, CKAN
export generate, UCI, GitHub, DataDryad, DataOneV1, DataOneV2, CKAN, DataCite

abstract type DataRepo end

Expand Down Expand Up @@ -38,6 +38,7 @@ include("DataDryad.jl")
include("DataOneV1.jl")
include("DataOneV2/DataOneV2.jl")
include("CKAN.jl")
include("DataCite.jl")


function message(meta)
Expand Down Expand Up @@ -108,6 +109,16 @@ function format_authors(authors::Vector)
end
end

function format_papers(authors::Vector, year::String, name::String, link::String)
#APA format. Other formats can be included here later.
join(authors, ", ") * " ($year). " * name * " " * link
end

function match_doi(uri::String)
identifier = match(r"\b(10[.][0-9]{4,}(?:[.][0-9]+)*\/(?:(?![\"&\'<>])\S)+)\b", uri).match
return identifier
end

website(::DataRepo, mainpage_url) = mainpage_url

function mainpage_url(repo::DataRepo, dataname)
Expand Down
9 changes: 9 additions & 0 deletions test/DataCite.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
using DataDepsGenerators
using Base.Test

using ReferenceTests

@testset "DataCite test" begin
@test_reference "references/DataCite Fire Patch.txt" generate(DataCite(), "https://search.datacite.org/works/10.15148/0e999ffc-e220-41ac-ac85-76e92ecd0320")
@test_reference "references/DataCite Ceramic.txt" generate(DataCite(), "10.5281/zenodo.1147572")
end
19 changes: 19 additions & 0 deletions test/references/DataCite Ceramic.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
register(DataDep(
"Study Of Surface Roughness With The Variation In Applied Load Of Rutile Ceramic Reinforced Aluminium Composite",
"""
Dataset: Study Of Surface Roughness With The Variation In Applied Load Of Rutile Ceramic Reinforced Aluminium Composite
Website: https://doi.org/10.5281/zenodo.1147572
Author: Rama Arora
License: https://creativecommons.org/licenses/by/4.0/
Date: 2018

The objective of this work is to study the surface roughness of LM13alloy composites with rutile particles .The fabrication route adopted for preparing the samples containing variable ratio of rutile reinforcement is simple vortex technique. The wear tests were carried out under different loading conditions from 9.8N to 49N. The pin specimen travelled a distance of 3000m at constant sliding speed on the hard steel disc. The addition of fine size rutile particles results in higher hardness and strength. The stress concentration at the voids due to weak interfaces leads to crack intitation, arising from the particle fracture .This can be avoided by providing more strength to the matrix which is achieved by introducing hard ceramic rutile particulates. As the soft matrix aluminium alloy is prone to scratches and indentation during the contact sliding conditions, study of surface roughness of composite after wear studies need significant attention.

Please cite this paper:
Rama Arora (2018). Study Of Surface Roughness With The Variation In Applied Load Of Rutile Ceramic Reinforced Aluminium Composite [Data set]. Zenodo. https://doi.org/10.5281/zenodo.1147572
if you use this in your research.

""",
String["PUT DOWNLOAD URL HERE"],

))
29 changes: 29 additions & 0 deletions test/references/DataCite Fire Patch.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
register(DataDep(
"List of fire patch properties computed and associated NetCDF maps from the MCD64A1 Collection 6 (2000-2016) and the MERIS fire_cci v4.1 (2005-2011) BA products",
"""
Dataset: List of fire patch properties computed and associated NetCDF maps from the MCD64A1 Collection 6 (2000-2016) and the MERIS fire_cci v4.1 (2005-2011) BA products
Website: https://doi.org/10.15148/0e999ffc-e220-41ac-ac85-76e92ecd0320
Author: Pierre Laurent et al.
License: nothing
Date: 2018

Fire patches computed from the MCD64A1 Collection 6 and the MERIS fire_cci v4.1 (2005-2011) burned area
datasets. A flood-filling algorithm with a fixed cut-off parameter has been used to group burned pixels into
fire patches. The listed morphological properties in the csv files are (in order of appearance) : ID, Fire
ID, Minimum Burn Date, Maximum Burn Date, Mean Burn Date, Year, Number of Pixels, Number of Core Pixels,
Area of the patch (ha), Core Area of the patch (ha), Perimeter, Perimeter to Area Ratio, Shape Index,
Fractal Correlation Dimension, Core Area Index, Longitudinal coordinate of the center of the patch,
Latitudinal coordinate of the center of the patch, Minor half-axis of the SDE (in degrees), Major and minor
half-axes of the SDE (in degrees, in lonlat projection), major and minor half-axes of the SDE (in
kilometers, in local flat projection), orientation of the SDE (with respect to North, clockwise, in
degrees), orientation of the SDE in flat projection (with respect to North, clockwise, in degrees),
eccentricity of the SDE, Ratio of SDE half axes. The NetCDF files are self-described.

Please cite this paper:
Pierre Laurent, Florent Mouillot, Chao Yue, Maria Vanesa Moreno Dominguez, Philippe Ciais, Joana M.P. Nogueira (2018). List of fire patch properties computed and associated NetCDF maps from the MCD64A1 Collection 6 (2000-2016) and the MERIS fire_cci v4.1 (2005-2011) BA products [Data set]. OSU OREME. https://doi.org/10.15148/0e999ffc-e220-41ac-ac85-76e92ecd0320
if you use this in your research.

""",
String["PUT DOWNLOAD URL HERE"],

))
1 change: 1 addition & 0 deletions test/runtests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ tests = [
"DataOneV2/KNB",
"DataOneV2/TERN",
"CKAN",
"DataCite",
"format_checksum"
]

Expand Down