Skip to content

#100 migrate DD reference sheet format to flattened sheet #152

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,5 @@ build/
*.log
*.iml
.run/

.DS_Store
13 changes: 0 additions & 13 deletions .run/generateDDAcceptanceTests.run.xml

This file was deleted.

22 changes: 3 additions & 19 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -1,21 +1,5 @@
MIT License
By downloading this resource, you agree to the RESO EULA.

Copyright (c) 2019 Joshua Darnell
https://www.reso.org/eula/

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Copyright (c) 2019 RESO (dev@reso.org)
1 change: 0 additions & 1 deletion build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,6 @@ jar {
exclude 'META-INF/*.DSA'
}


// don't suppress warnings or deprecation notices
tasks.withType(JavaCompile).configureEach {
options.compilerArgs << '-Xlint:unchecked'
Expand Down
3 changes: 0 additions & 3 deletions doc/CLI.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,9 +61,6 @@ usage: java -jar web-api-commander
using numeric keys.
--generateReferenceEDMX Generates reference metadata in EDMX
format.
--generateResourceInfoModels Generates Java Models for the Web API
Reference Server in the current
directory.
--getMetadata Fetches metadata from <serviceRoot>
using <bearerToken> and saves results
in <outputFile>.
Expand Down
11 changes: 0 additions & 11 deletions doc/Codegen.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,17 +18,6 @@ New Cucumber BDD acceptance tests will be generated and placed in a timestamped

To update the current tests, copy the newly generated ones into the [Data Dictionary BDD `.features` directory](src/main/java/org/reso/certification/features/data-dictionary/v1-7-0), run the `./gradlew build` task, and if everything works as expected, commit the newly generated tests.

## Generating RESO Web API Reference Server Data Models
The RESO Commander can be used to generate data models for the Web API Reference server from the currently approved [Data Dictionary Spreadsheet](src/main/resources/RESODataDictionary-1.7.xlsx).

The Commander project's copy of the sheet needs to be updated with a copy of the [DD Google Sheet](https://docs.google.com/spreadsheets/d/1SZ0b6T4_lz6ti6qB2Je7NSz_9iNOaV_v9dbfhPwWgXA/edit?usp=sharing) prior to generating reference metadata.

```
$ java -jar path/to/web-api-commander.jar --generateResourceInfoModels
```
New ResourceInfo Models for the Web API Reference Server will be generated and placed in a timestamped directory relative to your current path.


## Generating RESO Data Dictionary Reference Metadata
In addition to generating DD acceptance tests, the RESO Commander can generate reference metadata based on the current reference [Data Dictionary Spreadsheet](src/main/resources/RESODataDictionary-1.7.xlsx).

Expand Down
1 change: 1 addition & 0 deletions gradle.properties
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
org.gradle.jvmargs=-Xmx28g
org.gradle.warning.mode=all
org.gradle.java.installations.auto-detect=true
6 changes: 3 additions & 3 deletions settings.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
* The settings file is used to specify which projects to include in your build.
*
* Detailed information about configuring a multi-project build in Gradle can be found
* in the user manual at https://docs.gradle.org/5.2.1/userguide/multi_project_builds.html
* in the user manual at https://docs.gradle.org/8.0.2/userguide/multi_project_builds.html
* This project uses @Incubating APIs which are subject to change.
*/

rootProject.setName('web-api-commander')
rootProject.name = 'web-api-commander'
60 changes: 27 additions & 33 deletions src/main/java/org/reso/certification/codegen/BDDProcessor.java
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@

import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.poi.ss.usermodel.Sheet;
import org.reso.commander.common.Utils;
import org.reso.models.ReferenceStandardLookup;
import org.reso.models.ReferenceStandardField;
Expand All @@ -17,63 +16,57 @@

public class BDDProcessor extends WorksheetProcessor {
private static final Logger LOG = LogManager.getLogger(BDDProcessor.class);
private static final String
FEATURE_EXTENSION = ".feature",
LOCKED_WITH_ENUMERATIONS_KEY = "Locked with Enumerations";
private static final String FEATURE_EXTENSION = ".feature";
private static final int EXAMPLES_PADDING_AMOUNT = 6;

public void processResourceSheet(Sheet sheet) {
super.processResourceSheet(sheet);
markup.append(BDDTemplates.buildHeaderInfo(sheet.getSheetName(), startTimestamp));
}

@Override
void processNumber(ReferenceStandardField row) {
markup.append(BDDTemplates.buildNumberTest(row));
resourceTemplates.get(row.getResourceName()).append(BDDTemplates.buildNumberTest(row));
}

@Override
void processStringListSingle(ReferenceStandardField row) {
markup.append(BDDTemplates.buildStringListSingleTest(row));
resourceTemplates.get(row.getResourceName()).append(BDDTemplates.buildStringListSingleTest(row));
}

@Override
void processString(ReferenceStandardField row) {
markup.append(BDDTemplates.buildStringTest(row));
resourceTemplates.get(row.getResourceName()).append(BDDTemplates.buildStringTest(row));
}

@Override
void processBoolean(ReferenceStandardField row) {
markup.append(BDDTemplates.buildBooleanTest(row));
resourceTemplates.get(row.getResourceName()).append(BDDTemplates.buildBooleanTest(row));
}

@Override
void processStringListMulti(ReferenceStandardField row) {
markup.append(BDDTemplates.buildStringListMultiTest(row));
resourceTemplates.get(row.getResourceName()).append(BDDTemplates.buildStringListMultiTest(row));
}

@Override
void processDate(ReferenceStandardField row) {
markup.append(BDDTemplates.buildDateTest(row));
resourceTemplates.get(row.getResourceName()).append(BDDTemplates.buildDateTest(row));
}

@Override
void processTimestamp(ReferenceStandardField row) {
markup.append(BDDTemplates.buildTimestampTest(row));
resourceTemplates.get(row.getResourceName()).append(BDDTemplates.buildTimestampTest(row));
}

@Override
void processCollection(ReferenceStandardField row) {
LOG.debug("Collection Type is not supported!");
void processExpansion(ReferenceStandardField field) {
//TODO: DD 2.0
}

@Override
void generateOutput() {
LOG.info("Using reference worksheet: " + REFERENCE_WORKSHEET);
LOG.info("Generating BDD .feature files for the following resources: " + resourceTemplates.keySet().toString());
resourceTemplates.forEach((resourceName, content) -> {
LOG.info("Generating BDD .feature files for the following resources: " + resourceTemplates.keySet());
resourceTemplates.forEach((resourceName, buffer) -> {
//put in local directory rather than relative to where the input file is
Utils.createFile(getDirectoryName(), resourceName.toLowerCase() + FEATURE_EXTENSION, content);
Utils.createFile(getDirectoryName(), resourceName.toLowerCase() + FEATURE_EXTENSION,
BDDTemplates.buildHeaderInfo(resourceName, startTimestamp) + buffer.toString());
});
}

Expand Down Expand Up @@ -112,8 +105,8 @@ private static ArrayList<String> buildTags(ReferenceStandardField field) {
//use this to add each field name tag
//tags.add(field.getStandardName());

if (field.getParentResourceName() != null && field.getParentResourceName().length() > 0) {
tags.add(field.getParentResourceName());
if (field.getResourceName() != null && field.getResourceName().length() > 0) {
tags.add(field.getResourceName());
}

tags.addAll(field.getPropertyTypes());
Expand All @@ -133,7 +126,7 @@ private static String generateSynonymsMarkup(ReferenceStandardField field) {

if (field.getSynonyms().size() > 0) {
template += " Given that the following synonyms for \"" + field.getStandardName()
+ "\" DO NOT exist in the \"" + field.getParentResourceName() + "\" metadata\n" +
+ "\" DO NOT exist in the \"" + field.getResourceName() + "\" metadata\n" +
field.getSynonyms().stream()
.map(synonym -> padLeft("| " + synonym + " |\n", EXAMPLES_PADDING_AMOUNT)).collect(Collectors.joining());
}
Expand All @@ -146,7 +139,7 @@ public static String buildBooleanTest(ReferenceStandardField field) {
return "\n " + buildTags(field).stream().map(tag -> "@" + tag).collect(Collectors.joining(SINGLE_SPACE)) + "\n" +
" Scenario: " + field.getStandardName() + "\n" +
generateSynonymsMarkup(field) +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getParentResourceName() + "\" metadata\n" +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getResourceName() + "\" metadata\n" +
" Then \"" + field.getStandardName() + "\" MUST be \"Boolean\" data type\n";
}

Expand All @@ -156,7 +149,7 @@ public static String buildDateTest(ReferenceStandardField field) {
return "\n " + buildTags(field).stream().map(tag -> "@" + tag).collect(Collectors.joining(SINGLE_SPACE)) + "\n" +
" Scenario: " + field.getStandardName() + "\n" +
generateSynonymsMarkup(field) +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getParentResourceName() + "\" metadata\n" +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getResourceName() + "\" metadata\n" +
" Then \"" + field.getStandardName() + "\" MUST be \"Date\" data type\n";
}

Expand All @@ -179,7 +172,7 @@ public static String buildDecimalTest(ReferenceStandardField field) {
"\n " + buildTags(field).stream().map(tag -> "@" + tag).collect(Collectors.joining(SINGLE_SPACE)) + "\n" +
" Scenario: " + field.getStandardName() + "\n" +
generateSynonymsMarkup(field) +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getParentResourceName() + "\" metadata\n" +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getResourceName() + "\" metadata\n" +
" Then \"" + field.getStandardName() + "\" MUST be \"Decimal\" data type\n";

//TODO Length is actually scale for Decimal fields by the DD! :/
Expand All @@ -203,7 +196,7 @@ public static String buildIntegerTest(ReferenceStandardField field) {
return "\n " + buildTags(field).stream().map(tag -> "@" + tag).collect(Collectors.joining(SINGLE_SPACE)) + "\n" +
" Scenario: " + field.getStandardName() + "\n" +
generateSynonymsMarkup(field) +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getParentResourceName() + "\" metadata\n" +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getResourceName() + "\" metadata\n" +
" Then \"" + field.getStandardName() + "\" MUST be \"Integer\" data type\n";
}

Expand All @@ -215,7 +208,7 @@ private static String buildStandardEnumerationMarkup(String lookupName) {
markup
.append(padLeft("| ", EXAMPLES_PADDING_AMOUNT))
.append(lookup.getLookupValue()).append(" | ")
.append(lookup.getLookupDisplayName()).append(" |\n");
.append(lookup.getLegacyODataValue()).append(" |\n");
}
if (markup.length() > 0) markup.insert(0, padLeft("| lookupValue | lookupDisplayName |\n", EXAMPLES_PADDING_AMOUNT));
return markup.toString();
Expand All @@ -227,11 +220,12 @@ private static String buildStandardEnumerationMarkup(String lookupName) {

public static String buildStringListMultiTest(ReferenceStandardField field) {
if (field == null) return EMPTY_STRING;

return
"\n " + buildTags(field).stream().map(tag -> "@" + tag).collect(Collectors.joining(SINGLE_SPACE)) + "\n" +
" Scenario: " + field.getStandardName() + "\n" +
generateSynonymsMarkup(field) +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getParentResourceName() + "\" metadata\n" +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getResourceName() + "\" metadata\n" +
" Then \"" + field.getStandardName() + "\" MUST be \"Multiple Enumeration\" data type\n";
}

Expand All @@ -242,7 +236,7 @@ public static String buildStringListSingleTest(ReferenceStandardField field) {
"\n " + buildTags(field).stream().map(tag -> "@" + tag).collect(Collectors.joining(SINGLE_SPACE)) + "\n" +
" Scenario: " + field.getStandardName() + "\n" +
generateSynonymsMarkup(field) +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getParentResourceName() + "\" metadata\n" +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getResourceName() + "\" metadata\n" +
" Then \"" + field.getStandardName() + "\" MUST be \"Single Enumeration\" data type\n";
}

Expand All @@ -252,7 +246,7 @@ public static String buildStringTest(ReferenceStandardField field) {
"\n " + buildTags(field).stream().map(tag -> "@" + tag).collect(Collectors.joining(SINGLE_SPACE)) + "\n" +
" Scenario: " + field.getStandardName() + "\n" +
generateSynonymsMarkup(field) +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getParentResourceName() + "\" metadata\n" +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getResourceName() + "\" metadata\n" +
" Then \"" + field.getStandardName() + "\" MUST be \"String\" data type\n";

if (field.getSuggestedMaxLength() != null)
Expand All @@ -268,7 +262,7 @@ public static String buildTimestampTest(ReferenceStandardField field) {
return "\n " + buildTags(field).stream().map(tag -> "@" + tag).collect(Collectors.joining(SINGLE_SPACE)) + "\n" +
" Scenario: " + field.getStandardName() + "\n" +
generateSynonymsMarkup(field) +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getParentResourceName() + "\" metadata\n" +
" When \"" + field.getStandardName() + "\" exists in the \"" + field.getResourceName() + "\" metadata\n" +
" Then \"" + field.getStandardName() + "\" MUST be \"Timestamp\" data type\n";
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ public class DDCacheProcessor extends WorksheetProcessor {
new AtomicReference<>(Collections.synchronizedMap(new LinkedHashMap<>()));

private void addToFieldCache(ReferenceStandardField field) {
standardFieldCache.get().putIfAbsent(field.getParentResourceName(), new LinkedHashMap<>());
standardFieldCache.get().get(field.getParentResourceName()).put(field.getStandardName(), field);
standardFieldCache.get().putIfAbsent(field.getResourceName(), new LinkedHashMap<>());
standardFieldCache.get().get(field.getResourceName()).put(field.getStandardName(), field);
}

public Map<String, Map<String, ReferenceStandardField>> getStandardFieldCache() {
Expand Down Expand Up @@ -54,8 +54,8 @@ void processTimestamp(ReferenceStandardField field) {
}

@Override
void processCollection(ReferenceStandardField field) {
addToFieldCache(field);
void processExpansion(ReferenceStandardField field) {
//TODO: DD 2.0
}

@Override
Expand Down
Loading