[ << Back to DASH top-level README ]
Documentation comprises system descriptions, High-level design (HLD) documents and detailed compliance requirements. These are contained in the DASH/documentation directory and subdirectories.
The testing framework, methodology, documentation and testing artifacts are stored in the DASH/test directory
See also DASH FAQ and Glossary.
All DASH devices shall conform to the following design specifications and compliance requirements:
Topic | Links to Folders |
---|---|
General Architecture and Requirements | Parent Folder | Design | Compliance Requirements |
Dataplane | Parent Folder | Design | Compliance Requirements |
High-Availability (HA) | Parent Folder | Design | Compliance Requirements |
gNMI Northbound API | Parent Folder | Design | Compliance Requirements |
SAI Southbound API | Parent Folder | Design | Compliance Requirements |
DASH devices may implement one or more of the following services. They shall conform to each service's design specificaitons and compliance requirements.
Topic | Links to Folders |
---|---|
Load Balancer Service | Parent Folder | Design | Compliance Requirements |
VNET-to-VNET Service | Parent Folder | Design | Compliance Requirements |
Service Tunnel & Private Link Service | Parent Folder | Design | Compliance Requirements |
VNET Peering Service | Parent Folder | Design | Compliance Requirements |
Express Route (ER) Service | Parent Folder | Design | Compliance Requirements |
Encryption Gateway Service | Parent Folder | Design | Compliance Requirements |
Documentation consists of separate, but related System Descriptions (HLDs, architecture, theory of operations, etc.) and Compliance Requirements (hard specifications, typically numerical but also behavioral). These two types of documents are deliberately kept separated, see Relationships and Flow of Documents.
Documentation is organized into folders as follows. Each feature or topic has all the high-level specs and the compliance requirements in the same parent folder, e.g. General, High-Availability, etc., making it easier to access related information about one topic. As the complexity grows, this helps keep things organized according to "functional topic."
topic1
design
topic1 High-level Descriptions and architecture
requirements
topic1 compliance Requirements
topic2
design
topic2 High-level Descriptions
requirements
topic2 compliance Requirements
etc
...
The diagram below shows how High-Level Descriptions beget Compliance requirements, compliance requirements beget test cases, and test cases are executed by test scripts to produce Test Results.
Some of the guiding principles for this aproach are:
- Define the objectives and the design or proposal separately from performance and requirement details.
- Describe hard requirements separately from the design and architecture descriptions This allows the requirements to be easily defined, maintained, and referenced from other downstream "consumers," e.g., test cases. All requirements must be identified with some designator which allows traceability in test cases, scripts and results.
- We encourage the creation of simultaneous human and machine-readable data which can drive test cases.
- We must avoid burying test parameters into the test scripts. This allows the requirements to be maintained/defined independently from the (often complex) code which executes tests.
- Many projects exist where only a programmer can locate and ferret out actual test criteria, often expressed as hard-coded constants buried within thousands of lines of test automation code. For quality control, these criteria must be easily accessible, reviewable and maintainable, to anyone familiar with the project.
- We advocate complete auditability and traceability of tests cases, test results, associated specs and DUT/SUT configuration. This means a test run will record versions of every item including GitHub Repo commit SHA ids, branches, tags, SW versions, API versions, etc.
- Clear, concise and to the point human-readable reports, plus machine-readable results allowing dashboards, rolling-up of results, etc.