diff --git a/json-mongo-duality-general/json-collections/images/development-sql.png b/json-mongo-duality-general/json-collections/images/development-sql.png
index 2cc8819fa..6d9f479a4 100644
Binary files a/json-mongo-duality-general/json-collections/images/development-sql.png and b/json-mongo-duality-general/json-collections/images/development-sql.png differ
diff --git a/json-to-duality-migrator/0-intro/introduction.md b/json-to-duality-migrator/0-intro/introduction.md
index 91af78e68..207a8f9bb 100644
--- a/json-to-duality-migrator/0-intro/introduction.md
+++ b/json-to-duality-migrator/0-intro/introduction.md
@@ -4,6 +4,11 @@
This workshop focuses on migrating from JSON Collections to Duality Views using the JSON to Duality Migrator in Oracle Database 23ai. You will learn how to migrate apps from a document to relational model automatically without any application changes.
+Watch this quick video to know why JSON Relational Duality is awesome.
+
+[](youtube:Eb_ytQBw2i8)
+
+
### **JSON Relational Duality**
JSON Relational Duality is a landmark capability in Oracle Database 23ai providing game-changing flexibility and simplicity for Oracle Database developers. This breakthrough innovation overcomes the historical challenges developers have faced when building applications, using relational or document models.
@@ -12,11 +17,11 @@ JSON Relational Duality helps to converge the benefits of both document and rela
Key benefits of JSON Relational Duality:
-* Experience extreme flexibility in building apps using Duality Views. Developers can access the same data relationally or as hierarchical documents based on their use case and are not forced into making compromises because of the limitations of the underlying database. Build document-centric apps on relational data or create SQL apps on documents.
-* Experience simplicity by retrieving and storing all the data needed for an app in a single database operation. Duality Views provide fully updatable JSON views over data. Apps can read a document, make necessary changes, and write the document back without worrying about underlying data structure, mapping, consistency, or performance tuning.
-* Enable flexibility and simplicity in building multiple apps on same data. Developers can define multiple Duality Views across overlapping groups of tables. This flexible data modeling makes building multiple apps against the same data easy and efficient.
-* Duality Views eliminate the inherent problem of data duplication and data inconsistency in document databases. Duality Views are fully ACID (atomicity, consistency, isolation, durability) transactions across multiple documents and tables. It eliminates data duplication across documents data, whereas consistency is maintained automatically.
-* Build apps that support high concurrency access and updates. Traditional locks don’t work well for modern apps. A new value-based concurrency control protocol provided with Duality Views supports high concurrency updates. The new protocol also works efficiently for interactive applications since the data is not locked during human thinking time.
+* **Experience extreme flexibility** in building apps using Duality Views. Developers can access the same data relationally or as hierarchical documents based on their use case and are not forced into making compromises because of the limitations of the underlying database. Build document-centric apps on relational data or create SQL apps on documents.
+* **Experience simplicity** by retrieving and storing all the data needed for an app in a single database operation. Duality Views provide fully updatable JSON views over data. Apps can read a document, make necessary changes, and write the document back without worrying about underlying data structure, mapping, consistency, or performance tuning.
+* **Enable flexibility and simplicity** in building multiple apps on same data. Developers can define multiple Duality Views across overlapping groups of tables. This flexible data modeling makes building multiple apps against the same data easy and efficient.
+* **Eliminate the inherent problem of data duplication and data inconsistency** in document databases. Duality Views are fully ACID (atomicity, consistency, isolation, durability) transactions across multiple documents and tables. It eliminates data duplication across documents data, whereas consistency is maintained automatically.
+* **Support high concurrency access and updates**. Traditional locks don’t work well for modern apps. A new value-based concurrency control protocol provided with Duality Views supports high concurrency updates. The new protocol also works efficiently for interactive applications since the data is not locked during human thinking time.
### **JSON to Duality Migrator**
@@ -42,11 +47,6 @@ How does the JSON to Duality Migrator work?
3. Eliminates duplication by identifying shared data across collections
4. Uses functional dependency analysis to automatically identify primary keys for each entity and foreign keys between the identified entities
-Watch this quick video to know why JSON Relational Duality is awesome.
-
-[](youtube:Eb_ytQBw2i8)
-
-Estimated Time: 50 minutes
### Objectives
@@ -57,6 +57,8 @@ In this lab, you will:
* Migrate from JSON Collections to Duality Views using the JSON to Duality Migrator
* Use the JSON to Duality Migrator's hint infrastructure to guide relational schema design
+Estimated Time: 50 minutes
+
### Prerequisites
* Oracle Autonomous Database 23ai provisioned or one running in a LiveLabs environment
diff --git a/json-to-duality-migrator/1-json-collections/images/dbaction1.png b/json-to-duality-migrator/1-json-collections/images/dbaction1.png
new file mode 100644
index 000000000..02d407f21
Binary files /dev/null and b/json-to-duality-migrator/1-json-collections/images/dbaction1.png differ
diff --git a/json-to-duality-migrator/1-json-collections/images/development-sql.png b/json-to-duality-migrator/1-json-collections/images/development-sql.png
new file mode 100644
index 000000000..6d9f479a4
Binary files /dev/null and b/json-to-duality-migrator/1-json-collections/images/development-sql.png differ
diff --git a/json-to-duality-migrator/1-json-collections/images/sql-run-script.png b/json-to-duality-migrator/1-json-collections/images/sql-run-script.png
new file mode 100644
index 000000000..27412dddf
Binary files /dev/null and b/json-to-duality-migrator/1-json-collections/images/sql-run-script.png differ
diff --git a/json-to-duality-migrator/1-json-collections/images/task2-step1a-pretty.png b/json-to-duality-migrator/1-json-collections/images/task2-step1a-pretty.png
new file mode 100644
index 000000000..897dce809
Binary files /dev/null and b/json-to-duality-migrator/1-json-collections/images/task2-step1a-pretty.png differ
diff --git a/json-to-duality-migrator/1-json-collections/images/task2-step1a.png b/json-to-duality-migrator/1-json-collections/images/task2-step1a.png
index 17d45875b..f65129284 100644
Binary files a/json-to-duality-migrator/1-json-collections/images/task2-step1a.png and b/json-to-duality-migrator/1-json-collections/images/task2-step1a.png differ
diff --git a/json-to-duality-migrator/1-json-collections/images/task2-step1b.png b/json-to-duality-migrator/1-json-collections/images/task2-step1b.png
index 528f03505..7fdea194a 100644
Binary files a/json-to-duality-migrator/1-json-collections/images/task2-step1b.png and b/json-to-duality-migrator/1-json-collections/images/task2-step1b.png differ
diff --git a/json-to-duality-migrator/1-json-collections/images/task2-step3a.png b/json-to-duality-migrator/1-json-collections/images/task2-step3a.png
index 186323151..09a52ddab 100644
Binary files a/json-to-duality-migrator/1-json-collections/images/task2-step3a.png and b/json-to-duality-migrator/1-json-collections/images/task2-step3a.png differ
diff --git a/json-to-duality-migrator/1-json-collections/images/task2-step3b.png b/json-to-duality-migrator/1-json-collections/images/task2-step3b.png
index a44a9703a..7f43bd12e 100644
Binary files a/json-to-duality-migrator/1-json-collections/images/task2-step3b.png and b/json-to-duality-migrator/1-json-collections/images/task2-step3b.png differ
diff --git a/json-to-duality-migrator/1-json-collections/images/task3-step1.png b/json-to-duality-migrator/1-json-collections/images/task3-step1.png
index 700c48037..135257be6 100644
Binary files a/json-to-duality-migrator/1-json-collections/images/task3-step1.png and b/json-to-duality-migrator/1-json-collections/images/task3-step1.png differ
diff --git a/json-to-duality-migrator/1-json-collections/images/task3-step3.png b/json-to-duality-migrator/1-json-collections/images/task3-step3.png
index b3a413ef9..374e45bb3 100644
Binary files a/json-to-duality-migrator/1-json-collections/images/task3-step3.png and b/json-to-duality-migrator/1-json-collections/images/task3-step3.png differ
diff --git a/json-to-duality-migrator/1-json-collections/json-collections.md b/json-to-duality-migrator/1-json-collections/json-collections.md
index 3ede60c1b..a09fa080c 100644
--- a/json-to-duality-migrator/1-json-collections/json-collections.md
+++ b/json-to-duality-migrator/1-json-collections/json-collections.md
@@ -27,67 +27,82 @@ In this lab, you will:
In this task, we will create a JSON collection table called `attendee` that represents a collection of attendees for a database conference.
-1. Create the `attendee` collection.
-
- ```sql
-
- DROP TABLE IF EXISTS attendee;
- CREATE JSON COLLECTION TABLE IF NOT EXISTS attendee;
-
- ```
-
- This creates a table with a single JSON-type object column named `DATA`. Because it's ultimately "just a table", you can use a JSON collection table in most of the ways that you use a regular table. In particular, you can use GoldenGate to replicate a collection table between databases, including between Oracle Database and JSON document databases, such as MongoDB.
-
-2. Insert data into the `attendee` collection.
-
- ```sql
-
- INSERT INTO attendee VALUES
- ('{"_id" : 1,
- "firstName" : "Beda",
- "lastName" : "Hammerschmidt",
- "nickName" : "Dr. JSON",
- "age" : 20,
- "phoneNumber" : "222-111-021",
- "coffeeItem" : "Espresso",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 20, "lectureName" : "PL/SQL or Javascript", "credits" : 4},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
- {"id" : 40, "lectureName" : "Oracle ADB on iPhone", "credits" : 3},
- {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 2,
- "firstName" : "Hermann",
- "lastName" : "Baer",
- "age" : 22,
- "phoneNumber" : "222-112-023",
- "coffeeItem" : "Cappuccino",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
- {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 3,
- "firstName" : "Shashank",
- "lastName" : "Gugnani",
- "nickName" : "SG",
- "age" : 23,
- "phoneNumber" : "222-112-024",
- "coffeeItem" : "Americano",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 4,
- "firstName" : "Julian",
- "lastName" : "Dontcheff",
- "nickName" : "Jul",
- "age" : 24,
- "phoneNumber" : "222-113-025",
- "coffeeItem" : "Decaf",
- "lectures" : [ {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
-
- COMMIT;
-
- ```
+1. Click in the *Database Actions* dropdown list and select **View all database actions**
+
+ 
+
+
+2. Below you can find the Database Actions homepage. Click the SQL tile under development to open the SQL worksheet.
+
+ 
+
+3. In the SQL worksheet, create the `attendee` collection.
+
+ Copy the SQL below and press the 'run script' button.
+
+ ```sql
+
+ DROP TABLE IF EXISTS attendee;
+ CREATE JSON COLLECTION TABLE IF NOT EXISTS attendee;
+
+ ```
+
+ 
+
+ This creates a table with a single JSON-type object column named `DATA`. Because it's ultimately "just a table", you can use a JSON collection table in most of the ways that you use a regular table. In particular, you can use GoldenGate to replicate a collection table between databases, including between Oracle Database and JSON document databases, such as MongoDB.
+
+4. Insert data into the `attendee` collection.
+
+ Copy the SQL below and press the 'run script' button.
+
+ ```sql
+
+ INSERT INTO attendee VALUES
+ ('{"_id" : 1,
+ "firstName" : "Beda",
+ "lastName" : "Hammerschmidt",
+ "nickName" : "Dr. JSON",
+ "age" : 20,
+ "phoneNumber" : "222-111-021",
+ "coffeeItem" : "Espresso",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 20, "lectureName" : "PL/SQL or Javascript", "credits" : 4},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
+ {"id" : 50, "lectureName" : "Oracle ADB on iPhone", "credits" : 3},
+ {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 2,
+ "firstName" : "Hermann",
+ "lastName" : "Baer",
+ "age" : 22,
+ "phoneNumber" : "222-112-023",
+ "coffeeItem" : "Cappuccino",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
+ {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 3,
+ "firstName" : "Shashank",
+ "lastName" : "Gugnani",
+ "nickName" : "SG",
+ "age" : 23,
+ "phoneNumber" : "222-112-024",
+ "coffeeItem" : "Americano",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 4,
+ "firstName" : "Julian",
+ "lastName" : "Dontcheff",
+ "nickName" : "Jul",
+ "age" : 24,
+ "phoneNumber" : "222-113-025",
+ "coffeeItem" : "Decaf",
+ "lectures" : [ {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+
+ COMMIT;
+
+ ```
As you see, it looks like a normal SQL `INSERT` statement. The only difference is that we specified a proper JSON document as input for our DATA column. Copy the SQL statement and execute it in the SQL worksheet.
@@ -95,56 +110,63 @@ In this task, we will create a JSON collection table called `attendee` that repr
1. Find a document in the `attendee` collection
- ```sql
- SELECT a.data FROM attendee a WHERE a.data."_id" = 1;
- ```
+ ```sql
+ SELECT a.data FROM attendee a WHERE a.data."_id" = 1;
+ ```
- 
+ 
- We can also select specific fields within the JSON document by using the dot notation to peek inside the document.
+ **Tip:** If you get to the end of a long JSON column, you will see an eye. If you click on that eye, you will see your JSON document in a separate window in pretty format.
+
+ 
- ```sql
-
- SELECT a.data.lastName || ', ' || a.data.firstName as name,
- a.data.nickName as nick_name
- FROM attendee a
- WHERE a.data."_id" = 1;
-
- ```
+
+ We can also select specific fields within the JSON document by using the dot notation to peek inside the document.
+
+ ```sql
+
+ SELECT a.data.lastName || ', ' || a.data.firstName as name,
+ a.data.nickName as nick_name
+ FROM attendee a
+ WHERE a.data."_id" = 1;
+
+ ```
- 
+ 
2. Add a field to an existing document. We will add a `country` field to Julian's attendee document to specify his country of origin.
- ```sql
-
- UPDATE attendee a
- SET a.data = JSON_TRANSFORM(a.data, SET '$.country' = 'Finland')
- WHERE a.data."_id" = 4;
+ Copy the SQL below in the SQL worksheet and run it.
- COMMIT;
-
- ```
+ ```sql
+
+ UPDATE attendee a
+ SET a.data = JSON_TRANSFORM(a.data, SET '$.country' = 'Finland')
+ WHERE a.data."_id" = 4;
+
+ COMMIT;
+
+ ```
3. Query the updated document. It should now contain the `country` field, which can also be queried using dot notation.
- ```sql
- SELECT a.data FROM attendee a WHERE a.data."_id" = 4;
- ```
+ ```sql
+ SELECT a.data FROM attendee a WHERE a.data."_id" = 4;
+ ```
- 
+ 
- ```sql
-
- SELECT a.data.lastName || ', ' || a.data.firstName as name,
- a.data.nickName as nick_name,
- a.data.country as country
- FROM attendee a
- WHERE a.data."_id" = 4;
-
- ```
+ ```sql
+
+ SELECT a.data.lastName || ', ' || a.data.firstName as name,
+ a.data.nickName as nick_name,
+ a.data.country as country
+ FROM attendee a
+ WHERE a.data."_id" = 4;
+
+ ```
- 
+ 
## Task 3: Update Shared Information
@@ -152,32 +174,33 @@ In this task, we will update lecture name for lecture id 40, from "JSON Duality
1. Find all document that contain lecture id 40. We will use a `JSON_EXISTS` predicate to find all such documents.
- ```sql
-
- SELECT data
- FROM attendee
- WHERE JSON_EXISTS(data, '$.lectures[*]?(@.id == 40)');
-
- ```
+ ```sql
+
+ SELECT data
+ FROM attendee
+ WHERE JSON_EXISTS(data, '$.lectures[*]?(@.id == 40)');
+
+ ```
+ You will see three records returned. For legibility we expanded one of the three records to show the redundant information about lecture 40 in one JSON document.
- 
+ 
2. Update the lecture name in all documents containing lecture id 40. We will use a `JSON_EXISTS` predicate to find all such documents, then use `JSON_TRANSFORM` to update the lecture name only for the matching lecture id.
- ```sql
-
- UPDATE attendee
- SET data = JSON_TRANSFORM(
- data,
- SET '$.lectures[*]?(@.id == 40).lectureName' = 'JSON Relational Duality Views'
- )
- WHERE JSON_EXISTS(data, '$.lectures[*]?(@.id == 40)');
-
- COMMIT;
-
- ```
+ ```sql
+
+ UPDATE attendee
+ SET data = JSON_TRANSFORM(
+ data,
+ SET '$.lectures[*]?(@.id == 40).lectureName' = 'JSON Relational Duality Views'
+ )
+ WHERE JSON_EXISTS(data, '$.lectures[*]?(@.id == 40)');
- This statement updates three documents, each of which references lecture id 40.
+ COMMIT;
+
+ ```
+
+ This statement updates three documents, each of which references lecture id 40. **Note that we needed to update three documents to correct the lecture name for all our attendees.**
3. Select all documents from the view to see the updated documents.
@@ -188,6 +211,8 @@ In this task, we will update lecture name for lecture id 40, from "JSON Duality
```
+ You can scroll through the documents or drill down into the detail of individual documents. For illustration purposes we highlight two of the changed entries in the screenshot below. (We actually updated three documents before, so you will find the third one when scrolling to the right.)
+

We can see that the lecture name for lecture id 40 has now been updated consistently everywhere. It is easy to see the problem with JSON collections containing duplicate data - Any update to duplicate data must be managed carefully and kept consistent manually. In the next lab, we will see how duality views effectively solves this problem.
diff --git a/json-to-duality-migrator/2-duality-views/duality-views.md b/json-to-duality-migrator/2-duality-views/duality-views.md
index a8363b29f..2f76e711e 100644
--- a/json-to-duality-migrator/2-duality-views/duality-views.md
+++ b/json-to-duality-migrator/2-duality-views/duality-views.md
@@ -25,239 +25,276 @@ In this lab, you will:
In this task, we will create a duality view called `attendee` that represents a collection of attendees for a database conference.
-1. Create the `attendees` and `lectures` tables. We also need to create a mapping table between attendees and lectures (`map_attendees_to_lectures` tables) to model the many-to-many relationship between them.
-
- ```sql
-
- -- Attendees table
- DROP TABLE IF EXISTS attendees;
- CREATE TABLE attendees (
- id NUMBER GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
- first_name VARCHAR2(50),
- last_name VARCHAR2(50),
- nick_name VARCHAR2(50),
- age NUMBER,
- phone_number VARCHAR2(50),
- coffee_item VARCHAR2(100),
- extras JSON(Object) -- Stores flexible additional attributes
- );
-
- -- Lectures table
- DROP TABLE IF EXISTS lectures;
- CREATE TABLE lectures (
- id NUMBER GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
- name VARCHAR2(100),
- credits NUMBER
- );
-
- -- Mapping table for many-to-many relationship between attendees and lectures
- DROP TABLE IF EXISTS map_attendees_to_lectures;
- CREATE TABLE map_attendees_to_lectures (
- attendee_id NUMBER NOT NULL,
- lecture_id NUMBER NOT NULL,
- PRIMARY KEY (attendee_id, lecture_id),
- FOREIGN KEY (attendee_id) REFERENCES attendees(id),
- FOREIGN KEY (lecture_id) REFERENCES lectures(id)
- );
-
- ```
-
-2. Create the attendee duality view.
-
- ```sql
-
- DROP TABLE IF EXISTS attendee;
- CREATE OR REPLACE JSON DUALITY VIEW attendee AS
- attendees @insert @update @delete
- {
- _id : id,
- firstName : first_name,
- lastName : last_name,
- nickName : nick_name,
- age,
- phoneNumber : phone_number,
- coffeeItem : coffee_item,
- lectures : map_attendees_to_lectures @insert @update @delete
- [{
- lectures @unnest @insert @update
- {
- id,
- lectureName : name,
- credits
- }
- }]
- extras @flex
- };
-
- ```
+1. Click in the *Database Actions* dropdown list and select **View all database actions**
+
+ 
+
+
+2. Below you can find the Database Actions homepage. Click the SQL tile under development to open the SQL worksheet.
+
+ 
+
+3. Create the `attendees` and `lectures` tables using the SQL worksheet, using the 'run script' functionality (or your SQL client of choice). We also need to create a mapping table between attendees and lectures (`map_attendees_to_lectures` tables) to model the many-to-many relationship between them.
+
+ ```sql
+
+ -- cleanup, if needed
+ DROP TABLE IF EXISTS map_attendees_to_lectures;
+ DROP TABLE IF EXISTS attendees;
+ DROP TABLE IF EXISTS lectures;
+
+ -- Attendees table
+ CREATE TABLE attendees (
+ id NUMBER GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
+ first_name VARCHAR2(50),
+ last_name VARCHAR2(50),
+ nick_name VARCHAR2(50),
+ age NUMBER,
+ phone_number VARCHAR2(50),
+ coffee_item VARCHAR2(100),
+ extras JSON(Object) -- Stores flexible additional attributes
+ );
+
+ -- Lectures table
+ CREATE TABLE lectures (
+ id NUMBER GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
+ name VARCHAR2(100),
+ credits NUMBER
+ );
+
+ -- Mapping table for many-to-many relationship between attendees and lectures
+ CREATE TABLE map_attendees_to_lectures (
+ attendee_id NUMBER NOT NULL,
+ lecture_id NUMBER NOT NULL,
+ PRIMARY KEY (attendee_id, lecture_id),
+ FOREIGN KEY (attendee_id) REFERENCES attendees(id),
+ FOREIGN KEY (lecture_id) REFERENCES lectures(id)
+ );
+
+ ```
+
+2. Create two duality views - attendee and lectures - that represent the JSON document representation of the two entities we want to look at.
+
+ ```sql
+
+ DROP TABLE IF EXISTS attendee;
+ DROP TABLE IF EXISTS lecture;
+ DROP VIEW IF EXISTS lecture;
+
+ CREATE OR REPLACE JSON DUALITY VIEW attendee AS
+ attendees @insert @update @delete
+ {
+ _id : id,
+ firstName : first_name,
+ lastName : last_name,
+ nickName : nick_name,
+ age,
+ phoneNumber : phone_number,
+ coffeeItem : coffee_item,
+ lectures : map_attendees_to_lectures @insert @update @delete
+ [{
+ lectures @unnest @insert @update
+ {
+ id,
+ lectureName : name,
+ credits
+ }
+ }]
+ extras @flex
+ };
+
+ CREATE OR REPLACE JSON DUALITY VIEW lecture AS
+ lectures @insert @update @delete
+ {
+ _id: id,
+ lectureName: name,
+ lectureCredtis: credits
+ };
+
+ ```
Note how the view definition uses GraphQL-like what-you-see-is-what-you-get syntax which makes it easy to determine what the view output will look like.
3. Insert data into the attendee duality view.
- ```sql
-
- INSERT INTO attendee VALUES
- ('{"_id" : 1,
- "firstName" : "Beda",
- "lastName" : "Hammerschmidt",
- "nickName" : "Dr. JSON",
- "age" : 20,
- "phoneNumber" : "222-111-021",
- "coffeeItem" : "Espresso",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 20, "lectureName" : "PL/SQL or Javascript", "credits" : 4},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
- {"id" : 40, "lectureName" : "Oracle ADB on iPhone", "credits" : 3},
- {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 2,
- "firstName" : "Hermann",
- "lastName" : "Baer",
- "age" : 22,
- "phoneNumber" : "222-112-023",
- "coffeeItem" : "Cappuccino",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
- {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 3,
- "firstName" : "Shashank",
- "lastName" : "Gugnani",
- "nickName" : "SG",
- "age" : 23,
- "phoneNumber" : "222-112-024",
- "coffeeItem" : "Americano",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 4,
- "firstName" : "Julian",
- "lastName" : "Dontcheff",
- "nickName" : "Jul",
- "age" : 24,
- "phoneNumber" : "222-113-025",
- "coffeeItem" : "Decaf",
- "lectures" : [ {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
-
- COMMIT;
-
- ```
-
- As you see, it looks like a normal SQL INSERT statement. The only difference is that we specified a proper JSON document as input for our DATA column. Copy the SQL statement and execute it in the SQL worksheet.
+ ```sql
+
+ INSERT INTO attendee VALUES
+ ('{"_id" : 1,
+ "firstName" : "Beda",
+ "lastName" : "Hammerschmidt",
+ "nickName" : "Dr. JSON",
+ "age" : 20,
+ "phoneNumber" : "222-111-021",
+ "coffeeItem" : "Espresso",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 20, "lectureName" : "PL/SQL or Javascript", "credits" : 4},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
+ {"id" : 50, "lectureName" : "Oracle ADB on iPhone", "credits" : 3},
+ {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 2,
+ "firstName" : "Hermann",
+ "lastName" : "Baer",
+ "age" : 22,
+ "phoneNumber" : "222-112-023",
+ "coffeeItem" : "Cappuccino",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
+ {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 3,
+ "firstName" : "Shashank",
+ "lastName" : "Gugnani",
+ "nickName" : "SG",
+ "age" : 23,
+ "phoneNumber" : "222-112-024",
+ "coffeeItem" : "Americano",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 4,
+ "firstName" : "Julian",
+ "lastName" : "Dontcheff",
+ "nickName" : "Jul",
+ "age" : 24,
+ "phoneNumber" : "222-113-025",
+ "coffeeItem" : "Decaf",
+ "lectures" : [ {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+
+ COMMIT;
+
+ ```
+
+ As you see, it looks like a normal SQL INSERT statement. The only difference is that we specified a proper JSON document as input for our DATA column. Copy the SQL statement and execute it in the SQL worksheet.
## Task 2: Access a Duality View
1. Find a document in the `attendee` collection
- ```sql
- SELECT a.data FROM attendee a WHERE a.data."_id" = 1;
- ```
+ ```sql
+ SELECT a.data FROM attendee a WHERE a.data."_id" = 1;
+ ```
- 
+ 
- We can also select specific fields within the JSON document by using the dot notation to peek inside the document.
+ We can also select specific fields within the JSON document by using the dot notation to peek inside the document.
- ```sql
-
- SELECT a.data.lastName || ', ' || a.data.firstName as name,
- a.data.nickName as nick_name
- FROM attendee a
- WHERE a.data."_id" = 1;
-
- ```
+ ```sql
+
+ SELECT a.data.lastName || ', ' || a.data.firstName as name,
+ a.data.nickName as nick_name
+ FROM attendee a
+ WHERE a.data."_id" = 1;
+
+ ```
- 
+ 
2. Add a field to an existing document. We will add a `country` field to Julian's attendee document to specify his country of origin. This field gets stored in the `extras` flex JSON column since it is not mapped to any column in the attendees table.
- ```sql
-
- UPDATE attendee a
- SET a.data = JSON_TRANSFORM(a.data, SET '$.country' = 'Finland')
- WHERE a.data."_id" = 4;
+ ```sql
+
+ UPDATE attendee a
+ SET a.data = JSON_TRANSFORM(a.data, SET '$.country' = 'Finland')
+ WHERE a.data."_id" = 4;
- COMMIT;
-
- ```
+ COMMIT;
+
+ ```
3. Query the updated document. It should now contain the `country` field, which can also be queried using dot notation just like any other non-flex field.
- ```sql
- SELECT a.data FROM attendee a WHERE a.data."_id" = 4;
- ```
+ ```sql
+ SELECT a.data FROM attendee a WHERE a.data."_id" = 4;
+ ```
- 
+ 
- ```sql
-
- SELECT a.data.lastName || ', ' || a.data.firstName as name,
- a.data.nickName as nick_name,
- a.data.country as country
- FROM attendee a
- WHERE a.data."_id" = 4;
-
- ```
+ ```sql
+
+ SELECT a.data.lastName || ', ' || a.data.firstName as name,
+ a.data.nickName as nick_name,
+ a.data.country as country
+ FROM attendee a
+ WHERE a.data."_id" = 4;
+
+ ```
- 
+ 
4. Query the `attendees` table. The `extras` column holds both the key and value for the `country` field.
- ```sql
-
- SELECT first_name, last_name, extras
- FROM attendees
- WHERE id = 4;
-
- ```
+ ```sql
+
+ SELECT first_name, last_name, extras
+ FROM attendees
+ WHERE id = 4;
+
+ ```
- 
+ 
## Task 3: Update Shared Information
-In this task, we will update lecture name for lecture id 40, from "JSON Duality Views" to "JSON Relational Duality Views". lecture information is duplicated across multiple attendee documents. However, duality views use relational tables underneath, so there is a single row in the lecture table storing lecture id 40's information, which is shared by all documents.
+In this task, we will update lecture name for lecture id 40, from "JSON Duality Views" to "JSON Relational Duality Views". lecture information is duplicated across multiple attendee documents. However, duality views use relational tables underneath, so there is a **single row in the lecture table** storing lecture id 40's information, which is shared by all documents.
1. Find all document that contain lecture id 40. We will use a `JSON_EXISTS` predicate to find all such documents.
- ```sql
-
- SELECT data
- FROM attendee
- WHERE JSON_EXISTS(data, '$.lectures[*]?(@.id == 40)');
-
- ```
+ ```sql
+
+ SELECT data
+ FROM attendee
+ WHERE JSON_EXISTS(data, '$.lectures[*]?(@.id == 40)');
+
+ ```
- 
+ For illustration purposes, the screenshot shows the detail information of one of the documents.
-2. Update the lecture name in only attendee id 1's document using `JSON_TRANSFORM` to update the lecture name only for the matching lecture id.
+ 
- ```sql
-
- UPDATE attendee a
- SET data = JSON_TRANSFORM(
- data,
- SET '$.lectures[*]?(@.id == 40).lectureName' = 'JSON Relational Duality Views'
- )
- WHERE a.data."_id" = 1;
+2. Update the lecture name in the equivalent lecture document using `JSON_TRANSFORM` to update the lecture name only for the matching lecture id.
- COMMIT;
-
- ```
+ This demonstrates one of the big benefits of Json Relational Duality Views. First, we have multiple document representations on top of the same data. As you see here, we expose documents for our lectures, as well as documents for out attendee's schedules.
- This statement updates one document, as a result of which the row in the lecture table storing lecture id 40's information has been updated. As a result, the updated lecture information is immediately reflected in all other documents containing it. This is one of the biggest advantages of duality views - updates to shared data are immediately reflected everywhere they are referenced!
+ ```sql
+
+ SELECT data FROM lecture l WHERE l.data."_id" = 40;
+
+ ```
+ 
+
+ Let's now update this document.
+
+ ```sql
+
+ UPDATE lecture l
+ SET data = JSON_TRANSFORM(
+ data,
+ SET '$.lectureName' = 'JSON Relational Duality Views'
+ )
+ WHERE l.data."_id" = 40;
+
+ COMMIT;
+
+ ```
+ 
+
+ This statement updates **one lecture document**, as a result of which the row in the lecture table storing lecture id 40's information has been updated. As a result, the updated lecture information is **immediately reflected in all attendee's documents containing it**. This is one of the biggest advantages of duality views - updates to shared data are immediately reflected everywhere they are referenced!
3. Select all documents from the view to see the updated documents.
- ```sql
-
- SELECT data
- FROM attendee;
-
- ```
+ ```sql
+
+ SELECT data
+ FROM attendee;
+
+ ```
+ You can scroll through the documents or drill down into the detail of individual documents. For illustration purposes we highlight two of the changed entries in the screenshot below. (We actually updated three documents before, so you will find the third one when scrolling to the right.)
- 
+ 
- We can see that the lecture name for lecture id 40 has now been updated consistently everywhere. Updating the data in one place updates it consistently everywhere.
+ We can see that the lecture name for lecture id 40 has now been updated consistently everywhere. Updating the data in one place updates it consistently everywhere.
In this lab, we saw how duality views solve the data duplication problem with JSON collections while providing schema flexibility. However, users still need to define the relational schema and duality views upfront. What if the relational schema and duality views could be automatically inferred and created? In the next lab, we will work with the JSON-to-Duality Migrator, which solves exactly this problem!
diff --git a/json-to-duality-migrator/2-duality-views/images/dbaction1.png b/json-to-duality-migrator/2-duality-views/images/dbaction1.png
new file mode 100644
index 000000000..02d407f21
Binary files /dev/null and b/json-to-duality-migrator/2-duality-views/images/dbaction1.png differ
diff --git a/json-to-duality-migrator/2-duality-views/images/development-sql.png b/json-to-duality-migrator/2-duality-views/images/development-sql.png
new file mode 100644
index 000000000..6d9f479a4
Binary files /dev/null and b/json-to-duality-migrator/2-duality-views/images/development-sql.png differ
diff --git a/json-to-duality-migrator/2-duality-views/images/lecture-document.png b/json-to-duality-migrator/2-duality-views/images/lecture-document.png
new file mode 100644
index 000000000..964ed4c8b
Binary files /dev/null and b/json-to-duality-migrator/2-duality-views/images/lecture-document.png differ
diff --git a/json-to-duality-migrator/2-duality-views/images/task2-step1a.png b/json-to-duality-migrator/2-duality-views/images/task2-step1a.png
index fffb36d50..f1fb150f0 100644
Binary files a/json-to-duality-migrator/2-duality-views/images/task2-step1a.png and b/json-to-duality-migrator/2-duality-views/images/task2-step1a.png differ
diff --git a/json-to-duality-migrator/2-duality-views/images/task2-step1b.png b/json-to-duality-migrator/2-duality-views/images/task2-step1b.png
index bc2bbeef4..d3805ac4e 100644
Binary files a/json-to-duality-migrator/2-duality-views/images/task2-step1b.png and b/json-to-duality-migrator/2-duality-views/images/task2-step1b.png differ
diff --git a/json-to-duality-migrator/2-duality-views/images/task2-step3a.png b/json-to-duality-migrator/2-duality-views/images/task2-step3a.png
index 968f8b991..59ea20001 100644
Binary files a/json-to-duality-migrator/2-duality-views/images/task2-step3a.png and b/json-to-duality-migrator/2-duality-views/images/task2-step3a.png differ
diff --git a/json-to-duality-migrator/2-duality-views/images/task2-step3b.png b/json-to-duality-migrator/2-duality-views/images/task2-step3b.png
index a44a9703a..ad75a915f 100644
Binary files a/json-to-duality-migrator/2-duality-views/images/task2-step3b.png and b/json-to-duality-migrator/2-duality-views/images/task2-step3b.png differ
diff --git a/json-to-duality-migrator/2-duality-views/images/task2-step4.png b/json-to-duality-migrator/2-duality-views/images/task2-step4.png
index 4f6e54b52..d3f9b2902 100644
Binary files a/json-to-duality-migrator/2-duality-views/images/task2-step4.png and b/json-to-duality-migrator/2-duality-views/images/task2-step4.png differ
diff --git a/json-to-duality-migrator/2-duality-views/images/task3-step1.png b/json-to-duality-migrator/2-duality-views/images/task3-step1.png
index cbe20c6aa..135257be6 100644
Binary files a/json-to-duality-migrator/2-duality-views/images/task3-step1.png and b/json-to-duality-migrator/2-duality-views/images/task3-step1.png differ
diff --git a/json-to-duality-migrator/2-duality-views/images/task3-step3.png b/json-to-duality-migrator/2-duality-views/images/task3-step3.png
index e5442dce3..374e45bb3 100644
Binary files a/json-to-duality-migrator/2-duality-views/images/task3-step3.png and b/json-to-duality-migrator/2-duality-views/images/task3-step3.png differ
diff --git a/json-to-duality-migrator/2-duality-views/images/update-lecture-document.png b/json-to-duality-migrator/2-duality-views/images/update-lecture-document.png
new file mode 100644
index 000000000..057ac0ab5
Binary files /dev/null and b/json-to-duality-migrator/2-duality-views/images/update-lecture-document.png differ
diff --git a/json-to-duality-migrator/3-json-to-duality-migrator/images/dbaction1.png b/json-to-duality-migrator/3-json-to-duality-migrator/images/dbaction1.png
new file mode 100644
index 000000000..02d407f21
Binary files /dev/null and b/json-to-duality-migrator/3-json-to-duality-migrator/images/dbaction1.png differ
diff --git a/json-to-duality-migrator/3-json-to-duality-migrator/images/development-sql.png b/json-to-duality-migrator/3-json-to-duality-migrator/images/development-sql.png
new file mode 100644
index 000000000..6d9f479a4
Binary files /dev/null and b/json-to-duality-migrator/3-json-to-duality-migrator/images/development-sql.png differ
diff --git a/json-to-duality-migrator/3-json-to-duality-migrator/images/import-data.png b/json-to-duality-migrator/3-json-to-duality-migrator/images/import-data.png
new file mode 100644
index 000000000..705a3c4b5
Binary files /dev/null and b/json-to-duality-migrator/3-json-to-duality-migrator/images/import-data.png differ
diff --git a/json-to-duality-migrator/3-json-to-duality-migrator/images/infer-and-generate.png b/json-to-duality-migrator/3-json-to-duality-migrator/images/infer-and-generate.png
new file mode 100644
index 000000000..541b3be20
Binary files /dev/null and b/json-to-duality-migrator/3-json-to-duality-migrator/images/infer-and-generate.png differ
diff --git a/json-to-duality-migrator/3-json-to-duality-migrator/images/schema-tables.png b/json-to-duality-migrator/3-json-to-duality-migrator/images/schema-tables.png
new file mode 100644
index 000000000..9edf47eb5
Binary files /dev/null and b/json-to-duality-migrator/3-json-to-duality-migrator/images/schema-tables.png differ
diff --git a/json-to-duality-migrator/3-json-to-duality-migrator/images/task2-step2.png b/json-to-duality-migrator/3-json-to-duality-migrator/images/task2-step2.png
index f7777c32a..b03bcdf92 100644
Binary files a/json-to-duality-migrator/3-json-to-duality-migrator/images/task2-step2.png and b/json-to-duality-migrator/3-json-to-duality-migrator/images/task2-step2.png differ
diff --git a/json-to-duality-migrator/3-json-to-duality-migrator/images/task2-step3.png b/json-to-duality-migrator/3-json-to-duality-migrator/images/task2-step3.png
index 01a9fe7fc..5edcb2b4c 100644
Binary files a/json-to-duality-migrator/3-json-to-duality-migrator/images/task2-step3.png and b/json-to-duality-migrator/3-json-to-duality-migrator/images/task2-step3.png differ
diff --git a/json-to-duality-migrator/3-json-to-duality-migrator/images/task3-step3.png b/json-to-duality-migrator/3-json-to-duality-migrator/images/task3-step3.png
index e78b58085..d972ccce0 100644
Binary files a/json-to-duality-migrator/3-json-to-duality-migrator/images/task3-step3.png and b/json-to-duality-migrator/3-json-to-duality-migrator/images/task3-step3.png differ
diff --git a/json-to-duality-migrator/3-json-to-duality-migrator/json-to-duality-migrator.md b/json-to-duality-migrator/3-json-to-duality-migrator/json-to-duality-migrator.md
index dd33eeb4c..ce6593da9 100644
--- a/json-to-duality-migrator/3-json-to-duality-migrator/json-to-duality-migrator.md
+++ b/json-to-duality-migrator/3-json-to-duality-migrator/json-to-duality-migrator.md
@@ -37,258 +37,284 @@ In this lab, you will:
In this task, we will create JSON collection tables `speaker`, `attendee`, and `lecture` that represents collections required for a database conference application.
-1. Let's drop all the objects that we created in the previous lab first.
-
- ```sql
-
- BEGIN
- FOR t IN (
- SELECT object_name
- FROM user_objects
- WHERE object_type = 'TABLE'
- AND created >= SYSDATE - INTERVAL '2' HOUR
- ) LOOP
- BEGIN
- EXECUTE IMMEDIATE 'DROP TABLE "' || t.object_name || '" CASCADE CONSTRAINTS PURGE';
- EXCEPTION
- WHEN OTHERS THEN
- DBMS_OUTPUT.PUT_LINE('Failed to drop table ' || t.object_name || ': ' || SQLERRM);
- END;
- END LOOP;
- END;
- /
-
- ```
+1. Click in the *Database Actions* dropdown list and select **View all database actions**
+
+ 
+
+
+2. Below you can find the Database Actions homepage. Click the SQL tile under development to open the SQL worksheet.
+
+ 
+
+
+3. Let's now drop all the objects that we created in the previous lab first.
+
+ ```sql
+
+ BEGIN
+ FOR t IN (
+ SELECT object_name
+ FROM user_objects
+ WHERE object_type = 'TABLE'
+ AND created >= SYSDATE - INTERVAL '2' HOUR
+ ) LOOP
+ BEGIN
+ EXECUTE IMMEDIATE 'DROP TABLE "' || t.object_name || '" CASCADE CONSTRAINTS PURGE';
+ EXCEPTION
+ WHEN OTHERS THEN
+ DBMS_OUTPUT.PUT_LINE('Failed to drop table ' || t.object_name || ': ' || SQLERRM);
+ END;
+ END LOOP;
+ END;
+ /
+
+ ```
+ This code basically drops all tables that were created in the last 2 hours. If it took you longer to get to this point in the lab, then you need to increase the time window.
+
+ Let's check that there are no objects existent in our schema anymore, or at least no objects that will collide with this workshop.
+
+ ```sql
+
+ SELECT * FROM user_tables;
+
+ ```
+ 
2. Create the `speaker`, `attendee`, and `lecture` collections.
- ```sql
-
- DROP VIEW IF EXISTS speaker;
- DROP VIEW IF EXISTS attendee;
- DROP VIEW IF EXISTS lecture;
- DROP TABLE IF EXISTS speaker PURGE;
- DROP TABLE IF EXISTS attendee PURGE;
- DROP TABLE IF EXISTS lecture PURGE;
- CREATE JSON COLLECTION TABLE IF NOT EXISTS speaker;
- CREATE JSON COLLECTION TABLE IF NOT EXISTS attendee;
- CREATE JSON COLLECTION TABLE IF NOT EXISTS lecture;
-
- ```
+ ```sql
+
+ DROP VIEW IF EXISTS speaker;
+ DROP VIEW IF EXISTS attendee;
+ DROP VIEW IF EXISTS lecture;
+ DROP TABLE IF EXISTS speaker PURGE;
+ DROP TABLE IF EXISTS attendee PURGE;
+ DROP TABLE IF EXISTS lecture PURGE;
+ CREATE JSON COLLECTION TABLE IF NOT EXISTS speaker;
+ CREATE JSON COLLECTION TABLE IF NOT EXISTS attendee;
+ CREATE JSON COLLECTION TABLE IF NOT EXISTS lecture;
+
+ ```
3. Insert data into the `speaker`, `attendee`, and `lecture` collections.
- ```sql
-
- INSERT INTO speaker VALUES
- ('{"_id" : 101,
- "name" : "Abdul J.",
- "phoneNumber" : "222-555-011",
- "yearsAtOracle" : 25,
- "department" : "Product Management",
- "lecturesTaught" : [ {"id" : 10, "lectureName" : "JSON and SQL", "classType" : "Online"},
- {"id" : 20, "lectureName" : "PL/SQL or Javascript", "classType" : "In-person"} ]}');
- INSERT INTO speaker VALUES
- ('{"_id" : 102,
- "name" : "Betty Z.",
- "phoneNumber" : "222-555-022",
- "yearsAtOracle" : 30,
- "department" : "Autonomous Databases",
- "lecturesTaught" : [ {"id" : 30, "lectureName" : "MongoDB API Internals", "classType" : "In-person"},
- {"id" : 40, "lectureName" : "Oracle ADB on iPhone", "classType" : "Online"} ]}');
- INSERT INTO speaker VALUES
- ('{"_id" : 103,
- "name" : "Colin J.",
- "phoneNumber" : "222-555-023",
- "yearsAtOracle" : 27,
- "department" : "In-Memory and Data",
- "lecturesTaught" : [ {"id" : 50, "lectureName" : "JSON Duality Views", "classType" : "Online"} ]}');
-
- INSERT INTO attendee VALUES
- ('{"_id" : 1,
- "firstName" : "Beda",
- "lastName" : "Hammerschmidt",
- "nickName" : "Dr. JSON",
- "age" : 20,
- "phoneNumber" : "222-111-021",
- "coffeeItem" : "Espresso",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 20, "lectureName" : "PL/SQL or Javascript", "credits" : 4},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
- {"id" : 40, "lectureName" : "Oracle ADB on iPhone", "credits" : 3},
- {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 2,
- "firstName" : "Hermann",
- "lastName" : "Baer",
- "age" : 22,
- "phoneNumber" : "222-112-023",
- "coffeeItem" : "Cappuccino",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
- {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 3,
- "firstName" : "Shashank",
- "lastName" : "Gugnani",
- "nickName" : "SG",
- "age" : 23,
- "phoneNumber" : "222-112-024",
- "coffeeItem" : "Americano",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 4,
- "firstName" : "Julian",
- "lastName" : "Dontcheff",
- "nickName" : "Jul",
- "age" : 24,
- "phoneNumber" : "222-113-025",
- "coffeeItem" : "Decaf",
- "lectures" : [ {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
-
- INSERT INTO lecture VALUES
- ('{"_id" : 10,
- "lectureName" : "JSON and SQL",
- "creditHours" : 3,
- "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
- {"_id" : 2, "name": "Hermann", "age" : 22},
- {"_id" : 3, "name": "Shashank", "age" : 23} ]}');
- INSERT INTO lecture VALUES
- ('{"_id" : 20,
- "lectureName" : "PL/SQL or Javascript",
- "creditHours" : 4,
- "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20} ]}');
- INSERT INTO lecture VALUES
- ('{"_id" : 30,
- "lectureName" : "MongoDB API Internals",
- "creditHours" : 5,
- "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
- {"_id" : 2, "name": "Hermann", "age" : 22},
- {"_id" : 3, "name": "Shashank", "age" : 23} ]}');
- INSERT INTO lecture VALUES
- ('{"_id" : 40,
- "lectureName" : "Oracle ADB on iPhone",
- "creditHours" : 3,
- "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20} ]}');
- INSERT INTO lecture VALUES
- ('{"_id" : 50,
- "lectureName" : "JSON Duality Views",
- "creditHours" : 3,
- "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
- {"_id" : 2, "name": "Hermann", "age" : 22},
- {"_id" : 4, "name": "Julian", "age" : 24} ]}');
-
- COMMIT;
-
- ```
+ ```sql
+
+ INSERT INTO speaker VALUES
+ ('{"_id" : 101,
+ "name" : "Abdul J.",
+ "phoneNumber" : "222-555-011",
+ "yearsAtOracle" : 25,
+ "department" : "Product Management",
+ "lecturesTaught" : [ {"id" : 10, "lectureName" : "JSON and SQL", "classType" : "Online"},
+ {"id" : 20, "lectureName" : "PL/SQL or Javascript", "classType" : "In-person"} ]}');
+ INSERT INTO speaker VALUES
+ ('{"_id" : 102,
+ "name" : "Betty Z.",
+ "phoneNumber" : "222-555-022",
+ "yearsAtOracle" : 30,
+ "department" : "Autonomous Databases",
+ "lecturesTaught" : [ {"id" : 30, "lectureName" : "MongoDB API Internals", "classType" : "In-person"},
+ {"id" : 40, "lectureName" : "Oracle ADB on iPhone", "classType" : "Online"} ]}');
+ INSERT INTO speaker VALUES
+ ('{"_id" : 103,
+ "name" : "Colin J.",
+ "phoneNumber" : "222-555-023",
+ "yearsAtOracle" : 27,
+ "department" : "In-Memory and Data",
+ "lecturesTaught" : [ {"id" : 50, "lectureName" : "JSON Duality Views", "classType" : "Online"} ]}');
+
+ INSERT INTO attendee VALUES
+ ('{"_id" : 1,
+ "firstName" : "Beda",
+ "lastName" : "Hammerschmidt",
+ "nickName" : "Dr. JSON",
+ "age" : 20,
+ "phoneNumber" : "222-111-021",
+ "coffeeItem" : "Espresso",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 20, "lectureName" : "PL/SQL or Javascript", "credits" : 4},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
+ {"id" : 40, "lectureName" : "Oracle ADB on iPhone", "credits" : 3},
+ {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 2,
+ "firstName" : "Hermann",
+ "lastName" : "Baer",
+ "age" : 22,
+ "phoneNumber" : "222-112-023",
+ "coffeeItem" : "Cappuccino",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
+ {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 3,
+ "firstName" : "Shashank",
+ "lastName" : "Gugnani",
+ "nickName" : "SG",
+ "age" : 23,
+ "phoneNumber" : "222-112-024",
+ "coffeeItem" : "Americano",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 4,
+ "firstName" : "Julian",
+ "lastName" : "Dontcheff",
+ "nickName" : "Jul",
+ "age" : 24,
+ "phoneNumber" : "222-113-025",
+ "coffeeItem" : "Decaf",
+ "lectures" : [ {"id" : 40, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+
+ INSERT INTO lecture VALUES
+ ('{"_id" : 10,
+ "lectureName" : "JSON and SQL",
+ "creditHours" : 3,
+ "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
+ {"_id" : 2, "name": "Hermann", "age" : 22},
+ {"_id" : 3, "name": "Shashank", "age" : 23} ]}');
+ INSERT INTO lecture VALUES
+ ('{"_id" : 20,
+ "lectureName" : "PL/SQL or Javascript",
+ "creditHours" : 4,
+ "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20} ]}');
+ INSERT INTO lecture VALUES
+ ('{"_id" : 30,
+ "lectureName" : "MongoDB API Internals",
+ "creditHours" : 5,
+ "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
+ {"_id" : 2, "name": "Hermann", "age" : 22},
+ {"_id" : 3, "name": "Shashank", "age" : 23} ]}');
+ INSERT INTO lecture VALUES
+ ('{"_id" : 40,
+ "lectureName" : "Oracle ADB on iPhone",
+ "creditHours" : 3,
+ "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20} ]}');
+ INSERT INTO lecture VALUES
+ ('{"_id" : 50,
+ "lectureName" : "JSON Duality Views",
+ "creditHours" : 3,
+ "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
+ {"_id" : 2, "name": "Hermann", "age" : 22},
+ {"_id" : 4, "name": "Julian", "age" : 24} ]}');
+
+ COMMIT;
+
+ ```
## Task 2: Schema Inference using the JSON to Duality Migrator
In this task, we will infer a normalized relational schema using data from our JSON collections. The JSON to Duality Migrator will analyze the data in the input collections, and recommend a set of relational tables (including constraints, indexes, and sequences) and a set of duality views to match the input JSON collections.
-1. Run the `INFER_AND_GENERATE_SCHEMA` procedure to infer a relational schema using a few lines of PL/SQL code. This procedure will analyze the data in our input collections, infer an optimized normalized relational schema, and generate a DDL script to create the relational schema along with duality views for each collection. Here, we store the generated DDL script in a `CLOB` variable and then call the `EXECUTE IMMEDIATE` PL/SQL construct to execute the script.
+1. Run the `INFER_AND_GENERATE_SCHEMA` procedure to infer a relational schema using a few lines of PL/SQL code. This procedure will analyze the data in our input collections, infer an optimized normalized relational schema, and generate a DDL script to create the relational schema along with duality views for each collection.
- ```sql
-
- SET SERVEROUTPUT ON
- DECLARE
- schema_sql CLOB;
- BEGIN
- -- Infer relational schema
- schema_sql :=
- DBMS_JSON_DUALITY.INFER_AND_GENERATE_SCHEMA(
- JSON('{"tableNames" : [ "LECTURE", "ATTENDEE", "SPEAKER" ]}')
- );
+ ```sql
+
+ SET SERVEROUTPUT ON
+ DECLARE
+ schema_sql CLOB;
+ BEGIN
+ -- Infer relational schema
+ schema_sql :=
+ DBMS_JSON_DUALITY.INFER_AND_GENERATE_SCHEMA(
+ JSON('{"tableNames" : [ "LECTURE", "ATTENDEE", "SPEAKER" ]}')
+ );
- -- Print DDL script
- DBMS_OUTPUT.PUT_LINE(schema_sql);
+ -- Print DDL script
+ DBMS_OUTPUT.PUT_LINE(schema_sql);
- -- Create relational schema
- EXECUTE IMMEDIATE schema_sql;
- END;
- /
-
- ```
+ -- Create relational schema
+ EXECUTE IMMEDIATE schema_sql;
+ END;
+ /
+
+ ```
+ Here, we store the generated DDL script in a `CLOB` variable and then call the `EXECUTE IMMEDIATE` PL/SQL construct to execute the script.
- You can also use external tables pointing to data stored in object stores as the input to `INFER_AND_GENERATE_SCHEMA`. This is useful in cases where you are migrating from an external database to Oracle.
+ 
+
+ You can also use external tables pointing to data stored in object stores as the input to `INFER_AND_GENERATE_SCHEMA`. This is useful in cases where you are migrating from an external database to Oracle.
2. Check the objects created by the migrator. Note that the relational schema is completely normalized - one table is created per logical entity, one for speaker (`speaker_root`), one for attendee (`attendee_root`), and one for lecture (`lecture_root`). The many-to-many relationship between attendees and lectures is automatically identified and a mapping table is created to map attendees to lectures.
- ```sql
-
- SELECT object_name, object_type
- FROM user_objects
- WHERE created >= SYSDATE - INTERVAL '2' HOUR
- ORDER BY object_type DESC
- FETCH FIRST 15 ROWS ONLY;
-
- ```
+ ```sql
+
+ SELECT object_name, object_type
+ FROM user_objects
+ WHERE created >= SYSDATE - INTERVAL '2' HOUR
+ ORDER BY object_type DESC, created DESC
+ FETCH FIRST 15 ROWS ONLY;
+
+ ```
+
+ Note that the SQL script assumes you were running it within 2 hours of running the scheme inference and creation.
- 
+ 
- You can also use the two-phase API (`INFER_SCHEMA` and `GENERATE_SCHEMA`) to split the schema inference and DDL script generation into separate calls. This is useful when you want to inspect and hand-modify the resulting schema before generating the final DDL script.
+ You can also use the two-phase API (`INFER_SCHEMA` and `GENERATE_SCHEMA`) to split the schema inference and DDL script generation into separate calls. This is useful when you want to inspect and hand-modify the resulting schema before generating the final DDL script. There are always multiple ways of how to model JSON documents as relational schema.
3. Validate the schema using the `VALIDATE_SCHEMA_REPORT` table function. This should show no rows selected for each duality view, which means that there are no validation failures.
- ```sql
-
- SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'LECTURE', view_name => 'LECTURE_DUALITY');
- SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'ATTENDEE', view_name => 'ATTENDEE_DUALITY');
- SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'SPEAKER', view_name => 'SPEAKER_DUALITY');
-
- ```
+ ```sql
+
+ SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'LECTURE', view_name => 'LECTURE_DUALITY');
+ SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'ATTENDEE', view_name => 'ATTENDEE_DUALITY');
+ SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'SPEAKER', view_name => 'SPEAKER_DUALITY');
+
+ ```
- 
+ 
-The result shows no rows selected indicating that the resulting schema fits the input collections.
+ The result shows no rows selected indicating that the resulting schema fits the input collections.
## Task 3: Data Import using the JSON to Duality Migrator
In this task, we will import data from input JSON collections into the duality views. We will also look at techniques to find document that cannot be imported successfully.
-1. Let's create error logs to log errors for documents that do not get imported successfully.
+1. Let's create error logs to log errors for documents that do not get imported successfully. We leverage Oracle's DML Error Logging capabilities and start off with creating the tables that would track errors.
- ```sql
-
- BEGIN
- DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'LECTURE', err_log_table_name => 'LECTURE_ERR_LOG', skip_unsupported => TRUE);
- DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'ATTENDEE', err_log_table_name => 'ATTENDEE_ERR_LOG', skip_unsupported => TRUE);
- DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'SPEAKER', err_log_table_name => 'SPEAKER_ERR_LOG', skip_unsupported => TRUE);
- END;
- /
-
- ```
+ ```sql
+
+ BEGIN
+ DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'LECTURE', err_log_table_name => 'LECTURE_ERR_LOG', skip_unsupported => TRUE);
+ DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'ATTENDEE', err_log_table_name => 'ATTENDEE_ERR_LOG', skip_unsupported => TRUE);
+ DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'SPEAKER', err_log_table_name => 'SPEAKER_ERR_LOG', skip_unsupported => TRUE);
+ END;
+ /
+
+ ```
2. Let's import the data into the duality views using the `IMPORT_ALL` procedure.
- ```sql
-
- BEGIN
- DBMS_JSON_DUALITY.IMPORT_ALL(
- JSON('{"tableNames" : [ "LECTURE", "ATTENDEE", "SPEAKER" ],
- "viewNames" : [ "LECTURE_DUALITY", "ATTENDEE_DUALITY", "SPEAKER_DUALITY" ],
- "errorLog" : [ "LECTURE_ERR_LOG", "ATTENDEE_ERR_LOG", "SPEAKER_ERR_LOG" ]}'
- )
- );
- END;
- /
-
- ```
+ ```sql
+
+ BEGIN
+ DBMS_JSON_DUALITY.IMPORT_ALL(
+ JSON('{"tableNames" : [ "LECTURE", "ATTENDEE", "SPEAKER" ],
+ "viewNames" : [ "LECTURE_DUALITY", "ATTENDEE_DUALITY", "SPEAKER_DUALITY" ],
+ "errorLog" : [ "LECTURE_ERR_LOG", "ATTENDEE_ERR_LOG", "SPEAKER_ERR_LOG" ]}'
+ )
+ );
+ END;
+ /
+
+ ```
+ 
3. Query the error logs. The error logs are empty, showing that there are no import errors — there are no documents that did not get imported.
- ```sql
-
- SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM LECTURE_ERR_LOG;
- SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM ATTENDEE_ERR_LOG;
- SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM SPEAKER_ERR_LOG;
-
- ```
+ ```sql
+
+ SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM LECTURE_ERR_LOG;
+ SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM ATTENDEE_ERR_LOG;
+ SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM SPEAKER_ERR_LOG;
+
+ ```
- 
+ 
- > **_NOTE:_** In case you find that some documents could not be imported successfully, you can look at the error message to understand the reason for the failure, fix the error by either modifying the relational schema or document contents, and reimport the failed document set.
+ > **_NOTE:_** In case you find that some documents could not be imported successfully, you can look at the error message to understand the reason for the failure, fix the error by either modifying the relational schema or document contents, and reimport the failed document set.
In this lab, we used the default configuration options when invoking the `INFER_AND_GENERATE_SCHEMA` procedure and did not customize the relational schema in any way. However, in many use cases, you may want to customize the relational schema based on business requirements and the application model. In the next lab, we will see how to use a few configuration options to customize and design the relational schema with the JSON to Duality Migrator.
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/attendee-flexfield.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/attendee-flexfield.png
new file mode 100644
index 000000000..03fc757a0
Binary files /dev/null and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/attendee-flexfield.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/dbaction1.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/dbaction1.png
new file mode 100644
index 000000000..02d407f21
Binary files /dev/null and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/dbaction1.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/development-sql.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/development-sql.png
new file mode 100644
index 000000000..6d9f479a4
Binary files /dev/null and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/development-sql.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/import-data.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/import-data.png
new file mode 100644
index 000000000..705a3c4b5
Binary files /dev/null and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/import-data.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/sample-json-collection-doc.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/sample-json-collection-doc.png
new file mode 100644
index 000000000..c1a678b15
Binary files /dev/null and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/sample-json-collection-doc.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/sample-json-dv-doc.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/sample-json-dv-doc.png
new file mode 100644
index 000000000..fbd7ac856
Binary files /dev/null and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/sample-json-dv-doc.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/schema-tables.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/schema-tables.png
new file mode 100644
index 000000000..9edf47eb5
Binary files /dev/null and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/schema-tables.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/speaker-phone-string.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/speaker-phone-string.png
new file mode 100644
index 000000000..733882210
Binary files /dev/null and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/speaker-phone-string.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task2-step3.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task2-step3.png
index f0c83b364..67827cf1c 100644
Binary files a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task2-step3.png and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task2-step3.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task2-step4.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task2-step4.png
index 1dc8d71f6..5edcb2b4c 100644
Binary files a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task2-step4.png and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task2-step4.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task3-step3.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task3-step3.png
index 7766bd1be..d972ccce0 100644
Binary files a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task3-step3.png and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task3-step3.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task3-step4.png b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task3-step4.png
index b890e025d..a78f19b25 100644
Binary files a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task3-step4.png and b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/images/task3-step4.png differ
diff --git a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/json-to-duality-migrator-schema-design.md b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/json-to-duality-migrator-schema-design.md
index 01315aec6..6f3c7fe8b 100644
--- a/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/json-to-duality-migrator-schema-design.md
+++ b/json-to-duality-migrator/4-json-to-duality-migrator-schema-design/json-to-duality-migrator-schema-design.md
@@ -27,141 +27,161 @@ In this lab, you will:
Just like the previous lab, we will start with JSON collection tables `speaker`, `attendee`, and `lecture` that represents collections required for a database conference application.
-1. Let's drop all the objects that we created in the previous lab first.
-
- ```sql
-
- BEGIN
- FOR t IN (
- SELECT object_name
- FROM user_objects
- WHERE object_type = 'TABLE'
- AND created >= SYSDATE - INTERVAL '2' HOUR
- ) LOOP
- BEGIN
- EXECUTE IMMEDIATE 'DROP TABLE "' || t.object_name || '" CASCADE CONSTRAINTS PURGE';
- EXCEPTION
- WHEN OTHERS THEN
- DBMS_OUTPUT.PUT_LINE('Failed to drop table ' || t.object_name || ': ' || SQLERRM);
- END;
- END LOOP;
- END;
- /
-
- ```
+1. Click in the *Database Actions* dropdown list and select **View all database actions**
+
+ 
+
+
+2. Below you can find the Database Actions homepage. Click the SQL tile under development to open the SQL worksheet.
+
+ 
+
+3. Let's drop all the objects that we created in the previous lab first.
+
+ ```sql
+
+ BEGIN
+ FOR t IN (
+ SELECT object_name
+ FROM user_objects
+ WHERE object_type = 'TABLE'
+ AND created >= SYSDATE - INTERVAL '2' HOUR
+ ) LOOP
+ BEGIN
+ EXECUTE IMMEDIATE 'DROP TABLE "' || t.object_name || '" CASCADE CONSTRAINTS PURGE';
+ EXCEPTION
+ WHEN OTHERS THEN
+ DBMS_OUTPUT.PUT_LINE('Failed to drop table ' || t.object_name || ': ' || SQLERRM);
+ END;
+ END LOOP;
+ END;
+ /
+
+ ```
+
+ This code basically drops all tables that were created in the last 2 hours. If it took you longer to get to this point in the lab, then you need to increase the time window.
+
+ Let's check that there are no objects existent in our schema anymore, or at least no objects that will collide with this workshop..
+
+ ```sql
+
+ SELECT * FROM user_tables;
+
+ ```
+ 
2. Create the `speaker`, `attendee`, and `lecture` collections.
- ```sql
-
- DROP VIEW IF EXISTS speaker;
- DROP VIEW IF EXISTS attendee;
- DROP VIEW IF EXISTS lecture;
- DROP TABLE IF EXISTS speaker PURGE;
- DROP TABLE IF EXISTS attendee PURGE;
- DROP TABLE IF EXISTS lecture PURGE;
- CREATE JSON COLLECTION TABLE IF NOT EXISTS speaker;
- CREATE JSON COLLECTION TABLE IF NOT EXISTS attendee;
- CREATE JSON COLLECTION TABLE IF NOT EXISTS lecture;
-
- ```
+ ```sql
+
+ DROP VIEW IF EXISTS speaker;
+ DROP VIEW IF EXISTS attendee;
+ DROP VIEW IF EXISTS lecture;
+ DROP TABLE IF EXISTS speaker PURGE;
+ DROP TABLE IF EXISTS attendee PURGE;
+ DROP TABLE IF EXISTS lecture PURGE;
+ CREATE JSON COLLECTION TABLE IF NOT EXISTS speaker;
+ CREATE JSON COLLECTION TABLE IF NOT EXISTS attendee;
+ CREATE JSON COLLECTION TABLE IF NOT EXISTS lecture;
+
+ ```
3. Insert data into the `speaker`, `attendee`, and `lecture` collections. The attendee data is a bit different from the previous lab - only one attendee has specified their pre-ordered coffee item.
- ```sql
-
- INSERT INTO speaker VALUES
- ('{"_id" : 101,
- "name" : "Abdul J.",
- "phoneNumber" : "222-555-011",
- "yearsAtOracle" : 25,
- "department" : "Product Management",
- "lecturesTaught" : [ {"id" : 10, "lectureName" : "JSON and SQL", "classType" : "Online"},
- {"id" : 20, "lectureName" : "PL/SQL or Javascript", "classType" : "In-person"} ]}');
- INSERT INTO speaker VALUES
- ('{"_id" : 102,
- "name" : "Betty Z.",
- "phoneNumber" : "222-555-022",
- "yearsAtOracle" : 30,
- "department" : "Autonomous Databases",
- "lecturesTaught" : [ {"id" : 30, "lectureName" : "MongoDB API Internals", "classType" : "In-person"},
- {"id" : 40, "lectureName" : "Oracle ADB on iPhone", "classType" : "Online"} ]}');
- INSERT INTO speaker VALUES
- ('{"_id" : 103,
- "name" : "Colin J.",
- "phoneNumber" : "222-555-023",
- "yearsAtOracle" : 27,
- "department" : "In-Memory and Data",
- "lecturesTaught" : [ {"id" : 50, "lectureName" : "JSON Duality Views", "classType" : "Online"} ]}');
-
- INSERT INTO attendee VALUES
- ('{"_id" : 1,
- "name" : "Beda",
- "age" : 20,
- "phoneNumber" : "222-111-021",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 20, "lectureName" : "PL/SQL or Javascript", "credits" : 4},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
- {"id" : 40, "lectureName" : "Oracle ADB on iPhone", "credits" : 3},
- {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 2,
- "name" : "Hermann",
- "age" : 22,
- "phoneNumber" : "222-112-023",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
- {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 3,
- "name" : "Shashank",
- "age" : 23,
- "phoneNumber" : "222-112-024",
- "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
- {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5} ]}');
- INSERT INTO attendee VALUES
- ('{"_id" : 4,
- "name" : "Julian",
- "age" : 24,
- "phoneNumber" : "222-113-025",
- "coffeeItem" : "Decaf",
- "lectures" : [ {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
-
- INSERT INTO lecture VALUES
- ('{"_id" : 10,
- "lectureName" : "JSON and SQL",
- "creditHours" : 3,
- "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
- {"_id" : 2, "name": "Hermann", "age" : 22},
- {"_id" : 3, "name": "Shashank", "age" : 23} ]}');
- INSERT INTO lecture VALUES
- ('{"_id" : 20,
- "lectureName" : "PL/SQL or Javascript",
- "creditHours" : 4,
- "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20} ]}');
- INSERT INTO lecture VALUES
- ('{"_id" : 30,
- "lectureName" : "MongoDB API Internals",
- "creditHours" : 5,
- "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
- {"_id" : 2, "name": "Hermann", "age" : 22},
- {"_id" : 3, "name": "Shashank", "age" : 23} ]}');
- INSERT INTO lecture VALUES
- ('{"_id" : 40,
- "lectureName" : "Oracle ADB on iPhone",
- "creditHours" : 3,
- "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20} ]}');
- INSERT INTO lecture VALUES
- ('{"_id" : 50,
- "lectureName" : "JSON Duality Views",
- "creditHours" : 3,
- "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
- {"_id" : 2, "name": "Hermann", "age" : 22},
- {"_id" : 4, "name": "Julian", "age" : 24} ]}');
-
- COMMIT;
-
- ```
+ ```sql
+
+ INSERT INTO speaker VALUES
+ ('{"_id" : 101,
+ "name" : "Abdul J.",
+ "phoneNumber" : "222-555-011",
+ "yearsAtOracle" : 25,
+ "department" : "Product Management",
+ "lecturesTaught" : [ {"id" : 10, "lectureName" : "JSON and SQL", "classType" : "Online"},
+ {"id" : 20, "lectureName" : "PL/SQL or Javascript", "classType" : "In-person"} ]}');
+ INSERT INTO speaker VALUES
+ ('{"_id" : 102,
+ "name" : "Betty Z.",
+ "phoneNumber" : "222-555-022",
+ "yearsAtOracle" : 30,
+ "department" : "Autonomous Databases",
+ "lecturesTaught" : [ {"id" : 30, "lectureName" : "MongoDB API Internals", "classType" : "In-person"},
+ {"id" : 40, "lectureName" : "Oracle ADB on iPhone", "classType" : "Online"} ]}');
+ INSERT INTO speaker VALUES
+ ('{"_id" : 103,
+ "name" : "Colin J.",
+ "phoneNumber" : "222-555-023",
+ "yearsAtOracle" : 27,
+ "department" : "In-Memory and Data",
+ "lecturesTaught" : [ {"id" : 50, "lectureName" : "JSON Duality Views", "classType" : "Online"} ]}');
+
+ INSERT INTO attendee VALUES
+ ('{"_id" : 1,
+ "name" : "Beda",
+ "age" : 20,
+ "phoneNumber" : "222-111-021",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 20, "lectureName" : "PL/SQL or Javascript", "credits" : 4},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
+ {"id" : 40, "lectureName" : "Oracle ADB on iPhone", "credits" : 3},
+ {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 2,
+ "name" : "Hermann",
+ "age" : 22,
+ "phoneNumber" : "222-112-023",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5},
+ {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 3,
+ "name" : "Shashank",
+ "age" : 23,
+ "phoneNumber" : "222-112-024",
+ "lectures" : [ {"id" : 10, "lectureName" : "JSON and SQL", "credits" : 3},
+ {"id" : 30, "lectureName" : "MongoDB API Internals", "credits" : 5} ]}');
+ INSERT INTO attendee VALUES
+ ('{"_id" : 4,
+ "name" : "Julian",
+ "age" : 24,
+ "phoneNumber" : "222-113-025",
+ "coffeeItem" : "Decaf",
+ "lectures" : [ {"id" : 50, "lectureName" : "JSON Duality Views", "credits" : 3} ]}');
+
+ INSERT INTO lecture VALUES
+ ('{"_id" : 10,
+ "lectureName" : "JSON and SQL",
+ "creditHours" : 3,
+ "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
+ {"_id" : 2, "name": "Hermann", "age" : 22},
+ {"_id" : 3, "name": "Shashank", "age" : 23} ]}');
+ INSERT INTO lecture VALUES
+ ('{"_id" : 20,
+ "lectureName" : "PL/SQL or Javascript",
+ "creditHours" : 4,
+ "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20} ]}');
+ INSERT INTO lecture VALUES
+ ('{"_id" : 30,
+ "lectureName" : "MongoDB API Internals",
+ "creditHours" : 5,
+ "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
+ {"_id" : 2, "name": "Hermann", "age" : 22},
+ {"_id" : 3, "name": "Shashank", "age" : 23} ]}');
+ INSERT INTO lecture VALUES
+ ('{"_id" : 40,
+ "lectureName" : "Oracle ADB on iPhone",
+ "creditHours" : 3,
+ "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20} ]}');
+ INSERT INTO lecture VALUES
+ ('{"_id" : 50,
+ "lectureName" : "JSON Duality Views",
+ "creditHours" : 3,
+ "attendeesEnrolled" : [ {"_id" : 1, "name": "Beda", "age" : 20},
+ {"_id" : 2, "name": "Hermann", "age" : 22},
+ {"_id" : 4, "name": "Julian", "age" : 24} ]}');
+
+ COMMIT;
+
+ ```
## Task 2: Customized Schema Inference using the JSON to Duality Migrator
@@ -169,83 +189,101 @@ In this task, we will infer a customized normalized relational schema using data
1. Run the `INFER_AND_GENERATE_SCHEMA` procedure to infer a relational schema. We specify `minFieldFrequency` as 30 so that the `coffeeItem` field will be pruned from the schema and won't map to a relational column.
- ```sql
-
- SET SERVEROUTPUT ON
- DECLARE
- schema_sql CLOB;
- BEGIN
- -- Infer relational schema
- schema_sql :=
- DBMS_JSON_DUALITY.INFER_AND_GENERATE_SCHEMA(
- JSON('{"tableNames" : [ "ATTENDEE", "SPEAKER", "LECTURE" ],
- "minFieldFrequency" : 30}'
- )
- );
+ minFieldFrequency defines the percentage threshold for all documents when a field is considered an **occurence outlier**. Such outliers are not mapped to any underlying column.
+
+
+ ```sql
+
+ SET SERVEROUTPUT ON
+ DECLARE
+ schema_sql CLOB;
+ BEGIN
+ -- Infer relational schema
+ schema_sql :=
+ DBMS_JSON_DUALITY.INFER_AND_GENERATE_SCHEMA(
+ JSON('{"tableNames" : [ "ATTENDEE", "SPEAKER", "LECTURE" ],
+ "minFieldFrequency" : 30}'
+ )
+ );
+
+ -- Print DDL script
+ DBMS_OUTPUT.PUT_LINE(schema_sql);
+ END;
+ /
+
+ ```
+
+ Let's inspect the output DDL script. Since `coffeeItem` appears in fewer than 30% of documents, there is no column in the relational schema for the field, nor does it appear in the duality view definition.
+
+ So what happens when you import the input data into this duality view? Can you guess from what we learnt in lab 4? Since the migrator creates flex JSON columns by default, the `coffeeItem` field will be inserted into the flex column! This is a great way of handling rare fields - set `minFieldFrequency` as desired and the migrator handles the rest.
+
+ 
- -- Print DDL script
- DBMS_OUTPUT.PUT_LINE(schema_sql);
- END;
- /
-
- ```
+ Feel free to experiment with different values and re-run the migrator code. We are only displaying the DDL, but not executing it.
- Let's inspect the output DDL script. Since `coffeeItem` appears in fewer than 30% of documents, there is no column in the relational schema for the field, nor does it appear in the duality view definition. So what happens when you import the input data into this duality view? Can you guess from what we learnt in lab 2? Since the migrator creates flex JSON columns by default, the `coffeeItem` field will be inserted into the flex column! This is a great way of handling rare fields - set `minFieldFrequency` as desired and the migrator handles the rest.
2. Run the `INFER_AND_GENERATE_SCHEMA` procedure again with a datatype hint for the `phoneNumber` field. In the output above, we created a `VARCHAR2` column for phone number. Let's say that we want a fixed character length datatype (`CHAR`) for the phone number instead, since we know that all phone numbers will have the same length. We can use the hints configuration parameter to specify the datatype for the `phoneNumber` field.
- ```sql
-
- SET SERVEROUTPUT ON
- DECLARE
- schema_sql CLOB;
- BEGIN
- -- Infer relational schema
- schema_sql :=
- DBMS_JSON_DUALITY.INFER_AND_GENERATE_SCHEMA(
- JSON('{"tableNames" : [ "ATTENDEE", "SPEAKER", "LECTURE" ],
- "minFieldFrequency" : 30,
- "hints" : [ {"table" : "SPEAKER",
- "type" : "datatype",
- "path" : "$.phoneNumber",
- "value" : "CHAR(11)"} ]
- }'
- )
- );
-
- -- Print DDL script
- DBMS_OUTPUT.PUT_LINE(schema_sql);
-
- -- Create relational schema
- EXECUTE IMMEDIATE schema_sql;
- END;
- /
-
- ```
+ 
+
+ Let's now create the schema and influence the datatype for phoneNumber accordingly.
+
+ ```sql
+
+ SET SERVEROUTPUT ON
+ DECLARE
+ schema_sql CLOB;
+ BEGIN
+ -- Infer relational schema
+ schema_sql :=
+ DBMS_JSON_DUALITY.INFER_AND_GENERATE_SCHEMA(
+ JSON('{"tableNames" : [ "ATTENDEE", "SPEAKER", "LECTURE" ],
+ "minFieldFrequency" : 30,
+ "hints" : [ {"table" : "SPEAKER",
+ "type" : "datatype",
+ "path" : "$.phoneNumber",
+ "value" : "CHAR(11)"} ]
+ }'
+ )
+ );
+
+ -- Print DDL script
+ DBMS_OUTPUT.PUT_LINE(schema_sql);
+
+ -- Create relational schema
+ EXECUTE IMMEDIATE schema_sql;
+ END;
+ /
+
+ ```
+
+ 
+
+
3. Describe the `speaker_root` table. We can see that the datatype for the `PHONE_NUMBER` column is `CHAR(11)`, which is exactly what we specified in the hint parameter.
- ```sql
-
- DESC speaker_root
-
- ```
+ ```sql
+
+ DESC speaker_root
+
+ ```
- 
+ 
- The migrator also allows you to specify hints to specify identifying keys for sub-objects and whether to share data for sub-objects with other collections. The hint infrastructure is an effective tool to design and customize an effective normalized relational schema based on application and business requirements.
+ The migrator also allows you to specify hints to specify identifying keys for sub-objects and whether to share data for sub-objects with other collections. The hint infrastructure is an effective tool to design and customize an effective normalized relational schema based on application and business requirements.
4. Validate the schema using the `VALIDATE_SCHEMA_REPORT` table function. This should show no rows selected for each duality view, which means that there are no validation failures.
- ```sql
-
- SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'LECTURE', view_name => 'LECTURE_DUALITY');
- SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'ATTENDEE', view_name => 'ATTENDEE_DUALITY');
- SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'SPEAKER', view_name => 'SPEAKER_DUALITY');
-
- ```
+ ```sql
+
+ SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'LECTURE', view_name => 'LECTURE_DUALITY');
+ SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'ATTENDEE', view_name => 'ATTENDEE_DUALITY');
+ SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_SCHEMA_REPORT(table_name => 'SPEAKER', view_name => 'SPEAKER_DUALITY');
+
+ ```
- 
+ 
## Task 3: Data Import and Validation using the JSON to Duality Migrator
@@ -253,61 +291,83 @@ In this task, we will import data from input JSON collections into the duality v
1. Let's create error logs to log errors for documents that do not get imported successfully.
- ```sql
-
- BEGIN
- DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'LECTURE', err_log_table_name => 'LECTURE_ERR_LOG', skip_unsupported => TRUE);
- DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'ATTENDEE', err_log_table_name => 'ATTENDEE_ERR_LOG', skip_unsupported => TRUE);
- DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'SPEAKER', err_log_table_name => 'SPEAKER_ERR_LOG', skip_unsupported => TRUE);
- END;
- /
-
- ```
+ ```sql
+
+ BEGIN
+ DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'LECTURE', err_log_table_name => 'LECTURE_ERR_LOG', skip_unsupported => TRUE);
+ DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'ATTENDEE', err_log_table_name => 'ATTENDEE_ERR_LOG', skip_unsupported => TRUE);
+ DBMS_ERRLOG.CREATE_ERROR_LOG(dml_table_name => 'SPEAKER', err_log_table_name => 'SPEAKER_ERR_LOG', skip_unsupported => TRUE);
+ END;
+ /
+
+ ```
2. Let's import the data into the duality views using the `IMPORT_ALL` procedure.
- ```sql
-
- BEGIN
- DBMS_JSON_DUALITY.IMPORT_ALL(
- JSON('{"tableNames" : [ "LECTURE", "ATTENDEE", "SPEAKER" ],
- "viewNames" : [ "LECTURE_DUALITY", "ATTENDEE_DUALITY", "SPEAKER_DUALITY" ],
- "errorLog" : [ "LECTURE_ERR_LOG", "ATTENDEE_ERR_LOG", "SPEAKER_ERR_LOG" ]}'
- )
- );
- END;
- /
-
- ```
+ ```sql
+
+ BEGIN
+ DBMS_JSON_DUALITY.IMPORT_ALL(
+ JSON('{"tableNames" : [ "LECTURE", "ATTENDEE", "SPEAKER" ],
+ "viewNames" : [ "LECTURE_DUALITY", "ATTENDEE_DUALITY", "SPEAKER_DUALITY" ],
+ "errorLog" : [ "LECTURE_ERR_LOG", "ATTENDEE_ERR_LOG", "SPEAKER_ERR_LOG" ]}'
+ )
+ );
+ END;
+ /
+
+ ```
+ 
3. Query the error logs. The error logs are empty, showing that there are no import errors — there are no documents that did not get imported.
- ```sql
-
- SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM LECTURE_ERR_LOG;
- SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM ATTENDEE_ERR_LOG;
- SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM SPEAKER_ERR_LOG;
-
- ```
+ ```sql
+
+ SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM LECTURE_ERR_LOG;
+ SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM ATTENDEE_ERR_LOG;
+ SELECT ora_err_number$, ora_err_mesg$, ora_err_tag$ FROM SPEAKER_ERR_LOG;
+
+ ```
- 
+ 
- > **_NOTE:_** In case you find that some documents could not be imported successfully, you can look at the error message to understand the reason for the failure, fix the error by either modifying the relational schema or document contents, and reimport the failed document set.
+ > **_NOTE:_** In case you find that some documents could not be imported successfully, you can look at the error message to understand the reason for the failure, fix the error by either modifying the relational schema or document contents, and reimport the failed document set.
4. Let's validate that all data has been successfully imported using the `VALIDATE_IMPORT_REPORT` table function. This function validates that all documents that have been imported into duality views have correct data by comparing the duality view document contents with the input collection document content. This should show no rows selected for each duality view, which means that all data has been successfully imported.
- ```sql
-
- SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_IMPORT_REPORT(table_name => 'LECTURE', view_name => 'LECTURE_DUALITY');
- SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_IMPORT_REPORT(table_name => 'ATTENDEE', view_name => 'ATTENDEE_DUALITY');
- SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_IMPORT_REPORT(table_name => 'SPEAKER', view_name => 'SPEAKER_DUALITY');
-
- ```
+ ```sql
+
+ SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_IMPORT_REPORT(table_name => 'LECTURE', view_name => 'LECTURE_DUALITY');
+ SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_IMPORT_REPORT(table_name => 'ATTENDEE', view_name => 'ATTENDEE_DUALITY');
+ SELECT * FROM DBMS_JSON_DUALITY.VALIDATE_IMPORT_REPORT(table_name => 'SPEAKER', view_name => 'SPEAKER_DUALITY');
+
+ ```
- 
+ 
> **_NOTE:_** In case you find that some documents have different content that the input document, you can look at the error message to understand the reason for the difference, fix the error by either modifying the relational schema or document contents, and reimport the failed document set.
+5. Now let's look at our duality views and compare them quickly with the original documents in our JSON collection.
+
+ Issue the following SQL to show the attendee information for _id = 3, the original document in our JSON Collection table.
+
+ ```sql
+
+ SELECT json_serialize(data pretty) FROM attendee a WHERE a.data."_id"=3;
+
+ ```
+ 
+
+ Now let's do the exact same, looking at our new JSON Relational Duality View attendee_duality for the same _id. You will see it is exactly the same data, it just carries some additional metadata information for the lock-free concurrency capabilities of Duality Views and it is using some additinoal auto-generated names.
+
+ ```sql
+
+ SELECT json_serialize(data pretty) FROM attendee_duality a WHERE a.data."_id"=3;
+
+ ```
+ 
+
+
## Learn More
* [JSON to Duality Migrator Schema Customization Options](https://docs.oracle.com/en/database/oracle/oracle-database/23/sutil/json-config-filelds-specifying-migrator-parameters.html#GUID-36F4ABA7-CA01-4D70-9CF1-50A00C21E149)