From 2bff2a98eb46c186cdbd086679f933686136c92c Mon Sep 17 00:00:00 2001
From: Scott Sandre <scott.sandre@databricks.com>
Date: Wed, 22 Feb 2023 15:25:24 -0800
Subject: [PATCH] done

---
 src/pages/latest/delta-faq.mdx    | 32 +++++++++++++++----------------
 src/pages/latest/delta-update.mdx |  4 ++--
 src/pages/latest/releases.mdx     |  2 +-
 3 files changed, 19 insertions(+), 19 deletions(-)

diff --git a/src/pages/latest/delta-faq.mdx b/src/pages/latest/delta-faq.mdx
index ad99aab..032bb42 100644
--- a/src/pages/latest/delta-faq.mdx
+++ b/src/pages/latest/delta-faq.mdx
@@ -6,27 +6,27 @@ menu: docs
 
 _In this article:_
 
-- [What is Delta Lake?](/latest/delta-faq.html#what-is-delta-lake)
+- [What is Delta Lake?](/latest/delta-faq#what-is-delta-lake)
 
-- [How is Delta Lake related to Apache Spark?](/latest/delta-faq.html#how-is-delta-lake-related-to-apache-spark)
+- [How is Delta Lake related to Apache Spark?](/latest/delta-faq#how-is-delta-lake-related-to-apache-spark)
 
-- [What format does Delta Lake use to store data?](/latest/delta-faq.html#what-format-does-delta-lake-use-to-store-data)
+- [What format does Delta Lake use to store data?](/latest/delta-faq#what-format-does-delta-lake-use-to-store-data)
 
-- [How can I read and write data with Delta Lake?](/latest/delta-faq.html#how-can-i-read-and-write-data-with-delta-lake)
+- [How can I read and write data with Delta Lake?](/latest/delta-faq#how-can-i-read-and-write-data-with-delta-lake)
 
-- [Where does Delta Lake store the data?](/latest/delta-faq.html#where-does-delta-lake-store-the-data)
+- [Where does Delta Lake store the data?](/latest/delta-faq#where-does-delta-lake-store-the-data)
 
-- [Can I copy my Delta Lake table to another location?](/latest/delta-faq.html#can-i-copy-my-delta-lake-table-to-another-location)
+- [Can I copy my Delta Lake table to another location?](/latest/delta-faq#can-i-copy-my-delta-lake-table-to-another-location)
 
-- [Can I stream data directly into and from Delta tables?](/latest/delta-faq.html#can-i-stream-data-directly-into-and-from-delta-tables)
+- [Can I stream data directly into and from Delta tables?](/latest/delta-faq#can-i-stream-data-directly-into-and-from-delta-tables)
 
-- [Does Delta Lake support writes or reads using the Spark Streaming DStream API?](/latest/delta-faq.html#does-delta-lake-support-writes-or-reads-using-the-spark-streaming-dstream-api)
+- [Does Delta Lake support writes or reads using the Spark Streaming DStream API?](/latest/delta-faq#does-delta-lake-support-writes-or-reads-using-the-spark-streaming-dstream-api)
 
-- [When I use Delta Lake, will I be able to port my code to other Spark platforms easily?](/latest/delta-faq.html#when-i-use-delta-lake-will-i-be-able-to-port-my-code-to-other-spark-platforms-easily)
+- [When I use Delta Lake, will I be able to port my code to other Spark platforms easily?](/latest/delta-faq#when-i-use-delta-lake-will-i-be-able-to-port-my-code-to-other-spark-platforms-easily)
 
-- [Does Delta Lake support multi-table transactions?](/latest/delta-faq.html#does-delta-lake-support-multi-table-transactions)
+- [Does Delta Lake support multi-table transactions?](/latest/delta-faq#does-delta-lake-support-multi-table-transactions)
 
-- [How can I change the type of a column?](/latest/delta-faq.html#how-can-i-change-the-type-of-a-column)
+- [How can I change the type of a column?](/latest/delta-faq#how-can-i-change-the-type-of-a-column)
 
 ## What is Delta Lake?
 
@@ -54,8 +54,8 @@ provide ACID transactions.
 
 You can use your favorite Apache Spark APIs to read and write data with
 
-Delta Lake. See [Read a table](/latest/delta-batch#deltadataframereads) and
-[Write to a table](/latest/delta-batch#deltadataframewrites).
+Delta Lake. See [Read a table](/latest/delta-batch#read-a-table) and
+[Write to a table](/latest/delta-batch#write-to-a-table).
 
 ## Where does Delta Lake store the data?
 
@@ -73,8 +73,8 @@ timestamps will be consistent.
 
 Yes, you can use Structured Streaming to directly write data into Delta tables
 and read from Delta tables. See [Stream data into Delta
-tables](/latest/delta-streaming#stream-sink) and [Stream data from Delta
-tables](/latest/delta-streaming#stream-source).
+tables](/latest/delta-streaming#delta-table-as-a-sink) and [Stream data from Delta
+tables](/latest/delta-streaming#delta-table-as-a-source).
 
 ## Does Delta Lake support writes or reads using the Spark Streaming DStream API?
 
@@ -94,4 +94,4 @@ Delta Lake supports transactions at the _table_ level.
 ## How can I change the type of a column?
 
 Changing a column's type or dropping a column requires rewriting the table. For
-an example, see [Change column type](delta-batch.md#change-column-type).
+an example, see [Change column type](/latest/delta-batch/#change-column-type-or-name).
diff --git a/src/pages/latest/delta-update.mdx b/src/pages/latest/delta-update.mdx
index 6ac9b04..3ea01dc 100644
--- a/src/pages/latest/delta-update.mdx
+++ b/src/pages/latest/delta-update.mdx
@@ -339,7 +339,7 @@ Here is a detailed description of the `merge` programmatic operation.
 
 - `whenMatched` clauses are executed when a source row matches a target table row based on the match condition. These clauses have the following semantics.
 
-  - `whenMatched` clauses can have at most one `update` and one `delete` action. The `update` action in `merge` only updates the specified columns (similar to the `update` [operation](#delta-update)) of the matched target row. The `delete` action deletes the matched row.
+  - `whenMatched` clauses can have at most one `update` and one `delete` action. The `update` action in `merge` only updates the specified columns (similar to the `update` [operation](#update-a-table)) of the matched target row. The `delete` action deletes the matched row.
 
   - Each `whenMatched` clause can have an optional condition. If this clause condition exists, the `update` or `delete` action is executed for any matching source-target row pair only when the clause condition is true.
 
@@ -404,7 +404,7 @@ Here is a detailed description of the `merge` programmatic operation.
 
 - For `updateAll` and `insertAll` actions, the source dataset must have all the columns of the target Delta table. The source dataset can have extra columns and they are ignored.
 
-  If you do not want the extra columns to be ignored and instead want to update the target table schema to include new columns, see [merge schema evolution](#merge-schema-evolution).
+  If you do not want the extra columns to be ignored and instead want to update the target table schema to include new columns, see [Automatic schema evolution](#automatic-schema-evolution).
 
 - For all actions, if the data type generated by the expressions producing the target columns are different from the corresponding columns in the target Delta table, `merge` tries to cast them to the types in the table.
 
diff --git a/src/pages/latest/releases.mdx b/src/pages/latest/releases.mdx
index d4a9cb9..0bf6f21 100644
--- a/src/pages/latest/releases.mdx
+++ b/src/pages/latest/releases.mdx
@@ -21,4 +21,4 @@ The following table lists Delta Lake versions and their compatible Apache Spark
 | 1.1.x              | 3.2.x                |
 | 1.0.x              | 3.1.x                |
 | 0.7.x and 0.8.x    | 3.0.x                |
-| Below 0.7.0        | 2.4.2 - 2.4._latest_ |
+| Below 0.7.0        | 2.4.2 - 2.4.4        |