diff --git a/docs/integrations/enterprise-connectors/README.md b/docs/integrations/enterprise-connectors/README.md new file mode 100644 index 000000000000..86b212087241 --- /dev/null +++ b/docs/integrations/enterprise-connectors/README.md @@ -0,0 +1,7 @@ +# Enterprise Connectors + +Airbyte Enterprise Connectors are a selection of premium connectors available exclusively for Airbyte Self-Managed Enterprise and Airbyte Teams customers. These connectors, built and maintained by the Airbyte team, provide enhanced capabilities and support for critical enterprise systems that are not available to users of Airbyte Open Source and Airbyte Cloud. Key benefits of these connectors include support for larger data sets, parallelism for faster data transfers, and that they are covered under Airbyte Support SLAs. + +If you participate in the pre-release phase of an enterprise connector, you will retain the right to use it after its main release, even if you are not a customer of Airbyte Self-Managed Enterprise, or Airbyte Teams. + +To learn more about enterprise connectors, please [talk to our sales team](https://airbyte.com/company/talk-to-sales). diff --git a/docs/integrations/enterprise-connectors/source-oracle.md b/docs/integrations/enterprise-connectors/source-oracle.md new file mode 100644 index 000000000000..d82130edc3c7 --- /dev/null +++ b/docs/integrations/enterprise-connectors/source-oracle.md @@ -0,0 +1,232 @@ +# Source Oracle + +Airbyte's incubating Oracle enterprise source connector offers the following features: + +- Incremental as well as Full Refresh + [sync modes](https://docs.airbyte.com/cloud/core-concepts#connection-sync-modes), providing + flexibility in how data is delivered to your destination. + Note that incremental syncs using [Change Data Capture (CDC)](https://docs.airbyte.com/understanding-airbyte/cdc) are not yet supported. +- Reliable replication at any table size with + [checkpointing](https://docs.airbyte.com/understanding-airbyte/airbyte-protocol/#state--checkpointing) + and chunking of database reads. + +> ⚠️ **Please note the required minimum platform version is v0.58.0 for this connector.** + +## Features + +| Feature | Supported | Notes | +| :---------------------------- | :---------- | :----------------- | +| Full Refresh Sync | Yes | | +| Incremental Sync - Append | Yes | | +| Replicate Incremental Deletes | Coming soon | | +| CDC (Change Data Capture) | Coming soon | | +| SSL Support | Yes | | +| SSH Tunnel Connection | Yes | | +| Namespaces | Yes | Enabled by default | + +The Oracle source does not alter the schema present in your database. Depending on the destination +connected to this source, however, the schema may be altered. See the destination's documentation +for more details. + +## Getting Started + +### Requirements + +1. Oracle DB version 23ai, 21c or 19c. +2. Dedicated read-only Airbyte user with access to all tables needed for replication. + +#### 1. Make sure your database is accessible from the machine running Airbyte + +This is dependent on your networking setup. The easiest way to verify if Airbyte is able to connect +to your Oracle instance is by testing the connection in the UI. + +#### 2. Create a dedicated read-only user with access to the relevant tables (Recommended but optional) + +This step is optional but highly recommended to allow for better permission control and auditing. Alternatively, you can use Airbyte with an existing user in your database. + +To create a dedicated database user, run the following commands against your database: + +```sql +CREATE USER airbyte IDENTIFIED BY ; +GRANT CREATE SESSION TO airbyte; +``` + +Next, grant the user read-only access to the relevant tables. The simplest way is to grant read access to all tables in the schema as follows: + +```sql +GRANT SELECT ANY TABLE TO airbyte; +``` + +Or you can be more granular: + +```sql +GRANT SELECT ON ""."" TO airbyte; +GRANT SELECT ON ""."" TO airbyte; +``` + +Your database user should now be ready for use with Airbyte. + +#### 3. Include the schemas Airbyte should look at when configuring the Airbyte Oracle Source. + +Case sensitive. Defaults to the upper-cased user if empty. If the user does not have access to the configured schemas, no tables will be discovered and the connection test will fail. + +### Airbyte Cloud + +On Airbyte Cloud, only secured connections to your Oracle instance are supported in source +configuration. +Note that while the connector is still incubating, this may not yet be actively enforced. +You may configure your connection to either use SSL or an available encryption scheme, or by using an SSH tunnel. + +## Oracle encryption schemes + +The connection to the Oracle database instance can be established using the following schemes: + +1. `Unencrypted` connections will be made using the TCP protocol where all data over the network will be transmitted unencrypted. + Airbyte Cloud will only allow this if an SSH tunnel is also used. +2. `Native Network Encryption (NNE)` gives you the ability to encrypt database connections without + the configuration overhead of SSL / TLS and without the need to open and listen on different ports. + In this case, the _SQLNET.ENCRYPTION_CLIENT_ option will always be set as _REQUIRED_ by default: + the client or server will only accept encrypted traffic, but gives you the opportunity to choose + an `Encryption Algorithm` according to the security policies you require. +3. `TLS Encrypted (verify certificate)` gives you the ability to encrypt database connections using + the TLS protocol, taking into account the handshake procedure and certificate verification. + To use this option, insert the content of the certificate issued by the server into the + `SSL PEM file` field. + +## Connection to Oracle via an SSH Tunnel + +Airbyte has the ability to connect to an Oracle instance via an SSH Tunnel. The reason you might want +to do this because it is not possible (or against security policy) to connect to the database +directly (e.g. it does not have a public IP address). + +When using an SSH tunnel, you are configuring Airbyte to connect to an intermediate server (a.k.a. +a bastion sever) that _does_ have direct access to the database. Airbyte connects to the bastion +and then asks the bastion to connect directly to the server. + +Using this feature requires additional configuration, when creating the source. We will talk through +what each piece of configuration means. + +1. Configure all fields for the source as you normally would, except `SSH Tunnel Method`. + +2. `SSH Tunnel Method` defaults to `No Tunnel` (meaning a direct connection). If you want to use + an SSH Tunnel choose `SSH Key Authentication` or `Password Authentication`. + + 1. Choose `Key Authentication` if you will be using an RSA private key as your secret for + establishing the SSH Tunnel (see below for more information on generating this key). + + 2. Choose `Password Authentication` if you will be using a password as your secret for + establishing the SSH Tunnel. + +3. `SSH Tunnel Jump Server Host` refers to the intermediate (bastion) server that Airbyte will + connect to. This should be a hostname or an IP Address. + +4. `SSH Connection Port` is the port on the bastion server with which to make the SSH connection. + The default port for SSH connections is `22`, so unless you have explicitly changed something, + go with the default. + +5. `SSH Login Username` is the username that Airbyte should use when connection to the bastion + server. This is NOT the Oracle username. + +6. If you are using `Password Authentication`, then `SSH Login Username` should be set to the + password of the User from the previous step. If you are using `SSH Key Authentication` leave this + blank. Again, this is not the Oracle password, but the password for the OS-user that Airbyte is + using to perform commands on the bastion. + +7. If you are using `SSH Key Authentication`, then `SSH Private Key` should be set to the RSA + private Key that you are using to create the SSH connection. This should be the full contents of + the key file starting with `-----BEGIN RSA PRIVATE KEY-----` and ending + with `-----END RSA PRIVATE KEY-----`. + +### Generating an SSH Key Pair + +The connector expects an RSA key in PEM format. To generate this key: + +```text +ssh-keygen -t rsa -m PEM -f myuser_rsa +``` + +This produces the private key in pem format, and the public key remains in the standard format used +by the `authorized_keys` file on your bastion host. The public key should be added to your bastion +host to whichever user you want to use with Airbyte. The private key is provided via copy-and-paste +to the Airbyte connector configuration screen, so it may log in to the bastion. + +## Change Data Capture (CDC) + +We aim to support Oracle CDC soon. Please reach out to your sales engineer if you are interested in being a design partner for CDC support in Oracle. + +## Data type mapping + +Oracle data types are mapped to the following data types when synchronizing data. + +| Oracle Type | Airbyte Type | Notes | +| :------------------------------- | :---------------------- | :-------------------------- | +| `BFILE` | string | base-64 encoded binary data | +| `BINARY_FLOAT` | number | | +| `BINARY_DOUBLE` | number | | +| `BLOB` | string | base-64 encoded binary data | +| `BOOL` | boolean | | +| `BOOLEAN` | boolean | | +| `CHAR` | string | | +| `CHAR VARYING` | string | | +| `CHARACTER` | string | | +| `CHARACTER VARYING` | string | | +| `CLOB` | string | | +| `DATE` | date | | +| `DEC` | number | integer when scale is 0 | +| `DECIMAL` | number | integer when scale is 0 | +| `FLOAT` | number | | +| `DOUBLE PRECISION` | number | | +| `REAL` | number | | +| `INT` | number | integer | +| `INTEGER` | number | integer | +| `INTERVAL YEAR TO MONTH` | string | | +| `INTERVAL DAY TO SECOND` | string | | +| `INTERVALDS` | string | | +| `INTERVALYM` | string | | +| `JSON` | object | | +| `LONG` | string | base-64 encoded binary data | +| `LONG RAW` | string | base-64 encoded binary data | +| `NATIONAL CHAR` | string | | +| `NATIONAL CHAR VARYING` | string | | +| `NATIONAL CHARACTER` | string | | +| `NATIONAL CHARACTER VARYING` | string | | +| `NCHAR` | string | | +| `NCHAR VARYING` | string | | +| `NCLOB` | string | | +| `NUMBER` | number | integer when scale is 0 | +| `NUMERIC` | number | integer when scale is 0 | +| `NVARCHAR2` | string | | +| `RAW` | string | base-64 encoded binary data | +| `ROWID` | string | base-64 encoded binary data | +| `SMALLINT` | number | integer | +| `TIMESTAMP` | timestamp | | +| `TIMESTAMP WITH LOCAL TIME ZONE` | timestamp | | +| `TIMESTAMP WITH LOCAL TZ` | timestamp | | +| `TIMESTAMP WITH TIME ZONE` | timestamp with timezone | | +| `TIMESTAMP WITH TZ` | timestamp with timezone | | +| `UROWID` | string | base-64 encoded binary data | +| `VARCHAR` | string | | +| `VARCHAR2` | string | | + +Varray types are mapped to the corresponding Airbyte array type. +This applies also to multiple levels of nesting, i.e. varrays of varrays, and so forth. + +If you do not see a type in this list, assume that it is coerced into a string. We are happy to take +feedback on preferred mappings. + +## Changelog + +
+ Expand to review + +The connector is still incubating, this section only exists to satisfy Airbyte's QA checks. + +- 0.0.1 +- 0.0.2 +- 0.0.3 +- 0.0.4 +- 0.0.5 +- 0.0.6 +- 0.0.7 + +
diff --git a/docusaurus/docusaurus.config.js b/docusaurus/docusaurus.config.js index 4cd2357df1d9..6dbf99a88434 100644 --- a/docusaurus/docusaurus.config.js +++ b/docusaurus/docusaurus.config.js @@ -5,11 +5,12 @@ const yaml = require("js-yaml"); const fs = require("node:fs"); const path = require("node:path"); -const { themes } = require('prism-react-renderer'); +const { themes } = require("prism-react-renderer"); const lightCodeTheme = themes.github; const darkCodeTheme = themes.dracula; const docsHeaderDecoration = require("./src/remark/docsHeaderDecoration"); +const enterpriseDocsHeaderInformation = require("./src/remark/enterpriseDocsHeaderInformation"); const productInformation = require("./src/remark/productInformation"); const connectorList = require("./src/remark/connectorList"); const specDecoration = require("./src/remark/specDecoration"); @@ -23,7 +24,7 @@ const config = { markdown: { mermaid: true, }, - themes: ['@docusaurus/theme-mermaid'], + themes: ["@docusaurus/theme-mermaid"], title: "Airbyte Documentation", tagline: "Airbyte is an open-source data integration platform to build ELT pipelines. Consolidate your data in your data warehouses, lakes and databases.", @@ -80,7 +81,7 @@ const config = { clientModules: [ require.resolve("./src/scripts/cloudStatus.js"), - require.resolve('./src/scripts/download-abctl-buttons.js'), + require.resolve("./src/scripts/download-abctl-buttons.js"), require.resolve("./src/scripts/fontAwesomeIcons.js"), ], @@ -97,7 +98,11 @@ const config = { path: "../docs", exclude: ["**/*.inapp.md"], beforeDefaultRemarkPlugins: [specDecoration, connectorList], // use before-default plugins so TOC rendering picks up inserted headings - remarkPlugins: [docsHeaderDecoration, productInformation], + remarkPlugins: [ + docsHeaderDecoration, + enterpriseDocsHeaderInformation, + productInformation, + ], }, blog: false, theme: { diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index 50b4d0ff724e..fe309e8a51b2 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -4,11 +4,11 @@ const { parseMarkdownContentTitle, parseFrontMatter, } = require("@docusaurus/utils"); -const { type } = require("os"); const connectorsDocsRoot = "../docs/integrations"; const sourcesDocs = `${connectorsDocsRoot}/sources`; const destinationDocs = `${connectorsDocsRoot}/destinations`; +const enterpriseConnectorDocs = `${connectorsDocsRoot}/enterprise-connectors`; function getFilenamesInDir(prefix, dir, excludes) { return fs @@ -81,6 +81,14 @@ function getDestinationConnectors() { ]); } +function getEnterpriseConnectors() { + return getFilenamesInDir( + "integrations/enterprise-connectors/", + enterpriseConnectorDocs, + ["readme"] + ); +} + const sourcePostgres = { type: "category", label: "Postgres", @@ -295,7 +303,6 @@ const buildAConnector = { "connector-development/tutorials/custom-python-connector/concurrency", ], }, - ], }, ], @@ -435,7 +442,7 @@ const connectionConfigurations = { items: [ "using-airbyte/core-concepts/sync-schedules", "using-airbyte/core-concepts/namespaces", - { + { type: "doc", id: "using-airbyte/schema-change-management", }, @@ -453,8 +460,8 @@ const connectionConfigurations = { "using-airbyte/core-concepts/sync-modes/full-refresh-overwrite", ], }, - ], - }; + ], +}; const understandingAirbyte = { type: "category", @@ -482,11 +489,11 @@ module.exports = { sectionHeader("Getting Started"), { type: "doc", - id: "using-airbyte/getting-started/readme", + id: "using-airbyte/getting-started/readme", }, { - type: "doc", - id: "using-airbyte/core-concepts/readme", + type: "doc", + id: "using-airbyte/core-concepts/readme", }, { type: "doc", @@ -508,7 +515,7 @@ module.exports = { "integrations/connector-support-levels", sectionHeader("Using Airbyte"), connectionConfigurations, - { + { type: "doc", id: "using-airbyte/core-concepts/typing-deduping", }, @@ -543,6 +550,17 @@ module.exports = { "enterprise-setup/api-access-config", "enterprise-setup/scaling-airbyte", "enterprise-setup/upgrading-from-community", + { + type: "category", + label: "Enterprise Connectors", + link: { + type: "doc", + id: "integrations/enterprise-connectors/README", + }, + items: [...getEnterpriseConnectors()].sort((itemA, itemB) => + itemA.label.localeCompare(itemB.label) + ), + }, ], }, "operator-guides/upgrading-airbyte", @@ -564,7 +582,7 @@ module.exports = { "operator-guides/telemetry", ], }, - + { type: "category", label: "Access Management", @@ -590,11 +608,9 @@ module.exports = { type: "doc", id: "access-management/rbac", }, - items: [ - {type: "doc", id: "access-management/role-mapping"}, - ], + items: [{ type: "doc", id: "access-management/role-mapping" }], }, - ] + ], }, { type: "category", @@ -704,7 +720,7 @@ module.exports = { // Any temporarily archived content should be added here with a comment // to indicate when it was archived and why. -// You can still view docs that are not linked to in the sidebar. +// You can still view docs that are not linked to in the sidebar. // Java Destination template is not currently available for use // "connector-development/tutorials/building-a-java-destination", diff --git a/docusaurus/src/components/HeaderDecoration.jsx b/docusaurus/src/components/HeaderDecoration.jsx index bd1c653b001c..16ee952d894d 100644 --- a/docusaurus/src/components/HeaderDecoration.jsx +++ b/docusaurus/src/components/HeaderDecoration.jsx @@ -226,6 +226,7 @@ const ConnectorMetadataCallout = ({ isCloud, isOss, isPypiPublished, + isEnterprise, supportLevel, github_url, dockerImageTag, @@ -240,17 +241,29 @@ const ConnectorMetadataCallout = ({
- - Airbyte Cloud - - - Airbyte OSS - - - PyAirbyte - + {isEnterprise ? ( + <> + + Enterprise License + + + ) : ( + <> + + Airbyte Cloud + + + Airbyte OSS + + + PyAirbyte + + + )}
@@ -313,6 +326,7 @@ export const HeaderDecoration = ({ isOss: isOssString, isCloud: isCloudString, isPypiPublished: isPypiPublishedString, + isEnterprise: isEnterpriseString, dockerImageTag, supportLevel, iconUrl, @@ -329,6 +343,7 @@ export const HeaderDecoration = ({ const isOss = boolStringToBool(isOssString); const isCloud = boolStringToBool(isCloudString); const isPypiPublished = boolStringToBool(isPypiPublishedString); + const isEnterprise = boolStringToBool(isEnterpriseString); const isLatestCDK = boolStringToBool(isLatestCDKString); const isArchived = supportLevel?.toUpperCase() === "ARCHIVED"; @@ -344,6 +359,7 @@ export const HeaderDecoration = ({ isCloud={isCloud} isOss={isOss} isPypiPublished={isPypiPublished} + isEnterprise={isEnterprise} supportLevel={supportLevel} github_url={github_url} dockerImageTag={dockerImageTag} diff --git a/docusaurus/src/enterprise/enterpriseConnectors.js b/docusaurus/src/enterprise/enterpriseConnectors.js new file mode 100644 index 000000000000..c263322d2978 --- /dev/null +++ b/docusaurus/src/enterprise/enterpriseConnectors.js @@ -0,0 +1,6 @@ +export const EnterpriseConnectors = [ + { + name: "Oracle", + type: "source", + }, +]; diff --git a/docusaurus/src/helpers/objects.js b/docusaurus/src/helpers/objects.js index 98b07e2e4a41..434265b8ec43 100644 --- a/docusaurus/src/helpers/objects.js +++ b/docusaurus/src/helpers/objects.js @@ -37,7 +37,7 @@ const generateCombinations = (str) => { } return results; -} +}; /** * Merge a path tree into a flat array of paths @@ -49,18 +49,20 @@ const generateCombinations = (str) => { * @returns {string[]} A flat array of paths */ const mergePathTree = (pathTree) => { - return pathTree - // reduce [[a], [b,c], [d]] to [[a,b,d], [a,c,d]] - .reduce( - (a, b) => - a - .map((x) => b.map((y) => x.concat(y))) - .reduce((a, b) => a.concat(b), []), - [[]], - ) - // then flatten to ['a.b.d', 'a.c.d'] - .map((x) => x.join(".")); -} + return ( + pathTree + // reduce [[a], [b,c], [d]] to [[a,b,d], [a,c,d]] + .reduce( + (a, b) => + a + .map((x) => b.map((y) => x.concat(y))) + .reduce((a, b) => a.concat(b), []), + [[]] + ) + // then flatten to ['a.b.d', 'a.c.d'] + .map((x) => x.join(".")) + ); +}; /** * Generate all possible paths from a given path @@ -76,7 +78,7 @@ const generatePaths = (path) => { const pathTree = pathChunks.map(generateCombinations); const paths = mergePathTree(pathTree); return paths; -} +}; /** * Get a value from an object using a path OR multiple possible paths @@ -110,6 +112,23 @@ const getFromPaths = (obj, path, defaultValue = undefined) => { return defaultValue; }; +/** REMARK UTILS */ + +const removeUndefined = ([key, value]) => { + if (value === undefined) return false; + return [key, value]; +}; + +const kvToAttribute = ([key, value]) => ({ + type: "mdxJsxAttribute", + name: key, + value: value, +}); + +const toAttributes = (props) => + Object.entries(props).filter(removeUndefined).map(kvToAttribute); + module.exports = { getFromPaths, -}; \ No newline at end of file + toAttributes, +}; diff --git a/docusaurus/src/remark/docsHeaderDecoration.js b/docusaurus/src/remark/docsHeaderDecoration.js index 4d1537b43d4e..c89673dae97d 100644 --- a/docusaurus/src/remark/docsHeaderDecoration.js +++ b/docusaurus/src/remark/docsHeaderDecoration.js @@ -1,22 +1,12 @@ -const { getFromPaths } = require("../helpers/objects"); +const { getFromPaths, toAttributes } = require("../helpers/objects"); const { isDocsPage, getRegistryEntry } = require("./utils"); -const { isPypiConnector, getLatestPythonCDKVersion, parseCDKVersion } = require("../connector_registry"); +const { + isPypiConnector, + getLatestPythonCDKVersion, + parseCDKVersion, +} = require("../connector_registry"); const visit = require("unist-util-visit").visit; -const removeUndefined = ([key, value]) => { - if (value === undefined) return false; - return [key, value]; -}; - -const kvToAttribute = ([key, value]) => ({ - type: "mdxJsxAttribute", - name: key, - value: value, -}); - -const toAttributes = (props) => - Object.entries(props).filter(removeUndefined).map(kvToAttribute); - /** * Convert a boolean to a string * @@ -24,7 +14,6 @@ const toAttributes = (props) => */ const boolToBoolString = (bool) => (bool ? "TRUE" : "FALSE"); - const plugin = () => { const transformer = async (ast, vfile) => { const docsPageInfo = isDocsPage(vfile); @@ -42,12 +31,27 @@ const plugin = () => { const originalTitle = node.children[0].value; const originalId = node.data.hProperties.id; - const rawCDKVersion = getFromPaths(registryEntry, "packageInfo_[oss|cloud].cdk_version"); - const syncSuccessRate = getFromPaths(registryEntry, "generated_[oss|cloud].metrics.[all|cloud|oss].sync_success_rate"); - const usageRate = getFromPaths(registryEntry, "generated_[oss|cloud].metrics.[all|cloud|oss].usage"); - const lastUpdated = getFromPaths(registryEntry, "generated_[oss|cloud].source_file_info.metadata_last_modified"); + const rawCDKVersion = getFromPaths( + registryEntry, + "packageInfo_[oss|cloud].cdk_version" + ); + const syncSuccessRate = getFromPaths( + registryEntry, + "generated_[oss|cloud].metrics.[all|cloud|oss].sync_success_rate" + ); + const usageRate = getFromPaths( + registryEntry, + "generated_[oss|cloud].metrics.[all|cloud|oss].usage" + ); + const lastUpdated = getFromPaths( + registryEntry, + "generated_[oss|cloud].source_file_info.metadata_last_modified" + ); - const {version, isLatest, url} = parseCDKVersion(rawCDKVersion, latestPythonCdkVersion); + const { version, isLatest, url } = parseCDKVersion( + rawCDKVersion, + latestPythonCdkVersion + ); const attrDict = { isOss: registryEntry.is_oss, diff --git a/docusaurus/src/remark/enterpriseDocsHeaderInformation.js b/docusaurus/src/remark/enterpriseDocsHeaderInformation.js new file mode 100644 index 000000000000..c851603ed76a --- /dev/null +++ b/docusaurus/src/remark/enterpriseDocsHeaderInformation.js @@ -0,0 +1,48 @@ +const { isEnterpriseConnectorDocsPage } = require("./utils"); +const { toAttributes } = require("../helpers/objects"); +const visit = require("unist-util-visit").visit; + +const plugin = () => { + const transformer = async (ast, vfile) => { + const isDocsPage = isEnterpriseConnectorDocsPage(vfile); + if (!isDocsPage) return; + + let firstHeading = true; + + visit(ast, "heading", (node) => { + if (firstHeading && node.depth === 1 && node.children.length === 1) { + const originalTitle = node.children[0].value; + const originalId = node.data.hProperties.id; + + const attrDict = { + isOss: false, + isCloud: false, + isPypiPublished: false, + isEnterprise: true, + supportLevel: "certified", + dockerImageTag: "custom", + // iconUrl: registryEntry.iconUrl_oss, + // github_url: registryEntry.github_url, + // issue_url: registryEntry.issue_url, + originalTitle, + originalId, + // cdkVersion: version, + // isLatestCDKString: boolToBoolString(isLatest), + // cdkVersionUrl: url, + // syncSuccessRate, + // usageRate, + // lastUpdated, + }; + + firstHeading = false; + node.children = []; + node.type = "mdxJsxFlowElement"; + node.name = "HeaderDecoration"; + node.attributes = toAttributes(attrDict); + } + }); + }; + return transformer; +}; + +module.exports = plugin; diff --git a/docusaurus/src/remark/utils.js b/docusaurus/src/remark/utils.js index 087f1f59e46f..616e8739f7aa 100644 --- a/docusaurus/src/remark/utils.js +++ b/docusaurus/src/remark/utils.js @@ -30,6 +30,17 @@ const isDocsPage = (vfile) => { return response; }; +const isEnterpriseConnectorDocsPage = (vfile) => { + if ( + vfile.path.includes("integrations/enterprise-connectors") && + !vfile.path.toLowerCase().includes("readme.md") + ) { + return true; + } + + return false; +}; + const getRegistryEntry = async (vfile) => { if ( !vfile.path.includes("integrations/sources") && @@ -93,5 +104,6 @@ const buildArchivedRegistryEntry = ( module.exports = { isDocsPage, + isEnterpriseConnectorDocsPage, getRegistryEntry, };