From ceb291428f0d59eac471b8d8d176c1b2b331bc57 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 16:04:04 +0100 Subject: [PATCH 01/52] Initial doc movings --- .../airbyte-support.md} | 19 ++- .../code-of-conduct.md | 0 .../slack-code-of-conduct.md | 0 .../licenses/README.md | 0 .../licenses/elv2-license.md | 0 .../licenses/examples.md | 0 .../licenses/license-faq.md | 0 .../licenses/mit-license.md | 0 .../{self-managed => }/README.md | 10 +- .../implementation-guide.md | 16 +-- .../{self-managed => }/sso.md | 10 +- .../connector-support-levels.md} | 0 .../security.md | 2 +- docs/project-overview/README.md | 2 - docs/readme.md | 10 +- docs/troubleshooting.md | 50 +------- docusaurus/redirects.yml | 38 +++++- docusaurus/sidebars.js | 115 ++++++++---------- 18 files changed, 127 insertions(+), 145 deletions(-) rename docs/{operator-guides/contact-support.md => community/airbyte-support.md} (89%) rename docs/{project-overview => community}/code-of-conduct.md (100%) rename docs/{project-overview => community}/slack-code-of-conduct.md (100%) rename docs/{project-overview => developer-guides}/licenses/README.md (100%) rename docs/{project-overview => developer-guides}/licenses/elv2-license.md (100%) rename docs/{project-overview => developer-guides}/licenses/examples.md (100%) rename docs/{project-overview => developer-guides}/licenses/license-faq.md (100%) rename docs/{project-overview => developer-guides}/licenses/mit-license.md (100%) rename docs/enterprise-setup/{self-managed => }/README.md (59%) rename docs/enterprise-setup/{self-managed => }/implementation-guide.md (78%) rename docs/enterprise-setup/{self-managed => }/sso.md (86%) rename docs/{project-overview/product-support-levels.md => integrations/connector-support-levels.md} (100%) rename docs/{operator-guides => managing-airbyte}/security.md (99%) delete mode 100644 docs/project-overview/README.md diff --git a/docs/operator-guides/contact-support.md b/docs/community/airbyte-support.md similarity index 89% rename from docs/operator-guides/contact-support.md rename to docs/community/airbyte-support.md index db42a9aef36f..69c22aa37cab 100644 --- a/docs/operator-guides/contact-support.md +++ b/docs/community/airbyte-support.md @@ -6,14 +6,26 @@ Hold up! Have you looked at [our docs](https://docs.airbyte.com/) yet? We recomm Running Airbyte Open Source and have questions that our docs could not clear up? Post your questions on our [Github Discussions](https://github.com/airbytehq/airbyte/discussions?_gl=1*70s0c6*_ga*MTc1OTkyOTYzNi4xNjQxMjQyMjA0*_ga_HDBMVFQGBH*MTY4OTY5MDQyOC4zNDEuMC4xNjg5NjkwNDI4LjAuMC4w) and also join our community Slack to connect with other Airbyte users. +### Community Slack **Join our Slack community** [HERE](https://slack.airbyte.com/?_gl=1*1h8mjfe*_gcl_au*MTc4MjAxMDQzOS4xNjgyOTczMDYy*_ga*MTc1OTkyOTYzNi4xNjQxMjQyMjA0*_ga_HDBMVFQGBH*MTY4Nzg4OTQ4MC4zMjUuMS4xNjg3ODkwMjE1LjAuMC4w&_ga=2.58571491.813788522.1687789276-1759929636.1641242204)! -Ask your questions first in the #ask-ai channel and if our bot can not assist you, reach out to our community in the #ask-community-for-troubleshooting channel. - +Ask your questions first in the #ask-ai channel and if our bot can not assist you, reach out to our community in the #ask-community-for-troubleshooting channel. If you require personalized support, reach out to our sales team to inquire about [Airbyte Enterprise](https://airbyte.com/airbyte-enterprise). +### Airbyte Forum + +We are driving our community support from our [forum](https://github.com/airbytehq/airbyte/discussions) on GitHub. + +### Office Hour + +Airbyte provides a [Daily Office Hour](https://airbyte.com/daily-office-hour) to discuss issues. +It is a 45 minute meeting, the first 20 minutes are reserved to a weekly topic presentation about Airbyte concepts and the others 25 minutes are for general questions. The schedule is: +* Monday, Wednesday and Fridays: 1 PM PST/PDT +* Tuesday and Thursday: 4 PM CEST + + ## Airbyte Cloud Support If you have questions about connector setup, error resolution, or want to report a bug, Airbyte Support is available to assist you. We recommend checking [our documentation](https://docs.airbyte.com/) and searching our [Help Center](https://support.airbyte.com/hc/en-us) before opening a support ticket. @@ -59,5 +71,4 @@ Although we strive to offer our utmost assistance, there are certain requests th * Curating unique documentation and training materials * Configuring Airbyte to meet security requirements -If you think you will need asssitance when upgrading, we recommend upgrading during our support hours, Monday-Friday 7AM - 7PM ET so we can assist if support is needed. If you upgrade outside of support hours, please submit a ticket and we will assist when we are back online. - +If you think you will need assistance when upgrading, we recommend upgrading during our support hours, Monday-Friday 7AM - 7PM ET so we can assist if support is needed. If you upgrade outside of support hours, please submit a ticket and we will assist when we are back online. diff --git a/docs/project-overview/code-of-conduct.md b/docs/community/code-of-conduct.md similarity index 100% rename from docs/project-overview/code-of-conduct.md rename to docs/community/code-of-conduct.md diff --git a/docs/project-overview/slack-code-of-conduct.md b/docs/community/slack-code-of-conduct.md similarity index 100% rename from docs/project-overview/slack-code-of-conduct.md rename to docs/community/slack-code-of-conduct.md diff --git a/docs/project-overview/licenses/README.md b/docs/developer-guides/licenses/README.md similarity index 100% rename from docs/project-overview/licenses/README.md rename to docs/developer-guides/licenses/README.md diff --git a/docs/project-overview/licenses/elv2-license.md b/docs/developer-guides/licenses/elv2-license.md similarity index 100% rename from docs/project-overview/licenses/elv2-license.md rename to docs/developer-guides/licenses/elv2-license.md diff --git a/docs/project-overview/licenses/examples.md b/docs/developer-guides/licenses/examples.md similarity index 100% rename from docs/project-overview/licenses/examples.md rename to docs/developer-guides/licenses/examples.md diff --git a/docs/project-overview/licenses/license-faq.md b/docs/developer-guides/licenses/license-faq.md similarity index 100% rename from docs/project-overview/licenses/license-faq.md rename to docs/developer-guides/licenses/license-faq.md diff --git a/docs/project-overview/licenses/mit-license.md b/docs/developer-guides/licenses/mit-license.md similarity index 100% rename from docs/project-overview/licenses/mit-license.md rename to docs/developer-guides/licenses/mit-license.md diff --git a/docs/enterprise-setup/self-managed/README.md b/docs/enterprise-setup/README.md similarity index 59% rename from docs/enterprise-setup/self-managed/README.md rename to docs/enterprise-setup/README.md index 21d5fedf047d..9bb1a95450fa 100644 --- a/docs/enterprise-setup/self-managed/README.md +++ b/docs/enterprise-setup/README.md @@ -1,12 +1,12 @@ -# Airbyte Self-Managed +# Airbyte Enterprise -[Airbyte Self-Managed](https://airbyte.com/product/airbyte-enterprise) is the best way to run Airbyte yourself. You get all 300+ pre-built connectors, data never leaves your environment, and Airbyte becomes self-serve in your organization with new tools to manage multiple users, and multiple teams using Airbyte all in one place. +[Airbyte Enterprise](https://airbyte.com/product/airbyte-enterprise) is the best way to run Airbyte yourself. You get all 300+ pre-built connectors, data never leaves your environment, and Airbyte becomes self-serve in your organization with new tools to manage multiple users, and multiple teams using Airbyte all in one place. -A valid license key is required to get started with Airbyte Self-Managed. [Talk to sales](https://airbyte.com/company/talk-to-sales) to receive your license key. +A valid license key is required to get started with Airbyte Enterprise. [Talk to sales](https://airbyte.com/company/talk-to-sales) to receive your license key. The following pages outline how to: -1. [Deploy Airbyte Self-Managed using Kubernetes](./implementation-guide.md) -2. [Configure Okta for Single Sign-On (SSO) with Airbyte Self-Managed](./sso.md) +1. [Deploy Airbyte Enterprise using Kubernetes](./implementation-guide.md) +2. [Configure Okta for Single Sign-On (SSO) with Airbyte Enterprise](./sso.md) | Feature | Description | |---------------------------|--------------------------------------------------------------------------------------------------------------| diff --git a/docs/enterprise-setup/self-managed/implementation-guide.md b/docs/enterprise-setup/implementation-guide.md similarity index 78% rename from docs/enterprise-setup/self-managed/implementation-guide.md rename to docs/enterprise-setup/implementation-guide.md index 882a024436bb..6affccf7709d 100644 --- a/docs/enterprise-setup/self-managed/implementation-guide.md +++ b/docs/enterprise-setup/implementation-guide.md @@ -3,15 +3,15 @@ import TabItem from '@theme/TabItem'; # Implementation Guide -[Airbyte Self-Managed](./README.md) is in an early access stage for select priority users. Once you [are qualified for an Airbyte Self Managed license key](https://airbyte.com/company/talk-to-sales), you can deploy Airbyte with the following instructions. +[Airbyte Enterprise](./README.md) is in an early access stage for select priority users. Once you [are qualified for an Airbyte Enterprise license key](https://airbyte.com/company/talk-to-sales), you can deploy Airbyte with the following instructions. -Airbyte Self Managed must be deployed using Kubernetes. This is to enable Airbyte's best performance and scale. The core components \(api server, scheduler, etc\) run as deployments while the scheduler launches connector-related pods on different nodes. +Airbyte Enterprise must be deployed using Kubernetes. This is to enable Airbyte's best performance and scale. The core components \(api server, scheduler, etc\) run as deployments while the scheduler launches connector-related pods on different nodes. ## Prerequisites -There are three prerequisites to deploying Self-Managed: installing [helm](https://helm.sh/docs/intro/install/), a Kubernetes cluster, and having configured `kubectl` to connect to the cluster. +There are three prerequisites to deploying Enterprise: installing [helm](https://helm.sh/docs/intro/install/), a Kubernetes cluster, and having configured `kubectl` to connect to the cluster. -For production, we recommend deploying to EKS, GKE or AKS. If you are doing some local testing, follow the cluster setup instructions outlined [here](../../deploying-airbyte/on-kubernetes-via-helm.md#cluster-setup). +For production, we recommend deploying to EKS, GKE or AKS. If you are doing some local testing, follow the cluster setup instructions outlined [here](/deploying-airbyte/on-kubernetes-via-helm.md#cluster-setup). To install `kubectl`, please follow [these instructions](https://kubernetes.io/docs/tasks/tools/). To configure `kubectl` to connect to your cluster by using `kubectl use-context my-cluster-name`, see the following: @@ -38,7 +38,7 @@ To install `kubectl`, please follow [these instructions](https://kubernetes.io/d -## Deploy Airbyte Self-Managed +## Deploy Airbyte Enterprise ### Add Airbyte Helm Repository @@ -60,7 +60,7 @@ cp configs/airbyte.sample.yml configs/airbyte.yml 3. Add your Airbyte Enterprise license key to your `airbyte.yml`. -4. Add your [auth details](/enterprise-setup/self-managed/sso) to your `airbyte.yml`. Auth configurations aren't easy to modify after Airbyte is installed, so please double check them to make sure they're accurate before proceeding. +4. Add your [auth details](/enterprise-setup/sso) to your `airbyte.yml`. Auth configurations aren't easy to modify after Airbyte is installed, so please double check them to make sure they're accurate before proceeding.
Configuring auth in your airbyte.yml file @@ -81,7 +81,7 @@ To configure basic auth (deploy without SSO), remove the entire `auth:` section
-### Install Airbyte Self Managed +### Install Airbyte Enterprise Install Airbyte Enterprise on helm using the following command: @@ -92,7 +92,7 @@ Install Airbyte Enterprise on helm using the following command: The default release name is `airbyte-pro`. You can change this via the `RELEASE_NAME` environment variable. -### Customizing your Airbyte Self Managed Deployment +### Customizing your Airbyte Enterprise Deployment In order to customize your deployment, you need to create `values.yaml` file in a local folder and populate it with default configuration override values. A `values.yaml` example can be located in [charts/airbyte](https://github.com/airbytehq/airbyte-platform/blob/main/charts/airbyte/values.yaml) folder of the Airbyte repository. diff --git a/docs/enterprise-setup/self-managed/sso.md b/docs/enterprise-setup/sso.md similarity index 86% rename from docs/enterprise-setup/self-managed/sso.md rename to docs/enterprise-setup/sso.md index 55d7053736f7..8aede3304284 100644 --- a/docs/enterprise-setup/self-managed/sso.md +++ b/docs/enterprise-setup/sso.md @@ -6,7 +6,7 @@ Airbyte Self Managed currently supports SSO via OIDC with [Okta](https://www.okt The following instructions walk you through: 1. [Setting up the Okta OIDC App Integration to be used by your Airbyte instance](#setting-up-okta-for-sso) -2. [Configuring Airbyte Self-Managed to use SSO](#deploying-airbyte-enterprise-with-okta) +2. [Configuring Airbyte Enterprise to use SSO](#deploying-airbyte-enterprise-with-okta) ### Setting up Okta for SSO @@ -14,13 +14,13 @@ You will need to create a new Okta OIDC App Integration for your Airbyte instanc You should create an app integration with **OIDC - OpenID Connect** as the sign-in method and **Web Application** as the application type: -![Screenshot of Okta app integration creation modal](../assets/okta-create-new-app-integration.png) +![Screenshot of Okta app integration creation modal](./assets/okta-create-new-app-integration.png) #### App integration name Please choose a URL-friendly app integraiton name without spaces or special characters, such as `my-airbyte-app`: -![Screenshot of Okta app integration name](../assets/okta-app-integration-name.png) +![Screenshot of Okta app integration name](./assets/okta-app-integration-name.png) Spaces or special characters in this field could result in invalid redirect URIs. @@ -40,13 +40,13 @@ Sign-out redirect URIs /auth/realms/airbyte/broker//endpoint/logout_response ``` -![Okta app integration name screenshot](../assets/okta-login-redirect-uris.png) +![Okta app integration name screenshot](./assets/okta-login-redirect-uris.png) _Example values_ `` should point to where your Airbyte instance will be available, including the http/https protocol. -## Deploying Airbyte Self-Managed with Okta +## Deploying Airbyte Enterprise with Okta Once your Okta app is set up, you're ready to deploy Airbyte with SSO. Take note of the following configuration values, as you will need them to configure Airbyte to use your new Okta SSO app integration: diff --git a/docs/project-overview/product-support-levels.md b/docs/integrations/connector-support-levels.md similarity index 100% rename from docs/project-overview/product-support-levels.md rename to docs/integrations/connector-support-levels.md diff --git a/docs/operator-guides/security.md b/docs/managing-airbyte/security.md similarity index 99% rename from docs/operator-guides/security.md rename to docs/managing-airbyte/security.md index a887e8bd5b91..b94e04d8b8d4 100644 --- a/docs/operator-guides/security.md +++ b/docs/managing-airbyte/security.md @@ -1,4 +1,4 @@ -# Airbyte Security +# Security Airbyte is committed to keeping your data safe by following industry-standard practices for securing physical deployments, setting access policies, and leveraging the security features of leading Cloud providers. diff --git a/docs/project-overview/README.md b/docs/project-overview/README.md deleted file mode 100644 index a427d02b0519..000000000000 --- a/docs/project-overview/README.md +++ /dev/null @@ -1,2 +0,0 @@ -# Project Overview - diff --git a/docs/readme.md b/docs/readme.md index cbf550c2a7a6..e9ef4202203b 100644 --- a/docs/readme.md +++ b/docs/readme.md @@ -4,18 +4,18 @@ Whether you are an Airbyte user or contributor, we have docs for you! ## For Airbyte Cloud users -Browse the [connector catalog](https://docs.airbyte.com/integrations/) to find the connector you want. In case the connector is not yet supported on Airbyte Cloud, consider using [Airbyte Open Source](#for-airbyte-open-source-users). +Browse the [connector catalog](/integrations/) to find the connector you want. In case the connector is not yet supported on Airbyte Cloud, consider using [Airbyte Open Source](#for-airbyte-open-source-users). -Next, check out the [step-by-step tutorial](https://docs.airbyte.com/cloud/getting-started-with-airbyte-cloud) to sign up for Airbyte Cloud, understand Airbyte [concepts](https://docs.airbyte.com/cloud/core-concepts), and run your first sync. Then learn how to [use your Airbyte Cloud account](https://docs.airbyte.com/category/using-airbyte-cloud). +Next, check out the [step-by-step tutorial](/cloud/getting-started-with-airbyte-cloud) to sign up for Airbyte Cloud, understand Airbyte [concepts](/cloud/core-concepts), and run your first sync. Then learn how to [use your Airbyte Cloud account](/category/using-airbyte-cloud). ## For Airbyte Open Source users -Browse the [connector catalog](https://docs.airbyte.com/integrations/) to find the connector you want. If the connector is not yet supported on Airbyte Open Source, [build your own connector](https://docs.airbyte.com/connector-development/). +Browse the [connector catalog](/integrations/) to find the connector you want. If the connector is not yet supported on Airbyte Open Source, [build your own connector](/connector-development/). -Next, check out the [Airbyte Open Source QuickStart](https://docs.airbyte.com/quickstart/deploy-airbyte). Then learn how to [deploy](https://docs.airbyte.com/deploying-airbyte/local-deployment) and [manage](https://docs.airbyte.com/operator-guides/upgrading-airbyte) Airbyte Open Source in your cloud infrastructure. +Next, check out the [Airbyte Open Source QuickStart](/quickstart/deploy-airbyte). Then learn how to [deploy](/deploying-airbyte/local-deployment) and [manage](/operator-guides/upgrading-airbyte) Airbyte Open Source in your cloud infrastructure. ## For Airbyte contributors -To contribute to Airbyte code, connectors, and documentation, refer to our [Contributing Guide](https://docs.airbyte.com/contributing-to-airbyte/). +To contribute to Airbyte code, connectors, and documentation, refer to our [Contributing Guide](/contributing-to-airbyte/). [![GitHub stars](https://img.shields.io/github/stars/airbytehq/airbyte?style=social&label=Star&maxAge=2592000)](https://GitHub.com/airbytehq/airbyte/stargazers/) [![License](https://img.shields.io/static/v1?label=license&message=MIT&color=brightgreen)](https://github.com/airbytehq/airbyte/tree/a9b1c6c0420550ad5069aca66c295223e0d05e27/LICENSE/README.md) [![License](https://img.shields.io/static/v1?label=license&message=ELv2&color=brightgreen)](https://github.com/airbytehq/airbyte/tree/a9b1c6c0420550ad5069aca66c295223e0d05e27/LICENSE/README.md) diff --git a/docs/troubleshooting.md b/docs/troubleshooting.md index b9a5d7d12472..4ecc7eaf2f9d 100644 --- a/docs/troubleshooting.md +++ b/docs/troubleshooting.md @@ -1,4 +1,4 @@ -# Troubleshooting & FAQ +# Troubleshooting Welcome to the Airbyte troubleshooting guide! Like any platform, you may experience issues when using Airbyte. This guide is designed to help you diagnose and resolve any problems you may encounter while using Airbyte. By following the troubleshooting steps outlined in this guide, you can quickly and effectively identify the root cause of the issue and take steps to resolve it. We recommend checking this guide whenever you encounter an issue with Airbyte to help ensure a smooth and uninterrupted experience with our platform. Let's dive in! @@ -12,48 +12,6 @@ Step 4: Open a Github ticket. If you're still unable to resolve the issue after Airbyte is an open source project with a vibrant community that fosters collaboration and mutual support. To ensure accessible troubleshooting guidance, Airbyte offers multiple platforms for users to ask and discuss issues, including the Airbyte Github, Airbyte Community Slack (which is over 10,000 users), and the Airbyte Forum. In addition, Airbyte hosts daily office hours that include topic demonstrations and dedicated space for issue discussion in Zoom meetings. In addition to these community resources, Airbyte also offers premium support packages for users who require additional assistance beyond what is provided by the community. -## OSS Premium Support -Open source [premium support packages](https://airbyte.com/talk-to-sales-premium-support) are a great option for who use Airbyte OSS and need additional assistance beyond what is provided by the community. These packages typically include access to a dedicated support team that can provide assistance with installation, configuration, troubleshooting, and other technical issues. Premium support packages also often include faster response times, guaranteed issue resolution, and access to updates and patches. By opting for a premium support package, users can enjoy the benefits of open source software while also receiving the peace of mind they need to keep their systems running smoothly. - -Premier Support comes with: - -* 1-business-day SLA for your Severity 0 and 1 -* 2-business-day SLA for your Severity 2 and 3 -* 1-week Pull Request review SLA for first comment -If you need better SLA times, we can definitely discuss this, don't hesitate to [talk to our team](https://airbyte.com/talk-to-sales) about it. You can also see more details about it in our pricing page. - -## Office Hour -Airbyte provides a [Daily Office Hour](https://airbyte.com/daily-office-hour) to discuss issues. -It is a 45 minute meeting, the first 20 minutes are reserved to a weekly topic presentation about Airbyte concepts and the others 25 minutes are for general questions. The schedule is: -* Monday, Wednesday and Fridays: 1 PM PST/PDT -* Tuesday and Thursday: 4 PM CEST - - -## Github Issues -Whenever you face an issue using a connector or with the platform you're welcome to report opening a Github issue. -https://github.com/airbytehq/airbyte - - -## Airbyte Slack -You can access Airbyte Slack [here](https://slack.airbyte.com/). - -**Before posting on a channel this please first check if a similar question was already answered.** - -**The existing categories**: -* `#help-connections-issues`: for any questions or issues on your connections -* `#help-infrastructure-deployment`: for any questions or issues on your deployment and infrastructure -* `#help-connector-development`: for any questions about on the CDKs and issues while building a custom connector -* `#help-api-cli-orchestration`: for any questions or issues about the API, CLI, any scheduling effort. -* `#help-contributions`: for any questions about contributing to Airbyte’s codebase - -## Airbyte Forum -We are driving our community support from our [forum](https://github.com/airbytehq/airbyte/discussions). - -**Before posting on this forum please first check if a similar question was already answered.** - -**The existing categories**: -* 🙏 Questions: Ask the community for help on your question. As a reminder, the Airbyte team won’t provide help here, as our support is part of our Airbyte Cloud and Airbyte Enterprise offers. -* 💡 Ideas: Share ideas for new features, improvements, or feedback. -* 🙌 Show & Tell: Share projects, tutorials, videos, and articles you are working on. -* 🫶 Kind words: Show off something you love about Airbyte -* 🐙 General: For anything that doesn’t fit in the above categories +:::info +You can check all your [support options](./community/support). +::: \ No newline at end of file diff --git a/docusaurus/redirects.yml b/docusaurus/redirects.yml index 28a7f499bc15..f8d9eec8008e 100644 --- a/docusaurus/redirects.yml +++ b/docusaurus/redirects.yml @@ -9,10 +9,6 @@ to: /understanding-airbyte/airbyte-protocol - from: /integrations/sources/appstore-singer to: /integrations/sources/appstore -- from: - - /project-overview/security - - /operator-guides/securing-airbyte - to: /operator-guides/security - from: /connector-development/config-based/ to: /connector-development/config-based/low-code-cdk-overview - from: /project-overview/changelog @@ -33,5 +29,35 @@ to: /cloud/managing-airbyte-cloud/manage-connection-state - from: /cloud/managing-airbyte-cloud/edit-stream-configuration to: /cloud/managing-airbyte-cloud/configuring-connections -- from: /project-overview/product-release-stages - to: /project-overview/product-support-levels +# November 2023 documentation restructure: +- from: + - /project-overview/product-support-levels + - /project-overview/product-release-stages + to: /integrations/connector-support-levels +- from: /operator-guides/contact-support + to: /community/airbyte-support +- from: /project-overview/code-of-conduct + to: /community/code-of-conduct +- from: /project-overview/slack-code-of-conduct + to: /community/slack-code-of-conduct +- from : /project-overview/licenses/ + to: /developer-guides/licences/ +- from: /project-overview/licenses/license-faq + to: /developer-guides/licences/license-faq +- from: /project-overview/licenses/elv2-license + to: /developer-guides/licences/elv2-license +- from: /project-overview/licenses/mit-license + to: /developer-guides/licences/mit-license +- from: /project-overview/licenses/examples + to: /developer-guides/licences/examples +- from: /enterprise-setup/self-managed/ + to: /enterprise-setup/ +- from: /enterprise-setup/self-managed/implementation-guide + to: /enterprise-setup/implementation-guide +- from: /enterprise-setup/self-managed/sso + to: /enterprise-setup/sso +- from: + - /project-overview/security + - /operator-guides/securing-airbyte + - /operator-guides/security + to: /managing-airbyte/security \ No newline at end of file diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index 55f4497d1e22..a590a1cc9824 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -418,20 +418,7 @@ const deployAirbyte = { ], }; -const airbyteSelfManaged = { - type: "category", - label: "Airbyte Self-Managed", - link: { - type: "doc", - id: "enterprise-setup/self-managed/README", - }, - items: [ - "enterprise-setup/self-managed/implementation-guide", - "enterprise-setup/self-managed/sso", - ] -} - -const operatorGuide = { +const managingAirbyte = { type: "category", label: "Manage Airbyte", link: { @@ -501,16 +488,6 @@ const understandingAirbyte = { ], }; -const security = { - type: "doc", - id: "operator-guides/security", -}; - -const support = { - type: "doc", - id: "operator-guides/contact-support", -}; - module.exports = { mySidebar: [ { @@ -521,18 +498,41 @@ module.exports = { sectionHeader("Airbyte Connectors"), connectorCatalog, buildAConnector, - sectionHeader("Airbyte Cloud"), - ...airbyteCloud, - sectionHeader("Airbyte Open Source (OSS)"), + "integrations/connector-support-levels", + // -- begin legacy + // sectionHeader("Airbyte Cloud"), + // ...airbyteCloud, + // sectionHeader("Airbyte Open Source (OSS)"), + // ossGettingStarted, + // deployAirbyte, + // operatorGuide, + // { + // type: "doc", + // id: "troubleshooting", + // }, + // sectionHeader("Enterprise Setup"), + // airbyteSelfManaged, + // -- end legacy + sectionHeader("Using Airbyte"), ossGettingStarted, + ...airbyteCloud, + "troubleshooting", + sectionHeader("Managing Airbyte"), deployAirbyte, - operatorGuide, + managingAirbyte, + "managing-airbyte/security", { - type: "doc", - id: "troubleshooting", + type: "category", + label: "Airbyte Enterprise", + link: { + type: "doc", + id: "enterprise-setup/README", + }, + items: [ + "enterprise-setup/implementation-guide", + "enterprise-setup/sso", + ] }, - sectionHeader("Enterprise Setup"), - airbyteSelfManaged, sectionHeader("Developer Guides"), { type: "doc", @@ -548,42 +548,31 @@ module.exports = { }, understandingAirbyte, contributeToAirbyte, - sectionHeader("Resources"), - support, - security, { type: "category", - label: "Project Overview", + label: "Licenses", + link: { + type: "doc", + id: "developer-guides/licenses/README", + }, items: [ - { - type: "link", - label: "Roadmap", - href: "https://go.airbyte.com/roadmap", - }, - "project-overview/product-support-levels", - "project-overview/slack-code-of-conduct", - "project-overview/code-of-conduct", - { - type: "link", - label: "Airbyte Repository", - href: "https://github.com/airbytehq/airbyte", - }, - { - type: "category", - label: "Licenses", - link: { - type: "doc", - id: "project-overview/licenses/README", - }, - items: [ - "project-overview/licenses/license-faq", - "project-overview/licenses/elv2-license", - "project-overview/licenses/mit-license", - "project-overview/licenses/examples", - ], - }, + "developer-guides/licenses/license-faq", + "developer-guides/licenses/elv2-license", + "developer-guides/licenses/mit-license", + "developer-guides/licenses/examples", ], }, + sectionHeader("Community"), + // TODO: Write a "getting in touch or overview doc" + "community/airbyte-support", + "community/code-of-conduct", + "community/slack-code-of-conduct", + sectionHeader("Product Updates"), + { + type: "link", + label: "Roadmap", + href: "https://go.airbyte.com/roadmap", + }, { type: "category", label: "Release Notes", From 5d79191086a2893bdbf04743bdcd585abf129f5a Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 16:40:18 +0100 Subject: [PATCH 02/52] Moving more files --- .../getting-started-with-airbyte-cloud.md | 178 ------------------ .../security.md | 0 .../configuring-sync-notifications.md | 2 + docs/quickstart/deploy-airbyte.md | 28 --- docs/quickstart/getting-started.md | 105 ----------- docs/troubleshooting.md | 2 +- .../getting-started}/add-a-destination.md | 10 +- .../getting-started}/add-a-source.md | 6 +- docs/using-airbyte/getting-started/readme.md | 5 + .../getting-started}/set-up-a-connection.md | 12 +- docusaurus/redirects.yml | 14 +- docusaurus/sidebars.js | 23 ++- 12 files changed, 50 insertions(+), 335 deletions(-) delete mode 100644 docs/cloud/getting-started-with-airbyte-cloud.md rename docs/{managing-airbyte => operating-airbyte}/security.md (100%) delete mode 100644 docs/quickstart/deploy-airbyte.md delete mode 100644 docs/quickstart/getting-started.md rename docs/{quickstart => using-airbyte/getting-started}/add-a-destination.md (81%) rename docs/{quickstart => using-airbyte/getting-started}/add-a-source.md (86%) create mode 100644 docs/using-airbyte/getting-started/readme.md rename docs/{quickstart => using-airbyte/getting-started}/set-up-a-connection.md (78%) diff --git a/docs/cloud/getting-started-with-airbyte-cloud.md b/docs/cloud/getting-started-with-airbyte-cloud.md deleted file mode 100644 index 2fecf212572f..000000000000 --- a/docs/cloud/getting-started-with-airbyte-cloud.md +++ /dev/null @@ -1,178 +0,0 @@ -# Getting Started with Airbyte Cloud - -This page guides you through setting up your Airbyte Cloud account, setting up a source, destination, and connection, verifying the sync, and allowlisting an IP address. - -## Set up your Airbyte Cloud account - -To use Airbyte Cloud: - -1. If you haven't already, [sign up for Airbyte Cloud](https://cloud.airbyte.com/signup?utm_campaign=22Q1_AirbyteCloudSignUpCampaign_Trial&utm_source=Docs&utm_content=SetupGuide) using your email address, Google login, or GitHub login. - - Airbyte Cloud offers a 14-day free trial that begins after your first successful sync. For more information, see [Pricing](https://airbyte.com/pricing). - - :::note - If you are invited to a workspace, you currently cannot use your Google login to create a new Airbyte account. - ::: - -2. If you signed up using your email address, Airbyte will send you an email with a verification link. On clicking the link, you'll be taken to your new workspace. - - :::info - A workspace lets you collaborate with team members and share resources across your team under a shared billing account. - ::: - -## Set up a source - -:::info -A source is an API, file, database, or data warehouse that you want to ingest data from. -::: - -To set up a source: - -1. On the Airbyte Cloud dashboard, click **Sources**. -2. On the Set up the source page, select the source you want to set up from the **Source catalog**. Airbyte currently offers more than 200 source connectors in Cloud to choose from. Once you've selected the source, a Setup Guide will lead you through the authentication and setup of the source. - -3. Click **Set up source**. - -## Set up a destination - -:::info -A destination is a data warehouse, data lake, database, or an analytics tool where you want to load your extracted data. -::: - -To set up a destination: - -1. On the Airbyte Cloud dashboard, click **Destinations**. -2. On the Set up the Destination page, select the destination you want to set up from the **Destination catalog**. Airbyte currently offers more than 38 destination connectors in Cloud to choose from. Once you've selected the destination, a Setup Guide will lead you through the authentication and setup of the source. -3. Click **Set up destination**. - -## Set up a connection - -:::info -A connection is an automated data pipeline that replicates data from a source to a destination. -::: - -Setting up a connection involves configuring the following parameters: - -| Replication Setting | Description | -| ---------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------- | -| [Destination Namespace](../understanding-airbyte/namespaces.md) and stream prefix | Where should the replicated data be written to? | -| Replication Frequency | How often should the data sync? | -| [Data Residency](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-data-residency#choose-the-data-residency-for-a-connection) | Where should the data be processed? | -| [Schema Propagation](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-schema-changes) | Should schema drift be automated? | - -After configuring the connection settings, you will then define specifically what data will be synced. - -:::info -A connection's schema consists of one or many streams. Each stream is most commonly associated with a database table or an API endpoint. Within a stream, there can be one or many fields or columns. -::: - -| Catalog Selection | Description | -| ---------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------- | -| Stream Selection | Which streams should be replicated from the source to the destination? | -| Column Selection | Which fields should be included in the sync? | -| [Sync Mode](../understanding-airbyte/connections/README.md) | How should the streams be replicated (read and written)? | - -To set up a connection: - -:::tip - -Set your [default data residency](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-data-residency#choose-your-default-data-residency) before creating a new connection to ensure your data is processed in the correct region. - -::: - -1. On the Airbyte Cloud dashboard, click **Connections** and then click **+ New connection**. -2. Select a source: - - - To use a data source you've already set up with Airbyte, select from the list of existing sources. Click the source to use it. - - To set up a new source, select **Set up a new source** and fill out the fields relevant to your source using the Setup Guide. - -3. Select a destination: - - - To use a data source you've already set up with Airbyte, select from the list of existing destinations. Click the destination to use it. - - To set up a new destination, select **Set up a new destination** and fill out the fields relevant to your destination using the Setup Guide. - - Airbyte will scan the schema of the source, and then display the **Connection Configuration** page. - -4. From the **Replication frequency** dropdown, select how often you want the data to sync from the source to the destination. The default replication frequency is **Every 24 hours**. You can also set up [cron scheduling](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html). - - Reach out to [Sales](https://airbyte.com/company/talk-to-sales) if you require replication more frequently than once per hour. - -5. From the **Destination Namespace** dropdown, select the format in which you want to store the data in the destination. Note: The default configuration is **Destination default**. - -| Destination Namepsace | Description | -| ---------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------- | -| Destination default | All streams will be replicated to the single default namespace defined by the Destination. For more details, see ​​Destination Connector Settings | -| Mirror source structure | Some sources (for example, databases) provide namespace information for a stream. If a source provides namespace information, the destination will mirror the same namespace when this configuration is set. For sources or streams where the source namespace is not known, the behavior will default to the "Destination default" option. | -| Custom format | All streams will be replicated to a single user-defined namespace. See Custom format for more details | - -:::tip -To ensure your data is synced correctly, see our examples of how to use the [Destination Namespace](../understanding-airbyte/namespaces.md#examples) -::: - -6. (Optional) In the **Destination Stream Prefix (Optional)** field, add a prefix to stream names. For example, adding a prefix `airbyte_` renames the stream `projects` to `airbyte_projects`. This is helpful if you are sending multiple connections to the same Destination Namespace to ensure connections do not conflict when writing to the destination. - -7. Select in the **Detect and propagate schema changes** dropdown whether Airbyte should propagate schema changes. See more details about how we handle [schema changes](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-schema-changes). - - -8. Activate the streams you want to sync by toggling the **Sync** button on. Use the **Search stream name** search box to find streams quickly. If you want to sync all streams, bulk toggle to enable all streams. - -9. Configure the stream settings: - 1. **Data Destination**: Where the data will land in the destination - 2. **Stream**: The table name in the source - 3. **Sync mode**: How the data will be replicated from the source to the destination. - - For the source: - - - Select **Full Refresh** to copy the entire dataset each time you sync - - Select **Incremental** to replicate only the new or modified data - - For the destination: - - - Select **Overwrite** to erase the old data and replace it completely - - Select **Append** to capture changes to your table - **Note:** This creates duplicate records - - Select **Append + Deduped** to mirror your source while keeping records unique (most common) - - **Note:** Some sync modes may not yet be available for the source or destination. - - 4. **Cursor field**: Used in **Incremental** sync mode to determine which records to sync. Airbyte pre-selects the cursor field for you (example: updated date). If you have multiple cursor fields, select the one you want. - 5. **Primary key**: Used in **Append + Deduped** sync mode to determine the unique identifier. - 6. Choose which fields or columns to sync. By default, all fields are synced. - -10. Click **Set up connection**. -11. Airbyte tests the connectio setup. If the test is successful, Airbyte will save the configuration. If the Replication Frequency uses a preset schedule or CRON, your first sync will immediately begin! - -## Verify the sync - -Once the first sync has completed, you can verify the sync has completed by checking in Airbyte Cloud and in your destination. - -1. On the Airbyte Cloud dashboard, click **Connections**. The list of connections is displayed. Click on the connection you just set up. -2. The **Job History** tab shows each sync run, along with the sync summary of data and rows moved. You can also manually trigger syncs or view detailed logs for each sync here. -3. Check the data at your destination. If you added a Destination Stream Prefix while setting up the connection, make sure to search for the stream name with the prefix. - -## Allowlist IP addresses - -Depending on your [data residency](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-data-residency#choose-your-default-data-residency) location, you may need to allowlist the following IP addresses to enable access to Airbyte: - -### United States and Airbyte Default - -#### GCP region: us-west3 - -[comment]: # "IMPORTANT: if changing the list of IP addresses below, you must also update the connector.airbyteCloudIpAddresses LaunchDarkly flag to show the new list so that the correct list is shown in the Airbyte Cloud UI, then reach out to the frontend team and ask them to update the default value in the useAirbyteCloudIps hook!" - -- 34.106.109.131 -- 34.106.196.165 -- 34.106.60.246 -- 34.106.229.69 -- 34.106.127.139 -- 34.106.218.58 -- 34.106.115.240 -- 34.106.225.141 - -### European Union - -#### AWS region: eu-west-3 - -- 13.37.4.46 -- 13.37.142.60 -- 35.181.124.238 diff --git a/docs/managing-airbyte/security.md b/docs/operating-airbyte/security.md similarity index 100% rename from docs/managing-airbyte/security.md rename to docs/operating-airbyte/security.md diff --git a/docs/operator-guides/configuring-sync-notifications.md b/docs/operator-guides/configuring-sync-notifications.md index 6418aa2ffab5..837310c00af2 100644 --- a/docs/operator-guides/configuring-sync-notifications.md +++ b/docs/operator-guides/configuring-sync-notifications.md @@ -1,5 +1,7 @@ # Configuring Sync Notifications +// TODO: merge into other notification doc + ## Overview You can set up Airbyte to notify you when syncs have **failed** or **succeeded**. This is achieved through a webhook, a URL that you can input into other applications to get real time data from Airbyte. diff --git a/docs/quickstart/deploy-airbyte.md b/docs/quickstart/deploy-airbyte.md deleted file mode 100644 index 4df34e9aa05a..000000000000 --- a/docs/quickstart/deploy-airbyte.md +++ /dev/null @@ -1,28 +0,0 @@ -# Deploy Airbyte - -Deploying Airbyte Open-Source just takes two steps. - -1. Install Docker on your workstation \(see [instructions](https://www.docker.com/products/docker-desktop)\). Make sure you're on the latest version of `docker-compose`. -2. Run the following commands in your terminal: - -```bash -git clone https://github.com/airbytehq/airbyte.git -cd airbyte -./run-ab-platform.sh -``` - -Once you see an Airbyte banner, the UI is ready to go at [http://localhost:8000](http://localhost:8000)! You will be asked for a username and password. By default, that's username `airbyte` and password `password`. Once you deploy airbyte to your servers, **be sure to change these** in your `.env` file. - -Alternatively, if you have an Airbyte Cloud invite, just follow [these steps.](../deploying-airbyte/on-cloud.md) - -If you need direct access to our team for any kind of assistance, don't hesitate to [talk to our team](https://airbyte.com/talk-to-sales-premium-support) to discuss about our premium support offers. - -## FAQ - -If you have any questions about the Airbyte Open-Source setup and deployment process, head over to our [Getting Started FAQ](https://github.com/airbytehq/airbyte/discussions/categories/questions) on our Airbyte Forum that answers the following questions and more: - -- How long does it take to set up Airbyte? -- Where can I see my data once I've run a sync? -- Can I set a start time for my sync? - -If there are any questions that we couldn't answer here, we'd love to help you get started. [Join our Slack](https://airbytehq.slack.com/ssb/redirect) and feel free to ask your questions in the \#getting-started channel. diff --git a/docs/quickstart/getting-started.md b/docs/quickstart/getting-started.md deleted file mode 100644 index afb0e3408522..000000000000 --- a/docs/quickstart/getting-started.md +++ /dev/null @@ -1,105 +0,0 @@ -# Getting Started - -## Goal - -During this getting started tutorial, we are going to replicate currencies closing price into a JSON file. - -## Start Airbyte - -First of all, make sure you have Docker and Docker Compose installed. Then run the following commands: - -```text -git clone https://github.com/airbytehq/airbyte.git -cd airbyte -./run-ab-platform.sh -``` - -Once you see an Airbyte banner, the UI is ready to go at [http://localhost:8000/](http://localhost:8000/). - -## Set up your preferences - -You should see an onboarding page. Enter your email if you want updates about Airbyte and continue. - -![](../.gitbook/assets/airbyte_get-started.png) - -## Set up your first connection - -### Create a source - -The source we are creating will pull data from an external API. It will replicate the closing price of currencies compared to USD since the specified start date. - -To set it up, just follow the instructions on the screenshot below. - -:::info - -You might have to wait ~30 seconds before the fields show up because it is the first time you're using Airbyte. - -::: - -![](../.gitbook/assets/demo_source.png) - -### Create a destination - -The destination we are creating is a simple JSON line file, meaning that it will contain one JSON object per line. Each objects will represent data extracted from the source. - -The resulting files will be located in `/tmp/airbyte_local/json_data` - -:::caution - -Please make sure that Docker Desktop has access to `/tmp` (and `/private` on a MacOS, as /tmp has a symlink that points to /private. It will not work otherwise). You allow it with "File sharing" in `Settings -> Resources -> File sharing -> add the one or two above folder` and hit the "Apply & restart" button. - -::: - -To set it up, just follow the instructions on the screenshot below. - -:::info - -You might have to wait ~30 seconds before the fields show up because it is the first time you're using Airbyte. - -::: - -![](../.gitbook/assets/demo_destination.png) - -### Create connection - -When we create the connection, we can select which data stream we want to replicate. We can also select if we want an incremental replication. The replication will run at the specified sync frequency. - -To set it up, just follow the instructions on the screenshot below. - -![](../.gitbook/assets/demo_connection.png) - -## Check the logs of your first sync - -After you've completed the onboarding, you will be redirected to the source list and will see the source you just added. Click on it to find more information about it. You will now see all the destinations connected to that source. Click on it and you will see the sync history. - -From there, you can look at the logs, download them, force a sync and adjust the configuration of your connection. - -![](../.gitbook/assets/demo_history.png) - -## Check the data of your first sync - -Now let's verify that this worked: - -```bash -cat /tmp/airbyte_local/json_data/_airbyte_raw_exchange_rate.jsonl -``` - -You should see one line for each day that was replicated. - -If you have [`jq`](https://stedolan.github.io/jq/) installed, let's look at the evolution of `EUR`. - -```bash -cat /tmp/airbyte_local/test_json/_airbyte_raw_exchange_rate.jsonl | -jq -c '.data | {date: .date, EUR: .EUR }' -``` - -And there you have it. You've pulled data from an API directly into a file and all of the actual configuration for this replication only took place in the UI. - -## That's it! - -This is just the beginning of using Airbyte. We support a large collection of sources and destinations. You can even contribute your own. - -If you have any questions at all, please reach out to us on [Slack](https://slack.airbyte.io/). We’re still in alpha, so if you see any rough edges or want to request a connector you need, please create an issue on our [Github](https://github.com/airbytehq/airbyte) or leave a thumbs up on an existing issue. - -Thank you and we hope you enjoy using Airbyte. - diff --git a/docs/troubleshooting.md b/docs/troubleshooting.md index 4ecc7eaf2f9d..a5db81cdbba2 100644 --- a/docs/troubleshooting.md +++ b/docs/troubleshooting.md @@ -13,5 +13,5 @@ Step 4: Open a Github ticket. If you're still unable to resolve the issue after Airbyte is an open source project with a vibrant community that fosters collaboration and mutual support. To ensure accessible troubleshooting guidance, Airbyte offers multiple platforms for users to ask and discuss issues, including the Airbyte Github, Airbyte Community Slack (which is over 10,000 users), and the Airbyte Forum. In addition, Airbyte hosts daily office hours that include topic demonstrations and dedicated space for issue discussion in Zoom meetings. In addition to these community resources, Airbyte also offers premium support packages for users who require additional assistance beyond what is provided by the community. :::info -You can check all your [support options](./community/support). +You can check all your [support options](./community/airbyte-support). ::: \ No newline at end of file diff --git a/docs/quickstart/add-a-destination.md b/docs/using-airbyte/getting-started/add-a-destination.md similarity index 81% rename from docs/quickstart/add-a-destination.md rename to docs/using-airbyte/getting-started/add-a-destination.md index 594acd02cf9e..5d139a0d41a8 100644 --- a/docs/quickstart/add-a-destination.md +++ b/docs/using-airbyte/getting-started/add-a-destination.md @@ -1,20 +1,20 @@ # Add a Destination -Destinations are the data warehouses, data lakes, databases and analytics tools where you will load the data from your chosen source(s). The steps to setting up your first destination are very similar to those for [setting up a source](https://docs.airbyte.com/quickstart/add-a-source). +Destinations are the data warehouses, data lakes, databases and analytics tools where you will load the data from your chosen source(s). The steps to setting up your first destination are very similar to those for [setting up a source](./add-a-source). Once you've logged in to your Airbyte Open Source deployment, click on the **Destinations** tab in the navigation bar found on the left side of the dashboard. This will take you to the list of available destinations. -![Destination List](../.gitbook/assets/add-a-destination/getting-started-destination-list.png) +![Destination List](../../.gitbook/assets/add-a-destination/getting-started-destination-list.png) You can use the provided search bar at the top of the page, or scroll down the list to find the destination you want to replicate data from. :::tip -You can filter the list of destinations by support level. Airbyte connectors are categorized in two support levels, Certified and Community. See our [Product Support Levels](https://docs.airbyte.com/project-overview/product-support-levels) page for more information on this topic. +You can filter the list of destinations by support level. Airbyte connectors are categorized in two support levels, Certified and Community. See our [Connector Support Levels](./integrations/connector-support-levels) page for more information on this topic. ::: As an example, we'll be setting up a simple JSON file that will be saved on our local system as the destination. Select **Local JSON** from the list of destinations. This will take you to the destination setup page. -![Destination Page](../.gitbook/assets/add-a-destination/getting-started-destination-page.png) +![Destination Page](../../.gitbook/assets/add-a-destination/getting-started-destination-page.png) The left half of the page contains a set of fields that you will have to fill out. In the **Destination name** field, you can enter a name of your choosing to help you identify this instance of the connector. By default, this will be set to the name of the destination (i.e., `Local JSON`). @@ -26,4 +26,4 @@ Each destination will have its own set of required fields to configure during se Some destinations will also have an **Optional Fields** tab located beneath the required fields. You can open this tab to view and configure any additional optional parameters that exist for the source. These fields generally grant you more fine-grained control over your data replication, but you can safely ignore them. ::: -Once you've filled out the required fields, select **Set up destination**. A connection check will run to verify that a successful connection can be established. Now you're ready to [set up your first connection](https://docs.airbyte.com/quickstart/set-up-a-connection)! +Once you've filled out the required fields, select **Set up destination**. A connection check will run to verify that a successful connection can be established. Now you're ready to [set up your first connection](./set-up-a-connection)! diff --git a/docs/quickstart/add-a-source.md b/docs/using-airbyte/getting-started/add-a-source.md similarity index 86% rename from docs/quickstart/add-a-source.md rename to docs/using-airbyte/getting-started/add-a-source.md index 633d9a1d8b77..e5f59b2f7517 100644 --- a/docs/quickstart/add-a-source.md +++ b/docs/using-airbyte/getting-started/add-a-source.md @@ -2,11 +2,11 @@ Setting up a new source in Airbyte is a quick and simple process! When viewing the Airbyte UI, you'll see the main navigation bar on the left side of your screen. Click the **Sources** tab to bring up a list of all available sources. -![](../.gitbook/assets/add-a-source/getting-started-source-list.png) +![](../../.gitbook/assets/add-a-source/getting-started-source-list.png) You can use the provided search bar, or simply scroll down the list to find the source you want to replicate data from. Let's use Google Sheets as an example. Clicking on the **Google Sheets** card will bring us to its setup page. -![](../.gitbook/assets/add-a-source/getting-started-source-page.png) +![](../../.gitbook/assets/add-a-source/getting-started-source-page.png) The left half of the page contains a set of fields that you will have to fill out. In the **Source name** field, you can enter a name of your choosing to help you identify this instance of the connector. By default, this will be set to the name of the source (ie, `Google Sheets`). @@ -18,5 +18,5 @@ Some sources will also have an **Optional Fields** tab. You can open this tab to Once you've filled out all the required fields, click on the **Set up source** button and Airbyte will run a check to verify the connection. Happy replicating! -Can't find the connectors that you want? Try your hand at easily building one yourself using our [Connector Builder!](../connector-development/connector-builder-ui/overview.md) +Can't find the connectors that you want? Try your hand at easily building one yourself using our [Connector Builder!](../../connector-development/connector-builder-ui/overview.md) diff --git a/docs/using-airbyte/getting-started/readme.md b/docs/using-airbyte/getting-started/readme.md new file mode 100644 index 000000000000..ce0874c0e713 --- /dev/null +++ b/docs/using-airbyte/getting-started/readme.md @@ -0,0 +1,5 @@ +# Getting Started + +// TODO: Placeholder text + +should be about cloud and deployment options \ No newline at end of file diff --git a/docs/quickstart/set-up-a-connection.md b/docs/using-airbyte/getting-started/set-up-a-connection.md similarity index 78% rename from docs/quickstart/set-up-a-connection.md rename to docs/using-airbyte/getting-started/set-up-a-connection.md index c9144ec08c43..49d65d1c54d0 100644 --- a/docs/quickstart/set-up-a-connection.md +++ b/docs/using-airbyte/getting-started/set-up-a-connection.md @@ -1,6 +1,6 @@ # Set up a Connection -Now that you've learned how to [deploy Airbyte locally](https://docs.airbyte.com/quickstart/deploy-airbyte) and set up your first [source](https://docs.airbyte.com/quickstart/add-a-source) and [destination](https://docs.airbyte.com/quickstart/add-a-destination), it's time to finish the job by creating your very first connection! +Now that you've learned how to [deploy Airbyte locally](./deploy-airbyte) and set up your first [source](./add-a-source) and [destination](./add-a-destination), it's time to finish the job by creating your very first connection! On the left side of your main Airbyte dashboard, select **Connections**. You will be prompted to choose which source and destination to use for this connection. As an example, we'll use the **Google Sheets** source and **Local JSON** destination. @@ -8,13 +8,13 @@ On the left side of your main Airbyte dashboard, select **Connections**. You wil Once you've chosen your source and destination, you'll be able to configure the connection. You can refer to [this page](https://docs.airbyte.com/cloud/managing-airbyte-cloud/configuring-connections) for more information on each available configuration. For this demo, we'll simply set the **Replication frequency** to a 24 hour interval and leave the other fields at their default values. -![Connection config](../.gitbook/assets/set-up-a-connection/getting-started-connection-config.png) +![Connection config](../../.gitbook/assets/set-up-a-connection/getting-started-connection-config.png) Next, you can toggle which streams you want to replicate, as well as setting up the desired sync mode for each stream. For more information on the nature of each sync mode supported by Airbyte, see [this page](https://docs.airbyte.com/understanding-airbyte/connections/#sync-modes). Our test data consists of a single stream cleverly named `Test Data`, which we've enabled and set to `Full Refresh - Overwrite` sync mode. -![Stream config](../.gitbook/assets/set-up-a-connection/getting-started-connection-streams.png) +![Stream config](../../.gitbook/assets/set-up-a-connection/getting-started-connection-streams.png) Click **Set up connection** to complete your first connection. Your first sync is about to begin! @@ -22,7 +22,7 @@ Click **Set up connection** to complete your first connection. Your first sync i Once you've finished setting up the connection, you will be automatically redirected to a dashboard containing all the tools you need to keep track of your connection. -![Connection dashboard](../.gitbook/assets/set-up-a-connection/getting-started-connection-success.png) +![Connection dashboard](../../.gitbook/assets/set-up-a-connection/getting-started-connection-success.png) Here's a basic overview of the tabs and their use: @@ -42,12 +42,12 @@ cat /tmp/airbyte_local/YOUR_PATH/_airbyte_raw_YOUR_STREAM_NAME.jsonl You should see a list of JSON objects, each containing a unique `airbyte_ab_id`, an `emitted_at` timestamp, and `airbyte_data` containing the extracted record. :::tip -If you are using Airbyte on Windows with WSL2 and Docker, refer to [this guide](https://docs.airbyte.com/operator-guides/locating-files-local-destination) to locate the replicated folder and file. +If you are using Airbyte on Windows with WSL2 and Docker, refer to [this guide](/operator-guides/locating-files-local-destination) to locate the replicated folder and file. ::: ## What's next? -Congratulations on successfully setting up your first connection using Airbyte Open Source! We hope that this will be just the first step on your journey with us. We support a large, ever-growing [catalog of sources and destinations](https://docs.airbyte.com/integrations/), and you can even [contribute your own](https://docs.airbyte.com/connector-development/). +Congratulations on successfully setting up your first connection using Airbyte Open Source! We hope that this will be just the first step on your journey with us. We support a large, ever-growing [catalog of sources and destinations](/integrations/), and you can even [contribute your own](/connector-development/). If you have any questions at all, please reach out to us on [Slack](https://slack.airbyte.io/). If you would like to see a missing feature or connector added, please create an issue on our [Github](https://github.com/airbytehq/airbyte). Our community's participation is invaluable in helping us grow and improve every day, and we always welcome your feedback. diff --git a/docusaurus/redirects.yml b/docusaurus/redirects.yml index f8d9eec8008e..5cb0def1ddc7 100644 --- a/docusaurus/redirects.yml +++ b/docusaurus/redirects.yml @@ -60,4 +60,16 @@ - /project-overview/security - /operator-guides/securing-airbyte - /operator-guides/security - to: /managing-airbyte/security \ No newline at end of file + to: /operating-airbyte/security +- from: /cloud/getting-started-with-airbyte-cloud + to: /using-airbyte/getting-started +- from: + - /quickstart/deploy-airbyte + - /category/getting-started + to: /using-airbyte/getting-started/ +- from: /quickstart/add-a-source + to: /using-airbyte/getting-started/add-a-source +- from: /quickstart/add-a-destination + to: /using-airbyte/getting-started/add-a-destination +- from: /quickstart/set-up-a-connection + to: /using-airbyte/getting-started/set-up-a-connection diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index a590a1cc9824..1add893f8939 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -320,11 +320,6 @@ const contributeToAirbyte = { }; const airbyteCloud = [ - { - type: "doc", - label: "Getting Started", - id: "cloud/getting-started-with-airbyte-cloud", - }, "cloud/core-concepts", { type: "category", @@ -514,13 +509,25 @@ module.exports = { // airbyteSelfManaged, // -- end legacy sectionHeader("Using Airbyte"), - ossGettingStarted, + { + type: "category", + label: "Getting Started", + link: { + type: "doc", + id: "using-airbyte/getting-started/readme", + }, + items: [ + "using-airbyte/getting-started/add-a-source", + "using-airbyte/getting-started/add-a-destination", + "using-airbyte/getting-started/set-up-a-connection", + ], + }, ...airbyteCloud, "troubleshooting", - sectionHeader("Managing Airbyte"), + sectionHeader("Operating Airbyte"), deployAirbyte, managingAirbyte, - "managing-airbyte/security", + "operating-airbyte/security", { type: "category", label: "Airbyte Enterprise", From 22d070b4f8fa8056d8a6b3110c2a5236fb3cb6e7 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 17:16:07 +0100 Subject: [PATCH 03/52] Core concept section --- .../core-concepts}/basic-normalization.md | 32 ++++++------- .../core-concepts}/namespaces.md | 0 .../core-concepts/readme.md} | 0 .../core-concepts/sync-modes}/README.md | 22 ++++----- .../sync-modes}/full-refresh-append.md | 2 +- .../sync-modes}/full-refresh-overwrite.md | 2 +- .../sync-modes}/incremental-append-deduped.md | 14 +++--- .../sync-modes}/incremental-append.md | 18 +++---- .../core-concepts}/typing-deduping.md | 6 +-- docusaurus/redirects.yml | 18 +++++++ docusaurus/sidebars.js | 48 +++++++++++-------- 11 files changed, 93 insertions(+), 69 deletions(-) rename docs/{understanding-airbyte => using-airbyte/core-concepts}/basic-normalization.md (94%) rename docs/{understanding-airbyte => using-airbyte/core-concepts}/namespaces.md (100%) rename docs/{cloud/core-concepts.md => using-airbyte/core-concepts/readme.md} (100%) rename docs/{understanding-airbyte/connections => using-airbyte/core-concepts/sync-modes}/README.md (76%) rename docs/{understanding-airbyte/connections => using-airbyte/core-concepts/sync-modes}/full-refresh-append.md (92%) rename docs/{understanding-airbyte/connections => using-airbyte/core-concepts/sync-modes}/full-refresh-overwrite.md (91%) rename docs/{understanding-airbyte/connections => using-airbyte/core-concepts/sync-modes}/incremental-append-deduped.md (89%) rename docs/{understanding-airbyte/connections => using-airbyte/core-concepts/sync-modes}/incremental-append.md (88%) rename docs/{understanding-airbyte => using-airbyte/core-concepts}/typing-deduping.md (91%) diff --git a/docs/understanding-airbyte/basic-normalization.md b/docs/using-airbyte/core-concepts/basic-normalization.md similarity index 94% rename from docs/understanding-airbyte/basic-normalization.md rename to docs/using-airbyte/core-concepts/basic-normalization.md index e51f4eb1a1ac..3d94e7847f90 100644 --- a/docs/understanding-airbyte/basic-normalization.md +++ b/docs/using-airbyte/core-concepts/basic-normalization.md @@ -78,7 +78,7 @@ Additional metadata columns can be added on some tables depending on the usage: - On de-duplicated (and SCD) tables: - `_airbyte_unique_key`: hash of primary keys used to de-duplicate the final table. -The [normalization rules](basic-normalization.md#Rules) are _not_ configurable. They are designed to pick a reasonable set of defaults to hit the 80/20 rule of data normalization. We respect that normalization is a detail-oriented problem and that with a fixed set of rules, we cannot normalize your data in such a way that covers all use cases. If this feature does not meet your normalization needs, we always put the full json blob in destination as well, so that you can parse that object however best meets your use case. We will be adding more advanced normalization functionality shortly. Airbyte is focused on the EL of ELT. If you need a really featureful tool for the transformations then, we suggest trying out dbt. +The [normalization rules](#Rules) are _not_ configurable. They are designed to pick a reasonable set of defaults to hit the 80/20 rule of data normalization. We respect that normalization is a detail-oriented problem and that with a fixed set of rules, we cannot normalize your data in such a way that covers all use cases. If this feature does not meet your normalization needs, we always put the full json blob in destination as well, so that you can parse that object however best meets your use case. We will be adding more advanced normalization functionality shortly. Airbyte is focused on the EL of ELT. If you need a really featureful tool for the transformations then, we suggest trying out dbt. Airbyte places the json blob version of your data in a table called `_airbyte_raw_`. If basic normalization is turned on, it will place a separate copy of the data in a table called ``. Under the hood, Airbyte is using dbt, which means that the data only ingresses into the data store one time. The normalization happens as a query within the datastore. This implementation avoids extra network time and costs. @@ -94,7 +94,7 @@ Airbyte runs this step before handing the final data over to other tools that wi To summarize, we can represent the ELT process in the diagram below. These are steps that happens between your "Source Database or API" and the final "Replicated Tables" with examples of implementation underneath: -![](../.gitbook/assets/connecting-EL-with-T-4.png) +![](../../.gitbook/assets/connecting-EL-with-T-4.png) In Airbyte, the current normalization option is implemented using a dbt Transformer composed of: @@ -103,14 +103,14 @@ In Airbyte, the current normalization option is implemented using a dbt Transfor ## Destinations that Support Basic Normalization -- [BigQuery](../integrations/destinations/bigquery.md) -- [MS Server SQL](../integrations/destinations/mssql.md) -- [MySQL](../integrations/destinations/mysql.md) +- [BigQuery](../../integrations/destinations/bigquery.md) +- [MS Server SQL](../../integrations/destinations/mssql.md) +- [MySQL](../../integrations/destinations/mysql.md) - The server must support the `WITH` keyword. - Require MySQL >= 8.0, or MariaDB >= 10.2.1. -- [Postgres](../integrations/destinations/postgres.md) -- [Redshift](../integrations/destinations/redshift.md) -- [Snowflake](../integrations/destinations/snowflake.md) +- [Postgres](../../integrations/destinations/postgres.md) +- [Redshift](../../integrations/destinations/redshift.md) +- [Snowflake](../../integrations/destinations/snowflake.md) Basic Normalization can be configured when you're creating the connection between your Connection Setup and after in the Transformation Tab. Select the option: **Normalized tabular data**. @@ -131,8 +131,8 @@ Airbyte uses the types described in the catalog to determine the correct type fo | `bit` | boolean | | | `boolean` | boolean | | | `string` with format label `date-time` | timestamp with timezone | | -| `array` | new table | see [nesting](basic-normalization.md#Nesting) | -| `object` | new table | see [nesting](basic-normalization.md#Nesting) | +| `array` | new table | see [nesting](#Nesting) | +| `object` | new table | see [nesting](#Nesting) | ### Nesting @@ -326,11 +326,11 @@ As mentioned in the overview: To enable basic normalization \(which is optional\), you can toggle it on or disable it in the "Normalization and Transformation" section when setting up your connection: -![](../.gitbook/assets/basic-normalization-configuration.png) +![](../../.gitbook/assets/basic-normalization-configuration.png) ## Incremental runs -When the source is configured with sync modes compatible with incremental transformations (using append on destination) such as ( [full_refresh_append](connections/full-refresh-append.md), [incremental append](connections/incremental-append.md) or [incremental deduped history](connections/incremental-append-deduped.md)), only rows that have changed in the source are transferred over the network and written by the destination connector. +When the source is configured with sync modes compatible with incremental transformations (using append on destination) such as ( [full_refresh_append](./sync-modes/full-refresh-append.md), [incremental append](./sync-modes/incremental-append.md) or [incremental deduped history](./sync-modes/incremental-append-deduped.md)), only rows that have changed in the source are transferred over the network and written by the destination connector. Normalization will then try to build the normalized tables incrementally as the rows in the raw tables that have been created or updated since the last time dbt ran. As such, on each dbt run, the models get built incrementally. This limits the amount of data that needs to be transformed, vastly reducing the runtime of the transformations. This improves warehouse performance and reduces compute costs. Because normalization can be either run incrementally and, or, in full refresh, a technical column `_airbyte_normalized_at` can serve to track when was the last time a record has been transformed and written by normalization. This may greatly diverge from the `_airbyte_emitted_at` value as the normalized tables could be totally re-built at a latter time from the data stored in the `_airbyte_raw` tables. @@ -342,15 +342,15 @@ Normalization produces tables that are partitioned, clustered, sorted or indexed In general, normalization needs to do lookup on the last emitted_at column to know if a record is freshly produced and need to be incrementally processed or not. But in certain models, such as SCD tables for example, we also need to retrieve older data to update their type 2 SCD end_date and active_row flags, thus a different partitioning scheme is used to optimize that use case. -On Postgres destination, an additional table suffixed with `_stg` for every stream replicated in [incremental deduped history](connections/incremental-append-deduped.md) needs to be persisted (in a different staging schema) for incremental transformations to work because of a [limitation](https://github.com/dbt-labs/docs.getdbt.com/issues/335#issuecomment-694199569). +On Postgres destination, an additional table suffixed with `_stg` for every stream replicated in [incremental deduped history](./sync-modes/incremental-append-deduped.md) needs to be persisted (in a different staging schema) for incremental transformations to work because of a [limitation](https://github.com/dbt-labs/docs.getdbt.com/issues/335#issuecomment-694199569). ## Extending Basic Normalization Note that all the choices made by Normalization as described in this documentation page in terms of naming (and more) could be overridden by your own custom choices. To do so, you can follow the following tutorials: -- to build a [custom SQL view](../operator-guides/transformation-and-normalization/transformations-with-sql.md) with your own naming conventions -- to export, edit and run [custom dbt normalization](../operator-guides/transformation-and-normalization/transformations-with-dbt.md) yourself -- or further, you can configure the use of a custom dbt project within Airbyte by following [this guide](../operator-guides/transformation-and-normalization/transformations-with-airbyte.md). +- to build a [custom SQL view](../../operator-guides/transformation-and-normalization/transformations-with-sql.md) with your own naming conventions +- to export, edit and run [custom dbt normalization](../../operator-guides/transformation-and-normalization/transformations-with-dbt.md) yourself +- or further, you can configure the use of a custom dbt project within Airbyte by following [this guide](../../operator-guides/transformation-and-normalization/transformations-with-airbyte.md). ## CHANGELOG diff --git a/docs/understanding-airbyte/namespaces.md b/docs/using-airbyte/core-concepts/namespaces.md similarity index 100% rename from docs/understanding-airbyte/namespaces.md rename to docs/using-airbyte/core-concepts/namespaces.md diff --git a/docs/cloud/core-concepts.md b/docs/using-airbyte/core-concepts/readme.md similarity index 100% rename from docs/cloud/core-concepts.md rename to docs/using-airbyte/core-concepts/readme.md diff --git a/docs/understanding-airbyte/connections/README.md b/docs/using-airbyte/core-concepts/sync-modes/README.md similarity index 76% rename from docs/understanding-airbyte/connections/README.md rename to docs/using-airbyte/core-concepts/sync-modes/README.md index 5e6c449152b7..671f79024cad 100644 --- a/docs/understanding-airbyte/connections/README.md +++ b/docs/using-airbyte/core-concepts/sync-modes/README.md @@ -4,13 +4,13 @@ A connection is a configuration for syncing data between a source and a destinat - Sync schedule: when to trigger a sync of the data. - Destination [Namespace](../namespaces.md) and stream names: where the data will end up being written. -- A catalog selection: which [streams and fields](../airbyte-protocol.md#catalog) to replicate from the source +- A catalog selection: which [streams and fields](../../../understanding-airbyte/airbyte-protocol.md#catalog) to replicate from the source - Sync mode: how streams should be replicated \(read and write\): - Optional transformations: how to convert Airbyte protocol messages \(raw JSON blob\) data into some other data representations. ## Sync schedules -Sync schedules are explained below. For information about catalog selections, see [AirbyteCatalog & ConfiguredAirbyteCatalog](../airbyte-protocol.md#catalog). +Sync schedules are explained below. For information about catalog selections, see [AirbyteCatalog & ConfiguredAirbyteCatalog](../../../understanding-airbyte/airbyte-protocol.md#catalog). Syncs will be triggered by either: @@ -38,7 +38,7 @@ Stream names refer to table names in a typical RDBMS. But it can also be the nam ## Stream-specific customization -All the customization of namespace and stream names described above will be equally applied to all streams selected for replication in a catalog per connection. If you need more granular customization, stream by stream, for example, or with different logic rules, then you could follow the tutorial on [customizing transformations with dbt](../../operator-guides/transformation-and-normalization/transformations-with-dbt.md). +All the customization of namespace and stream names described above will be equally applied to all streams selected for replication in a catalog per connection. If you need more granular customization, stream by stream, for example, or with different logic rules, then you could follow the tutorial on [customizing transformations with dbt](../../../operator-guides/transformation-and-normalization/transformations-with-dbt.md). ## Sync modes @@ -47,7 +47,7 @@ A sync mode governs how Airbyte reads from a source and writes to a destination. 1. The first part of the name denotes how the source connector reads data from the source: 1. Incremental: Read records added to the source since the last sync job. \(The first sync using Incremental is equivalent to a Full Refresh\) - Method 1: Using a cursor. Generally supported by all connectors whose data source allows extracting records incrementally. - - Method 2: Using change data capture. Only supported by some sources. See [CDC](../cdc.md) for more info. + - Method 2: Using change data capture. Only supported by some sources. See [CDC](../../../understanding-airbyte/cdc.md) for more info. 2. Full Refresh: Read everything in the source. 2. The second part of the sync mode name denotes how the destination connector writes data. This is not affected by how the source connector produced the data: 1. Overwrite: Overwrite by first deleting existing data in the destination. @@ -56,23 +56,19 @@ A sync mode governs how Airbyte reads from a source and writes to a destination. A sync mode is therefore, a combination of a source and destination mode together. The UI exposes the following options, whenever both source and destination connectors are capable to support it for the corresponding stream: -- [Full Refresh Overwrite](full-refresh-overwrite.md): Sync the whole stream and replace data in destination by overwriting it. -- [Full Refresh Append](full-refresh-append.md): Sync the whole stream and append data in destination. -- [Incremental Append](incremental-append.md): Sync new records from stream and append data in destination. -- [Incremental Append + Deduped](incremental-append-deduped.md): Sync new records from stream and append data in destination, also provides a de-duplicated view mirroring the state of the stream in the source. +- [Full Refresh Overwrite](./full-refresh-overwrite.md): Sync the whole stream and replace data in destination by overwriting it. +- [Full Refresh Append](./full-refresh-append.md): Sync the whole stream and append data in destination. +- [Incremental Append](./incremental-append.md): Sync new records from stream and append data in destination. +- [Incremental Append + Deduped](./incremental-append-deduped.md): Sync new records from stream and append data in destination, also provides a de-duplicated view mirroring the state of the stream in the source. ## Optional operations ### Typing and Deduping -As described by the [Airbyte Protocol from the Airbyte Specifications](../airbyte-protocol.md), replication is composed of source connectors that are transmitting data in a JSON format. It is then written as such by the destination connectors. On top of this replication, Airbyte's database and datawarehous destinations can provide converstions from the raw JSON data into type-cast relational columns. Learn more [here](/understanding-airbyte/typing-deduping). +As described by the [Airbyte Protocol from the Airbyte Specifications](../../../understanding-airbyte/airbyte-protocol.md), replication is composed of source connectors that are transmitting data in a JSON format. It is then written as such by the destination connectors. On top of this replication, Airbyte's database and datawarehous destinations can provide converstions from the raw JSON data into type-cast relational columns. Learn more [here](/understanding-airbyte/typing-deduping). :::note Typing and Deduping may cause an increase in your destination's compute cost. This cost will vary depending on the amount of data that is transformed and is not related to Airbyte credit usage. ::: - -### Custom sync operations - -Further operations can be included in a sync on top of Airbyte basic normalization \(or even to replace it completely\). See [operations](../operations.md) for more details. diff --git a/docs/understanding-airbyte/connections/full-refresh-append.md b/docs/using-airbyte/core-concepts/sync-modes/full-refresh-append.md similarity index 92% rename from docs/understanding-airbyte/connections/full-refresh-append.md rename to docs/using-airbyte/core-concepts/sync-modes/full-refresh-append.md index b7343fc1c07b..ccdd7951bbe5 100644 --- a/docs/understanding-airbyte/connections/full-refresh-append.md +++ b/docs/using-airbyte/core-concepts/sync-modes/full-refresh-append.md @@ -2,7 +2,7 @@ ## Overview -The **Full Refresh** modes are the simplest methods that Airbyte uses to sync data, as they always retrieve all available data requested from the source, regardless of whether it has been synced before. This contrasts with [**Incremental sync**](incremental-append.md), which does not sync data that has already been synced before. +The **Full Refresh** modes are the simplest methods that Airbyte uses to sync data, as they always retrieve all available data requested from the source, regardless of whether it has been synced before. This contrasts with [**Incremental sync**](./incremental-append.md), which does not sync data that has already been synced before. In the **Append** variant, new syncs will take all data from the sync and append it to the destination table. Therefore, if syncing similar information multiple times, every sync will create duplicates of already existing data. diff --git a/docs/understanding-airbyte/connections/full-refresh-overwrite.md b/docs/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md similarity index 91% rename from docs/understanding-airbyte/connections/full-refresh-overwrite.md rename to docs/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md index 44d4ff5f6699..6de7d266c9ce 100644 --- a/docs/understanding-airbyte/connections/full-refresh-overwrite.md +++ b/docs/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md @@ -2,7 +2,7 @@ ## Overview -The **Full Refresh** modes are the simplest methods that Airbyte uses to sync data, as they always retrieve all available information requested from the source, regardless of whether it has been synced before. This contrasts with [**Incremental sync**](incremental-append.md), which does not sync data that has already been synced before. +The **Full Refresh** modes are the simplest methods that Airbyte uses to sync data, as they always retrieve all available information requested from the source, regardless of whether it has been synced before. This contrasts with [**Incremental sync**](./incremental-append.md), which does not sync data that has already been synced before. In the **Overwrite** variant, new syncs will destroy all data in the existing destination table and then pull the new data in. Therefore, data that has been removed from the source after an old sync will be deleted in the destination table. diff --git a/docs/understanding-airbyte/connections/incremental-append-deduped.md b/docs/using-airbyte/core-concepts/sync-modes/incremental-append-deduped.md similarity index 89% rename from docs/understanding-airbyte/connections/incremental-append-deduped.md rename to docs/using-airbyte/core-concepts/sync-modes/incremental-append-deduped.md index 86e8ee92ee75..b077c0508c79 100644 --- a/docs/understanding-airbyte/connections/incremental-append-deduped.md +++ b/docs/using-airbyte/core-concepts/sync-modes/incremental-append-deduped.md @@ -69,19 +69,19 @@ In the final de-duplicated table: ## Source-Defined Cursor -Some sources are able to determine the cursor that they use without any user input. For example, in the [exchange rates source](../../integrations/sources/exchange-rates.md), the source knows that the date field should be used to determine the last record that was synced. In these cases, simply select the incremental option in the UI. +Some sources are able to determine the cursor that they use without any user input. For example, in the [exchange rates source](../../../integrations/sources/exchange-rates.md), the source knows that the date field should be used to determine the last record that was synced. In these cases, simply select the incremental option in the UI. -![](../../.gitbook/assets/incremental_source_defined.png) +![](../../../.gitbook/assets/incremental_source_defined.png) -\(You can find a more technical details about the configuration data model [here](../airbyte-protocol.md#catalog)\). +\(You can find a more technical details about the configuration data model [here](../../../understanding-airbyte/airbyte-protocol.md#catalog)\). ## User-Defined Cursor -Some sources cannot define the cursor without user input. For example, in the [postgres source](../../integrations/sources/postgres.md), the user needs to choose which column in a database table they want to use as the `cursor field`. In these cases, select the column in the sync settings dropdown that should be used as the `cursor field`. +Some sources cannot define the cursor without user input. For example, in the [postgres source](../../../integrations/sources/postgres.md), the user needs to choose which column in a database table they want to use as the `cursor field`. In these cases, select the column in the sync settings dropdown that should be used as the `cursor field`. -![](../../.gitbook/assets/incremental_user_defined.png) +![](../../../.gitbook/assets/incremental_user_defined.png) -\(You can find a more technical details about the configuration data model [here](../airbyte-protocol.md#catalog)\). +\(You can find a more technical details about the configuration data model [here](../../../understanding-airbyte/airbyte-protocol.md#catalog)\). ## Source-Defined Primary key @@ -91,7 +91,7 @@ Some sources are able to determine the primary key that they use without any use Some sources cannot define the cursor without user input or the user may want to specify their own primary key on the destination that is different from the source definitions. In these cases, select the column in the sync settings dropdown that should be used as the `primary key` or `composite primary keys`. -![](../../.gitbook/assets/primary_key_user_defined.png) +![](../../../.gitbook/assets/primary_key_user_defined.png) In this example, we selected both the `campaigns.id` and `campaigns.name` as the composite primary key of our `campaigns` table. diff --git a/docs/understanding-airbyte/connections/incremental-append.md b/docs/using-airbyte/core-concepts/sync-modes/incremental-append.md similarity index 88% rename from docs/understanding-airbyte/connections/incremental-append.md rename to docs/using-airbyte/core-concepts/sync-modes/incremental-append.md index c380d2226912..c9facb4711f3 100644 --- a/docs/understanding-airbyte/connections/incremental-append.md +++ b/docs/using-airbyte/core-concepts/sync-modes/incremental-append.md @@ -2,7 +2,7 @@ ## Overview -Airbyte supports syncing data in **Incremental Append** mode i.e: syncing only replicate _new_ or _modified_ data. This prevents re-fetching data that you have already replicated from a source. If the sync is running for the first time, it is equivalent to a [Full Refresh](full-refresh-append.md) since all data will be considered as _new_. +Airbyte supports syncing data in **Incremental Append** mode i.e: syncing only replicate _new_ or _modified_ data. This prevents re-fetching data that you have already replicated from a source. If the sync is running for the first time, it is equivalent to a [Full Refresh](./full-refresh-append.md) since all data will be considered as _new_. In this flavor of incremental, records in the warehouse destination will never be deleted or mutated. A copy of each new or updated record is _appended_ to the data in the warehouse. This means you can find multiple copies of the same record in the destination warehouse. We provide an "at least once" guarantee of replicating each record that is present when the sync runs. @@ -62,25 +62,25 @@ The output we expect to see in the warehouse is as follows: ## Source-Defined Cursor -Some sources are able to determine the cursor that they use without any user input. For example, in the [exchange rates source](../../integrations/sources/exchange-rates.md), the source knows that the date field should be used to determine the last record that was synced. In these cases, simply select the incremental option in the UI. +Some sources are able to determine the cursor that they use without any user input. For example, in the [exchange rates source](../../../integrations/sources/exchange-rates.md), the source knows that the date field should be used to determine the last record that was synced. In these cases, simply select the incremental option in the UI. -![](../../.gitbook/assets/incremental_source_defined.png) +![](../../../.gitbook/assets/incremental_source_defined.png) -\(You can find a more technical details about the configuration data model [here](../airbyte-protocol.md#catalog)\). +\(You can find a more technical details about the configuration data model [here](../../../understanding-airbyte/airbyte-protocol.md#catalog)\). ## User-Defined Cursor -Some sources cannot define the cursor without user input. For example, in the [postgres source](../../integrations/sources/postgres.md), the user needs to choose which column in a database table they want to use as the `cursor field`. In these cases, select the column in the sync settings dropdown that should be used as the `cursor field`. +Some sources cannot define the cursor without user input. For example, in the [postgres source](../../../integrations/sources/postgres.md), the user needs to choose which column in a database table they want to use as the `cursor field`. In these cases, select the column in the sync settings dropdown that should be used as the `cursor field`. -![](../../.gitbook/assets/incremental_user_defined.png) +![](../../../.gitbook/assets/incremental_user_defined.png) -\(You can find a more technical details about the configuration data model [here](../airbyte-protocol.md#catalog)\). +\(You can find a more technical details about the configuration data model [here](../../../understanding-airbyte/airbyte-protocol.md#catalog)\). ## Getting the Latest Snapshot of data As demonstrated in the examples above, with **Incremental Append,** a record which was updated in the source will be appended to the destination rather than updated in-place. This means that if data in the source uses a primary key \(e.g: `user_id` in the `users` table\), then the destination will end up having multiple records with the same primary key value. -However, some use cases require only the latest snapshot of the data. This is available by using other flavors of sync modes such as [Incremental - Append + Deduped](incremental-append-deduped.md) instead. +However, some use cases require only the latest snapshot of the data. This is available by using other flavors of sync modes such as [Incremental - Append + Deduped](./incremental-append-deduped.md) instead. Note that in **Incremental Append**, the size of the data in your warehouse increases monotonically since an updated record in the source is appended to the destination rather than updated in-place. @@ -122,7 +122,7 @@ At the end of the second incremental sync, the data warehouse would still contai Similarly, if multiple modifications are made during the same day to the same records. If the frequency of the sync is not granular enough \(for example, set for every 24h\), then intermediate modifications to the data are not going to be detected and emitted. Only the state of data at the time the sync runs will be reflected in the destination. -Those concerns could be solved by using a different incremental approach based on binary logs, Write-Ahead-Logs \(WAL\), or also called [Change Data Capture \(CDC\)](../cdc.md). +Those concerns could be solved by using a different incremental approach based on binary logs, Write-Ahead-Logs \(WAL\), or also called [Change Data Capture \(CDC\)](../../../understanding-airbyte/cdc.md). The current behavior of **Incremental** is not able to handle source schema changes yet, for example, when a column is added, renamed or deleted from an existing table etc. It is recommended to trigger a [Full refresh - Overwrite](full-refresh-overwrite.md) to correctly replicate the data to the destination with the new schema changes. diff --git a/docs/understanding-airbyte/typing-deduping.md b/docs/using-airbyte/core-concepts/typing-deduping.md similarity index 91% rename from docs/understanding-airbyte/typing-deduping.md rename to docs/using-airbyte/core-concepts/typing-deduping.md index f66e6a3c59ba..1ebfd060d2b0 100644 --- a/docs/understanding-airbyte/typing-deduping.md +++ b/docs/using-airbyte/core-concepts/typing-deduping.md @@ -34,7 +34,7 @@ Depending on your use-case, it may still be valuable to consider rows with error ## Destinations V2 Example -Consider the following [source schema](https://docs.airbyte.com/integrations/sources/faker) for stream `users`: +Consider the following [source schema](/integrations/sources/faker) for stream `users`: ```json { @@ -58,7 +58,7 @@ The data from one stream will now be mapped to one table in your schema as below | Failed typing that didn’t break other rows ⟶ | yyy-yyy-yyy | 2022-01-01 12:00:00 | { errors: {[“fish” is not a valid integer for column “age”]} | 2 | evan | NULL | { city: “Menlo Park”, zip: “94002” } | | Not-yet-typed ⟶ | | | | | | | | -In legacy normalization, columns of [Airbyte type](https://docs.airbyte.com/understanding-airbyte/supported-data-types/#the-types) `Object` in the Destination were "unnested" into separate tables. In this example, with Destinations V2, the previously unnested `public.users_address` table with columns `city` and `zip` will no longer be generated. +In legacy normalization, columns of [Airbyte type](/understanding-airbyte/supported-data-types/#the-types) `Object` in the Destination were "unnested" into separate tables. In this example, with Destinations V2, the previously unnested `public.users_address` table with columns `city` and `zip` will no longer be generated. #### Destination Table Name: _airbyte.raw_public_users_ (`airbyte.{namespace}_{stream}`) @@ -70,4 +70,4 @@ In legacy normalization, columns of [Airbyte type](https://docs.airbyte.com/unde You also now see the following changes in Airbyte-provided columns: -![Airbyte Destinations V2 Column Changes](../release_notes/assets/updated_table_columns.png) +![Airbyte Destinations V2 Column Changes](../../release_notes/assets/updated_table_columns.png) diff --git a/docusaurus/redirects.yml b/docusaurus/redirects.yml index 5cb0def1ddc7..c31fc1ea4751 100644 --- a/docusaurus/redirects.yml +++ b/docusaurus/redirects.yml @@ -73,3 +73,21 @@ to: /using-airbyte/getting-started/add-a-destination - from: /quickstart/set-up-a-connection to: /using-airbyte/getting-started/set-up-a-connection +- from: /cloud/core-concepts + to: /using-airbyte/core-concepts +- from: /understanding-airbyte/namespaces + to: /using-airbyte/core-concepts/namespaces +- from: /understanding-airbyte/connections/ + to: /using-airbyte/core-concepts/sync-modes/ +- from: /understanding-airbyte/connections/full-refresh-overwrite + to: /using-airbyte/core-concepts/full-refresh-overwrite +- from: /understanding-airbyte/connections/full-refresh-append + to: /using-airbyte/core-concepts/full-refresh-append +- from: /understanding-airbyte/connections/incremental-append + to: /using-airbyte/core-concepts/incremental-append +- from: /understanding-airbyte/connections/incremental-append-deduped + to: /using-airbyte/core-concepts/incremental-append-deduped +- from: /understanding-airbyte/basic-normalization + to: /using-airbyte/core-concepts/basic-normalization +- from: /understanding-airbyte/typing-deduping + to: /using-airbyte/core-concepts/typing-deduping \ No newline at end of file diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index 1add893f8939..5bbe0663c890 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -454,29 +454,11 @@ const understandingAirbyte = { "understanding-airbyte/beginners-guide-to-catalog", "understanding-airbyte/airbyte-protocol", "understanding-airbyte/airbyte-protocol-docker", - "understanding-airbyte/basic-normalization", - "understanding-airbyte/typing-deduping", - { - type: "category", - label: "Connections and Sync Modes", - items: [ - { - type: "doc", - label: "Connections Overview", - id: "understanding-airbyte/connections/README", - }, - "understanding-airbyte/connections/full-refresh-overwrite", - "understanding-airbyte/connections/full-refresh-append", - "understanding-airbyte/connections/incremental-append", - "understanding-airbyte/connections/incremental-append-deduped", - ], - }, "understanding-airbyte/operations", "understanding-airbyte/high-level-view", "understanding-airbyte/jobs", "understanding-airbyte/tech-stack", "understanding-airbyte/cdc", - "understanding-airbyte/namespaces", "understanding-airbyte/supported-data-types", "understanding-airbyte/json-avro-conversion", "understanding-airbyte/database-data-catalog", @@ -522,7 +504,35 @@ module.exports = { "using-airbyte/getting-started/set-up-a-connection", ], }, - ...airbyteCloud, + { + type: "category", + label: "Core Concepts", + link: { + type: "doc", + id: "using-airbyte/core-concepts/readme" + }, + items: [ + "using-airbyte/core-concepts/namespaces", + { + type: "category", + label: "Connections and Sync Modes", + items: [ + { + type: "doc", + label: "Connections Overview", + id: "using-airbyte/core-concepts/sync-modes/README", + }, + "using-airbyte/core-concepts/sync-modes/full-refresh-overwrite", + "using-airbyte/core-concepts/sync-modes/full-refresh-append", + "using-airbyte/core-concepts/sync-modes/incremental-append", + "using-airbyte/core-concepts/sync-modes/incremental-append-deduped", + ], + }, + "using-airbyte/core-concepts/basic-normalization", + "using-airbyte/core-concepts/typing-deduping", + ], + }, + // ...airbyteCloud, "troubleshooting", sectionHeader("Operating Airbyte"), deployAirbyte, From cd28617b560eabc75e51861a119b50c4b9f8c0b6 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 17:30:38 +0100 Subject: [PATCH 04/52] Using Airbyte section --- .../understand-airbyte-cloud-limits.md | 2 +- docusaurus/sidebars.js | 67 ++++++++++++------- 2 files changed, 44 insertions(+), 25 deletions(-) diff --git a/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md b/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md index 9d8a429eab9e..7f9a0e97a1e9 100644 --- a/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md +++ b/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md @@ -1,4 +1,4 @@ -# Understand Airbyte Cloud limits +# Airbyte Cloud limits Understanding the following limitations will help you more effectively manage Airbyte Cloud. diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index 5bbe0663c890..a96ccf3d3a18 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -431,15 +431,6 @@ const managingAirbyte = { "operator-guides/using-kestra-plugin", "operator-guides/locating-files-local-destination", "operator-guides/collecting-metrics", - { - type: "category", - label: "Transformations and Normalization", - items: [ - "operator-guides/transformation-and-normalization/transformations-with-sql", - "operator-guides/transformation-and-normalization/transformations-with-dbt", - "operator-guides/transformation-and-normalization/transformations-with-airbyte", - ], - }, "operator-guides/configuring-airbyte", "operator-guides/using-custom-connectors", "operator-guides/scaling-airbyte", @@ -476,20 +467,6 @@ module.exports = { connectorCatalog, buildAConnector, "integrations/connector-support-levels", - // -- begin legacy - // sectionHeader("Airbyte Cloud"), - // ...airbyteCloud, - // sectionHeader("Airbyte Open Source (OSS)"), - // ossGettingStarted, - // deployAirbyte, - // operatorGuide, - // { - // type: "doc", - // id: "troubleshooting", - // }, - // sectionHeader("Enterprise Setup"), - // airbyteSelfManaged, - // -- end legacy sectionHeader("Using Airbyte"), { type: "category", @@ -532,7 +509,49 @@ module.exports = { "using-airbyte/core-concepts/typing-deduping", ], }, - // ...airbyteCloud, + { + type: "category", + label: "Managing Connections", + items: [ + "cloud/managing-airbyte-cloud/configuring-connections", + "cloud/managing-airbyte-cloud/manage-schema-changes", + "cloud/managing-airbyte-cloud/manage-data-residency", + "cloud/managing-airbyte-cloud/manage-connection-state", + ] + }, + { + type: "category", + label: "Managing Syncs", + items: [ + "operator-guides/reset", + "cloud/managing-airbyte-cloud/review-connection-status", + "cloud/managing-airbyte-cloud/review-sync-history", + "operator-guides/browsing-output-logs", + "operator-guides/locating-files-local-destination", + ], + }, + { + type: "category", + label: "Managing your Account", + items: [ + "cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace", + // TODO: merge with operator-guides/configure-sync-notifications + "cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications", + "cloud/managing-airbyte-cloud/manage-credits", + "operator-guides/using-custom-connectors", + ] + }, + { + type: "category", + label: "Transformations", + items: [ + "cloud/managing-airbyte-cloud/dbt-cloud-integration", + "operator-guides/transformation-and-normalization/transformations-with-sql", + "operator-guides/transformation-and-normalization/transformations-with-dbt", + "operator-guides/transformation-and-normalization/transformations-with-airbyte", + ] + }, + "cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits", "troubleshooting", sectionHeader("Operating Airbyte"), deployAirbyte, From c931087fe8d02737295b3a41dccb3b68d8d13acc Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 17:42:22 +0100 Subject: [PATCH 05/52] Sidebar structure --- docusaurus/sidebars.js | 117 +++++++++++++++-------------------------- 1 file changed, 42 insertions(+), 75 deletions(-) diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index a96ccf3d3a18..f151fc90e158 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -319,44 +319,6 @@ const contributeToAirbyte = { ], }; -const airbyteCloud = [ - "cloud/core-concepts", - { - type: "category", - label: "Using Airbyte Cloud", - link: { - type: "generated-index", - }, - items: [ - "cloud/managing-airbyte-cloud/configuring-connections", - "cloud/managing-airbyte-cloud/review-connection-status", - "cloud/managing-airbyte-cloud/review-sync-history", - "cloud/managing-airbyte-cloud/manage-schema-changes", - "cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications", - "cloud/managing-airbyte-cloud/manage-data-residency", - "cloud/managing-airbyte-cloud/dbt-cloud-integration", - "cloud/managing-airbyte-cloud/manage-credits", - "cloud/managing-airbyte-cloud/manage-connection-state", - "cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace", - "cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits", - ], - }, -]; - -const ossGettingStarted = { - type: "category", - label: "Getting Started", - link: { - type: "generated-index", - }, - items: [ - "quickstart/deploy-airbyte", - "quickstart/add-a-source", - "quickstart/add-a-destination", - "quickstart/set-up-a-connection", - ], -}; - const deployAirbyte = { type: "category", label: "Deploy Airbyte", @@ -413,31 +375,6 @@ const deployAirbyte = { ], }; -const managingAirbyte = { - type: "category", - label: "Manage Airbyte", - link: { - type: "generated-index", - }, - items: [ - "operator-guides/upgrading-airbyte", - "operator-guides/reset", - "operator-guides/configuring-airbyte-db", - "operator-guides/configuring-connector-resources", - "operator-guides/browsing-output-logs", - "operator-guides/using-the-airflow-airbyte-operator", - "operator-guides/using-prefect-task", - "operator-guides/using-dagster-integration", - "operator-guides/using-kestra-plugin", - "operator-guides/locating-files-local-destination", - "operator-guides/collecting-metrics", - "operator-guides/configuring-airbyte", - "operator-guides/using-custom-connectors", - "operator-guides/scaling-airbyte", - "operator-guides/configuring-sync-notifications", - ], -}; - const understandingAirbyte = { type: "category", label: "Understand Airbyte", @@ -517,6 +454,16 @@ module.exports = { "cloud/managing-airbyte-cloud/manage-schema-changes", "cloud/managing-airbyte-cloud/manage-data-residency", "cloud/managing-airbyte-cloud/manage-connection-state", + { + type: "category", + label: "Transformations", + items: [ + "cloud/managing-airbyte-cloud/dbt-cloud-integration", + "operator-guides/transformation-and-normalization/transformations-with-sql", + "operator-guides/transformation-and-normalization/transformations-with-dbt", + "operator-guides/transformation-and-normalization/transformations-with-airbyte", + ] + }, ] }, { @@ -541,22 +488,10 @@ module.exports = { "operator-guides/using-custom-connectors", ] }, - { - type: "category", - label: "Transformations", - items: [ - "cloud/managing-airbyte-cloud/dbt-cloud-integration", - "operator-guides/transformation-and-normalization/transformations-with-sql", - "operator-guides/transformation-and-normalization/transformations-with-dbt", - "operator-guides/transformation-and-normalization/transformations-with-airbyte", - ] - }, "cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits", "troubleshooting", sectionHeader("Operating Airbyte"), deployAirbyte, - managingAirbyte, - "operating-airbyte/security", { type: "category", label: "Airbyte Enterprise", @@ -569,6 +504,38 @@ module.exports = { "enterprise-setup/sso", ] }, + "operator-guides/upgrading-airbyte", + { + type: "category", + label: "Configuring Airbyte", + link: { + type: "doc", + id: "operator-guides/configuring-airbyte", + }, + items: [ + "operator-guides/configuring-airbyte-db", + "operator-guides/configuring-connector-resources", + ] + }, + { + type: "category", + label: "Airbyte at Scale", + items: [ + "operator-guides/collecting-metrics", + "operator-guides/scaling-airbyte", + ] + }, + "operating-airbyte/security", + { + type: "category", + label: "Integrating with Airbyte", + items: [ + "operator-guides/using-the-airflow-airbyte-operator", + "operator-guides/using-prefect-task", + "operator-guides/using-dagster-integration", + "operator-guides/using-kestra-plugin", + ], + }, sectionHeader("Developer Guides"), { type: "doc", From e9d71b6f5ca49affbfa7f9276ccbfd5e2adf436a Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 17:44:16 +0100 Subject: [PATCH 06/52] Support changed --- docs/community/airbyte-support.md | 2 +- docs/troubleshooting.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/community/airbyte-support.md b/docs/community/airbyte-support.md index 69c22aa37cab..03b1ff795560 100644 --- a/docs/community/airbyte-support.md +++ b/docs/community/airbyte-support.md @@ -1,4 +1,4 @@ -# Airbyte Support +# Getting Support Hold up! Have you looked at [our docs](https://docs.airbyte.com/) yet? We recommend searching the wealth of knowledge in our documentation as many times the answer you are looking for is there! diff --git a/docs/troubleshooting.md b/docs/troubleshooting.md index a5db81cdbba2..7ace8c8742e8 100644 --- a/docs/troubleshooting.md +++ b/docs/troubleshooting.md @@ -13,5 +13,5 @@ Step 4: Open a Github ticket. If you're still unable to resolve the issue after Airbyte is an open source project with a vibrant community that fosters collaboration and mutual support. To ensure accessible troubleshooting guidance, Airbyte offers multiple platforms for users to ask and discuss issues, including the Airbyte Github, Airbyte Community Slack (which is over 10,000 users), and the Airbyte Forum. In addition, Airbyte hosts daily office hours that include topic demonstrations and dedicated space for issue discussion in Zoom meetings. In addition to these community resources, Airbyte also offers premium support packages for users who require additional assistance beyond what is provided by the community. :::info -You can check all your [support options](./community/airbyte-support). +In case you're not able to solve your issue, have a look at all your [support options](./community/airbyte-support). ::: \ No newline at end of file From b536cda5cb6141d59caa307271253f5704f6a693 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 18:00:09 +0100 Subject: [PATCH 07/52] Fix broken links --- docs/archive/changelog/platform.md | 2 +- docs/archive/faq/getting-started.md | 2 +- docs/archive/faq/transformation-and-schemas.md | 2 +- .../connector-builder-ui/incremental-sync.md | 4 ++-- .../connector-builder-ui/record-processing.mdx | 2 +- .../tutorials/adding-incremental-sync.md | 4 ++-- .../tutorials/build-a-connector-the-hard-way.md | 2 +- .../tutorials/cdk-tutorial-python-http/read-data.md | 2 +- docs/contributing-to-airbyte/README.md | 2 +- docs/contributing-to-airbyte/writing-docs.md | 2 +- docs/integrations/destinations/gcs.md | 2 +- docs/integrations/destinations/s3-glue.md | 2 +- docs/integrations/destinations/s3.md | 2 +- docs/integrations/sources/pokeapi.md | 2 +- .../transformations-with-airbyte.md | 2 +- docs/release_notes/upgrading_to_destinations_v2.md | 2 +- docs/snowflake-native-apps/facebook-marketing.md | 2 +- docs/snowflake-native-apps/linkedin-ads.md | 2 +- docs/understanding-airbyte/airbyte-protocol.md | 2 +- docs/understanding-airbyte/beginners-guide-to-catalog.md | 2 +- .../core-concepts/sync-modes/incremental-append-deduped.md | 2 +- 21 files changed, 23 insertions(+), 23 deletions(-) diff --git a/docs/archive/changelog/platform.md b/docs/archive/changelog/platform.md index 92bc158dce83..f4429eb1b852 100644 --- a/docs/archive/changelog/platform.md +++ b/docs/archive/changelog/platform.md @@ -442,7 +442,7 @@ This interim patch period mostly contained stability changes for Airbyte Cloud, * **Incremental - Append"** * We now allow sources to replicate only new or modified data. This enables to avoid re-fetching data that you have already replicated from a source. * The delta from a sync will be _appended_ to the existing data in the data warehouse. - * Here are [all the details of this feature](../../understanding-airbyte/connections/incremental-append.md). + * Here are [all the details of this feature](/using-airbyte/core-concepts/sync-modes/incremental-append.md). * It has been released for 15 connectors, including Postgres, MySQL, Intercom, Zendesk, Stripe, Twilio, Marketo, Shopify, GitHub, and all the destination connectors. We will expand it to all the connectors in the next couple of weeks. * **Other features:** * Improve interface for writing python sources \(should make writing new python sources easier and clearer\). diff --git a/docs/archive/faq/getting-started.md b/docs/archive/faq/getting-started.md index fd4ce42d47f6..1ab44be311f0 100644 --- a/docs/archive/faq/getting-started.md +++ b/docs/archive/faq/getting-started.md @@ -30,7 +30,7 @@ We don’t. Airbyte is to be self-hosted in your own private cloud. ## Can I set a start time for my integration? -[Here](../../understanding-airbyte/connections#sync-schedules) is the link to the docs on scheduling syncs. +[Here](/using-airbyte/core-concepts/sync-modes#sync-schedules) is the link to the docs on scheduling syncs. ## **Can I disable analytics in Airbyte?** diff --git a/docs/archive/faq/transformation-and-schemas.md b/docs/archive/faq/transformation-and-schemas.md index 554b11b558fd..b759f73b7146 100644 --- a/docs/archive/faq/transformation-and-schemas.md +++ b/docs/archive/faq/transformation-and-schemas.md @@ -16,5 +16,5 @@ For now, the schema can only be updated manually in the UI \(by clicking "Update ## **How does Airbyte handle namespaces \(or schemas for the DB-inclined\)?** -Airbyte respects source-defined namespaces when syncing data with a namespace-supported destination. See [this](../../understanding-airbyte/namespaces.md) for more details. +Airbyte respects source-defined namespaces when syncing data with a namespace-supported destination. See [this](/using-airbyte/core-concepts/namespaces.md) for more details. diff --git a/docs/connector-development/connector-builder-ui/incremental-sync.md b/docs/connector-development/connector-builder-ui/incremental-sync.md index 5801267fea9d..0a4db2bc7a54 100644 --- a/docs/connector-development/connector-builder-ui/incremental-sync.md +++ b/docs/connector-development/connector-builder-ui/incremental-sync.md @@ -12,7 +12,7 @@ To use incremental syncs, the API endpoint needs to fullfil the following requir - If the record's cursor field is nested, you can use an "Add Field" transformation to copy it to the top-level, and a Remove Field to remove it from the object. This will effectively move the field to the top-level of the record - It's possible to filter/request records by the cursor field -The knowledge of a cursor value also allows the Airbyte system to automatically keep a history of changes to records in the destination. To learn more about how different modes of incremental syncs, check out the [Incremental Sync - Append](/understanding-airbyte/connections/incremental-append/) and [Incremental Sync - Append + Deduped](/understanding-airbyte/connections/incremental-append-deduped) pages. +The knowledge of a cursor value also allows the Airbyte system to automatically keep a history of changes to records in the destination. To learn more about how different modes of incremental syncs, check out the [Incremental Sync - Append](/using-airbyte/core-concepts/sync-modes/incremental-append/) and [Incremental Sync - Append + Deduped](/using-airbyte/core-concepts/sync-modes/incremental-append-deduped) pages. ## Configuration @@ -132,7 +132,7 @@ Some APIs update records over time but do not allow to filter or search by modif In these cases, there are two options: -- **Do not use incremental sync** and always sync the full set of records to always have a consistent state, losing the advantages of reduced load and [automatic history keeping in the destination](/understanding-airbyte/connections/incremental-append-deduped) +- **Do not use incremental sync** and always sync the full set of records to always have a consistent state, losing the advantages of reduced load and [automatic history keeping in the destination](/using-airbyte/core-concepts/sync-modes/incremental-append-deduped) - **Configure the "Lookback window"** to not only sync exclusively new records, but resync some portion of records before the cutoff date to catch changes that were made to existing records, trading off data consistency and the amount of synced records. In the case of the API of The Guardian, news articles tend to only be updated for a few days after the initial release date, so this strategy should be able to catch most updates without having to resync all articles. Reiterating the example from above with a "Lookback window" of 2 days configured, let's assume the last encountered article looked like this: diff --git a/docs/connector-development/connector-builder-ui/record-processing.mdx b/docs/connector-development/connector-builder-ui/record-processing.mdx index d5ac0dbb88de..41a57d2351a9 100644 --- a/docs/connector-development/connector-builder-ui/record-processing.mdx +++ b/docs/connector-development/connector-builder-ui/record-processing.mdx @@ -321,7 +321,7 @@ Besides bringing the records in the right shape, it's important to communicate s ### Primary key -The "Primary key" field specifies how to uniquely identify a record. This is important for downstream de-duplication of records (e.g. by the [incremental sync - Append + Deduped sync mode](/understanding-airbyte/connections/incremental-append-deduped)). +The "Primary key" field specifies how to uniquely identify a record. This is important for downstream de-duplication of records (e.g. by the [incremental sync - Append + Deduped sync mode](/using-airbyte/core-concepts/sync-modes/incremental-append-deduped)). In a lot of cases, like for the EmailOctopus example from above, there is a dedicated id field that can be used for this purpose. It's important that the value of the id field is guaranteed to only occur once for a single record. diff --git a/docs/connector-development/tutorials/adding-incremental-sync.md b/docs/connector-development/tutorials/adding-incremental-sync.md index 992c9d9ed4b5..b463503a795b 100644 --- a/docs/connector-development/tutorials/adding-incremental-sync.md +++ b/docs/connector-development/tutorials/adding-incremental-sync.md @@ -2,7 +2,7 @@ ## Overview -This tutorial will assume that you already have a working source. If you do not, feel free to refer to the [Building a Toy Connector](building-a-python-source.md) tutorial. This tutorial will build directly off the example from that article. We will also assume that you have a basic understanding of how Airbyte's Incremental-Append replication strategy works. We have a brief explanation of it [here](../../understanding-airbyte/connections/incremental-append.md). +This tutorial will assume that you already have a working source. If you do not, feel free to refer to the [Building a Toy Connector](building-a-python-source.md) tutorial. This tutorial will build directly off the example from that article. We will also assume that you have a basic understanding of how Airbyte's Incremental-Append replication strategy works. We have a brief explanation of it [here](/using-airbyte/core-concepts/sync-modes/incremental-append.md). ## Update Catalog in `discover` @@ -293,6 +293,6 @@ Bonus points: go to Airbyte UI and reconfigure the connection to use incremental Incremental definitely requires more configurability than full refresh, so your implementation may deviate slightly depending on whether your cursor field is source defined or user-defined. If you think you are running into one of those cases, check out -our [incremental](../../understanding-airbyte/connections/incremental-append.md) documentation for more information on different types of +our [incremental](/using-airbyte/core-concepts/sync-modes/incremental-append.md) documentation for more information on different types of configuration. diff --git a/docs/connector-development/tutorials/build-a-connector-the-hard-way.md b/docs/connector-development/tutorials/build-a-connector-the-hard-way.md index fe2ea339bd51..9fb9a71aac70 100644 --- a/docs/connector-development/tutorials/build-a-connector-the-hard-way.md +++ b/docs/connector-development/tutorials/build-a-connector-the-hard-way.md @@ -57,7 +57,7 @@ Here's the outline of what we'll do to build our connector: Once we've completed the above steps, we will have built a functioning connector. Then, we'll add some optional functionality: -- Support [incremental sync](../../understanding-airbyte/connections/incremental-append.md) +- Support [incremental sync](/using-airbyte/core-concepts/sync-modes/incremental-append.md) - Add custom integration tests ### 1. Bootstrap the connector package diff --git a/docs/connector-development/tutorials/cdk-tutorial-python-http/read-data.md b/docs/connector-development/tutorials/cdk-tutorial-python-http/read-data.md index 711880cb0460..8cdee893e5ab 100644 --- a/docs/connector-development/tutorials/cdk-tutorial-python-http/read-data.md +++ b/docs/connector-development/tutorials/cdk-tutorial-python-http/read-data.md @@ -132,7 +132,7 @@ To add incremental sync, we'll do a few things: 6. Update the `path` method to specify the date to pull exchange rates for. 7. Update the configured catalog to use `incremental` sync when we're testing the stream. -We'll describe what each of these methods do below. Before we begin, it may help to familiarize yourself with how incremental sync works in Airbyte by reading the [docs on incremental](../../../understanding-airbyte/connections/incremental-append.md). +We'll describe what each of these methods do below. Before we begin, it may help to familiarize yourself with how incremental sync works in Airbyte by reading the [docs on incremental](/using-airbyte/core-concepts/sync-modes/incremental-append.md). To keep things concise, we'll only show functions as we edit them one by one. diff --git a/docs/contributing-to-airbyte/README.md b/docs/contributing-to-airbyte/README.md index e2b9669e46ea..6683cd77fbb5 100644 --- a/docs/contributing-to-airbyte/README.md +++ b/docs/contributing-to-airbyte/README.md @@ -8,7 +8,7 @@ Thank you for your interest in contributing! We love community contributions. Read on to learn how to contribute to Airbyte. We appreciate first time contributors and we are happy to assist you in getting started. In case of questions, just reach out to us via [email](mailto:hey@airbyte.io) or [Slack](https://slack.airbyte.io)! -Before getting started, please review Airbyte's Code of Conduct. Everyone interacting in Slack, codebases, mailing lists, events, or other Airbyte activities is expected to follow [Code of Conduct](../project-overview/code-of-conduct.md). +Before getting started, please review Airbyte's Code of Conduct. Everyone interacting in Slack, codebases, mailing lists, events, or other Airbyte activities is expected to follow [Code of Conduct](../community/code-of-conduct.md). ## Code Contributions diff --git a/docs/contributing-to-airbyte/writing-docs.md b/docs/contributing-to-airbyte/writing-docs.md index 6e8e0b21081d..ec4abac48481 100644 --- a/docs/contributing-to-airbyte/writing-docs.md +++ b/docs/contributing-to-airbyte/writing-docs.md @@ -13,7 +13,7 @@ The Docs team maintains a list of [#good-first-issues](https://github.com/airbyt ## Contributing to Airbyte docs -Before contributing to Airbyte docs, read the Airbyte Community [Code of Conduct](../project-overview/code-of-conduct.md). +Before contributing to Airbyte docs, read the Airbyte Community [Code of Conduct](../community/code-of-conduct.md). :::tip If you're new to GitHub and Markdown, complete [the First Contributions tutorial](https://github.com/firstcontributions/first-contributions) and learn [Markdown basics](https://guides.github.com/features/mastering-markdown/) before contributing to Airbyte documentation. Even if you're familiar with the basics, you may be interested in Airbyte's [custom markdown extensions for connector docs](#custom-markdown-extensions-for-connector-docs). diff --git a/docs/integrations/destinations/gcs.md b/docs/integrations/destinations/gcs.md index df8405a3448d..f272b77a9d6c 100644 --- a/docs/integrations/destinations/gcs.md +++ b/docs/integrations/destinations/gcs.md @@ -13,7 +13,7 @@ The Airbyte GCS destination allows you to sync data to cloud storage buckets. Ea | Feature | Support | Notes | | :----------------------------- | :-----: | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Full Refresh Sync | ✅ | Warning: this mode deletes all previously synced data in the configured bucket path. | -| Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/understanding-airbyte/connections/incremental-append#inclusive-cursors) | +| Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/using-airbyte/core-concepts/sync-modes/incremental-append#inclusive-cursors) | | Incremental - Append + Deduped | ❌ | | | Namespaces | ❌ | Setting a specific bucket path is equivalent to having separate namespaces. | diff --git a/docs/integrations/destinations/s3-glue.md b/docs/integrations/destinations/s3-glue.md index 5e66cf7d6e70..f588bc1b424b 100644 --- a/docs/integrations/destinations/s3-glue.md +++ b/docs/integrations/destinations/s3-glue.md @@ -178,7 +178,7 @@ A data sync may create multiple files as the output files can be partitioned by | Feature | Support | Notes | | :----------------------------- | :-----: | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Full Refresh Sync | ✅ | Warning: this mode deletes all previously synced data in the configured bucket path. | -| Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/understanding-airbyte/connections/incremental-append#inclusive-cursors) | +| Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/using-airbyte/core-concepts/sync-modes/incremental-append#inclusive-cursors) | | Incremental - Append + Deduped | ❌ | | | Namespaces | ❌ | Setting a specific bucket path is equivalent to having separate namespaces. | diff --git a/docs/integrations/destinations/s3.md b/docs/integrations/destinations/s3.md index 209b52a7bd31..81f796cae883 100644 --- a/docs/integrations/destinations/s3.md +++ b/docs/integrations/destinations/s3.md @@ -174,7 +174,7 @@ A data sync may create multiple files as the output files can be partitioned by | Feature | Support | Notes | | :----------------------------- | :-----: | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Full Refresh Sync | ✅ | Warning: this mode deletes all previously synced data in the configured bucket path. | -| Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/understanding-airbyte/connections/incremental-append#inclusive-cursors) | +| Incremental - Append Sync | ✅ | Warning: Airbyte provides at-least-once delivery. Depending on your source, you may see duplicated data. Learn more [here](/using-airbyte/core-concepts/sync-modes/incremental-append#inclusive-cursors) | | Incremental - Append + Deduped | ❌ | | | Namespaces | ❌ | Setting a specific bucket path is equivalent to having separate namespaces. | diff --git a/docs/integrations/sources/pokeapi.md b/docs/integrations/sources/pokeapi.md index 4290a6073023..4ea12d78b100 100644 --- a/docs/integrations/sources/pokeapi.md +++ b/docs/integrations/sources/pokeapi.md @@ -4,7 +4,7 @@ The PokéAPI is primarly used as a tutorial and educational resource, as it requires zero dependencies. Learn how Airbyte and this connector works with these tutorials: -- [Airbyte Quickstart: An Introduction to Deploying and Syncing](../../quickstart/deploy-airbyte.md) +- [Airbyte Quickstart: An Introduction to Deploying and Syncing](../../using-airbyte/getting-started/readme.md) - [Airbyte CDK Speedrun: A Quick Primer on Building Source Connectors](../../connector-development/tutorials/cdk-speedrun.md) - [How to Build ETL Sources in Under 30 Minutes: A Video Tutorial](https://www.youtube.com/watch?v=kJ3hLoNfz_E&t=13s&ab_channel=Airbyte) diff --git a/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md b/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md index a204b2a2f49b..1f0175b392d8 100644 --- a/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md +++ b/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md @@ -18,7 +18,7 @@ After replication of data from a source connector \(Extract\) to a destination c ## Public Git repository -In the connection settings page, I can add new Transformations steps to apply after [normalization](../../understanding-airbyte/basic-normalization.md). For example, I want to run my custom dbt project jaffle_shop, whenever my sync is done replicating and normalizing my data. +In the connection settings page, I can add new Transformations steps to apply after [normalization](../../using-airbyte/core-concepts/basic-normalization.md). For example, I want to run my custom dbt project jaffle_shop, whenever my sync is done replicating and normalizing my data. You can find the jaffle shop test repository by clicking [here](https://github.com/dbt-labs/jaffle_shop). diff --git a/docs/release_notes/upgrading_to_destinations_v2.md b/docs/release_notes/upgrading_to_destinations_v2.md index 0d5f70c6bed4..e48eea50f611 100644 --- a/docs/release_notes/upgrading_to_destinations_v2.md +++ b/docs/release_notes/upgrading_to_destinations_v2.md @@ -13,7 +13,7 @@ Airbyte Destinations V2 provides you with: - Internal Airbyte tables in the `airbyte_internal` schema: Airbyte will now generate all raw tables in the `airbyte_internal` schema. We no longer clutter your destination schema with raw data tables. - Incremental delivery for large syncs: Data will be incrementally delivered to your final tables. No more waiting hours to see the first rows in your destination table. -To see more details and examples on the contents of the Destinations V2 release, see this [guide](understanding-airbyte/typing-deduping.md). The remainder of this page will walk you through upgrading connectors from legacy normalization to Destinations V2. +To see more details and examples on the contents of the Destinations V2 release, see this [guide](../using-airbyte/core-concepts/typing-deduping.md). The remainder of this page will walk you through upgrading connectors from legacy normalization to Destinations V2. Destinations V2 were in preview for Snowflake and BigQuery during August 2023, and launched on August 29th, 2023. Other destinations will be transitioned to Destinations V2 on or before November 1st, 2023. diff --git a/docs/snowflake-native-apps/facebook-marketing.md b/docs/snowflake-native-apps/facebook-marketing.md index a24a38b37bc1..1b4a458e2e20 100644 --- a/docs/snowflake-native-apps/facebook-marketing.md +++ b/docs/snowflake-native-apps/facebook-marketing.md @@ -3,7 +3,7 @@ The Facebook Marketing Connector by Airbyte is a Snowflake Native Application that allows you to extract data from your Facebook Marketing account and load records into a Snowflake database of your choice. :::info -The Snowflake Native Apps platform is new and rapidly evolving. The Facebook Marketing Connector by Airbyte is in _public preview_ and is subject to further development that may affect setup and configuration of the application. Please note that, at this time, only a [full table refresh](../understanding-airbyte/connections/full-refresh-overwrite.md) without deduplication is supported. +The Snowflake Native Apps platform is new and rapidly evolving. The Facebook Marketing Connector by Airbyte is in _public preview_ and is subject to further development that may affect setup and configuration of the application. Please note that, at this time, only a [full table refresh](/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md) without deduplication is supported. ::: # Getting started diff --git a/docs/snowflake-native-apps/linkedin-ads.md b/docs/snowflake-native-apps/linkedin-ads.md index af43f7157cc5..bd34a7ffa565 100644 --- a/docs/snowflake-native-apps/linkedin-ads.md +++ b/docs/snowflake-native-apps/linkedin-ads.md @@ -3,7 +3,7 @@ The LinkedIn Ads Connector by Airbyte is a Snowflake Native Application that allows you to extract data from your LinkedIn Ads account and load records into a Snowflake database of your choice. :::info -The Snowflake Native Apps platform is new and rapidly evolving. The LinkedIn Ads Connector by Airbyte is in _public preview_ and is subject to further development that may affect setup and configuration of the application. Please note that, at this time, only a [full table refresh](../understanding-airbyte/connections/full-refresh-overwrite.md) without deduplication is supported. +The Snowflake Native Apps platform is new and rapidly evolving. The LinkedIn Ads Connector by Airbyte is in _public preview_ and is subject to further development that may affect setup and configuration of the application. Please note that, at this time, only a [full table refresh](/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md) without deduplication is supported. ::: # Getting started diff --git a/docs/understanding-airbyte/airbyte-protocol.md b/docs/understanding-airbyte/airbyte-protocol.md index 66c0bc4f10ed..e436b24eada6 100644 --- a/docs/understanding-airbyte/airbyte-protocol.md +++ b/docs/understanding-airbyte/airbyte-protocol.md @@ -333,7 +333,7 @@ Technical systems often group their underlying data into namespaces with each na An example of a namespace is the RDBMS's `schema` concept. An API namespace might be used for multiple accounts (e.g. `company_a` vs `company_b`, each having a "users" and "purchases" stream). Some common use cases for schemas are enforcing permissions, segregating test and production data and general data organization. -The `AirbyteStream` represents this concept through an optional field called `namespace`. Additional documentation on Namespaces can be found [here](namespaces.md). +The `AirbyteStream` represents this concept through an optional field called `namespace`. Additional documentation on Namespaces can be found [here](/using-airbyte/core-concepts/namespaces.md). ### Cursor diff --git a/docs/understanding-airbyte/beginners-guide-to-catalog.md b/docs/understanding-airbyte/beginners-guide-to-catalog.md index ff5451e15c5d..44b6260052aa 100644 --- a/docs/understanding-airbyte/beginners-guide-to-catalog.md +++ b/docs/understanding-airbyte/beginners-guide-to-catalog.md @@ -16,7 +16,7 @@ This article will illustrate how to use `AirbyteCatalog` via a series of example * [Dynamic Streams Example](#dynamic-streams-example) * [Nested Schema Example](#nested-schema-example) -In order to understand in depth how to configure incremental data replication, head over to the [incremental replication docs](connections/incremental-append.md). +In order to understand in depth how to configure incremental data replication, head over to the [incremental replication docs](/using-airbyte/core-concepts/sync-modes/incremental-append.md). ## Database Example diff --git a/docs/using-airbyte/core-concepts/sync-modes/incremental-append-deduped.md b/docs/using-airbyte/core-concepts/sync-modes/incremental-append-deduped.md index b077c0508c79..6fa0272fda6e 100644 --- a/docs/using-airbyte/core-concepts/sync-modes/incremental-append-deduped.md +++ b/docs/using-airbyte/core-concepts/sync-modes/incremental-append-deduped.md @@ -118,4 +118,4 @@ select * from table where cursor_field > 'last_sync_max_cursor_field_value' **Note**: -Previous versions of Airbyte destinations supported SCD tables, which would sore every entry seen for a record. This was removed with Destinations V2 and [Typing and Deduplication](/understanding-airbyte/typing-deduping.md). +Previous versions of Airbyte destinations supported SCD tables, which would sore every entry seen for a record. This was removed with Destinations V2 and [Typing and Deduplication](../typing-deduping.md). From c2a34f8363d43399f7ee561e577e80aa5fafc4a7 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 18:04:42 +0100 Subject: [PATCH 08/52] Fix more links --- .../transformations-with-sql.md | 2 +- docs/understanding-airbyte/beginners-guide-to-catalog.md | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md b/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md index 3f6c9357d2c1..4e29e15fe167 100644 --- a/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md +++ b/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md @@ -16,7 +16,7 @@ At its core, Airbyte is geared to handle the EL \(Extract Load\) steps of an ELT However, this is actually producing a table in the destination with a JSON blob column... For the typical analytics use case, you probably want this json blob normalized so that each field is its own column. -So, after EL, comes the T \(transformation\) and the first T step that Airbyte actually applies on top of the extracted data is called "Normalization". You can find more information about it [here](../../understanding-airbyte/basic-normalization.md). +So, after EL, comes the T \(transformation\) and the first T step that Airbyte actually applies on top of the extracted data is called "Normalization". You can find more information about it [here](../../using-airbyte/core-concepts/basic-normalization.md). Airbyte runs this step before handing the final data over to other tools that will manage further transformation down the line. diff --git a/docs/understanding-airbyte/beginners-guide-to-catalog.md b/docs/understanding-airbyte/beginners-guide-to-catalog.md index 44b6260052aa..1953b1681c82 100644 --- a/docs/understanding-airbyte/beginners-guide-to-catalog.md +++ b/docs/understanding-airbyte/beginners-guide-to-catalog.md @@ -92,7 +92,7 @@ The catalog is structured as a list of `AirbyteStream`. In the case of a databas Let's walk through what each field in a stream means. * `name` - The name of the stream. -* `supported_sync_modes` - This field lists the type of data replication that this source supports. The possible values in this array include `FULL_REFRESH` \([docs](connections/full-refresh-overwrite.md)\) and `INCREMENTAL` \([docs](connections/incremental-append.md)\). +* `supported_sync_modes` - This field lists the type of data replication that this source supports. The possible values in this array include `FULL_REFRESH` \([docs](/using-airbyte/core-concepts/sync-modes/full-refresh-overwrite.md)\) and `INCREMENTAL` \([docs](/using-airbyte/core-concepts/sync-modes/incremental-append.md)\). * `source_defined_cursor` - If the stream supports `INCREMENTAL` replication, then this field signals whether the source can figure out how to detect new records on its own or not. * `json_schema` - This field is a [JsonSchema](https://json-schema.org/understanding-json-schema) object that describes the structure of the data. Notice that each key in the `properties` object corresponds to a column name in our database table. @@ -137,7 +137,7 @@ Let's walk through each field in the `ConfiguredAirbyteStream`: * `sync_mode` - This field must be one of the values that was in `supported_sync_modes` in the `AirbyteStream` - Configures which sync mode will be used when data is replicated. * `stream` - Hopefully this one looks familiar! This field contains an `AirbyteStream`. It should be _identical_ to the one we saw in the `AirbyteCatalog`. -* `cursor_field` - When `sync_mode` is `INCREMENTAL` and `source_defined_cursor = false`, this field configures which field in the stream will be used to determine if a record should be replicated or not. Read more about this concept in our [documentation of incremental replication](connections/incremental-append.md). +* `cursor_field` - When `sync_mode` is `INCREMENTAL` and `source_defined_cursor = false`, this field configures which field in the stream will be used to determine if a record should be replicated or not. Read more about this concept in our [documentation of incremental replication](/using-airbyte/core-concepts/sync-modes/incremental-append.md). ### Summary of the Postgres Example From 91c3ce1413a4cfef71f29757ac7ed9fefe7c4761 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 18:14:12 +0100 Subject: [PATCH 09/52] Fix redirects --- docusaurus/redirects.yml | 39 ++++++++++++++++----------------------- 1 file changed, 16 insertions(+), 23 deletions(-) diff --git a/docusaurus/redirects.yml b/docusaurus/redirects.yml index c31fc1ea4751..eba9621835f0 100644 --- a/docusaurus/redirects.yml +++ b/docusaurus/redirects.yml @@ -1,8 +1,4 @@ # A list of URLs that should be redirected to new pathes -- from: /airbyte-pro - to: /enterprise-setup/self-managed/ -- from: /airbyte-enterprise - to: /enterprise-setup/self-managed/ - from: /upgrading-airbyte to: /operator-guides/upgrading-airbyte - from: /catalog @@ -15,12 +11,6 @@ to: /category/release-notes - from: /connector-development/config-based/understanding-the-yaml-file/stream-slicers/ to: /connector-development/config-based/understanding-the-yaml-file/partition-router -- from: /cloud/managing-airbyte-cloud - to: /category/using-airbyte-cloud -- from: /category/managing-airbyte-cloud - to: /category/using-airbyte-cloud -- from: /category/airbyte-open-source-quick-start - to: /category/getting-started - from: /cloud/dbt-cloud-integration to: /cloud/managing-airbyte-cloud/dbt-cloud-integration - from: /cloud/managing-airbyte-cloud/review-sync-summary @@ -41,16 +31,19 @@ - from: /project-overview/slack-code-of-conduct to: /community/slack-code-of-conduct - from : /project-overview/licenses/ - to: /developer-guides/licences/ + to: /developer-guides/licenses/ - from: /project-overview/licenses/license-faq - to: /developer-guides/licences/license-faq + to: /developer-guides/licenses/license-faq - from: /project-overview/licenses/elv2-license - to: /developer-guides/licences/elv2-license + to: /developer-guides/licenses/elv2-license - from: /project-overview/licenses/mit-license - to: /developer-guides/licences/mit-license + to: /developer-guides/licenses/mit-license - from: /project-overview/licenses/examples - to: /developer-guides/licences/examples -- from: /enterprise-setup/self-managed/ + to: /developer-guides/licenses/examples +- from: + - /enterprise-setup/self-managed/ + - /airbyte-pro + - /airbyte-enterprise to: /enterprise-setup/ - from: /enterprise-setup/self-managed/implementation-guide to: /enterprise-setup/implementation-guide @@ -61,11 +54,11 @@ - /operator-guides/securing-airbyte - /operator-guides/security to: /operating-airbyte/security -- from: /cloud/getting-started-with-airbyte-cloud - to: /using-airbyte/getting-started - from: + - /cloud/getting-started-with-airbyte-cloud - /quickstart/deploy-airbyte - /category/getting-started + - /category/airbyte-open-source-quick-start to: /using-airbyte/getting-started/ - from: /quickstart/add-a-source to: /using-airbyte/getting-started/add-a-source @@ -74,19 +67,19 @@ - from: /quickstart/set-up-a-connection to: /using-airbyte/getting-started/set-up-a-connection - from: /cloud/core-concepts - to: /using-airbyte/core-concepts + to: /using-airbyte/core-concepts/ - from: /understanding-airbyte/namespaces to: /using-airbyte/core-concepts/namespaces - from: /understanding-airbyte/connections/ to: /using-airbyte/core-concepts/sync-modes/ - from: /understanding-airbyte/connections/full-refresh-overwrite - to: /using-airbyte/core-concepts/full-refresh-overwrite + to: /using-airbyte/core-concepts/sync-modes/full-refresh-overwrite - from: /understanding-airbyte/connections/full-refresh-append - to: /using-airbyte/core-concepts/full-refresh-append + to: /using-airbyte/core-concepts/sync-modes/full-refresh-append - from: /understanding-airbyte/connections/incremental-append - to: /using-airbyte/core-concepts/incremental-append + to: /using-airbyte/core-concepts/sync-modes/incremental-append - from: /understanding-airbyte/connections/incremental-append-deduped - to: /using-airbyte/core-concepts/incremental-append-deduped + to: /using-airbyte/core-concepts/sync-modes/incremental-append-deduped - from: /understanding-airbyte/basic-normalization to: /using-airbyte/core-concepts/basic-normalization - from: /understanding-airbyte/typing-deduping From 3c6273e114c60a1beca12cc9899fcca6bacd265f Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 18:19:45 +0100 Subject: [PATCH 10/52] Fix all links --- docs/integrations/sources/mysql.md | 2 +- docs/integrations/sources/postgres.md | 2 +- docs/integrations/sources/postgres/cloud-sql-postgres.md | 2 +- docs/readme.md | 2 +- docs/understanding-airbyte/operations.md | 2 +- docs/using-airbyte/core-concepts/namespaces.md | 2 +- docs/using-airbyte/getting-started/set-up-a-connection.md | 2 +- 7 files changed, 7 insertions(+), 7 deletions(-) diff --git a/docs/integrations/sources/mysql.md b/docs/integrations/sources/mysql.md index f75d347df8f6..9f5c110266c7 100644 --- a/docs/integrations/sources/mysql.md +++ b/docs/integrations/sources/mysql.md @@ -91,7 +91,7 @@ To fill out the required information: #### Step 4: (Airbyte Cloud Only) Allow inbound traffic from Airbyte IPs. If you are on Airbyte Cloud, you will always need to modify your database configuration to allow inbound traffic from Airbyte IPs. You can find a list of all IPs that need to be allowlisted in -our [Airbyte Security docs](../../../operator-guides/security#network-security-1). +our [Airbyte Security docs](../../operating-airbyte/security#network-security-1). Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting to your database. Once this succeeds, you've configured an Airbyte MySQL source! diff --git a/docs/integrations/sources/postgres.md b/docs/integrations/sources/postgres.md index 6c09d3aabd75..2d31f7286bb0 100644 --- a/docs/integrations/sources/postgres.md +++ b/docs/integrations/sources/postgres.md @@ -54,7 +54,7 @@ To fill out the required information: #### Step 3: (Airbyte Cloud Only) Allow inbound traffic from Airbyte IPs. If you are on Airbyte Cloud, you will always need to modify your database configuration to allow inbound traffic from Airbyte IPs. You can find a list of all IPs that need to be allowlisted in -our [Airbyte Security docs](../../../operator-guides/security#network-security-1). +our [Airbyte Security docs](../../operating-airbyte/security#network-security-1). Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting to your database. Once this succeeds, you've configured an Airbyte Postgres source! diff --git a/docs/integrations/sources/postgres/cloud-sql-postgres.md b/docs/integrations/sources/postgres/cloud-sql-postgres.md index 9a3f9e6e01a0..670d268f82d3 100644 --- a/docs/integrations/sources/postgres/cloud-sql-postgres.md +++ b/docs/integrations/sources/postgres/cloud-sql-postgres.md @@ -58,7 +58,7 @@ If you are on Airbyte Cloud, you will always need to modify your database config ![Add a Network](./assets/airbyte_cloud_sql_postgres_add_network.png) -2. Add a new network, and enter the Airbyte's IPs, which you can find in our [Airbyte Security documentation](../../../operator-guides/security#network-security-1). +2. Add a new network, and enter the Airbyte's IPs, which you can find in our [Airbyte Security documentation](../../../operating-airbyte/security#network-security-1). Now, click `Set up source` in the Airbyte UI. Airbyte will now test connecting to your database. Once this succeeds, you've configured an Airbyte Postgres source! diff --git a/docs/readme.md b/docs/readme.md index e9ef4202203b..e8321d457839 100644 --- a/docs/readme.md +++ b/docs/readme.md @@ -6,7 +6,7 @@ Whether you are an Airbyte user or contributor, we have docs for you! Browse the [connector catalog](/integrations/) to find the connector you want. In case the connector is not yet supported on Airbyte Cloud, consider using [Airbyte Open Source](#for-airbyte-open-source-users). -Next, check out the [step-by-step tutorial](/cloud/getting-started-with-airbyte-cloud) to sign up for Airbyte Cloud, understand Airbyte [concepts](/cloud/core-concepts), and run your first sync. Then learn how to [use your Airbyte Cloud account](/category/using-airbyte-cloud). +Next, check out the [step-by-step tutorial](/using-airbyte/getting-started) to sign up for Airbyte Cloud, understand Airbyte [concepts](/using-airbyte/core-concepts), and run your first sync. ## For Airbyte Open Source users diff --git a/docs/understanding-airbyte/operations.md b/docs/understanding-airbyte/operations.md index f3839499e39b..b21a087651b3 100644 --- a/docs/understanding-airbyte/operations.md +++ b/docs/understanding-airbyte/operations.md @@ -1,6 +1,6 @@ # Operations -Airbyte [connections](connections/) support configuring additional transformations that execute after the sync. Useful applications could be: +Airbyte [connections](/using-airbyte/core-concepts/sync-modes/) support configuring additional transformations that execute after the sync. Useful applications could be: * Customized normalization to better fit the requirements of your own business context. * Business transformations from a technical data representation into a more logical and business oriented data structure. This can facilitate usage by end-users, non-technical operators, and executives looking to generate Business Intelligence dashboards and reports. diff --git a/docs/using-airbyte/core-concepts/namespaces.md b/docs/using-airbyte/core-concepts/namespaces.md index d5deac5d12fc..83975ff6180b 100644 --- a/docs/using-airbyte/core-concepts/namespaces.md +++ b/docs/using-airbyte/core-concepts/namespaces.md @@ -35,7 +35,7 @@ If the Destination does not support namespaces, the [namespace field](https://gi ## Destination namespace configuration -As part of the [connections sync settings](connections/), it is possible to configure the namespace used by: 1. destination connectors: to store the `_airbyte_raw_*` tables. 2. basic normalization: to store the final normalized tables. +As part of the [connections sync settings](./sync-modes/), it is possible to configure the namespace used by: 1. destination connectors: to store the `_airbyte_raw_*` tables. 2. basic normalization: to store the final normalized tables. Note that custom transformation outputs are not affected by the namespace settings from Airbyte: It is up to the configuration of the custom dbt project, and how it is written to handle its [custom schemas](https://docs.getdbt.com/docs/building-a-dbt-project/building-models/using-custom-schemas). The default target schema for dbt in this case, will always be the destination namespace. diff --git a/docs/using-airbyte/getting-started/set-up-a-connection.md b/docs/using-airbyte/getting-started/set-up-a-connection.md index 49d65d1c54d0..c3b681ba1607 100644 --- a/docs/using-airbyte/getting-started/set-up-a-connection.md +++ b/docs/using-airbyte/getting-started/set-up-a-connection.md @@ -1,6 +1,6 @@ # Set up a Connection -Now that you've learned how to [deploy Airbyte locally](./deploy-airbyte) and set up your first [source](./add-a-source) and [destination](./add-a-destination), it's time to finish the job by creating your very first connection! +Now that you've learned how to [deploy Airbyte locally](./readme.md) and set up your first [source](./add-a-source) and [destination](./add-a-destination), it's time to finish the job by creating your very first connection! On the left side of your main Airbyte dashboard, select **Connections**. You will be prompted to choose which source and destination to use for this connection. As an example, we'll use the **Google Sheets** source and **Local JSON** destination. From fd2eb2d166d42b135ba482c007dac6e87bc147ca Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 18:27:20 +0100 Subject: [PATCH 11/52] Cloud/oss appliesto --- .../manage-airbyte-cloud-workspace.md | 14 ++++++++------ .../cloud/managing-airbyte-cloud/manage-credits.md | 2 ++ 2 files changed, 10 insertions(+), 6 deletions(-) diff --git a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace.md b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace.md index 40336d7b9273..710242ca4728 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace.md +++ b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace.md @@ -1,6 +1,8 @@ # Manage your workspace -An Airbyte Cloud workspace allows you to collaborate with other users and manage connections under a shared billing account. +A workspace in Airbyte allows you to collaborate with other users and manage connections together. On Airbyte Cloud it will allow you to share billing details for a workspace. + + :::info Airbyte [credits](https://airbyte.com/pricing) are assigned per workspace and cannot be transferred between workspaces. @@ -10,7 +12,7 @@ Airbyte [credits](https://airbyte.com/pricing) are assigned per workspace and ca To add a user to your workspace: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Settings**. +1. Go to the **Settings** via the side navigation in Airbyte. 2. Click **Access Management**. @@ -28,7 +30,7 @@ To add a user to your workspace: To remove a user from your workspace: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Settings**. +1. Go to the **Settings** via the side navigation in Airbyte. 2. Click **Access Management**. @@ -40,7 +42,7 @@ To remove a user from your workspace: To rename a workspace: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Settings**. +1. Go to the **Settings** via the side navigation in Airbyte. 2. Click **General Settings**. @@ -52,7 +54,7 @@ To rename a workspace: To delete a workspace: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Settings**. +1. Go to the **Settings** via the side navigation in Airbyte. 2. Click **General Settings**. @@ -78,6 +80,6 @@ You can use one or multiple workspaces with Airbyte Cloud, which gives you flexi To switch between workspaces: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click the current workspace name under the Airbyte logo in the navigation bar. +1. Click the current workspace name under the Airbyte logo in the navigation bar. 2. Search for the workspace or click the name of the workspace you want to switch to. diff --git a/docs/cloud/managing-airbyte-cloud/manage-credits.md b/docs/cloud/managing-airbyte-cloud/manage-credits.md index 7ed15c0ed76f..bcc1fe8af0e9 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-credits.md +++ b/docs/cloud/managing-airbyte-cloud/manage-credits.md @@ -1,5 +1,7 @@ # Manage credits + + ## Buy credits Airbyte [credits](https://airbyte.com/pricing) are used to pay for Airbyte resources when you run a sync. You can purchase credits on Airbyte Cloud to keep your data flowing without interruption. From c7c28436f6113114b5bf4282e67995c31fb14566 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 18:33:20 +0100 Subject: [PATCH 12/52] Remove troubleshooting --- .../{airbyte-support.md => getting-support.md} | 0 docs/deploying-airbyte/local-deployment.md | 3 ++- docs/troubleshooting.md | 17 ----------------- docusaurus/redirects.yml | 8 +++++--- docusaurus/sidebars.js | 1 - 5 files changed, 7 insertions(+), 22 deletions(-) rename docs/community/{airbyte-support.md => getting-support.md} (100%) delete mode 100644 docs/troubleshooting.md diff --git a/docs/community/airbyte-support.md b/docs/community/getting-support.md similarity index 100% rename from docs/community/airbyte-support.md rename to docs/community/getting-support.md diff --git a/docs/deploying-airbyte/local-deployment.md b/docs/deploying-airbyte/local-deployment.md index ff94ad68c885..772a603d6147 100644 --- a/docs/deploying-airbyte/local-deployment.md +++ b/docs/deploying-airbyte/local-deployment.md @@ -67,4 +67,5 @@ bash run-ab-platform.sh ## Troubleshooting -If you encounter any issues, just connect to our [Slack](https://slack.airbyte.io). Our community will help! We also have a [troubleshooting](../troubleshooting.md) section in our docs for common problems. +If you encounter any issues, check out [Getting Support](/community/getting-support) documentation +for options how to get in touch with the community or us. \ No newline at end of file diff --git a/docs/troubleshooting.md b/docs/troubleshooting.md deleted file mode 100644 index 7ace8c8742e8..000000000000 --- a/docs/troubleshooting.md +++ /dev/null @@ -1,17 +0,0 @@ -# Troubleshooting - -Welcome to the Airbyte troubleshooting guide! Like any platform, you may experience issues when using Airbyte. This guide is designed to help you diagnose and resolve any problems you may encounter while using Airbyte. By following the troubleshooting steps outlined in this guide, you can quickly and effectively identify the root cause of the issue and take steps to resolve it. We recommend checking this guide whenever you encounter an issue with Airbyte to help ensure a smooth and uninterrupted experience with our platform. Let's dive in! - -Step 1: Check the logs. The logs provide detailed information about what's happening behind the scenes, and they can help pinpoint the root cause of the problem. - -Step 2: Check the documentation. Our documentation covers a wide range of topics, including common issues and their solutions, troubleshooting tips, and best practices. - -Step 3: Reach out to the community. Our community forum is a great place to ask for help, share your experiences, and learn from others who have faced similar issues. - -Step 4: Open a Github ticket. If you're still unable to resolve the issue after reaching out to the community, it's time to open a support ticket. Our support team is here to help you with any issues you're facing with Airbyte. - -Airbyte is an open source project with a vibrant community that fosters collaboration and mutual support. To ensure accessible troubleshooting guidance, Airbyte offers multiple platforms for users to ask and discuss issues, including the Airbyte Github, Airbyte Community Slack (which is over 10,000 users), and the Airbyte Forum. In addition, Airbyte hosts daily office hours that include topic demonstrations and dedicated space for issue discussion in Zoom meetings. In addition to these community resources, Airbyte also offers premium support packages for users who require additional assistance beyond what is provided by the community. - -:::info -In case you're not able to solve your issue, have a look at all your [support options](./community/airbyte-support). -::: \ No newline at end of file diff --git a/docusaurus/redirects.yml b/docusaurus/redirects.yml index eba9621835f0..74a9578c0aad 100644 --- a/docusaurus/redirects.yml +++ b/docusaurus/redirects.yml @@ -24,8 +24,6 @@ - /project-overview/product-support-levels - /project-overview/product-release-stages to: /integrations/connector-support-levels -- from: /operator-guides/contact-support - to: /community/airbyte-support - from: /project-overview/code-of-conduct to: /community/code-of-conduct - from: /project-overview/slack-code-of-conduct @@ -83,4 +81,8 @@ - from: /understanding-airbyte/basic-normalization to: /using-airbyte/core-concepts/basic-normalization - from: /understanding-airbyte/typing-deduping - to: /using-airbyte/core-concepts/typing-deduping \ No newline at end of file + to: /using-airbyte/core-concepts/typing-deduping +- from: + - /troubleshooting + - /operator-guides/contact-support + to: /community/getting-support \ No newline at end of file diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index f151fc90e158..af1c53134c25 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -489,7 +489,6 @@ module.exports = { ] }, "cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits", - "troubleshooting", sectionHeader("Operating Airbyte"), deployAirbyte, { From 60ce0d5308fba6174911261025ae2aaaef3c424a Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 18:38:54 +0100 Subject: [PATCH 13/52] Fix sidebar --- docusaurus/sidebars.js | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index af1c53134c25..ac86476ad368 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -479,7 +479,7 @@ module.exports = { }, { type: "category", - label: "Managing your Account", + label: "Managing your workspace", items: [ "cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace", // TODO: merge with operator-guides/configure-sync-notifications @@ -565,8 +565,7 @@ module.exports = { ], }, sectionHeader("Community"), - // TODO: Write a "getting in touch or overview doc" - "community/airbyte-support", + "community/getting-support", "community/code-of-conduct", "community/slack-code-of-conduct", sectionHeader("Product Updates"), From df7294df5afd600c6f81c3db180fd2f513acac4d Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sat, 25 Nov 2023 18:42:18 +0100 Subject: [PATCH 14/52] Note in upgrade airbyte --- docs/operator-guides/upgrading-airbyte.md | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/docs/operator-guides/upgrading-airbyte.md b/docs/operator-guides/upgrading-airbyte.md index 5c5197441b85..4d2dafd2991f 100644 --- a/docs/operator-guides/upgrading-airbyte.md +++ b/docs/operator-guides/upgrading-airbyte.md @@ -1,5 +1,12 @@ # Upgrading Airbyte +:::info + +If you run on [Airbyte Cloud](https://cloud.airbyte.com/signup) you'll always run on the newest +Airbyte version automatically. This documentation only applies to users deploying our self-managed +version. +::: + ## Overview This tutorial will describe how to determine if you need to run this upgrade process, and if you do, how to do so. This process does require temporarily turning off Airbyte. From 4323818e102ca61dd2439fd6bbbc615047309bb4 Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sat, 25 Nov 2023 17:42:45 +0000 Subject: [PATCH 15/52] Update wording of consolidated docs --- docs/deploying-airbyte/README.md | 15 --- docs/deploying-airbyte/local-deployment.md | 12 +- .../getting-started/destination-redshift.md | 70 ------------ .../getting-started/source-github.md | 12 -- .../getting-started/source-google-ads.md | 42 ------- docs/integrations/missing-an-integration.md | 14 --- .../using-airbyte/core-concepts/namespaces.md | 108 +++++++----------- docs/using-airbyte/core-concepts/readme.md | 35 +++++- .../core-concepts/sync-modes/README.md | 58 +--------- docs/using-airbyte/getting-started/readme.md | 35 +++++- .../getting-started/set-up-a-connection.md | 18 ++- 11 files changed, 130 insertions(+), 289 deletions(-) delete mode 100644 docs/deploying-airbyte/README.md delete mode 100644 docs/integrations/getting-started/destination-redshift.md delete mode 100644 docs/integrations/getting-started/source-github.md delete mode 100644 docs/integrations/getting-started/source-google-ads.md delete mode 100644 docs/integrations/missing-an-integration.md diff --git a/docs/deploying-airbyte/README.md b/docs/deploying-airbyte/README.md deleted file mode 100644 index 2f8a6e290a36..000000000000 --- a/docs/deploying-airbyte/README.md +++ /dev/null @@ -1,15 +0,0 @@ -# Deploy Airbyte where you want to - -![not all who wander are lost](https://user-images.githubusercontent.com/2591516/170351002-0d054d06-c901-4794-8719-97569060408f.png) - -- [Local Deployment](local-deployment.md) -- [On Airbyte Cloud](on-cloud.md) -- [On Aws](on-aws-ec2.md) -- [On Azure VM Cloud Shell](on-azure-vm-cloud-shell.md) -- [On Digital Ocean Droplet](on-digitalocean-droplet.md) -- [On GCP.md](on-gcp-compute-engine.md) -- [On Kubernetes](on-kubernetes-via-helm.md) -- [On OCI VM](on-oci-vm.md) -- [On Restack](on-restack.md) -- [On Plural](on-plural.md) -- [On AWS ECS (spoiler alert: it doesn't work)](on-aws-ecs.md) diff --git a/docs/deploying-airbyte/local-deployment.md b/docs/deploying-airbyte/local-deployment.md index ff94ad68c885..d0d581024e00 100644 --- a/docs/deploying-airbyte/local-deployment.md +++ b/docs/deploying-airbyte/local-deployment.md @@ -21,8 +21,8 @@ cd airbyte ./run-ab-platform.sh ``` -- In your browser, just visit [http://localhost:8000](http://localhost:8000) -- You will be asked for a username and password. By default, that's username `airbyte` and password `password`. Once you deploy Airbyte to your servers, be sure to change these: +- In your browser, visit [http://localhost:8000](http://localhost:8000) +- You will be asked for a username and password. By default, that's username `airbyte` and password `password`. Once you deploy Airbyte to your servers, be sure to change these in your `.env` file: ```yaml # Proxy Configuration @@ -67,4 +67,10 @@ bash run-ab-platform.sh ## Troubleshooting -If you encounter any issues, just connect to our [Slack](https://slack.airbyte.io). Our community will help! We also have a [troubleshooting](../troubleshooting.md) section in our docs for common problems. +If you have any questions about the local setup and deployment process, head over to our [Getting Started FAQ](https://github.com/airbytehq/airbyte/discussions/categories/questions) on our Airbyte Forum that answers the following questions and more: + +- How long does it take to set up Airbyte? +- Where can I see my data once I've run a sync? +- Can I set a start time for my sync? + +We also have a [troubleshooting](../troubleshooting.md) section in our docs for common problems. If there are any questions that we couldn't answer here, we'd love to help you get started. [Join our Slack](https://airbytehq.slack.com/ssb/redirect) for additional community support. diff --git a/docs/integrations/getting-started/destination-redshift.md b/docs/integrations/getting-started/destination-redshift.md deleted file mode 100644 index ae59b0eeff95..000000000000 --- a/docs/integrations/getting-started/destination-redshift.md +++ /dev/null @@ -1,70 +0,0 @@ -# Getting Started: Destination Redshift - -## Requirements - -1. Active Redshift cluster -2. Allow connections from Airbyte to your Redshift cluster \(if they exist in separate VPCs\) -3. A staging S3 bucket with credentials \(for the COPY strategy\). - -## Setup guide - -### 1. Make sure your cluster is active and accessible from the machine running Airbyte - -This is dependent on your networking setup. The easiest way to verify if Airbyte is able to connect to your Redshift cluster is via the check connection tool in the UI. You can check AWS Redshift documentation with a tutorial on how to properly configure your cluster's access [here](https://docs.aws.amazon.com/redshift/latest/gsg/rs-gsg-authorize-cluster-access.html) - -### 2. Fill up connection info - -Next is to provide the necessary information on how to connect to your cluster such as the `host` whcih is part of the connection string or Endpoint accessible [here](https://docs.aws.amazon.com/redshift/latest/gsg/rs-gsg-connect-to-cluster.html#rs-gsg-how-to-get-connection-string) without the `port` and `database` name \(it typically includes the cluster-id, region and end with `.redshift.amazonaws.com`\). - -You should have all the requirements needed to configure Redshift as a destination in the UI. You'll need the following information to configure the destination: - -* **Host** -* **Port** -* **Username** -* **Password** -* **Schema** -* **Database** - * This database needs to exist within the cluster provided. - -### 2a. Fill up S3 info \(for COPY strategy\) - -Provide the required S3 info. - -* **S3 Bucket Name** - * See [this](https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.html) to create an S3 bucket. -* **S3 Bucket Region** - * Place the S3 bucket and the Redshift cluster in the same region to save on networking costs. -* **Access Key Id** - * See [this](https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys) on how to generate an access key. - * We recommend creating an Airbyte-specific user. This user will require [read and write permissions](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html) to objects in the staging bucket. -* **Secret Access Key** - * Corresponding key to the above key id. -* **Part Size** - * Affects the size limit of an individual Redshift table. Optional. Increase this if syncing tables larger than 100GB. Files are streamed to S3 in parts. This determines the size of each part, in MBs. As S3 has a limit of 10,000 parts per file, part size affects the table size. This is 10MB by default, resulting in a default table limit of 100GB. Note, a larger part size will result in larger memory requirements. A rule of thumb is to multiply the part size by 10 to get the memory requirement. Modify this with care. - -Optional parameters: -* **Bucket Path** - * The directory within the S3 bucket to place the staging data. For example, if you set this to `yourFavoriteSubdirectory`, staging data will be placed inside `s3://yourBucket/yourFavoriteSubdirectory`. If not provided, defaults to the root directory. - -## Notes about Redshift Naming Conventions - -From [Redshift Names & Identifiers](https://docs.aws.amazon.com/redshift/latest/dg/r_names.html): - -### Standard Identifiers - -* Begin with an ASCII single-byte alphabetic character or underscore character, or a UTF-8 multibyte character two to four bytes long. -* Subsequent characters can be ASCII single-byte alphanumeric characters, underscores, or dollar signs, or UTF-8 multibyte characters two to four bytes long. -* Be between 1 and 127 bytes in length, not including quotation marks for delimited identifiers. -* Contain no quotation marks and no spaces. - -### Delimited Identifiers - -Delimited identifiers \(also known as quoted identifiers\) begin and end with double quotation marks \("\). If you use a delimited identifier, you must use the double quotation marks for every reference to that object. The identifier can contain any standard UTF-8 printable characters other than the double quotation mark itself. Therefore, you can create column or table names that include otherwise illegal characters, such as spaces or the percent symbol. ASCII letters in delimited identifiers are case-insensitive and are folded to lowercase. To use a double quotation mark in a string, you must precede it with another double quotation mark character. - -Therefore, Airbyte Redshift destination will create tables and schemas using the Unquoted identifiers when possible or fallback to Quoted Identifiers if the names are containing special characters. - -## Data Size Limitations - -Redshift specifies a maximum limit of 65535 bytes to store the raw JSON record data. Thus, when a row is too big to fit, the Redshift destination fails to load such data and currently ignores that record. - -For more information, see the [docs here.](https://docs.aws.amazon.com/redshift/latest/dg/r_Character_types.html) diff --git a/docs/integrations/getting-started/source-github.md b/docs/integrations/getting-started/source-github.md deleted file mode 100644 index 6ae7f442aade..000000000000 --- a/docs/integrations/getting-started/source-github.md +++ /dev/null @@ -1,12 +0,0 @@ -## Getting Started: Source GitHub - -### Requirements - -* Github Account -* Github Personal Access Token wih the necessary permissions \(described below\) - -### Setup guide - -Log into Github and then generate a [personal access token](https://github.com/settings/tokens). - -Your token should have at least the `repo` scope. Depending on which streams you want to sync, the user generating the token needs more permissions: diff --git a/docs/integrations/getting-started/source-google-ads.md b/docs/integrations/getting-started/source-google-ads.md deleted file mode 100644 index f1558cddf335..000000000000 --- a/docs/integrations/getting-started/source-google-ads.md +++ /dev/null @@ -1,42 +0,0 @@ -# Getting Started: Source Google Ads - -## Requirements - -Google Ads Account with an approved Developer Token \(note: In order to get API access to Google Ads, you must have a "manager" account. This must be created separately from your standard account. You can find more information about this distinction in the [google ads docs](https://ads.google.com/home/tools/manager-accounts/).\) - -* developer_token -* client_id -* client_secret -* refresh_token -* start_date -* customer_id - -## Setup guide - -This guide will provide information as if starting from scratch. Please skip over any steps you have already completed. - -* Create an Google Ads Account. Here are [Google's instruction](https://support.google.com/google-ads/answer/6366720) on how to create one. -* Create an Google Ads MANAGER Account. Here are [Google's instruction](https://ads.google.com/home/tools/manager-accounts/) on how to create one. -* You should now have two Google Ads accounts: a normal account and a manager account. Link the Manager account to the normal account following [Google's documentation](https://support.google.com/google-ads/answer/7459601). -* Apply for a developer token \(**make sure you follow our** [**instructions**](#how-to-apply-for-the-developer-token)\) on your Manager account. This token allows you to access your data from the Google Ads API. Here are [Google's instructions](https://developers.google.com/google-ads/api/docs/first-call/dev-token). The docs are a little unclear on this point, but you will _not_ be able to access your data via the Google Ads API until this token is approved. You cannot use a test developer token, it has to be at least a basic developer token. It usually takes Google 24 hours to respond to these applications. This developer token is the value you will use in the `developer_token` field. -* Fetch your `client_id`, `client_secret`, and `refresh_token`. Google provides [instructions](https://developers.google.com/google-ads/api/docs/first-call/overview) on how to do this. -* Select your `customer_id`. The `customer_is` refer to the id of each of your Google Ads accounts. This is the 10 digit number in the top corner of the page when you are in google ads ui. The source will only pull data from the accounts for which you provide an id. If you are having trouble finding it, check out [Google's instructions](https://support.google.com/google-ads/answer/1704344). - -Wow! That was a lot of steps. We are working on making the OAuth flow for all of our connectors simpler \(allowing you to skip needing to get a `developer_token` and a `refresh_token` which are the most painful / time-consuming steps in this walkthrough\). - -## How to apply for the developer token - -Google is very picky about which software and which use case can get access to a developer token. The Airbyte team has worked with the Google Ads team to whitelist Airbyte and make sure you can get one \(see [issue 1981](https://github.com/airbytehq/airbyte/issues/1981) for more information\). - -When you apply for a token, you need to mention: - -* Why you need the token \(eg: want to run some internal analytics...\) -* That you will be using the Airbyte Open Source project -* That you have full access to the code base \(because we're open source\) -* That you have full access to the server running the code \(because you're self-hosting Airbyte\) - -If for any reason the request gets denied, let us know and we will be able to unblock you. - -## Understanding Google Ads Query Language - -The Google Ads Query Language can query the Google Ads API. Check out [Google Ads Query Language](https://developers.google.com/google-ads/api/docs/query/overview) diff --git a/docs/integrations/missing-an-integration.md b/docs/integrations/missing-an-integration.md deleted file mode 100644 index e52613182866..000000000000 --- a/docs/integrations/missing-an-integration.md +++ /dev/null @@ -1,14 +0,0 @@ -# Missing an Integration? - -If you'd like to ask for a new connector, or build a new connectors and make them part of the pool of pre-built connectors on Airbyte, first a big thank you. We invite you to check our [contributing guide](../contributing-to-airbyte/). - -If you'd like to build new connectors, or update existing ones, for your own usage, without contributing to the Airbyte codebase, read along. - -## Developing your own connectors - -It's easy to code your own integrations on Airbyte. Here are some links to instruct on how to code new sources and destinations. - -* [Building new connectors](../contributing-to-airbyte/README.md) - -While the guides above are specific to the languages used most frequently to write integrations, **Airbyte integrations can be written in any language**. Please reach out to us if you'd like help developing integrations in other languages. - diff --git a/docs/using-airbyte/core-concepts/namespaces.md b/docs/using-airbyte/core-concepts/namespaces.md index d5deac5d12fc..27f85873cb4c 100644 --- a/docs/using-airbyte/core-concepts/namespaces.md +++ b/docs/using-airbyte/core-concepts/namespaces.md @@ -2,22 +2,17 @@ ## High-Level Overview -:::info - -The high-level overview contains all the information you need to use Namespaces when pulling from APIs. Information past that can be read for advanced or educational purposes. - -::: +Namespaces allow you to organize and separate your data into groups. In most cases, namespaces are schemas in the database you're replicating to. -When looking through our connector docs, you'll notice that some sources and destinations support "Namespaces." These allow you to organize and separate your data into groups in the destination if the destination supports it. In most cases, namespaces are schemas in the database you're replicating to. If your desired destination doesn't support it, you can ignore this feature. +As a part of connection setup, you select where in the destination you want to write your data. Note: The default configuration is **Destination default**. -Note that this is the location that both your normalized and raw data will get written to. Your raw data will show up with the prefix `_airbyte_raw_` in the namespace you define. If you don't enable basic normalization, you will only receive the raw tables. +| Destination Namepsace | Description | +| ---------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------- | +| Destination default | All streams will be replicated to the single default namespace defined by the Destination. | +| Mirror source structure | Some sources (for example, databases) provide namespace information for a stream. If a source provides namespace information, the destination will mirror the same namespace when this configuration is set. For sources or streams where the source namespace is not known, the behavior will default to the "Destination default" option. | +| Custom format | All streams will be replicated to a single user-defined namespace. See Custom format for more details | -If only your destination supports namespaces, you have two simple options. **This is the most likely case**, as all HTTP APIs currently don't support Namespaces. - -1. Mirror Destination Settings - Replicate to the default namespace in the destination, which will differ based on your destination. -2. Custom Format - Create a "Custom Format" to rename the namespace that your data will be replicated into. - -If both your desired source and destination support namespaces, you're likely using a more advanced use case with a database as a source, so continue reading. +Most of our destinations support this feature. To learn if your connector supports this, head to the individual connector page to learn more. If your desired destination doesn't support it, you can ignore this feature. ## What is a Namespace? @@ -25,29 +20,11 @@ Technical systems often group their underlying data into namespaces with each na An example of a namespace is the RDMS's `schema` concept. Some common use cases for schemas are enforcing permissions, segregating test and production data and general data organisation. -## Syncing - -The Airbyte Protocol supports namespaces and allows Sources to define namespaces, and Destinations to write to various namespaces. - -If the Source does not support namespaces, the data will be replicated into the Destination's default namespace. For databases, the default namespace is the schema provided in the destination configuration. - -If the Destination does not support namespaces, the [namespace field](https://github.com/airbytehq/airbyte/blob/master/airbyte-protocol/models/src/main/resources/airbyte_protocol/airbyte_protocol.yaml#L64) is ignored. - -## Destination namespace configuration - -As part of the [connections sync settings](connections/), it is possible to configure the namespace used by: 1. destination connectors: to store the `_airbyte_raw_*` tables. 2. basic normalization: to store the final normalized tables. - -Note that custom transformation outputs are not affected by the namespace settings from Airbyte: It is up to the configuration of the custom dbt project, and how it is written to handle its [custom schemas](https://docs.getdbt.com/docs/building-a-dbt-project/building-models/using-custom-schemas). The default target schema for dbt in this case, will always be the destination namespace. - -Available options for namespace configurations are: - -### - Mirror source structure +Airbyte supports namespaces and allows Sources to define namespaces, and Destinations to write to various namespaces. In Airbyte, the following options are available and are set on each individual connection. -Some sources \(such as databases based on JDBC for example\) are providing namespace information from which a stream has been extracted. Whenever a source is able to fill this field in the catalog.json file, the destination will try to reproduce exactly the same namespace when this configuration is set. For sources or streams where the source namespace is not known, the behavior will fall back to the "Destination Connector settings". +### Destination default -### - Destination connector settings - -All stream will be replicated and store in the default namespace defined on the destination settings page. In the destinations, namespace refers to: +All streams will be replicated and stored in the default namespace defined on the destination settings page, which is typically defined when the destination was set up. Depending on your destination, the namespace refers to: | Destination Connector | Namespace setting | | :--- | :--- | @@ -60,63 +37,60 @@ All stream will be replicated and store in the default namespace defined on the | Snowflake | schema | | S3 | path prefix | -### - Custom format +:::tip +If you prefer to replicate multiple sources into the same namespace, use the `Stream Prefix` configuration to differentiate data from these sources to ensure no streams collide when writing to the destination. +::: -When replicating multiple sources into the same destination, conflicts on tables being overwritten by syncs can occur. +### Mirror source structure -For example, a Github source can be replicated into a "github" schema. But if we have multiple connections to different GitHub repositories \(similar in multi-tenant scenarios\): +Some sources \(such as databases based on JDBC\) provide namespace information from which a stream has been extracted. Whenever a source is able to fill this field in the catalog.json file, the destination will try to write to exactly the same namespace when this configuration is set. For sources or streams where the source namespace is not known, the behavior will fall back to the "Destination default". Most APIs do not provide namespace information. -* we'd probably wish to keep the same table names \(to keep consistent queries downstream\) -* but store them in different namespaces \(to avoid mixing data from different "tenants"\) +### Custom format -To solve this, we can either: +When replicating multiple sources into the same destination, you may create table conflicts where tables are overwritten by different syncs. This is where using a custom namespace will ensure data is synced accurately. -* use a specific namespace for each connection, thus this option of custom format. -* or, use prefix to stream names as described below. +For example, a Github source can be replicated into a `github` schema. However, you may have multiple connections writing from different GitHub repositories \(common in multi-tenant scenarios\). -Note that we can use a template format string using variables that will be resolved during replication as follow: +:::tip +To keep the same table names, Airbyte recommends writing the connections to unique namespaces to avoid mixing data from the different GitHub repositories. +::: -* `${SOURCE_NAMESPACE}`: will be replaced by the namespace provided by the source if available +You can enter plain text (most common) or additionally add a dynamic parameter `${SOURCE_NAMESPACE}`, which uses the namespace provided by the source if available. ### Examples -The following table summarises how this works. We assume an example of replication configurations between a Postgres Source and Snowflake Destination \(with settings of schema = "my\_schema"\): +The following table summarises how this works. In this example, we're looking at the replication configuration between a Postgres Source and Snowflake Destination \(with settings of schema = "my\_schema"\): | Namespace Configuration | Source Namespace | Source Table Name | Destination Namespace | Destination Table Name | | :--- | :--- | :--- | :--- | :--- | +| Destination default | public | my\_table | my\_schema | my\_table | +| Destination default | | my\_table | my\_schema | my\_table | | Mirror source structure | public | my\_table | public | my\_table | | Mirror source structure | | my\_table | my\_schema | my\_table | -| Destination connector settings | public | my\_table | my\_schema | my\_table | -| Destination connector settings | | my\_table | my\_schema | my\_table | | Custom format = "custom" | public | my\_table | custom | my\_table | | Custom format = "${SOURCE\_NAMESPACE}" | public | my\_table | public | my\_table | | Custom format = "my\_${SOURCE\_NAMESPACE}\_schema" | public | my\_table | my\_public\_schema | my\_table | | Custom format = " " | public | my\_table | my\_schema | my\_table | -## Requirements +## Syncing Details -* Both Source and Destination connectors need to support namespaces. -* Relevant Source and Destination connectors need to be at least version `0.3.0` or later. -* Airbyte version `0.21.0-alpha` or later. +If the Source does not support namespaces, the data will be replicated into the Destination's default namespace. For databases, the default namespace is the schema provided in the destination configuration. -## Current Support +If the Destination does not support namespaces, any preference set in the connection is ignored. -### Sources +## Using Namespaces with Basic Normalization -* MSSQL -* MYSQL -* Oracle DB -* Postgres -* Redshift +As part of the connections sync settings, it is possible to configure the namespace used by: 1. destination connectors: to store the `_airbyte_raw_*` tables. 2. basic normalization: to store the final normalized tables. -### Destination +:::info +When basic normalization is enabled, this is the location that both your normalized and raw data will get written to. Your raw data will show up with the prefix `_airbyte_raw_` in the namespace you define. If you don't enable basic normalization, you will only receive the raw tables. +:::note + +Note custom transformation outputs are not affected by the namespace settings from Airbyte: It is up to the configuration of the custom dbt project, and how it is written to handle its [custom schemas](https://docs.getdbt.com/docs/building-a-dbt-project/building-models/using-custom-schemas). The default target schema for dbt in this case, will always be the destination namespace. + +## Requirements -* BigQuery -* MSSQL -* MySql -* Oracle DB -* Postgres -* Redshift -* Snowflake -* S3 +* Both Source and Destination connectors need to support namespaces. +* Relevant Source and Destination connectors need to be at least version `0.3.0` or later. +* Airbyte version `0.21.0-alpha` or later. diff --git a/docs/using-airbyte/core-concepts/readme.md b/docs/using-airbyte/core-concepts/readme.md index c3c949599ee8..2f03a54dc210 100644 --- a/docs/using-airbyte/core-concepts/readme.md +++ b/docs/using-airbyte/core-concepts/readme.md @@ -46,6 +46,23 @@ Examples of fields: - A column in the table in a relational database - A field in an API response +## Sync schedules + +Syncs will be triggered by either: + +- A manual request \(i.e: clicking the "Sync Now" button in the UI or through the API\) +- A schedule +- CRON schedule + +When a scheduled connection is first created, a sync is executed as soon as possible. After that, a sync is run once the time since the last sync \(whether it was triggered manually or due to a schedule\) has exceeded the schedule interval. For example, consider the following illustrative scenario: + +- **October 1st, 2pm**, a user sets up a connection to sync data every 24 hours. +- **October 1st, 2:01pm**: sync job runs +- **October 2nd, 2:01pm:** 24 hours have passed since the last sync, so a sync is triggered. +- **October 2nd, 5pm**: The user manually triggers a sync from the UI +- **October 3rd, 2:01pm:** since the last sync was less than 24 hours ago, no sync is run +- **October 3rd, 5:01pm:** It has been more than 24 hours since the last sync, so a sync is run + ## Namespace Namespace is a method of grouping streams in a source or destination. Namespaces are used to generally organize data, segregate tests and production data, and enforce permissions. In a relational database system, this is known as a schema. @@ -56,9 +73,11 @@ Airbyte supports the following configuration options for a connection: | Destination Namepsace | Description | | ---------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------- | -| Destination default | All streams will be replicated to the single default namespace defined by the Destination. For more details, see ​​Destination Connector Settings | +| Destination default | All streams will be replicated to the single default namespace defined by the Destination. | | Mirror source structure | Some sources (for example, databases) provide namespace information for a stream. If a source provides namespace information, the destination will mirror the same namespace when this configuration is set. For sources or streams where the source namespace is not known, the behavior will default to the "Destination default" option. | -| Custom format | All streams will be replicated to a single user-defined namespace. See Custom format for more details | +| Custom format | All streams will be replicated to a single user-defined namespace. | + +For more details, see our [Namespace documentation](/using-airbyte/namespaces). ## Connection sync modes @@ -69,6 +88,8 @@ A sync mode governs how Airbyte reads from a source and writes to a destination. - **Incremental Sync | Append:** Sync new records from the source and add them to the destination without deleting any data. This enables efficient historical tracking over time of data. - **Incremental Sync | Append + Deduped:** Sync new records from the source and add them to the destination. Also provides a de-duplicated view mirroring the state of the stream in the source. This is the most common replication use case. +Read more about each [sync mode](using-airbyte/core-concepts/sync-modes) and how they differ. + ## Normalization Normalization is the process of structuring data from the source into a format appropriate for consumption in the destination. For example, when writing data from a nested, dynamically typed source like a JSON API to a relational destination like Postgres, normalization is the process which un-nests JSON from the source into a relational table format which uses the appropriate column types in the destination. @@ -95,6 +116,16 @@ Normalizing data may cause an increase in your destination's compute cost. This ::: +### Typing and Deduping + +As described by the [Airbyte Protocol from the Airbyte Specifications](../../../understanding-airbyte/airbyte-protocol.md), replication is composed of source connectors that are transmitting data in a JSON format. It is then written as such by the destination connectors. On top of this replication, Airbyte's database and datawarehous destinations can provide converstions from the raw JSON data into type-cast relational columns. Learn more [here](/understanding-airbyte/typing-deduping). + +:::note + +Typing and Deduping may cause an increase in your destination's compute cost. This cost will vary depending on the amount of data that is transformed and is not related to Airbyte credit usage. + +::: + ## Workspace A workspace is a grouping of sources, destinations, connections, and other configurations. It lets you collaborate with team members and share resources across your team under a shared billing account. diff --git a/docs/using-airbyte/core-concepts/sync-modes/README.md b/docs/using-airbyte/core-concepts/sync-modes/README.md index 671f79024cad..5a6921910d5e 100644 --- a/docs/using-airbyte/core-concepts/sync-modes/README.md +++ b/docs/using-airbyte/core-concepts/sync-modes/README.md @@ -1,46 +1,4 @@ -# Connections and Sync Modes - -A connection is a configuration for syncing data between a source and a destination. To setup a connection, a user must configure things such as: - -- Sync schedule: when to trigger a sync of the data. -- Destination [Namespace](../namespaces.md) and stream names: where the data will end up being written. -- A catalog selection: which [streams and fields](../../../understanding-airbyte/airbyte-protocol.md#catalog) to replicate from the source -- Sync mode: how streams should be replicated \(read and write\): -- Optional transformations: how to convert Airbyte protocol messages \(raw JSON blob\) data into some other data representations. - -## Sync schedules - -Sync schedules are explained below. For information about catalog selections, see [AirbyteCatalog & ConfiguredAirbyteCatalog](../../../understanding-airbyte/airbyte-protocol.md#catalog). - -Syncs will be triggered by either: - -- A manual request \(i.e: clicking the "Sync Now" button in the UI\) -- A schedule - -When a scheduled connection is first created, a sync is executed as soon as possible. After that, a sync is run once the time since the last sync \(whether it was triggered manually or due to a schedule\) has exceeded the schedule interval. For example, consider the following illustrative scenario: - -- **October 1st, 2pm**, a user sets up a connection to sync data every 24 hours. -- **October 1st, 2:01pm**: sync job runs -- **October 2nd, 2:01pm:** 24 hours have passed since the last sync, so a sync is triggered. -- **October 2nd, 5pm**: The user manually triggers a sync from the UI -- **October 3rd, 2:01pm:** since the last sync was less than 24 hours ago, no sync is run -- **October 3rd, 5:01pm:** It has been more than 24 hours since the last sync, so a sync is run - -## Destination namespace - -The location of where a connection replication will store data is referenced as the destination namespace. The destination connectors should create and write records \(for both raw and normalized tables\) in the specified namespace which should be configurable in the UI via the Namespace Configuration field \(or NamespaceDefinition in the API\). You can read more about configuring namespaces [here](../namespaces.md). - -## Destination stream name - -### Prefix stream name - -Stream names refer to table names in a typical RDBMS. But it can also be the name of an API endpoint, etc. Similarly to the namespace, stream names can be configured to diverge from their names in the source with a "prefix" field. The prefix is prepended to the source stream name in the destination. - -## Stream-specific customization - -All the customization of namespace and stream names described above will be equally applied to all streams selected for replication in a catalog per connection. If you need more granular customization, stream by stream, for example, or with different logic rules, then you could follow the tutorial on [customizing transformations with dbt](../../../operator-guides/transformation-and-normalization/transformations-with-dbt.md). - -## Sync modes +# Sync Modes A sync mode governs how Airbyte reads from a source and writes to a destination. Airbyte provides different sync modes to account for various use cases. To minimize confusion, a mode's behavior is reflected in its name. The easiest way to understand Airbyte's sync modes is to understand how the modes are named. @@ -59,16 +17,4 @@ A sync mode is therefore, a combination of a source and destination mode togethe - [Full Refresh Overwrite](./full-refresh-overwrite.md): Sync the whole stream and replace data in destination by overwriting it. - [Full Refresh Append](./full-refresh-append.md): Sync the whole stream and append data in destination. - [Incremental Append](./incremental-append.md): Sync new records from stream and append data in destination. -- [Incremental Append + Deduped](./incremental-append-deduped.md): Sync new records from stream and append data in destination, also provides a de-duplicated view mirroring the state of the stream in the source. - -## Optional operations - -### Typing and Deduping - -As described by the [Airbyte Protocol from the Airbyte Specifications](../../../understanding-airbyte/airbyte-protocol.md), replication is composed of source connectors that are transmitting data in a JSON format. It is then written as such by the destination connectors. On top of this replication, Airbyte's database and datawarehous destinations can provide converstions from the raw JSON data into type-cast relational columns. Learn more [here](/understanding-airbyte/typing-deduping). - -:::note - -Typing and Deduping may cause an increase in your destination's compute cost. This cost will vary depending on the amount of data that is transformed and is not related to Airbyte credit usage. - -::: +- [Incremental Append + Deduped](./incremental-append-deduped.md): Sync new records from stream and append data in destination, also provides a de-duplicated view mirroring the state of the stream in the source. \ No newline at end of file diff --git a/docs/using-airbyte/getting-started/readme.md b/docs/using-airbyte/getting-started/readme.md index ce0874c0e713..22ed7e2eb778 100644 --- a/docs/using-airbyte/getting-started/readme.md +++ b/docs/using-airbyte/getting-started/readme.md @@ -1,5 +1,36 @@ # Getting Started -// TODO: Placeholder text +Getting started with Airbyte takes only a few steps! This page guides you through the initial steps to get started. -should be about cloud and deployment options \ No newline at end of file +## Sign Up for Airbyte (Cloud) + +To use Airbyte Cloud, [sign up](https://cloud.airbyte.io/signup) with your email address, Google login, or GitHub login. Upon signing up, you'll be taken to your workspace, which lets you collaborate with team members and share resources across your team under a shared billing account. + +Airbyte Cloud offers a 14-day free trial that begins after your first successful sync. For more details on our pricing model, see our [pricing page](https://www.airbyte.com/pricing). + +If you signed up using your email address, Airbyte will send you an email with a verification link. On clicking the link, you'll be taken to your new workspace. + + :::note + If you have been invited to an existing workspace, you cannot use the Google login option to create a new Airbyte account. Use email instead. + ::: + +To start setting up a data pipeline, see how to [set up a source](/add-a-source/). + + :::info + Depending on your data residency, you may need to [allowlist IP addresses](/operating-airbyte/security) to enable access to Airbyte. + ::: + +## Deploy Airbyte (Open-Source) + +To use Airbyte Open-Source, you can use on the following options to deploy it on your infrastructure. + +- [Local Deployment](local-deployment.md) (recommended when trying out Airbyte) +- [On Aws](on-aws-ec2.md) +- [On Azure VM Cloud Shell](on-azure-vm-cloud-shell.md) +- [On Digital Ocean Droplet](on-digitalocean-droplet.md) +- [On GCP.md](on-gcp-compute-engine.md) +- [On Kubernetes](on-kubernetes-via-helm.md) +- [On OCI VM](on-oci-vm.md) +- [On Restack](on-restack.md) +- [On Plural](on-plural.md) +- [On AWS ECS](on-aws-ecs.md) (Spoiler alert: it doesn't work) diff --git a/docs/using-airbyte/getting-started/set-up-a-connection.md b/docs/using-airbyte/getting-started/set-up-a-connection.md index 49d65d1c54d0..e3378d1f5bd4 100644 --- a/docs/using-airbyte/getting-started/set-up-a-connection.md +++ b/docs/using-airbyte/getting-started/set-up-a-connection.md @@ -1,16 +1,20 @@ # Set up a Connection -Now that you've learned how to [deploy Airbyte locally](./deploy-airbyte) and set up your first [source](./add-a-source) and [destination](./add-a-destination), it's time to finish the job by creating your very first connection! +Now that you've learned how to set up your first [source](./add-a-source) and [destination](./add-a-destination), it's time to finish the job by creating your very first connection! On the left side of your main Airbyte dashboard, select **Connections**. You will be prompted to choose which source and destination to use for this connection. As an example, we'll use the **Google Sheets** source and **Local JSON** destination. ## Configure the connection -Once you've chosen your source and destination, you'll be able to configure the connection. You can refer to [this page](https://docs.airbyte.com/cloud/managing-airbyte-cloud/configuring-connections) for more information on each available configuration. For this demo, we'll simply set the **Replication frequency** to a 24 hour interval and leave the other fields at their default values. +Once you've chosen your source and destination, you'll be able to configure the connection. You can refer to [this page](/using-airbyte/configuring-connections) for more information on each available configuration. For this demo, we'll simply set the **Replication frequency** to a 24 hour interval and leave the other fields at their default values. ![Connection config](../../.gitbook/assets/set-up-a-connection/getting-started-connection-config.png) -Next, you can toggle which streams you want to replicate, as well as setting up the desired sync mode for each stream. For more information on the nature of each sync mode supported by Airbyte, see [this page](https://docs.airbyte.com/understanding-airbyte/connections/#sync-modes). +:::note +By default, data will sync to the default defined in the destination. To ensure your data is synced to the correct place, see our examples for [Destination Namespace](/using-airbyte/core-concepts/namespaces) +::: + +Next, you can toggle which streams you want to replicate, as well as setting up the desired sync mode for each stream. For more information on the nature of each sync mode supported by Airbyte, see [this page](/using-airbyte/core-concepts/sync-modes). Our test data consists of a single stream cleverly named `Test Data`, which we've enabled and set to `Full Refresh - Overwrite` sync mode. @@ -18,21 +22,23 @@ Our test data consists of a single stream cleverly named `Test Data`, which we'v Click **Set up connection** to complete your first connection. Your first sync is about to begin! -## Connector Dashboard +## Connection Overview -Once you've finished setting up the connection, you will be automatically redirected to a dashboard containing all the tools you need to keep track of your connection. +Once you've finished setting up the connection, you will be automatically redirected to a connection overview containing all the tools you need to keep track of your connection. ![Connection dashboard](../../.gitbook/assets/set-up-a-connection/getting-started-connection-success.png) Here's a basic overview of the tabs and their use: -1. The **Status** tab shows you an overview of your connector's sync schedule and health. +1. The **Status** tab shows you an overview of your connector's sync health. 2. The **Job History** tab allows you to check the logs for each sync. If you encounter any errors or unexpected behaviors during a sync, checking the logs is always a good first step to finding the cause and solution. 3. The **Replication** tab allows you to modify the configurations you chose during the connection setup. 4. The **Settings** tab contains additional settings, and the option to delete the connection if you no longer wish to use it. ### Check the data from your first sync +Once the first sync has completed, you can verify the sync has completed by checking the data in your destination. + If you followed along and created your own connection using a `Local JSON` destination, you can use this command to check the file's contents to make sure the replication worked as intended (be sure to replace YOUR_PATH with the path you chose in your destination setup, and YOUR_STREAM_NAME with the name of an actual stream you replicated): ```bash From fd1e641c1f3d5e9e165a5ccd16740e86c011d484 Mon Sep 17 00:00:00 2001 From: timroes Date: Sun, 26 Nov 2023 15:20:40 +0000 Subject: [PATCH 16/52] Automated Commit - Formatting Changes --- docusaurus/redirects.yml | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docusaurus/redirects.yml b/docusaurus/redirects.yml index 74a9578c0aad..e1f892f2b077 100644 --- a/docusaurus/redirects.yml +++ b/docusaurus/redirects.yml @@ -20,7 +20,7 @@ - from: /cloud/managing-airbyte-cloud/edit-stream-configuration to: /cloud/managing-airbyte-cloud/configuring-connections # November 2023 documentation restructure: -- from: +- from: - /project-overview/product-support-levels - /project-overview/product-release-stages to: /integrations/connector-support-levels @@ -28,7 +28,7 @@ to: /community/code-of-conduct - from: /project-overview/slack-code-of-conduct to: /community/slack-code-of-conduct -- from : /project-overview/licenses/ +- from: /project-overview/licenses/ to: /developer-guides/licenses/ - from: /project-overview/licenses/license-faq to: /developer-guides/licenses/license-faq @@ -85,4 +85,4 @@ - from: - /troubleshooting - /operator-guides/contact-support - to: /community/getting-support \ No newline at end of file + to: /community/getting-support From a2443959e7ed02515374bb2b0fca46beca9dfcf8 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 16:21:11 +0100 Subject: [PATCH 17/52] Fix links --- docs/using-airbyte/core-concepts/readme.md | 2 +- docs/using-airbyte/getting-started/readme.md | 22 ++++++++++---------- 2 files changed, 12 insertions(+), 12 deletions(-) diff --git a/docs/using-airbyte/core-concepts/readme.md b/docs/using-airbyte/core-concepts/readme.md index 2f03a54dc210..12cca43b7b60 100644 --- a/docs/using-airbyte/core-concepts/readme.md +++ b/docs/using-airbyte/core-concepts/readme.md @@ -118,7 +118,7 @@ Normalizing data may cause an increase in your destination's compute cost. This ### Typing and Deduping -As described by the [Airbyte Protocol from the Airbyte Specifications](../../../understanding-airbyte/airbyte-protocol.md), replication is composed of source connectors that are transmitting data in a JSON format. It is then written as such by the destination connectors. On top of this replication, Airbyte's database and datawarehous destinations can provide converstions from the raw JSON data into type-cast relational columns. Learn more [here](/understanding-airbyte/typing-deduping). +As described by the [Airbyte Protocol from the Airbyte Specifications](/understanding-airbyte/airbyte-protocol.md), replication is composed of source connectors that are transmitting data in a JSON format. It is then written as such by the destination connectors. On top of this replication, Airbyte's database and datawarehous destinations can provide converstions from the raw JSON data into type-cast relational columns. Learn more [here](/understanding-airbyte/typing-deduping). :::note diff --git a/docs/using-airbyte/getting-started/readme.md b/docs/using-airbyte/getting-started/readme.md index 22ed7e2eb778..6c1624a39866 100644 --- a/docs/using-airbyte/getting-started/readme.md +++ b/docs/using-airbyte/getting-started/readme.md @@ -14,7 +14,7 @@ If you signed up using your email address, Airbyte will send you an email with a If you have been invited to an existing workspace, you cannot use the Google login option to create a new Airbyte account. Use email instead. ::: -To start setting up a data pipeline, see how to [set up a source](/add-a-source/). +To start setting up a data pipeline, see how to [set up a source](./add-a-source.md). :::info Depending on your data residency, you may need to [allowlist IP addresses](/operating-airbyte/security) to enable access to Airbyte. @@ -24,13 +24,13 @@ To start setting up a data pipeline, see how to [set up a source](/add-a-source/ To use Airbyte Open-Source, you can use on the following options to deploy it on your infrastructure. -- [Local Deployment](local-deployment.md) (recommended when trying out Airbyte) -- [On Aws](on-aws-ec2.md) -- [On Azure VM Cloud Shell](on-azure-vm-cloud-shell.md) -- [On Digital Ocean Droplet](on-digitalocean-droplet.md) -- [On GCP.md](on-gcp-compute-engine.md) -- [On Kubernetes](on-kubernetes-via-helm.md) -- [On OCI VM](on-oci-vm.md) -- [On Restack](on-restack.md) -- [On Plural](on-plural.md) -- [On AWS ECS](on-aws-ecs.md) (Spoiler alert: it doesn't work) +- [Local Deployment](/deploying-airbyte/local-deployment.md) (recommended when trying out Airbyte) +- [On Aws](/deploying-airbyte/on-aws-ec2.md) +- [On Azure VM Cloud Shell](/deploying-airbyte/on-azure-vm-cloud-shell.md) +- [On Digital Ocean Droplet](/deploying-airbyte/on-digitalocean-droplet.md) +- [On GCP.md](/deploying-airbyte/on-gcp-compute-engine.md) +- [On Kubernetes](/deploying-airbyte/on-kubernetes-via-helm.md) +- [On OCI VM](/deploying-airbyte/on-oci-vm.md) +- [On Restack](/deploying-airbyte/on-restack.md) +- [On Plural](/deploying-airbyte/on-plural.md) +- [On AWS ECS](/deploying-airbyte/on-aws-ecs.md) (Spoiler alert: it doesn't work) From e7f28013e26031b8ca1fe3214b4a7e3e1310c263 Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 15:36:52 +0000 Subject: [PATCH 18/52] Changes --- .../cloud/managing-airbyte-cloud/configuring-connections.md | 6 +++--- docs/cloud/managing-airbyte-cloud/manage-credits.md | 6 +++--- docs/cloud/managing-airbyte-cloud/manage-data-residency.md | 6 +++--- docs/cloud/managing-airbyte-cloud/manage-schema-changes.md | 2 +- docs/cloud/managing-airbyte-cloud/review-sync-history.md | 2 +- docs/using-airbyte/core-concepts/readme.md | 5 +++++ 6 files changed, 16 insertions(+), 11 deletions(-) diff --git a/docs/cloud/managing-airbyte-cloud/configuring-connections.md b/docs/cloud/managing-airbyte-cloud/configuring-connections.md index 4e95bac58714..6e8672d9f894 100644 --- a/docs/cloud/managing-airbyte-cloud/configuring-connections.md +++ b/docs/cloud/managing-airbyte-cloud/configuring-connections.md @@ -1,6 +1,6 @@ # Configuring connections -A connection links a source to a destination and defines how your data will sync. After you have created a connection, you can modify any of the [configuration settings](#configure-connection-settings) or [stream settings](#modify-streams-in-your-connection). +A connection links a source to a destination and defines how your data will sync. After you have created a connection, you can modify any of the configuration settings or stream settings. ## Configure Connection Settings @@ -25,9 +25,9 @@ You can configure the following settings: | Setting | Description | |--------------------------------------|-------------------------------------------------------------------------------------| | Replication frequency | How often the data syncs | -| Destination namespace | Where the replicated data is written | +| [Destination namespace](/using-airbyte/namespaces.md) | Where the replicated data is written | | Destination stream prefix | How you identify streams from different connectors | -| [Detect and propagate schema changes](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-schema-changes/#review-non-breaking-schema-changes) | How Airbyte handles syncs when it detects schema changes in the source | +| [Detect and propagate schema changes](/using-airbyte/manage-schema-changes.md) | How Airbyte handles syncs when it detects schema changes in the source | | Connection Data Residency | Where data will be processed | To use [cron scheduling](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html): diff --git a/docs/cloud/managing-airbyte-cloud/manage-credits.md b/docs/cloud/managing-airbyte-cloud/manage-credits.md index bcc1fe8af0e9..d0f5839cc1eb 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-credits.md +++ b/docs/cloud/managing-airbyte-cloud/manage-credits.md @@ -10,7 +10,7 @@ To buy credits: 1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Billing** in the navigation bar. -2. If you are unsure of how many credits you need, use our [Cost Estimator](https://cost.airbyte.com/) or click **Talk to Sales** to find the right amount for your team. +2. If you are unsure of how many credits you need, use our [Cost Estimator](https://www.airbyte.com/pricing) or click **Talk to Sales** to find the right amount for your team. 3. Click **Buy credits**. @@ -46,7 +46,7 @@ To buy credits: You can enroll in automatic top-ups of your credit balance. This is a beta feature for those who do not want to manually add credits each time. -To enroll, [email us](mailto:natalie@airbyte.io) with: +To enroll, [email us](mailto:billing@airbyte.io) with: 1. A link to your workspace that you'd like to enable this feature for. 2. **Recharge threshold** The number under what credit balance you would like the automatic top up to occur. @@ -61,7 +61,7 @@ To take a real example, if: Note that the difference between the recharge credit amount and recharge threshold must be at least 20 as our minimum purchase is 20 credits. -If you are enrolled and want to change your limits or cancel your enrollment, [email us](mailto:natalie@airbyte.io). +If you are enrolled and want to change your limits or cancel your enrollment, [email us](mailto:billing@airbyte.io). ## View invoice history diff --git a/docs/cloud/managing-airbyte-cloud/manage-data-residency.md b/docs/cloud/managing-airbyte-cloud/manage-data-residency.md index da02874006ce..2106f0d12a92 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-data-residency.md +++ b/docs/cloud/managing-airbyte-cloud/manage-data-residency.md @@ -12,7 +12,7 @@ While the data is processed in a data plane of the chosen residency, the cursor ::: -When you set the default data residency, it applies to new connections only. If you do not set the default data residency, the [Airbyte Default](https://docs.airbyte.com/cloud/getting-started-with-airbyte-cloud/#united-states-and-airbyte-default) region is used. If you want to change the data residency for a connection, you can do so in its [connection settings](#choose-the-data-residency-for-a-connection). +When you set the default data residency, it applies to new connections only. If you do not set the default data residency, the [Airbyte Default](configuring-connections.md) region is used. If you want to change the data residency for a connection, you can do so in its [connection settings](configuring-connections.md). To choose your default data residency: @@ -26,12 +26,12 @@ To choose your default data residency: :::info -Depending on your network configuration, you may need to add [IP addresses](https://docs.airbyte.com/cloud/getting-started-with-airbyte-cloud/#allowlist-ip-addresses) to your allowlist. +Depending on your network configuration, you may need to add [IP addresses](/operating-airbyte/security#network-security-1.md) to your allowlist. ::: ## Choose the data residency for a connection -You can choose the data residency for your connection in the connection settings. You can also choose data residency when creating a [new connection](https://docs.airbyte.com/cloud/getting-started-with-airbyte-cloud#set-up-a-connection), or you can set the [default data residency](#choose-your-default-data-residency) for your workspace. +You can choose the data residency for your connection in the connection settings. You can also choose data residency when creating a new connection, or you can set the default data residency for your workspace. To choose the data residency for your connection: diff --git a/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md b/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md index 1e76e5f6ff58..4e1190f733fc 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md +++ b/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md @@ -83,4 +83,4 @@ In addition to Airbyte Cloud’s automatic schema change detection, you can manu 4. If there are changes to the schema, you can review them in the **Refreshed source schema** dialog. ## Manage Schema Change Notifications -[Refer to our notification documentation](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications#enable-schema-update-notifications) to understand how to stay updated on any schema updates to your connections. \ No newline at end of file +[Refer to our notification documentation](manage-airbyte-cloud-notifications.md) to understand how to stay updated on any schema updates to your connections. \ No newline at end of file diff --git a/docs/cloud/managing-airbyte-cloud/review-sync-history.md b/docs/cloud/managing-airbyte-cloud/review-sync-history.md index 0bb5cf2290f5..485d981fc92f 100644 --- a/docs/cloud/managing-airbyte-cloud/review-sync-history.md +++ b/docs/cloud/managing-airbyte-cloud/review-sync-history.md @@ -2,7 +2,7 @@ The job history displays information about synced data, such as the amount of data moved, the number of records read and committed, and the total sync time. Reviewing this summary can help you monitor the sync performance and identify any potential issues. -To review the sync history, click a connection in the list to view its sync history. Sync History displays the sync status or [reset](https://docs.airbyte.com/operator-guides/reset/) status. The sync status is defined as: +To review the sync history, click a connection in the list to view its sync history. Sync History displays the sync status or [reset](/operator-guides/reset.md) status. The sync status is defined as: | Status | Description | |---------------------|---------------------------------------------------------------------------------------------------------------------| diff --git a/docs/using-airbyte/core-concepts/readme.md b/docs/using-airbyte/core-concepts/readme.md index 2f03a54dc210..c4a97d482867 100644 --- a/docs/using-airbyte/core-concepts/readme.md +++ b/docs/using-airbyte/core-concepts/readme.md @@ -120,6 +120,11 @@ Normalizing data may cause an increase in your destination's compute cost. This As described by the [Airbyte Protocol from the Airbyte Specifications](../../../understanding-airbyte/airbyte-protocol.md), replication is composed of source connectors that are transmitting data in a JSON format. It is then written as such by the destination connectors. On top of this replication, Airbyte's database and datawarehous destinations can provide converstions from the raw JSON data into type-cast relational columns. Learn more [here](/understanding-airbyte/typing-deduping). +Note that typing and deduping is only relevant for the following relational database & warehouse destinations: + +- Snowflake +- BigQuery + :::note Typing and Deduping may cause an increase in your destination's compute cost. This cost will vary depending on the amount of data that is transformed and is not related to Airbyte credit usage. From c69b3b90da6c35661be3399c0c827cb1786a91c9 Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 15:45:49 +0000 Subject: [PATCH 19/52] remove archive --- docs/archive/changelog/README.md | 645 --------------- docs/archive/changelog/connectors.md | 776 ------------------ docs/archive/changelog/platform.md | 509 ------------ docs/archive/examples/README.md | 2 - .../build-a-slack-activity-dashboard.md | 424 ---------- docs/archive/examples/postgres-replication.md | 116 --- docs/archive/examples/slack-history.md | 109 --- .../archive/examples/slack-history/index.html | 77 -- .../examples/zoom-activity-dashboard.md | 272 ------ docs/archive/faq/README.md | 5 - docs/archive/faq/data-loading.md | 124 --- docs/archive/faq/deploying-on-other-os.md | 40 - docs/archive/faq/differences-with/README.md | 2 - .../differences-with/fivetran-vs-airbyte.md | 27 - .../differences-with/meltano-vs-airbyte.md | 28 - .../pipelinewise-vs-airbyte.md | 25 - .../faq/differences-with/singer-vs-airbyte.md | 28 - .../differences-with/stitchdata-vs-airbyte.md | 29 - docs/archive/faq/getting-started.md | 50 -- docs/archive/faq/security-and-data-audits.md | 14 - .../archive/faq/transformation-and-schemas.md | 20 - docs/archive/mongodb.md | 102 --- docs/archive/securing-airbyte.md | 28 - .../configuring-connections.md | 4 +- .../manage-airbyte-cloud-notifications.md | 4 +- .../manage-data-residency.md | 2 +- .../configuring-sync-notifications.md | 2 +- 27 files changed, 7 insertions(+), 3457 deletions(-) delete mode 100644 docs/archive/changelog/README.md delete mode 100644 docs/archive/changelog/connectors.md delete mode 100644 docs/archive/changelog/platform.md delete mode 100644 docs/archive/examples/README.md delete mode 100644 docs/archive/examples/build-a-slack-activity-dashboard.md delete mode 100644 docs/archive/examples/postgres-replication.md delete mode 100644 docs/archive/examples/slack-history.md delete mode 100644 docs/archive/examples/slack-history/index.html delete mode 100644 docs/archive/examples/zoom-activity-dashboard.md delete mode 100644 docs/archive/faq/README.md delete mode 100644 docs/archive/faq/data-loading.md delete mode 100644 docs/archive/faq/deploying-on-other-os.md delete mode 100644 docs/archive/faq/differences-with/README.md delete mode 100644 docs/archive/faq/differences-with/fivetran-vs-airbyte.md delete mode 100644 docs/archive/faq/differences-with/meltano-vs-airbyte.md delete mode 100644 docs/archive/faq/differences-with/pipelinewise-vs-airbyte.md delete mode 100644 docs/archive/faq/differences-with/singer-vs-airbyte.md delete mode 100644 docs/archive/faq/differences-with/stitchdata-vs-airbyte.md delete mode 100644 docs/archive/faq/getting-started.md delete mode 100644 docs/archive/faq/security-and-data-audits.md delete mode 100644 docs/archive/faq/transformation-and-schemas.md delete mode 100644 docs/archive/mongodb.md delete mode 100644 docs/archive/securing-airbyte.md diff --git a/docs/archive/changelog/README.md b/docs/archive/changelog/README.md deleted file mode 100644 index cc854f303e60..000000000000 --- a/docs/archive/changelog/README.md +++ /dev/null @@ -1,645 +0,0 @@ -# Changelog - -## 1/28/2022 Summary - -* New Source: Chartmogul (contributyed by Titas Skrebė) -* New Source: Hellobaton (contributed by Daniel Luftspring) -* New Source: Flexport (contributed by Juozas) -* New Source: PersistIq (contributed by Wadii Zaim) - -* ✨ Postgres Source: Users can now select which schemas they wish to sync before discovery. This makes the discovery stage for large instances much more performant. -* ✨ Shopify Source: Now verifies permissions on the token before accessing resources. -* ✨ Snowflake Destination: Users now have access to an option to purge their staging data. -* ✨ HubSpot Source: Added some more fields for the email_events stream. -* ✨ Amazon Seller Partner Source: Added the GET_FLAT_FILE_ALL_ORDERS_DATA_BY_LAST_UPDATE_GENERAL report stream. (contributed by @ron-damon) -* ✨ HubSpot Source: Added the form_submission and property_history streams. - -* 🐛 DynamoDB Destination: The parameter dynamodb_table_name is now named dynamodb_table_name_prefix to more accurately represent it. -* 🐛 Intercom Source: The handling of scroll param is now fixed when it is expired. -* 🐛 S3 + GCS Destinations: Now support arrays with unknown item type. -* 🐛 Postgres Source: Now supports handling of the Java SQL date type. -* 🐛 Salesforce Source: No longer fails during schema generation. - -## 1/13/2022 Summary - -⚠️ WARNING ⚠️ - -Snowflake Source: Normalization with Snowflake now produces permanent tables. [If you want to continue creating transient tables, you will need to create a new transient database for Airbyte.] - -* ✨ GitHub Source: PR related streams now support incremental sync. -* ✨ HubSpot Source: We now support ListMemberships in the Contacts stream. -* ✨ Azure Blob Storage Destination: Now has the option to add a BufferedOutputStream to improve performance and fix writing data with over 50GB in a stream. (contributed by @bmatticus) - -* 🐛 Normalization partitioning now works as expected with FLOAT64 and BigQuery. -* 🐛 Normalization now works properly with quoted and case sensitive columns. -* 🐛 Source MSSQL: Added support for some missing data types. -* 🐛 Snowflake Destination: Schema is now not created if it previously exists. -* 🐛 Postgres Source: Now properly reads materialized views. -* 🐛 Delighted Source: Pagination for survey_responses, bounces and unsubscribes streams now works as expected. -* 🐛 Google Search Console Source: Incremental sync now works as expected. -* 🐛 Recurly Source: Now does not load all accounts when importing account coupon redemptions. -* 🐛 Salesforce Source: Now properly handles 400 when streams don't support query or queryAll. - -## 1/6/2022 Summary - -* New Source: 3PL Central (contributed by Juozas) -* New Source: My Hours (contributed by Wisse Jelgersma) -* New Source: Qualaroo (contributed by gunu) -* New Source: SearchMetrics - -* 💎 Salesforce Source: Now supports filtering streams at configuration, making it easier to handle large Salesforce instances. -* 💎 Snowflake Destination: Now supports byte-buffering for staged inserts. -* 💎 Redshift Destination: Now supports byte-buffering for staged inserts. -* ✨ Postgres Source: Now supports all Postgres 14 types. -* ✨ Recurly Source: Now supports incremental sync for all streams. -* ✨ Zendesk Support Source: Added the Brands, CustomRoles, and Schedules streams. -* ✨ Zendesk Support Source: Now uses cursor-based pagination. -* ✨ Kustomer Source: Setup configuration is now more straightforward. -* ✨ Hubspot Source: Now supports incremental sync on all streams where possible. -* ✨ Facebook Marketing Source: Fixed schema for breakdowns fields. -* ✨ Facebook Marketing Source: Added asset_feed_spec to AdCreatives stream. -* ✨ Redshift Destination: Now has an option to toggle the deletion of staging data. - -* 🐛 S3 Destination: Avro and Parquet formats are now processed correctly. -* 🐛 Snowflake Destination: Fixed SQL Compliation error. -* 🐛 Kafka Source: SASL configurations no longer throw null pointer exceptions (contributed by Nitesh Kumar) -* 🐛 Salesforce Source: Now throws a 400 for non-queryable streams. -* 🐛 Amazon Ads Source: Polling for report generation is now much more resilient. (contributed by Juozas) -* 🐛 Jira Source: The filters stream now works as expected. -* 🐛 BigQuery Destination: You can now properly configure the buffer size with the part_size config field. -* 🐛 Snowflake Destination: You can now properly configure the buffer size with the part_size config field. -* 🐛 CockroachDB Source: Now correctly only discovers tables the user has permission to access. -* 🐛 Stripe Source: The date and arrival_date fields are now typed correctly. - -## 12/16/2021 Summary - -🎉 First off... There's a brand new CDK! Menno Hamburg contributed a .NET/C# implementation for our CDK, allowing you to write HTTP API sources and Generic Dotnet sources. Thank you so much Menno, this is huge! - -* New Source: OpenWeather -* New Destination: ClickHouse (contributed by @Bo) -* New Destination: RabbitMQ (contributed by @Luis Gomez) -* New Destination: Amazon SQS (contributed by @Alasdair Brown) -* New Destination: Rockset (contributed by @Steve Baldwin) - -* ✨ Facebook Marketing Source: Updated the campaign schema with more relevant fields. (contributed by @Maxime Lavoie) -* ✨ TikTok Marketing Source: Now supports the Basic Report stream. -* ✨ MySQL Source: Now supports all MySQL 8.0 data types. -* ✨ Klaviyo Source: Improved performance, added incremental sync support to the Global Exclusions stream. -* ✨ Redshift Destination: You can now specify a bucket path to stage your data in before inserting. -* ✨ Kubernetes deployments: Sidecar memory is now 25Mi, up from 6Mi to cover all usage cases. -* ✨ Kubernetes deployments: The Helm chart can now set up writing logs to S3 easily. (contributed by @Valentin Nourdin) - -* 🐛 Python CDK: Now shows the stack trace of unhandled exceptions. -* 🐛 Google Analytics Source: Fix data window input validation, fix date type conversion. -* 🐛 Google Ads Source: Data from the end_date for syncs is now included in a sync. -* 🐛 Marketo Source: Fixed issues around input type conversion and conformation to the schema. -* 🐛 Mailchimp Source: Fixed schema conversion error causing sync failures. -* 🐛 PayPal Transactions Source: Now reports full error message details on failure. -* 🐛 Shopify Source: Normalization now works as expected. - -## 12/9/2021 Summary - -⚠️ WARNING ⚠️ - -v0.33.0 is a minor version with breaking changes. Take the normal precautions with upgrading safely to this version. -v0.33.0 has a bug that affects GCS logs on Kubernetes. Upgrade straight to v0.33.2 if you are running a K8s deployment of Airbyte. - -* New Source: Mailgun - -🎉 Snowflake Destination: You can now stage your inserts, making them much faster. - -* ✨ Google Ads Source: Source configuration is now more clear. -* ✨ Google Analytics Source: Source configuration is now more clear. -* ✨ S3 Destination: You can now write timestamps in Avro and Parquet formats. -* ✨ BigQuery & BigQuery Denormalized Destinations: Now use byte-based buffering for batch inserts. -* ✨ Iterable Source: Now has email validation on the list_users stream. - -* 🐛 Incremental normalization now works properly with empty tables. -* 🐛 LinkedIn Ads Source: 429 response is now properly handled. -* 🐛 Intercom Source: Now handles failed pagination requests with backoffs. -* 🐛 Intercom Source: No longer drops records from the conversation stream. -* 🐛 Google Analytics Source: 400 errors no longer get ignored with custom reports. -* 🐛 Marketo Source: The createdAt and updatedAt fields are now formatted correctly. - -## 12/2/2021 Summary - -🎃 **Hacktoberfest Submissions** 🎃 ------------------------------------------ -* New Destination: Redis (contributed by @Ivica Taseski) -* New Destination: MQTT (contributed by @Mario Molina) -* New Destination: Google Firestore (contributed by @Adam Dobrawy) -* New Destination: Kinesis (contributed by @Ivica Taseski) -* New Source: Zenloop (contributed by @Alexander Batoulis) -* New Source: Outreach (contributed by @Luis Gomez) - -* ✨ Zendesk Source: The chats stream now supports incremental sync and added testing for all streams. -* 🐛 Monday Source: Pagination now works as expected and the schema has been fixed. -* 🐛 Postgres Source: Views are now properly listed during schema discovery. -* 🐛 Postgres Source: Using the money type with an amount greater than 1000 works properly now. -* 🐛 Google Search Console Search: We now set a default end_data value. -* 🐛 Mixpanel Source: Normalization now works as expected and streams are now displayed properly in the UI. -* 🐛 MongoDB Source: The DATE_TIME type now uses milliseconds. - -## 11/25/2021 Summary -Hey Airbyte Community! Let's go over all the changes from v.32.5 and prior! - -🎃 **Hacktoberfest Submissions** 🎃 -* New Source: Airtable (contributed by Tuan Nguyen). -* New Source: Notion (contributed by Bo Lu). -* New Source: Pardot (contributed by Tuan Nguyen). - -* New Source: Youtube analytics. - -* ✨ Source Exchange Rates: add ignore_weekends option. -* ✨ Source Facebook: add the videos stream. -* ✨ Source Freshdesk: removed the limitation in streams pagination. -* ✨ Source Jira: add option to render fields in HTML format. -* ✨ Source MongoDB v2: improve read performance. -* ✨ Source Pipedrive: specify schema for "persons" stream. -* ✨ Source PostgreSQL: exclude tables on which user doesn't have select privileges. -* ✨ Source SurveyMonkey: improve connection check. - -* 🐛 Source Salesforce: improve resiliency of async bulk jobs. -* 🐛 Source Zendesk Support: fix missing ticket_id in ticket_comments stream. -* 🐛 Normalization: optimize incremental normalization runtime with Snowflake. - -As usual, thank you so much to our wonderful contributors this week that have made Airbyte into what it is today: Madison Swain-Bowden, Tuan Nguyen, Bo Lu, Adam Dobrawy, Christopher Wu, Luis Gomez, Ivica Taseski, Mario Molina, Ping Yee, Koji Matsumoto, Sujit Sagar, Shadab, Juozas V.([Labanoras Tech](http://labanoras.io)) and Serhii Chvaliuk! - -## 11/17/2021 Summary - -Hey Airbyte Community! Let's go over all the changes from v.32.1 and prior! But first, there's an important announcement I need to make about upgrading Airbyte to v.32.1. - -⚠️ WARNING ⚠️ -Upgrading to v.32.0 is equivalent to a major version bump. If your current version is v.32.0, you must upgrade to v.32.0 first before upgrading to any later version - -Keep in mind that this upgrade requires your all of your connector Specs to be retrievable, or Airbyte will fail on startup. You can force delete your connector Specs by setting the `VERSION_0_32_0_FORCE_UPGRADE` environment variable to `true`. Steps to specifically check out v.32.0 and details around this breaking change can be found [here](https://docs.airbyte.com/operator-guides/upgrading-airbyte/#mandatory-intermediate-upgrade). - -*Now back to our regularly scheduled programming.* - -🎃 Hacktoberfest Submissions 🎃 - -* New Destination: ScyllaDB (contributed by Ivica Taseski) -* New Source: Azure Table Storage (contributed by geekwhocodes) -* New Source: Linnworks (contributed by Juozas V.([Labanoras Tech](http://labanoras.io))) - -* ✨ Source MySQL: Now has basic performance tests. -* ✨ Source Salesforce: We now automatically transform and handle incorrect data for the anyType and calculated types. - -* 🐛 IBM Db2 Source: Now handles conversion from DECFLOAT to BigDecimal correctly. -* 🐛 MSSQL Source: Now handles VARBINARY correctly. -* 🐛 CockroachDB Source: Improved parsing of various data types. - -As usual, thank you so much to our wonderful contributors this week that have made Airbyte into what it is today: Achmad Syarif Hidayatullah, Tuan Nguyen, Ivica Taseski, Hai To, Juozas, gunu, Shadab, Per-Victor Persson, and Harsha Teja Kanna! - -## 11/11/2021 Summary - -Time to go over changes from v.30.39! And... let's get another update on Hacktoberfest. - -🎃 Hacktoberfest Submissions 🎃 - -* New Destination: Cassandra (contributed by Ivica Taseski) -* New Destination: Pulsar (contributed by Mario Molina) -* New Source: Confluence (contributed by Tuan Nguyen) -* New Source: Monday (contributed by Tuan Nguyen) -* New Source: Commerce Tools (contributed by James Wilson) -* New Source: Pinterest Marketing (contributed by us!) - -* ✨ Shopify Source: Now supports the FulfillmentOrders and Fulfillments streams. -* ✨ Greenhouse Source: Now supports the Demographics stream. -* ✨ Recharge Source: Broken requests should now be re-requested with improved backoff. -* ✨ Stripe Source: Now supports the checkout_sessions, checkout_sessions_line_item, and promotion_codes streams. -* ✨ Db2 Source: Now supports SSL. - -* 🐛 We've made some updates to incremental normalization to fix some outstanding issues. [Details](https://github.com/airbytehq/airbyte/pull/7669) -* 🐛 Airbyte Server no longer crashes due to too many open files. -* 🐛 MSSQL Source: Data type conversion with smalldatetime and smallmoney works correctly now. -* 🐛 Salesforce Source: anyType fields can now be retrieved properly with the BULK API -* 🐛 BigQuery-Denormalized Destination: Fixed JSON parsing with $ref fields. - -As usual, thank you to our awesome contributors that have done awesome work during the last week: Tuan Nguyen, Harsha Teja Kanna, Aaditya S, James Wilson, Vladimir Remar, Yuhui Shi, Mario Molina, Ivica Taseski, Collin Scangarella, and haoranyu! - -## 11/03/2021 Summary - -It's patch notes time. Let's go over the changes from 0.30.24 and before. But before we do, let's get a quick update on how Hacktober is going! - -🎃 Hacktoberfest Submissions 🎃 - -* New Destination: Elasticsearch (contributed by Jeremy Branham) -* New Source: Salesloft (contributed by Pras) -* New Source: OneSignal (contributed by Bo) -* New Source: Strava (contributed by terencecho) -* New Source: Lemlist (contributed by Igli Koxha) -* New Source: Amazon SQS (contributed by Alasdair Brown) -* New Source: Freshservices (contributed by Tuan Nguyen) -* New Source: Freshsales (contributed by Tuan Nguyen) -* New Source: Appsflyer (contributed by Achmad Syarif Hidayatullah) -* New Source: Paystack (contributed by Foluso Ogunlana) -* New Source: Sentry (contributed by koji matsumoto) -* New Source: Retently (contributed by Subhash Gopalakrishnan) -* New Source: Delighted! (contributed by Rodrigo Parra) - -with 18 more currently in review... - -🎉 **Incremental Normalization is here!** 🎉 - -💎 Basic normalization no longer runs on already normalized data, making it way faster and cheaper. :gem: - -🎉 **Airbyte Compiles on M1 Macs!** - -Airbyte developers with M1 chips in their MacBooks can now compile the project and run the server. This is a major step towards being able to fully run Airbyte on M1. (contributed by Harsha Teja Kanna) - -* ✨ BigQuery Destination: You can now run transformations in batches, preventing queries from hitting BigQuery limits. (contributed by Andrés Bravo) -* ✨ S3 Source: Memory and Performance optimizations, also some fancy new PyArrow CSV configuration options. -* ✨ Zuora Source: Now supports Unlimited as an option for the Data Query Live API. -* ✨ Clickhouse Source: Now supports SSL and connection via SSH tunneling. - -* 🐛 Oracle Source: Now handles the LONG RAW data type correctly. -* 🐛 Snowflake Source: Fixed parsing of extreme values for FLOAT and NUMBER data types. -* 🐛 Hubspot Source: No longer fails due to lengthy URI/URLs. -* 🐛 Zendesk Source: The chats stream now pulls data past the first page. -* 🐛 Jira Source: Normalization now works as expected. - -As usual, thank you to our awesome contributors that have done awesome work during this productive spooky season: Tuan Nguyen, Achmad Syarif Hidayatullah, Christopher Wu, Andrés Bravo, Harsha Teja Kanna, Collin Scangarella, haoranyu, koji matsumoto, Subhash Gopalakrishnan, Jeremy Branham, Rodrigo Parra, Foluso Ogunlana, EdBizarro, Gergely Lendvai, Rodeoclash, terencecho, Igli Koxha, Alasdair Brown, bbugh, Pras, Bo, Xiangxuan Liu, Hai To, s-mawjee, Mario Molina, SamyPesse, Yuhui Shi, Maciej Nędza, Matt Hoag, and denis-sokolov! - -## 10/20/2021 Summary - -It's patch notes time! Let's go over changes from 0.30.16! But before we do... I want to remind everyone that Airbyte Hacktoberfest is currently taking place! For every connector that is merged into our codebase, you'll get $500, so make sure to submit before the hackathon ends on November 19th. - -* 🎉 New Source: WooCommerce (contributed by James Wilson) -* 🎉 K8s deployments: Worker image pull policy is now configurable (contributed by Mario Molina) - -* ✨ MSSQL destination: Now supports basic normalization -* 🐛 LinkedIn Ads source: Analytics streams now work as expected. - -We've had a lot of contributors over the last few weeks, so I'd like to thank all of them for their efforts: James Wilson, Mario Molina, Maciej Nędza, Pras, Tuan Nguyen, Andrés Bravo, Christopher Wu, gunu, Harsha Teja Kanna, Jonathan Stacks, darian, Christian Gagnon, Nicolas Moreau, Matt Hoag, Achmad Syarif Hidayatullah, s-mawjee, SamyPesse, heade, zurferr, denis-solokov, and aristidednd! - -## 09/29/2021 Summary - -It's patch notes time, let's go over the changes from our new minor version, v0.30.0. As usual, bug fixes are in the thread. - -* New source: LinkedIn Ads -* New source: Kafka -* New source: Lever Hiring - -* 🎉 New License: Nothing changes for users of Airbyte/contributors. You just can't sell your own Airbyte Cloud! - -* 💎 New API endpoint: You can now call connections/search in the web backend API to search sources and destinations. (contributed by Mario Molina) -* 💎 K8s: Added support for ImagePullSecrets for connector images. -* 💎 MSSQL, Oracle, MySQL sources & destinations: Now support connection via SSH (Bastion server) - -* ✨ MySQL destination: Now supports connection via TLS/SSL -* ✨ BigQuery (denormalized) destination: Supports reading BigQuery types such as date by reading the format field (contributed by Nicolas Moreau) -* ✨ Hubspot source: Added contacts associations to the deals stream. -* ✨ GitHub source: Now supports pulling commits from user-specified branches. -* ✨ Google Search Console source: Now accepts admin email as input when using a service account key. -* ✨ Greenhouse source: Now identifies API streams it has access to if permissions are limited. -* ✨ Marketo source: Now Airbyte native. -* ✨ S3 source: Now supports any source that conforms to the S3 protocol (Non-AWS S3). -* ✨ Shopify source: Now reports pre_tax_price on the line_items stream if you have Shopify Plus. -* ✨ Stripe source: Now actually uses the mandatory start_date config field for incremental syncs. - -* 🏗 Python CDK: Now supports passing custom headers to the requests in OAuth2, enabling token refresh calls. -* 🏗 Python CDK: Parent streams can now be configured to cache data for their child streams. -* 🏗 Python CDK: Now has a Transformer class that can cast record fields to the data type expected by the schema. - -* 🐛 Amplitude source: Fixed schema for date-time objects. -* 🐛 Asana source: Schema fixed for the sections, stories, tasks, and users streams. -* 🐛 GitHub source: Added error handling for streams not applicable to a repo. (contributed by Christopher Wu) -* 🐛 Google Search Console source: Verifies access to sites when performing the connection check. -* 🐛 Hubspot source: Now conforms to the V3 API, with streams such as owners reflecting the new fields. -* 🐛 Intercom source: Fixed data type for the updated_at field. (contributed by Christian Gagnon) -* 🐛 Iterable source: Normalization now works as expected. -* 🐛 Pipedrive source: Schema now reflects the correct types for date/time fields. -* 🐛 Stripe source: Incorrect timestamp formats removed for coupons and subscriptions streams. -* 🐛 Salesforce source: You can now sync more than 10,000 records with the Bulk API. -* 🐛 Snowflake destination: Now accepts any date-time format with normalization. -* 🐛 Snowflake destination: Inserts are now split into batches to accommodate for large data loads. - -Thank you to our awesome contributors. Y'all are amazing: Mario Molina, Pras, Vladimir Remar, Christopher Wu, gunu, Juliano Benvenuto Piovezan, Brian M, Justinas Lukasevicius, Jonathan Stacks, Christian Gagnon, Nicolas Moreau, aristidednd, camro, minimax75, peter-mcconnell, and sashkalife! - -## 09/16/2021 Summary - -Now let's get to the 0.29.19 changelog. As with last time, bug fixes are in the thread! - -* New Destination: Databricks 🎉 -* New Source: Google Search Console -* New Source: Close.com - -* 🏗 Python CDK: Now supports auth workflows involving query params. -* 🏗 Java CDK: You can now run the connector gradle build script on Macs with M1 chips! (contributed by @Harsha Teja Kanna) - -* 💎 Google Ads source: You can now specify user-specified queries in GAQL. -* ✨ GitHub source: All streams with a parent stream use cached parent stream data when possible. -* ✨ Shopify source: Substantial performance improvements to the incremental sync mode. -* ✨ Stripe source: Now supports the PaymentIntents stream. -* ✨ Pipedrive source: Now supports the Organizations stream. -* ✨ Sendgrid source: Now supports the SingleSendStats stream. -* ✨ Bing Ads source: Now supports the Report stream. -* ✨ GitHub source: Now supports the Reactions stream. -* ✨ MongoDB source: Now Airbyte native! -* 🐛 Facebook Marketing source: Numeric values are no longer wrapped into strings. -* 🐛 Facebook Marketing source: Fetching conversion data now works as expected. (contributed by @Manav) -* 🐛 Keen destination: Timestamps are now parsed correctly. -* 🐛 S3 destination: Parquet schema parsing errors are fixed. -* 🐛 Snowflake destination: No longer syncs unnecessary tables with S3. -* 🐛 SurveyMonkey source: Cached responses are now decoded correctly. -* 🐛 Okta source: Incremental sync now works as expected. - -Also, a quick shout out to Jinni Gu and their team who made the DynamoDB destination that we announced last week! - -As usual, thank you to all of our contributors: Harsha Teja Kanna, Manav, Maciej Nędza, mauro, Brian M, Iakov Salikov, Eliziario (Marcos Santos), coeurdestenebres, and mohammadbolt. - -## 09/09/2021 Summary - -We're going over the changes from 0.29.17 and before... and there's a lot of big improvements here, so don't miss them! - -**New Source**: Facebook Pages **New Destination**: MongoDB **New Destination**: DynamoDB - -* 🎉 You can now send notifications via webhook for successes and failures on Airbyte syncs. \(This is a massive contribution by @Pras, thank you\) 🎉 -* 🎉 Scheduling jobs and worker jobs are now separated, allowing for workers to be scaled horizontally. -* 🎉 When developing a connector, you can now preview what your spec looks like in real time with this process. -* 🎉 Oracle destination: Now has basic normalization. -* 🎉 Add XLSB \(binary excel\) support to the Files source \(contributed by Muutech\). -* 🎉 You can now properly cancel K8s deployments. -* ✨ S3 source: Support for Parquet format. -* ✨ Github source: Branches, repositories, organization users, tags, and pull request stats streams added \(contributed by @Christopher Wu\). -* ✨ BigQuery destination: Added GCS upload option. -* ✨ Salesforce source: Now Airbyte native. -* ✨ Redshift destination: Optimized for performance. -* 🏗 CDK: 🎉 We’ve released a tool to generate JSON Schemas from OpenAPI specs. This should make specifying schemas for API connectors a breeze! 🎉 -* 🏗 CDK: Source Acceptance Tests now verify that connectors correctly format strings which are declared as using date-time and date formats. -* 🏗 CDK: Add private options to help in testing: \_limit and \_page\_size are now accepted by any CDK connector to minimze your output size for quick iteration while testing. -* 🐛 Fixed a bug that made it possible for connector definitions to be duplicated, violating uniqueness. -* 🐛 Pipedrive source: Output schemas no longer remove timestamp from fields. -* 🐛 Github source: Empty repos and negative backoff values are now handled correctly. -* 🐛 Harvest source: Normalization now works as expected. -* 🐛 All CDC sources: Removed sleep logic which caused exceptions when loading data from high-volume sources. -* 🐛 Slack source: Increased number of retries to tolerate flaky retry wait times on the API side. -* 🐛 Slack source: Sync operations no longer hang indefinitely. -* 🐛 Jira source: Now uses updated time as the cursor field for incremental sync instead of the created time. -* 🐛 Intercom source: Fixed inconsistency between schema and output data. -* 🐛 HubSpot source: Streams with the items property now have their schemas fixed. -* 🐛 HubSpot source: Empty strings are no longer handled as dates, fixing the deals, companies, and contacts streams. -* 🐛 Typeform source: Allows for multiple choices in responses now. -* 🐛 Shopify source: The type for the amount field is now fixed in the schema. -* 🐛 Postgres destination: \u0000\(NULL\) value processing is now fixed. - -As usual... thank you to our wonderful contributors this week: Pras, Christopher Wu, Brian M, yahu98, Michele Zuccala, jinnig, and luizgribeiro! - -## 09/01/2021 Summary - -Got the changes from 0.29.13... with some other surprises! - -* 🔥 There's a new way to create Airbyte sources! The team at Faros AI has created a Javascript/Typescript CDK which can be found here and in our docs here. This is absolutely awesome and give a huge thanks to Chalenge Masekera, Christopher Wu, eskrm, and Matthew Tovbin! -* ✨ New Destination: Azure Blob Storage ✨ - -**New Source**: Bamboo HR \(contributed by @Oren Haliva\) **New Source**: BigCommerce \(contributed by @James Wilson\) **New Source**: Trello **New Source**: Google Analytics V4 **New Source**: Amazon Ads - -* 💎 Alpine Docker images are the new standard for Python connectors, so image sizes have dropped by around 100 MB! -* ✨ You can now apply tolerations for Airbyte Pods on K8s deployments \(contributed by @Pras\). -* 🐛 Shopify source: Rate limit throttling fixed. -* 📚 We now have a doc on how to deploy Airbyte at scale. Check it out here! -* 🏗 Airbyte CDK: You can now ignore HTTP status errors and override retry parameters. - -As usual, thank you to our awesome contributors: Oren Haliva, Pras, James Wilson, and Muutech. - -## 08/26/2021 Summary - -New Source: Short.io \(contributed by @Apostol Tegko\) - -* 💎 GitHub source: Added support for rotating through multiple API tokens! -* ✨ Syncs are now scheduled with a 3 day timeout \(contributed by @Vladimir Remar\). -* ✨ Google Ads source: Added UserLocationReport stream \(contributed by @Max Krog\). -* ✨ Cart.com source: Added the order\_items stream. -* 🐛 Postgres source: Fixed out-of-memory issue with CDC interacting with large JSON blobs. -* 🐛 Intercom source: Pagination now works as expected. - -As always, thank you to our awesome community contributors this week: Apostol Tegko, Vladimir Remar, Max Krog, Pras, Marco Fontana, Troy Harvey, and damianlegawiec! - -## 08/20/2021 Summary - -Hey Airbyte community, we got some patch notes for y'all. Here's all the changes we've pushed since the last update. - -* **New Source**: S3/Abstract Files -* **New Source**: Zuora -* **New Source**: Kustomer -* **New Source**: Apify -* **New Source**: Chargebee -* **New Source**: Bing Ads - -New Destination: Keen - -* ✨ Shopify source: The `status` property is now in the `Products` stream. -* ✨ Amazon Seller Partner source: Added support for `GET_MERCHANT_LISTINGS_ALL_DATA` and `GET_FBA_INVENTORY_AGED_DATA` stream endpoints. -* ✨ GitHub source: Existing streams now don't minify the user property. -* ✨ HubSpot source: Updated user-defined custom field schema generation. -* ✨ Zendesk source: Migrated from Singer to the Airbyte CDK. -* ✨ Amazon Seller Partner source: Migrated to the Airbyte CDK. -* 🐛 Shopify source: Fixed the `products` schema to be in accordance with the API. -* 🐛 S3 source: Fixed bug where syncs could hang indefinitely. - -And as always... we'd love to shout out the awesome contributors that have helped push Airbyte forward. As a reminder, you can now see your contributions publicly reflected on our [contributors page](https://airbyte.com/contributors). - -Thank you to Rodrigo Parra, Brian Krausz, Max Krog, Apostol Tegko, Matej Hamas, Vladimir Remar, Marco Fontana, Nicholas Bull, @mildbyte, @subhaklp, and Maciej Nędza! - -## 07/30/2021 Summary - -For this week's update, we got... a few new connectors this week in 0.29.0. We found that a lot of sources can pull data directly from the underlying db instance, which we naturally already supported. - -* New Source: PrestaShop ✨ -* New Source: Snapchat Marketing ✨ -* New Source: Drupal -* New Source: Magento -* New Source: Microsoft Dynamics AX -* New Source: Microsoft Dynamics Customer Engagement -* New Source: Microsoft Dynamics GP -* New Source: Microsoft Dynamics NAV -* New Source: Oracle PeopleSoft -* New Source: Oracle Siebel CRM -* New Source: SAP Business One -* New Source: Spree Commerce -* New Source: Sugar CRM -* New Source: Wordpress -* New Source: Zencart -* 🐛 Shopify source: Fixed the products schema to be in accordance with the API -* 🐛 BigQuery source: No longer fails with nested array data types. - -View the full release highlights here: [Platform](platform.md), [Connectors](connectors.md) - -And as always, thank you to our wonderful contributors: Madison Swain-Bowden, Brian Krausz, Apostol Tegko, Matej Hamas, Vladimir Remar, Oren Haliva, satishblotout, jacqueskpoty, wallies - -## 07/23/2021 Summary - -What's going on? We just released 0.28.0 and here's the main highlights. - -* New Destination: Google Cloud Storage ✨ -* New Destination: Kafka ✨ \(contributed by @Mario Molina\) -* New Source: Pipedrive -* New Source: US Census \(contributed by @Daniel Mateus Pires \(Earnest Research\)\) -* ✨ Google Ads source: Now supports Campaigns, Ads, AdGroups, and Accounts streams. -* ✨ Stripe source: All subscription types \(including expired and canceled ones\) are now returned. -* 🐛 Facebook source: Improved rate limit management -* 🐛 Square source: The send\_request method is no longer broken due to CDK changes -* 🐛 MySQL destination: Does not fail on columns with JSON data now. - -View the full release highlights here: [Platform](platform.md), [Connectors](connectors.md) - -And as always, thank you to our wonderful contributors: Mario Molina, Daniel Mateus Pires \(Earnest Research\), gunu, Ankur Adhikari, Vladimir Remar, Madison Swain-Bowden, Maksym Pavlenok, Sam Crowder, mildbyte, avida, and gaart - -## 07/16/2021 Summary - -As for our changes this week... - -* New Source: Zendesk Sunshine -* New Source: Dixa -* New Source: Typeform -* 💎 MySQL destination: Now supports normalization! -* 💎 MSSQL source: Now supports CDC \(Change Data Capture\) -* ✨ Snowflake destination: Data coming from Airbyte is now identifiable -* 🐛 GitHub source: Now uses the correct cursor field for the IssueEvents stream -* 🐛 Square source: The send\_request method is no longer broken due to CDK changes - -View the full release highlights here: [Platform](platform.md), [Connectors](connectors.md) - -As usual, thank you to our awesome community contributors this week: Oliver Meyer, Varun, Brian Krausz, shadabshaukat, Serhii Lazebnyi, Juliano Benvenuto Piovezan, mildbyte, and Sam Crowder! - -## 07/09/2021 Summary - -* New Source: PayPal Transaction -* New Source: Square -* New Source: SurveyMonkey -* New Source: CockroachDB -* New Source: Airbyte-Native GitHub -* New Source: Airbyte-Native GitLab -* New Source: Airbyte-Native Twilio -* ✨ S3 destination: Now supports anyOf, oneOf and allOf schema fields. -* ✨ Instagram source: Migrated to the CDK and has improved error handling. -* ✨ Shopify source: Add support for draft orders. -* ✨ K8s Deployments: Now support logging to GCS. -* 🐛 GitHub source: Fixed issue with locked breaking normalization of the pull\_request stream. -* 🐛 Okta source: Fix endless loop when syncing data from logs stream. -* 🐛 PostgreSQL source: Fixed decimal handling with CDC. -* 🐛 Fixed random silent source failures. -* 📚 New document on how the CDK handles schemas. -* 🏗️ Python CDK: Now allows setting of network adapter args on outgoing HTTP requests. - -View the full release highlights here: [Platform](platform.md), [Connectors](connectors.md) - -As usual, thank you to our awesome community contributors this week: gunu, P.VAD, Rodrigo Parra, Mario Molina, Antonio Grass, sabifranjo, Jaime Farres, shadabshaukat, Rodrigo Menezes, dkelwa, Jonathan Duval, and Augustin Lafanechère. - -## 07/01/2021 Summary - -* New Destination: Google PubSub -* New Source: AWS CloudTrail - -_The risks and issues with upgrading Airbyte are now gone..._ - -* 🎉 Airbyte automatically upgrades versions safely at server startup 🎉 -* 💎 Logs on K8s are now stored in Minio by default, no S3 bucket required -* ✨ Looker Source: Supports the Run Look output stream -* ✨ Slack Source: is now Airbyte native! -* 🐛 Freshdesk Source: No longer fails after 300 pages -* 📚 New tutorial on building Java destinations - -Starting from next week, our weekly office hours will now become demo days! Drop by to get sneak peeks and new feature demos. - -* We added the \#careers channel, so if you're hiring, post your job reqs there! -* We added a \#understanding-airbyte channel to mirror [this](../../understanding-airbyte/) section on our docs site. Ask any questions about our architecture or protocol there. -* We added a \#contributing-to-airbyte channel. A lot of people ask us about how to contribute to the project, so ask away there! - -View the full release highlights here: [Platform](platform.md), [Connectors](connectors.md) - -As usual, thank you to our awesome community contributors this week: Harshith Mullapudi, Michael Irvine, and [sabifranjo](https://github.com/sabifranjo). - -## 06/24/2021 Summary - -* New Source: [IBM Db2](../../integrations/sources/db2.md) -* 💎 We now support Avro and JSONL output for our S3 destination! 💎 -* 💎 Brand new BigQuery destination flavor that now supports denormalized STRUCT types. -* ✨ Looker source now supports self-hosted instances. -* ✨ Facebook Marketing source is now migrated to the CDK, massively improving async job performance and error handling. - -View the full connector release notes [here](connectors.md). - -As usual, thank you to some of our awesome community contributors this week: Harshith Mullapudi, Tyler DeLange, Daniel Mateus Pires, EdBizarro, Tyler Schroeder, and Konrad Schlatte! - -## 06/18/2021 Summary - -* New Source: [Snowflake](../../integrations/sources/snowflake.md) -* 💎 We now support custom dbt transformations! 💎 -* ✨ We now support configuring your destination namespace at the table level when setting up a connection! -* ✨ The S3 destination now supports Minio S3 and Parquet output! - -View the full release notes here: [Platform](platform.md), [Connectors](connectors.md) - -As usual, thank you to some of our awesome community contributors this week: Tyler DeLange, Mario Molina, Rodrigo Parra, Prashanth Patali, Christopher Wu, Itai Admi, Fred Reimer, and Konrad Schlatte! - -## 06/10/2021 Summary - -* New Destination: [S3!!](../../integrations/destinations/s3.md) -* New Sources: [Harvest](../../integrations/sources/harvest.md), [Amplitude](../../integrations/sources/amplitude.md), [Posthog](../../integrations/sources/posthog.md) -* 🐛 Ensure that logs from threads created by replication workers are added to the log file. -* 🐛 Handle TINYINT\(1\) and BOOLEAN correctly and fix target file comparison for MySQL CDC. -* Jira source: now supports all available entities in Jira Cloud. -* 📚 Added a troubleshooting section, a gradle cheatsheet, a reminder on what the reset button does, and a refresh on our docs best practices. - -#### Connector Development: - -* Containerized connector code generator -* Added JDBC source connector bootstrap template. -* Added Java destination generator. - -View the full release notes highlights here: [Platform](platform.md), [Connectors](connectors.md) - -As usual, thank you to some of our awesome community contributors this week \(I've noticed that we've had more contributors to our docs, which we really appreciate\). Ping, Harshith Mullapudi, Michael Irvine, Matheus di Paula, jacqueskpoty and P.VAD. - -## Overview - -Airbyte is comprised of 2 parts: - -* Platform \(The scheduler, workers, api, web app, and the Airbyte protocol\). Here is the [changelog for Platform](platform.md). -* Connectors that run in Docker containers. Here is the [changelog for the connectors](connectors.md). - -## Airbyte Platform Releases - -### Production v. Dev Releases - -The "production" version of Airbyte is the version of the app specified in `.env`. With each production release, we update the version in the `.env` file. This version will always be available for download on DockerHub. It is the version of the app that runs when a user runs `docker compose up`. - -The "development" version of Airbyte is the head of master branch. It is the version of the app that runs when a user runs `./gradlew build && -VERSION=dev docker compose up`. - -### Production Release Schedule - -#### Scheduled Releases - -Airbyte currently releases a new minor version of the application on a weekly basis. Generally this weekly release happens on Monday or Tuesday. - -#### Hotfixes - -Airbyte releases a new version whenever it discovers and fixes a bug that blocks any mission critical functionality. - -**Mission Critical** - -e.g. Non-ASCII characters break the Salesforce source. - -**Non-Mission Critical** - -e.g. Buttons in the UI are offset. - -#### Unscheduled Releases - -We will often release more frequently than the weekly cadence if we complete a feature that we know that a user is waiting on. - -### Development Release Schedule - -As soon as a feature is on master, it is part of the development version of Airbyte. We merge features as soon as they are ready to go \(have been code reviewed and tested\). We attempt to keep the development version of the app working all the time. We are iterating quickly, however, and there may be intermittent periods where the development version is broken. - -If there is ever a feature that is only on the development version, and you need it on the production version, please let us know. We are very happy to do ad-hoc production releases if it unblocks a specific need for one of our users. - -## Airbyte Connector Releases - -Each connector is tracked with its own version. These versions are separate from the versions of Airbyte Platform. We generally will bump the version of a connector anytime we make a change to it. We rely on a large suite of tests to make sure that these changes do not cause regressions in our connectors. - -When we updated the version of a connector, we usually update the connector's version in Airbyte Platform as well. Keep in mind that you might not see the updated version of that connector in the production version of Airbyte Platform until after a production release of Airbyte Platform. - diff --git a/docs/archive/changelog/connectors.md b/docs/archive/changelog/connectors.md deleted file mode 100644 index a1f8b8126e07..000000000000 --- a/docs/archive/changelog/connectors.md +++ /dev/null @@ -1,776 +0,0 @@ ---- -description: Do not miss the new connectors we support! ---- - -# Connectors - -**You can request new connectors directly** [**here**](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=area%2Fintegration%2C+new-integration&template=new-integration-request.md&title=)**.** - -Note: Airbyte is not built on top of Singer but is compatible with Singer's protocol. Airbyte's ambitions go beyond what Singer enables us to do, so we are building our own protocol that maintains compatibility with Singer's protocol. - -Check out our [connector roadmap](https://github.com/airbytehq/airbyte/projects/3) to see what we're currently working on. - -## 1/28/2022 - -New sources: - -- [**Chartmogul**](https://docs.airbyte.com/integrations/sources/chartmogul) -- [**Hellobaton**](https://docs.airbyte.com/integrations/sources/hellobaton) -- [**Flexport**](https://docs.airbyte.com/integrations/sources/flexport) -- [**PersistIq**](https://docs.airbyte.com/integrations/sources/persistiq) - -## 1/6/2022 - -New sources: - -- [**3PL Central**](https://docs.airbyte.com/integrations/sources/tplcentral) -- [**My Hours**](https://docs.airbyte.com/integrations/sources/my-hours) -- [**Qualaroo**](https://docs.airbyte.com/integrations/sources/qualaroo) -- [**SearchMetrics**](https://docs.airbyte.com/integrations/sources/search-metrics) - -## 12/16/2021 - -New source: - -- [**OpenWeather**](https://docs.airbyte.com/integrations/sources/openweather) - -New destinations: - -- [**ClickHouse**](https://docs.airbyte.com/integrations/destinations/clickhouse) -- [**RabbitMQ**](https://docs.airbyte.com/integrations/destinations/rabbitmq) -- [**Amazon SQS**](https://docs.airbyte.com/integrations/destinations/amazon-sqs) -- [**Rockset**](https://docs.airbyte.com/integrations/destinations/rockset) - -## 12/9/2021 - -New source: - -- [**Mailgun**](https://docs.airbyte.com/integrations/sources/mailgun) - -## 12/2/2021 - -New destinations: - -- [**Redis**](https://docs.airbyte.com/integrations/destinations/redis) -- [**MQTT**](https://docs.airbyte.com/integrations/destinations/mqtt) -- [**Google Firestore**](https://docs.airbyte.com/integrations/destinations/firestore) -- [**Kinesis**](https://docs.airbyte.com/integrations/destinations/kinesis) - -## 11/25/2021 - -New sources: - -- [**Airtable**](https://docs.airbyte.com/integrations/sources/airtable) -- [**Notion**](https://docs.airbyte.com/integrations/sources/notion) -- [**Pardot**](https://docs.airbyte.com/integrations/sources/pardot) -- [**Notion**](https://docs.airbyte.com/integrations/sources/linnworks) -- [**YouTube Analytics**](https://docs.airbyte.com/integrations/sources/youtube-analytics) - -New features: - -- **Exchange Rates** Source: add `ignore_weekends` option. -- **Facebook** Source: add the videos stream. -- **Freshdesk** Source: removed the limitation in streams pagination. -- **Jira** Source: add option to render fields in HTML format. -- **MongoDB v2** Source: improve read performance. -- **Pipedrive** Source: specify schema for "persons" stream. -- **PostgreSQL** Source: exclude tables on which user doesn't have select privileges. -- **SurveyMonkey** Source: improve connection check. - -## 11/17/2021 - -New destination: - -- [**ScyllaDB**](https://docs.airbyte.com/integrations/destinations/scylla) - -New sources: - -- [**Azure Table Storage**](https://docs.airbyte.com/integrations/sources/azure-table) -- [**Linnworks**](https://docs.airbyte.com/integrations/sources/linnworks) - -New features: - -- **MySQL** Source: Now has basic performance tests. -- **Salesforce** Source: We now automatically transform and handle incorrect data for the anyType and calculated types. - -## 11/11/2021 - -New destinations: - -- [**Cassandra**](https://docs.airbyte.com/integrations/destinations/cassandra) -- [**Pulsar**](https://docs.airbyte.com/integrations/destinations/pulsar) - -New sources: - -- [**Confluence**](https://docs.airbyte.com/integrations/sources/confluence) -- [**Monday**](https://docs.airbyte.com/integrations/sources/monday) -- [**Commerce Tools**](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-commercetools) -- [**Pinterest**](https://docs.airbyte.com/integrations/sources/pinterest) - -New features: - -- **Shopify** Source: Now supports the FulfillmentOrders and Fulfillments streams. -- **Greenhouse** Source: Now supports the Demographics stream. -- **Recharge** Source: Broken requests should now be re-requested with improved backoff. -- **Stripe** Source: Now supports the checkout_sessions, checkout_sessions_line_item, and promotion_codes streams. -- **Db2** Source: Now supports SSL. - -## 11/3/2021 - -New destination: - -- [**Elasticsearch**](https://docs.airbyte.com/integrations/destinations/elasticsearch) - -New sources: - -- [**Salesloft**](https://docs.airbyte.com/integrations/sources/salesloft) -- [**OneSignal**](https://docs.airbyte.com/integrations/sources/onesignal) -- [**Strava**](https://docs.airbyte.com/integrations/sources/strava) -- [**Lemlist**](https://docs.airbyte.com/integrations/sources/lemlist) -- [**Amazon SQS**](https://docs.airbyte.com/integrations/sources/amazon-sqs) -- [**Freshservices**](https://docs.airbyte.com/integrations/sources/freshservice/) -- [**Freshsales**](https://docs.airbyte.com/integrations/sources/freshsales) -- [**Appsflyer**](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-appsflyer) -- [**Paystack**](https://docs.airbyte.com/integrations/sources/paystack) -- [**Sentry**](https://docs.airbyte.com/integrations/sources/sentry) -- [**Retently**](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-retently) -- [**Delighted!**](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-delighted) - -New features: - -- **BigQuery** Destination: You can now run transformations in batches, preventing queries from hitting BigQuery limits. (contributed by @Andrés Bravo) -- **S3** Source: Memory and Performance optimizations, also some fancy new PyArrow CSV configuration options. -- **Zuora** Source: Now supports Unlimited as an option for the Data Query Live API. -- **Clickhouse** Source: Now supports SSL and connection via SSH tunneling. - -## 10/20/2021 - -New source: - -- [**WooCommerce**](https://docs.airbyte.com/integrations/sources/woocommerce) - -New feature: - -- **MSSQL** destination: Now supports basic normalization - -## 9/29/2021 - -New sources: - -- [**LinkedIn Ads**](https://docs.airbyte.com/integrations/sources/linkedin-ads) -- [**Kafka**](https://docs.airbyte.com/integrations/sources/kafka) -- [**Lever Hiring**](https://docs.airbyte.com/integrations/sources/lever-hiring) - -New features: - -- **MySQL** destination: Now supports connection via TLS/SSL -- **BigQuery** (denormalized) destination: Supports reading BigQuery types such as date by reading the format field (contributed by @Nicolas Moreau) -- **Hubspot** source: Added contacts associations to the deals stream. -- **GitHub** source: Now supports pulling commits from user-specified branches. -- **Google Search Console** source: Now accepts admin email as input when using a service account key. -- **Greenhouse** source: Now identifies API streams it has access to if permissions are limited. -- **Marketo** source: Now Airbyte native. -- **S3** source: Now supports any source that conforms to the S3 protocol (Non-AWS S3). -- **Shopify** source: Now reports pre_tax_price on the line_items stream if you have Shopify Plus. -- **Stripe** source: Now actually uses the mandatory start_date config field for incremental syncs. - -## 9/16/2021 - -New destinations: - -- [**Databricks**](https://docs.airbyte.com/integrations/destinations/databricks) - -New sources: - -- [**Close.com**](https://docs.airbyte.com/integrations/sources/close-com) -- [**Google Search Console**](https://docs.airbyte.com/integrations/sources/google-search-console) - -New features: - -- **Google Ads** source: You can now specify user-specified queries in GAQL. -- **GitHub** source: All streams with a parent stream use cached parent stream data when possible. -- **Shopify** source: Substantial performance improvements to the incremental sync mode. -- **Stripe** source: Now supports the PaymentIntents stream. -- **Pipedrive** source: Now supports the Organizations stream. -- **Sendgrid** source: Now supports the SingleSendStats stream. -- **Bing Ads** source: Now supports the Report stream. -- **GitHub** source: Now supports the Reactions stream. -- **MongoDB** source: Now Airbyte native! - -## 9/9/2021 - -New source: - -- [**Facebook Pages**](https://docs.airbyte.com/integrations/sources/facebook-pages) - -New destinations: - -- [**MongoDB**](https://docs.airbyte.com/integrations/destinations/mongodb) -- [**DynamoDB**](https://docs.airbyte.com/integrations/destinations/dynamodb) - -New features: - -- **S3** source: Support for Parquet format. -- **Github** source: Branches, repositories, organization users, tags, and pull request stats streams added \(contributed by @Christopher Wu\). -- **BigQuery** destination: Added GCS upload option. -- **Salesforce** source: Now Airbyte native. -- **Redshift** destination: Optimized for performance. - -Bug fixes: - -- **Pipedrive** source: Output schemas no longer remove timestamp from fields. -- **Github** source: Empty repos and negative backoff values are now handled correctly. -- **Harvest** source: Normalization now works as expected. -- **All CDC sources**: Removed sleep logic which caused exceptions when loading data from high-volume sources. -- **Slack** source: Increased number of retries to tolerate flaky retry wait times on the API side. -- **Slack** source: Sync operations no longer hang indefinitely. -- **Jira** source: Now uses updated time as the cursor field for incremental sync instead of the created time. -- **Intercom** source: Fixed inconsistency between schema and output data. -- **HubSpot** source: Streams with the items property now have their schemas fixed. -- **HubSpot** source: Empty strings are no longer handled as dates, fixing the deals, companies, and contacts streams. -- **Typeform** source: Allows for multiple choices in responses now. -- **Shopify** source: The type for the amount field is now fixed in the schema. -- **Postgres** destination: \u0000\(NULL\) value processing is now fixed. - -## 9/1/2021 - -New sources: - -- [**Bamboo HR**](https://docs.airbyte.com/integrations/sources/bamboo-hr) -- [**BigCommerce**](https://docs.airbyte.com/integrations/sources/bigcommerce) -- [**Trello**](https://docs.airbyte.com/integrations/sources/trello) -- [**Google Analytics V4**](https://docs.airbyte.com/integrations/sources/google-analytics-v4) -- [**Amazon Ads**](https://docs.airbyte.com/integrations/sources/google-analytics-v4) - -Bug fixes: - -- **Shopify** source: Rate limit throttling fixed. - -## 8/26/2021 - -New source: - -- [**Short.io**](https://docs.airbyte.com/integrations/sources/shortio) - -New features: - -- **GitHub** source: Add support for rotating through multiple API tokens. -- **Google Ads** source: Added `UserLocationReport` stream. -- **Cart.com** source: Added the `order_items` stream. - -Bug fixes: - -- **Postgres** source: Fix out-of-memory issue with CDC interacting with large JSON blobs. -- **Intercom** source: Pagination now works as expected. - -## 8/18/2021 - -New source: - -- [**Bing Ads**](https://docs.airbyte.com/integrations/sources/bing-ads) - -New destination: - -- [**Keen**](https://docs.airbyte.com/integrations/destinations/keen) - -New features: - -- **Chargebee** source: Adds support for the `items`, `item prices` and `attached items` endpoints. - -Bug fixes: - -- **QuickBooks** source: Now uses the number data type for decimal fields. -- **HubSpot** source: Fixed `empty string` inside of the `number` and `float` datatypes. -- **GitHub** source: Validation fixed on non-required fields. -- **BigQuery** destination: Now supports processing of arrays of records properly. -- **Oracle** destination: Fixed destination check for users without DBA role. - -## 8/9/2021 - -New sources: - -- [**S3/Abstract Files**](https://docs.airbyte.com/integrations/sources/s3) -- [**Zuora**](https://docs.airbyte.com/integrations/sources/zuora) -- [**Kustomer**](https://docs.airbyte.com/integrations/sources/kustomer-singer/) -- [**Apify**](https://docs.airbyte.com/integrations/sources/apify-dataset) -- [**Chargebee**](https://docs.airbyte.com/integrations/sources/chargebee) - -New features: - -- **Shopify** source: The `status` property is now in the `Products` stream. -- **Amazon Seller Partner** source: Added support for `GET_MERCHANT_LISTINGS_ALL_DATA` and `GET_FBA_INVENTORY_AGED_DATA` stream endpoints. -- **GitHub** source: Existing streams now don't minify the `user` property. -- **HubSpot** source: Updated user-defined custom field schema generation. -- **Zendesk** source: Migrated from Singer to the Airbyte CDK. -- **Amazon Seller Partner** source: Migrated to the Airbyte CDK. - -Bug fixes: - -- **HubSpot** source: Casting exceptions are now logged correctly. -- **S3** source: Fixed bug where syncs could hang indefinitely. -- **Shopify** source: Fixed the `products` schema to be in accordance with the API. -- **PayPal Transactions** source: Fixed the start date minimum to be 3 years rather than 45 days. -- **Google Ads** source: Added the `login-customer-id` setting. -- **Intercom** source: Rate limit corrected from 1000 requests/minute from 1000 requests/hour. -- **S3** source: Fixed bug in spec to properly display the `format` field in the UI. - -New CDK features: - -- Now allows for setting request data in non-JSON formats. - -## 7/30/2021 - -New sources: - -- [**PrestaShop**](https://docs.airbyte.com/integrations/sources/prestashop) -- [**Snapchat Marketing**](https://docs.airbyte.com/integrations/sources/snapchat-marketing) -- [**Drupal**](https://docs.airbyte.com/integrations/sources/drupal) -- [**Magento**](https://docs.airbyte.com/integrations/sources/magento) -- [**Microsoft Dynamics AX**](https://docs.airbyte.com/integrations/sources/microsoft-dynamics-ax) -- [**Microsoft Dynamics Customer Engagement**](https://docs.airbyte.com/integrations/sources/microsoft-dynamics-customer-engagement) -- [**Microsoft Dynamics GP**](https://docs.airbyte.com/integrations/sources/microsoft-dynamics-gp) -- [**Microsoft Dynamics NAV**](https://docs.airbyte.com/integrations/sources/microsoft-dynamics-nav) -- [**Oracle PeopleSoft**](https://docs.airbyte.com/integrations/sources/oracle-peoplesoft) -- [**Oracle Siebel CRM**](https://docs.airbyte.com/integrations/sources/oracle-siebel-crm) -- [**SAP Business One**](https://docs.airbyte.com/integrations/sources/sap-business-one) -- [**Spree Commerce**](https://docs.airbyte.com/integrations/sources/spree-commerce) -- [**Sugar CRM**](https://docs.airbyte.com/integrations/sources/sugar-crm) -- [**WooCommerce**](https://docs.airbyte.com/integrations/sources/woocommerce) -- [**Wordpress**](https://docs.airbyte.com/integrations/sources/wordpress) -- [**Zencart**](https://docs.airbyte.com/integrations/sources/zencart) - -Bug fixes: - -- **Shopify** source: Fixed the `products` schema to be in accordance with the API. -- **BigQuery** source: No longer fails with `Array of Records` data types. -- **BigQuery** destination: Improved logging, Job IDs are now filled with location and Project IDs. - -## 7/23/2021 - -New sources: - -- [**Pipedrive**](https://docs.airbyte.com/integrations/sources/pipedrive) -- [**US Census**](https://docs.airbyte.com/integrations/sources/us-census) -- [**BigQuery**](https://docs.airbyte.com/integrations/sources/bigquery) - -New destinations: - -- [**Google Cloud Storage**](https://docs.airbyte.com/integrations/destinations/gcs) -- [**Kafka**](https://docs.airbyte.com/integrations/destinations/kafka) - -New Features: - -- **Java Connectors**: Now have config validators for check, discover, read, and write calls -- **Stripe** source: All subscription types are returnable \(including expired and canceled ones\). -- **Mixpanel** source: Migrated to the CDK. -- **Intercom** source: Migrated to the CDK. -- **Google Ads** source: Now supports the `Campaigns`, `Ads`, `AdGroups`, and `Accounts` streams. - -Bug Fixes: - -- **Facebook** source: Improved rate limit management -- **Instagram** source: Now supports old format for state and automatically updates it to the new format. -- **Sendgrid** source: Now gracefully handles malformed responses from API. -- **Jira** source: Fixed dbt failing to normalize schema for the labels stream. -- **MySQL** destination: Does not fail anymore with columns that contain JSON data. -- **Slack** source: Now does not fail stream slicing on reading threads. - -## 7/16/2021 - -3 new sources: - -- [**Zendesk Sunshine**](https://docs.airbyte.com/integrations/sources/zendesk-sunshine) -- [**Dixa**](https://docs.airbyte.com/integrations/sources/dixa) -- [**Typeform**](https://docs.airbyte.com/integrations/sources/typeform) - -New Features: - -- **MySQL** destination: Now supports normalization! -- **MSSQL** source: Now supports CDC \(Change Data Capture\). -- **Snowflake** destination: Data coming from Airbyte is now identifiable. -- **GitHub** source: Now handles rate limiting. - -Bug Fixes: - -- **GitHub** source: Now uses the correct cursor field for the `IssueEvents` stream. -- **Square** source: `send_request` method is no longer broken. - -## 7/08/2021 - -7 new sources: - -- [**PayPal Transaction**](https://docs.airbyte.com/integrations/sources/paypal-transaction) -- [**Square**](https://docs.airbyte.com/integrations/sources/square) -- [**SurveyMonkey**](https://docs.airbyte.com/integrations/sources/surveymonkey) -- [**CockroachDB**](https://docs.airbyte.com/integrations/sources/cockroachdb) -- [**Airbyte-native GitLab**](https://docs.airbyte.com/integrations/sources/gitlab) -- [**Airbyte-native GitHub**](https://docs.airbyte.com/integrations/sources/github) -- [**Airbyte-native Twilio**](https://docs.airbyte.com/integrations/sources/twilio) - -New Features: - -- **S3** destination: Now supports `anyOf`, `oneOf` and `allOf` schema fields. -- **Instagram** source: Migrated to the CDK and has improved error handling. -- **Snowflake** source: Now has comprehensive data type tests. -- **Shopify** source: Change the default stream cursor field to `update_at` where possible. -- **Shopify** source: Add support for draft orders. -- **MySQL** destination: Now supports normalization. - -Connector Development: - -- **Python CDK**: Now allows setting of network adapter args on outgoing HTTP requests. -- Abstract classes for non-JDBC relational database sources. - -Bugfixes: - -- **GitHub** source: Fixed issue with `locked` breaking normalization of the pull_request stream. -- **PostgreSQL** source: Fixed decimal handling with CDC. -- **Okta** source: Fix endless loop when syncing data from logs stream. - -## 7/01/2021 - -Bugfixes: - -- **Looker** source: Now supports the Run Look stream. -- **Google Adwords**: CI is fixed and new version is published. -- **Slack** source: Now Airbyte native and supports channels, channel members, messages, users, and threads streams. -- **Freshdesk** source: Does not fail after 300 pages anymore. -- **MSSQL** source: Now has comprehensive data type tests. - -## 6/24/2021 - -1 new source: - -- [**Db2**](https://docs.airbyte.com/integrations/sources/db2) - -New features: - -- **S3** destination: supports Avro and Jsonl output! -- **BigQuery** destination: now supports loading JSON data as structured data. -- **Looker** source: Now supports self-hosted instances. -- **Facebook** source: is now migrated to the CDK. - -## 6/18/2021 - -1 new source: - -- [**Snowflake**](https://docs.airbyte.com/integrations/sources/snowflake) - -New features: - -- **Postgres** source: now has comprehensive data type tests. -- **Google Ads** source: now uses the [Google Ads Query Language](https://developers.google.com/google-ads/api/docs/query/overview)! -- **S3** destination: supports Parquet output! -- **S3** destination: supports Minio S3! -- **BigQuery** destination: credentials are now optional. - -## 6/10/2021 - -1 new destination: - -- [**S3**](https://docs.airbyte.com/integrations/destinations/s3) - -3 new sources: - -- [**Harvest**](https://docs.airbyte.com/integrations/sources/harvest) -- [**Amplitude**](https://docs.airbyte.com/integrations/sources/amplitude) -- [**Posthog**](https://docs.airbyte.com/integrations/sources/posthog) - -New features: - -- **Jira** source: now supports all available entities in Jira Cloud. -- **ExchangeRatesAPI** source: clearer messages around unsupported currencies. -- **MySQL** source: Comprehensive core extension to be more compatible with other JDBC sources. -- **BigQuery** destination: Add dataset location. -- **Shopify** source: Add order risks + new attributes to orders schema for native connector - -Bugfixes: - -- **MSSQL** destination: fixed handling of unicode symbols. - -Connector development updates: - -- Containerized connector code generator. -- Added JDBC source connector bootstrap template. -- Added Java destination generator. - -## 06/3/2021 - -2 new sources: - -- [**Okta**](https://docs.airbyte.com/integrations/sources/okta) -- [**Amazon Seller Partner**](https://docs.airbyte.com/integrations/sources/amazon-seller-partner) - -New features: - -- **MySQL CDC** now only polls for 5 minutes if we haven't received any records \([\#3789](https://github.com/airbytehq/airbyte/pull/3789)\) -- **Python CDK** now supports Python 3.7.X \([\#3692](https://github.com/airbytehq/airbyte/pull/3692)\) -- **File** source: now supports Azure Blob Storage \([\#3660](https://github.com/airbytehq/airbyte/pull/3660)\) - -Bugfixes: - -- **Recurly** source: now uses type `number` instead of `integer` \([\#3769](https://github.com/airbytehq/airbyte/pull/3769)\) -- **Stripe** source: fix types in schema \([\#3744](https://github.com/airbytehq/airbyte/pull/3744)\) -- **Stripe** source: output `number` instead of `int` \([\#3728](https://github.com/airbytehq/airbyte/pull/3728)\) -- **MSSQL** destination: fix issue with unicode symbols handling \([\#3671](https://github.com/airbytehq/airbyte/pull/3671)\) - -## 05/25/2021 - -4 new sources: - -- [**Asana**](https://docs.airbyte.com/integrations/sources/asana) -- [**Klaviyo**](https://docs.airbyte.com/integrations/sources/klaviyo) -- [**Recharge**](https://docs.airbyte.com/integrations/sources/recharge) -- [**Tempo**](https://docs.airbyte.com/integrations/sources/tempo) - -Progress on connectors: - -- **CDC for MySQL** is now available! -- **Sendgrid** source: support incremental sync, as rewritten using HTTP CDK \([\#3445](https://github.com/airbytehq/airbyte/pull/3445)\) -- **Github** source bugfix: exception when parsing null date values, use `created_at` as cursor value for issue_milestones \([\#3314](https://github.com/airbytehq/airbyte/pull/3314)\) -- **Slack** source bugfix: don't overwrite thread_ts in threads stream \([\#3483](https://github.com/airbytehq/airbyte/pull/3483)\) -- **Facebook Marketing** source: allow configuring insights lookback window \([\#3396](https://github.com/airbytehq/airbyte/pull/3396)\) -- **Freshdesk** source: fix discovery \([\#3591](https://github.com/airbytehq/airbyte/pull/3591)\) - -## 05/18/2021 - -1 new destination: [**MSSQL**](https://docs.airbyte.com/integrations/destinations/mssql) - -1 new source: [**ClickHouse**](https://docs.airbyte.com/integrations/sources/clickhouse) - -Progress on connectors: - -- **Shopify**: make this source more resilient to timeouts \([\#3409](https://github.com/airbytehq/airbyte/pull/3409)\) -- **Freshdesk** bugfix: output correct schema for various streams \([\#3376](https://github.com/airbytehq/airbyte/pull/3376)\) -- **Iterable**: update to use latest version of CDK \([\#3378](https://github.com/airbytehq/airbyte/pull/3378)\) - -## 05/11/2021 - -1 new destination: [**MySQL**](https://docs.airbyte.com/integrations/destinations/mysql) - -2 new sources: - -- [**Google Search Console**](https://docs.airbyte.com/integrations/sources/google-search-console) -- [**PokeAPI**](https://docs.airbyte.com/integrations/sources/pokeapi) \(talking about long tail and having fun ;\)\) - -Progress on connectors: - -- **Zoom**: bugfix on declaring correct types to match data coming from API \([\#3159](https://github.com/airbytehq/airbyte/pull/3159)\), thanks to [vovavovavovavova](https://github.com/vovavovavovavova) -- **Smartsheets**: bugfix on gracefully handling empty cell values \([\#3337](https://github.com/airbytehq/airbyte/pull/3337)\), thanks to [Nathan Nowack](https://github.com/zzstoatzz) -- **Stripe**: fix date property name, only add connected account header when set, and set primary key \(\#3210\), thanks to [Nathan Yergler](https://github.com/nyergler) - -## 05/04/2021 - -2 new sources: - -- [**Smartsheets**](https://docs.airbyte.com/integrations/sources/smartsheets), thanks to [Nathan Nowack](https://github.com/zzstoatzz) -- [**Zendesk Chat**](https://docs.airbyte.com/integrations/sources/zendesk-chat) - -Progress on connectors: - -- **Appstore**: bugfix private key handling in the UI \([\#3201](https://github.com/airbytehq/airbyte/pull/3201)\) -- **Facebook marketing**: Wait longer \(5 min\) for async jobs to start \([\#3116](https://github.com/airbytehq/airbyte/pull/3116)\), thanks to [Max Krog](https://github.com/MaxKrog) -- **Stripe**: support reading data from connected accounts \(\#3121\), and 2 new streams with Refunds & Bank Accounts \([\#3030](https://github.com/airbytehq/airbyte/pull/3030)\) \([\#3086](https://github.com/airbytehq/airbyte/pull/3086)\) -- **Redshift destination**: Ignore records that are too big \(instead of failing\) \([\#2988](https://github.com/airbytehq/airbyte/pull/2988)\) -- **MongoDB**: add supporting TLS and Replica Sets \([\#3111](https://github.com/airbytehq/airbyte/pull/3111)\) -- **HTTP sources**: bugfix on handling array responses gracefully \([\#3008](https://github.com/airbytehq/airbyte/pull/3008)\) - -## 04/27/2021 - -- **Zendesk Talk**: fix normalization failure \([\#3022](https://github.com/airbytehq/airbyte/pull/3022)\), thanks to [yevhenii-ldv](https://github.com/yevhenii-ldv) -- **Github**: pull_requests stream only incremental syncs \([\#2886](https://github.com/airbytehq/airbyte/pull/2886)\) \([\#3009](https://github.com/airbytehq/airbyte/pull/3009)\), thanks to [Zirochkaa](https://github.com/Zirochkaa) -- Create streaming writes to a file and manage the issuance of copy commands for the destination \([\#2921](https://github.com/airbytehq/airbyte/pull/2921)\) -- **Redshift**: make Redshift part size configurable. \([\#3053](https://github.com/airbytehq/airbyte/pull/23053)\) -- **HubSpot**: fix argument error in log call \(\#3087\) \([\#3087](https://github.com/airbytehq/airbyte/pull/3087)\) , thanks to [Nathan Yergler](https://github.com/nyergler) - -## 04/20/2021 - -3 new source connectors! - -- [**Zendesk Talk**](https://docs.airbyte.com/integrations/sources/zendesk-talk) -- [**Iterable**](https://docs.airbyte.com/integrations/sources/iterable) -- [**QuickBooks**](https://docs.airbyte.com/integrations/sources/quickbooks-singer) - -Other progress on connectors: - -- **Postgres source/destination**: add SSL option, thanks to [Marcos Marx](https://github.com/marcosmarxm) \([\#2757](https://github.com/airbytehq/airbyte/pull/2757)\) -- **Google sheets bugfix**: handle duplicate sheet headers, thanks to [Aneesh Makala](https://github.com/makalaaneesh) \([\#2905](https://github.com/airbytehq/airbyte/pull/2905)\) -- **Source Google Adwords**: support specifying the lookback window for conversions, thanks to [Harshith Mullapudi](https://github.com/harshithmullapudi) \([\#2918](https://github.com/airbytehq/airbyte/pull/2918)\) -- **MongoDB improvement**: speed up mongodb schema discovery, thanks to [Yury Koleda](https://github.com/FUT) \([\#2851](https://github.com/airbytehq/airbyte/pull/2851)\) -- **MySQL bugfix**: parsing Mysql jdbc params, thanks to [Vasily Safronov](https://github.com/gingeard) \([\#2891](https://github.com/airbytehq/airbyte/pull/2891)\) -- **CSV bugfix**: discovery takes too much memory \([\#2089](https://github.com/airbytehq/airbyte/pull/2851)\) -- A lot of work was done on improving the standard tests for the connectors, for better standardization and maintenance! - -## 04/13/2021 - -- New connector: [**Oracle DB**](https://docs.airbyte.com/integrations/sources/oracle), thanks to [Marcos Marx](https://github.com/marcosmarxm) - -## 04/07/2021 - -- New connector: [**Google Workspace Admin Reports**](https://docs.airbyte.com/integrations/sources/google-workspace-admin-reports) \(audit logs\) -- Bugfix in the base python connector library that caused errors to be silently skipped rather than failing the sync -- **Exchangeratesapi.io** bugfix: to point to the updated API URL -- **Redshift destination** bugfix: quote keywords “DATETIME” and “TIME” when used as identifiers -- **GitHub** bugfix: syncs failing when a personal repository doesn’t contain collaborators or team streams available -- **Mixpanel** connector: sync at most the last 90 days of data in the annotations stream to adhere to API limits - -## 03/29/2021 - -- We started measuring throughput of connectors. This will help us improve that point for all connectors. -- **Redshift**: implemented Copy strategy to improve its throughput. -- **Instagram**: bugfix an issue which caused media and media_insights streams to stop syncing prematurely. -- Support NCHAR and NVCHAR types in SQL-based database sources. -- Add the ability to specify a custom JDBC parameters for the MySQL source connector. - -## 03/22/2021 - -- 2 new source connectors: [**Gitlab**](https://docs.airbyte.com/integrations/sources/gitlab) and [**Airbyte-native HubSpot**](https://docs.airbyte.com/integrations/sources/hubspot) -- Developing connectors now requires almost no interaction with Gradle, Airbyte’s monorepo build tool. If you’re building a Python connector, you never have to worry about developing outside your typical flow. See [the updated documentation](https://docs.airbyte.com/connector-development). - -## 03/15/2021 - -- 2 new source connectors: [**Instagram**](https://docs.airbyte.com/integrations/sources/instagram) and [**Google Directory**](https://docs.airbyte.com/integrations/sources/google-directory) -- **Facebook Marketing**: support of API v10 -- **Google Analytics**: support incremental sync -- **Jira**: bug fix to consistently pull all tickets -- **HTTP Source**: bug fix to correctly parse JSON responses consistently - -## 03/08/2021 - -- 1 new source connector: **MongoDB** -- **Google Analytics**: Support chunked syncs to avoid sampling -- **AppStore**: fix bug where the catalog was displayed incorrectly - -## 03/01/2021 - -- **New native HubSpot connector** with schema folder populated -- Facebook Marketing connector: add option to include deleted records - -## 02/22/2021 - -- Bug fixes: - - **Google Analytics:** add the ability to sync custom reports - - **Apple Appstore:** bug fix to correctly run incremental syncs - - **Exchange rates:** UI now correctly validates input date pattern - - **File Source:** Support JSONL \(newline-delimited JSON\) format - - **Freshdesk:** Enable controlling how many requests per minute the connector makes to avoid overclocking rate limits - -## 02/15/2021 - -- 1 new destination connector: [MeiliSearch](https://docs.airbyte.com/integrations/destinations/meilisearch) -- 2 new sources that support incremental append: [Freshdesk](https://docs.airbyte.com/integrations/sources/freshdesk) and [Sendgrid](https://docs.airbyte.com/integrations/sources/sendgrid) -- Other fixes: - - Thanks to [@ns-admetrics](https://github.com/ns-admetrics) for contributing an upgrade to the **Shopify** source connector which now provides the landing_site field containing UTM parameters in the Orders table. - - **Sendgrid** source connector supports most available endpoints available in the API - - **Facebook** Source connector now supports syncing Ad Insights data - - **Freshdesk** source connector now supports syncing satisfaction ratings and conversations - - **Microsoft Teams** source connector now gracefully handles rate limiting - - Bug fix in **Slack** source where the last few records in a sync were sporadically dropped - - Bug fix in **Google Analytics** source where the last few records in sync were sporadically dropped - - In **Redshift source**, support non alpha-numeric table names - - Bug fix in **Github Source** to fix instances where syncs didn’t always fail if there was an error while reading data from the API - -## 02/02/2021 - -- Sources that we improved reliability for \(and that became “certified”\): - - [Certified sources](https://docs.airbyte.com/integrations): Files and Shopify - - Enhanced continuous testing for Tempo and Looker sources -- Other fixes / features: - - Correctly handle boolean types in the File Source - - Add docs for [App Store](https://docs.airbyte.com/integrations/sources/appstore) source - - Fix a bug in Snowflake destination where the connector didn’t check for all needed write permissions, causing some syncs to fail - -## 01/26/2021 - -- Improved reliability with our best practices on : Google Sheets, Google Ads, Marketo, Tempo -- Support incremental for Facebook and Google Ads -- The Facebook connector now supports the FB marketing API v9 - -## 01/19/2021 - -- **Our new** [**Connector Health Grade**](../../integrations/) **page** -- **1 new source:** App Store \(thanks to [@Muriloo](https://github.com/Muriloo)\) -- Fixes on connectors: - - Bug fix writing boolean columns to Redshift - - Bug fix where getting a connector’s input configuration hung indefinitely - - Stripe connector now gracefully handles rate limiting from the Stripe API - -## 01/12/2021 - -- **1 new source:** Tempo \(thanks to [@thomasvl](https://github.com/thomasvl)\) -- **Incremental support for 3 new source connectors:** [Salesforce](../../integrations/sources/salesforce.md), [Slack](../../integrations/sources/slack.md) and [Braintree](../../integrations/sources/braintree.md) -- Fixes on connectors: - - Fix a bug in MSSQL and Redshift source connectors where custom SQL types weren't being handled correctly. - - Improvement of the Snowflake connector from [@hudsondba](https://github.com/hudsondba) \(batch size and timeout sync\) - -## 01/05/2021 - -- **Incremental support for 2 new source connectors:** [Mixpanel](../../integrations/sources/mixpanel.md) and [HubSpot](../../integrations/sources/hubspot.md) -- Fixes on connectors: - - Fixed a bug in the github connector where the connector didn’t verify the provided API token was granted the correct permissions - - Fixed a bug in the Google sheets connector where rate limits were not always respected - - Alpha version of Facebook marketing API v9. This connector is a native Airbyte connector \(current is Singer based\). - -## 12/30/2020 - -**New sources:** [Plaid](../../integrations/sources/plaid.md) \(contributed by [tgiardina](https://github.com/tgiardina)\), [Looker](../../integrations/sources/looker.md) - -## 12/18/2020 - -**New sources:** [Drift](../../integrations/sources/drift.md), [Microsoft Teams](../../integrations/sources/microsoft-teams.md) - -## 12/10/2020 - -**New sources:** [Intercom](../../integrations/sources/intercom.md), [Mixpanel](../../integrations/sources/mixpanel.md), [Jira Cloud](../../integrations/sources/jira.md), [Zoom](../../integrations/sources/zoom.md) - -## 12/07/2020 - -**New sources:** [Slack](../../integrations/sources/slack.md), [Braintree](../../integrations/sources/braintree.md), [Zendesk Support](../../integrations/sources/zendesk-support.md) - -## 12/04/2020 - -**New sources:** [Redshift](../../integrations/sources/redshift.md), [Greenhouse](../../integrations/sources/greenhouse.md) **New destination:** [Redshift](../../integrations/destinations/redshift.md) - -## 11/30/2020 - -**New sources:** [Freshdesk](../../integrations/sources/freshdesk.md), [Twilio](../../integrations/sources/twilio.md) - -## 11/25/2020 - -**New source:** [Recurly](../../integrations/sources/recurly.md) - -## 11/23/2020 - -**New source:** [Sendgrid](../../integrations/sources/sendgrid.md) - -## 11/18/2020 - -**New source:** [Mailchimp](../../integrations/sources/mailchimp.md) - -## 11/13/2020 - -**New source:** [MSSQL](../../integrations/sources/mssql.md) - -## 11/11/2020 - -**New source:** [Shopify](../../integrations/sources/shopify.md) - -## 11/09/2020 - -**New sources:** [Files \(CSV, JSON, HTML...\)](../../integrations/sources/file.md) - -## 11/04/2020 - -**New sources:** [Facebook Ads](connectors.md), [Google Ads](../../integrations/sources/google-ads.md), [Marketo](../../integrations/sources/marketo.md) **New destination:** [Snowflake](../../integrations/destinations/snowflake.md) - -## 10/30/2020 - -**New sources:** [Salesforce](../../integrations/sources/salesforce.md), Google Analytics, [HubSpot](../../integrations/sources/hubspot.md), [GitHub](../../integrations/sources/github.md), [Google Sheets](../../integrations/sources/google-sheets.md), [Rest APIs](connectors.md), and [MySQL](../../integrations/sources/mysql.md) - -## 10/21/2020 - -**New destinations:** we built our own connectors for [BigQuery](../../integrations/destinations/bigquery.md) and [Postgres](../../integrations/destinations/postgres.md), to ensure they are of the highest quality. - -## 09/23/2020 - -**New sources:** [Stripe](../../integrations/sources/stripe.md), [Postgres](../../integrations/sources/postgres.md) **New destinations:** [BigQuery](../../integrations/destinations/bigquery.md), [Postgres](../../integrations/destinations/postgres.md), [local CSV](../../integrations/destinations/csv.md) diff --git a/docs/archive/changelog/platform.md b/docs/archive/changelog/platform.md deleted file mode 100644 index f4429eb1b852..000000000000 --- a/docs/archive/changelog/platform.md +++ /dev/null @@ -1,509 +0,0 @@ ---- -description: Be sure to not miss out on new features and improvements! ---- - -# Platform - -This is the changelog for Airbyte Platform. For our connector changelog, please visit our [Connector Changelog](connectors.md) page. - -## [20-12-2021 - 0.32.5](https://github.com/airbytehq/airbyte/releases/tag/v0.32.5-alpha) -* Add an endpoint that specify that the feedback have been given after the first sync. - -## [18-12-2021 - 0.32.4](https://github.com/airbytehq/airbyte/releases/tag/v0.32.4-alpha) -* No major changes to Airbyte Core. - -## [18-12-2021 - 0.32.3](https://github.com/airbytehq/airbyte/releases/tag/v0.32.3-alpha) -* No major changes to Airbyte Core. - -## [18-12-2021 - 0.32.2](https://github.com/airbytehq/airbyte/releases/tag/v0.32.2-alpha) -* Improve error handling when additional sources/destinations cannot be read. -* Implement connector config dependency for OAuth consent URL. -* Treat oauthFlowInitParameters just as hidden instead of getting rid of them. -* Stop using gentle close with heartbeat. - -## [17-12-2021 - 0.32.1](https://github.com/airbytehq/airbyte/releases/tag/v0.32.1-alpha) -* Add to the new connection flow form with an existing source and destination dropdown. -* Implement protocol change for OAuth outputs. -* Enhance API for use by cloud to provide per-connector billing info. - -## [11-12-2021 - 0.32.0](https://github.com/airbytehq/airbyte/releases/tag/v0.32.0-alpha) -* This is a **MAJOR** version update. You need to [update to this version](../../operator-guides/upgrading-airbyte.md#mandatory-intermediate-upgrade) before updating to any version newer than `0.32.0` - -## [11-11-2021 - 0.31.0](https://github.com/airbytehq/airbyte/releases/tag/v0.31.0-alpha) -* No major changes to Airbyte Core. - -## [11-11-2021 - 0.30.39](https://github.com/airbytehq/airbyte/releases/tag/v0.30.39-alpha) -* We migrated our secret management to Google Secret Manager, allowing us to scale how many connectors we support. - -## [11-09-2021 - 0.30.37](https://github.com/airbytehq/airbyte/releases/tag/v0.30.37-alpha) -* No major changes to Airbyte Core. - -## [11-09-2021 - 0.30.36](https://github.com/airbytehq/airbyte/releases/tag/v0.30.36-alpha) -* No major changes to Airbyte Core. - -## [11-08-2021 - 0.30.35](https://github.com/airbytehq/airbyte/releases/tag/v0.30.35-alpha) -* No major changes to Airbyte Core. - -## [11-06-2021 - 0.30.34](https://github.com/airbytehq/airbyte/releases/tag/v0.30.34-alpha) -* No major changes to Airbyte Core. - -## [11-06-2021 - 0.30.33](https://github.com/airbytehq/airbyte/releases/tag/v0.30.33-alpha) -* No major changes to Airbyte Core. - -## [11-05-2021 - 0.30.32](https://github.com/airbytehq/airbyte/releases/tag/v0.30.32-alpha) -* Airbyte Server no longer crashes from having too many open files. - -## [11-04-2021 - 0.30.31](https://github.com/airbytehq/airbyte/releases/tag/v0.30.31-alpha) -* No major changes to Airbyte Core. - -## [11-01-2021 - 0.30.25](https://github.com/airbytehq/airbyte/releases/tag/v0.30.25-alpha) -* No major changes to Airbyte Core. - -## [11-01-2021 - 0.30.24](https://github.com/airbytehq/airbyte/releases/tag/v0.30.24-alpha) -* Incremental normalization is live. Basic normalization no longer runs on already normalized data, making it way faster and cheaper. - -## [11-01-2021 - 0.30.23](https://github.com/airbytehq/airbyte/releases/tag/v0.30.23-alpha) -* No major changes to Airbyte Core. - -## [10-21-2021 - 0.30.22](https://github.com/airbytehq/airbyte/releases/tag/v0.30.22-alpha) -* We now support experimental deployment of Airbyte on Macbooks with M1 chips! - -:::info - -This interim patch period mostly contained stability changes for Airbyte Cloud, so we skipped from `0.30.16` to `0.30.22`. - -::: - -## [10-07-2021 - 0.30.16](https://github.com/airbytehq/airbyte/releases/tag/v0.30.16-alpha) -* On Kubernetes deployments, you can now configure the Airbyte Worker Pod's image pull policy. - -:::info - -This interim patch period mostly contained stability changes for Airbyte Cloud, so we skipped from `0.30.2` to `0.30.16`. - -::: - -## [09-30-2021 - 0.30.2](https://github.com/airbytehq/airbyte/releases/tag/v0.30.2-alpha) -* Fixed a bug that would fail Airbyte upgrades for deployments with sync notifications. - -## [09-24-2021 - 0.29.22](https://github.com/airbytehq/airbyte/releases/tag/v0.29.22-alpha) -* We now have integration tests for SSH. - -## [09-19-2021 - 0.29.21](https://github.com/airbytehq/airbyte/releases/tag/v0.29.21-alpha) -* You can now [deploy Airbyte on Kubernetes with a Helm Chart](https://github.com/airbytehq/airbyte/pull/5891)! - -## [09-16-2021 - 0.29.19](https://github.com/airbytehq/airbyte/releases/tag/v0.29.19-alpha) -* Fixes a breaking bug that prevents Airbyte upgrading from older versions. - -## [09-15-2021 - 0.29.18](https://github.com/airbytehq/airbyte/releases/tag/v0.29.18-alpha) -* Building images is now optional in the CI build. - -## [09-08-2021 - 0.29.17](https://github.com/airbytehq/airbyte/releases/tag/v0.29.17-alpha) - -* You can now properly cancel deployments when deploying on K8s. - -## [09-08-2021 - 0.29.16](https://github.com/airbytehq/airbyte/releases/tag/v0.29.16-alpha) - -* You can now send notifications via webhook for successes and failures on Airbyte syncs. -* Scheduling jobs and worker jobs are now separated, allowing for workers to be scaled horizontally. - -## [09-04-2021 - 0.29.15](https://github.com/airbytehq/airbyte/releases/tag/v0.29.15-alpha) - -* Fixed a bug that made it possible for connector definitions to be duplicated, violating uniqueness. - -## [09-02-2021 - 0.29.14](https://github.com/airbytehq/airbyte/releases/tag/v0.29.14-alpha) - -* Nothing of note. - -## [08-27-2021 - 0.29.13](https://github.com/airbytehq/airbyte/releases/tag/v0.29.13-alpha) - -* The scheduler now waits for the server before it creates any databases. -* You can now apply tolerations for Airbyte Pods on K8s deployments. - -## [08-23-2021 - 0.29.12](https://github.com/airbytehq/airbyte/releases/tag/v0.29.12-alpha) - -* Syncs now have a `max_sync_timeout` that times them out after 3 days. -* Fixed Kube deploys when logging with Minio. - -## [08-20-2021 - 0.29.11](https://github.com/airbytehq/airbyte/releases/tag/v0.29.11-alpha) - -* Nothing of note. - -## [08-20-2021 - 0.29.10](https://github.com/airbytehq/airbyte/releases/tag/v0.29.10-alpha) - -* Migration of Python connector template images to Alpine Docker images to reduce size. - -## [08-20-2021 - 0.29.9](https://github.com/airbytehq/airbyte/releases/tag/v0.29.9-alpha) - -* Nothing of note. - -## [08-17-2021 - 0.29.8](https://github.com/airbytehq/airbyte/releases/tag/v0.29.8-alpha) - -* Nothing of note. - -## [08-14-2021 - 0.29.7](https://github.com/airbytehq/airbyte/releases/tag/v0.29.7-alpha) - -* Re-release: Fixed errant ENV variable in `0.29.6` - -## [08-14-2021 - 0.29.6](https://github.com/airbytehq/airbyte/releases/tag/v0.29.6-alpha) - -* Connector pods no longer fail with edge case names for the associated Docker images. - -## [08-14-2021 - 0.29.5](https://github.com/airbytehq/airbyte/releases/tag/v0.29.5-alpha) - -* Nothing of note. - -## [08-12-2021 - 0.29.4](https://github.com/airbytehq/airbyte/releases/tag/v0.29.4-alpha) - -* Introduced implementation for date-time support in normalization. - -## [08-9-2021 - 0.29.3](https://github.com/airbytehq/airbyte/releases/tag/v0.29.3-alpha) - -* Importing configuration no longer removes available but unused connectors. - -## [08-6-2021 - 0.29.2](https://github.com/airbytehq/airbyte/releases/tag/v0.29.2-alpha) - -* Fixed nil pointer exception in version migrations. - -## [07-29-2021 - 0.29.1](https://github.com/airbytehq/airbyte/releases/tag/v0.29.1-alpha) - -* When migrating, types represented in the config archive need to be a subset of the types declared in the schema. - -## [07-28-2021 - 0.29.0](https://github.com/airbytehq/airbyte/releases/tag/v0.29.0-alpha) - -* Deprecated `DEFAULT_WORKSPACE_ID`; default workspace no longer exists by default. - -## [07-28-2021 - 0.28.2](https://github.com/airbytehq/airbyte/releases/tag/v0.28.2-alpha) - -* Backend now handles workspaceId for WebBackend operations. - -## [07-26-2021 - 0.28.1](https://github.com/airbytehq/airbyte/releases/tag/v0.28.1-alpha) - -* K8s: Overly-sensitive logs are now silenced. - -## [07-22-2021 - 0.28.0](https://github.com/airbytehq/airbyte/releases/tag/v0.28.0-alpha) - -* Acceptance test dependencies fixed. - -## [07-22-2021 - 0.27.5](https://github.com/airbytehq/airbyte/releases/tag/v0.27.5-alpha) - -* Fixed unreliable logging on Kubernetes deployments. -* Introduced pre-commit to auto-format files on commits. - -## [07-21-2021 - 0.27.4](https://github.com/airbytehq/airbyte/releases/tag/v0.27.4-alpha) - -* Config persistence is now migrated to the internal Airbyte database. -* Source connector ports now properly close when deployed on Kubernetes. -* Missing dependencies added that allow acceptance tests to run. - -## [07-15-2021 - 0.27.3](https://github.com/airbytehq/airbyte/releases/tag/v0.27.3-alpha) - -* Fixed some minor API spec errors. - -## [07-12-2021 - 0.27.2](https://github.com/airbytehq/airbyte/releases/tag/v0.27.2-alpha) - -* GCP environment variable is now stubbed out to prevent noisy and harmless errors. - -## [07-8-2021 - 0.27.1](https://github.com/airbytehq/airbyte/releases/tag/v0.27.1-alpha) - -* New API endpoint: List workspaces -* K8s: Server doesn't start up before Temporal is ready to operate now. -* Silent source failures caused by last patch fixed to throw exceptions. - -## [07-1-2021 - 0.27.0](https://github.com/airbytehq/airbyte/releases/tag/v0.27.0-alpha) - -* Airbyte now automatically upgrades on server startup! - * Airbyte will check whether your `.env` Airbyte version is compatible with the Airbyte version in the database and upgrade accordingly. -* When running Airbyte on K8s logs will automatically be stored in a Minio bucket unless configured otherwise. -* CDC for MySQL now handles decimal types correctly. - -## [06-21-2021 - 0.26.2](https://github.com/airbytehq/airbyte/releases/tag/v0.26.2-alpha) - -* First-Class Kubernetes support! - -## [06-16-2021 - 0.26.0](https://github.com/airbytehq/airbyte/releases/tag/v0.26.0-alpha) - -* Custom dbt transformations! -* You can now configure your destination namespace at the table level when setting up a connection! -* Migrate basic normalization settings to the sync operations. - -## [06-09-2021 - 0.24.8 / 0.25.0](https://github.com/airbytehq/airbyte/releases/tag/v0.24.8-alpha) - -* Bugfix: Handle TINYINT\(1\) and BOOLEAN correctly and fix target file comparison for MySQL CDC. -* Bugfix: Updating the source/destination name in the UI now works as intended. - -## [06-04-2021 - 0.24.7](https://github.com/airbytehq/airbyte/releases/tag/v0.24.7-alpha) - -* Bugfix: Ensure that logs from threads created by replication workers are added to the log file. - -## [06-03-2021 - 0.24.5](https://github.com/airbytehq/airbyte/releases/tag/v0.24.5-alpha) - -* Remove hash from table names when it's not necessary for normalization outputs. - -## [06-03-2021 - 0.24.4](https://github.com/airbytehq/airbyte/releases/tag/v0.24.4-alpha) - -* PythonCDK: change minimum Python version to 3.7.0 - -## [05-28-2021 - 0.24.3](https://github.com/airbytehq/airbyte/releases/tag/v0.24.3-alpha) - -* Minor fixes to documentation -* Reliability updates in preparation for custom transformations -* Limit Docker log size to 500 MB \([\#3702](https://github.com/airbytehq/airbyte/pull/3702)\) - -## [05-26-2021 - 0.24.2](https://github.com/airbytehq/airbyte/releases/tag/v0.24.2-alpha) - -* Fix for file names being too long in Windows deployments \([\#3625](https://github.com/airbytehq/airbyte/pull/3625)\) -* Allow users to access the API and WebApp from the same port \([\#3603](https://github.com/airbytehq/airbyte/pull/3603)\) - -## [05-25-2021 - 0.24.1](https://github.com/airbytehq/airbyte/releases/tag/v0.24.1-alpha) - -* **Checkpointing for incremental syncs** that will now continue where they left off even if they fail! \([\#3290](https://github.com/airbytehq/airbyte/pull/3290)\) - -## [05-25-2021 - 0.24.0](https://github.com/airbytehq/airbyte/releases/tag/v0.24.0-alpha) - -* Avoid dbt runtime exception "maximum recursion depth exceeded" in ephemeral materialization \([\#3470](https://github.com/airbytehq/airbyte/pull/3470)\) - -## [05-18-2021 - 0.23.0](https://github.com/airbytehq/airbyte/releases/tag/v0.23.0-alpha) - -* Documentation to deploy locally on Windows is now available \([\#3425](https://github.com/airbytehq/airbyte/pull/3425)\) -* Connector icons are now displayed in the UI -* Restart core containers if they fail automatically \([\#3423](https://github.com/airbytehq/airbyte/pull/3423)\) -* Progress on supporting custom transformation using dbt. More updates on this soon! - -## [05-11-2021 - 0.22.3](https://github.com/airbytehq/airbyte/releases/tag/v0.22.3-alpha) - -* Bump K8s deployment version to latest stable version, thanks to [Coetzee van Staden](https://github.com/coetzeevs) -* Added tutorial to deploy Airbyte on Azure VM \([\#3171](https://github.com/airbytehq/airbyte/pull/3171)\), thanks to [geekwhocodes](https://github.com/geekwhocodes) -* Progress on checkpointing to support rate limits better -* Upgrade normalization to use dbt from docker images \([\#3186](https://github.com/airbytehq/airbyte/pull/3186)\) - -## [05-04-2021 - 0.22.2](https://github.com/airbytehq/airbyte/releases/tag/v0.22.2-alpha) - -* Split replication and normalization into separate temporal activities \([\#3136](https://github.com/airbytehq/airbyte/pull/3136)\) -* Fix normalization Nesting bug \([\#3110](https://github.com/airbytehq/airbyte/pull/3110)\) - -## [04-27-2021 - 0.22.0](https://github.com/airbytehq/airbyte/releases/tag/v0.22.0-alpha) - -* **Replace timeout for sources** \([\#3031](https://github.com/airbytehq/airbyte/pull/2851)\) -* Fix UI issue where tables with the same name are selected together \([\#3032](https://github.com/airbytehq/airbyte/pull/2851)\) -* Fix feed handling when feeds are unavailable \([\#2964](https://github.com/airbytehq/airbyte/pull/2851)\) -* Export whitelisted tables \([\#3055](https://github.com/airbytehq/airbyte/pull/2851)\) -* Create a contributor bootstrap script \(\#3028\) \([\#3054](https://github.com/airbytehq/airbyte/pull/2851)\), thanks to [nclsbayona](https://github.com/nclsbayona) - -## [04-20-2021 - 0.21.0](https://github.com/airbytehq/airbyte/releases/tag/v0.21.0-alpha) - -* **Namespace support**: supported source-destination pairs will now sync data into the same namespace as the source \(\#2862\) -* Add **“Refresh Schema”** button \([\#2943](https://github.com/airbytehq/airbyte/pull/2943)\) -* In the Settings, you can now **add a webhook to get notified when a sync fails** -* Add destinationSyncModes to connection form -* Add tooltips for connection status icons - -## [04-12-2021 - 0.20.0](https://github.com/airbytehq/airbyte/releases/tag/v0.20.0-alpha) - -* **Change Data Capture \(CDC\)** is now supported for Postgres, thanks to [@jrhizor](https://github.com/jrhizor) and [@cgardens](https://github.com/cgardens). We will now expand it to MySQL and MSSQL in the coming weeks. -* When displaying the schema for a source, you can now search for table names, thanks to [@jamakase](https://github.com/jamakase) -* Better feedback UX when manually triggering a sync with “Sync now” - -## [04-07-2021 - 0.19.0](https://github.com/airbytehq/airbyte/releases/tag/v0.19.0-alpha) - -* New **Connections** page where you can see the list of all your connections and their statuses. -* New **Settings** page to update your preferences. -* Bugfix where very large schemas caused schema discovery to fail. - -## [03-29-2021 - 0.18.1](https://github.com/airbytehq/airbyte/releases/tag/v0.18.1-alpha) - -* Surface the **health of each connection** so that a user can spot any problems at a glance. -* Added support for deduplicating records in the destination using a primary key using incremental dedupe - -* A source’s extraction mode \(incremental, full refresh\) is now decoupled from the destination’s write mode -- so you can repeatedly append full refreshes to get repeated snapshots of data in your source. -* New **Upgrade all** button in Admin to upgrade all your connectors at once -* New **Cancel** job button in Connections Status page when a sync job is running, so you can stop never-ending processes. - -## [03-22-2021 - 0.17.2](https://github.com/airbytehq/airbyte/releases/tag/v0.17.2-alpha) - -* Improved the speed of get spec, check connection, and discover schema by migrating to the Temporal workflow engine. -* Exposed cancellation for sync jobs in the API \(will be exposed in the UI in the next week!\). -* Bug fix: Fix issue where migration app was OOMing. - -## [03-15-2021 - 0.17.1](https://github.com/airbytehq/airbyte/releases/tag/v0.17.1-alpha) - -* **Creating and deleting multiple workspaces** is now supported via the API. Thanks to [@Samuel Gordalina](https://github.com/gordalina) for contributing this feature! -* Normalization now supports numeric types with precision greater than 32 bits -* Normalization now supports union data types -* Support longform text inputs in the UI for cases where you need to preserve formatting on connector inputs like .pem keys -* Expose the latest available connector versions in the API -* Airflow: published a new [tutorial](https://docs.airbyte.com/operator-guides/using-the-airflow-airbyte-operator/) for how to use the Airbyte operator. Thanks [@Marcos Marx](https://github.com/marcosmarxm) for writing the tutorial! -* Connector Contributions: All connectors now describe how to contribute to them without having to touch Airbyte’s monorepo build system -- just work on the connector in your favorite dev setup! - -## [03-08-2021 - 0.17](https://github.com/airbytehq/airbyte/releases/tag/v0.17.0-alpha) - -* **Integration with Airflow** is here. Thanks to @Marcos Marx, you can now run Airbyte jobs from Airflow directly. A tutorial is on the way and should be coming this week! -* Add a prefix for tables, so that tables with the same name don't clobber each other in the destination - -## [03-01-2021 - 0.16](https://github.com/airbytehq/airbyte/milestone/22?closed=1) - -* We made some progress to address **nested tables in our normalization.** - - Previously, basic normalization would output nested tables as-is and append a number for duplicate tables. For example, Stripe’s nested address fields go from: - - ```text - Address - address_1 - ``` - - To - - ```text - Charges_source_owner_755_address - customers_shipping_c70_address - ``` - - After the change, the parent tables are combined with the name of the nested table to show where the nested table originated. **This is a breaking change for the consumers of nested tables. Consumers will need to update to point at the new tables.** - -## [02-19-2021 - 0.15](https://github.com/airbytehq/airbyte/milestone/22?closed=1) - -* We now handle nested tables with the normalization steps. Check out the video below to see how it works. - -{% embed url="https://youtu.be/I4fngMnkJzY" caption="" %} - -## [02-12-2021 - 0.14](https://github.com/airbytehq/airbyte/milestone/21?closed=1) - -* Front-end changes: - * Display Airbyte's version number - * Describe schemas using JsonSchema - * Better feedback on buttons - -## [Beta launch - 0.13](https://github.com/airbytehq/airbyte/milestone/15?closed=1) - Released 02/02/2021 - -* Add connector build status dashboard -* Support Schema Changes in Sources -* Support Import / Export of Airbyte Data in the Admin section of the UI -* Bug fixes: - * If Airbyte is closed during a sync the running job is not marked as failed - * Airbyte should fail when deployment version doesn't match data version - * Upgrade Airbyte Version without losing existing configuration / data - -## [0.12-alpha](https://github.com/airbytehq/airbyte/milestone/14?closed=1) - Released 01/20/2021 - -* Ability to skip onboarding -* Miscellaneous bug fixes: - * A long discovery request causes a timeout in the UI type/bug - * Out of Memory when replicating large table from MySQL - -## 0.11.2-alpha - Released 01/18/2021 - -* Increase timeout for long running catalog discovery operations from 3 minutes to 30 minutes to avoid prematurely failing long-running operations - -## 0.11.1-alpha - Released 01/17/2021 - -### Bugfixes - -* Writing boolean columns to Redshift destination now works correctly - -## [0.11.0-alpha](https://github.com/airbytehq/airbyte/milestone/12?closed=1) - Delivered 01/14/2021 - -### New features - -* Allow skipping the onboarding flow in the UI -* Add the ability to reset a connection's schema when the underlying data source schema changes - -### Bugfixes - -* Fix UI race condition which showed config for the wrong connector when rapidly choosing between different connector -* Fix a bug in MSSQL and Redshift source connectors where custom SQL types weren't being handled correctly. [Pull request](https://github.com/airbytehq/airbyte/pull/1576) -* Support incremental sync for Salesforce, Slack, and Braintree sources -* Gracefully handle invalid nuemric values \(e.g NaN or Infinity\) in MySQL, MSSQL, and Postgtres DB sources -* Fix flashing red sources/destinations fields after success submit -* Fix a bug which caused getting a connector's specification to hang indefinitely if the connector docker image failed to download - -### New connectors - -* Tempo -* Appstore - -## [0.10.0](https://github.com/airbytehq/airbyte/milestone/12?closed=1) - delivered on 01/04/2021 - -* You can now **deploy Airbyte on** [**Kuberbetes**](https://docs.airbyte.com/deploying-airbyte/on-kubernetes) _\*\*_\(alpha version\) -* **Support incremental sync** for Mixpanel and HubSpot sources -* **Fixes on connectors:** - * Fixed a bug in the GitHub connector where the connector didn’t verify the provided API token was granted the correct permissions - * Fixed a bug in the Google Sheets connector where rate limits were not always respected - * Alpha version of Facebook marketing API v9. This connector is a native Airbyte connector \(current is Singer based\). -* **New source:** Plaid \(contributed by [@tgiardina](https://github.com/tgiardina) - thanks Thomas!\) - -## [0.9.0](https://github.com/airbytehq/airbyte/milestone/11?closed=1) - delivered on 12/23/2020 - -* **New chat app from the web app** so you can directly chat with the team for any issues you run into -* **Debugging** has been made easier in the UI, with checks, discover logs, and sync download logs -* Support of **Kubernetes in local**. GKE will come at the next release. -* **New source:** Looker _\*\*_ - -## [0.8.0](https://github.com/airbytehq/airbyte/milestone/10?closed=1) - delivered on 12/17/2020 - -* **Incremental - Append"** - * We now allow sources to replicate only new or modified data. This enables to avoid re-fetching data that you have already replicated from a source. - * The delta from a sync will be _appended_ to the existing data in the data warehouse. - * Here are [all the details of this feature](/using-airbyte/core-concepts/sync-modes/incremental-append.md). - * It has been released for 15 connectors, including Postgres, MySQL, Intercom, Zendesk, Stripe, Twilio, Marketo, Shopify, GitHub, and all the destination connectors. We will expand it to all the connectors in the next couple of weeks. -* **Other features:** - * Improve interface for writing python sources \(should make writing new python sources easier and clearer\). - * Add support for running Standard Source Tests with files \(making them easy to run for any language a source is written in\) - * Add ability to reset data for a connection. -* **Bug fixes:** - * Update version of test containers we use to avoid pull issues while running tests. - * Fix issue where jobs were not sorted by created at in connection detail view. -* **New sources:** Intercom, Mixpanel, Jira Cloud, Zoom, Drift, Microsoft Teams - -## [0.7.0](https://github.com/airbytehq/airbyte/milestone/8?closed=1) - delivered on 12/07/2020 - -* **New destination:** our own **Redshift** warehouse connector. You can also use this connector for Panoply. -* **New sources**: 8 additional source connectors including Recurly, Twilio, Freshdesk. Greenhouse, Redshift \(source\), Braintree, Slack, Zendesk Support -* Bug fixes - -## [0.6.0](https://github.com/airbytehq/airbyte/milestone/6?closed=1) - delivered on 11/23/2020 - -* Support **multiple destinations** -* **New source:** Sendgrid -* Support **basic normalization** -* Bug fixes - -## [0.5.0](https://github.com/airbytehq/airbyte/milestone/5?closed=1) - delivered on 11/18/2020 - -* **New sources:** 10 additional source connectors, including Files \(CSV, HTML, JSON...\), Shopify, MSSQL, Mailchimp - -## [0.4.0](https://github.com/airbytehq/airbyte/milestone/4?closed=1) - delivered on 11/04/2020 - -Here is what we are working on right now: - -* **New destination**: our own **Snowflake** warehouse connector -* **New sources:** Facebook Ads, Google Ads. - -## [0.3.0](https://github.com/airbytehq/airbyte/milestone/3?closed=1) - delivered on 10/30/2020 - -* **New sources:** Salesforce, GitHub, Google Sheets, Google Analytics, HubSpot, Rest APIs, and MySQL -* Integration test suite for sources -* Improve build speed - -## [0.2.0](https://github.com/airbytehq/airbyte/milestone/2?closed=1) - delivered on 10/21/2020 - -* **a new Admin section** to enable users to add their own connectors, in addition to upgrading the ones they currently use -* improve the developer experience \(DX\) for **contributing new connectors** with additional documentation and a connector protocol -* our own **BigQuery** warehouse connector -* our own **Postgres** warehouse connector -* simplify the process of supporting new Singer taps, ideally make it a 1-day process - -## [0.1.0](https://github.com/airbytehq/airbyte/milestone/1?closed=1) - delivered on 09/23/2020 - -This is our very first release after 2 months of work. - -* **New sources:** Stripe, Postgres -* **New destinations:** BigQuery, Postgres -* **Only one destination**: we only support one destination in that 1st release, but you will soon be able to add as many as you need. -* **Logs & monitoring**: you can now see your detailed logs -* **Scheduler:** you now have 10 different frequency options for your recurring syncs -* **Deployment:** you can now deploy Airbyte via a simple Docker image, or directly on AWS and GCP -* **New website**: this is the day we launch our website - airbyte.io. Let us know what you think -* **New documentation:** this is the 1st day for our documentation too -* **New blog:** we published a few articles on our startup journey, but also about our vision to making data integrations a commodity. - -Stay tuned, we will have new sources and destinations very soon! Don't hesitate to subscribe to our [newsletter](https://airbyte.io/#subscribe-newsletter) to receive our product updates and community news. - diff --git a/docs/archive/examples/README.md b/docs/archive/examples/README.md deleted file mode 100644 index e62ee1c8eb21..000000000000 --- a/docs/archive/examples/README.md +++ /dev/null @@ -1,2 +0,0 @@ -# Example Use Cases - diff --git a/docs/archive/examples/build-a-slack-activity-dashboard.md b/docs/archive/examples/build-a-slack-activity-dashboard.md deleted file mode 100644 index b63a2b65babb..000000000000 --- a/docs/archive/examples/build-a-slack-activity-dashboard.md +++ /dev/null @@ -1,424 +0,0 @@ ---- -description: Using Airbyte and Apache Superset ---- - -# Build a Slack Activity Dashboard - -![](../../.gitbook/assets/46.png) - -This article will show how to use [Airbyte](http://airbyte.com) - an open-source data integration platform - and [Apache Superset](https://superset.apache.org/) - an open-source data exploration platform - in order to build a Slack activity dashboard showing: - -* Total number of members of a Slack workspace -* The evolution of the number of Slack workspace members -* Evolution of weekly messages -* Evolution of messages per channel -* Members per time zone - -Before we get started, let’s take a high-level look at how we are going to achieve creating a Slack dashboard using Airbyte and Apache Superset. - -1. We will use the Airbyte’s Slack connector to get the data off a Slack workspace \(we will be using Airbyte’s own Slack workspace for this tutorial\). -2. We will save the data onto a PostgreSQL database. -3. Finally, using Apache Superset, we will implement the various metrics we care about. - -Got it? Now let’s get started. - -## 1. Replicating Data from Slack to Postgres with Airbyte - -### a. Deploying Airbyte - -There are several easy ways to deploy Airbyte, as listed [here](https://docs.airbyte.com/). For this tutorial, I will just use the [Docker Compose method](https://docs.airbyte.com/deploying-airbyte/local-deployment) from my workstation: - -```text -# In your workstation terminal -git clone https://github.com/airbytehq/airbyte.git -cd airbyte -docker-compose up -``` - -The above command will make the Airbyte app available on `localhost:8000`. Visit the URL on your favorite browser, and you should see Airbyte’s dashboard \(if this is your first time, you will be prompted to enter your email to get started\). - -If you haven’t set Docker up, follow the [instructions here](https://docs.docker.com/desktop/) to set it up on your machine. - -### b. Setting Up Airbyte’s Slack Source Connector - -Airbyte’s Slack connector will give us access to the data. So, we are going to kick things off by setting this connector to be our data source in Airbyte’s web app. I am assuming you already have Airbyte and Docker set up on your local machine. We will be using Docker to create our PostgreSQL database container later on. - -Now, let’s proceed. If you already went through the onboarding, click on the “new source” button at the top right of the Sources section. If you're going through the onboarding, then follow the instructions. - -You will be requested to enter a name for the source you are about to create. You can call it “slack-source”. Then, in the Source Type combo box, look for “Slack,” and then select it. Airbyte will then present the configuration fields needed for the Slack connector. So you should be seeing something like this on the Airbyte App: - -![](../../.gitbook/assets/1.png) - -The first thing you will notice is that this connector requires a Slack token. So, we have to obtain one. If you are not a workspace admin, you will need to ask for permission. - -Let’s walk through how we would get the Slack token we need. - -Assuming you are a workspace admin, open the Slack workspace and navigate to \[Workspace Name\] > Administration > Customize \[Workspace Name\]. In our case, it will be Airbyte > Administration > Customize Airbyte \(as shown below\): - -![](../../.gitbook/assets/2.png) - -In the new page that opens up in your browser, you will then need to navigate to **Configure apps**. - -![](../../.gitbook/assets/3.png) - -In the new window that opens up, click on **Build** in the top right corner. - -![](../../.gitbook/assets/4.png) - -Click on the **Create an App** button. - -![](../../.gitbook/assets/5.png) - -In the modal form that follows, give your app a name - you can name it `airbyte_superset`, then select your workspace from the Development Slack Workspace. - -![](../../.gitbook/assets/6.png) - -Next, click on the **Create App** button. You will then be presented with a screen where we are going to set permissions for our `airbyte_superset` app, by clicking on the **Permissions** button on this page. - -![](../../.gitbook/assets/7.png) - -In the next screen, navigate to the scope section. Then, click on the **Add an OAuth Scope** button. This will allow you to add permission scopes for your app. At a minimum, your app should have the following permission scopes: - -![](../../.gitbook/assets/8.png) - -Then, we are going to add our created app to the workspace by clicking the **Install to Workspace** button. - -![](../../.gitbook/assets/9.png) - -Slack will prompt you that your app is requesting permission to access your workspace of choice. Click Allow. - -![](../../.gitbook/assets/10.png) - -After the app has been successfully installed, you will be navigated to Slack’s dashboard, where you will see the Bot User OAuth Access Token. - -This is the token you will provide back on the Airbyte page, where we dropped off to obtain this token. So make sure to copy it and keep it in a safe place. - -Now that we are done with obtaining a Slack token, let’s go back to the Airbyte page we dropped off and add the token in there. - -We will also need to provide Airbyte with `start_date`. This is the date from which we want Airbyte to start replicating data from the Slack API, and we define that in the format: `YYYY-MM-DDT00:00:00Z`. - -We will specify ours as `2020-09-01T00:00:00Z`. We will also tell Airbyte to exclude archived channels and not include private channels, and also to join public channels, so the latter part of the form should look like this: - -![](../../.gitbook/assets/11.png) - -Finally, click on the **Set up source** button for Airbyte to set the Slack source up. - -If the source was set up correctly, you will be taken to the destination section of Airbyte’s dashboard, where you will tell Airbyte where to store the replicated data. - -### c. Setting Up Airbyte’s Postgres Destination Connector - -For our use case, we will be using PostgreSQL as the destination. - -Click the **add destination** button in the top right corner, then click on **add a new destination**. - -![](../../.gitbook/assets/12.png) - -In the next screen, Airbyte will validate the source, and then present you with a form to give your destination a name. We’ll call this destination slack-destination. Then, we will select the Postgres destination type. Your screen should look like this now: - -![](../../.gitbook/assets/13.png) - -Great! We have a form to enter Postgres connection credentials, but we haven’t set up a Postgres database. Let’s do that! - -Since we already have Docker installed, we can spin off a Postgres container with the following command in our terminal: - -```text -docker run --rm --name slack-db -e POSTGRES_PASSWORD=password -p 2000:5432 -d postgres -``` - -\(Note that the Docker compose file for Superset ships with a Postgres database, as you can see [here](https://github.com/apache/superset/blob/master/docker-compose.yml#L40)\). - -The above command will do the following: - -* create a Postgres container with the name slack-db, -* set the password to password, -* expose the container’s port 5432, as our machine’s port 2000. -* create a database and a user, both called postgres. - -With this, we can go back to the Airbyte screen and supply the information needed. Your form should look like this: - -![](../../.gitbook/assets/14.png) - -Then click on the **Set up destination** button. - -### d. Setting Up the Replication - -You should now see the following screen: - -![](../../.gitbook/assets/15.png) - -Airbyte will then fetch the schema for the data coming from the Slack API for your workspace. You should leave all boxes checked and then choose the sync frequency - this is the interval in which Airbyte will sync the data coming from your workspace. Let’s set the sync interval to every 24 hours. - -Then click on the **Set up connection** button. - -Airbyte will now take you to the destination dashboard, where you will see the destination you just set up. Click on it to see more details about this destination. - -![](../../.gitbook/assets/16.png) - -You will see Airbyte running the very first sync. Depending on the size of the data Airbyte is replicating, it might take a while before syncing is complete. - -![](../../.gitbook/assets/17.png) - -When it’s done, you will see the **Running status** change to **Succeeded**, and the size of the data Airbyte replicated as well as the number of records being stored on the Postgres database. - -![](../../.gitbook/assets/18.png) - -To test if the sync worked, run the following in your terminal: - -```text -docker exec slack-source psql -U postgres -c "SELECT * FROM public.users;" -``` - -This should output the rows in the users’ table. - -To get the count of the users’ table as well, you can also run: - -```text -docker exec slack-db psql -U postgres -c "SELECT count(*) FROM public.users;" -``` - -Now that we have the data from the Slack workspace in our Postgres destination, we will head on to creating the Slack dashboard with Apache Superset. - -## 2. Setting Up Apache Superset for the Dashboards - -### a. Installing Apache Superset - -Apache Superset, or simply Superset, is a modern data exploration and visualization platform. To get started using it, we will be cloning the Superset repo. Navigate to a destination in your terminal where you want to clone the Superset repo to and run: - -```text -git clone https://github.com/apache/superset.git -``` - -It’s recommended to check out the latest branch of Superset, so run: - -```text -cd superset -``` - -And then run: - -```text -git checkout latest -``` - -Superset needs you to install and build its frontend dependencies and assets. So, we will start by installing the frontend dependencies: - -```text -npm install -``` - -Note: The above command assumes you have both Node and NPM installed on your machine. - -Finally, for the frontend, we will build the assets by running: - -```text -npm run build -``` - -After that, go back up one directory into the Superset directory by running: - -```text -cd.. -``` - -Then run: - -```text -docker-compose up -``` - -This will download the Docker images Superset needs and build containers and start services Superset needs to run locally on your machine. - -Once that’s done, you should be able to access Superset on your browser by visiting [`http://localhost:8088`](http://localhost:8088), and you should be presented with the Superset login screen. - -Enter username: **admin** and Password: **admin** to be taken to your Superset dashboard. - -Great! You’ve got Superset set up. Now let’s tell Superset about our Postgres Database holding the Slack data from Airbyte. - -### b. Setting Up a Postgres Database in Superset - -To do this, on the top menu in your Superset dashboard, hover on the Data dropdown and click on **Databases**. - -![](../../.gitbook/assets/19.png) - -In the page that opens up, click on the **+ Database** button in the top right corner. - -![](../../.gitbook/assets/20.png) - -Then, you will be presented with a modal to add your Database Name and the connection URI. - -![](../../.gitbook/assets/21.png) - -Let’s call our Database `slack_db`, and then add the following URI as the connection URI: - -```text -postgresql://postgres:password@docker.for.mac.localhost:2000/postgres -``` - -If you are on a Windows Machine, yours will be: - -```text -postgresql://postgres:password@docker.for.win.localhost:2000/postgres -``` - -Note: We are using `docker.for.[mac|win].localhost` in order to access the localhost of your machine, because using just localhost will point to the Docker container network and not your machine’s network. - -Your Superset UI should look like this: - -![](../../.gitbook/assets/22.png) - -We will need to enable some settings on this connection. Click on the **SQL LAB SETTINGS** and check the following boxes: - -![](../../.gitbook/assets/23.png) - -Afterwards, click on the **ADD** button, and you will see your database on the data page of Superset. - -![](../../.gitbook/assets/24.png) - -### c. Importing our dataset - -Now that you’ve added the database, you will need to hover over the data menu again; now click on **Datasets**. - -![](../../.gitbook/assets/25.png) - -Then, you will be taken to the datasets page: - -![](../../.gitbook/assets/26.png) - -We want to only see the datasets that are in our `slack_db` database, so in the Database that is currently showing All, select `slack_db` and you will see that we don’t have any datasets at the moment. - -![](../../.gitbook/assets/27.png) - -![](../../.gitbook/assets/28.png) - -You can fix this by clicking on the **+ DATASET** button and adding the following datasets. - -Note: Make sure you select the public schema under the Schema dropdown. - -![](../../.gitbook/assets/29.png) - -Now that we have set up Superset and given it our Slack data, let’s proceed to creating the visualizations we need. - -Still remember them? Here they are again: - -* Total number of members of a Slack workspace -* The evolution of the number of Slack workspace members -* Evolution of weekly messages -* Evolution of weekly threads created -* Evolution of messages per channel -* Members per time zone - -## 3. Creating Our Dashboards with Superset - -### a. Total number of members of a Slack workspace - -To get this, we will first click on the users’ dataset of our `slack_db` on the Superset dashboard. - -![](../../.gitbook/assets/30.png) - -Next, change **untitled** at the top to **Number of Members**. - -![](../../.gitbook/assets/31.png) - -Now change the **Visualization Type** to **Big Number,** remove the **Time Range** filter, and add a Subheader named “Slack Members.” So your UI should look like this: - -![](../../.gitbook/assets/32.png) - -Then, click on the **RUN QUERY** button, and you should now see the total number of members. - -Pretty cool, right? Now let’s save this chart by clicking on the **SAVE** button. - -![](../../.gitbook/assets/33.png) - -Then, in the **ADD TO DASHBOARD** section, type in “Slack Dashboard”, click on the “Create Slack Dashboard” button, and then click the **Save** button. - -Great! We have successfully created our first Chart, and we also created the Dashboard. Subsequently, we will be following this flow to add the other charts to the created Slack Dashboard. - -### b. Casting the ts column - -Before we proceed with the rest of the charts for our dashboard, if you inspect the **ts** column on either the **messages** table or the **threads** table, you will see it’s of the type `VARCHAR`. We can’t really use this for our charts, so we have to cast both the **messages** and **threads**’ **ts** column as `TIMESTAMP`. Then, we can create our charts from the results of those queries. Let’s do this. - -First, navigate to the **Data** menu, and click on the **Datasets** link. In the list of datasets, click the **Edit** button for the **messages** table. - -![](../../.gitbook/assets/34.png) - -You’re now in the Edit Dataset view. Click the **Lock** button to enable editing of the dataset. Then, navigate to the **Columns** tab, expand the **ts** dropdown, and then tick the **Is Temporal** box. - -![](../../.gitbook/assets/35.png) - -Persist the changes by clicking the Save button. - -### c. The evolution of the number of Slack workspace members - -In the exploration page, let’s first get the chart showing the evolution of the number of Slack members. To do this, make your settings on this page match the screenshot below: - -![](../../.gitbook/assets/36.png) - -Save this chart onto the Slack Dashboard. - -### d. Evolution of weekly messages posted - -Now, we will look at the evolution of weekly messages posted. Let’s configure the chart settings on the same page as the previous one. - -![](../../.gitbook/assets/37.png) - -Remember, your visualization will differ based on the data you have. - -### e. Evolution of weekly threads created - -Now, we are finished with creating the message chart. Let's go over to the thread chart. You will recall that we will need to cast the **ts** column as stated earlier. So, do that and get to the exploration page, and make it match the screenshot below to achieve the required visualization: - -![](../../.gitbook/assets/38.png) - -### f. Evolution of messages per channel - -For this visualization, we will need a more complex SQL query. Here’s the query we used \(as you can see in the screenshot below\): - -```text -SELECT CAST(m.ts as TIMESTAMP), c.name, m.text -FROM public.messages m -INNER JOIN public.channels c -ON m.channel_id = c_id -``` - -![](../../.gitbook/assets/39.png) - -Next, click on **EXPLORE** to be taken to the exploration page; make it match the screenshot below: - -![](../../.gitbook/assets/40.png) - -Save this chart to the dashboard. - -### g. Members per time zone - -Finally, we will be visualizing members per time zone. To do this, instead of casting in the SQL lab as we’ve previously done, we will explore another method to achieve casting by using Superset’s Virtual calculated column feature. This feature allows us to write SQL queries that customize the appearance and behavior of a specific column. - -For our use case, we will need the updated column of the users table to be a `TIMESTAMP`, in order to perform the visualization we need for Members per time zone. Let’s start on clicking the edit icon on the users table in Superset. - -![](../../.gitbook/assets/41.png) - -You will be presented with a modal like so: - -![](../../.gitbook/assets/42.png) - -Click on the **CALCULATED COLUMNS** tab: - -![](../../.gitbook/assets/43.png) - -Then, click on the **+ ADD ITEM** button, and make your settings match the screenshot below. - -![](../../.gitbook/assets/44.png) - -Then, go to the **exploration** page and make it match the settings below: - -![](../../.gitbook/assets/45.png) - -Now save this last chart, and head over to your Slack Dashboard. It should look like this: - -![](../../.gitbook/assets/46.png) - -Of course, you can edit how the dashboard looks to fit what you want on it. - -## Conclusion - -In this article, we looked at using Airbyte’s Slack connector to get the data from a Slack workspace into a Postgres database, and then used Apache Superset to craft a dashboard of visualizations.If you have any questions about Airbyte, don’t hesitate to ask questions on our [Slack](https://slack.airbyte.io)! If you have questions about Superset, you can join the [Superset Community Slack](https://superset.apache.org/community/)! - diff --git a/docs/archive/examples/postgres-replication.md b/docs/archive/examples/postgres-replication.md deleted file mode 100644 index 160da6d20f7a..000000000000 --- a/docs/archive/examples/postgres-replication.md +++ /dev/null @@ -1,116 +0,0 @@ ---- -description: Start syncing data in minutes with Airbyte ---- - -# Postgres Replication - -Let's see how you can spin up a local instance of Airbyte and syncing data from one Postgres database to another. - -Here's a 6-minute video showing you how you can do it. - -{% embed url="https://www.youtube.com/watch?v=Rcpt5SVsMpk" caption="" %} - -First of all, make sure you have Docker and Docker Compose installed. If this isn't the case, follow the [guide](../../deploying-airbyte/local-deployment.md) for the recommended approach to install Docker. - -Once Docker is installed successfully, run the following commands: - -```text -git clone https://github.com/airbytehq/airbyte.git -cd airbyte -docker-compose up -``` - -Once you see an Airbyte banner, the UI is ready to go at [http://localhost:8000/](http://localhost:8000/). - -## 1. Set up your preferences - -You should see an onboarding page. Enter your email and continue. - -![](../../.gitbook/assets/airbyte_get-started.png) - -## 2. Set up your first connection - -We support a growing [list of source connectors](https://docs.airbyte.com/category/sources). For now, we will start out with a Postgres source and destination. - -**If you don't have a readily available Postgres database to sync, here are some quick instructions:** -Run the following commands in a new terminal window to start backgrounded source and destination databases: - -```text -docker run --rm --name airbyte-source -e POSTGRES_PASSWORD=password -p 2000:5432 -d postgres -docker run --rm --name airbyte-destination -e POSTGRES_PASSWORD=password -p 3000:5432 -d postgres -``` - -Add a table with a few rows to the source database: - -```text -docker exec -it airbyte-source psql -U postgres -c "CREATE TABLE users(id SERIAL PRIMARY KEY, col1 VARCHAR(200));" -docker exec -it airbyte-source psql -U postgres -c "INSERT INTO public.users(col1) VALUES('record1');" -docker exec -it airbyte-source psql -U postgres -c "INSERT INTO public.users(col1) VALUES('record2');" -docker exec -it airbyte-source psql -U postgres -c "INSERT INTO public.users(col1) VALUES('record3');" -``` - -You now have a Postgres database ready to be replicated! - -### **Connect the Postgres database** - -In the UI, you will see a wizard that allows you choose the data you want to send through Airbyte. - -![](../../.gitbook/assets/02_set-up-sources.png) - -Use the name `airbyte-source` for the name and `Postgres`as the type. If you used our instructions to create a Postgres database, fill in the configuration fields as follows: - -```text -Host: localhost -Port: 2000 -User: postgres -Password: password -DB Name: postgres -``` - -Click on `Set Up Source` and the wizard should move on to allow you to configure a destination. - -We support a growing list of data warehouses, lakes and databases. For now, use the name `airbyte-destination`, and configure the destination Postgres database: - -```text -Host: localhost -Port: 3000 -User: postgres -Password: password -DB Name: postgres -``` - -After adding the destination, you can choose what tables and columns you want to sync. - -![](../../.gitbook/assets/03_set-up-connection.png) - -For this demo, we recommend leaving the defaults and selecting "Every 5 Minutes" as the frequency. Click `Set Up Connection` to finish setting up the sync. - -## 3. Check the logs of your first sync - -You should now see a list of sources with the source you just added. Click on it to find more information about your connection. This is the page where you can update any settings about this source and how it syncs. There should be a `Completed` job under the history section. If you click on that run, it will show logs from that run. - -![](../../.gitbook/assets/04_source-details.png) - -One of biggest problems we've seen in tools like Fivetran is the lack of visibility when debugging. In Airbyte, allowing full log access and the ability to debug and fix connector problems is one of our highest priorities. We'll be working hard to make these logs accessible and understandable. - -## 4. Check if the syncing actually worked - -Now let's verify that this worked. Let's output the contents of the destination db: - -```text -docker exec airbyte-destination psql -U postgres -c "SELECT * FROM public.users;" -``` - -:::info - -Don't worry about the awkward `public_users` name for now; we are currently working on an update to allow users to configure their destination table names! - -::: - -You should see the rows from the source database inside the destination database! - -And there you have it. You've taken data from one database and replicated it to another. All of the actual configuration for this replication only took place in the UI. - -That's it! This is just the beginning of Airbyte. If you have any questions at all, please reach out to us on [Slack](https://slack.airbyte.io/). We’re still in alpha, so if you see any rough edges or want to request a connector you need, please create an issue on our [Github](https://github.com/airbytehq/airbyte) or leave a thumbs up on an existing issue. - -Thank you and we hope you enjoy using Airbyte. diff --git a/docs/archive/examples/slack-history.md b/docs/archive/examples/slack-history.md deleted file mode 100644 index 6305798bffee..000000000000 --- a/docs/archive/examples/slack-history.md +++ /dev/null @@ -1,109 +0,0 @@ ---- -description: Using Airbyte and MeiliSearch ---- - -# Save and Search Through Your Slack History on a Free Slack Plan - -![](../../.gitbook/assets/slack-history-ui-title.png) - -The [Slack free tier](https://slack.com/pricing/paid-vs-free) saves only the last 10K messages. For social Slack instances, it may be impractical to upgrade to a paid plan to retain these messages. Similarly, for an open-source project like [Airbyte](../../understanding-airbyte/airbyte-protocol.md#catalog) where we interact with our community through a public Slack instance, the cost of paying for a seat for every Slack member is prohibitive. - -However, searching through old messages can be really helpful. Losing that history feels like some advanced form of memory loss. What was that joke about Java 8 Streams? This contributor question sounds familiar—haven't we seen it before? But you just can't remember! - -This tutorial will show you how you can, for free, use Airbyte to save these messages \(even after Slack removes access to them\). It will also provide you a convenient way to search through them. - -Specifically, we will export messages from your Slack instance into an open-source search engine called [MeiliSearch](https://github.com/meilisearch/meilisearch). We will be focusing on getting this setup running from your local workstation. We will mention at the end how you can set up a more productionized version of this pipeline. - -We want to make this process easy, so while we will link to some external documentation for further exploration, we will provide all the instructions you need here to get this up and running. - -## 1. Set Up MeiliSearch - -First, let's get MeiliSearch running on our workstation. MeiliSearch has extensive docs for [getting started](https://docs.meilisearch.com/reference/features/installation.html#download-and-launch). For this tutorial, however, we will give you all the instructions you need to set up MeiliSearch using Docker. - -```text -docker run -it --rm \ - -p 7700:7700 \ - -v $(pwd)/data.ms:/data.ms \ - getmeili/meilisearch -``` - -That's it! - -:::info - -MeiliSearch stores data in $\(pwd\)/data.ms, so if you prefer to store it somewhere else, just adjust this path. - -::: - -## 2. Replicate Your Slack Messages to MeiliSearch - -### a. Set Up Airbyte - -Make sure you have Docker and Docker Compose installed. If you haven’t set Docker up, follow the [instructions here](https://docs.docker.com/desktop/) to set it up on your machine. Then, run the following commands: - -```bash -git clone https://github.com/airbytehq/airbyte.git -cd airbyte -docker-compose up -``` - -If you run into any problems, feel free to check out our more extensive [Getting Started FAQ](https://discuss.airbyte.io/c/faq/15) for help. - -Once you see an Airbyte banner, the UI is ready to go at [http://localhost:8000/](http://localhost:8000/). Once you have set your user preferences, you will be brought to a page that asks you to set up a source. In the next step, we'll go over how to do that. - -### b. Set Up Airbyte’s Slack Source Connector - -In the Airbyte UI, select Slack from the dropdown. We provide step-by-step instructions for setting up the Slack source in Airbyte [here](https://docs.airbyte.com/integrations/sources/slack#setup-guide). These will walk you through how to complete the form on this page. - -![](../../.gitbook/assets/slack-history-setup-wizard.png) - -By the end of these instructions, you should have created a Slack source in the Airbyte UI. For now, just add your Slack app to a single public channel \(you can add it to more channels later\). Only messages from that channel will be replicated. - -The Airbyte app will now prompt you to set up a destination. Next, we will walk through how to set up MeiliSearch. - -### c. Set Up Airbyte’s MeiliSearch Destination Connector - -Head back to the Airbyte UI. It should still be prompting you to set up a destination. Select "MeiliSearch" from the dropdown. For the `host` field, set: `http://localhost:7700`. The `api_key` can be left blank. - -### d. Set Up the Replication - -On the next page, you will be asked to select which streams of data you'd like to replicate. We recommend unchecking "files" and "remote files" since you won't really be able to search them easily in this search engine. - -![](../../.gitbook/assets/airbyte_connection-settings.png) - -For frequency, we recommend every 24 hours. - -## 3. Search MeiliSearch - -After the connection has been saved, Airbyte should start replicating the data immediately. When it completes you should see the following: - -![](../../.gitbook/assets/slack-history-sync.png) - -When the sync is done, you can sanity check that this is all working by making a search request to MeiliSearch. Replication can take several minutes depending on the size of your Slack instance. - -```bash -curl 'http://localhost:7700/indexes/messages/search' --data '{ "q": "" }' -``` - -For example, I have the following message in one of the messages that I replicated: "welcome to airbyte". - -```bash -curl 'http://localhost:7700/indexes/messages/search' --data '{ "q": "welcome to" }' -# => {"hits":[{"_ab_pk":"7ff9a858_6959_45e7_ad6b_16f9e0e91098","channel_id":"C01M2UUP87P","client_msg_id":"77022f01-3846-4b9d-a6d3-120a26b2c2ac","type":"message","text":"welcome to airbyte.","user":"U01AS8LGX41","ts":"2021-02-05T17:26:01.000000Z","team":"T01AB4DDR2N","blocks":[{"type":"rich_text"}],"file_ids":[],"thread_ts":"1612545961.000800"}],"offset":0,"limit":20,"nbHits":2,"exhaustiveNbHits":false,"processingTimeMs":21,"query":"test-72"} -``` - -## 4. Search via a UI - -Making curl requests to search your Slack History is a little clunky, so we have modified the example UI that MeiliSearch provides in [their docs](https://docs.meilisearch.com/learn/tutorials/getting_started.html#integrate-with-your-project) to search through the Slack results. - -Download \(or copy and paste\) this [html file](https://github.com/airbytehq/airbyte/blob/master/docs/examples/slack-history/index.html) to your workstation. Then, open it using a browser. You should now be able to write search terms in the search bar and get results instantly! - -![](../../.gitbook/assets/slack-history-ui.png) - -## 5. "Productionizing" Saving Slack History - -You can find instructions for how to host Airbyte on various cloud platforms [here](../../deploying-airbyte/README.md). - -Documentation on how to host MeiliSearch on cloud platforms can be found [here](https://docs.meilisearch.com/running-production/#a-quick-introduction). - -If you want to use the UI mentioned in the section above, we recommend statically hosting it on S3, GCS, or equivalent. diff --git a/docs/archive/examples/slack-history/index.html b/docs/archive/examples/slack-history/index.html deleted file mode 100644 index 0812368137cd..000000000000 --- a/docs/archive/examples/slack-history/index.html +++ /dev/null @@ -1,77 +0,0 @@ - - - - - - - - -
- -
-
- -
- - - - - - - diff --git a/docs/archive/examples/zoom-activity-dashboard.md b/docs/archive/examples/zoom-activity-dashboard.md deleted file mode 100644 index a141f2da418a..000000000000 --- a/docs/archive/examples/zoom-activity-dashboard.md +++ /dev/null @@ -1,272 +0,0 @@ ---- -description: Using Airbyte and Tableau ---- - -# Visualizing the Time Spent by Your Team in Zoom Calls - -In this article, we will show you how you can understand how much your team leverages Zoom, or spends time in meetings, in a couple of minutes. We will be using [Airbyte](https://airbyte.com) \(an open-source data integration platform\) and [Tableau](https://www.tableau.com) \(a business intelligence and analytics software\) for this tutorial. - -Here is what we will cover: - -1. Replicating data from Zoom to a PostgreSQL database, using Airbyte -2. Connecting the PostgreSQL database to Tableau -3. Creating charts in Tableau with Zoom data - -We will produce the following charts in Tableau: - -* Meetings per week in a team -* Hours a team spends in meetings per week -* Listing of team members with the number of meetings per week and number of hours spent in meetings, ranked -* Webinars per week in a team -* Hours a team spends in webinars per week -* Participants for all webinars in a team per week -* Listing of team members with the number of webinars per week and number of hours spent in meetings, ranked - -Let’s get started by replicating Zoom data using Airbyte. - -## Step 1: Replicating Zoom data to PostgreSQL - -### Launching Airbyte - -In order to replicate Zoom data, we will need to use [Airbyte’s Zoom connector](https://docs.airbyte.com/integrations/sources/zoom). To do this, you need to start off Airbyte’s web app by opening up your terminal and navigating to Airbyte and running: - -`docker-compose up` - -You can find more details about this in the [Getting Started FAQ](https://discuss.airbyte.io/c/faq/15) on our [Airbyte Forum](https://github.com/airbytehq/airbyte/discussions). - -This will start up Airbyte on `localhost:8000`; open that address in your browser to access the Airbyte dashboard. - -![](../../.gitbook/assets/01_airbyte-dashboard.png) - -If you haven't gone through the onboarding yet, you will be prompted to connect a source and a destination. Then just follow the instructions. If you've gone through it, then you will see the screenshot above. In the top right corner of the Airbyte dashboard, click on the **+ new source** button to add a new Airbyte source. In the screen to set up the new source, enter the source name \(we will use airbyte-zoom\) and select **Zoom** as source type. - -Choosing Zoom as **source type** will cause Airbyte to display the configuration parameters needed to set up the Zoom source. - -![](../../.gitbook/assets/02_setting-zoom-connector-name.png) - -The Zoom connector for Airbyte requires you to provide it with a Zoom JWT token. Let’s take a detour and look at how to obtain one from Zoom. - -### Obtaining a Zoom JWT Token - -To obtain a Zoom JWT Token, login to your Zoom account and go to the [Zoom Marketplace](https://marketplace.zoom.us/). If this is your first time in the marketplace, you will need to agree to the Zoom’s marketplace terms of use. - -Once you are in, you need to click on the **Develop** dropdown and then click on **Build App.** - -![](../../.gitbook/assets/03_click.png) - -Clicking on **Build App** for the first time will display a modal for you to accept the Zoom’s API license and terms of use. Do accept if you agree and you will be presented with the below screen. - -![](../../.gitbook/assets/zoom-marketplace-build-screen%20(3)%20(3).png) - -Select **JWT** as the app you want to build and click on the **Create** button on the card. You will be presented with a modal to enter the app name; type in `airbyte-zoom`. - -![](../../.gitbook/assets/05_app-name-modal.png) - -Next, click on the **Create** button on the modal. - -You will then be taken to the **App Information** page of the app you just created. Fill in the required information. - -![](../../.gitbook/assets/06_app-information.png) - -After filling in the needed information, click on the **Continue** button. You will be taken to the **App Credentials** page. Here, click on the **View JWT Token** dropdown. - -![](../../.gitbook/assets/07_view-jwt-token.png) - -There you can set the expiration time of the token \(we will leave the default 90 minutes\), and then you click on the **Copy** button of the **JWT Token**. - -After copying it, click on the **Continue** button. - -![](../../.gitbook/assets/08_activate-webhook.png) - -You will be taken to a screen to activate **Event Subscriptions**. Just leave it as is, as we won’t be needing Webhooks. Click on **Continue**, and your app should be marked as activated. - -### Connecting Zoom on Airbyte - -So let’s go back to the Airbyte web UI and provide it with the JWT token we copied from our Zoom app. - -Now click on the **Set up source** button. You will see the below success message when the connection is made successfully. - -![](../../.gitbook/assets/setup-successful%20(3)%20(2).png) - -And you will be taken to the page to add your destination. - -### Connecting PostgreSQL on Airbyte - -![](../../.gitbook/assets/10_destination.png) - -For our destination, we will be using a PostgreSQL database, since Tableau supports PostgreSQL as a data source. Click on the **add destination** button, and then in the drop down click on **+ add a new destination**. In the page that presents itself, add the destination name and choose the Postgres destination. - -![](../../.gitbook/assets/11_choose-postgres-destination.png) - -To supply Airbyte with the PostgreSQL configuration parameters needed to make a PostgreSQL destination, we will spin off a PostgreSQL container with Docker using the following command in our terminal. - -`docker run --rm --name airbyte-zoom-db -e POSTGRES_PASSWORD=password -v airbyte_zoom_data:/var/lib/postgresql/data -p 2000:5432 -d postgres` - -This will spin a docker container and persist the data we will be replicating in the PostgreSQL database in a Docker volume `airbyte_zoom_data`. - -Now, let’s supply the above credentials to the Airbyte UI requiring those credentials. - -![](../../.gitbook/assets/postgres_credentials%20(3)%20(3).png) - -Then click on the **Set up destination** button. - -After the connection has been made to your PostgreSQL database successfully, Airbyte will generate the schema of the data to be replicated in your database from the Zoom source. - -Leave all the fields checked. - -![](../../.gitbook/assets/schema%20(3)%20(3).png) - -Select a **Sync frequency** of **manual** and then click on **Set up connection**. - -After successfully making the connection, you will see your PostgreSQL destination. Click on the Launch button to start the data replication. - -![](../../.gitbook/assets/launch%20(3)%20(3).png) - -Then click on the **airbyte-zoom-destination** to see the Sync page. - -![](../../.gitbook/assets/sync-screen%20(3)%20(3).png) - -Syncing should take a few minutes or longer depending on the size of the data being replicated. Once Airbyte is done replicating the data, you will get a **succeeded** status. - -Then, you can run the following SQL command on the PostgreSQL container to confirm that the sync was done successfully. - -`docker exec airbyte-zoom-db psql -U postgres -c "SELECT * FROM public.users;"` - -Now that we have our Zoom data replicated successfully via Airbyte, let’s move on and set up Tableau to make the various visualizations and analytics we want. - -## Step 2: Connect the PostgreSQL database to Tableau - -Tableau helps people and organizations to get answers from their data. It’s a visual analytic platform that makes it easy to explore and manage data. - -To get started with Tableau, you can opt in for a [free trial period](https://www.tableau.com/products/trial) by providing your email and clicking the **DOWNLOAD FREE TRIAL** button to download the Tableau desktop app. The download should automatically detect your machine type \(Windows/Mac\). - -Go ahead and install Tableau on your machine. After the installation is complete, you will need to fill in some more details to activate your free trial. - -Once your activation is successful, you will see your Tableau dashboard. - -![](../../.gitbook/assets/tableau-dashboard%20(3)%20(3).png) - -On the sidebar menu under the **To a Server** section, click on the **More…** menu. You will see a list of datasource connectors you can connect Tableau with. - -![](../../.gitbook/assets/datasources%20(4)%20(4).png) - -Select **PostgreSQL** and you will be presented with a connection credentials modal. - -Fill in the same details of the PostgreSQL database we used as the destination in Airbyte. - -![](../../.gitbook/assets/18_fill-in-connection-details.png) - -Next, click on the **Sign In** button. If the connection was made successfully, you will see the Tableau dashboard for the database you just connected. - -_Note: If you are having trouble connecting PostgreSQL with Tableau, it might be because the driver Tableau comes with for PostgreSQL might not work for newer versions of PostgreSQL. You can download the JDBC driver for PostgreSQL_ [_here_](https://www.tableau.com/support/drivers?_ga=2.62351404.1800241672.1616922684-1838321730.1615100968) _and follow the setup instructions._ - -Now that we have replicated our Zoom data into a PostgreSQL database using Airbyte’s Zoom connector, and connected Tableau with our PostgreSQL database containing our Zoom data, let’s proceed to creating the charts we need to visualize the time spent by a team in Zoom calls. - -## Step 3: Create the charts on Tableau with the Zoom data - -### Meetings per week in a team - -To create this chart, we will need to use the count of the meetings and the **createdAt** field of the **meetings** table. Currently, we haven’t selected a table to work on in Tableau. So you will see a prompt to **Drag tables here**. - -![](../../.gitbook/assets/19_tableau-view-with-all-tables.png) - -Drag the **meetings** table from the sidebar onto the space with the prompt. - -Now that we have the meetings table, we can start building out the chart by clicking on **Sheet 1** at the bottom left of Tableau. - -![](../../.gitbook/assets/20_empty-meeting-sheet.png) - -As stated earlier, we need **Created At**, but currently it’s a String data type. Let’s change that by converting it to a data time. So right click on **Created At**, then select `ChangeDataType` and choose Date & Time. And that’s it! That field is now of type **Date** & **Time**. - -![](../../.gitbook/assets/21_change-to-date-time.png) - -Next, drag **Created At** to **Columns**. - -![](../../.gitbook/assets/22_drag-created-at.png) - -Currently, we get the Created At in **YEAR**, but per our requirement we want them in Weeks, so right click on the **YEAR\(Created At\)** and choose **Week Number**. - -![](../../.gitbook/assets/change-to-per-week%20(3)%20(3).png) - -Tableau should now look like this: - -![](../../.gitbook/assets/24_meetings-per-week.png) - -Now, to finish up, we need to add the **meetings\(Count\) measure** Tableau already calculated for us in the **Rows** section. So drag **meetings\(Count\)** onto the Columns section to complete the chart. - -![](../../.gitbook/assets/evolution-of-meetings-per-week%20(3)%20(3).png) - -And now we are done with the very first chart. Let's save the sheet and create a new Dashboard that we will add this sheet to as well as the others we will be creating. - -Currently the sheet shows **Sheet 1**; right click on **Sheet 1** at the bottom left and rename it to **Weekly Meetings**. - -To create our Dashboard, we can right click on the sheet we just renamed and choose **new Dashboard**. Rename the Dashboard to Zoom Dashboard and drag the sheet into it to have something like this: - -![](../../.gitbook/assets/26_zoom-dashboard.png) - -Now that we have this first chart out of the way, we just need to replicate most of the process we used for this one to create the other charts. Because the steps are so similar, we will mostly be showing the finished screenshots of the charts except when we need to conform to the chart requirements. - -### Hours a team spends in meetings per week - -For this chart, we need the sum of the duration spent in weekly meetings. We already have a Duration field, which is currently displaying durations in minutes. We can derive a calculated field off this field since we want the duration in hours \(we just need to divide the duration field by 60\). - -To do this, right click on the Duration field and select **create**, then click on **calculatedField**. Change the name to **Duration in Hours**, and then the calculation should be **\[Duration\]/60**. Click ok to create the field. - -So now we can drag the Duration in Hours and Created At fields onto your sheet like so: - -![](../../.gitbook/assets/27_hours-spent-in-weekly-meetings.png) - -Note: We are adding a filter on the Duration to filter out null values. You can do this by right clicking on the **SUM\(Duration\)** pill and clicking filter, then make sure the **include null values** checkbox is unchecked. - -### Participants for all meetings per week - -For this chart, we will need to have a calculated field called **\# of meetings attended**, which will be an aggregate of the counts of rows matching a particular user's email in the `report_meeting_participants` table plotted against the **Created At** field of the **meetings** table. To get this done, right click on the **User Email** field. Select **create** and click on **calculatedField**, then enter the title of the field as **\# of meetings attended**. Next, enter the below formula: - -`COUNT(IF [User Email] == [User Email] THEN [Id (Report Meeting Participants)] END)` - -Then click on apply. Finally, drag the **Created At** fields \(make sure it’s on the **Weekly** number\) and the calculated field you just created to match the below screenshot: - -![](../../.gitbook/assets/number_of_participants_per_weekly_meetings.png) - -### Listing of team members with the number of meetings per week and number of hours spent in meetings, ranked. - -To get this chart, we need to create a relationship between the **meetings table** and the `report_meeting_participants` table. You can do this by dragging the `report_meeting_participants` table in as a source alongside the **meetings** table and relate both via the **meeting id**. Then you will be able to create a new worksheet that looks like this: - -![](../../.gitbook/assets/meetings-participant-ranked%20(3)%20(3).png) - -Note: To achieve the ranking, we simply use the sort menu icon on the top menu bar. - -### Webinars per week in a team - -The rest of the charts will be needing the **webinars** and `report_webinar_participants` tables. Similar to the number of meetings per week in a team, we will be plotting the Count of webinars against the **Created At** property. - -![](../../.gitbook/assets/30_weekly-webinars.png) - -### Hours a week spends in webinars per week - -For this chart, as for the meeting’s counterpart, we will get a calculated field off the Duration field to get the **Webinar Duration in Hours**, and then plot **Created At** against the **Sum of Webinar Duration in Hours**, as shown in the screenshot below. Note: Make sure you create a new sheet for each of these graphs. - -### Participants for all webinars per week - -This calculation is the same as the number of participants for all meetings per week, but instead of using the **meetings** and `report_meeting_participants` tables, we will use the webinars and `report_webinar_participants` tables. - -Also, the formula will now be: - -`COUNT(IF [User Email] == [User Email] THEN [Id (Report Webinar Participants)] END)` - -Below is the chart: - -![](../../.gitbook/assets/32_number_of_webinar_attended_per_week.png) - -#### Listing of team members with the number of webinars per week and number of hours spent in meetings, ranked - -Below is the chart with these specs - -![](../../.gitbook/assets/33_number-of-webinars-participants.png) - -## Conclusion - -In this article, we see how we can use Airbyte to get data off the Zoom API onto a PostgreSQL database, and then use that data to create some chart visualizations in Tableau. - -You can leverage Airbyte and Tableau to produce graphs on any collaboration tool. We just used Zoom to illustrate how it can be done. Hope this is helpful! - diff --git a/docs/archive/faq/README.md b/docs/archive/faq/README.md deleted file mode 100644 index 1f6a217b74c7..000000000000 --- a/docs/archive/faq/README.md +++ /dev/null @@ -1,5 +0,0 @@ -# FAQ - -Our FAQ is now a section on our Airbyte Forum. Check it out [here](https://github.com/airbytehq/airbyte/discussions)! - -If you don't see your question answered, feel free to open up a new topic for it. \ No newline at end of file diff --git a/docs/archive/faq/data-loading.md b/docs/archive/faq/data-loading.md deleted file mode 100644 index 4ae20d834edc..000000000000 --- a/docs/archive/faq/data-loading.md +++ /dev/null @@ -1,124 +0,0 @@ -# Data Loading - -## **Why don’t I see any data in my destination yet?** - -It can take a while for Airbyte to load data into your destination. Some sources have restrictive API limits which constrain how much -data we can sync in a given time. Large amounts of data in your source can also make the initial sync take longer. You can check your -sync status in your connection detail page that you can access through the destination detail page or the source one. - -## **Why my final tables are being recreated everytime?** - -Airbyte ingests data into raw tables and applies the process of normalization if you selected it in the connection page. -The normalization runs a full refresh each sync and for some destinations like Snowflake, Redshift, Bigquery this may incur more -resource consumption and more costs. You need to pay attention to the frequency that you're retrieving your data to avoid issues. -For example, if you create a connection to sync every 5 minutes with incremental sync on, it will only retrieve new records into the raw tables but will apply normalization -to *all* the data in every sync! If you have tons of data, this may not be the right sync frequency for you. - -There is a [Github issue](https://github.com/airbytehq/airbyte/issues/4286) to implement normalization using incremental, which will reduce -costs and resources in your destination. - -## **What happens if a sync fails?** - -You won't lose data when a sync fails, however, no data will be added or updated in your destination. - -Airbyte will automatically attempt to replicate data 3 times. You can see and export the logs for those attempts in the connection -detail page. You can access this page through the Source or Destination detail page. - -You can configure a Slack webhook to warn you when a sync fails. - -In the future you will be able to configure other notification method (email, Sentry) and an option to create a -GitHub issue with the logs. We’re still working on it, and the purpose would be to help the community and the Airbyte team to fix the -issue as soon as possible, especially if it is a connector issue. - -Until Airbyte has this system in place, here is what you can do: - -* File a GitHub issue: go [here](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=type%2Fbug&template=bug-report.md&title=) - and file an issue with the detailed logs copied in the issue’s description. The team will be notified about your issue and will update - it for any progress or comment on it. -* Fix the issue yourself: Airbyte is open source so you don’t need to wait for anybody to fix your issue if it is important to you. - To do so, just fork the [GitHub project](https://github.com/airbytehq/airbyte) and fix the piece of code that need fixing. If you’re okay - with contributing your fix to the community, you can submit a pull request. We will review it ASAP. -* Ask on Slack: don’t hesitate to ping the team on [Slack](https://slack.airbyte.io). - -Once all this is done, Airbyte resumes your sync from where it left off. - -We truly appreciate any contribution you make to help the community. Airbyte will become the open-source standard only if everybody participates. - -## **Can Airbyte support 2-way sync i.e. changes from A go to B and changes from B go to A?** - -Airbyte actually does not support this right now. There are some details around how we handle schema and tables names that isn't going to -work for you in the current iteration. -If you attempt to do a circular dependency between source and destination, you'll end up with the following -A.public.table_foo writes to B.public.public_table_foo to A.public.public_public_table_foo. You won't be writing into your original table, -which I think is your intention. - - -## **What happens to data in the pipeline if the destination gets disconnected? Could I lose data, or wind up with duplicate data when the pipeline is reconnected?** - -Airbyte is architected to prevent data loss or duplication. Airbyte will display a failure for the sync, and re-attempt it at the next syncing, -according to the frequency you set. - -## **How frequently can Airbyte sync data?** - -You can adjust the load time to run as frequent as every hour or as infrequent as once a year using [Cron expressions](https://docs.airbyte.com/cloud/managing-airbyte-cloud/edit-stream-configuration). - -## **Why wouldn’t I choose to load all of my data more frequently?** - -While frequent data loads will give you more up-to-date data, there are a few reasons you wouldn’t want to load your too frequently, including: - -* Higher API usage may cause you to hit a limit that could impact other systems that rely on that API. -* Higher cost of loading data into your warehouse. -* More frequent delays, resulting in increased delay notification emails. For instance, if the data source generally takes several hours to - update but you wanted five-minute increments, you may receive a delay notification every sync. - -Generally is recommended setting the incremental loads to every hour to help limit API calls. - -## **Is there a way to know the estimated time to completion for the first historic sync?** - -Unfortunately not yet. - -## **Do you support change data capture \(CDC\) or logical replication for databases?** - -Airbyte currently supports [CDC for Postgres and Mysql](../../understanding-airbyte/cdc.md). Airbyte is adding support for a few other -databases you can check in the roadmap. - -## Using incremental sync, is it possible to add more fields when some new columns are added to a source table, or when a new table is added? - -For the moment, incremental sync doesn't support schema changes, so you would need to perform a full refresh whenever that happens. -Here’s a related [Github issue](https://github.com/airbytehq/airbyte/issues/1601). - -## There is a limit of how many tables one connection can handle? - -Yes, for more than 6000 thousand tables could be a problem to load the information on UI. - -There are two Github issues about this limitation: [Issue #3942](https://github.com/airbytehq/airbyte/issues/3942) -and [Issue #3943](https://github.com/airbytehq/airbyte/issues/3943). - -## Help, Airbyte is hanging/taking a long time to discover my source's schema! - -This usually happens for database sources that contain a lot of tables. This should resolve itself in half an hour or so. - -If the source contains more than 6k tables, see the [above question](#there-is-a-limit-of-how-many-tables-one-connection-can-handle). - -There is a known issue with [Oracle databases](https://github.com/airbytehq/airbyte/issues/4944). - -## **I see you support a lot of connectors – what about connectors Airbyte doesn’t support yet?** - -You can either: - -* Submit a [connector request](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=area%2Fintegration%2C+new-integration&template=new-integration-request.md&title=) on our Github project, and be notified once we or the community build a connector for it. -* Build a connector yourself by forking our [GitHub project](https://github.com/airbytehq/airbyte) and submitting a pull request. Here - are the [instructions how to build a connector](../../contributing-to-airbyte/README.md). -* Ask on Slack: don’t hesitate to ping the team on [Slack](https://slack.airbyte.io). - -## **What kind of notifications do I get?** - -For the moment, the UI will only display one kind of notification: when a sync fails, Airbyte will display the failure at the source/destination -level in the list of sources/destinations, and in the connection detail page along with the logs. - -However, there are other types of notifications: - -* When a connector that you use is no longer up to date -* When your connections fails -* When core isn't up to date - diff --git a/docs/archive/faq/deploying-on-other-os.md b/docs/archive/faq/deploying-on-other-os.md deleted file mode 100644 index 0b493c3db200..000000000000 --- a/docs/archive/faq/deploying-on-other-os.md +++ /dev/null @@ -1,40 +0,0 @@ -# Deploying Airbyte on a Non-Standard Operating System - -## CentOS 8 - -From clean install: - -``` -firewall-cmd --zone=public --add-port=8000/tcp --permanent -firewall-cmd --zone=public --add-port=8001/tcp --permanent -firewall-cmd --zone=public --add-port=7233/tcp --permanent -systemctl restart firewalld -``` -OR... if you prefer iptables: -``` -iptables -A INPUT -p tcp -m tcp --dport 8000 -j ACCEPT -iptables -A INPUT -p tcp -m tcp --dport 8001 -j ACCEPT -iptables -A INPUT -p tcp -m tcp --dport 7233 -j ACCEPT -systemctl restart iptables -``` -Setup the docker repo: -``` -dnf config-manager --add-repo=https://download.docker.com/linux/centos/docker-ce.repo` -dnf install docker-ce --nobest -systemctl enable --now docker -usermod -aG docker $USER -``` -You'll need to get docker-compose separately. -``` -dnf install wget git curl -curl -L https://github.com/docker/compose/releases/download/1.25.0/docker-compose-`uname -s`-`uname -m` -o /usr/local/bin/docker-compose -chmod +x /usr/local/bin/docker-compose -``` -Now we can install Airbyte. In this example, we will install it under `/opt/` -``` -cd /opt -git clone https://github.com/airbytehq/airbyte.git -cd airbyte -docker-compose up -docker-compose ps -``` \ No newline at end of file diff --git a/docs/archive/faq/differences-with/README.md b/docs/archive/faq/differences-with/README.md deleted file mode 100644 index d020cfd1db38..000000000000 --- a/docs/archive/faq/differences-with/README.md +++ /dev/null @@ -1,2 +0,0 @@ -# Differences with - diff --git a/docs/archive/faq/differences-with/fivetran-vs-airbyte.md b/docs/archive/faq/differences-with/fivetran-vs-airbyte.md deleted file mode 100644 index 9a9fe1045660..000000000000 --- a/docs/archive/faq/differences-with/fivetran-vs-airbyte.md +++ /dev/null @@ -1,27 +0,0 @@ -# Fivetran vs Airbyte - -We wrote an article, “[Open-source vs. Commercial Software: How to Solve the Data Integration Problem](https://airbyte.com/articles/data-engineering-thoughts/open-source-vs-commercial-software-how-to-better-solve-data-integration/),” in which we describe the pros and cons of Fivetran’s commercial approach and Airbyte’s open-source approach. Don’t hesitate to check it out for more detailed arguments. As a summary, here are the differences: - -![](https://airbyte.com/wp-content/uploads/2021/01/Airbyte-vs-Fivetran.png) - -## **Fivetran:** - -* **Limited high-quality connectors:** after 8 years in business, Fivetran supports 150 connectors. The more connectors, the more difficult it is for Fivetran to keep the same level of maintenance across all connectors. They will always have a ROI consideration to maintaining long-tailed connectors. -* **Pricing indexed on usage:** Fivetran’s pricing is indexed on the number of active rows \(rows added or edited\) per month. Teams always need to keep that in mind and are not free to move data without thinking about cost, as the costs can grow fast. -* **Security and privacy compliance:** all companies are subject to privacy compliance laws, such as GDPR, CCPA, HIPAA, etc. As a matter of fact, above a certain stage \(about 100 employees\) in a company, all external products need to go through a security compliance process that can take several months. -* **No moving data between internal databases:** Fivetran sits in the cloud, so if you have to replicate data from an internal database to another, it makes no sense to have the data move through them \(Fivetran\) for privacy and cost reasons. - -## **Airbyte:** - -* **Free, as open source, so no more pricing based on usage**: learn more about our [future business model](https://handbook.airbyte.io/strategy/business-model) \(connectors will always remain open source\). -* **Supporting 60 connectors within 8 months from inception**. Our goal is to reach 200+ connectors by the end of 2021. -* **Building new connectors made trivial, in the language of your choice:** Airbyte makes it a lot easier to create your own connector, vs. building them yourself in-house \(with Airflow or other tools\). Scheduling, orchestration, and monitoring comes out of the box with Airbyte. -* **Addressing the long tail of connectors:** with the help of the community, Airbyte ambitions to support thousands of connectors. -* **Adapt existing connectors to your needs:** you can adapt any existing connector to address your own unique edge case. -* **Using data integration in a workflow:** Airbyte’s API lets engineering teams add data integration jobs into their workflow seamlessly. -* **Integrates with your data stack and your needs:** Airflow, Kubernetes, dbt, etc. Its normalization is optional, it gives you a basic version that works out of the box, but also allows you to use dbt to do more complicated things. -* **Debugging autonomy:** if you experience any connector issue, you won’t need to wait for Fivetran’s customer support team to get back to you, if you can fix the issue fast yourself. -* **No more security and privacy compliance, as self-hosted, source-available and open-sourced \(MIT\)**. Any team can directly address their integration needs. - -Your data stays in your cloud. Have full control over your data, and the costs of your data transfers. - diff --git a/docs/archive/faq/differences-with/meltano-vs-airbyte.md b/docs/archive/faq/differences-with/meltano-vs-airbyte.md deleted file mode 100644 index f8e2ff5fba64..000000000000 --- a/docs/archive/faq/differences-with/meltano-vs-airbyte.md +++ /dev/null @@ -1,28 +0,0 @@ -# Meltano vs Airbyte - -We wrote an article, “[The State of Open-Source Data Integration and ETL](https://airbyte.com/articles/data-engineering-thoughts/the-state-of-open-source-data-integration-and-etl/),” in which we list and compare all ETL-related open-source projects, including Meltano and Airbyte. Don’t hesitate to check it out for more detailed arguments. As a summary, here are the differences: - -## **Meltano:** - -* **Meltano is built on top of the Singer protocol, whereas Airbyte is built on top of the Airbyte protocol**. Having initially created Airbyte on top of Singer, we wrote about why we didn't move forward with it [here](https://airbyte.com/blog/why-you-should-not-build-your-data-pipeline-on-top-of-singer) and [here](https://airbyte.com/blog/airbyte-vs-singer-why-airbyte-is-not-built-on-top-of-singer). Summarized, the reasons were: Singer connectors didn't always adhere to the Singer protocol, had poor standardization and visibility in terms of quality, and community governance and support was abandoned by Stitch. By contrast, we aim to make Airbyte a product that ["just works"](https://airbyte.com/blog/our-truth-for-2021-airbyte-just-works) and always plan to maximize engagement within the Airbyte community. -* **CLI-first approach:** Meltano was primarily built with a command line interface in mind. In that sense, they seem to target engineers with a preference for that interface. -* **Integration with Airflow for orchestration:** You can either use Meltano alone for orchestration or with Airflow; Meltano works both ways. -* All connectors must use Python. -* Meltano works with any of Singer's 200+ available connectors. However, in our experience, quality has been hit or miss. - -## **Airbyte:** - -In contrast, Airbyte is a company fully committed to the open-source project and has a [business model](https://handbook.airbyte.io/strategy/business-model) in mind around this project. Our [team](https://airbyte.com/about-us) are data integration experts that have built more than 1,000 integrations collectively at large scale. The team now counts 20 engineers working full-time on Airbyte. - -* **Airbyte supports more than 100 connectors after only 1 year since its inception**, 20% of which were built by the community. Our ambition is to support **200+ connectors by the end of 2021.** -* Airbyte’s connectors are **usable out of the box through a UI and API,** with monitoring, scheduling and orchestration. Airbyte was built on the premise that a user, whatever their background, should be able to move data in 2 minutes. Data engineers might want to use raw data and their own transformation processes, or to use Airbyte’s API to include data integration in their workflows. On the other hand, analysts and data scientists might want to use normalized consolidated data in their database or data warehouses. Airbyte supports all these use cases. -* **One platform, one project with standards:** This will help consolidate the developments behind one single project, some standardization and specific data protocol that can benefit all teams and specific cases. -* **Not limited by Singer’s data protocol:** In contrast to Meltano, Airbyte was not built on top of Singer, but its data protocol is compatible with Singer’s. This means Airbyte can go beyond Singer, but Meltano will remain limited. -* **Connectors can be built in the language of your choice,** as Airbyte runs them as Docker containers. -* **Airbyte integrates with your data stack and your needs:** Airflow, Kubernetes, dbt, etc. Its normalization is optional, it gives you a basic version that works out of the box, but also allows you to use dbt to do more complicated things. - -## **Other noteworthy differences:** - -* In terms of community, Meltano's Slack community got 430 new members in the last 6 months, while Airbyte got 800. -* The difference in velocity in terms of feature progress is easily measurable as both are open-source projects. Meltano closes about 30 issues per month, while Airbyte closes about 120. - diff --git a/docs/archive/faq/differences-with/pipelinewise-vs-airbyte.md b/docs/archive/faq/differences-with/pipelinewise-vs-airbyte.md deleted file mode 100644 index adcc9c2bf376..000000000000 --- a/docs/archive/faq/differences-with/pipelinewise-vs-airbyte.md +++ /dev/null @@ -1,25 +0,0 @@ -# Pipelinewise vs Airbyte - -## **PipelineWise:** - -PipelineWise is an open-source project by Transferwise that was built with the primary goal of serving their own needs. There is no business model attached to the project, and no apparent interest in growing the community. - -* **Supports 21 connectors,** and only adds new ones based on the needs of the mother company, Transferwise. -* **No business model attached to the project,** and no apparent interest from the company in growing the community. -* **As close to the original format as possible:** PipelineWise aims to reproduce the data from the source to an Analytics-Data-Store in as close to the original format as possible. Some minor load time transformations are supported, but complex mapping and joins have to be done in the Analytics-Data-Store to extract meaning. -* **Managed Schema Changes:** When source data changes, PipelineWise detects the change and alters the schema in your Analytics-Data-Store automatically. -* **YAML based configuration:** Data pipelines are defined as YAML files, ensuring that the entire configuration is kept under version control. -* **Lightweight:** No daemons or database setup are required. - -## **Airbyte:** - -In contrast, Airbyte is a company fully committed to the open-source project and has a [business model in mind](https://handbook.airbyte.io/) around this project. - -* Our ambition is to support **300+ connectors by the end of 2021.** We already supported about 50 connectors at the end of 2020, just 5 months after its inception. -* Airbyte’s connectors are **usable out of the box through a UI and API,** with monitoring, scheduling and orchestration. Airbyte was built on the premise that a user, whatever their background, should be able to move data in 2 minutes. Data engineers might want to use raw data and their own transformation processes, or to use Airbyte’s API to include data integration in their workflows. On the other hand, analysts and data scientists might want to use normalized consolidated data in their database or data warehouses. Airbyte supports all these use cases. -* **One platform, one project with standards:** This will help consolidate the developments behind one single project, some standardization and specific data protocol that can benefit all teams and specific cases. -* **Connectors can be built in the language of your choice,** as Airbyte runs them as Docker containers. -* **Airbyte integrates with your data stack and your needs:** Airflow, Kubernetes, dbt, etc. Its normalization is optional, it gives you a basic version that works out of the box, but also allows you to use dbt to do more complicated things. - -The data protocols for both projects are compatible with Singer’s. So it is easy to migrate a Singer tap or target onto Airbyte or PipelineWise. - diff --git a/docs/archive/faq/differences-with/singer-vs-airbyte.md b/docs/archive/faq/differences-with/singer-vs-airbyte.md deleted file mode 100644 index 58edd43eedb0..000000000000 --- a/docs/archive/faq/differences-with/singer-vs-airbyte.md +++ /dev/null @@ -1,28 +0,0 @@ -# Singer vs Airbyte - -If you want to understand the difference between Airbyte and Singer, you might be interested in 2 articles we wrote: - -* “[Airbyte vs. Singer: Why Airbyte is not built on top of Singer](https://airbyte.com/articles/data-engineering-thoughts/airbyte-vs-singer-why-airbyte-is-not-built-on-top-of-singer/).” -* “[The State of Open-Source Data Integration and ETL](https://airbyte.com/articles/data-engineering-thoughts/the-state-of-open-source-data-integration-and-etl/),” in which we list and compare all ETL-related open-source projects, including Singer and Airbyte. As a summary, here are the differences: - -![](https://airbyte.com/wp-content/uploads/2020/10/Landscape-of-open-source-data-integration-platforms-4.png) - -## **Singer:** - -* **Supports 96 connectors after 4 years.** -* **Increasingly outdated connectors:** Talend \(acquirer of StitchData\) seems to have stopped investing in maintaining Singer’s community and connectors. As most connectors see schema changes several times a year, more and more Singer’s taps and targets are not actively maintained and are becoming outdated. -* **Absence of standardization:** each connector is its own open-source project. So you never know the quality of a tap or target until you have actually used it. There is no guarantee whatsoever about what you’ll get. -* **Singer’s connectors are standalone binaries:** you still need to build everything around to make them work \(e.g. UI, configuration validation, state management, normalization, schema migration, monitoring, etc\). -* **No full commitment to open sourcing all connectors,** as some connectors are only offered by StitchData under a paid plan. _\*\*_ - -## **Airbyte:** - -* Our ambition is to support **300+ connectors by the end of 2021.** We already supported about 50 connectors at the end of 2020, just 5 months after its inception. -* Airbyte’s connectors are **usable out of the box through a UI and API**, with monitoring, scheduling and orchestration. Airbyte was built on the premise that a user, whatever their background, should be able to move data in 2 minutes. Data engineers might want to use raw data and their own transformation processes, or to use Airbyte’s API to include data integration in their workflows. On the other hand, analysts and data scientists might want to use normalized consolidated data in their database or data warehouses. Airbyte supports all these use cases. -* **One platform, one project with standards:** This will help consolidate the developments behind one single project, some standardization and specific data protocol that can benefit all teams and specific cases. -* **Connectors can be built in the language of your choice,** as Airbyte runs them as Docker containers. -* **Airbyte integrates with your data stack and your needs:** Airflow, Kubernetes, dbt, etc. Its normalization is optional, it gives you a basic version that works out of the box, but also allows you to use dbt to do more complicated things. -* **A full commitment to the open-source MIT project** with the promise not to hide some connectors behind paid walls. - -Note that Airbyte’s data protocol is compatible with Singer’s. So it is easy to migrate a Singer tap onto Airbyte. - diff --git a/docs/archive/faq/differences-with/stitchdata-vs-airbyte.md b/docs/archive/faq/differences-with/stitchdata-vs-airbyte.md deleted file mode 100644 index ec612ea9b2b1..000000000000 --- a/docs/archive/faq/differences-with/stitchdata-vs-airbyte.md +++ /dev/null @@ -1,29 +0,0 @@ -# StitchData vs Airbyte - -We wrote an article, “[Open-source vs. Commercial Software: How to Solve the Data Integration Problem](https://airbyte.com/articles/data-engineering-thoughts/open-source-vs-commercial-software-how-to-better-solve-data-integration/),” in which we describe the pros and cons of StitchData’s commercial approach and Airbyte’s open-source approach. Don’t hesitate to check it out for more detailed arguments. As a summary, here are the differences: - -![](https://airbyte.com/wp-content/uploads/2020/10/Open-source-vs-commercial-approach-2048x1843.png) - -## StitchData: - -* **Limited deprecating connectors:** Stitch only supports 150 connectors. Talend has stopped investing in StitchData and its connectors. And on Singer, each connector is its own open-source project. So you never know the quality of a tap or target until you have actually used it. There is no guarantee whatsoever about what you’ll get. -* **Pricing indexed on usage:** StitchData’s pricing is indexed on the connectors used and the volume of data transferred. Teams always need to keep that in mind and are not free to move data without thinking about cost. -* **Security and privacy compliance:** all companies are subject to privacy compliance laws, such as GDPR, CCPA, HIPAA, etc. As a matter of fact, above a certain stage \(about 100 employees\) in a company, all external products need to go through a security compliance process that can take several months. -* **No moving data between internal databases:** StitchData sits in the cloud, so if you have to replicate data from an internal database to another, it makes no sense to have the data move through their cloud for privacy and cost reasons. -* **StitchData’s Singer connectors are standalone binaries:** you still need to build everything around to make them work. And it’s hard to update some pre-built connectors, as they are of poor quality. - -## Airbyte: - -* **Free, as open source, so no more pricing based on usage:** learn more about our [future business model](https://handbook.airbyte.io/strategy/business-model) \(connectors will always remain open-source\). -* **Supporting 50+ connectors by the end of 2020** \(so in only 5 months of existence\). Our goal is to reach 300+ connectors by the end of 2021. -* **Building new connectors made trivial, in the language of your choice:** Airbyte makes it a lot easier to create your own connector, vs. building them yourself in-house \(with Airflow or other tools\). Scheduling, orchestration, and monitoring comes out of the box with Airbyte. -* **Maintenance-free connectors you can use in minutes.** Just authenticate your sources and warehouse, and get connectors that adapt to schema and API changes for you. -* **Addressing the long tail of connectors:** with the help of the community, Airbyte ambitions to support thousands of connectors. -* **Adapt existing connectors to your needs:** you can adapt any existing connector to address your own unique edge case. -* **Using data integration in a workflow:** Airbyte’s API lets engineering teams add data integration jobs into their workflow seamlessly. -* **Integrates with your data stack and your needs:** Airflow, Kubernetes, dbt, etc. Its normalization is optional, it gives you a basic version that works out of the box, but also allows you to use dbt to do more complicated things. -* **Debugging autonomy:** if you experience any connector issue, you won’t need to wait for Fivetran’s customer support team to get back to you, if you can fix the issue fast yourself. -* **Your data stays in your cloud.** Have full control over your data, and the costs of your data transfers. -* **No more security and privacy compliance, as self-hosted and open-sourced \(MIT\).** Any team can directly address their integration needs. -* **Premium support directly on our Slack for free**. Our time to resolution is about 3-4 hours in average. - diff --git a/docs/archive/faq/getting-started.md b/docs/archive/faq/getting-started.md deleted file mode 100644 index 1ab44be311f0..000000000000 --- a/docs/archive/faq/getting-started.md +++ /dev/null @@ -1,50 +0,0 @@ -# Getting Started - -## **What do I need to get started using Airbyte?** - -You can deploy Airbyte in several ways, as [documented here](../../deploying-airbyte/README.md). Airbyte will then help you replicate data between a source and a destination. If you don’t see the connector you need, you can [build your connector yourself](../../connector-development) and benefit from Airbyte’s optional scheduling, orchestration and monitoring modules. - -## **How long does it take to set up Airbyte?** - -It depends on your source and destination. Check our setup guides to see the tasks for your source and destination. Each source and destination also has a list of prerequisites for setup. To make setup faster, get your prerequisites ready before you start to set up your connector. During the setup process, you may need to contact others \(like a database administrator or AWS account owner\) for help, which might slow you down. But if you have access to the connection information, it can take 2 minutes: see [demo video. ](https://www.youtube.com/watch?v=jWVYpUV9vEg) - -## **What data sources does Airbyte offer connectors for?** - -We already offer 100+ connectors, and will focus all our effort in ramping up the number of connectors and strengthening them. If you don’t see a source you need, you can file a [connector request here](https://github.com/airbytehq/airbyte/issues/new?assignees=&labels=area%2Fintegration%2C+new-integration&template=new-integration-request.md&title=). - -## **Where can I see my data in Airbyte?** - -You can’t see your data in Airbyte, because we don’t store it. The sync loads your data into your destination \(data warehouse, data lake, etc.\). While you can’t see your data directly in Airbyte, you can check your schema and sync status on the source detail page in Airbyte. - -## **Can I add multiple destinations?** - -Sure, you can. Just go to the "Destinations" section and click on the top right "+ new destination" button. You can have multiple destinations for the same source, and multiple sources for the same destination. - -## Am I limited to GUI interaction or is there a way to set up / run / interact with Airbyte programmatically? - -You can use the API to do anything you do today from the UI. Though, word of notice, the API is in alpha and may change. You won’t lose any functionality, but you may need to update your code to catch up to any backwards incompatible changes in the API. - -## How does Airbyte handle connecting to databases that are behind a firewall / NAT? - -We don’t. Airbyte is to be self-hosted in your own private cloud. - -## Can I set a start time for my integration? - -[Here](/using-airbyte/core-concepts/sync-modes#sync-schedules) is the link to the docs on scheduling syncs. - -## **Can I disable analytics in Airbyte?** - -Yes, you can control what's sent outside of Airbyte for analytics purposes. - -We added the following telemetry to Airbyte to ensure the best experience for users: - -* Measure usage of features & connectors -* Measure failure rate of connectors to address bugs quickly -* Reach out to our users about Airbyte community updates if they opt-in -* ... - -To disable telemetry, modify the `.env` file and define the two following environment variables: - -```text -TRACKING_STRATEGY=logging -``` diff --git a/docs/archive/faq/security-and-data-audits.md b/docs/archive/faq/security-and-data-audits.md deleted file mode 100644 index e56db4de7ac3..000000000000 --- a/docs/archive/faq/security-and-data-audits.md +++ /dev/null @@ -1,14 +0,0 @@ -# Security & Data Audits - -## **How secure is Airbyte?** - -Airbyte is an open-source self-hosted solution, so let’s say it is as safe as your data infrastructure. _\*\*_ - -## **Is Airbyte GDPR compliant?** - -Airbyte is a self-hosted solution, so it doesn’t bring any security or privacy risk to your infrastructure. We do intend to add data quality and privacy compliance features in the future, in order to give you more visibility on that topic. - -## **How does Airbyte charge?** - -We don’t. All connectors are all under the MIT license. If you are curious about the business model we have in mind, please check our [company handbook](https://handbook.airbyte.io/strategy/business-model). - diff --git a/docs/archive/faq/transformation-and-schemas.md b/docs/archive/faq/transformation-and-schemas.md deleted file mode 100644 index b759f73b7146..000000000000 --- a/docs/archive/faq/transformation-and-schemas.md +++ /dev/null @@ -1,20 +0,0 @@ -# Transformation and Schemas - -## **Where's the T in Airbyte’s ETL tool?** - -Airbyte is actually an ELT tool, and you have the freedom to use it as an EL-only tool. The transformation part is done by default, but it is optional. You can choose to receive the data in raw \(JSON file for instance\) in your destination. - -We do provide normalization \(if option is still on\) so that data analysts / scientists / any users of the data can use it without much effort. - -We also intend to integrate deeply with dbt to make it easier for your team to continue relying you on them, if this was what you were doing. - -## **How does Airbyte handle replication when a data source changes its schema?** - -Airbyte continues to sync data using the configured schema until that schema is updated. Because Airbyte treats all fields as optional, if a field is renamed or deleted in the source, that field simply will no longer be replicated, but all remaining fields will. The same is true for streams as well. - -For now, the schema can only be updated manually in the UI \(by clicking "Update Schema" in the settings page for the connection\). When a schema is updated Airbyte will re-sync all data for that source using the new schema. - -## **How does Airbyte handle namespaces \(or schemas for the DB-inclined\)?** - -Airbyte respects source-defined namespaces when syncing data with a namespace-supported destination. See [this](/using-airbyte/core-concepts/namespaces.md) for more details. - diff --git a/docs/archive/mongodb.md b/docs/archive/mongodb.md deleted file mode 100644 index d239da867673..000000000000 --- a/docs/archive/mongodb.md +++ /dev/null @@ -1,102 +0,0 @@ -# Mongo DB - -The MongoDB source supports Full Refresh and Incremental sync strategies. - -## Resulting schema - -MongoDB does not have anything like table definition, thus we have to define column types from actual attributes and their values. Discover phase have two steps: - -### Step 1. Find all unique properties - -Connector runs the map-reduce command which returns all unique document props in the collection. Map-reduce approach should be sufficient even for large clusters. - -#### Note - -To work with Atlas MongoDB, a **non-free** tier is required, as the free tier does not support the ability to perform the mapReduce operation. - -### Step 2. Determine property types - -For each property found, connector selects 10k documents from the collection where this property is not empty. If all the selected values have the same type - connector will set appropriate type to the property. In all other cases connector will fallback to `string` type. - -## Features - -| Feature | Supported | -| :--- | :--- | -| Full Refresh Sync | Yes | -| Incremental - Append Sync | Yes | -| Replicate Incremental Deletes | No | -| Namespaces | No | - -### Full Refresh sync - -Works as usual full refresh sync. - -### Incremental sync - -Cursor field can not be nested. Currently only top level document properties are supported. - -Cursor should **never** be blank. In case cursor is blank - the incremental sync results might be unpredictable and will totally rely on MongoDB comparison algorithm. - -Only `datetime` and `integer` cursor types are supported. Cursor type is determined based on the cursor field name: - -* `datetime` - if cursor field name contains a string from: `time`, `date`, `_at`, `timestamp`, `ts` -* `integer` - otherwise - -## Getting started - -This guide describes in details how you can configure MongoDB for integration with Airbyte. - -### Create users - -Run `mongo` shell, switch to `admin` database and create a `READ_ONLY_USER`. `READ_ONLY_USER` will be used for Airbyte integration. Please make sure that user has read-only privileges. - -```javascript -mongo -use admin; -db.createUser({user: "READ_ONLY_USER", pwd: "READ_ONLY_PASSWORD", roles: [{role: "read", db: "TARGET_DATABASE"}]} -``` - -Make sure the user have appropriate access levels. - -### Configure application - -In case your application uses MongoDB without authentication you will have to adjust code base and MongoDB config to enable MongoDB authentication. **Otherwise your application might go down once MongoDB authentication will be enabled.** - -### Enable MongoDB authentication - -Open `/etc/mongod.conf` and add/replace specific keys: - -```yaml -net: - bindIp: 0.0.0.0 - -security: - authorization: enabled -``` - -Binding to `0.0.0.0` will allow to connect to database from any IP address. - -The last line will enable MongoDB security. Now only authenticated users will be able to access the database. - -### Configure firewall - -Make sure that MongoDB is accessible from external servers. Specific commands will depend on the firewall you are using \(UFW/iptables/AWS/etc\). Please refer to appropriate documentation. - -Your `READ_ONLY_USER` should now be ready for use with Airbyte. - - -#### Possible configuration Parameters - -* [Authentication Source](https://docs.mongodb.com/manual/reference/connection-string/#mongodb-urioption-urioption.authSource) -* Host: URL of the database -* Port: Port to use for connecting to the database -* User: username to use when connecting -* Password: used to authenticate the user -* [Replica Set](https://docs.mongodb.com/manual/reference/connection-string/#mongodb-urioption-urioption.replicaSet) -* Whether to enable SSL - - -## Changelog -| Version | Date | Pull Request | Subject | -| :------ | :-------- | :----- | :------ | -| 0.2.3 | 2021-07-20 | [4669](https://github.com/airbytehq/airbyte/pull/4669) | Subscriptions Stream now returns all kinds of subscriptions (including expired and canceled)| diff --git a/docs/archive/securing-airbyte.md b/docs/archive/securing-airbyte.md deleted file mode 100644 index 727ff5043eeb..000000000000 --- a/docs/archive/securing-airbyte.md +++ /dev/null @@ -1,28 +0,0 @@ -# Securing Airbyte access - -## Reporting Vulnerabilities -⚠️ Please do not file GitHub issues or post on our public forum for security vulnerabilities as they are public! ⚠️ - -Airbyte takes security issues very seriously. If you have any concern around Airbyte or believe you have uncovered a vulnerability, please get in touch via the e-mail address security@airbyte.io. In the message, try to provide a description of the issue and ideally a way of reproducing it. The security team will get back to you as soon as possible. - -Note that this security address should be used only for undisclosed vulnerabilities. Dealing with fixed issues or general questions on how to use the security features should be handled regularly via the user and the dev lists. Please report any security problems to us before disclosing it publicly. - -## Access control - -Airbyte, in its open-source version, does not support RBAC to manage access to the UI. - -However, multiple options exist for the operators to implement access control themselves. - -To secure access to Airbyte you have three options: -* Networking restrictions: deploy Airbyte in a private network or use a firewall to filter which IP is allowed to access your host. -* Put Airbyte behind a reverse proxy and handle the access control on the reverse proxy side. -* If you deployed Airbyte on a cloud provider: - * GCP: use the [Identity-Aware proxy](https://cloud.google.com/iap) service - * AWS: use the [AWS Systems Manager Session Manager](https://docs.aws.amazon.com/systems-manager/latest/userguide/session-manager.html) service - -**Non exhaustive** online resources list to set up auth on your reverse proxy: -* [Configure HTTP Basic Auth on NGINX for Airbyte](https://shadabshaukat.medium.com/deploy-and-secure-airbyte-with-nginx-reverse-proxy-basic-authentication-lets-encrypt-ssl-72bee223a4d9) -* [Kubernetes: Basic auth on a Nginx ingress controller](https://kubernetes.github.io/ingress-nginx/examples/auth/basic/) -* [How to set up Okta SSO on an NGINX reverse proxy](https://developer.okta.com/blog/2018/08/28/nginx-auth-request) -* [How to enable HTTP Basic Auth on Caddy](https://caddyserver.com/docs/caddyfile/directives/basicauth) -* [SSO for Traefik](https://github.com/thomseddon/traefik-forward-auth) diff --git a/docs/cloud/managing-airbyte-cloud/configuring-connections.md b/docs/cloud/managing-airbyte-cloud/configuring-connections.md index 6e8672d9f894..129fd366a48f 100644 --- a/docs/cloud/managing-airbyte-cloud/configuring-connections.md +++ b/docs/cloud/managing-airbyte-cloud/configuring-connections.md @@ -25,9 +25,9 @@ You can configure the following settings: | Setting | Description | |--------------------------------------|-------------------------------------------------------------------------------------| | Replication frequency | How often the data syncs | -| [Destination namespace](/using-airbyte/namespaces.md) | Where the replicated data is written | +| [Destination namespace](/using-airbyte/core-concepts/namespaces.md) | Where the replicated data is written | | Destination stream prefix | How you identify streams from different connectors | -| [Detect and propagate schema changes](/using-airbyte/manage-schema-changes.md) | How Airbyte handles syncs when it detects schema changes in the source | +| [Detect and propagate schema changes](/cloud/managing-airbyte-cloud/manage-schema-changes.md) | How Airbyte handles syncs when it detects schema changes in the source | | Connection Data Residency | Where data will be processed | To use [cron scheduling](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html): diff --git a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md index 160b28d5f47e..90ddf71220bc 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md +++ b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md @@ -1,6 +1,6 @@ # Manage notifications -This page provides guidance on how to manage notifications for Airbyte Cloud, allowing you to stay up-to-date on the activities in your workspace. +This page provides guidance on how to manage notifications for Airbyte, allowing you to stay up-to-date on the activities in your workspace. ## Notification Event Types @@ -18,6 +18,8 @@ This page provides guidance on how to manage notifications for Airbyte Cloud, al ## Configure Notification Settings + + To set up email notifications: 1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Settings**. diff --git a/docs/cloud/managing-airbyte-cloud/manage-data-residency.md b/docs/cloud/managing-airbyte-cloud/manage-data-residency.md index 2106f0d12a92..024772fac02d 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-data-residency.md +++ b/docs/cloud/managing-airbyte-cloud/manage-data-residency.md @@ -26,7 +26,7 @@ To choose your default data residency: :::info -Depending on your network configuration, you may need to add [IP addresses](/operating-airbyte/security#network-security-1.md) to your allowlist. +Depending on your network configuration, you may need to add [IP addresses](/operating-airbyte/security.md#network-security-1) to your allowlist. ::: diff --git a/docs/operator-guides/configuring-sync-notifications.md b/docs/operator-guides/configuring-sync-notifications.md index 837310c00af2..9b4d2efd41d3 100644 --- a/docs/operator-guides/configuring-sync-notifications.md +++ b/docs/operator-guides/configuring-sync-notifications.md @@ -38,7 +38,7 @@ Click `Copy.` **Add the webhook to Airbyte.** -Assuming you have a [running instance of Airbyte](../deploying-airbyte/README.md), we can navigate to the UI. Click on Settings and then click on `Notifications`. +Assuming you have a [running instance of Airbyte](/deploying-airbyte/), we can navigate to the UI. Click on Settings and then click on `Notifications`. ![](../.gitbook/assets/notifications_airbyte_settings.png) From a0b2d2a9b636fadc9eb850133a8f33b6ba17b8ed Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 15:58:59 +0000 Subject: [PATCH 20/52] changes --- .../manage-airbyte-cloud-notifications.md | 69 ++++++++++++++----- .../configuring-sync-notifications.md | 57 --------------- .../core-concepts/basic-normalization.md | 2 +- docs/using-airbyte/core-concepts/readme.md | 2 +- .../getting-started/add-a-destination.md | 2 +- docs/using-airbyte/getting-started/readme.md | 18 ++--- .../getting-started/set-up-a-connection.md | 4 +- docusaurus/src/scripts/cloudStatus.js | 4 +- 8 files changed, 63 insertions(+), 95 deletions(-) delete mode 100644 docs/operator-guides/configuring-sync-notifications.md diff --git a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md index 90ddf71220bc..741a4f809232 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md +++ b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md @@ -12,43 +12,74 @@ This page provides guidance on how to manage notifications for Airbyte, allowing | Connection Updates Requiring Action | A connection update requires you to take action (ex. a breaking schema change is detected) | | Warning - Repeated Failures | A connection will be disabled soon due to repeated failures. It has failed 50 times consecutively or there were only failed jobs in the past 7 days | | Sync Disabled - Repeated Failures | A connection was automatically disabled due to repeated failures. It will be disabled when it has failed 100 times consecutively or has been failing for 14 days in a row | -| Warning - Upgrade Required (email only) | A new connector version is available and requires manual upgrade | -| Sync Disabled - Upgrade Required (email only) | One or more connections were automatically disabled due to a connector upgrade deadline passing -| - -## Configure Notification Settings +| Warning - Upgrade Required (Cloud only) | A new connector version is available and requires manual upgrade | +| Sync Disabled - Upgrade Required (Cloud only) | One or more connections were automatically disabled due to a connector upgrade deadline passing +## Configure Email Notification Settings + To set up email notifications: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Settings**. +1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Settings** and navigate to **Notifications**. -2. Click **Notifications**. +2. Toggle which messages you'd like to receive from Airbyte. All email notifications will be sent by default to the creator of the workspace. To change the recipient, edit and save the **notification email recipient**. If you would like to send email notifications to more than one recipient, you can enter an email distribution list (ie Google Group) as the recipient. -3. Toggle which messages you'd like to receive from Airbyte. All email notifications will be sent by default to the creator of the workspace. To change the recipient, edit and save the **notification email recipient**. If you would like to send email notifications to more than one recipient, you can enter an email distribution list (ie Google Group) as the recipient. +3. Click **Save changes**. -4. Click **Save changes**. +:::note +All email notifications except for Successful Syncs are enabled by default. +::: + +## Configure Slack Notification settings + +To set up Slack notifications: + +If you're more of a visual learner, just head over to [this video](https://www.youtube.com/watch?v=NjYm8F-KiFc&ab_channel=Airbyte) to learn how to do this. You can also refer to the Slack documentation on how to [create an incoming webhook for Slack](https://api.slack.com/messaging/webhooks). + +### Create a Slack app + +1. **Create a Slack App**: Navigate to https://api.slack.com/apps/. Select `Create an App`. + +![](../../.gitbook/assets/notifications_create_slack_app.png) + +2. Select `From Scratch`. Enter your App Name (e.g. Airbyte Sync Notifications) and pick your desired Slack workspace. + +3. **Set up the webhook URL.**: in the left sidebar, click on `Incoming Webhooks`. Click the slider button in the top right to turn the feature on. Then click `Add New Webhook to Workspace`. -To set up webhook notifications: +![](../../.gitbook/assets/notifications_add_new_webhook.png) -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Settings**. +4. Pick the channel that you want to receive Airbyte notifications in (ideally a dedicated one), and click `Allow` to give it permissions to access the channel. You should see the bot show up in the selected channel now. You will see an active webhook right above the `Add New Webhook to Workspace` button. -2. Click **Notifications**. +![](../../.gitbook/assets/notifications_webhook_url.png) -3. Have a webhook URL ready if you plan to use webhook notifications. Using a Slack webook is recommended. [Create an Incoming Webhook for Slack](https://api.slack.com/messaging/webhooks). +5. Click `Copy.` to copy the link to your clipboard, which you will need to enter into Airbyte. -4. Toggle the type of events you are interested to receive notifications for. - 1. To enable webhook notifications, the webhook URL is required. For your convenience, we provide a 'test' function to send a test message to your webhook URL so you can make sure it's working as expected. +Your Webhook URL should look something like this: -5. Click **Save changes**. +![](../../.gitbook/assets/notifications_airbyte_notification_settings.png) + + +### Enable the Slack notification in Airbyte + +1. In the Airbyte UI, click **Settings** and navigate to **Notifications**. + +2. Paste the copied webhook URL to `Webhook URL`. Using a Slack webook is recommended. On this page, you can toggle each slider decide whether you want notifications on each notification type. + +3. **Test it out.**: you can click `Test` to send a test message to the channel. Or, just run a sync now and try it out! If all goes well, you should receive a notification in your selected channel that looks like this: + +![](../../.gitbook/assets/notifications_slack_message.png) + +You're done! + +4. Click **Save changes**. ## Enable schema update notifications -To get notified when your source schema changes: -1. Make sure you have `Automatic Connection Updates` and `Connection Updates Requiring Action` turned on for your desired notification channels; If these are off, even if you turned on schema update notifications in a connection's settings, Airbyte will *NOT* send out any notifications related to these types of events. +To be notified of any source schema changes: +1. Make sure you have enabled `Automatic Connection Updates` and `Connection Updates Requiring Action` notifications. If these are off, even if you turned on schema update notifications in a connection's settings, Airbyte will *NOT* send out any notifications related to these types of events. -2. On the [Airbyte Cloud](http://cloud.airbyte.com/) dashboard, click **Connections** and select the connection you want to receive notifications for. +2. On the [Airbyte](http://cloud.airbyte.com/) dashboard, click **Connections** and select the connection you want to receive notifications for. 3. Click the **Settings** tab on the Connection page. diff --git a/docs/operator-guides/configuring-sync-notifications.md b/docs/operator-guides/configuring-sync-notifications.md deleted file mode 100644 index 9b4d2efd41d3..000000000000 --- a/docs/operator-guides/configuring-sync-notifications.md +++ /dev/null @@ -1,57 +0,0 @@ -# Configuring Sync Notifications - -// TODO: merge into other notification doc - -## Overview - -You can set up Airbyte to notify you when syncs have **failed** or **succeeded**. This is achieved through a webhook, a URL that you can input into other applications to get real time data from Airbyte. - -## Set up Slack Notifications on Sync Status - -If you're more of a visual learner, just head over to [this video](https://www.youtube.com/watch?v=NjYm8F-KiFc&ab_channel=Airbyte) to learn how to do this. Otherwise, keep reading! - -**Set up the bot.** - -Navigate to https://api.slack.com/apps/. Hit `Create an App`. - -![](../.gitbook/assets/notifications_create_slack_app.png) - -Then click `From scratch`. Enter your App Name (e.g. Airbyte Sync Notifications) and pick your desired Slack workspace. - -**Set up the webhook URL.** - -Now on the left sidebar, click on `Incoming Webhooks`. - -![](../.gitbook/assets/notifications_incoming_webhooks.png) - -Click the slider button in the top right to turn the feature on. Then click `Add New Webhook to Workspace`. - -![](../.gitbook/assets/notifications_add_new_webhook.png) - -Pick the channel that you want to receive Airbyte notifications in (ideally a dedicated one), and click `Allow` to give it permissions to access the channel. You should see the bot show up in the selected channel now. - -Now you should see an active webhook right above the `Add New Webhook to Workspace` button. - -![](../.gitbook/assets/notifications_webhook_url.png) - -Click `Copy.` - -**Add the webhook to Airbyte.** - -Assuming you have a [running instance of Airbyte](/deploying-airbyte/), we can navigate to the UI. Click on Settings and then click on `Notifications`. - -![](../.gitbook/assets/notifications_airbyte_settings.png) - -Simply paste the copied webhook URL in `Connection status Webhook URL` and you're ready to go! On this page, you can click one or both of the sliders to decide whether you want notifications on sync successes, failures, or both. Make sure to click `Save changes` before you leave. - -Your Webhook URL should look something like this: - -![](../.gitbook/assets/notifications_airbyte_notification_settings.png) - -**Test it out.** - -From the settings page, you can click `Test` to send a test message to the channel. Or, just run a sync now and try it out! If all goes well, you should receive a notification in your selected channel that looks like this: - -![](../.gitbook/assets/notifications_slack_message.png) - -You're done! diff --git a/docs/using-airbyte/core-concepts/basic-normalization.md b/docs/using-airbyte/core-concepts/basic-normalization.md index 3d94e7847f90..b2ef3700b866 100644 --- a/docs/using-airbyte/core-concepts/basic-normalization.md +++ b/docs/using-airbyte/core-concepts/basic-normalization.md @@ -2,7 +2,7 @@ :::danger -Basic normalization is being removed in favor of [Typing and Deduping](/understanding-airbyte/typing-deduping), as part of [Destinations V2](/release_notes/upgrading_to_destinations_v2). This pages remains as a guide for legacy connectors. +Basic normalization is being removed in favor of [Typing and Deduping](typing-deduping.md), as part of [Destinations V2](/release_notes/upgrading_to_destinations_v2). This pages remains as a guide for legacy connectors. ::: diff --git a/docs/using-airbyte/core-concepts/readme.md b/docs/using-airbyte/core-concepts/readme.md index ea824ec627f2..6345e43334ea 100644 --- a/docs/using-airbyte/core-concepts/readme.md +++ b/docs/using-airbyte/core-concepts/readme.md @@ -77,7 +77,7 @@ Airbyte supports the following configuration options for a connection: | Mirror source structure | Some sources (for example, databases) provide namespace information for a stream. If a source provides namespace information, the destination will mirror the same namespace when this configuration is set. For sources or streams where the source namespace is not known, the behavior will default to the "Destination default" option. | | Custom format | All streams will be replicated to a single user-defined namespace. | -For more details, see our [Namespace documentation](/using-airbyte/namespaces). +For more details, see our [Namespace documentation](namespaces.md). ## Connection sync modes diff --git a/docs/using-airbyte/getting-started/add-a-destination.md b/docs/using-airbyte/getting-started/add-a-destination.md index 5d139a0d41a8..cc473d8384f3 100644 --- a/docs/using-airbyte/getting-started/add-a-destination.md +++ b/docs/using-airbyte/getting-started/add-a-destination.md @@ -9,7 +9,7 @@ Once you've logged in to your Airbyte Open Source deployment, click on the **Des You can use the provided search bar at the top of the page, or scroll down the list to find the destination you want to replicate data from. :::tip -You can filter the list of destinations by support level. Airbyte connectors are categorized in two support levels, Certified and Community. See our [Connector Support Levels](./integrations/connector-support-levels) page for more information on this topic. +You can filter the list of destinations by support level. Airbyte connectors are categorized in two support levels, Certified and Community. See our [Connector Support Levels](/integrations/connector-support-levels.md) page for more information on this topic. ::: As an example, we'll be setting up a simple JSON file that will be saved on our local system as the destination. Select **Local JSON** from the list of destinations. This will take you to the destination setup page. diff --git a/docs/using-airbyte/getting-started/readme.md b/docs/using-airbyte/getting-started/readme.md index 6c1624a39866..2d1d7d4feb27 100644 --- a/docs/using-airbyte/getting-started/readme.md +++ b/docs/using-airbyte/getting-started/readme.md @@ -8,21 +8,15 @@ To use Airbyte Cloud, [sign up](https://cloud.airbyte.io/signup) with your email Airbyte Cloud offers a 14-day free trial that begins after your first successful sync. For more details on our pricing model, see our [pricing page](https://www.airbyte.com/pricing). -If you signed up using your email address, Airbyte will send you an email with a verification link. On clicking the link, you'll be taken to your new workspace. - - :::note - If you have been invited to an existing workspace, you cannot use the Google login option to create a new Airbyte account. Use email instead. - ::: - To start setting up a data pipeline, see how to [set up a source](./add-a-source.md). - :::info - Depending on your data residency, you may need to [allowlist IP addresses](/operating-airbyte/security) to enable access to Airbyte. - ::: +:::info +Depending on your data residency, you may need to [allowlist IP addresses](/operating-airbyte/security.md#network-security-1) to enable access to Airbyte. +::: -## Deploy Airbyte (Open-Source) +## Deploy Airbyte (Open Source) -To use Airbyte Open-Source, you can use on the following options to deploy it on your infrastructure. +To use Airbyte Open Source, you can use on the following options to deploy it on your infrastructure. - [Local Deployment](/deploying-airbyte/local-deployment.md) (recommended when trying out Airbyte) - [On Aws](/deploying-airbyte/on-aws-ec2.md) @@ -33,4 +27,4 @@ To use Airbyte Open-Source, you can use on the following options to deploy it on - [On OCI VM](/deploying-airbyte/on-oci-vm.md) - [On Restack](/deploying-airbyte/on-restack.md) - [On Plural](/deploying-airbyte/on-plural.md) -- [On AWS ECS](/deploying-airbyte/on-aws-ecs.md) (Spoiler alert: it doesn't work) +- [On AWS ECS](/deploying-airbyte/on-aws-ecs.md) (Spoiler alert: it doesn't work) diff --git a/docs/using-airbyte/getting-started/set-up-a-connection.md b/docs/using-airbyte/getting-started/set-up-a-connection.md index e3378d1f5bd4..6cf68ae45fa1 100644 --- a/docs/using-airbyte/getting-started/set-up-a-connection.md +++ b/docs/using-airbyte/getting-started/set-up-a-connection.md @@ -6,12 +6,12 @@ On the left side of your main Airbyte dashboard, select **Connections**. You wil ## Configure the connection -Once you've chosen your source and destination, you'll be able to configure the connection. You can refer to [this page](/using-airbyte/configuring-connections) for more information on each available configuration. For this demo, we'll simply set the **Replication frequency** to a 24 hour interval and leave the other fields at their default values. +Once you've chosen your source and destination, you'll be able to configure the connection. You can refer to [this page](/cloud/managing-airbyte-cloud/configuring-connections.md) for more information on each available configuration. For this demo, we'll simply set the **Replication frequency** to a 24 hour interval and leave the other fields at their default values. ![Connection config](../../.gitbook/assets/set-up-a-connection/getting-started-connection-config.png) :::note -By default, data will sync to the default defined in the destination. To ensure your data is synced to the correct place, see our examples for [Destination Namespace](/using-airbyte/core-concepts/namespaces) +By default, data will sync to the default defined in the destination. To ensure your data is synced to the correct place, see our examples for [Destination Namespace](/using-airbyte/core-concepts/namespaces.md) ::: Next, you can toggle which streams you want to replicate, as well as setting up the desired sync mode for each stream. For more information on the nature of each sync mode supported by Airbyte, see [this page](/using-airbyte/core-concepts/sync-modes). diff --git a/docusaurus/src/scripts/cloudStatus.js b/docusaurus/src/scripts/cloudStatus.js index fa1844409227..e3428ac94ed3 100644 --- a/docusaurus/src/scripts/cloudStatus.js +++ b/docusaurus/src/scripts/cloudStatus.js @@ -9,12 +9,12 @@ if (ExecutionEnvironment.canUseDOM) { .then((summary) => { const status = summary.page.status; const el = document.querySelector(".cloudStatusLink"); - el.classList.forEach((className) => { + el?.classList.forEach((className) => { if (className.startsWith("status-")) { el.classList.remove(className); } }); - el.classList.add(`status-${status.toLowerCase()}`) + el?.classList.add(`status-${status.toLowerCase()}`) }); } From cf3745712ca12918d09cd37a922be5898a725c48 Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 16:31:59 +0000 Subject: [PATCH 21/52] Change sidebar and code of conduct --- docs/community/code-of-conduct.md | 43 ++++++++++++++++++++++ docs/community/slack-code-of-conduct.md | 47 ------------------------- docs/readme.md | 4 +++ docusaurus/redirects.yml | 6 ++-- docusaurus/sidebars.js | 15 +++----- docusaurus/src/css/custom.css | 10 +++++- 6 files changed, 63 insertions(+), 62 deletions(-) delete mode 100644 docs/community/slack-code-of-conduct.md diff --git a/docs/community/code-of-conduct.md b/docs/community/code-of-conduct.md index 9eacce28a212..4cb81d4468fc 100644 --- a/docs/community/code-of-conduct.md +++ b/docs/community/code-of-conduct.md @@ -46,3 +46,46 @@ Project maintainers who do not follow or enforce the Code of Conduct in good fai This Code of Conduct is adapted from the [Contributor Covenant](https://www.contributor-covenant.org/), version 1.4, available at [https://www.contributor-covenant.org/version/1/4/code-of-conduct.html](https://www.contributor-covenant.org/version/1/4/code-of-conduct.html) +## Slack Code of Conduct + +Airbyte's Slack community is growing incredibly fast. We're home to over 1500 data professionals and are growing at an awesome pace. We are proud of our community, and have provided these guidelines to support new members in maintaining the wholesome spirit we have developed here. We appreciate your continued commitment to making this a community we are all excited to be a part of. + +### Rule 1: Be respectful. + +Our desire is for everyone to have a positive, fulfilling experience in Airbyte Slack, and we sincerely appreciate your help in making this happen. +All of the guidelines we provide below are important, but there’s a reason respect is the first rule. We take it seriously, and while the occasional breach of etiquette around Slack is forgivable, we cannot condone disrespectful behavior. + +### Rule 2: Use the most relevant channels. + +We deliberately use topic-specific Slack channels so members of the community can opt-in on various types of conversations. Our members take care to post their messages in the most relevant channel, and you’ll often see reminders about the best place to post a message (respectfully written, of course!). If you're looking for help directly from the Community Assistance Team or other Airbyte employees, please stick to posting in the airbyte-help channel, so we know you're asking us specifically! + +### Rule 3: Don’t double-post. + +Please be considerate of our community members’ time. We know your question is important, but please keep in mind that Airbyte Slack is not a customer service platform but a community of volunteers who will help you as they are able around their own work schedule. You have access to all the history, so it’s easy to check if your question has already been asked. + +### Rule 4: Check question for clarity and thoughtfulness. + +Airbyte Slack is a community of volunteers. Our members enjoy helping others; they are knowledgeable, gracious, and willing to give their time and expertise for free. Putting some effort into a well-researched and thoughtful post shows consideration for their time and will gain more responses. + +### Rule 5: Keep it public. + +This is a public forum; please do not contact individual members of this community without their express permission, regardless of whether you are trying to recruit someone, sell a product, or solicit help. + +### Rule 6: No soliciting! + +The purpose of the Airbyte Slack community is to provide a forum for data practitioners to discuss their work and share their ideas and learnings. It is not intended as a place to generate leads for vendors or recruiters, and may not be used as such. + +If you’re a vendor, you may advertise your product in #shameless-plugs. Advertising your product anywhere else is strictly against the rules. + +### Rule 7: Don't spam tags, or use @here or @channel. + +Using the @here and @channel keywords in a post will not help, as they are disabled in Slack for everyone excluding admins. Nonetheless, if you use them we will remind you with a link to this rule, to help you better understand the way Airbyte Slack operates. + +Do not tag specific individuals for help on your questions. If someone chooses to respond to your question, they will do so. You will find that our community of volunteers is generally very responsive and amazingly helpful! + +### Rule 8: Use threads for discussion. + +The simplest way to keep conversations on track in Slack is to use threads. The Airbyte Slack community relies heavily on threads, and if you break from this convention, rest assured one of our community members will respectfully inform you quickly! + +_If you see a message or receive a direct message that violates any of these rules, please contact an Airbyte team member and we will take the appropriate moderation action immediately. We have zero tolerance for intentional rule-breaking and hate speech._ + diff --git a/docs/community/slack-code-of-conduct.md b/docs/community/slack-code-of-conduct.md deleted file mode 100644 index c88da4c1adb5..000000000000 --- a/docs/community/slack-code-of-conduct.md +++ /dev/null @@ -1,47 +0,0 @@ ---- -description: Be nice to one another. ---- - -# Slack Code of Conduct - -Airbyte's Slack community is growing incredibly fast. We're home to over 1500 data professionals and are growing at an awesome pace. We are proud of our community, and have provided these guidelines to support new members in maintaining the wholesome spirit we have developed here. We appreciate your continued commitment to making this a community we are all excited to be a part of. - -## Rule 1: Be respectful. - -Our desire is for everyone to have a positive, fulfilling experience in Airbyte Slack, and we sincerely appreciate your help in making this happen. -All of the guidelines we provide below are important, but there’s a reason respect is the first rule. We take it seriously, and while the occasional breach of etiquette around Slack is forgivable, we cannot condone disrespectful behavior. - -## Rule 2: Use the most relevant channels. - -We deliberately use topic-specific Slack channels so members of the community can opt-in on various types of conversations. Our members take care to post their messages in the most relevant channel, and you’ll often see reminders about the best place to post a message (respectfully written, of course!). If you're looking for help directly from the Community Assistance Team or other Airbyte employees, please stick to posting in the airbyte-help channel, so we know you're asking us specifically! - -## Rule 3: Don’t double-post. - -Please be considerate of our community members’ time. We know your question is important, but please keep in mind that Airbyte Slack is not a customer service platform but a community of volunteers who will help you as they are able around their own work schedule. You have access to all the history, so it’s easy to check if your question has already been asked. - -## Rule 4: Check question for clarity and thoughtfulness. - -Airbyte Slack is a community of volunteers. Our members enjoy helping others; they are knowledgeable, gracious, and willing to give their time and expertise for free. Putting some effort into a well-researched and thoughtful post shows consideration for their time and will gain more responses. - -## Rule 5: Keep it public. - -This is a public forum; please do not contact individual members of this community without their express permission, regardless of whether you are trying to recruit someone, sell a product, or solicit help. - -## Rule 6: No soliciting! - -The purpose of the Airbyte Slack community is to provide a forum for data practitioners to discuss their work and share their ideas and learnings. It is not intended as a place to generate leads for vendors or recruiters, and may not be used as such. - -If you’re a vendor, you may advertise your product in #shameless-plugs. Advertising your product anywhere else is strictly against the rules. - -## Rule 7: Don't spam tags, or use @here or @channel. - -Using the @here and @channel keywords in a post will not help, as they are disabled in Slack for everyone excluding admins. Nonetheless, if you use them we will remind you with a link to this rule, to help you better understand the way Airbyte Slack operates. - -Do not tag specific individuals for help on your questions. If someone chooses to respond to your question, they will do so. You will find that our community of volunteers is generally very responsive and amazingly helpful! - -## Rule 8: Use threads for discussion. - -The simplest way to keep conversations on track in Slack is to use threads. The Airbyte Slack community relies heavily on threads, and if you break from this convention, rest assured one of our community members will respectfully inform you quickly! - -_If you see a message or receive a direct message that violates any of these rules, please contact an Airbyte team member and we will take the appropriate moderation action immediately. We have zero tolerance for intentional rule-breaking and hate speech._ - diff --git a/docs/readme.md b/docs/readme.md index e8321d457839..708a6a790430 100644 --- a/docs/readme.md +++ b/docs/readme.md @@ -1,3 +1,7 @@ +--- +displayed_sidebar: docs +--- + # Welcome to Airbyte Docs Whether you are an Airbyte user or contributor, we have docs for you! diff --git a/docusaurus/redirects.yml b/docusaurus/redirects.yml index e1f892f2b077..b9b03fac35cf 100644 --- a/docusaurus/redirects.yml +++ b/docusaurus/redirects.yml @@ -24,10 +24,10 @@ - /project-overview/product-support-levels - /project-overview/product-release-stages to: /integrations/connector-support-levels -- from: /project-overview/code-of-conduct +- from: + - /project-overview/code-of-conduct + - /project-overview/slack-code-of-conduct to: /community/code-of-conduct -- from: /project-overview/slack-code-of-conduct - to: /community/slack-code-of-conduct - from: /project-overview/licenses/ to: /developer-guides/licenses/ - from: /project-overview/licenses/license-faq diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index ac86476ad368..b91bb4d3a824 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -394,12 +394,7 @@ const understandingAirbyte = { }; module.exports = { - mySidebar: [ - { - type: "doc", - label: "Start here", - id: "readme", - }, + docs: [ sectionHeader("Airbyte Connectors"), connectorCatalog, buildAConnector, @@ -448,7 +443,7 @@ module.exports = { }, { type: "category", - label: "Managing Connections", + label: "Configuring Connections", items: [ "cloud/managing-airbyte-cloud/configuring-connections", "cloud/managing-airbyte-cloud/manage-schema-changes", @@ -479,16 +474,14 @@ module.exports = { }, { type: "category", - label: "Managing your workspace", + label: "Workspace Management", items: [ "cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace", - // TODO: merge with operator-guides/configure-sync-notifications "cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications", "cloud/managing-airbyte-cloud/manage-credits", "operator-guides/using-custom-connectors", ] }, - "cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits", sectionHeader("Operating Airbyte"), deployAirbyte, { @@ -522,6 +515,7 @@ module.exports = { items: [ "operator-guides/collecting-metrics", "operator-guides/scaling-airbyte", + "cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits", ] }, "operating-airbyte/security", @@ -567,7 +561,6 @@ module.exports = { sectionHeader("Community"), "community/getting-support", "community/code-of-conduct", - "community/slack-code-of-conduct", sectionHeader("Product Updates"), { type: "link", diff --git a/docusaurus/src/css/custom.css b/docusaurus/src/css/custom.css index 56563f0b9d24..ba56dadcae02 100644 --- a/docusaurus/src/css/custom.css +++ b/docusaurus/src/css/custom.css @@ -124,11 +124,19 @@ html[data-theme="dark"] .docusaurus-highlight-code-line { font-weight: 700; font-size: 0.8em; padding: 0.4em 0 0.4em 0.4em; - margin-top: 1.1em; color: var(--docsearch-text-color); background-color: var(--ifm-hover-overlay); } +.navbar__category:not(:first-child) { + margin-top: 1.1em; +} + +/* Hide the breadcrumbs if they have only the house as an entry (i.e. on the start page) */ +.breadcrumbs:has(li:first-child:last-child) { + display: none; +} + .cloudStatusLink { display: flex; gap: 4px; From 694b245ffd5f3a32f66a6ff9d331007cb26c91a4 Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 16:32:11 +0000 Subject: [PATCH 22/52] More changes --- docs/contributing-to-airbyte/writing-docs.md | 11 +-------- docs/operator-guides/reset.md | 25 ++++++++++++-------- 2 files changed, 16 insertions(+), 20 deletions(-) diff --git a/docs/contributing-to-airbyte/writing-docs.md b/docs/contributing-to-airbyte/writing-docs.md index ec4abac48481..a0621d10e9e1 100644 --- a/docs/contributing-to-airbyte/writing-docs.md +++ b/docs/contributing-to-airbyte/writing-docs.md @@ -276,16 +276,7 @@ Eagle-eyed readers may note that _all_ markdown should support this feature sinc ### Adding a redirect -To add a redirect, open the [`docusaurus.config.js`](https://github.com/airbytehq/airbyte/blob/master/docusaurus/docusaurus.config.js#L22) file and locate the following commented section: - -```js -// { -// from: '/some-lame-path', -// to: '/a-much-cooler-uri', -// }, -``` - -Copy this section, replace the values, and [test the changes locally](#editing-on-your-local-machine) by going to the path you created a redirect for and verify that the address changes to the new one. +To add a redirect, open the [`docusaurus/redirects.yml`](https://github.com/airbytehq/airbyte/blob/master/docusaurus/redirects.yml) file and add an entry from which old path to which new path a redirect should happen. :::note Your path **needs** a leading slash `/` to work diff --git a/docs/operator-guides/reset.md b/docs/operator-guides/reset.md index ff7dc4d06124..bfaf8787d71b 100644 --- a/docs/operator-guides/reset.md +++ b/docs/operator-guides/reset.md @@ -1,20 +1,25 @@ # Resetting Your Data -The reset button gives you a blank slate, of sorts, to perform a fresh new sync. This can be useful if you are just testing Airbyte or don't necessarily require the data replicated to your destination to be saved permanently. +Resetting your data allows you to drop all previously synced data so that any ensuing sync can start syncing fresh. This is useful if you don't require the data replicated to your destination to be saved permanently or are just testing Airbyte. -![](../.gitbook/assets/reset_your_data_1.png) +Airbyte allows you to reset all streams in the connection, some, or only a single stream (when the connector support per-stream operations). -As outlined above, you can click on the `Reset your data` button to give you that clean slate. Just as a heads up, here is what it does and doesn't do: +A sync will automatically start after a completed reset, which commonly backfills all historical data. -The reset button **DOES**: +## Performing a Reset +To perform a reset, select `Reset your data` in the UI on a connection's status or job history tabs. You will also be prompted to reset affected streams if you edit any stream settings to ensure data continues to sync accurately. -* Delete all records in your destination tables -* Delete all records in your destination file +Similarly to a sync job, a reset can be completed as successful, failed, or cancelled. To resolve a failed reset, you should manually drop the tables in the destination so that Airbyte can continue syncing accurately into the destination. -The reset button **DOES NOT**: +## Reset behavior +When a reset is successfully completed, all the records are deleted from your destination tables (and files, if using local JSON or local CSV as the destination)) -* Delete the destination tables -* Delete a destination file if using the LocalCSV or LocalJSON Destinations +:::info +If you are using destinations that are on the Destinations v2 framework, only raw tables will be cleared of their data. Final tables will retain all records from the last sync. +::: -Because of this, if you have any orphaned tables or files that are no longer being synced to, they will have to be cleaned up later, as Airbyte will not clean them up for you. +A reset **DOES NOT** delete any destination tables or file itself. The schema is retained but will not contain any rows. +:::tip +If you have any orphaned tables or files that are no longer being synced to, they should be cleaned up separately, as Airbyte will not clean them up for you. This can occur when the `Destination Namespace` or `Stream Prefix` connection configuration is changed for an existing connection. +::: From 2a4dc1a45063cf8adfbcfe8e3792267f301c20ce Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 16:43:36 +0000 Subject: [PATCH 23/52] Rename workspace docs --- docs/operating-airbyte/security.md | 2 +- docs/release_notes/july_2022.md | 2 +- .../workspaces.md} | 0 docusaurus/redirects.yml | 2 ++ docusaurus/sidebars.js | 2 +- 5 files changed, 5 insertions(+), 3 deletions(-) rename docs/{cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace.md => using-airbyte/workspaces.md} (100%) diff --git a/docs/operating-airbyte/security.md b/docs/operating-airbyte/security.md index b94e04d8b8d4..7f1b10973bd6 100644 --- a/docs/operating-airbyte/security.md +++ b/docs/operating-airbyte/security.md @@ -142,7 +142,7 @@ Airbyte Cloud allows you to log in to the platform using your email and password ### Access Control -Airbyte Cloud supports [user management](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace#add-users-to-your-workspace) but doesn’t support role-based access control (RBAC) yet. +Airbyte Cloud supports [user management](/using-airbyte/workspaces.md#add-users-to-your-workspace) but doesn’t support role-based access control (RBAC) yet. ### Compliance diff --git a/docs/release_notes/july_2022.md b/docs/release_notes/july_2022.md index 0c6cbc35e004..c3a4c8240b2b 100644 --- a/docs/release_notes/july_2022.md +++ b/docs/release_notes/july_2022.md @@ -19,7 +19,7 @@ This page includes new features and improvements to the Airbyte Cloud and Airbyt * Airbyte is currently developing a low-code connector builder, which allows you to easily create new source and destination connectors in your workspace. [#14402](https://github.com/airbytehq/airbyte/pull/14402) [#14317](https://github.com/airbytehq/airbyte/pull/14317) [#14288](https://github.com/airbytehq/airbyte/pull/14288) [#14004](https://github.com/airbytehq/airbyte/pull/14004) -* Added [documentation](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace#single-workspace-vs-multiple-workspaces) about the benefits and considerations of having a single workspace vs. multiple workspaces in Airbyte Cloud. [#14608](https://github.com/airbytehq/airbyte/pull/14608) +* Added [documentation](/using-airbyte/workspaces.md#single-workspace-vs-multiple-workspaces) about the benefits and considerations of having a single workspace vs. multiple workspaces in Airbyte Cloud. [#14608](https://github.com/airbytehq/airbyte/pull/14608) ### Improvements * Improved platform security by using Docker images from the latest version of OpenJDK (openjdk:19-slim-bullseye). [#14971](https://github.com/airbytehq/airbyte/pull/14971) diff --git a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace.md b/docs/using-airbyte/workspaces.md similarity index 100% rename from docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace.md rename to docs/using-airbyte/workspaces.md diff --git a/docusaurus/redirects.yml b/docusaurus/redirects.yml index b9b03fac35cf..93c2245e6cf1 100644 --- a/docusaurus/redirects.yml +++ b/docusaurus/redirects.yml @@ -86,3 +86,5 @@ - /troubleshooting - /operator-guides/contact-support to: /community/getting-support +- from: /cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace + to: /using-airbyte/workspaces \ No newline at end of file diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index b91bb4d3a824..8dfc075206bb 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -476,7 +476,7 @@ module.exports = { type: "category", label: "Workspace Management", items: [ - "cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace", + "using-airbyte/workspaces", "cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications", "cloud/managing-airbyte-cloud/manage-credits", "operator-guides/using-custom-connectors", From 63119e75a0aded2e34d2ae2965c765276855bc64 Mon Sep 17 00:00:00 2001 From: nataliekwong Date: Sun, 26 Nov 2023 16:47:28 +0000 Subject: [PATCH 24/52] Automated Commit - Formatting Changes --- docusaurus/redirects.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docusaurus/redirects.yml b/docusaurus/redirects.yml index 93c2245e6cf1..89adddf5065a 100644 --- a/docusaurus/redirects.yml +++ b/docusaurus/redirects.yml @@ -87,4 +87,4 @@ - /operator-guides/contact-support to: /community/getting-support - from: /cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace - to: /using-airbyte/workspaces \ No newline at end of file + to: /using-airbyte/workspaces From 0b4598929ba04fbe99c5941e20b309a98d344499 Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 16:47:30 +0000 Subject: [PATCH 25/52] Update custom connectors --- .../using-custom-connectors.md | 57 +++++++------------ 1 file changed, 19 insertions(+), 38 deletions(-) diff --git a/docs/operator-guides/using-custom-connectors.md b/docs/operator-guides/using-custom-connectors.md index 4516f19ff987..04be26cf889e 100644 --- a/docs/operator-guides/using-custom-connectors.md +++ b/docs/operator-guides/using-custom-connectors.md @@ -1,15 +1,17 @@ # Using custom connectors -If our connector catalog does not fulfill your needs, you can build your own Airbyte connectors. -There are two approaches you can take while jumping on connector development project: -1. You want to build a connector for an **external** source or destination (public API, off-the-shelf DBMS, data warehouses, etc.). In this scenario, your connector development will probably benefit the community. The right way is to open a PR on our repo to add your connector to our catalog. You will then benefit from an Airbyte team review and potential future improvements and maintenance from the community. -2. You want to build a connector for an **internal** source or destination (private API) specific to your organization. This connector has no good reason to be exposed to the community. - -This guide focuses on the second approach and assumes the following: -* You followed our other guides and tutorials about connector developments. -* You finished your connector development, running it locally on an Airbyte development instance. + +:::info +This guide walks through the setup of a Docker-based custom connector. To understand how to use our low-code connector builder, read our guide [here](/connector-development/connector-builder-ui/overview.md). +::: + +If our connector catalog does not fulfill your needs, you can build your own Airbyte connectors! You can either use our [low-code connector builder](/connector-development/connector-builder-ui/overview.md) or upload a Docker-based custom connector. + +This page walks through the process to upload a **Docker-based custom connector**. This is an ideal route for connectors that have an **internal** use case like a private API with a specific fit for your organization. This guide for using Docker-based custom connectors assumes the following: +* You followed our other guides and tutorials about [connector development](/connector-development/connector-builder-ui/overview.md) +* You finished your connector development and have it running locally on an Airbyte development instance. * You want to deploy this connector to a production Airbyte instance running on a VM with docker-compose or on a Kubernetes cluster. -If you prefer video tutorials, [we recorded a demo about uploading connectors images to a GCP Artifact Registry](https://www.youtube.com/watch?v=4YF20PODv30&ab_channel=Airbyte). +If you prefer video tutorials, we recorded a demo on how to upload [connectors images to a GCP Artifact Registry](https://www.youtube.com/watch?v=4YF20PODv30&ab_channel=Airbyte). ## 1. Create a private Docker registry Airbyte needs to pull its Docker images from a remote Docker registry to consume a connector. @@ -70,42 +72,21 @@ If you want Airbyte to pull images from another private Docker registry, you wil You should run all the above commands from your local/CI environment, where your connector source code is available. -## 4. Use your custom connector in Airbyte +## 4. Use your custom Docker connector in Airbyte At this step, you should have: * A private Docker registry hosting your custom connector image. * Authenticated your Airbyte instance to your private Docker registry. You can pull your connector image from your private registry to validate the previous steps. On your Airbyte instance: run `docker pull :` if you are using our `docker-compose` deployment, or start a pod that is using the connector image. -### 1. Click on Settings -![Step 1 screenshot](https://images.tango.us/public/screenshot_bf5c3e27-19a3-4cc0-bc40-90c80afdbcba?crop=focalpoint&fit=crop&fp-x=0.0211&fp-y=0.9320&fp-z=2.9521&w=1200&mark-w=0.2&mark-pad=0&mark64=aHR0cHM6Ly9pbWFnZXMudGFuZ28udXMvc3RhdGljL21hZGUtd2l0aC10YW5nby13YXRlcm1hcmsucG5n&ar=4594%3A2234) - - -### 2. Click on Sources (or Destinations) -![Step 2 screenshot](https://images.tango.us/public/screenshot_d956e987-424d-4f76-ad39-f6d6172f6acc?crop=focalpoint&fit=crop&fp-x=0.0855&fp-y=0.1083&fp-z=2.7473&w=1200&mark-w=0.2&mark-pad=0&mark64=aHR0cHM6Ly9pbWFnZXMudGFuZ28udXMvc3RhdGljL21hZGUtd2l0aC10YW5nby13YXRlcm1hcmsucG5n&ar=4594%3A2234) - - -### 3. Click on + New connector -![Step 3 screenshot](https://images.tango.us/public/screenshot_52248202-6351-496d-bc8f-892c43cf7cf8?crop=focalpoint&fit=crop&fp-x=0.8912&fp-y=0.0833&fp-z=3.0763&w=1200&mark-w=0.2&mark-pad=0&mark64=aHR0cHM6Ly9pbWFnZXMudGFuZ28udXMvc3RhdGljL21hZGUtd2l0aC10YW5nby13YXRlcm1hcmsucG5n&ar=4594%3A2234) - - -### 4. Fill the name of your custom connector -![Step 4 screenshot](https://images.tango.us/public/screenshot_809a22c8-ff38-4b10-8292-bce7364f111c?crop=focalpoint&fit=crop&fp-x=0.4989&fp-y=0.4145&fp-z=1.9188&w=1200&mark-w=0.2&mark-pad=0&mark64=aHR0cHM6Ly9pbWFnZXMudGFuZ28udXMvc3RhdGljL21hZGUtd2l0aC10YW5nby13YXRlcm1hcmsucG5n&ar=4594%3A2234) - - -### 5. Fill the Docker image name of your custom connector -![Step 5 screenshot](https://images.tango.us/public/screenshot_ed91d789-9fc7-4758-a6f0-50bf2f04f248?crop=focalpoint&fit=crop&fp-x=0.4989&fp-y=0.4924&fp-z=1.9188&w=1200&mark-w=0.2&mark-pad=0&mark64=aHR0cHM6Ly9pbWFnZXMudGFuZ28udXMvc3RhdGljL21hZGUtd2l0aC10YW5nby13YXRlcm1hcmsucG5n&ar=4594%3A2234) - - -### 6. Fill the Docker Tag of your custom connector image -![Step 6 screenshot](https://images.tango.us/public/screenshot_5b6bff70-5703-4dac-b359-95b9ab8f8ce1?crop=focalpoint&fit=crop&fp-x=0.4989&fp-y=0.5703&fp-z=1.9188&w=1200&mark-w=0.2&mark-pad=0&mark64=aHR0cHM6Ly9pbWFnZXMudGFuZ28udXMvc3RhdGljL21hZGUtd2l0aC10YW5nby13YXRlcm1hcmsucG5n&ar=4594%3A2234) +1. Click on `Settings` in the left-hand sidebar. Navigate to `Sources` or `Destinations` depending on your connector. Click on `Add a new Docker connector`. +2. Name your custom connector in `Connector display name`. This is just the display name used for your workspace. -### 7. Fill the URL to your connector documentation -This is a required field at the moment, but you can fill with any value if you do not have online documentation for your connector. -This documentation will be linked in the connector setting page. -![Step 7 screenshot](https://images.tango.us/public/screenshot_007e6465-619f-4553-8d65-9af2f5ad76bc?crop=focalpoint&fit=crop&fp-x=0.4989&fp-y=0.6482&fp-z=1.9188&w=1200&mark-w=0.2&mark-pad=0&mark64=aHR0cHM6Ly9pbWFnZXMudGFuZ28udXMvc3RhdGljL21hZGUtd2l0aC10YW5nby13YXRlcm1hcmsucG5n&ar=4594%3A2234) +3. Fill in the Docker `Docker full image name` and `Docker image tag`. +4. (Optional) Add a link to connector's documentation in `Connector documentation URL` +You can optionally fill this with any value if you do not have online documentation for your connector. +This documentation will be linked in your connector setting's page. -### 8. Click on Add -![Step 8 screenshot](https://images.tango.us/public/screenshot_c097183f-1687-469f-852d-f66f743e8c10?crop=focalpoint&fit=crop&fp-x=0.5968&fp-y=0.7010&fp-z=3.0725&w=1200&mark-w=0.2&mark-pad=0&mark64=aHR0cHM6Ly9pbWFnZXMudGFuZ28udXMvc3RhdGljL21hZGUtd2l0aC10YW5nby13YXRlcm1hcmsucG5n&ar=4594%3A2234) +5. `Add` the connector to save the configuration. You can now select your new connector when setting up a new connection! \ No newline at end of file From 6f5aabb216b2d0d76c2dc01fa38032288dc1949e Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 17:01:05 +0000 Subject: [PATCH 26/52] Move windows locating docs --- docs/integrations/destinations/csv.md | 2 +- docs/integrations/destinations/duckdb.md | 2 +- docs/integrations/destinations/local-json.md | 2 +- docs/integrations/destinations/sqlite.md | 2 +- .../locating-files-local-destination.md | 4 ++++ docs/using-airbyte/getting-started/set-up-a-connection.md | 2 +- docusaurus/redirects.yml | 4 +++- docusaurus/sidebars.js | 1 - 8 files changed, 12 insertions(+), 7 deletions(-) rename docs/{operator-guides => integrations}/locating-files-local-destination.md (98%) diff --git a/docs/integrations/destinations/csv.md b/docs/integrations/destinations/csv.md index 4cc00f440c79..223c618b8f8b 100644 --- a/docs/integrations/destinations/csv.md +++ b/docs/integrations/destinations/csv.md @@ -69,7 +69,7 @@ You can also copy the output file to your host machine, the following command wi docker cp airbyte-server:/tmp/airbyte_local/{destination_path}/{filename}.csv . ``` -Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](../../operator-guides/locating-files-local-destination.md) for an alternative approach. +Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination.md) for an alternative approach. ## Changelog diff --git a/docs/integrations/destinations/duckdb.md b/docs/integrations/destinations/duckdb.md index fa87f65038b9..078006e75f54 100644 --- a/docs/integrations/destinations/duckdb.md +++ b/docs/integrations/destinations/duckdb.md @@ -98,7 +98,7 @@ You can also copy the output file to your host machine, the following command wi docker cp airbyte-server:/tmp/airbyte_local/{destination_path} . ``` -Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](../../operator-guides/locating-files-local-destination.md) for an alternative approach. +Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination.md) for an alternative approach. diff --git a/docs/integrations/destinations/local-json.md b/docs/integrations/destinations/local-json.md index 11870a8d5177..45ddda3fb757 100644 --- a/docs/integrations/destinations/local-json.md +++ b/docs/integrations/destinations/local-json.md @@ -69,7 +69,7 @@ You can also copy the output file to your host machine, the following command wi docker cp airbyte-server:/tmp/airbyte_local/{destination_path}/{filename}.jsonl . ``` -Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](../../operator-guides/locating-files-local-destination.md) for an alternative approach. +Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination.md) for an alternative approach. ## Changelog diff --git a/docs/integrations/destinations/sqlite.md b/docs/integrations/destinations/sqlite.md index eb266b61eee8..f5c2a3193780 100644 --- a/docs/integrations/destinations/sqlite.md +++ b/docs/integrations/destinations/sqlite.md @@ -68,7 +68,7 @@ You can also copy the output file to your host machine, the following command wi docker cp airbyte-server:/tmp/airbyte_local/{destination_path} . ``` -Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](../../operator-guides/locating-files-local-destination.md) for an alternative approach. +Note: If you are running Airbyte on Windows with Docker backed by WSL2, you have to use similar step as above or refer to this [link](/integrations/locating-files-local-destination.md) for an alternative approach. ## Changelog diff --git a/docs/operator-guides/locating-files-local-destination.md b/docs/integrations/locating-files-local-destination.md similarity index 98% rename from docs/operator-guides/locating-files-local-destination.md rename to docs/integrations/locating-files-local-destination.md index e514f3a92ebd..d401d7952455 100644 --- a/docs/operator-guides/locating-files-local-destination.md +++ b/docs/integrations/locating-files-local-destination.md @@ -1,3 +1,7 @@ +--- +displayed_sidebar: docs +--- + # Windows - Browsing Local File Output ## Overview diff --git a/docs/using-airbyte/getting-started/set-up-a-connection.md b/docs/using-airbyte/getting-started/set-up-a-connection.md index 6cf68ae45fa1..7948eeeda06a 100644 --- a/docs/using-airbyte/getting-started/set-up-a-connection.md +++ b/docs/using-airbyte/getting-started/set-up-a-connection.md @@ -48,7 +48,7 @@ cat /tmp/airbyte_local/YOUR_PATH/_airbyte_raw_YOUR_STREAM_NAME.jsonl You should see a list of JSON objects, each containing a unique `airbyte_ab_id`, an `emitted_at` timestamp, and `airbyte_data` containing the extracted record. :::tip -If you are using Airbyte on Windows with WSL2 and Docker, refer to [this guide](/operator-guides/locating-files-local-destination) to locate the replicated folder and file. +If you are using Airbyte on Windows with WSL2 and Docker, refer to [this guide](/integrations/locating-files-local-destination.md) to locate the replicated folder and file. ::: ## What's next? diff --git a/docusaurus/redirects.yml b/docusaurus/redirects.yml index 93c2245e6cf1..69047ba00731 100644 --- a/docusaurus/redirects.yml +++ b/docusaurus/redirects.yml @@ -87,4 +87,6 @@ - /operator-guides/contact-support to: /community/getting-support - from: /cloud/managing-airbyte-cloud/manage-airbyte-cloud-workspace - to: /using-airbyte/workspaces \ No newline at end of file + to: /using-airbyte/workspaces +- from: /operator-guides/locating-files-local-destination + to: /integrations/locating-files-local-destination \ No newline at end of file diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index 8dfc075206bb..c7ac46cc335b 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -469,7 +469,6 @@ module.exports = { "cloud/managing-airbyte-cloud/review-connection-status", "cloud/managing-airbyte-cloud/review-sync-history", "operator-guides/browsing-output-logs", - "operator-guides/locating-files-local-destination", ], }, { From a0c70d33829cf8c8dde4d2311800475d49f0faff Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 17:01:22 +0000 Subject: [PATCH 27/52] Add Cloud tag --- docs/cloud/managing-airbyte-cloud/manage-data-residency.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/cloud/managing-airbyte-cloud/manage-data-residency.md b/docs/cloud/managing-airbyte-cloud/manage-data-residency.md index 024772fac02d..167a7c1b0d87 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-data-residency.md +++ b/docs/cloud/managing-airbyte-cloud/manage-data-residency.md @@ -1,5 +1,7 @@ # Manage data residency + + In Airbyte Cloud, you can set the default data residency and choose the data residency for individual connections, which can help you comply with data localization requirements. ## Choose your default data residency From d523b6ebbba4f2b2df82e8b613dbfefa0d4e633e Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 17:22:25 +0000 Subject: [PATCH 28/52] Connector support level table --- docs/integrations/connector-support-levels.md | 22 ++++++++++--------- 1 file changed, 12 insertions(+), 10 deletions(-) diff --git a/docs/integrations/connector-support-levels.md b/docs/integrations/connector-support-levels.md index 47e533d90f40..497d415b9ae4 100644 --- a/docs/integrations/connector-support-levels.md +++ b/docs/integrations/connector-support-levels.md @@ -2,16 +2,18 @@ The following table describes the support levels of Airbyte connectors. -| | Certified | Custom | Community | -| --------------------------------- | -------------------------- | -------------------------- | ---------------------- | -| **Availability** | Available to all users | Available to all users | Available to all users | -| **Support: Cloud** | Supported* | Supported** | No Support | -| **Support: Powered by Airbyte** | Supported* | Supported** | No Support | -| **Support: Self-Managed Enterprise** | Supported* | Supported** | No Support | -| **Support: Community (OSS)** | Slack Support only | Slack Support only | No Support | -| **Who builds them?** | Either the community or the Airbyte team. | Anyone can build custom connectors. We recommend using our [Connector Builder](https://docs.airbyte.com/connector-development/connector-builder-ui/overview) or [Low-code CDK](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview). | Typically they are built by the community. The Airbyte team may upgrade them to Certified at any time. | -| **Who maintains them?** | The Airbyte team | Users | Users | -| **Production Readiness** | Guaranteed by Airbyte | Not guaranteed | Not guaranteed | +Certified<> + +| | Certified | Community | Custom | +| ------------------------------------ | ----------------------------------------- | ------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | +| **Availability** | Available to all users | Available to all users | Available to all users | +| **Who builds them?** | Either the community or the Airbyte team. | Typically they are built by the community. The Airbyte team may upgrade them to Certified at any time. | Anyone can build custom connectors. We recommend using our [Connector Builder](https://docs.airbyte.com/connector-development/connector-builder-ui/overview) or [Low-code CDK](https://docs.airbyte.com/connector-development/config-based/low-code-cdk-overview). | +| **Who maintains them?** | The Airbyte team | Users | Users | +| **Production Readiness** | Guaranteed by Airbyte | Not guaranteed | Not guaranteed | +| **Support: Cloud** | Supported* | No Support | Supported** | +| **Support: Powered by Airbyte** | Supported* | No Support | Supported** | +| **Support: Self-Managed Enterprise** | Supported* | No Support | Supported** | +| **Support: Community (OSS)** | Slack Support only | No Support | Slack Support only | \*For Certified connectors, Official Support SLAs are only available to customers with Premium Support included in their contract. Otherwise, please use our support portal and we will address your issues as soon as possible. From f350af13c7f91cc6dd87d87179a9dbb0957cd4f0 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 18:28:22 +0100 Subject: [PATCH 29/52] Remove test tag --- docs/integrations/connector-support-levels.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/docs/integrations/connector-support-levels.md b/docs/integrations/connector-support-levels.md index 497d415b9ae4..e684c1292b7c 100644 --- a/docs/integrations/connector-support-levels.md +++ b/docs/integrations/connector-support-levels.md @@ -2,8 +2,6 @@ The following table describes the support levels of Airbyte connectors. -Certified<> - | | Certified | Community | Custom | | ------------------------------------ | ----------------------------------------- | ------------------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | | **Availability** | Available to all users | Available to all users | Available to all users | From 87d0e546c0613a0a0134fcd6207a438fa45e0369 Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 17:34:06 +0000 Subject: [PATCH 30/52] Edit logs text --- docs/.gitbook/assets/explore_logs.png | Bin 105640 -> 0 bytes docs/operator-guides/browsing-output-logs.md | 54 +++++++++++++------ 2 files changed, 37 insertions(+), 17 deletions(-) delete mode 100644 docs/.gitbook/assets/explore_logs.png diff --git a/docs/.gitbook/assets/explore_logs.png b/docs/.gitbook/assets/explore_logs.png deleted file mode 100644 index 98d159e8af7a0a6abedf691fb93b7169df4ed89d..0000000000000000000000000000000000000000 GIT binary patch literal 0 HcmV?d00001 literal 105640 zcmeGEWmsLy5(Wq(o3Mf49)fM$CBfZ2xO=eR?rsSd+%*u~-8Hxecemi~Hk*6Tx%VXZ z`+m%inP=vC23E40?$xWRtE;PfRn@zL_0 z0VrxhKtPI_3JJ+c2ni9%*;yN#S{Ok7hx`QugWV>*@6k(@??NIV%JOg(>KH z5486Y4YYR?b$yFT)Y4kTL0N@X`T_A>zN5~Ze~RBm9otal@c|CcnMzE0=%Y8M82?XZ zL`5F>RY?iD53k0(ec0(5zU2q_`ZB@vsn9b}K~!-0SVV!oefK%wAaddiXo2kd;8z~u zGZg^ffSzOXvGCmt6aI!9#^7U-MoN0>K*~C)gR*KXIgFrWgUrUv#74BLH7+1wCIT1a zG{TTiJ0LwuDQ^S#tWfA@qEne3?ib}7MGOs1IYeB@2kYAzEpI_6Nb3y;?cdGz>eY5I zup*o4_3MQN92`*$bYEV99kdwE`K6ixMD(v)9L#w@YBF*}Le_RhM6C3T^o$^0cp@SqZaYI`P6c7n zzq*6};sJefaIoQIU~qPJrgwfvZ*6D7z{J79!NADOz|2eszJt!*)yhHNh0e;JCh6NrV!%GVT6Fnot z|F3QirpEt&y1lggyW79U^>1?AFAw9CGj%btP!~3}1WOgn8ZXOx4(@-E`Ja~mGWzeH z%JxQfLe`exjt;#4otD2k|FiMWj{hR5_Ft0h@7VrH@*ge#==m}PPFXusFg5)b8S*l5 zGyG58U-xq}ya@Oo0{{1H{;L)|oxJee4F6FLUidWKEDHz-eh3L+0VNm6gEUwV3}x(Y zYk0Cu=iL@I7z{c2Y&J+#6uI`&SEu-VnbmP*#v+40%feX?DWn zvTNtUkkvf-0eRDb>#b#Oi|ZEOyC(bXWoHUf-sLqEax@~y-!7m6RHneZwNnZ}+~>FJ zJ!A+d9Z1R#_0LN{#wUw1ADNF!_P1(Me)%r@-@A(w<-^Hgz+A4G>`46HLK)(M6bS+M z&kMi;8Itvl_u+v1x9Ug;5!BJ&*(LJ*M$RvfG5IAUjpFx~Zz zI{)Vwz-0^(ZrDGD>z526A}>ei<&^%L6vTb9f_8pqHj-Zs{Tn6xUgzjY%;E@PYh=`O_cnEf% zF=EqR9)KE~oJ>tcMHQWx_$?0scP1m0DNzYAxI(d!kwlr9#MLOSNerPRzpK#+5+v~Z$%))? zDCqqtK$%jVd7P`ni0Xd^01ee0D(0e#fjh%dT_+wT$oq*kl=SyU5kVfK1clM=amfQ_ zI0EVoKz}xSldr}wi6Dt3^Ia!d{=o+kBn4zyC@A(dHs}AYSpXC-s)+AT+uEoFfUki~;}6?a%w!S+%)p(<%&lEwwjnR-_^!7#xI+-MnDSH47xA$mn9oH2{Q2No8xtQ-F3Wuyt5l?-eAKW*Z^G5g6Qg_S(zHYy0ce&X;^fpo zddnVZTA^I16s3?OY1H_Y=0&rK7yyFn8~EYP2Z>X>;H8I#k52~H3#Aeb;vx$j^^DY1 zT3NowRF9L6FNM5Cc!BVk(Y*xj!lGXrmuMWzQNZf{Ok{JiDquR1+ahHUg3Ul*qM@BF zJV4p-a5kt_n>Hw@q!jCMv&IdFfRMm-+$@9b`0FRJws#V<(Pmkj*L4WaV42%UAj~zH zuOFz~W^KoA#S__B<57x1CT71BFLsx;^hLRSn^AzDq{b!-7zR=~>~k#UD#!C=8G=JY zW4c2V`fg6qFDDj@%8t^o$rrx>#M?*VE{>KIM^c01GQ6Keyl&RS)a#DelDMut8xELN z$6{x!Qn_6f$UmeOmyC*HD3@xfx880`QE4~Essd2w<8?1L+A@T8>t{>jursT8Os>6IA2#DYo)5yrN`z z)rm?p>!L$2Y2ed!5Z|)%yImd7FVr1Jl1arEQuKODm1xvt>$Vl;xSks~zI~|L)1hs9 zYWzU0Tu7?j=%8|`ySF%k3zpX+y36HMLe8hLEW5i4(zp8Z%!D5 z(hV}HCyM8X^9ZM+!A?i+1vedq$T}$rF3hVYB1FS#J+&Qr`{P|hIm~sb(-?XI`bKoxCt>32;{&c%X?eh)z#G(Bg;#k z;dvFfcW(UnIQd=7d1~3`DcRX>Q~CrgU#F!}I-Q#d8shw3yV6hsm(?O@D3wFAX_EKi zCp5>?Rc>zXK(p)l)bai3Kv7P}-BZ1Ky7zJ;tjks$_7rqN&7E_b?MBas(HQGCZEC56 zCs~_eWPGka^jA!q!Q~5*LG-4X4&7Px=P}GG)i;s4Nr5^KAcBPy)RWiuhN-m z^@MHt(6F<+MX&9~E{)rFzLSB6f4R>#wos^A_C2r0Y|6P<>$YWiRL9x>mCRe^!bu14 zk{x97exh*WdX9pCbNRskmG0*ZPU(t^hiRG&nEF_*3|EWg{Y4mYH_d1#x_7^Eg$f=^3)pr55*V4p!^sgZ694y^cJqB~c_t^mUu(QpObo&$|oz z$)cIrq&AJ3`EY~Y@O`(*PTU=5;r;2u)D_53EYB@lo?k24&Pzpuz0N_mrMR39^(0hO z5;GH(M&>P^C#(7+PTvkCX?G7F#Grh|S0`loCU)%0bUfc#wlq}-?*(X8)Yj;>a zgEq36Ar^RTrUHq63bwdiI@BWKJxwawTed!&IXGHAmC&}8eJYA}MGK>76_4P#%5O~v z&kjB=JkCo- z0*^szu`8}SGU{zl*sj}2716m+tnG!Hraa|No~PX)OH-ZN15KkPjm&-Z8nYUYi?Lq% zlHef9!gQjts9r*E;j-uLl?t$!)v2V>UxdF7%dKq`}fc2g;W3e`{g2V8xi*>B`KD$09ne_ChShs#}>tno<;2E2Cy@>#buP(y1=e&l?BI2BHWbN$i z+=}31{S{1?^L{bDCr z{tg^C_W|OLz2W$ZrIxZB)?G;WoCmf%+wR$ldEw*Lrs!J)N}69Yt9zu9Epr*UzjgS7 zxETe#BM5l5*Cj2^Y$c(5g(0l6*sq6S<&6*FKv2kUZ4g1Avi1)`sxTA zTYtOIvTxoChemG98Got0G_H+*)2bhW!zxZtlSn9UOk^a=A?VAM<#Mw~_*;&wfE8+O z+E&ezpPz%lo1!}aYI^Q17H$}BdjfIK^(s2Ly(oc;UNynS`O`h2cyG8Wb=!JOGS(}z zSN)f1?M<0L3#)5qCV^V=-unB7>cew^f3j-PI*+}1M%fBia9 zM7((-fMW|jG5Xq5_~aFA6AOXYiNgt$)@XfZyP|Ro*bHcmg7`a2>Xx$!LOSEq&{cG4kXVUtzc_lDEOb%s9zBs9KhIgAC~`G&N}7`Gkd zZo3b7SQD4s^1fTKHhxAUE3RFw0icMKOD5GOa;HY~e?#?|Rd05Yk00(y)N^+^O7(J= zP2(&Xq*kTh@bq;jtJQ>NsU4l_B*}A=R-f|}OEjPL@?IklUf(Oar!+rz_xsYa;(6Ud zf0b;0m$c47=KB`zEf;mATU?*Z@J?=e80}w)S`R6k(G%vXWHw!#`O3R3bFWnn;e7yN z_o_K5cZDc3H2Edei?QEKhOoMtp)o%`n6ow|1|UdY?oCw#8E3Y;f=4CNI2|t>+l+_b zcy0t72ZIfY5Q@yv`6%|MQ%2V_4A<+W>#483_w-pnM}9Say-l)A%USRK=rTtLtokN8 zT`QQNii8Aho*RN&X&Bj?+Z|np6{4$?H2(lgY4D;g1=!sk2AaT*PnT%cuwcgH0JJOe z5a{!>u~8#ot`XPP^l{I?7U7hcia}?+?ENaxH-A(6C4Yn5qsx-G97owV1QR1bIX= zds?gLIt8Dd_GWw%xHzoS-IC*E1HXV9ihPs84%iGIV|(8RT^AF=>2OvR^hMb zNjas@QG1O?MLn-AP!=7CB+0bYQ8LfK%0DKb?RN2t9FI+71LbQD_E_nv2DEcridCaB zf#<~<8Cy?rJnS77;c#<@l2C5pKrd9?#V@xY^b)4 zoiE_MFIdCz-hcW=BKpJfDujl!pDdwzY$gw%kRWUQjgVn|ttErvjJN*dJhFRw>R!dp zu-VgRyw8QOWHgg%n`}lqBWZhZPqsYtjgI@FsnZNYope~&%;NQB{rmpO#q%APJJ^tF z0X;nE?4Ex7elD6NuQSWvV#cx$ENsZ~tv#%VPHe_&T8*>OxO{vq0H{L)2(`}=l5)Gc z9@Zq(EWoE%u9D}0Eum!UkmKhn&x49Dj=XB2t4M5~cS;Az25uK)Om@dD&imJ-o=>+4 zoEFqXuOnDD9$p5C&W8ce({k9T~NMI1d1tDP=sREp^d@T(L z^Q6Esx9v~uhB}hh@w_BERB8Lz?R*DG4Z!tln?)}cAwSg*BJMCX1`yFdh@f;p0esId z>_KI%#LV8AEZyAG`YB6}BSGK&Dy;&EDlLuW&Kr@ACxXJm6|4wMsjr^Zdpnx3;zJ)# z9AI=Ea--pSO!lA`5)Zn@l2`M2z9-mNRy!4+nUNK^WK_ z7{D%?|48K1YEtKM6 zb$+SB5P_Lf<+eFodDf6WmYXO$OMJJ3kAGN6w9wV~7y}_JBA~8zAgp)mi%D5WRmCJ% zPP&yxgYQ{y=RTR3@wbm+wHbu+=1uOB(PN2X3ah2Uk$c_+&ZqhZ&;uVjTs^vvV;>=4 ztU=8gr{WsXscRoP9L_CI{9xtjZJ7@cd>@X^ev${+`_)1KuRG2v+kjp zVwCmSSMnEoLqm41-rD@I8YjF+@#4R^Dt!eYrZMi-QvE)$1W4&i9?=FZE($|;8xhid8kqWPw{&Z#$!$w7=j`EhI%ICeU z?J1|Gp)0Azj`X&ytnau5qi?VU%tM31tX3kKmBlKgO|Y z55?tl5M{aGWE6y@)#8SUDcKV2wz5feyY|I#ywUO9rC0wO`op=l=Sa7m2_2TFJ-dWU z7l|w9mB*}Z@iHBJ?b@WX6JBakU}|}}oatCrRLG60c3J1sVEiLx8kcUNPIolZc!fKz z8^t9uRIyaT;KjOio28QcD+7l)mx?;^pb#$&(U!#~{amQg*r+KC`Gw{0em`{r8VYUs<#aV|EZ&?#XPL1>3Z=C?y?c*3n>M~;ORzQHBN=)v|x_4E=?p1 zVDRbj%hrtUv(oJVwUjKA`!wwiakezBx{=vpeW@FRvUUqDR>k4Mvm(RhlMJVm#g07R z!Is;gAC!P00M?IeQxt=^tqg76@D2$1Y5rHy-AEIY2>Y6%;shxNm2#5K1S#=yCNFVx zB1&Oy7GE$Nl97fCX6nUN3Z<@qaRZp|R+4t}WoHX@<`?a}p@9TOCbJX`G&*{_IWW|j zj6xRZv{Zt9LF!ira~YO(tMe+~MKKQro%fWKYF$T-492oVs^jt{<0LEf{h0$dnYX&9 zhFjp9h<>ncRBop;6dCElW4g~R|1wwcWZWOclMlcn2}&g;3y4M&iralB{nk{!Fyo5; zbN&{J<7t3o;e2``O4hmJ+oTC!MKs{s7y0g}Rph+LcNGDOPUF1_a(j%?{&dm=m36U>%hq#&yg}t7^YbOyIKmgpk#L%X`2@)@&ZnIoow|- zkLiQX&dz3v2Wn?xW?a!NAn)DBXPT!CuDBQ$cWmqf(0R#lvtR}LCNQ!BFuJ0Iejwz) zznUH^8BufsD=>_h&x}V$g~{*>OOb?t4;kUpN9I?JVDAQ<(Lk(S=?_oMg<3kJsY2Z? zKLqM}3wvXQcQ_^(sPal*VJfdg+CTflMtmBK>Q9K@j$|?Snh3Qq4-F2ExjNL8=#TD} ziGQ0%2~F>O72wp76{juO4pV5UM*w z1V}5Co=q;ZH<&5ZyLsGJXHawin8-`42-iz~EU7p{E(nypd0>8jwqd4qmI3p*wT$JY_&M_>Igahqp` zJ*M!dat1o~+@<{9Wh>fykgL_lQ<1|RIdZGW=;-lD9HNwl9Sc{bAfC}Oh&N1_lvD5G4Q$0m*dD>kw>}@$jklO+8tIHa zCM339AY|Fny=_(q?aX58j5r2ZS~n7IS44`N5F8 z(r6+oTzVs%1q0ceqv4@IBv;8;%F<<;NjVYQg#)y=Yi-X+;!BMz7gD;1Mje4j3wtwl z#qu~j`g@)1mj=BbfXfM+RwaUZurbJ60+G1pF)gnZ6Fj-5Gc67xU{A)zD$b7EnkE}D zSiE*YRiV9#0FH8mFGEi6fApz^NfTHDL9u~%o|qOL%XJUs7!49o{sV1!+rhR$_3$7y zpN|G~#10)WS?fOyY|_XAh(gMd`hg&l$Z{k>>Y;i&&L5ZkA?UanL*sa5p7axWBdXZx z`w-0nGmg;-CW{Gk^aN!K6`pO?X0BDFTq^F(vH~VuBH^qF@>%6_nKqv1KqsNkPEj+) zyxAMi?&G31hn|;40|DCoF$b17+Jk8e&J(rVHAKMWyw!ypJBuTDDt(%hb%D27Lz?W^~IOcD=U-FgMQz8JyHjbn{iq;^Dv9C?Up( zj~B_Nhbg2BRnS9XN_z!?>K8g35#c~Xldz}Cm4&0)bX4D3O^GAL<&M=>+$+52yESXT#A{D#SVu&!a0% zn^{!(8eEF>rVGM+aJTxG8Fo9Cibk`DOPh1HZ0^GO^DvKF{t@ZXAJR14 zWBJOG(%sJJZqGsqx7wc!DxXiVanp&ml<2KOh@TSAzWDHIovZ42MO-x8DY)K@={HX7 zdoNJSag9T?L z0&j=-MZ1mjN{=~f+yOB>u_Y!M>*uW<@68(AjQ(ir8O}jp5rE@s{D-hb`x&W9%YB`l z+En8c0N}@VPm23)qs)`c^Fh4(v6sZ%rK_~Mrg7hqHtZ!iw_}@Wm+)m5xk|2dItB7F zU*|#AjL{5IJ+t4$s(D6q0mG2uF-Q)UB?VB_*^Hbp7$8w6u|>UQG?rklWml%ws-u(q z6ug>yU7Mo&lNQ_iJX9(j_Sl<6m0QbcTd)-FW`kvAdtLNw7M-oH@R%+C7-heJbfg3( z!y|7r<^IT3J^s;J`*SP(gG1b6aq)-mRvF~F-j^}U=}WiZpbC{y+>=@<-c7aiTRySX zo-bbz$=BusU=W}XXQsxgtXlf6AG#Xu?$?s?mZKE&it{p!+lJ=#Yr&4UC;s+__%-MC zCwE=jCzUlf_qAK(OY0pmLi&m00;a>^qdHXAA?vx=e!|^HzK2!s`91S1wuRxd{+O}5 zIx1e?rLPEWF*hxH-j^H9`ipNJ%k&$jl0DhUd>=qQwyY?m>EuDg)(D`<78~C@x)bh< z7HM7etVK<9n=Bw3e!T!ZoIE^bj8b!bKF)DGk>9Lu4Y1T6MT%U%sREQ_E2`Gx;r4?o zY+OrN=B_q%hyM7dV5<@okZnc0{x<1+xo-TFdeG!zp}S4IZUni|Rw6G)`#$%qXJ0p; zVNQY`X2oj|Vg8ML(+5n)7#8(q&wIQbr_Hp}0l^mQd#MyST0)x^0nQRDukAtxw}XgZ zbeJ@8W9$9&eJ4UT8z*oBZYnBfl&UO?Hp9ucIcJDVC4=qAgFareluTDK4oBmn5gBF+ z+pIfIY5o4*Hx-mR+=4rcH#Kq#WDyN6Sx&oYD zUumxNwVS6a8H5i=PMKl8f4*K;KNz|yXguYy+sw94de=&_Iy%Gs(09rcm24KM7|7_V z0vFjMV@5q_Z|aF|(q=_`B*u4A$lIGzmrrtC01L>V(~8@=jo8mqy=FRlbjG_7dIc=+ zfX;}H*eS7r>TPhD=-b%}a%2D5B`EKib!|ux3+#_q2RK1RsV%t~u zYl5l!2=60L*Mlhak>y73y51i0zAyF(yA?fs7u&@mcAY>`K4MIbTwamz01j$ z97iG22(uuJblJ=Z>CjzXOM$VF_%99XX} z1p&U$0Gz#4#v@P6f%5>Vp8?)=hjEv9%Te*z)1CBor(|C*kxzs0+}*f&C6D{Ob%<9P zW|~{xm3RHfnyuAV^gLSg{jp!>MeqIe*owsL*R1N)Ea&EcPbU<66COreSQflS;?7<+ zL9fj)-SOr`j;QMVxnxe4ssn>u*Xyy_X`;zMi4T_-{NmnacO`Xb;oo>&FHA1e+Bw(U zOPEAOV!cPTL9v`Wqu+K{Q-%%z;yPZlsMmI?LzsxS%K}&H4t7nfqY)Si8E%&DE5Kgl zhrBk22rFW*d*0^**oNC_UN?@s6;Z<<*44Q83+3{Nk6e!fMb{?xO_>O)&x|Uc<^~e7 z9aBELVG{UU9I%S=;dHQLfL*KBP~}(x?`O4#&zcbPtuLu*1Q#3N6O+msUPZKX#huZ> z6jQ<@4ma$g2%~V}D1r~(090B*o4l$BDyS2X$aux~hwAr(WaOSD`&#`9-4_W(o|e1g z1IWh~ds9stump-)jWV9}o#P=n zOJ8~7#9hE8ilTSX>4&+TB6ptk%3DWF!wz6u2PxTEE`c4hnjDc?7hCGz$c< zwUJ#IJ1R{hkP|?6lSH1PSPU|bBjV(Hwy0_C7bXzyGhfP$@^~1tbNw`gvSSq&>o^Pq z6(3ALeq4H{OL>kl9^Pj^w6ebS4r@mW=lsNw+E74bIzW}hblYUqz3Hd?`XY~( zzn3&F)75d^bH0n%Ei-4^?+f`90l zssvDogM>b0emh+A5)u-+l;ME)Ekf!+ExC8BayWFVcyFPkX!CuO*NeF5(ct6y#aA3q z&zwI@Y^BJ!zxj{d38qSI&)pOy!C+Fg>Po4indaOQ>R$$mMy`JM=@Kj6*I(=;@QS>D z+MEOH1i~B#=(i@3Y;HX_n&xXz;;Y4anlqufnwiWUa=StGPy`{=RcVmz}Zn9F)&}d%rE@CYS!J? znbydrNf%x%@8jm@`gR!BssQ_`ixrDH-8|zo9&z%s?{q+rpZQ{Ts0K@9Jl+0o_hM~l zYzGw_2Ee!2%YSJhx^AThMLX=?aa*TWy{BF|?6A|?dV{5oT4hB^i zIU24S;?&<22olZ~8jwaIBcsU2!=ia=j-$DLUekg0-ZEuarW@rsPk*hzrieoraE-hs z@h+=sEff=ujiIfiGsbw3G(+3cX5oHi+_Ca{bw{Bz^)ehEAaU}(qY=uPGpwXTh5$`# zd9{bt@RLc4hkVhUq=RJ??!eY4X*DV&My1qjI9?0&Ucr0eS6LU@MLoy@fDxBl#fGg~ zUS^|t?MC1OkSBN1V{t-#sYum*9I090jg|k69<4|yIf-GmoxS-&X|Vg)yyMZ4ds*BV za-L>j)`yHJR_vv$d+T)Zj}~ZFv`q8>I9b#P!iZ=r6Km>s7(kLx`^ZpV4i;_sH6tSl7m`_vNUE0I)I#wAcpdzQ-BNj+p;Z~rIPZJ~mTU-_ z;by#?;F?cJ@pme1wMD${FZU(5DvSIV#M=qsQunP%?f`7Z&rXDPpC>;%`$*gj`}OvS zoo7sKT$e9DdGW?XD56nPLWv4vR5LQG&;ap+@PMRY?tVptv=ZWd(5IX9pUV*+89h7f zymEXTe7Y6gFJEO!;~J3%MUK@E;o1gJb)nFT0=rhz-%>Z@@AcCYBk_VV=J@ia5NjoD zVmcDj?+M=)Ki(fJjpat3d{hjI*hzO)ZE7HjM(RP=FNWwr-i)O#SOuaer#w|?&FTH{ zepDV6LG2k|^gM>x65|nx2=?Ln$>K?NRu<^BIrKy%zJA?PI@%_7~vWXyED(vz>WlMv+P|JXz(|- zB@ArtOx6J&wvtCn6L2&!P}POSeEkDqv);_$X#FzF7nl=oapx{Bb4*no8?)u)7Y{~@ z8u_?1`i%G4Ii|%O$$XHpYfOk*ymMfyjpN*e1zJ@>t7#S%d=R#(Dby|^+0xHQosUv( z(2|9El3vyF#v7~idgS^=BpSqfxAXas)9?H4I|6nhcGNCxEw|I3&}~c7q^q)kuy2&V zRuS}|)}RMb7uQiZJ^J{~LtKj|j-03?BtE_CLBR|l62z-)+h;4esG@pt+-xvcQ_~25 zHy(kTSr|E|AC$vURg{9H`Fn9?wzrAx2temh#u>B7tZN9?_u{U#Dg@P zSD^hTC$?UVSd<5?=EMfssOEg+GgY$Aw-YEPG9bX+8;Y5Dy*S*%%cBsAP@l$gb%RK~ zK8m`ueU{_C|8wqgzX(#dHsBh!V^6tmo2Z2~&q*)!)vMULwoN1|)xv~u9Ja%5_sN*x z`C5e%^)Ng?K6YBiBhz1(+J;ZfTOXXiI~}I%67rmsC>li877wRgSKnJ{jZi>6*$tb_ zXanxlnhgt!AB|#^npWT7&l2(SN?>`gE;lTMGOUo$#O8N2NuwBUb_G)=F>60^rpg@c z@X-!R#mmfW+@s7L#WsfL7D3Q)kiF&WQtHbCVTfN(w?oVqw&qF&RuG_5c!pN@g zqk|11uD(^?;*QNQ2wGggjbyOyM)-L8dAxGLHWlpO|>_dZEf3 zcJYlgfMX(WXyD2a9~4CFaT{{1JFUW84J$yA2p`V>P5T_|3%k4XQ)aIUFejLafm5{} z-SB$a;H-;4PUSKw2y?&UnQOgAzqe8!>14t36=If}t0Ad+&;n;t@vmxm;mJHNmT^@P zC~AGbR}!~&;hQa!aVHpw@o7WxX-cl0O~tP=h({>&?D2z9&TtiotGF^OT*K!O5%(G;~1dWU!PIx_a9wMp`1f+^FzN$Z$PCKgz zulz-ISPVeD8ZE_7k013QU9|>`Y}Jijka1P`o4+q>YsvNaAVJgy6T~6cf0msY&l;&Zc?e)oVMLMU!TtewWE#igxTb ztm8~2fyCTwJ&(k(&*uz_yL_-;=!pLgV*2p&H;(#p4b8v}U|?Wk(shFdsZkYYUD=83 z^r!>K`sjmFaVK9x(RTG}GFY$>*!A~Ipy4?ys8dN6K89?E=JYL-zvTKG%7DWc(-!rC z|A}ZW0s+O8KX9Uza=g~p*Of5xaH!O?{6ZNA=&*oU?<(@DBfJd_vn$^Qrp?4>EuQIn zp;3MU#}g3lfaP9)#aP+B;cmQshDFSFq`mq<1_B9wrJ<%DW9n5(+{@;k`uzAIjW@ez z7Gr8l9%T!T#2EycfwszWtk_JO`WF{|$^87`RLo+p*(8FHUEzEZajWNWI8EWdf{HuPzCKtY>=0Keq`+hs7CB1t z?RxlnAHCNiNI0QJ<{7 zSN>K1A^7h?hZq=ja}&O z`i-)(W0m%I7Wq%W)$pwbBLsh-S!RK97;}0y@xQZ3bOWx2Sz$7X_+v~KV62S=BfHe^ zETVV-JU}6Wf`X(hEOa-YDtrGJ8l+hJeiqP=l9W^_IvS0PlaqDTr@Z5jn>vJu=;;N( z&-u~WNk&GdyuOa+|ApLL@()rrfUNHu9uy(qOlDGQ>X7LyA1=CCz2AfbM1nf8@Hakf z%mLo$QH2BrDcKX-t+wmKe_v(Ko0a|lZ$p-&~ae(vI_+RTMB0IpxqBw-e3ZrU7$H(Ue z%4=yAS7L}rNlS-S5@0I(=#gG3-@0$2`B4(eL;QT*%07HGw_lUaq7&FSt8CfQK#SB6g z-3@64dpHf_m(FdIe%B9h%u1wPEsRlwdBR3l`4hv8l5y^pW$GVbEh1k(($$#AMVL9_ zX_?=rksqpom|s2?Ph;p0RkB$ID=OzP5*63qM(Zy%iNphAtQs=AoBvRo9enVjJU@!8 z6#Ap#{{{jm=_y05!}~Vu7o~~GvO0t7UVrdF8i@@>OicWdcb4#6oEKdLg%1p!x{LWQ z@+!B#@#E9~Xww5Z3KkZt z1ap1cKf6b=L7|8;F`X(jyTGleg@7FrstwiEO7L0D_#{Unzt7~V5LC}6`|TmzQ@0C_ z`oRtb3_`-PhjYPC#Aj>1hGTy`+LZ!AAA^8k&r?2toH$=lK_LrGILg(U`1*fPRfOnZ zKlhLOEMPn+2$T&HfxoRsp#2lJ2RQpc#9RrMjl7vzNcg%Bit|gN|2b%5 z0IcKje*u%e-#|78`F%q(BCOW%g;ubRCNxUrjn8{Si~6e#(Of#qLuoSxPV|<*;0R~- zksOF?PelcV&CPyvG)za0WH#$xFIjK@nR+=rz~|Sr<%R0Dkm&7Sukzi6liyz!6LYI_ zZ*1;s$@1;oGI1r;{`5T3`em#8w;lBLA!a;E@-d~x+k=Mc4G*T(T2=3O=@LuWz|CX4 zGmjOLWlz@k_#cgc@`G5+O!Q0f$=mVCyY)*ke!Ado3x^Sre|jrBE%>(@%C!UZsPy2i zpwiQ!)3*_tB$vzxSFGM9BO5h%h7Yq7Hcayw&F+$5kTQ%JFKM^0FC4gzu?b-tF2& zZG8j1hMiI#r111)p7&&NXHrLdW6{i}w$%_eIr-if69{*%p0h^MIM@N*yBRM3SUC`_ z|L9Ef9BF>X3$~hA@rgHExV83Rw9vm#Za&w*$s!77mrri4Vo0HGL^PFa5f_fFi3o~F zj5&ir`KkgT&MS<&kJBn840_HcU7Y|BbqCp1RUgxUfKPcRkb^q zWMkgxtZ>U?;C&VVLjy}6IAzVFs*TRQ=>rd!+=a_lTK2!SAC3f31tox^&XzTERXah0 zyjOwRT*zko7-w$Dk=HXdDO+tzU$mSGd|_9*TM-h5hRN%=hHD{cfKL2K&PK`N=?|I8$bI4UWd0%)AW|C&)mwsz z>JLrm-e$=>pvb`Dz$dFTtPoFRHVVriRgiC3Wpd5%fz2zrrnz(qPGLQq7Kx`)6dZ`7 zDWLx$hwJ32dTV+peY=)7GleaV>f4>KJeZPWRE3wOf_LB`vqEukr~VGR|8zs%jo=o` z`@Bktc0r6&%PnEDh(#^k^G-X#*!%o)RH5mP-&D;yzuMBiDVAA8x)qWb( zg1-?#P=AccgPa-YH0rj2u+zGc?kRh{7Cl`j(K0`?>#15Q7g04ruDDVq+d`61+oD*v z8@R)F78FP2ulSc00g6Jclwhb|@kQY>1V22b^kJ&zD*9Z0f6R%m$<}lRzXFZdVN#Qx z^G!Vd{apa#EqQ%1bwLht_ZW1^t#n0?lJRJ}a|;7(ECG(TX$OBG zk_dx9#g1f>K@Pjr%$2gdwZ2AFaTX`CDxcG{q`{4H$q;}^eAQrP^ zty0AUzJpg$PtZ!ko~I^mG6>(8&MR|lw$WPsVZw(jFNn#qm$kwI zVwaWHU0nEl{%U~<3%(cYR==W}LJ=w4`g_l==1;Fwd-AnrF)ryJu2a=X>eM&8uEZg3 zS7vzs7j3dD5jDE$VSc#b9$nRpZaC^QY zdCL2rUxMo`3_UVDS6zrNX+&lcXarXB4bQ|ir7@UE z#YJe+edY=(_Z;KG>l?*dlT*d4?#MKN{FoM%|Br02OT?%OiMU!41l zh~02`b0EW#n=(?%j4butBtvsnbp*YUq;K!wp4zVcs?arNojHV)Ylc}HSYAd5) z*?J8KFj=}f)=w4zab@v#&_h|e>RW7=w}aJ!Fhyi@En=_&m`5!%kI-ln>X2Tq>Jk@e{@b=@M{IxWeW);7QcTSrM3iI8{KqJ2XaDFK$?4Q<|sZt*F}M;J&%J)Son+7#eq4m2=2 zZhYPNGI{2Lx5i90{+6&Tc4fvo#{5+(4_<0c&M1F9_j}`@R1S28IGT&8EsvH@Utjal z6p!0e+ZPn#6*%m!kGF_vnoc)|C+lRj{!{>LA(+Jnsz^CckPciHO6a^*2tDgRrm{Ag~fT9QZV-<}jkv|nR`dybx$ z_VbeJpNuu?pcP7z$#_KzGB$B8-#uM|qS#hyXqgsjLnce15^B%ImI!#!(`BS4YQRRZ8aY?Kgem zOnO46Z5JAZ)OW2^=w0-2R!Rrn24&)cVn*RpOU!r`(n!>Ewg#y0Cf-DyZB5yxlPTnd zSAIN1!?h`4$wgA3&F=hCvVlA}t?R83FSi-U&zUolv0f;HW3SRsNp|tPWPG@T48}EO zqVCMU0{VdiI<(J6$qWc>iQm#2FQx>cHf%VZn@6df!dN~vRAS^%nKn8EH~J~RVXl_? zOzyC^C|TwoS0Es?f7&m=@(R|p#Lib}Z~pKjPd6VhLPm3EAV6aGM_0ob>qJh`9l2bA z%PG)^S!pGVTHiv}xS$P96?2N61iz)UTSno~?}^!RpqDcdFdwJvhqoGzzn3DG15TOS zr^=e>Knnn1^SU<{D$;-xlPw$cjTl1z{?ZbZ*MMYJ)((rZ~rHwk|uNy8n_DoUl!m&_OVe#i(vLt+{91VkoLw zdeIW6Oq_6gc)z{VY@0MVaCxA$0JTpN@QzkSmd_MM!$mrH@c$t0t;3?)-uGdFK?V>Q zx&(&qF6r*>E&*wfRFD=VrBk}3LAtv`IwVE9TRPtD3D5bQ@9+Qjy5^ca#O#%Ot>?M# z=egJCuca|ne4G7_tKv6~3vEUqO_H@T9lf-C!gx#89bDIkGJTT-Hbut=D3MlONk|CU z2A%d0P2cNw0x-9BoNyc1ILE8kf8M|0It1(A1O_~8VeZllPQJ+>lfvz38EyJhcWP=}$ z#l!_;hnub1bc}L?r%9^9t?aeuF4q8H@DIBY1n0LSyPumI%wNDjlW1tKBk&rN5Zi8P zTfu#q^YoLWk6sHB8uc_@S}&PEobp?@RM*LrN$+%)dlj7z^R;!0Nh==a416jogQDh` z!u36VKN0q#$I!H6Dh-~+o5hs02jnTxh1fHVpv#efieVb;Ds<(EHASqH@QfIH4rMUD3{$(5y#xr(s1D0hha8 zZri**7{C`!(Omse?-51NXUT4sA%A1&39L8PZSJTYGa1T*6VUDjq;o}2Xe#u(-E~7Ul~ZmLh^Mh%`gWvwsEIeEDz?E&Tjgz@_ywSdgNpxGZA$I}4~S=05pTY+gug3NG~y4z zptFe#)p{XLDrP%BiS@&G2>n1b&!ApaTTG;oZka)cTL#J-gBA}7TyAakvr_uW|?Gp z3}c-VT(%DJRgaau#mfl*I{^_>bS)lrvm&^X%Pb-+wP2jjVsr(dUU@*wd+nM=g!~a=CnQdQgw-wo0k=tUWGjnw5K@>XY|LvIZ3_15~(khH)?^Cqm6WmMK4# zAcf;AL3x0qY;eCN#^S}XeJLCQLsr1WOIw|6kJnFd6@6r3m_whOHJsFKvx>7D8WxZb zia>>>p>#{@v^lx9IzLffb8*_M%F#}F@a{erABJMTAgx0*_pCAtXGx&z!^Xzt?dRbNsmOXZ@&4%20JS1-PQ86NrmmA#3 zErZMd>NeCo`va!g@@IiZ9Y4s5=9XYDTvp%708^G?J?6`1r1=kQf<{TL2)%kDxe_$E z0>R>!>&xY-r%gvjQA5G}L11Svr@-rD9UB$3Ssbgd3A$O=bOsUQwww{hJc}9$)~bFV z8tN+3L+?e%{DSwg=y@xr-1<@-gGyZd!s3svvfHfI=-IWlco!boZ*8}=<{bg`ZMmay!7 zW|LfM=!rCh2vSpwUS*^2@-h$?NB_B=6MSo3&P=YG=FuBt*PtVjJESOWu93;CZLWbk z`<1G*$-wLBD#FdgL?k9?GkvrjM_*gR1 z^?r~ENS^- zltUkWfBOm><}xw#Mq}+8UK^P&WI_Hxq@8$XF_AHw=9JPP;WyE+!VP+xv&S8j-A=H& zgTiIlJGf2)I)2E%lLnwSk!083b-{W8v%HOSMM`u0^sUEx6qt`JW9gv}1m!B{tKnxa0MI8Kx7(7M-JY&<|~W)kz%{zpu(ogP5OoBgAIWpw=e1vEG` zTUfx!mivEx9p$S9aE$KoG79nKK&AJ|{{mt^7pF9YEBx;!pM1|Wn;osL&3MlcyTTQsl};6M#unwnB_a!&hr3@7}%OMiX#B?CAjs-946%#7gdu`y+U^b3w- zG7!~Tefbt= zFg<+)d2I>*w@-XWf#JG-&PxA3vmD6WvH;3`@O1gc^`B{y0q8cK^1RgivwFYZfAK&U zr0%k84*iE<6o%cBZD*E14}b^`ka#jgkW2NP%u+??^oi4Hj@_eJjjN05 zal{@UropcU!|v!S!5#juJZ6U+ltNGtSxufeITs*oU0mWYIDu5HBFdLeYFORM6B8RT z7e8R-z~*&T7&2J~A6aZ6|IEq>S_*|PwYz1&ji0#b}$jf{iHd_ zfsZK&zzR^Yv^)%w=21+?=-Z)5d|3fetqcf=V@@<_xecxUNhmHGA2)X6DiISyVfQG4 zSgJN8kkU64K{K5YAdXn5KECFa^lw2yr~?R)|?k+W>&lvbMI(BP2LDbgp|Eu>d#Muph}!Q|FdO$F%oV zR9-&yuI@H!zOD$M9yyEIs)IKp%kW8-O%OAQr7NgDMg zP@l(i^BpDzZ-iA2XH0&W%A==HXh6nEzj9I30pj5P8HttWh{K-u$nKX1JxqBNR7D6& zR;WDqE#2lVI#*jq1`{=taBSx{Wv4(2$EWE^kLe7y%x| zCS=Tv#L3NbInLN+@RW%gW~8GHr9Yy|6{_~cU-6KCXGo?iUEpwe#plkZ`zv*+5fSN zdeDEgSElWOD0L6coa*v3Hp#hs)A0R?MLH-jo&S(C|kkDl*`ivTd~t2YwPbYkAW`gR8HsctLa{T6>s8jrI{gyN4hE4`Y6+ zV3~y{x5O+t4|`}!x)NlD$&0YSNfH^31GNik^xu-cNb)?*$`j z6l9F(ew5a83ILtEpGca0K5#hj>xbc%)ZdSI>pllYjJ;*)?sny#(#gCpY<;&iaYq9k zU1I&?oN2CI({L$I|J1+o_`uzXGK{)2u{!MR9>iV`bJlh(1H=Ai7XqGFxt`ZOk*9aa zZ-8imBtsMxPrs6TNl>4DW-HxZZ6cOH{GOl~w3b%VCj{7&vVtQZ5 zT$hew&W9-wX*K0P@)YGSo(m@C(lYa%@m}o5<^~bt@?g?ueI<}PT~vx=(LMM>cJXg} z0qS>0$>Kf{G*P9+=f@&K;Hu@5d@>K`&nhf!&}6_6AO9}(-OY8rPqix|ti}2J@__(H z=U+KbWklb`1&W89qR9eljQx80bECO0AmO9ec1t)_vPu@Ccd8+kt*QmeGdRg#Q{3dP zPxEpeV|7Qo#@)cRZCr;_d(htfg*5!>XT7Ld%f-Ak6U9HWx%Wzg4s@ozW!p^vV08oP6q{$Sebykg#2`e~s) zsSH`&Vq|B|T5u=z&h^e`umGdWY$=9nQCvVAJ-tNwJ5kXE{oL6g*Jk}y1k}Xpv?hvb z8S+H0MckPDREx2~T&@C=rV;b9Xu`MNjJ>8mfj z*Pwn&}CWrI^;_SYyW`KY(SFqkCqXY7R-sa5>>bm^O_-ajgm@Dn+2Xe|{ zwx~zGR`UZsrv5#hCS8ijyB=Bd^V(;Gi<);|Zj`K+v&#SjX3;LJCRng@^?Y^{jctjz zfF4;LtK%z5HqAPT7PhT16|D-4h*?0iV@g+37>50qfPjI$R6xOv4%7VI-9vU&!gC9J zYpUF+4Qc{aT|{#89+`XBmKPTXxP8lT^RfX2u5*-qKgd^udC0BaUC7QadySNk(y6A2 zH4KMfNLW1{Ms>p7@Yc~%QfHOw7AZr=clKWVTaqP;8AL^2G2S$qr;D{-+sSURx)JPT zaQpMJF4cb(z|V$q(Ar3Cypwkr4yUzn2Ja%Z=%{nC7*juSlv0KzDmvS9v%iVh&F^Je zqEs_#);- zFUDLz^8@a-p|l2P^5IUdpdHr9F?)~_5UU+ez_&B5aFaf0CI>onr%zY5(Fph>(6|F zBda3qu3hScMQ8sR=*}zbkJz%5W1)v%Fq0DXJ-J~ui(fmRskxwaSi`OnNT@p34xqTw`$}{?1y09VBYn5C^ zULFVs4p}S;AmNal_hpT`h4(Yf##uRKi554lOP_+KtL!jE+fVOT8cik0r3P{#SrEjg zqW1*MQxsk)7`B778q7g6g^X5wafHOnjxJnzc2{!H@c#)AI*oiHYCtY6TD9q_PX~yF zC080h$j+U7hux|tOMGU5OTbx1`?fDi;67jUrIxC_jU>cG@BZoZPrT>vT1z&MlxMPK zFgKC>xu(k0cos5IK0gDKfb6b-^zhvKyqVXS3_f+IXbq<7KCz6Gv}6OM;cuEQ6bx#+ zIKNobjXRBc%?_qnhfiN8A@PsFW_~Vmb8K2tZDg z-PM{A{(XBfo0+!OJI97kBrmqpugYAPxqsiycC>@y>z9%9tss=a; zS#1o-d<%)i*9J1&GGV(Ry|_H6B(6wg;GE%JY)bATW{DF5W9Gc$EI7)YLPQLok4QRr z1Y|#!c;%$0>4@VE-=4R`aNKLis51GY)+1_`Q3Mj#%fY|?Vz`36+!pyFf_N5v`#IgJ zwB?en>UJYcOs#o})CT*%gDL?TQ{VUkhqj?ytEzc?Br!8iPbb;edU{tq;`pw&>J68N ziRUjeNxK^Ra|i0Ss%;V$OiUV2HVscVOUhkV0vXsF?UEiQ1b)dV=<=1!DerVDxR(M6dkWz4%(6le+IT zJsMct;k2hO#{BIbOI{H{>?Fd9j_1oveJcl#DvA9_)W_h-Kj_z*h#*q67{I$4|68~m zkZtOVl-ltHFE`_UMUDnyQVwK;EZx#`2{fMePi#!)$xMc?j>X=?@hJJ}hQ7`=j?#HM zJs6X{IMumC)|fY(#(V|ntx293u%m_gvWYgjOqqM!Qixj~@>5UuF~FlEX_$sXHX;C} z%-=D2&|}0E0g;EOm8TYO0*A^J#&^?XUq>QKih!8l1wVtc3trb04GNULaqAn%v8$+y zP==cewNzOjM;-o;9|!$HCBS#MnMtX#k`2Zos&B;JU6#mx_h#tKuf0`b3R$z|fGct9 zsiV~Y`i*|R&=uTl8h(C#GIDat5*uosr7F3y4wH;1llLz8G(!tz&JFg;f(Kvh7Ro$} z27HQ1;T{FJZ1Ml`RzXO9u(%RW%u6JCcBf*#5SGQpl9$Drp)F6p=kE{m0%Tm!T{N=+ z#T)oR_suG|7{Af6%6|neg9%{a2`oDl@lC^q-)7be3Q$V698Uo*C~dWCllcB3W1wPM zLM~e#h4_8XtOm0{sj{N#)1Qf}Jz+M;=!=`mPyao4|A1iDKlpwRCgadWFrd-bRtn>= zg@)8v%f7r64^cH~Qq%Zmoy&q1VJ@DGuHF=@!@yq^vjaH&jz9LGY<^in0 z(SyU}=L4tJZrT=Jet!9$3}#9cf~grhr_sIE)TpR}dA%3k^=~1&cjaxh13Q!Bs*Nr( zfYMmjtBqwQA00(``52+cMFzVC!|fWXygVMW$4v2@c%4A zzWJaDxBZy7>x`XChYwu#&_H_vCJdFS$)!?HM8s8T#po!Ep0ryY_48LmJugT;!ojQL zS9J+~I^ml{(%*qPg%-Vm?Vv0JG|N?6J>3hnE3zMqiTA#sxQ}Q+ha$t~Rpj>ft2@w-&E-*h};pPN_p8mHNLdYthe?=#-1Gh?+sM z$P?~)@`@5?iTM&v;9IvQHh=CtsxmXkMawZYC?j_P>gNU%Q+zr9+D4#4h%X{CqH?MG zWWH$->PchtC__;wUeet7QRAyS%QT$inlDMNbyDtr4%OTp6$UzAHe8nmwRi(5sP7R6R#hWB!W? zVO9kiV`)=BX_Cq`j>%LS*n^!(k z6_GU6LUaz+Yz2``daugzCs%QP>a`G@p3^!Z_ZTsEWE{2S^&K<5w@EJvr?njUJ8~Bv z?-je9zbzW|7PGxf;mB^j4x@3nkq`aBqbmP(w}{6*ECIvZOnG^bGwyQs`RV{h&rE}K z&&I=fv0U2N^NoAK=+lI7w1qiG8$Ob3ile5Au=YG8RehUN-X$(&k~cex;}LU`6D?M zoJe^=Jvpr@Fovn?Z0B}Lr9*H*AQ%y02;|oV3I)APQtmcbRM@Ct^&3cINq)(v)AVdz zAw`BK;~{839xk-d@MC3$r=P8bDi&*~Op!vj^XNA&Mh%^ceS26o%!Lxfo^)z5o(xYJ zcW0c}+G>MFb$*rUKGTNGqZ>}%*J&ZxvrF|b8w#niI+6Th*ZXZGT@;MB@pb(Pp3-?= zfdKsb3xoj1V6{V~wf%rlQ-j_02Ai2}>Ix@<9xjLM`SdpX96IVx4|7J3QfguEziC1s z^0k;4NO$tWAjpRo^j~4_3^QF}bbKZAw8}?{B%nlN)@i)v)H-7cmUfs_Yfy=A{t&au zK`e2LvCQS72w;ty%$2FaRuT4eb1OUbfiVFDz&{oBW!Q^Nnq8!Iuy&S$bfG!SqrD#& zw7Vt@tm4NIEFUsq{L818TM(?a$K*78APB1n2P-1pm@o%Aw-=1+*flm<`Lt@A0L+9k z%^Iwd$Yh5tBmN{E%B=l9UJ|JYsEp%@=aNyZ!vGy9~Y)0yU*~NEFbVIck%o~h4A}8b@ ziq?{fLex*^`VM{`*9li0;z^v$8YnQie$J;?(0J~{&hh!BMCe*Ie~9{Z_Kr~nl>}4( z3WCzk@1xv;tH{LhzPTOfLZqL08Urffy`N$4A*kU5T-C7zO>G+3AbP)K0u}`s84VD^br$C&tOCI>YE3&9 z`aFg7*H0SQ1E2WaYsq3*3X9bX0~B{BXY4)jrIwoNv<^x%ih2F8{K8+7c$=_z1yB_! zrs+m_U#ZtqVEp?5dIDN4CsA~KkD$6d%oFGOy-WQjWfY5t6~}xHx2a*RSM32{3i)~- zX80}0NbV=j-t4dP=hNCunZipr87_2)g<|y_zY)PE6=gW;qtJBA4U>_E5~d7Wh1e|U z9?>ykpCG#}{K%2tw&v}u!0e(I={vg=tvd7$ewo7cxt{dy^sZTRnARkK1j|M>pu*1$ zF4jvS^h+em8(6YTue9B%!mpoV&1%SilO8zZUV%S5j1{z_J#7`!^bdK#A?kJWZMuLL zg+mEqsr4la?#_eDez6P;Y&)*5Y~^}W$VP-Y!tdd~hH=BV`)L+SMMfyj;|FvwFbfDw zR)Q^SDyt5t+<0NYI5*am(hipbRkuXuW~r_eaIFSSwQBc*gJFj43}~O|1tcQ^h~12- z^1ZrU*rM1hVMxbN4cv5$rt^vTE(%!pm(#x62LH8}{BePpfu=;k$3Yg8+;^+xEuWmI z6-W3oIxH=Q|jfcy3l_&WwtL4qZbXMITB$|~}3ZlEn zkXaDlBjR~~lS<Q#y*VR-55b+wc<)cz9PVs!rNES;93GX1!pZ$}V9rN>XI zJTHGq1VI)kX;QtZBgiu94t@Q70|6Kwr1zYBF~#>>qf|H0QCf4Jv>L5dq_A+Oq(N5s z)%#;1f2i%v`85)E(lu!$718kqjN~DhN*=M!;d}9f@%8H~3#$%TOzkKbil_735NU?p zmB=UD#)bD4naQxIV5z1AM!n*Bq&9NPhr6C{8MV^7f3N7m5SnGMaYw*@Uj{Xy{8#PM zS%-D(HANM9qFLM-F0PZvgE2)bw;^U(e|pgeHXiWQd?zB@v=U*4Qzp7B!Y338TZsCgg%qs z3)gL!i|)A6#ZFw}SAz@+cPlF7-V0&O79*GojS}Q*+rzb80YMuUL~~zVI~k4^4=#(Z z@E&Xx2XNC}az7@dB~@9x-!q9g>dgK8`FP>VlL~K{6EhkXi)s@he~e6?)K9|eG91&d zJ?)3cbLTMZKYeg|>cqI_GGC~+H6BIk^)qV&2U*nD44&-R88*7I6J#b~(=cq_v=_@m z6d^F}=ZqMDs>}XnmGJ2dUXHwF?K#rh!`-OY|4cgw$lw2oQjy;F)N$*tH3RJ*>7wz$eImmxkGeqitDdZ&Q7*U@MW{Fvb4Uqg$=QKGR#Ex z=u?jp+BJg%W?OlXf=kk1*U5?vOdLK=cYn{2Y^OjTQa<`tp=IS)a;iIRA`u)lt0<{I z5TntM-U&-7PNR1FLMQ$x0zz8aB&ZX=R*0JP{HYj!jTBgg1(Lw_&&ah?gOW0@#0jn_ zz2skZz-ls?6C$FMF>cBWPOqtcp?z{tLnf~2TIt}ayvxB!svmS{)Dct+k1H)OX5B82 zylK9<@kMM}!Uu;KtU(yssNa!UASJx}k7ebDi~{3n##QSoC)C)J!F3(2HSuo$RkpeF zeLpUbOphm%Q?IBl+202AUu&N@2`uyD`|T#B{exFNv;=K2iE8$k8ut(d6?K$E4#4vH z*S;eR0b=e(QYmhJO!`0lfywCOsrULhPY_rVV1H~wKh z`UQ(a`NFnGNrF*HsxfmD9=uH1F>WYwX)q*3=6`z!zZZ6i557sG*!btF75}zp{-~-r zEUaiKnF&@d%71F^3sp7)gK6p5w)0;8d#e30kij5O0G322jWppu*EY}W|F#77S4%peGMN9jU>BgeNJO=& zmH)TbV1Ql&P^a9GzZ>w|DTD9=Hw;A8`+rn4`-xvvnIJCweIgaopZ*&d7Vsq$xl~>~ zRZ&z_=g=b9p@yw+6nDHUYE;M1p2)i&W*7kKl{xlnBsc<8|0zWmBpj?d5>Sgu+hU2{ zKT6(-mW#VyAU=?77dgcQ#^MHbdzRIItUuTK%iGw6NXvyaQpvqX_ItwN|I3Ml82+j# zu0x^1zk4(iyuR@+JV6@hC}4pV4HbP-ghl)J(*nYTNzw?xA_18-hmC5>DUl<U*7||}`IGKytw+^QXd z8_ERQu_g+Nrp0udv6z5PqXLPmnZb&q!uzd82zz@An){rQ&w1ZXbJJ+Qi+k*1j_bi> zg5-SDcGNgWsn1rRT+^?<3Tko-e{)w)QaB9yUS<>Zx&EXLvYg_NH{eQih#q?b%x>r+ zxPPB=F<1^bqXR_l8=^Mzjl`2${6nj+I4ScWB*YeEmP-E4(|?IK{LNyy(3N~}_|Ny^ zq~-dvlImtgz+f|p10!z`oI~e@0fH44PF5{7lsliSf8vxe+UZ`E|B&Yy4NFTwNg0p& zT>5jB#VCA?0NrVBzD0o`>```D3A zScbIvkYH30i=)4g@k3L_)#6GL!OYVmmOL>XKlz)stCtKdSF3v*Rue9XZu^T3DWB!9 zhiDh+hG|9c&^{}s$p8q=g|_LdO&yO!R!Nf4G-RQZusFwbR?~>im7K`d*BwLF@=6(k zU8X|s71MYRVgwgqTEov$d5|rq3i@_R>jX0`((%QZn~y!vYnCM4-5b;oi#v^G^XB4E z&C#!wHs73?pY+;@m$%8@b#}RSzdAKA~tk?bW zu*D@#i;E+4I*EA8d3V-`8d594xZLah`0yS$@G6bEml zcas%^`umH?%?1AS_74$tbf#;(HBM=5L^>6z1@ibWRVAul)z};&dg?fE8t{hUpfl=j zf8tssU8qC<_}JxKI+>4NYNX_HDIk8-Ea%dfCQf+u_JU#bn&Q{yN`~xLzm?&0yvtE_J%>w2>SA|L!-P5&xKNu}$-u4h?*(Ah z0fa-;B{yUb{ek?odDJRTRkD?O?}RtGL7P=@pKxpJ-{K15wE^}k^|1+ z%RO16o(#nY7ljP@fjnaUS zA{l)AWd0xb(Lj___bon1K`Modk(P-`aXSBzh}R|Qb}C?ZrX~1mk=q2e*0awTl&h{T z+O>zh`0;$bjTUbP^{T^P4koGvfE~Y$-g}X`rBOsV_STDFz8zvDw_59Ef9KntRoX+X+R$gulIP|7|#H!liJ4}&!AsKL5 zWyhA|a)l{w-B~zqR!n9u}Su!z`obOu$j#gNeF6%712$d1)54&g1C<& z6c*ovsc@A}i?mhpy0ufbXu3Ff^O(_MQt}->r}ekGtH%Rw`NM0r?-lPYdtJ!vnK&(l zJro+xF1O$mN2qgnbY(x@`&3LZs8mu?(!Gv0II z$E~lx$Q8-S0UyM5zrWt;LAOR{sK#II^*cT{w$JAdZ6)$qcX0Bm$gqLp&}OA|Q7+9& zWWYo_M)$pV!p#2it-p1`ew~oAuKzc;(n65ZOOc2#Xah!d(UrtLBdI;!;n+L%@^J@y zmti=}hKj&J7kTzeytEjwcCgZbIV2hT2?>y$^_G#~9X62;3puqZzbDZs){NhsEiVp6 zAN1HQ48p$VdA$Nf@M*LorsJ*BCgKX^NgxmCqm;q3yN5Op;p z&-E8Jcbjes$JK5t+xe;w>E1Uc>DxWWxpRU;qD}j^W#8R)X3G=l2;C1>+wdo@!(r_B zKe8irYbV~c&4TAiTO?kzs<M-C1sJNi#m7ECn5m-Cjib zKN=(*Y%mkZCTWCnKP=!hq6Bgacebt=QP^i^xi)bmdFmO)c8J z^Q|U@TFI7vEr)`dL^a}asnK;4QH#NKXCj*Og-ufKaj{N9nE@Z&0rCZKl5PE`FS>!G z4?!f}8oHUlDj9CydVRD%d%~8+@BS$S(snAqtfMo>;d%9ayY<#SAAzJfXk|~W(S_Dy zH3VTA%?ycvB4_2}<6e22^qcQq^dE`6NMegxY@gPODol-2N$<@tYTRbp1&cd!k66<8 z^6ph^P2qG)2Q^TN_pfcecu&KuR~yq;&S*0rOY_p@eA{up#wN08)LMRTx>yVj;X>mC zc*`1MA}qKN;~ar@95tWOExB)fJ)e+MDmrt#yP}1OWFLvHvEd$@)-o!bCmQjQRdX5q z*(gZ}OW^Pmw&!w>;3tETVhLlFjcuk3jEPQ1HwZSi(PlppwGC2nbY>;;o|(L;Hy`R9 zT>kY_V#~Xb%@NYZJ>qOadoqHbXzzF-B!Z>SF^0@`L-K=(SKh`ocoU7^=LFlgeBACO zOa@hGT5KswIq?pPZ;RBT1`YtHi$OT;tf}p#VHnZ+F+c~F3nQ&ZheGq0`;+324@s8Y zHR8CJE{Uq0fH4fkKnFI`kQ_2Q3{YqIlV!RlO!@Tc>(nNW&EtB^e0@*K$ zx{i~(r%6VmmA=o@%p+|rG49aM;V^BZFs_b;>Ca}3oZ}Iv$_(BSNr=3gt~6(NO^s8R zvm@kO)f(7>r?T&hX)7<(J?L3Z@_2^TrRH?@0po=v4?6Yx_+n~WV(0s z!ePEzBRR(PXv7*Wq1H~19iOqZKM(2n#E7O3e+ypz2D+gmFjk}y=G%wA>1X6~FJRz0 zZ60t&y@;`Y!qh=WY&1A^-<^#_$o-9z4=QFGSRjqy8GaWp(7XauEp>Y_ukgbdH8G)= zpqwGQD+o!}X0Fnhm#^@c0vL=;uLM7)H8o%2(5+NvX}-mUoPjD zu&&_KAXiHbC6;GLlLyhqN@H(t7`^RlsOHUR<`|&#*qb3bAD7Bd8%DfOGZxWk7#iCMyNOtXT>N$CZ6wQTvw}bo>wZO@i+`a&0Sp^ z3TaZQl}@W?&FMnND*S8hSy+r5>Zwf@%izrGj!s1247;$1O&PmJHT(#$kXx)Bx9Lak>y>gqWcAnT5EX(1JrGk`s(BPgN1@f5F%}6Z+(qeS}3P;7BuH@gRxD z<<0rcSQa@4aPV5qwtp{fwB}m)^b|CU2-v|<@6lexxbCHEv8JM(bB$=*uH9cgetqSC zurs44wVU19+VgfxU9p-%!Y}-CPEtX?Oh75kU$Q{A1{RYi*F2Ge;FdFG%rP|@O9vf~ zFn?OLE`3QJ^%LH{n6W{aXY;a5yVHuVSMuwmQxW6OL8r!z7FbiJu5w~mXx|qU?w&ql z=#SQ(w*MT&h4a}9o^mk#mk~z*V`Q}Xp^-?n75E8@6jqy-F~!SvIHu12JyItHegN3N z^n!IT&HLeJc_{6bg=sAL*{%(I#mKucVq-u5A!d?X`EcV`0VJMMnF6pp{uGrLi2m3c zUv3Dy=}MF`!rc8^bdpCpI*7vP`0tLZID8&n##4{^9HS)jyEoq8j9X-U)VSXp%lep8 zU!+k?<1EBLPh7ZapE_}(1`&B}OZgCfX8pjz5-_3|YK?c7Jzv#4od4?wO&UcQ?B5f?S|gB2cj7jrGsnp8?ot zC&-slA*OR3ROZ2Zt{+=LY|A4^rWayRleDJKkL7*3I! z%F6M|<9AV`?}ON9y;c*arCw;Y<(A(ZLrA({)U&>n!Vm>vNr3#eg@rjJ##t37WJRE6 z;1xSI`GHlPu9%@P#BM^tC`8!Yfv5_Y!_MfZ=H2=eO7T`R+jv9mf;g(}{zwN4bsF9M zcxVXkN>GEaJ8mwIN^!I-=PG&b&nF!~2e8ZSrhzQZcvMSCo-bt%zq{jGT|Gcc9KLtD z3y8%AM)pMzUQ)N+E9SmCYWYaiZo~75|7Eu_0pEqx&1{>X@!6Nzc#>FhHdr^ndii9* z`57u{!AxX5=i$?f58v+a%yhik$~6B}c#8LpeYyM|i@l~Pk}AjB<0m!xyq>+wZ`UX} zKWE`tl)MOzM>KNZu%qHK2gt&qLalPxuL*UIy%P(TxZaNsms*NmMC_Y76GOdx5&xDtjZEg$@HcCmfhpUf)wo|2}AgkSUHAA zJ6BZPo);M`*As8|kXKokb#%69-t?LkUCaF_+k?A|_{?U|$bOO4P{xY+LobR9tGCt9 ztbMevzpGM*{r!2KEWX-J+hfN&w~r9bjIJ$gl_0*asH+tmFF54VL9I$2R*p(Lo*~87 zY(LH@BXfV5Z0+cfvxfGH*i`>^`1eni?_P$iW4ve1nxcOSzsS7K*aJu%>9jU0pyrZ! z4)sb0D}T_y>qM3Vt~&hjA%aqLq@qIEUa2A3z`XiWs(0=T{4$k1dCJ4SMT0?pdo|fME^ko~Zd+RDvD{{zhkIXFz=WIa>prgNiW_!};tF=_#UkE@$sbFHgBs}*96 z&bs1M1+Kv%&@-xCpJHo=k*f{V!>jjRNkt{;M;5pFAJV(t(LTXeRHvuavAMdES*?*5mi$l;Kal026lp611LA@s4CIg&r!Nq%H!9d|?@Q+*Ki;nhFgM zg~Q(}!26uZsT#@Mk5;zV-0Tzz!7sPzF3f8-eQnp}v)y?@ibN2l8U+lae$+286dI(# zO9{pzhEZ@4_F%ED8`f62r`d25?-QRp8Q`KS4KM1ypG`jjIJm=(X5&cXh{E478{jUc8kPwzud3%yD{~JMXL#OE=VPMQ?@3k+R zvG-1-#3#dqaoZxhJ3gTJGdLmu7nqL7%*(1NK%4%!H3R}XMjZT<6uM0j`)jA?f!*&) zYL2wL+UZ&`+M$cQ@OS0^hWgtv zVF|U+@3|5zSCz!`l{{p3-sx|MXBSIiptOA~aHal!cAcy$9i8DT^O^J5!7f{x^{SN3Vimd!G2&?S&XVm#B6!dCP|! z+sK*$;j|&F_6Wp+N0FtK2Bzx2Ye@4HoH|6|aU_vlK?2#>yqWnSs}W4VNnJD)m~5zZ;${uy|L~)K z6tvPQZ2TQHJKrHE$v{uwqVEGdrUx3;?s7(foKf~CSN_j$#EE%?t6vh2PwN-SF7R`3603;JBrG$*kG^|zZINu#VKRn`B}BJnh)j8jaRlE;bK0cR{Hiqh4}lRCZ#n>N$F@@Ukcs-_QeJG?h1cg zZeJ)a!vRGiHbFeHRhGjNuMgptPwXx&2Y)SO*(=?$@73OoscFc=fVU*~q6N_rWt?-^ zUz7UZmzAe}quWZ*r` z^ogu(C8qnYN$~eYm{r7h#4LMjDJiEz-u^E17?VsG4dX*&d|`TzcIH>hCK9sTezY$X zg?~Rhe?3FW3=rw%Awf}Y^o@-Mx6Ap31A^?z=5`Sk#N7`GNOr8q^Wn8QXo8_pWF&qu zW`CFV7da$cAq-_sv^6aPS&-U>yIFk?ragtB zkohkaibV(#d>3o^Rdx38q9uIW^j zF9w6L*?X@w*PK=F`&5;UjSV^bBNo8EgpK`hHwAc-g_; z?|dmnN;+lU5l?hfrldc%;ZQH+`3irZEKBx4BlK>3D}dyZg29WpjauoGp%_W>Cort)uYa~pJ|-foi%pc}HrEbQ z)1CnP!|YH~br#pl-%7D5W;zmeibcQvAgRSV@1I6$`}tDFL;!tB5fGI0t;Fv2LaIDo zSvYLT{>ast_2wyZFo|YDg}@Y6U{)fB;8+SPtUBEFwRx?{j01`d@x--`d4LW z^~b+w5i}?PirBg&Q(QtL%g8}OFubItkI<1JP7Q6X`jzv9*~j*1N;EQr~Th%ogrKuzY6s z_V=Oc5Gnp7ei{cs;q7TevCR;*bwdwV600^R+ZVt`t8)#Gn+k5zmn)(;ofacIe8#y@KL7=a6e z*p1sst>JLVq7hwja|0g-zBO25I7a4^KMb_!E0_#GSYg|ck&u*R6?v>h3YSz@hpaQ1 z2?Ox$l6p3gMzvotk<$Pa3=_)Yg)ES{#Nl*7SEkjHB)*~jHm=(xxQ5kWTVR`dHq&gc zaWI*c6H(0PaCgp|`}Bv>$zvAWi6$9~NKgTO(39y?^W=4E! zGnz0gG96DFk$=dZ7>LYZQl@_XyXe57K)!Zcjraz1_ z7B06E|A$~i3H?B}LBy^Oj0zTmz9yrS?GYk0fc0T@Ph2(Q*Ax*U>+9<)zS{o%yU=uQ zc-W6RvID+dw+psz_*38w?~cFnC3>bn0p^u1q)i7R zS9+kU79IzER<6w@8&_tN+edBd z1@P~fnFag2RejI|_y+lSfQv{DpV)_JW?lIN|{_rXa z86}Z!jS;Q#9PtSX+0Cn-}uDE2ct)z~G{79y|5!tUK=ir#JB=^*v7qXum zFvq2GCFT*1uN2GtyV*nsejpnl0)v5p@E}XfC4sl5r+)$Aj2TU?6Z0oQaI*(8%`*Af zk^~_}Ln>X#VC9$5pgdrZHN_bCAhKIzy~SFY)LE3%M!P{AW-^{wooSrO-l*^&nu-iL zsdSA()c%;vnw`Ehy7;hx7P*$la0444(j(><0}4r~v#wM&gwXt?3AR3lX#S5@mZ(3ec?{k5G(owBWwPu3DO3Djw5VLx0@HY-sO1 zjNHPBAH>IGR0Bu!ei`hM^0l-RfT;j*?Jx>8DkIef}okcKq2B8qE_4G9a@W)W~O6J*L*A`810Q

mr**ziu%mSCV5ko$=fAM(0WD#&gLs9?zb zlI}$B)H_&=r0l*6nFxPs7h~}EL;`e&VCYUeI5NFS&CNh}o%p4IJ8+Zj?d>>27_c@@Gf1Ti8FdAA0CLiND)G5JKrHuo!r9gj_0*P?7R)UGY|T(?VQ)^ znaz~9o3HRn4!kwDu(sB7ZGN=G!|foO`<80AiPzy<8#B835)paM6ilcz&K1f@-)h7l zK3sKzPlf1aXI%76f4%oM#>O;+hJ*wbgB%WDTM#IQ>jy**JP(nJdr&fHUoxpb`8c?D zCQ+12@RHI7{*Y(ah{kmRr}I9 zhW^?;_nR!oa~e}?&?vUrGVARp#NUB73A8#BCH>|~LM4JwElB7XR9@mZZaI7`-()7z zAr=#ly~&_MnScT*S(7ln4|>0$U?3q23GKaJ7N1cB(>Ut(EWH<{?=s_3Sj^LlD}Wge zTr~Kz?JuZ*4n%+80Bk^AFm1a&yF=xdZ|NEmYZ^-h$NLY{q(Y89T9x7@IQ&IcB|1*ZFaq z4i)_IlJsmD?7FR$YjZneBEF!RROauQpM1mUd19B#i0m7H;{Ft9yZD$Ro#IfpbyBfv(3N9LFgjWYf4$@}A{*wbLDMze+nw>_2|JVr7O z=?glbq;E3S`6$OC@aJDl>;lf)?irL(#`$9jCw{GwIzP#!o6SePgI z!hcShaIg8)o6{BMpCg>lH4g#?Ihd-)d<#e~vA({MTUB$E*?uerz#D&yx-G8H3Wbp3r5AAAxI&2Vc+pY+v&j4NPs0;% zOHi5{9q(oTyH2nGzP?&tbo`$=C%j6-iU4N2WFbghsPc*LaS7trpq`l0&WG>*tsEhZ zcZ`{8S)lYucTVLjAu8J#SJT9DmW1w!IlIaexok~>Ne^QR{r=8$qX|uht6ME0C#RYOh}JM! zK)I>JrmCHVL=l^BT7&?4q&=9|IuY8PPBIc?580k7_n#y~Ah45aCM1T-F;5VZaeZ=* z>a!>)Fc~2&6Hrl$^MPfv8SVK@6evHi1Zw=+pXf4afQO2P~GyGBw)B5?73do{< zbq9;G9W#6<`4eSZfsjm=Qp@!ij^i7If+b)0+F}vLQK@x*rkVx?y4Sm3qG?lg=MISy zrl@(=X^F|@hG*CA`-o;*bMMnAmk7ivuCYr+&`5z>#Y99#svx0BP1e-d!mN;pMwATQ z$5V42jwUJQr7hN(b$|jTIXy=Y50^eeoVC~h(v<}Sf|JX2Db&m?Vmd%%JuQLHqj}L5 zyw-xm#-voJI}p)^h#=;1>bjB5`y01kH0>f8FzXSD7L$?&s(H^>&=(X3sEdlW)?1;o@OXis|hG(rn~mJ z6G&mpTve|tE)jnVK+vF|uli*Qb+lS*!pUs>O*Fdt@jBsaHIOWW@?P9s81TfY*O_+5 zT*C`nO_nD!mo&A02AA{6^gs??s=+K_m};mqYp68LhfpesFBHRj06`Jpm*F%4tZ3 zg$7I3If9ozsh0i0qy&DC4s@cKcRbXNa42>IP`GhOUe2l!-31VYY~6ezM++T;)}@Z} z3Nvv-+BqV(g;Gh=wRVTD-{&|gIfWA{A9s?QH!#;0Fo%8nEQsTlr_k%_>;_nejgdA|MV>4fl zCrFz9?>U(a6>utRBj~QNz7P*J?iYH@u#uK=I1Qk9u zWaD)&_rb7c(W8xoW5 z!IvOxXym7yJcrZdKUtr>YOx~4+<pu8trb}sF6=HeiyyLGZ7xRGrp5PtIVbbgvWk*f&D~tpq-b;sMWF)wvkXYu#MsK`NQHc zK|>S6B9up~Q;-w4mhv%Jj0wi;2 zvTbNin;pMtsxCWtRF}k(xpFN*bg&w3KeH|aE^lTb1syGV*W9hz}=TnF>;HB?f*6^(Os`L$gAQ|(-_WdKi5{@{mz zlfTvhG9@_OqpVnACf>ELT;P%;II4lMA>Ad9`xCruD;(J_5egCCB-ZW0l$`XAby!CF z0ST>CZ`gO*nplQ&hJa)DSh#Tb(Jn~DlS3jqx*Xp_EH|x9PO#tEb!3SLDrKg$Aj|a9 zFMRTRt<4VR=X7O)lu?2=Ly5`@NYbdaKi!|u5=V$-P$i@A67x(z)O;Wzmf4`0D6vq3 z&@G@vpSPeO%;V)wz>nNS!%$iyc|bU+C)2&DeDDN%psMAovfH$!H={7X>&+G%U}nKiw8-imeqJd7uqM;*ogx~2_P+?8j~AFX#!nQw2h;4sR;hBic9(>nime7o zt?;bqh7JqtsfF{VdN_B14Xz;L5H2qgxxbeY91f&JoC!nAi&vb;rw>(^0J6&m?t&R0 zMndrwOA|9kE9{yszyw^V1iXS7m;hSpbIA=`K<6mJ-BzNa;ARRKph^UBdaJ1ds%k8T zQD4X{CH5drd_{l>(UJdNH``^PA&hG`>Qj#1j&P>FymQ zvVci=eq2ySj@&*TF(`0#{2Nq5gGn3gI)+XdQYd@h!c6MqQnjsm+kNdH-=Q~vwwO7p z;RO+*9aGw~VBXGn-9#7eJ^={rGIU7b^fNy60_su4F1NqXwq|%_#Voa*Ko&pf%tCz0}O>|VRf0+ zp!B>jAoom>jr}wAx;`PO^AYIYLC!EB<0Sguo*b6qPnnEqP<^5^8kc8Y3XF%|fhV&< zei*$unCgm#Aal9!1Rdf_E7MqUk5;e$nLUUmhVv8~^-Z@N>J$cp)GHE|>`fIb+h#YU zcyzt3rO=H?AT?YixTN9;stBOyeo;!QM7#l5b?``1ju>jKL2_)8HJEkJC&W&@-$HMh zi_Rts<)2aGmnDuNsgvS#HT8yrTqZH26i?rO=*)gfa1X05WTT4mvLxo${Ro39fUIns z%=eKw2~}I8sYU(~m&)dO@+tbx&$&s1D$sc7%a^nERb_L{dN}M-MFh4|_3q zwPaGi>3wYGA^zpBbkBxDZjuqhsg%^#byztGHEAK0!je(mnB5_%mbfE0nhc%b7vRNg z#zH?tQ2_mc3}|^sJqUyGk?t3)pmK)8R?E&<*MEKeIhdbtC~sqzEG#Lw`z7fCiPhwu zo0!>xyx!;uoJ6M|%5sS$&EX+A>}yys2tDzT>he1^W9Ps(Qu}ABqLGR5Yl;1dg~)rv46N{ibZ=`HRgdFl zR#9-J4%SdU)pki`hn+GsU`s~zrHDDl78JNU-Hi~w^Vcu=Z_67f5Bdpy<{okoCtF1sWyU*^_Sq$c z_sQjlYj%h3a6cR&xE4UchjV{#$PWIo~Gf zAD}=J3p&WN*PlLOI`rbtb$Y0bPX-5SA9W{KT6ibw!_!k#fC*#V7nIrSzCknu2Qv*m zir#^DDAeh~wYrc~RBvR{OHvcJOlIcyW~Q4j_^cC2n)?1|P2WHL0js`+AxFVz)8qU+ zH2sDgWkyC$CqeAPXGdZc5j^Zl`!W3W17Y-a8EsrKvYuW*b9fNt1ytjRQsurc3puqv zV92**OeQI`#|2uht+6B@MAHl9C3&IC`=OzK`GdO3EmyaSIO&y z_eLNV{%4Oy23?s)ZDR{D-tG3PhDt56rOp0_6R<+r3;V5LAR_cn5dH74Kqn_C(dTy! zzMC~2A+2X8S)E0+LNNQRaiWi(m_l!5{-J~Z_dNg)>g-FX+(3S3DgX3jPBqn=5hRlV zUrDD>KO8Ki5>ATafh%_`lRvnLAM&XAcM$?19qb?4iX4y0#HN=R0EJLkNvDbUGl3$* zKP1LryDlh(^kVh63$}TQ@*fW4SHk)keP=r#nAGlXN!2`f(?kG_=%(<80W_pqps164 zSww0XHyEH@LVD$*Szth;{XTdakteCdt&DwLz~}w0Q?5a~6b^xTQKB4ENL)dbpcsiu zfO^*t$Y`89^8f7)|4aHIf(Del>PZ38@PJz3|D&=1`dEZlRCc8x4f_AZGJN<*|9qBT zB7)2R9IpSaDCq%MliVV40_@+!<=?M4(XT&6TjDvy=Ko1QRFVMy@FYso|398TWbr@W zHg_tj{(s*N;5WX-Kk&N4V4lGLh(rH+!yru%@JY(ZkOhKbiyD^@+mv9E)wzOeaQX6vdPAfAs4>GCrH4BE1%g@Z1Hlyq z*VVaw$sF$8TeN-2r7jsTc+hAPQJE>PcBH9(gqAhKj@&S(+k+xck zQ<;JrVe5-o)Qsu61uj{@)uaeH0xP#&njW*WZg}>&k_XffeWu?>`*Y6Ex1MB8wTI&? z>))&+d0wI=@jyki?-^bndUWEg{b8s^+gj*uMI^Mje7(Inv7c|znx7*YR)s~~oR~Jg zk`QQ`=*|Ln|KKI6OsmEh62Z+6@(y0gvFI z&4nwIEQ3}1U3PeoZ9izE)dXqHb)ln>KMf^xbJ2Dj+G4~ZNvK>LwtPOMd8x39@x zr!99Y0@3ZJi>>+S>dRbHAk0iR#4*10x`Vs|BlP|@uH~92(9^kAs@-V{5(v>inN8FO zP$D@9aBkbT*%mpK?IBwt%tUT-_Nb4h<^l6r%WiO_huvm|+LwZiOk{5NxG6o!IfCqk zKBK(F!z(8|z5Jfxhs2bnXRAbH*jrS+CFkbgw#oihAdlkyei>_9@vkks9`+PPeXQ_z zeeZ%&g6(vdJ&|_p>4VuJ#mK{#a3*aX70o;FHIEUq*i^Ufjqg-5+ew!%H9C_I!Itux znp=cCT3=bBoB;GDe9$u zLifH(Y`9ug(0}yeJ6{XYIHQRcL5OHUBYRB64@Z#x4LxKjg;j---ndoq^ z7+c8+Tt%N1pZ@&GWcpl{WkcEWz?6`56f4ZQv#}?t-t*23M z7gHwJW`?iVesZybyQ0<5QS{5uAhnL!wNi1aF>w$rS?EbE% z6E(7)7S9eRilR?E+@edWsJm?ah)*u!?^7~*>C4IS@g`=AAllX39UgV5+Jd#QjNV*dH6ihuX@Vx1;OdU7x?q zts`x`&-MFB`fs%wFqoTd?6It{zNcmXwtjtK*UO@rH!&-X+3+Y~(8gz+kKW6VzOzX> zD6Xt%Pc1AL)fB{X3?FEB7oBzDW;^1xZ+9?uZ3)-mS$qg@84j#^ewn#sAuFhlAG46Z zb-P>wM`yj95U%gPt*DtR=b7!NrT)3bu~F7-jl{?6%{^S=Ic1enWsU__OIy_cEx{qj ztI2kL(dD*rLz(1K0(TrDeYVKv66v(F&Y#28p<|+dX9lIf@#^Yo7K@O}NfWAw$CYy- zIwW<(u)Hfn+Ete$Ng-iHe`!lmq zuCF}$3e^~u*{nX}`=@VoJ$f5?bMmjTb$BJHSw-MlPhGzJ5;^BEcW-j!`~lq!a@OQz zC~%RgO6Qptt|L&q9dUa0P5fAbxC+NB4G0tM_n-|S`;jJ;*XdyoOF zFS-$~SHY+hltaJzaFwvq-Pg~KUwqWpQt0Co-z(AmD0^nwa2N28dKf{RO_kduxH}$% ze+`Fvo{bY)o?4?3abkjlqDcw(RcFVcVT+>X_B;7@_XXhoijU-d8N2H@+c*>)z(68h zZM+pe3fjcq_EihiX9$KVc)P^a!_OmF?GHw_7<~y?Ly5$fQXI>LMir4twoh72=Q8QI z#A0+cJ-%INKft7z-c_%Ivc6xtu6HTi+zJ?CIbwk&EC2lcsTeAOUh`a|ZvWNs-N^NT z@VtbSNTG44N?F86P6HX1WQblwy$*eGuFO3g0ZDRjgx2Fn%ai2b_j`^wR{a5E(C2aM zD2`NFb*r-W*ZD{muAnJPsbqe?@NOJa+>}>r-=FQN$Aey%EV8Ti{J@jd>n)a8A(^(& z3c>HCD*(+9Ce9`HA(w7DPaLIO9~m(Uk2<}w$7#dyeuZVe@axA$_Z1KO_Jh_Zu3m5~ zTJaljtXwod2p8E*r z={;WX-w-Z(8iXNH8zM8bz}j?W!zG1E3S>W_eKhql$WaTIEZ&wZVw z(CzT(l(F_`_7m=chxME^1pqe4*4=u#YP}H0bW&_oDA#seC?KFVw^2Mt$^PT{Yyfi+ zt=Dmby!N!rks#CJXSMg8QmJA((8ct(qG2YKwYBw4WMIHdiGF()5T0v0y?tJaFH4Q!55r_nZ%R+~(#RMr3_SZ|2AE`58=D zUhA}EG@Dn#9a#>yqCYTRxwJ=e6A4w3b-Y9+E{V(XWd{b){(LGVnYlu;X5YOq2l}82 z@YAlFKxO{jus$qx>A!!n$CoFWAa5w&Prr`|ezR$8q@bZQZcpaK6>&LRucfb)@t9F? z%+13Sv0P^<_0qpK6gI6Hz*FA47mvkwlg9kD_#9zEUup63H%xxF!~7}|-hMgjiPbvF zJ7BQ-tUqrOuSpGHz{y^uIL#UMZol~Grq&%?FF1w0Vrn;so6bq(w!ba|bK}M)r-oN6 zT_fis^tTa5SK`ZG1hl}QJ>U9Puu5S6Jc|RWB6@eUXG_UW(+6xw_G+$(VI*O7YF24T zE%``4uH9XbQ`qhaC#NpquN}@=67nLsxmWCP99P_*BR)!{KK=+-_+_&zF>{Wpt&%^6 zbv)7Th3ex5J_I!?e zQnlW>gZwPczwK_+eR7~tw42Zp?76v{@HphF>qX6X)HNFdecAh4GHIIcjy}#@P@EFs zICJ@Vl5kSe+-of;tj1QD%{$ZM#n=;Ej*MMepuZ1aD0JGxaP(y-u$6*dS^$kf-TTW!7aSd?!kGQ-S1yo{+rRjZ+xFSSOJ$y}~&*$m$}+Bqbc z3zlDZzUfC;m2luk2c-48^&~c2=nDoNTM(JsQ};DQN#ov6(m4FiVB9#vga5JxbK?3T z119G|FenNs7XtMQ*M;g=e%6hPkmOYCkzTNoi&jRFrnP}y=ZB#7*rcRBnTZLx1qBhXkjg5}j|G@`;dh4nB7r2KBgXh$jkj@R~XIwWP z_^MPdAGBombM-ol!Vsp6VB{1WKY;LldqnH>rX{c(rVim+`c&(U)i(05G|iNK z66Do^_wb!BU&|7UuDG{ITWGw!Z~s0D2l{mOj3jT?{;oF_z2T82ne&hb2@VcHYp|HB z*#623CFJqgrwk!;O;hsBkxoqF#2?T3xVyINR$d*P zxBXU9V&!-)(t7#mcYUz>=wHW`#@nn}a(BSCq|FA6*7~W*>Iv=r=aFbE#rbuky)q z-B~`JjC|4jC5YMays9{Lg&`LmbENTWaYBGm!p5D{)jE78SMs+V2uR0OI zu*$YGF=dCXnf+S#n!KCYl=V3XvCiKPG43xG9So)Urx>en7*3S~iu~ebx9ZFN%03!dHELl;rV0>)O)@Ji_+FTvPsU2f`q_>xD7djX3l95oJ zFfTpvKpmVniw7&}cWCABJ=eqJ(%6ysFYp0gxq394CBTE=6j6|CfQ+FCf->-1u)UNj zn4`zN0)D>Es2^?uBGvK4b$lM8Vfy%mq4#H;o;~-u(J6925-Ik5`ql1;{QP5+e|gVZ z8}8*#j(t_*-GmGy_H^0waYC(fC09E8X3Oi}};9k&&&HFH)P9UUtY2(w!A~-<$YhaWzqw z*9qTbF8293Hwiy@MmZhn`;M`L+gBd?UG;eD#N7q+}yhygS>-6FNfS* zFnS>&oosrC5@4o6U5t4V!WhzcXP6$*-BR_dePlY z@b}@?gCb;O(=z)R?m_?b9~3ZpNsTd4#ceA0`j#zoWqBIJG$9&R{qY-UFSftu6TpFU;m% z$m*a$(y<#ctRptpgZJrQHYdNhpVCiN%@x_#HJ4kR>bcc+dLsYWY7aX`Uuf_^)N8fY z>2Zg14$!)-MSue}>0%Cs_~6CcufS{h@FUukWZB<{Y`Mfev^9-a70*W+e$E~IO{&pI zH9X0}cO9dzjbYMEwqa=vV*&c5iKr79u8)svZ2@J1wsh_jIh)aCt9Qm`;qxg^c>3Ae zg^I5yT0hJLODdcQuZ81&WW!yiG#k{V+tiyJG-JqIUBPzQ9tDR?{g$ZLhRyJiZF_e_ zAZ2Q94@KgKCz~30qZH60xELO8y}DsbySs)E)<6(cOMMU{70dCGJsa-2sZ^Fp{Tn6O zYIo|Mlc;k^o|_KK3c2Cvg+zi-Lfxya(^i4Bw^~#<0^Lgxsi-l7WsS7Kg1*Lzo^R>a zzsq{S)=as!9wNjWSf}APuh!jAfFf70@7~AbEm{i?5lyC1s zD|^EU@rCjCSCV^G_?QhlwW)_oRt`uujKOl^V;;;Oxn1W`m0Cdl4);eX1*b(C-43%` zA15rSc7y5ZCc=`u_Tl27?MK+pkK1&aB<;dGKV3JM(U~2x4L=v-g97Q08%SgjbH!18l#cJxKt@<-^Mc^#j5 z%wcGGG-~~l9OKk{2bxrnIFhivNvvybfM;Mn>o1R&`o5`yEJMHC8BnvXy8yum`LY^Z zz*fRld3n!*adBA5pi@aH8`xFEF#Fq5XNrK$`1$L>yPvovO2zs>KIB^5Bl7S=bVrl@ zbD9dQVj+2JO`M)djuE0>^#SteJpXU!4*j8UU*J3D=oiE{Iuh&>-=02e-|8U`0*pkvNkg?y5{1LaH zfF-*)$GT8i2#?W!p>(pK=-tDEw>buM>6zZu&@0#mT~kl9Y>$PQ87N|0&*FT@DGrVI z*t<}faV~RvMBYDrnRkfO^mL>fBFs&L~;F{G=P@gS;rU>D;W%5f9(axudILVlPT0#XXcJwd3D#BP8i0S#q<}R{YHDhV1z-i9 z50ArHT!}v<)&AI>PT+J)1HTas_bcSDE|CG-S zfmrg=QD*boaB*=n*(^fvA38453vc)`Gl$iJ*F#BKbb0-vF zT0MUFMf#s^8yqIkW#RwRtFo~{r~*&LA9DO+6B5oT#3#2yvd1$WKlwora2o2YuWnFY zoD#n!7Edfain5lw%yB+KhxFy`avmL=Q;LcLHl1Na2OM4Gpo@hj z6TXc=lTWFzmsYw;1%W#rqFegaJ%QY!*ZXC+oR)G%or$+c z9vvgh2P5o{qY4=_?;gp!vk;C8oM-lHEvu5p^JjJvghybjP(YJ1#b%QoESE$73i8jN zbG|9bWYlYhGxxg!rd#|EXO4`Cv=L=zub77CvL9Q*cvwfa_}lgqX_ix-9m#m{=^hvB z6*q$iNMIgv+lL&NcWy|MP`=-W_M}}8sziD6pU*siuAJJx6@M>Q6ilu;w(g=S&q8Im z-Ly1|47^um7N*D>+-`WDbN(^)O!gyuh36rQ7CC`t2IKm*4r7QoOcy9PiNWZw~gdJe)&>u$q$GX1(P^M0LIQKYwKW^!A6x*lQ^azC3s@y4D0 zbnhHQgMU%k9Q`cO_@8#1f-4vB@?+2T#6ffSZ*i{I2>Q^3OV`2|N;avuV>sct7HQ{M z#J&V1#)9*u>oR4@YeL-3>m>wNiK?|fT}W=*A3&H7@m+Rj6iF^A<1imDOA?q?<#t60 z!*WV!Jmu3|8?$IME7wu>&A%`+`&#j|U(UPFHcJbR?1W;P6S)<&iO+CEIL%&87T)6z zbgfqvGcMX|GG-GpTgm~&h)b9!*@D^xF`RcAYUmFy3~aYJ2Y98rVA?Ilyk@YGmYSiaPb+J`n%xI?^#A!wr%tw|#Ucd*22xI1|+lK%h zG%HFhh8bNWm=x>XJR&RKpK|zfkPOnl^SUYNb!P6NQOJWgTLC^vh!1A=hB~?SxtGZ| zp87NsaO*nO%9vk~lRq{hH(MAzO6RGKG5VZr@#3Iyq*3PpZU1WUajnr>quNJKr@?{M z{y;Sx`?M1awxOOzqdGh(Hupx`n+iN@RL9B;X|!4dQHPJyK|1qvLn<>vRx>jrBIBCe z*?Kt$VH(!L+<$56AUYK2sW*|k zYRbHR|7>^TJMC<;y*H)UYG851nn_{4$Q1IuFNA+Fvsjr{;BtH56SQV{N<7fcop1p; z-|@CN3ys?aZK%br?}K`S9@}RzEYW26J?ry^0pqcR9^?J9U?%6~FSE5%$a=rZ+-+RW zm)vpY3e{o<7;6+hCbRrNVB|y}eXw*O$gW^@7ya_c1h?(e@m0lQFfmV7OY|a!QVl_m zz%BcD+l4GuVCOA%HmY7l_1f1X6w;UL38+CC zxlvpvYLU;wGAHFtm}C z)uSO7BjP*8WkG8)O@=|-VikQ-YchhB%vgcRKQryW@CAa6Z#cdwdJ|DUM~ZPOKCiW( zO0!aJuYZD9Co>E?W`y(V`%ii@@sEmGWC&0xhF)wx`ny2=dTfn6Gbo($SjldvM+RBp^v4KQ_9kqgnQB~uz->H#l%CcJ)dDgPrd2qIUOr=Lq;fZbGji^ z9SBjT-@fUEaQ#K$>4|G!r6hkkWS29SmQB&;4iSpspvC8hWCLpL*Ro|rdHyTA6jzIF zh%!>EKsq$Vxt(# zP-q?4b-D5_&&_Fi&PT&nB>k}4k5G=P8vJmJ=d1m2kNn%Jgk2>MLu;aL)7v z{0dmKH){AdMB@mR8M@luw4GTDd#E4m;Gc$4vp}|x3oSF$uEHH21vO&6l*fALTVzWa zTdsa+g*^gGKe^-HvB#0@U2C$KR0T zv7i%xlI-bt#>GUV)2^r4Y?rnVv|MYo#vN@<3^yVk*o0VpF(m85VlXa$EYp4fAJ7#U z)>Zh7Zos2JR%X#&f~h@;rP7h)*?L$)o}-a41j(4V$0|$ZE89J4*;*gBZ?)< z;fpC*BNhApeBQu4z1dS%)>?@3e26p8|w5ro4vgrKdPXJMwM@TG#` zQ>${K$sVg&<)g(ydQ}c4?x>Y`vf08Lol_Re1KRMYZvqGNGfr!!vtJQXOFgLLXcTtq zmSlCiyCKe~TM?B9_mu2pD7wpiss)vLquSGWSk+X?0F(aGw7KEO`OzWOPqdaShHUL> zg2o=(5OxPz?V_L~qUot)d76?@178o2?0KCoBC&aFpVWPm*nC-4d|8dfg4g@wZ zX|-9ig~&~*bn4eO5k_?Thd-RDlK*n{<@tmJht}0U+aiPU1oQ^D%F!WDNunbWTy~kx z57#+mwAnW{Oy%{Q(;Z6*JG{uc9=rR*y{B1zB8^~}nX4TOfyH8w4zdjTx<9cptlK|3 zIv|i@z1e~a$ynK+aC|3*O?lOGw+(Md0MA0fG`TKtUdKEDVy;^4R8o66U)BO?CY?Jr z7+Rk>@w561)3{X7WDF&yt$lAL1f=C2@9%W zin{gh;85IMTO5i@p#&>ZoZ?WtScAK}6mN0Y;_mJg4G`R23I&Q=vH$5S-@Wg>cjlWt z6DE`7BsrYyz1H*m*4lraO7P|O``GqheKwuI?tA5qJ$nTshmR|bS6;DKr$?!*frj5r z&7nr2KY;C7(UJE-mqOLg*35x?$;f-~mgv05e`=?Mc~ArG%x^KwwK2e}0%hiyr3vAn zjR{+WI_9jN-tbfu4%$^-;mnGWr?YNEU<9Ui`f=IT5b7{#0lj2k_XTT3ke z1#mE3%ncF<6GTfRVX|fVT@dN}i{(FC{5LBDk*RN9xqREHK@EwV z14D!>4)+I9=)wG)hMdG=$*!=R2J@B&M^b)Lxq*QS|04?W%{1IyM})Nvzgm_l{EHTf1%&LyAGmHeH`xkAwE z`Qn?SiANBl1Es|VKij^nBex0gGmb>lX%bhn6c=+5X_`ts&W3)p>wxOReudd{^T z;gAt@$hgNm1-D{}-aMLG5e|hr;$p)Y7J5ah|KzlM^L5OIt>)?KfKT!oeJ0k&=`bni zL;AArK`6qkx}kTxTv2v_akttDnaSj@-Ql4~o~w;%)Qmuu4XI$0w_q1(5~ZBf_iB@S z@d7w3+Dxow(1aBXCod}%H-8VZ2Nj(gO2~&Uhsii>vX6obh)$)!on6`(une%0NRItBZg}O!CFh^F8a=$B}n?5=emw}*>u za2><$eTHx5c8(vfnKaDQ_>tCJ_J&eJV<^Dv9-`)B2D_ypQV1ymcAmok$!7a&E}x42 zJ)U7F3Z|8~i;m=M3xx6xiyBNp33irK@^=tA!kn*K-%=ZiE!MwhQ7?1l;t}FW$LE$5 zKiL~)V)&@%7A3F^o0zj{Cqe-wtV99P&xX|_!BOLe+c-i|@7bH$=`^Ar8S%r;HFd=ln^Q=-!DlhqN zee%4rcYjzFR#d+D;cWPU5SvHtggc{?>qHC!vRnc}C?&N!Uh-~uV#vE|ZhhW_H?w|5 zJgts*$g_600Uo{VXm?!%Q72saMCnB(r`V-qd*UDlcKqIui}miF=&XP;yK4nJ8#JX! zVV8#}&2lQXCA#vnSx6^GnlcQ&-UV1b@r0x##a=S$StYTJvR9a59;_iuf5M`%sRBnM zGP;ME|EAER5<`gM7P}vg>j+|c!K!PKYq*a;hXsOhQj4WHzwq9A_pmT=5oooXvsk6o z^?m2!X~O1rOEmHM?ntJACCF@VhtD5d)ngSc&8{@-;lzvqiehK4J5PN|5#+Z-I>*@) z7)WY%QqDb>Qk6eYk4|Jk49YNG(}g=!C7l~;4OrF%MWgdv%(R;!G3RNYW#o-%K?2Ba zdW7Jey&e|coEv1Tk__P8UWEeDAd#9dj5-H$)7pzr>}(4FP&Hx6Y>6`E2I^FAfn^%9 zCzJU(!h>I--k9qGJ9!(u+Re!3Xu+2a;us&<5TbihGw6JvsWlfaC4Y}}$530sr4RsT zTi;+l`}$A4AwE}~)Y7*Z3}}B#7foC#_ZYw8{?}$ z(nY;%$1_&Up@2}CZH@THcKg-xzDgyYsq;7%7do!TT0|r-%=dahgj*IM1x)2Sd{Kod z5H-(sOYFLGGe5o>gKIzPp$Sdgh(RVbsvgRC99s|4OflPzNON?#Fgs|M(2TAG>=xf{j6 zEg{t*7wTT!t*zc$zaW+(6jxn*JA3cShx*DD9+x{{j-3CsO; z8A+t~a__X}jd^zYUgex`wX4GET&oTJ;)d~BJ{C=&{p0tFg>Kq@IC~rrSr|?s4K}*3 zPHmxjHX@Hc#b*MJZ=)D}k)rij@x_`IB8nO6v`nxVo2B+HKoj-dv|A^EJ7F+Z&ajeyzCn zv|ewX3xz8zHY;Vc@YlB=?LN?A`aNwrYVT3cTAhc&3R4lRK%Ae5pPz(&2#2&)Al;li z?$#9MMisXYtnM}_1rUmAE`2x)06Q3+pKju-c?oRx+qSD15`wGGkp-4|sY-(hclv~$ zCPX{2U1=A(34E~AK@v$1c;6IBTuoEa`dGuD(H zlK6&yX~N)jbc?mcI%pngUc0iG{$5|dZ9bo4@1{@~D!BIPxIFOgXu8OIW}_@Wfk_d4AIHbvZET`;2*~Cv%7g9K|Ef!GF%6A;~+^%rxLlinUbZ>})+svtk#JHCjn7kuYWM5(l~Z+gIe z;)(i4Y1d}e+dtww{LO={JxgT#Qy29%p@#{N69y&-bp41U_=v>l8A^jCjsXz@72OEt zsNis{=_!Qil(1CMQv0e)BBd)^E*yc#ux^o={c*m(uI)BnhQDrMQG4BGNlsK-ofXGd z=l?ebe9g-g@tY=vQGr02PdQ7;3DqmRsjxq7k~zET%&|8U$m@i$Y9@5JAz#~JFRu!pRC0v$0%}Z$^EWw7iUDFxm`n9lEI~TxT>@8f&Wi8Uyvb z7FPOqaxZnifF(Ls^A+S?o`}oeK5LBkUzRuG@da+ zaYW!h0WG}i!$wj0ZX_Jfe{@u_xT3I+&qEi0TH*J825F??S ziulJ8Pr+>v!|C0mOv=5m-4OVGbhvqf_{>?4;0yAWPdgJL#Cyiwk)k}>VCzIVE$eM& z-9Q(@=mbgpL0V_%>{0fP#ffdC6FC~%Z0c^}dtors?2pAccluI%mF;7h38rfy(Neb{ zyhVy(#K6XUh~wtXCT6Fl&%;8&*8OJY#JsoGfsU}rmQ=_akxg>{D^#%TNHSz4h2Gq8 z;Q=A}6iw{eA7Qr(&X3J?A*r1S!QF^GX(xpGm1Twd@g`#cZ#OXPou~@+HF{^W)Splhs7=Ks3W}00Zu`d}=Asa9G2-V_7*4f9gvm!YCxP&qYEMHFORULa<^?`DC4EWZ9ni z&1W{;`$hQ)3&w8l3Jg9(nUer#zZ}Gnth^cQdbAHG2)BKR*xiSGL9c#ys zFiO_m_jot38E_-`G8^yLSqvYvtt(gYuowlT+nBB=UNio@Z9?*qdsCz+(+TlMuyeaV zK5E_cVFNL~n0PJRq`Y()<&I;4Gxl$xQREBQM-ANad3%g~-)m zzD(@?gs5Fz=Hv5iFUX=Usi;zxIEO2TYXcvFGrfnfLNct0$inlJkz!GbS+B8RJxB|$ z4-qRFeWpTds?UO@{1-+=quyvh3C?SJ*|bZW(%~mFSJ%tA%iOD&b&8+^+s=nnc$PT3 z@C1kz!WE_@Jrq&(Ti~%(PoDCR1n!&&!}I>;oq$wFF1_F`Lz+EL4)V__0&&^JvR`SG zs^!Ic^Qyd;Q*IA79kDQTgW$|ZcAL^qz0^2|4ck1E9~y~$<+h~YQ~qo<5!5L8jddE^ zbx|HleeSe9ME8edC+GkLeNQju!bv09h<7;4tQN`z+uSu*gUE?M$tc~_eJ_kviy9ta zo)*B)xWA=5fGNvxt_}odzg)gsZWB?CpKoJ(qzs1pxCZYT%7N^y!E1@pzE9Fk|9br* zwy2gU@VxdU`-;1SxWKO51z*(UcD-ah*GqM&a{120NDN(yV?gjg^D~C!{~E!#$d|a< zB($2uWx$#KpX2oZ_r>)p1kzt-fBavEZYKe9k;zCdrPf3;g#SH^gMrf$f@|VZ^S>V> zL*PJ0u2<|glR1x`j|MWoWObUVAm-LF){Fx8Cc`{GHo_ya| zy%?wa{HP|$p3I#7$5*WqLhg+@^b_auy4)unc{V6Cs>l|#!#5jU`4Y!i$-sE@{B$q+ ze!6v6RxZ0W#xkk9dv9pzU1##^MTRpVPn1QvXv%HcPAF=ukWpLv$o}fBT!JhCyDq^b z0pZYmTHLE*3D%?&&O;IR4GR|4`urU)0XKBr7L%g>&H3Uw3o8X1-RQiU{A87>V2u*( zZ`cPDq9>^yKYePxaept7w^ExcymJyzExf$4x*U?idq#N9ySfiqQVVn^EHP=ZJuf>p zmrhQ$)i0Dyy)C#=qE^hgtr=CzHsE=M;%nCGoB5Qme=Ik3BaX0_Lhn;|f#cJ0APjU* z!(%t7mwKxF=s}0t+F&u+%g*aH=Xqf_{*!!vt)r{pNK!N_uC>)ImCdZ|#beLJPB<$P z<1n5^!L`jG&H#@6W-s5tOJ1h+MS`6yMFeXbw^{?wDm%#M;#~0i&j0<15Ow+e~|M@d(|MfQ~4}tGi z2YX&}bX;jV%QN9ZqeC%MmF1JMbEspWh4;`DY+#9>Qb?N$m-GFaT>De@rN{PwPkbnb zo#HH{odfE;jT~fo3QKZ6_{B1kEsC%Ic;yybi-LI@Sg3R&0qLZWegr?RL`CAzzP&nF zPO1E8sEAH{Q}E;yw3XBG=hxw6y=9NNkk21(%DV$}xvU3D-A?DW9TDFnHgP7ME{9jg z!h-t2C4vqnQ-j$vd2Mdbc`SrpDnJr5xfnT$y2OIY^VI#-n7srrf#`4oQrEAJUwxj- zYyaf)cy+3t`$1gD#TM;Ygi}8lo?cc=9uB`pG^Iw=@iyuKIB_vM7qeN0I_`lWvq!Jn z({DRtY3ziQB3#N#_E+4zD`JSK$(bl|6s|G$G?F6q&7<)e4>rj~6YpyP+t6N2*gP0nEXn z91hFFmGW&J&Tm}u$-b$Olw|*RcV0en86lP%8 zor?8+w85OMFpL1M72AQk3;1YUL`_sLvj09C@FE+SVYEN7xUDVSU@yn!BSHIp{cgjJ z^0vB`s$>5lp%s;Xqv6~;>{^i*@e=ri-#l-5xp2MJrb=T-UaD7xs8RY+Wy%JS@ia;_ zgI^Ghu&lcX{=PkPIgw>D%g6l`{$y!TQATt-wK8lVoqRsnXGvTNXIBb7C&dt6s zd~>QR=`W|PoZAsPPv$?3OEroF=}pDaU=3dSzgDdB+9mJd;E-(wsd-5N6NT9LAX*?# zmaOO*ro80`6N@69?|DgDk86oi{~o6sCOw0sC>3wM%Oo`tjCefT-7_X~e)o9-kghzx z-#me+jua?XPwW1zNwG?LWVX!|vZrIGNxeMTlq|BG=wzk?m3--Xev&~74 zIA@SvFB!kxN-hoT#|WYD{=*$4f!hpzsqH2u;pD_`Uw(8h5jCx9w+vYo$Rz)yYZOhB z!2a8CFX7Ay$s6oLQ$!_!E%AZP*oQvFiicY1{<$1xQeL|>7bNjR`btb={nq^yTPm7N z2Pv`|SL6@AF+$*v99r=`Cl=KK1?CxGbM%r}1Cc`c)ZLMvMThgYQ-*#bMzqpDxFs_f zzM;~gq7lcCpOVl?duLLCAGCaGlW1k@0zsEA#e18n9)|{(Sa;5dS+tl5*qC2Or|@md ze_m~jn`yGz_U)p$^=T-Q?Qvb%V+^D_ZSq`I!Q~9(>t*dm6LmXTE83sTPb-NplI>n9 zMBEcFpn;0N{Dib2YSNF2cpL!_lT0dS-o@X>ikAiP3Tlc`tE~OidyIDElV69Y_Lo$c z!>1FY*3T|{s$xdZ#OaNxS!C>`Grv|^c?X{sb&936{v&*fs3=1Jj@%EppxH=n8HXfh z@bV1*Yq2v>!nmEl?8bFjX--hbtuqf0@LgC{B1@3zW>~B#f{G#@-owEMCrnm&TSN z+|SnVE;>-KsUnQN$BCiVA)vZ)W=lQ-TD?)lfHRNI_{W2Ng8OR%)-3rgBOzob;mct| z*&+BK0qakSWC{l?wP=*~0@g!sV;c+Kr>FcCN)}X!EqV1?b*jX~qd%zeaw*aSvDEVCbX*F{TZ4Y`=i;22 zMm{nm!8yxKCuo!#f2M7gP9io1|^Nu zG(IXhL+D+EkXwZ$(DAHb`9UlHvt1f*1Ah8c;`i7^L^iweJ%UK21afa$@iwg4@6sPj ztDi>g4!k;VK?19PF${D9KbQ=44Dm(%U*`a>lm88WdVUYkf_f}(OX2x3S7)YuSmn>b ztp^#*uJQJD&pIkuF7`q;5?Oc&PisWHDD;7*Tp^q6(=`ev>O*?%MF=kEiG=6|Bd?RI z8wVTWbFe5Tuh?>TetwB(Hn+7>|*+Ix*h*T+IstU6Ic<=F-A3doIT z<*n6ZDLkdgh2Y0`xFJZ!sG%>V8@{n@{0xdD-gu4|mf^uA60oJRd0~Q1bdwlK1oKN7 zFjmm2pk#S6BN4t7b=>)~()dHY%@(Z{XcLESRAx0)t!ZM=@|8pS%Yb>r&55XyXzjSk z^*)4cvf6xXbQp>9xfUlOiBG{}eT-%s<)Y=|VtRzVda*sfT)!Zg(wQ-sa>O}h!2dr7 z04xEib{&( zP4j&Naox~gXZob;nfdNvEXWy#qLJNVzV~9 zdj#q9owT65W2_cxdd^Ebaoo!-h|_NT)C{fjp+`bS`U zYQL@31+LrYOqbql`glZf*HQacE4%qUMp`LFc>x?kXOdqSa+ekYdmMguIrb7JW3fgX ziP^|Q|IILSCf4%_7clKU`(hXr5?SW%n~^E55=X=Ra|rxBKBh2-<;sCt@((ubLVu4% zYPr>SKW8W{X-^T#l>A2#H>p#Js~7k! zJ+VR$J^7G(K?rFzm7;YOFk*w+k&sec0gfeYIfN z4b(IJUD{B4{i_L&V__W9e(|Gv}wP+*BYLXi}H7uw}UkK|yvtwT(zU~dUH zk@3~;pU;P)&u?G#ajFx-iDWJX2}!o5v^5Ntfc?mhljhmSvuh=VG3{|aFzGfUP>MJX zaCx2G2v4O>CyQ)l7lLC0DZod+oMp-KHd>rI^GXvvr^YEoLz$laQV)L3iWYabRRoQR z`&rg{&6m`>ZxY4s36HszyPWLVz7O+#yZa4XE?SKRz4JzxEnCe7AB)7>u6N$3YuG+~ z?ur69&GohIMzfHFEafN1Z&O8;rh0|?8^BUuqPuJ09PIZX{2gbiF|;0Rc(<~I&*bYp z8{&TCk)y+yA_5WNv*U&`3-xcNxOL9Swe_x^nkKJhb2f1gM(fB?`v^XjVAWQ4-EXl! zSpAGlVJP1pL?iCpUt~7a4iU9GMl16Tt^deQATqELE1K~=?gu!}L<;;#Iv1%zuWb;a z=;Egh9j9YKgrCCWz3*?_`G$*6EMKdl*@|5tSNUEyPyE}l)WY)Y>6k@3;bp1DPwhr( z`)e){GhwdAbq(+&0geT2Dr290%6q`;t4k5n2fo?kc?9oEr*BaF&UBLa0`$BJ6?COe zh<}OEN4S0mav{`#Bc0Kj&bEf1^M-y!WEx(hK-Nwdb$HVyp z<*rB9(FQicyV+15Zq2`0S9^=D%0XNv>e5&0&gT&B!U~zY^ZEqEql;9-523s+B}PyD zfngE5t*Q?TyYw8E=zj0PCrr2dzZJ%it-}pYIK75FFq!VqN|kIUI_gbdWAeSw26*j zQr(?EA!r{6KKPJ6RVd$k(W@VSFh|u-=piZ`oeujJ$KaLT{oN}FUnE<|q#c|y2DoI`-)}F#>iUC`y6M32trn&%r{Oy_}8>@)bv)*+4XA`=Tazi(uZ27%g(37tRK_Q+daTDRXCc&W zZ=Zqz>Bjpm)aC;;{ayIhhe51u>+MzenrnU=-gmcztg+LDvrHq|O^HyHqoVM&%AFhV zv1Yc~$CIq-5Jupl;qy_+!`G;qii9;D3|uPt53xtxeoa&vCZ zti7dBF*#w4_3MwT@8fNcw#NZFbHM+@aH8+J>NG2%&0|?RZM-+?W|&xMn&yj6Pelc%Fqw!WuFdjYYJy(Hz&#@JEH#Ly z-r>TM#eiGjiX4`}Z%y_Cs*17LUH0F(OnPs(hvrbujT?61$SSw~9vv{xI z2l|x3j&-VhK0~a0KxmoTvC19_MP_(|<6b3ZKDaYoSe{E%!IE;ee)RU8q1~StCGNg2 z4n6CMtOB2)_9j>l-{mwVgZVNpUt&0i2L&GKkWCZU3CMaUMPXVUr>&Z|%n?H}jW$#U za^RexuW+Ps`Tc$~pr!MfQs%y$S+2Y1b*k&M-%LYIp4B0=sGX3{igy9JeI3VOV`5B!PCTrX-WZkp=E8Fl;EyHs-F-gz zBYd|`B}edvp)*+ma zIrsm?V#d%(MbIdiM3IPZ+en{A;)}dgES(9re9BX1MreZTU2O7-(XET%rxQi@UqksI zix>@$HBoK1VT7=a=^2)DhrIoKnme7qqo;=br^WrOwVu+M$9%?QI1&Nh+jP98fzV~k z9|?sYa10WJR0VQ`s?>0TMYs5n4zcFYU@c4eIfUJC7(+SqWNr`zu)h7R9}3d*WMezSm4e+XOlkqfz0Dq2iD$ffiDzKMyG#5(c84zK2q zrR)d>r(b?MWdRUv-itkGY7!C0A;-N+(Rf3#%BJ;RZd580fgv)+XQjp3qQYS{(cG#| zDk+~o{DsQkEGE8<9Hg)yP?YF^?NCn3`LVPf%>5<1C3PS*mT>hGlYM?Rkz$xCfc{wl z{8sJ!hW*89E+d&UMe7zP{^mk5XKXd`A9w*z*Jb&YYyKHv{%rCYs1WJ=b6(mH$=yGb zP5wiJ9Z-YRjrdYB^wW9Fv+vgQUb_d6O8*z!R~YC6%7fV%VzH$2;=Mfy)SocrJ;Aw} zoSxQRBYhsnAm}S1?JJb4jzH6LV-|J6GH+=hT_(bDc|n$}%psb7i%YLWp-eB3y#VTY zTg-jdFHzQAQDL*Bk*nJfsoHEW&N#yoTUDgz)f$!itwPT-L) zz`==uP3@UQ($>Ie&@OOSe`mpr(mTTeHS$RRqHj^S$;~sQX%AsV)a698L&?c#4 zaM78&YPK-ww?2!VlRauTCXtD-#^bUnZ=$)TlQdbF7_hdbbW`LLw~g7k%ypDXZ@PvU`~HZ7%r32e)t`h z+DjnG1NZ+4R_GbdSmcvwbCuLeXw>M5rD`eocAa;~K1*H9xnK}U62SXRhp@61OBBE} z92;XYY5dVAvUDhLTLb)fe2QZN* zH*(nzre-w~MEw|AKjRva&rpiBTzvo89%(g`a_!B%nrts&6M}hog9a}5ISrdrMLwM= zb&-NN?C`H^^in`^0QEWU55}!d?BM&<3gb=;a_;P;*VJO76sd|g9$-vLihfx41Gc%@ zLcO?Y4hP@7iq(L^@WJv);pD-E{JUxu376$-4JNMGt5x9}sV~CrmPr(qx6>+&o|+!X zwIG#vq=?olxv9WA!!jcerS!#+n5+~T;&Hnbk@nF==5VTH5pN4Y$k#qQ79!vOMs=4EwtDn5+Qn9Zf&Gwu7 zUR3u0H$n7Ab$2sd1O&YVWY7a>NI~t-Eg14fnTTwi-@5XiZ=@a4wvHOM@xiZXKHjW` zZ%rw#&o^B((~f{!o@bG8{7%&Esh#6-p#6=*R(rd!Ra*~r&gz;`(+TaEh4i%wp6_01sE2h#yR z%W?6h8YrNrUXd>-V1aT(o(^{4mpYG8_%X4kmRcvV_vml2%kKqOj@IxcCQPgl%6;rP z8-2HBm)z`IQO_K<`V%Cb-*22|bNp)}F+6ryvhZ#g%q6e-czRg%G1kZV_OfJjCP;bK z{zWM$-92isXdJZXNf?WtOd*y zFyoKBpZTkNmcjujDCg=IiZ^td-*Jna7PnKiy zoOTrY$e*o-@3|n?EL!YrWZmr^)ACt-Wo~LsR9Ejn65Tnp?8UEngfrWa?>xlz2g0d% z8*l{MH~(Ded9ys?Hkb~O!FL5ht(8#mV*o3NfE|uR0%@O7p)jsUmPQ;lw1tTK7{U*) zz&L7!)bDayJtXzzL3$QVvQ|?emeAX0%dRY3L=go1gr1y zwq}9Qz!`m|VY)e5RoPSbGuN%G)=_y3rLGps9Lv8Yh_bU(vCl)-4b0JKQ*wCUm(Aec zaz?zePiiIxK?8Ubi(zqf=>=JzAYAfoUs{ddy2GOAn)}NLw=0wDBE)Qt`D5Ha!3_u? zUJM9wJv|>}G^eG9S!9h%>5WrCqOIK10KyYx`iNUgOD-3HYsc2tEO6~*M2ZlI4>tFS3a0lY}R+e<2#NyZ_8wsO2B?vMxCt8}Cz(oTs&%g#+L%`tqVq zkc~sWMNs_AoG7jEgSiRqy)98EfB!o$*!>j^XSFx^go?o6zxj?<>T@&$c*mjjf44 ziBQz~rjPou<9WLWDMd}Y!tDAjdefAxHp`7|PjE_&M!t*@kmUiyXG(hFNEi|)=;?iJ zfJ5a7+1L9U>gqmcWf}JD(wAQudAU_u7!^pgc>P#7h&)mzE3`X<;zSMHaZxtO$VpR( zk^TaoEN@|qDWy8zLO-R0T+JkKK;77Aw^zuPe1zQz`y2!L@NGgp{u1b*3Xml+lq}`* z>`yPZlRD-V{6Y>E$VE8{8WXufM5b7aDRt-XiBc6N+vj43t4587~m8m;r}#`D2|ET>01o*Ti#p%4yXS zh%YUzvWd#Wf;1V3C1?@b9iuEP`8Wm2CDdepbqqRTr}?tJ`y@`RkcUSGXhCna!vgp0 z)FHTt_@60A#1LMLB3d@^+mMdVVJUlaig3SuI!(^X+#jq|IH?{@_RH6<37Rv7WUO|E zHK0KFq55Z7#)N^@O)7|LyA0M_(G8>2Du?N!R>xD&v{ij~_7o-)%nwT}5Zx3s&!A;( z-R+}mrjy$8byA*!ugVZGl&g#|a_Xb*Maa&o4TtLP;y*58i)>Z1!y*ynKqtQI1A{** z#Q~eJn0VV1?r!zX2h%z<1%i|h#<_jr=t|85)5!hZ1J1)+Hs#TzGzyxQD^Wq1POq11 zBzDm>&XML6N~U9qNTaGP%k;duDY`$sNXG}6;~H64fR^1B)?ez=_^VkiWV!w34KGwU zpg=@~BQR#l~3JY)OVN@!BAN!p~#i*){aSS4Vr7%p;ZSjJC1Lpy>tc~xG*C(Ko>!e8K#z)J!SvugTYq zUbVSqy<#l64K3lU--ngv#ukjWAJXo>jHa~yi)a!}ym!%dLbG}G+VJ+C%itTBpU2OF zibn>A>A`I2>wI{Y@$0ZN!}|K{hn}~e;qE1o;Faosz7u6?vu^QrP6;{tW<9r8LFoJ$ z0=PP#sF^h1CT|cf&DXzGKE{f#HWY&e!Qep0TcM&-Za7eUh6qN7(@Fm~ES8>vkYQox zawX1Xjg1*9!jtRZ4O5i#&LzfsHcN!gkJ_d)6cU~(T75;hpm@ilj!uvhcHren7fMAU zXF9y?3@z5VFwm#caxiOE=Ij^TsYm}VUH9lKFk1-Ci)mLRrQ0*936?JW$M0qoO!(ZE z+s6;GZodWl!5+ivs(aH`>p?`MC?x*)N6X|KYw-;z#@Z(vqfveN7HEP5mcw&&oEPdK ze5un<+J=Q9*9aO8?yb=UdKnEEZEkJt8$XEHZ*{S-j9QT&kl~dsj~bJU&yV-}Dny=+ zc$AJ+U4$mFw^p|#Mg~hfv(tYX`jJXoKCQM~z>LDzjS}fExHzz?^8rs%4lOA2c!vY1 zOB!f#(i+1m*XbCQO)hQhiSb5W&P>|`Q1In1WHl+@7^K*T2glu|FBl}p+`*3t_TR-9 zEA(P`x4ks(^DoHDTW?U^AlUaC zBmj58w?I2>G-wp-{R_bb&Q0@lcFxsCdQ`hMHljxY*gon|n1?t}*p(`&b8mbSx*<=G zF?U3AWNBTE(9(mo{{dOJhpe11?uHI&8ST-Z-=ERW$cZOc*p)Nse9}(f%hzGf_UEIi zi>n{V=l%3q?zYwDC8RJ^8fr<8 zB#?sN{!NYoJqua*h};q-yO`|A&Zu1}Bnq(k%3Mw!;32$qof#7ZSb8reMaW8 ze?mOEn?8_2Ic0f)-Z~*`5LS;xtzOW-2vH9wzT{S&X3ZaD&=oWrqjIh)t}cRT5uk1` zVRXF7-1BNHa&9nI|8w=5FgaVW&zM`{*Rd!>jy2~XJ3EkwsnU-~4W$8cA5sDH@YE>9 zb;M*q%Ki2P_gwDt)tv|r)3rPqz(qZDBIzBDB8~h9?lh%enzni(-_xR1jHS1!ZY?gZ zbpHz%y`3@Gs#eNl3(f_%U+fa#6+K-PTFlA57z0`pkCR$cnHi0nWKs!x*5ocl)XpqL zHyXBGhxJGl_b@I?1JGqu#$uYR4*<_3l{Q*~jZURdk=JXP-OmfDXQ?jRR38I=O>ZBp zY7!0;+mnQ^msoEla^!zyDhIkKreOaqbMxmwV*5^hw7_vJ?gwL-!+6g-fCa(1=D)-~ zyv(ImE-gQ=hs2Wm^7}R@54tr@7TdiW9OZ(#-768&fA<+1&p2~e&xF3?N)$}NHjyV>U&D?)Ehc{q}BvM&7KzgU?XX2Hf8OhDpYGbBO72 zV_!5M_MW*csGfq!e~(7Nvno*#Btfxs7%dE;RJsMiVTf;=+YDr$xE=`w(ib+~G|F)7 zvV4b!x8nnE>A(e=VH&m7eC zSp2R1?{}iM4nyQj)M0NY$QQ_8i3)g@|16`m_YU&wsY5J_9jGf8s6Py34$3SED;;E4 zDyRV)UmQZ%j7&_6u$q##A1jY0I{G9pgkha|wrz)@k2d%U@4*}7vqt3D{^2;4$W2R% z%ZY@tnaz&GPozNUVp2L#byZ5}JzG*aAx(AI#)82AJOXDBA~L21)K4I5`r9F8WEk%a zlM^j2KA2M4U&O<$>WsOT?{2T&pJN1%W!9-GM4bM{w5`Xo@LJgH6L_U!guGu9+fUL$ z@e=l6d%&g1NsK|Ta*CXtT?EIsr>rJT<56?%Ud7m4Nul(c)2zr8Y7mj@Gp8oo8#54SFi8% z>&2Fh!`uwL!u2Vi8V7B&YmzMHsruU{WA!M`LxY>dJAa}X1Qjyzk^}oSy>xxiF%}d5 zj30ImW7A76_{#&w1boR7t)mTpezTQ|0tLTzJEbS^*mHdD4M93pgxfd=342|^DyKW7 zXd{0d%iprIgMu+{$=;%Gs4d~xF##~`l7OG(V46V}N(|hJ32MO7J&fiw&KV2vivuh| z;6D0ah^(&+rs3Ur5V_#5)|IZeW&NxWSv&Un*43bLNf(=C+cw}&jLd6#7=rV7Ab`w! zd>hS@sq!yZd4s;+ztiV8J%#!z{eL*n|MJA20lS>;*`Yw~e$mvh5%)|7|Pt~t| zVfvX^?0YKPRVF8axKQXkQjuptuZUXmiVyizK!^v+SJq9@M?D z{hj2zKd44^Ah~UNU48{{FV!%nk$j901i$@@9S-8ZK65=_jKp%EZW0yY2>_0s<^Fa| zc1ezkmkGP|q%(;R_+nIr6(6ZUUL)+3oIwc&z_mTYCGp9D^k|4pLr0<6lP`(Uet%X! z9_!L`=g`*Q9$+ho1yek$hjA(0hyRkPmh0mbn=RR|^;@4P0YJrE-EF`GCdyFCYMyFX zsZR58kh-KMHq ziUTc>&PZN9;JqPf`S3$D$uOWv+V%Kt>WdwSq!IXV?w8EX??)SHYEjoDU4C+Cf@9fCa2ih3;>F+UySXewzejyvGqUEMX6U_1K7G-BnVz17g5Ic&4M)97 zmy%y^;BRd22IK4~jirPJX1huv_0~^}f&4otPd!ahR)?ycZ`niOK==u?iux=mUC|r4 z03u8v>gq9!zSL_70LEi1qU23c09H%qgBa9wnBHfe(ZwMd+K{J^szxN?W*^Eep;{h z+*K-7Eh}D>Zn~qQmwTn#>6pV=)b-#!3HWJ&&N-@5buYs>>J_v8eRnXgFp8yzeJeg4%LKuN|TW2Q(_C+zn_pkXu3Jv565 zl<3v(nm3DdBF}kkC1hHi4+L6W4lC)kV26zQpVji-Ih2@&IrMCKBSEv1x z^Zl`CxwPGn2-~qj0CCR}#dx?%HV)of2YR?hk0#rZ0DE3qnDs>>KxfFv=S~ulFG7t! zjl;UFA6n|V$Wo`~1*qQM%Jkk|j%qhJ7LiV%x81D)Xqo9joXws!ZG~MM2lgD4B1OM7I3eI6Rg5_kAx{Dn$M|<+qBBRg~h znnPi37{Wm`qre^JBy1b+f*Z!tvT%Y?TtWIugcA4KQ>W)gy!$DDk~43h@rTAh8PL*3 zqH3{@m;V+B7~Q!nk9!srGuuauHfM&uHfGK6*o=H;b^@$yX#i7?LOf$3^R_hJ8Jqps zTZ5dF-7PXApHy6GY(qW(I+iEtGN}f3|2SUuH@H zwyoaW!!L#B27si1M30~P>lq?D^`EovsiDff(LgTu{|I}_pgN*rco)qX2zX*`UmYmybS_{#e3m-!qOT-=1DL>hte!RV({7W>kaz~sd#7WJW}-IT+Sj_QkYU)B8u$|$s)%4&tjXMXf-AGlh)F#if zPUH~GzZS+vfBaG>=_wj>h;=(j;Z`>$nhU9LZqSb@vV_j?KCwHqs>yvqdmJE~OwAc^ z`BaQ~F^QOeIlR!`HoyqL+XA=;oJ`)mv9ur{GK11A48=M(Gg?d_h@87TB(!JtgwjrL z`VzS=c*=ZPlYHC`kfJ$!$@t)Htr1Sp* z8NQhN{EV}-Yifl`d%v=@RA7wv8&LLzq28vZtYs23+=OCML@UG(OW7O76hJ>&;V5}b z=P7ThAKQz*w~$R7I}-LY0fbuYYiVxa&It_+$F*J(6%L%LSIxAgXmM?@q2a4S(Ll5o z9_oZjrAP|)(*moCBEDmt2v1o4rs}{%m6a2d#JB5{=VFZ=PtcnWZC2`<=7IGDEb%uOg3x5?rnpBw``o-AJ1zAbCRy(sDOIrZ91tZU68aXxFt+>$e~sjF6*H*A0Xz zc`)YQ7DnY|yo&D;{nKS$T@aYdQEx>lIY;`xd1j0m>ST@TYiPuLKu22QO=nTfoQvuX z!&G%YN}{zwvK`)@t99scx(ASMFCuPUEzbL&y@kq#1Ji$v<-WpW5%<$S*ezfdh9Yly z%|-~E@1{9U7CxoDIfOmyR|Qg^3Q=`+)WQ_tm6m#l)qY>K^rFT5AE>)7-@agiIdOnG zUF@o8a~gDTwbSgFKlW96*?C&j%SN%;%^h!NA~T}AT2>Q!)YfJ1{e(BcOcCdae#r`7 zvhCED>jyZBc~PJmT?Fg(X>LjL5)AO%maHoNJ4Gi~COnEkZlgx0D?spj)%Un2DL3D$ zY3+tB4BJ4$7gO+8N})V1E^iIpkV#|49|}ve9wlB zc@&moi3;rxH!1HuPge>lV#YT(Nu;F^CunSF@G7yW3ump_Sn?r`1Jw9j+PF=9DG{GP ztn|AKRwi<>XW@nS2P`ziA*JHLgTTFZzP!NATDp_#O%Wt}o2$aZ=>OPI1jLr)5HYB8 z@tH+!^}tE~*+;yI_o)jx3xN?|kW(F{u>qZWrIc*siAwcv2kY8U>Q*dS&ZFCt%!A;Q z_gr^xeyZ{r=pJO=M2I-Nw1ZPy666 z4=hrxPbx_M{=i7h15`U_J_13hmyiHn|E84HD1Rp1eYV!YyAfs^$l!b{LS_MCz>E2A zAL>5Sht`IoZA5R|!|&s72&S9Dt4*D!9mO^E7joocBu5e*3cXSN46{u!g8z!I>jwCB zcVtN{@>K>vCGCeSDdtt(i$w+P;lfsk6`tHrq{90ilYg$S2LjY;&_UF4oF8SmJIpVH zGW}E55A$6MBf4v?0NV$AlV}x(K}HVM_feBs-Q;FNzEhvMI|T0Eo<<{5v`g!?=W3WZ zJfABSbS5v3j6P9j6S5kW0p%8 z_AHElZ>M%esGi^9tY=f%y#n9neR;Odhhk5{TaCUm_9uO^J$Y2#dg9lTfyLML9`3mH zaw%=%#Q+SMh&NNYq#P%!%}Ur^7~M_5>&e)WWI|u=eqWxZ^m?A(hyR^O0&A%4)z zQ)1Zt3X8~hTZC{Ganp7D7SLsD6}w1aPyVfGVsHxY_|-=86$*wb&nWCxuSi%y`%k=b zkY|Q~OF=9;4v?rr5i+ces_8wR^xu&>5>PN`#M8ltT-=MVfjVC%{UDQ!OEr#>NThFU z?oI`{b>2q$OP?$uB0n-QadT&a-Y5fkI`XzNZ?nsU`b!uDaBWuWx!$VAigyEAI9X6N z+fx)a4Y3Shp!u##%Bu;;2)`s-|A(#W!~VclI&sGLfDYi+goVacvU@eOumFY@AfC-U zSQ8oiqF3`4t@XZthQa_b1fO&B2o8rt$SLjA)W2)xm8({c{3S*RKp|BSZ-fsN=xOwV z?I(olLWEJ-p#Eb+o+b%xy7|RJyr`dI7cLaBG=_gg&GljQfSKhhLJBXv=>|~B2y%bm z!$CUYdKR_HhS{CCTOVPu46*u_H2C%utJjpetP?0cocrGsD<05$;#qxCV5mzx2h@97 zG4TG54EjtBqtK+JrfclU9e?sh$d&{0rdb$%i%K{J>vNsKQb9wdGm{#QtW`ZB3W6R{ zQ)-e8u{eTEphiQS7e9u*vkg(QT?JOP&$eq*6qBud;v5JfdeGgRN;A$j3Bh}Gp;pIs zf4rh(JEWRDq`vmA)WXb@BJDbWzX&B@9^r)&kdmLQU~7ewJmpRN(CRPHPE*i;+jRwq zIPy_D43gp5Mi9YiZ4fm^5_wBsADI+@!?=sFd3`BNOsTRH?yvsn5uK&&AJL zVMT{PpQwW*O(`jg#0o@X;+UZ(A=i75quPLs;- z{&3m_sx}n`TyycyG#f0AN~#(Gk|2*ipeXpLeXGpV)i%|s9C%JY2Ks-}PNWcV z81(OCfvQpW+|5vlE$aAKPHGXJ9wkFOrz&r+#`J#fUc<)Ec6B0U$Pm-k+^D_cABaUrhz}9|KVUlk2*#h){YhS3LWH&TG@$M zn~S;_fCobd=SgY<{g=~Tg5+b^R~EORE*;5Op7DE(Y`jaQpgK~UFDjKkz7+B`PBnWj z{qV1ez~g?Du#2oiv-8#F4{ZRTdR2XC0^EiMEYcuWBovi#%N%KfBn;qc@?=2RLZ+hM ze~P#cBY9fcgw;C=qPpmQ8Akbh!LMqM9A7L~xc6uYi;0m=kV!9+-zdCwWbNzODGy>9PG7ZqN z=$PL*7&Be?d!&veA%eA#_~4CA$8h2D3&URdWLl@u<~B|mAJpz=Q-jhg@Oop@DKk55eXoisp)vu}|RrNVsL zHqNs=9UiixzP?%NRH@AsXtG`4kepBPO+x3}g-G9mv=!KKv?9u!qjrQ+k znZubM{na&~CAaqxeodnmO*MRR&CMPa^aURPfs-nqzEBC2Rd6I{r@cd@{`w%DKL=j% zKWQa09pb-fWpdeXRB=k_;Nou-V)%2WaqJnB?=~wTj(0q!*Q89R&gYO7D$MxymqU2Ozm<~2xC09giDx{Xq{2> zCx^)I32W(zxam+6@d(5!RkP&y*;Js*elH&bCO%a2uu7wrvTN`vy64vcVq3gNYy|Ibp(wSc|^67!$c`;UDc-ff|&vT1U(;Bw;Qol<{__0 zp^IB%f2ZQNpvhJXXv?-D0@hAOexLX^vVcr<=amqeJ4Rcf69S(;IRHZH>T z-9@(mK1Lr3SU%b*uWM!*+aSNjx4Nt~9&u0K%4d|e(TgfzO~>U zuMfe-Je_aktGOU){|4kW4<=kS7QH=TZ187QHh)BQ;I=w#`UXv_&Z657-B{qIor2W4 z^a~Bs!(kB%xvG(dX&XOuZ2M9qDhn72gWel|S9sX<0h|^C=jp^GW8wfq!_vsj(;={lbn^EM5n^oVIQ_;+!U>G#uJRUmtv5DFA(Y*lH9%yEpFLopNV)xP%9cuj_bha-xs?Nt&+S;c7!^%W*~6(`uG8M#b%Yj-$@-|De%0{C(zRf z*9?BXRH0?|EzCT>kwrJ(m43)J8OB2=^F559#wA%S@N5pzej^@;^S9C}*rC$H)Lz=R zOS5Rid}TkJXOZhm7IrDC-Zyb(mP&cA*Z7uFsyJ_)x034F@RP$cqayKRyN}i7RTS~1 zW{r(q1eN#({k*o_Ow|tN>ERf^nEz<`&OSFNtTBWHdF%=n@CfY%Tg$MX^y)}A!i>Qd!#6zK)z%WffOG~Q7jz6B8YS94llm& zx%7;cr{L}^eMwZiW@;h2oXLi{#h&Me7O#q&+14p4HA0iR7cGgJHMs;$4piZ;%-zy? zrpmZyKJzT{3{Su^vONXJHPA6xL|@qyG#3%4011IAJ^ej486W6hS|G;6Mynd8YU3_0 zrE|%g>?Ns*nh9gW-)4({1jW6%scNCiw~s-8uTlv4DkWCx-YxtY_5o0LEOYC{w%cS7 z85s-}*l~!V;X0{Hhu53$Q<_9$Re(vls$X1b_0*CP#YvztiCetCox`jh?~3fQZ604_ zPfZqeEsSBTY#vXSaI{ouIXkG@RBd=+9Kjq3OOTt`S;{XTCStxrP%du3%w;$5JjyjB ztu28F!01^ryolTR*vO}Y@+Rr<_k+vACqACk*Q>V)ZK(9Mh^+1-I%nwyESNO-z(n-i zr?F~EH6_#h011}$ca8)?+lvx(GJJHdUj7ctKn>2T^bVMUr*9olrEoeGRM7F^bj+{`jtoS^O$E39EHx!iyLjD zDk>5w`me060H6j(ebseeejj=vsYofAf|AK$S%pH+!P-4$$w8&TXI#u%MDG-gik}YU zwnvLag-NJtmw}WfWKC0d)hvLq0mWx9Um%}M4PzOj(Lc6m51SxCQxJ6JahZAe$FkGk zxrN&ZD3&k>Yxf4ud6_O z{M1rMy7D<}xeE%5*NJR5$JXF24I;w1C16z;s1!P0rvUP>^1pj-V1)k;PD4-`SR@l8*lUa*)#^)Zz@>i!)Gb$ z2?JF?-;jM!)4bGg2ut;Swrnm&BTq>lc%N;My?Y`j?l`Ibw8K~Ha))vM(cdw~=RD+B zW|U!nhun`ZJ~s~g*aXv<4eN@?aGuZer>J5;qEClC>X@5@*LC^817AezR%YLAPk7AW z!+_yaF@YRs`@j@izW(+`2g3k2t}S zH!RBCt-mIVz%gRadH3%3np53mMF6*P1HAP9c&I^E^WWW9Ci>r*J-T#U6iE-nWaKA~ zebKz=ozwbp*XsX7cm3dopD(YG(Jlu0Mnpc({xAO?Mq?^Kq9%7n_+Z8hx&J#YMOft3 zXG<84i211W>XdvFIw?;(#E>wD_Pw# z)hT8S%;wGafMe?nb$)*4^3@XtxIY=p9u>4GP3(#dEqf0<0pP{xe|&Kz5up+<=E%%Q z^6FTl?VfvIj=8gic@4ATw2*VaF`UrGXus<6quS!L{HwRKCtT&d=wpNF^Fpxd_xFHZ zbANHF#wwk}f8|gb+O$*M1St9!S>M`n$sz$Ft-Pub!0ixM6=H?56~jsN+FAhD{L*<~ zf|m6AOE-MlSR>$F#ZS+T$*MZk_L=(GXZZGc4~Kv-$?$*O7(3sAfT5_l8Z^t2CS+-oR4PgzVrssNd)W%{L7&ATi(SkMj^C5Pst#J z(oHY>h(4}(7Sa14O(f1{@&~?FgGYc}?UvB)Y@Honv44n$68jf(!IrOpYq4t5Bsxbn z3YW!)Xd3B!7TJ~Q?;VEHupWpDhMDIYNYe!sXBt}3Va zo$79Pt9H1}W9eV_NmO#4s8=Fc%hd|6LddIuT&HnNM7#d~%3%;D`^RVsz@`o@6!jyS zNjNI8-PmlO?Ks_|v+4;$m0PDseM@I!Jz>PzbN-X|J!q-cy%Y)SHTK91fO|{VJ&b1E z{68_iYYO8>j~nSR6%N#Wg5xX;39rxVP510Cbr_wLDF7O542=p*<>Ws}cM1TJHyn{k z`itX{K_-0we4M3FSeEQWN3ca-S9`~XqJk1OUY zbTB{Z0SGm|#dgaRZbn9YU6lLaZ^pyb)|5akXor%##Jkq|7CYkMQ_lypMfT_BpY?wU z+08L)#Gww6h;PH`KCpjiWpl=Qr0kR=@jhI66G<+D!fM!@vCDc)a}EGs3p4%}Bx(7U zqviLr$DFJ{RLs;GS#QvKlmJUL_XON-wTTQ(V?5HqQHnQ~XR1dLa~ zDC0iDLv^-;iQ4wlG1tvhj?8(&JH5yKsuM=lHMFj7?8*N^9QLi6StR}*e}BnWC9|!M zCB?2qUccOw!>s+ziNHjU`3ro-#{=Jq4p`XOxX3bsZ)s^|aN(elC7|EJ)55{S$>etl zu0QTog}a^#?X5ig^~Lnc6cY2I=bZ=uy2^N1-_T_(I)8j0FcB|MXTj&$g(c|hLuB0U zCL(>K)~Q=gB$rfCp6i1?oB~;G@-tZ?c!FFVD7rYVTDl(jQlU(-zk?x$jtZ0&r^&)b zLV`ZgydvafU(-X%sx#*nXMG(%QnD}nC z{^{F=%2U_4^iSGz+wo)N3lx7sLs3!i2FcagMrLTj1G1PtB`t5%kIWH3iwQh$;$%XC6I> zvFXDrwa%F!zIgo~OH+ixwFymjyIK?sQJ$xwKinW8LpK)*<}%RzB1#2ZuD(I{H+IxQ zrs%|p1Uohhzlh>F%#7~DOif+8%OwEnu|ra&C}7j+>w@`Gp3~=|K+dkkCaFfKb2%Q= zV6yawTP#w;xXi~nLdZs*S-l>&z{T=8D06J~!0&DJ{pI=Y)3k@$xH$5gUOeuJ3p|wO zdAa=4`8HAaeJ$qDQ%qb-P%d@6-^(*iBmw)?aP;t&zR~Jg1PLE+*)o3+!0*`?$A1YW zS6eCC>EJX1bF|ET`Q632vcC>4@W`b{(H=JsKny^UG$)CF8b2+f`0jD?2Ub zzyFOriDU!!kD(#Hh&8D&W&UXD4>Y%V_q#__if=J~uPa2Z+I5S)RSm0hWK(&~N(AtD z;kpXsgaWoD&J1TgwW*jt$mq372Yn&+7dwGO>kbPOE6OvrzjmH*RFya>o-xX^z;F^TCro^das6gkTV z{{e$^-94!QH`b$@ILF-mP1@-W`K~0;+(&6Ktn)9b< ztj5}&vUJ}N`2c*B1&YJJ{2NbwZ1dI{@jRX}PX5;*pXcB2kLh|jZU_9al|oYuCYq@N zzLW`UaWSTB`AEZ4O|D(;VOx*m6EQ`UnYs+QYjBq|O22)tZKXL>zY(y1B|mpaOh`Yg z>OXHE_&XxZ%N#ll-z|abW&O70Uau$PZxZ{%sFsq>d33IZ0dwX1KaWjaEk2L}_+!M} z;q(dM-Euv|b~!E17p2|gvR5D5gFJY#ROx&(`XoI6J4>sp$I-lqK`q#v(>Qb;%juR* zs9n142fqFH7q!rt0!6C#eshSiybg;}XJuW{g$Y1BmLzJEXcpe_rB1z(zg^`@4+T1+ zMQCDW&vfX+&t>E0`uZt1azs`E`15_xrEci@ajNLt~$A4!TeRGCyec$-$mh1!KGrmj3WO zd8ba7K2um;kR4(@bzbw7kfvSC)5~*z?%Wu# zfgjJd%2?i*3Capo*J9!PwmbHP+4(J3|K%s5(j7QJtJI4vM)=&QL&AGFtp>f%OsoJ* zmi(<+2>Rt4UP*ssrotnf#AabUtu4&b*_5=xur3w`Qt~_Jrqa(_#yd>6SoKqs`LG z+j{i9O;_Nyx=C`$ICEyMF%D04yV~~_2&F`5(B|d!)9~kSKQ#9*?Lk)}L%o2;SX5vb zJ%Ou{oSgN%ABILkCHRcpS1$0-r040BrL;L2C0&}_O<;Gu-}$T`t$w*k|_E6 zK+Ni|BOsLaU)+uy%)9WkE4qH=HNhAr^vOmB#S~&pdxrni~o4r3umAq})Zf^ucv z>DVSa*d1H`{1Jb-XB!nq02~~_fy>t7C?U#m65>rv=e{8U5uNR`r#=deF}&-fe7r@M|4SILGnL8`_KNpWRYz0_lck@zRmr6z3}gi+Q9w7eBCUl zHl#dEma-eG6SJTh+y5QD*!TN)2m0}M?>g8^Ems@QeZ}>`h(+hQHP3JS7nEjdq`)1K zF9FvrnPi1w<=f&rs{0C4RXi{g6u@y@e8N#;S{bwrz5`OurgP;21nAsRU6p#FRd$Ns z1#KUdltVITfKV#z>}J6^L`wugtme6aV&MR$lf-$}(ahE!Aq)>2$2 zU7X%?1ou;6m%75;u&fg_I7x^tT*f)!f;I;~Rg0ZNVz2uCI!a~S=+ld*B{0v?NK;&e z(}!bbNR@~#Zk!{lJdstGYn>FoTZ-x)_Y=#*Y*%iLuh)s|qM}ZB1Tt3u@X~ccW z-4DTA7P;sQF-(qJY3d>AmV&)+EBO$%j8|Q0xc452O&!=sr043!^vo37`;$}#f8}Lc zZR+y(A+xk%NA%qybr1>A>8t|x7zpxH22$Hl)#tB@?TTJTPD5=u`)!5Q*RgXZtnbT<-m+e}{S;G+2hxdU$_!f%wcLuTAl|N!4 zuh!dZu#PTLo34qzygt{ve46ly`{7|gZz)=1?T|&lbd~=p;bE%tnOjonMwqKwxN&Z8 zsV*%PnqV=@_;oS=bNXq%to(A%zt{bQ=C0nJC;AMw_<6E0lQh15IHm4w0?T~VCmjm; zMfAJfLH*acE`ja*rxFwg@^D-(PXhalrc;rLHDSJzf$ns0!clYa#=q9436p-vcSulV>w7b|yZO_yLm9cz$Ql@GqJ+qG+!Bn^Y^&@Y^WklnrCq zT)G4X-M`=6?pqW$GTe%Cu`g>RsX~$T7^~xX>5Wg-l6WE~yXJsL#R%bka?smO^h~zZ z!|5xcr~SfA|7?GDWxNdkCTmu|791 z&-$Hf5A&*QQZ&xn!!}>c@5d==B*GF&);jp#*F*{l4aG!kW{U+DGml649o+p+SL+Ze zeiK95D=>a>%|M!+OE6E}yg|Mv1c)T`szzsnE<1?J=sLaMj}`(hy-sy9v3qt2PuZV) zznf%#W)IQD7_(jh8+vEmdhO{f2zmY`Y5i^8FGr&4W1ez%JfS@C*D(#0w;4k&Fk!kg zV7jCBA&QuD0pc{Kg71K2tpEkX>{uo`6xCO1tn6+ZbF zB}OsHKk){L4EcR#w*f#D zf5&)8(rRNC&oZyQCWQbN3KF>(i7&4*Dii9_W$Zb6JjjX!8x4p*9@ra4M4HS zGZ3e(?fM3Ik;M{>uYW#M2~5G@b9geA2?a^| zNlK|)pk1>kSG1XOTTCj&uuzeF!#}hD8ehpT;k1ITmQTm{|-#r03zt@32@O4jKYGVY6rK%Dg1fYx4sa_eWyd0BZl~&QB3={Z&mhs+fAcj z{*n;Wy%->RB&g2QEdf za$peOd3V|%=^4Kl*@fiL2sOMA5w~?|9Oo31+(OwZ>m8NqU~~-MX_EC|y-7fuCI66v zcD?3{#zy1P5XBj%1@ZPgb1s2=fVeQ0#W|n}lHT1Lo0HuxgFB<(-NN(lX^@}`MAR)? zq3UYYd+UmR#??>wsQW~OonD#7XB`LP-nHw*a#<9_1c$vLx>Tq0LP}=52#e87kw2Zi z1j-+tgmxZ;)Yo_g}T&+R`cHUx!h&7<7)COVMZ96 zWRPKUZMml`^=)Xd2p$yr%=PDne3HXC3Ok>gd0Le&m`T;>H*+nBcyJ5rjpm+iAAglA^|U1R1AM+P$I zk6OI*{*8*bC}_f+7z%Lr=DXDtpGMRN6Yl4PohGjlb=HmvvjKm+ag;c_wWx{wblMsq?X7_Kycr*Ak zO}N40Xz((|hao9RYYSyzl<4}gxo_h+g?K_KzeRKJ;xj*kkC7^d6=Re%>7&Jsa{+nw z*G8c+bn9%AeF_uaeA)(|#iKNH1z@n>D`;^5Y0)iAWT(&aPnt_8b>_fJOP%FgtRn;0N$v+FySpxp{ zQYXpt2#Uhg^FQ?!@SY5e3hHjCnlk*#_p3{9)`x^75KccT>I2pB8#OcpOwdH~^_q`% zs5ktJ`DPK)Fr*i`YYWYSG@ctoaF2*f&w#4~!)_iN^}aNLlDl#;$Sv6`_mn9g>B;@J z+_7{jio1|~q=uxT1WpA~krjOa$Cr;r!7ITUxEV}eFGAgZSQBPaEmV1-3838}c6U=ii1Z{8%vmhI zks!mqm{rl+cBgXno2FMq`#CILA9Bjzzdu|3{Lz|?iQ0TnEFKd?)lns%Gl)*n>&!Hx zyY5<|n$?juSNR91h-r>(>hr4yhdPh#g@PtZOm=IUH@c#S zbs6NnFD-Q{j4DnHI(9d3$#_2D0#+oJeC4$7&yc5b-IVi1(xYggBb@Y;kY_qWfj*PV zaD0UASo8UzdUphkF^QIV!% ze5e-E+{0+sZ-px+@dLzO-(FVh)@QTnJaj*NP41rdu8BT@Wl8%mi9=nO$%wzjjyM1o z5b4WuL{@;h6YOCrAa5TetkY+$8?gTqzW z*jggRYN&hJN!DmJl&Uj|J)pXO7>oNDaSy0|J&u!Sv&_8D<(ejTb2>3zN%u$RHPRJXV^#GHaIjjHw}BZ|H3iD`O;j`tgd{3Yqn}8fbwTkOG}#>I;{JJmW+~ z$5fO-1F2j{IKuJ)P%BM{CKHi+z#K2wd>cGdRPEH}yoJfzbg0_)@Gb|RwIo^M?oav; z-JD{rgVuO%U4(nkt&kOAI}ROtN=;s@r{4n)6?yNDFAnWjMb3}XNjD1pRtoQb(qdh* zm{uQM(!kY-3__`u@Xqbe7E^5hqQ9@*`kR ziM5X7z&Q-;Yh{`84lO^#7zV_UixA_n#7D7xXgoBNJ;(^3;Rf+{w%nmwBrvGw%D%1r z#up?eo@;U-UaU!lu@9(P%VtBz}zjyt78T1TXGkgJ#($IlL+CJxrQd`;ta`oERVq z)-1e5kA-C>n=P=v#yTxoU{KkIZ9Qav^6$Gip)>ircFhVTbEnSoUGF)D>T_YteI;Oz z-i9B=>z^OhKgc#OTuT!D)~;O+f_4Rghgja;{!9~aK}ef!K5yoEsa)Ce?YvxbJ0HLd zqbNN!+r*f0`vA&fS>ZEV`|ezZHDUwu%&PX?47Hr*E3=f?o&DIlI!GW|q*-h8;~SN$ z1VvY)UYjiP;Yyv#S0n&&@$E2-swI$~^tqrF+&>|_!0L zx`KDx%!-{aM*FYGTAp~QX2lR24mgM~W zbH+-E@Ns!^tNFBrwz#&ik|4qOOY(}BT3_9VY@|SNI2@>rk%6K94n>q1lcoOL!!jqD zaL_B^M`sJ6S{!YJqyJ@F%PphB3iv$*7e*w>vN;v>DF?$cqcjq1MZ>Ayx5d0jpck*L z%KU3(9VjjS2%_+YPmInm`slBzrp7``Q09e|nzLIN{<* zDA*O0GtE&rj|#u|Oz3iwn6g^Q^NFn;qs)Yyx}P1)@uX&WT3tPfX0akGJZ? z97SQq7g(n&_eqhT=>x#ap%-{&8`L_b^nNEdClA!Vn&;GrNKbqO>|8<8zZ`uHamQ;w zV7N56$d7R3n9TYFzseN$im)%GQ@*?j@58;)E}6!aZ+gBG`Nud8n$;{srht}rVAF(#;{!)Pf=8LC(7i@iou z>SE8v6>RkxU8qYNReTHCPZ2C>d26_G@z_^nhJm)oLk z{xN0c`G$)iVkhc~^(>%|>Xs1aSuiCIC1Pa_+Q&=cODHTQD~@8t6fR^QyAS%$d%Gc& zvv~yPd*@UX8O^ix#wVUz?)=rT%l*pCR3w-eg&pRc`|}i@@{FzIMADtGbUJHVwR&u^ zuG+uxae#cB__#Z)NLo75V|tnvcd-6(-p*)>1l`~uVk;MrSUmJ|EWxI$b&OoNyTCOk zloCd#piKAxpN?qh@w|{Vb~J*W^OxzuWVE2$Z&nwI@xNeh889ouDWBt7qAnn5UWCk8 zsS(dgh%DOHBpO%v>wnaAiVLrDh5H5FblxnD6fTy$2>1;drB(PLL1VoD*#P67G0{ON z8C;#78#JVc9qa@eF)=~uUcJiEHzbqkbRBg+qtAoKzws#gXpOx0Xt`!j2+DYb-d+6D z8;y+w$~zn2-rg)Tyj4p1B9mErpg}2*(cUS+0^Z{L1C#66QE*katX}*0VokM9(PS6mH`l&&R^%Hg~Rt)S) z8c12E|4M{YG88Bj494(v1ff1!o{fg&@cZWafejJKcY@>|U7YrzA7#O=Fdc}@0tpd4 z$ejZ=B#;kX4ZR9yu+Rujf?tdUH-k)tFj^i%ol8)8_i-vS5UVyBMG=Y+L2%#}7(*`z zYLFU2<~v4z{>ndcEzH3)qUjFj#!esmgWjB>LuHqL#T27~oPh(8!8uFtAgFar{=1yd zI_28+oAPLD;zHDpSL}V}XGfClm1DPbCauv@?BK#p<@WoQ$e12}T(-lE8N5fdH67t^ zjzJ-Hg8YGa``}aa6-qg*a?rO<=JSubJdk%XRqNqmIyqiEqdnpfLPr|E!!N<8l|9hN zky=gQ_x#qmWeKoikJXS`xeXyO>I)H8rOZpX*qg_9Kk(hdPYVnQrzP>T{nP`l%XFTD zH5)YzyXi5b*O*A#y>x59i7@%-a?B*!|qw-#Cn>DkL0Q|DsYrc(XBVOa@_Bdh}NnoO8Z{rrOgYJqOgt+~d zc~xWks<>q52lAi;NN8znc@Mi9wH3)5=}+{HTMx!_7@@g3`L>`1&zZI>+ZrCh-M0(y zZTKC2zlxnnbcD43tHCTqA9hwQDO^KPjJ{{5Vw=1O@cDv?*EJCm^V_ zBtJqnnd6C4eLhz92F46UH*lSlDCKEdwCSGyptwM1*YNtt5h`i@7Ol3`m$B7+0J9Sg zB^(FZQ)oxFZy||2SHQC&p4;!;#*g8Z2Ov$&mMD}0%sE)E3 z=WA``j-=;_zX%pT+xa;z=$36@J5#7bvdTSNa8KvVs2$7;(vItoOdKu{Epu3px4F8- zJJ*Cr6~vxlY0_u+q<@5|_$@I;E&Fcod?Q#na~@eP!)(f*F#~ERfQUKLSeU=s;KWWU z_8qvWlC^$m4KaO+TLp=mkZ<{R@*Wo&$HYG9CNg$_X2NQ9A@=bfd z*6#aW=2UXAN#xA1=nT7uZRYp`M-f2BEOzNAcOIdC(UZI;3f(>&8q!LA)aDcZSlW(t z=$EA%Mz54z`pf`GnNJzILce;3+k4|DcRDXb1h%B<;{c)Ti4mYgwm%B&*F}zL`6Lxv z9KlxynqQyV?LQ_Z2@}Px8}(#!@e^)&QzhdHUK`dXX?Do1yDNk{x<1G@aPS!&FHN(y zqmb~mqduHl#P__J6>`6R%M4pqVkmf)e4EWy6vO^O*B#>oZ+D2_Mv5$XGlTr(tG|U@ zQQTq2{8=GYOWDqbXbu2Ve|YK$3`a&hMpODXINZ0ef4igI{1f}{(|wl^w{WCm? z4^})n#8L!(lMHs@eY7E2_s@A&6S4vTrwN-Nff}unAm>Fh{5>Te246x6ehOv|MgyyV zYG2gDd%OQh-^?V%(V~ZHj{j&7{-fr?q66R?E*-?}Idt01SyiU#iUjC8`%m9}MS8oL&*rGHd!;Oy*JJTAr;D$V zh^BlnxKTsZo!LO3Mz>r~t_tY63h~Q+`#chROV8tp#S8k$iV2?WyuaVe4DiF(J_zf6 zBW7Npb@lXz!ml4Q2Fby)IV?=>k09p%b*kHe^N3AzbwH2E=y{{_+Qx=SqcTou(j>q> z0?`9?&#?d~;%NdUa~q&&NG?RM)z6mWR!pom|9#9Nc)f5z>%H&pm8C)5h3kDrLX1blk7y}*(Mb^}I>J1=?q(E4}01zq=SnN?*PB*`^ngwy6 zTv#p_sXb$803i>B-b!CG4s+k+vK2AMV=$wWz5Dj5RqyA`apwDfl|QV}Jz8IOlEiE- zhqmTh;Z>u^1yp|_qI;IbS*~_7|-FKlu!|Jz%`>R?)t6GnbI|gq= zEbcj^vbpZdc_&MBv(%CziGtR;qmWS3d8*@VF zvf2GeZj#zZz?LsPPNmO}yzZHOZis$jv{z_+zsP+?e>UHHywFTgYoiyECwpQ3{ZSwk zXzz_bZT#5hSBlLn_wkC+$WF;nUHr^!vzkNl4+S`v(_-SgIxcbpIZHX8QfYr{>sR@! zjH$iF`&uYd%qRS4*@x!v`O;CRHwVyEZ}Qp^^YJpnJe%8J(jKw_;?$=`Vmhxien7T_ zW9xDP1z))hONq-ih_;3R?!14=%;Rlgkj~|i^t&h9Dq3!8eC;FLAaKRd`&$e-tv@Bd z5xO$fpH^i;hBj&O`<8&g!~r>R^@jM@-^c!!qnl6VJfPIK)BM|E{a*}>rI3}6J zNb=Vkki9aq2E%u&S1cO!M;FEV7wZ8Q#Z&+9x%dPkc5CdbDaqS zKaAT#)5yhhOIkGiMzgNtCS81Oo%sRMGmTBoqh*?1XBM?GLd@0EI7zX5enA1)LOQaR ziHSd4SLf~RFZ?v=oR4>R|u9X?T=yWJ(1?om+x2J z><%Q0Qe}#}hkRGkQ({zpp93`TV6VF!QCv6=7J)S!nU#boY$Ayi zVoeUM)ldCIKJ+_UX@HKDrSgCE09wU-*p9i>0MJ!YBC(oATffAjBf0j;0QDl}K?P^p zCvsMEYdxeSF>&9x$bsC0TCjfh?9oh1YmRFKF>c!{j zE2(Y#smKqUAR67W#4W}Xhkve^+>aDny5kKV=C;E*-M&dY%%X33X(X3^*p-8xAV_*h z_|PRt6|@z-zecKYALA5)z=$>oqTPcoH74=|=^=-q%{G_AxTTdOR#PoFfrUc9JB<%C z1s{ySVvyevE{fbuNq)mAF^)M05Dloe-lo>{50xg?L@n44I;Nm1Gg0O@t)Ro1=(*0b z`3X&wLR@SV8k6~8kXHwiWZ7b2h&xd^i*Jq&8<%2QOT4yEWe~9Yqt6usi-$E+`M=)% zuaH!%^y)ev%Tk6?e5_a)QvD@&5ls86q2H9ggD0Nz6?O~bE3WU&QORCFs`=dBGd>ZV zK0U&q2#!xUW+8isDp8U!(tzr?5S>ZKEf}_C?WJ635 z*`AkISnX**!g)<8s(3P(#Qrv6MW0!iB058}0LaPJLn1nxDGfYQaIL~F7J~5xr|xe- zBXuX}%@Qoy0ZPwy#&DrCZ~m~Gh_m#Fa|kv&(awA0g(|OhU^xbMZ74N5Tfc;tGSl1M zH?d21o_!@5!TCXiPzXf*=OM4po}YRE@*Qn9xPc)O3V0)l419l=Na-jEi$D19^G3`cui2+nB z|GAz%0Hg3;8d3-}&|mb=s&@Iz?A(mCSDk*&qV`nM=Un@(Hsaw@yJZ-bdOnfNAH-^Y z?X?c~&_L9v$ko*>Su$qW(y6#p=!yKCOD#&73a5nvh|^YqV3Mq1^XG5<^I7?JbG`|m zd){r5^gLSwBIEogR?*wxYt<=cLg$!!`^K3jKhv)iy$|+KfWq%jjsJp|MgO4JPxx`6 z_#UNe4IA#u^ecZsn=Qxc>*`CWVQ#*!Jex(aSPS1T)^s{?C)(;I(Xx{4JMIH zSR)H63Wo?*Ls`in5sYV0h2AzjRhC{Zl@84#pV8y4Yu4%N1=gB4k3xFz?-Qzw5=mx1 z03%Hob>!H9#yn$)P|>R{AUP9#K3ZvV;?&HNe(aP79==)yp%4!2PR*)@eyuIA=QkS5 zVE>i+07Oqo@h_EV#H?Fy(O8)|l!C9H<9+dWMv>v5x<$D-5=^ohK434x5K7wB25VAa zW52rQ5r^3?)(i|EtMV(~m3Ccd{nstWj*=$=!PjO-8KZl6xOL2WzOrtZScRdEIZ`=&c7xRPVV#1=;9vFd#5@Ys4+IX&ncg*tD3mihD)U@JL*eF=4 z)g<1!et%8nX1o{iHH-jB@AxFjbE(Ny=Pn#p+7CR6?#7J*Scm@}a7f%5+&A8hP`bAH z-1$y zR-`q4m#S@wfuk$$Pnyce&0R%yrjsjSDhR(`D4w@4da2?i#vm)3y3m!Vghx!ZjX4{ll1^swTDmJ@5Zf{JahD7xQ_o&XZ7H@L&I>e@^}QYEQ*j zUhK^E|6nNNkbonkOoY{B()528G59C%DNK`Fk{9on6KbJrN6*j#;JLm!>2CcgZdSRA<)4W_qIp&Yt#sk?`KkR zDdh{vv;Zva`Wu0y9hK$f$)=F&gsZ8j{5`C1Qg6f`?e`2GoSmVWx1Lq=bP1Q7xBE;RE$EX;xktPX878AIlWBH zF|HY;Ce%m^-C*C|Voz$;*s{s0qM6FMXM6%Jl60Up6 zob686)gJ*^lSGY#MVm$1yw{DV8+b0tc(n;A01GK|71PuwmQddSA=dme-@Q$lbTL<4 zpPCO~V3ck`abq9Nl}yn4wT~-IID+IA%f;WBSPH(MCJav{@A=U=h}rYpre-alfB34+ zVQJ5E1E^Q`TQ7JVM;QRR{u%MB}#w(sC-+T86f)_%re{~ZUYf9E)6L#$Co6Hd27(CuqNRo3EE?OqsLd*Z$_I3fky( z&%JT|G5~_F_sLHrlXHEy@b<)hJO`eueC3aUi*scb7;(%8`N;+=^+b%=< zH6mgW>B;4x*K|`)$vME}C$jq`#}dFtfb&4%9l$D@eKC(^ zd7nd=DbhOaO`63IESWC?CeAc2pYL{a8^p9w@D@uKUH7G{__>F(XtHe*T(*cqHi9sF zRs*jnj`4)e&i< zhrNSin9oq;{m6h|hF&SX<5ilSu5>Nwo+m((G5dT;Tv}OI?F#->3{aJtg|=#QHzJM! zcG|}Ub4G@2T)dWpzln&5GK}`rvTbKsw%oOV?TM1%4J&%gKxN7U^bZ#6ugHArF6{QE z3%~Ntx3FvQW%rkKBCq7f<^Rbze&dBKxLAAkvI(-r0?cH}$BTQOGt$3H4ikGXE>^P; z5@uYqjjw zRR+_)gG*9pPnBQTb)_7)&8=+sK*Xb{1!F30P{9_*!n8dW+xo86) zNYT*7AG>yec>$00K)GaA@p%V}f$`}E-+TU;D~Ag}x%u_vVO?byPCR(Y1qrkf+dx~< zzX(YYg*MJyjAq`VRkWm@O!Ou|b*;pjmk;jl!X^M3+lnZwfXv1V|5WV??xvj+{mGnE zp6o%_BZx7V;qmT#ie2`aOLPR%{@^*pWK<9-f?Q&smQD_|Gi<-q+X(fp2K0Kn0h;T| z>OGR4)fvD0;SRzeox$?M8;Y>ONH**XzvdKmuU5#X7V3>HOwVZHO%{z0I2)o zp5+PF%tHkO3;#aza249wM}h0-hvz(>UZ(@3$4{gsjbZEJn-Qj`p`YWM^K_PDU3rf2 z%O8Y;mLXG7w^AWuvIAy@xY{K2F^Tf~WWb2v=u{P0D#OB?VAlJNOq7<%)XH zIpk3H*Q;Mc(JXK=E+pFDCFHfK0mokR<1BNQw7!`L5E2b?sW-ZqFwi24;eo7b$$-+n zxI0uMtkIk^Ph(!3g+e&x+KhW-Zb@HF%ZpMt9QFv~$pQp9N_N)r3LhYWw*Bs6$DSx? z-_N!y2b^l-+_W&~0BmlNSM50seOANAX8VP7-I+w6C>^G(_=Cqc2| z7BBxEEu(iKuY!UBGwdYb84gW50(yZ8*@2}8vmWt@xkyEAt%IvFTplRd#V6RVa$t3D zkOvaA_WdUC4veeD4<$v_a-*IbbAQg)M`wp{BYxU7AMA_itG(GfRvLv}2UrUD(ayc* zB*Zu5i(iWR9Yv^>Ig2;;LHZ(%WdU?*ll&f}D^;G;2Icx4?&kn)q31$v;oQiqb#DK- zuk-U~p!f~$rh4ujmB`{?arxhRGFH_X<^-LHsDXt1T9J~L3A~8x;V$Zxh5;o0#;x9c z2YADPgXTb4tLE~u1>$C@?Hl5ikEJrtS3@t+OiOjl3tAv|kXj= z&L;LYZR#JXOB*a%|1J$$DWf1fPp#-!nhi%s*{7^v=k1=s4VH}y%*5f7 zQ?D2!At+6>b!k3T`PVsu8hceHmAjWCE?2gnviC zhotnV3A7l}_%J8+2oCb2akmzCZC`Qou40BD<#7B^oyids#jhN@z$PqU)o$u(wz~xV z-TZ`$Y+%kF@L^1m`9wMI_a$IpWymLx6vwHj{+Hf2rYlv8*$S`fN`J+1FvprkBxse8 zi7d6j$9)f7L+yw9t|c-yv5mW8KZHJm)Qx&z6Gfq(j(4s5Wu!ZTrN2gkB=3OiiiEH! zrvw2*P3m8NFie!JWyS)sytsUtnf>urKao~(uXGUgm}}6zGA$wiv%-}NCxj&Es5{gT ztGt6?5DVj;EQpJUjfpo^bp*t>(!@9vWh#G(s+h0B)16sZ0b z6-*XET$ej6j0iGn2P`nh{t~J-alU5i67Khk;&ihpB0^-XVJ!zfp_#6M+r48Q_^xF} zh3)55yeEJh>t()po)V28gtW8t8N=r%G}eboz@$p*;vN43qKbr3K>K+8ClI|d0f|e~ zsTsqm^fl^f9cfyiQ+-olU+F2@2I}S)y8h3wLaN69yxXS7%34b=+~ZMGz4Mh0`yEPq zq}Xqw=$QI}Hb@RNz~d*N7677+Pz7eBcHJGZ206de&{740MTEQ4ZcwLrRR)_=_vptZ z;K?^VSEdM~C(++_~u+?e~B;>5=#|pkox9#jE2%?iaYh=mi0%mo!YZr zsW@4jOr<|=io96UauuSgVw^-IryCb(&-;UOapm>1bMQ$TRRcQjj*6&d&*8ik#XYlTWaU(grr zP`QwEW0d;0s*8SNs`A5Ib3V|S%nyU|C333#&dnEA!~f8a+@sh@kzw+kdr87JcfsY` zUP~PZ2v!h4>{>*`{4BDVvF+la)#L%q(Yyq_9g@>_pNr{tkXB5T! z1x@P?b}>Xm0mZq{r*)*mMeZOa20ua#=~Ub&R&fwO9zD*%^WRK84Ma$jFz`L12erLg&X z5V3~R{b`B6xkwd|M_M;j=0g?Rnm2lC$#NtF-J#UlYD6mo0Ljq|xabn;INygsS(ic0 z^rj12&&p^OS^MYiW}2lLN&cyZPS(rHMpK^MAi|Y{a-&F0I@a47K>5I8L=u4WwWs z+(KEmX`MDYsR}RH^s0lS%#m(xN{$N-m*MDEL$Nv>qEtKu#FiF>maLujSk9;$LJuGl z!7i-meq!K$*D-xSP1rsVoSfK8%6`{e(lYj~&0>(@0#6618JzzAtJe51M2Oir9+lJ& zhi^;g#M@90u@FL{AvjzyeTQaB^+yF?16HErIOrWfifZxxB#GSu;XQdN3beFmBUt>m z0zcK3SbN6(Cd~$MMULP60Fv>1W3-pUVMvouQaA_9vc^UY9>J(0m6i%-UNB9=nuLnV z0fXRcT)k%RgIU82sj~zh>a%LpEj~EnxLwST-nA|uGiz*5TfLnv{a~{k+B|HH!8_lf zzk66OwiJ0`_3bX6{ab4WCn$2{ zxdiGTBThkwr2@L!DG@WqjkxuEQ_6|Hci#0gI_O>2gEeA_F~nBzhf>2|k)9_O&w*Mg z@_SV_K1M=4-XVi(>&+JA8BxN~vKQ0w33`H$qO*i*%R-IpQ4$gkJ!0S}xN8&XS}#e% zPRIzg=twk>5jK0}aFzOlDG-Ettou{0ab1{rO_-XaHv)!>z2m0{^~nR8_-y?#6Ovh~ z=J=fjIW`j}#`RwB{kEEmPQwQXU*Tw~MiBAZzE@lJd_MtNbT6e3#jM#^o%%+`_u&!= zV5^n9>+nuqCb>dG{lTQAUi*s#;05{E}-qRe8}@L;AjR=M#7i+4Z8=6Bv+- z$BTTu`lt>F^|>DsnbI=W6A;Cd&933U{LQv*YIg(1j zn%#}u$!o8Wg!8e3NA$;*%u*ELS{)8{u@BrhH?ZjHcnz;*D?d@DSH<%wAd~~=tw9r| zL0_pJdty52u1!9>y3afwwvK7`282=z)hyB&L3tqXOIFmAv*qTit(or7&cQIO`kU`v zesn_X3T7hM%fOwK${x_HmT#wp>Hz2SsYiVh!jn9SSYowd~04v6#8?fL-S2jO=c(qs&J54(L&Ls1ln0m4UF{kFj!UQH0 z|8lm@lL4O6N)~Uymf7c1=GYKkh792pL6na<%V&!XlE2{D9de>edkgEWg4~ohvcwN9 zb-rsXmmQa3pJ^AOV@J;hv5v>^c+lZDZPAX^JnE?Y^>($xU6cGlrB!0|ZQkgcdrEwRzzfLn#jVGbL`P4#Rf8|8&qQpJ1XeVJs9LI})RbUEQmE zPILdB!xWN4msjrK8Yt1h!pUHt!x}}4M)ltWt9g>^3Up)LM$8Mkia)W?tEa4jMXeZw)O^A=VTzG3l^nf#cO2 zP6c&kG=z@7d4Eu6VTsxb_zRe^f~UO1^f5=lo11FTRXybrS@- z6ONIG{di{+=L%jJ(g5MaA^A9Dn(u~bF)@03oS*C>%VxXIL3?c{@*jPXR z5_^4u^c>HlvlHm>%oOr_^ikYh@gC0yeg!v6T;VXj1x%_!ge@~lt#gu-`0 znaIeNoO8@1xCwfc5Hn*#rMkcTRzrF8X?_rJrwF^RE})+A5xQxYZffcYiEDT$Q7CEn5Wewi>1ZWZ@UkscZc5LmlakgKz*w^eNR7KXp(#2+<-V)xe^^N zxQHD20}i^5tb@tXp4QvfLtP75Du{I;u9bm9u}p+D$50bf20W)ZD9X@;0n-eH?~{lvJOmgFo6 z^I@CpbVRxwh%n*)dKDH_GV{r>=TlRcEctb=;(QrE&b~ew2-ug1yqvX6RB$H$1gwWB|+ws79Idn^#e7`2Ar`IjV67r;8wGk3*;@sJSe;Gj_BEfFkX z?;c90gJ2bcZxh6wYR^a8Au(@WtS01Rcy#koDX;c}MLG#z6CZe5YOsN_QS2<{|8%4G zgyVm03Isq(|85kM6_C+yL`3hUTDVK+96DYmOvj(_)wefsXrl|2AGa~X^iU={&{i^5 z%?a7vv<6b#-$l<)7;`|(%w>bdP+_P?PJeYwVIYaj_1>R!#{t57$VlH|R6%9D_M@L5 zlAnK?+=zDuP}iOPHHNYKce9)#cZ%ciZT5p2ko~r>F^t7(i0?YadkGoYi2pdiwF68x zqOaZ}d%3RW-XP(XN`NKXIxu|32K@lE**Gy=WK9^!z=gGwaoE!&0PBn%TT|qmdm5AP z@*$RAygOU&VfhXli_E{%hwF5zV2FV* ze%-9)mapp7a4#+?z(^&t-;vcf6}(xgrcOr$V+057j61Ai%XE-jNDAzs_%XnL`j3n4 z;iJ`Xwk5a+zr^7}cI^@fLVg!pak`7_$8?R5odcjyVPn+KkTl?wlGN*zC#_`Gs~MD3{Cw*f3R&UPrGAT#nIdSi2q=fo<-Ba`s(Jqkd{7lvU8~iyTqRgxpa3C(aniY6}cI%PC5i|;MifdBu%n(oolQ2X5 z3f$?KgtbG)DL|$jvkGmnlisOxk_7W_bYvKkhxBD2>h((v0sOcv>F+K163kd* zWmUjNwT!XM_MW=7$I}4B&e4IjwD!?=Kw^jiMlo^zUG!q&H@{aH2-i7;`)-~S`wjdu zxKP@DqB~98yjOGjn@?TK$1RT?tzqdk{96&iV~fe02kzf$P0Y7cxMb7o#hU!8-jF=0 z#_HO%e?Bz`5SZ`19mn3U?{r78Y6DV? zk3LkzN40DiJ}%c3S5n#L^BNuOgx?#SdIX%|rFXJK-oFlqm%EMhH-{xzW#WDH*>5Z(8B?%65F@{ZRjHt516 zIIPq_7lwIU7uA0}?O}&PM~DR5`E80NfT*4z+d+wFypiR8S9CJm+~n+!@7;WoKO>;T zA<}nKbb?D@_yr(bDNHl}2d4^q{~|{v+p9c7PUpWdmwzJGFq@}5b^>Gn?tkR5I}}gS zlDUz5*8jqW{-v&w<^cfF2%BKn|ADx6xI_bSXV(7qMAFp%$YY=WtPD?=kl(-aSjZ8< zD!8_w*&Qr$brIE%C5DXy? zJMT%^m!u4Zl}?4{PBxNZps6+~cxILaMmz5B_HpXBlSRqfkF|b>i`UWpH9*pNI$bF( z3GAfJ)dC-2s9DimgyY;1uD_w;z^DN&Y&l+)?_uBn1po^-$-3hQutjI$0BcAerx>)& zXY5kA`<8xFbQ=q>(NL<+uJN*Cd|XTYgHtmt$Yn;l=s0_lP5Zxu99V7N!({zYf1Zri z@IJ{Wx@&*1>iBY2TiDpbR;j;xR~X%EonmS*2z?5ssq|eY?VWIJkBJ5BJ5zvQ1}fom zUkGH0=1;j%2QNN_xIlq3O&h`Ckp6^}v+WT4^m?vARD zUrwA}aa?t^wF9*ciAX(mwuht*i)#KJLJ%p#@XI)2;WOm(5#S>MW%ewMHM6WxsVM%9 z{DQduPu5}`g&dHyI{YjFaNXpq1uowFxbB^2^(;?q+`-INV$&_8vxmBDh$(;V1SU)F!~cQZWl^Z{@Ly*z9Y%)lrS26h$Z14M3`YbWkr zEayk_pW2OJQ&dKUL?-IB678d$Qt}M>R5|~lZUE=~;hmP1$Y+V1F6>1p1B@o0;VQ$% z#DR?>8UV%o!zth^^*l{)>D4TuiWI8&>ikCGKs<-*UV#<0V@sQ68}{xD%N@Vm=p;7r zjcPj4MX_(D>F4{K<4)95V`{cy%@c=HJkyTde5KoAr=kp3JNI{jL(mn4cO*V^0$RoA z)dv(gAD?`Y@=s{)I#4X=W;MHJN%vs6Gu){Q{+rAd-O*HRXmS5pLg~o-fLVhS(f$RL zCIcX(8aWG)0!qLJWReENg({9{17bGO;7WkIPw#lj ztxD?gCztC}SHRa(dmaraZj^7Hv7Vk2KI*#z%E-->Ch=9VFTVblnk^e8<#ysrGQb{o z=ti(_4PbnUp<>idE>Ql9{Ws+=r9-fjH?l>K^`ExnA0Y4pv#ZqoSk|@L4YT&AiMh@a zkrVD;&-pCsycutH0N)T3KurIRqkE^m5pfTX)v;J+ULW|7xK_7q3l+`_WkWs`m&PWi5k3ew3D;Sj;(L zzgbri7qs~%=(DgIp52L+Isrrh@r)0Zl^pR5r5|7&bgt_I-#r_Z2M{21Cj zeoZbS$o*6sC`M)o{^BF!%+uVYYBm;uHj6ri@<7&YU8!(bt+_~;aQRn&ly2;UyWup9$XEIYhTBaYNzR*Jo^@V0+^+u_+A+& zarp~le-2_4*ye8A@3BS3DeYj_hgD4gVK3&{1v=}nuFS6o>tBTO+azBwlwB`+#2s8u zwHQFtTVa3#V?#VFy;n53M2&tp-e*(5G)L`h*Wfqy@D7fDmvYHzT##Oi~U>Q`11> zWejj&Xxic)u%rT}7<7eQ^grUl7MuN(QhOm1)1=k8o=Fd~?QK(q_WOI)8jpd`hf?d} z&h^fBF*jEDb?zq|VnL2d`m{c7fK#au^v)6!_@mXQLBis9}8ZsjH9lQNpUi z=%UdCPcI@930BD-@uSW9)f{CI>Y3Ox!LJWtzW$GY59iC)HrU1r|!*HLBh?L@_An$iw)SKEcjk1S35u7DpRI7y2&z}og?@0S7Ddh^Uo zczx;C{PHae;a!OHn>VG28arPJEmJfLfw^kVGCh7Hr-Q|^*RS;Mn3b|7BWF(>-nF7_ zF#+;|h&_nX*!>SuA?Q~zBz%cBFN$Ky z=!VDbt1Y)9@Pa_$bx;on+}N*JQ(5O)SenB>F3#i#0Osx+pieY#WR=YJVfdklYkQ!w zKSZ#l3wOO!3E~d;C0HU3YrR7Pd0cvhY{!2fxtgy0&>S|Nr!P~jiY?7MMagbzBWf`- z0+1wNakThxR3^~^o%eu!q}EpP)a~N7<6Jvzq{_T6&)4U>T2@P1u~Xc0x?iz~95Qg4 zqmrEMS3k#tPu^j&-Vf;FH4HHD>ya~wwdhKYG~g{X+)Mry7U_)hI{2CEXgL;_~-}~F~t|I=N6`rG|GB1on6Q(dvO zIaT%wdCT?U-c3hW0U#sl+en!l7iQkHp7FB7A!oj8hEe=9(*a4efh{FQ`2x7BRQmmi z1@eUB5nF$D&z6sWly2?usYtdz#=7suJU7^k>pi3e?$Lg)1KTQf_Kk&3i)00@ z_g>EH_pG7?#;=P`e8vhw8V?3>{~bmKF^n=cYOH1^oo)}=i|L=W`mOvb|M4f=56Zdm zGb5Grjl)O@$8m{!ma4W{Ps9gVuYN{u$z2VLp4HbI=DflK!(D~%>UmgLvY)%WQ%vC~ z4$@PcvF&*orK85Aw7996x0x_tzsq%P5opG+ogo{|st>+zt5B~^4VAJ;!Iu&@eqf`P z^f*HmH{iSqGdV+$SqKLN78FHILMJf_JD)F#6pLz246E9Wk~ z{%`ab!>J1^oaMYG(_tQY&-2HYSLG3vJ5KkoX8+P2apu-&gyB#8OqAU9=Dp$;ae zALb^h{BUrYIknYVA-=k>)X41tD^-UsO`64Dyvo=tvz{9LQc6~4d>fJ)A~DeDcN`Gv zEh#3sR5Vdh8vNO9bGn^Fa;3=!>(Khw8)zJZ5pK%q&M=s60R z?8T$nGcZ#o_R#MTGt36!=+hT?GzT1t0c;j0M^+e(j^PeaSLnoc9t=yO@jt)_L@(K+Dg9o#`!39elK&t#FGOgFMJY@ogA^Fc zNDL=TM8Rhu#lMND%oFurciKz*6nw#{8PPV~z^JmU6;r5o$t&Txb`{pX)HPFY~a~e~vMyoRQ#02zd zxtND%J^k4=VH;nT&5WYaOyeUnl}b)|oSfd-E;$h;u^OvtAJVUHhzmMfaom?Z31X-E ziU4-B<=5@60}McTx-*=nVUVy0HD6VUqvIUOxFt|Bm9bfA+lU9b`BsI&&~JxL`Vc+? z2rd$XeC?3MFWn-F#0-v#-2R+yw-ivyK#FUYI@YP+rer<7ie7X)( zhf+gXYq-)pN)Vv>@IYyt=_3{E?+^E@J8hT@ao61EXBzX?k$^u_FxF)rZwqzK1R8;O z$$dMc_rV6cDZ=97bXT+D%GCp9#{tJV!z}wqNh`nNQ2{Gb(ST!FfqCEP5TtZ3s6@j6 zZE%0KrS;h7pXVt-%a)yye`7xopbYzlheQI7jnwR!T#b+VgTm4F5wE=+(L~xlWfTo> z3!=KkOT7Wg2C)_2x4q@sEiRb*IDAPjm~;u`$=o|!=C$mAxHy`OFXa_FmaIjbkfR?y zVAv?#RO;(=uavugcI_V@-O*UpxgzRu;Ttv)uzF5i|2!S zxtDdKW5JrY*V)ZUdq-!?G@w5oFse4>(ke~MZn}ohz{Ve8BtH{-@W#<9sL6nh%Dn0? zi_JS$Q28z@J7uX(w{6X7*x}@r8|Aw%0&Z9g4E+vf2*AIU2hr{bS0fP(Ox`DrAs<|= z^#u!cJa&05QrY*f$)R|gy!$Eu>qhDnXJoP>4$5@I%{m`&FHl(?HwlsMTC)zecFRhx z*76vHiaC_B1QOUMC4mxNe!hMZ5q?Mj;Lpftm(V(uyFe4tD52cwYgNn>(tt;_JBx9L ze@qqQ*e4tQt$u*1;`iJ@VFL%M;LJ>PJ<%hIKI}>C+0em`%;EW*sz@FWab?As-CfsZdEcCuj{*k=Ix{Ynk&3hCxdY_HA^OgEJPg38VsWX(lIG)gk?Wk_fwvUs*{JO4EUdw(bvLsyx-w%NCZpuz)BDQPmwF=9Kq(lNAVi)q%PKdO*Ll zPkvzvM^3}=brWJ987RVFbPtp}=r0okb}g_oJ_LZtz5lH8BSDc=OmG5>C3Tr&+d{dP zIV1$UIcVo5_+YH3a(t!-9jRZcI?9hvB$3eA)oHvyp4!z2z#by6ABf`Wj?T72l`x?} zz$xgBDBG~oe%wHG$bh|clSS!hN{!(AMvyG#Q|;ipW5+tN<`HZfIS$7_Hp#5mDS@hS z1sl8b535_i Date: Sun, 26 Nov 2023 18:33:17 +0100 Subject: [PATCH 31/52] Apply to for dbt cloud --- docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md b/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md index 96510918cfd9..c44be5a814fd 100644 --- a/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md +++ b/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md @@ -1,5 +1,7 @@ # Use the dbt Cloud integration + + By using the dbt Cloud integration, you can create and run dbt transformations during syncs in Airbyte Cloud. This allows you to transform raw data into a format that is suitable for analysis and reporting, including cleaning and enriching the data. ## Step 1: Generate a service token From 13ceb0bb05cd55e53afa19f3f945c3089e620eb9 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 18:34:16 +0100 Subject: [PATCH 32/52] Change sidebar order --- docusaurus/sidebars.js | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index c7ac46cc335b..06b0975b3834 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -465,10 +465,10 @@ module.exports = { type: "category", label: "Managing Syncs", items: [ - "operator-guides/reset", "cloud/managing-airbyte-cloud/review-connection-status", "cloud/managing-airbyte-cloud/review-sync-history", "operator-guides/browsing-output-logs", + "operator-guides/reset", ], }, { From ec2d4e9a3d0defe181325c9a29691941067baabf Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 18:39:58 +0100 Subject: [PATCH 33/52] Change sync modes sidebar --- docusaurus/sidebars.js | 11 +++++------ 1 file changed, 5 insertions(+), 6 deletions(-) diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index 06b0975b3834..1694bf907c74 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -424,13 +424,12 @@ module.exports = { "using-airbyte/core-concepts/namespaces", { type: "category", - label: "Connections and Sync Modes", + label: "Sync Modes", + link: { + type: "doc", + id: "using-airbyte/core-concepts/sync-modes/README" + }, items: [ - { - type: "doc", - label: "Connections Overview", - id: "using-airbyte/core-concepts/sync-modes/README", - }, "using-airbyte/core-concepts/sync-modes/full-refresh-overwrite", "using-airbyte/core-concepts/sync-modes/full-refresh-append", "using-airbyte/core-concepts/sync-modes/incremental-append", From f94adfff43a7dd84fdfff3152fa91831343cc288 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 18:42:15 +0100 Subject: [PATCH 34/52] Fix Open Source spelling --- docs/integrations/destinations/chroma.md | 2 +- docs/integrations/destinations/clickhouse.md | 2 +- docs/integrations/destinations/databend.md | 2 +- docs/integrations/destinations/mongodb.md | 2 +- docs/integrations/destinations/mssql.md | 2 +- docs/integrations/destinations/mysql.md | 2 +- docs/integrations/destinations/oracle.md | 2 +- docs/integrations/destinations/rockset.md | 2 +- docs/integrations/destinations/timeplus.md | 2 +- docs/integrations/sources/dv-360.md | 2 +- docs/integrations/sources/e2e-test-cloud.md | 2 +- docs/integrations/sources/google-analytics-v4.md | 2 +- docs/integrations/sources/google-directory.md | 2 +- docs/integrations/sources/mssql.md | 2 +- docs/integrations/sources/my-hours.md | 2 +- docs/integrations/sources/oracle.md | 2 +- docs/integrations/sources/pokeapi.md | 2 +- docs/operator-guides/browsing-output-logs.md | 4 ++-- 18 files changed, 19 insertions(+), 19 deletions(-) diff --git a/docs/integrations/destinations/chroma.md b/docs/integrations/destinations/chroma.md index 3e37bebba225..f99a9cf869a5 100644 --- a/docs/integrations/destinations/chroma.md +++ b/docs/integrations/destinations/chroma.md @@ -17,7 +17,7 @@ Only one stream will exist to collect data from all source streams. This will be For each record, a UUID string is generated and used as the document id. The embeddings generated as defined will be stored as embeddings. Data in the text fields will be stored as documents and those in the metadata fields will be stored as metadata. -## Getting Started \(Airbyte Open-Source\) +## Getting Started \(Airbyte Open Source\) You can connect to a Chroma instance either in client/server mode or in a local persistent mode. For the local persistent mode, the database file will be saved in the path defined in the `path` config parameter. Note that `path` must be an absolute path, prefixed with `/local`. diff --git a/docs/integrations/destinations/clickhouse.md b/docs/integrations/destinations/clickhouse.md index 75da81407f48..02446ba825f6 100644 --- a/docs/integrations/destinations/clickhouse.md +++ b/docs/integrations/destinations/clickhouse.md @@ -21,7 +21,7 @@ Each stream will be output into its own table in ClickHouse. Each table will con Airbyte Cloud only supports connecting to your ClickHouse instance with SSL or TLS encryption, which is supported by [ClickHouse JDBC driver](https://github.com/ClickHouse/clickhouse-jdbc). -## Getting Started \(Airbyte Open-Source\) +## Getting Started \(Airbyte Open Source\) #### Requirements diff --git a/docs/integrations/destinations/databend.md b/docs/integrations/destinations/databend.md index e25a80f7ec88..444a47473a6d 100644 --- a/docs/integrations/destinations/databend.md +++ b/docs/integrations/destinations/databend.md @@ -20,7 +20,7 @@ Each stream will be output into its own table in Databend. Each table will conta ## Getting Started (Airbyte Cloud) Coming soon... -## Getting Started (Airbyte Open-Source) +## Getting Started (Airbyte Open Source) You can follow the [Connecting to a Warehouse docs](https://docs.databend.com/using-databend-cloud/warehouses/connecting-a-warehouse) to get the user, password, host etc. Or you can create such a user by running: diff --git a/docs/integrations/destinations/mongodb.md b/docs/integrations/destinations/mongodb.md index 51bd94cb8c46..6df8e95f929c 100644 --- a/docs/integrations/destinations/mongodb.md +++ b/docs/integrations/destinations/mongodb.md @@ -25,7 +25,7 @@ Each stream will be output into its own collection in MongoDB. Each collection w Airbyte Cloud only supports connecting to your MongoDB instance with TLS encryption. Other than that, you can proceed with the open-source instructions below. -## Getting Started \(Airbyte Open-Source\) +## Getting Started \(Airbyte Open Source\) #### Requirements diff --git a/docs/integrations/destinations/mssql.md b/docs/integrations/destinations/mssql.md index c48261be1a0b..2a4bfd50bf5a 100644 --- a/docs/integrations/destinations/mssql.md +++ b/docs/integrations/destinations/mssql.md @@ -33,7 +33,7 @@ Airbyte Cloud only supports connecting to your MSSQL instance with TLS encryptio | Incremental - Append + Deduped | Yes | | | Namespaces | Yes | | -## Getting Started \(Airbyte Open-Source\) +## Getting Started \(Airbyte Open Source\) ### Requirements diff --git a/docs/integrations/destinations/mysql.md b/docs/integrations/destinations/mysql.md index 3ade0339ed56..469d24d4fa59 100644 --- a/docs/integrations/destinations/mysql.md +++ b/docs/integrations/destinations/mysql.md @@ -27,7 +27,7 @@ Each stream will be output into its own table in MySQL. Each table will contain Airbyte Cloud only supports connecting to your MySQL instance with TLS encryption. Other than that, you can proceed with the open-source instructions below. -## Getting Started \(Airbyte Open-Source\) +## Getting Started \(Airbyte Open Source\) ### Requirements diff --git a/docs/integrations/destinations/oracle.md b/docs/integrations/destinations/oracle.md index 2b26a69cbf6c..d2e9867eb04a 100644 --- a/docs/integrations/destinations/oracle.md +++ b/docs/integrations/destinations/oracle.md @@ -26,7 +26,7 @@ Enabling normalization will also create normalized, strongly typed tables. The Oracle connector is currently in Alpha on Airbyte Cloud. Only TLS encrypted connections to your DB can be made from Airbyte Cloud. Other than that, follow the open-source instructions below. -## Getting Started \(Airbyte Open-Source\) +## Getting Started \(Airbyte Open Source\) #### Requirements diff --git a/docs/integrations/destinations/rockset.md b/docs/integrations/destinations/rockset.md index 0ab1709a68b6..bf685f3e4ce9 100644 --- a/docs/integrations/destinations/rockset.md +++ b/docs/integrations/destinations/rockset.md @@ -23,7 +23,7 @@ | api_server | string | api URL to rockset, specifying http protocol | | workspace | string | workspace under which rockset collections will be added/modified | -## Getting Started \(Airbyte Open-Source / Airbyte Cloud\) +## Getting Started \(Airbyte Open Source / Airbyte Cloud\) #### Requirements diff --git a/docs/integrations/destinations/timeplus.md b/docs/integrations/destinations/timeplus.md index dcf43cc48225..d883fc1b3726 100644 --- a/docs/integrations/destinations/timeplus.md +++ b/docs/integrations/destinations/timeplus.md @@ -16,7 +16,7 @@ Each stream will be output into its own stream in Timeplus, with corresponding s ## Getting Started (Airbyte Cloud) Coming soon... -## Getting Started (Airbyte Open-Source) +## Getting Started (Airbyte Open Source) You can follow the [Quickstart with Timeplus Ingestion API](https://docs.timeplus.com/quickstart-ingest-api) to createa a workspace and API key. ### Setup the Timeplus Destination in Airbyte diff --git a/docs/integrations/sources/dv-360.md b/docs/integrations/sources/dv-360.md index 9e4341f1d847..b3c095f4691c 100644 --- a/docs/integrations/sources/dv-360.md +++ b/docs/integrations/sources/dv-360.md @@ -36,7 +36,7 @@ Available filters and metrics are provided in this [page](https://developers.goo 3. Fill out a start date, and optionally, an end date and filters (check the [Queries documentation](https://developers.google.com/bid-manager/v1.1/queries)) . 4. You're done. -## Getting Started \(Airbyte Open-Source\) +## Getting Started \(Airbyte Open Source\) #### Requirements diff --git a/docs/integrations/sources/e2e-test-cloud.md b/docs/integrations/sources/e2e-test-cloud.md index be70af977245..633e65c3e548 100644 --- a/docs/integrations/sources/e2e-test-cloud.md +++ b/docs/integrations/sources/e2e-test-cloud.md @@ -2,7 +2,7 @@ ## Overview -This is a mock source for testing the Airbyte pipeline. It can generate arbitrary data streams. It is a subset of what is in [End-to-End Testing Source](e2e-test.md) in Open-Source to avoid Airbyte Cloud users accidentally in curring a huge bill. +This is a mock source for testing the Airbyte pipeline. It can generate arbitrary data streams. It is a subset of what is in [End-to-End Testing Source](e2e-test.md) in Open Source to avoid Airbyte Cloud users accidentally in curring a huge bill. ## Mode diff --git a/docs/integrations/sources/google-analytics-v4.md b/docs/integrations/sources/google-analytics-v4.md index 835d1d324df5..85538f77acef 100644 --- a/docs/integrations/sources/google-analytics-v4.md +++ b/docs/integrations/sources/google-analytics-v4.md @@ -104,7 +104,7 @@ The Google Analytics (Universal Analytics) source connector can sync the followi Reach out to us on Slack or [create an issue](https://github.com/airbytehq/airbyte/issues) if you need to send custom Google Analytics report data with Airbyte. -## Rate Limits and Performance Considerations \(Airbyte Open-Source\) +## Rate Limits and Performance Considerations \(Airbyte Open Source\) [Analytics Reporting API v4](https://developers.google.com/analytics/devguides/reporting/core/v4/limits-quotas) diff --git a/docs/integrations/sources/google-directory.md b/docs/integrations/sources/google-directory.md index b0e570f7544f..d263d9efc93e 100644 --- a/docs/integrations/sources/google-directory.md +++ b/docs/integrations/sources/google-directory.md @@ -40,7 +40,7 @@ This connector attempts to back off gracefully when it hits Directory API's rate 1. Click `OAuth2.0 authorization` then `Authenticate your Google Directory account`. 2. You're done. -## Getting Started \(Airbyte Open-Source\) +## Getting Started \(Airbyte Open Source\) Google APIs use the OAuth 2.0 protocol for authentication and authorization. This connector supports [Web server application](https://developers.google.com/identity/protocols/oauth2#webserver) and [Service accounts](https://developers.google.com/identity/protocols/oauth2#serviceaccount) scenarios. Therefore, there are 2 options of setting up authorization for this source: diff --git a/docs/integrations/sources/mssql.md b/docs/integrations/sources/mssql.md index 391c7cfbed48..c73999857959 100644 --- a/docs/integrations/sources/mssql.md +++ b/docs/integrations/sources/mssql.md @@ -25,7 +25,7 @@ Note: Currently hierarchyid and sql_variant are not processed in CDC migration t On Airbyte Cloud, only TLS connections to your MSSQL instance are supported in source configuration. Other than that, you can proceed with the open-source instructions below. -## Getting Started \(Airbyte Open-Source\) +## Getting Started \(Airbyte Open Source\) #### Requirements diff --git a/docs/integrations/sources/my-hours.md b/docs/integrations/sources/my-hours.md index 66ae44d7bc2d..f58aa7899378 100644 --- a/docs/integrations/sources/my-hours.md +++ b/docs/integrations/sources/my-hours.md @@ -24,7 +24,7 @@ This source allows you to synchronize the following data tables: **Requirements** In order to use the My Hours API you need to provide the credentials to an admin My Hours account. -### Performance Considerations (Airbyte Open-Source) +### Performance Considerations (Airbyte Open Source) Depending on the amount of team members and time logs the source provides a property to change the pagination size for the time logs query. Typically a pagination of 30 days is a correct balance between reliability and speed. But if you have a big amount of monthly entries you might want to change this value to a lower value. diff --git a/docs/integrations/sources/oracle.md b/docs/integrations/sources/oracle.md index 1e81b7c73fed..e4493f950b19 100644 --- a/docs/integrations/sources/oracle.md +++ b/docs/integrations/sources/oracle.md @@ -20,7 +20,7 @@ The Oracle source does not alter the schema present in your database. Depending On Airbyte Cloud, only TLS connections to your Oracle instance are supported. Other than that, you can proceed with the open-source instructions below. -## Getting Started \(Airbyte Open-Source\) +## Getting Started \(Airbyte Open Source\) #### Requirements diff --git a/docs/integrations/sources/pokeapi.md b/docs/integrations/sources/pokeapi.md index 4ea12d78b100..ee543b33e024 100644 --- a/docs/integrations/sources/pokeapi.md +++ b/docs/integrations/sources/pokeapi.md @@ -24,7 +24,7 @@ This source uses the fully open [PokéAPI](https://pokeapi.co/docs/v2#info) to s Currently, only one output stream is available from this source, which is the Pokémon output stream. This schema is defined [here](https://github.com/airbytehq/airbyte/tree/master/airbyte-integrations/connectors/source-pokeapi/source_pokeapi/schemas/pokemon.json). -## Rate Limiting & Performance Considerations \(Airbyte Open-Source\) +## Rate Limiting & Performance Considerations \(Airbyte Open Source\) According to the API's [fair use policy](https://pokeapi.co/docs/v2#fairuse), please make sure to cache resources retrieved from the PokéAPI wherever possible. That said, the PokéAPI does not perform rate limiting. diff --git a/docs/operator-guides/browsing-output-logs.md b/docs/operator-guides/browsing-output-logs.md index 6a54b4f8727e..14c7b8e0e1dc 100644 --- a/docs/operator-guides/browsing-output-logs.md +++ b/docs/operator-guides/browsing-output-logs.md @@ -5,7 +5,7 @@ Airbyte records the full logs as a part of each sync. These logs can be used to understand the underlying operations Airbyte performs to read data from the source and write to the destination as a part of the [Airbyte Protocol](/understanding-airbyte/airbyte-protocol.md). The logs includes many details, including any errors that can be helpful when troubleshooting sync errors. :::info -When using Airbyte Open-Source, you can also access additional logs outside of the UI. This is useful if you need to browse the Docker volumes where extra output files of Airbyte server and workers are stored. +When using Airbyte Open Source, you can also access additional logs outside of the UI. This is useful if you need to browse the Docker volumes where extra output files of Airbyte server and workers are stored. ::: To find the logs for a connection, navigate to a connection's `Job History` tab to see the latest syncs. @@ -33,7 +33,7 @@ You can also access the download log button from the in-app log viewer. If a sync was completed across multiple attempts, downloading the logs will union all the logs for all attempts for that job. ::: -## Exploring Local Logs (Open-Source only) +## Exploring Local Logs (Open Source only) ### Establish the folder directory From 5362089c6542696c4a7f022d2e301f8517c59398 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 19:46:26 +0100 Subject: [PATCH 35/52] Minor changes --- docs/operator-guides/browsing-output-logs.md | 4 +++- docusaurus/sidebars.js | 6 +++++- 2 files changed, 8 insertions(+), 2 deletions(-) diff --git a/docs/operator-guides/browsing-output-logs.md b/docs/operator-guides/browsing-output-logs.md index 14c7b8e0e1dc..4e62843f160b 100644 --- a/docs/operator-guides/browsing-output-logs.md +++ b/docs/operator-guides/browsing-output-logs.md @@ -33,7 +33,9 @@ You can also access the download log button from the in-app log viewer. If a sync was completed across multiple attempts, downloading the logs will union all the logs for all attempts for that job. ::: -## Exploring Local Logs (Open Source only) +## Exploring Local Logs + + ### Establish the folder directory diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index 1694bf907c74..9fe0d7363df1 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -336,7 +336,11 @@ const deployAirbyte = { label: "On AWS EC2", id: "deploying-airbyte/on-aws-ec2", }, - + { + type: "doc", + label: "On AWS ECS", + id: "deploying-airbyte/on-aws-ecs", + }, { type: "doc", label: "On Azure", From 2ddf9ccd5dd08b78e58d297898552afd5ed56df2 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 19:53:15 +0100 Subject: [PATCH 36/52] Reset document --- docs/operator-guides/reset.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/operator-guides/reset.md b/docs/operator-guides/reset.md index bfaf8787d71b..2e1139feab6c 100644 --- a/docs/operator-guides/reset.md +++ b/docs/operator-guides/reset.md @@ -12,13 +12,13 @@ To perform a reset, select `Reset your data` in the UI on a connection's status Similarly to a sync job, a reset can be completed as successful, failed, or cancelled. To resolve a failed reset, you should manually drop the tables in the destination so that Airbyte can continue syncing accurately into the destination. ## Reset behavior -When a reset is successfully completed, all the records are deleted from your destination tables (and files, if using local JSON or local CSV as the destination)) +When a reset is successfully completed, all the records are deleted from your destination tables (and files, if using local JSON or local CSV as the destination). :::info -If you are using destinations that are on the Destinations v2 framework, only raw tables will be cleared of their data. Final tables will retain all records from the last sync. +If you are using destinations that are on the [Destinations v2](/release_notes/upgrading_to_destinations_v2.md) framework, only raw tables will be cleared of their data. Final tables will retain all records from the last sync. ::: -A reset **DOES NOT** delete any destination tables or file itself. The schema is retained but will not contain any rows. +A reset **DOES NOT** delete any destination tables or files itself. The schema is retained but will not contain any rows. :::tip If you have any orphaned tables or files that are no longer being synced to, they should be cleaned up separately, as Airbyte will not clean them up for you. This can occur when the `Destination Namespace` or `Stream Prefix` connection configuration is changed for an existing connection. From 52b7c8de3df47f16b50eab70e7ab338c333e06bb Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 18:56:11 +0000 Subject: [PATCH 37/52] Update normalization text --- .../dbt-cloud-integration.md | 6 ++ .../core-concepts/basic-normalization.md | 15 +++- .../using-airbyte/core-concepts/namespaces.md | 6 +- docs/using-airbyte/core-concepts/readme.md | 84 +++++++------------ .../core-concepts/typing-deduping.md | 8 +- 5 files changed, 60 insertions(+), 59 deletions(-) diff --git a/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md b/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md index c44be5a814fd..a68a5f1c3f48 100644 --- a/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md +++ b/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md @@ -4,6 +4,12 @@ By using the dbt Cloud integration, you can create and run dbt transformations during syncs in Airbyte Cloud. This allows you to transform raw data into a format that is suitable for analysis and reporting, including cleaning and enriching the data. +:::note + +Normalizing data may cause an increase in your destination's compute cost. This cost will vary depending on the amount of data that is normalized and is not related to Airbyte credit usage. + +::: + ## Step 1: Generate a service token Generate a [service token](https://docs.getdbt.com/docs/dbt-cloud-apis/service-tokens#generating-service-account-tokens) for your dbt Cloud transformation. diff --git a/docs/using-airbyte/core-concepts/basic-normalization.md b/docs/using-airbyte/core-concepts/basic-normalization.md index b2ef3700b866..b76d4759de54 100644 --- a/docs/using-airbyte/core-concepts/basic-normalization.md +++ b/docs/using-airbyte/core-concepts/basic-normalization.md @@ -14,10 +14,23 @@ The high-level overview contains all the information you need to use Basic Norma ::: -When you run your first Airbyte sync without the basic normalization, you'll notice that your data gets written to your destination as one data column with a JSON blob that contains all of your data. This is the `_airbyte_raw_` table that you may have seen before. Why do we create this table? A core tenet of ELT philosophy is that data should be untouched as it moves through the E and L stages so that the raw data is always accessible. If an unmodified version of the data exists in the destination, it can be retransformed without needing to sync data again. +For every connection, you can choose between two options: + +- Basic Normalization: Airbyte converts the raw JSON blob version of your data to the format of your destination. _Note: Not all destinations support normalization._ +- Raw data (no normalization): Airbyte places the JSON blob version of your data in a table called `_airbyte_raw_` + +When basic normalization is enabled, Airbyte transforms data after the sync in a step called `Basic Normalization`, which structures data from the source into a format appropriate for consumption in the destination. For example, when writing data from a nested, dynamically typed source like a JSON API to a relational destination like Postgres, normalization is the process which un-nests JSON from the source into a relational table format which uses the appropriate column types in the destination. + +Without basic normalization, your data will be written to your destination as one data column with a JSON blob that contains all of your data. This is the `_airbyte_raw_` table that you may have seen before. Why do we create this table? A core tenet of ELT philosophy is that data should be untouched as it moves through the E and L stages so that the raw data is always accessible. If an unmodified version of the data exists in the destination, it can be retransformed without needing to sync data again. If you have Basic Normalization enabled, Airbyte automatically uses this JSON blob to create a schema and tables with your data in mind, converting it to the format of your destination. This runs after your sync and may take a long time if you have a large amount of data synced. If you don't enable Basic Normalization, you'll have to transform the JSON data from that column yourself. +:::note + +Typing and Deduping may cause an increase in your destination's compute cost. This cost will vary depending on the amount of data that is transformed and is not related to Airbyte credit usage. + +::: + ## Example Basic Normalization uses a fixed set of rules to map a json object from a source to the types and format that are native to the destination. For example if a source emits data that looks like this: diff --git a/docs/using-airbyte/core-concepts/namespaces.md b/docs/using-airbyte/core-concepts/namespaces.md index 27f85873cb4c..2d767aca92f5 100644 --- a/docs/using-airbyte/core-concepts/namespaces.md +++ b/docs/using-airbyte/core-concepts/namespaces.md @@ -2,7 +2,7 @@ ## High-Level Overview -Namespaces allow you to organize and separate your data into groups. In most cases, namespaces are schemas in the database you're replicating to. +Namespaces are used to generally organize data, segregate tests and production data, and enforce permissions. In most cases, namespaces are schemas in the database you're replicating to. As a part of connection setup, you select where in the destination you want to write your data. Note: The default configuration is **Destination default**. @@ -16,10 +16,12 @@ Most of our destinations support this feature. To learn if your connector suppor ## What is a Namespace? -Technical systems often group their underlying data into namespaces with each namespace's data isolated from another namespace. This isolation allows for better organisation and flexibility, leading to better usability. +Systems often group their underlying data into namespaces with each namespace's data isolated from another namespace. This isolation allows for better organisation and flexibility, leading to better usability. An example of a namespace is the RDMS's `schema` concept. Some common use cases for schemas are enforcing permissions, segregating test and production data and general data organisation. +In a source, the namespace is the location from where the data is replicated to the destination. In a destination, the namespace is the location where the replicated data is stored in the destination. + Airbyte supports namespaces and allows Sources to define namespaces, and Destinations to write to various namespaces. In Airbyte, the following options are available and are set on each individual connection. ### Destination default diff --git a/docs/using-airbyte/core-concepts/readme.md b/docs/using-airbyte/core-concepts/readme.md index 6345e43334ea..f7c8dd38b316 100644 --- a/docs/using-airbyte/core-concepts/readme.md +++ b/docs/using-airbyte/core-concepts/readme.md @@ -24,9 +24,10 @@ A connection is an automated data pipeline that replicates data from a source to |---------------------|---------------------------------------------------------------------------------------------------------------------| | Replication Frequency | When should a data sync be triggered? | | Destination Namespace and Stream Prefix | Where should the replicated data be written? | -| Catalog Selection | What data (streams and columns) should be replicated from the source to the destination? | | Sync Mode | How should the streams be replicated (read and written)? | | Schema Propagation | How should Airbyte handle schema drift in sources? | +| Catalog Selection | What data should be replicated from the source to the destination? | + ## Stream A stream is a group of related records. @@ -46,15 +47,15 @@ Examples of fields: - A column in the table in a relational database - A field in an API response -## Sync schedules +## Sync Schedule -Syncs will be triggered by either: +You have three options when scheduling a connection's sync to run: +- Scheduled (ie. every 24 hours, every 2 hours) +- [CRON schedule](https://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html) +- Manual \(i.e: clicking the "Sync Now" button in the UI or through the API\) -- A manual request \(i.e: clicking the "Sync Now" button in the UI or through the API\) -- A schedule -- CRON schedule -When a scheduled connection is first created, a sync is executed as soon as possible. After that, a sync is run once the time since the last sync \(whether it was triggered manually or due to a schedule\) has exceeded the schedule interval. For example, consider the following illustrative scenario: +When a scheduled connection is first created, a sync is executed as soon as possible. After that, a sync is run once the time since the last sync \(whether it was triggered manually or due to a schedule\) has exceeded the schedule interval. For example: - **October 1st, 2pm**, a user sets up a connection to sync data every 24 hours. - **October 1st, 2:01pm**: sync job runs @@ -63,38 +64,34 @@ When a scheduled connection is first created, a sync is executed as soon as poss - **October 3rd, 2:01pm:** since the last sync was less than 24 hours ago, no sync is run - **October 3rd, 5:01pm:** It has been more than 24 hours since the last sync, so a sync is run -## Namespace +## Destination Namespace -Namespace is a method of grouping streams in a source or destination. Namespaces are used to generally organize data, segregate tests and production data, and enforce permissions. In a relational database system, this is known as a schema. +A namespace defines where the data will be written to your destination. You can use the namespace to group streams in a source or destination. In a relational database system, this is typically known as a schema. -In a source, the namespace is the location from where the data is replicated to the destination. In a destination, the namespace is the location where the replicated data is stored in the destination. +For more details, see our [Namespace documentation](namespaces.md). -Airbyte supports the following configuration options for a connection: +## Sync Mode - | Destination Namepsace | Description | -| ---------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------- | -| Destination default | All streams will be replicated to the single default namespace defined by the Destination. | -| Mirror source structure | Some sources (for example, databases) provide namespace information for a stream. If a source provides namespace information, the destination will mirror the same namespace when this configuration is set. For sources or streams where the source namespace is not known, the behavior will default to the "Destination default" option. | -| Custom format | All streams will be replicated to a single user-defined namespace. | +A sync mode governs how Airbyte reads from a source and writes to a destination. Airbyte provides different sync modes depending on what you want to accomplish. -For more details, see our [Namespace documentation](namespaces.md). +Read more about each [sync mode](using-airbyte/core-concepts/sync-modes) and how they differ. -## Connection sync modes +### Typing and Deduping -A sync mode governs how Airbyte reads from a source and writes to a destination. Airbyte provides different sync modes to account for various use cases. +Typing and deduping ensures the data emitted from sources is written into the correct type-cast relational columns and only contains unique records. Typing and deduping is only relevant for the following relational database & warehouse destinations: -- **Full Refresh | Overwrite:** Sync all records from the source and replace data in destination by overwriting it each time. -- **Full Refresh | Append:** Sync all records from the source and add them to the destination without deleting any data. This creates a historical copy of all records each sync. -- **Incremental Sync | Append:** Sync new records from the source and add them to the destination without deleting any data. This enables efficient historical tracking over time of data. -- **Incremental Sync | Append + Deduped:** Sync new records from the source and add them to the destination. Also provides a de-duplicated view mirroring the state of the stream in the source. This is the most common replication use case. +- Snowflake +- BigQuery -Read more about each [sync mode](using-airbyte/core-concepts/sync-modes) and how they differ. +:::info +Typing and Deduping is the default method of transforming datasets within data warehouse and database destinations after they've been replicated. We are retaining documentation about normalization to support legacy destinations. +::: -## Normalization +For more details, see our [Typing & Deduping documentation](/understanding-airbyte/typing-deduping). -Normalization is the process of structuring data from the source into a format appropriate for consumption in the destination. For example, when writing data from a nested, dynamically typed source like a JSON API to a relational destination like Postgres, normalization is the process which un-nests JSON from the source into a relational table format which uses the appropriate column types in the destination. +## Basic Normalization -Note that normalization is only relevant for the following relational database & warehouse destinations: +Basic Normalization transforms data after a sync to denest columns into their own tables. Note that normalization is only available for the following relational database & warehouse destinations: - Redshift - Postgres @@ -102,41 +99,18 @@ Note that normalization is only relevant for the following relational database & - MySQL - MSSQL -Other destinations do not support normalization as described in this section, though they may normalize data in a format that makes sense for them. For example, the S3 destination connector offers the option of writing JSON files in S3, but also offers the option of writing statically typed files such as Parquet or Avro. - -After a sync is complete, Airbyte normalizes the data. When setting up a connection, you can choose one of the following normalization options: - -- Raw data (no normalization): Airbyte places the JSON blob version of your data in a table called `_airbyte_raw_` -- Basic Normalization: Airbyte converts the raw JSON blob version of your data to the format of your destination. _Note: Not all destinations support normalization._ -- [dbt Cloud integration](https://docs.airbyte.com/cloud/managing-airbyte-cloud/dbt-cloud-integration): Airbyte's dbt Cloud integration allows you to use dbt Cloud for transforming and cleaning your data during the normalization process. - -:::note +For more details, see our [Basic Normalization documentation](/using-airbyte/core-concepts/basic-normalization.md). -Normalizing data may cause an increase in your destination's compute cost. This cost will vary depending on the amount of data that is normalized and is not related to Airbyte credit usage. +## Custom Transformations -::: - -### Typing and Deduping - -As described by the [Airbyte Protocol from the Airbyte Specifications](/understanding-airbyte/airbyte-protocol.md), replication is composed of source connectors that are transmitting data in a JSON format. It is then written as such by the destination connectors. On top of this replication, Airbyte's database and datawarehous destinations can provide converstions from the raw JSON data into type-cast relational columns. Learn more [here](/understanding-airbyte/typing-deduping). - -Note that typing and deduping is only relevant for the following relational database & warehouse destinations: - -- Snowflake -- BigQuery - -:::note +Airbyte integrates natively with dbt to allow you to use dbt for post-sync transformations. This is useful if you would like to trigger dbt models after a sync successfully completes. -Typing and Deduping may cause an increase in your destination's compute cost. This cost will vary depending on the amount of data that is transformed and is not related to Airbyte credit usage. - -::: +For more details, see our [dbt integration documentation](/cloud/managing-airbyte-cloud/dbt-cloud-integration.md). ## Workspace A workspace is a grouping of sources, destinations, connections, and other configurations. It lets you collaborate with team members and share resources across your team under a shared billing account. -When you [sign up](http://cloud.airbyte.com/signup) for Airbyte Cloud, we automatically create your first workspace where you are the only user with access. You can set up your sources and destinations to start syncing data and invite other users to join your workspace. - ## Glossary of Terms -You find and extended list of [Airbyte specific terms](https://glossary.airbyte.com/term/airbyte-glossary-of-terms/), [data engineering concepts](https://glossary.airbyte.com/term/data-engineering-concepts) or many [other data related terms](https://glossary.airbyte.com/). +You can find a extended list of [Airbyte specific terms](https://glossary.airbyte.com/term/airbyte-glossary-of-terms/), [data engineering concepts](https://glossary.airbyte.com/term/data-engineering-concepts) or many [other data related terms](https://glossary.airbyte.com/). diff --git a/docs/using-airbyte/core-concepts/typing-deduping.md b/docs/using-airbyte/core-concepts/typing-deduping.md index 1ebfd060d2b0..1cd029e47a03 100644 --- a/docs/using-airbyte/core-concepts/typing-deduping.md +++ b/docs/using-airbyte/core-concepts/typing-deduping.md @@ -1,6 +1,6 @@ # Typing and Deduping -This page refers to new functionality added by [Destinations V2](/release_notes/upgrading_to_destinations_v2/). Typing and deduping is the default method of transforming datasets within data warehouse and database destinations after they've been replicated. Please check each destination to learn if Typing and deduping is supported. +This page refers to new functionality added by [Destinations V2](/release_notes/upgrading_to_destinations_v2/). Typing and deduping is the default method of transforming datasets within data warehouse and database destinations after they've been replicated. Please check each destination to learn if Typing and Deduping is supported. ## What is Destinations V2? @@ -11,6 +11,12 @@ This page refers to new functionality added by [Destinations V2](/release_notes/ - Internal Airbyte tables in the `airbyte_internal` schema: Airbyte will now generate all raw tables in the `airbyte_internal` schema. We no longer clutter your desired schema with raw data tables. - Incremental delivery for large syncs: Data will be incrementally delivered to your final tables when possible. No more waiting hours to see the first rows in your destination table. +:::note + +Typing and Deduping may cause an increase in your destination's compute cost. This cost will vary depending on the amount of data that is transformed and is not related to Airbyte credit usage. + +::: + ## `_airbyte_meta` Errors "Per-row error handling" is a new paradigm for Airbyte which provides greater flexibility for our users. Airbyte now separates `data-moving problems` from `data-content problems`. Prior to Destinations V2, both types of errors were handled the same way: by failing the sync. Now, a failing sync means that Airbyte could not _move_ all of your data. You can query the `_airbyte_meta` column to see which rows failed for _content_ reasons, and why. This is a more flexible approach, as you can now decide how to handle rows with errors on a case-by-case basis. From a84bb78a2184bc00704f40c2305614be4643e905 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 19:57:55 +0100 Subject: [PATCH 38/52] Getting started readme --- docs/using-airbyte/getting-started/readme.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/docs/using-airbyte/getting-started/readme.md b/docs/using-airbyte/getting-started/readme.md index 2d1d7d4feb27..ab860999e2fb 100644 --- a/docs/using-airbyte/getting-started/readme.md +++ b/docs/using-airbyte/getting-started/readme.md @@ -1,8 +1,10 @@ # Getting Started -Getting started with Airbyte takes only a few steps! This page guides you through the initial steps to get started. +Getting started with Airbyte takes only a few steps! This page guides you through the initial steps to get started and you'll learn how to setup your first connection on the following pages. -## Sign Up for Airbyte (Cloud) +You have two options to run Airbyte: Use **Airbyte Cloud** (recommended) or **self-host Airbyte** in your infrastructure. + +## Sign Up for Airbyte Cloud To use Airbyte Cloud, [sign up](https://cloud.airbyte.io/signup) with your email address, Google login, or GitHub login. Upon signing up, you'll be taken to your workspace, which lets you collaborate with team members and share resources across your team under a shared billing account. From eea31f350077f70ac8bbfcf7a072ce47f5d58125 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 20:01:35 +0100 Subject: [PATCH 39/52] Connector support level fix --- docs/integrations/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/integrations/README.md b/docs/integrations/README.md index fef3f532394d..fe41578bacf5 100644 --- a/docs/integrations/README.md +++ b/docs/integrations/README.md @@ -10,7 +10,7 @@ Airbyte uses a two tiered system for connectors to help you understand what to e **Community**: A community connector is maintained by the Airbyte community until it becomes Certified. Airbyte has over 800 code contributors and 15,000 people in the Slack community to help. The Airbyte team is continually certifying Community connectors as usage grows. As these connectors are not maintained by Airbyte, we do not offer support SLAs around them, and we encourage caution when using them in production. -For more information about the system, see [Product Support Levels](https://docs.airbyte.com/project-overview/product-support-levels) +For more information about the system, see [Connector Support Levels](./connector-support-levels.md) _[View the connector registries in full](https://connectors.airbyte.com/files/generated_reports/connector_registry_report.html)_ From 911b3363e4146d2aaf98044d506e45982e176dd0 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 20:09:02 +0100 Subject: [PATCH 40/52] ConnectorRegisry changes --- docusaurus/src/components/ConnectorRegistry.jsx | 15 +++++++-------- .../src/components/ConnectorRegistry.module.css | 6 ++++++ 2 files changed, 13 insertions(+), 8 deletions(-) create mode 100644 docusaurus/src/components/ConnectorRegistry.module.css diff --git a/docusaurus/src/components/ConnectorRegistry.jsx b/docusaurus/src/components/ConnectorRegistry.jsx index 3b81708e3192..d3548c350d34 100644 --- a/docusaurus/src/components/ConnectorRegistry.jsx +++ b/docusaurus/src/components/ConnectorRegistry.jsx @@ -1,6 +1,8 @@ import React from "react"; import { useEffect, useState } from "react"; +import styles from "./ConnectorRegistry.module.css"; + const registry_url = "https://connectors.airbyte.com/files/generated_reports/connector_registry_report.json"; @@ -46,7 +48,6 @@ export default function ConnectorRegistry({ type }) { Connector Name - Icon Links Support Level OSS @@ -64,14 +65,12 @@ export default function ConnectorRegistry({ type }) { return ( - +

+ {connector.iconUrl_oss && ( + + )} {connector.name_oss} - - - - {connector.iconUrl_oss ? ( - - ) : null} +
{/* min width to prevent wrapping */} diff --git a/docusaurus/src/components/ConnectorRegistry.module.css b/docusaurus/src/components/ConnectorRegistry.module.css new file mode 100644 index 000000000000..e3d085db4932 --- /dev/null +++ b/docusaurus/src/components/ConnectorRegistry.module.css @@ -0,0 +1,6 @@ +.connectorName { + display: flex; + align-items: center; + gap: 4px; + font-weight: bold; +} From 20b977b8a374a7ba5ef1cb25f81b01702014179c Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 19:11:34 +0000 Subject: [PATCH 41/52] Update sync schedule --- .../configuring-connections.md | 22 ++++--------------- .../understand-airbyte-cloud-limits.md | 2 -- .../using-airbyte/core-concepts/namespaces.md | 2 +- docs/using-airbyte/core-concepts/readme.md | 12 +++------- 4 files changed, 8 insertions(+), 30 deletions(-) diff --git a/docs/cloud/managing-airbyte-cloud/configuring-connections.md b/docs/cloud/managing-airbyte-cloud/configuring-connections.md index 129fd366a48f..cfe255bc6471 100644 --- a/docs/cloud/managing-airbyte-cloud/configuring-connections.md +++ b/docs/cloud/managing-airbyte-cloud/configuring-connections.md @@ -8,7 +8,7 @@ Configuring the connection settings allows you to manage various aspects of the To configure these settings: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Connections** and then click the connection you want to change. +1. In the Airbyte UI, click **Connections** and then click the connection you want to change. 2. Click the **Replication** tab. @@ -24,25 +24,11 @@ You can configure the following settings: | Setting | Description | |--------------------------------------|-------------------------------------------------------------------------------------| -| Replication frequency | How often the data syncs | +| [Replication frequency](/using-airbyte/core-concepts/sync-schedules.md) | How often the data syncs | | [Destination namespace](/using-airbyte/core-concepts/namespaces.md) | Where the replicated data is written | | Destination stream prefix | How you identify streams from different connectors | | [Detect and propagate schema changes](/cloud/managing-airbyte-cloud/manage-schema-changes.md) | How Airbyte handles syncs when it detects schema changes in the source | -| Connection Data Residency | Where data will be processed | - -To use [cron scheduling](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html): - -1. In the **Replication Frequency** dropdown, click **Cron**. - -2. Enter a cron expression and choose a time zone to create a sync schedule. - -:::note - -* Only one sync per connection can run at a time. -* If a sync is scheduled to run before the previous sync finishes, the scheduled sync will start after the completion of the previous sync. -* Reach out to [Sales](https://airbyte.com/company/talk-to-sales) if you require replication more frequently than once per hour. - -::: +| [Connection Data Residency](/cloud/managing-airbyte-cloud/manage-data-residency.md) | Where data will be processed | ## Modify streams in your connection @@ -74,7 +60,7 @@ Source-defined cursors and primary keys are selected automatically and cannot be 3. Click on a stream to display the stream details panel. You'll see each column we detect from the source. -4. Toggle individual fields or columns to include or exclude them in the sync, or use the toggle in the table header to select all fields at once. +4. Column selection is available to protect PII or sensitive data from being synced to the destination. Toggle individual fields to include or exclude them in the sync, or use the toggle in the table header to select all fields at once. :::info diff --git a/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md b/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md index 7f9a0e97a1e9..47950e66f1bd 100644 --- a/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md +++ b/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md @@ -10,7 +10,5 @@ Understanding the following limitations will help you more effectively manage Ai * Max number of streams that can be returned by a source in a discover call: 1K * Max number of streams that can be configured to sync in a single connection: 1K * Size of a single record: 20MB -* Shortest sync schedule: Every 60 min (Reach out to [Sales](https://airbyte.com/company/talk-to-sales) if you require replication more frequently than once per hour) -* Schedule accuracy: +/- 30 min *Limits on workspaces, sources, and destinations do not apply to customers of [Powered by Airbyte](https://airbyte.com/solutions/powered-by-airbyte). To learn more [contact us](https://airbyte.com/talk-to-sales)! diff --git a/docs/using-airbyte/core-concepts/namespaces.md b/docs/using-airbyte/core-concepts/namespaces.md index 2d767aca92f5..31e092e0d862 100644 --- a/docs/using-airbyte/core-concepts/namespaces.md +++ b/docs/using-airbyte/core-concepts/namespaces.md @@ -2,7 +2,7 @@ ## High-Level Overview -Namespaces are used to generally organize data, segregate tests and production data, and enforce permissions. In most cases, namespaces are schemas in the database you're replicating to. +Namespaces are used to generally organize data, separate tests and production data, and enforce permissions. In most cases, namespaces are schemas in the database you're replicating to. As a part of connection setup, you select where in the destination you want to write your data. Note: The default configuration is **Destination default**. diff --git a/docs/using-airbyte/core-concepts/readme.md b/docs/using-airbyte/core-concepts/readme.md index f7c8dd38b316..67dd7c63447d 100644 --- a/docs/using-airbyte/core-concepts/readme.md +++ b/docs/using-airbyte/core-concepts/readme.md @@ -49,20 +49,14 @@ Examples of fields: ## Sync Schedule -You have three options when scheduling a connection's sync to run: +There are three options for scheduling a sync to run: - Scheduled (ie. every 24 hours, every 2 hours) - [CRON schedule](https://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html) - Manual \(i.e: clicking the "Sync Now" button in the UI or through the API\) - When a scheduled connection is first created, a sync is executed as soon as possible. After that, a sync is run once the time since the last sync \(whether it was triggered manually or due to a schedule\) has exceeded the schedule interval. For example: -- **October 1st, 2pm**, a user sets up a connection to sync data every 24 hours. -- **October 1st, 2:01pm**: sync job runs -- **October 2nd, 2:01pm:** 24 hours have passed since the last sync, so a sync is triggered. -- **October 2nd, 5pm**: The user manually triggers a sync from the UI -- **October 3rd, 2:01pm:** since the last sync was less than 24 hours ago, no sync is run -- **October 3rd, 5:01pm:** It has been more than 24 hours since the last sync, so a sync is run +For more details, see our [Sync Schedules documentation](sync-schedules.md). ## Destination Namespace @@ -76,7 +70,7 @@ A sync mode governs how Airbyte reads from a source and writes to a destination. Read more about each [sync mode](using-airbyte/core-concepts/sync-modes) and how they differ. -### Typing and Deduping +## Typing and Deduping Typing and deduping ensures the data emitted from sources is written into the correct type-cast relational columns and only contains unique records. Typing and deduping is only relevant for the following relational database & warehouse destinations: From 8e5d74baa07a46e0488aef3cb0d6777b2c172d59 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 20:11:45 +0100 Subject: [PATCH 42/52] Replace airbyte dashboard --- docs/cloud/managing-airbyte-cloud/configuring-connections.md | 2 +- .../manage-airbyte-cloud-notifications.md | 2 +- docs/cloud/managing-airbyte-cloud/manage-connection-state.md | 2 +- docs/cloud/managing-airbyte-cloud/manage-credits.md | 4 ++-- docs/cloud/managing-airbyte-cloud/manage-data-residency.md | 4 ++-- docs/cloud/managing-airbyte-cloud/manage-schema-changes.md | 2 +- 6 files changed, 8 insertions(+), 8 deletions(-) diff --git a/docs/cloud/managing-airbyte-cloud/configuring-connections.md b/docs/cloud/managing-airbyte-cloud/configuring-connections.md index cfe255bc6471..bc896004eb30 100644 --- a/docs/cloud/managing-airbyte-cloud/configuring-connections.md +++ b/docs/cloud/managing-airbyte-cloud/configuring-connections.md @@ -40,7 +40,7 @@ A connection's schema consists of one or many streams. Each stream is most commo To modify streams: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Connections** and then click the connection you want to change. +1. In the Airbyte UI, click **Connections** and then click the connection you want to change. 2. Click the **Replication** tab. diff --git a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md index 741a4f809232..1c7a68f8f7f9 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md +++ b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md @@ -21,7 +21,7 @@ This page provides guidance on how to manage notifications for Airbyte, allowing To set up email notifications: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Settings** and navigate to **Notifications**. +1. In the Airbyte UI, click **Settings** and navigate to **Notifications**. 2. Toggle which messages you'd like to receive from Airbyte. All email notifications will be sent by default to the creator of the workspace. To change the recipient, edit and save the **notification email recipient**. If you would like to send email notifications to more than one recipient, you can enter an email distribution list (ie Google Group) as the recipient. diff --git a/docs/cloud/managing-airbyte-cloud/manage-connection-state.md b/docs/cloud/managing-airbyte-cloud/manage-connection-state.md index 321c3753e7b8..23d25db6be99 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-connection-state.md +++ b/docs/cloud/managing-airbyte-cloud/manage-connection-state.md @@ -3,7 +3,7 @@ The connection state provides additional information about incremental syncs. It includes the most recent values for the global or stream-level cursors, which can aid in debugging or determining which data will be included in the next sync. To review the connection state: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Connections** and then click the connection you want to display. +1. In the Airbyte UI, click **Connections** and then click the connection you want to display. 2. Click the **Settings** tab on the Connection page. diff --git a/docs/cloud/managing-airbyte-cloud/manage-credits.md b/docs/cloud/managing-airbyte-cloud/manage-credits.md index d0f5839cc1eb..ed54d783d6ae 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-credits.md +++ b/docs/cloud/managing-airbyte-cloud/manage-credits.md @@ -8,7 +8,7 @@ Airbyte [credits](https://airbyte.com/pricing) are used to pay for Airbyte resou To buy credits: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Billing** in the navigation bar. +1. In the Airbyte UI, click **Billing** in the navigation bar. 2. If you are unsure of how many credits you need, use our [Cost Estimator](https://www.airbyte.com/pricing) or click **Talk to Sales** to find the right amount for your team. @@ -65,7 +65,7 @@ If you are enrolled and want to change your limits or cancel your enrollment, [e ## View invoice history -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Billing** in the navigation bar. +1. In the Airbyte UI, click **Billing** in the navigation bar. 2. Click **Invoice History**. You will be redirected to a Stripe portal. diff --git a/docs/cloud/managing-airbyte-cloud/manage-data-residency.md b/docs/cloud/managing-airbyte-cloud/manage-data-residency.md index 167a7c1b0d87..384d18337bb5 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-data-residency.md +++ b/docs/cloud/managing-airbyte-cloud/manage-data-residency.md @@ -18,7 +18,7 @@ When you set the default data residency, it applies to new connections only. If To choose your default data residency: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Settings**. +1. In the Airbyte UI, click **Settings**. 2. Click **Data Residency**. @@ -37,7 +37,7 @@ You can choose the data residency for your connection in the connection settings To choose the data residency for your connection: -1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Connections** and then click the connection that you want to change. +1. In the Airbyte UI, click **Connections** and then click the connection that you want to change. 2. Click the **Settings** tab. diff --git a/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md b/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md index 4e1190f733fc..d74f7a3ce492 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md +++ b/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md @@ -74,7 +74,7 @@ In addition to Airbyte Cloud’s automatic schema change detection, you can manu To manually refresh the source schema: - 1. On the [Airbyte Cloud](http://cloud.airbyte.com) dashboard, click **Connections** and then click the connection you want to refresh. + 1. In the Airbyte UI, click **Connections** and then click the connection you want to refresh. 2. Click the **Replication** tab. From c0de83ca2d14a84c721ee2786d08562dd85ead72 Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 19:14:51 +0000 Subject: [PATCH 43/52] sync schedules --- .../core-concepts/sync-schedules.md | 38 +++++++++++++++++++ docusaurus/sidebars.js | 1 + 2 files changed, 39 insertions(+) create mode 100644 docs/using-airbyte/core-concepts/sync-schedules.md diff --git a/docs/using-airbyte/core-concepts/sync-schedules.md b/docs/using-airbyte/core-concepts/sync-schedules.md new file mode 100644 index 000000000000..1a85ccd76c54 --- /dev/null +++ b/docs/using-airbyte/core-concepts/sync-schedules.md @@ -0,0 +1,38 @@ +# Sync Schedules + +For each connection, you can select between three options that allow a sync to run. The three options for `Replication Frequency` are: +- Scheduled (ie. every 24 hours, every 2 hours) +- [CRON scheduling](https://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html) +- Manual + +## Sync Limitations + +* Only one sync per connection can run at a time. +* If a sync is scheduled to run before the previous sync finishes, the scheduled sync will start after the completion of the previous sync. +* Syncs can run at most every 60 minutes. Reach out to [Sales](https://airbyte.com/company/talk-to-sales) if you require replication more frequently than once per hour. + +## Scheduled syncs +When a scheduled connection is first created, a sync is executed immediately after creation. After that, a sync is run once the time since the last sync \(whether it was triggered manually or due to a schedule\) has exceeded the schedule interval. For example: + +- **October 1st, 2pm**, a user sets up a connection to sync data every 24 hours. +- **October 1st, 2:01pm**: sync job runs +- **October 2nd, 2:01pm:** 24 hours have passed since the last sync, so a sync is triggered. +- **October 2nd, 5pm**: The user manually triggers a sync from the UI +- **October 3rd, 2:01pm:** since the last sync was less than 24 hours ago, no sync is run +- **October 3rd, 5:01pm:** It has been more than 24 hours since the last sync, so a sync is run + +## CRON Scheduling +If you prefer more flexibility in scheduling your sync, you can also use CRON scheduling to set a precise time of day or month. + +Airbyte uses the CRON scheduler from [Quartz](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html). We recommend reading their [documentation](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html) to learn more about how to + +When setting up the cron extpression, you will also be asked to choose a time zone the sync will run in. + +:::note +For Scheduled or CRON scheduled syncs, Airbyte guarantees syncs will initiate with a schedule accuracy of +/- 30 minutes. +::: + +## Manual Syncs +When the connection is set to replicate with `Manual` frequency, the sync will not automatically run. + +It can be triggered by clicking the "Sync Now" button at any time through the UI or be triggered through the UI. \ No newline at end of file diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index 9fe0d7363df1..b83cd1a10686 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -425,6 +425,7 @@ module.exports = { id: "using-airbyte/core-concepts/readme" }, items: [ + "using-airbyte/core-concepts/sync-schedules", "using-airbyte/core-concepts/namespaces", { type: "category", From 5247d13d4c1a79c6e0d43770c6ea70e4137059bf Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 19:16:24 +0000 Subject: [PATCH 44/52] remove text --- docs/using-airbyte/core-concepts/readme.md | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/docs/using-airbyte/core-concepts/readme.md b/docs/using-airbyte/core-concepts/readme.md index 67dd7c63447d..9d8e495a62d5 100644 --- a/docs/using-airbyte/core-concepts/readme.md +++ b/docs/using-airbyte/core-concepts/readme.md @@ -47,15 +47,13 @@ Examples of fields: - A column in the table in a relational database - A field in an API response -## Sync Schedule +## Sync Schedules There are three options for scheduling a sync to run: - Scheduled (ie. every 24 hours, every 2 hours) - [CRON schedule](https://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html) - Manual \(i.e: clicking the "Sync Now" button in the UI or through the API\) -When a scheduled connection is first created, a sync is executed as soon as possible. After that, a sync is run once the time since the last sync \(whether it was triggered manually or due to a schedule\) has exceeded the schedule interval. For example: - For more details, see our [Sync Schedules documentation](sync-schedules.md). ## Destination Namespace From 4b2112d1b43a4e56e0fa3cf6c10c5433a6585461 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 20:18:05 +0100 Subject: [PATCH 45/52] Minor sync schedule changes --- docs/using-airbyte/core-concepts/sync-schedules.md | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/docs/using-airbyte/core-concepts/sync-schedules.md b/docs/using-airbyte/core-concepts/sync-schedules.md index 1a85ccd76c54..bda6dc32d411 100644 --- a/docs/using-airbyte/core-concepts/sync-schedules.md +++ b/docs/using-airbyte/core-concepts/sync-schedules.md @@ -1,8 +1,9 @@ # Sync Schedules For each connection, you can select between three options that allow a sync to run. The three options for `Replication Frequency` are: + - Scheduled (ie. every 24 hours, every 2 hours) -- [CRON scheduling](https://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html) +- Cron scheduling - Manual ## Sync Limitations @@ -21,7 +22,7 @@ When a scheduled connection is first created, a sync is executed immediately aft - **October 3rd, 2:01pm:** since the last sync was less than 24 hours ago, no sync is run - **October 3rd, 5:01pm:** It has been more than 24 hours since the last sync, so a sync is run -## CRON Scheduling +## Cron Scheduling If you prefer more flexibility in scheduling your sync, you can also use CRON scheduling to set a precise time of day or month. Airbyte uses the CRON scheduler from [Quartz](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html). We recommend reading their [documentation](http://www.quartz-scheduler.org/documentation/quartz-2.3.0/tutorials/crontrigger.html) to learn more about how to @@ -29,7 +30,7 @@ Airbyte uses the CRON scheduler from [Quartz](http://www.quartz-scheduler.org/do When setting up the cron extpression, you will also be asked to choose a time zone the sync will run in. :::note -For Scheduled or CRON scheduled syncs, Airbyte guarantees syncs will initiate with a schedule accuracy of +/- 30 minutes. +For Scheduled or cron scheduled syncs, Airbyte guarantees syncs will initiate with a schedule accuracy of +/- 30 minutes. ::: ## Manual Syncs From c07ebfa2c0a6e17208dd89a3e6c657bf3f560dc9 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 20:19:43 +0100 Subject: [PATCH 46/52] Airbyte cloud dashboard removal --- docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md | 2 +- .../manage-airbyte-cloud-notifications.md | 2 +- docs/cloud/managing-airbyte-cloud/manage-schema-changes.md | 4 ++-- docs/cloud/managing-airbyte-cloud/review-connection-status.md | 2 +- 4 files changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md b/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md index a68a5f1c3f48..777433ec72e3 100644 --- a/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md +++ b/docs/cloud/managing-airbyte-cloud/dbt-cloud-integration.md @@ -25,7 +25,7 @@ Generate a [service token](https://docs.getdbt.com/docs/dbt-cloud-apis/service-t To set up the dbt Cloud integration in Airbyte Cloud: -1. On the Airbyte Cloud dashboard, click **Settings**. +1. In the Airbyte UI, click **Settings**. 2. Click **dbt Cloud integration**. diff --git a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md index 1c7a68f8f7f9..2b39a0bb1893 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md +++ b/docs/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications.md @@ -79,7 +79,7 @@ You're done! To be notified of any source schema changes: 1. Make sure you have enabled `Automatic Connection Updates` and `Connection Updates Requiring Action` notifications. If these are off, even if you turned on schema update notifications in a connection's settings, Airbyte will *NOT* send out any notifications related to these types of events. -2. On the [Airbyte](http://cloud.airbyte.com/) dashboard, click **Connections** and select the connection you want to receive notifications for. +2. In the Airbyte UI, click **Connections** and select the connection you want to receive notifications for. 3. Click the **Settings** tab on the Connection page. diff --git a/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md b/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md index d74f7a3ce492..90d9d16d6f1a 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md +++ b/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md @@ -33,7 +33,7 @@ To re-enable the streams, ensure the correct **Primary Key** and **Cursor** are If the connection is set to **Ignore** any schema changes, Airbyte continues syncing according to your last saved schema. You need to manually approve any detected schema changes for the schema in the destination to change. -1. On the [Airbyte Cloud](http://cloud.airbyte.com/) dashboard, click **Connections**. Select a connection and navigate to the **Replication** tab. If schema changes are detected, you'll see a blue "i" icon next to the Replication ab. +1. In the Airbyte UI, click **Connections**. Select a connection and navigate to the **Replication** tab. If schema changes are detected, you'll see a blue "i" icon next to the Replication ab. 2. Click **Review changes**. @@ -62,7 +62,7 @@ A major version upgrade will include a breaking change if any of these apply: | State Changes | The format of the source’s state has changed, and the full dataset will need to be re-synced | To review and fix breaking schema changes: -1. On the [Airbyte Cloud](http://cloud.airbyte.com/) dashboard, click **Connections** and select the connection with breaking changes. +1. In the Airbyte UI, click **Connections** and select the connection with breaking changes. 2. Review the description of what has changed in the new version. The breaking change will require you to upgrade your source or destination to a new version by a specific cutoff date. diff --git a/docs/cloud/managing-airbyte-cloud/review-connection-status.md b/docs/cloud/managing-airbyte-cloud/review-connection-status.md index d9ee57020af7..80a296110854 100644 --- a/docs/cloud/managing-airbyte-cloud/review-connection-status.md +++ b/docs/cloud/managing-airbyte-cloud/review-connection-status.md @@ -2,7 +2,7 @@ The connection status displays information about the connection and of each stream being synced. Reviewing this summary allows you to assess the connection's current status and understand when the next sync will be run. To review the connection status: -1. On the [Airbyte Cloud](http://cloud.airbyte.com/) dashboard, click **Connections**. +1. In the Airbyte UI, click **Connections**. 2. Click a connection in the list to view its status. From 781f9711a6b629b6844e50144b0eed4c8afbc7e9 Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 20:22:07 +0100 Subject: [PATCH 47/52] sidebar --- docusaurus/sidebars.js | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index b83cd1a10686..fe3cf1512e1b 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -448,8 +448,11 @@ module.exports = { { type: "category", label: "Configuring Connections", + link: { + type: "doc", + id: "cloud/managing-airbyte-cloud/configuring-connections" + }, items: [ - "cloud/managing-airbyte-cloud/configuring-connections", "cloud/managing-airbyte-cloud/manage-schema-changes", "cloud/managing-airbyte-cloud/manage-data-residency", "cloud/managing-airbyte-cloud/manage-connection-state", From 6072a91df953d98fea2af0488d29de588b703036 Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 19:24:30 +0000 Subject: [PATCH 48/52] Text updates --- .../managing-airbyte-cloud/manage-schema-changes.md | 11 +++++++---- .../review-connection-status.md | 4 ++-- 2 files changed, 9 insertions(+), 6 deletions(-) diff --git a/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md b/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md index d74f7a3ce492..214065d413f6 100644 --- a/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md +++ b/docs/cloud/managing-airbyte-cloud/manage-schema-changes.md @@ -4,6 +4,7 @@ You can specify for each connection how Airbyte should handle any change of sche Airbyte checks for any changes in your source schema immediately before syncing, at most once every 24 hours. +## Detection and Propagate Schema Changes Based on your configured settings for **Detect and propagate schema changes**, Airbyte will automatically sync those changes or ignore them: | Setting | Description | @@ -13,6 +14,7 @@ Based on your configured settings for **Detect and propagate schema changes**, A | Ignore | Schema changes will be detected, but not propagated. Syncs will continue running with the schema you've set up. To propagate the detected schema changes, you will need to approve the changes manually | | Pause Connection | Connections will be automatically disabled as soon as any schema changes are detected | +## Types of Schema Changes When propagation is enabled, your data in the destination will automatically shift to bring in the new changes. | Type of Schema Change | Propagation Behavior | @@ -23,6 +25,10 @@ When propagation is enabled, your data in the destination will automatically shi | Removal of stream | The stream will stop updating, and any existing data in the destination will remain. | | Column data type changes | The data in the destination will remain the same. Any new or updated rows with incompatible data types will result in a row error in the raw Airbyte tables. You will need to refresh the schema and do a full resync to ensure the data types are consistent. +:::tip +Ensure you receive webhook notifications for your connection by enabling `Schema update notifications` in the connection's settings. +::: + In all cases, if a breaking schema change is detected, the connection will be paused immediately for manual review to prevent future syncs from failing. Breaking schema changes occur when: * An existing primary key is removed from the source * An existing cursor is removed from the source @@ -80,7 +86,4 @@ In addition to Airbyte Cloud’s automatic schema change detection, you can manu 3. In the **Activate the streams you want to sync** table, click **Refresh source schema** to fetch the schema of your data source. - 4. If there are changes to the schema, you can review them in the **Refreshed source schema** dialog. - -## Manage Schema Change Notifications -[Refer to our notification documentation](manage-airbyte-cloud-notifications.md) to understand how to stay updated on any schema updates to your connections. \ No newline at end of file + 4. If there are changes to the schema, you can review them in the **Refreshed source schema** dialog. \ No newline at end of file diff --git a/docs/cloud/managing-airbyte-cloud/review-connection-status.md b/docs/cloud/managing-airbyte-cloud/review-connection-status.md index d9ee57020af7..0210d61cb3ae 100644 --- a/docs/cloud/managing-airbyte-cloud/review-connection-status.md +++ b/docs/cloud/managing-airbyte-cloud/review-connection-status.md @@ -2,9 +2,9 @@ The connection status displays information about the connection and of each stream being synced. Reviewing this summary allows you to assess the connection's current status and understand when the next sync will be run. To review the connection status: -1. On the [Airbyte Cloud](http://cloud.airbyte.com/) dashboard, click **Connections**. +1. In the Airbyte UI, click **Connections**. -2. Click a connection in the list to view its status. +2. Click a connection in the list to view its status. | Status | Description | |------------------|---------------------------------------------------------------------------------------------------------------------| From 48d484268ae7bf270752def0bb2403adccdd871d Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 19:28:47 +0000 Subject: [PATCH 49/52] update sidebar --- docs/using-airbyte/core-concepts/sync-modes/README.md | 4 ++-- docusaurus/sidebars.js | 8 ++++---- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/docs/using-airbyte/core-concepts/sync-modes/README.md b/docs/using-airbyte/core-concepts/sync-modes/README.md index 5a6921910d5e..a561506a1f73 100644 --- a/docs/using-airbyte/core-concepts/sync-modes/README.md +++ b/docs/using-airbyte/core-concepts/sync-modes/README.md @@ -12,9 +12,9 @@ A sync mode governs how Airbyte reads from a source and writes to a destination. 2. Append: Write by adding data to existing tables in the destination. 3. Deduped History: Write by first adding data to existing tables in the destination to keep a history of changes. The final table is produced by de-duplicating the intermediate ones using a primary key. -A sync mode is therefore, a combination of a source and destination mode together. The UI exposes the following options, whenever both source and destination connectors are capable to support it for the corresponding stream: +A sync mode is a combination of a source and destination mode together. The UI exposes the following options, whenever both source and destination connectors are capable to support it for the corresponding stream: +- [Incremental Append + Deduped](./incremental-append-deduped.md): Sync new records from stream and append data in destination, also provides a de-duplicated view mirroring the state of the stream in the source. - [Full Refresh Overwrite](./full-refresh-overwrite.md): Sync the whole stream and replace data in destination by overwriting it. - [Full Refresh Append](./full-refresh-append.md): Sync the whole stream and append data in destination. - [Incremental Append](./incremental-append.md): Sync new records from stream and append data in destination. -- [Incremental Append + Deduped](./incremental-append-deduped.md): Sync new records from stream and append data in destination, also provides a de-duplicated view mirroring the state of the stream in the source. \ No newline at end of file diff --git a/docusaurus/sidebars.js b/docusaurus/sidebars.js index fe3cf1512e1b..e42c14f55492 100644 --- a/docusaurus/sidebars.js +++ b/docusaurus/sidebars.js @@ -435,14 +435,14 @@ module.exports = { id: "using-airbyte/core-concepts/sync-modes/README" }, items: [ - "using-airbyte/core-concepts/sync-modes/full-refresh-overwrite", - "using-airbyte/core-concepts/sync-modes/full-refresh-append", - "using-airbyte/core-concepts/sync-modes/incremental-append", "using-airbyte/core-concepts/sync-modes/incremental-append-deduped", + "using-airbyte/core-concepts/sync-modes/incremental-append", + "using-airbyte/core-concepts/sync-modes/full-refresh-append", + "using-airbyte/core-concepts/sync-modes/full-refresh-overwrite", ], }, - "using-airbyte/core-concepts/basic-normalization", "using-airbyte/core-concepts/typing-deduping", + "using-airbyte/core-concepts/basic-normalization", ], }, { From 75faa3f64fc21feb885f46d64376e1d5eb816fad Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 20:32:24 +0100 Subject: [PATCH 50/52] ie. to e.g. --- docs/using-airbyte/core-concepts/sync-schedules.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/using-airbyte/core-concepts/sync-schedules.md b/docs/using-airbyte/core-concepts/sync-schedules.md index bda6dc32d411..a0d6c22fbee9 100644 --- a/docs/using-airbyte/core-concepts/sync-schedules.md +++ b/docs/using-airbyte/core-concepts/sync-schedules.md @@ -2,7 +2,7 @@ For each connection, you can select between three options that allow a sync to run. The three options for `Replication Frequency` are: -- Scheduled (ie. every 24 hours, every 2 hours) +- Scheduled (e.g. every 24 hours, every 2 hours) - Cron scheduling - Manual From 918bc7b8bfbebe5d27a65a38a37765fd6074168b Mon Sep 17 00:00:00 2001 From: Tim Roes Date: Sun, 26 Nov 2023 20:36:31 +0100 Subject: [PATCH 51/52] Adjust tech stack --- docs/understanding-airbyte/tech-stack.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/understanding-airbyte/tech-stack.md b/docs/understanding-airbyte/tech-stack.md index ba69157075e6..c829f8b7a81b 100644 --- a/docs/understanding-airbyte/tech-stack.md +++ b/docs/understanding-airbyte/tech-stack.md @@ -3,7 +3,7 @@ ## Airbyte Core Backend * [Java 17](https://jdk.java.net/archive/) -* Framework: [Jersey](https://eclipse-ee4j.github.io/jersey/) +* Framework: [Micronaut](https://micronaut.io/) * API: [OAS3](https://www.openapis.org/) * Databases: [PostgreSQL](https://www.postgresql.org/) * Unit & E2E testing: [JUnit 5](https://junit.org/junit5) @@ -18,7 +18,7 @@ Connectors can be written in any language. However the most common languages are ## **Frontend** -* [Node.js 16](https://nodejs.org/en/) +* [Node.js](https://nodejs.org/en/) * [TypeScript](https://www.typescriptlang.org/) * Web Framework/Library: [React](https://reactjs.org/) @@ -27,7 +27,7 @@ Connectors can be written in any language. However the most common languages are * CI/CD: [GitHub Actions](https://github.com/features/actions) * Containerization: [Docker](https://www.docker.com/) and [Docker Compose](https://docs.docker.com/compose/) * Linter \(Frontend\): [ESLint](https://eslint.org/) -* Formatter \(Frontend\): [Prettier](https://prettier.io/) +* Formatter \(Frontend & Backend\): [Prettier](https://prettier.io/) * Formatter \(Backend\): [Spotless](https://github.com/diffplug/spotless) ## FAQ From a2c99d624b55a642414c5434ceda336df9582443 Mon Sep 17 00:00:00 2001 From: Natalie Kwong <38087517+nataliekwong@users.noreply.github.com> Date: Sun, 26 Nov 2023 19:47:56 +0000 Subject: [PATCH 52/52] Limits --- .../review-connection-status.md | 19 +++++++++++++++---- .../understand-airbyte-cloud-limits.md | 2 -- docs/operator-guides/browsing-output-logs.md | 4 ++-- docs/operator-guides/reset.md | 2 +- 4 files changed, 18 insertions(+), 9 deletions(-) diff --git a/docs/cloud/managing-airbyte-cloud/review-connection-status.md b/docs/cloud/managing-airbyte-cloud/review-connection-status.md index 0210d61cb3ae..5970e3756f4b 100644 --- a/docs/cloud/managing-airbyte-cloud/review-connection-status.md +++ b/docs/cloud/managing-airbyte-cloud/review-connection-status.md @@ -13,10 +13,20 @@ To review the connection status: | Delayed | The connection has not loaded data within the scheduled replication frequency. For example, if the replication frequency is 1 hour, the connection has not loaded data for more than 1 hour | | Error | The connection has not loaded data in more than two times the scheduled replication frequency. For example, if the replication frequency is 1 hour, the connection has not loaded data for more than 2 hours | | Action Required | A breaking change related to the source or destination requires attention to resolve | -| Pending | The connection has not been run yet, so no status exists | -| Disabled | The connection has been disabled and is not scheduled to run | | In Progress | The connection is currently extracting or loading data | +| Disabled | The connection has been disabled and is not scheduled to run | +| Pending | The connection has not been run yet, so no status exists | +If the most recent sync failed, you'll see the error message that will help diagnose if the failure is due to a source or destination configuration error. [Reach out](/community/getting-support.md) to us if you need any help to ensure you data continues syncing. + +:::info +If a sync starts to fail, it will automatically be disabled after 100 consecutive failures or 14 consecutive days of failure. +::: + +If a new major version of the connector has been released, you will also see a banner on this page indicating the cutoff date for the version. Airbyte recommends upgrading before the cutoff date to ensure your data continues syncing. If you do not upgrade before the cutoff date, Airbyte will automatically disable your connection. + +Learn more about version upgrades in our [resolving breaking change documentation](/cloud/managing-airbyte-cloud/manage-schema-changes#resolving-breaking-changes). + ## Review the stream status The stream status allows you to monitor each stream's latest status. The stream will be highlighted with a grey pending bar to indicate the sync is actively extracting or loading data. @@ -28,6 +38,7 @@ The stream status allows you to monitor each stream's latest status. The stream Each stream shows the last record loaded to the destination. Toggle the header to display the exact datetime the last record was loaded. -You can reset an individual stream without resetting all streams in a connection by clicking the three grey dots next to any stream. It is recommended to start a new sync after a reset. +You can [reset](/operator-guides/reset.md) an individual stream without resetting all streams in a connection by clicking the three grey dots next to any stream. + +You can also navigate directly to the stream's configuration by click the three grey dots next to any stream and selecting "Open details" to be redirected to the stream configuration. -You can also navigate directly to the stream's configuration by click the three grey dots next to any stream and selecting "Open details" to be redirected to the stream configuration. \ No newline at end of file diff --git a/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md b/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md index 47950e66f1bd..47bc59ea6b19 100644 --- a/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md +++ b/docs/cloud/managing-airbyte-cloud/understand-airbyte-cloud-limits.md @@ -5,8 +5,6 @@ Understanding the following limitations will help you more effectively manage Ai * Max number of workspaces per user: 3* * Max number of instances of the same source connector: 10* * Max number of destinations in a workspace: 20* -* Max number of consecutive sync failures before a connection is paused: 100 -* Max number of days with consecutive sync failures before a connection is paused: 14 days * Max number of streams that can be returned by a source in a discover call: 1K * Max number of streams that can be configured to sync in a single connection: 1K * Size of a single record: 20MB diff --git a/docs/operator-guides/browsing-output-logs.md b/docs/operator-guides/browsing-output-logs.md index 4e62843f160b..19de2cdcb6b6 100644 --- a/docs/operator-guides/browsing-output-logs.md +++ b/docs/operator-guides/browsing-output-logs.md @@ -10,8 +10,8 @@ When using Airbyte Open Source, you can also access additional logs outside of t To find the logs for a connection, navigate to a connection's `Job History` tab to see the latest syncs. -## View the logs in-app -To open the logs in the UI, select the three grey dots next to a sync and select `View logs`. This will open our in-app log viewer. +## View the logs in the UI +To open the logs in the UI, select the three grey dots next to a sync and select `View logs`. This will open our full screen in-app log viewer. :::tip If you are troubleshooting a sync error, you can search for `Error`, `Exception`, or `Fail` to find common errors. diff --git a/docs/operator-guides/reset.md b/docs/operator-guides/reset.md index 2e1139feab6c..3fba28aa45a3 100644 --- a/docs/operator-guides/reset.md +++ b/docs/operator-guides/reset.md @@ -18,7 +18,7 @@ When a reset is successfully completed, all the records are deleted from your de If you are using destinations that are on the [Destinations v2](/release_notes/upgrading_to_destinations_v2.md) framework, only raw tables will be cleared of their data. Final tables will retain all records from the last sync. ::: -A reset **DOES NOT** delete any destination tables or files itself. The schema is retained but will not contain any rows. +A reset **DOES NOT** delete any destination tables when using a data warehouse, data lake, database. The schema is retained but will not contain any rows. :::tip If you have any orphaned tables or files that are no longer being synced to, they should be cleaned up separately, as Airbyte will not clean them up for you. This can occur when the `Destination Namespace` or `Stream Prefix` connection configuration is changed for an existing connection.