Skip to content

Commit

Permalink
rename: data-management to data-handling
Browse files Browse the repository at this point in the history
* Remove id and slug from pages where it was not necessary

Signed-off-by: Mike Szczys <mike@golioth.io>
  • Loading branch information
szczys authored and ChrisGammell committed Nov 28, 2023
1 parent 7deb245 commit 1bc914e
Show file tree
Hide file tree
Showing 76 changed files with 25 additions and 47 deletions.
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: LightDB State Overview
slug: /data-management/stored-data/lightdb-state
---

## What is LightDB?
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: LightDB Stream Overview
slug: /data-management/stored-data/lightdb-stream
---

## What is LightDB Stream?
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: Data Storage on Golioth
slug: /data-management/stored-data
---

This is page describes two types of device data stored on Golioth:
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: Webhooks Overview
slug: /data-management/output-streams/webhook
---

Webhooks are a simple and flexible way to receive events from the Golioth platform using HTTP. It is really easy to build a web server to receive these events and you can write you own logic to process them.
Expand All @@ -17,9 +15,9 @@ For each Output Stream type, there is a set of specify attributes. Here are the

### Example

As mentioned on [Output Streams Overview](/data-management/output-streams), events are sent using the [Cloud Events](https://cloudevents.io) format. For Webhooks specifically, some metadata of the event are sent as HTTP headers.
As mentioned on [Output Streams Overview](/data-handling/output-streams), events are sent using the [Cloud Events](https://cloudevents.io) format. For Webhooks specifically, some metadata of the event are sent as HTTP headers.

Here is an example of an event arriving on a webhook. Headers prefixed with `Ce-` are related to Cloud Events and the message body is the event payload (see event payloads on [Output Streams Event Types](/data-management/output-streams/event-types/events)).
Here is an example of an event arriving on a webhook. Headers prefixed with `Ce-` are related to Cloud Events and the message body is the event payload (see event payloads on [Output Streams Event Types](/data-handling/output-streams/event-types/events)).

```
POST {your-uri-path} HTTP/1.1
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: Azure Event Hub Overview
slug: /data-management/output-streams/azure-event-hub
---

Azure Event Hub is a fully managed, real-time data ingestion service that can be used to consume Golioth events in a simple, trusted, and scalable way. You can ingest data in multiple ways inside Azure with support for popular protocols, including AMQP, HTTPS, and Apache Kafka. You can also consume Azure Event Hub in a serverless manner using Azure Functions.
Expand Down Expand Up @@ -33,9 +31,9 @@ To use this integration, you need to create an Event Hub on Azure and get the co

### Example

As mentioned on [Output Streams Overview](/data-management/output-streams), events are sent using [Cloud Events](https://cloudevents.io) format. For Azure Event Hub, some metadata of the event are sent together with the message body.
As mentioned on [Output Streams Overview](/data-handling/output-streams), events are sent using [Cloud Events](https://cloudevents.io) format. For Azure Event Hub, some metadata of the event are sent together with the message body.

Here is an example of an event arriving on Event Hub. The payload will be inside a `data` attribute (see event payloads on [Output Streams Event Types](/data-management/output-streams/event-types/events)). The other attributes are metadata related to Cloud Events.
Here is an example of an event arriving on Event Hub. The payload will be inside a `data` attribute (see event payloads on [Output Streams Event Types](/data-handling/output-streams/event-types/events)). The other attributes are metadata related to Cloud Events.

```json
{
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: AWS SQS Overview
slug: /data-management/output-streams/aws-sqs
---

Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you receive events generated on the Golioth platform that can be processed in a decoupled and scalable way. SQS eliminates the complexity and overhead associated with managing and operating message-oriented middleware systems. Data can be ingested using multiple solutions inside of AWS, including Serverless offerings like AWS Lambda.
Expand Down Expand Up @@ -55,9 +53,9 @@ To use this integration, you need to create an SQS Queue and and a user with per

### Example

As mentioned on the [Output Streams Overview](/data-management/output-streams), events are sent using the [Cloud Events](https://cloudevents.io) format. For AWS SQS, some metadata of the event are sent together with the message body.
As mentioned on the [Output Streams Overview](/data-handling/output-streams), events are sent using the [Cloud Events](https://cloudevents.io) format. For AWS SQS, some metadata of the event are sent together with the message body.

Here is an example of an event arriving on SQS. The payload is going to be inside of a `data` attribute (see event payloads on [Output Streams Event Types](/data-management/output-streams/event-types/events)). The other attributes are metadata related to Cloud Events.
Here is an example of an event arriving on SQS. The payload is going to be inside of a `data` attribute (see event payloads on [Output Streams Event Types](/data-handling/output-streams/event-types/events)). The other attributes are metadata related to Cloud Events.

```json
{
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: Google Cloud Platform (GCP) PubSub Overview
slug: /data-management/output-streams/gcp-pubsub
---

[Google Cloud PubSub](https://cloud.google.com/pubsub/) works as a messaging middleware for traditional service integration or a simple communication medium for modern microservices. Events can be ingested with serverless environments like Cloud Functions, Cloud Run or custom environments on Google Kubernetes Engine or Compute Engine.
Expand Down Expand Up @@ -108,7 +106,7 @@ Go the [Golioth Platform](https://console.golioth.io), log in and select your Pr

Golioth Output Streams use the [Cloud Events](https://cloudevents.io) format. For GCP PubSub, this means some metadata of the event are sent together with the message body.

Here is an example of an event arriving on PubSub. The payload will be inside a `data` attribute (see event payloads on [Output Streams Event Types](/data-management/output-streams/event-types/events)). The other attributes are metadata related to Cloud Events.
Here is an example of an event arriving on PubSub. The payload will be inside a `data` attribute (see event payloads on [Output Streams Event Types](/data-handling/output-streams/event-types/events)). The other attributes are metadata related to Cloud Events.

```json
{
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: Datacake Overview
slug: /data-management/output-streams/datacake
---

[Datacake](https://datacake.co/) is a multi-purpose, low-code IoT platform that requires no programming skills and minimal time to create custom IoT applications that can be brought into a white label IoT solution at the push of a button.
Expand Down Expand Up @@ -40,7 +38,7 @@ In this tutorial you will see how to:

Golioth Output Streams use the [Cloud Events](https://cloudevents.io) format. For Datacake, this means some metadata of the event are sent as HTTP headers.

Here is an example of an event arriving on Datcake. Headers prefixed with `Ce-` are related to Cloud Events and the message body is the event payload (see event payloads on [Output Streams Event Types](/data-management/output-streams/event-types/events)).
Here is an example of an event arriving on Datcake. Headers prefixed with `Ce-` are related to Cloud Events and the message body is the event payload (see event payloads on [Output Streams Event Types](/data-handling/output-streams/event-types/events)).


```
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: Ubidots Overview
slug: /data-management/output-streams/ubidots
---

[Ubidots](https://ubidots.com/) makes it easy to white-label visualizations to produce a
Expand Down Expand Up @@ -46,7 +44,7 @@ In this tutorial you will see how to:

Golioth Output Streams use the [Cloud Events](https://cloudevents.io) format. For Ubidots, this means some metadata of the event are sent as HTTP headers.

Here is an example of an event arriving on Ubidots. Headers prefixed with `Ce-` are related to Cloud Events and the message body is the event payload (see event payloads on [Output Streams Event Types](/data-management/output-streams/event-types/events)).
Here is an example of an event arriving on Ubidots. Headers prefixed with `Ce-` are related to Cloud Events and the message body is the event payload (see event payloads on [Output Streams Event Types](/data-handling/output-streams/event-types/events)).

```
POST {your-uri-path} HTTP/1.1
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: InfluxDB Overview
slug: /data-management/output-streams/influxdb
---

[InfluxDB Cloud](https://www.influxdata.com/products/influxdb-cloud/) is a
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
id: overview
title: Event Types Overview
slug: /data-management/output-streams/event-types
slug: event-types
---

The Golioth platform uses Cloud Events to communicate between services internally. With the Output Stream feature, we are making some of those events available for users to consume for use in their own applications.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
---
id: overview
title: Output Streams Overview
slug: /data-management/output-streams
---

Output Streams are a feature of the Golioth platform that allows users to integrate their data seamlessly with a number of external services.
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: Golioth Data Management
slug: /data-management
title: Golioth Data Handling
slug: /data-handling
sidebar_position: 0
---

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -119,4 +119,4 @@ testing. We recommend using `coap` when first working on LightDB data storage
and retrieval. The interactive nature makes it easy to test your schema before
moving to embedded devices.

Examples of using `coap` with LightDB are [found in the Cloud documentation](/data-management/stored-data/lightdb-state/read-write-data).
Examples of using `coap` with LightDB are [found in the Cloud documentation](/data-handling/stored-data/lightdb-state/read-write-data).
2 changes: 1 addition & 1 deletion docs/firmware/zephyr-device-sdk/light-db-stream/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,4 +7,4 @@ LightDB Stream is a persistent database service hosted by Golioth. The LightDB S

![Console](../assets/lightDB-stream-svg-a4.svg)

Checkout the [LightDB Stream](https://docs.golioth.io/data-management/lightdb-stream/) guide for a walkthrough of the sample demonstrating the firmware calls used to interact with the Golioth LightDB Stream service.
Checkout the [LightDB Stream](https://docs.golioth.io/data-handling/lightdb-stream/) guide for a walkthrough of the sample demonstrating the firmware calls used to interact with the Golioth LightDB Stream service.
4 changes: 2 additions & 2 deletions docs/reference/2-protocols/1-coap/3-lightdb.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@ id: lightdb
title: LightDB
---

[LightDB Device Service](/data-management/stored-data/lightdb-state) definitions over CoAP.
[LightDB Device Service](/data-handling/stored-data/lightdb-state) definitions over CoAP.

How to use guides:

- [Read Write Data](/data-management/stored-data/lightdb-state/read-write-data)
- [Read Write Data](/data-handling/stored-data/lightdb-state/read-write-data)

### Interface

Expand Down
4 changes: 2 additions & 2 deletions docs/reference/2-protocols/1-coap/4-lightdb-stream.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@ id: lightdb-stream
title: LightDB Stream
---

[LightDB Stream Device Service](/data-management/stored-data/lightdb-stream) definitions over CoAP.
[LightDB Stream Device Service](/data-handling/stored-data/lightdb-stream) definitions over CoAP.

How to use guides:

- [Sending Data](/data-management/stored-data/lightdb-stream/sending-data)
- [Sending Data](/data-handling/stored-data/lightdb-stream/sending-data)

### Interface

Expand Down
2 changes: 1 addition & 1 deletion docs/reference/2-protocols/1-coap/7-limits.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Golioth servers enforce the following limit on CoAP requests:

When passing frequent readings from a single device to Golioth (greater than 1
Hz) we recommend sending in batches. As noted for the [LightDB Stream
service](/data-management/stored-data/lightdb-stream/sending-data), your device
service](/data-handling/stored-data/lightdb-stream/sending-data), your device
can add timestamps to data packets by using `t`, `ts`, or `time` as the key. The
Golioth LightDB Stream service will use the timestamp for the database entry
instead of the time received.
Expand Down
2 changes: 1 addition & 1 deletion docs/reference/4-websocket/3-Endpoints/1-lightdb.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ id: lightdb
title: LightDB
---

[LightDB Device Service](/data-management/stored-data/lightdb-state) definitions.
[LightDB Device Service](/data-handling/stored-data/lightdb-state) definitions.

Real-time endpoint to listen to any changes in a device state path.

Expand Down
2 changes: 1 addition & 1 deletion docs/reference/4-websocket/3-Endpoints/2-lightdb-stream.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ id: lightdb-stream
title: LightDB Stream
---

[LightDB Stream Device Service](/data-management/stored-data/lightdb-stream) definitions.
[LightDB Stream Device Service](/data-handling/stored-data/lightdb-stream) definitions.

Real-time endpoint to listen to a device's data stream as it arrives at the Golioth Cloud.

Expand Down
6 changes: 3 additions & 3 deletions docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -40,9 +40,9 @@ module.exports = {
position: "left",
},
{
to: "data-management",
activeBasePath: "data-management",
label: "Data Management",
to: "data-handling",
activeBasePath: "data-handling",
label: "Data Handling",
position: "left",
},
{
Expand Down
2 changes: 1 addition & 1 deletion sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ const devicemanagement = [
const datamanagement = [
{
type: 'autogenerated',
dirName: 'data-management',
dirName: 'data-handling',
},
]

Expand Down

0 comments on commit 1bc914e

Please sign in to comment.