Skip to content

Commit

Permalink
fix: typo (#1221)
Browse files Browse the repository at this point in the history
Co-authored-by: Martin Adámek <banan23@gmail.com>
  • Loading branch information
gschaer and B4nan committed Sep 24, 2024
1 parent 8b75e5c commit f5af26f
Show file tree
Hide file tree
Showing 5 changed files with 10 additions and 10 deletions.
2 changes: 1 addition & 1 deletion sources/platform/actors/development/deployment/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ To deploy using other methods, first create the Actor manually through Apify CLI

![Actor source types](./images/actor-source-types.png)

You can link your anctor to a Git repository, Gist, or a Zip file.
You can link your Actor to a Git repository, Gist, or a Zip file.

For more information on alternative source types, check out next chapter.

6 changes: 3 additions & 3 deletions sources/platform/storage/dataset.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ import TabItem from '@theme/TabItem';

Dataset storage enables you to sequentially save and retrieve data. A unique dataset is automatically created and assigned to each Actor run when the first item is stored.

Typically, datasets comprises results from web scraping, crawling, and data processing jobs. You can visualize this data in a table, where each object is forming a row and its attributes are represented as columns. You have the option to export data in various formats, including JSON, CSV, XML, Excel, HTML Table, RSS or JSONL.
Typically, datasets comprise results from web scraping, crawling, and data processing jobs. You can visualize this data in a table, where each object is forming a row and its attributes are represented as columns. You have the option to export data in various formats, including JSON, CSV, XML, Excel, HTML Table, RSS or JSONL.

> Named datasets are retained indefinitely. <br/>
> Unnamed datasets expire after 7 days unless otherwise specified. <br/> > [Learn more](usage.md#named-and-unnamed-storages)
Expand Down Expand Up @@ -57,7 +57,7 @@ If you are accessing your datasets using the `username~store-name` [store ID for

> When providing your API authentication token, we recommend using the request's `Authorization` header, rather than the URL. ([More info](../integrations/programming/api.md#authentication)).
To retrieve a list of you datasets, send a GET request to the [Get list of datasets](/api/v2#/reference/datasets/get-list-of-datasets) endpoint.
To retrieve a list of your datasets, send a GET request to the [Get list of datasets](/api/v2#/reference/datasets/get-list-of-datasets) endpoint.

```text
https://api.apify.com/v2/datasets
Expand Down Expand Up @@ -87,7 +87,7 @@ To retrieve the `hotel` and `cafe` fields, you would send your GET request to th
https://api.apify.com/v2/datasets/{DATASET_ID}/items?format=json&fields=hotel%2Ccafe
```

> Use `%2C` instead of commas for URL encoding, as `%2C` represent a comma. For morn on URL encoding check out [this page](https://www.url-encode-decode.com)
> Use `%2C` instead of commas for URL encoding, as `%2C` represent a comma. For more on URL encoding check out [this page](https://www.url-encode-decode.com)
To add data to a dataset, issue a POST request to the [Put items](/api/v2#/reference/datasets/item-collection/put-items) endpoint with the data as a JSON object payload.

Expand Down
4 changes: 2 additions & 2 deletions sources/platform/storage/key_value_store.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ You can access key-value stores through several methods
- [Apify Console](https://console.apify.com) - provides an easy-to-understand interface.
- [Apify API](/api/v2#) - for accessing your key-value stores programmatically.
- [Apify API clients](/api) - to access your key-value stores from any Node.js/Python application.
- [Apify SDKs](/sdk) - when building your own JavaScript/Pyhton Actor.
- [Apify SDKs](/sdk) - when building your own JavaScript/Python Actor.

### Apify Console

Expand Down Expand Up @@ -248,7 +248,7 @@ Check out the [Python SDK documentation](/sdk/python/docs/concepts/storages#work

## Compression

Previously, when using the [Put record](/api/v2#/reference/key-value-stores/record/put-record) endpoint, every record was automatically compressed with Gzip before being uploaded. However, this process has been updated. _Now, record are stored exactly as you upload them._ This change means that it is up to you whether the record is stored compressed or uncompressed.
Previously, when using the [Put record](/api/v2#/reference/key-value-stores/record/put-record) endpoint, every record was automatically compressed with Gzip before being uploaded. However, this process has been updated. _Now, records are stored exactly as you upload them._ This change means that it is up to you whether the record is stored compressed or uncompressed.

You can compress a record and use the [Content-Encoding request header](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Encoding) to let our platform know which compression it uses. We recommend compressing large key-value records to save storage space and network traffic.

Expand Down
4 changes: 2 additions & 2 deletions sources/platform/storage/request_queue.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ import TabItem from '@theme/TabItem';

Request queues enable you to enqueue and retrieve requests such as URLs with an [HTTP method](https://developer.mozilla.org/en-US/docs/Web/HTTP/Methods) and other parameters. They prove essential not only in web crawling scenarios but also in any situation requiring the management of a large number of URLs and the addition of new links.

The storage system for request queues accomoodates both breadth-first and depth-first crawling strategies, along with the inclusion of custom data attributes. This system enables you to check if certain URLs have already been encountered, add new URLs to the queue, and retrieve the next set of URLs for processing.
The storage system for request queues accommodates both breadth-first and depth-first crawling strategies, along with the inclusion of custom data attributes. This system enables you to check if certain URLs have already been encountered, add new URLs to the queue, and retrieve the next set of URLs for processing.

> Named request queues are retained indefinitely. <br/>
> Unnamed request queues expire after 7 days unless otherwise specified.<br/> > [Learn more](./index.md#named-and-unnamed-storages)
Expand All @@ -36,7 +36,7 @@ In the [Apify Console](https://console.apify.com), you can view your request que
![Request queues in app](./images/request-queue-app.png)

To view a request queue, click on its **Queue ID**.
Under the **Actions** menu, you can rename your queue's name (and, in turn, its
Under the **Actions** menu, you can rename your queue (and, in turn, its
[retention period](./usage#named-and-unnamed-storages)) and [access rights](../collaboration/index.md) using the **Share** button.
Click on the **API** button to view and test a queue's [API endpoints](/api/v2#/reference/request-queues).

Expand Down
4 changes: 2 additions & 2 deletions sources/platform/storage/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ You can visit [API Clients](/api) documentations for more information.

### Apify SDKs

The Apify SDKs are libraries in JavaScript or Python that proviede tools for building your own Actors.<br />
The Apify SDKs are libraries in JavaScript or Python that provide tools for building your own Actors.<br />

* JavaScript SDK requires [Node.js](https://nodejs.org/en/) 16 or later.
* Python SDK requires [Python](https://www.python.org/downloads/release/python-380/) 3.8 or above.
Expand All @@ -99,7 +99,7 @@ All API endpoints limit their rate of requests to protect Apify servers from ove
[delete](/api/v2#/reference/request-queues/request-collection/delete-request))
operations of _request queue_ requests.

If a client exceeds this limit, the API endpoints responds with the HTTP status code `429 Too Many Requests` and the following body:
If a client exceeds this limit, the API endpoints respond with the HTTP status code `429 Too Many Requests` and the following body:

```json
{
Expand Down

0 comments on commit f5af26f

Please sign in to comment.