Skip to content

Commit

Permalink
updated docs [skip ci]
Browse files Browse the repository at this point in the history
  • Loading branch information
ddnexus committed May 12, 2021
1 parent 4c0de13 commit eb2df6d
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 8 deletions.
2 changes: 1 addition & 1 deletion docs/api/javascript.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ If you use any of them you should follow this documentation, if not, consider th

### Basic principle

All the `pagy*_js` helpers produce/render their component on the client side. The helper methods serve just a minimal HTML tag and a `JSON` tag that gets into the view. The javascript in the [pagy.js](https://github.com/ddnexus/pagy/blob/master/lib/javascripts/pagy.js) file takes care to read the data embedded in the `ISON` tag and make it work in the browser.
All the `pagy*_js` helpers render their component on the client side. The helper methods serve just a minimal HTML tag that contains a `data-pagy-json` attribute. The javascript in the [pagy.js](https://github.com/ddnexus/pagy/blob/master/lib/javascripts/pagy.js) file takes care to read the data embedded in the `data-pagy-json` attribute and make it work in the browser.

## Usage

Expand Down
2 changes: 1 addition & 1 deletion docs/how-to.md
Original file line number Diff line number Diff line change
Expand Up @@ -650,7 +650,7 @@ If your requirements allow to use the `countless` extra (minimal or automatic UI

## Preventing crawlers to follow look-alike links

The `*_js` helpers come with a JSON tag including a string that looks like an `a` link tag. It's just a placeholder string used by `pagy.js` in order to create actual DOM elements links, but some crawlers are reportedly following it even if it is not a DOM element. That causes server side errors reported in your log.
The `*_js` helpers come with a `data-pagy-json` attribute that includes an HTML encoded string that looks like an `a` link tag. It's just a placeholder string used by `pagy.js` in order to create actual DOM elements links, but some crawlers are reportedly following it even if it is not a DOM element. That causes server side errors reported in your log.

You may want to prevent that by simply adding the following lines to your `robots.txt` file:

Expand Down
10 changes: 4 additions & 6 deletions pagy-on-docker/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,12 +26,10 @@ The pagy docker environment has been designed to be useful for developing:
You have a couple of alternatives:

1. (recommended) Permanently set a few environment variables about your user in your IDE or system (it will be easier for the future):

- the `GROUP` name (get it with `id -gn` in the terminal)
- if `echo $UID` return nothing, then set the `UID` (get it with `id -u` in the terminal)
- if `echo $GID` return nothing, then set the `GID` (get it with `id -g` in the terminal)

(Notice: you can also specify a few other variables used in the `docker-compose.yml` file.)
- the `GROUP` name (get it with `id -gn` in the terminal)
- if `echo $UID` return nothing, then set the `UID` (get it with `id -u` in the terminal)
- if `echo $GID` return nothing, then set the `GID` (get it with `id -g` in the terminal)
- (Notice: you can also specify a few other variables used in the `docker-compose.yml` file.)

```sh
cd pagy-on-docker
Expand Down

0 comments on commit eb2df6d

Please sign in to comment.