Skip to content

Commit 9ae3852

Browse files
committed
docs: Add the deploy command to the CLI utility reference
1 parent 37c6ccc commit 9ae3852

File tree

11 files changed

+87
-83
lines changed

11 files changed

+87
-83
lines changed

docs/pages/guides/recipes/queries/getting-unique-values-for-a-field.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -92,7 +92,7 @@ In case we need to choose a dimension or render dropdowns for all dimensions, we
9292
can fetch the list of dimensions for all cubes from the
9393
[`/meta` endpoint](/reference/rest-api#v1meta):
9494

95-
```bash{promptUser: user}
95+
```bash
9696
curl http://localhost:4000/cubejs-api/v1/meta
9797
```
9898

docs/pages/product/caching/running-in-production.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ to run the following commands.
2323

2424
You can run Cube Store with Docker with the following command:
2525

26-
```bash{promptUser: user}
26+
```bash
2727
docker run -p 3030:3030 cubejs/cubestore
2828
```
2929

docs/pages/product/configuration/data-sources/databricks-jdbc.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ RUN npm install
4545

4646
You can then build and run the image using the following commands:
4747

48-
```bash{promptUser: user}
48+
```bash
4949
docker build -t cube-jdk .
5050
docker run -it -p 4000:4000 --env-file=.env cube-jdk
5151
```

docs/pages/product/configuration/visualization-tools/jupyter.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ Jupyter connects to Cube as to a Postgres database.
3434

3535
Make sure to install the `sqlalchemy` and `pandas` modules.
3636

37-
```bash{promptUser: user}
37+
```bash
3838
pip install sqlalchemy
3939
pip install pandas
4040
```

docs/pages/product/configuration/visualization-tools/streamlit.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ Streamlit connects to Cube as to a Postgres database.
3333

3434
Make sure to install the `streamlit`, `sqlalchemy` and `pandas` modules.
3535

36-
```bash{promptUser: user}
36+
```bash
3737
pip install streamlit
3838
pip install sqlalchemy
3939
pip install pandas

docs/pages/product/deployment/cloud/continuous-deployment.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ click <Btn>Generate Git credentials</Btn> to obtain Git credentials:
2222
The instructions to set up Cube Cloud as a Git remote are also available on the
2323
same screen:
2424

25-
```bash{promptUser: user}
25+
```bash
2626
git config credential.helper store
2727
git remote add cubecloud <YOUR-CUBE-CLOUD-GIT-URL>
2828
git push cubecloud master
@@ -68,7 +68,7 @@ configuration files directly from your local project directory.
6868
You can obtain a Cube Cloud deploy token from your
6969
deployment's <Btn>Settings</Btn> screen.
7070

71-
```bash{promptUser: user}
71+
```bash
7272
npx cubejs-cli deploy --token TOKEN
7373
```
7474

docs/pages/product/deployment/core.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -310,7 +310,7 @@ models][ref-dynamic-data-models], build a custom Docker image.
310310
You can do this by creating a `Dockerfile` and a corresponding
311311
`.dockerignore` file:
312312

313-
```bash{promptUser: user}
313+
```bash
314314
touch Dockerfile
315315
touch .dockerignore
316316
```
@@ -339,7 +339,7 @@ npm-debug.log
339339

340340
Then start the build process by running the following command:
341341

342-
```bash{promptUser: user}
342+
```bash
343343
docker build -t <YOUR-USERNAME>/cube-custom-image .
344344
```
345345

docs/pages/product/getting-started/core/create-a-project.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ data source, and generate data models.
1313
Start by opening your terminal to create a new folder for the project, then
1414
create a `docker-compose.yml` file within it:
1515

16-
```bash{promptUser: user}
16+
```bash
1717
mkdir my-first-cube-project
1818
cd my-first-cube-project
1919
touch docker-compose.yml
@@ -51,7 +51,7 @@ If you're using Linux as the Docker host OS, you'll also need to add
5151
From the newly-created project directory, run the following command to start
5252
Cube:
5353

54-
```bash{promptUser: user}
54+
```bash
5555
docker compose up -d
5656
```
5757

docs/pages/product/getting-started/migrate-from-core/upload-with-cli.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ The next step is to upload your existing Cube project to the Cube Cloud.
3939
You can do it by running the following command from terminal in your Cube
4040
project directory.
4141

42-
```bash{promptUser: user}
42+
```bash
4343
npx cubejs-cli deploy --token <TOKEN>
4444
```
4545

docs/pages/product/workspace/cli.mdx

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,14 +8,16 @@ redirect_from:
88
The Cube command line interface (CLI) is used for various Cube workflows. It
99
could help you in areas such as:
1010

11-
- Creating a new Cube service;
12-
- Generating a data model based on your database tables;
11+
- Creating a new Cube service.
12+
- Generating a data model based on your database tables.
13+
14+
See the [CLI command reference][ref-ref-cli] for details on all commands.
1315

1416
## Quickstart
1517

1618
Once installed, run the following command to create new Cube service
1719

18-
```bash{promptUser: user}
20+
```bash
1921
npx cubejs-cli create <project name> -d <database type>
2022
```
2123

@@ -35,7 +37,7 @@ options:
3537

3638
For example,
3739

38-
```bash{promptUser: user}
40+
```bash
3941
npx cubejs-cli create hello-world -d postgres
4042
```
4143

@@ -52,5 +54,7 @@ specify the AWS access and secret keys with the [access necessary to run Athena
5254
queries][link-athena-access], and the target AWS region and [S3 output
5355
location][link-athena-output] where query results are stored.
5456

57+
5558
[link-athena-access]: https://docs.aws.amazon.com/athena/latest/ug/access.html
5659
[link-athena-output]: https://docs.aws.amazon.com/athena/latest/ug/querying.html
60+
[ref-ref-cli]: /reference/cli

0 commit comments

Comments
 (0)