Skip to content

Commit 4dc9bbe

Browse files
authored
v0.1.0 node client sdk sync (#3)
* initial sync * update license & add examples * example inference script * readme * add rag * stainless->meta-llama * move scripts to test * add agents tests * node->typescript/javascript
1 parent 8881bef commit 4dc9bbe

File tree

134 files changed

+9426
-4759
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

134 files changed

+9426
-4759
lines changed

.eslintrc.js

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
module.exports = {
2+
parser: '@typescript-eslint/parser',
3+
plugins: ['@typescript-eslint', 'unused-imports', 'prettier'],
4+
rules: {
5+
'no-unused-vars': 'off',
6+
'prettier/prettier': 'error',
7+
'unused-imports/no-unused-imports': 'error',
8+
},
9+
root: true,
10+
};

.gitignore

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
.prism.log
2+
node_modules
3+
yarn-error.log
4+
codegen.log
5+
Brewfile.lock.json
6+
dist
7+
dist-deno
8+
/*.tgz
9+
.idea/

.prettierignore

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
CHANGELOG.md
2+
/ecosystem-tests/*/**
3+
/node_modules
4+
/deno
5+
6+
# don't format tsc output, will break source maps
7+
/dist

.prettierrc.json

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
{
2+
"arrowParens": "always",
3+
"experimentalTernaries": true,
4+
"printWidth": 110,
5+
"singleQuote": true,
6+
"trailingComma": "all"
7+
}

.stats.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
configured_endpoints: 74
2+
openapi_spec_url: https://github.com/meta-llama/llama-stack/blob/main/docs/resources/llama-stack-spec.yaml

CONTRIBUTING.md

Lines changed: 27 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
## Setting up the environment
22

3-
This repository uses [`yarn@v1`](https://classic.yarnpkg.com/lang/en/docs/install/#mac-stable).
3+
This repository uses [`yarn@v1`](https://classic.yarnpkg.com/lang/en/docs/install).
44
Other package managers may work but are not officially supported for development.
55

66
To set up the repository, run:
77

8-
```bash
9-
yarn
10-
yarn build
8+
```sh
9+
$ yarn
10+
$ yarn build
1111
```
1212

1313
This will install all the required dependencies and build output files to `dist/`.
@@ -22,17 +22,17 @@ modify the contents of the `src/lib/` and `examples/` directories.
2222

2323
All files in the `examples/` directory are not modified by the generator and can be freely edited or added to.
2424

25-
```bash
25+
```ts
2626
// add an example to examples/<your-example>.ts
2727

2828
#!/usr/bin/env -S npm run tsn -T
2929
3030
```
3131

32-
```
33-
chmod +x examples/<your-example>.ts
32+
```sh
33+
$ chmod +x examples/<your-example>.ts
3434
# run the example against your api
35-
yarn tsn -T examples/<your-example>.ts
35+
$ yarn tsn -T examples/<your-example>.ts
3636
```
3737

3838
## Using the repository from source
@@ -41,38 +41,38 @@ If you’d like to use the repository from source, you can either install from g
4141

4242
To install via git:
4343

44-
```bash
45-
npm install git+ssh://git@github.com:stainless-sdks/llama-stack-node.git
44+
```sh
45+
$ npm install git+ssh://git@github.com:stainless-sdks/llama-stack-node.git
4646
```
4747

4848
Alternatively, to link a local copy of the repo:
4949

50-
```bash
50+
```sh
5151
# Clone
52-
git clone https://www.github.com/stainless-sdks/llama-stack-node
53-
cd llama-stack-node
52+
$ git clone https://www.github.com/stainless-sdks/llama-stack-node
53+
$ cd llama-stack-node
5454

5555
# With yarn
56-
yarn link
57-
cd ../my-package
58-
yarn link llama-stack-client
56+
$ yarn link
57+
$ cd ../my-package
58+
$ yarn link llama-stack-client
5959

6060
# With pnpm
61-
pnpm link --global
62-
cd ../my-package
63-
pnpm link -—global llama-stack-client
61+
$ pnpm link --global
62+
$ cd ../my-package
63+
$ pnpm link -—global llama-stack-client
6464
```
6565

6666
## Running tests
6767

6868
Most tests require you to [set up a mock server](https://github.com/stoplightio/prism) against the OpenAPI spec to run the tests.
6969

70-
```bash
71-
npx prism mock path/to/your/openapi.yml
70+
```sh
71+
$ npx prism mock path/to/your/openapi.yml
7272
```
7373

74-
```bash
75-
yarn run test
74+
```sh
75+
$ yarn run test
7676
```
7777

7878
## Linting and formatting
@@ -82,26 +82,12 @@ This repository uses [prettier](https://www.npmjs.com/package/prettier) and
8282

8383
To lint:
8484

85-
```bash
86-
yarn lint
85+
```sh
86+
$ yarn lint
8787
```
8888

8989
To format and fix all lint issues automatically:
9090

91-
```bash
92-
yarn fix
91+
```sh
92+
$ yarn fix
9393
```
94-
95-
## Publishing and releases
96-
97-
Changes made to this repository via the automated release PR pipeline should publish to npm automatically. If
98-
the changes aren't made through the automated pipeline, you may want to make releases manually.
99-
100-
### Publish with a GitHub workflow
101-
102-
You can release to package managers by using [the `Publish NPM` GitHub action](https://www.github.com/stainless-sdks/llama-stack-node/actions/workflows/publish-npm.yml). This requires a setup organization or repository secret to be set up.
103-
104-
### Publish manually
105-
106-
If you need to manually release a package, you can run the `bin/publish-npm` script with an `NPM_TOKEN` set on
107-
the environment.

LICENSE

Lines changed: 17 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,21 +1,22 @@
11
MIT License
22

3-
Copyright (c) 2024 Meta Platforms, Inc. and affiliates
3+
Copyright (c) Meta Platforms, Inc. and affiliates
44

5-
Permission is hereby granted, free of charge, to any person obtaining a copy
6-
of this software and associated documentation files (the "Software"), to deal
7-
in the Software without restriction, including without limitation the rights
8-
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9-
copies of the Software, and to permit persons to whom the Software is
10-
furnished to do so, subject to the following conditions:
5+
Permission is hereby granted, free of charge, to any person obtaining
6+
a copy of this software and associated documentation files (the
7+
"Software"), to deal in the Software without restriction, including
8+
without limitation the rights to use, copy, modify, merge, publish,
9+
distribute, sublicense, and/or sell copies of the Software, and to
10+
permit persons to whom the Software is furnished to do so, subject to
11+
the following conditions:
1112

12-
The above copyright notice and this permission notice shall be included in all
13-
copies or substantial portions of the Software.
13+
The above copyright notice and this permission notice shall be
14+
included in all copies or substantial portions of the Software.
1415

15-
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16-
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17-
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18-
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19-
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20-
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21-
SOFTWARE.
16+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
17+
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
18+
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
19+
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
20+
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
21+
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
22+
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

README.md

Lines changed: 59 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
1-
# Llama Stack Client Node API Library
1+
# Llama Stack Client TypeScript and JavaScript API Library
22

33
[![NPM version](https://img.shields.io/npm/v/llama-stack-client.svg)](https://npmjs.org/package/llama-stack-client) ![npm bundle size](https://img.shields.io/bundlephobia/minzip/llama-stack-client) [![Discord](https://img.shields.io/discord/1257833999603335178)](https://discord.gg/llama-stack)
44

55
This library provides convenient access to the Llama Stack Client REST API from server-side TypeScript or JavaScript.
66

7-
The REST API documentation can be found on [llama-stack](https://github.com/meta-llama/llama-stack/blob/main/docs/resources/llama-stack-spec.html). The full API of this library can be found in [api.md](api.md).
7+
The REST API documentation can be found on [https://llama-stack.readthedocs.io/en/latest/references/api_reference/index.html](https://llama-stack.readthedocs.io/en/latest/references/api_reference/index.html). The full API of this library can be found in [api.md](api.md).
88

99
It is generated with [Stainless](https://www.stainlessapi.com/).
1010

@@ -14,7 +14,6 @@ It is generated with [Stainless](https://www.stainlessapi.com/).
1414
npm install llama-stack-client
1515
```
1616

17-
1817
## Usage
1918

2019
The full API of this library can be found in [api.md](api.md).
@@ -24,18 +23,40 @@ The full API of this library can be found in [api.md](api.md).
2423
import LlamaStackClient from 'llama-stack-client';
2524

2625
const client = new LlamaStackClient({
27-
environment: 'sandbox', // defaults to 'production'
26+
baseURL: 'http://localhost:8321'
2827
});
2928

3029
async function main() {
31-
const session = await client.agents.sessions.create({ agent_id: 'agent_id', session_name: 'session_name' });
30+
const models = await client.models.list();
3231

33-
console.log(session.session_id);
32+
console.log(models);
3433
}
3534

3635
main();
3736
```
3837

38+
## Streaming responses
39+
40+
We provide support for streaming responses using Server Sent Events (SSE).
41+
42+
```ts
43+
import LlamaStackClient from 'llama-stack-client';
44+
45+
const client = new LlamaStackClient();
46+
47+
const stream = await client.inference.chatCompletion({
48+
messages: [{ content: 'string', role: 'user' }],
49+
model_id: 'meta-llama/Llama-3.2-3B-Instruct',
50+
stream: true,
51+
});
52+
for await (const inferenceChatCompletionResponse of stream) {
53+
process.stdout.write(inferenceChatCompletionResponse.event.delta.text || '');
54+
}
55+
```
56+
57+
If you need to cancel a stream, you can `break` from the loop
58+
or call `stream.controller.abort()`.
59+
3960
### Request & Response types
4061

4162
This library includes TypeScript definitions for all request params and response fields. You may import and use them like so:
@@ -44,16 +65,16 @@ This library includes TypeScript definitions for all request params and response
4465
```ts
4566
import LlamaStackClient from 'llama-stack-client';
4667

47-
const client = new LlamaStackClient({
48-
environment: 'sandbox', // defaults to 'production'
49-
});
68+
const client = new LlamaStackClient();
5069

5170
async function main() {
52-
const params: LlamaStackClient.Agents.SessionCreateParams = {
53-
agent_id: 'agent_id',
54-
session_name: 'session_name',
71+
const params: LlamaStackClient.InferenceChatCompletionParams = {
72+
messages: [{ content: 'string', role: 'user' }],
73+
model_id: 'model_id',
5574
};
56-
const session: LlamaStackClient.Agents.SessionCreateResponse = await client.agents.sessions.create(params);
75+
const response: LlamaStackClient.InferenceChatCompletionResponse = await client.inference.chatCompletion(
76+
params,
77+
);
5778
}
5879

5980
main();
@@ -70,8 +91,8 @@ a subclass of `APIError` will be thrown:
7091
<!-- prettier-ignore -->
7192
```ts
7293
async function main() {
73-
const session = await client.agents.sessions
74-
.create({ agent_id: 'agent_id', session_name: 'session_name' })
94+
const response = await client.inference
95+
.chatCompletion({ messages: [{ content: 'string', role: 'user' }], model_id: 'model_id' })
7596
.catch(async (err) => {
7697
if (err instanceof LlamaStackClient.APIError) {
7798
console.log(err.status); // 400
@@ -115,7 +136,7 @@ const client = new LlamaStackClient({
115136
});
116137

117138
// Or, configure per-request:
118-
await client.agents.sessions.create({ agent_id: 'agent_id', session_name: 'session_name' }, {
139+
await client.inference.chatCompletion({ messages: [{ content: 'string', role: 'user' }], model_id: 'model_id' }, {
119140
maxRetries: 5,
120141
});
121142
```
@@ -132,7 +153,7 @@ const client = new LlamaStackClient({
132153
});
133154

134155
// Override per-request:
135-
await client.agents.sessions.create({ agent_id: 'agent_id', session_name: 'session_name' }, {
156+
await client.inference.chatCompletion({ messages: [{ content: 'string', role: 'user' }], model_id: 'model_id' }, {
136157
timeout: 5 * 1000,
137158
});
138159
```
@@ -153,17 +174,17 @@ You can also use the `.withResponse()` method to get the raw `Response` along wi
153174
```ts
154175
const client = new LlamaStackClient();
155176

156-
const response = await client.agents.sessions
157-
.create({ agent_id: 'agent_id', session_name: 'session_name' })
177+
const response = await client.inference
178+
.chatCompletion({ messages: [{ content: 'string', role: 'user' }], model_id: 'model_id' })
158179
.asResponse();
159180
console.log(response.headers.get('X-My-Header'));
160181
console.log(response.statusText); // access the underlying Response object
161182

162-
const { data: session, response: raw } = await client.agents.sessions
163-
.create({ agent_id: 'agent_id', session_name: 'session_name' })
183+
const { data: response, response: raw } = await client.inference
184+
.chatCompletion({ messages: [{ content: 'string', role: 'user' }], model_id: 'model_id' })
164185
.withResponse();
165186
console.log(raw.headers.get('X-My-Header'));
166-
console.log(session.session_id);
187+
console.log(response);
167188
```
168189

169190
### Making custom/undocumented requests
@@ -267,8 +288,8 @@ const client = new LlamaStackClient({
267288
});
268289

269290
// Override per-request:
270-
await client.agents.sessions.create(
271-
{ agent_id: 'agent_id', session_name: 'session_name' },
291+
await client.inference.chatCompletion(
292+
{ messages: [{ content: 'string', role: 'user' }], model_id: 'model_id' },
272293
{
273294
httpAgent: new http.Agent({ keepAlive: false }),
274295
},
@@ -280,7 +301,7 @@ await client.agents.sessions.create(
280301
This package generally follows [SemVer](https://semver.org/spec/v2.0.0.html) conventions, though certain backwards-incompatible changes may be released as minor versions:
281302

282303
1. Changes that only affect static types, without breaking runtime behavior.
283-
2. Changes to library internals which are technically public but not intended or documented for external use. _(Please open a GitHub issue to let us know if you are relying on such internals)_.
304+
2. Changes to library internals which are technically public but not intended or documented for external use. _(Please open a GitHub issue to let us know if you are relying on such internals.)_
284305
3. Changes that we do not expect to impact the vast majority of users in practice.
285306

286307
We take backwards-compatibility seriously and work hard to ensure you can rely on a smooth upgrade experience.
@@ -293,6 +314,19 @@ TypeScript >= 4.5 is supported.
293314

294315
The following runtimes are supported:
295316

317+
- Web browsers (Up-to-date Chrome, Firefox, Safari, Edge, and more)
318+
- Node.js 18 LTS or later ([non-EOL](https://endoflife.date/nodejs)) versions.
319+
- Deno v1.28.0 or higher.
320+
- Bun 1.0 or later.
321+
- Cloudflare Workers.
322+
- Vercel Edge Runtime.
323+
- Jest 28 or greater with the `"node"` environment (`"jsdom"` is not supported at this time).
324+
- Nitro v2.6 or greater.
325+
296326
Note that React Native is not supported at this time.
297327

298328
If you are interested in other runtime environments, please open or upvote an issue on GitHub.
329+
330+
## Contributing
331+
332+
See [the contributing documentation](./CONTRIBUTING.md).

SECURITY.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ or products provided by Llama Stack Client please follow the respective company'
2020

2121
### Llama Stack Client Terms and Policies
2222

23-
Please contact dev-feedback@llama-stack-client.com for any questions or concerns regarding security of our services.
23+
Please contact llamastack@meta.com for any questions or concerns regarding security of our services.
2424

2525
---
2626

0 commit comments

Comments
 (0)