-
Notifications
You must be signed in to change notification settings - Fork 191
feat: add create-llama artifacts template (python) #586
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🦋 Changeset detectedLatest commit: 347ba8e The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
WalkthroughThis update introduces a new "artifacts" use case to the LlamaIndex project template generator. It extends type definitions and user prompts to support the new use case, adds a Python workflow template for artifact generation (including code and document workflows), and provides a corresponding React UI component for workflow progress visualization. The update also includes a new README template with setup and usage instructions for the artifacts workflow, and adjusts file operations to copy all Python files for the workflow template. Additionally, dependencies are updated to require a newer version of "llama-index-server". Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant UI (React)
participant FastAPI Server
participant Workflow (Python)
participant LLM
User->>UI (React): Sends chat input
UI (React)->>FastAPI Server: POST /api/chat with message
FastAPI Server->>Workflow (Python): Passes chat request
Workflow (Python)->>LLM: Planning step (analyze user request)
LLM-->>Workflow (Python): Returns plan (e.g., code or document requirement)
Workflow (Python)->>LLM: Generate artifact (code/doc) if needed
LLM-->>Workflow (Python): Returns generated artifact content
Workflow (Python)->>UI (React): Emits UI event (progress, requirement, etc.)
Workflow (Python)->>LLM: Synthesize answer (explanation)
LLM-->>Workflow (Python): Returns answer
Workflow (Python)->>FastAPI Server: Streams answer and events
FastAPI Server->>UI (React): Streams response and events
UI (React)->>User: Displays progress, artifact, and answer
Possibly related PRs
Suggested reviewers
Poem
📜 Recent review detailsConfiguration used: .coderabbit.yaml 📒 Files selected for processing (2)
✅ Files skipped from review due to trivial changes (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms (27)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
8e5847f to
bff0f77
Compare
packages/create-llama/templates/components/workflows/python/artifacts/document_workflow.py
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/python/artifacts/workflow.py
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/python/artifacts/README-template.md
Outdated
Show resolved
Hide resolved
- Introduced `CodeArtifactWorkflow` and `DocumentArtifactWorkflow` classes to handle code and document artifacts respectively. - Updated README to include instructions for modifying the factory method to select the appropriate workflow. - Enhanced clarity in class documentation and improved naming conventions for better understanding.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
♻️ Duplicate comments (4)
packages/create-llama/templates/components/workflows/python/artifacts/workflow.py (1)
3-3: Improve the import comment clarity.The comment on line 3 is meant to provide guidance on how to import the document workflow, but it's missing the "from" keyword which makes it unclear that it's an import instruction.
-# from app.document_workflow import DocumentArtifactWorkflow to generate documents +# import DocumentArtifactWorkflow from app.document_workflow to generate documentspackages/create-llama/templates/components/workflows/python/artifacts/README-template.md (1)
39-41: Clarify workflow selection instructions.The sentence explaining workflow selection is missing a comma and could be made clearer.
-- [Document Workflow](./document_workflow.py): Generate and update a document like OpenAI's canvas. +- [Document Workflow](./document_workflow.py): Generate and update a document like OpenAI's canvas. -Modify the factory method in `workflow.py` to decide which artifact workflow to use. Without any changes the Code Workflow is used. +Modify the factory method in `workflow.py` to decide which artifact workflow to use. Without any changes, the Code Workflow is used.🧰 Tools
🪛 LanguageTool
[typographical] ~41-~41: It appears that a comma is missing.
Context: ...h artifact workflow to use. Without any changes the Code Workflow is used. You can sta...(DURING_THAT_TIME_COMMA)
packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py (2)
187-194: Same greedy-JSON issue as in the document workflowReuse the non-greedy pattern suggested earlier to avoid capturing unrelated back-tick blocks and crashing at
Requirement.model_validate_json.
289-292: Confirmformatted=TruekwargSame concern as with the document workflow – please verify the driver signature.
🧹 Nitpick comments (8)
packages/create-llama/templates/components/workflows/python/artifacts/README-template.md (4)
26-28: Add a language specifier to the code block.For better syntax highlighting and documentation consistency, specify the shell language in the code fence.
-``` +```shell uv run fastapi run<details> <summary>🧰 Tools</summary> <details> <summary>🪛 markdownlint-cli2 (0.17.2)</summary> 26-26: Fenced code blocks should have a language specified null (MD040, fenced-code-language) </details> </details> --- `43-43`: **Fix article usage with "request".** Use "a" instead of "an" before "request" since "request" begins with a consonant sound. ```diff -You can start by sending an request on the [chat UI](http://localhost:8000) or you can test the `/api/chat` endpoint with the following curl request: +You can start by sending a request on the [chat UI](http://localhost:8000) or you can test the `/api/chat` endpoint with the following curl request:🧰 Tools
🪛 LanguageTool
[misspelling] ~43-~43: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...flow is used. You can start by sending an request on the [chat UI](http://localho...(EN_A_VS_AN)
45-49: Add a language specifier to the code block.For better syntax highlighting and documentation consistency, specify the shell language in the code fence.
-``` +```shell curl --location 'localhost:8000/api/chat' \ --header 'Content-Type: application/json' \ --data '{ "messages": [{ "role": "user", "content": "Create a report comparing the finances of Apple and Tesla" }] }'<details> <summary>🧰 Tools</summary> <details> <summary>🪛 markdownlint-cli2 (0.17.2)</summary> 45-45: Fenced code blocks should have a language specified null (MD040, fenced-code-language) </details> </details> --- `57-59`: **Add a language specifier to the code block.** For better syntax highlighting and documentation consistency, specify the shell language in the code fence. ```diff -``` +```shell uv run generate_ui<details> <summary>🧰 Tools</summary> <details> <summary>🪛 markdownlint-cli2 (0.17.2)</summary> 57-57: Fenced code blocks should have a language specified null (MD040, fenced-code-language) </details> </details> </blockquote></details> <details> <summary>packages/create-llama/templates/components/ui/workflows/artifacts/ui_event.jsx (1)</summary><blockquote> `128-137`: **Improve handling of undefined or null events.** The `aggregateEvents` function handles empty arrays, but the component could be more robust by adding a null check in the default export function. ```diff export default function Component({ events }) { + if (!events) return null; + const aggregateEvents = () => { if (!events || events.length === 0) return null; return events[events.length - 1]; }; const event = aggregateEvents(); return <ArtifactWorkflowCard event={event} />; }packages/create-llama/templates/components/workflows/python/artifacts/document_workflow.py (1)
266-285: Minor: ensuredoc_typeis lower-case before persistingIf the LLM returns “Markdown”, the captured group retains the original case;
DocumentArtifactData.typeis aLiteral["markdown","html"]so upper-case would break validation downstream.-doc_type = doc_match.group(1) +doc_type = doc_match.group(1).lower()packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py (2)
294-300: Make code-block regex non-greedy & verify language-language_pattern = r"```(\w+)([\s\S]*)```" +language_pattern = r"```(\w+)\s*([\s\S]*?)\s*```"After extraction, ensure
lang.lower()∈ {“typescript”, “python”}.
If not, emit a graceful error so that users discover the mismatch early.
310-320: Propagate the captured language when LLM omits itWhen
event.requirement.languageisNone, you already fall back to"".
However, if the LLM does return a language tag in the code block (code_match.group(1)), you can use that as a better default:- language=event.requirement.language or "", + language=(event.requirement.language or code_match.group(1).lower()),This prevents empty strings in the stored artifact and keeps UI tooling consistent.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (10)
.changeset/chilly-foxes-remain.md(1 hunks)packages/create-llama/helpers/python.ts(1 hunks)packages/create-llama/helpers/types.ts(1 hunks)packages/create-llama/questions/simple.ts(4 hunks)packages/create-llama/templates/components/ui/workflows/artifacts/ui_event.jsx(1 hunks)packages/create-llama/templates/components/workflows/python/artifacts/README-template.md(1 hunks)packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py(1 hunks)packages/create-llama/templates/components/workflows/python/artifacts/document_workflow.py(1 hunks)packages/create-llama/templates/components/workflows/python/artifacts/workflow.py(1 hunks)packages/create-llama/templates/types/llamaindexserver/fastapi/pyproject.toml(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (3)
packages/create-llama/templates/components/workflows/python/artifacts/workflow.py (2)
python/llama-index-server/llama_index/server/api/models.py (1)
ChatRequest(32-41)python/llama-index-server/tests/api/test_event_stream.py (1)
chat_request(23-26)
packages/create-llama/questions/simple.ts (1)
packages/create-llama/helpers/types.ts (1)
ModelConfig(14-21)
packages/create-llama/templates/components/workflows/python/artifacts/document_workflow.py (5)
python/llama-index-server/llama_index/server/api/models.py (2)
ChatRequest(32-41)UIEvent(147-155)python/llama-index-server/llama_index/server/api/utils/chat_request.py (1)
get_last_artifact(21-23)packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py (9)
PlanEvent(37-39)GenerateArtifactEvent(42-43)SynthesizeAnswerEvent(46-47)UIEventData(50-52)ArtifactWorkflow(55-354)prepare_chat_history(80-102)planning(105-218)generate_artifact(221-322)synthesize_answer(325-354)python/llama-index-server/tests/api/test_event_stream.py (1)
chat_request(23-26)packages/create-llama/templates/components/ui/workflows/artifacts/ui_event.jsx (2)
event(44-44)event(134-134)
🪛 LanguageTool
packages/create-llama/templates/components/workflows/python/artifacts/README-template.md
[typographical] ~41-~41: It appears that a comma is missing.
Context: ...h artifact workflow to use. Without any changes the Code Workflow is used. You can sta...
(DURING_THAT_TIME_COMMA)
[misspelling] ~43-~43: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...flow is used. You can start by sending an request on the [chat UI](http://localho...
(EN_A_VS_AN)
🪛 markdownlint-cli2 (0.17.2)
packages/create-llama/templates/components/workflows/python/artifacts/README-template.md
26-26: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
45-45: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
57-57: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
🔇 Additional comments (13)
packages/create-llama/templates/types/llamaindexserver/fastapi/pyproject.toml (1)
15-15: Version bump for llama-index-server dependency to support artifacts use case.The version requirement for llama-index-server has been updated from ">=0.1.14,<0.2.0" to ">=0.1.15,<0.2.0" to support the new artifacts use case functionality.
.changeset/chilly-foxes-remain.md (1)
1-6: LGTM! Changeset documented appropriately.The changeset file correctly documents the patch for adding the artifacts use case (Python implementation) to create-llama.
packages/create-llama/helpers/types.ts (1)
60-61: Good addition to the TemplateUseCase type.Adding "artifacts" to the TemplateUseCase union type correctly integrates the new use case into the type system, ensuring type safety throughout the application.
packages/create-llama/helpers/python.ts (1)
565-565:✅ Verification successful
Modified file copy pattern to handle multiple workflow files.
Good change! This modifies the copy pattern from a specific "workflow.py" file to all Python files ("*.py") in the use case directory, which is necessary to support the multiple workflow files in the new artifacts use case.
🏁 Script executed:
#!/bin/bash # Verify existence of multiple Python files in the artifacts directory fd -t f -e py . packages/create-llama/templates/components/workflows/python/artifacts --exec ls -la {} \;Length of output: 518
Copy pattern update correctly supports multiple workflow files
Verified that the
artifactsdirectory contains multiple Python files (workflow.py,code_workflow.py,document_workflow.py), so changing the copy pattern to"*.py"is necessary and correct.
No further action required.packages/create-llama/templates/components/workflows/python/artifacts/workflow.py (1)
9-15: LGTM: Clean and clear workflow factory implementation.The
create_workflowfunction correctly implements a factory pattern that returns a configuredCodeArtifactWorkflowinstance with appropriate parameters. This structure makes it easy to switch between different workflow implementations.packages/create-llama/templates/components/ui/workflows/artifacts/ui_event.jsx (3)
10-27: LGTM: Well-structured workflow stage configuration.The
STAGE_METAobject is well-organized with clear stage-specific styling and configuration, making it easy to understand and maintain.
29-48: LGTM: Clean state management for workflow card visibility.The component effectively manages visibility and animation states based on the event state, with proper effect cleanup.
49-126: LGTM: Responsive and well-styled artifact workflow card UI.The card UI is well-structured with appropriate conditional rendering based on the workflow stage, clear visual indicators of progress, and good use of styling for a polished user experience.
packages/create-llama/questions/simple.ts (4)
9-13: LGTM: Clean type extension for new artifacts use case.The
AppTypeunion has been properly extended to include the new "artifacts" use case.
49-54: LGTM: Clear description for the artifacts use case option.The new choice added to the use case selection prompt has a descriptive title and explanation that clearly conveys the purpose of the artifacts use case.
124-128: LGTM: Model constant rename and update.The model constant has been appropriately renamed from
MODEL_GPT4otoMODEL_GPT41and updated to use "gpt-4.1" instead of "gpt-4o", maintaining consistency with the model used in the workflow implementation.
156-161: LGTM: Well-configured artifacts use case.The new entry in the lookup table for the "artifacts" use case is properly configured with the appropriate template, empty data sources and tools arrays, and the GPT-4.1 model configuration.
packages/create-llama/templates/components/workflows/python/artifacts/document_workflow.py (1)
164-167: Double-checkformatted=True– not all LLM drivers support it
self.llm.acomplete(prompt=prompt, formatted=True)assumes the driver accepts aformattedkwarg. Inllama-index(≥0.10) it’s usuallyformat=True(no ed) or the prompt must be pre-formatted manually. A mismatch will raiseTypeError: got an unexpected keyword. Please verify against the exact version inrequirements.txt.
packages/create-llama/templates/components/workflows/python/artifacts/document_workflow.py
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/python/artifacts/document_workflow.py
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/python/artifacts/workflow.py
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (1)
packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py (1)
30-33: Restrictlanguageto allowed literals for static safetyWe can catch typos (“pyhton”) and invalid languages at parse-time by encoding the contract in the model:
- language: Optional[str] = None + language: Optional[Literal["typescript", "python"]] = NonePydantic will still accept
None, but any other value outside the two allowed options will raise immediately, preventing runtime surprises.
🧹 Nitpick comments (2)
packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py (2)
313-317: Prefer detected language over requested one when emitting artifactThe LLM might return a fenced block tagged with a different language (or none). Deriving the language from the regex keeps the UI in sync with what was actually generated:
- language=event.requirement.language or "", + language=(code_match.group(1) if code_match else event.requirement.language) or "",This also removes silent fall-back to an empty string, which can break downstream renderers.
120-144: Consider extracting the verbose prompt to a constant for reuse & readabilityThe 120-line inline prompt makes the method hard to scan and is duplicated almost verbatim in
document_workflow.py. Moving it to a module-level constant (or a.promptfile) reduces noise and promotes reuse:PLANNING_PROMPT = PromptTemplate(\"\"\" ... long text ... \"\"\") ... prompt = PLANNING_PROMPT.format( context="..." if event.context is None else f"...", user_msg=event.user_msg, )This keeps business logic focused on control flow, not string literals.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
packages/create-llama/questions/simple.ts(5 hunks)packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py(1 hunks)packages/create-llama/templates/components/workflows/python/artifacts/document_workflow.py(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/create-llama/questions/simple.ts
- packages/create-llama/templates/components/workflows/python/artifacts/document_workflow.py
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py (3)
python/llama-index-server/llama_index/server/api/models.py (2)
ChatRequest(32-41)UIEvent(147-155)python/llama-index-server/llama_index/server/api/utils/chat_request.py (1)
get_last_artifact(21-23)packages/create-llama/templates/components/workflows/python/artifacts/document_workflow.py (8)
PlanEvent(35-37)GenerateArtifactEvent(40-41)SynthesizeAnswerEvent(44-46)UIEventData(49-51)prepare_chat_history(77-99)planning(102-195)generate_artifact(198-292)synthesize_answer(295-325)
⏰ Context from checks skipped due to timeout of 90000ms (29)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: Unit Tests (windows-latest, 3.9)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files)
- GitHub Check: lint
packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/python/artifacts/code_workflow.py
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🧹 Nitpick comments (4)
packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts (2)
33-34: Avoid banned{}type – replace withvoidorRecord<string, never>Biome correctly reports that
{}is too permissive.
voidis most appropriate here because no payload is ever consumed downstream.-const synthesizeAnswerEvent = workflowEvent<{}>(); +const synthesizeAnswerEvent = workflowEvent<void>();🧰 Tools
🪛 Biome (1.9.4)
[error] 33-33: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
342-366: Remove side-effecting test harness from production moduleShipping a self-executing
run()causes:• Extraneous console output in the browser/server.
• An additional LLM call every time the file is imported.
• Difficult-to-stub side effects in unit tests.Move the snippet to a dedicated
*.spec.tsor guard withif (require.main === module).-// Test the workflow -... -run(); +// Uncomment the lines below when debugging locally +// if (require.main === module) { +// run(); +// }packages/create-llama/templates/components/workflows/typescript/artifacts/doc-workflow.ts (2)
336-361: Accidental execution on import – move the demo runnerAs with
code-workflow.ts, executingrun()upon import introduces side effects. Wrap it or extract to tests.-// Uncomment to run -run(); +// Uncomment to run locally +// if (require.main === module) { +// run(); +// }
63-63: Nit: usevoidinstead of an empty tuple typeFor parity with the earlier comment, prefer:
-const synthesizeAnswerEvent = workflowEvent<{ +const synthesizeAnswerEvent = workflowEvent<{ requirement: DocumentRequirement; generatedArtifact: string; -}>(); +}>();Alternatively, if you never pass a payload, declare
workflowEvent<void>().
(Here a payload is passed later, so keep the object but avoid{}elsewhere.)
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts(1 hunks)packages/create-llama/templates/components/workflows/typescript/artifacts/doc-workflow.ts(1 hunks)packages/create-llama/templates/components/workflows/typescript/artifacts/workflow.ts(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- packages/create-llama/templates/components/workflows/typescript/artifacts/workflow.ts
🧰 Additional context used
🪛 Biome (1.9.4)
packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts
[error] 33-33: Don't use '{}' as a type.
Prefer explicitly define the object shape. '{}' means "any non-nullable value".
(lint/complexity/noBannedTypes)
⏰ Context from checks skipped due to timeout of 90000ms (27)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files)
🔇 Additional comments (1)
packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts (1)
272-279: Regex drops language & may capture trailing backticks
(\w+)misses “typescript” if the model outputs ```ts.([\s\S]*)is greedy – it will include the closing backticks in some cases because the first triple-backtick may appear inside a string literal.Consider:
const codeMatch = response.text.match(/```(\w+)\s*([\s\S]*?)\s*```/);and save the captured language:
const [, detectedLang, codeRaw] = codeMatch; const code = codeRaw.trim(); const lang = detectedLang ?? planData.requirement.language ?? ""; ... language: lang,
packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/artifacts/code-workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/workflows/typescript/artifacts/doc-workflow.ts
Outdated
Show resolved
Hide resolved
packages/create-llama/templates/components/ui/workflows/artifacts/ui_event.jsx
Show resolved
Hide resolved
packages/create-llama/templates/components/ui/workflows/artifacts/ui_event.jsx
Show resolved
Hide resolved
packages/create-llama/templates/components/ui/workflows/artifacts/ui_event.jsx
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
packages/create-llama/templates/components/workflows/python/artifacts/README-template.md (2)
26-28: Add language identifiers to fenced code blocks and fix the API URL in the curl exampleSeveral fenced code blocks lack a language specifier, which helps with syntax highlighting and readability. Also, the
curlcommand is missing thehttp://protocol, causing it to be interpreted as a file path instead of an HTTP request.Apply this diff to address both issues:
@@ lines 26-28 - ``` + ```bash uv run fastapi run - ``` + ``` @@ lines 45-49 - ``` + ```bash curl --location 'localhost:8000/api/chat' \ + curl --location 'http://localhost:8000/api/chat' \ --header 'Content-Type: application/json' \ --data '{ "messages": [{ "role": "user", "content": "Create a report comparing the finances of Apple and Tesla" }] }' - ``` + ``` @@ lines 57-59 - ``` + ```bash uv run generate_ui - ``` + ```Also applies to: 45-49, 57-59
🧰 Tools
🪛 markdownlint-cli2 (0.17.2)
26-26: Fenced code blocks should have a language specified
null(MD040, fenced-code-language)
41-43: Correct punctuation and article usageThere are a couple of minor grammar issues that impact clarity:
@@ line 41 - Without any changes the Code Workflow is used. + Without any changes, the Code Workflow is used. @@ line 43 - You can start by sending an request on the [chat UI] + You can start by sending a request on the [chat UI]🧰 Tools
🪛 LanguageTool
[uncategorized] ~41-~41: A punctuation mark might be missing here.
Context: ...s canvas. Modify the factory method inworkflow.pyto decide...(AI_EN_LECTOR_MISSING_PUNCTUATION)
[typographical] ~41-~41: It appears that a comma is missing.
Context: ...h artifact workflow to use. Without any changes the Code Workflow is used. You can sta...(DURING_THAT_TIME_COMMA)
[misspelling] ~43-~43: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...flow is used. You can start by sending an request on the [chat UI](http://localho...(EN_A_VS_AN)
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
packages/create-llama/templates/components/workflows/python/artifacts/README-template.md(1 hunks)
🧰 Additional context used
🪛 LanguageTool
packages/create-llama/templates/components/workflows/python/artifacts/README-template.md
[uncategorized] ~41-~41: A punctuation mark might be missing here.
Context: ...s canvas. Modify the factory method in workflow.py to decide...
(AI_EN_LECTOR_MISSING_PUNCTUATION)
[typographical] ~41-~41: It appears that a comma is missing.
Context: ...h artifact workflow to use. Without any changes the Code Workflow is used. You can sta...
(DURING_THAT_TIME_COMMA)
[misspelling] ~43-~43: Use “a” instead of ‘an’ if the following word doesn’t start with a vowel sound, e.g. ‘a sentence’, ‘a university’.
Context: ...flow is used. You can start by sending an request on the [chat UI](http://localho...
(EN_A_VS_AN)
🪛 markdownlint-cli2 (0.17.2)
packages/create-llama/templates/components/workflows/python/artifacts/README-template.md
26-26: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
45-45: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
57-57: Fenced code blocks should have a language specified
null
(MD040, fenced-code-language)
⏰ Context from checks skipped due to timeout of 90000ms (27)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files)
Summary by CodeRabbit
New Features
Enhancements
Documentation