Skip to content

Regarding Crossfile Completion Prompt Format for Seed-Coder-8B Series Models on CrossCodeEval Benchmark #4

@YDJSIR-NJU

Description

@YDJSIR-NJU

Referencing the technical report, could you please provide the specific prompt format used to evaluateSeed-Coder-8B-Base on the CrossCodeEval benchmark?

This information is needed to accurately reproduce the evaluation setup and for comparative analysis.

Specifically:

CrossCodeEval Prompt Format:

  • What was the exact prompt structure/template used for Seed-Coder-8B-Base during the CrossCodeEval evaluation?

Cross-File Context Handling:

  • How was cross-file context incorporated into the prompts for the CrossCodeEval evaluation (if applicable)?
  • More generally, does the Seed-Coder model series utilize special tokens or a specific prompt format for handling multi-file context in repository-level code completion tasks (potentially similar to mechanisms in models like the Qwen2.5-Coder series)?

Details on the format(s), including any special tokens, or providing an example prompt would be very helpful.

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions