Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement logprobs #2

Closed
abetlen opened this issue Mar 25, 2023 · 0 comments
Closed

Implement logprobs #2

abetlen opened this issue Mar 25, 2023 · 0 comments
Labels
enhancement New feature or request

Comments

@abetlen
Copy link
Owner

abetlen commented Mar 25, 2023

logprobs return format should match OpenAI API. Currently calling a Llama instance with logprobs enabled just returns a list of floats.

Example of the correct format:

"logprobs": {
    "text_offset": [
        11,
        12,
        13,
        14,
        15,
        17,
        18,
        20,
        21,
        23,
        24,
        26,
        27,
        29,
        30,
        32
    ],
    "token_logprobs": [
        -0.028534053,
        -0.0013638621,
        -0.0001191709,
        -0.037809037,
        -0.008346983,
        -1.3900239e-05,
        -6.0395385e-05,
        -2.462996e-05,
        -5.4432137e-05,
        -4.3108244e-05,
        -6.0395385e-05,
        -4.382537e-05,
        -4.489638e-05,
        -4.751897e-05,
        -0.00017937786,
        -7.314978e-05
    ],
    "tokens": [
        "\n",
        "\n",
        "1",
        ",",
        " 2",
        ",",
        " 3",
        ",",
        " 4",
        ",",
        " 5",
        ",",
        " 6",
        ",",
        " 7",
        ","
    ],
    "top_logprobs": [
        {
            "\n": -0.028534053,
            "\n\n": -5.3414392,
            " (": -6.8118296,
            " in": -4.9322805,
            ":": -5.6061873
        },
        {
            "\n": -0.0013638621,
            " \u00a7\u00a7": -8.594428,
            "//": -9.296644,
            "1": -9.727121,
            "Count": -9.291412
        },
        {
            " 1": -10.996209,
            "\"": -12.673454,
            "#": -12.253096,
            "1": -0.0001191709,
            "One": -9.39247
        },
        {
            " -": -6.4947214,
            " 2": -7.7675867,
            ")": -8.327954,
            ",": -0.037809037,
            ".": -3.3655276
        },
        {
            "\n": -14.826643,
            " ": -10.675518,
            " 2": -0.008346983,
            " two": -16.126537,
            "2": -4.792885
        },
        {
            " ,": -11.469002,
            " 3": -12.7872095,
            ",": -1.3900239e-05,
            ".": -14.724538,
            "<|endoftext|>": -15.308233
        },
        {
            " ": -12.118958,
            " 3": -6.0395385e-05,
            " three": -17.906118,
            "3": -9.814757,
            "<|endoftext|>": -15.049129
        },
        {
            " ,": -10.729593,
            " 4": -14.016008,
            ",": -2.462996e-05,
            ".": -14.297305,
            "<|endoftext|>": -13.67176
        },
        {
            " ": -11.351273,
            " 4": -5.4432137e-05,
            "4": -10.086686,
            "<|endoftext|>": -13.919009,
            "\u00a0": -16.80569
        },
        {
            " ,": -10.206355,
            " 5": -12.87644,
            ",": -4.3108244e-05,
            ".": -13.588498,
            "<|endoftext|>": -13.03574
        },
        {
            " ": -11.478045,
            " 5": -6.0395385e-05,
            "5": -9.931537,
            "<|endoftext|>": -13.568035,
            "\u00a0": -16.266188
        },
        {
            " ,": -10.160495,
            " 6": -12.964705,
            ",": -4.382537e-05,
            ".": -14.101328,
            "<|endoftext|>": -13.08568
        },
        {
            " ": -11.344849,
            " 6": -4.489638e-05,
            "6": -10.329956,
            "<|endoftext|>": -14.879237,
            "\u00a0": -16.98358
        },
        {
            " ,": -10.096309,
            " 7": -12.389179,
            ",": -4.751897e-05,
            ".": -13.817777,
            "<|endoftext|>": -13.860558
        },
        {
            " ": -11.630913,
            " 7": -0.00017937786,
            " seven": -16.613815,
            "7": -8.680304,
            "<|endoftext|>": -14.859097
        },
        {
            " ,": -9.754253,
            " 8": -11.516983,
            ",": -7.314978e-05,
            ".": -13.250221,
            "<|endoftext|>": -12.703088
        }
    ]
}
@abetlen abetlen added the bug Something isn't working label Mar 25, 2023
@abetlen abetlen added the good first issue Good for newcomers label Apr 4, 2023
@abetlen abetlen changed the title Fix logprobs return type Implement logprobs Apr 5, 2023
@abetlen abetlen added enhancement New feature or request and removed bug Something isn't working good first issue Good for newcomers labels Apr 5, 2023
@abetlen abetlen pinned this issue Apr 5, 2023
@abetlen abetlen unpinned this issue Apr 6, 2023
xaptronic pushed a commit to xaptronic/llama-cpp-python that referenced this issue Jun 13, 2023
abetlen pushed a commit that referenced this issue Aug 18, 2023
abetlen added a commit that referenced this issue Sep 30, 2023
antoine-lizee pushed a commit to antoine-lizee/llama-cpp-python that referenced this issue Oct 30, 2023
* vvhg-code-infill (abetlen#1)

* infill in separate example (abetlen#2)

* reverted changes to main and added infill example

* cleanup

* naming improvement

* make : add missing blank line

* fix missing semicolon

* brought infill up to current main code

* cleanup

---------

Co-authored-by: Cebtenzzre <cebtenzzre@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant