Skip to content

Releases: log10-io/log10

0.10.1

02 Jul 20:34
617b1e7
Compare
Choose a tag to compare

What's Changed

Bug fixes

  • Fix missing request's system message in anthropic by @kxtran in #208

Chores

  • ENG-879 ENG-882 Move cli logic into its own folder by @kxtran in #198

Full Changelog: 0.10.0...0.10.1

0.10.0

21 Jun 22:13
5d1b0bb
Compare
Choose a tag to compare

What's Changed

New

  • Support Anthropic claude-3-5-sonnet-20240620 model
  • Support magentic version 0.27.0
  • Support anthropic version 0.29.0

Fixes

  • Update Anthropic function call output parsing logic to display messages properly in log10 platform

Chores

  • ENG-849 ENG-819 Refactor Anthropic parsing code for async and sync with httpx client by @kxtran in #183

Full Changelog: 0.9.2...0.10.0

0.9.2

17 Jun 22:48
4c19150
Compare
Choose a tag to compare

What's Changed

  • ENG-851 Update google-generativeai version, example and test by @kxtran in #185
  • Update openai version to 1.33.0 by @kxtran in #186
  • Disable warning for async openai embedding calls by @wenzhe-log10 in #189
  • [ENG-856] Remove large file from repo by @nqn in #187
  • [ENG-857] Filter large image messages by @nqn in #190

Full Changelog: 0.9.1...0.9.2

0.9.1

11 Jun 19:25
@nqn nqn
Compare
Choose a tag to compare

What's Changed

  • [ENG-850] Large files break logger by @nqn in #184

Full Changelog: 0.9.0...0.9.1

0.9.0

07 Jun 17:04
7b56d11
Compare
Choose a tag to compare

What's Changed

New

  • Add fetching autofeedback by completion id to cli by @kxtran in #175

    To get auto generated feedback for a completion, use log10 feedback autofeedback get

  • Use non blocking async for AsyncOpenAI and AsyncAnthropic by @wenzhe-log10 in #179

    Release 0.9.0 includes significant improvements in how we handle concurrency while using LLM in asynchronous streaming mode.
    This update is designed to ensure that logging at steady state incurs no overhead (previously up to 1-2 seconds), providing a smoother and more efficient experience in latency critical settings.

    Important Considerations for Short-Lived Scripts:

    💡For short-lived scripts using asynchronous streaming, it's important to note that you may need to wait until all logging requests have been completed before terminating your script.
    We have provided a convenient method called finalize() to handle this.
    Here's how you can implement this in your code:

    from log10._httpx_utils import finalize
    
    ...
    
    await finalize()

    Ensure finalize() is called once, at the very end of your event loop to guarantee that all pending logging requests are processed before the script exits.
    For more details, check async logging examples.

Chores

Full Changelog: 0.8.6...0.9.0

0.8.6

29 May 17:54
5e43358
Compare
Choose a tag to compare

What's Changed

Bug fixes

  • Update parsing async openai streaming response logic by @kxtran in #167
    • Installing latest magentic version pulls openai version >= 1.26.0 which includes usage block and an empty array list of choices for streaming response which caused the parsing logic to run into exception. So the fix is to handle the new streaming responses and improve the code quality to be more robust.

Chores

Full Changelog: 0.8.5...0.8.6

0.8.5

28 May 20:33
bc72c3e
Compare
Choose a tag to compare

What's Changed

New

  • Add gpt-4o in cli benchmark_models by @wenzhe-log10 in #159
  • ENG-724: Add a function in load.py to return the last_completion_id by @nqn in #165
  • ENG-784 Add anthropic async and tools stream api support by @kxtran in #162

Bug fixes

  • Session bug (fix with context variables) by @nqn in #161

Chores

Full Changelog: 0.8.4...0.8.5

0.8.4

29 Apr 19:07
4ef086a
Compare
Choose a tag to compare

What's Changed

  • ENG-615: Make feedback task name required & add optional completion tag selectors to feedback task creation by @nullfox in #153

New Contributors

Full Changelog: 0.8.3...0.8.4

0.8.3

26 Apr 00:11
901925c
Compare
Choose a tag to compare

What's Changed

Bug fixes

  • fix condition when finish_reason is stop for tool_calls by @wenzhe-log10 in #152
  • strip not given kwargs for openai sync calls if openai.NOT_GIVEN is assigned to kwargs by @wenzhe-log10 in #154

Chores

Full Changelog: 0.8.2...0.8.3

0.8.2

24 Apr 22:32
61e4561
Compare
Choose a tag to compare

What's Changed

New

  • support google.generativeai sdk ChatSession.send_message and add examples by @wenzhe-log10 in #148
    import google.generativeai as genai
    
    from log10.load import log10
    
    
    log10(genai)
    
    
    model = genai.GenerativeModel("gemini-1.5-pro-latest", system_instruction="You are a cat. Your name is Neko.")
    chat = model.start_chat(
        history=[
            {"role": "user", "parts": [{"text": "please say yes."}]},
            {"role": "model", "parts": [{"text": "Yes yes yes?"}]},
        ]
    )
    
    prompt = "please say no."
    response = chat.send_message(prompt)
    
    print(response.text)
    
  • README update - add model comparison using CLI by @wenzhe-log10 in #149

Fix

  • Move cli utils func in its own file by @kxtran in #150
  • fix sync stream for both openai tool_calls and magentic funciton calls by @wenzhe-log10 in #136

Full Changelog: 0.8.1...0.8.2