Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add custom api key support to chat endpoint #507

Merged
merged 3 commits into from
Sep 18, 2024

Conversation

HamadaSalhab
Copy link
Contributor

@HamadaSalhab HamadaSalhab commented Sep 18, 2024

🚀 This description was created by Ellipsis for commit a51e6f8

feat: add custom API key support to chat endpoint

Summary:

Add support for custom API keys in chat endpoint by modifying acompletion() and chat() functions and updating API specification.

Key points:

  • Behavior:
    • Adds support for custom API keys in acompletion() in litellm.py by introducing custom_api_key parameter.
    • Modifies chat() in chat.py to accept X-Custom-Api-Key header and pass it to acompletion().
  • API:
    • Updates endpoints.tsp to include X-Custom-Api-Key as an optional header in the generate endpoint.

Generated with ❤️ by ellipsis.dev

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Looks good to me! Reviewed everything up to a51e6f8 in 13 seconds

More details
  • Looked at 71 lines of code in 3 files
  • Skipped 0 files when reviewing.
  • Skipped posting 2 drafted comments based on config settings.
1. agents-api/agents_api/clients/litellm.py:18
  • Draft comment:
    Consider re-adding the line model = f"openai/{model}" if the litellm proxy still requires this format for model names.
  • Reason this comment was not posted:
    Decided after close inspection that this draft comment was likely wrong and/or not actionable:
    The comment is speculative and does not provide strong evidence that the line needs to be re-added. It asks the author to consider a change based on a condition that is not verified. According to the rules, speculative comments should be removed unless there is strong evidence of an issue.
    I might be missing some context about the requirements of the litellm proxy, but the comment does not provide this context either. The comment should be actionable and based on evidence, not speculation.
    Even if there is missing context, the comment should have provided that context to be useful. Without it, the comment remains speculative and not actionable.
    The comment should be deleted because it is speculative and does not provide strong evidence of an issue.
2. agents-api/agents_api/routers/sessions/chat.py:36
  • Draft comment:
    Ensure consistency in header alias usage across the codebase if other headers are used similarly.
  • Reason this comment was not posted:
    Confidence changes required: 50%
    The addition of x_custom_api_key in the chat function is correctly implemented, but the usage of Header with alias should be consistent with other headers if any.

Workflow ID: wflow_feoUgOlRFgnkCXAh


You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.

@HamadaSalhab HamadaSalhab merged commit c8eff6f into dev Sep 18, 2024
4 of 7 checks passed
@HamadaSalhab HamadaSalhab deleted the f/custom-api-key-support-to-chat-endpoint branch September 18, 2024 17:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant