Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Passing and receiving response headers to and from the language model #1379

Open
cmapund opened this issue Sep 26, 2024 · 0 comments
Open

Passing and receiving response headers to and from the language model #1379

cmapund opened this issue Sep 26, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@cmapund
Copy link

cmapund commented Sep 26, 2024

🚀 The feature

Can you pass extra headers to the large language model and extract the headers in the response?

Motivation, pitch

I have AzureOpenAI end points in different regions and my use case requests that some of my data should not leave the country of origin. It's super easy to do that with AzureOpenAI chat completions, by passing in the target endpoint for each query, but I can't find the equivalent functionality with PandasAI Agents. Moreover, I don't not have any way of verifying which endpoint gave me my response, because I either get a string, dataframe or graph in the response.

I know it's possible because PandasAI sits above the language models which can accept and return extra response headers. I need to make a conversational data agent and PandasAI is ticking all the boxes - except that security one. Is this already possible and I missed it in the documentation somewhere?

Alternatives

No response

Additional context

No response

@dosubot dosubot bot added the enhancement New feature or request label Sep 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant