Description
Bug Description
I am facing a similar issue as I have set my global handler for the promptlayer but not able to see any of the responses logged in my account. I tried to debug a little, and was able to see that indeed the global handler is set, but for some reason, when i invoke the llama_parse_query_engine.query it doesnt invoke any prompt tracing service from prompt layer
<llama_index.callbacks.promptlayer.base.PromptLayerHandler object at 0x0000023D6734FEC0>
here is how i set the promptlayer in my fast api application: ( I have my api key set as mentioned in documentation, and yes it is active, as i tested it out in another application in js.)
from llama_index.core import set_global_handler set_global_handler("promptlayer", pl_tags=["test"])
here are the versions of the packages i am currently using:
llama-index 0.10.65
python ^3.12
If you need any further information, please let me know.
Version
0.10.65
Steps to Reproduce
I am facing a similar issue as I have set my global handler for the promptlayer but not able to see any of the responses logged in my account. I tried to debug a little, and was able to see that indeed the global handler is set, but for some reason, when i invoke the llama_parse_query_engine.query it doesnt invoke any prompt tracing service from prompt layer
<llama_index.callbacks.promptlayer.base.PromptLayerHandler object at 0x0000023D6734FEC0>
here is how i set the promptlayer in my fast api application: ( I have my api key set as mentioned in documentation, and yes it is active, as i tested it out in another application in js.)
from llama_index.core import set_global_handler set_global_handler("promptlayer", pl_tags=["test"])
here are the versions of the packages i am currently using:
llama-index 0.10.65
python ^3.12
If you need any further information, please let me know.
Relevant Logs/Tracbacks
similar to the issue mentioned here:
https://github.com/run-llama/llama_index/issues/10632
Activity