Skip to content

Process Time Inconsistent #480

Open
Open
@tech-deployment

Description

@tech-deployment

Describe the bug
I am trying to understand why do I get so much difference to get a parsing response between:

  • Parse a file directly on LlamaCloud
  • Parse a file Using the Python Client

Job ID
JobId using LlamaCloud: 11d692d2-c10b-4323-a1b6-659ce705b9a6
JobId using LlamaParse Client: 27be0d32-3923-438b-b782-51fbf39e5695

Additional context
I have been using for both the same option:
-> Accurate Mode
-> Both are not using the cache

Some Numbers:
On LlamaCloud the average processing time is around 2/4 seconds.
Using LlamaParse python client I am getting an average of 26 seconds (when it complete sometime it's hang forever).

Is there any setup to be done on the Python Client to get up to speed?

Thank you

Pierre

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions