Skip to content

Conversation

@NarekA
Copy link
Contributor

@NarekA NarekA commented Aug 29, 2023

Currently, having a ClassVar in an input class breaks the application.

class MyDoc(BaseDoc):
    endpoint: ClassVar[str] = "my_endpoint"
    input_test: str = ""

Error:

    field_info = model.__fields__[field_name].field_info
KeyError: 'endpoint'

Goals:

  • Allow input schemas to have ClassVar fields which are excluded from the API docs.
  • check and update documentation. See guide and ask the team.

class CustomDoc(BaseDoc):
tensor: Optional[AnyTensor]
url: ImageUrl
class_var: ClassVar[str] = "class_var_val"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we need a better test for this. In this case only an empty DocList is created.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added classvars to more tests. Let me know if there are any more tests you think I should add it to.

@codecov
Copy link

codecov bot commented Aug 29, 2023

Codecov Report

Patch coverage: 33.33% and project coverage change: -0.01% ⚠️

Comparison is base (0569354) 77.60% compared to head (48b0e72) 77.59%.
Report is 1 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #6035      +/-   ##
==========================================
- Coverage   77.60%   77.59%   -0.01%     
==========================================
  Files         144      144              
  Lines       13788    13790       +2     
==========================================
+ Hits        10700    10701       +1     
- Misses       3088     3089       +1     
Flag Coverage Δ
jina 77.59% <33.33%> (-0.01%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files Changed Coverage Δ
jina/serve/runtimes/helper.py 23.01% <0.00%> (-0.38%) ⬇️
jina/__init__.py 56.00% <100.00%> (ø)

... and 1 file with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@JoanFM JoanFM left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You need to import classVar in tests

@NarekA
Copy link
Contributor Author

NarekA commented Sep 1, 2023

@JoanFM Done. Sorry, I haven't had time to setup my dev-environment.

fields: Dict[str, Any] = {}
for field_name, field in model.__annotations__.items():
if field_name not in model.__fields__:
continue
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but here what will happen is that the model is not shown in the gateway.

I need an example that proves that if I deploy an Executor in a Flow with such a document class, that that info also shows up in the OpenAPI description.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or isn't it ur expected behavior?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JoanFM the way we are currently using ClassVars, they should not show up in the OpenAPI documentation. They are attributes specific to a type of input that is used internally within the service. One of the recent changes broke this pattern for us, so this change is to fix that.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JoanFM I added another example/test here. Here the classvar is an attirbuted associated with the input type, but not exposed to the user.

@NarekA NarekA requested a review from JoanFM September 6, 2023 04:44
@JoanFM JoanFM changed the title Skip Annotations not in Fields fix: skip doc attributes in __annotations__ but not in __fields__ Sep 6, 2023
@JoanFM
Copy link
Contributor

JoanFM commented Sep 6, 2023

@NarekA ,

tests are failing for your PR https://github.com/jina-ai/jina/actions/runs/6092600012/job/16533518592?pr=6035

These are the tests that use docarray>0.30

@NarekA NarekA force-pushed the fix-input-doc-description branch from 94694bb to f5bf383 Compare September 6, 2023 12:55
@NarekA
Copy link
Contributor Author

NarekA commented Sep 6, 2023

@NarekA ,

tests are failing for your PR https://github.com/jina-ai/jina/actions/runs/6092600012/job/16533518592?pr=6035

These are the tests that use docarray>0.30

I believe my latest change fixed that module. I am seeing tests passing when I run locally:

❯ pytest tests/unit/serve/runtimes/test_helper.py -vv
========================== test session starts ==========================
platform darwin -- Python 3.8.17, pytest-7.4.1, pluggy-1.3.0 -- /Users/narek/anaconda3/envs/jina/bin/python
cachedir: .pytest_cache
rootdir: /Users/narek/projects/jina
configfile: pytest.ini
plugins: anyio-3.7.1
collected 19 items                                                      

tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[key-False] PASSED [  5%]
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[key_1-False] PASSED [ 10%]
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[executor__key-True] PASSED [ 15%]
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[exec2__key_2-True] PASSED [ 21%]
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[__results__-False] PASSED [ 26%]
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[__banana__-False] PASSED [ 31%]
tests/unit/serve/runtimes/test_helper.py::test_split_key_executor_name[executor__key-key-executor] PASSED [ 36%]
tests/unit/serve/runtimes/test_helper.py::test_split_key_executor_name[executor__key_1-key_1-executor] PASSED [ 42%]
tests/unit/serve/runtimes/test_helper.py::test_split_key_executor_name[executor_1__key-key-executor_1] PASSED [ 47%]
tests/unit/serve/runtimes/test_helper.py::test_parse_specific_param[param0-parsed_param0-executor] PASSED [ 52%]
tests/unit/serve/runtimes/test_helper.py::test_parse_specific_param[param1-parsed_param1-executor] PASSED [ 57%]
tests/unit/serve/runtimes/test_helper.py::test_parse_specific_param[param2-parsed_param2-executor] PASSED [ 63%]
tests/unit/serve/runtimes/test_helper.py::test_parse_specific_param[param3-parsed_param3-executor] PASSED [ 68%]
tests/unit/serve/runtimes/test_helper.py::test_get_name_from_replicas[exec1/rep-0-exec1] PASSED [ 73%]
tests/unit/serve/runtimes/test_helper.py::test_get_name_from_replicas[exec1-exec1] PASSED [ 78%]
tests/unit/serve/runtimes/test_helper.py::test_create_pydantic_model_from_schema[proto] PASSED [ 84%]
tests/unit/serve/runtimes/test_helper.py::test_create_pydantic_model_from_schema[json] PASSED [ 89%]
tests/unit/serve/runtimes/test_helper.py::test_create_empty_doc_list_from_schema[proto] PASSED [ 94%]
tests/unit/serve/runtimes/test_helper.py::test_create_empty_doc_list_from_schema[json] PASSED [100%]

@NarekA
Copy link
Contributor Author

NarekA commented Sep 6, 2023

@JoanFM ⬆️

@NarekA
Copy link
Contributor Author

NarekA commented Sep 6, 2023

@JoanFM I can't tell if the segfaults in the last test run are real errors, or just need to be re-run. Can you take a look?

@NarekA
Copy link
Contributor Author

NarekA commented Sep 6, 2023

Last execution: https://github.com/jina-ai/jina/actions/runs/6097551109/job/16545774855

Thread 0x00007f5c540d8b80 (most recent call first):
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/selectors.py", line 468 in select
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/asyncio/base_events.py", line 1823 in _run_once
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/asyncio/base_events.py", line 570 in run_forever
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/asyncio/base_events.py", line 603 in run_until_complete
  File "/home/runner/work/jina/jina/jina/orchestrate/flow/base.py", line 1989 in _wait_until_all_ready
  File "/home/runner/work/jina/jina/jina/orchestrate/flow/base.py", line 1843 in start
  File "/home/runner/work/jina/jina/jina/orchestrate/flow/builder.py", line 33 in arg_wrapper
  File "/home/runner/work/jina/jina/jina/orchestrate/orchestrator.py", line 14 in __enter__
  File "/home/runner/work/jina/jina/tests/integration/sandbox/test_sandbox.py", line 11 in test_sandbox
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/python.py", line 194 in pytest_pyfunc_call
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_callers.py", line 77 in _multicall
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_manager.py", line 115 in _hookexec
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_hooks.py", line 493 in __call__
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/python.py", line 1792 in runtest
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 169 in pytest_runtest_call
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_callers.py", line 77 in _multicall
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_manager.py", line 115 in _hookexec
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_hooks.py", line 493 in __call__
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 262 in <lambda>
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 341 in from_call
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 261 in call_runtest_hook
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/flaky/flaky_pytest_plugin.py", line 138 in call_and_report
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 133 in runtestprotocol
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 114 in pytest_runtest_protocol
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/flaky/flaky_pytest_plugin.py", line 94 in pytest_runtest_protocol
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_callers.py", line 77 in _multicall
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_manager.py", line 115 in _hookexec
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_hooks.py", line 493 in __call__
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/main.py", line 349 in pytest_runtestloop
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_callers.py", line 77 in _multicall
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_manager.py", line 115 in _hookexec
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_hooks.py", line 493 in __call__
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/main.py", line 324 in _main
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/main.py", line 270 in wrap_session
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/main.py", line [317](https://github.com/jina-ai/jina/actions/runs/6097551109/job/16545774855#step:8:318) in pytest_cmdline_main
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_callers.py", line 77 in _multicall
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_manager.py", line 115 in _hookexec
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_hooks.py", line 493 in __call__
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/config/__init__.py", line 168 in main
  File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/config/__init__.py", line 191 in console_main
  File "/opt/hostedtoolcache/Python/3.8.17/x64/bin/pytest", line 8 in <module>
/home/runner/work/_temp/d40278d6-4c92-44f4-8c21-[335](https://github.com/jina-ai/jina/actions/runs/6097551109/job/16545774855#step:8:336)7dbff6133.sh: line 1:  7638 Segmentation fault      (core dumped) pytest --suppress-no-test-exit-code --force-flaky --min-passes 1 --max-runs 5 --cov=jina --cov-report=xml --timeout=600 -v -s --ignore-glob='tests/integration/docarray_v2/*' --ignore-glob='tests/integration/stateful/*' --ignore-glob='tests/integration/hub_usage/dummyhub*' tests/integration/sandbox/
Error: Process completed with exit code 139.

@JoanFM
Copy link
Contributor

JoanFM commented Sep 6, 2023

Thanks for the contribution @NarekA !!!

@JoanFM JoanFM merged commit 7529214 into jina-ai:master Sep 6, 2023
@JoanFM JoanFM mentioned this pull request Sep 6, 2023
@NarekA NarekA deleted the fix-input-doc-description branch September 6, 2023 16:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants