-
Notifications
You must be signed in to change notification settings - Fork 2.2k
fix: skip doc attributes in __annotations__ but not in __fields__ #6035
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| class CustomDoc(BaseDoc): | ||
| tensor: Optional[AnyTensor] | ||
| url: ImageUrl | ||
| class_var: ClassVar[str] = "class_var_val" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we need a better test for this. In this case only an empty DocList is created.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added classvars to more tests. Let me know if there are any more tests you think I should add it to.
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## master #6035 +/- ##
==========================================
- Coverage 77.60% 77.59% -0.01%
==========================================
Files 144 144
Lines 13788 13790 +2
==========================================
+ Hits 10700 10701 +1
- Misses 3088 3089 +1
Flags with carried forward coverage won't be shown. Click here to find out more.
☔ View full report in Codecov by Sentry. |
JoanFM
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You need to import classVar in tests
|
@JoanFM Done. Sorry, I haven't had time to setup my dev-environment. |
| fields: Dict[str, Any] = {} | ||
| for field_name, field in model.__annotations__.items(): | ||
| if field_name not in model.__fields__: | ||
| continue |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but here what will happen is that the model is not shown in the gateway.
I need an example that proves that if I deploy an Executor in a Flow with such a document class, that that info also shows up in the OpenAPI description.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or isn't it ur expected behavior?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@JoanFM the way we are currently using ClassVars, they should not show up in the OpenAPI documentation. They are attributes specific to a type of input that is used internally within the service. One of the recent changes broke this pattern for us, so this change is to fix that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@JoanFM I added another example/test here. Here the classvar is an attirbuted associated with the input type, but not exposed to the user.
|
@NarekA , tests are failing for your PR https://github.com/jina-ai/jina/actions/runs/6092600012/job/16533518592?pr=6035 These are the tests that use |
94694bb to
f5bf383
Compare
I believe my latest change fixed that module. I am seeing tests passing when I run locally: ❯ pytest tests/unit/serve/runtimes/test_helper.py -vv
========================== test session starts ==========================
platform darwin -- Python 3.8.17, pytest-7.4.1, pluggy-1.3.0 -- /Users/narek/anaconda3/envs/jina/bin/python
cachedir: .pytest_cache
rootdir: /Users/narek/projects/jina
configfile: pytest.ini
plugins: anyio-3.7.1
collected 19 items
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[key-False] PASSED [ 5%]
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[key_1-False] PASSED [ 10%]
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[executor__key-True] PASSED [ 15%]
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[exec2__key_2-True] PASSED [ 21%]
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[__results__-False] PASSED [ 26%]
tests/unit/serve/runtimes/test_helper.py::test_is_specific_executor[__banana__-False] PASSED [ 31%]
tests/unit/serve/runtimes/test_helper.py::test_split_key_executor_name[executor__key-key-executor] PASSED [ 36%]
tests/unit/serve/runtimes/test_helper.py::test_split_key_executor_name[executor__key_1-key_1-executor] PASSED [ 42%]
tests/unit/serve/runtimes/test_helper.py::test_split_key_executor_name[executor_1__key-key-executor_1] PASSED [ 47%]
tests/unit/serve/runtimes/test_helper.py::test_parse_specific_param[param0-parsed_param0-executor] PASSED [ 52%]
tests/unit/serve/runtimes/test_helper.py::test_parse_specific_param[param1-parsed_param1-executor] PASSED [ 57%]
tests/unit/serve/runtimes/test_helper.py::test_parse_specific_param[param2-parsed_param2-executor] PASSED [ 63%]
tests/unit/serve/runtimes/test_helper.py::test_parse_specific_param[param3-parsed_param3-executor] PASSED [ 68%]
tests/unit/serve/runtimes/test_helper.py::test_get_name_from_replicas[exec1/rep-0-exec1] PASSED [ 73%]
tests/unit/serve/runtimes/test_helper.py::test_get_name_from_replicas[exec1-exec1] PASSED [ 78%]
tests/unit/serve/runtimes/test_helper.py::test_create_pydantic_model_from_schema[proto] PASSED [ 84%]
tests/unit/serve/runtimes/test_helper.py::test_create_pydantic_model_from_schema[json] PASSED [ 89%]
tests/unit/serve/runtimes/test_helper.py::test_create_empty_doc_list_from_schema[proto] PASSED [ 94%]
tests/unit/serve/runtimes/test_helper.py::test_create_empty_doc_list_from_schema[json] PASSED [100%]
|
|
@JoanFM ⬆️ |
…/jina into fix-input-doc-description
|
@JoanFM I can't tell if the segfaults in the last test run are real errors, or just need to be re-run. Can you take a look? |
|
Last execution: https://github.com/jina-ai/jina/actions/runs/6097551109/job/16545774855 Thread 0x00007f5c540d8b80 (most recent call first):
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/selectors.py", line 468 in select
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/asyncio/base_events.py", line 1823 in _run_once
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/asyncio/base_events.py", line 570 in run_forever
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/asyncio/base_events.py", line 603 in run_until_complete
File "/home/runner/work/jina/jina/jina/orchestrate/flow/base.py", line 1989 in _wait_until_all_ready
File "/home/runner/work/jina/jina/jina/orchestrate/flow/base.py", line 1843 in start
File "/home/runner/work/jina/jina/jina/orchestrate/flow/builder.py", line 33 in arg_wrapper
File "/home/runner/work/jina/jina/jina/orchestrate/orchestrator.py", line 14 in __enter__
File "/home/runner/work/jina/jina/tests/integration/sandbox/test_sandbox.py", line 11 in test_sandbox
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/python.py", line 194 in pytest_pyfunc_call
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_callers.py", line 77 in _multicall
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_manager.py", line 115 in _hookexec
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_hooks.py", line 493 in __call__
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/python.py", line 1792 in runtest
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 169 in pytest_runtest_call
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_callers.py", line 77 in _multicall
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_manager.py", line 115 in _hookexec
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_hooks.py", line 493 in __call__
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 262 in <lambda>
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 341 in from_call
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 261 in call_runtest_hook
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/flaky/flaky_pytest_plugin.py", line 138 in call_and_report
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 133 in runtestprotocol
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/runner.py", line 114 in pytest_runtest_protocol
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/flaky/flaky_pytest_plugin.py", line 94 in pytest_runtest_protocol
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_callers.py", line 77 in _multicall
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_manager.py", line 115 in _hookexec
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_hooks.py", line 493 in __call__
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/main.py", line 349 in pytest_runtestloop
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_callers.py", line 77 in _multicall
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_manager.py", line 115 in _hookexec
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_hooks.py", line 493 in __call__
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/main.py", line 324 in _main
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/main.py", line 270 in wrap_session
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/main.py", line [317](https://github.com/jina-ai/jina/actions/runs/6097551109/job/16545774855#step:8:318) in pytest_cmdline_main
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_callers.py", line 77 in _multicall
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_manager.py", line 115 in _hookexec
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/pluggy/_hooks.py", line 493 in __call__
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/config/__init__.py", line 168 in main
File "/opt/hostedtoolcache/Python/3.8.17/x64/lib/python3.8/site-packages/_pytest/config/__init__.py", line 191 in console_main
File "/opt/hostedtoolcache/Python/3.8.17/x64/bin/pytest", line 8 in <module>
/home/runner/work/_temp/d40278d6-4c92-44f4-8c21-[335](https://github.com/jina-ai/jina/actions/runs/6097551109/job/16545774855#step:8:336)7dbff6133.sh: line 1: 7638 Segmentation fault (core dumped) pytest --suppress-no-test-exit-code --force-flaky --min-passes 1 --max-runs 5 --cov=jina --cov-report=xml --timeout=600 -v -s --ignore-glob='tests/integration/docarray_v2/*' --ignore-glob='tests/integration/stateful/*' --ignore-glob='tests/integration/hub_usage/dummyhub*' tests/integration/sandbox/
Error: Process completed with exit code 139. |
|
Thanks for the contribution @NarekA !!! |
Currently, having a
ClassVarin an input class breaks the application.Error:
Goals:
ClassVarfields which are excluded from the API docs.