Skip to content

Add custom block size for neural network workflow #2891

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

thodkatz
Copy link
Contributor

Attempts to close #2701.

Before
image

After
tiling_error
tiling_success

We used to determine the block size by a fixed sensible value. But there were some use cases like #2071, that this value wasn't the best fit. So, I have attempted to expose the configuration of the block size in the neural network applet.

There are at least two important factors when determining the block size. Tiling artifacts caused by normalization issues from the model (relevant PR), and hardware constraints e.g. GPU memory usage.

Another example of tiling artifacts:
image

The current PR doesn't address the hardware constraints issue. It is possible to use a very large shape that can lead to an out of memory exception. Then the user should attempt to choose a smaller shape. But this process maybe shouldn't eventually be exposed to the user at all?

Copy link

codecov bot commented Jul 26, 2024

Codecov Report

Attention: Patch coverage is 59.48905% with 111 lines in your changes missing coverage. Please review.

Project coverage is 55.81%. Comparing base (f0862ef) to head (8717f5d).
Report is 8 commits behind head on main.

Files Patch % Lines
ilastik/applets/neuralNetwork/nnClassGui.py 0.00% 69 Missing ⚠️
lazyflow/operators/tiktorch/classifier.py 79.78% 27 Missing and 10 partials ⚠️
lazyflow/operators/tiktorch/operators.py 66.66% 2 Missing and 1 partial ⚠️
ilastik/applets/neuralNetwork/opNNclass.py 84.61% 1 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2891      +/-   ##
==========================================
- Coverage   55.82%   55.81%   -0.02%     
==========================================
  Files         539      539              
  Lines       62582    62754     +172     
  Branches     8599     8607       +8     
==========================================
+ Hits        34936    35025      +89     
- Misses      25877    25951      +74     
- Partials     1769     1778       +9     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Theodoros Katzalis added 4 commits July 26, 2024 21:16
There was the assumption that ModelSession should be able to handle multiple inputs and outputs, and each one of those with multiple shapes. This logic was left partially implemented, with a lot of assertions whenever ModelSession was used. At the same time, the interface of tiktorch server doesn't support multiple tensors and multiple shapes.

The feature of supporting this should be revised, and not be left partially implemented. To be able to continue with other features, ModelSession has been refactored to just work with the simple case of single tensors and shapes.
ModelSession's input shape is more evidently classified either as explicit or parameterized.
@thodkatz thodkatz force-pushed the add-nn-custom-block-size branch from 8717f5d to 19dc023 Compare July 26, 2024 19:17
@thodkatz thodkatz marked this pull request as draft August 1, 2024 12:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Wrong segmentation using NN Prediction
1 participant