-
-
Notifications
You must be signed in to change notification settings - Fork 25.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable warnings on travis (and make sure they are reasonable) #10158
Comments
I think you are right. I look at the distribution of warnings removing
The top 10 offenders:
The full thing: warnings.txt |
I wonder what proportion come from common tests...
|
A lot of warnings are indeed coming from common tests:
|
Okay. I've added hiding the ConvergenceWarnings there to the issue description. |
IMO the main priority is to figure out where the 3000+ warnings from test_common.py are coming from, and to silence them if we think it is fine to silence them. One example that jumps to mind: we may fit on some small data in test_common.py so that ignoring ConvergeWarning may be fine. |
We explicitly limit the number of iterations in common tests, which is why
we get the warnings there.
|
I think @jnothman is right, we get convergence warnings because we change |
So we got
a bunch of times, which is bad, since it means the code will actually break if we don't change it (but we still have 2 versions to fix it ;) #9379 added a whole bunch of new warnings... |
also pretty bad:
|
looks like the common tests are clean now. looking for the worst offenders in the other tests now |
|
Updated:
Some other ones are a bit more concerning I feel, though:
I'm also confused by these:
Shouldn't these be errors? |
I think we should try to avoid all that are not RuntimeWarning or ConvergenceWarning who are both numerical issues. The other warnings are UserWarnings mostly which means we made weird mistakes in setting up the tests. |
Did we make those estimator check ones warnings to avoid backward
compatibility issues? Maybe that needs looking into again.
|
what needs to be done here at this point? |
I think someone need to do what I did in #10158 (comment) and reevaluate how many warnings are still there. From Andy's comments about worst offenders #11536 is still open (but you saw that since you commented on the issue). |
some new ones, yay! (this is from #8022 but master is similar) |
@thomasjpfan is this still relevant? |
@adrinjalali Yes, this feels like a never ending issue. Since we do not actively check for warnings, new warnings will always appear. We need a good way to see if a PR introduces new warnings or have a way to display the warnings a PR creates. Furthermore, on the master branch, we should have a way to see current warnings. In azure, after running pytest, we can create a separate step to just show the warnings. (It will do a little string processing of the pytest output and display only the warnings) |
if we could somehow save those warning as artifacts, then we could fail on the diff between the master's log and the PR's maybe? |
Why do we need to do string processing? Can't we just show the pytest output? |
We can show the pytest output, but it's the same as checking for pep8 issues on the whole codebase. We don't do it, we only fail if a PR introduces new issues. The issue is that we already do have tons of warnings, and it'd be too hard to try and remove them all, so the idea is to prevent PRs to introduce new warnings. |
Why is it too hard? I want to remove them all. |
Also I would argue we should maybe do the same for flake8, because we keep introducing unused imports. though lgtm might help us with that. |
I'm totally +1 on both warnings and flake8! |
We just disabled warnings on travis in #9840.
I don't think we should do that. I've been a bit absent lately, but I think the current state of the warnings is pretty bad. Many of these seem recent changes that require cleanup.
ConvergenceWarning
s raised insklearn.utils.estimator_checks
done in [MRG+2] catch more expected warnings in common tests #11151.The text was updated successfully, but these errors were encountered: