-
-
Notifications
You must be signed in to change notification settings - Fork 5.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ENH: add a stats.differential_entropy
function
#13631
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
100 % patch diff coverage of the new code and the paper is really well cited as Matt recently noted.
I added a few comments before the stats
regulars have a deeper look.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some minor changes in the text
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is great. Good documentation, good input validation, good tests, good PEP8. "No" might be an acceptable answer to all these comments. (Haven't checked the code, yet, but it is short so there probably won't be much to say.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think for the general parts this looks good. I'll leave it to stats residents about your points Matt.
There is a segmentation fault in a MacOS test, but I think it has nothing to do with these changes. |
Re-running tests. I think this just needs opinions from a second maintainer who is familiar with stats. Depending on their thoughts, it may be ready to merge as-is. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, just a few remarks on tests on my side. Also, if you know you will add more methods, I would still suggest to have the parameter method
now in case the following PR do not make it for the release. Just my opinion for discussion here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry about that, I used the wrong account.
- Testing of scalar results now also uses `assert_allclose`. - We use the context manager form of `assert_raises`.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approving as @mdhaber proposes to do a follow-up PR to move the functions to morestats.py
. The small doctest remark I had could be addressed there (if you agree OC).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's see if there's any more feedback from the mailing list. If not, I'll go ahead and merge this weekend, then submit a followup to move this into morestats.py
if it still seems like a good idea.
…py.git into differential_entropy
@vnmabus Can you also merge master, please? This will help get rid of some unrelated documentation build errors. Once you do that, you can add |
I merged master. |
Thanks for the changes @vnmabus! The doc builds pass now and the annotations also look correct. So, +1 for merging! @rgommers You were complaining about warnings with |
Indeed the logs are clean. It's great if we can finally do inline type hints! I wonder why it works now. |
OK I'm going to go ahead and merge this. In a followup PR I'll suggest moving it to a more appropriate file. |
Thanks @mdhaber. So it means we can do inline types now right? We should make sure everyone is aware of this and start enforcing it. What do you think? |
I don't know. As for enforcing it, that should be determined by the maintainers group as a whole. Please explain the advantages when proposing it. I don't know anything about it yet. My only experience with inline types so far is CI failures and unfamiliar-looking function signatures, so I would need to understand that the benefits outweigh the hassle before supporting it. I'm happy to learn things that will improve SciPy for users or make SciPy more maintainable, though. |
The immediate advantage with type hints is the support from IDE. If the whole code is typed, your IDE can catch type errors better, and just help you when you type. It's difficult to come back to not typed code after that 😉. |
Would also like to mention the |
Never heard of it, thanks! Sounds a bit like Pythran. |
While it totally helps a professional programmer in the general setting which something I notice everyday in my colleagues' workflow professionally in Go or other langs, I find it not so helpful in scientific code since most types that are relevant are ints floats bools and similar. In turn they make the function signatures very ugly and very often unreadable. As an example, I use Spyder and it already handles the signatures just fine enough and the complicated object linking like BankCustomerReport object is passed to a BankReporter function etc. is not present so I think we can handle . To be honest, I don't see the point of using types in Python but what do I know as a regular user. |
Well, to be honest I have found a lot of errors thanks to type annotations in my thesis project (which is scientific code), but what do I know as a regular user 😜. |
There is a way to tell Sphinx to use the types. Use |
I absolutely agree they can be ugly at times. I also agree that it makes code unreadable at first but once you get used to it you can read and understand it just like normal Python code. Anyway, we can always create separate |
stats.differential_entropy
function
Adds a function to compute differential entropy given a sample from a continuous distribution.
Reference issue
Closes #4080.
What does this implement/fix?
Adds a
differential_entropy
function, which computes differential entropy using the Vasicek estimator.