-
-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
codecov > 70% #72
Comments
@jacobbieker happy to discuss if it should be a different number |
Yeah, I'll try to get it up to that, I need to think about it some more. Test coverage should be higher though than it is definitely. |
There is an open issue about adding unit tests, which basically amounts to the same thing: #9 . I would be a taker for a few unit tests, so if any of you with more experience with the code would point me at some likely candidates, I would be grateful. |
Awesome! That would be great! Thanks. A few good ones for the unit tests I think would be making unit tests out of the parts of scripts/generate_test_plots.py as there is an end to end test with the plots, but no actual unit tests for the individual parts. Additionally, especially testing the jpeg_xl_float_with_nans and scale_to_zero_to_one functions as they are quite important for creating the zarrs and making sure the data is saved with minimal changes |
Hi guys, sorry for the radio-silence, but Covid hit me hard and it took me some time to get back up again + holidays, so this was delayed more than I would have liked. @jacobbieker I am having troubles with the eccodes when testing the downloading and conversion as applied in the |
No worries! Hope you are feeling all better now. Yeah, eccodes isn't included by default, I think the easiest way is to conda install it, as it has a binary dependency. The workflow for the plot generation script includes all the required dependencies to run satip end to end though. |
Yes, thanks, everything back to normal. After you mentioned the conda-install, I checked the Dockerfile and indeed, that is where the missing packages are installed. Makes sense now. However, then I am quite baffled by the error. I will take the liberty to @ you over in the draft-PR to get your insights on that. Might be that you ran into that before. |
@jacobbieker should we keep this open? I reckon it just closed as it was linked with PR #102 |
Yeah, we should leave it open, it's mostly to above 70 percent now though! |
I would vote to close it, though, as with the current setup, 70% coverage overall are not feasible (see description of PR #102 ). The uncovered code-parts are all about downloading or chunking up large downloaded datasets. Downloading is covered via So to me, this means that the cost is tested well enough and this issue if not closed will stay open forever as one of those zombie-issues that get never touched. |
Should we write a test that tests the downloading, perhaps mock the actual download with some data? I know 70% is a bit of random number, and we dont have to do that. But if there are bits of code untested, then I think we should aim to cover them https://app.codecov.io/gh/openclimatefix/Satip/branch/main btw - I can pitch in and help |
I though about mocks myself, but I don't think that this would help. The non-tested code's functionality is mostly about storing and transforming the downloaded files, which is where things can go wrong if formats in the libs or eumetsat's API changes. Mocks would make it such that we effectively test a few if's and else's, but not what the code in there actually does. The simple download-functionality itself is covered by For some of the tests I created mock data to test the transformations, where I saw a sensible way which would be in line with the intentions of unit-tests. Did not see a way for the still un-tested code parts, but if you have an idea, I am all ears. |
It actually is above 70% now, so closing this. |
Detailed Description
Nice to get codecov greater than 70%
The text was updated successfully, but these errors were encountered: