Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

codecov > 70% #72

Closed
peterdudfield opened this issue Feb 10, 2022 · 13 comments · Fixed by #102 · May be fixed by #79
Closed

codecov > 70% #72

peterdudfield opened this issue Feb 10, 2022 · 13 comments · Fixed by #102 · May be fixed by #79
Labels
enhancement New feature or request

Comments

@peterdudfield
Copy link
Collaborator

Detailed Description

Nice to get codecov greater than 70%

@peterdudfield peterdudfield added the enhancement New feature or request label Feb 10, 2022
@peterdudfield peterdudfield moved this to Todo in Nowcasting Feb 10, 2022
@peterdudfield
Copy link
Collaborator Author

@jacobbieker happy to discuss if it should be a different number

@jacobbieker
Copy link
Member

Yeah, I'll try to get it up to that, I need to think about it some more. Test coverage should be higher though than it is definitely.

@jacobbieker jacobbieker mentioned this issue Feb 14, 2022
7 tasks
@notger
Copy link
Collaborator

notger commented Mar 11, 2022

There is an open issue about adding unit tests, which basically amounts to the same thing: #9 .

I would be a taker for a few unit tests, so if any of you with more experience with the code would point me at some likely candidates, I would be grateful.

@notger notger mentioned this issue Mar 11, 2022
@jacobbieker
Copy link
Member

Awesome! That would be great! Thanks. A few good ones for the unit tests I think would be making unit tests out of the parts of scripts/generate_test_plots.py as there is an end to end test with the plots, but no actual unit tests for the individual parts. Additionally, especially testing the jpeg_xl_float_with_nans and scale_to_zero_to_one functions as they are quite important for creating the zarrs and making sure the data is saved with minimal changes

@notger
Copy link
Collaborator

notger commented Apr 11, 2022

Hi guys, sorry for the radio-silence, but Covid hit me hard and it took me some time to get back up again + holidays, so this was delayed more than I would have liked.

@jacobbieker I am having troubles with the eccodes when testing the downloading and conversion as applied in the generate_test_plots-script. I dimly remember that the test setup does not have all libraries installed per requirements.txt and maybe this could be connected?
Failed tests for reference: https://github.com/openclimatefix/Satip/runs/5968233754?check_suite_focus=true

@jacobbieker
Copy link
Member

No worries! Hope you are feeling all better now.

Yeah, eccodes isn't included by default, I think the easiest way is to conda install it, as it has a binary dependency. The workflow for the plot generation script includes all the required dependencies to run satip end to end though.

@notger
Copy link
Collaborator

notger commented Apr 11, 2022

Yes, thanks, everything back to normal.

After you mentioned the conda-install, I checked the Dockerfile and indeed, that is where the missing packages are installed. Makes sense now. However, then I am quite baffled by the error. I will take the liberty to @ you over in the draft-PR to get your insights on that. Might be that you ran into that before.

@notger notger mentioned this issue Apr 11, 2022
8 tasks
Repository owner moved this from Todo to Done in Nowcasting Apr 21, 2022
@peterdudfield
Copy link
Collaborator Author

@jacobbieker should we keep this open? I reckon it just closed as it was linked with PR #102

@jacobbieker
Copy link
Member

Yeah, we should leave it open, it's mostly to above 70 percent now though!

@jacobbieker jacobbieker reopened this Apr 22, 2022
Repository owner moved this from Done to In Progress in Nowcasting Apr 22, 2022
@notger
Copy link
Collaborator

notger commented Apr 22, 2022

I would vote to close it, though, as with the current setup, 70% coverage overall are not feasible (see description of PR #102 ).

The uncovered code-parts are all about downloading or chunking up large downloaded datasets. Downloading is covered via scripts/generate_test_plots.py and chunking up large downloaded datasets is not realistically testable on a regular basis.

So to me, this means that the cost is tested well enough and this issue if not closed will stay open forever as one of those zombie-issues that get never touched.

@peterdudfield
Copy link
Collaborator Author

peterdudfield commented Apr 22, 2022

Should we write a test that tests the downloading, perhaps mock the actual download with some data?

I know 70% is a bit of random number, and we dont have to do that. But if there are bits of code untested, then I think we should aim to cover them

https://app.codecov.io/gh/openclimatefix/Satip/branch/main

btw - I can pitch in and help

@notger
Copy link
Collaborator

notger commented Apr 22, 2022

I though about mocks myself, but I don't think that this would help.

The non-tested code's functionality is mostly about storing and transforming the downloaded files, which is where things can go wrong if formats in the libs or eumetsat's API changes. Mocks would make it such that we effectively test a few if's and else's, but not what the code in there actually does.

The simple download-functionality itself is covered by scripts/generate_test_plots.py, though that does not show up in the coverage report, which I don't mind, personally.

For some of the tests I created mock data to test the transformations, where I saw a sensible way which would be in line with the intentions of unit-tests. Did not see a way for the still un-tested code parts, but if you have an idea, I am all ears.

@jacobbieker
Copy link
Member

It actually is above 70% now, so closing this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
No open projects
Status: In Progress
Development

Successfully merging a pull request may close this issue.

3 participants