You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Part of the automated testing framework is, of course, the artifacts that are used to populate a Malcolm system to be able to have data to run tests against. In other words, PCAP.
GitHub has some limits on large files, and for good reason, because managing binary files, especially large ones, does not work well within Git. The solution for this is Git LFS (large file storage), which is not a GitHub-specific technology (it can be deployed on other git servers locally or wherever), but it is a (paid, at the volume we would need) service that GitHub offers. GitHub does offer free Git LFS but it's fairly limited for our needs.
I'm going to make the automated testing framework as flexible as possible (supporting getting the PCAP data set from a number of places, I think), but for the "canonical" location for storing the PCAP files we need to come up with some place. I am looking into cisagov's GitHub account (i'll look at idaholabs, as well) if it is already paying for Git LFS, and if so, could we use it, and if so, what would the limits be, etc.
Another possible option might be something like backblaze which is reasonably priced.
The text was updated successfully, but these errors were encountered:
Although it wasn't my first choice, after coming across some difficulties/complications with trying to fund GitHub's Git-LFS, we've just created a regular git repository called Malcolm-Test-Artifacts where we're housing the artifacts. We want to be judicious about the volume of data we store here, but I think it will be sufficient for now. We may revisit in the future with a more robust solution.
Part of the automated testing framework is, of course, the artifacts that are used to populate a Malcolm system to be able to have data to run tests against. In other words, PCAP.
GitHub has some limits on large files, and for good reason, because managing binary files, especially large ones, does not work well within Git. The solution for this is Git LFS (large file storage), which is not a GitHub-specific technology (it can be deployed on other git servers locally or wherever), but it is a (paid, at the volume we would need) service that GitHub offers. GitHub does offer free Git LFS but it's fairly limited for our needs.
I'm going to make the automated testing framework as flexible as possible (supporting getting the PCAP data set from a number of places, I think), but for the "canonical" location for storing the PCAP files we need to come up with some place. I am looking into cisagov's GitHub account (i'll look at idaholabs, as well) if it is already paying for Git LFS, and if so, could we use it, and if so, what would the limits be, etc.
Another possible option might be something like backblaze which is reasonably priced.
The text was updated successfully, but these errors were encountered: