-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Provide a “pass/fail” status and badge #9
Comments
That badge idea is nice. However, not all Knowledge Organization Systems have the same requirements; for example sometimes it is OK to have homonyms, sometimes not. Sometimes you know you miss some translations, etc. The test tool results is not as simple as a pass/fail status and sometimes need to be interpreted with case. To explore the idea, could you suggest a format for the result as I am not familiar with shields.io ? Thanks for the comment |
http://shields.io provides badges that are often used here on GitHub. You can provide content using a json endpoint at the SKOS testing tool that returns data in a json schema; see https://shields.io/endpoint for examples and details. That's a good point about what is a pass / fail varies by system. I think it would be helpful to focus on checking that files conform to the schema and can be interpreted, and then identify problems with the content on top. A file that cannot be interpreted at all should be a fail, and problems with content could be seen as sort-of-passing. Challenges related to which system accepts what content should be dealt with by that system (i.e. out of scope). For example, an skos validating testing tool could generate something like this if the translations were missing... {
"schemaVersion": 1,
"label": "skos validator",
"message": "missing translations",
"color": "orange"
} and if the validator detects no file-related problems, then... {
"schemaVersion": 1,
"label": "skos validator",
"message": "file passes",
"color": "green"
} |
This is a great idea that I implemented but not in the SKOS validator but
rather in a generic SHACL validator, see that little badge at
https://github.com/sparna-git/SHACL-Catalog
To get the equivalent for a SKOS file, one would first need to define a
SHACL constraint file for SKOS (which I could not find easily yet).
Thomas
Le ven. 16 oct. 2020 à 16:36, Andy <[email protected]> a écrit :
… http://shields.io provides badges that are often used here on GitHub. You
can provide content using a json schema; see https://shields.io/endpoint
for details.
That's a good point about what is a pass / fail varies by system. I think
it would be helpful to focus on checking that files conform to the schema
and can be interpreted, and then identify problems with the content on top.
A file that cannot be interpreted at all should be a fail, and problems
with content could be seen as sort-of-passing. Challenges related to which
system accepts what content should be dealt with by that system (i.e. are
out of scope).
For example, an skos validating testing tool could generate something like
this if the translations were missing...
{
"schemaVersion": 1,
"label": "skos validator",
"message": "missing translations",
"color": "orange"
}
and if the validator detects no file-related problems, then...
{
"schemaVersion": 1,
"label": "skos validator",
"message": "file passes",
"color": "green"
}
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#9 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAU2H4IZX72PL2ZM4YM7H23SLBK5JANCNFSM4SJUOVBA>
.
--
*Thomas Francart* -* SPARNA*
Web de *données* | Architecture de l'*information* | Accès aux
*connaissances*
blog : blog.sparna.fr, site : sparna.fr, linkedin :
fr.linkedin.com/in/thomasfrancart
tel : +33 (0)6.71.11.25.97, skype : francartthomas
|
Great tool! It would really help if the SKOS testing tool could generate a simple pass / fail result from the test that could in turn be used to generate a shields.io badge. These are really useful for showing the user a status of their .ttl file, for example.
The text was updated successfully, but these errors were encountered: