You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a situation where I'm trying to map from several fields to a media-type (see attached template, map data and sample data).
- r_mt : "{ { var res = nullValue() ; /* 'empty(res) || res.isNull()' because plain 'null' and 'ValueNull' results arise - confusing*/ res = (empty(res) || res.isNull()) && !empty(resource_mime_type) ? resource_mime_type.toLowerCase().split(',').trim().map('mediatype.map',false) : res ; res = (empty(res) || res.isNull()) && !empty(resource_mimetype_inner) ? resource_mimetype_inner.toLowerCase().split(',').trim().map('mediatype.map',false) : res ; res = (empty(res) || res.isNull()) && !empty(resource_format) ? resource_format.toLowerCase().split(',').trim().map('mediatype.map',false) : res; return res; } }"
Most of the entries have single values in the CSV data file. However in the first two (non-header) rows there are entries that contain comma separated values within the cell, so the transform uses split(',') to form a ValueArray as the 'input' side of the map. In the first row, not all of the values within the array map successfully (the check_mt binding reports on failed or partially failed mappings).
When ValueBase.map fails to resolve a mapping it throws a new NullResult which appear to abort further evaluation of the corresponding Pattern. In the case of a ValueArray.map the means that potentially successful mappings are lost. Hence for example triples are generated for row 2 of the attached CSV, but not for row 1 even though the first two elements do resolve in the map.
I tried an experiment that change the applyFunction on ValueArray to catch the NullResult and substitute a ValueNull for the absent result and continue. [Aside: did also try using a HashSet to collect results, but that the disturbed the ordering of results to the extend of generating test failures].
However, the presence of ValueNull in the binding result cause a subsequent failure when TemplateBase.asTriple(...) validates the object node and find's null. This throws an EvalFailed exception with a message of Illegal or null RDF node result from pattern . which aborts processing for any further non-null array members. In the case of a 'simple' scalar null, the absent object value is silently skipped.
My larger concern is how to deal with the partial failure ValueArray.map(...) when the mapping of a single element in the array fails. It seems right that there should be index alignment between input and output arrays. The experiment with HashSets lost that, and allowed the output array to be smaller than the input array. Index alignment could be important when multiple related arrays are being processed.
I think the map processing should insert ValueNulls into the array (or default value if given as you suggested); and the later TemplateBase processing of ValueArray with nulls within them should probably silently ignore the null and move on to the next value in the array.
I have a situation where I'm trying to map from several fields to a media-type (see attached template, map data and sample data).
Most of the entries have single values in the CSV data file. However in the first two (non-header) rows there are entries that contain comma separated values within the cell, so the transform uses
split(',')
to form aValueArray
as the 'input' side of the map. In the first row, not all of the values within the array map successfully (thecheck_mt
binding reports on failed or partially failed mappings).When
ValueBase.map
fails to resolve a mapping it throws a newNullResult
which appear to abort further evaluation of the corresponding Pattern. In the case of aValueArray.map
the means that potentially successful mappings are lost. Hence for example triples are generated for row 2 of the attached CSV, but not for row 1 even though the first two elements do resolve in the map.I tried an experiment that change the
applyFunction
onValueArray
to catch theNullResult
and substitute aValueNull
for the absent result and continue. [Aside: did also try using a HashSet to collect results, but that the disturbed the ordering of results to the extend of generating test failures].However, the presence of
ValueNull
in the binding result cause a subsequent failure whenTemplateBase.asTriple(...)
validates the object node and find's null. This throws anEvalFailed
exception with a message ofIllegal or null RDF node result from pattern
. which aborts processing for any further non-null array members. In the case of a 'simple' scalar null, the absent object value is silently skipped.My larger concern is how to deal with the partial failure
ValueArray.map(...)
when the mapping of a single element in the array fails. It seems right that there should be index alignment between input and output arrays. The experiment with HashSets lost that, and allowed the output array to be smaller than the input array. Index alignment could be important when multiple related arrays are being processed.I think the map processing should insert
ValueNull
s into the array (or default value if given as you suggested); and the laterTemplateBase
processing ofValueArray
with nulls within them should probably silently ignore the null and move on to the next value in the array.Template
resource-media-type.yaml.txt
Media-type map source
media-type.ttl.txt
Data sample
resource_media_types.csv
The text was updated successfully, but these errors were encountered: