Skip to content

Releases: HeardLibrary/linked-data

VanderBot v1.7.1 release

07 Apr 00:30
Compare
Choose a tag to compare

This release includes changes from v1.7 and v1.7.1 which

  • enabled command line options different from defaults
  • created an error log rather than displaying the errors as they occurred.
  • checked for existing label/description combinations prior to attempting to write
  • added more "smart" error trapping for dates

VanderBot v1.6.4 release

27 Jan 23:13
Compare
Choose a tag to compare

Version 1.6.4 contains a bug fix that explicitly encodes all HTTP POST bodies as UTF-8. This caused problems if strings being sent as part of a SPARQL query contained non-Latin characters.

VanderBot v1.6.3 release

23 Dec 04:17
Compare
Choose a tag to compare

This is a minor upgrade that adds an updated version of the HTML, Javascript, and CSS for the web page that generates CSV metadata description JSON:

wikidata-csv2rdf-metadata.html
wikidata-csv2rdf-metadata.js
wikidata-csv2rdf-metadata.css

The upgrade now supports monolingual string values the complex value types globecoordinate and quantity. Other scripts were not affected.

VanderBot v1.6.2 release

01 Dec 15:00
Compare
Choose a tag to compare

This is a minor upgrade to fix a bug in vb6_upload_wikidata.py that was preventing it from capturing the hash of a reference when an item did not have a value for one of the reference properties.

VanderBot v1.6.1 release

25 Nov 17:30
Compare
Choose a tag to compare

This is a minor upgrade primarily to fix several bugs discovered during testing of v1.6 . It also include the script acquire_wikidata_metadata.py, a general-purpose script for downloading existing data from Wikidata into a CSV format suitable for use by VanderBot.

VanderBot v1.6 release

13 Nov 21:39
338ec07
Compare
Choose a tag to compare

This release adds support for statements having monolingual string, globecoordinate, and quantity values. It also fixes several bugs discovered during testing.

VanderBot v1.5.2 release

06 Nov 22:07
Compare
Choose a tag to compare

VanderBot v1.5.1 release

10 Sep 14:39
Compare
Choose a tag to compare

Minor update to uncomment code in vb6_upload_wikidata.py after testing. See v1.5 release notes for other details.

VanderBot v1.5 release

08 Sep 21:39
73baf3d
Compare
Choose a tag to compare

The major change to the code was to increase the number of table columns per date from one to three. Previously, there was a single column for the date string. However, this did not allow for varying date precision. Now there is an additional column for the Wikibase date precision number (e.g. 9 for year, 11 for date to day). The third column is for a date value node identifier. This can either be the actual node identifier from Wikidata (a hash of unknown origin) or a random UUID generated by one of the scripts in this suite. This identifies the node to which both the date value and date precision are attached. It effectively serves as a blank node. In the future, it may be replaced with the actual date node identifier.

The other addition is a Javascript script written by Jessie Baskauf that drives this form, which can be used to generate a csv-metadata.json mapping schema. With such a mapping schema, any CSV can be used as the source date for the vb6_upload_wikidata.py API upload script.

VanderBot v1.4 release

18 Aug 03:02
2086028
Compare
Choose a tag to compare

The changes made in this release were made following tests that used the csv-metadata.json mapping schema to emit RDF from the source CSV tables. In order to make it possible to create all of the kinds of statements present in the Wikidata data model, the csv-metadata.json file and vb6_upload_wikidata.py script were changed to use the ps: namespace (http://www.wikidata.org/prop/statement/) properties rather than the wdt: namespace properties. This makes it possible to construct the missing wdt: statements using SPARQL CONSTRUCT. A new script materializes those triples by a CONSTRUCT query to a SPARQL endpoint whose triplestore contains the triples generated by the schema. Those materialized triples are then loaded into the triplestore, making it possible to perform queries on any graph pattern that can be used at the Wikidata Query Service SPARQL endpoint.

The first five scripts were not changed in this release.