-
Notifications
You must be signed in to change notification settings - Fork 3
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
80 changed files
with
7,119 additions
and
2,064 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -28,12 +28,15 @@ up of the indicia schema. E.g. on Ubuntu you can | |
|
||
### Starting | ||
If you clone this repo, `cd docker` and execute `./compose.sh` it will start | ||
5 docker containers offering these services. | ||
8 docker containers offering these services. | ||
1. A postgres database with postgis installed. | ||
1. pgAdmin for administering the database. | ||
1. A mock mail server. | ||
1. A webserver running the warehouse code. | ||
1. GeoServer for sharing spatial data in OGC standard format. | ||
1. Elasticsearch for storing an index of warehouse data. | ||
1. Kohana for exploring and managing Elasticsearch | ||
1. Logstash for populating Elasticsearch with warehouse data. | ||
On first run, it offers to initialise the indicia database schema. | ||
If you choose this option you will later login in as user `admin` having | ||
password `password`. | ||
|
@@ -48,10 +51,17 @@ Once running you can browse the warehouse at http://localhost:8080. | |
You can examine the database with pgAdmin at http://localhost:8070. | ||
Any mail sent by the warehouse can be viewed at http://localhost:8025. | ||
GeoServer can be configured at http://localhost:8090/geoserver. | ||
The Elasticsearch API is accessible at https://localhost:9200. | ||
Kohana is accessed by browsing https://localhost:5601. | ||
|
||
#### PgAdmin | ||
To connect pgAdmin to the database, configure the connection with | ||
- Host name: The docker container name e.g. indicia_postgres_1 | ||
To log in , the default credentials are | ||
- Email: [email protected] | ||
- Password: password | ||
|
||
To connect pgAdmin to the database, add a new server and configure the | ||
connection with | ||
- Host name: The docker container name e.g. indicia-postgres-1 | ||
- Port: 5432 | ||
- Username: postgres | ||
- Password: password | ||
|
@@ -62,7 +72,13 @@ To list the container names and ports you can execute the command | |
To log in, the default credentials are: | ||
- Username: admin | ||
- Password: geoserver | ||
|
||
|
||
#### Elasticsearch and Kohana | ||
Note that security is enabled so use https. To log in the default credentials | ||
are | ||
- Username: elastic | ||
- Password: password | ||
|
||
### Unit testing | ||
There is a separate Docker configuration for unit testing which can be | ||
run up by `cd docker` then `./phpunit.sh`. This replicates the unit | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,92 @@ | ||
# Upgrading to version 9 of the warehouse. | ||
|
||
Version 9 adds fields to the reporting cache tables and the code that populates these fields will | ||
error if the upgrade scripts which add the fields have not been run. In order to avoid errors for | ||
posted records during the upgrade process, you have 2 options. Either take all client sites | ||
offline during the upgrade (e.g. by putting Drupal in maintenance mode), or by running the | ||
following script before the upgrade so that the fields are ready in the database. Note that the | ||
UPDATE statements in particular may take a long time, depending on the number of records and | ||
samples in your database. | ||
|
||
```sql | ||
ALTER TABLE cache_samples_functional | ||
ADD COLUMN IF NOT EXISTS hide_sample_as_private boolean; | ||
|
||
ALTER TABLE cache_occurrences_functional | ||
ADD COLUMN IF NOT EXISTS hide_sample_as_private boolean; | ||
|
||
-- Disable tracking increments, so doesn't force a complete ES refresh. | ||
SET application_name = 'skiptrigger'; | ||
|
||
ALTER TABLE cache_occurrences_functional | ||
ALTER COLUMN hide_sample_as_private SET DEFAULT false; | ||
|
||
ALTER TABLE cache_samples_functional | ||
ALTER COLUMN hide_sample_as_private SET DEFAULT false; | ||
|
||
UPDATE cache_samples_functional SET hide_sample_as_private=false; | ||
|
||
UPDATE cache_occurrences_functional SET hide_sample_as_private=false; | ||
|
||
ALTER TABLE cache_samples_functional | ||
ALTER COLUMN hide_sample_as_private SET NOT NULL; | ||
|
||
ALTER TABLE cache_occurrences_functional | ||
ALTER COLUMN hide_sample_as_private SET NOT NULL; | ||
``` | ||
|
||
After the upgrade the warehouse will ask you to run the 2nd part of this script using pgAdmin, if | ||
you have already run it there is no need to run it a second time. | ||
|
||
## Elasticsearch | ||
|
||
If you are using Elasticsearch, then before upgrading you should add the mappings required for new | ||
fields. You can run the following using the Dev tools in Kibana, replacing your index name: | ||
|
||
``` | ||
PUT /my_occurrence_index/_mapping | ||
{ | ||
"properties": { | ||
"metadata.hide_sample_as_private": { | ||
"type": "boolean" | ||
} | ||
} | ||
} | ||
``` | ||
|
||
Repeat this step for your samples index if you are also storing samples in Elasticsearch. | ||
|
||
You also need to add information about this new field to each of your occurrence *.conf files used | ||
by Logstash. Edit the files and search for a comment which starts `# Convert our list of fields` | ||
which should be just above a `mutate` block. Insert the following code before the comment: | ||
|
||
```yaml | ||
mutate { | ||
add_field => { | ||
"hide_sample_as_private" => false | ||
} | ||
} | ||
# Set hide_sample_as_private using privacy_precision value. | ||
translate { | ||
source => "[privacy_precision]" | ||
target => "[hide_sample_as_private]" | ||
override => true | ||
dictionary => { | ||
"0" => true | ||
} | ||
fallback => false | ||
} | ||
``` | ||
|
||
Also, in the list of rename operations in the mutate block just below, add the following after the | ||
rename operation for `privacy_precision`: | ||
|
||
```yaml | ||
"hide_sample_as_private" => "[metadata][hide_sample_as_private]" | ||
``` | ||
|
||
Now save the config file and repeat for any other pipeline configuration files that you have set | ||
up. Finally, restart the Logstash process or service as appropriate. | ||
|
||
One the above steps have been completed, it is safe to update your warehouse code then visit the | ||
home page in order to follow the link to upgrade the database. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.