Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MLT-0070 Cleanup and update documentation #51

Merged
merged 13 commits into from
Dec 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
711 changes: 0 additions & 711 deletions .editorconfig

Large diffs are not rendered by default.

147 changes: 70 additions & 77 deletions .github/workflows/deploy-project.yaml

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
HELP.md
### Compile Output ###
target/
!.mvn/wrapper/maven-wrapper.jar
!**/src/main/**/target/
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ After building the JAR file, it can be used to build a Docker image using the fo

```shell
# Move the jar to the Docker directory
cp target/wls.jar docker/
cp target/wls.jar docker/wls.jar

# Use Docker Buildx to build the Docker Image
docker buildx build --platform linux/amd64 -t wls:latest docker/
Expand All @@ -107,14 +107,14 @@ The images are built based on the `main` branch as well as project `tags`, and c
# Pull the latest image
docker pull harbor.nb.no/mlt/wls:latest

# Or pull a specific tag (either a GitHub tag or 'main' for the latest main branch image)
# Or pull a specific tag (either a GitHub tag or "main" for the latest main branch image)
docker pull harbor.nb.no/mlt/wls:<TAG>
```

With the image either built or pulled, it can be run using the following command:

```shell
docker run -p 8080:8080 -e SPRING_PROFILES_ACTIVE='local-dev' harbor.nb.no/mlt/wls:<TAG>
docker run -p 8080:8080 -e SPRING_PROFILES_ACTIVE="local-dev" harbor.nb.no/mlt/wls:<TAG>
```

### Using an IDE
Expand Down
60 changes: 60 additions & 0 deletions docker/.dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
# Ignore Maven build files
target/
build/
pom.xml.tag
pom.xml.releaseBackup
pom.xml.versionsBackup
pom.xml.next
release.properties

# Ignore IntelliJ IDEA project files
.idea/
*.iml
*.ipr
*.iws

# Ignore VS Code project files
.vscode/

# Ignore Maven files
.mvn/
.m2/
mvnw
mvnw.cmd

# Ignore OS-specific files
.DS_Store
Thumbs.db

# Ignore logs
*.log

# Ignore temporary files
*.tmp
*.swp
*.swo
*~

# Ignore backup files
*.bak

# Ignore Git configuration files
.github/
.git/
.gitignore
.gitattributes
.gitmodules

# Ignore Kubernetes configuration files
k8s/

# Exclude test-related files and coverage reports
test-results/
surefire-reports/


# Ignore project specific files
docker/keycloak/
docker/mongo/
docker/keycloak-realm-export.md
.editorconfig
26 changes: 20 additions & 6 deletions docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,14 +1,28 @@
# What base image to use, since we are using Java let's use the temurin image from our harbor registry
# Use temurin as base image, getting it from local harbor registry, since this is a Kotlin application
# To keep the bloat to a minumum, it uses the alpine version of the image
# Should consider removing the french language pack as well: "sudo rm -fr /*"
FROM harbor.nb.no/library/eclipse-temurin:21-jdk-alpine

# Set the maintainer of the image, makes it easier to know who to contact if there are any issues
LABEL maintainer="Magasin og Logistikk"
# Set metadata for the image, this is used to more easily identify the image and who created it
LABEL maintainer="\"Magasin og Logistikk\" team at NLN"
LABEL description="Hermes WLS (Warehouse and Logistics Service) functions as a \
middleware between NLNs (National Library of Norway) catalogues and storage systems"

# Copy the jar file from the target folder to the root of the container, the wls.jar comes from GitHub workflow
# Copy the jar file from the target folder to the root of the container
# GitHub workflow generates the wls.jar file
# When building manually generate the jar file with "mvn clean package"
# and copy it to the docker folder with "cp target/wls.jar docker/wls.jar"
MormonJesus69420 marked this conversation as resolved.
Show resolved Hide resolved
COPY wls.jar app.jar

# Expose the port that the application is running on, this is defined in the src/main/resources/application.yml file
# Mark the port that the application is listenting on
# This port is defined in the src/main/resources/application.yml file
EXPOSE 8080

# Add a healthcheck to the container, this is used to check if the container is running as expected
# This is not a replacement for health check in kubernetes
# However it is convenient when running locally, or in a purely Docker based environment
HEALTHCHECK --start-period=40s --retries=5 \
CMD curl --fail --silent localhost:8081/actuator/health | grep UP || exit 1

# Set the entrypoint for the container, this is the command that will be run when the container starts up
ENTRYPOINT ["java", "-Duser.timezone=Europe/Oslo", "-Djava.security.egd=file:/dev/./urandom", "-jar", "/app.jar"]
ENTRYPOINT ["java", "-Duser.timezone=Europe/Oslo", "-Djava.security.egd=file:/dev/./urandom", "-jar", "/app.jar"]
25 changes: 18 additions & 7 deletions docker/compose.yaml
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
# Define what services to set up for local development
services:
# Sets up a simple mongo database for local development
# Set up a simple mongo database for local development
mongo-db:
image: mongo:4.4.22 # Same as the version in prod env
image: mongo:4.4.22 # Same as the version in the prod env
restart: always
environment:
MONGO_INITDB_ROOT_USERNAME: root
Expand All @@ -10,29 +11,36 @@ services:
ports:
- "27017:27017"
volumes:
# Copy script to populate the database with test data
- ./mongo/mongo-init.js:/docker-entrypoint-initdb.d/mongo-init.js:ro

# Set up a mongo-express instance to view the mongo database
# Will be available at: http://localhost:8081
# Makes GUI available at: http://localhost:8081
mongo-express:
image: mongo-express:1.0.2
restart: always
ports:
- "8081:8081"
depends_on:
- mongo-db
- mongo-db # Needs to have mongo-db container running
environment:
ME_CONFIG_MONGODB_SERVER: mongo-db
ME_CONFIG_MONGODB_ADMINUSERNAME: root
ME_CONFIG_MONGODB_ADMINPASSWORD: toor
ME_CONFIG_OPTIONS_EDITORTHEME: "gruvbox-dark"
ME_CONFIG_BASICAUTH: false

# Set up a local keycloak instance for local development
# Keycloak is configured to automatically import a test realm with service accounts
# GUI will be available at: http://localhost:8082
# Makes GUI available at: http://localhost:8082
#
# When making changes to the realm, export the realm and save it in the keycloak/import folder
# docker exec -it wls-local-keycloak-1 /opt/keycloak/bin/kc.sh export --file "/tmp/mlt-local-realm-export.json" --users "same_file" --realm "mlt-local"
# docker cp wls-local-keycloak-1:/tmp/mlt-local-realm-export.json ./docker/keycloak/import/mlt-local-realm-export.json
# git add docker/keycloak/import/mlt-local-realm-export.json
# This connects interactively to the container, uses keycloak export command to export the realm, copies the file to the local machine, and adds it to git
#
# Also, since I am a lazy boi: axiell -> E93MF2F8UfpRrCowAbVMStvsTzy0gmgr
keycloak:
image: quay.io/keycloak/keycloak:24.0.1 # Same as the version in prod env
restart: always
Expand All @@ -42,12 +50,15 @@ services:
KEYCLOAK_ADMIN_PASSWORD: toor
ports:
- "8082:8080"
command: start-dev --import-realm
command: start-dev --import-realm # Automatically import realm on startup with needed clients
volumes:
- ./keycloak/import:/opt/keycloak/data/import

# Set up a local SynQ instance for local development
# Provides an option to test out communication with SynQ API locally, without the need to use SynQ in dev/prod environments
# Makes Swagger GUI available at: http://localhost:8181/synq/swagger
dummy-synq:
image: harbor.nb.no/mlt/dummy-synq:main
image: harbor.nb.no/mlt/dummy-synq:main # Get the "latest" version of the dummy-synq image
restart: always
ports:
- "8181:8181"
6 changes: 4 additions & 2 deletions docker/keycloak-realm-export.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,13 @@
Keycloak is configured to automatically import a test realm with service accounts.
GUI with imported realm is then available [here](http://localhost:8082/admin/master/console/#/mlt-local "Keycloak Admin Console for MLT-Local Realm").
When making changes to the realm, export the realm and save it in the keycloak/import folder.
That way changes will be persisted after the container is restarted, and available to others.
That way changes will persist even if the container is restarted, and be available to others.

```bash
docker exec -it wls-local-keycloak-1 /opt/keycloak/bin/kc.sh export --file "/tmp/mlt-local-realm-export.json" --users "same_file" --realm "mlt-local"
docker cp wls-local-keycloak-1:/tmp/mlt-local-realm-export.json ./docker/keycloak/import/mlt-local-realm-export.json
git add docker/keycloak/import/mlt-local-realm-export.json
```

Make sure that the container name `wls-local-keycloak-1` is correct.
You can check the container name by running `docker ps` and looking for the container running the `quay.io/keycloak/keycloak` image.
Check the container name with `docker ps` and look for the container running the `quay.io/keycloak/keycloak` image.
34 changes: 19 additions & 15 deletions docker/keycloak/import/mlt-local-realm-export.json
Original file line number Diff line number Diff line change
Expand Up @@ -553,12 +553,12 @@
"directAccessGrantsEnabled" : false,
"serviceAccountsEnabled" : true,
"publicClient" : false,
"frontchannelLogout" : true,
"frontchannelLogout" : false,
"protocol" : "openid-connect",
"attributes" : {
"oidc.ciba.grant.enabled" : "false",
"client.secret.creation.time" : "1720685173",
"backchannel.logout.session.required" : "true",
"backchannel.logout.session.required" : "false",
"post.logout.redirect.uris" : "+",
"oauth2.device.authorization.grant.enabled" : "false",
"display.on.consent.screen" : "false",
Expand Down Expand Up @@ -741,12 +741,13 @@
"directAccessGrantsEnabled" : false,
"serviceAccountsEnabled" : true,
"publicClient" : false,
"frontchannelLogout" : true,
"frontchannelLogout" : false,
"protocol" : "openid-connect",
"attributes" : {
"oidc.ciba.grant.enabled" : "false",
"client.secret.creation.time" : "1728308503",
"backchannel.logout.session.required" : "true",
"backchannel.logout.session.required" : "false",
"post.logout.redirect.uris" : "+",
"oauth2.device.authorization.grant.enabled" : "false",
"display.on.consent.screen" : "false",
"backchannel.logout.revoke.offline.tokens" : "false"
Expand All @@ -763,6 +764,7 @@
"config" : {
"user.session.note" : "clientAddress",
"introspection.token.claim" : "true",
"userinfo.token.claim" : "true",
"id.token.claim" : "true",
"access.token.claim" : "true",
"claim.name" : "clientAddress",
Expand All @@ -777,6 +779,7 @@
"config" : {
"user.session.note" : "client_id",
"introspection.token.claim" : "true",
"userinfo.token.claim" : "true",
"id.token.claim" : "true",
"access.token.claim" : "true",
"claim.name" : "client_id",
Expand All @@ -791,13 +794,14 @@
"config" : {
"user.session.note" : "clientHost",
"introspection.token.claim" : "true",
"userinfo.token.claim" : "true",
"id.token.claim" : "true",
"access.token.claim" : "true",
"claim.name" : "clientHost",
"jsonType.label" : "String"
}
} ],
"defaultClientScopes" : [ "wls-synq", "wls-item", "wls-audience", "wls-order", "wls-subject" ],
"defaultClientScopes" : [ "wls-synq", "wls-audience", "wls-subject" ],
"optionalClientScopes" : [ "web-origins", "acr", "address", "phone", "roles", "profile", "offline_access", "microprofile-jwt", "email" ]
}, {
"id" : "e84865db-9f5c-4213-b9c9-23c911520899",
Expand All @@ -822,7 +826,7 @@
"directAccessGrantsEnabled" : false,
"serviceAccountsEnabled" : true,
"publicClient" : false,
"frontchannelLogout" : true,
"frontchannelLogout" : false,
"protocol" : "openid-connect",
"attributes" : {
"client.secret.creation.time" : "1719488186",
Expand All @@ -832,7 +836,7 @@
"use.refresh.tokens" : "true",
"oidc.ciba.grant.enabled" : "false",
"client.use.lightweight.access.token.enabled" : "false",
"backchannel.logout.session.required" : "true",
"backchannel.logout.session.required" : "false",
"client_credentials.use_refresh_token" : "false",
"tls.client.certificate.bound.access.tokens" : "false",
"require.pushed.authorization.requests" : "false",
Expand Down Expand Up @@ -889,7 +893,7 @@
"jsonType.label" : "String"
}
} ],
"defaultClientScopes" : [ "wls-item", "wls-audience", "wls-order", "wls-subject" ],
"defaultClientScopes" : [ "wls-synq", "wls-item", "wls-audience", "wls-order", "wls-subject" ],
"optionalClientScopes" : [ "web-origins", "acr", "address", "phone", "roles", "profile", "offline_access", "microprofile-jwt", "email" ]
} ],
"clientScopes" : [ {
Expand Down Expand Up @@ -967,7 +971,7 @@
}, {
"id" : "e77adead-5964-4400-9368-3b1520d57bdd",
"name" : "wls-item",
"description" : "A client scope used to add a 'item' scope to the JWT scope field",
"description" : "A client scope used to add a \"item\" scope to the JWT scope field",
"protocol" : "openid-connect",
"attributes" : {
"include.in.token.scope" : "true",
Expand Down Expand Up @@ -1284,10 +1288,10 @@
}, {
"id" : "43e7474f-a3fe-4d44-a0e1-48518c8330f1",
"name" : "wls-synq",
"description" : "A client scope used to add a 'synq' scope to the JWT scope field",
"description" : "A client scope used to add a \"synq\" scope to the JWT scope field",
"protocol" : "openid-connect",
"attributes" : {
"include.in.token.scope" : "false",
"include.in.token.scope" : "true",
"display.on.consent.screen" : "false",
"gui.order" : "",
"consent.screen.text" : ""
Expand Down Expand Up @@ -1427,7 +1431,7 @@
}, {
"id" : "2f9f07c6-b866-402b-8f18-3f7baf021f12",
"name" : "wls-order",
"description" : "A client scope used to add an 'order' scope to the JWT scope field",
"description" : "A client scope used to add an \"order\" scope to the JWT scope field",
"protocol" : "openid-connect",
"attributes" : {
"include.in.token.scope" : "true",
Expand Down Expand Up @@ -1553,7 +1557,7 @@
"subType" : "authenticated",
"subComponents" : { },
"config" : {
"allowed-protocol-mapper-types" : [ "oidc-usermodel-attribute-mapper", "saml-user-property-mapper", "oidc-full-name-mapper", "saml-user-attribute-mapper", "oidc-usermodel-property-mapper", "oidc-address-mapper", "saml-role-list-mapper", "oidc-sha256-pairwise-sub-mapper" ]
"allowed-protocol-mapper-types" : [ "oidc-full-name-mapper", "oidc-address-mapper", "saml-user-property-mapper", "oidc-usermodel-property-mapper", "oidc-sha256-pairwise-sub-mapper", "saml-role-list-mapper", "oidc-usermodel-attribute-mapper", "saml-user-attribute-mapper" ]
}
}, {
"id" : "85a712b7-0975-406f-ba2e-c9a99625cfad",
Expand All @@ -1562,7 +1566,7 @@
"subType" : "anonymous",
"subComponents" : { },
"config" : {
"allowed-protocol-mapper-types" : [ "oidc-sha256-pairwise-sub-mapper", "saml-user-attribute-mapper", "oidc-address-mapper", "oidc-usermodel-property-mapper", "saml-user-property-mapper", "oidc-full-name-mapper", "saml-role-list-mapper", "oidc-usermodel-attribute-mapper" ]
"allowed-protocol-mapper-types" : [ "oidc-sha256-pairwise-sub-mapper", "oidc-full-name-mapper", "oidc-address-mapper", "saml-role-list-mapper", "oidc-usermodel-attribute-mapper", "oidc-usermodel-property-mapper", "saml-user-property-mapper", "saml-user-attribute-mapper" ]
}
} ],
"org.keycloak.keys.KeyProvider" : [ {
Expand Down Expand Up @@ -2185,4 +2189,4 @@
"clientPolicies" : {
"policies" : [ ]
}
}
}
Loading