Skip to content

Commit

Permalink
Merge pull request #2 from IQSS/2073-4.13-update
Browse files Browse the repository at this point in the history
2073 4.13 update
  • Loading branch information
PaulBoon authored Apr 23, 2019
2 parents dac47c4 + c0cca67 commit 8ef5096
Show file tree
Hide file tree
Showing 28 changed files with 543 additions and 94 deletions.
7 changes: 7 additions & 0 deletions doc/release-notes/5478-refactor-swift-properties.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Now all Swift properties have been migrated to `domain.xml`, no longer needing to maintain a separate
`swift.properties` file, and offering better governability and performance. Furthermore, now the Swift
credential's password is stored using `create-password-alias`, which encrypts the password so that it does
not appear in plain text on `domain.xml`.

In order to migrate to these new configuration settings, please visit
`doc/sphinx-guides/source/installation/config.rst#swift-storage`.
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,9 +65,9 @@
# built documents.
#
# The short X.Y version.
version = '4.12'
version = '4.13'
# The full version, including alpha/beta/rc tags.
release = '4.12'
release = '4.13'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
Expand Down
11 changes: 11 additions & 0 deletions doc/sphinx-guides/source/developers/testing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -224,6 +224,17 @@ One way of generating load is by downloading many files. You can download :downl

The script requires a file called ``files.txt`` to operate and database IDs for the files you want to download should each be on their own line.

Continuous Integration
~~~~~~~~~~~~~~~~~~~~~~

The Dataverse Project currently makes use of two Continuous Integration platforms, Travis and Jenkins.

Travis builds are configured via :download:`.travis.yml <../../../../.travis.yml>` and a `GitHub webhook <https://docs.travis-ci.com/user/notifications/#configuring-webhook-notifications>`; build output is viewable at https://travis-ci.org/IQSS/dataverse/builds

Our Jenkins config is a work in progress and may be viewed at https://github.com/IQSS/dataverse-jenkins/ A corresponding GitHub webhook is required. Build output is viewable at https://jenkins.dataverse.org/

As always, pull requests to improve our continuous integration configurations are welcome.

The Phoenix Server
------------------

Expand Down
42 changes: 23 additions & 19 deletions doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -207,21 +207,29 @@ Swift Storage

Rather than storing data files on the filesystem, you can opt for an experimental setup with a `Swift Object Storage <http://swift.openstack.org>`_ backend. Each dataset that users create gets a corresponding "container" on the Swift side, and each data file is saved as a file within that container.

**In order to configure a Swift installation,** there are two steps you need to complete:
**In order to configure a Swift installation,** you need to complete these steps to properly modify the JVM options:

First, create a file named ``swift.properties`` as follows in the ``config`` directory for your installation of Glassfish (by default, this would be ``/usr/local/glassfish4/glassfish/domains/domain1/config/swift.properties``):
First, run all the following create commands with your Swift endpoint information and credentials:

.. code-block:: none
swift.default.endpoint=endpoint1
swift.auth_type.endpoint1=your-authentication-type
swift.auth_url.endpoint1=your-auth-url
swift.tenant.endpoint1=your-tenant-name
swift.username.endpoint1=your-username
swift.password.endpoint1=your-password
swift.swift_endpoint.endpoint1=your-swift-endpoint
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift.defaultEndpoint=endpoint1"
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift.authType.endpoint1=your-auth-type"
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift.authUrl.endpoint1=your-auth-url"
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift.tenant.endpoint1=your-tenant-name"
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift.username.endpoint1=your-username"
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift.endpoint.endpoint1=your-swift-endpoint"
``auth_type`` can either be ``keystone``, ``keystone_v3``, or it will assumed to be ``basic``. ``auth_url`` should be your keystone authentication URL which includes the tokens (e.g. for keystone, ``https://openstack.example.edu:35357/v2.0/tokens`` and for keystone_v3, ``https://openstack.example.edu:35357/v3/auth/tokens``). ``swift_endpoint`` is a URL that look something like ``http://rdgw.swift.example.org/swift/v1``.
``auth_type`` can either be ``keystone``, ``keystone_v3``, or it will assumed to be ``basic``. ``auth_url`` should be your keystone authentication URL which includes the tokens (e.g. for keystone, ``https://openstack.example.edu:35357/v2.0/tokens`` and for keystone_v3, ``https://openstack.example.edu:35357/v3/auth/tokens``). ``swift_endpoint`` is a URL that looks something like ``http://rdgw.swift.example.org/swift/v1``.

Then create a password alias by running (without changes):

.. code-block:: none
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift.password.endpoint1='${ALIAS=swiftpassword-alias}'"
./asadmin $ASADMIN_OPTS create-password-alias swiftpassword-alias
The second command will trigger an interactive prompt asking you to input your Swift password.

Second, update the JVM option ``dataverse.files.storage-driver-id`` by running the delete command:

Expand All @@ -233,21 +241,17 @@ Then run the create command:

You also have the option to set a **custom container name separator.** It is initialized to ``_``, but you can change it by running the create command:

``./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift-folder-path-separator=-"``
``./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift.folderPathSeparator=-"``

By default, your Swift installation will be public-only, meaning users will be unable to put access restrictions on their data. If you are comfortable with this level of privacy, the final step in your setup is to set the :ref:`:PublicInstall` setting to `true`.

In order to **enable file access restrictions**, you must enable Swift to use temporary URLs for file access. To enable usage of temporary URLs, set a hash key both on your swift endpoint and in your swift.properties file. You can do so by adding

.. code-block:: none
swift.hash_key.endpoint1=your-hash-key
In order to **enable file access restrictions**, you must enable Swift to use temporary URLs for file access. To enable usage of temporary URLs, set a hash key both on your swift endpoint and in your swift.properties file. You can do so by running the create command:

to your swift.properties file.
``./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift.hashKey.endpoint1=your-hash-key"``

You also have the option to set a custom expiration length for a generated temporary URL. It is initialized to 60 seconds, but you can change it by running the create command:
You also have the option to set a custom expiration length, in seconds, for a generated temporary URL. It is initialized to 60 seconds, but you can change it by running the create command:

``./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.temp_url_expire=3600"``
``./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.files.swift.temporaryUrlExpiryTime=3600"``

In this example, you would be setting the expiration length for one hour.

Expand Down
5 changes: 5 additions & 0 deletions doc/sphinx-guides/source/user/dataset-management.rst
Original file line number Diff line number Diff line change
Expand Up @@ -223,6 +223,9 @@ The File Path metadata field is Dataverse's way of representing a file's locatio

A file's File Path can be manually added or edited on the Edit Files page. Changing a file's File Path will change its location in the folder structure that is created when a user downloads the full dataset or a selection of files from it.

If there is more than one file in the dataset, and once at least one of them has a non-empty directory path, the Dataset Page will present an option for switching between the traditional table view, and the tree-like view of the files showing the folder structure, as in the example below:

|image-file-tree-view|

File Tags
---------
Expand Down Expand Up @@ -506,3 +509,5 @@ If you deaccession the most recently published version of the dataset but not al
:class: img-responsive
.. |file-upload-prov-window| image:: ./img/prov1.png
:class: img-responsive
.. |image-file-tree-view| image:: ./img/file-tree-view.png
:class: img-responsive
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
3 changes: 2 additions & 1 deletion doc/sphinx-guides/source/versions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,9 @@ Dataverse Guides Versions

This list provides a way to refer to previous versions of the Dataverse guides, which we still host. In order to learn more about the updates delivered from one version to another, visit the `Releases <https://github.com/IQSS/dataverse/releases>`__ page in our GitHub repo.

- 4.12
- 4.13

- `4.12 </en/4.12/>`__
- `4.11 </en/4.11/>`__
- `4.10.1 </en/4.10/>`__
- `4.10 </en/4.10/>`__
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
-->
<groupId>edu.harvard.iq</groupId>
<artifactId>dataverse</artifactId>
<version>4.12</version>
<version>4.13</version>
<packaging>war</packaging>
<name>dataverse</name>
<properties>
Expand Down
48 changes: 42 additions & 6 deletions src/main/java/edu/harvard/iq/dataverse/ConfigureFragmentBean.java
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,16 @@
import edu.harvard.iq.dataverse.externaltools.ExternalTool;
import edu.harvard.iq.dataverse.externaltools.ExternalToolHandler;
import edu.harvard.iq.dataverse.util.BundleUtil;
import static edu.harvard.iq.dataverse.util.JsfHelper.JH;
import org.primefaces.PrimeFaces;

import java.sql.Timestamp;
import java.util.logging.Logger;
import javax.ejb.EJB;
import javax.faces.application.FacesMessage;
import javax.faces.view.ViewScoped;
import javax.inject.Inject;
import javax.inject.Named;
import java.util.Date;


/**
* This bean is mainly for keeping track of which file the user selected to run external tools on.
Expand All @@ -35,16 +38,22 @@ public class ConfigureFragmentBean implements java.io.Serializable{
private ExternalTool tool = null;
private Long fileId = null;
private ExternalToolHandler toolHandler = null;
private String messageApi = "";

@EJB
DataFileServiceBean datafileService;
@Inject
DataverseSession session;
@EJB
AuthenticationServiceBean authService;
@EJB
UserNotificationServiceBean userNotificationService;

public String configureExternalAlert() {
JH.addMessage(FacesMessage.SEVERITY_WARN, tool.getDisplayName(), BundleUtil.getStringFromBundle("file.configure.launchMessage.details") + " " + tool.getDisplayName() + ".");
public String configureExternalAlert() {
generateApiToken();
PrimeFaces.current().executeScript("location.reload(true)");
String httpString = "window.open('" + toolHandler.getToolUrlWithQueryParams()+ "','_blank'" +")";
PrimeFaces.current().executeScript(httpString);
return "";
}

Expand Down Expand Up @@ -79,13 +88,40 @@ public ExternalToolHandler getConfigurePopupToolHandler() {
if (user instanceof AuthenticatedUser) {
apiToken = authService.findApiTokenByUser((AuthenticatedUser) user);
}
if ((apiToken == null) || (apiToken.getExpireTime().before(new Date()))) {
messageApi = BundleUtil.getStringFromBundle("configurefragmentbean.apiTokenGenerated");
} else {
messageApi = "";
}


toolHandler = new ExternalToolHandler(tool, datafileService.find(fileId), apiToken);

return toolHandler;
}

public void generateApiToken() {

ApiToken apiToken = new ApiToken();
User user = session.getUser();
if (user instanceof AuthenticatedUser) {
apiToken = authService.findApiTokenByUser((AuthenticatedUser) user);
if ((apiToken == null) || (apiToken.getExpireTime().before(new Date()))) {
apiToken = authService.generateApiTokenForUser(( AuthenticatedUser) user);
toolHandler.setApiToken(apiToken);
toolHandler.getToolUrlWithQueryParams();
userNotificationService.sendNotification((AuthenticatedUser) user, new Timestamp(new Date().getTime()), UserNotification.Type.APIGENERATED, null);
}
}

}

public void setConfigureFileId(Long setFileId)
{
public void setConfigureFileId(Long setFileId) {
fileId = setFileId;
}

public String getMessageApi() {
return messageApi;
}

}
11 changes: 11 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/DataFileServiceBean.java
Original file line number Diff line number Diff line change
Expand Up @@ -1678,4 +1678,15 @@ public String getPhysicalFileToDelete(DataFile dataFile) {
}
return null;
}

public boolean isFoldersMetadataPresentInVersion(DatasetVersion datasetVersion) {
Query query = em.createNativeQuery("SELECT id FROM fileMetadata WHERE datasetversion_id="+datasetVersion.getId()+" AND directoryLabel IS NOT null LIMIT 1");

try {
int count = query.getResultList().size();
return count > 0;
} catch (Exception ex) {
return false;
}
}
}
Loading

0 comments on commit 8ef5096

Please sign in to comment.