Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

5028 Add dataset level external tools #6059

Merged
merged 30 commits into from
Sep 13, 2019
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
f699fd6
Added placeholder Explore btn to dataset pg to be wired up to backend…
mheppler Jul 25, 2019
48d1181
add dataset level external tools #5028
pdurbin Jul 26, 2019
0390e3c
return scope in listing, GET tool by id #5028
pdurbin Jul 26, 2019
6fe2996
adding release notes for dataset explore
djbrooke Jul 31, 2019
e9dad22
Merge branch 'develop' into 5028-dataset-explore-btn
sekmiller Jul 31, 2019
d8f7a96
Merge branch 'develop' into 5028-dataset-explore-btn
sekmiller Aug 5, 2019
c0df37d
removing release note because it's handled by flyway
djbrooke Aug 8, 2019
162207e
get tests passing #5028
pdurbin Aug 26, 2019
319141a
enforce datasetId or datasetPid requirement #5028
pdurbin Aug 27, 2019
3d522aa
allow content type to be null #5028
pdurbin Aug 27, 2019
cc99058
support file PIDs #5028
pdurbin Aug 27, 2019
952c30b
Merge branch 'develop' into 5028-dataset-explore-btn #5028
pdurbin Aug 27, 2019
336a424
adjust docs #5028
pdurbin Aug 27, 2019
8075283
Merge branch 'develop' into 5028-dataset-explore-btn #5028
pdurbin Aug 29, 2019
94f8127
fix typo in method name (remove "scope") #5028
pdurbin Aug 29, 2019
e92c8f1
reduce code duplication #5028
pdurbin Aug 29, 2019
969d5ed
fix assertion in test #5028
pdurbin Aug 29, 2019
f8e1e0e
prevent 500 error if invalid type is supplied #5028
pdurbin Aug 30, 2019
801bf56
rename SQL script to reflect bump to 4.16 #5028
pdurbin Sep 5, 2019
e2e34d3
improve external tools documentation #5028
pdurbin Sep 9, 2019
1578595
Link to the Admin Guide page on external tools #5028
pdurbin Sep 10, 2019
50d6cda
Merge branch 'develop' into 5028-dataset-explore-btn #5028
pdurbin Sep 10, 2019
c89819a
move Building External Tools to API Guide #5028
pdurbin Sep 10, 2019
4897de4
add lots more content for external tool makers #5028
pdurbin Sep 12, 2019
7828dea
ignore python virutal environments (venv)
pdurbin Sep 12, 2019
760d670
Merge branch 'develop' into 5028-dataset-explore-btn #5028
pdurbin Sep 12, 2019
6dd0cfc
rename flyway script # 5028
pdurbin Sep 12, 2019
519662a
get deployment working again for new installations #6165
pdurbin Sep 12, 2019
09766f1
Merge branch '6165-cannot-deploy' into 5028-dataset-explore-btn #5028
pdurbin Sep 12, 2019
06105e8
move testing methods to /api/admin/test #5028 #4137
pdurbin Sep 12, 2019
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion conf/docker-aio/run-test-suite.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,4 @@ fi

# Please note the "dataverse.test.baseurl" is set to run for "all-in-one" Docker environment.
# TODO: Rather than hard-coding the list of "IT" classes here, add a profile to pom.xml.
mvn test -Dtest=DataversesIT,DatasetsIT,SwordIT,AdminIT,BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT,FileMetadataIT,FilesIT,SearchIT,InReviewWorkflowIT,HarvestingServerIT,MoveIT,MakeDataCountApiIT,FileTypeDetectionIT,EditDDIIT -Ddataverse.test.baseurl=$dvurl
mvn test -Dtest=DataversesIT,DatasetsIT,SwordIT,AdminIT,BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT,FileMetadataIT,FilesIT,SearchIT,InReviewWorkflowIT,HarvestingServerIT,MoveIT,MakeDataCountApiIT,FileTypeDetectionIT,EditDDIIT,ExternalToolsIT -Ddataverse.test.baseurl=$dvurl
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
"displayName": "Awesome Tool",
"description": "The most awesome tool.",
"type": "explore",
"scope": "file",
"contentType": "text/tab-separated-values",
"toolUrl": "https://awesometool.com",
"toolParameters": {
Expand Down
13 changes: 8 additions & 5 deletions doc/sphinx-guides/source/installation/external-tools.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ External tools can provide additional features that are not part of Dataverse it
Inventory of External Tools
---------------------------

Support for external tools is just getting off the ground but the following tools have been successfully integrated with Dataverse:
The following tools have been successfully integrated with Dataverse:

- TwoRavens: a system of interlocking statistical tools for data exploration, analysis, and meta-analysis: http://2ra.vn. See the :doc:`/user/data-exploration/tworavens` section of the User Guide for more information on TwoRavens from the user perspective and the :doc:`r-rapache-tworavens` section of the Installation Guide.

Expand All @@ -35,14 +35,17 @@ External tools must be expressed in an external tool manifest file, a specific J

``type`` is required and must be ``explore`` or ``configure`` to make the tool appear under a button called "Explore" or "Configure", respectively.

External tools can operate on any file, including tabular files that have been created by successful ingestion. (For more on ingest, see the :doc:`/user/tabulardataingest/ingestprocess` of the User Guide.) The optional ``contentType`` entry specifies the mimetype a tool works on. (Not providing this parameter makes the tool work on ingested tabular files and is equivalent to specifying the ``contentType`` as "text/tab-separated-values".)
``scope`` is required and must be ``file`` or ``dataset`` to make the tool appear at the file level or dataset level.

File level tools can operate on any file, including tabular files that have been created by successful ingestion. (For more on ingest, see the :doc:`/user/tabulardataingest/ingestprocess` of the User Guide.) The optional ``contentType`` entry specifies the mimetype a tool works on. (Not providing this parameter makes the tool work on ingested tabular files and is equivalent to specifying the ``contentType`` as "text/tab-separated-values".)

In the example above, a mix of required and optional reserved words appear that can be used to insert dynamic values into tools. The supported values are:

- ``{fileId}`` (required) - The Dataverse database ID of a file the external tool has been launched on.
- ``{siteUrl}`` (optional) - The URL of the Dataverse installation that hosts the file with the fileId above.
- ``{fileId}`` (required for file tools) - The Dataverse database ID of a file from which the external tool has been launched.
- ``{siteUrl}`` (optional) - The URL of the Dataverse installation from which the tool was launched.
- ``{apiToken}`` (optional) - The Dataverse API token of the user launching the external tool, if available.
- ``{datasetId}`` (optional) - The ID of the dataset containing the file.
- ``{datasetId}`` (optional) - The ID of the dataset.
- ``{datasetPid}`` (optional) - The Persistent ID (DOI or Handle) of the dataset.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why this and not file PID too? Since Dataverse is always generating this, I'm not sure it helps anyone to have both id and pid (a tool can get the pid), but it seems like file and dataset should be the same.
Also - right now we check for fileId existing (for file scope) as a required condition. For dataset, and file if it is made parallel, the condition will have to change to be that id or pid is required.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. The focus right now is on dataset level tools but sure, if a file PID exists it would hurt to expose it. Tool makers should just understand that they might not always get a file PID.

I'm not sure if I grok your other comment but thanks for making it. 😄 Someone else will be picking this up while I'm on vacation, I suspect.

- ``{datasetVersion}`` (optional) - The friendly version number ( or \:draft ) of the dataset version the tool is being launched from.

Making an External Tool Available in Dataverse
Expand Down
23 changes: 23 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,10 @@
import edu.harvard.iq.dataverse.authorization.AuthenticationServiceBean;
import edu.harvard.iq.dataverse.authorization.Permission;
import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUserServiceBean;
import edu.harvard.iq.dataverse.authorization.users.ApiToken;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
import edu.harvard.iq.dataverse.authorization.users.PrivateUrlUser;
import edu.harvard.iq.dataverse.authorization.users.User;
import edu.harvard.iq.dataverse.branding.BrandingUtil;
import edu.harvard.iq.dataverse.dataaccess.StorageIO;
import edu.harvard.iq.dataverse.dataaccess.ImageThumbConverter;
Expand Down Expand Up @@ -99,6 +101,7 @@
import edu.harvard.iq.dataverse.externaltools.ExternalTool;
import edu.harvard.iq.dataverse.externaltools.ExternalToolServiceBean;
import edu.harvard.iq.dataverse.export.SchemaDotOrgExporter;
import edu.harvard.iq.dataverse.externaltools.ExternalToolHandler;
import edu.harvard.iq.dataverse.makedatacount.MakeDataCountLoggingServiceBean;
import edu.harvard.iq.dataverse.makedatacount.MakeDataCountLoggingServiceBean.MakeDataCountEntry;
import java.util.Collections;
Expand Down Expand Up @@ -135,6 +138,7 @@
import org.apache.solr.client.solrj.response.QueryResponse;
import org.apache.solr.common.SolrDocument;
import org.apache.solr.common.SolrDocumentList;
import org.primefaces.PrimeFaces;
import org.primefaces.model.DefaultTreeNode;
import org.primefaces.model.TreeNode;

Expand Down Expand Up @@ -316,6 +320,7 @@ public void setShowIngestSuccess(boolean showIngestSuccess) {
List<ExternalTool> exploreTools = new ArrayList<>();
Map<Long, List<ExternalTool>> configureToolsByFileId = new HashMap<>();
Map<Long, List<ExternalTool>> exploreToolsByFileId = new HashMap<>();
private List<ExternalTool> datasetExploreTools;

public Boolean isHasRsyncScript() {
return hasRsyncScript;
Expand Down Expand Up @@ -2016,6 +2021,7 @@ private String init(boolean initFull) {

configureTools = externalToolService.findByType(ExternalTool.Type.CONFIGURE);
exploreTools = externalToolService.findByType(ExternalTool.Type.EXPLORE);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

findByType isn't filtering by scope yet - should these use findByTypeAndScope with file scope?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, they probably should.

datasetExploreTools = externalToolService.findByScopeAndType(ExternalTool.Scope.DATASET, ExternalTool.Type.EXPLORE);
rowsPerPage = 10;


Expand Down Expand Up @@ -5052,6 +5058,10 @@ public List<ExternalTool> getCachedToolsForDataFile(Long fileId, ExternalTool.Ty
return cachedTools;
}

public List<ExternalTool> getDatasetExploreTools() {
return datasetExploreTools;
}

Boolean thisLatestReleasedVersion = null;

public boolean isThisLatestReleasedVersion() {
Expand Down Expand Up @@ -5200,4 +5210,17 @@ public int compare(FileMetadata o1, FileMetadata o2) {
return type1.compareTo(type2);
}
};

public void explore(ExternalTool externalTool) {
ApiToken apiToken = null;
User user = session.getUser();
if (user instanceof AuthenticatedUser) {
apiToken = authService.findApiTokenByUser((AuthenticatedUser) user);
}
ExternalToolHandler externalToolHandler = new ExternalToolHandler(externalTool, dataset, apiToken);
String toolUrl = externalToolHandler.getToolUrlWithQueryParams();
logger.fine("Exploring with " + toolUrl);
PrimeFaces.current().executeScript("window.open('"+toolUrl + "', target='_blank');");
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ public class ExternalTool implements Serializable {
public static final String DISPLAY_NAME = "displayName";
public static final String DESCRIPTION = "description";
public static final String TYPE = "type";
public static final String SCOPE = "scope";
public static final String TOOL_URL = "toolUrl";
public static final String TOOL_PARAMETERS = "toolParameters";
public static final String CONTENT_TYPE = "contentType";
Expand Down Expand Up @@ -52,6 +53,13 @@ public class ExternalTool implements Serializable {
@Enumerated(EnumType.STRING)
private Type type;

/**
* Whether the tool operates at the dataset or file level.
*/
@Column(nullable = false)
@Enumerated(EnumType.STRING)
private Scope scope;

@Column(nullable = false)
private String toolUrl;

Expand Down Expand Up @@ -83,10 +91,11 @@ public class ExternalTool implements Serializable {
public ExternalTool() {
}

public ExternalTool(String displayName, String description, Type type, String toolUrl, String toolParameters, String contentType) {
public ExternalTool(String displayName, String description, Type type, Scope scope, String toolUrl, String toolParameters, String contentType) {
this.displayName = displayName;
this.description = description;
this.type = type;
this.scope = scope;
this.toolUrl = toolUrl;
this.toolParameters = toolParameters;
this.contentType = contentType;
Expand Down Expand Up @@ -120,6 +129,34 @@ public String toString() {
}
}

public enum Scope {

DATASET("dataset"),
FILE("file");

private final String text;

private Scope(final String text) {
this.text = text;
}

public static Scope fromString(String text) {
if (text != null) {
for (Scope scope : Scope.values()) {
if (text.equals(scope.text)) {
return scope;
}
}
}
throw new IllegalArgumentException("Scope must be one of these values: " + Arrays.asList(Scope.values()) + ".");
}

@Override
public String toString() {
return text;
}
}

public Long getId() {
return id;
}
Expand Down Expand Up @@ -148,6 +185,10 @@ public Type getType() {
return type;
}

public Scope getScope() {
return scope;
}

public String getToolUrl() {
return toolUrl;
}
Expand Down Expand Up @@ -193,7 +234,10 @@ public enum ReservedWord {
FILE_ID("fileId"),
SITE_URL("siteUrl"),
API_TOKEN("apiToken"),
// datasetId is the database id
DATASET_ID("datasetId"),
// datasetPid is the DOI or Handle
DATASET_PID("datasetPid"),
DATASET_VERSION("datasetVersion"),
FILE_METADATA_ID("fileMetadataId");

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,8 @@ public class ExternalToolHandler {
private ApiToken apiToken;

/**
* File level tool
*
* @param externalTool The database entity.
* @param dataFile Required.
* @param apiToken The apiToken can be null because "explore" tools can be
Expand All @@ -51,6 +53,27 @@ public ExternalToolHandler(ExternalTool externalTool, DataFile dataFile, ApiToke
this.fileMetadata = fileMetadata;
}

/**
* Dataset level tool
*
* @param externalTool The database entity.
* @param dataset Required.
* @param apiToken The apiToken can be null because "explore" tools can be
* used anonymously.
*/
public ExternalToolHandler(ExternalTool externalTool, Dataset dataset, ApiToken apiToken) {
this.externalTool = externalTool;
if (dataset == null) {
String error = "A Dataset is required.";
logger.warning("Error in ExternalToolHandler constructor: " + error);
throw new IllegalArgumentException(error);
}
this.dataset = dataset;
this.apiToken = apiToken;
this.dataFile = null;
this.fileMetadata = null;
}

public DataFile getDataFile() {
return dataFile;
}
Expand Down Expand Up @@ -89,7 +112,7 @@ private String getQueryParam(String key, String value) {
ReservedWord reservedWord = ReservedWord.fromString(value);
switch (reservedWord) {
case FILE_ID:
// getDataFile is never null because of the constructor
// getDataFile is never null for file tools because of the constructor
return key + "=" + getDataFile().getId();
case SITE_URL:
return key + "=" + SystemConfig.getDataverseSiteUrlStatic();
Expand All @@ -103,6 +126,8 @@ private String getQueryParam(String key, String value) {
break;
case DATASET_ID:
return key + "=" + dataset.getId();
case DATASET_PID:
return key + "=" + dataset.getGlobalId().asString();
case DATASET_VERSION:
String version = null;
if (getApiToken() != null) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,14 @@
import edu.harvard.iq.dataverse.DataFileServiceBean;
import edu.harvard.iq.dataverse.externaltools.ExternalTool.ReservedWord;
import edu.harvard.iq.dataverse.externaltools.ExternalTool.Type;
import edu.harvard.iq.dataverse.externaltools.ExternalTool.Scope;

import static edu.harvard.iq.dataverse.externaltools.ExternalTool.DESCRIPTION;
import static edu.harvard.iq.dataverse.externaltools.ExternalTool.DISPLAY_NAME;
import static edu.harvard.iq.dataverse.externaltools.ExternalTool.TOOL_PARAMETERS;
import static edu.harvard.iq.dataverse.externaltools.ExternalTool.TOOL_URL;
import static edu.harvard.iq.dataverse.externaltools.ExternalTool.TYPE;
import static edu.harvard.iq.dataverse.externaltools.ExternalTool.SCOPE;
import static edu.harvard.iq.dataverse.externaltools.ExternalTool.CONTENT_TYPE;
import java.io.StringReader;
import java.util.ArrayList;
Expand Down Expand Up @@ -74,7 +76,21 @@ public List<ExternalTool> findByType(Type type, String contentType) {
return externalTools;
}


/**
* @param scope - dataset or file
* @return A list of tools or an empty list.
*/
public List<ExternalTool> findByScopeAndType(Scope scope, Type type) {
List<ExternalTool> externalTools = new ArrayList<>();
TypedQuery<ExternalTool> typedQuery = em.createQuery("SELECT OBJECT(o) FROM ExternalTool AS o WHERE o.scope = :scope AND o.type = :type", ExternalTool.class);
typedQuery.setParameter("scope", scope);
typedQuery.setParameter("type", type);
List<ExternalTool> toolsFromQuery = typedQuery.getResultList();
if (toolsFromQuery != null) {
externalTools = toolsFromQuery;
}
return externalTools;
}

public ExternalTool findById(long id) {
TypedQuery<ExternalTool> typedQuery = em.createQuery("SELECT OBJECT(o) FROM ExternalTool AS o WHERE o.id = :id", ExternalTool.class);
Expand Down Expand Up @@ -131,6 +147,7 @@ public static ExternalTool parseAddExternalToolManifest(String manifest) {
String displayName = getRequiredTopLevelField(jsonObject, DISPLAY_NAME);
String description = getRequiredTopLevelField(jsonObject, DESCRIPTION);
String typeUserInput = getRequiredTopLevelField(jsonObject, TYPE);
String scopeUserInput = getRequiredTopLevelField(jsonObject, SCOPE);
String contentType = getOptionalTopLevelField(jsonObject, CONTENT_TYPE);
//Legacy support - assume tool manifests without any mimetype are for tabular data
if(contentType==null) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is going to set a content type on dataset level tools

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this is a bug. We don't want dataset tools to have a content type. Good catch. Thanks.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@qqmyers eventually we may use this - for example, show the dataset level explore only if the dataset has at least one file of content type X - but for now we're going with simple.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I understand @scolapasta's suggestion - we may have a use for the content type, with some dataset-level external tools; but it sounds like it still needs to be nullable in the table; and it still should be null by default.
In other words, if we adopt this scheme, then we definitely DON'T want to set the type to tab-delimited by default.

Expand All @@ -139,26 +156,29 @@ public static ExternalTool parseAddExternalToolManifest(String manifest) {

// Allow IllegalArgumentException to bubble up from ExternalTool.Type.fromString
ExternalTool.Type type = ExternalTool.Type.fromString(typeUserInput);
ExternalTool.Scope scope = ExternalTool.Scope.fromString(scopeUserInput);
String toolUrl = getRequiredTopLevelField(jsonObject, TOOL_URL);
JsonObject toolParametersObj = jsonObject.getJsonObject(TOOL_PARAMETERS);
JsonArray queryParams = toolParametersObj.getJsonArray("queryParameters");
boolean allRequiredReservedWordsFound = false;
for (JsonObject queryParam : queryParams.getValuesAs(JsonObject.class)) {
Set<String> keyValuePair = queryParam.keySet();
for (String key : keyValuePair) {
String value = queryParam.getString(key);
ReservedWord reservedWord = ReservedWord.fromString(value);
if (reservedWord.equals(ReservedWord.FILE_ID)) {
allRequiredReservedWordsFound = true;
if (scope.equals(Scope.FILE)) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

else Scope.DATASET - just not written yet?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, I'm not sure. Should dataset tools have any reserved words that are required?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

without an else statement, allRequiredReservedWordsFound is false and an exception happens so this method doesn't work for scope dataset as is.
w.r.t. reserved words - it would be odd/bad to configure a dataset tool to not get the dataset id (or pid) in the same way that a file tool needs a file id (or pid if that's implemented).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah sorry - I see the exception is in the if statement - probably should not define the allRequiredWordsFound outside the if clause then (to avoid confusing people like me :-) )

for (JsonObject queryParam : queryParams.getValuesAs(JsonObject.class)) {
Set<String> keyValuePair = queryParam.keySet();
for (String key : keyValuePair) {
String value = queryParam.getString(key);
ReservedWord reservedWord = ReservedWord.fromString(value);
if (reservedWord.equals(ReservedWord.FILE_ID)) {
allRequiredReservedWordsFound = true;
}
}
}
}
if (!allRequiredReservedWordsFound) {
// Some day there might be more reserved words than just {fileId}.
throw new IllegalArgumentException("Required reserved word not found: " + ReservedWord.FILE_ID.toString());
if (!allRequiredReservedWordsFound) {
// Some day there might be more reserved words than just {fileId}.
throw new IllegalArgumentException("Required reserved word not found: " + ReservedWord.FILE_ID.toString());
}
}
String toolParameters = toolParametersObj.toString();
return new ExternalTool(displayName, description, type, toolUrl, toolParameters, contentType);
return new ExternalTool(displayName, description, type, scope, toolUrl, toolParameters, contentType);
}

private static String getRequiredTopLevelField(JsonObject jsonObject, String key) {
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
ALTER TABLE externaltool ADD COLUMN IF NOT EXISTS scope VARCHAR(255);
UPDATE externaltool SET scope = 'FILE';
ALTER TABLE externaltool ALTER COLUMN scope SET NOT NULL;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should content type now be nullable (in the db and class definition) since it isn't used for scope dataset?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As mentioned above, due to a bug in the pull request, dataset tools get the default content type. Yes, I think this field should be made nullable (and the bug fixed).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So what is the status of this - is this being addressed? I.e., is the bug in ExternalToolServiceBean getting fixed in this PR, and is ContentType being made nullable?

30 changes: 30 additions & 0 deletions src/main/webapp/dataset.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,36 @@
</ui:fragment>
</ul>
</div>



<div class="btn-group pull-right" jsf:rendered="#{!DatasetPage.dataset.deaccessioned}">
<!-- Explore Button Group -->
<ui:fragment rendered="#{DatasetPage.datasetExploreTools.size()==1}">
<button type="button" class="btn btn-default btn-explore" onclick="$(this).parent().find( 'li > a' ).trigger( 'click' );">
<span class="glyphicon glyphicon-equalizer"/> #{bundle.explore}
</button>
</ui:fragment>
<ui:fragment rendered="#{DatasetPage.datasetExploreTools.size()>1}">
<button type="button" class="btn btn-default btn-explore dropdown-toggle" data-toggle="dropdown">
<span class="glyphicon glyphicon-equalizer"/> #{bundle.explore} <span class="caret"></span>
</button>
</ui:fragment>

<ul class="dropdown-menu pull-left text-left" role="menu">
<!-- Explore tool links -->
<ui:repeat var="tool" value="#{DatasetPage.datasetExploreTools}">
<li>
<h:commandLink action="#{DatasetPage.explore(tool)}">
<h:outputText value="#{tool.displayName}"/>
</h:commandLink>
</li>
</ui:repeat>
</ul>
</div>



<div class="btn-group pull-right" role="group">
<!-- Edit/Publish Button Group -->
<!-- Publish/Submit for Review/Return to Author Button Group -->
Expand Down
Loading