Skip to content

Commit

Permalink
Update date annotator description (#178)
Browse files Browse the repository at this point in the history
* Update date annotator description

* Update description of the three annotators
  • Loading branch information
tschaffter authored Feb 23, 2021
1 parent a39520d commit af5b11e
Show file tree
Hide file tree
Showing 3 changed files with 112 additions and 16 deletions.
44 changes: 38 additions & 6 deletions openapi/date-annotator/openapi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,16 +11,48 @@ info:
url: https://github.com/nlpsandbox/nlpsandbox-schemas/blob/develop/LICENSE
x-logo:
url: https://nlpsandbox.github.io/nlpsandbox-schemas/logo.png
description: |
# Overview
description: >
# Introduction
This NLP tool detects references of dates in the clinical note given as
input and returns a list of date annotations.
The Date Annotator is one of the first type of NLP Tools that can be
benchmarked on [nlpsandbox.io]. A Date Annotator takes as input a clinical
note and outputs a list of predicted date annotations found in the clinical
note. This OpenAPI document describes the specification of a Date Annotator.
This specification includes the schemas of the input and output data, and
the conditions that this annotator must meet if you want to benchmark its
performance on [nlpsandbox.io].
# Getting Started
The GitHub repository [nlpsandbox/date-annotator-example] provides a simple
example implementation of a Python-Flask Date Annotator. By the end of the
tutorial available in this repository, you will have built a Docker image
for a simple Date Annotator. You will then be able to submit this image to
[nlpsandbox.io] to benchmark its performance.
# Benchmarking Requirements
The following conditions must be met by your Date Annotator if you want to
benchmark its performance on [nlpsandbox.io].
- The endpoint `/` must redirect to `/api/v1/tool`.
- The endpoint `/ui` must redirect to the web interface (UI).
- The output of this tool must be reproducible: a given input should always
generate the same output.
- This tool must not attempt to connect to remote server for reproducibility,
robustness, and security reason. When benchmark on [nlpsandbox.io], this
tool will be prevented from connecting to remote servers.
# Examples
- [NLP Sandbox Date Annotator (Python)](https://github.com/nlpsandbox/date-annotator-example)
- [NLP Sandbox Date Annotator (Java)](https://github.com/nlpsandbox/date-annotator-example-java)
- [Date Annotator Example (Python)]
- [Date Annotator Example (Java)]
<!-- Links -->
[nlpsandbox.io]: https://nlpsandbox.io
[nlpsandbox/date-annotator-example]: https://github.com/nlpsandbox/date-annotator-example
[Date Annotator Example (Python)]: https://github.com/nlpsandbox/date-annotator-example
[Date Annotator Example (Java)]: https://github.com/nlpsandbox/date-annotator-example-java
tags:
- name: HealthCheck
description: Operations about health checks
Expand Down
41 changes: 36 additions & 5 deletions openapi/person-name-annotator/openapi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,46 @@ info:
url: https://github.com/nlpsandbox/nlpsandbox-schemas/blob/develop/LICENSE
x-logo:
url: https://nlpsandbox.github.io/nlpsandbox-schemas/logo.png
description: |
# Overview
description: >
# Introduction
This NLP tool detects references of person names in the clinical note given
as input and returns a list of person name annotations.
The Person Name Annotator is one of the first type of NLP Tools that can be
benchmarked on [nlpsandbox.io]. A Person Name Annotator takes as input a
clinical note and outputs a list of predicted person name annotations found
in the clinical note. This OpenAPI document describes the specification of a
Person Name Annotator. This specification includes the schemas of the input
and output data, and the conditions that this annotator must meet if you
want to benchmark its performance on [nlpsandbox.io].
# Getting Started
The GitHub repository [nlpsandbox/person-name-annotator-example] provides a
simple example implementation of a Python-Flask Person Name Annotator. By
the end of the tutorial available in this repository, you will have built a
Docker image for a simple Person Name Annotator. You will then be able to
submit this image to [nlpsandbox.io] to benchmark its performance.
# Benchmarking Requirements
The following conditions must be met by your Person Name Annotator if you
want to benchmark its performance on [nlpsandbox.io].
- The endpoint `/` must redirect to `/api/v1/tool`.
- The endpoint `/ui` must redirect to the web interface (UI).
- The output of this tool must be reproducible: a given input should always
generate the same output.
- This tool must not attempt to connect to remote server for reproducibility,
robustness, and security reason. When benchmark on [nlpsandbox.io], this
tool will be prevented from connecting to remote servers.
# Examples
- [NLP Sandbox Person Name Annotator (Python)](https://github.com/nlpsandbox/person-name-annotator-example)
- [Person Name Annotator Example (Python)]
<!-- Links -->
[nlpsandbox.io]: https://nlpsandbox.io
[nlpsandbox/person-name-annotator-example]: https://github.com/nlpsandbox/person-name-annotator-example
[Person Name Annotator Example (Python)]: https://github.com/nlpsandbox/person-name-annotator-example
tags:
- name: HealthCheck
description: Operations about health checks
Expand Down
43 changes: 38 additions & 5 deletions openapi/physical-address-annotator/openapi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,48 @@ info:
url: https://github.com/nlpsandbox/nlpsandbox-schemas/blob/develop/LICENSE
x-logo:
url: https://nlpsandbox.github.io/nlpsandbox-schemas/logo.png
description: |
# Overview
description: >
# Introduction
This NLP tool detects references of physical addresses in the clinical note
given as input and returns a list of physical address annotations.
The Physical Address Annotator is one of the first type of NLP Tools that
can be benchmarked on [nlpsandbox.io]. A Physical Address Annotator takes as
input a clinical note and outputs a list of predicted physical address
annotations found in the clinical note. This OpenAPI document describes the
specification of a Physical Address Annotator. This specification includes
the schemas of the input and output data, and the conditions that this
annotator must meet if you want to benchmark its performance on
[nlpsandbox.io].
# Getting Started
The GitHub repository [nlpsandbox/physical-address-annotator-example]
provides a simple example implementation of a Python-Flask Physical Address
Annotator. By the end of the tutorial available in this repository, you will
have built a Docker image for a simple Physical Address Annotator. You will
then be able to submit this image to [nlpsandbox.io] to benchmark its
performance.
# Benchmarking Requirements
The following conditions must be met by your Physical Address Annotator if
you want to benchmark its performance on [nlpsandbox.io].
- The endpoint `/` must redirect to `/api/v1/tool`.
- The endpoint `/ui` must redirect to the web interface (UI).
- The output of this tool must be reproducible: a given input should always
generate the same output.
- This tool must not attempt to connect to remote server for reproducibility,
robustness, and security reason. When benchmark on [nlpsandbox.io], this
tool will be prevented from connecting to remote servers.
# Examples
- [NLP Sandbox Physical Address Annotator (Python)](https://github.com/nlpsandbox/physical-address-annotator-example)
- [Physical Address Annotator Example (Python)]
<!-- Links -->
[nlpsandbox.io]: https://nlpsandbox.io
[nlpsandbox/physical-address-annotator-example]: https://github.com/nlpsandbox/physical-address-annotator-example
[Physical Address Annotator Example (Python)]: https://github.com/nlpsandbox/physical-address-annotator-example
tags:
- name: HealthCheck
description: Operations about health checks
Expand Down

0 comments on commit af5b11e

Please sign in to comment.