Skip to content

Commit

Permalink
Doc: Fix Build Namespace and Broken Links (#3925)
Browse files Browse the repository at this point in the history
* - Modify doc build and test instructions to match how readthedocs does it.
- Update pymdown snippets to allow inclusion outside of manual root.
- Update pymdown snippets to fail on build if snippet can't be found.

* Remove symlink to code-examples, make examples refer to examples in source dir directly.

* Fix up broken internal references.
  • Loading branch information
dthain authored Aug 29, 2024
1 parent 7517035 commit 034aba1
Show file tree
Hide file tree
Showing 12 changed files with 25 additions and 24 deletions.
2 changes: 1 addition & 1 deletion doc/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ explain the principles of operations, give examples for new users to follow, and
as a general reference.

**Local Build**. The manual is written using [mkdocs flavored markdown](https://www.mkdocs.org/user-guide/writing-your-docs/).
To build and test the documentation locally, run `mkdocs serve` in the `doc` directory,
To build and test the documentation locally, run `mkdocs serve --config-file doc/mkdocs.yml` from the repository root directory,
which will compile the sources into HTML and start a local web server on `http://localhost:8000`.
You can then view the compiled manuals using your browser.

Expand Down
6 changes: 3 additions & 3 deletions doc/manuals/jx-workflow/jx.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ A workflow is encoded as a JSON object with the following keys:
| Key | Required | Description |
|-----|:--------:|-------------|
|[**rules**](#rules)| yes | Unordered array of rules comprising the workflow.<br> Each `<rule>` corresponds to a single job represented as a JSON object to be executed in the workflow.
|[**define**](#defining-values) | no | Defines [expression substitutions](#jx-expressions) that can be used when defining rules, environments, and categories.|
|[**define**](#computed-values) | no | Defines [expression substitutions](#computed-values) that can be used when defining rules, environments, and categories.|
|[**environment**](#environments) | no | Defines environment variables to be set when executing all rules.|
|[**categories**](#categories)| no | Rules are grouped into categories. Rules inside a category are run with the same [environment variables values](#environments), and the same resource requirements.|
|default_category | no | Name of the category used if none is specified for a rule. <br>If there is no corresponding category defined, default values will be filled in. If not provided, the default category is `"default"`.|
Expand Down Expand Up @@ -84,8 +84,8 @@ single command, but it replaces the key `command` with keys `workflow` and
|-----|:--------:|-------------|
| command <br> _or_ <br> workflow | yes | Either `command`, which is a single Unix program to run, or a `workflow` which names another workflow to be run as a sub-job.
| args | no | **Only used with workflow key.** Gives arguments as a JSON array to be passed to a sub-workflow.
| inputs | no | An array of [file specifications](#Files) required by the command or sub-workflow.
| outputs | no | An array of [file specifications](#Files) produced by the command or sub-workflow.
| inputs | no | An array of [file specifications](#files) required by the command or sub-workflow.
| outputs | no | An array of [file specifications](#files) produced by the command or sub-workflow.
| local_job | no | If `true` indicates that the job is best run locally by the workflow manager, rather than dispatched to a distributed system. This is a performance hint provided by the end user and not a functional requirement. Default is `false`.
| category | no | Specifies the name of a job category. The name given should correspond to the key of a category object in the global workflow object.
| resources | no | Specifies the specific [resource requirements](#resources) of this job.
Expand Down
1 change: 0 additions & 1 deletion doc/manuals/taskvine/code-examples

This file was deleted.

2 changes: 1 addition & 1 deletion doc/manuals/taskvine/example-blast.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,6 @@ and performs multiple queries against it. It demonstrates use of remote
data, unpacking, temporary files, and immediate buffer data.

```
--8<-- "taskvine/code-examples/vine_example_blast.py"
--8<-- "../../taskvine/src/examples/vine_example_blast.py"
```

6 changes: 3 additions & 3 deletions doc/manuals/taskvine/example-functional.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,13 @@ These three examples show the use of higher order functions
Python functions to data via remote tasks:

```
--8<-- "taskvine/code-examples/vine_example_map.py"
--8<-- "../../taskvine/src/examples/vine_example_map.py"
```

```
--8<-- "taskvine/code-examples/vine_example_pair.py"
--8<-- "../../taskvine/src/examples/vine_example_pair.py"
```

```
--8<-- "taskvine/code-examples/vine_example_tree_reduce.py"
--8<-- "../../taskvine/src/examples/vine_example_tree_reduce.py"
```
2 changes: 1 addition & 1 deletion doc/manuals/taskvine/example-gradient-descent.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,5 @@ a function. Demonstrates the use of serverless computing,
and distributed python environments.

```
--8<-- "taskvine/code-examples/vine_example_gradient_descent.py"
--8<-- "../../taskvine/src/examples/vine_example_gradient_descent.py"
```
2 changes: 1 addition & 1 deletion doc/manuals/taskvine/example-gutenberg.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,5 @@ performs an all-to-all comparison of each pair using a Unix script.
Demonstrates use of external data, caching, and shared data.

```
--8<-- "taskvine/code-examples/vine_example_gutenberg.py"
--8<-- "../../taskvine/src/examples/vine_example_gutenberg.py"
```
2 changes: 1 addition & 1 deletion doc/manuals/taskvine/example-mosaic.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,5 @@ tool to produce a mosaic. Demonstrates use of remote data, unpacking,
caching, starch, and temporary files.

```
--8<-- "taskvine/code-examples/vine_example_mosaic.py"
--8<-- "../../taskvine/src/examples/vine_example_mosaic.py"
```
2 changes: 1 addition & 1 deletion doc/manuals/taskvine/example-watch.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,5 @@ that produces gradual output. It demonstates how the `VINE_WATCH` flag
can be used to incrementally bring back the output of a running task.

```
--8<-- "taskvine/code-examples/vine_example_watch.py"
--8<-- "../../taskvine/src/examples/vine_example_watch.py"
```
16 changes: 8 additions & 8 deletions doc/manuals/taskvine/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -277,7 +277,7 @@ tasks at once:
struct vine_file *x = vine_declare_untar(m, u);
```

`declare_untar` is an example of a [MiniTask](#MiniTasks), which is explained further below.
`declare_untar` is an example of a [MiniTask](#minitasks), which is explained further below.


### Declaring Tasks
Expand Down Expand Up @@ -692,7 +692,7 @@ For further options, please refer to the TaskVine factory [manual](../man_pages/

By default, the factory submits as many tasks that are waiting and running up
to a specified maximum. To run more than one task in a worker, please refer
to the following section on describing [task resources](#task-resources) and [worker resources](#taskvine-factory-and-resources).
to the following section on describing [task resources](#task-resources) and [worker resources](#worker-resources).

We can also create a factory directly in python. Creating a factory object does not
immediately launch it, so this is a good time to configure the resources,
Expand Down Expand Up @@ -1675,7 +1675,7 @@ can be tailored as any other task:
print(f.result())
```

Instead of tasks, the futures may also executed using [function calls](serverless-computing) with the `future_funcall` method:
Instead of tasks, the futures may also executed using [function calls](#serverless-computing) with the `future_funcall` method:

=== "Python"
```python
Expand Down Expand Up @@ -1909,7 +1909,7 @@ Consider now that the task requires 1 cores, 6GB of memory, and 27 GB of disk:
!!! note
If you want TaskVine to exactly allocate the resources you have
specified, use the `proportional-resources` and `proportional-whole-tasks`
parameters as shown [here](#specialized-and-experimental-settings). In
parameters as shown [here](#tuning-specialized-execution-parameters). In
general, however, we have found that using proportions nicely adapts to the
underlying available resources, and leads to very few resource exhaustion
failures while still using worker resources efficiently.
Expand All @@ -1921,7 +1921,7 @@ its number of cores. (This will likely change in the future.)
When you would like to run several tasks in a worker, but you are not sure
about the resources each task needs, TaskVine can automatically find values
of resources that maximize throughput, or minimize waste. This is discussed in
the section [below](#grouping-tasks-with-similar-resources-needs).
the section [below](#grouping-tasks-with-similar-resource-needs).

### Worker Resources

Expand Down Expand Up @@ -2408,7 +2408,7 @@ produces the following graphs:

![](images/plot-perf-montage.png)

- [Performance Log File Format Details](log-file-formats#performance-log-format)
- [Performance Log File Format Details](log-file-formats.md#performance-log-format)

### Transactions Log

Expand All @@ -2432,7 +2432,7 @@ to produce a visualization of how tasks are packed into workers like this:

![](images/plot-txn-workers.png)

- [Transactions Log File Format Details](log-file-formats#transactions-log-format)
- [Transactions Log File Format Details](log-file-formats.md#transactions-log-format)


Custom APPLICATION messages can be added to the log with the calls:
Expand Down Expand Up @@ -2695,7 +2695,7 @@ The `compute` call above may receive the following keyword arguments:

| Keyword | Description |
|------------ |---------|
| environment | A TaskVine file that provides an [environment](#environments) to execute each task. |
| environment | A TaskVine file that provides an [environment](#execution-contexts) to execute each task. |
| env\_vars | A dictionary of VAR=VALUE environment variables to set per task. A value should be either a string, or a function that accepts as arguments the manager and task, and that returns a string. |
| extra\_files | A dictionary of {taskvine.File: "remote_name"} of input files to attach to each task.|
| lazy\_transfer | Whether to bring each result back from the workers (False, default), or keep transient results at workers (True) |
Expand Down
4 changes: 2 additions & 2 deletions doc/manuals/work_queue/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -714,7 +714,7 @@ For further options, please refer to the work queue factory [manual](../man_page

By default, the factory submits as many tasks that are waiting and running up
to a specified maximum. To run more than one task in a worker, please refer
to the following section on describing [task resources](#task-resources) and [worker resources](#work-queue-factory-and-resources).
to the following section on describing [task resources](#task-resources) and [worker resources](#worker-resources).


#### Using the factory with python
Expand Down Expand Up @@ -860,7 +860,7 @@ its number of cores. (This will likely change in the future.)
When you would like to run several tasks in a worker, but you are not sure
about the resources each task needs, Work Queue can automatically find values
of resources that maximize throughput, or minimize waste. This is discussed in
the section [below](#grouping-tasks-with-similar-resources-needs).
the section [below](#grouping-tasks-with-similar-resource-needs).

### Worker Resources

Expand Down
4 changes: 3 additions & 1 deletion doc/mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,9 @@ markdown_extensions:
- pymdownx.tabbed:
- pymdownx.superfences:
- pymdownx.snippets:
base_path: manuals
base_path: ['doc/manuals'] # make snippets relative to manual root
restrict_base_path: false # allow snippets to include examples outside of that tree
check_paths: true # fail if snippet include doesn't work

validation:
nav:
Expand Down

0 comments on commit 034aba1

Please sign in to comment.