Skip to content

Commit

Permalink
Update docs site for version 2.23.0.dev1 (#226)
Browse files Browse the repository at this point in the history
  • Loading branch information
WorkerPants authored Jul 8, 2024
1 parent b08f2e0 commit c1758bc
Show file tree
Hide file tree
Showing 38 changed files with 2,066 additions and 393 deletions.
15 changes: 13 additions & 2 deletions docs/docs/docker/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -80,12 +80,23 @@ The context is assembled as follows:

### Dependency inference support

When you `COPY` PEX binaries into your image, the dependency on the `pex_binary` target will be inferred, so you don't have to add that explicitly to the list of `dependencies` on your `docker_image` target.
When you `COPY` PEX binaries into your image, the dependency on the `pex_binary` target will be inferred, so you don't have to add that explicitly to the list of `dependencies` on your `docker_image` target. For example, the `pex_binary` target `src/python/helloworld/bin.pex` has the default `output_path` of `src.python.helloworld/bin.pex`. So, Pants can infer a dependency based on the line `COPY src.python.helloworld/bin.pex /bin/helloworld`. This inference is also done for targets referenced by their target address in build arguments, for example:

For example, the `pex_binary` target `src/python/helloworld/bin.pex` has the default `output_path` of `src.python.helloworld/bin.pex`. So, Pants can infer a dependency based on the line `COPY src.python.helloworld/bin.pex /bin/helloworld`.
```dockerfile
FROM python:3.9
ARG PEX_BIN=src:my_target
COPY $PEX_BIN /app/my_app
```

Inference for Go binaries and artifacts of other packaged targets is similar.

Inference on `file`/`files` targets is also done on files, for example:

```dockerfile
FROM python:3.9
COPY src/file.txt /app/
```

Inference is also supported for `docker_image` targets specified in build arguments, for example:

```dockerfile
Expand Down
12 changes: 6 additions & 6 deletions docs/docs/docker/tagging-docker-images.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Options for `registries` in `pants.toml`:
registry.

- `repository` - Format the repository part of the image name for this image. See [Setting a
repository name](doc:tagging-docker-images#setting-a-repository-name) for details of this option.
repository name](#setting-a-repository-name) for details of this option.

- `skip_push` - Do not push images to this registry during `pants publish`.

Expand Down Expand Up @@ -143,16 +143,16 @@ by allowing you to omit the `repository` field on each `docker_image`. But you c
this field on specific `docker_image` targets, of course. In fact, you can use these placeholders in
the `repository` field as well, if you find that helpful.

See [String interpolation using placeholder values](./tagging-docker-images.mdx#string-interpolation-using-placeholder-values) for more information.
See [String interpolation using placeholder values](#string-interpolation-using-placeholder-values) for more information.

## Tagging images

When Docker builds images, it can tag them with a set of tags. Pants will apply the tags listed in
the `image_tags` field of `docker_image`, and any additional tags if defined from the registry
configuration (see [Configuring registries](./tagging-docker-images.mdx#configuring-registries)).
configuration (see [Configuring registries](#configuring-registries)).

(Note that the field is named `image_tags` and not just `tags`, because Pants has [its own tags
concept](doc:reference-target#tags), which is unrelated.)
concept](../../reference/targets/target#tags), which is unrelated.)

```python title="src/example/BUILD"
docker_image(
Expand Down Expand Up @@ -193,7 +193,7 @@ This way you can specify a version just once, on the base image, and the derived
automatically acquire the same version.

You may also use any Docker build arguments (when configured as described in [Docker build
arguments](doc:docker#build-arguments)) for interpolation into the `image_tags` in the corresponding
arguments](../docker#build-arguments)) for interpolation into the `image_tags` in the corresponding
`docker_image`:

```python title="src/example/BUILD"
Expand Down Expand Up @@ -302,7 +302,7 @@ The interpolation context (the available placeholder values) depends on which fi
- `{build_args.ARG_NAME}`: Each defined Docker build arg is available for interpolation under the `build_args.` prefix.
- `{pants.hash}`: This is a unique hash value calculated from all input sources and the `Dockerfile`. It is effectively a hash of the Docker build context. See note below regarding its stability guarantee.

See [Setting a repository name](./tagging-docker-images.mdx#setting-a-repository-name) for placeholders specific to the `repository` field.
See [Setting a repository name](#setting-a-repository-name) for placeholders specific to the `repository` field.

:::note The `{pants.hash}` stability guarantee
The calculated hash value _may_ change between stable versions of Pants for the otherwise same input sources.
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/jvm/java-and-scala.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ To override the default on a particular target, you can use the [`jdk=` field](.

#### Scala version

The Scala version to use is configured on a resolve-by-resolve basis (see the "Third-party dependencies" section below) using the [`[scala].version_for_resolve` option](../../reference/subsystems/scala.mdx#section-version_for_resolve). The default Scala version for your repository will thus be whichever Scala version is configured for the "default" resolve, which is configured by the [`[jvm].default_resolve` option](reference-jvm#section-default-resolve).
The Scala version to use is configured on a resolve-by-resolve basis (see the "Third-party dependencies" section below) using the [`[scala].version_for_resolve` option](../../reference/subsystems/scala.mdx#section-version_for_resolve). The default Scala version for your repository will thus be whichever Scala version is configured for the "default" resolve, which is configured by the [`[jvm].default_resolve` option](../../reference/subsystems/jvm#default_resolve).

To use multiple Scala versions in a repository, you would define multiple resolves, and then adjust the [`resolve` field](../../reference/targets/scalatest_test.mdx#resolve) of any targets which should be used with the non-`default_resolve` resolve.

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/jvm/kotlin.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ To override the default on a particular target, you can use the [`jdk=` field](.

#### Kotlin version

The Kotlin version to use is configured on a resolve-by-resolve basis (see the "Third-party dependencies" section below) using the [`[kotlin].version_for_resolve` option](../../reference/subsystems/kotlin.mdx#section-version_for_resolve). The default Kotlin version for your repository will thus be whichever Kotlin version is configured for the "default" resolve, which is configured by the [`[jvm].default_resolve` option](reference-jvm#section-default-resolve).
The Kotlin version to use is configured on a resolve-by-resolve basis (see the "Third-party dependencies" section below) using the [`[kotlin].version_for_resolve` option](../../reference/subsystems/kotlin.mdx#section-version_for_resolve). The default Kotlin version for your repository will thus be whichever Kotlin version is configured for the "default" resolve, which is configured by the [`[jvm].default_resolve` option](../../reference/subsystems/jvm#default_resolve).

Each resolve must contain the following jars for the Kotlin runtime with the version matching the version specified for the resolve in the `[kotlin].version_for_resolve` option:

Expand Down
32 changes: 11 additions & 21 deletions docs/docs/python/goals/repl.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -20,35 +20,25 @@ To use IPython, run `pants repl --shell=ipython`. To permanently use IPython, ad
shell = "ipython"
```

You can change IPython's version with `[ipython].version`. If you change it, Pants's default lockfile for IPython will not work. Either set the `lockfile` option to a custom path or `"<none>"` to opt-out. See [Third-party dependencies](../overview/third-party-dependencies.mdx#tool-lockfiles).
You can change IPython's version [like any other tool, using `install_from_resolve`](../overview/lockfiles#lockfiles-for-tools).

```toml title="pants.toml"
[ipython]
version = "ipython>=8.0.0"
lockfile = "3rdparty/python/ipython_lock.txt"
```

If you set the `version` lower than IPython 7, then you must set `[ipython].ignore_cwd = false` to avoid Pants setting an option that did not exist in earlier IPython releases.
If you use a version lower than IPython 7, then you must set `[ipython].ignore_cwd = false` to avoid Pants setting an option that did not exist in earlier IPython releases.

:::note Python 2 support
Pants uses IPython 7 by default, which does not work with Python 2. You can override `version` to use IPython 5. As mentioned above, you must set `ignore_cwd = false`.
Pants uses IPython 7 by default, which does not work with Python 2. You can use `install_from_resolve` to install IPython 5:

```toml
[ipython]
version = "ipython<6"
lockfile = "3rdparty/python/ipython_lock.txt"
ignore_cwd = false
```
```toml tab={"label":"pants.toml"}
[python.resolves]
...
ipython = "3rdparty/python/ipython.lock"

You can even use IPython 7 for Python 3 code, and IPython 5 for Python 2 code:

```toml
[ipython]
version = "ipython==7.16.1 ; python_version >= '3.6'"
extra_requirements.add = ["ipython<6 ; python_version == '2.7'"]
lockfile = "3rdparty/python/ipython_lock.txt"
install_from_resolve = "ipython"
ignore_cwd = false
```
```toml tab={"label": "BUILD"}
python_requirement(name="ipython", requirements=["ipython<6"], resolve="ipython")
```

:::

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/python/integrations/protobuf-and-grpc.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@ from project.example.f_pb2_grcp import GreeterServicer

You do not need to run this goal for codegen to work when using Pants; `export-codegen` is only for external consumption outside of Pants.

Note: You can also export the generated sources using the [`--export-py-generated-sources` option](reference/goals/export#py_generated_sources) to the [`pants export` goal](reference/goals/export). This is useful when you want to provide an IDE with third-party dependencies and generated sources in a single place.
Note: You can also export the generated sources using the [`--export-py-generated-sources` option](../../../reference/goals/export#py_generated_sources) to the [`pants export` goal](../../../reference/goals/export). This is useful when you want to provide an IDE with third-party dependencies and generated sources in a single place.
:::

:::caution You likely need to add empty `__init__.py` files
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/python/integrations/thrift.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,7 @@ For example, compare `import user.ttypes` to `import codegen.models.user.ttypes`

You do not need to run this goal for codegen to work when using Pants; `export-codegen` is only for external consumption outside of Pants.

Note: You can also export the generated sources using the [`--export-py-generated-sources` option](reference/goals/export#py_generated_sources) to the [`pants export` goal](reference/goals/export). This is useful when you want to provide an IDE with third-party dependencies and generated sources in a single place.
Note: You can also export the generated sources using the [`--export-py-generated-sources` option](../../../reference/goals/export#py_generated_sources) to the [`pants export` goal](../../../reference/goals/export). This is useful when you want to provide an IDE with third-party dependencies and generated sources in a single place.
:::

## Multiple resolves
Expand Down
2 changes: 2 additions & 0 deletions docs/docs/using-pants/environments.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -257,6 +257,8 @@ Thus, Pants puts that burden on you, the Pants user, to ensure a process output

If a process isn't reproducible, re-running a build from the same source code could fail unexpectedly, or give different output to an earlier build.

You should use the `workspace_invalidation_sources` field available on the `adhoc_tool` and `shell_command` target types to inform Pants of what files should cause re-execution of the target's process if they change.

:::

The special environment name `__local_workspace__` can be used to select a matching `experimental_workspace_environment` based on its `compatible_platforms` attribute.
Expand Down
38 changes: 38 additions & 0 deletions docs/docs/writing-plugins/macros.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -107,3 +107,41 @@ custom_python_sources(
has_type_hints=False,
)
```

## Documenting your macros

Using doc-strings to document your macros, Pants will pick up and present these for the online help text. Also global constants can be documented using `Doc` wrapped in `Annotated`.

```python title="pants-plugins/macros.py"
OUR_GLOBAL_CONSTANT: Annotated[
int,
Doc(
"""This is our magic number.
It is useful when you need the answer to the meaning of life, the universe and everythin.
"""
)
] = 42


def custom_python_sources(has_type_hints: bool = True, **kwargs):
"""Custom target for Python sources.
This target adds the `type_checked` tag for targets for which `has_type_hints` is true.
"""
# ...
```

For CLI help, this information is presented when calling pants with:
```
pants OUR_GLOBAL_CONSTANT --help
pants custom_python_sources --help
```

To list all available symbols, along with the first sentence of the docs for each:
```
pants symbols --help
# To also include targets (to list only targets, there is "pants targets --help"):
pants symbols --help-advanced
```
2 changes: 1 addition & 1 deletion docs/reference/build-file-symbols/PANTS_VERSION.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,6 @@ description: |

import BuildFileSymbol from "@site/src/components/reference/BuildFileSymbol";

<BuildFileSymbol name={`PANTS_VERSION`} signature={``}>
<BuildFileSymbol name={`PANTS_VERSION`} signature={`: Version`}>

</BuildFileSymbol>
3 changes: 1 addition & 2 deletions docs/reference/build-file-symbols/node_build_script.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ import BuildFileSymbol from "@site/src/components/reference/BuildFileSymbol";

A build script, mapped from the `scripts` section of a package.json file.

Either the `output_directories` or the `output_files` argument has to be set to capture the
output artifacts of the build.
Either the `output_directories` or the `output_files` argument has to be set to capture the output artifacts of the build.

</BuildFileSymbol>
5 changes: 2 additions & 3 deletions docs/reference/build-file-symbols/node_test_script.mdx
Original file line number Diff line number Diff line change
@@ -1,15 +1,14 @@
---
title: node_test_script
description: |
The test script for this package, mapped from the `scripts` section of a package.json
The test script for this package, mapped from the `scripts` section of a package.json file. The pointed to script should accept a variadic number of ([ARG]...) path arguments.
---

import BuildFileSymbol from "@site/src/components/reference/BuildFileSymbol";

<BuildFileSymbol name={`node_test_script`} signature={`(entry_point: 'str' = 'test', report_args: 'Iterable[str]' = (), report_output_files: 'Iterable[str]' = (), report_output_directories: 'Iterable[str]' = (), coverage_args: 'Iterable[str]' = (), coverage_output_files: 'Iterable[str]' = (), coverage_output_directories: 'Iterable[str]' = (), coverage_entry_point: 'str | None' = None) -> 'NodeTestScript'`}>

The test script for this package, mapped from the `scripts` section of a package.json
file. The pointed to script should accept a variadic number of ([ARG]...) path arguments.
The test script for this package, mapped from the `scripts` section of a package.json file. The pointed to script should accept a variadic number of ([ARG]...) path arguments.

This entry point is the &#x22;test&#x22; script, by default.

Expand Down
3 changes: 1 addition & 2 deletions docs/reference/build-file-symbols/parametrize.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ import BuildFileSymbol from "@site/src/components/reference/BuildFileSymbol";

A builtin function/dataclass that can be used to parametrize Targets.

Parametrization is applied between TargetAdaptor construction and Target instantiation, which
means that individual Field instances need not be aware of it.
Parametrization is applied between TargetAdaptor construction and Target instantiation, which means that individual Field instances need not be aware of it.

</BuildFileSymbol>
17 changes: 5 additions & 12 deletions docs/reference/build-file-symbols/per_platform.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,17 +10,15 @@ import BuildFileSymbol from "@site/src/components/reference/BuildFileSymbol";

An object containing differing homogeneous platform-dependent values.

The values should be evaluated for the execution environment, and not the host environment
(I.e. it should be evaluated in a `rule` which requests `Platform`).
The values should be evaluated for the execution environment, and not the host environment (I.e. it should be evaluated in a `rule` which requests `Platform`).

Expected usage is roughly:

```python
class MyFieldType(...):
```python class MyFieldType(...):
value = str | per_platform[str]

@classmethod
def compute_value( # type: ignore[override]
def compute_value( # type: ignore[override]
cls,
raw_value: Optional[Union[str, per_platform[str]]],
address: Address,
Expand All @@ -33,8 +31,7 @@ class MyFieldType(...):

...

@rule
async def my_rule(..., platform: Platform) -> ...:
@rule async def my_rule(..., platform: Platform) -> ...:
field_value = target[MyFieldType].value

if isinstance(field_value, per_platform):
Expand All @@ -43,10 +40,6 @@ async def my_rule(..., platform: Platform) -> ...:
...
```

NOTE: Support for this object should be heavily weighed, as it would be inappropriate to use in
certain contexts (such as the `source` field in a `foo_source` target, where the intent is to
support differing source files based on platform. The result would be that dependency inference
(and therefore the dependencies field) wouldn&#x27;t be knowable on the host, which is not something
the engine can support yet).
NOTE: Support for this object should be heavily weighed, as it would be inappropriate to use in certain contexts (such as the `source` field in a `foo_source` target, where the intent is to support differing source files based on platform. The result would be that dependency inference (and therefore the dependencies field) wouldn&#x27;t be knowable on the host, which is not something the engine can support yet).

</BuildFileSymbol>
23 changes: 9 additions & 14 deletions docs/reference/build-file-symbols/stevedore_namespace.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,20 +10,15 @@ import BuildFileSymbol from "@site/src/components/reference/BuildFileSymbol";

Tag a namespace in entry_points as a stevedore namespace.

This is required for the entry_point to be visible to dep inference
based on the `stevedore_namespaces` field.
This is required for the entry_point to be visible to dep inference based on the `stevedore_namespaces` field.

For example:

```python
python_distribution(
...
entry_points={
stevedore_namespace("a.b.c"): {
"plugin_name": "some.entry:point",
},
},
)
```
For example: `&#x60;`python python_distribution(
...
entry_points=&#123;
stevedore_namespace(&#x22;a.b.c&#x22;): &#123;
&#x22;plugin_name&#x22;: &#x22;some.entry:point&#x22;,
&#125;,
&#125;,
) `&#x60;`

</BuildFileSymbol>
14 changes: 14 additions & 0 deletions docs/reference/global-options.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -438,6 +438,20 @@ This can be useful, for example, if you want to always use Docker locally, but d

</Option>

### `enable_target_origin_sources_blocks`

<Option
cli_repr={`--[no-]enable-target-origin-sources-blocks`}
env_repr='PANTS_ENABLE_TARGET_ORIGIN_SOURCES_BLOCKS'
toml_repr={`[GLOBAL]
enable_target_origin_sources_blocks = <bool>`}
default_repr={`False`}
>

Enable fine grained target analysis based on line numbers.

</Option>

### `engine_visualize_to`

<Option
Expand Down
Loading

0 comments on commit c1758bc

Please sign in to comment.