-
Notifications
You must be signed in to change notification settings - Fork 192
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add generalized interface generation architecture proposal #310
base: gh-pages
Are you sure you want to change the base?
Conversation
Signed-off-by: Michel Hidalgo <[email protected]>
@dirk-thomas if you happen to have a bit of free time (if that even exists 😅), would you take a look and share your thoughts? I'd very much appreciate it. |
Should The rational is that If we assume that language bindings are strongly tied to the build system (this seems to be the case for most languages. cython, nodejs, cgo all use specialized build tools afaik), it makes sense for |
I second @koonpeng, I think that introducing a common build specification (cbs) makes everything more complex with little extra value. IMO having a non-cmake plugin mechanism would be a huge improvement, but I think it's ok if some plugins continue generating cmake (or files for whatever build system is convenient for that language). |
|
||
All positional arguments are paths to interface definition files. | ||
If no output directory is specified, the current working directory is used. | ||
If no type representation / type support generators are specified, all available generators available are used. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If no type representation / type support generators are specified, all available generators available are used. | |
If no type representation / type support generators are specified, all available generators are used. |
@koonpeng It sounds like that assumption means one language one build system, but I think one of the goals is to enable generation by any build system, like generating C++ bindings with either CMake or with Bazel. |
Would you mind sharing an example? I'd like to understand where the difficulties show up.
That's precisely the problem that this is trying to address. It is definitely simpler for generators to be coupled with a build system, but then you're forced to port or extend them [1]. As of today, we have three (3) type representation generators and eight (8) type support generators in the core alone. [1] As I see it, if a given generator allows extension to support other build systems, then some form of intermediate representation must necessarily exist. If we can standardize that representation, we get massive port effort reduction.
I disagree. Building interfaces outside the package build system requires additional logic to ensure proper incremental builds, to use generated artifacts (e.g. shared libraries) within the package build, and to land them along with their metadata (e.g. CMake exported targets) in the package install space. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I have a few more comments, but I'm not sure I understand the flow correctly. I think the design doc could really use a diagram of how all the data flows. I'm not sure I understand when each tool is invoked either.
Here's what I think the design doc is describing
+----+ +-----+ +-------+ +----+ +----+ +-----+ +-------+ +----+ +-------+ +-------+ +-----------+ +----+ +------------+
|.msg| |.srv | |.action| |.idl| |.msg| |.srv | |.action| |.idl| |headers| |sources| |directories| |.cps| |build system|
+---++ +--+--+ ++------+ ++---+ +---++ +--+--+ ++------+ ++---+ +--+----+ ++------+ +-----+-----+ +--+-+ +----+-------+
| | | | | | | | | | | | |
+-------------------------------------------------------+ +-------------------------------------------------------------+ | | | | |
| | | | | | | | | | | | +-------------------------------------------------+
| v v v | | | v v v | | | | | | | | |
| ++-----+-----+-+ | | | ++-----+-----+-+ | | | +--------+---+-----------+----------+ | |
| |rosidl_adapter| | | | |rosidl_adapter| | | | | | |
| +----------+---+ | rosidl_generate | | +----------+---+ | rosidl_build_configure | | | ament_meta_build | |
| | | rosidle_typesupport_generate| | | | rosidle_typesupport_build_configure| | | | |
| v | | | v | | | v | |
| +-+--+ | | | +-+--+ | | | +------+---------+ | |
| |.idl| | | | |.idl| | | | |select generator+<--------------------+ |
| +--+-+ | | | +--+-+ | | | +-+--+--------+-++ |
| | | | | | | | | | | | | |
| +--------->+ | | +--------->+ | | | | | | |
| | | | | | | +---+ | | | |
| +-------+---+---+-------+ | | +-------+---+---+-------+ | | | | | | |
| | | | | | | | | | | | | | | | +---------+ |
| v v v v | | v v v v | | | | | | |
| +-+--+ +-+--+ +-+-+ +--+-+ | | +-+--+ +-+--+ +-+-+ +--+-+ | | v v v v |
| |gen1| |gen2| |...| |genN| | | |gen1| |gen2| |...| |genN| | | +--+--+ +-+---+ +--+-------+ +-+-+ |
| +-+--+ +-+--+ ++--+ +-+--+ | | +-+--+ +-+--+ ++--+ +-+--+ | | |cmake| |bazel| |setuptools| |...| |
| | | | | | | | | | | | | +--+--+ +--+--+ +----+-----+ +-+-+ |
| | v v | | | | v v | | | | | | | |
| +-------+---+--+-------+ | | +-------+---+--+-------+ | | | | | | |
| | | | | | | | | | | |
+-------------------------------------------------------+ +-------------------------------------------------------------+ +-------------------------------------------------+
| | | | | |
+---------+---+----------+ v v v v v
| | | +--+-+ +-+-+ +-+-+ +-+-+ +-+-+
v v v |.cps| |???| |???| |???| |???|
+---+---+ +--+----+ +------+----+ +----+ +---+ +---+ +---+ +---+
|headers| |sources| |directories|
+-------+ +-------+ +-----------+
It seems like the .cps
generation is separated from the code generation. When is each called? In a CMake project when are the three tools called? The cps
created called during CMake configure time, the code generated at build time, and ament_meta_build
at build time? What does ament_meta_build
produce? Who uses it's output?
|
||
### A. Common Build Specification | ||
|
||
This specification is based on the [Common Package Specification](https://mwoehlke.github.io/cps/) format, a simple and relatively standard format that describes packages for downstream consumption in a build system and build tool agnostic way. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The rational is that node-gyp for example contains many platform specific and other conditional configurations, trying to correctly translate all the conditionals into a CPS does not look like a simple job
It looks like cps
limits itself to describe only a subset of configurations
[...] the CPS file is not intended to cover all possible configurations of a package; rather, it is meant to be generated by the build system and to describe the artifacts of one or more extant configurations for a single architecture.
@koonpeng would it be feasible to generate multiple CPS files that can be selected from based on those conditions, or are there too many conditionals?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would it be feasible to generate multiple CPS files that can be selected from based on those conditions
That's the path forward with this. There will be conditionals and that's fine, it's just that you cannot use the build system to resolve (most of) those conditionals.
ament_meta_build ((-o|--output-file) PATH)? ((-b|--build-system) IDENTIFIER)? PATH? | ||
``` | ||
|
||
Its only positional argument is a path to a common build specification file. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Assuming cps files have relative paths, is there another assumed argument of the working directory from which to resolve those paths?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point. I wrote this under the assumption that the build system or build system generator this tool generates code for already has a default path resolution logic (like CMake with its binary and source trees). But perhaps that's not enough and this tool could be given that information. WDYT?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is, the CMake output plugin to ament_meta_build
would assume relative paths in Sources
come from CMAKE_CURRENT_SOURCE_DIR
, things from Artifacts
are relative to CMAKE_CURRENT_BINARY_DIR
, and installed things from Distribution
are relative to CMAKE_INSTALL_PREFIX
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only one thing to correct:
That is, the CMake output plugin to
ament_meta_build
would assume relative paths inSources
come fromCMAKE_CURRENT_SOURCE_DIR
orCMAKE_CURRENT_BINARY_DIR
, things fromArtifacts
are relative toCMAKE_CURRENT_BINARY_DIR
, and installed things fromDistribution
are relative toCMAKE_INSTALL_PREFIX
.
This is because a component's artifacts may be another component's sources. The lack of a distinction between source and build trees is because not all build system support this notion. Build specifications must make sure there's no ambiguity upon source lookup.
I did some research on the build systems for some language bindings.
I'm sure it can work even if we don't fully follow the logic of each specialized tool but then we won't be doing 1:1 builds.
We can still have multiple build system for each language by having multiple plugins. I'm not sure if there is any worth with the flexibility to generate files for any language bindings to any build system. We rarely want to build nodejs bindings with maven for example.
I'm inclined to say that there are too many conditions. From what I see, it is not just platform specific conditions, there are often conditions like the version of the interpreter, the existence of certain libraries, the location of libraries, libraries specified by name or relative path or absolute path etc. Given all these conditionals and the footnote on CPS that it's intended use is to be generated by a build system for a particular environment, the best way would be for |
I'm not sure what you mean.
By an extension, do you mean for example
But you can still call the tool from bazel (whatever it generates cmake behind the scenes or not), right? |
@sloretz Agreed.
@sloretz Build specifications use source code generators. In a CMake project, one would invoke In a CMakeLists.txt execute_process(COMMAND rosidl_*_build_configure OUTPUT_FILE build.spec)
execute_process(COMMAND ament_meta_build -b cmake INPUT_FILE build.spec OUTPUT_FILE build.cmake)
include(build.cmake) In build.cmake add_custom_command(
OUTPUT ...
COMMAND rosidl_*_generate ...
) |
@koonpeng Thank you !
It has several conditionals, but they seem rather simple to resolve. The format itself is pretty simple too: the way it lists compiler and linker flags is almost a 1-to-1 mapping to the generic build spec I'm proposing.
Those arguments seem specific to
Hmm, for Python I'd think the build specification would be translated to a
This one's interesting. In this case, the build specification would only state that a build system has to run the go compiler on the set of generated files. Nothing more. Just to be super clear, the build spec is not a substitute for build systems. It's a blueprint to generate what each build system needs (CMake code, Starlark code, Groovy code,
@koonpeng That's an option. If it turns out we cannot afford a standard build specification. I'd really try to avoid per generator templates (and the combinatorial explosion that comes along).
I think we're mixing things up a bit. CPS is a format that describes package installs. Here I'm proposing a variation of the CPS format to describe package builds. Build configuration generates the latter, not CPS.
@ivanpauno I mean that if we have build system-specific plugins for
@ivanpauno some build system A can delegate to another build system B, sure. But (a) you need to properly embed B in A (ensuring rebuilds keep working; pushing some of A's output into B, specially if a call to B needs to build on top of a previous one; pulling B's output into A such that it can be used during the build and on install, etc.), (b) you force a dependency on B, and (c) there's a performance penalty in doing (a). For Bazel, properly embedding CMake also means staying within the build sandbox. |
What if I run |
@koonpeng Yours is a good example. Correct build spec granularity is key. While you could specify a shared library build for a CPython extension, I'd rather treat it as a component type of its own. That way, we push build details to each |
@hidmic I think I may have misunderstood the goal of build system plugins, I thought it was to support generating build system files for any language with CBS, but is it correct that the goal is actually just to support many code generator for languages they are built for? A build system plugin may choose to support multiple languages but most commonly they would only support one (node-gyp for nodejs, distutils for python). You still get the advantage for being able to generate build system files for multiple code generator plugins of one language, e.g. one python code generator may require c++14, but a new fancy one may require c++20.
@ivanpauno An example of the problem I faced is that |
I think we're close. Clearly, the design document needs better, more thorough explanations (and examples, many examples). As I see it, build system adapter plugins (i.e. It is up to each plugin whether and how it supports a given component type. For example:
There's no restriction here, only a path for proper integration. Some plugins will be better at it than others. [1] I have not fleshed out the schema yet, but it is my intention to allow source code generators to force a build system via arbitrary command support in CBS. It completely defeats the purpose of |
I will say that this design does not fully address that problem. |
Could it make sense to use something like a GNU Guix Package as a generalized interface, using lisp s-expressions rather than JSON to describe the package? |
@peterpolidoro Interesting, I didn't know about Guix. I would have to look into it in more depth. I will say that revamping ROS 2's package management system is [1] Which makes me I'm a bit wary about |
Take a good look at Guix before running away screaming from the parentheses. |
@hidmic how about providing the plugins a function to resolve a message/package's required build flags? This function could use the CPS files or any other methods under the hood. As it is now, only cmake knows the build flags so the only way for any other build system to build rosidl related targets is to delegate to cmake, that wouldn't make this tool very helpful. |
Signed-off-by: Michel Hidalgo <[email protected]>
I've address some comments and questions by clarifying the proposal. It's still lacking a nice diagram and examples. I'll add those next. |
@koonpeng at the package-level, that information can be propagated by the build system. I understand your use case, but as I said in ros2/rosidl#560 (comment) what you describe is not a generator if we go by its current definition (see architecture review in this proposal), and I intend to scope this work to that definition. Perhaps you can get ahead of me and kickstart a design to better support multi build-system ROS workspaces? E.g. by exporting package information in a tool-agnostic format like CPS.
@peterpolidoro It is quite cool. But I think it doesn't solve the problem we're trying to address here. This generalization is meant to support users that are already committed to (or constrained by) some build-system and now they want to use ROS 2 interfaces. Doing that right now means porting a lot of CMake logic, or just let CMake do the job (which is not nearly as easy nor convenient as it sounds). Guix (and please correct me if I'm wrong) looks like a very interesting alternative to the typical ROS workspace-based workflow. And yes, it could be used as a vehicle to export package information across build-systems. But that's a solution for a different problem. That is not to say that it wouldn't be cool if you were willing to propose a design to build ROS packages using Guix. Not sure how the @ros2/team feels about this though. |
Signed-off-by: Michel Hidalgo <[email protected]>
FYI I added pipeline diagrams in 78cd763. Would a data-flow diagram that shows how the current architecture can be cast into the proposed one help? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nits:
- Use one sentence per line
- For posterity, use permalinks for GitHub references
|
||
From a structural point-of-view, the `rosidl` interface generation pipeline is comprised of: | ||
|
||
- **Generator packages**. These packages provide tooling to generate language-specific, in-memory representations of ROS interfaces. Common runtime (i.e. interface agnostic) functionality is conventionally but not necessarily split into a separate package (e.g. [`rosidl_runtime_c`](https://github.com/ros2/rosidl/tree/master/rosidl_runtime_c), [`rosidl_runtime_cpp`](https://github.com/ros2/rosidl/tree/master/rosidl_runtime_cpp)). It is not unusual for a generator to use another generator's output via language-specific binding machinery (e.g. [`rosidl_generator_py`](https://github.com/ros2/rosidl_python/blob/master/rosidl_generator_py/), which uses Python C bindings to wrap [`rosidl_generator_c`](https://github.com/ros2/rosidl/tree/master/rosidl_generator_c) representations), although this dependency is not explicit: generator packages do depend on each other but it is assumed that their generated code will all be built within the same package so that build system dependencies can be established. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- **Generator packages**. These packages provide tooling to generate language-specific, in-memory representations of ROS interfaces. Common runtime (i.e. interface agnostic) functionality is conventionally but not necessarily split into a separate package (e.g. [`rosidl_runtime_c`](https://github.com/ros2/rosidl/tree/master/rosidl_runtime_c), [`rosidl_runtime_cpp`](https://github.com/ros2/rosidl/tree/master/rosidl_runtime_cpp)). It is not unusual for a generator to use another generator's output via language-specific binding machinery (e.g. [`rosidl_generator_py`](https://github.com/ros2/rosidl_python/blob/master/rosidl_generator_py/), which uses Python C bindings to wrap [`rosidl_generator_c`](https://github.com/ros2/rosidl/tree/master/rosidl_generator_c) representations), although this dependency is not explicit: generator packages do depend on each other but it is assumed that their generated code will all be built within the same package so that build system dependencies can be established. | |
- **Generator packages**. These packages provide tooling to generate language-specific, in-memory representations of ROS interfaces. Common runtime (i.e. interface agnostic) functionality is conventionally, but not necessarily, split into a separate package (e.g. [`rosidl_runtime_c`](https://github.com/ros2/rosidl/tree/master/rosidl_runtime_c), [`rosidl_runtime_cpp`](https://github.com/ros2/rosidl/tree/master/rosidl_runtime_cpp)). It is not unusual for a generator to use another generator's output via language-specific binding machinery (e.g. [`rosidl_generator_py`](https://github.com/ros2/rosidl_python/blob/master/rosidl_generator_py/), which uses Python C bindings to wrap [`rosidl_generator_c`](https://github.com/ros2/rosidl/tree/master/rosidl_generator_c) representations), although this dependency is not explicit: generator packages do depend on each other but it is assumed that their generated code will all be built within the same package so that build system dependencies can be established. |
+1 I'm still a little fuzzy on the proposal. In practice, it sounds like we would be replacing all of the CMake logic with a set of Python tools that others can extend via plugins. Here's a list of the proposed tools with inputs and outputs as I understand:
Does that look correct? As an exercise, I try to map these to plugins necessary for Java support:
Am I somewhat close to what you are imagining? |
+1
You're spot on.
Code generation and build specification/configuration are tightly coupled. Today both live on the same packages (as
Describe what the build system has to do to generate, build, and install generated artifacts e.g.: generate source files calling the code generation tooling, then build them into a shared library, then install that shared library somewhere. |
Okay, thanks. I guess in my example I wouldn't supply an exact build command, since it would be dependent on what system is chosen (CMake or Gradle), but we could supply things like Java version and compile flags. Thanks for clarifying. |
Signed-off-by: Michel Hidalgo <[email protected]>
Signed-off-by: Michel Hidalgo <[email protected]>
@sloretz @jacobperron I've updated the proposal, added a draft CBS schema, added examples. PTAL! |
That's the idea. In most cases, one shouldn't specify a command but an appropriate component type. It's the meta builder plugin's responsibility to figure out how to build that component type later on. |
Signed-off-by: Michel Hidalgo <[email protected]>
@hidmic @jacobperron guys, I hate to hijack the discussion here but I think we at Apex.AI did something very relevant to this and I wanted to convey how important I find this proposal to be. We did an experimental bazelization of our internal ROS2 fork with some pretty neat results. Bazel is not important here though. The main part is that I had do dig through the whole message generation pipeline and compose it in such a way that bazel could build all of the messages natively. So I just want to support the statement that:
It would have made my last couple of weeks much easier if there would be a single configurable binary (or script for that matter) that would spit out all the headers, sources and additional files that I would then just need to pipe into my build system of choice. If you guys need any specific help (I know I'm late to the party) please do not hesitate to tag me. |
"Configurations": [ "Optimized", "Debug" ], | ||
"Default-Components": [ "c" ], | ||
"Components": { | ||
"idls": { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are the component names arbitrary? It seems like the component names are tied to the build-configure
plugins. That is, c_code
below knows their is a build-configure
plugin that will create a component called idls
, and it will depend on that.
What if there are no .msg/.srv/.action
files to convert? Will .idls
still exist and be a noop, or does that mean there will be no idls
component? If the latter, how does the build-configure
plugin generating c_code
know then to not include a dependency on .idls
(assuming the order that build-configure
plugins are invoked is random)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are the component names arbitrary?
They are, though consistency is required to establish dependencies between them (e.g. to build Python generated code, C generated code may be necessary).
There's also something to be said about components shared by multiple build specifications. It is likely that components like idls
will be required by many source code generators and since each build specification is carried out independently, replication will ensue. By keeping names consistent, deduplication becomes feasible when multiple generators are configured at once (e.g. when targeting all available generators by not specifying any on rosidl build-configure
).
What if there are no
.msg/.srv/.action
files to convert? Will.idls
still exist and be a noop, or does that mean there will be noidls
component?
Either would work. During build specification, the tool has access to sources, so it can adapt the resulting .cbs
accordingly (e.g. adding or removing c_code
dependencies).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
assuming the order that build-configure plugins are invoked is random
One note about this. Some topological ordering will necessarily happen somewhere: in rosidl build-configure
by ordering plugins or when merging build specifications, in the build system after build specifications have been transformed, in the build tool if the user splits build specifications across multiple packages. I'm inclined to do so upon merging build specifications, simply to make things easier for the build system.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if there are no .msg/.srv/.action files to convert? Will .idls still exist and be a noop, or does that mean there will be no idls component?
Either would work. During build specification, the tool has access to sources, so it can adapt the resulting .cbs accordingly (e.g. adding or removing c_code dependencies).
By keeping names consistent, deduplication becomes feasible when multiple generators are configured at once (e.g. when targeting all available generators by not specifying any on rosidl build-configure).
So if all the build-configure
plugins know their respective generator
plugins require .idl
files but only .msg
files are available, then the build-configure
plugins all depend on :idls
and add an idls
component to translate any .msg/.srv/.action
files to .idl
files? Then rosidl build-configure
de-duplicates by keeping only one :idls
components? If so, it seems like that requires all build-configure
plugins to have exactly the same idea about how to convert files. If something changes and a plugin in the wild doesn't get updated, then all would be affected if rosidl build-configure
deduplicates and keeps the out-of date :idls
component.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So if all the
build-configure
plugins know their respective generator plugins require.idl
files but only.msg
files are available, then the build-configure plugins all depend on:idls
and add an:idls
component to translate any.msg/.srv/.action
files to.idl
files?
Correct.
Then
rosidl build-configure
de-duplicates by keeping only one:idls
components?
Correct, assuming these components are all equivalent.
it seems like that requires all build-configure plugins to have exactly the same idea about how to convert files.
That's right, if de-duplication is naive and only looks at component names.
IIRC this design document does not describe a build specification merge procedure. I didn't get that far. Thinking out loud, to safely de-duplicate two or more components, these have to be equal. Equality here means that name and description are equal, as well as that of its local dependencies, recursively.
For instance, say we have two :foo
components:
":foo": {
"Requires": {":bar", "fizz:buzz"},
...
}
These can be deemed equal if their descriptions match. But that's not enough, since each could be using a different :bar
(note that's not a problem for fizz:buzz
, as it is a fully qualified component in an external package[1]), so those have to be equal too. And the same goes for :bar
dependencies, and the dependencies of its dependencies, and so on. In practice, I highly doubt we'll have seemingly equivalent components like these.
[1] One could argue that two build specifications could depend on different versions of the same external package. The simplest solution is to ban merging them. If there's a need, the procedure could try to pick a version that satisfies both constraints, but it seems overkill to me.
"idls": { | ||
"Type": "generic", | ||
"Assembly": { | ||
"Command": "rosidl_adapt -p foo_msgs $(sources)", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do I understand correctly that there would be a build-configure
plugin that describes the conversion from .msg/.srv/.action
to .idl
, but there's no rosidl generate
plugin that does the conversion?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, not really. Converting from .msg/.srv/.action
to .idl
is more of a translation than a generation. Translation that may or may not be required depending on the generator, and thus it is left to each generator to do so at will. This is an aspect I'm refining though, as rosidl_typesupport_connext_cpp
forced me to. Right now, the plan of record is to:
- Add an extensible
rosidl translate
verb to expose this functionality. - Have each
rosidl generate
plugin perform an internalrosidl translate
if and only it cannot process a given interface file. - Have each
rosidl build-configure
plugin separate the translation and generation steps to have access to the translated files (e.g. to install them) and to use the build system as a cache (i.e. re-translate once if sources have changed).
"Requires": [ "action_msgs:idls", ":idls" ], | ||
"Command": "rosidl_generate --language c -p foo_msgs -I $(location action_msgs:idls) $(sources)", | ||
"Sources": [ | ||
"msg/foo.idl", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since :c_code
requires msg/foo.idl
and :idls
provides that, is "Requires": [":idls"]
redundant?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is, but I figured it'd be easier on ament_meta_build
if it doesn't have to deduce that if the underlying build system needs the information.
}, | ||
"Requires": { | ||
// Base common dependencies | ||
"rmw": {}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How would ament_meta_build
output for CMake
translate this? find_package(rmw REQUIRED)
, but then which CMake target gets used? Do all components making shared libraries get linked against whatever CMake target rmw
provides?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Whoops. Some component should be requiring some other component within rmw
. Good catch !
}, | ||
"Debug": { | ||
// Use debug Python interpreter to resolve numpy include path. | ||
"System-Includes": [ "$(shell pythond -c \"import numpy; print(numpy.get_include())\")" ] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On windows the command is python_d
, but AFAIK ROS 2 only uses the debug interpreter on Windows. How would that platform specific quirk be specified? Is the generated .cbs
supposed to be the same on all platforms, or is it only applicable to the platform it's run on?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This bit is not fully fleshed out, I must say, but I'd leave this to ament_meta_build
. That is, use pythond
to refer to Python's debug interpreter and let each ament_meta_build
plugin figure out what to put in there. Arguably, performing substitutions in a shell command line may be brittle, and perhaps introducing a $(pythond ...)
substitution is a better, simpler solution. I'm open to ideas :)
A few details worth noting: | ||
|
||
- The explicit dependency on other `foo_msgs_*` generated packages, which the tooling should resolve if all are to be present on the same specification. | ||
- The requirement on the `python-dev` package, which is a hint for the build system to build against CPython development libraries. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is python-dev
in this context a rosdep key? Can any rosdep key be used here, or is this a keyword that gets special treatment?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TL; DR not a rosdep
key, just a package this build specification depends on.
Generally speaking, a package requirement abstracts away a dependency. A package may exist for the target build system, or it may not i.e. it may be replaced by ament_meta_build
with some target build system-specific code. In this case, python-dev
is a package that abstracts away a dependency on CPython development libraries and that will likely be replaced with build system code by each ament_meta_build
plugin.
It's worth mentioning that this is an exceptional case. Having each ament_meta_build
plugin handle a "package" in a custom way is expensive. I think this makes sense for CPython development libraries because there's a large variability across build systems when using them and CPython extensions are ubiquitous. For most cases, it'll be up to the user to make sure a package requirement makes sense for the target build system (e.g. a requirement on fastrtps
in a .cbs
file will translate to find_package(fastrtps REQUIRED)
in CMake -- it is up to the user to make sure that such macro call succeeds and provides the necessary information).
"Configurations": { | ||
"Optimized": { | ||
// Use Python interpreter to resolve numpy include path. | ||
"System-Includes": [ "$(shell python -c \"import numpy; print(numpy.get_include())\")" ] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
python
might be Python 2 or Python 3, depending on the platform. Is this an actual command, or is something else translating this to the actual command?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as in #310 (comment) i.e. to be substituted. In that sense, custom Python substitutions like $(python2.7 ...)
and $(python3.6 ...)
may be a simpler approach to handle multiple Python versions. WDYT?
ament_meta_build ((-o|--output-file) PATH)? ((-b|--build-system) IDENTIFIER)? PATH? | ||
``` | ||
|
||
Its only positional argument is a path to a common build specification file. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is, the CMake output plugin to ament_meta_build
would assume relative paths in Sources
come from CMAKE_CURRENT_SOURCE_DIR
, things from Artifacts
are relative to CMAKE_CURRENT_BINARY_DIR
, and installed things from Distribution
are relative to CMAKE_INSTALL_PREFIX
?
- Add section on interface definition translators and rosidl translate - Update CLI specifications for all verbs - Update CLI examples Signed-off-by: Michel Hidalgo <[email protected]>
First step towards addressing ros2/rosidl#560 (might be worth going through the discussion there before reviewing). This is a first draft proposal to get some eyes on it. Lacks
csp
schema extension proposal.