diff --git a/.gitignore b/.gitignore
index 6dddc2e..fe48896 100644
--- a/.gitignore
+++ b/.gitignore
@@ -137,6 +137,9 @@ venv.bak/
.spyderproject
.spyproject
+# VSCode settings
+.vscode/
+
# Rope project settings
.ropeproject
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index fa1edc1..84a7376 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -32,14 +32,10 @@ pytest
We additionally recommend that you set up your editor / IDE as follows.
-- Indent with 4 spaces per level of indentation
- Indent with 4 spaces per level of indentation
-- Maximum line length of 79 (add a ruler / thin line / highlighting / ...)
- Maximum line length of 79 (add a ruler / thin line / highlighting / ...)
-- _If you use Visual Studio Code_: Consider using a platform which supports
- third-party language servers more easily, and continue with the next point.
- _If you use Visual Studio Code_: Consider using a platform which supports
third-party language servers more easily, and continue with the next point.
@@ -55,40 +51,19 @@ We additionally recommend that you set up your editor / IDE as follows.
}
```
- ```json
- "[python]": {
- "editor.codeActionsOnSave": {
- "source.organizeImports": true
- }
- }
- ```
-
- Note that the Pylance language server is not recommended, as it occasionally
- causes false-positive errors for perfectly valid code.
Note that the Pylance language server is not recommended, as it occasionally
causes false-positive errors for perfectly valid code.
-- _If you do not use VSC_: Set up your editor to use the [python-lsp-server],
- and make sure that the relevant plugins are installed. You can install
- everything that's needed into the virtualenv with pip:
- _If you do not use VSC_: Set up your editor to use the [python-lsp-server],
and make sure that the relevant plugins are installed. You can install
everything that's needed into the virtualenv with pip:
- [python-lsp-server]: https://github.com/python-lsp/python-lsp-server
[python-lsp-server]: https://github.com/python-lsp/python-lsp-server
```sh
pip install "python-lsp-server[pylint]" python-lsp-black pyls-isort pylsp-mypy
```
- ```sh
- pip install "python-lsp-server[pylint]" python-lsp-black pyls-isort pylsp-mypy
- ```
-
- This will provide as-you-type linting as well as automatic formatting on
- save. Language server clients are available for a wide range of editors, from
- Vim/Emacs to PyCharm/IDEA.
This will provide as-you-type linting as well as automatic formatting on
save. Language server clients are available for a wide range of editors, from
Vim/Emacs to PyCharm/IDEA.
@@ -99,47 +74,27 @@ We base our code style on a modified version of the
[Google style guide for Python code](https://google.github.io/styleguide/pyguide.html).
The key differences are:
-- **Docstrings**: The [Numpy style guide] applies here.
- **Docstrings**: The [Numpy style guide] applies here.
[numpy style guide]: https://numpydoc.readthedocs.io/en/latest/format.html#docstring-standard
- [numpy style guide]: https://numpydoc.readthedocs.io/en/latest/format.html#docstring-standard
- When writing docstrings for functions, use the imperative style, as per
- [PEP-257]). For example, write "Do X and Y" instead of "Does X and Y".
When writing docstrings for functions, use the imperative style, as per
[PEP-257]). For example, write "Do X and Y" instead of "Does X and Y".
[pep-257]: https://peps.python.org/pep-0257/
- [pep-257]: https://peps.python.org/pep-0257/
-- **Overridden methods**: If the documentation did not change from the base
- class (i.e. the base class' method's docstring still applies without
- modification), do not add a short docstring á la "See base class". This lets
- automated tools pick up the full base class docstring instead, and is
- therefore more useful in IDEs etc.
- **Overridden methods**: If the documentation did not change from the base
class (i.e. the base class' method's docstring still applies without
modification), do not add a short docstring á la "See base class". This lets
automated tools pick up the full base class docstring instead, and is
therefore more useful in IDEs etc.
-- **Linting**: Use [pylint] for static code analysis, and [mypy] for static
- type checking.
- **Linting**: Use [pylint] for static code analysis, and [mypy] for static
type checking.
[pylint]: https://github.com/PyCQA/pylint
[mypy]: https://github.com/python/mypy
- [pylint]: https://github.com/PyCQA/pylint
- [mypy]: https://github.com/python/mypy
-- **Formatting**: Use [black] as code auto-formatter. The maximum line length
- is 79, as per [PEP-8]. This setting should be automatically picked up from
- the `pyproject.toml` file. The reason for the shorter line length is that it
- avoids wrapping and overflows in side-by-side split views (e.g. diffs) if
- there's also information displayed to the side of it (e.g. a tree view of the
- modified files).
- **Formatting**: Use [black] as code auto-formatter. The maximum line length
is 79, as per [PEP-8]. This setting should be automatically picked up from
the `pyproject.toml` file. The reason for the shorter line length is that it
@@ -147,36 +102,22 @@ The key differences are:
there's also information displayed to the side of it (e.g. a tree view of the
modified files).
- [black]: https://github.com/psf/black
- [pep-8]: https://www.python.org/dev/peps/pep-0008/
[black]: https://github.com/psf/black
[pep-8]: https://www.python.org/dev/peps/pep-0008/
- Be aware of the different line length of 72 for docstrings. We currently do
- not have a satisfactory solution to automatically apply or enforce this.
Be aware of the different line length of 72 for docstrings. We currently do
not have a satisfactory solution to automatically apply or enforce this.
- Note that, while you're encouraged to do so in general, it is not a hard
- requirement to break up long strings into smaller parts. Additionally, never
- break up strings that are presented to the user in e.g. log messages, as that
- makes it significantly harder to grep for them.
Note that, while you're encouraged to do so in general, it is not a hard
requirement to break up long strings into smaller parts. Additionally, never
break up strings that are presented to the user in e.g. log messages, as that
makes it significantly harder to grep for them.
- Use [isort] for automatic sorting of imports. Its settings should
- automatically be picked up from the `pyproject.toml` file as well.
Use [isort] for automatic sorting of imports. Its settings should
automatically be picked up from the `pyproject.toml` file as well.
[isort]: https://github.com/PyCQA/isort
- [isort]: https://github.com/PyCQA/isort
-- **Typing**: We do not make an exception for `typing` imports. Instead of
- writing `from typing import SomeName`, use `import typing as t` and access
- typing related classes like `t.TypedDict`.
- **Typing**: We do not make an exception for `typing` imports. Instead of
writing `from typing import SomeName`, use `import typing as t` and access
typing related classes like `t.TypedDict`.
@@ -194,22 +135,13 @@ The key differences are:
`t.Optional[...]` and always explicitly annotate where `None` is possible.
[pep-604-style unions]: https://www.python.org/dev/peps/pep-0604/
- [pep-604-style unions]: https://www.python.org/dev/peps/pep-0604/
-- **Python style rules**: For conflicting parts, the [Black code style] wins.
- If you have set up black correctly, you don't need to worry about this though
- :)
- **Python style rules**: For conflicting parts, the [Black code style] wins.
If you have set up black correctly, you don't need to worry about this though
:)
[black code style]: https://black.readthedocs.io/en/stable/the_black_code_style/current_style.html
- [black code style]: https://black.readthedocs.io/en/stable/the_black_code_style/current_style.html
-- When working with `dict`s, consider using `t.TypedDict` instead of a more
- generic `dict[str, float|int|str]`-like annotation where possible, as the
- latter is much less precise (often requiring additional `assert`s or
- `isinstance` checks to pass) and can grow unwieldy very quickly.
- When working with `dict`s, consider using `t.TypedDict` instead of a more
generic `dict[str, float|int|str]`-like annotation where possible, as the
latter is much less precise (often requiring additional `assert`s or
@@ -217,5 +149,3 @@ The key differences are:
- Prefer `t.NamedTuple` over `collections.namedtuple`, because the former uses
a more convenient `class ...:` syntax and also supports type annotations.
-- Prefer `t.NamedTuple` over `collections.namedtuple`, because the former uses
- a more convenient `class ...:` syntax and also supports type annotations.
diff --git a/capella_ros_tools/__main__.py b/capella_ros_tools/__main__.py
index e328959..be47277 100644
--- a/capella_ros_tools/__main__.py
+++ b/capella_ros_tools/__main__.py
@@ -14,18 +14,40 @@
from capella_ros_tools.snapshot import app
-@click.command()
+@click.group(context_settings={"default_map": {}})
@click.version_option(
version=capella_ros_tools.__version__,
prog_name="capella-ros-tools",
message="%(prog)s %(version)s",
)
+def cli():
+ """CLI for capella-ros-tools."""
+
+
+@cli.command("import")
+@click.argument(
+ "msg_path", type=str, required=True, help="Path to ROS messages."
+)
+@click.argument(
+ "capella_path",
+ type=click.Path(path_type=Path),
+ required=True,
+ help="Path to Capella model.",
+)
+@click.argument(
+ "layer",
+ type=click.Choice(["oa", "la", "sa", "pa"], case_sensitive=False),
+ required=True,
+ help="Layer of Capella data package.",
+)
@click.option(
"--exists-action",
"action",
- type=click.Choice(["k", "o", "a", "c"], case_sensitive=False),
- default="c" if sys.stdin.isatty() else "a",
- help="Default action when an element already exists: (c)heck, (k)eep, (o)verwrite, (a)bort.",
+ type=click.Choice(
+ ["skip", "replace", "abort", "ask"], case_sensitive=False
+ ),
+ default="ask" if sys.stdin.isatty() else "abort",
+ help="Default action when an element already exists.",
)
@click.option(
"--no-deps",
@@ -34,71 +56,20 @@
help="Don’t install message dependencies.",
)
@click.option("--port", type=int, help="Port for HTML display.")
-@click.option(
- "-i",
- "in_",
- nargs=2,
- type=(click.Choice(["capella", "messages"]), str),
- required=True,
- help="Input file type and path.",
-)
-@click.option(
- "-o",
- "out",
- nargs=2,
- type=(
- click.Choice(["capella", "messages"]),
- click.Path(path_type=Path),
- ),
- required=True,
- help="Output file type and path.",
-)
-@click.option(
- "-l",
- "layer",
- type=click.Choice(["oa", "sa", "la", "pa"], case_sensitive=True),
- required=True,
- help="Layer to use.",
-)
-def cli(
- in_: tuple[str, str],
- out: tuple[str, str],
+def import_msg(
+ msg_path: t.Any,
+ capella_path: Path,
layer: str,
action: str,
- port: int,
no_deps: bool,
+ port: int,
):
- """Convert between Capella and ROS message definitions."""
- input_type, input_path = in_
- output_type, output = out
-
- if input_type == output_type:
- raise click.UsageError(
- "Input and output must be different file types."
- )
- if "capella" not in (input_type, output_type):
- raise click.UsageError(
- "Either input or output must be a capella file."
- )
- if "messages" not in (input_type, output_type):
- raise click.UsageError(
- "Either input or output must be a messages file."
- )
-
- input: t.Any = Path(input_path)
-
- if not input.exists() and input_type == "messages":
- input = capellambse.filehandler.get_filehandler(input_path).rootdir
- elif not input.exists() and input_type == "capella":
- input = capellambse.filehandler.get_filehandler(input_path)
+ """Import ROS messages into Capella data package."""
- msg_path, capella_path, convert_class = (
- (input, output, msg2capella.Converter)
- if input_type == "messages"
- else (output, input, capella2msg.Converter)
- )
+ if not Path(msg_path).exists():
+ msg_path = capellambse.filehandler.get_filehandler(msg_path).rootdir
- converter: t.Any = convert_class(
+ converter: t.Any = msg2capella.Converter(
msg_path, capella_path, layer, action, no_deps
)
converter.convert()
@@ -107,5 +78,35 @@ def cli(
app.start(converter.model.model, layer, port)
+@cli.command("export")
+@click.argument("capella_path", type=str, required=True)
+@click.argument(
+ "layer",
+ type=click.Choice(["oa", "la", "sa", "pa"], case_sensitive=False),
+ required=True,
+)
+@click.argument("msg_path", type=click.Path(path_type=Path), required=True)
+@click.option(
+ "--exists-action",
+ "action",
+ type=click.Choice(
+ ["keep", "overwrite", "abort", "ask"], case_sensitive=False
+ ),
+ default="ask" if sys.stdin.isatty() else "abort",
+ help="Default action when an element already exists.",
+)
+def export_capella(
+ capella_path: t.Any,
+ msg_path: Path,
+ layer: str,
+):
+ """Export Capella data package to ROS messages."""
+ if not Path(capella_path).exists():
+ capella_path = capellambse.filehandler.get_filehandler(capella_path)
+
+ converter: t.Any = capella2msg.Converter(capella_path, msg_path, layer)
+ converter.convert()
+
+
if __name__ == "__main__":
cli()
diff --git a/capella_ros_tools/modules/capella/__init__.py b/capella_ros_tools/modules/capella/__init__.py
index 2449cb0..1337ae5 100644
--- a/capella_ros_tools/modules/capella/__init__.py
+++ b/capella_ros_tools/modules/capella/__init__.py
@@ -48,7 +48,7 @@ class BaseCapellaModel:
def __init__(
self,
- path_to_capella_model: str,
+ path_to_capella_model: t.Any,
layer: str,
) -> None:
self.model = capellambse.MelodyModel(path_to_capella_model)
diff --git a/capella_ros_tools/modules/capella/serializer.py b/capella_ros_tools/modules/capella/serializer.py
index ff94cc2..46d4b6c 100644
--- a/capella_ros_tools/modules/capella/serializer.py
+++ b/capella_ros_tools/modules/capella/serializer.py
@@ -65,12 +65,23 @@ def delete_classes(self, classes: list, package: t.Any = None) -> None:
except ValueError:
pass
- def _find_type(self, type_name: str, package: t.Any) -> t.Any:
+ def _find_or_create_type(self, type_name: str, package: t.Any) -> t.Any:
"""Find type in Capella model."""
try:
return self.predef_types.by_name(type_name)
except KeyError:
- self.create_basic_types([type_name], package)
+ pass
+ try:
+ return package.datatypes.by_name(type_name)
+ except KeyError:
+ type_name_lower = type_name.lower()
+ if "char" in type_name_lower or "string" in type_name_lower:
+ type = "StringType"
+ elif "bool" in type_name_lower or "boolean" in type_name_lower:
+ type = "BooleanType"
+ else:
+ type = "NumericType"
+ package.datatypes.create(type, name=type_name)
return package.datatypes.by_name(type_name)
def create_enums(
@@ -102,7 +113,7 @@ def create_enums(
property.value = capellambse.new_object(
"LiteralNumericValue",
value=prop.value,
- type=self._find_type(prop.type, package),
+ type=self._find_or_create_type(prop.type, package),
)
logger.info("Created enum %s.", enum.name)
@@ -120,27 +131,6 @@ def delete_enums(self, enums: list, package: t.Any = None) -> None:
except ValueError:
pass
- def create_basic_types(
- self, basic_types: list[str], package: t.Any = None
- ) -> list:
- """Create basic types in Capella model."""
- if package is None:
- package = self.data
-
- overlap = []
- for basic_type in basic_types:
- try:
- overlap.append(package.datatypes.by_name(basic_type))
- except KeyError:
- if "char" in basic_type or "string" in basic_type:
- type = "StringType"
- elif "bool" in basic_type or "boolean" in basic_type:
- type = "BooleanType"
- else:
- type = "NumericType"
- package.datatypes.create(type, name=basic_type)
- return overlap
-
def create_properties(self, cls: ClassDef, package: t.Any):
"""Create properties for class in Capella model."""
if package is None:
@@ -191,7 +181,9 @@ def create_properties(self, cls: ClassDef, package: t.Any):
"Enumeration", below=type_package
).by_name(prop.type_name)
except KeyError:
- property_type = self._find_type(prop.type_name, package)
+ property_type = self._find_or_create_type(
+ prop.type_name, package
+ )
attribute = self._create_composition(
superclass, prop, property_type
diff --git a/capella_ros_tools/modules/messages/parser.py b/capella_ros_tools/modules/messages/parser.py
index c71136a..01baea0 100644
--- a/capella_ros_tools/modules/messages/parser.py
+++ b/capella_ros_tools/modules/messages/parser.py
@@ -41,7 +41,7 @@
VALID_MESSAGE_NAME_PATTERN = "[A-Z][A-Za-z0-9]*"
-VALID_CONSTANT_NAME_PATTERN = "[A-Z][A-Z0-9_]*[A-Z0-9]*"
+VALID_CONSTANT_NAME_PATTERN = "[A-Z](?:[A-Z0-9_]*[A-Z0-9])?"
VALID_REF_COMMENT_PATTERN = re.compile(
r"cf\.\s*"
rf"({VALID_MESSAGE_NAME_PATTERN})"
@@ -248,23 +248,20 @@ def _process_enums(msg):
msg.enums.remove(enum)
for enum in msg.enums:
- match_name = [
- i
- for i, field in enumerate(msg.fields)
- if _get_enum_identifier(field.name) == enum.name
- ]
- match_type = [
- i
- for i, field in enumerate(msg.fields)
- if field.type.name == enum.values[0].type.name
- ]
- if match_name:
- msg.fields[match_name[0]].type.name = enum.name
- elif match_type:
- field = msg.fields[match_type[0]]
- field.type.name = msg.name + _get_enum_identifier(field.name)
- enum.name = field.type.name
- elif not enum.name or len(msg.enums) == 1:
+ for field in msg.fields:
+ if enum.name == _get_enum_identifier(field.name):
+ # enum name is the same as the field name
+ field.type.name = enum.name
+ return
+
+ for field in msg.fields:
+ if field.type.name == enum.values[0].type.name:
+ # enum type is the same as the field type
+ field.type.name = msg.name + _get_enum_identifier(field.name)
+ enum.name = field.type.name
+ return
+
+ if not enum.name or len(msg.enums) == 1:
enum.name = msg.name + "Type" if msg.fields else msg.name
diff --git a/capella_ros_tools/scripts/capella2msg.py b/capella_ros_tools/scripts/capella2msg.py
index 5da434e..8406719 100644
--- a/capella_ros_tools/scripts/capella2msg.py
+++ b/capella_ros_tools/scripts/capella2msg.py
@@ -19,16 +19,17 @@
"Boolean": "bool",
"Byte": "byte",
"Char": "char",
- "Short": "int8",
- "UnsignedShort": "uint8",
- "Integer": "int16",
- "UnsignedInteger": "uint16",
- "Long": "int32",
- "UnsignedLong": "uint32",
- "LongLong": "int64",
- "UnsignedLongLong": "uint64",
+ "Short": "int16",
+ "UnsignedShort": "uint16",
+ "Integer": "int32",
+ "UnsignedInteger": "uint32",
+ "Long": "int64",
+ "UnsignedLong": "uint64",
+ "LongLong": "int128",
+ "UnsignedLongLong": "uint128",
"Float": "float32",
"Double": "float64",
+ "LongDouble": "float128",
"String": "string",
}
@@ -38,74 +39,55 @@ class Converter:
def __init__(
self,
- msg_path: t.Any,
capella_path: t.Any,
+ msg_path: t.Any,
layer: str,
- action: str,
- no_deps: bool,
) -> None:
self.msg_path = msg_path
self.msgs = MessagePkgDef(msg_path.stem, [], [])
self.model = CapellaModel(capella_path, layer)
- self.action = action
- self.no_deps = no_deps
def _add_package(self, current_root: t.Any) -> MessagePkgDef:
current_pkg_def = MessagePkgDef(current_root.name, [], [])
for cls in self.model.get_classes(current_root):
- current_pkg_def.messages.append(
- MessageDef(
- cls.name,
- [
- FieldDef(
- BaseTypeDef(
- CAPELLA_TYPE_TO_MSG[prop.type_name]
- if prop.type_name in CAPELLA_TYPE_TO_MSG
- else prop.type_name,
- None
- if prop.max_card == "1"
- else prop.max_card,
- None
- if prop.type_pkg_name == current_pkg_def.name
- else prop.type_pkg_name,
- ),
- prop.name,
- prop.description.split("\n"),
- )
- for prop in cls.properties
- ],
- [],
- cls.description.split("\n"),
+ fields = []
+ for prop in cls.properties:
+ bt_name = CAPELLA_TYPE_TO_MSG.get(
+ prop.type_name, prop.type_name
+ )
+ bt_size = None if prop.max_card == "1" else prop.max_card
+ if prop.type_pkg_name != current_pkg_def.name:
+ bt_pkg = prop.type_pkg_name
+ else:
+ bt_pkg = None
+ fields.append(
+ FieldDef(
+ BaseTypeDef(bt_name, bt_size, bt_pkg),
+ prop.name,
+ prop.description.split("\n"),
+ )
)
- )
+ annotations = cls.description.split("\n")
+ msg_def = MessageDef(cls.name, fields, [], annotations)
+ current_pkg_def.messages.append(msg_def)
for enum in self.model.get_enums(current_root):
- current_pkg_def.messages.append(
- MessageDef(
- enum.name,
- [],
- [
- EnumDef(
- enum.name,
- [
- ConstantDef(
- BaseTypeDef(
- CAPELLA_TYPE_TO_MSG.get(value.type)
- or "uint8"
- ),
- value.name,
- value.value or str(i),
- value.description.split("\n"),
- )
- for i, value in enumerate(enum.values)
- ],
- [],
- )
- ],
- enum.description.split("\n"),
+ values = []
+ for i, value in enumerate(enum.values):
+ bt_name = CAPELLA_TYPE_TO_MSG.get(value.type, "uint8")
+ values.append(
+ ConstantDef(
+ BaseTypeDef(bt_name),
+ value.name,
+ value.value or str(i),
+ value.description.split("\n"),
+ )
)
- )
+ enum_def = EnumDef(enum.name, values, [])
+ annotations = enum.description.split("\n")
+ msg_def = MessageDef(enum.name, [], [enum_def], annotations)
+ current_pkg_def.messages.append(msg_def)
for pkg_name in self.model.get_packages(current_root):
new_root = current_root.packages.by_name(pkg_name)
diff --git a/capella_ros_tools/scripts/msg2capella.py b/capella_ros_tools/scripts/msg2capella.py
index 5676ed1..7792b01 100644
--- a/capella_ros_tools/scripts/msg2capella.py
+++ b/capella_ros_tools/scripts/msg2capella.py
@@ -26,18 +26,7 @@
"bool": "Boolean",
"byte": "Byte",
"char": "Char",
- "int8": "Short",
- "uint8": "UnsignedShort",
- "int16": "Integer",
- "uint16": "UnsignedInteger",
- "int32": "Long",
- "uint32": "UnsignedLong",
- "int64": "LongLong",
- "uint64": "UnsignedLongLong",
- "float32": "Float",
- "float64": "Double",
"string": "String",
- "wstring": "Char",
}
@@ -58,20 +47,20 @@ def __init__(
self.no_deps = no_deps
def _resolve_overlap(self, overlap, deletion_func, current_root):
- if not overlap or self.action == "k":
+ if not overlap or self.action == "skip":
return
- if self.action == "a":
+ if self.action == "abort":
click.echo(
f"{len(overlap)} elements already exist."
- " Use --exists-action=o to overwrite."
+ " Use --exists-action=replace to replace."
)
raise click.Abort()
- elif self.action == "o":
+ elif self.action == "replace":
deletion_func(overlap, current_root)
- elif self.action == "c":
+ elif self.action == "ask":
for i, cls in enumerate(overlap):
confirm = click.prompt(
- f"{cls.name} already exists. " "Do you want to overwrite?",
+ f"{cls.name} already exists. Overwrite? [y]es / [Y]es to all / [n]o / [N]o to all",
type=click.Choice(
["y", "Y", "n", "N"],
case_sensitive=True,
@@ -95,33 +84,46 @@ def _add_objects(
packages = [p.name for p in current_pkg_def.packages]
self.model.create_packages(packages, current_root)
- enums = [
- EnumDef(
- e.name,
- [
- EnumValue(
- MSG_TYPE_TO_CAPELLA.get(v.type.name) or v.type.name,
- v.name,
- v.value,
- "\n".join(v.annotations),
+ classes = []
+ enums = []
+ for msg in current_pkg_def.messages:
+ if msg.fields:
+ class_description = "\n".join(msg.annotations)
+ class_def = ClassDef(
+ msg.name,
+ [],
+ class_description,
+ )
+ classes.append(class_def)
+
+ for enum in msg.enums:
+ if not enum.values:
+ continue
+ values = []
+ for value in enum.values:
+ value_type = MSG_TYPE_TO_CAPELLA.get(
+ value.type.name, value.type.name
)
- for v in e.values
- ],
- "\n".join(e.annotations),
- )
- for msg in current_pkg_def.messages
- for e in msg.enums
- if e.values
- ]
+ value_description = "\n".join(value.annotations)
+ value_def = EnumValue(
+ value_type,
+ value.name,
+ value.value,
+ value_description,
+ )
+ values.append(value_def)
+ enum_description = "\n".join(enum.annotations)
+ enum_def = EnumDef(
+ enum.name,
+ values,
+ enum_description,
+ )
+ enums.append(enum_def)
+
overlap = self.model.create_enums(enums, current_root)
self._resolve_overlap(overlap, self.model.delete_enums, current_root)
self.model.create_enums(enums, current_root)
- classes = [
- ClassDef(c.name, [], "\n".join(c.annotations))
- for c in current_pkg_def.messages
- if c.fields
- ]
overlap = self.model.create_classes(classes, current_root)
self._resolve_overlap(overlap, self.model.delete_classes, current_root)
self.model.create_classes(classes, current_root)
@@ -134,25 +136,29 @@ def _add_relations(self, current_pkg_def, current_root):
for msg in current_pkg_def.messages:
if not msg.fields:
continue
- self.model.create_properties(
- ClassDef(
- name=msg.name,
- properties=[
- ClassProperty(
- MSG_TYPE_TO_CAPELLA.get(f.type.name)
- or f.type.name,
- f.type.pkg_name,
- f.name,
- min_card="0" if f.type.array_size else "1",
- max_card=f.type.array_size or "1",
- description="\n".join(f.annotations),
- )
- for f in msg.fields
- ],
- description="\n".join(msg.annotations),
- ),
- current_root,
+ properties = []
+ for field in msg.fields:
+ field_type = MSG_TYPE_TO_CAPELLA.get(
+ field.type.name, field.type.name
+ )
+ field_min = "0" if field.type.array_size else "1"
+ field_max = field.type.array_size or "1"
+ field_description = "\n".join(field.annotations)
+ property_def = ClassProperty(
+ field_type,
+ field.type.pkg_name,
+ field.name,
+ field_min,
+ field_max,
+ field_description,
+ )
+ properties.append(property_def)
+ class_def = ClassDef(
+ msg.name,
+ properties,
+ "",
)
+ self.model.create_properties(class_def, current_root)
for new_pkg_def in current_pkg_def.packages:
new_root = current_root.packages.by_name(new_pkg_def.name)
diff --git a/docs/source/examples/data/empty_project_52/empty_project_52.capella b/docs/source/examples/data/empty_project_52/empty_project_52.capella
index 836b9e3..1db3bd1 100644
--- a/docs/source/examples/data/empty_project_52/empty_project_52.capella
+++ b/docs/source/examples/data/empty_project_52/empty_project_52.capella
@@ -201,7 +201,10775 @@
+ id="d755d151-7b1f-4df3-b4b8-329953b177dc" name="Data">
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
-o capella -l --port= --exists-action= --no-deps
-
-* "-i messages ", import ROS2 messages from
-* "-o capella ", export to Capella model
-* "-l ", use Capella model layer
-* "--port=", start Capella model server at
-* "--exists-action=", action to take if a Capella element already exists
+ $ python -m capella_ros_tools import --port= --exists-action= --no-deps
+
+* "", import ROS2 messages from
+* "", export to Capella model
+* "", use Capella model layer
+* "--port=", start Capella model server at (optional)
+* "--exists-action=", action to take if a Capella element already exists (optional)
+ * "skip", skip elements
+ * "replace", replace elements
+ * "abort", abort import
+ * "ask", ask the user (default)
* "--no-deps", do not import ROS2 dependencies (e.g. std_msgs)
Export Capella Model (experimental):
------------------------------------
.. code-block:: bash
- $ python -m capella_ros_tools -i capella -l -o messages --port
+ $ python -m capella_ros_tools export
-* "-i capella ", import Capella model from
-* "-l ", use Capella model layer
-* "-o messages ", export ROS2 messages to
-* "--port=", start Capella model server at
+* "", import Capella model from
+* "", use Capella model layer
+* "", export ROS2 messages to
diff --git a/git-conventional-commits.json b/git-conventional-commits.json
deleted file mode 100644
index 525cbf0..0000000
--- a/git-conventional-commits.json
+++ /dev/null
@@ -1,18 +0,0 @@
-{
- "convention" : {
- "commitTypes": [
- "build",
- "chore",
- "ci",
- "docs",
- "feat",
- "fix",
- "merge",
- "perf",
- "refactor",
- "revert",
- "test"
- ],
- "commitScopes": []
- }
-}
diff --git a/git-conventional-commits.json.license b/git-conventional-commits.json.license
deleted file mode 100644
index b689e74..0000000
--- a/git-conventional-commits.json.license
+++ /dev/null
@@ -1,2 +0,0 @@
-Copyright DB InfraGO AG and contributors
-SPDX-License-Identifier: CC0-1.0
diff --git a/pyproject.toml b/pyproject.toml
index a93c430..3e6909a 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -30,6 +30,7 @@ classifiers = [
dependencies = [
"click",
"capellambse @ git+https://github.com/DSD-DBS/py-capellambse.git@more-filepath-props",
+ "capellambse_context_diagrams @ git+https://github.com/DSD-DBS/capellambse-context-diagrams.git@add-context-to-tree",
"fastapi",
"uvicorn[standard]",
]
@@ -44,10 +45,10 @@ docs = [
"ipython",
"nbsphinx",
"sphinx-copybutton",
- "tomli; python_version<'3.11'",
+ "tomli",
"jinja2",
- "pyyaml>=6.0",
- "sphinx!=7.2.0,!=7.2.1,!=7.2.2",
+ "pyyaml",
+ "sphinx",
"sphinx-argparse-cli",
]