Skip to content

Commit

Permalink
DM-47764: App metrics unit testing helpers
Browse files Browse the repository at this point in the history
Add a `MockEventManager` that creates `MockPublishers` which are no-op
publishers that record all published payloads. This is returned from the
config factory function when `enabled` is `False` and a new config key,
`mock`, is `True`.

Payloads are recorded to a list with mixed-in assertion helpers.
  • Loading branch information
fajpunk committed Dec 5, 2024
1 parent e927d13 commit 9b7d194
Show file tree
Hide file tree
Showing 8 changed files with 828 additions and 5 deletions.
2 changes: 2 additions & 0 deletions docs/documenteer.toml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ nitpick_ignore = [
# autodoc_pydantic generates some spurious links that can't be resolved.
['py:class', 'unittest.mock.Base'],
['py:class', 'unittest.mock.CallableMixin'],
['py:class', 'unittest.mock.MagicMixin'],
["py:obj", "ComputedFieldInfo"],
["py:class", "lambda"],
# arq doesn't provide documentation for all of its types.
Expand All @@ -53,6 +54,7 @@ nitpick_ignore = [
["py:obj", "safir.database._pagination.E"],
["py:obj", "safir.redis._storage.S"],
["py:obj", "safir.metrics._event_manager.P"],
["py:obj", "safir.metrics._testing.P"],
# SQLAlchemy DeclarativeBase documentation has references that Sphinx
# can't resolve properly.
["py:class", "sqlalchemy.inspection.Inspectable"],
Expand Down
185 changes: 184 additions & 1 deletion docs/user-guide/metrics/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,7 @@ The Kafka and schema manager values come from the Sasquatch configuration that y
.. code-block:: shell
METRICS_APPLICATION=myapp
METRICS_EVENTS_TOPIC_PREFIX=what.ever
KAFKA_SECURITY_PROTOCOL=SSL
KAFKA_BOOSTRAP_SERVERS=sasquatch.kafka-1:9092,sasquatcy.kafka2-9092
KAFKA_CLUSTER_CA_PATH=/some/path/ca.crt
Expand Down Expand Up @@ -162,6 +163,8 @@ Initialize

Then, in a `FastAPI lifespan`_ function, we'll create an `safir.metrics.EventManager` and initialize our ``events_dependency`` with it.
We need to do this in a lifespan function, because we need to do it only once for our whole application, not once for each request.
In this example we use a function that returns our app instead of a module variable to contain the app.
This will let us configure the environment in our unit tests before we initialize our app so that we can use the `safir.metrics.MockEventManager`, as described later in the :ref:`unit-tests` section.
In more complex apps, this would probably use the ProcessContext_ pattern.

.. code-block:: python
Expand All @@ -187,7 +190,8 @@ In more complex apps, this would probably use the ProcessContext_ pattern.
await event_manager.aclose()
app = FastAPI(lifespan=lifespan)
def create_app() -> FastApi:
return FastAPI(lifespan=lifespan)
.. _FastAPI lifespan: https://fastapi.tiangolo.com/advanced/events/#lifespan
.. _ProcessContext: https://sqr-072.lsst.io/#process-context
Expand Down Expand Up @@ -232,6 +236,183 @@ But the principle remains the same:
.. _RequestContext: https://sqr-072.lsst.io/#request-context
.. _Service: https://sqr-072.lsst.io/#services


.. _unit-tests:

Unit testing
============

Setting ``enabled`` to ``False`` and ``mock`` to ``True`` in your metrics configuration will give you a `safir.metrics.MockEventManager`.
This is a no-op event manager that produces publishers that record all of the events that they publish.
You can make assertions about these published events in your unit tests.

.. warning::

Do not use the `safir.metrics.MockEventManager` in any deployed instance of your application.
Recorded events are never cleaned up, and memory usage will grow unbounded.

.. code-block:: shell
METRICS_APPLICATION=myapp
METRICS_ENABLED=false
METRICS_MOCK=true
METRICS_EVENTS_TOPIC_PREFIX=what.ever
.. code-block:: python
from safir.metrics import metrics_configuration_factory
config = metrics_configuration_factory()
manager = config.make_manager()
class SomeEvent(EventPayload):
model_config = ConfigDict(ser_json_timedelta="float")
foo: str
count: int
duration: float | None
await manager.initialize()
pub = await manager.create_publisher("someevent", SomeEvent)
await pub.publish(SomeEvent(foo="foo1", count=1, duration=1.234))
await pub.publish(SomeEvent(foo="foo2", count=2, duration=2.345))
await pub.publish(SomeEvent(foo="foo3", count=3, duration=3.456))
await pub.publish(SomeEvent(foo="foo4", count=4, duration=None))
await pub.publish(SomeEvent(foo="foo5", count=5, duration=5.678))
await manager.aclose()
published = pub.published
A mock publisher has an `safir.metrics.MockPublisher.published` attribute which is a `safir.metrics.PublishedList` containing of all of the `safir.metrics.EventPayload`'s published by that publisher.
A `safir.metrics.PublishedList` is a regular Python list with some mixed-in assertion methods.
All of these assertion methods take a list of dicts and compare them to the ``model_dump(mode="json")`` serialization of the published ``EventPayloads``.

``assert_published``
--------------------

Use `safir.metrics.PublishedList.assert_published` to assert that some set of payloads is an ordered subset of all of the payloads that were published, with no events in between.
If not, an exception (a subclass of `AssertionError`) will be raised.
Other events could have been published before or after the expected payloads.

.. code-block:: python
pub.assert_published(
[
{"foo": "foo1", "count": 1, "duration": 1.234},
{"foo": "foo2", "count": 2, "duration": 2.345},
{"foo": "foo3", "count": 3, "duration": 3.456},
]
)
You can also assert that the all of the expected payloads were published in any order, and possibly with events in between:

.. code-block:: python
pub.assert_published(
[
{"foo": "foo1", "count": 1, "duration": 1.234},
{"foo": "foo3", "count": 3, "duration": 3.456},
{"foo": "foo2", "count": 2, "duration": 2.345},
],
any_order=True,
)
``assert_published_all``
------------------------

Use `safir.metrics.PublishedList.assert_published_all` to assert that the expected payloads, and only the expected payloads, were published:

.. code-block:: python
pub.assert_published_all(
[
{"foo": "foo1", "count": 1, "duration": 1.234},
{"foo": "foo2", "count": 2, "duration": 2.345},
{"foo": "foo3", "count": 3, "duration": 3.456},
{"foo": "foo4", "count": 4, "duration": None},
{"foo": "foo5", "count": 5, "duration": 5.678},
],
)
This would raise an exception because it is missing the ``foo5`` event:

.. code-block:: python
pub.assert_published_all(
[
{"foo": "foo1", "count": 1, "duration": 1.234},
{"foo": "foo2", "count": 2, "duration": 2.345},
{"foo": "foo3", "count": 3, "duration": 3.456},
{"foo": "foo4", "count": 4, "duration": None},
],
)
You can use ``any_order`` here too:

.. code-block:: python
pub.assert_published_all(
[
{"foo": "foo2", "count": 2, "duration": 2.345},
{"foo": "foo5", "count": 5, "duration": 5.678},
{"foo": "foo3", "count": 3, "duration": 3.456},
{"foo": "foo1", "count": 1, "duration": 1.234},
{"foo": "foo4", "count": 4, "duration": None},
],
any_order=True,
)
``ANY`` and ``NOT_NONE``
------------------------

You can use `safir.metrics.ANY` to indicate that any value, event `None` is OK.
This is just a re-export of `unittest.mock.ANY`.

.. code-block:: python
from safir.metrics import ANY
pub.assert_published_all(
[
{"foo": "foo3", "count": 3, "duration": ANY},
{"foo": "foo4", "count": 4, "duration": ANY},
],
)
You can use `safir.metrics.NOT_NONE` to indicate that any value except `None` is OK:

.. code-block:: python
from safir.metrics import ANY
pub.assert_published_all(
[
{"foo": "foo3", "count": 3, "duration": NOT_NONE},
{"foo": "foo4", "count": 4, "duration": ANY},
],
)
This would raise an exception, because ``duration`` for the ``foo4`` payload is `None`:

.. code-block:: python
from safir.metrics import ANY
pub.assert_published_all(
[
{"foo": "foo3", "count": 3, "duration": NOT_NONE},
{"foo": "foo4", "count": 4, "duration": NOT_NONE},
],
)
.. _configuration-details:

Configuration details
Expand Down Expand Up @@ -322,3 +503,5 @@ If your app uses Kafka for things other than metrics publishing (maybe it's a Fa
)
.. _FastStream: https://faststream.airt.ai


24 changes: 24 additions & 0 deletions safir/src/safir/metrics/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,16 @@
EventsConfiguration,
KafkaMetricsConfiguration,
MetricsConfiguration,
MockMetricsConfiguration,
metrics_configuration_factory,
)
from ._event_manager import (
EventManager,
EventPublisher,
KafkaEventManager,
KafkaEventPublisher,
MockEventManager,
MockPublisher,
NoopEventManager,
NoopEventPublisher,
)
Expand All @@ -20,8 +23,20 @@
KafkaTopicError,
)
from ._models import EventMetadata, EventPayload
from ._testing import (
ANY,
NOT_NONE,
BaseAssertionError,
NotPublishedConsecutivelyError,
NotPublishedError,
PublishedList,
PublishedNumberError,
PublishedTooFewError,
)

__all__ = [
"ANY",
"BaseAssertionError",
"BaseMetricsConfiguration",
"DisabledMetricsConfiguration",
"DuplicateEventError",
Expand All @@ -37,7 +52,16 @@
"KafkaTopicError",
"KafkaTopicError",
"MetricsConfiguration",
"MockEventManager",
"MockMetricsConfiguration",
"MockPublisher",
"NOT_NONE",
"NoopEventManager",
"NoopEventPublisher",
"NotPublishedConsecutivelyError",
"NotPublishedError",
"PublishedList",
"PublishedNumberError",
"PublishedTooFewError",
"metrics_configuration_factory",
]
58 changes: 54 additions & 4 deletions safir/src/safir/metrics/_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,20 @@

from ..kafka import KafkaConnectionSettings, SchemaManagerSettings
from ._constants import ADMIN_CLIENT_PREFIX, BROKER_PREFIX
from ._event_manager import EventManager, KafkaEventManager, NoopEventManager
from ._event_manager import (
EventManager,
KafkaEventManager,
MockEventManager,
NoopEventManager,
)

__all__ = [
"BaseMetricsConfiguration",
"DisabledMetricsConfiguration",
"EventsConfiguration",
"KafkaMetricsConfiguration",
"MetricsConfiguration",
"MockMetricsConfiguration",
"metrics_configuration_factory",
]

Expand Down Expand Up @@ -131,6 +137,45 @@ def make_manager(
)


class MockMetricsConfiguration(BaseMetricsConfiguration):
"""Metrics configuration when metrics publishing is mocked."""

enabled: Annotated[
bool, AfterValidator(lambda x: _require_bool(x, False))
] = Field(
...,
title="Whether to send events",
description=(
"If set to false, no events will be sent and all calls to publish"
" events will be no-ops."
),
validation_alias=AliasChoices("enabled", "METRICS_ENABLED"),
)

mock: Annotated[bool, AfterValidator(lambda x: _require_bool(x, True))] = (
Field(
title="Mock publishers",
description=(
"If set to true, all event publishers will be"
" unittest.mock.MagicMock instances which will record all"
" calls to their publish methods."
),
validation_alias=AliasChoices("mock", "METRICS_MOCK"),
)
)

model_config = SettingsConfigDict(extra="ignore", populate_by_name=True)

def make_manager(
self, logger: BoundLogger | None = None
) -> MockEventManager:
if not logger:
logger = structlog.get_logger("safir.metrics")
return MockEventManager(
self.application, self.events.topic_prefix, logger
)


class KafkaMetricsConfiguration(BaseMetricsConfiguration):
"""Metrics configuration when enabled, including Kafka configuration."""

Expand Down Expand Up @@ -204,7 +249,9 @@ def make_manager(


MetricsConfiguration: TypeAlias = (
DisabledMetricsConfiguration | KafkaMetricsConfiguration
MockMetricsConfiguration
| DisabledMetricsConfiguration
| KafkaMetricsConfiguration
)
"""Type to use for metrics configuration in the application config.
Expand Down Expand Up @@ -262,6 +309,9 @@ class Config(BaseSettings):
# environment variable settings to enable, and then finally
# unconditionally try to return the default.
try:
return DisabledMetricsConfiguration()
return MockMetricsConfiguration()
except ValidationError:
return KafkaMetricsConfiguration()
try:
return DisabledMetricsConfiguration()
except ValidationError:
return KafkaMetricsConfiguration()
Loading

0 comments on commit 9b7d194

Please sign in to comment.