Skip to content

Commit

Permalink
last min fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
amitkparekh committed Dec 5, 2023
1 parent 9e963f8 commit bfb4e93
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 11 deletions.
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,6 @@ This repo holds the object detector and feature extractor for running things.
> [!IMPORTANT]
> If you have questions or find bugs or anything, you can contact us in our [organisation's discussion](https://github.com/orgs/emma-heriot-watt/discussions).

## Writing code and running things

### Run the server for the [Alexa Arena](https://github.com/amazon-science/alexa-arena)
Expand Down
5 changes: 0 additions & 5 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -39,11 +39,6 @@ cmd = "pytest -v --junitxml=pytest.xml --cov=src -m 'not slow and not multiproce
help = "Update torch to use the latest CUDA version"
shell = "python scripts/update_torch_cuda.py"

[[tool.poe.tasks.postinstall]]
help = 'Update torch and install maskrcnn-benchmark and scene-graph-benchmark'
shell = """
pip install git+https://github.com/emma-simbot/scene_graph_benchmark
"""

[tool.poetry.dependencies]
python = ">=3.9,<3.10"
Expand Down
10 changes: 5 additions & 5 deletions src/emma_perception/models/simbot_entity_classifier.py
Original file line number Diff line number Diff line change
Expand Up @@ -177,12 +177,12 @@ def __init__(

def make_layers(self) -> Sequential:
"""Make a simple 2 layer MLP."""
layers = []
layers: list[torch.nn.Module] = []

layers.append(Linear(self._in_features, self._hidden_dim)) # type: ignore[arg-type]
layers.append(BatchNorm1d(self._hidden_dim)) # type: ignore[arg-type]
layers.append(Dropout(self._dropout)) # type: ignore[arg-type]
layers.append(Linear(self._hidden_dim, self._num_classes)) # type: ignore[arg-type]
layers.append(Linear(self._in_features, self._hidden_dim))
layers.append(BatchNorm1d(self._hidden_dim))
layers.append(Dropout(self._dropout))
layers.append(Linear(self._hidden_dim, self._num_classes))

return Sequential(*layers)

Expand Down

0 comments on commit bfb4e93

Please sign in to comment.