Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Unregistered type : mrc::pymrc::coro::BoostFibersMainPyAwaitable error #1869

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion ci/scripts/common.sh
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ export PY_DIRS="${PY_ROOT} ci/scripts"
export BASE_SHA=${CHANGE_TARGET:-${BASE_SHA:-$(${SCRIPT_DIR}/gitutils.py get_merge_target)}}
export COMMIT_SHA=${GIT_COMMIT:-${COMMIT_SHA:-HEAD}}

export CPP_FILE_REGEX='^(\.\/)?(examples|morpheus|tests)\/.*\.(cc|cpp|h|hpp)$'
export CPP_FILE_REGEX='^(\.\/)?(examples|python|tests)\/.*\.(cc|cpp|h|hpp)$'
export PYTHON_FILE_REGEX='^(\.\/)?(?!\.|build|external).*\.(py|pyx|pxd)$'

# Use these options to skip any of the checks
Expand Down
1 change: 1 addition & 0 deletions conda/environments/all_cuda-121_arch-x86_64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ dependencies:
- benchmark=1.8.3
- boto3
- breathe=4.35.0
- c-ares=1.32
- ccache
- clangdev=16
- click>=8
Expand Down
1 change: 1 addition & 0 deletions conda/environments/dev_cuda-121_arch-x86_64.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ dependencies:
- beautifulsoup4=4.12
- benchmark=1.8.3
- breathe=4.35.0
- c-ares=1.32
- ccache
- clangdev=16
- click>=8
Expand Down
1 change: 1 addition & 0 deletions dependencies.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -233,6 +233,7 @@ dependencies:

# Non-Compiler Dependencies
- automake=1.16.5 # Needed for DOCA build
- c-ares=1.32 # 1.33 causes an undefined symbol error
- ccache
- cmake=3.27
- cuda-cudart-dev=12.1
Expand Down
2 changes: 1 addition & 1 deletion docs/source/developer_guide/contributing.md
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@ This workflow utilizes a Docker container to set up most dependencies ensuring a
```bash
./docker/run_container_dev.sh
```
1. The container tag follows the same rules as `build_container_dev.sh` and will default to the current `YYMMDD`. Specify the desired tag with `DOCKER_IMAGE_TAG`. i.e. `DOCKER_IMAGE_TAG=my_tag ./docker/run_container_dev.sh`
1. The container tag follows the same rules as `build_container_dev.sh` and will default to the current `YYMMDD`. Specify the desired tag with `DOCKER_IMAGE_TAG`. For example, `DOCKER_IMAGE_TAG=my_tag ./docker/run_container_dev.sh`
2. This will automatically mount the current working directory to `/workspace`.
3. Some of the validation tests require launching the Morpheus models Docker container within the Morpheus container. To enable this you will need to grant the Morpheus container access to your host OS's Docker socket file with:
```bash
Expand Down
1 change: 1 addition & 0 deletions python/morpheus/morpheus/_lib/stages/__init__.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ import morpheus._lib.stages
import typing
from morpheus._lib.common import FilterSource
import morpheus._lib.common
import mrc.core.coro
import mrc.core.segment
import os

Expand Down
3 changes: 3 additions & 0 deletions python/morpheus/morpheus/_lib/stages/module.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,9 @@ PYBIND11_MODULE(stages, _module)
// Make sure to load mrc.core.segment to get ObjectProperties
mrc::pymrc::import(_module, "mrc.core.segment");

// Import the mrc coro module
mrc::pymrc::import(_module, "mrc.core.coro");

mrc::pymrc::from_import(_module, "morpheus._lib.common", "FilterSource");

py::class_<mrc::segment::Object<AddClassificationsStageMM>,
Expand Down
14 changes: 7 additions & 7 deletions scripts/validation/val-run-pipeline.sh
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ function run_pipeline_sid_minibert(){
pipeline-nlp --model_seq_length=256 \
from-file --filename=${INPUT_FILE} \
deserialize \
preprocess --vocab_hash_file=${MORPHEUS_ROOT}/morpheus/data/bert-base-uncased-hash.txt --truncation=True --do_lower_case=True --add_special_tokens=False \
preprocess --vocab_hash_file=${MORPHEUS_ROOT}/python/morpheus/morpheus/data/bert-base-uncased-hash.txt --truncation=True --do_lower_case=True --add_special_tokens=False \
${INFERENCE_STAGE} \
monitor --description "Inference Rate" --smoothing=0.001 --unit inf \
add-class --prefix="si_" \
Expand All @@ -62,7 +62,7 @@ function run_pipeline_sid_bert(){
pipeline-nlp --model_seq_length=256 \
from-file --filename=${INPUT_FILE} \
deserialize \
preprocess --vocab_hash_file=${MORPHEUS_ROOT}/morpheus/data/bert-base-cased-hash.txt --truncation=True --do_lower_case=False --add_special_tokens=False \
preprocess --vocab_hash_file=${MORPHEUS_ROOT}/python/morpheus/morpheus/data/bert-base-cased-hash.txt --truncation=True --do_lower_case=False --add_special_tokens=False \
${INFERENCE_STAGE} \
monitor --description "Inference Rate" --smoothing=0.001 --unit inf \
add-class --prefix="si_" \
Expand All @@ -80,7 +80,7 @@ function run_pipeline_abp_nvsmi(){
VAL_OUTPUT=$5

morpheus --log_level=DEBUG run --num_threads=$(nproc) --pipeline_batch_size=1024 --model_max_batch_size=1024 --use_cpp=${USE_CPP} \
pipeline-fil --columns_file=${MORPHEUS_ROOT}/morpheus/data/columns_fil.txt \
pipeline-fil --columns_file=${MORPHEUS_ROOT}/python/morpheus/morpheus/data/columns_fil.txt \
from-file --filename=${INPUT_FILE} \
deserialize \
preprocess \
Expand All @@ -101,10 +101,10 @@ function run_pipeline_phishing_email(){
VAL_OUTPUT=$5

morpheus --log_level=DEBUG run --num_threads=$(nproc) --pipeline_batch_size=1024 --model_max_batch_size=32 --use_cpp=${USE_CPP} \
pipeline-nlp --model_seq_length=128 --labels_file=${MORPHEUS_ROOT}/morpheus/data/labels_phishing.txt \
pipeline-nlp --model_seq_length=128 --labels_file=${MORPHEUS_ROOT}/python/morpheus/morpheus/data/labels_phishing.txt \
from-file --filename=${INPUT_FILE} \
deserialize \
preprocess --vocab_hash_file=${MORPHEUS_ROOT}/morpheus/data/bert-base-uncased-hash.txt --truncation=True --do_lower_case=True --add_special_tokens=False \
preprocess --vocab_hash_file=${MORPHEUS_ROOT}/python/morpheus/morpheus/data/bert-base-uncased-hash.txt --truncation=True --do_lower_case=True --add_special_tokens=False \
${INFERENCE_STAGE} \
monitor --description "Inference Rate" --smoothing=0.001 --unit inf \
add-class --label=is_phishing --threshold=0.7 \
Expand All @@ -122,7 +122,7 @@ function run_pipeline_hammah_user123(){
VAL_OUTPUT=$5

morpheus --log_level=DEBUG run --num_threads=$(nproc) --pipeline_batch_size=1024 --model_max_batch_size=1024 --use_cpp=${USE_CPP} \
pipeline-ae --columns_file="${MORPHEUS_ROOT}/morpheus/data/columns_ae_cloudtrail.txt" --userid_filter="user123" --userid_column_name="userIdentitysessionContextsessionIssueruserName" --timestamp_column_name="event_dt" \
pipeline-ae --columns_file="${MORPHEUS_ROOT}/python/morpheus/morpheus/data/columns_ae_cloudtrail.txt" --userid_filter="user123" --userid_column_name="userIdentitysessionContextsessionIssueruserName" --timestamp_column_name="event_dt" \
from-cloudtrail --input_glob="${MORPHEUS_ROOT}/models/datasets/validation-data/dfp-cloudtrail-*-input.csv" \
train-ae --train_data_glob="${MORPHEUS_ROOT}/models/datasets/training-data/dfp-cloudtrail-*.csv" --source_stage_class=morpheus.stages.input.cloud_trail_source_stage.CloudTrailSourceStage --seed 42 \
preprocess \
Expand All @@ -144,7 +144,7 @@ function run_pipeline_hammah_role-g(){
VAL_OUTPUT=$5

morpheus --log_level=DEBUG run --num_threads=$(nproc) --pipeline_batch_size=1024 --model_max_batch_size=1024 --use_cpp=${USE_CPP} \
pipeline-ae --columns_file="${MORPHEUS_ROOT}/morpheus/data/columns_ae_cloudtrail.txt" --userid_filter="role-g" --userid_column_name="userIdentitysessionContextsessionIssueruserName" --timestamp_column_name="event_dt" \
pipeline-ae --columns_file="${MORPHEUS_ROOT}/python/morpheus/morpheus/data/columns_ae_cloudtrail.txt" --userid_filter="role-g" --userid_column_name="userIdentitysessionContextsessionIssueruserName" --timestamp_column_name="event_dt" \
from-cloudtrail --input_glob="${MORPHEUS_ROOT}/models/datasets/validation-data/dfp-cloudtrail-*-input.csv" \
train-ae --train_data_glob="${MORPHEUS_ROOT}/models/datasets/training-data/dfp-cloudtrail-*.csv" --source_stage_class=morpheus.stages.input.cloud_trail_source_stage.CloudTrailSourceStage --seed 42 \
preprocess \
Expand Down
Loading