From 113b235418ffd985ddb75cdf75e934d3d2b90dca Mon Sep 17 00:00:00 2001 From: David Gardner <96306125+dagardner-nv@users.noreply.github.com> Date: Tue, 29 Oct 2024 09:00:34 -0700 Subject: [PATCH] Fix mis-leading deserialize stage comments (#2009) * The deserialize stage no longer actually deserializes, but instead chunks incoming `MessageMeta` objects and converts to `ControlMessage` Closes #2007 ## By Submitting this PR I confirm: - I am familiar with the [Contributing Guidelines](https://github.com/nv-morpheus/Morpheus/blob/main/docs/source/developer_guide/contributing.md). - When the PR is ready for review, new or existing tests cover these changes. - When the PR is ready for review, the documentation is up to date with these changes. Authors: - David Gardner (https://github.com/dagardner-nv) Approvers: - Yuchen Zhang (https://github.com/yczhang-nv) URL: https://github.com/nv-morpheus/Morpheus/pull/2009 --- examples/abp_nvsmi_detection/README.md | 2 +- examples/nlp_si_detection/README.md | 2 +- examples/root_cause_analysis/README.md | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/examples/abp_nvsmi_detection/README.md b/examples/abp_nvsmi_detection/README.md index b29ad6bb84..7e0c796630 100644 --- a/examples/abp_nvsmi_detection/README.md +++ b/examples/abp_nvsmi_detection/README.md @@ -127,7 +127,7 @@ morpheus --log_level=DEBUG \ pipeline-fil --columns_file=data/columns_fil.txt \ `# 1st Stage: Read from file` \ from-file --filename=examples/data/nvsmi.jsonlines \ - `# 2nd Stage: Deserialize from JSON strings to objects` \ + `# 2nd Stage: Deserialize batch DataFrame into ControlMessages` \ deserialize \ `# 3rd Stage: Preprocessing converts the input data into BERT tokens` \ preprocess \ diff --git a/examples/nlp_si_detection/README.md b/examples/nlp_si_detection/README.md index 1d24fea105..507d64e862 100644 --- a/examples/nlp_si_detection/README.md +++ b/examples/nlp_si_detection/README.md @@ -117,7 +117,7 @@ morpheus --log_level=DEBUG \ pipeline-nlp --model_seq_length=256 \ `# 1st Stage: Read from file` \ from-file --filename=examples/data/pcap_dump.jsonlines \ - `# 2nd Stage: Deserialize from JSON strings to objects` \ + `# 2nd Stage: Deserialize batch DataFrame into ControlMessages` \ deserialize \ `# 3rd Stage: Preprocessing converts the input data into BERT tokens` \ preprocess --vocab_hash_file=data/bert-base-uncased-hash.txt --do_lower_case=True --truncation=True \ diff --git a/examples/root_cause_analysis/README.md b/examples/root_cause_analysis/README.md index 943c00fad2..c68c3bdee4 100644 --- a/examples/root_cause_analysis/README.md +++ b/examples/root_cause_analysis/README.md @@ -111,7 +111,7 @@ run --num_threads=8 --edge_buffer_size=4 --pipeline_batch_size=1024 --model_max_ pipeline-nlp --model_seq_length=128 --label=not_root_cause --label=is_root_cause \ `# 1st Stage: Read from file` \ from-file --filename=${MORPHEUS_ROOT}/models/datasets/validation-data/root-cause-validation-data-input.jsonlines \ -`# 2nd Stage: Deserialize from JSON strings to objects` \ +`# 2nd Stage: Deserialize batch DataFrame into ControlMessages` \ deserialize \ `# 3rd Stage: Preprocessing converts the input data into BERT tokens` \ preprocess --column=log --vocab_hash_file=./data/bert-base-uncased-hash.txt --truncation=True --do_lower_case=True --add_special_tokens=False \