Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

APT29 Emulation Plan includes multiple scenarios + APT3 Techniques #9

Open
L015H4CK opened this issue Nov 14, 2024 · 2 comments
Open

Comments

@L015H4CK
Copy link

Hello everyone,

I just found this repository, after working with the adversary emulation library of the CTID for a few years.
So I wonder, is this the new/maintained version of the library?

However, I noticed the APT29 Emulation Plan still contains all scenarios for the APT29 (i.e., Day1.A, Day1.B, and Day2) and even some techniques from APT3. More information going into great detail can be found in my issue from almost two years ago. I also created a pull request to split the faulty APT29 emulation plan into 4 separate plans.

As before, I am happy to answer any questions regarding the problem at hand and my PR.
Also, I can take some time and create a new PR for this repository if you think it might be merged in the future.

Best regards,
Louis

@m3mike
Copy link
Contributor

m3mike commented Nov 19, 2024

Hi @L015H4CK!

Thanks for your notes! The emulation plans used for ATT&CK Evaluations have been migrated to this repository and will be hosted here from now on.

The Center for Threat Informed Defense (CTID) may still post additional emulation plans under the CTID organization, but ATT&CK Evaluations projects will be exclusively hosted here.

This repository is intended to contain the publicly available versions of the emulation plans used during the evaluation itself. Participants configure their products to detect and protect the “victim” organization, and then the evaluation team executes the scenario.

During the execution of an ATT&CK Evaluation, we assess participants based on the specific emulation plan they developed. For instance, during the APT29 evaluation, Day1.A and Day1.B were executed on the first day of the evaluation week.

While discrepancies may occur, we strive to publish the actual emulation plan that participants were evaluated against. This consistency ensures that the emulation plan in this repository provides context to the results. If someone wishes to delve deeper into the published results to understand the specific code, script, or tool used to evaluate a particular step, that is possible.

However, if we were to modify scenarios to split them out as suggested in your PR, it could lead to inconsistencies, making it unclear what a participant was evaluated against.

I am open to updates on this repository, but this is the challenge I face: ensuring that the published results from an evaluation align with the published emulation plan.

I am open to suggestions that preserve this linkage, but I am unsure how to achieve that with your changes. Does this make sense, and do you have any thoughts on the process?

Apologies for the length, but we genuinely appreciate the effort you put into identifying and submitting a PR, so I wanted to clarify the challenges we face.

@L015H4CK
Copy link
Author

Hello @m3mike, thanks a lot for your detailed response!

I totally understand and support the decision that you want to keep the emulation plans in the repository as close as possible to the emulation plans used during the evaluation.

Regarding the APT29 Emulation: I am still not 100% percent sure I understand your decision to combine the emulation plans of APT29 Day1 and APT29 Day2 into one emulation plan - but that is fine for me! If that is how the evaluation was performed (APT29 Day1.A, APT29 Day1.B, and APT29 Day2 at the same time) it should be in this repository the same way. I just want to note, that this might cause confusion, e.g., when using the Caldera evals plugin - which creates a single adversary profile for all 3 scenarios.

Regarding the APT3 techniques: Just to give an example of the problem.
The complete APT29 emulation plan section for scenario 1 contains techniques that are not mentioned anywhere in the scenario description:

In more details, the command ipconfig /all is nowhere to be mentioned for APT29 (Scenario 1 or 2). Instead, it comes from APT3 techniques. There are several more techniques (almost all in Scenario 1 Step 2) that are not from APT29, but from APT3 (see linked emulation plan, scenario description, and archived adversary profile). 17 steps are mentioned in the emulation plan for scenarion 1 step 2, were the scenario description only mentions two steps (2.A and 2.B). Most of the steps are also from the Discovery tactic and not Collection+Exfiltration.

As I mentioned in the old issue:

In January 2021 the content of the above-mentioned repository was ported to this repository and the "old" form was archived. During this port, all adversary profiles were merged into one emulation plan - APT29.yaml. This plan now contains both scenarios for APT29 as well as the abilities for APT3.

If this was indeed how the ATT&CK evaluation was performed, I am more than happy to close this issue, but it seems to me that the error from 2021 was ported over several repositories into this and was never fixed.
Also, the APT3 and APT29 were not evaluated at the same time. See APT3 results from 2018 here and APT29 results from 2020 here.

Best regards,
Louis

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants