-
Notifications
You must be signed in to change notification settings - Fork 10
Development Home
To perform development activities, developers must:
- Clone the CxAnalytix repository
- Install the .Net Core SDK
Developers can use an IDE such as VSCode or Visual Studio 2019 to perform development activities. Each IDE will provide a method to build the code, but a build script has been provided for performing builds if an IDE build is not desired.
At the root of the repository is the build.ps1
Powershell script. This is an ideal method of building if you are mainly interested in getting it running quickly. Executing build.ps1
will place the output artifacts under .\artifacts
. If the .Net Core SDK is not installed on the machine, it will attempt to build using Docker.
Please see the build script documentation for requirements and advanced build options.
It is possible to download the installer for the .Net Core SDK, or use the Chocolatey package manager to perform the install:
choco install dotnetcore-sdk
Note: you may need to reboot after the install.
Microsoft provides detailed documentation explaining how to install the .Net Core runtime and SDK on various Linux platforms.
A basic regression testing tool can be found in the RegressionTester
project. The idea is to compare the data output of a previous version with the data output of the program including new code changes. In theory, crawling the same instance of SAST with a previous version and a modified version should produce similar data sets. If new fields are added as part of the new modifications, these are detected when possible but does not necessarily indicate there is a problem.
The tool assumes that the log output is used and the filenames for the logs are using names that are specified in the default log4net configuration.
Some of the basic regression tests:
- Check for duplicate records within a data set.
- Detect missing or added fields between data sets.
- Verify record field values for fields found in both data sets have identical values when the values are should not have been changed due to the crawl. (e.g. the date the scan report is generated is ignored since it will be different in each data set).