-
Notifications
You must be signed in to change notification settings - Fork 22
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[difftest] add difftest organization doc and rational doc
- Loading branch information
Showing
2 changed files
with
69 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,31 @@ | ||
## Organizational Summary | ||
|
||
`difftest` is a verification framework designed to separate the driver and diff test phases. The framework supports testing for two verification objects: `t1` and `t1rocket`. The basic structure and functionality are as follows: | ||
|
||
### Directory Structure | ||
|
||
1. **Top-Level Directory** | ||
- `Cargo.lock` and `Cargo.toml`: For project management and dependency configuration. | ||
- `default.nix`: Nix configuration file for building and managing the project environment. | ||
- `doc/`: Documentation directory containing information about the project organization and structure. | ||
|
||
2. **Verification Object Directories** | ||
- `dpi_common/`: Contains shared library code, which are used across different verification objects. | ||
- `dpi_t1/` and `dpi_t1rocket/`: Contain the TestBench code for `t1` and `t1rocket`, respectively. Each directory includes source files providing the DPI library linked by emulator(vcs or verilator), these DPIs will be called by corresponding Testbench. | ||
|
||
3. **Verification and Driver Directories** | ||
- `offline_t1/` and `offline_t1rocket/`: Correspond to the verification projects for `t1` and `t1rocket`, respectively. These directories include the main test code files, used for executing verification and testing. | ||
- `spike_interfaces/`: Contains C++ code files for interface definitions. | ||
- `spike_rs/`: Source files include `lib.rs`, `runner.rs`, and `spike_event.rs`, which provide the methods and tools needed during the verification phase. | ||
|
||
### Workflow | ||
|
||
1. **TestBench Generation** | ||
- For each verification object (`t1` and `t1rocket`), corresponding TestBench code is used with emulator to generate the static library. | ||
|
||
2. **Driving and Verification** | ||
- The generated static library is driven by the Rust code in `spike_rs`. | ||
- During the verification phase, interfaces provided by `spike_rs` generate the architectural information. | ||
|
||
3. **Testing and Validation** | ||
- Code in the `offline_t1` and `offline_t1rocket` directories carries out the actual offline difftest verification work, using `difftest.rs` and other test code to test the generated architectural information. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,38 @@ | ||
### Rational Documentation: Offline Difftest | ||
|
||
#### Background | ||
In existing online difftest solutions, step-by-step implementations often lack general applicability, especially when dealing with complex instruction streams and diverse processor architectures. Therefore, an **offline difftest system** is proposed, aimed at providing more detailed validation of each instruction’s execution result, along with an interface for simulators to handle **undefined behavior** injection. | ||
|
||
#### Objective | ||
The goal of offline difftest is to ensure the correctness of each instruction’s observed behavior during processor execution, while providing a flexible validation mechanism for handling undefined behavior. The validation process involves three steps: | ||
|
||
1. Running the driver to generate the DUT (Device Under Test) trace. | ||
2. Using the DUT trace to generate the model trace. | ||
3. Detecting instruction differences by comparing the DUT trace and model trace using the difftest algorithm. | ||
|
||
#### Design Choices | ||
|
||
##### 1. **DUT Trace Design and Encoding** | ||
- **Design:** The DUT trace design must consider how to efficiently and accurately record the execution state of each instruction, including the execution address, register values, memory state, etc. Currently, using `printf` to output the simulation address is a preliminary solution, with plans to possibly use XDMA (high-speed data transfer interface) to extract FPGA traces in the future for higher verification efficiency. | ||
- **Encoding:** Proper encoding of the trace will facilitate fast alignment and parsing in the comparison stage. | ||
|
||
##### 2. **Simulator Integration** | ||
- **Using SAIL Simulator:** SAIL offers a highly accurate and customizable architecture simulation framework. Despite its slower speed, SAIL’s precision and flexibility make it well-suited for offline verification needs. | ||
- **Removing Platform Dependencies:** To simplify the simulation, platform-related parts of SAIL will be removed, focusing on validating the instruction set and processor core. | ||
- **Function Patch:** For example, in validating FP (floating-point) related instructions, certain functions, such as `fp reduce order`, may need to be replaced. | ||
- **Handling Undefined Behavior:** To verify undefined behavior, an interface will be provided for the simulator to inject specific behavior. This may involve forking the SAIL project or directly customizing its source code. | ||
|
||
##### 3. **DiffTest Algorithm** | ||
- **Instruction Alignment:** During the comparison process, it’s crucial to ensure that the DUT and model instruction streams are properly aligned. This may require special alignment algorithms to handle challenges arising from out-of-order or parallel execution. | ||
- **Result Comparison:** The core of the comparison is to ensure that the execution results of each instruction match. This can be done via streaming-based real-time comparison or through other efficient differential calculation methods. | ||
|
||
#### Potential Challenges and Solutions | ||
|
||
##### 1. **Memory Behavior Issues** | ||
- When memory access order differs, it may lead to processor behavior discrepancies. In such cases, alignment strategies and memory model simulation may be needed to reduce the impact of memory ordering on the behavior. | ||
|
||
##### 2. **Handling Undefined Behavior** | ||
- Defining and simulating undefined behavior is a complex task. A flexible interface is needed to allow the simulator to inject undefined behavior when detected, and to record related traces for further analysis. | ||
|
||
#### Conclusion | ||
The offline difftest system is an innovative validation approach designed to enhance the flexibility and accuracy of verification by decoupling the driver and verification processes, thereby reducing the impact of verification on the driver process. This system not only precisely verifies the results of each instruction but also flexibly handles undefined behavior, offering an efficient and general-purpose validation method for complex processor architectures. |