Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support multiple outstanding stores #1474

Merged
merged 2 commits into from
Oct 19, 2023

Conversation

colluca
Copy link
Contributor

@colluca colluca commented Sep 25, 2023

This PR implements support for multiple outstanding store transactions bypassing the cache. It improves memory parallelism of write requests to non-cached memory regions, e.g. involved in configuring a peripheral.

@github-actions
Copy link
Contributor

✔️ successful run, report available here.

@colluca colluca marked this pull request as ready for review September 26, 2023 07:28
@JeanRochCoulon
Copy link
Contributor

Hello @colluca,
Happy to see you are interested in using CVA6.
Your PR is quite embarrassing for me. You modify the WB cache to add a feature, but this WB cache is deprecated. But as far I know (@zarubaf could you confirm it?) the WB cache is never tested by the CI, that's why we cannot monitor the regression of your modification.
BTW do you know we will soon integrate another cache called HPDcache supporting WB and WT configuration ? Just in case.. Maybe it would be better to use and modify a not deprecated cache.
If you confirm your will to use WB cache, the first think to do is to test this configuration in CI.
Cheers

@colluca
Copy link
Contributor Author

colluca commented Oct 4, 2023

Dear @JeanRochCoulon, thank you for taking the time to have a look at the PR.

You modify the WB cache to add a feature, but this WB cache is deprecated. BTW do you know we will soon integrate another cache called HPDcache supporting WB and WT configuration ? Just in case.. Maybe it would be better to use and modify a not deprecated cache.

Where is this information disseminated? Is there a timeline for the integration of the HPDCache? We are using the WB cache in most PULP systems, so my PR comes from developments which are needed thereof.

But as far I know (@zarubaf could you confirm it?) the WB cache is never tested by the CI, that's why we cannot monitor the regression of your modification. If you confirm your will to use WB cache, the first think to do is to test this configuration in CI. Cheers

Indeed I confirm the will to use it... or rather, it has been in use up to now. How should I proceed to test the WB configuration in the CI?

Best regards,
Luca

@JeanRochCoulon
Copy link
Contributor

Up to now, no configuration has been defined with WB cache enable. The first is to define a new configuration (inside core/include dir), then to invoke this configuration from github actions CI (defined in cva6 repository)

@JeanRochCoulon
Copy link
Contributor

Hello @colluca This PR is not ready to be merged because tests need to be added to CI to test the regression. I propose to close the PR, do not hesitate to revive it when it will be ready

@zarubaf
Copy link
Contributor

zarubaf commented Oct 9, 2023

@JeanRochCoulon since this PR does not touch the WT cache that is part of your verification target, is there any harm in merging this?

I wouldn't go that far to declare the WB cache dead (just yet). For example, there is the ace project of @suppamax which uses the WB cache. And we (as in Axelera) haven't yet concluded on the choice of cash, so it could be that we revive it.

I am re-opening this PR, @colluca can we merge?

@zarubaf zarubaf reopened this Oct 9, 2023
@JeanRochCoulon
Copy link
Contributor

JeanRochCoulon commented Oct 9, 2023

Merging without testing is a risk to make WB die. I do not merge to keep it as it is without modification, and I hope functional. How to garanty it is functional after merging ?

@niwis
Copy link
Contributor

niwis commented Oct 10, 2023

@JeanRochCoulon I've opened a PR to re-add WB regression tests to the CI: #1523. Do you think this would do?
If I understood @cfuguet correctly, HPDcache currently only supports WT configuration (but perhaps this info is outdated).

@JeanRochCoulon
Copy link
Contributor

Hello @niwis, good for having added WB to the regression. Let's wait for the CI report.
The @cfuguet PR allows to select one of the three dcache configuration: WT, WB or HPD.

@niwis
Copy link
Contributor

niwis commented Oct 10, 2023

I meant that afaik HPDcache itself will feature a write-through (no write-back) policy. In that case, I think it would be good to keep the existing WB cache alive for now.

@JeanRochCoulon
Copy link
Contributor

JeanRochCoulon commented Oct 10, 2023

In fact HPDcache gets the ability to be configured in WT or WB configurations. But @cfuguet could provide better information than me ;-)

@JeanRochCoulon
Copy link
Contributor

Hello @colluca
@niwis has provided a great help in integrating WB into the Github Actions CI. Can you rebase your PR to get back this modification and be able to get the CI result of WB ?

@cfuguet
Copy link

cfuguet commented Oct 10, 2023

I meant that afaik HPDcache itself will feature a write-through (no write-back) policy. In that case, I think it would be good to keep the existing WB cache alive for now.

Hello @niwis, currently the HPDcache only supports the Write-Through policy. I will start the developments to support Write-Back, but the official release date of this new feature is May, 2024.

@colluca
Copy link
Contributor Author

colluca commented Oct 10, 2023

Tests seem to pass. There is one last outstanding question from my side.

Currently, the number of outstanding transactions is hardcoded in the miss_handler. Should I expose it as a top level parameter? How would you suggest to implement this? e.g. any preferred name for the parameter?

@cfuguet
Copy link

cfuguet commented Oct 10, 2023

After the add of WB tests in the CI, I'm having the following issue from the CI when running WB configurations:

%Error: Internal Error: /home/runner/work/cva6/cva6/core/cache_subsystem/cva6_icache.sv:357:55: ../V3LinkDot.cpp:2641: Couldn't resolve inlined scope 'i_cache_subsystem' in: ariane_testharness.i_ariane.i_cva6.genblk1__DOT__i_cache_subsystem.i_cva6_icache_axi_wrapper.i_cva6_icache
  357 |                     (mem_rtrn_i.inv.vld && inv_en)  ? icache_way_bin2oh(mem_rtrn_i.inv.way) :
      |                                                       ^~~~~~~~~~~~~~~~~
make: *** [Makefile:569: verilate] Error 1

Someone else is experiencing the same issue ?

Thanks

Copy link
Contributor

@zarubaf zarubaf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

core/cache_subsystem/miss_handler.sv Outdated Show resolved Hide resolved
@JeanRochCoulon
Copy link
Contributor

Euh... Why Github actions did not detect the error ?

@github-actions
Copy link
Contributor

❌ failed run, report available here.

@JeanRochCoulon
Copy link
Contributor

@colluca If you think a new parameter is needed, Add this parameter to the config_pkg::cva6_cfg_t. It will become an input parameter of cva6. Of course all configuration files need to be updated.

@colluca colluca force-pushed the multiple_stores branch 2 times, most recently from 50f3d5e to a84d6bf Compare October 13, 2023 15:21
@colluca
Copy link
Contributor Author

colluca commented Oct 13, 2023

I created the top-level parameter. Now ready to merge in my opinion.

@github-actions
Copy link
Contributor

❌ failed run, report available here.

@JeanRochCoulon
Copy link
Contributor

@colluca can you rebase to solve the conflicts ?

@github-actions
Copy link
Contributor

❌ failed run, report available here.

@JeanRochCoulon
Copy link
Contributor

image

@colluca
Copy link
Contributor Author

colluca commented Oct 19, 2023

Thanks for the patience and feedback @JeanRochCoulon

image

Should be amended by the latest push

@github-actions
Copy link
Contributor

✔️ successful run, report available here.

@JeanRochCoulon JeanRochCoulon merged commit 74675b4 into openhwgroup:master Oct 19, 2023
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants