Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mroda/v5 #25

Closed
wants to merge 168 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
168 commits
Select commit Hold shift + click to select a range
f598b51
Safekeeping
mroda88 Feb 20, 2024
8cd06bc
initial skeleton from macro
mroda88 Feb 20, 2024
b3830aa
Some dumb changes
mroda88 Feb 20, 2024
c1ef276
JCF: add the standard set of single-repo workflows
jcfreeman2 Feb 20, 2024
bd2b3dd
Syncing .github/workflows/dunedaq-v4-cpp-ci.yml
jcfreeman2 Feb 20, 2024
5830cb0
Draft of the DaphneInterface
mroda88 Feb 27, 2024
2ef2662
Restore compilation
mroda88 Feb 27, 2024
ddd3b5a
Uniqueness of the socket
mroda88 Feb 27, 2024
f038229
Better wrapper around our socket
mroda88 Feb 27, 2024
32d7cc7
initial full draft of DaphneInterface
mroda88 Mar 1, 2024
c58b353
Fix compilation errors
mroda88 Mar 1, 2024
a1048f3
unit test
mroda88 Mar 1, 2024
987ad10
Draft of messages
mroda88 Mar 8, 2024
a5933f6
progress with commands
mroda88 Mar 8, 2024
651414f
Working command receiving
mroda88 Mar 8, 2024
0b1b06c
Compiling monitoring
mroda88 Mar 11, 2024
62ea5ea
Working monitoring
mroda88 Mar 11, 2024
606cc7f
Comments from Manuel
mroda88 Mar 12, 2024
e850526
Comments from Manuel
mroda88 Mar 12, 2024
3ad1b09
implement checks for timing endpoint conf
mroda88 Mar 13, 2024
8d58973
better time utilisation for timing endpoint
mroda88 Mar 13, 2024
35ccf7a
Progress towards analog configuration
mroda88 Mar 13, 2024
7e5a27d
Validation for channel Configurations
mroda88 Mar 13, 2024
6df18e3
Validate connection
mroda88 Mar 13, 2024
544fb3a
Conf for AFEs in schema
mroda88 Mar 14, 2024
ae4f522
Configuration validation
mroda88 Mar 14, 2024
3685d6b
Setting the afe conf
mroda88 Mar 14, 2024
34a5e74
Add DDR alignment and check
mroda88 Mar 14, 2024
e5ec02f
Start adding trigger configuration
mroda88 Mar 15, 2024
fb9b649
Add more info for monitoring and for configuration
mroda88 Mar 15, 2024
4753c7a
adding missing variables for configuration
marroyav Mar 17, 2024
9246fd1
fixed biasctrl handling
marroyav Mar 17, 2024
17145ab
working configuration for AFE
mroda88 Mar 18, 2024
14faa7e
Configuration for trigger mode
mroda88 Mar 18, 2024
6fd880a
Definition of configuration in controller
mroda88 Mar 18, 2024
19f453d
Correct and operational configuration
mroda88 Mar 18, 2024
2703c24
Some improvements here and there
mroda88 Mar 18, 2024
ee42d78
Complete validation configuration
mroda88 Mar 19, 2024
cd8eebf
Full configuration validation
mroda88 Mar 19, 2024
322b0cc
Restore compilation
mroda88 Mar 19, 2024
f8ab523
Full configuration implemented and compiling
mroda88 Mar 19, 2024
a5b5fcf
Check with Manuel for correct configuration
mroda88 Mar 19, 2024
a52cbab
draft of configuration in test
mroda88 Mar 20, 2024
03cf8b9
Fix log
mroda88 Mar 20, 2024
f1c3568
working code tested, timing not aligning but that is a firmware issue
mroda88 Mar 20, 2024
f1668d6
additional delay to let the timing interface recover the time stamp
marroyav Mar 21, 2024
e27677e
time optimisation
mroda88 Mar 21, 2024
ceb7636
progress towards buffer dumping
mroda88 Mar 21, 2024
b3ec425
Draft of the spy buffers dumping
mroda88 Mar 22, 2024
908afdd
Progress towards working env
mroda88 Mar 25, 2024
3c4e2de
Changing the calls to main daqconf libraries
mroda88 Mar 25, 2024
ae3dac7
Draft of confgen app
mroda88 Mar 25, 2024
6b68ac2
Skeleton for daphne
mroda88 Mar 25, 2024
c7abfe8
Progress in configuration generation
mroda88 Mar 26, 2024
84cec22
Complete draft of the configuration, no file yet
mroda88 Mar 26, 2024
06f155a
Make the controller more id aware
mroda88 Mar 27, 2024
7d11446
Merge pull request #4 from DUNE-DAQ/team/DaphneInterface
mroda88 Mar 27, 2024
e2ad0cd
bump version
mroda88 Mar 27, 2024
2fc0e77
Merge pull request #5 from DUNE-DAQ/mroda/version
mroda88 Mar 28, 2024
7ce3cb5
adding a dump_buffer command
TiagoTAlves Apr 9, 2024
872d1ba
dump_buffers command is automatically produced when generating config…
TiagoTAlves Apr 10, 2024
d96586e
removed daphneapp_dump_buffers.json
TiagoTAlves Apr 10, 2024
741b5f7
added a more useful log
TiagoTAlves Apr 10, 2024
a0b169e
Merge pull request #6 from DUNE-DAQ/titavare/buffer_dump
plasorak Apr 11, 2024
b74fb52
bump version
TiagoTAlves Apr 11, 2024
7143a79
Merge pull request #7 from DUNE-DAQ/titavare/bump-version
plasorak Apr 11, 2024
5c94a49
Syncing .github/workflows/dunedaq-v4-cpp-ci.yml
Apr 13, 2024
b6d0711
Syncing .github/workflows/dunedaq-v4-cpp-ci.yml
andrewmogan Apr 25, 2024
30f515c
Merge pull request #8 from DUNE-DAQ/prep-release/fddaq-v4.4.0
jcfreeman2 May 8, 2024
96defc3
Starting creating the file for the new file schema
mroda88 May 14, 2024
956b5f0
Draft of the schema for channels in the daphne file
mroda88 May 15, 2024
6e62f08
Fix entry
mroda88 May 15, 2024
d003522
A few more fix
mroda88 May 15, 2024
9c9ecd9
Cosmetic changes
mroda88 May 15, 2024
d6e5750
More updates, but it does not compile
mroda88 May 15, 2024
a77ddc7
Possibly working schema for file
mroda88 May 16, 2024
1a225ef
initial draft for daphe file interpretation
mroda88 May 16, 2024
3f93e56
Progress towards file configuration
mroda88 May 17, 2024
bb31846
Correct logic
mroda88 May 17, 2024
5ea42cf
Working version
mroda88 May 17, 2024
deff6b6
Better organisation
mroda88 May 17, 2024
2f66fe7
transitioning the AFE details
mroda88 May 20, 2024
5f761ab
full implementation of the details
mroda88 May 21, 2024
ceb3e42
cleanup
mroda88 May 21, 2024
ae97c1a
Create the new version
mroda88 May 21, 2024
5e39f09
Merge pull request #9 from DUNE-DAQ/mroda/detail_file
marroyav May 22, 2024
22ae04e
fixing typos
marroyav May 27, 2024
c0f56f2
Merge pull request #10 from DUNE-DAQ/mroda/detail_file
mroda88 May 27, 2024
c746c8d
Syncing .github/workflows/dunedaq-v4-cpp-ci.yml
andrewmogan May 30, 2024
5942b03
Merge pull request #11 from DUNE-DAQ/patch/fddaq-v4.4.x
jcfreeman2 Jun 10, 2024
76f94d7
Better error handling for monitoring parsing
mroda88 Jun 18, 2024
fceadb9
Add logging for the error, every time it happens
mroda88 Jun 18, 2024
f992fc1
Add scrap command
mroda88 Jun 18, 2024
ead3922
bump correct tag
mroda88 Jun 18, 2024
fa4a3e2
Remove unused parameters
mroda88 Jun 18, 2024
66e0561
Merge pull request #12 from DUNE-DAQ/mroda/stop_transition
mroda88 Jun 18, 2024
88bf863
correct reporting of slot
mroda88 Jun 18, 2024
d45f7a2
Bump patch version
mroda88 Jun 18, 2024
cb9413c
Merge pull request #13 from DUNE-DAQ/mroda/stop_transition
mroda88 Jun 18, 2024
5128ecb
Right error threshold for decoding
mroda88 Jun 18, 2024
da0c15f
Merge pull request #14 from DUNE-DAQ/mroda/stop_transition
marroyav Jun 18, 2024
6d86140
Correct termination
mroda88 Jun 18, 2024
ea038ed
Bump patch version
mroda88 Jun 18, 2024
53a8ed8
Merge pull request #15 from DUNE-DAQ/mroda/refinements
marroyav Jun 18, 2024
395694c
Merge pull request #16 from DUNE-DAQ/patch/fddaq-v4.4.x
jcfreeman2 Jun 20, 2024
9a004a8
add channel counter
mroda88 Jul 3, 2024
1099488
Define tracker quantities
mroda88 Jul 3, 2024
13f9150
Make tracker atomic
mroda88 Jul 3, 2024
ce78edd
Draft of the whole monitoring publishing
mroda88 Jul 3, 2024
6e4d6c4
Bump version
mroda88 Jul 3, 2024
772b2c4
Tested with Manuel
mroda88 Jul 3, 2024
182d33b
Better logig
mroda88 Jul 4, 2024
3760f42
solve bad scrap transition
mroda88 Jul 4, 2024
f474c8e
Merge pull request #18 from DUNE-DAQ/mroda/counters
mroda88 Jul 4, 2024
8555993
Merge pull request #19 from DUNE-DAQ/patch/fddaq-v4.4.x
jcfreeman2 Jul 10, 2024
1652600
Bump version
mroda88 Jul 17, 2024
031656e
Dedicated slot configuration
mroda88 Jul 17, 2024
1ca9196
Fix compilation errors
mroda88 Jul 17, 2024
143267f
Configuration with independent slot and ip; separate applications for…
mroda88 Jul 17, 2024
ef40690
Complete example
mroda88 Jul 17, 2024
c09c606
Restore spy buffer commands
mroda88 Jul 17, 2024
0704baf
Add timeout for cmd
mroda88 Jul 18, 2024
8c2d869
Add more details in case of timeout
mroda88 Jul 18, 2024
ce4eb7c
A more graceful loop
mroda88 Jul 18, 2024
1a9bef1
Draft of timeout on sockets
mroda88 Jul 18, 2024
bb42e50
Add socket timeout via configuration
mroda88 Jul 18, 2024
78fcc9f
interruptible retrying command
mroda88 Jul 18, 2024
207ef8b
Initial draft for retry and interruptible
mroda88 Jul 18, 2024
1c7a283
Configurable parameters added
mroda88 Jul 19, 2024
080cd6f
Better example file
mroda88 Jul 19, 2024
5815951
Retore monitoring
mroda88 Jul 19, 2024
661a656
New default timeouts
mroda88 Jul 19, 2024
f418b6d
Update schemas
mroda88 Jul 20, 2024
aacc267
Implement dropped package counter
mroda88 Jul 20, 2024
0d60b9d
fix compilation error
mroda88 Jul 20, 2024
85fb738
Publish the stream_info
mroda88 Jul 20, 2024
f7aab85
restore bias monitoring
mroda88 Jul 20, 2024
e3cd583
Better handling in the errors from monitoring
mroda88 Jul 20, 2024
b67ef74
More uniform timeouts
mroda88 Jul 20, 2024
383784e
Merge pull request #22 from DUNE-DAQ/mroda/socket
mroda88 Jul 22, 2024
10c78fc
Syncing .github/workflows/dunedaq-v4-cpp-ci.yml
andrewmogan Nov 6, 2024
ef67cfe
Initiating file purge
mroda88 Nov 7, 2024
74abe26
Restore compilability
mroda88 Nov 7, 2024
66f8fd3
Add example file from Manuel
mroda88 Nov 19, 2024
ac4dac1
Better file
mroda88 Nov 19, 2024
971d756
adding script to add file
mroda88 Nov 19, 2024
9a3be76
Add script
mroda88 Nov 19, 2024
ecbcec5
add right preveleges to script
mroda88 Nov 19, 2024
8755625
Fix typo
mroda88 Nov 19, 2024
2a56f09
Update daphne_example_config.json
mroda88 Nov 20, 2024
b212e6d
Compiling C++
mroda88 Nov 20, 2024
bbedd4c
Merge remote-tracking branch 'origin/mroda/v5' into mroda/v5
mroda88 Nov 20, 2024
8a095fe
script is working
mroda88 Nov 20, 2024
fc2f319
Update file format
mroda88 Nov 21, 2024
3bc20d5
Compiling version with correct init
mroda88 Nov 21, 2024
fcf977c
starting adapting the code
mroda88 Nov 21, 2024
2952df3
some changes
mroda88 Nov 21, 2024
81dad55
rename
mroda88 Nov 21, 2024
36c8ceb
restore complability
mroda88 Nov 21, 2024
c59694f
Read bias ctrl
mroda88 Nov 22, 2024
2871478
Add defaults
mroda88 Nov 22, 2024
006c892
Loop for channel configuration
mroda88 Nov 22, 2024
967c46c
Configure channels
mroda88 Nov 22, 2024
93ba7c6
starting using the AFE
mroda88 Nov 22, 2024
25ed55a
Add configuration for reg4
mroda88 Nov 22, 2024
291bac3
Upddate of the json file
mroda88 Nov 25, 2024
f2e8cde
Sync with appmodel
mroda88 Nov 25, 2024
4bf7937
Add configuration for afe
mroda88 Nov 25, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions .github/workflows/auto_approve.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
name: Auto approve

on:
workflow_dispatch:
inputs:
pullRequestNumber:
description: Pull request number to auto-approve
required: false

jobs:
auto-approve:
runs-on: ubuntu-latest
permissions:
pull-requests: write
steps:
- uses: hmarr/auto-approve-action@v2
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
pull-request-number: ${{ github.event.inputs.pullRequestNumber }}
88 changes: 88 additions & 0 deletions .github/workflows/dunedaq-v4-cpp-ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
name: build-develop

# Controls when the action will run. Workflow runs when manually triggered using the UI
# or API.
on:
push:
branches:
- production/v4
- patch/*
- prep-release/*
paths-ignore:
- 'docs/**'
- '.github/**'
pull_request:
branches: [ production/v4 ]
schedule:
- cron: "0 9 * * *"

workflow_dispatch:


jobs:
Build_against_dev_release:
name: build_against_dev_on_${{ matrix.os_name }}
# The type of runner that the job will run on
runs-on: ubuntu-latest
strategy:
matrix:
include:
- image: "ghcr.io/dune-daq/nightly-release-alma9:production_v4"
os_name: "a9"
container:
image: ${{ matrix.image }}
defaults:
run:
shell: bash

# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Runs a single command using the runners shell

- name: Checkout daq-release
uses: actions/checkout@v3
with:
repository: DUNE-DAQ/daq-release
path: daq-release

- name: setup dev area
run: |
export REPO=$(echo '${{ github.repository }}' | awk -F '/' '{print $2}')
source /cvmfs/dunedaq.opensciencegrid.org/setup_dunedaq.sh
setup_dbt latest_v4 || true
release_name="last_fddaq"

dbt-create -n $release_name dev-${{ matrix.os_name }}

- name: checkout package for CI
uses: actions/checkout@v3
with:
path: ${{ github.repository }}

- name: setup build env, build the repo against the development release
run: |
export REPO=$(echo '${{ github.repository }}' | awk -F '/' '{print $2}')
cd $GITHUB_WORKSPACE/dev-${{ matrix.os_name }}
source env.sh || true

spack unload $REPO || true
cp -pr $GITHUB_WORKSPACE/DUNE-DAQ/$REPO $GITHUB_WORKSPACE/dev-${{ matrix.os_name }}/sourcecode
dbt-build # --unittest # --lint

- name: upload build log file
uses: actions/upload-artifact@v4
with:
name: build_log_${{ matrix.os_name }}
path: ${{ github.workspace }}/dev-${{ matrix.os_name }}/log/build*.log

# - name: upload linter output file
# uses: actions/upload-artifact@v4
# with:
# name: linting_log_${{ matrix.os_name }}
# path: ${{ github.workspace }}/dev-${{ matrix.os_name }}/log/linting*

# - name: upload unittest output file
# uses: actions/upload-artifact@v4
# with:
# name: unit_tests_log_${{ matrix.os_name }}
# path: ${{ github.workspace }}/dev-${{ matrix.os_name }}/log/unit_tests*
40 changes: 40 additions & 0 deletions .github/workflows/track_new_issues.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
name: Add issue to project
on:
issues:
types:
- opened

jobs:
track_issue:
runs-on: ubuntu-latest
steps:
- name: Get project data
env:
GITHUB_TOKEN: ${{ secrets.BOT_CI_ISSUES }}
ORGANIZATION: DUNE-DAQ
PROJECT_NUMBER: 5
run: |
gh api graphql -f query='
query($org: String!, $number: Int!) {
organization(login: $org){
projectV2(number: $number) {
id
}
}
}' -f org=$ORGANIZATION -F number=$PROJECT_NUMBER > project_data.json

echo 'PROJECT_ID='$(jq '.data.organization.projectV2.id' project_data.json) >> $GITHUB_ENV

- name: Add issue to project
env:
GITHUB_TOKEN: ${{ secrets.BOT_CI_ISSUES }}
ISSUE_ID: ${{ github.event.issue.node_id }}
run: |
item_id="$( gh api graphql -f query='
mutation($project:ID!, $issue:ID!) {
addProjectV2ItemById(input: {projectId: $project, contentId: $issue}) {
item {
id
}
}
}' -f project=$PROJECT_ID -f issue=$ISSUE_ID --jq '.data.addProjectV2ItemById.item.id')"
40 changes: 40 additions & 0 deletions .github/workflows/track_new_prs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
name: Add pull request to project
on:
pull_request:
types:
- opened
jobs:
track_issue:
runs-on: ubuntu-latest
steps:
- name: Get project data
env:
GITHUB_TOKEN: ${{ secrets.BOT_CI_ISSUES }}
ORGANIZATION: DUNE-DAQ
PROJECT_NUMBER: 5
run: |
gh api graphql -f query='
query($org: String!, $number: Int!) {
organization(login: $org){
projectV2(number: $number) {
id
}
}
}' -f org=$ORGANIZATION -F number=$PROJECT_NUMBER > project_data.json

echo 'PROJECT_ID='$(jq '.data.organization.projectV2.id' project_data.json) >> $GITHUB_ENV

- name: Add issue to project
env:
GITHUB_TOKEN: ${{ secrets.BOT_CI_ISSUES }}
run: |
PR_NUMBER=${{ github.event.pull_request.number }}
node_id=`curl -H "Accept: application/vnd.github.v3+json" https://api.github.com/repos/${{ github.repository }}/pulls/${PR_NUMBER} | jq '.node_id'`
item_id="$( gh api graphql -f query='
mutation($project:ID!, $issue:ID!) {
addProjectV2ItemById(input: {projectId: $project, contentId: $issue}) {
item {
id
}
}
}' -f project=$PROJECT_ID -f issue=$node_id --jq '.data.addProjectV2ItemById.item.id')"
46 changes: 46 additions & 0 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@

cmake_minimum_required(VERSION 3.12)
project(daphnemodules VERSION 1.5.0)

find_package(daq-cmake REQUIRED)

daq_setup_environment()

find_package(appmodel REQUIRED)
find_package(conffwk REQUIRED)
find_package(appfwk REQUIRED)
find_package(opmonlib REQUIRED)
find_package(fmt REQUIRED)
find_package(Boost COMPONENTS unit_test_framework REQUIRED)

##############################################################################


# See https://dune-daq-sw.readthedocs.io/en/latest/packages/daq-cmake/#daq_codegen
daq_protobuf_codegen( opmon/*.proto )

##############################################################################


# See https://dune-daq-sw.readthedocs.io/en/latest/packages/daq-cmake/#daq_add_library

daq_add_library( *.cpp LINK_LIBRARIES appfwk::appfwk fmt::fmt) # Any source files and/or dependent libraries to link in not yet determined
##############################################################################


# See https://dune-daq-sw.readthedocs.io/en/latest/packages/daq-cmake/#daq_add_plugin

daq_add_plugin(DaphneV2ControllerModule duneDAQModule LINK_LIBRARIES daphnemodules appmodel::appmodel)

##############################################################################


# See https://dune-daq-sw.readthedocs.io/en/latest/packages/daq-cmake/#daq_add_unit_test

daq_add_unit_test(DaphneInterface_test LINK_LIBRARIES daphnemodules)

##############################################################################
# daq_add_application(daphne_controller_test controller_test.cxx TEST LINK_LIBRARIES daphnemodules)

daq_install()

37 changes: 37 additions & 0 deletions cmake/daphnemodulesConfig.cmake.in
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@

@PACKAGE_INIT@

include(CMakeFindDependencyMacro)

# Insert find_dependency() calls for your package's dependencies in
# the place of this comment. Make sure they match up with the
# find_package calls in your package's CMakeLists.txt file

find_package(appmodel)
find_package(appfwk)
find_package(opmonlib)
find_package(fmt)
find_package(Boost COMPONENTS unit_test_framework)


# Figure out whether or not this dependency is an installed package or
# in repo form

if (EXISTS ${CMAKE_SOURCE_DIR}/@PROJECT_NAME@)

message(STATUS "Project \"@PROJECT_NAME@\" will be treated as repo (found in ${CMAKE_SOURCE_DIR}/@PROJECT_NAME@)")
add_library(@PROJECT_NAME@::@PROJECT_NAME@ ALIAS @PROJECT_NAME@)

get_filename_component(@PROJECT_NAME@_DAQSHARE "${CMAKE_CURRENT_LIST_FILE}" DIRECTORY)

else()

message(STATUS "Project \"@PROJECT_NAME@\" will be treated as installed package (found in ${CMAKE_CURRENT_LIST_DIR})")
set_and_check(targets_file ${CMAKE_CURRENT_LIST_DIR}/@[email protected])
include(${targets_file})

set(@PROJECT_NAME@_DAQSHARE "${CMAKE_CURRENT_LIST_DIR}/../../../share")

endif()

check_required_components(@PROJECT_NAME@)
49 changes: 1 addition & 48 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -1,48 +1 @@
/**

@mainpage DAPHNE-libs
@author Manuel Arroyave (marroyav)

@description This is a collection of libraries to monitor, config and spy daphne registers.
All the libraries are based in oei, a class with 4 functions to read and write registers via ETH.

Basic functionality would support endpoint status.
Endpoint_status.hpp seems like the library to test integration since it's only readout
and returns feedback about the status of the timing endpoint.

Stand alone applications:
testip.cpp -------> pings to every daphne board and prints success or fail
gateware.cpp -------> takes ip address and ask for the gateware running on each daphne
endpoint_status.cpp -------> prints clk and timing variables on terminal
clocks.cpp -------> execute clk config for all daphne boards. (requires echo after)
ts.cpp -------> Prints 4 consecutive ts from each board
select_mode.cpp -------> Configures all daphnes in default data stream config
read20.cpp -------> Prints 20 words from the spy buffer at the output of the 0 transceiver

These are based in hpp files with functions to automatize sending commands for specific interfaces

The Booting procedure for all DAPHNEs in the detector is as follows:

1. Power on
2. Check ping on IP Address ----> send warnings and set a vector only with pingable IP Addresses
3. Check firmware version ----> warning if it's different among the set of running daphnes
4. Check timing registers ----> set timing to work with timing interface
5. Send Echo/Alignment command from timing interface (calls a bash script by hand?)
6. Check endpoint ----> send warnings if state is not good to go
6. Align the AFEs and check ----> send warnings if registers are not 0x3f80
7. Set analog chain ----> this process takes some time (~few min)
8. Fine tunning the offset ----> we can skip it but It might improve dynamic range in some channels

Calibration:

We might need several (tens of) runs for calibration
Procedure is very easy from DAQ point of view:
Is it possible to automatize this and scan using one variable of the calibration module?

1. Configure calibration module
2. Configure DAPHNE
3. Lauch run
4. save metadata (configuration parameters)

*/

# No Official User Documentation Has Been Written Yet (Tue Feb 20 17:12:47 CET 2024)
48 changes: 48 additions & 0 deletions docs/README_old.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
/**

@mainpage DAPHNE-libs
@author Manuel Arroyave (marroyav)

@description This is a collection of libraries to monitor, config and spy daphne registers.
All the libraries are based in oei, a class with 4 functions to read and write registers via ETH.

Basic functionality would support endpoint status.
Endpoint_status.hpp seems like the library to test integration since it's only readout
and returns feedback about the status of the timing endpoint.

Stand alone applications:
testip.cpp -------> pings to every daphne board and prints success or fail
gateware.cpp -------> takes ip address and ask for the gateware running on each daphne
endpoint_status.cpp -------> prints clk and timing variables on terminal
clocks.cpp -------> execute clk config for all daphne boards. (requires echo after)
ts.cpp -------> Prints 4 consecutive ts from each board
select_mode.cpp -------> Configures all daphnes in default data stream config
read20.cpp -------> Prints 20 words from the spy buffer at the output of the 0 transceiver

These are based in hpp files with functions to automatize sending commands for specific interfaces

The Booting procedure for all DAPHNEs in the detector is as follows:

1. Power on
2. Check ping on IP Address ----> send warnings and set a vector only with pingable IP Addresses
3. Check firmware version ----> warning if it's different among the set of running daphnes
4. Check timing registers ----> set timing to work with timing interface
5. Send Echo/Alignment command from timing interface (calls a bash script by hand?)
6. Check endpoint ----> send warnings if state is not good to go
6. Align the AFEs and check ----> send warnings if registers are not 0x3f80
7. Set analog chain ----> this process takes some time (~few min)
8. Fine tunning the offset ----> we can skip it but It might improve dynamic range in some channels

Calibration:

We might need several (tens of) runs for calibration
Procedure is very easy from DAQ point of view:
Is it possible to automatize this and scan using one variable of the calibration module?

1. Configure calibration module
2. Configure DAPHNE
3. Lauch run
4. save metadata (configuration parameters)

*/

Loading
Loading