Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: COBRAPRO: A MATLAB toolbox for Physics-based Battery Modeling and Co-simulation Parameter Optimization #6803

Open
editorialbot opened this issue May 29, 2024 · 76 comments
Assignees
Labels
Matlab recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 3 (PE) Physics and Engineering

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented May 29, 2024

Submitting author: @COBRAPROsimulator (Sara Ha)
Repository: https://github.com/COBRAPROsimulator/COBRAPRO
Branch with paper.md (empty if default branch):
Version: v2.0.0
Editor: @mbarzegary
Reviewers: @yuefan98, @BradyPlanden, @brosaplanella
Archive: 10.5281/zenodo.14192733

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/54273964eb00e1e54308c19d54518891"><img src="https://joss.theoj.org/papers/54273964eb00e1e54308c19d54518891/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/54273964eb00e1e54308c19d54518891/status.svg)](https://joss.theoj.org/papers/54273964eb00e1e54308c19d54518891)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@yuefan98 & @BradyPlanden & @brosaplanella, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @mbarzegary know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @yuefan98

📝 Checklist for @brosaplanella

📝 Checklist for @BradyPlanden

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.90  T=0.05 s (1133.1 files/s, 146764.4 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
MATLAB                          53            801           2888           2933
Markdown                         2             62              0            406
TeX                              1             24              0            246
YAML                             1              1              4             18
-------------------------------------------------------------------------------
SUM:                            57            888           2892           3603
-------------------------------------------------------------------------------

Commit count by author:

   200	COBRAPROsimulator
     1	CO-simulation BatteRy modeling for Accelerated PaRameter Optimization (COBRAPRO)

@editorialbot
Copy link
Collaborator Author

Paper file info:

📄 Wordcount for paper.md is 2632

✅ The paper includes a Statement of need section

@editorialbot
Copy link
Collaborator Author

License info:

✅ License found: MIT License (Valid open source OSI approved license)

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@mbarzegary
Copy link

👋🏼 @yuefan98 @BradyPlanden @brosaplanella, this is the review thread for the paper. All of our communications will happen here from now on.

As a reviewer, the first step is to create a checklist for your review by entering

@editorialbot generate my checklist

as the top of a new comment in this thread.

These checklists contain the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. The first comment in this thread also contains links to the JOSS reviewer guidelines.

The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention openjournals/joss-reviews/issues/6803 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread.

We aim for reviews to be completed within about 4-6 weeks. Please feel free to ping me (@mbarzegary) if you have any questions/concerns.

@mbarzegary
Copy link

@editorialbot check references

@mbarzegary
Copy link

@COBRAPROsimulator this is where the review takes place. Please keep an eye out for comments here from the reviewers, as well as any issues opened by them on your software repository. I recommend you aim to respond to these as soon as possible, and you can address them straight away as they come in if you like, to ensure we do not loose track of the reviewers.

@brosaplanella
Copy link

brosaplanella commented May 29, 2024

Review checklist for @brosaplanella

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/COBRAPROsimulator/COBRAPRO?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@COBRAPROsimulator) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@yuefan98
Copy link

yuefan98 commented Jun 2, 2024

Review checklist for @yuefan98

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/COBRAPROsimulator/COBRAPRO?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@COBRAPROsimulator) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@yuefan98
Copy link

yuefan98 commented Jun 3, 2024

@COBRAPROsimulator It seems there is an issue with the ARM version of MATLAB compatibility. Was able to run code successfully with the intel version of MATLAB. I have created an issue on target repo
COBRAPROsimulator/COBRAPRO#1

@yuefan98
Copy link

yuefan98 commented Jun 3, 2024

@COBRAPROsimulator Overall, this is a great work that requires significant effort. I have several comments that I would like to get clarified and some features that I would like to be considered for future release.

  1. The authors provide decent amount of comments in the code, but minimal documentation. It would be great if there could be structured documentation like other software in the future.
  2. The overpotential in the simulation is almost constant and close to zero (cycle_CC.m). That means there is negligible resistance from the charge transfer reaction. However, for a real battery system, we would expect a much larger SOC-dependent charge transfer resistance and hence overpotential. Can you comment on why this is the case in your simulation? Is it because you are assuming a constant reaction rate over the SOC range? If that is the case, does the current version of the code support input of variable reaction rates?
  3. It seems that in most DC DFN/P2D models, double-layer capacitance is ignored. As a result, the overpotential lacks time dependence in the simulation. It might be worth including this in the future release, as the time dependence on overpotential might be important in certain applications.
  4. Can you comment on the time required to run DFN_pso_HPPC.m. It took more than an hour to run the calculation on my computer, and I frequently encountered the following errors. Is this expected?
could not complete due to initialization issues during HPPC

Given the code seems to run forever on my side, I want to confirm first that the code is expected to return SOC dependent estimation of physical parameters for the DFN model rather than a single set of parameters that can fit the HPPC data for the whole SOC range.

@COBRAPROsimulator
Copy link

@yuefan98, thank you very much for reviewing COBRAPRO! Here are the answers to your questions:

  1. Thank you for pointing this out. I can include more structured documentation in the future. Will this need to be completed for your review to be complete?
  2. I think your question can be answered in two parts. First, the kinetic parameters used in cycle_CC.m are quite large. The positive electrode kinetic reaction rate (kp) is 3.9537e-08 [m2.5.mol-0.5.s-1] and negative electrode kinetic reaction rate (kn) 1.4394e-08 [m2.5.mol-0.5.s-1] are quite high, which is probably why the overpotential is small for this battery chemistry. Side note, kp and kn were identified from the HPPC profile for this battery data (NMC/Gr-Si). Second, we don't consider the SOC-dependence of kp and kn. The only "SOC-dependence" is considered in the Butler-Volmer equation in the exchange current density (i_0), given as i_0p = kp*(csp_max-csp_surf)^0.5c_e0.5csp_surf^0.5 and i_0n = kn(csn_max-csn_surf)^0.5c_e0.5*csn_surf^0.5 for positive and negative electrodes, respectively.
  3. Correct, this model does not include the double-layer capacitance. We will consider including in the future releases, thank you.
  4. Yes, the DFN_pso_HPPC.m code takes a while to run. One, because each HPPC simulation takes around ~74 seconds and with 100 PSO particles, then the model needs to run 100 times for a single PSO iteration. I used a 24-core desktop ( PSO ran 24 simulations in parallel), it took me almost 1.5 days to run DFN_pso_HPPC.m using 500 particles. Also, the messages printed in the Command Windows are not errors in the code, but printing when the PSO failed to simulate the model for a particular particle. This is just to provide more information to the user if desired. The user can suppress these messages as noted in Line 138 of DFN_pso_HPPC.m.
    To clarify, the current code does not optimize for SOC-dependence of parameters. The PSO cost function calculates the RMSE between the experimental and simulated voltage for the entire SOC range, and tries to find a single parameter that best fits the HPPC data. In the future, we plan to include SOC-dependent optimization of parameters, but will require closer investigation to consider parameter identifiability as a function of SOC.
    Lastly, we plan to link our recently accepted paper from the Journal of The Electrochemical Society which explains COBRAPRO's numerical implementation and parameter identification framework in detail.

Please let me know if there are any further questions or points I can clarify.

@yuefan98
Copy link

yuefan98 commented Jun 6, 2024

@COBRAPROsimulator Thanks for the clarifications!

For 1. I am good with that unless other reviewers have comments on it.
For 2. I think that make sense to me.
For 4. I was able to validate your HPPC result by reducing it to two particles optimization. I think the estimation provides reasonable reaction rate given the fit to the instantaneous response. After reviewing the simulation in detail, I am a bit worry about the diffusion properties in the optimization. And it seems we generally have a faster solids state diffusion estimation than the actual battery response. Can you comment on the diffusion coefficient estimation? That is assumed to be constant over SOC right? Do you think the estimation can be improve by considered SOC dependence in the future or it is more a feature of model?

Lastly, I am glad that SOC dependence optimization is considered as a future work. I believe that will make this toolbox more useful for the real application.

Lastly, we plan to link our recently accepted paper from the Journal of The Electrochemical Society which explains COBRAPRO's numerical implementation and parameter identification framework in detail.

I think that will be really helpful !

@COBRAPROsimulator
Copy link

@yuefan98, thank you. Here is my comment to 4:

  • Could you please clarify what you mean by validating HPPC results by reducing it to two particles optimization? Do you mean that you only used 2 particles in the PSO? What was your the optimized objective function value (RMSE between simulated and experimental voltage, J_V) in this case? And approximately how long did your optimization take? Also, were you able to validate the identified HPPC values using the UDDS profile? As you may know, in parameter optimization, validation is typically carried on a different profile (e.g., driving cycle) than the one it was identified with.
  • To answer your point on solid phase diffusion, I would like to add a general comment about the DFN model parameterization pipeline we proposed. The DFN model is only a simplified representation of a real battery and makes several assumptions such as perfectly spherical particles, dynamics only in the x- and r-directions, and homogenous electrode properties. In my PhD work, I have seen that when we take experimentally measured parameters from cell-tear down and directly input them into the model, the model cannot predict the behavior of the real cell. This motivates the need for parameter identification. However, we cannot and should not identify/optimize all the DFN parameters at once since the DFN model is overparameterized. We use a layered identification technique to improve parameter identifiability, where the geometrical parameters are taken from cell-tear down measurements, stoichiometric parameters extracted from C/20 data, and the remaining transport and kinetic parameters are identified with the HPPC profile. For the HPPC identification, it is important to note that prior to conducting HPPC PSO, we conduct practical identifiability analysis to determine the identifiable parameters that will be optimized using the HPPC data. The identifiability analysis is conducted through LSA and correlation analysis in the Examples/Local_Sensitivity_Analysis/DFN_LSA_Corr_HPPC.m file. The user can use this code as a tool to determine the "identifiable" parameters, which then can be optimized in the Examples/Parameter_Identification_Routines/DFN_pso_HPPC.m file. This is pipeline is summarized in Fig 1 of the paper.
  • To summarize, the goal is not for the identified transport and kinetic parameters to match the actual/real values measured from cell-tear down, since the DFN model is just a simplified representation of a real battery, but rather to identify solid phase diffusion coefficient that will reduce the experimental and simulated voltage and state-of-charge RMSE. Furthermore, this is why the UDDS validation is carried out to check that the HPPC identified parameters work well on a new/different profile.
  • Our ECS paper, developed to complement the COBRAPRO code, explains this DFN model parameterization in detail. Also, it is important to note that this code was developed to be a tool to carry out parameter identifiability and identification, but the actual parameters being identified may change depending on the user's experimental profile and cell chemistry.
  • You mention a good point that the in the current code, the solid phase diffusion is not a function of SOC. We can add this feature in the future but will require some development since the radial discretization method will need to be reconstructed to allow SOC dependent Dsp and Dsn coefficients.

I hope I answered your question. Please let me know if you have any other questions.

@yuefan98
Copy link

yuefan98 commented Jun 7, 2024

  • Yes, I just ran your cycle_HPPC.m with 2 particles in PSO, and cycle_UDDS.m. By reducing it to two particles I can ran your code in around ~ 10 mins, but I think that's sufficient for me to validate your code. I think I am all good with that
  • Thanks for the clarification. I agree that battery parameterization are complicated and all those parameters can coupled together. I think your pipeline is good and reasonable.
  • I think everything is good from my side at this point.

@mbarzegary
Copy link

@yuefan98 thanks a lot for finalizing your review.
@BradyPlanden how is your review going?

@brosaplanella
Copy link

brosaplanella commented Jul 2, 2024

Here's my review. Overall, the package provides a novel solution to the parameterisation of physics-based battery models (in this case the DFN). The code is functional and the article well-written, and meets all the points in the checklist (though note I have some questions regarding the performance and testing, see below). I provide a list of comments to address before publication, split into major and minor comments.

Major comments

  • One of the main claims in the paper is performance, and it is stated that the 1C discharge runs in less than a second (similar order of magnitude to PyBaMM). When running it on my laptop (it tic/toced the first 1C discharge in cycle_CC.m it took around 8 seconds (the PyBaMM time in my machine is similar to that reported by the authors). I understand that the discrepancy is probably due to pre/post-processing and printing to screen, but it would be good to have one example with minimal overhead which can demonstrate the performance claimed in the article.
  • I agree with @yuefan98 that the documentation is a bit sparse. Even though for a research tool the current documentation might be enough, it should be expanded in the near future (I am also happy for this to take place after publication). I believe the main issue is that, even though the examples work straight out of the box, it would be extremely hard for a user to adapt the code for their own needs. I believe that publishing some MATLAB notebooks commenting on the steps would be very useful.
  • Similarly, tests are quite sparse. At the moment the tests only check that CasADI runs and that one particular example works, but these need to be ran manually. I would recommend some continuous integration on Github (or similar) to test things work periodically, but I don't know if that is feasible with proprietary software like MATLAB. In any case, some additional tests should be written, at the very least one to check that the parameter fitting tools work as expected.
  • At the moment the list of CONTRIBUTORS is a bit opaque. All the commits have been pushed by COBRAPROsimulator, which does not tell who that is or if is one or multiple users. This could be an issue if other people join the project or the person behind COBRAPROsimulator changes institutions. I would recommend having a list of contributors in the README, where contributors are acknowledged. The all-contributors tool is quite useful for this.

Minor comments

  • The following comments refer to the installation instructions.
    • The instructions require SUNDIALS 2.6.2 but the hyperlink points to 2.6.1. In addition, this version is quite old. Is there any particular reason to not use a more recent version?
    • The instructions say CasADI should be installed, but they don't specify for which platform. Even though it is quite obvious, it would be helpful to specify it needs to be the MATLAB version.
    • The instructions assume MATLAB is installed, with some toolboxes. It should be stated explicitly at the beginning of the "Installation" section.
  • The following comments refer to the installation process in MATLAB:
    • There is a step that asks "Install toolbox" that it is not reflected in the installation instructions.
    • It would be good to always have a default answer to most of the prompts (so just hitting enter gets you through), as the fact that some answers require a yes and some a no is a bit confusing. If that is already implemented, users should be made aware of what the default is.
  • The title of the correlation matrix plot (e.g. when running DFN_LSA_Corr_CC.m) does not render LaTeX.
  • The DFN_pso_0_05C.m also took a long time to me (around 30 minutes) and some errors/warning came up. It would be good to clarify in the documentation that this example might take a while to run.
  • The following comments refer to the article:
    • [l33] I would remove all the references here, as they are all cited again and discussed in more detail later on.
    • [l47] "Other packages resort to literature-derived parameters and lack the ability to predict battery behaviour". I think this statement is a bit of a stretch. All the aforementioned tools will provide models with the ability to predict real battery behaviour, as long as the user provides the right parameter values. COBRAPRO provides tools to estimate those parameters (which is no small feat), while the others don't, so I think this would be a more accurate statement.
    • [l54]: "COBRAPRO leverages a fast solver". Which one? Give details and references.
    • [l64] About Challenge 2, I think it is a bit misleading. I suspect most of the other tools also look for consistent initial conditions (PyBaMM definitely does, and I suspect COMSOL does too; haven't played much with the other tools). If any of the other tools does not do that, it should be stated clearly. Otherwise, I think this challenge would be better merged with Challenge 1, as it all ties to dealing with the computational complexity.
    • [l185, but there are other occurrences through the text] The subscripts denoting labels (e.g. n, p, e...) should be in Roman text (i.e. \mathrm{e}) in LaTeX.
    • [Fig3&5] The plots showing the SOC are not very insightful. I imagine they are computed through Coulomb counting in both cases, and given that the experiment is current driven they need to be virtually identical. I believe a plot of the voltage error between experiment and simulation would be a more valuable output of the code.
    • [l246] Given that this list is bound to change as the code evolves, I think it would be better to not include it in the article and, instead, point the users to the README where an up-to-date list will be available.

@mbarzegary
Copy link

@BradyPlanden how is your review going?

@mbarzegary
Copy link

@COBRAPROsimulator can you please provide an update on the above, in terms of responding to the issues raised by @brosaplanella?

@COBRAPROsimulator
Copy link

Hi @mbarzegary and @brosaplanella, I sincerely apologize for the delay. I will reply to @brosaplanella's comments soon. Thank you for your understanding.

@BradyPlanden
Copy link

BradyPlanden commented Aug 2, 2024

Review checklist for @BradyPlanden

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/COBRAPROsimulator/COBRAPRO?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@COBRAPROsimulator) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@BradyPlanden
Copy link

Apologies for the delay, here is my review in full.

Overall, this paper is well written and makes a novel contribution to the subfield of parameterisation for electrochemical battery modelling. As the current work has all been completed through the @COBRAPROsimulator account, I have assumed that is the first author, as Ferran mentioned above, setting up a contributor list would be helpful and enable expansion in the future.

Repository Comments

Article Comments

  • Presently, the comparison with alternative work in the field is aimed at pure forward modelling packages (PyBaMM, PetLion.jl, LionSimba) which do not aim to solve the inverse problem, a more extensive literature review will find other implementations which are more suitable for comparison.
  • A table showing the full set of parameters identified for the example would be helpful to summarise the results in one place for the reader.
  • Similarly, a table of correlation metrics for the parameters in the LSA section of the example.

@mbarzegary
Copy link

@COBRAPROsimulator can you please provide an update on the raised issues? We need to move forward with this submission.

@mbarzegary
Copy link

@COBRAPROsimulator given the green light of the reviewers, we will now work towards processing this for acceptance in JOSS. So please

  • Merge my PR containing some minor edits
  • Work on the author's points of the final checklist I created ☝️

I can then move forward with recommending acceptance of the submission.

@COBRAPROsimulator
Copy link

COBRAPROsimulator commented Nov 20, 2024

Hi @mbarzegary, thank you very much for initiating the post-review process. I have completed the post-review tasks as you mentioned:

  • I have double checked the authors and affiliations (including ORCIDs)
  • The version of the software that will be used in the JOSS paper is v2.0.0 (includes all changes made from the review process).
  • I have created a Zenodo archive and the DOI is: 10.5281/zenodo.14192734
  • I made sure that the title and author list (including ORCIDs) in the archive match those in the JOSS paper.
  • I made sure that the license listed for the archive is the same as the software license.

Finally, I have merged your PR containing the minor edits. Thank you for pointing out the spelling errors.

@mbarzegary
Copy link

@COBRAPROsimulator Nice. The only issues I see is that the files in the archive are uploaded one by one, meaning that the directory structure of the repository is lost. Can you please instead upload the zip file of the repository so that the directory structure retains? An example would this: https://zenodo.org/records/14052179

@COBRAPROsimulator
Copy link

COBRAPROsimulator commented Nov 25, 2024

Hi @mbarzegary, thank you for pointing this out. I have uploaded a zip folder instead to retain the folder structure. The new Zenodo DOI is: 10.5281/zenodo.14192733

@mbarzegary
Copy link

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@mbarzegary
Copy link

@editorialbot set 10.5281/zenodo.14192733 as archive

@editorialbot
Copy link
Collaborator Author

Done! archive is now 10.5281/zenodo.14192733

@mbarzegary
Copy link

@editorialbot set v2.0.0 as version

@editorialbot
Copy link
Collaborator Author

Done! version is now v2.0.0

@mbarzegary
Copy link

@editorialbot check references

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1149/1.2221597 is OK
- 10.1149/2.0301603jes is OK
- 10.1016/j.energy.2022.125966 is OK
- 10.1149/1945-7111/ab7bd7 is OK
- 10.1149/2.0551509jes is OK
- 10.1149/2.0321816jes is OK
- 10.1149/1945-7111/ab9050 is OK
- 10.5334/jors.309 is OK
- 10.1149/2.0291607jes is OK
- 10.1149/2.0171711jes is OK
- 10.1016/j.jpowsour.2016.12.083 is OK
- 10.4271/2023-01-5047 is OK
- 10.1016/j.dib.2022.107995 is OK
- 10.1149/1945-7111/ac201c is OK
- 10.1149/1945-7111/ad7292 is OK
- 10.1149/2.0051908jes is OK
- 10.1149/1945-7111/ac22c8 is OK
- 10.1016/j.compchemeng.2011.01.003 is OK
- 10.1016/j.compchemeng.2015.07.002 is OK
- 10.1149/1945-7111/ad1293 is OK
- 10.1145/3539801 is OK
- 10.1145/1089014.1089020 is OK
- 10.1016/j.ensm.2021.10.023 is OK
- 10.1149/1945-7111/ad1293 is OK
- 10.1109/TCST.2020.3017566 is OK
- 10.1149/1945-7111/ab7091 is OK
- 10.1016/j.jpowsour.2012.03.009 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: COMSOL Multiphysics&copy; v. 6.2
- No DOI given, and none found for title: Battery Design Module User’s Guide, COMSOL Multiph...
- No DOI given, and none found for title: fastDFN

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@mbarzegary
Copy link

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1149/1.2221597 is OK
- 10.1149/2.0301603jes is OK
- 10.1016/j.energy.2022.125966 is OK
- 10.1149/1945-7111/ab7bd7 is OK
- 10.1149/2.0551509jes is OK
- 10.1149/2.0321816jes is OK
- 10.1149/1945-7111/ab9050 is OK
- 10.5334/jors.309 is OK
- 10.1149/2.0291607jes is OK
- 10.1149/2.0171711jes is OK
- 10.1016/j.jpowsour.2016.12.083 is OK
- 10.4271/2023-01-5047 is OK
- 10.1016/j.dib.2022.107995 is OK
- 10.1149/1945-7111/ac201c is OK
- 10.1149/1945-7111/ad7292 is OK
- 10.1149/2.0051908jes is OK
- 10.1149/1945-7111/ac22c8 is OK
- 10.1016/j.compchemeng.2011.01.003 is OK
- 10.1016/j.compchemeng.2015.07.002 is OK
- 10.1149/1945-7111/ad1293 is OK
- 10.1145/3539801 is OK
- 10.1145/1089014.1089020 is OK
- 10.1016/j.ensm.2021.10.023 is OK
- 10.1149/1945-7111/ad1293 is OK
- 10.1109/TCST.2020.3017566 is OK
- 10.1149/1945-7111/ab7091 is OK
- 10.1016/j.jpowsour.2012.03.009 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: COMSOL Multiphysics&copy; v. 6.2
- No DOI given, and none found for title: Battery Design Module User’s Guide, COMSOL Multiph...
- No DOI given, and none found for title: fastDFN

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

⚠️ Error preparing paper acceptance. The generated XML metadata file is invalid.

ID ref-xu_comparative_2023 already defined

@mbarzegary
Copy link

Hi @kyleniemeyer, can you please help here? There is an error while preparing paper acceptance.

@mbarzegary
Copy link

@openjournals/dev can you help us with the issue above?

@xuanxu
Copy link
Member

xuanxu commented Dec 5, 2024

The entry xu_comparative_2023 is duplicated in the .bib file

@mbarzegary
Copy link

@xuanxu thank you for spotting the issue.

@mbarzegary
Copy link

@COBRAPROsimulator Can you please fix the above issue by removing the duplicated entry?

@COBRAPROsimulator
Copy link

@xuanxu, thank you for pointing out the issue. @mbarzegary I have removed the duplicated entry in the bib file.

@mbarzegary
Copy link

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1149/1.2221597 is OK
- 10.1149/2.0301603jes is OK
- 10.1016/j.energy.2022.125966 is OK
- 10.1149/1945-7111/ab7bd7 is OK
- 10.1149/2.0551509jes is OK
- 10.1149/2.0321816jes is OK
- 10.1149/1945-7111/ab9050 is OK
- 10.5334/jors.309 is OK
- 10.1149/2.0291607jes is OK
- 10.1149/2.0171711jes is OK
- 10.1016/j.jpowsour.2016.12.083 is OK
- 10.4271/2023-01-5047 is OK
- 10.1016/j.dib.2022.107995 is OK
- 10.1149/1945-7111/ac201c is OK
- 10.1149/1945-7111/ad7292 is OK
- 10.1149/2.0051908jes is OK
- 10.1149/1945-7111/ac22c8 is OK
- 10.1016/j.compchemeng.2011.01.003 is OK
- 10.1016/j.compchemeng.2015.07.002 is OK
- 10.1149/1945-7111/ad1293 is OK
- 10.1145/3539801 is OK
- 10.1145/1089014.1089020 is OK
- 10.1016/j.ensm.2021.10.023 is OK
- 10.1109/TCST.2020.3017566 is OK
- 10.1149/1945-7111/ab7091 is OK
- 10.1016/j.jpowsour.2012.03.009 is OK

🟡 SKIP DOIs

- No DOI given, and none found for title: COMSOL Multiphysics&copy; v. 6.2
- No DOI given, and none found for title: Battery Design Module User’s Guide, COMSOL Multiph...
- No DOI given, and none found for title: fastDFN

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/pe-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#6236, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSS. label Dec 9, 2024
@kyleniemeyer
Copy link

@COBRAPROsimulator just one minor issue in the paper to resolve: the COMSOL reference appears incorrectly ("COMSOL multiphysics©"). Can you please fix this?

@COBRAPROsimulator
Copy link

@kyleniemeyer, thank you for pointing that out. I have fixed the COMSOL reference. Please let me know if there any other issues that come up.

@COBRAPROsimulator
Copy link

Hello @kyleniemeyer, is there a status update regarding our paper? Please let us know if there are any issues that we need to address. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Matlab recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 3 (PE) Physics and Engineering
Projects
None yet
Development

No branches or pull requests

8 participants