Skip to content

Commit

Permalink
Tidy up (#54)
Browse files Browse the repository at this point in the history
🔥 Remove unused files (optics.py, exceptions.py, and images.py) and deep_equals, because they are not used.

🤖 Fix some minor typos.
  • Loading branch information
arafune authored Dec 5, 2024
1 parent b26a77a commit 90c7784
Show file tree
Hide file tree
Showing 77 changed files with 780 additions and 518 deletions.
14 changes: 7 additions & 7 deletions docs/source/CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Changed
* Add new endstation
* Logging and endstation plugin can be selected from local_config.py

Miror
Minor
~~~~~

* Recommend to use uv, instead of rye.
Expand All @@ -36,7 +36,7 @@ The name change
* corrections -> correction

This is somehow backwards incopatibilities. However, the effect for most is really minor,
because this functionalites are not so frequently used. Thus the major version number has not been changed.
because this functionalities are not so frequently used. Thus the major version number has not been changed.

* New UI

Expand Down Expand Up @@ -71,7 +71,7 @@ Changed
* Remove G.extent
* Remove overlapped_stack_dispersion_plot
- use stack_dispersion_plot_with_appropriate_args
* Revise the k-conversion. The origianl version is correct from the view of the coding, but incorrect from the physics!
* Revise the k-conversion. The original version is correct from the view of the coding, but incorrect from the physics!
* introduce new attrs, "energy_notation". if not specified, attrs["energy_notation"] = "Binding" is assumed to keep the consistency from the previous version.

* see Changes.md for others
Expand Down Expand Up @@ -120,7 +120,7 @@ New
and the content available greatly expanded.

* Tutorials for common types of analysis are available as Jupyter notebooks.
* An organized API documentation page is availabe.
* An organized API documentation page is available.
* Docstrings have been massively expanded to cover the public API
and most of the internal API.
* The documentation build process has been simplified.
Expand Down Expand Up @@ -446,7 +446,7 @@ New:
~~~~

1. Improved API documentation.
2. Most recent interative plot context is saved to
2. Most recent interactive plot context is saved to
``arpes.config.CONFIG['CURRENT_CONTEXT']``. This allows simple and
transparent recovery in case you forget to save the context and
performed a lot of work in an interactive session. Additionally, this
Expand All @@ -459,7 +459,7 @@ New:
Changed:
~~~~~~~~

1. Metadata reworked to a common format accross all endstations. This is
1. Metadata reworked to a common format across all endstations. This is
now documented appropriately with the data model.

.. _fixed-8:
Expand Down Expand Up @@ -494,7 +494,7 @@ New:
ARPES experiments from inside PyARPES.

1. As an example: After conducting nano-XPS, you can use PCA to
select your sample region and export a scan sequnce just over the
select your sample region and export a scan sequence just over the
sample ROI or over the border between your sample and another
area.

Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
# The full version, including alpha/beta/rc tags
release = arpes.__version__

# supress some output information for nbconvert, don't open tools
# suppress some output information for nbconvert, don't open tools

nbsphinx_allow_errors = True

Expand Down
2 changes: 1 addition & 1 deletion docs/source/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ repo <https://gitlab.com/lanzara-group/python-arpes>`__.
What you’ll need
----------------

Here’s a summary of what you’ll need to do, if you’are already familar
Here’s a summary of what you’ll need to do, if you’are already familiar
with contributing to open source. If you are less familiar, much more
detail on this is described in the :doc:`developer’s guide </dev-guide>`.

Expand Down
2 changes: 1 addition & 1 deletion docs/source/curve-fitting.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ complicated models using operators like ``+`` and ``*``, but also makes
the process of curve fitting transparent and simple.

Here we will prepare an EDC with a step edge, and fit it with a linear
density of states multipled by the Fermi distribution and convolved with
density of states multiplied by the Fermi distribution and convolved with
Gaussian instrumental broadening (``AffineBroadenedFD``). In general in
PyARPES, we use extensions of the models available in ``lmfit``, which
provides an ``xarray`` compatible and unitful fitting function
Expand Down
2 changes: 1 addition & 1 deletion docs/source/data-provenance.rst
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@ appropriately.
Additionally, because PyARPES includes recent Juypyter cell evaluations
in the record, you shouldn’t find the need to be very defensive in your
use of manual provenance tracking, although you should wrap code that
forms part of your more permanent analysis repetoire.
forms part of your more permanent analysis repertoire.

.. figure:: _static/decorator-provenance.png
:alt: Example data provenance in PyARPES
Expand Down
2 changes: 1 addition & 1 deletion docs/source/dev-guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Installing an editable copy of PyARPES
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

1. Install `uv <https://docs.astral.sh/uv/guides/projects/>` to make an isolated environment for development.
2. Clone the respository
2. Clone the repository

.. code:: bash
Expand Down
2 changes: 1 addition & 1 deletion docs/source/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Frequently Asked Questions
Igor Installation
-----------------

Using the suggested invokation I get a pip error
Using the suggested invocation I get a pip error
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Pip on Windows appears not to like certain archival formats. While
Expand Down
2 changes: 1 addition & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ I recommend igor to analyze, especially for students.



**December 2020, V3 Release**: The current relase focuses on improving
**December 2020, V3 Release**: The current release focuses on improving
usage and workflow for less experienced Python users, lifting version
incompatibilities with dependencies, and ironing out edges in the user
experience.
Expand Down
2 changes: 1 addition & 1 deletion docs/source/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ available either from the main repository at
`GitHub <https://github.com/arafune/arpes>`.

1. Install `uv <https://docs.astral.sh/uv/guides/projects/>`__
2. Clone or otherwise download the respository
2. Clone or otherwise download the repository

.. code:: bash
Expand Down
2 changes: 1 addition & 1 deletion docs/source/interactive.rst
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ Laying Lineshapes for Curve Fitting
-----------------------------------

Use ``.S.show_band_tool()`` to get an interactive lineshape/band layer
to set inital locations and model structure for complicated curve fits.
to set initial locations and model structure for complicated curve fits.

Examining Fits
--------------
Expand Down
2 changes: 1 addition & 1 deletion docs/source/notebooks/basic-data-exploration.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -255,7 +255,7 @@
"source": [
"### PyARPES philosophy on interactive tools\n",
"\n",
"Instead of one large interactive application where you perform analysis, PyARPES has many small interactive utilities which are built for a single purpose. From Ver 4.0.1 (pragmatically, from 4.0), the Qt based interactive tools are depricated. Thus, the following tools cannot be used:\n",
"Instead of one large interactive application where you perform analysis, PyARPES has many small interactive utilities which are built for a single purpose. From Ver 4.0.1 (pragmatically, from 4.0), the Qt based interactive tools are deprecated. Thus, the following tools cannot be used:\n",
"\n",
"1. A data browser: `arpes.plotting.qt_tool`\n",
"2. A momentum offset browser: `arpes.plotting.qt_ktool.ktool`\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/source/notebooks/converting-to-kspace.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -305,7 +305,7 @@
"source": [
"Excellent, this enables all kinds of analysis which we frequently want to perform.\n",
"\n",
"## Exactracting a Momentum Cut Passing Thorugh Known Angular Coordinates\n",
"## Exactracting a Momentum Cut Passing Through Known Angular Coordinates\n",
"\n",
"For instance, we can determine a momentum cut passing through our point of interest."
]
Expand Down
6 changes: 3 additions & 3 deletions docs/source/notebooks/curve-fitting.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
"\n",
"PyARPES uses `lmfit` in order to provide a user friendly, compositional API for curve fitting. This allows users to define more complicated models using operators like `+` and `*`, but also makes the process of curve fitting transparent and simple.\n",
"\n",
"Here we will prepare an EDC with a step edge, and fit it with a linear density of states multipled by the Fermi distribution and convolved with Gaussian instrumental broadening (`AffineBroadenedFD`). In general in PyARPES, we use extensions of the models available in `lmfit`, which provides an xarray compatible and unitful fitting function `guess_fit`. This has more or less the same call signature as `fit except that we do not need to pass the X and Y data separately, the X data is provided by the dataset coordinates."
"Here we will prepare an EDC with a step edge, and fit it with a linear density of states multiplied by the Fermi distribution and convolved with Gaussian instrumental broadening (`AffineBroadenedFD`). In general in PyARPES, we use extensions of the models available in `lmfit`, which provides an xarray compatible and unitful fitting function `guess_fit`. This has more or less the same call signature as `fit except that we do not need to pass the X and Y data separately, the X data is provided by the dataset coordinates."
]
},
{
Expand Down Expand Up @@ -79,7 +79,7 @@
"id": "6",
"metadata": {},
"source": [
"## Influencing the fit by setting parametrs\n",
"## Influencing the fit by setting parameters\n",
"\n",
"Using the `params=` keyword you can provide initial guess with `value`, enforce a `max` or `min`, and request that a parameter be allowed to `vary` or not. In this case, we will force a fit with the step edge at 10 millivolts, obtaining a substantially worse result.\n",
"\n",
Expand Down Expand Up @@ -290,7 +290,7 @@
"source": [
"## Interactively inspecting fits\n",
"\n",
"There's no substitute for inspecting fits by eye. PyARPES has holoviews based interactive fit inspection tools. This is very much like `profile_view` which we have already seen with the adddition that the marginal shows the curve fitting information for a broadcast fit. \n",
"There's no substitute for inspecting fits by eye. PyARPES has holoviews based interactive fit inspection tools. This is very much like `profile_view` which we have already seen with the addition that the marginal shows the curve fitting information for a broadcast fit. \n",
"\n",
"Additionally, you can use the tool to copy any given marginal's parameters to a hint dictionary which you can pass into the curve fit\n",
"for refinement."
Expand Down
2 changes: 1 addition & 1 deletion docs/source/notebooks/custom-dot-t-functionality.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@
"source": [
"## Functional Programming Primitives: `filter` and `map`\n",
"\n",
"You can `filter` or conditionally remove some of a datasets contents. To do this over coordinates on a dataset according to a function/sieve which accepts the coordinate and data value, you can use `filter_coord`. The sieving function should accept two arguments, the coordinate and the cut at that coordinate respectively. You can specify which coordinate or coordinates are iterated across when filtering using the `coordinate_name` paramter.\n",
"You can `filter` or conditionally remove some of a datasets contents. To do this over coordinates on a dataset according to a function/sieve which accepts the coordinate and data value, you can use `filter_coord`. The sieving function should accept two arguments, the coordinate and the cut at that coordinate respectively. You can specify which coordinate or coordinates are iterated across when filtering using the `coordinate_name` parameter.\n",
"\n",
"As a simple, example, we can remove all the odd valued coordinates along Y:"
]
Expand Down
4 changes: 2 additions & 2 deletions docs/source/notebooks/full-analysis-xps.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -457,8 +457,8 @@
"source": [
"## Exercises\n",
"\n",
"1. Peform a selection of the data according to the fitting values and plot the corresponding mask. What does the collection of points for which **Lower Peak BE** > -34.4eV look like?\n",
"2. Peform a selection of the data for the decompositions above and plot their corresponding average XPS curves. How do these results compare to the PCA results we found before?\n",
"1. Perform a selection of the data according to the fitting values and plot the corresponding mask. What does the collection of points for which **Lower Peak BE** > -34.4eV look like?\n",
"2. Perform a selection of the data for the decompositions above and plot their corresponding average XPS curves. How do these results compare to the PCA results we found before?\n",
"3. What might you conclude about the sample and experimental conditions given the extracted peak width map?"
]
}
Expand Down
2 changes: 1 addition & 1 deletion docs/source/notebooks/jupyter-crash-course.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
"id": "4",
"metadata": {},
"source": [
"You can use this with library functions, PyARPES funtions, or your own."
"You can use this with library functions, PyARPES functions, or your own."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion docs/source/spectra.rst
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ cases, this imaged axis will always be treated in the same role as the
high-resolution motion axis of a nano-ARPES system.

Working in two coordinate systems is frustrating, and it makes comparing
data cumbersome. In PyARPES x,y,z is always the total inferrable
data cumbersome. In PyARPES x,y,z is always the total inferable
coordinate value, i.e. (+/- long range +/- high resolution) as
appropriate. You can still access the underlying coordinates in this
case as ``long_{dim}`` and ``short_{dim}``.
Expand Down
2 changes: 1 addition & 1 deletion docs/source/stack-plots.rst
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ axes with ``cbarmap=``. Utilities for colorbars are

Finally, as an example of how you might use the code in a real
situation, we can do some preprocessing of the data before creating the
figure. Here we subract and normalize by the low temperature data, which
figure. Here we subtract and normalize by the low temperature data, which
highlights the Fermi edge width changing.

.. figure:: _static/flat-stack-difference.png
Expand Down
2 changes: 1 addition & 1 deletion docs/source/writing-interactive-tools.rst
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,7 @@ data (``xy``) and the transformed data (``f(xy)``). To do this we used
the utility function ``generate_marginal_for`` that can be used to
create browsable marginal plots for high dimensional data sets. Here we
do not want to integrate out any dimensions so we passed an tuple as the
first argument. With the rest of the invokation we specify to add the
first argument. With the rest of the invocation we specify to add the
plot to the layout ``self.content_layout`` in the locations (0,0) and
(1,0). Because we are not linking plots we don’t need cursors.

Expand Down
File renamed without changes.
File renamed without changes.
87 changes: 53 additions & 34 deletions src/arpes/analysis/band_analysis.py
Original file line number Diff line number Diff line change
Expand Up @@ -162,16 +162,25 @@ def dataarray_for_value(
*,
is_value: bool,
) -> xr.DataArray | None:
"""Return DataArray representing the fit results.
"""Return a DataArray representing the fit results for a specific parameter.
Args:
param_name (Literal["center", "amplitude", "sigma", "gamma"]): [TODO:description]
i (int): index for band names in identified_band_results.
is_value (bool): if True, return the value, else return stderr.
This function retrieves the values (or standard errors) of a specified fit parameter
(such as "center", "amplitude", "sigma", or "gamma") for each band in the
`identified_band_results`. The result is returned as an `xr.DataArray`. If the parameter
is not available in the fitting results for a given band, `None` is returned.
Returns: xr.DataArray | None
DataArray storing the fitting data. if the corresponding parameter name is not used,
returns None.
Args:
param_name (Literal["center", "amplitude", "sigma", "gamma"]): The name of the fit
parameter whose values are being retrieved (e.g., "center", "amplitude", etc.).
i (int): Index for band names in the `identified_band_results` list. It is used to
identify the correct band in the results.
is_value (bool): If `True`, the function returns the fit parameter's value; if
`False`, it returns the standard error (stderr) of the fit.
Returns:
xr.DataArray | None: An `xr.DataArray` containing the fit parameter values
(or stderr). Returns `None` if the corresponding parameter is not found for the
given index.
"""
values: NDArray[np.float64] = np.zeros_like(
band_results.values,
Expand Down Expand Up @@ -303,13 +312,25 @@ def _modelresult_to_array(
prefix: str = "",
weights: tuple[float, float, float] = (2, 0, 10),
) -> NDArray[np.float64]:
"""Convert ModelResult to NDArray.
"""Convert ModelResult to a weighted NDArray of fit parameter values.
This function extracts the values and standard errors for the parameters
"sigma", "gamma", "amplitude", and "center" from the `model_fit` object,
applies weights for each parameter (sigma, amplitude, center), and
returns the result as a NumPy array.
If any parameter is missing from `model_fit`, a default value and
standard error are assigned. The weights are applied to the parameters
during the conversion process.
Args:
model_fit (ModelResult): [TODO:description]
prefix (str): Prefix in ModelResult
weights (tuple[float, float, float]): Weight for (sigma, amplitude, center)
model_fit (ModelResult): The model fitting result containing the parameters.
prefix (str): Prefix to be added to parameter names for identification.
weights (tuple[float, float, float]): Weights for the parameters in the order
(sigma, amplitude, center). Default is (2, 0, 10).
Returns:
NDArray[np.float64]: A NumPy array containing the weighted parameter values.
"""
parameter_names: set[str] = set(model_fit.params.keys())
if prefix + "sigma" in parameter_names:
Expand Down Expand Up @@ -358,34 +379,32 @@ def fit_patterned_bands( # noqa: PLR0913
interactive: bool = True,
dataset: bool = True,
) -> XrTypes:
"""Fits bands and determines dispersion in some region of a spectrum.
"""Fits bands and determines dispersion in a region of a spectrum.
The dimensions of the dataset are partitioned into three types:
1. Fit directions, these are coordinates along the 1D (or maybe later 2D) marginals
2. Broadcast directions, these are directions used to interpolate against the patterned
3. Free directions, these are broadcasted but they are not used to extract initial values of the
directions
fit parameters
For instance, if you laid out band patterns in a E, k_p, delay spectrum at delta_t=0, then if
you are using MDCs, k_p is the fit direction, E is the broadcast direction, and delay is a free
direction.
1. Fit directions: Coordinates along the 1D (or later 2D) marginals, e.g., energy (E).
2. Broadcast directions: Directions used to interpolate against the patterned, e.g., k.
3. Free directions: Broadcasted directions not used to extract the initial parameter values.
In general we can recover the free directions and the broadcast directions implicitly by
examining the band_set passed as a pattern.
For example, in a spectrum at delta_t=0, if using MDCs, `k_p` could be the fit direction,
`E` the broadcast direction, and `delay` a free direction.
Args:
arr (xr.DataArray): [ToDo: description]
band_set: dictionary with bands and points along the spectrum
fit_direction (str): [ToDo: description]
stray (float, optional): [ToDo: description]
background (bool): [ToDo: description]
interactive(bool): [ToDo: description]
dataset(bool): if true, return as xr.Dataset.
Returns: XrTypes
Dataset or DataArray, as controlled by the parameter "dataset"
arr (xr.DataArray): The data array containing the spectrum to fit.
band_set (dict[Incomplete, Incomplete]): A dictionary defining the bands and points along
the spectrum.
fit_direction (str): The direction to fit the data (e.g., "energy").
stray (float, optional): A parameter used for adjusting fits. Defaults to None.
background (bool | type[Band]): If True, includes background fitting, otherwise specifies
the background band class.
interactive (bool): If True, show an interactive progress bar.
dataset (bool): If True, return the results as an `xr.Dataset`. If False, return just the
`band_results`.
Returns:
XrTypes: Either an `xr.DataArray` or an `xr.Dataset` depending on the `dataset` argument.
The returned object contains fitting results, residuals, and normalized residuals.
"""
if background:
from arpes.models.band import BackgroundBand
Expand Down
Loading

0 comments on commit 90c7784

Please sign in to comment.