diff --git a/docs/source/CHANGELOG.rst b/docs/source/CHANGELOG.rst index 6d1f8a06..daf474d6 100644 --- a/docs/source/CHANGELOG.rst +++ b/docs/source/CHANGELOG.rst @@ -20,7 +20,7 @@ Changed * Add new endstation * Logging and endstation plugin can be selected from local_config.py -Miror +Minor ~~~~~ * Recommend to use uv, instead of rye. @@ -36,7 +36,7 @@ The name change * corrections -> correction This is somehow backwards incopatibilities. However, the effect for most is really minor, - because this functionalites are not so frequently used. Thus the major version number has not been changed. + because this functionalities are not so frequently used. Thus the major version number has not been changed. * New UI @@ -71,7 +71,7 @@ Changed * Remove G.extent * Remove overlapped_stack_dispersion_plot - use stack_dispersion_plot_with_appropriate_args - * Revise the k-conversion. The origianl version is correct from the view of the coding, but incorrect from the physics! + * Revise the k-conversion. The original version is correct from the view of the coding, but incorrect from the physics! * introduce new attrs, "energy_notation". if not specified, attrs["energy_notation"] = "Binding" is assumed to keep the consistency from the previous version. * see Changes.md for others @@ -120,7 +120,7 @@ New and the content available greatly expanded. * Tutorials for common types of analysis are available as Jupyter notebooks. - * An organized API documentation page is availabe. + * An organized API documentation page is available. * Docstrings have been massively expanded to cover the public API and most of the internal API. * The documentation build process has been simplified. @@ -446,7 +446,7 @@ New: ~~~~ 1. Improved API documentation. -2. Most recent interative plot context is saved to +2. Most recent interactive plot context is saved to ``arpes.config.CONFIG['CURRENT_CONTEXT']``. This allows simple and transparent recovery in case you forget to save the context and performed a lot of work in an interactive session. Additionally, this @@ -459,7 +459,7 @@ New: Changed: ~~~~~~~~ -1. Metadata reworked to a common format accross all endstations. This is +1. Metadata reworked to a common format across all endstations. This is now documented appropriately with the data model. .. _fixed-8: @@ -494,7 +494,7 @@ New: ARPES experiments from inside PyARPES. 1. As an example: After conducting nano-XPS, you can use PCA to - select your sample region and export a scan sequnce just over the + select your sample region and export a scan sequence just over the sample ROI or over the border between your sample and another area. diff --git a/docs/source/conf.py b/docs/source/conf.py index cc75ee0d..f7df0d67 100644 --- a/docs/source/conf.py +++ b/docs/source/conf.py @@ -30,7 +30,7 @@ # The full version, including alpha/beta/rc tags release = arpes.__version__ -# supress some output information for nbconvert, don't open tools +# suppress some output information for nbconvert, don't open tools nbsphinx_allow_errors = True diff --git a/docs/source/contributing.rst b/docs/source/contributing.rst index ade9ae60..610cc890 100644 --- a/docs/source/contributing.rst +++ b/docs/source/contributing.rst @@ -27,7 +27,7 @@ repo `__. What you’ll need ---------------- -Here’s a summary of what you’ll need to do, if you’are already familar +Here’s a summary of what you’ll need to do, if you’are already familiar with contributing to open source. If you are less familiar, much more detail on this is described in the :doc:`developer’s guide `. diff --git a/docs/source/curve-fitting.rst b/docs/source/curve-fitting.rst index 547f21da..ff040c17 100644 --- a/docs/source/curve-fitting.rst +++ b/docs/source/curve-fitting.rst @@ -20,7 +20,7 @@ complicated models using operators like ``+`` and ``*``, but also makes the process of curve fitting transparent and simple. Here we will prepare an EDC with a step edge, and fit it with a linear -density of states multipled by the Fermi distribution and convolved with +density of states multiplied by the Fermi distribution and convolved with Gaussian instrumental broadening (``AffineBroadenedFD``). In general in PyARPES, we use extensions of the models available in ``lmfit``, which provides an ``xarray`` compatible and unitful fitting function diff --git a/docs/source/data-provenance.rst b/docs/source/data-provenance.rst index 22b30b4f..aaaf9c0c 100644 --- a/docs/source/data-provenance.rst +++ b/docs/source/data-provenance.rst @@ -126,7 +126,7 @@ appropriately. Additionally, because PyARPES includes recent Juypyter cell evaluations in the record, you shouldn’t find the need to be very defensive in your use of manual provenance tracking, although you should wrap code that -forms part of your more permanent analysis repetoire. +forms part of your more permanent analysis repertoire. .. figure:: _static/decorator-provenance.png :alt: Example data provenance in PyARPES diff --git a/docs/source/dev-guide.rst b/docs/source/dev-guide.rst index 39c5482d..2cba6ecd 100644 --- a/docs/source/dev-guide.rst +++ b/docs/source/dev-guide.rst @@ -8,7 +8,7 @@ Installing an editable copy of PyARPES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 1. Install `uv ` to make an isolated environment for development. -2. Clone the respository +2. Clone the repository .. code:: bash diff --git a/docs/source/faq.rst b/docs/source/faq.rst index 84f1d445..6df6d016 100644 --- a/docs/source/faq.rst +++ b/docs/source/faq.rst @@ -4,7 +4,7 @@ Frequently Asked Questions Igor Installation ----------------- -Using the suggested invokation I get a pip error +Using the suggested invocation I get a pip error ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Pip on Windows appears not to like certain archival formats. While diff --git a/docs/source/index.rst b/docs/source/index.rst index 9dd15234..d92d5a7a 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -19,7 +19,7 @@ I recommend igor to analyze, especially for students. -**December 2020, V3 Release**: The current relase focuses on improving +**December 2020, V3 Release**: The current release focuses on improving usage and workflow for less experienced Python users, lifting version incompatibilities with dependencies, and ironing out edges in the user experience. diff --git a/docs/source/installation.rst b/docs/source/installation.rst index daff78ec..ec829615 100644 --- a/docs/source/installation.rst +++ b/docs/source/installation.rst @@ -21,7 +21,7 @@ available either from the main repository at `GitHub `. 1. Install `uv `__ -2. Clone or otherwise download the respository +2. Clone or otherwise download the repository .. code:: bash diff --git a/docs/source/interactive.rst b/docs/source/interactive.rst index 766117f9..fdc46eb8 100644 --- a/docs/source/interactive.rst +++ b/docs/source/interactive.rst @@ -65,7 +65,7 @@ Laying Lineshapes for Curve Fitting ----------------------------------- Use ``.S.show_band_tool()`` to get an interactive lineshape/band layer -to set inital locations and model structure for complicated curve fits. +to set initial locations and model structure for complicated curve fits. Examining Fits -------------- diff --git a/docs/source/notebooks/basic-data-exploration.ipynb b/docs/source/notebooks/basic-data-exploration.ipynb index 4483a9ba..379618a4 100644 --- a/docs/source/notebooks/basic-data-exploration.ipynb +++ b/docs/source/notebooks/basic-data-exploration.ipynb @@ -255,7 +255,7 @@ "source": [ "### PyARPES philosophy on interactive tools\n", "\n", - "Instead of one large interactive application where you perform analysis, PyARPES has many small interactive utilities which are built for a single purpose. From Ver 4.0.1 (pragmatically, from 4.0), the Qt based interactive tools are depricated. Thus, the following tools cannot be used:\n", + "Instead of one large interactive application where you perform analysis, PyARPES has many small interactive utilities which are built for a single purpose. From Ver 4.0.1 (pragmatically, from 4.0), the Qt based interactive tools are deprecated. Thus, the following tools cannot be used:\n", "\n", "1. A data browser: `arpes.plotting.qt_tool`\n", "2. A momentum offset browser: `arpes.plotting.qt_ktool.ktool`\n", diff --git a/docs/source/notebooks/converting-to-kspace.ipynb b/docs/source/notebooks/converting-to-kspace.ipynb index 9b4e665f..1ad616a6 100644 --- a/docs/source/notebooks/converting-to-kspace.ipynb +++ b/docs/source/notebooks/converting-to-kspace.ipynb @@ -305,7 +305,7 @@ "source": [ "Excellent, this enables all kinds of analysis which we frequently want to perform.\n", "\n", - "## Exactracting a Momentum Cut Passing Thorugh Known Angular Coordinates\n", + "## Exactracting a Momentum Cut Passing Through Known Angular Coordinates\n", "\n", "For instance, we can determine a momentum cut passing through our point of interest." ] diff --git a/docs/source/notebooks/curve-fitting.ipynb b/docs/source/notebooks/curve-fitting.ipynb index 7a29bdc3..068f3c91 100644 --- a/docs/source/notebooks/curve-fitting.ipynb +++ b/docs/source/notebooks/curve-fitting.ipynb @@ -15,7 +15,7 @@ "\n", "PyARPES uses `lmfit` in order to provide a user friendly, compositional API for curve fitting. This allows users to define more complicated models using operators like `+` and `*`, but also makes the process of curve fitting transparent and simple.\n", "\n", - "Here we will prepare an EDC with a step edge, and fit it with a linear density of states multipled by the Fermi distribution and convolved with Gaussian instrumental broadening (`AffineBroadenedFD`). In general in PyARPES, we use extensions of the models available in `lmfit`, which provides an xarray compatible and unitful fitting function `guess_fit`. This has more or less the same call signature as `fit except that we do not need to pass the X and Y data separately, the X data is provided by the dataset coordinates." + "Here we will prepare an EDC with a step edge, and fit it with a linear density of states multiplied by the Fermi distribution and convolved with Gaussian instrumental broadening (`AffineBroadenedFD`). In general in PyARPES, we use extensions of the models available in `lmfit`, which provides an xarray compatible and unitful fitting function `guess_fit`. This has more or less the same call signature as `fit except that we do not need to pass the X and Y data separately, the X data is provided by the dataset coordinates." ] }, { @@ -79,7 +79,7 @@ "id": "6", "metadata": {}, "source": [ - "## Influencing the fit by setting parametrs\n", + "## Influencing the fit by setting parameters\n", "\n", "Using the `params=` keyword you can provide initial guess with `value`, enforce a `max` or `min`, and request that a parameter be allowed to `vary` or not. In this case, we will force a fit with the step edge at 10 millivolts, obtaining a substantially worse result.\n", "\n", @@ -290,7 +290,7 @@ "source": [ "## Interactively inspecting fits\n", "\n", - "There's no substitute for inspecting fits by eye. PyARPES has holoviews based interactive fit inspection tools. This is very much like `profile_view` which we have already seen with the adddition that the marginal shows the curve fitting information for a broadcast fit. \n", + "There's no substitute for inspecting fits by eye. PyARPES has holoviews based interactive fit inspection tools. This is very much like `profile_view` which we have already seen with the addition that the marginal shows the curve fitting information for a broadcast fit. \n", "\n", "Additionally, you can use the tool to copy any given marginal's parameters to a hint dictionary which you can pass into the curve fit\n", "for refinement." diff --git a/docs/source/notebooks/custom-dot-t-functionality.ipynb b/docs/source/notebooks/custom-dot-t-functionality.ipynb index 70b1990f..0fe447f3 100644 --- a/docs/source/notebooks/custom-dot-t-functionality.ipynb +++ b/docs/source/notebooks/custom-dot-t-functionality.ipynb @@ -140,7 +140,7 @@ "source": [ "## Functional Programming Primitives: `filter` and `map`\n", "\n", - "You can `filter` or conditionally remove some of a datasets contents. To do this over coordinates on a dataset according to a function/sieve which accepts the coordinate and data value, you can use `filter_coord`. The sieving function should accept two arguments, the coordinate and the cut at that coordinate respectively. You can specify which coordinate or coordinates are iterated across when filtering using the `coordinate_name` paramter.\n", + "You can `filter` or conditionally remove some of a datasets contents. To do this over coordinates on a dataset according to a function/sieve which accepts the coordinate and data value, you can use `filter_coord`. The sieving function should accept two arguments, the coordinate and the cut at that coordinate respectively. You can specify which coordinate or coordinates are iterated across when filtering using the `coordinate_name` parameter.\n", "\n", "As a simple, example, we can remove all the odd valued coordinates along Y:" ] diff --git a/docs/source/notebooks/full-analysis-xps.ipynb b/docs/source/notebooks/full-analysis-xps.ipynb index 2b0045bf..88e7a609 100644 --- a/docs/source/notebooks/full-analysis-xps.ipynb +++ b/docs/source/notebooks/full-analysis-xps.ipynb @@ -457,8 +457,8 @@ "source": [ "## Exercises\n", "\n", - "1. Peform a selection of the data according to the fitting values and plot the corresponding mask. What does the collection of points for which **Lower Peak BE** > -34.4eV look like?\n", - "2. Peform a selection of the data for the decompositions above and plot their corresponding average XPS curves. How do these results compare to the PCA results we found before?\n", + "1. Perform a selection of the data according to the fitting values and plot the corresponding mask. What does the collection of points for which **Lower Peak BE** > -34.4eV look like?\n", + "2. Perform a selection of the data for the decompositions above and plot their corresponding average XPS curves. How do these results compare to the PCA results we found before?\n", "3. What might you conclude about the sample and experimental conditions given the extracted peak width map?" ] } diff --git a/docs/source/notebooks/jupyter-crash-course.ipynb b/docs/source/notebooks/jupyter-crash-course.ipynb index b66feb68..8390da9d 100644 --- a/docs/source/notebooks/jupyter-crash-course.ipynb +++ b/docs/source/notebooks/jupyter-crash-course.ipynb @@ -37,7 +37,7 @@ "id": "4", "metadata": {}, "source": [ - "You can use this with library functions, PyARPES funtions, or your own." + "You can use this with library functions, PyARPES functions, or your own." ] }, { diff --git a/docs/source/spectra.rst b/docs/source/spectra.rst index 5e82bd9b..54418585 100644 --- a/docs/source/spectra.rst +++ b/docs/source/spectra.rst @@ -42,7 +42,7 @@ cases, this imaged axis will always be treated in the same role as the high-resolution motion axis of a nano-ARPES system. Working in two coordinate systems is frustrating, and it makes comparing -data cumbersome. In PyARPES x,y,z is always the total inferrable +data cumbersome. In PyARPES x,y,z is always the total inferable coordinate value, i.e. (+/- long range +/- high resolution) as appropriate. You can still access the underlying coordinates in this case as ``long_{dim}`` and ``short_{dim}``. diff --git a/docs/source/stack-plots.rst b/docs/source/stack-plots.rst index 9a33b8a5..04beaf51 100644 --- a/docs/source/stack-plots.rst +++ b/docs/source/stack-plots.rst @@ -63,7 +63,7 @@ axes with ``cbarmap=``. Utilities for colorbars are Finally, as an example of how you might use the code in a real situation, we can do some preprocessing of the data before creating the -figure. Here we subract and normalize by the low temperature data, which +figure. Here we subtract and normalize by the low temperature data, which highlights the Fermi edge width changing. .. figure:: _static/flat-stack-difference.png diff --git a/docs/source/writing-interactive-tools.rst b/docs/source/writing-interactive-tools.rst index 15dd7179..336ac7db 100644 --- a/docs/source/writing-interactive-tools.rst +++ b/docs/source/writing-interactive-tools.rst @@ -191,7 +191,7 @@ data (``xy``) and the transformed data (``f(xy)``). To do this we used the utility function ``generate_marginal_for`` that can be used to create browsable marginal plots for high dimensional data sets. Here we do not want to integrate out any dimensions so we passed an tuple as the -first argument. With the rest of the invokation we specify to add the +first argument. With the rest of the invocation we specify to add the plot to the layout ``self.content_layout`` in the locations (0,0) and (1,0). Because we are not linking plots we don’t need cursors. diff --git a/src/arpes/utilities/image.py b/legacy_files/image.py similarity index 100% rename from src/arpes/utilities/image.py rename to legacy_files/image.py diff --git a/src/arpes/optics.py b/legacy_files/optics.py similarity index 100% rename from src/arpes/optics.py rename to legacy_files/optics.py diff --git a/src/arpes/analysis/band_analysis.py b/src/arpes/analysis/band_analysis.py index b5afb1f5..e5a0f485 100644 --- a/src/arpes/analysis/band_analysis.py +++ b/src/arpes/analysis/band_analysis.py @@ -162,16 +162,25 @@ def dataarray_for_value( *, is_value: bool, ) -> xr.DataArray | None: - """Return DataArray representing the fit results. + """Return a DataArray representing the fit results for a specific parameter. - Args: - param_name (Literal["center", "amplitude", "sigma", "gamma"]): [TODO:description] - i (int): index for band names in identified_band_results. - is_value (bool): if True, return the value, else return stderr. + This function retrieves the values (or standard errors) of a specified fit parameter + (such as "center", "amplitude", "sigma", or "gamma") for each band in the + `identified_band_results`. The result is returned as an `xr.DataArray`. If the parameter + is not available in the fitting results for a given band, `None` is returned. - Returns: xr.DataArray | None - DataArray storing the fitting data. if the corresponding parameter name is not used, - returns None. + Args: + param_name (Literal["center", "amplitude", "sigma", "gamma"]): The name of the fit + parameter whose values are being retrieved (e.g., "center", "amplitude", etc.). + i (int): Index for band names in the `identified_band_results` list. It is used to + identify the correct band in the results. + is_value (bool): If `True`, the function returns the fit parameter's value; if + `False`, it returns the standard error (stderr) of the fit. + + Returns: + xr.DataArray | None: An `xr.DataArray` containing the fit parameter values + (or stderr). Returns `None` if the corresponding parameter is not found for the + given index. """ values: NDArray[np.float64] = np.zeros_like( band_results.values, @@ -303,13 +312,25 @@ def _modelresult_to_array( prefix: str = "", weights: tuple[float, float, float] = (2, 0, 10), ) -> NDArray[np.float64]: - """Convert ModelResult to NDArray. + """Convert ModelResult to a weighted NDArray of fit parameter values. + + This function extracts the values and standard errors for the parameters + "sigma", "gamma", "amplitude", and "center" from the `model_fit` object, + applies weights for each parameter (sigma, amplitude, center), and + returns the result as a NumPy array. + + If any parameter is missing from `model_fit`, a default value and + standard error are assigned. The weights are applied to the parameters + during the conversion process. Args: - model_fit (ModelResult): [TODO:description] - prefix (str): Prefix in ModelResult - weights (tuple[float, float, float]): Weight for (sigma, amplitude, center) + model_fit (ModelResult): The model fitting result containing the parameters. + prefix (str): Prefix to be added to parameter names for identification. + weights (tuple[float, float, float]): Weights for the parameters in the order + (sigma, amplitude, center). Default is (2, 0, 10). + Returns: + NDArray[np.float64]: A NumPy array containing the weighted parameter values. """ parameter_names: set[str] = set(model_fit.params.keys()) if prefix + "sigma" in parameter_names: @@ -358,34 +379,32 @@ def fit_patterned_bands( # noqa: PLR0913 interactive: bool = True, dataset: bool = True, ) -> XrTypes: - """Fits bands and determines dispersion in some region of a spectrum. + """Fits bands and determines dispersion in a region of a spectrum. The dimensions of the dataset are partitioned into three types: - 1. Fit directions, these are coordinates along the 1D (or maybe later 2D) marginals - 2. Broadcast directions, these are directions used to interpolate against the patterned - 3. Free directions, these are broadcasted but they are not used to extract initial values of the - directions - fit parameters - - For instance, if you laid out band patterns in a E, k_p, delay spectrum at delta_t=0, then if - you are using MDCs, k_p is the fit direction, E is the broadcast direction, and delay is a free - direction. + 1. Fit directions: Coordinates along the 1D (or later 2D) marginals, e.g., energy (E). + 2. Broadcast directions: Directions used to interpolate against the patterned, e.g., k. + 3. Free directions: Broadcasted directions not used to extract the initial parameter values. - In general we can recover the free directions and the broadcast directions implicitly by - examining the band_set passed as a pattern. + For example, in a spectrum at delta_t=0, if using MDCs, `k_p` could be the fit direction, + `E` the broadcast direction, and `delay` a free direction. Args: - arr (xr.DataArray): [ToDo: description] - band_set: dictionary with bands and points along the spectrum - fit_direction (str): [ToDo: description] - stray (float, optional): [ToDo: description] - background (bool): [ToDo: description] - interactive(bool): [ToDo: description] - dataset(bool): if true, return as xr.Dataset. - - Returns: XrTypes - Dataset or DataArray, as controlled by the parameter "dataset" + arr (xr.DataArray): The data array containing the spectrum to fit. + band_set (dict[Incomplete, Incomplete]): A dictionary defining the bands and points along + the spectrum. + fit_direction (str): The direction to fit the data (e.g., "energy"). + stray (float, optional): A parameter used for adjusting fits. Defaults to None. + background (bool | type[Band]): If True, includes background fitting, otherwise specifies + the background band class. + interactive (bool): If True, show an interactive progress bar. + dataset (bool): If True, return the results as an `xr.Dataset`. If False, return just the + `band_results`. + + Returns: + XrTypes: Either an `xr.DataArray` or an `xr.Dataset` depending on the `dataset` argument. + The returned object contains fitting results, residuals, and normalized residuals. """ if background: from arpes.models.band import BackgroundBand diff --git a/src/arpes/analysis/shirley.py b/src/arpes/analysis/shirley.py index 87e4ea3b..5277adc6 100644 --- a/src/arpes/analysis/shirley.py +++ b/src/arpes/analysis/shirley.py @@ -128,7 +128,7 @@ def calculate_shirley_background_full_range( n_samples: The number of samples to use at the boundaries of the input data. Returns: - A monotonic Shirley backgruond over the entire energy range. + A monotonic Shirley background over the entire energy range. """ xps_array = ( xps.copy(deep=True) @@ -173,7 +173,7 @@ def calculate_shirley_background( n_samples: The number of samples to use at the boundaries of the input data. Returns: - A monotonic Shirley backgruond over the entire energy range. + A monotonic Shirley background over the entire energy range. """ if energy_range is None: energy_range = slice(None, None) diff --git a/src/arpes/analysis/tarpes.py b/src/arpes/analysis/tarpes.py index f22bb61a..5c4b652f 100644 --- a/src/arpes/analysis/tarpes.py +++ b/src/arpes/analysis/tarpes.py @@ -54,7 +54,7 @@ def build_crosscorrelation( Args: datalist (Sequence[xr.DataArray]): Data series from the cross-correlation experiments. delayline_dim: the dimension name for "delay line", which must be in key of data.attrs - When this is the "position" dimention, the unit is assumed to be "mm". If the value has + When this is the "position" dimension, the unit is assumed to be "mm". If the value has already been converted to "time" dimension, set convert_position_to_time=True delayline_origin (float): The value corresponding to the delay zero. convert_position_to_time: (bool) If true, no conversion into "delay" is processed. @@ -189,7 +189,7 @@ def find_t_for_max_intensity(data: xr.DataArray, e_bound: float = 0.02) -> float """Finds the time corresponding to the maximum (integrated) intensity. While the time returned can be used to "t=0" in pump probe exepriments, especially for - realtively slow (~ps) phenomena, but not always true. + relatively slow (~ps) phenomena, but not always true. Args: data: A spectrum with "eV" and "delay" dimensions. diff --git a/src/arpes/bootstrap.py b/src/arpes/bootstrap.py index 9a17ff4d..5a149713 100644 --- a/src/arpes/bootstrap.py +++ b/src/arpes/bootstrap.py @@ -276,7 +276,7 @@ def propagate_errors(f: Callable[P, R]) -> Callable[P, R]: f: The inner function to wrap Returns: - The wrapped function handling distributions tranparently. + The wrapped function handling distributions transparently. """ @functools.wraps(f) diff --git a/src/arpes/config.py b/src/arpes/config.py index 90d22ffd..6fd7c4c2 100644 --- a/src/arpes/config.py +++ b/src/arpes/config.py @@ -33,7 +33,7 @@ # pylint: disable=global-statement LOGLEVELS = (DEBUG, INFO) -LOGLEVEL = LOGLEVELS[0] +LOGLEVEL = LOGLEVELS[1] logger = getLogger(__name__) fmt = "%(asctime)s %(levelname)s %(name)s :%(message)s" formatter = Formatter(fmt) diff --git a/src/arpes/correction/fermi_edge.py b/src/arpes/correction/fermi_edge.py index 0c4112ee..7c356106 100644 --- a/src/arpes/correction/fermi_edge.py +++ b/src/arpes/correction/fermi_edge.py @@ -154,7 +154,7 @@ def build_direct_fermi_edge_correction( Args: arr (xr.DataArray) : input DataArray - energy_range (slice): Energy range, which is used in xr.DataArray.sel(). defautl (-0.1, 0.1) + energy_range (slice): Energy range, which is used in xr.DataArray.sel(). default (-0.1, 0.1) plot (bool): if True, show the plot along (str): axis for non energy axis diff --git a/src/arpes/deep_learning/transforms.py b/src/arpes/deep_learning/transforms.py index c3a5596d..8ef76105 100644 --- a/src/arpes/deep_learning/transforms.py +++ b/src/arpes/deep_learning/transforms.py @@ -125,7 +125,7 @@ def decodes_target(self, y: Incomplete) -> Incomplete: return y def __repr__(self) -> str: - """Show both of the constitutent parts of this transform.""" + """Show both of the constituent parts of this transform.""" return ( self.__class__.__name__ + "(\n\t" diff --git a/src/arpes/endstations/__init__.py b/src/arpes/endstations/__init__.py index b933f942..c5486384 100644 --- a/src/arpes/endstations/__init__.py +++ b/src/arpes/endstations/__init__.py @@ -385,26 +385,28 @@ def load_from_path(self, path: str | Path) -> xr.Dataset: def load(self, scan_desc: ScanDesc | None = None, **kwargs: Incomplete) -> xr.Dataset: """Loads a scan from a single file or a sequence of files. - This defines the contract and structure for standard data loading plugins: - 1. Search for files (`.resolve_frame_locations`) - 2. Load them sequentially (`.load_single_frame`) - 3. Apply cleaning code to each frame (`.postprocess`) - 4. Concatenate these loaded files (`.concatenate_frames`) - 5. Apply postprocessing code to the concatenated dataset + This method provides the standard procedure for loading data from one or more files: + 1. Resolves file locations (`.resolve_frame_locations`). + 2. Loads each file sequentially (`.load_single_frame`). + 3. Applies any cleaning or processing to each frame (`.postprocess`). + 4. Concatenates the loaded frames into a single dataset (`.concatenate_frames`). + 5. Applies any final postprocessing to the concatenated dataset (`.postprocess_final`). - You can read more about the plugin system in the detailed documentation, - but for the most part loaders just specializing one or more of these different steps - as appropriate for a beamline. + This loading workflow can be customized by overriding the respective methods for specific + beamlines or data sources. It provides a flexible way to integrate with different types of + data formats and handling strategies. Args: - scan_desc(ScanDesc): scan description - kwargs: pass to load_sing_frame + scan_desc (ScanDesc): The description of the scan, which may contain information such as + file paths or other metadata. + kwargs: Additional keyword arguments that will be passed to the `.load_single_frame` + method for loading each frame. Returns: - [TODO:description] + xr.Dataset: The concatenated and processed dataset containing the scan data. Raises: - RuntimeError: [TODO:description] + RuntimeError: If no files are found or if there is an error in loading the scan data. """ scan_desc = scan_desc or {} logger.debug("Resolving frame locations") @@ -429,12 +431,19 @@ def load(self, scan_desc: ScanDesc | None = None, **kwargs: Incomplete) -> xr.Da def _modify_a_data(self, a_data: DataType, spectrum_type: str | None) -> DataType: """Helper function to modify the Dataset and DataArray that are contained in the Dataset. + This method modifies the attributes and coordinates of a given data object + (either an xarray Dataset or DataArray). It ensures that the "phi" coordinate is + set to 0 if it doesn't exist, updates the "spectrum_type" attribute, and applies any + transformations defined in `ATTR_TRANSFORMS`. Additionally, it ensures that default + attributes from `MERGE_ATTRS` are added to the dataset if they don't already exist. + Args: - a_data: [TODO:description] - spectrum_type: [TODO:description] + a_data (DataType): The data object (either an xarray Dataset or DataArray) to modify. + spectrum_type (str | None): The spectrum type to set as an attribute for the data + object. Returns: - [TODO:description] + DataType: The modified data object with updated attributes and coordinates. """ if "phi" not in a_data.coords: a_data.coords["phi"] = 0 @@ -546,17 +555,24 @@ def load_single_frame( scan_desc: ScanDesc | None = None, **kwargs: bool, ) -> xr.Dataset: - """Load the single frame fro the file. + """Load the single frame from the specified file. - [TODO:description] + This method loads a single frame of data from a file. + If the file is in NetCDF (".nc") format, it loads the data using the `load_SES_nc` + method, passing along the `scan_desc` dictionary and any additional keyword arguments. + If the file is in PXT format, it reads the data, negates the energy values, and returns + the data as an `xarray.Dataset` with the `spectrum` key. Args: - frame_path: [TODO:description] - scan_desc (ScanDesc): [TODO:description] - kwargs: pass to load_SES_nc, thus only "robust_dimension_labels" can be accepted. + frame_path (str | Path): The path to the file containing the single frame of data. + scan_desc (ScanDesc | None): A description of the scan, which is passed to the + `load_SES_nc` function if the file is in NetCDF format. Defaults to `None`. + kwargs (bool): Additional keyword arguments passed to `load_SES_nc`. The only accepted + argument is "robust_dimension_labels". Returns: - [TODO:description] + xr.Dataset: The dataset containing the loaded spectrum data. + Load the single frame from the file. """ ext = Path(frame_path).suffix if scan_desc is None: @@ -734,15 +750,22 @@ class FITSEndstation(EndstationBase): def resolve_frame_locations(self, scan_desc: ScanDesc | None = None) -> list[Path]: """Determines all files associated with a given scan. + This function resolves the file location(s) based on the provided `scan_desc` dictionary. + It looks for the "path" or "file" key in the `scan_desc` to determine the file location. + If the file does not exist at the provided location, it will attempt to find it in the + `DATA_PATH` directory. If the file is still not found, a `RuntimeError` is raised. + Args: - scan_desc: [TODO:description] + scan_desc (ScanDesc | None): A dictionary containing scan metadata. + It must include a "path" or "file" key specifying the location of the scan data file. Returns: - [TODO:description] + list[Path]: A list containing the resolved file path(s). Raises: - ValueError: [TODO:description] - RuntimeError: [TODO:description] + ValueError: If `scan_desc` is not provided or is `None`. + RuntimeError: If the file cannot be found at the specified location or in the + `DATA_PATH` directory. """ if scan_desc is None: msg = "Must pass dictionary as file scan_desc to all endstation loading code." diff --git a/src/arpes/endstations/plugin/ALG_spin_ToF.py b/src/arpes/endstations/plugin/ALG_spin_ToF.py index 64a89033..5ddb6d33 100644 --- a/src/arpes/endstations/plugin/ALG_spin_ToF.py +++ b/src/arpes/endstations/plugin/ALG_spin_ToF.py @@ -245,7 +245,7 @@ def load_SToF_fits(self, scan_desc: ScanDesc) -> xr.Dataset: except Exception: # we should probably zero pad in the case where the slices are not the right # size - logger.exception("Exception Occure") + logger.exception("Exception Occur") continue altered_dimension = dimensions[spectrum_name][0] diff --git a/src/arpes/endstations/plugin/IF_UMCS.py b/src/arpes/endstations/plugin/IF_UMCS.py index afb4b691..4796a85c 100644 --- a/src/arpes/endstations/plugin/IF_UMCS.py +++ b/src/arpes/endstations/plugin/IF_UMCS.py @@ -37,7 +37,7 @@ class IF_UMCSEndstation( # noqa: N801 LENS_MAPPING: ClassVar[dict[str, bool]] = { "HighAngularDispersion": True, "MediumAngularDispersion": True, - "LowAngularDispersion": True, + "LowAngularDispersion": True, "WideAngleMode": True, "LowMagnification": False, "MediumMagnification": False, @@ -77,7 +77,7 @@ def load_single_frame( if file.suffix == ".xy": data = load_xy(frame_path, **kwargs) elif file.suffix == ".itx": - msg = "Not suported yet..." + msg = "Not supported yet..." raise RuntimeWarning(msg) return xr.Dataset({"spectrum": data}, attrs=data.attrs) diff --git a/src/arpes/endstations/plugin/MAESTRO.py b/src/arpes/endstations/plugin/MAESTRO.py index f0f58a55..c1af7800 100644 --- a/src/arpes/endstations/plugin/MAESTRO.py +++ b/src/arpes/endstations/plugin/MAESTRO.py @@ -2,7 +2,7 @@ Common code is provided by a base class reflecting DAQ similarities between micro- and nanoARPES at MAESTRO. This is subclassed for the individual experiments to handle some subtle differences -in how nanoARPES handles its spatial coordiantes (they are hierarchical) and in the spectrometers. +in how nanoARPES handles its spatial coordinates (they are hierarchical) and in the spectrometers. """ from __future__ import annotations @@ -294,7 +294,7 @@ def update_hierarchical_coordinates(data: xr.Dataset) -> xr.Dataset: high-resolution motion axis of a nano-ARPES system. Working in two coordinate systems is frustrating, and it makes comparing data cumbersome. In - PyARPES x,y,z is always the total inferrable coordinate value, + PyARPES x,y,z is always the total inferable coordinate value, i.e. (+/- long range +/- high resolution) as appropriate. You can still access the underlying coordinates in this case as `long_{dim}` and `short_{dim}`. diff --git a/src/arpes/endstations/plugin/Phelix.py b/src/arpes/endstations/plugin/Phelix.py index 04321b21..725185cd 100644 --- a/src/arpes/endstations/plugin/Phelix.py +++ b/src/arpes/endstations/plugin/Phelix.py @@ -94,7 +94,7 @@ def postprocess_final( - Calculate phi or x values depending on the lens mode. - Add missing parameters. - Rename keys and dimensions in particular the third dimension that - could be psi andle or theta angle in this endstation. + could be psi angle or theta angle in this endstation. Args: data(xr.Dataset): ARPES data @@ -143,10 +143,12 @@ def postprocess_final( if "psi" in data.coords: data = data.assign_coords(psi=np.deg2rad(data.psi)) if "theta" in data.coords: - data = data.assign_coords(theta=np.deg2rad( - - data.theta - Phelix.NORMAL_EMISSION["theta"], - )) - data = data.isel(theta=slice(None,None,-1)) + data = data.assign_coords( + theta=np.deg2rad( + -data.theta - Phelix.NORMAL_EMISSION["theta"], + ), + ) + data = data.isel(theta=slice(None, None, -1)) return super().postprocess_final(data, scan_desc) diff --git a/src/arpes/endstations/plugin/fallback.py b/src/arpes/endstations/plugin/fallback.py index cad052fc..8035094c 100644 --- a/src/arpes/endstations/plugin/fallback.py +++ b/src/arpes/endstations/plugin/fallback.py @@ -115,7 +115,7 @@ def find_first_file(cls: type[FallbackEndstation], file_number: int) -> Path: """Finds any file associated to this scan. Instead actually using the superclass code here, we first try to determine - which loading pluging should be used. Then, we delegate to that class to + which loading plugin should be used. Then, we delegate to that class to find the first associated file. """ associated_loader = cls.determine_associated_loader(str(file_number)) diff --git a/src/arpes/endstations/plugin/kaindl.py b/src/arpes/endstations/plugin/kaindl.py index 3f428a87..17b9fbce 100644 --- a/src/arpes/endstations/plugin/kaindl.py +++ b/src/arpes/endstations/plugin/kaindl.py @@ -189,7 +189,7 @@ def postprocess_final( data: xr.Dataset, scan_desc: ScanDesc | None = None, ) -> xr.Dataset: - """Peforms final data preprocessing for the Kaindl lab Tr-ARPES setup. + """Performs final data preprocessing for the Kaindl lab Tr-ARPES setup. This is very similar to what happens at BL4/MERLIN because the code was adopted from an old version of the DAQ on that beamline. diff --git a/src/arpes/endstations/prodigy_xy.py b/src/arpes/endstations/prodigy_xy.py index ad011bc3..d7a395b1 100644 --- a/src/arpes/endstations/prodigy_xy.py +++ b/src/arpes/endstations/prodigy_xy.py @@ -9,7 +9,7 @@ Second dimension "nonenegy" is perpendicular to the energy on the MCP detector, stored as # NonEnergyOrdinate in each block of the data. -This could be both: angular (phi angle) or spacial (along the slit direction). +This could be both: angular (phi angle) or spatial (along the slit direction). Third dimension/parameter could be: - the deflector shift (psi angle) @@ -96,7 +96,7 @@ def parse(self, list_from_xy_file: list[str]) -> None: else: num_of_en = int(self.params["values_curve"]) - kinetic_ef_energy = np.linspace(energies[0], energies[num_of_en-1], num_of_en) + kinetic_ef_energy = np.linspace(energies[0], energies[num_of_en - 1], num_of_en) # first dimension is always energy self.axis_info["d1"] = (kinetic_ef_energy, "eV") diff --git a/src/arpes/exceptions.py b/src/arpes/exceptions.py deleted file mode 100644 index 2efbc0d5..00000000 --- a/src/arpes/exceptions.py +++ /dev/null @@ -1,24 +0,0 @@ -"""Some bespoke exceptions that can be used in control sequences. - -Over builtins, these provide more information to the user. I (Conrad) prefer to use warnings for -the latter purpose, but there are reasons to throw these errors in a variety of circumstances. -""" - -from __future__ import annotations - - -class AnalysisError(Exception): - """Base class to indicate that something scientific went wrong. - - Example: - A bad fit from scipy.optimize in an internal function or analysis - routine that could not be handled by the user. - """ - - -class ConfigurationError(Exception): - """Indicates that the user needs to supply more configuration. - - This could be due to failing to set some directories in which to place plots, - or failing to indicate the appropriate workspace. - """ diff --git a/src/arpes/fits/__init__.py b/src/arpes/fits/__init__.py index 873295bd..4f92d9df 100644 --- a/src/arpes/fits/__init__.py +++ b/src/arpes/fits/__init__.py @@ -67,5 +67,5 @@ class ParametersArgs(TypedDict, total=False): vary: bool # Whether the parameter is varied during the fit min: float # Lower bound for value (default, -np.inf) max: float # Upper bound for value (default np.inf) - expr: str # Mathematical expression to contstrain the value. + expr: str # Mathematical expression to constrain the value. brute_step: float # step size for grid points in the brute method. diff --git a/src/arpes/fits/fit_models/fermi_edge.py b/src/arpes/fits/fit_models/fermi_edge.py index 669b82c1..ec2aebc8 100644 --- a/src/arpes/fits/fit_models/fermi_edge.py +++ b/src/arpes/fits/fit_models/fermi_edge.py @@ -27,7 +27,7 @@ from _typeshed import Incomplete from numpy.typing import NDArray - from arpes._typing import DataType, XrTypes + from arpes._typing import XrTypes from arpes.fits import ModelArgs __all__ = ( @@ -155,15 +155,20 @@ def guess( data: XrTypes, **kwargs: Incomplete, ) -> lf.Parameters: - """Placeholder for making better heuristic guesses here. + """Makes heuristic guesses for parameters based on input data. + + This function sets initial guesses for a set of parameters based on simple + heuristics, such as the minimum and mean of the input data. The function + is a placeholder for future improvements where better guesses can be made. Args: - data ([TODO:type]): [TODO:description] - x (NONE): in this guess function, x should be None. - kwargs: [TODO:description] + data (XrTypes): Input data for making parameter guesses. The data is used + to estimate initial values like background levels and amplitude. + kwargs: Additional keyword arguments to update parameter values. Returns: - [TODO:description] + lf.Parameters: A set of parameters with initial guesses, potentially updated + by the provided `kwargs`. """ pars = self.make_params() @@ -194,8 +199,22 @@ def __init__(self, **kwargs: Unpack[ModelArgs]) -> None: self.set_param_hint("width", min=0) - def guess(self, data: DataType, **kwargs: Incomplete) -> lf.Parameters: - """Placeholder for making better heuristic guesses here.""" + def guess(self, data: XrTypes, **kwargs: Incomplete) -> lf.Parameters: + """Makes heuristic guesses for parameters based on input data. + + This function sets initial guesses for a set of parameters based on simple + heuristics, such as the minimum and mean of the input data. The function + is a placeholder for future improvements where better guesses can be made. + + Args: + data (XrTypes): Input data for making parameter guesses. The data is used + to estimate initial values like background levels and amplitude. + kwargs: Additional keyword arguments to update parameter values. + + Returns: + lf.Parameters: A set of parameters with initial guesses, potentially updated + by the provided `kwargs`. + """ pars = self.make_params() pars[f"{self.prefix}center"].set(value=0) @@ -229,15 +248,22 @@ def guess( x: None = None, **kwargs: Incomplete, ) -> lf.Parameters: - """Placeholder for making better heuristic guesses here. + """Makes heuristic guesses for parameters based on the input data. + + This function initializes parameter values with simple heuristic estimates, + such as using the minimum and mean values of the data. The `x` parameter is + intentionally ignored, and it should always be `None`. Args: - data ([TODO:type]): [TODO:description] - x (NONE): in this guess function, x should be None. - kwargs: [TODO:description] + data (XrTypes): The input data used to make initial guesses for parameters. + The data's minimum and mean values are used for background + and amplitude estimates. + x (None): This parameter is ignored and should always be `None`. + kwargs: Additional keyword arguments used to update the guessed parameters. Returns: - [TODO:description] + lf.Parameters: A set of parameters initialized with heuristic guesses, + which may be updated with the provided `kwargs`. """ pars = self.make_params() assert x is None diff --git a/src/arpes/fits/fit_models/x_model_mixin.py b/src/arpes/fits/fit_models/x_model_mixin.py index 4d1cb953..63fcb5c3 100644 --- a/src/arpes/fits/fit_models/x_model_mixin.py +++ b/src/arpes/fits/fit_models/x_model_mixin.py @@ -48,7 +48,7 @@ def _prep_parameters( Returns: lf.Parameters - Note that lf.Paramters class not, lf.Parameter + Note that lf.Parameters class not, lf.Parameter Notes: Example of lf.Parameters() @@ -105,20 +105,32 @@ def guess_fit( # noqa: PLR0913 transpose: bool = False, **kwargs: Incomplete, ) -> ModelResult: - """Performs a fit on xarray data after guessing parameters. + """Performs a fit on xarray or ndarray data after guessing parameters. - Params allows you to pass in hints as to what the values and bounds on parameters - should be. Look at the lmfit docs to get hints about structure + This method uses the `lmfit` library for fitting and allows for parameter guesses. + You can pass initial values and bounds for the parameters through the `params` argument. + The fitting can be done with optional weights, and additional keyword arguments can be + passed to the `lmfit.Model.fit` function. Args: - data (xr.DataArray): [TODO:description] - params (lf.Parameters|dict| None): Fitting parameters - weights ([TODO:type]): [TODO:description] - guess (bool): [TODO:description] - prefix_params: [TODO:description] - transpose: [TODO:description] - kwargs([TODO:type]): pass to lf.Model.fit - Additional keyword arguments, passed to model function. + data (xr.DataArray | NDArray[np.float64]): The data to fit. + It can be either an xarray DataArray or a NumPy ndarray. + params (lf.Parameters | dict[str, ParametersArgs] | None, optional): Initial fitting + parameters. This can be an `lf.Parameters` object or a dictionary of parameter + names and their initial values or bounds. + weights (xr.DataArray | NDArray[np.float64] | None, optional): Weights for the fitting + process, either as an xarray DataArray or a NumPy ndarray. + guess (bool, optional): If True, guess the initial parameters based on the data. + Default is True. + prefix_params (bool, optional): If True, prefix parameters with the object's prefix. + Default is True. + transpose (bool, optional): If True, transpose the data before fitting. + Default is False. + kwargs: Additional keyword arguments passed to the `lmfit.Model.fit` function. + + Returns: + ModelResult: The result of the fitting process, including the fit parameters and other + information. """ if isinstance(data, xr.DataArray): real_data, flat_data, coord_values, new_dim_order = self._real_data_etc_from_xarray( @@ -229,15 +241,14 @@ def _real_weights_from_xarray( xr_weights: xr.DataArray, new_dim_order: Sequence[Hashable] | None, ) -> NDArray[np.float64]: - """Return Weigths ndarray from xarray. + """Convert xarray weights to a flattened ndarray with an optional new dimension order. Args: - xr_weights (xr.DataArray): [TODO:description] - new_dim_order (Sequence[Hashable] | None): new dimension order - + xr_weights (xr.DataArray): The weights data stored in an xarray DataArray. + new_dim_order (Sequence[Hashable] | None): The desired order for dimensions, or None. Returns: - [TODO:description] + NDArray[np.float64]: Flattened NumPy array of weights, reordered if specified. """ if self.n_dims == 1: return xr_weights.values @@ -254,13 +265,17 @@ def _real_data_etc_from_xarray( dict[str, NDArray[np.float64]], Sequence[Hashable] | None, ]: - """Helper function: Return real_data, flat_data, coord_valuesn, new_dim_order from xarray. + """Helper function: Returns real data, flat data, coordinates, and new dimension order. Args: - data: (xr.DataArray) [TODO:description] + data (xr.DataArray): The data array containing the information to process. Returns: - real_data, flat_data, coord_values and new_dim_order from xarray + tuple: A tuple containing: + - real_data (NDArray[np.float64]): The raw data values from the array. + - flat_data (NDArray[np.float64]): The flattened data values. + - coord_values (dict[str, NDArray[np.float64]]): A dictionary of coordinate values. + - new_dim_order (Sequence[Hashable] | None): The new dimension order if changed. """ real_data, flat_data = data.values, data.values assert len(real_data.shape) == self.n_dims diff --git a/src/arpes/fits/hot_pool.py b/src/arpes/fits/hot_pool.py index f0726bfa..2c00a30a 100644 --- a/src/arpes/fits/hot_pool.py +++ b/src/arpes/fits/hot_pool.py @@ -16,13 +16,16 @@ class HotPool: @property def pool(self) -> pool.Pool: - """[TODO:summary]. + """Returns a pool object, creating it if necessary. + + This method lazily initializes a pool object and returns it. If the pool has + already been created, it simply returns the existing one. Args: - self ([TODO:type]): [TODO:description] + self: The instance of the class calling this method. Returns: - [TODO:description] + pool.Pool: A pool object. """ if self._pool is not None: return self._pool @@ -31,10 +34,13 @@ def pool(self) -> pool.Pool: return self._pool def __del__(self) -> None: - """[TODO:summary]. + """Cleans up resources when the object is deleted. + + This method ensures that the pool, if it exists, is closed before the object + is destroyed to release any allocated resources. Returns: - [TODO:description] + None """ if self._pool is not None: self._pool.close() diff --git a/src/arpes/fits/utilities.py b/src/arpes/fits/utilities.py index bdbbe8d8..c8f67a53 100644 --- a/src/arpes/fits/utilities.py +++ b/src/arpes/fits/utilities.py @@ -67,7 +67,7 @@ def result_to_hints( defaults: Returned if `model_result` is None, useful for cell re-evaluation in Jupyter Returns: - A dict containing parameter specifications in key-value rathar than `lmfit.Parameter` + A dict containing parameter specifications in key-value rather than `lmfit.Parameter` format, as you might pass as `params=` to PyARPES fitting code. """ if model_result is None: @@ -170,7 +170,7 @@ def broadcast_model( # noqa: PLR0913 prefixes: Prefix for the parameter name. Pass to MPWorker that pass to broadcast_common.compile_model. When prefixes are specified, the number of prefixes must be same as the number of models for fitting. If not specified, the prefix automatically is - determined as "a\_", "b\_",.... (We recommend to specifiy them explicitly.) + determined as "a\_", "b\_",.... (We recommend to specify them explicitly.) window: A specification of cuts/windows to apply to each curve fit parallelize: Whether to parallelize curve fits, defaults to True if unspecified and more than 20 fits were requested. @@ -293,14 +293,17 @@ def unwrap(result_data: str) -> object: # (Unpickler) def _fake_wqdm(x: Iterable[T], **kwargs: str | float) -> Iterable[T]: - """Fake of tqdm.notebook.tqdm. + """A placeholder for tqdm.notebook.tqdm that returns the input iterable unchanged. + + This function simulates the behavior of tqdm for cases where progress tracking + is not needed, effectively acting as a no-op that passes the iterable through. Args: - x (Iterable[int]): [TODO:description] - kwargs: its dummy parameters, not used. + x (Iterable[T]): An iterable to be processed. + kwargs: Dummy parameters that are not used in the function. Returns: - Same iterable. + Iterable[T]: The same iterable passed as the argument. """ del kwargs # kwargs is dummy parameter return x diff --git a/src/arpes/io.py b/src/arpes/io.py index 25e80303..3472838b 100644 --- a/src/arpes/io.py +++ b/src/arpes/io.py @@ -254,7 +254,7 @@ def _df_or_list_to_files( assert not isinstance( df_or_list, list | tuple, - ), "Expected an interable for a list of the scans to stitch together" + ), "Expected an iterable for a list of the scans to stitch together" return list(df_or_list) diff --git a/src/arpes/plotting/bands.py b/src/arpes/plotting/bands.py index 3048bd4d..839ce764 100644 --- a/src/arpes/plotting/bands.py +++ b/src/arpes/plotting/bands.py @@ -36,18 +36,20 @@ def plot_with_bands( """Makes a dispersion plot with bands overlaid. Args: - data (DataType): ARPES experimental data - bands: [TODO:description] - title (str): title of the plot - ax: [TODO:description] - out: [TODO:description] - kwargs: pass to data.S.plot() + data (xr.DataArray): ARPES experimental data. + bands (Sequence[Band]): Collection of bands to overlay on the plot. + title (str, optional): Title of the plot. Defaults to the label of `data.S`. + ax (Axes, optional): Matplotlib axis to plot on. If None, a new figure is created. + out (str or Path, optional): File path to save the plot. If empty, the plot is shown + interactively. + kwargs: Additional keyword arguments passed to `data.plot()`. Returns: - [TODO:description] + Union[Path, Axes]: File path if `out` is specified; otherwise, the Matplotlib axis object. """ if ax is None: _, ax = plt.subplots(figsize=(8, 5)) + assert isinstance(ax, Axes) if not title: @@ -68,5 +70,6 @@ def plot_with_bands( filename = path_for_plot(out) plt.savefig(filename) return filename + plt.show() return ax diff --git a/src/arpes/plotting/dispersion.py b/src/arpes/plotting/dispersion.py index 6f77ed0a..e0ec37c7 100644 --- a/src/arpes/plotting/dispersion.py +++ b/src/arpes/plotting/dispersion.py @@ -27,7 +27,7 @@ from matplotlib.figure import Figure, FigureBase from numpy.typing import NDArray - from arpes._typing import DataType, PColorMeshKwargs, XrTypes + from arpes._typing import PColorMeshKwargs, XrTypes from arpes.models.band import Band __all__ = ( @@ -392,7 +392,7 @@ class LabeledFermiSurfaceParam(TypedDict, total=False): @save_plot_provenance def reference_scan_fermi_surface( - data: DataType, + data: xr.DataArray, **kwargs: Unpack[LabeledFermiSurfaceParam], ) -> Path | Axes: """A reference plot for Fermi surfaces. Used internally by other code. @@ -412,8 +412,8 @@ def reference_scan_fermi_surface( handles = [] for index, row in referenced_scans.iterrows(): scan = load_data(row.id) - remapped_coords = remap_coords_to(scan, data) + remapped_coords = remap_coords_to(scan, data) dim_order = [ax.get_xlabel(), ax.get_ylabel()] ls = ax.plot( remapped_coords[dim_order[0]], @@ -508,20 +508,27 @@ def fancy_dispersion( include_symmetry_points: bool = True, **kwargs: Unpack[PColorMeshKwargs], ) -> Axes | Path: - """Generates a 2D ARPES cut with some fancy annotations for throwing plots together. + """Generates a 2D ARPES cut with additional annotations, useful for quick presentations. - Useful for brief slides/quick presentations. + This function creates a plot of ARPES data with optional symmetry points and custom styling for + quick visualization. It is designed to help create figures rapidly for presentations or reports. + Symmetry points are annotated if `include_symmetry_points` is set to True. Args: - data (xr.DataArray): ARPES data. - title (str): Title of Figure. - ax (Axes): matpplotlib Axes object - out (str | Path): str or Path object for output image. - include_symmetry_points: [TODO:description] - kwargs: pass to xr.Dataset.plot or xr.DataArray.plot() + data (xr.DataArray): ARPES data to plot. + title (str): Title of the figure. If not provided, the title is derived from the dataset + label. + ax (Axes, optional): Matplotlib Axes object for plotting. If not provided, a new Axes is + created. + out (str | Path, optional): Output file path for saving the figure. If not provided, the + figure is not saved. + include_symmetry_points (bool): Whether to include symmetry points in the plot + (default is True). + kwargs: Additional keyword arguments passed to `xr.DataArray.plot()` for further + customization. Returns: - [TODO:description] + Axes | Path: The Axes object containing the plot, or the file path if the plot is saved. """ if ax is None: _, ax = plt.subplots(figsize=(8, 5)) @@ -579,19 +586,23 @@ def scan_var_reference_plot( norm: Normalize | None = None, out: str | Path = "", ) -> Axes | Path: - """Makes a straightforward plot of a DataArray with reasonable axes. + """Generates a simple plot of a DataArray with appropriately labeled axes. - Used internally by other scripts. + This function is used internally by other scripts to quickly generate plots for DataArrays. It + supports normalization and customization of axes labels and titles. The plot can optionally be + saved to a file. Args: - data: [TODO:description] - title: [TODO:description] - ax: Axes on which to plot. By default, use the current axes. - norm ([TODO:type]): [TODO:description] - out: [TODO:description] + data (xr.DataArray): The input data to plot, typically a DataArray. + title (str): The title of the plot. If not provided, it is derived from the DataArray label. + ax (Axes, optional): The Matplotlib Axes object to plot on. If not provided, a new Axes is + created. + norm (Normalize, optional): Normalization to apply to the plot. Default is None. + out (str | Path, optional): File path to save the plot. If not provided, the plot is not + saved. Returns: - [TODO:description] + Axes | Path: The Axes object containing the plot, or the file path if the plot is saved. """ assert isinstance(data, xr.DataArray) if ax is None: diff --git a/src/arpes/plotting/dos.py b/src/arpes/plotting/dos.py index c46c2ec5..661b340f 100644 --- a/src/arpes/plotting/dos.py +++ b/src/arpes/plotting/dos.py @@ -91,7 +91,7 @@ def plot_dos( Args: data: ARPES data to plot. out (str | Path): Path to the figure. - orientation (Literal["horizontal", "vetical"]): Orientation of the figures. + orientation (Literal["horizontal", "vertical"]): Orientation of the figures. figsize: The figure size (arg of plt.figure()) kwargs: Pass to the original data. diff --git a/src/arpes/plotting/fermi_surface.py b/src/arpes/plotting/fermi_surface.py index 2b533e13..9bfe9c56 100644 --- a/src/arpes/plotting/fermi_surface.py +++ b/src/arpes/plotting/fermi_surface.py @@ -92,22 +92,25 @@ def magnify_circular_regions_plot( # noqa: PLR0913 ax: Axes | None = None, **kwargs: tuple[float, float], ) -> tuple[Figure | None, Axes] | Path: - """Plots a Fermi surface with inset points magnified in an inset. + """Plots a Fermi surface with magnified circular regions as insets. + + This function highlights specified points on a Fermi surface plot by magnifying + their corresponding regions and displaying them as inset circular regions. Args: - data: [TODO:description] - magnified_points: [TODO:description] - mag: [TODO:description] - radius (float): [TODO:description] - cmap: [TODO:description] - color: [TODO:description] - edgecolor (ColorType): [TODO:description] - out: [TODO:description] - ax: [TODO:description] - kwargs: [TODO:description] + data (xr.DataArray): ARPES data to plot. + magnified_points: Points on the surface to magnify. + mag: Magnification factor for the inset regions. + radius: Radius for the circular regions. + cmap: Colormap for the plot. + color: Color of the magnified points. + edgecolor: Color of the borders around the magnified regions. + out: File path to save the plot. + ax: Matplotlib axes to plot on. + kwargs: Additional keyword arguments for customization. Returns: - [TODO:description] + A tuple of figure and axes, or the path to the saved plot. """ data_arr = data if isinstance(data, xr.DataArray) else normalize_to_spectrum(data) assert isinstance(data_arr, xr.DataArray) diff --git a/src/arpes/plotting/holoviews.py b/src/arpes/plotting/holoviews.py index 73dfbdc0..c532ecd3 100644 --- a/src/arpes/plotting/holoviews.py +++ b/src/arpes/plotting/holoviews.py @@ -39,7 +39,7 @@ def _fix_xarray_to_fit_with_holoview(dataarray: xr.DataArray) -> xr.DataArray: dataarray (xr.DataArray): input Dataarray Returns: - xr.DataArray, whose coordinates is regularly orderd determined by dataarray.dims. + xr.DataArray, whose coordinates is regularly ordered determined by dataarray.dims. """ for coord_name in dataarray.coords: if coord_name not in dataarray.dims: @@ -54,18 +54,21 @@ def concat_along_phi_ui( dataarray_b: xr.DataArray, **kwargs: Unpack[ProfileViewParam], ) -> hv.util.Dynamic: - """UI for determination of appropriate parameters of concat_along_phi. + """UI for determining the appropriate parameters for the `concat_along_phi` function. Args: - dataarray_a: An AREPS data. - dataarray_b: Another ARPES data. - use_quadmesh (bool): If true, use hv.QuadMesh instead of hv.Image. - In most case, hv.Image is sufficient. However, if the coords is irregulaly spaced, - hv.QuadMesh would be more accurate mapping, but slow. - kwargs: Options for hv.Image/hv.QuadMesh (width, height, cmap, log) + dataarray_a (xr.DataArray): First ARPES data array. + dataarray_b (xr.DataArray): Second ARPES data array. + use_quadmesh (bool): If True, uses `hv.QuadMesh` instead of `hv.Image`. + `hv.Image` is generally sufficient, but if the coordinates are irregularly spaced, + `hv.QuadMesh` provides more accurate mapping, though at a slower performance. + kwargs: Additional options for `hv.Image` or `hv.QuadMesh` + (e.g., `width`, `height`, `cmap`, `log`). Returns: - [TODO:description] + hv.util.Dynamic: A dynamic map (UI) to adjust the parameters of `concat_along_phi` + interactively. + """ dataarray_a = _fix_xarray_to_fit_with_holoview(dataarray_a) dataarray_b = _fix_xarray_to_fit_with_holoview(dataarray_b) @@ -207,15 +210,20 @@ def fit_inspection( ) -> AdjointLayout: """Fit results inspector. + This function generates a set of plots to inspect the fit results of ARPES data. The main plot + shows the measured ARPES data along with the fit and residuals. Additionally, a dynamic profile + view is provided to inspect specific cuts of the data along with the corresponding fit and + residual profiles. The plots are interactive and allow for zooming and panning. + Args: - dataset: [TODO:description] - use_quadmesh (bool): If true, use hv.QuadMesh instead of hv.Image. - In most case, hv.Image is sufficient. However, if the coords is irregulaly spaced, - hv.QuadMesh would be more accurate mapping, but very slow. - kwargs: [TODO:description] + dataset (xr.Dataset): The input dataset containing ARPES data, fit, and residual variables. + use_quadmesh (bool): If True, uses `hv.QuadMesh` instead of `hv.Image` for plotting. + `hv.QuadMesh` is more accurate for irregularly spaced coordinates but may be slower. + kwargs: Additional arguments passed to the plot options, such as plot size, colormap, and + logarithmic scaling. Returns: - [TODO:description] + AdjointLayout: A holoviews AdjointLayout object containing the interactive plots. """ kwargs.setdefault("width", 300) kwargs.setdefault("height", 300) diff --git a/src/arpes/plotting/movie.py b/src/arpes/plotting/movie.py index 0dca67cb..cd965008 100644 --- a/src/arpes/plotting/movie.py +++ b/src/arpes/plotting/movie.py @@ -37,19 +37,24 @@ def plot_movie( # noqa: PLR0913 figsize: tuple[float, float] | None = None, **kwargs: Unpack[PColorMeshKwargs], ) -> Path | animation.FuncAnimation: - """Make an animated plot of a 3D dataset using one dimension as "time". + """Creates an animated plot of a 3D dataset using one dimension as "time". Args: data (xr.DataArray): ARPES data - time_dim (str): dimension name for time, default is "delay". - interval_ms: Delay between frames in milliseconds. - fig_ax (tuple[Figure, Axes]): matplotlib object - out: [TODO:description] - figsize (tuple[float, float]) : figure size of the movie. - kwargs: [TODO:description] + time_dim (str): Dimension name for time, default is "delay" + interval_ms (float): Delay between frames in milliseconds + fig_ax (tuple[Figure, Axes]): matplotlib Figure and Axes objects + out (str | Path): Output path for saving the animation (optional) + figsize (tuple[float, float]): Size of the movie figure + kwargs: Additional keyword arguments for the plot + + Returns: + Path | animation.FuncAnimation: The path to the saved animation or the animation object + itself Raises: - TypeError: [TODO:description] + TypeError: If the argument types are incorrect. + """ figsize = figsize or (7.0, 7.0) data = data if isinstance(data, xr.DataArray) else normalize_to_spectrum(data) diff --git a/src/arpes/plotting/parameter.py b/src/arpes/plotting/parameter.py index 043e12a8..bbf38bf1 100644 --- a/src/arpes/plotting/parameter.py +++ b/src/arpes/plotting/parameter.py @@ -33,22 +33,23 @@ def plot_parameter( # noqa: PLR0913 figsize: tuple[float, float] = (7, 5), **kwargs: Unpack[MPLPlotKwargs], ) -> Axes: - """Make a simple scatter plot of a parameter from an ``broadcast_fit` result. + """Creates a scatter plot of a parameter from a `broadcast_fit` result. Args: - fit_data: Fitting result. (broadcast_fit.results) - param_name: The parameter name of fitting. - ax: Axes on which to plot. By default, use the current axes. - shift: [TODO:description] - x_shift: [TODO:description] - two_sigma (bool): [TODO:description] - figsize: [TODO:description] - kwargs: [TODO:description] + fit_data (xr.DataArray): The fitting result, typically from `broadcast_fit.results`. + param_name (str): The name of the parameter to plot. + ax (Axes, optional): The axes on which to plot. If not provided, a new set of axes will be + created. + shift (float, optional): A vertical shift for the plot. Default is 0. + x_shift (float, optional): A horizontal shift for the x-values. Default is 0. + two_sigma (bool, optional): If True, plots the error bars as two standard deviations. + Default is False. + figsize (tuple[float, float], optional): The size of the figure. Default is (7, 5). + kwargs: Additional keyword arguments for the plot (e.g., `color`, `markersize`, etc.). Returns: - [TODO:description] + Axes: The Axes object with the plot. """ - """Makes a simple scatter plot of a parameter from an `broadcast_fit` result.""" if ax is None: _, ax = plt.subplots(figsize=figsize) assert isinstance(ax, Axes) diff --git a/src/arpes/plotting/utils.py b/src/arpes/plotting/utils.py index 49168e2b..f28acdb1 100644 --- a/src/arpes/plotting/utils.py +++ b/src/arpes/plotting/utils.py @@ -674,16 +674,17 @@ def imshow_arr( over: AxesImage | None = None, **kwargs: Unpack[IMshowParam], ) -> tuple[Figure | None, AxesImage]: - """Similar to plt.imshow but users different default origin, and sets appropriate extents. + """Display ARPES data using imshow with default settings suited for xr.DataArray. Args: - arr (xr.DataArray): ARPES data - ax (Axes): [TODO:description] - over ([TODO:type]): [TODO:description] - kwargs: pass to ax.imshow + arr (xr.DataArray): ARPES data to be visualized. + ax (Axes | None): The Axes object to plot on; creates a new figure if None. + over (AxesImage | None): Optional, overlays an existing image if provided. + kwargs: Additional arguments to pass to ax.imshow, such as colormap, alpha, etc. Returns: - The axes and quadmesh instance. + tuple: A tuple containing the figure (or None if ax is provided) and the + AxesImage instance resulting from imshow. """ fig: Figure | None = None if ax is None: diff --git a/src/arpes/preparation/axis_preparation.py b/src/arpes/preparation/axis_preparation.py index 593138dd..bf412b61 100644 --- a/src/arpes/preparation/axis_preparation.py +++ b/src/arpes/preparation/axis_preparation.py @@ -56,11 +56,12 @@ def sort_axis(data: xr.DataArray, axis_name: str) -> xr.DataArray: """Sorts slices of `data` along `axis_name` so that they lie in order. Args: - data(xr.DataArray): [TODO:description] - axis_name(str): [TODO:description] + data (xr.DataArray): The xarray data to be sorted. + axis_name (str): The name of the axis along which to sort. - Returns(xr.DataArray): - [TODO:description] + Returns: + xr.DataArray: The sorted xarray data.orts slices of `data` along `axis_name` so that they + lie in order. """ assert isinstance(data, xr.DataArray) copied = data.copy(deep=True) @@ -74,20 +75,21 @@ def sort_axis(data: xr.DataArray, axis_name: str) -> xr.DataArray: @update_provenance("Flip data along axis") def flip_axis( - arr: xr.DataArray, # valuse is used + arr: xr.DataArray, # values is used axis_name: str, *, flip_data: bool = True, ) -> xr.DataArray: - """Flips the coordinate values along an axis w/o changing the data as well. + """Flips the coordinate values along an axis without changing the data. Args: - arr (xr.DataArray): [TODO:description] - axis_name(str): [TODO:description] - flip_data(bool): [TODO:description] + arr (xr.DataArray): The xarray data to be modified. + axis_name (str): The name of the axis to flip. + flip_data (bool): If True, the data will also be flipped along the axis. - Returns(xr.DataArray): - [TODO:description] + Returns: + xr.DataArray: The xarray data with flipped coordinates.Flips the coordinate values along an + axis w/o changing the data as well. """ coords = copy.deepcopy(arr.coords) coords[axis_name] = coords[axis_name][::-1] @@ -162,10 +164,13 @@ def normalize_total(data: XrTypes, *, total_intensity: float = 1000000) -> xr.Da def dim_normalizer( dim_name: str, ) -> Callable[[xr.DataArray], xr.DataArray]: - """Safe partial application of dimension normalization. + """Returns a function for safely applying dimension normalization. Args: - dim_name (str): [TODO:description] + dim_name (str): The name of the dimension to normalize. + + Returns: + Callable: A function that normalizes the dimension of an xarray data. """ def normalize(arr: xr.DataArray) -> xr.DataArray: @@ -186,16 +191,20 @@ def transform_dataarray_axis( # noqa: PLR0913 *, remove_old: bool = True, ) -> xr.Dataset: - """Applies a function onto a DataArray axis. + """Applies a function to a DataArray axis. Args: - func ([TODO:type]): [TODO:description] - old_and_new_axis_names (tuple[str, str]) : old and new axis names as the tuple form - new_axis ([TODO:type]): [TODO:description] - dataset(xr.Dataset): [TODO:description] - prep_name ([TODO:type]): [TODO:description] - transform_spectra ([TODO:type]): [TODO:description] - remove_old ([TODO:type]): [TODO:description] + func (Callable): The function to apply to the axis of the DataArray + old_and_new_axis_names (tuple[str, str]): Tuple containing the old and new axis names + new_axis (NDArray[np.float64] | xr.DataArray): Values for the new axis + dataset (xr.Dataset): The dataset to transform + prep_name (Callable): Function to prepare the name for the transformed DataArrays + transform_spectra (dict[str, xr.DataArray] | None): Dictionary of spectra to transform + (default is None) + remove_old (bool): Whether to remove the old axis (default is True) + + Returns: + xr.Dataset: A new dataset with the transformed axisApplies a function onto a DataArray axis. """ old_axis_name, new_axis_name = old_and_new_axis_names diff --git a/src/arpes/provenance.py b/src/arpes/provenance.py index 2e2756cd..23a68eec 100644 --- a/src/arpes/provenance.py +++ b/src/arpes/provenance.py @@ -159,8 +159,8 @@ def update_provenance( """A decorator that promotes a function to one that records data provenance. Args: - what: Description of what transpired, to put into the record. - keep_parent_ref: Whether to keep a pointer to the parents in the hierarchy or not. + what (str): Description of what transpired, to put into the record. + keep_parent_ref (bool): Whether to keep a pointer to the parents in the hierarchy or not. Returns: A decorator which can be applied to a function. @@ -169,10 +169,13 @@ def update_provenance( def update_provenance_decorator( fn: Callable[P, R], ) -> Callable[P, R]: - """[TODO:summary]. + """A wrapper function that records data provenance for the execution of a function. Args: - fn: [TODO:description] + fn (Callable): The function for which provenance will be recorded. + + Returns: + Callable: A function that has been extended to record data provenance.[TODO:summary]. """ @functools.wraps(fn) @@ -237,14 +240,14 @@ def save_plot_provenance(plot_fn: Callable[P, R]) -> Callable[P, R]: @functools.wraps(plot_fn) def func_wrapper(*args: P.args, **kwargs: P.kwargs) -> R: - """[TODO:summary]. + """A wrapper function that records provenance information after generating a plot. Args: - args: [TODO:description] - kwargs: [TODO:description] + args: Positional arguments passed to `plot_fn` + kwargs: Keyword arguments passed to `plot_fn` Returns: - [TODO:description] + str: The file path where the plot is saved[TODO:summary]. """ path = plot_fn(*args, **kwargs) if isinstance(path, str) and Path(path).exists(): diff --git a/src/arpes/simulation.py b/src/arpes/simulation.py index 414bdc8c..e9b89907 100644 --- a/src/arpes/simulation.py +++ b/src/arpes/simulation.py @@ -427,7 +427,7 @@ def __init__( k: The momentum axis. omega: The energy axis. temperature: The temperature to use for the calculation. Defaults to None. - mfl_parameter (tuple[float, float]): The MFL paramter ('a', and 'b'). + mfl_parameter (tuple[float, float]): The MFL parameter ('a', and 'b'). Defaults to (10.0, 1.0) """ super().__init__(k, omega, temperature) @@ -451,7 +451,7 @@ def __init__( k: NDArray[np.float64] | None = None, omega: NDArray[np.float64] | None = None, temperature: float = 20, - gap_paramters: tuple[float, float, float] = (50, 30, 0), + gap_parameters: tuple[float, float, float] = (50, 30, 0), ) -> None: """Initializes from parameters. @@ -460,10 +460,10 @@ def __init__( omega: The energy axis. temperature: The temperature to use for the calculation. Defaults to None. delta: The gap size. - gap_paramters (tuple[float, float, float]): Gap paramter of the BSSCO, + gap_parameters (tuple[float, float, float]): Gap parameter of the BSSCO, Delta, and two Gamma pamaramters (s- and p-wave) """ - self.delta, self.gamma_s, self.gamma_p = gap_paramters + self.delta, self.gamma_s, self.gamma_p = gap_parameters super().__init__(k, omega, temperature) def digest_to_json(self) -> dict[str, Any]: diff --git a/src/arpes/utilities/__init__.py b/src/arpes/utilities/__init__.py index 10cf0f0b..75759999 100644 --- a/src/arpes/utilities/__init__.py +++ b/src/arpes/utilities/__init__.py @@ -6,7 +6,7 @@ from operator import itemgetter from typing import TYPE_CHECKING, Any -from .collections import deep_equals, deep_update +from .collections import deep_update from .combine import concat_along_phi from .dict import ( clean_keys, diff --git a/src/arpes/utilities/bz.py b/src/arpes/utilities/bz.py index 517ed657..bb80336f 100644 --- a/src/arpes/utilities/bz.py +++ b/src/arpes/utilities/bz.py @@ -59,7 +59,7 @@ def process_kpath( path: str, cell: Cell, ) -> NDArray[np.float64]: - """Converts paths consiting of point definitions to raw coordinates. + """Converts paths consisting of point definitions to raw coordinates. Args: path: String that represents the high symmetry points such as "GMK". diff --git a/src/arpes/utilities/collections.py b/src/arpes/utilities/collections.py index dacb30c5..aec89de5 100644 --- a/src/arpes/utilities/collections.py +++ b/src/arpes/utilities/collections.py @@ -2,14 +2,10 @@ from __future__ import annotations -from collections.abc import Mapping, Sequence -from itertools import starmap +from collections.abc import Mapping from typing import TypeVar -__all__ = ( - "deep_equals", - "deep_update", -) +__all__ = ("deep_update",) T = TypeVar("T") @@ -33,37 +29,3 @@ def deep_update(destination: dict[str, T], source: dict[str, T]) -> dict[str, T] destination[k] = v return destination - - -def deep_equals( - a: T | Sequence[T] | set[T] | Mapping[str, T] | None, - b: T | Sequence[T] | set[T] | Mapping[str, T] | None, -) -> bool: - """An equality check that looks into common collection types.""" - if not isinstance(b, type(a)): - return False - - if isinstance(a, str | float | int | None | set): - return a == b - - if isinstance(a, Sequence) and isinstance(b, Sequence): - if len(a) != len(b): - return False - return all(starmap(deep_equals, zip(a, b, strict=True))) - - if isinstance(a, Mapping) and isinstance(b, Mapping): - return _deep_equals_dict(a, b) - raise TypeError - - -def _deep_equals_dict(a: Mapping, b: Mapping) -> bool: - if set(a.keys()) != set(b.keys()): - return False - - for k in a: - item_a, item_b = a[k], b[k] - - if not deep_equals(item_a, item_b): - return False - - return True diff --git a/src/arpes/utilities/conversion/grids.py b/src/arpes/utilities/conversion/grids.py index 7147d967..bd832d88 100644 --- a/src/arpes/utilities/conversion/grids.py +++ b/src/arpes/utilities/conversion/grids.py @@ -24,11 +24,11 @@ def is_dimension_convertible_to_momentum(dimension_name: str) -> bool: - """Determine whether a dimension can paticipate in the momentum conversion. + """Determine whether a dimension can participate in the momentum conversion. if dimension name is in {"phi", "theta", "beta", "chi", "psi", "hv"}, return True. Originally, is_dimension_unconvertible(dimension_name: str) is defined. - {"phi", "theta", "beta", "chi", "psi", "hv"} can be converted to momemtum. + {"phi", "theta", "beta", "chi", "psi", "hv"} can be converted to momentum. Args: dimension_name (str): [description] diff --git a/src/arpes/utilities/conversion/trapezoid.py b/src/arpes/utilities/conversion/trapezoid.py index 67bb9419..e539fb9a 100644 --- a/src/arpes/utilities/conversion/trapezoid.py +++ b/src/arpes/utilities/conversion/trapezoid.py @@ -157,13 +157,21 @@ def phi_to_phi( binding_energy: NDArray[np.float64], phi: NDArray[np.float64], ) -> NDArray[np.float64]: - """[TODO:summary]. + """Converts the given phi values to a new phi representation based on binding energy. + + This method computes the new phi values based on the provided binding energy and phi values, + and stores the result in `self.phi`. If `self.phi` is already set, it simply returns + the existing value. Args: - binding_energy: [TODO:description] - phi: [TODO:description] - args: [TODO:description] - kwargs: [TODO:description] + binding_energy (NDArray[np.float64]): The array of binding energy values. + phi (NDArray[np.float64]): The array of phi values to be converted. + + Returns: + NDArray[np.float64]: The transformed phi values. + + Raises: + ValueError: If any required attributes are missing or invalid. """ if self.phi is not None: return self.phi @@ -176,13 +184,17 @@ def phi_to_phi_forward( binding_energy: NDArray[np.float64], phi: NDArray[np.float64], ) -> NDArray[np.float64]: - """[TODO:summary]. + """Transforms phi values based on binding energy using a forward method. + + This method computes the new phi values based on the provided binding energy and phi values, + applying a forward transformation. The result is stored in the `phi_out` array. Args: - binding_energy: [TODO:description] - phi: [TODO:description] - args: [TODO:description] - kwargs: [TODO:description] + binding_energy (NDArray[np.float64]): The array of binding energy values. + phi (NDArray[np.float64]): The array of phi values to be converted. + + Returns: + NDArray[np.float64]: The transformed phi values after the forward transformation. """ phi_out = np.zeros_like(phi) _phi_to_phi_forward(binding_energy, phi, phi_out, self.corner_angles) diff --git a/src/arpes/utilities/xarray.py b/src/arpes/utilities/xarray.py index ee7ffdd3..a2736d27 100644 --- a/src/arpes/utilities/xarray.py +++ b/src/arpes/utilities/xarray.py @@ -101,7 +101,7 @@ def apply_dataarray( Args: arr (xr.DataArray): original DataArray. f (Callable): Function to apple the DataArray. - args: argments for "f". + args: arguments for "f". kwargs: keyword arguments for "f" Returns: diff --git a/src/arpes/workflow.py b/src/arpes/workflow.py index 1c45f159..8f533c05 100644 --- a/src/arpes/workflow.py +++ b/src/arpes/workflow.py @@ -83,12 +83,22 @@ def wrapped_with_workspace( *args: P.args, **kwargs: P.kwargs, ) -> R: - """[TODO:summary]. + """Wraps a function execution with a context manager for handling workspace settings. + + This function wraps the execution of another function by setting up the appropriate + workspace environment using the `WorkspaceManager`. The workspace name can be specified + through the `kwargs` as `workspace_name`. The workspace is then passed to the original + function during its execution. Args: - args: args of the original function. - workspace (str | None): [TODO:description] - kwargs: [TODO:description] + args: Arguments for the original function. + workspace_name (str | None): The name of the workspace to be used. + If not provided, defaults to an empty string. + kwargs: Additional keyword arguments for the original function. + The `workspace` is added as a keyword argument. + + Returns: + R: The result returned by the wrapped function. """ workspace_name: str = kwargs.pop("workspace_name", "") with WorkspaceManager(workspace_name=workspace_name): diff --git a/src/arpes/xarray_extensions.py b/src/arpes/xarray_extensions.py index ddfde445..5f2376d9 100644 --- a/src/arpes/xarray_extensions.py +++ b/src/arpes/xarray_extensions.py @@ -478,7 +478,7 @@ def switch_energy_notation(self, nonlinear_order: int = 1) -> None: ) self._obj.attrs["energy_notation"] = "Binding" else: - msg = "Not impremented yet." + msg = "Not implemented yet." raise RuntimeError(msg) @@ -939,7 +939,7 @@ def short_history(self, key: str = "by") -> list: """Return the short version of history. Args: - key (str): key str in recored dict of self.history. (default: "by") + key (str): key str in recorded dict of self.history. (default: "by") """ return [h["record"][key] if isinstance(h, dict) else h for h in self.history] # type: ignore[literal-required] @@ -1129,7 +1129,7 @@ class ARPESProperty(ARPESPropertyBase): @staticmethod def dict_to_html(d: Mapping[str, float | str]) -> str: - """Returnn html format of dict object. + """Return html format of dict object. Args: d: dict object @@ -1333,7 +1333,7 @@ def transpose_to_front(self, dim: str) -> XrTypes: Returns: (XrTypes) Transposed ARPES data - Tooo: + Todo: Test """ dims = list(self._obj.dims) @@ -1350,7 +1350,7 @@ def transpose_to_back(self, dim: str) -> XrTypes: Returns: (XrTypes) Transposed ARPES data. - Tooo: + Todo: Test """ dims = list(self._obj.dims) @@ -1632,15 +1632,51 @@ def find_spectrum_energy_edges( *, indices: bool = False, ) -> NDArray[np.float64] | NDArray[np.int_]: - """Return energy position corresponding to the (1D) spectrum edge. + """Compute the angular edges of the spectrum over the specified energy range. - Spectrum edge is infection point of the peak. + This method identifies the low and high angular edges for each slice of the spectrum + within a given energy range. The energy range is divided into slices using the specified + `energy_division`. For each slice, edges are detected using the Canny edge detection + algorithm after applying Gaussian smoothing. Args: - indices (bool): if True, return the pixel (index) number. + indices (bool, optional): + If `True`, returns the edge positions as indices. If `False`, returns the + edge positions as physical coordinates. Defaults to `False`. + energy_division (float, optional): + The step size for dividing the energy range. Smaller values provide finer + resolution for edge detection. Defaults to 0.05. + + Returns: + tuple[NDArray[np.float64], NDArray[np.float64], xr.DataArray]: + - If `indices=True`: + - Low edge indices. + - High edge indices. + - Corresponding energy coordinates. + - If `indices=False`: + - Low edge physical coordinates. + - High edge physical coordinates. + - Corresponding energy coordinates. + + Raises: + ValueError: If the energy range is too narrow for proper edge detection. + + Example: + ```python + # Assuming `data` is an xarray.DataArray with "eV" and "phi" or "pixel" dimensions + low_edges, high_edges, energy_coords = data.find_spectrum_angular_edges_full( + indices=False, energy_division=0.1 + ) - Returns: NDArray[np.float64] - Energy position + print("High edges:", high_edges) + print("Low edges:", low_edges) + print("Energy coordinates:", energy_coords) + ``` + + Todo: + - Add unit tests for edge cases and different data configurations. + - Investigate optimal parameters for edge detection. + (e.g., Gaussian filter size, thresholds). """ assert isinstance( self._obj, @@ -1672,17 +1708,26 @@ def find_spectrum_angular_edges_full( indices: bool = False, energy_division: float = 0.05, ) -> tuple[NDArray[np.float64], NDArray[np.float64], xr.DataArray]: - """[TODO:summary]. + """Finds the angular edges of the spectrum based on energy slicing and rebinning. + + This method uses edge detection techniques to identify boundaries in the angular dimension. Args: - indices: [TODO:description] - energy_division: [TODO:description] + indices (bool, optional): If True, returns edge indices; if False, returns physical + angular coordinates. Defaults to False. + energy_division (float, optional): Specifies the energy division step for rebinning. + Defaults to 0.05 eV. Returns: - [TODO:description] + tuple: A tuple containing: + - low_edges (NDArray[np.float64]): Values or indices of the low edges + of the spectrum. + - high_edges (NDArray[np.float64]): Values or indices of the high edges + of the spectrum. + - eV_coords (xr.DataArray): The coordinates of the rebinned energy axis. Todo: - Test + - Add unit tests for this function. """ # as a first pass, we need to find the bottom of the spectrum, we will use this # to select the active region and then to rebin into course steps in energy from 0 @@ -1746,19 +1791,28 @@ def zero_spectrometer_edges( low: Sequence[float] | NDArray[np.float64] | None = None, high: Sequence[float] | NDArray[np.float64] | None = None, ) -> xr.DataArray: - """[TODO:summary]. + """Zeros out the spectrum data outside of the specified low and high edges. + + It uses the provided or inferred edge information, applying cut margins and optionally + interpolating over a given range. Args: - cut_margin: [TODO:description] - interp_range: [TODO:description] - low: [TODO:description] - high: [TODO:description] + cut_margin (int or float, optional): Margin to apply when invalidating data near edges. + Use `int` for pixel-based margins or `float` for angular physical units. + Defaults to 50 pixels or 0.08 in angular units, depending on the data type. + interp_range (float or None, optional): Specifies the interpolation range for edge data. + If provided, the edge values are interpolated within this range. + low (Sequence[float], NDArray[np.float64], or None, optional): Low edge values. + Use this to manually specify the low edge. Defaults to None. + high (Sequence[float], NDArray[np.float64], or None, optional): High edge values. + Use this to manually specify the high edge. Defaults to None. Returns: - [TODO:description] + xr.DataArray: The spectrum data with values outside the edges set to zero. Todo: - Test + - Add tests. + """ assert isinstance(self._obj, xr.DataArray) if low is not None: @@ -1857,16 +1911,21 @@ def find_spectrum_angular_edges( return edges * delta[angular_dim] + self._obj.coords[angular_dim].values[0] def wide_angle_selector(self, *, include_margin: bool = True) -> slice: - """[TODO:summary]. + """Generates a slice for selecting the wide angular range of the spectrum. + + Optionally includes a margin to slightly reduce the range. Args: - include_margin: [TODO:description] + include_margin (bool, optional): If True, includes a margin to shrink the range. + Defaults to True. Returns: - [TODO:description] + slice: A slice object representing the wide angular range of the spectrum. Todo: - Test/Consider to remove + - Add tests. + - Consider removing the function. + """ edges = self.find_spectrum_angular_edges() low_edge, high_edge = np.min(edges), np.max(edges) @@ -1883,13 +1942,18 @@ def wide_angle_selector(self, *, include_margin: bool = True) -> slice: return slice(low_edge, high_edge) def meso_effective_selector(self) -> slice: - """[TODO:summary]. + """Creates a slice to select the "meso-effective" range of the spectrum. + + The range is defined as the upper energy range from `max(energy_edge) - 0.3` to + `max(energy_edge) - 0.1`. Returns: - [TODO:description] + slice: A slice object representing the meso-effective energy range. Todo: - Test/Consider to remove + - Add tests. + - Consider removing the function. + """ energy_edge = self.find_spectrum_energy_edges() return slice(np.max(energy_edge) - 0.3, np.max(energy_edge) - 0.1) @@ -1899,19 +1963,26 @@ def region_sel( *regions: Literal["copper_prior", "wide_angular", "narrow_angular"] | dict[str, DesignatedRegions], ) -> XrTypes: - """[TODO:summary]. + """Filters the data by selecting specified regions and applying those regions to the object. + + Regions can be provided as literal strings or as a dictionary of `DesignatedRegions`. Args: - regions: [TODO:description] + regions (Literal or dict[str, DesignatedRegions]): The regions to select. + Valid regions include: + - "copper_prior": A specific region. + - "wide_angular": The wide angular region. + - "narrow_angular": The narrow angular region. + Alternatively, use the `DesignatedRegions` enumeration. Returns: - [TODO:description] + XrTypes: The data with the selected regions applied. Raises: - NotImplementedError: [TODO:description] + NotImplementedError: If a specified region cannot be resolved. Todo: - Test + - Add tests. """ def process_region_selector( @@ -2178,13 +2249,25 @@ def fermi_edge_reference_plot( ) -> Path | Axes: """Provides a reference plot for a Fermi edge reference. + This function generates a reference plot for a Fermi edge, which can be useful for analyzing + energy spectra. It calls the `fermi_edge_reference` function and passes any additional + keyword arguments to it for plotting customization. The output file name can be specified + using the `out` argument, with a default name pattern. + Args: - pattern ([TODO:type]): [TODO:description] - out (str | Path): Path name for output figure. - kwargs: pass to plotting.fermi_edge.fermi_edge_reference + pattern (str): A string pattern for the output file name. The pattern can include + placeholders that will be replaced by the label or other variables. + Default is "{}.png". + out (str | Path): The path for saving the output figure. If set to `None` or `False`, + no figure will be saved. If a boolean `True` is passed, it will use the `pattern` + to generate the filename. + kwargs: Additional arguments passed to the `fermi_edge_reference` function for + customizing the plot. Returns: - [TODO:description] + Path | Axes: The path to the saved figure (if `out` is provided), or the Axes object of + the plot.Provides a reference plot for a Fermi edge reference. + """ assert isinstance(self._obj, xr.DataArray) if out is not None and isinstance(out, bool): @@ -2198,15 +2281,27 @@ def _referenced_scans_for_spatial_plot( pattern: str = "{}.png", out: str | Path = "", ) -> Path | tuple[Figure, NDArray[np.object_]]: - """[TODO:summary]. + """Helper function for generating a spatial plot of referenced scans. - A Helper function. + This function assists in generating a spatial plot for referenced scans, either by using a + unique identifier or a predefined label. The output file name can be automatically generated + or specified by the user. The function calls `reference_scan_spatial` for generating the + plot and optionally saves the output figure. Args: - use_id (bool): [TODO:description] - pattern (str): [TODO:description] - out (str|bool): if str, Path for output figure. if True, - the file name is automatically set. If False/"", no output is given. + use_id (bool): If `True`, uses the "id" attribute from the object's metadata as the + label. If `False`, uses the predefined label. Default is `True`. + pattern (str): A string pattern for the output file name. The placeholder `{}` will be + replaced by the label or identifier. Default is `"{}.png"`. + out (str | bool): The path to save the output figure. If `True`, the file name is + generated using the `pattern`. If `False` or an empty string (`""`), no output is + saved. + + Returns: + Path | tuple[Figure, NDArray[np.object_]]: + - If `out` is provided, returns the path to the saved figure. + - Otherwise, returns the Figure and an array of the spatial data. + """ label = self._obj.attrs["id"] if use_id else self.label if isinstance(out, bool) and out is True: @@ -2324,18 +2419,19 @@ def apply_over( copy: bool = True, **selections: Incomplete, ) -> XrTypes: - """[TODO:summary]. + """Applies a function to a data region and updates the dataset with the result. Args: - fn: [TODO:description] - copy: [TODO:description] - selections: [TODO:description] + fn (Callable): The function to apply. + copy (bool, optional): If True, operates on a deep copy of the data. + If False, modifies the data in-place. Defaults to True. + selections (Incomplete): Keyword arguments specifying the region of the data to select. Returns: - [TODO:description] + XrTypes: The dataset after the function has been applied. Todo: - Test + - Add tests. """ assert isinstance(self._obj, xr.DataArray | xr.Dataset) data = self._obj @@ -2416,7 +2512,7 @@ def iter_coords( *, reverse: bool = False, ) -> Iterator[dict[Hashable, float]]: - """Iterator for cooridinates along the axis. + """Iterator for coordinates along the axis. Args: dim_names (Sequence[Hashable]): Dimensions for iteration. @@ -2547,16 +2643,17 @@ def filter_vars( self, f: Callable[[Hashable, xr.DataArray], bool], ) -> xr.Dataset: - """[TODO:summary]. + """Filters data variables based on the specified condition and returns a new dataset. Args: - f: [TODO:description] + f (Callable[[Hashable, xr.DataArray], bool]): A function to filter data variables. + It takes a variable name (key) and its data and returns a boolean. Returns: - [TODO:description] + xr.Dataset: A new dataset with the filtered data variables. Todo: - Test + - Add tests. """ assert isinstance(self._obj, xr.Dataset) # ._obj.data_vars return xr.Dataset( @@ -2569,20 +2666,21 @@ def shift_coords( dims: tuple[str, ...], shift: NDArray[np.float64] | float, ) -> xr.Dataset: - """[TODO:summary]. + """Shifts the coordinates and returns a new dataset with the shifted coordinates. Args: - dims: [TODO:description] - shift: [TODO:description] + dims (tuple[str, ...]): The list of dimensions whose coordinates will be shifted. + shift (NDArray[np.float64] or float): The amount to shift the coordinates. If a float, + the same shift is applied to all dimensions. Returns: - [TODO:description] + xr.Dataset: A new dataset with the shifted coordinates. Raises: - RuntimeError: [TODO:description] + RuntimeError: If an invalid shift amount is provided. Todo: - Test + - Add tests. """ if not isinstance(shift, np.ndarray): shift = np.ones((len(dims),)) * shift @@ -2601,17 +2699,18 @@ def scale_coords( dims: tuple[str, ...], scale: float | NDArray[np.float64], ) -> xr.Dataset: - """[TODO:summary]. + """Scales the coordinates and returns a new dataset with the scaled coordinates. Args: - dims: [TODO:description] - scale: [TODO:description] + dims (tuple[str, ...]): The list of dimensions whose coordinates will be scaled. + scale (float or NDArray[np.float64]): The amount to scale the coordinates. If a float, + the same scaling is applied to all dimensions. Returns: - [TODO:description] + xr.Dataset: A new dataset with the scaled coordinates. Todo: - Test + - Add tests. """ if not isinstance(scale, np.ndarray): n_dims = len(dims) @@ -2750,16 +2849,29 @@ def to_arrays(self) -> tuple[NDArray[np.float64], NDArray[np.float64]]: return (self._obj.coords[self._obj.dims[0]].values, self._obj.values) def clean_outliers(self, clip: float = 0.5) -> xr.DataArray: - """[TODO:summary]. + """Clip outliers in the DataArray by limiting values to a specified percentile range. + + This method modifies the values of an `xarray.DataArray` to ensure that they fall within a + specified range defined by percentiles. Any value below the lower percentile is set to the + lower limit, and any value above the upper percentile is set to the upper limit. Args: - clip: [TODO:description] + clip (float, optional): The percentile range to use for clipping. The lower and upper + bounds are determined by the `clip` value and its complement: + - Lower bound: `clip` percentile. + - Upper bound: `(100 - clip)` percentile. + For example, if `clip=0.5`, the lower 0.5% and upper 99.5% of + the data will be clipped. + Default is 0.5. Returns: - [TODO:description] + xr.DataArray: A new DataArray with outliers clipped to the specified range. + + Raises: + AssertionError: If the underlying object is not an `xarray.DataArray`. Todo: - Test + - Add unit tests to ensure the method behaves as expected. """ assert isinstance(self._obj, xr.DataArray) low, high = np.percentile(self._obj.values, [clip, 100 - clip]) @@ -2776,19 +2888,57 @@ def as_movie( out: str | bool = "", **kwargs: Unpack[PColorMeshKwargs], ) -> Path | animation.FuncAnimation: - """[TODO:summary]. + """Create an animation or save images showing the DataArray's evolution over time. + + This method creates a time-based visualization of an `xarray.DataArray`, either as an + animation or as a sequence of images saved to disk. The `time_dim` parameter specifies + the dimension used for the temporal progression. Args: - time_dim: [TODO:description] - pattern: [TODO:description] - out: [TODO:description] - kwargs: [TODO:description] + time_dim (str, optional): The name of the dimension representing time or progression + in the DataArray. Defaults to "delay". + pattern (str, optional): A format string to name output image files. The string should + include a placeholder (`{}`) for dynamic naming. Defaults to "{}.png". + out (str | bool, optional): Determines the output format: + - If a string is provided, it is used as the base name for the output file or + directory. + - If `True`, the file name is automatically generated using the DataArray's label + and the provided `pattern`. + - If `False` or an empty string, the animation is returned without saving. + Defaults to "". + kwargs (optional): Additional keyword arguments passed to the `plot_movie` function. + These can customize the appearance of the generated images or animation. Returns: - [TODO:description] + Path | animation.FuncAnimation: + - If `out` is specified (as a string or `True`), returns a `Path` to the saved file. + - If `out` is `False` or an empty string, returns a + `matplotlib.animation.FuncAnimation` object. + + Raises: + AssertionError: If the underlying object is not an `xarray.DataArray`. + AssertionError: If `out` is not a valid string when required. + + Example: + ```python + import xarray as xr + + # Create a sample DataArray with a time dimension + data = xr.DataArray( + [[[i + j for j in range(10)] for i in range(10)] for _ in range(5)], + dims=("time", "x", "y"), + coords={"time": range(5), "x": range(10), "y": range(10)}, + ) + + # Generate an animation + animation = data.as_movie(time_dim="time") + # Save as images or a movie file + data.as_movie(time_dim="time", out=True, pattern="frame_{}.png") + ``` Todo: - Test + - Add unit tests to verify functionality with various data configurations. + - Enhance compatibility with additional plot types. """ assert isinstance(self._obj, xr.DataArray) @@ -2803,18 +2953,52 @@ def map_axes( fn: Callable[[XrTypes, dict[str, float]], DataType], dtype: DTypeLike = None, ) -> xr.DataArray: - """[TODO:summary]. + """Apply a function along specified axes of the DataArray, creating a new DataArray. + + This method iterates over the coordinates of the specified axes, applies the provided + function to each coordinate, and assigns the result to the corresponding position + in the output DataArray. Optionally, the data type of the output array can be specified. Args: - axes ([TODO:type]): [TODO:description] - fn: [TODO:description] - dtype: [TODO:description] + axes (list[str] | str): The axis or axes along which to iterate and apply the function. + fn (Callable[[XrTypes, dict[str, float]], DataType]): A function that takes the selected + data and its coordinates as input and returns the transformed data. + dtype (DTypeLike, optional): The desired data type for the output DataArray. If not + specified, the type is inferred from the function's output. + + Returns: + xr.DataArray: A new DataArray with the function applied along the specified axes. Raises: - TypeError: [TODO:description] + TypeError: If the input arguments or operations result in a type mismatch. + + Example: + ```python + import xarray as xr + import numpy as np + + # Create a sample DataArray + data = xr.DataArray( + np.random.rand(5, 5), + dims=["x", "y"], + coords={"x": range(5), "y": range(5)}, + ) + + # Define a function to scale data + def scale_fn(data, coord): + scale_factor = coord["x"] + 1 + return data * scale_factor + + # Apply the function along the "x" axis + result = data.map_axes(axes="x", fn=scale_fn) + + print(result) + ``` Todo: - Test + - Add tests to validate the behavior with complex axes configurations. + - Optimize performance for high-dimensional DataArrays. + """ obj = self._obj.copy(deep=True) @@ -2923,14 +3107,14 @@ def map( fn: Callable[[NDArray[np.float64], Any], NDArray[np.float64]], **kwargs: Incomplete, ) -> xr.DataArray: - """[TODO:summary]. + """Applies the specified function to the values of an xarray and returns a new DataArray. Args: - fn (Callable): Function applying to xarray.values - kwargs: [TODO:description] + fn (Callable): The function to apply to the xarray values. + kwargs: Additional arguments to pass to the function. Returns: - [TODO:description] + xr.DataArray: A new DataArray with the function applied to the values. """ return apply_dataarray(self._obj, np.vectorize(fn, **kwargs)) @@ -2943,24 +3127,22 @@ def shift_by( zero_nans: bool = True, shift_coords: bool = False, ) -> xr.DataArray: - """Data shift along the axis. + """Shifts the data along the specified axis. - For now we only support shifting by a one dimensional array + Currently, only supports shifting by a one-dimensional array. Args: - other (xr.DataArray | NDArray): [TODO:description] - we only support shifting by a one dimensional array - shift_axis (str): [TODO:description] - by_axis (str): The dimension name of `other`. When `other` is xr.DataArray, this value - is ignored. - zero_nans (bool): if True, fill 0 for np.nan - shift_coords (bool): [TODO:description] + other (xr.DataArray | NDArray): Data to shift by. Only supports one-dimensional array. + shift_axis (str): The axis to shift along. + by_axis (str): The dimension name of `other`. Ignored when `other` is an xr.DataArray. + zero_nans (bool): If True, fill np.nan with 0. + shift_coords (bool): Whether to shift the coordinates as well. - Returns (xr.DataArray): - Shifted xr.DataArray + Returns: + xr.DataArray: The shifted xr.DataArray. Todo: - Test + - Add tests.Data shift along the axis. """ assert shift_axis, "shift_by must take shift_axis argument." data = self._obj.copy(deep=True) @@ -3006,13 +3188,13 @@ def shift_by( return built_data def drop_nan(self) -> xr.DataArray: - """[TODO:summary].. + """Drops the NaN values from the data. Returns: - [TODO:description] + xr.DataArray: The xr.DataArray with NaN values removed. Todo: - Test + - Add tests. """ assert len(self._obj.dims) == 1 diff --git a/tests/conftest.py b/tests/conftest.py index 8e6eb18e..056c0cfc 100644 --- a/tests/conftest.py +++ b/tests/conftest.py @@ -6,17 +6,18 @@ from pathlib import Path from typing import TYPE_CHECKING, TypedDict +import pytest + import arpes.config import arpes.endstations -import pytest from arpes.io import example_data - from tests.utils import cache_loader if TYPE_CHECKING: from collections.abc import Callable, Iterator import xarray as xr + from arpes._typing import ScanInfo, WorkSpaceType @@ -32,55 +33,55 @@ class Scenario(TypedDict, total=False): file: str -@pytest.fixture() +@pytest.fixture def dataset_cut() -> xr.Dataset: """A fixture for loading Dataset.""" return example_data.cut -@pytest.fixture() +@pytest.fixture def dataarray_cut() -> xr.DataArray: """A fixture for loading DataArray.""" return example_data.cut.spectrum -@pytest.fixture() +@pytest.fixture def dataset_map() -> xr.Dataset: """A fixture for loading Dataset.""" return example_data.map.spectrum -@pytest.fixture() +@pytest.fixture def dataarray_map() -> xr.DataArray: """A fixture for loading DataArray.""" return example_data.map.spectrum -@pytest.fixture() +@pytest.fixture def xps_map() -> xr.Dataset: """A fixture for loading example_data.xps.""" return example_data.nano_xps -@pytest.fixture() +@pytest.fixture def hv_map() -> xr.Dataset: """A fixture for loading photonenergy dependence.""" return example_data.photon_energy -@pytest.fixture() +@pytest.fixture def dataset_cut2() -> xr.Dataset: """A fixture for loading Dataset.""" return example_data.cut2 -@pytest.fixture() +@pytest.fixture def dataarray_cut2() -> xr.DataArray: """A fixture for loading Dataset.""" return example_data.cut2.spectrum -@pytest.fixture() +@pytest.fixture def dataset_temperature_dependence() -> xr.Dataset: """A fixture for loading Dataset (temperature_dependence).""" return example_data.temperature_dependence @@ -117,7 +118,7 @@ class Sandbox: } -@pytest.fixture() +@pytest.fixture def sandbox_configuration() -> Iterator[Sandbox]: """Generates a sandboxed configuration of the ARPES data analysis suite.""" resources_dir = Path.cwd() / "tests" / "resources" diff --git a/tests/test_basic_data_loading.py b/tests/test_basic_data_loading.py index c0dd5c28..f5fca3d3 100644 --- a/tests/test_basic_data_loading.py +++ b/tests/test_basic_data_loading.py @@ -5,10 +5,11 @@ import contextlib from typing import TYPE_CHECKING, Any, ClassVar -import arpes.xarray_extensions # pylint: disable=unused-import, redefined-outer-name # noqa: F401 import numpy as np import pytest import xarray as xr + +import arpes.xarray_extensions # pylint: disable=unused-import, redefined-outer-name # noqa: F401 from arpes.utilities.conversion import convert_to_kspace if TYPE_CHECKING: @@ -426,10 +427,10 @@ def test_load_file_and_basic_attributes( data = sandbox_configuration.load(file) assert isinstance(data, xr.Dataset) - for k in expected: + for k, v in expected.items(): metadata = getattr(data.S, k) assert k - assert metadata == expected[k] + assert metadata == v class TestBasicDataLoading: @@ -726,8 +727,7 @@ class TestBasicDataLoading: "eV": [-3.619399, 0.1806009, 0.003999], "phi": [-0.28633, 0.26867, 0.0008409], }, - "offset_coords": { - }, + "offset_coords": {}, }, }, ), diff --git a/tests/test_bz_spec.py b/tests/test_bz_spec.py index 36f4a26c..9b0e0ca0 100644 --- a/tests/test_bz_spec.py +++ b/tests/test_bz_spec.py @@ -2,10 +2,11 @@ import numpy as np import pytest + from arpes.utilities import bz_spec -@pytest.mark.skip() +@pytest.mark.skip def test_bz_points_for_hexagonal_lattice() -> None: """Test for bz_points_for_hexagonal_lattice.""" np.testing.assert_allclose( diff --git a/tests/test_curve_fitting.py b/tests/test_curve_fitting.py index 5ea67b63..64f213d4 100644 --- a/tests/test_curve_fitting.py +++ b/tests/test_curve_fitting.py @@ -2,6 +2,7 @@ import numpy as np import xarray as xr + from arpes.analysis import rebin from arpes.fits import AffineBroadenedFD, LorentzianModel, broadcast_model from arpes.fits.utilities import parse_model diff --git a/tests/test_derivative_analysis.py b/tests/test_derivative_analysis.py index c3e4b0a9..b1b29ae8 100644 --- a/tests/test_derivative_analysis.py +++ b/tests/test_derivative_analysis.py @@ -6,6 +6,7 @@ import numpy as np import pytest + from arpes.analysis import ( curvature1d, curvature2d, diff --git a/tests/test_direct_and_example_data_loading.py b/tests/test_direct_and_example_data_loading.py index fa35f3b7..81e8843d 100644 --- a/tests/test_direct_and_example_data_loading.py +++ b/tests/test_direct_and_example_data_loading.py @@ -6,6 +6,7 @@ import numpy as np import xarray as xr + from arpes.endstations.plugin.ALG_main import ALGMainChamber from arpes.io import load_data, load_example_data diff --git a/tests/test_fits.py b/tests/test_fits.py index 52ca14f6..2b2fe3df 100644 --- a/tests/test_fits.py +++ b/tests/test_fits.py @@ -6,11 +6,12 @@ import matplotlib.pyplot as plt import numpy as np import pytest -from arpes.plotting.fits import plot_fit, plot_fits from matplotlib.axes import Axes +from arpes.plotting.fits import plot_fit, plot_fits + -@pytest.fixture() +@pytest.fixture def mock_model_results() -> list[lf.model.ModelResult]: results = [] for _ in range(4): @@ -30,7 +31,7 @@ def mock_model_results() -> list[lf.model.ModelResult]: return results -@pytest.fixture() +@pytest.fixture def mock_model_result(): mock_result = MagicMock() x = np.linspace(0, 10, 100) diff --git a/tests/test_generic_utilities.py b/tests/test_generic_utilities.py index 3d39ed85..7490a482 100644 --- a/tests/test_generic_utilities.py +++ b/tests/test_generic_utilities.py @@ -1,7 +1,8 @@ """Test for generic utility.""" import pytest -from arpes.utilities import clean_keys, deep_equals, deep_update + +from arpes.utilities import clean_keys, deep_update def test_cldean_keys() -> None: @@ -11,39 +12,6 @@ def test_cldean_keys() -> None: assert cleaned_dict == {"excitation_energy": 4.03, "count_cycle": 100} -@pytest.mark.parametrize( - ("destination", "source", "expected_equal"), - [ - ({}, {}, True), - ({"a": []}, {"a": []}, True), - ({"a": [1.1]}, {"a": [1.2]}, False), - ({"a": [{}, {"b": {"c": [5]}}]}, {"a": [{}, {"b": {"c": [5]}}]}, True), - ], -) -def test_deep_equals( - destination: dict, - source: dict, - *, - expected_equal: bool, -) -> None: - """Test for deep_equals. - - [TODO:description] - - Args: - destination: [TODO:description] - source: [TODO:description] - expected_equal: [TODO:description] - - Returns: - [TODO:description] - """ - if expected_equal: - assert deep_equals(destination, source) - else: - assert not deep_equals(destination, source) - - @pytest.mark.parametrize( ("destination", "source", "expected"), [ @@ -66,4 +34,4 @@ def test_deep_update(destination: dict, source: dict, expected: dict) -> None: Returns: [TODO:description] """ - assert deep_equals(deep_update(destination, source), expected) + assert deep_update(destination, source) == expected diff --git a/tests/test_momentum_conversion.py b/tests/test_momentum_conversion.py index 0d5b332b..a2a41736 100644 --- a/tests/test_momentum_conversion.py +++ b/tests/test_momentum_conversion.py @@ -2,10 +2,11 @@ from typing import TYPE_CHECKING -import arpes.xarray_extensions # pylint: disable=unused-import, redefined-outer-name # noqa: F401 import numpy as np import pytest import xarray as xr + +import arpes.xarray_extensions # pylint: disable=unused-import, redefined-outer-name # noqa: F401 from arpes.utilities.conversion import convert_to_kspace from arpes.utilities.conversion.base import CoordinateConverter from arpes.utilities.conversion.forward import ( diff --git a/tests/test_momentum_conversion_forward.py b/tests/test_momentum_conversion_forward.py index f7fb1660..9562bf6a 100644 --- a/tests/test_momentum_conversion_forward.py +++ b/tests/test_momentum_conversion_forward.py @@ -5,6 +5,7 @@ import numpy as np import pytest import xarray as xr + from arpes.fits.fit_models import AffineBroadenedFD, QuadraticModel from arpes.fits.utilities import broadcast_model from arpes.utilities.conversion.forward import ( @@ -19,7 +20,7 @@ RTOL = 1e-2 -@pytest.fixture() +@pytest.fixture def energy_corrected(dataarray_map: xr.DataArray) -> xr.DataArray: """A fixture for loading DataArray.""" fmap = dataarray_map diff --git a/tests/test_plot_with_holoviews.py b/tests/test_plot_with_holoviews.py index ca2e06eb..3d26d784 100644 --- a/tests/test_plot_with_holoviews.py +++ b/tests/test_plot_with_holoviews.py @@ -1,9 +1,10 @@ """Unit test for plotting/holoviews.py.""" import xarray as xr -from arpes.plotting import profile_view from holoviews.core.layout import AdjointLayout +from arpes.plotting import profile_view + class TestProfileView: """Class for profile_view function.""" diff --git a/tests/test_prodigy_itx.py b/tests/test_prodigy_itx.py index 765c5e5d..8f54b2c7 100644 --- a/tests/test_prodigy_itx.py +++ b/tests/test_prodigy_itx.py @@ -5,12 +5,13 @@ import numpy as np import pytest import xarray as xr + from arpes.endstations.prodigy_itx import ProdigyItx, load_sp2 data_dir = Path(__file__).parent.parent / "src" / "arpes" / "example_data" -@pytest.fixture() +@pytest.fixture def sample_itx() -> ProdigyItx: """Fixture.""" with Path(data_dir / "example_itx_data.itx").open(mode="r") as itx_file: @@ -18,7 +19,7 @@ def sample_itx() -> ProdigyItx: return ProdigyItx(itx_data) -@pytest.fixture() +@pytest.fixture def sample_sp2() -> xr.DataArray: """Fixture: produce xr.DataArray.""" return load_sp2(data_dir / "GrIr_111_20230410_1.sp2") diff --git a/tests/test_prodigy_xy.py b/tests/test_prodigy_xy.py index 3d682077..593ca048 100644 --- a/tests/test_prodigy_xy.py +++ b/tests/test_prodigy_xy.py @@ -4,12 +4,13 @@ import numpy as np import pytest + from arpes.endstations.prodigy_xy import ProdigyXY data_dir = Path(__file__).parent.parent / "src" / "arpes" / "example_data" -@pytest.fixture() +@pytest.fixture def sample_xy() -> ProdigyXY: """Fixture.""" with Path(data_dir / "BLGr_GK_example_xy_data.xy").open(mode="r") as xy_file: diff --git a/tests/test_stack_plot.py b/tests/test_stack_plot.py index 9180cd1c..3ae45ba7 100644 --- a/tests/test_stack_plot.py +++ b/tests/test_stack_plot.py @@ -3,6 +3,7 @@ import numpy as np import pytest import xarray as xr + from arpes.plotting import stack_plot diff --git a/tests/test_xarray_extensions.py b/tests/test_xarray_extensions.py index 0b73eb5c..6c1d62e2 100644 --- a/tests/test_xarray_extensions.py +++ b/tests/test_xarray_extensions.py @@ -3,6 +3,7 @@ import numpy as np import pytest import xarray as xr + from arpes.fits.fit_models import ( AffineBackgroundModel, AffineBroadenedFD, @@ -284,14 +285,14 @@ def test_switch_energy_notation( with pytest.raises(RuntimeError) as e: hv_map.S.switch_energy_notation() - assert str(e.value) == "Not impremented yet." + assert str(e.value) == "Not implemented yet." with pytest.raises(RuntimeError) as e: hv_map.S.switch_energy_notation() - assert str(e.value) == "Not impremented yet." + assert str(e.value) == "Not implemented yet." with pytest.raises(RuntimeError) as e: hv_map.spectrum.S.switch_energy_notation() - assert str(e.value) == "Not impremented yet." + assert str(e.value) == "Not implemented yet." def test_spectrum_type(self, dataarray_cut: xr.DataArray) -> None: """Test spectrum_type.""" diff --git a/tests/test_xps.py b/tests/test_xps.py index 98f83d30..ed5b6b21 100644 --- a/tests/test_xps.py +++ b/tests/test_xps.py @@ -2,6 +2,7 @@ import numpy as np import xarray as xr + from arpes.analysis.xps import approximate_core_levels