Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metadata check #330

Merged
merged 5 commits into from
Nov 27, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion punchbowl/data/data/Level0.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ Kinds:
NAXIS: 2
NAXIS1: 2048
NAXIS2: 2048
omits: [NAXIS3]
omits: [Camera and Readout State, Onboard Image Processing, Calibration Data, Spacecraft Location & Environment, NAXIS3]

PM:
overrides:
Expand Down
4 changes: 3 additions & 1 deletion punchbowl/data/data/Level1.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ Level:
overrides:
DESCRPTN: PUNCH Level-1 data, Calibrated instrumental units in camera coordinates
File Type and Provenance:
omits: [FILE_RAW]
overrides:
LEVEL: 1
Temporal Information:
Expand Down Expand Up @@ -45,14 +46,15 @@ Kinds:
NAXIS: 2
NAXIS1: 2048
NAXIS2: 2048
omits: [NAXIS3]
omits: [Camera and Readout State, Onboard Image Processing, Calibration Data, Spacecraft Location & Environment, NAXIS3]

QuarticCalibration:
overrides:
NAXIS: 3
NAXIS1: 2048
NAXIS2: 2048
NAXIS3: 5
omits: [Camera and Readout State, Onboard Image Processing, Calibration Data, Spacecraft Location & Environment]

PM:
overrides:
Expand Down
5 changes: 3 additions & 2 deletions punchbowl/data/data/Level2.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ Level:
overrides:
DESCRPTN: PUNCH Level-2 data, Composite mosaic in output coordinates
File Type and Provenance:
omits: [FILE_RAW]
overrides:
LEVEL: 2
OBSRVTRY: PUNCH
Expand Down Expand Up @@ -57,9 +58,9 @@ Kinds:
CRPIX2A: 2047.5
CTYPE1A: RA---ARC
CTYPE2A: DEC--ARC
omits: [NAXIS3, FILTER, OBSLAYR1, OBSLAYR2, OBSLAYR3 ,CRPIX3,
omits: [NAXIS3, FILTER, OBSLAYR1, OBSLAYR2, OBSLAYR3,CRPIX3,
PC1_3, PC2_3, PC3_1, PC3_2, PC3_3, CDELT3, CUNIT3, CTYPE3, CRVAL3, CNAME3,
CRPIX3A, PC1_3A, PC2_3A, PC3_1A, PC3_2A, PC3_3A, CDELT3A, CUNIT3A, CTYPE3A, CRVAL3A, CNAME3A ]
CRPIX3A, PC1_3A, PC2_3A, PC3_1A, PC3_2A, PC3_3A, CDELT3A, CUNIT3A, CTYPE3A, CRVAL3A, CNAME3A]

Calibration:
overrides:
Expand Down
5 changes: 3 additions & 2 deletions punchbowl/data/data/Level3.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ Level:
overrides:
DESCRPTN: PUNCH Level-3 data, Composite mosaic in output coordinates
File Type and Provenance:
omits: [FILE_RAW]
overrides:
LEVEL: 3
OBSRVTRY: PUNCH
Expand Down Expand Up @@ -49,7 +50,7 @@ Kinds:
NAXIS: 2
NAXIS1: 4096
NAXIS2: 4096
omits: [Velocity, NAXIS3, OBSLAYR1, OBSLAYR2, OBSLAYR3]
omits: [Velocity, Spacecraft Location & Environment, NAXIS3, OBSLAYR1, OBSLAYR2, OBSLAYR3]

Velocity:
overrides:
Expand Down Expand Up @@ -136,7 +137,7 @@ Products:
OBSCODE: M

PIM:
kinds: [ Polarized ]
kinds: [Polarized]
overrides:
TITLE: PUNCH Level-3 Intermediate F-corona Subtracted Polarized Mosaic
OBSTYPE: Polarized mosaic
Expand Down
4 changes: 1 addition & 3 deletions punchbowl/data/data/LevelL.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,13 @@ Level:
overrides:
DESCRPTN: PUNCH QuickLook data, Composite mosaic in output coordinates
File Type and Provenance:
omits: [FILE_RAW]
overrides:
LEVEL: L
OBSRVTRY: PUNCH
Temporal Information:
Instrument and Spacecraft State:
World Coordinate System:
Camera and Readout State:
Onboard Image Processing:
Calibration Data:
Image Statistics and Properties:
Solar Reference Data:
Spacecraft Location & Environment:
Expand Down
6 changes: 2 additions & 4 deletions punchbowl/data/data/LevelQ.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,13 @@ Level:
overrides:
DESCRPTN: PUNCH QuickPUNCH data, Composite mosaic in output coordinates
File Type and Provenance:
omits: [FILE_RAW]
overrides:
LEVEL: Q
OBSRVTRY: PUNCH
Temporal Information:
Instrument and Spacecraft State:
World Coordinate System:
Camera and Readout State:
Onboard Image Processing:
Calibration Data:
Image Statistics and Properties:
Solar Reference Data:
Spacecraft Location & Environment:
Expand Down Expand Up @@ -53,7 +51,7 @@ Kinds:
NAXIS: 2
NAXIS1: 4096
NAXIS2: 4096
omits: [NAXIS3, OBSLAYR1, OBSLAYR2, OBSLAYR3]
omits: [Spacecraft Location & Environment, NAXIS3, OBSLAYR1, OBSLAYR2, OBSLAYR3]


# Product specifications
Expand Down
14 changes: 7 additions & 7 deletions punchbowl/data/data/omniheader.csv
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,8 @@ SECTION,TYPE,KEYWORD,VALUE,COMMENT,DATATYPE,NULLABLE,MUTABLE,DEFAULT
4,keyword,OBSLAYR2,,Image Mode for second datacube layer,str,TRUE,TRUE,
4,keyword,OBSLAYR3,,Image Mode for third datacube layer,str,TRUE,TRUE,
4,keyword,POLAR,,[deg] Polarizer angle or fill value for clear,int,TRUE,TRUE,
4,keyword,POLAROFF,,Offset in degrees of polarizer from POLAR,float,TRUE,TRUE,0.0
4,keyword,POLARREF,,[deg] Polarizer reference frame,str,TRUE,TRUE,
4,keyword,POLAROFF,,[deg] Offset in degrees of polarizer from POLAR,float,TRUE,TRUE,0.0
4,keyword,POLARREF,,Polarizer reference frame (instrument or solar),str,TRUE,TRUE,
4,keyword,INSTRUME,,Instrument name,str,TRUE,TRUE,
4,keyword,TELESCOP,,Satellite name,str,TRUE,TRUE,
4,keyword,OBSRVTRY,PUNCH,Observatory name,str,TRUE,TRUE,PUNCH
Expand All @@ -61,7 +61,7 @@ SECTION,TYPE,KEYWORD,VALUE,COMMENT,DATATYPE,NULLABLE,MUTABLE,DEFAULT
6,keyword,TELAPSE,48.1296,[s] time between begin and end of exposure,float,TRUE,TRUE,48.1296
6,keyword,REGION,1,region,int,TRUE,TRUE,1
6,keyword,READOUT0,3,readout region if region=0,int,TRUE,TRUE,3
6,keyword,CAMERA,FMCFMD,detector name (TBD),str,TRUE,TRUE,FMCFMD
6,keyword,CAMERA,FMCFMD,detector name,str,TRUE,TRUE,FMCFMD
6,keyword,PXBEG1,1,first read-out detector row,int,TRUE,TRUE,1
6,keyword,PXEND1,2048,last read-out detector row,int,TRUE,TRUE,2048
6,keyword,PXBEG2,1,first read-out detector column,int,TRUE,TRUE,1
Expand All @@ -74,7 +74,7 @@ SECTION,TYPE,KEYWORD,VALUE,COMMENT,DATATYPE,NULLABLE,MUTABLE,DEFAULT
6,keyword,DSTART2,1,first row of image area on data array,int,TRUE,TRUE,1
6,keyword,DSTOP2,2048,last row of image area on data array,int,TRUE,TRUE,2048
6,keyword,IMGCTR,8370,image counter from IDPU,int,TRUE,TRUE,8370
6,keyword,LEDSTATE,Off,state of LED (TBD),str,TRUE,TRUE,Off
6,keyword,LEDSTATE,Off,state of LED,str,TRUE,TRUE,Off
6,keyword,LEDDAC,0,last commanded setting of LED,int,TRUE,TRUE,0
6,keyword,OFFSET,40,commanded offset value used in camera,int,TRUE,TRUE,40
6,keyword,GAINCMD,12,commanded gain value,float,TRUE,TRUE,12
Expand All @@ -89,8 +89,8 @@ SECTION,TYPE,KEYWORD,VALUE,COMMENT,DATATYPE,NULLABLE,MUTABLE,DEFAULT
8,keyword,CALCF,,quartic fit filename,str,TRUE,TRUE,
8,keyword,CALVI,,vignetting filename,str,TRUE,TRUE,
8,keyword,CALPSF,,PSF filename,str,TRUE,TRUE,
8,keyword,CALPM,,bad pixel map filename (TBC),str,TRUE,TRUE,
8,keyword,CALSL,,stray light model filename(TBC),str,TRUE,TRUE,
8,keyword,CALPM,,bad pixel map filename,str,TRUE,TRUE,
8,keyword,CALSL,,stray light model filename,str,TRUE,TRUE,
9,section,COMMENT,Image Statistics and Properties,,str,TRUE,TRUE,Image Statistics and Properties
9,keyword,BUNIT,Mean Solar Brightness,Units of observation,str,TRUE,TRUE,Mean Solar Brightness
9,keyword,BSUN_DEF,2.0090000E7,[W/m2/sr] Mean Solar Brightness,float,TRUE,TRUE,2.0090000E7
Expand Down Expand Up @@ -150,7 +150,7 @@ SECTION,TYPE,KEYWORD,VALUE,COMMENT,DATATYPE,NULLABLE,MUTABLE,DEFAULT
12,keyword,BANDWDTH,,Half-width of each radial band in solar radii,float,TRUE,TRUE,
12,keyword,MAXRAD,,The maximum radius in degrees,int,TRUE,TRUE,
12,keyword,AZMBINS,,Number of azimuthal bins in the polar remapped images,int,TRUE,TRUE,
12,keyword,AZMBINF,,Binning factor for binning the polar remapped image over the azimuth,int,TRUE,TRUE,
12,keyword,AZMBINF,,Azimuthal binning factor,int,TRUE,TRUE,
12,keyword,PLTBINS,,Number of azimuthal bins in the output flow maps,int,TRUE,TRUE,
12,keyword,YCENS,,Radial band centers in solar radii,str,TRUE,TRUE,
12,keyword,RBANDS,,Indices of radial bands to visualize,str,TRUE,TRUE,
Expand Down
7 changes: 6 additions & 1 deletion punchbowl/data/io.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,11 +88,14 @@ def write_ndcube_to_fits(cube: NDCube,
header=full_header,
name="Uncertainty array",
quantize_level=uncertainty_quantize_level)
hdu_provenance = fits.BinTableHDU.from_columns(fits.ColDefs([fits.Column(
name="provenance", format="A40", array=np.char.array(cube.meta.provenance))]))

hdul = cube.wcs.to_fits()
hdul[0] = fits.PrimaryHDU()
hdul.insert(1, hdu_data)
hdul.insert(2, hdu_uncertainty)
hdul.insert(3, hdu_provenance)
hdul.writeto(filename, overwrite=overwrite, checksum=True)
hdul.close()
if write_hash:
Expand Down Expand Up @@ -152,7 +155,7 @@ def _update_statistics(cube: NDCube) -> None:
cube.meta["DATAMAX"] = float(np.nanmax(cube.data))


def load_ndcube_from_fits(path: str | Path, key: str = " ") -> NDCube:
def load_ndcube_from_fits(path: str | Path, key: str = " ", include_provenance: bool = True) -> NDCube:
"""Load an NDCube from a FITS file."""
with fits.open(path) as hdul:
primary_hdu = hdul[1]
Expand All @@ -162,6 +165,8 @@ def load_ndcube_from_fits(path: str | Path, key: str = " ") -> NDCube:
header["CHECKSUM"] = ""
header["DATASUM"] = ""
meta = NormalizedMetadata.from_fits_header(header)
if include_provenance:
meta._provenance = hdul[3].data["provenance"] # noqa: SLF001
wcs = WCS(header, hdul, key=key)
unit = u.ct

Expand Down
9 changes: 9 additions & 0 deletions punchbowl/data/meta.py
Original file line number Diff line number Diff line change
Expand Up @@ -208,6 +208,7 @@ def __init__(
self,
contents: t.OrderedDict[str, t.OrderedDict[str, MetaField]],
history: History | None = None,
provenance: list[str] | None = None,
wcs_section_name: str = "World Coordinate System",
) -> None:
"""
Expand All @@ -219,12 +220,15 @@ def __init__(
contents of the meta information
history: History
history contents for this meta field
provenance: list[str]
list of files used in the generation of this product
wcs_section_name: str
the section title for the WCS section to specially fill

"""
self._contents = contents
self._history = history if history is not None else History()
self._provenance = provenance if provenance is not None else []
self._wcs_section_name = wcs_section_name

def __iter__(self) -> t.Iterator[t.Any]:
Expand Down Expand Up @@ -616,6 +620,11 @@ def history(self) -> History:
def history(self, history: History) -> None:
self._history = history

@property
def provenance(self) -> list[str]:
"""Returns file provenance."""
return self._provenance

@staticmethod
def _validate_key_is_str(key: str) -> None:
"""
Expand Down
2 changes: 1 addition & 1 deletion punchbowl/data/tests/test_io.py
Original file line number Diff line number Diff line change
Expand Up @@ -262,7 +262,7 @@ def test_write_punchdata_with_distortion(tmpdir):
write_ndcube_to_fits(obj, file_path, overwrite=True)

with fits.open(file_path) as hdul:
assert len(hdul) == 5
assert len(hdul) == 6

loaded_cube = load_ndcube_from_fits(file_path)
assert loaded_cube.wcs.has_distortion

Large diffs are not rendered by default.

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level1/tests/data/test_quartic_coeffs.fits

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level3/tests/data/test_0.fits

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level3/tests/data/test_1.fits

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level3/tests/data/test_2.fits

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level3/tests/data/test_3.fits

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level3/tests/data/test_4.fits

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level3/tests/data/test_5.fits

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level3/tests/data/test_6.fits

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level3/tests/data/test_7.fits

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level3/tests/data/test_8.fits

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions punchbowl/level3/tests/data/test_9.fits

Large diffs are not rendered by default.

18 changes: 9 additions & 9 deletions punchbowl/level3/tests/test_celestial_intermediary.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@


def test_to_celestial_frame_cutout():
data_cube = io.load_ndcube_from_fits(TEST_FILE)
data_cube = io.load_ndcube_from_fits(TEST_FILE, include_provenance=False)
reprojected_cube = celestial_intermediary.to_celestial_frame_cutout(data_cube, cdelt=1)
assert np.any(np.isfinite(reprojected_cube.data))
assert np.any(np.isfinite(reprojected_cube.uncertainty.array))
Expand Down Expand Up @@ -57,8 +57,8 @@ def test_shift_image_onto(tmp_path, shift1, shift2, is_overlap):
hdu.data = hdu.data[0]
hdul.writeto(file2)

cube1 = io.load_ndcube_from_fits(file1, key='A')
cube2 = io.load_ndcube_from_fits(file2, key='A')
cube1 = io.load_ndcube_from_fits(file1, key='A', include_provenance=False)
cube2 = io.load_ndcube_from_fits(file2, key='A', include_provenance=False)
reproj_data_1 = celestial_intermediary.to_celestial_frame_cutout(cube1, cdelt=.6)
reproj_data_2 = celestial_intermediary.to_celestial_frame_cutout(cube2, cdelt=.6)

Expand Down Expand Up @@ -117,8 +117,8 @@ def test_shift_image_onto_3d_cube(tmp_path):
hdu.header['CRVAL2A'] += 5
hdul.writeto(file2)

cube1 = io.load_ndcube_from_fits(file1, key='A')
cube2 = io.load_ndcube_from_fits(file2, key='A')
cube1 = io.load_ndcube_from_fits(file1, key='A', include_provenance=False)
cube2 = io.load_ndcube_from_fits(file2, key='A', include_provenance=False)
reproj_data_1 = celestial_intermediary.to_celestial_frame_cutout(cube1, cdelt=.6)
reproj_data_2 = celestial_intermediary.to_celestial_frame_cutout(cube2, cdelt=.6)

Expand Down Expand Up @@ -176,8 +176,8 @@ def test_shift_image_onto_fill_value(tmp_path):
hdu.header['CRVAL2A'] += 10
hdul.writeto(file2)

cube1 = io.load_ndcube_from_fits(file1, key='A')
cube2 = io.load_ndcube_from_fits(file2, key='A')
cube1 = io.load_ndcube_from_fits(file1, key='A', include_provenance=False)
cube2 = io.load_ndcube_from_fits(file2, key='A', include_provenance=False)
reproj_data_1 = celestial_intermediary.to_celestial_frame_cutout(cube1, cdelt=.6)
reproj_data_2 = celestial_intermediary.to_celestial_frame_cutout(cube2, cdelt=.6)

Expand All @@ -204,9 +204,9 @@ def test_shift_image_onto_fill_value(tmp_path):

def test_shift_image_onto_different_cdelts():
reproj_data_1 = celestial_intermediary.to_celestial_frame_cutout(
io.load_ndcube_from_fits(TEST_FILE, key='A'), cdelt=1.2)
io.load_ndcube_from_fits(TEST_FILE, key='A', include_provenance=False), cdelt=1.2)
reproj_data_2 = celestial_intermediary.to_celestial_frame_cutout(
io.load_ndcube_from_fits(TEST_FILE, key='A'), cdelt=1)
io.load_ndcube_from_fits(TEST_FILE, key='A', include_provenance=False), cdelt=1)

with pytest.raises(ValueError, match=".*WCSes must have identical CDELTs.*"):
celestial_intermediary.shift_image_onto(reproj_data_1, reproj_data_2)
80 changes: 80 additions & 0 deletions scripts/generate_templates.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
import os
from datetime import datetime

import numpy as np
from astropy.nddata import StdDevUncertainty
from astropy.wcs import WCS
from astropy.wcs.utils import add_stokes_axis_to_wcs
from ndcube import NDCube

from punchbowl.data import NormalizedMetadata, get_base_file_name, write_ndcube_to_fits
from punchbowl.data.meta import load_level_spec

LEVELS = ["0", "1", "2", "3", "L", "Q"]


def sample_ndcube(shape, code="PM1", level="0"):
data = np.zeros(shape).astype(np.float32)
uncertainty = StdDevUncertainty(np.sqrt(np.abs(data)))
wcs = WCS(naxis=2)
wcs.wcs.ctype = "HPLN-ARC", "HPLT-ARC"
wcs.wcs.cunit = "deg", "deg"
wcs.wcs.cdelt = 0.1, 0.1
wcs.wcs.crpix = 0, 0
wcs.wcs.crval = 1, 1

if level in ["2", "3"] and code[0] == "P":
wcs = add_stokes_axis_to_wcs(wcs, 2)

meta = NormalizedMetadata.load_template(code, level)
meta['DATE-OBS'] = str(datetime(2024, 1, 1, 0, 0, 0))
meta['DATE-BEG'] = str(datetime(2024, 1, 1, 0, 0, 0))
meta['DATE-END'] = str(datetime(2024, 1, 1, 0, 0, 0))
meta['DATE-AVG'] = str(datetime(2024, 1, 1, 0, 0, 0))
meta['FILEVRSN'] = "1"
meta['POLARREF'] = "Instrument"
meta['POLAROFF'] = 0.0
return NDCube(data=data, uncertainty=uncertainty, wcs=wcs, meta=meta)


def construct_all_product_headers(directory, level, outpath):
date_obs = datetime.now()
level_path = os.path.join(directory, f"Level{level}.yaml")
level_spec = load_level_spec(level_path)
product_keys = list(level_spec["Products"].keys())
# crafts = load_spacecraft_def().keys()
if level in ["0", "1", "2"]:
crafts = {'1': '', '2': '', '3': '', '4': ''}.keys()
shape = (2048,2048)
if level in ["2", "3", "Q", "L"]:
crafts = {'M': '', 'N': ''}.keys()
shape = (4096,4096)
if level in ["Q", "L"]:
crafts = {'M': '', 'N': ''}.keys()
shape = (1024,1024)
product_keys = sorted(list(set([pc.replace("?", craft) for craft in crafts for pc in product_keys])))
for pc in product_keys:
try:
meta = NormalizedMetadata.load_template(pc, level)
except Exception as e:
assert False, f"failed to create {pc} for level {level} because: {e}"
meta['DATE-OBS'] = str(datetime.now())

sample_data = sample_ndcube(shape=shape, code=pc, level=level)

filename = outpath + get_base_file_name(sample_data) + '.fits'

print('Finished writing ' + filename)

write_ndcube_to_fits(sample_data, filename=filename, write_hash=False)


if __name__ == "__main__":

path_yaml = '/Users/clowder/work/punch/punchbowl/punchbowl/data/data/'
path_output = '/Users/clowder/data/punch/metadata/'

for level in LEVELS:
construct_all_product_headers(path_yaml, level, path_output)

print("Job's finished.")