Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with edge effects? #17

Closed
keflavich opened this issue Nov 7, 2023 · 14 comments
Closed

Problem with edge effects? #17

keflavich opened this issue Nov 7, 2023 · 14 comments

Comments

@keflavich
Copy link
Contributor

We have some images with awkward edges, like this:
image

that end up rejecting all of the found stars in the image as "too faint" at this stage:

if verbose:
print('Extension %s, iteration %2d, found %6d sources; %4d close and '
'%4d faint sources removed.' %
(ccd, titer+1, len(xn),
numpy.sum(~isolatedenough),
numpy.sum(~brightenough & isolatedenough)))

There doesn't appear to be anything obviously wrong with the weights (they're zero where the data are zero outside the image footprint). However, I suspect a problem with the sky modeling. Zooming in on part of the sky model, we have a corner-like artifact:
image

Any idea how to suppress this?

@keflavich
Copy link
Contributor Author

The sky model actually looks fine when I load it up (saved it as FITS), so that's not the issue.
image

@schlafly
Copy link
Owner

schlafly commented Nov 7, 2023

No, I haven't seen that before. The sky image feels pretty simple and robust. You can try calling it separately to see if anything looks weird there. brightenough is just supposed to be >3 sigma; is it finding sources and then removing them, or is it failing to find them at all? If the latter, I worry about the weight image not being right (implying that there is much more noise than there actually is, so nothing gets found), or the PSF. If the former, I don't have any immediate guesses unfortunately.

@keflavich
Copy link
Contributor Author

it's finding sources and then flagging them out: "210211 faint sources removed"

@schlafly
Copy link
Owner

schlafly commented Nov 7, 2023

Mmm, that's weird. The peak finding and image fitting do use slightly different sky images with different filtering scales. I can't immediately see why that would cause issues, but in principle if the one for the fit were way too large then all of the sources would appear to be faint and get thrown out. The source detection also uses a single PSF stamp while the fitting uses the appropriate stamp for each location, so if the one stamp that source detection used looked right and somehow all of the other stamps evaluated elsewhere had weird garbage in them, that would cause problems.

I'd probably put a pdb.set_trace() around here
https://github.com/schlafly/crowdsource/blob/master/crowdsource/crowdsource_base.py#L909
and plot some of the stamps.

@keflavich
Copy link
Contributor Author

Thanks, that gives some threads to follow. @andrew-saydjari seems to be tracking these down now too on a different data set of mine. @SpacialTree is contending with the problem on these data.

@SpacialTree
Copy link

Here's what our weight look like,
weight fits-image-2023-11-07-16-21-16

I don't see anything wrong popping out.

@schlafly
Copy link
Owner

schlafly commented Nov 7, 2023

Since you find the sources in the initial pass, I no longer think it's the weight map. The concept would have been that if you had the weight map units wrong or something, the code would think that no sources were significant, since the weights were saying that the image was much noisier than it was.

@andrew-saydjari
Copy link
Collaborator

Also re the weight map, if you look at the source sigmas from im*weight, the peak heights are exactly what you would expect. My money is on the psf model that was hacked to import the JWST PSF and that is where I am spending my debugging time on this issue atm.

@andrew-saydjari
Copy link
Collaborator

When I have things cleaned up, I will actually push to PR #16, but all of the problems seem to have come from that. A quick and dirty rewrite to (1) get the order of the nstamp axis versus spatial axes right and (2) avoid some absolutely insane broadcasting rules in photutils seems to have resolved the issue. Here is a quick dump of that code. @keflavich seems to think that (2) was introduced by some update in photutils. Thus, I will record that I am using version 1.9.0 of photutils.

class WrappedPSFModel(crowdsource.psf.SimplePSF):
    def __init__(self, psfgridmodel):
        self.psfgridmodel = psfgridmodel
        self.default_stampsz = 39

    def __call__(self, col, row, stampsz=None, deriv=False):

        if stampsz is None:
            stampsz = self.default_stampsz

        parshape = np.broadcast(col, row).shape
        tparshape = parshape if len(parshape) > 0 else (1,)

        # numpy uses row, column notation
        rows, cols = np.indices((stampsz, stampsz)) - (np.array([stampsz, stampsz])-1)[:, None, None] / 2.
        # explicitly broadcast
        col = np.atleast_1d(col)
        row = np.atleast_1d(row)
        rows = rows[:, :, None] + row[None, None, :]
        cols = cols[:, :, None] + col[None, None, :]

        # photutils seems to use column, row notation
        # it returns something in (nstamps, row, col) shape
        stamps = []
        for i in range(len(col)):
            stamps.append(self.psfgridmodel.evaluate(cols[:,:,i], rows[:,:,i], 1, col[i], row[i]))
        stampsS = np.stack(stamps,axis=0)
        stamps = np.transpose(stampsS,axes=(0,2,1))

        if deriv:
            dpsfdrow, dpsfdcol = np.gradient(stamps, axis=(1,2))
            dpsfdrow = dpsfdrow.T
            dpsfdcol = dpsfdcol.T

        ret = stamps
        if parshape != tparshape:
            ret = ret.reshape(stampsz, stampsz)
            if deriv:
                dpsfdrow = dpsfdrow.reshape(stampsz, stampsz)
                dpsfdcol = dpsfdcol.reshape(stampsz, stampsz)
        if deriv:
            ret = (ret, dpsfdcol, dpsfdrow)

        return ret

    def render_model(self, col, row, stampsz=None):
        return None

The following is a demonstration of my success using crowdsource with this PSF model for a JWST image. Obviously, we could do better with some improvements to the nodeblend maskbit etc. But I really just want a source-based mask for CloudClean, so I will move on to that for now.
image

@keflavich
Copy link
Contributor Author

OK, just reading this now, looks like all the change is in:

        # photutils seems to use column, row notation
        # it returns something in (nstamps, row, col) shape
        stamps = []
        for i in range(len(col)):
            stamps.append(self.psfgridmodel.evaluate(cols[:,:,i], rows[:,:,i], 1, col[i], row[i]))
        stampsS = np.stack(stamps,axis=0)
        stamps = np.transpose(stampsS,axes=(0,2,1))

This could be made slightly more efficient, but I want to understand if something broke under me or if I somehow got lucky previously.

@andrew-saydjari your images look good, and that's a nice-looking sky model (except for the saturated stars, no surprise there). What parameters did you use in fit_im?

@andrew-saydjari
Copy link
Collaborator

Thanks. I played around a lot this afternoon with noblend/sharp_maskbits, but ended up just accepting what is essentially just the default parameters. This is adding a few spurious sources to every real source, but I think that is because of the difference between the true and coadded PSF that you noted before, and not a crowdsource setting problem. It might be worth running that gaussian blur test again with this model. For now, I am generating my CloudCovErr mask by running like:

out = fit_im(im_clean,psfmodel_py,weight=wt,dq=d_im,
    ntilex=1, ntiley=1, refit_psf=false,psfderiv=true,fewstars=100,
    maxstars=320000,verbose=true,threshold=5,
    blendthreshu=0.2,psfvalsharpcutfac=0.7,psfsharpsat=0.7,miniter=2, maxiter=10);

P.S. I ended up commenting out the last transpose for my work, but there is a transpose between julia/python so you should just confirm that for yourself (and I can hack the transpose in more on the julia side).

P.P.S I tried the usual list comprehension python trick, and got no speedup. One could try preallocation/other usual performance tricks, because these fits are taking a long time. I don't have a good handle on if it is much longer than usual for this crowded of a field, but my feeling is yes.

@schlafly
Copy link
Owner

Yeah, I don't think nodeblend / sharp will help you much if the PSF is just wrong. I agree that if this is some kind of ~drizzled image I would start by convolving the PSF by an appropriate ~top hat, and then probably adjust the blendthreshold
https://github.com/schlafly/crowdsource/blob/master/crowdsource/crowdsource_base.py#L121-L123
to avoid detecting too much shattered PSF garbage. If you're getting PSF deblending issues on the first iteration you could also push the maxfilter shape higher to avoid detecting sources too near one another in each iteration.

But, sorry, summarizing: we don't think there is any particular problem with edge effects, but the PSF implementation for Webb was doing something wonky and you've fixed that?

@keflavich
Copy link
Contributor Author

Yeah, I think that's what @andrew-saydjari concluded. I haven't gotten to run any tests yet. But we can... relabel this issue, I suppose?

@schlafly
Copy link
Owner

I may try to convert it to a discussion.

Repository owner locked and limited conversation to collaborators Nov 10, 2023
@schlafly schlafly converted this issue into discussion #19 Nov 10, 2023

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants