-
Notifications
You must be signed in to change notification settings - Fork 0
/
precursor.tex
81 lines (63 loc) · 8.68 KB
/
precursor.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
\section{Experience from Precursor Surveys (Melissa) } \label{sec:precursor}
\MLG[inline]{"Experience from Precursor Surveys" section draft complete. Contact the following to verify text and invite as co-authors: Scott Barthelmy ([email protected]), Eric Christensen ([email protected]) and Matthew Graham ([email protected]).}
%Ensure this section answers these questions:
%- How are we planning to scale up from precursor surveys? e.g data rates, number of targets, scaling up of follow-up observations.
% - What are the challenges of scaling up to LSST rates?
% - Data Challenges?
This section summarizes the lessons that can be learned from three ongoing time-domain surveys and/or event-distribution networks: the Gamma-ray Coordinates Network, the Catalina Sky Survey, and the Zwicky Transient Facility.
The contents of this section are derived from invited presentations given by representatives\footnote{Thanks to Scott Barthelmy, Eric Christensen, and Matthew Graham for their talks at the workshop.} of each survey.
\subsection{Gamma-ray Coordinates Network ( {GCN})}
% MLG sourced GCN material from:
% LSST CBW Participants Drive > Presentations - Wednesday Afternoon > GCN_LSST_CBW_June2019_v8-2.pdf (public)
% LSSTBrokerWorkshop19 > SOC's Notes (private)
The {GCN} evolved out of the {BATSE}\footnote{Burst and Transient {Source} Experiment; a part of {NASA}'s Compton Gamma-Ray Observatory.} Coordinates Distribution Network ( {BACODINE}; 1993-1997), and is a complex system that's grown over time.
The {GCN}'s self-described goal is to {\it ``Collect all {transient} information from all sources and distribute it in real-time to all who want it"}, and it sits at the intersection of automated (real-time) and human-in-the-loop analysis.
Compared to the stream of {LSST} alerts -- which will all have similar contents, be distributed within 60 seconds, and be world-public -- the {GCN} supports a wide diversity of message types, contents, origins, filtering, and user-access options.
There are currently two categories of message types: notices (seconds to hours) and circulars (minutes to days)\footnote{A third category of reports, which had a timescale of days, has been discontinued.}.
Message contents can include positions, light-curves, images, spectra, temporal data, and/or telescope {metadata} (pointing, thresholds).
Notices can be automatically generated by observatories, and circulars and reports can be submitted by a human (e.g., via socket connections, email).
Messages are automatically extracted, processed, and verified before circulation.
Filtering options for registered subscribers include brightness, significance, time since discovery, sky location, or source of the notice (e.g., gamma-ray burst, gravitational wave event, and $>$100 other types).
Access options include email, socket connections (including {IVOA} VOEvent protocol), an online archival, etc.
The {GCN} also supports the restriction of circulation to private networks (e.g., the {LIGO}-VIRGO Collaboration's gravitational wave event notices in observing runs 1 and 2), and daily omnibus summaries.
The {GCN} intends to receive and process at least a small subset of {LSST} alerts that are relevant to its user base.
{\bf Lessons learned from the {GCN} include:}
considerable {monitoring} infrastructure is required to ensure {GCN}'s maximum system live-time ($>$99\%);
a parallel, stand-alone environment is necessary for testing new code and running stress-tests to assess system load and performance;
contents purity is ensured by implementing automatic file format checks every time files are edited;
and, of course, the importance of backing up the data, code, scripts, config files, and documentation.
% One takeaway seems to be that automation is never done!
\subsection{Catalina Sky Survey}
% MLG sourced Catalina material from:
% LSST CBW Participants Drive > Presentations - Wednesday Afternoon > Christensen_Eric_pdf2.pdf
% LSSTBrokerWorkshop19 > SOC's Notes (private)
This survey is optimized for the discovery and follow-up of Near-Earth Asteroids (NEOs), has been funded by {NASA} since 1998, and is immensely successful: Catalina is credited for the discovery of almost half of all known NEOs.
This success can be attributed -- at least in part -- to this singular focus: all aspects of the survey are optimized for the scientific goal of {NEO} discovery, and there are no competing interests.
This clear and resolute aspect garnered strong support from the wider {NEO} follow-up community, which has compounded Catalina's success.
Out of necessity, the {LSST} cadence must accommodate many science goals, but Catalina serves as a good example for brokers intending to focus on a singular {LSST} science area or limited set of science goals.
The Minor Planet {Center} ( {MPC}) plays a key role by ingesting {NEO} candidate reports and making them available to the community.
The {MPC} has strict regulations on event reporting which lead to a $>$99\% purity rate of its database, but is in need of some modernization for the {LSST} era, such as reducing processing latencies and improving database access.
These updates will enable the time-sensitive astrometric follow-up that is required to solve for asteroid orbital parameters.
For example, reducing the delay between initial {NEO} detection and the creation of an ``actionable orbit" (from which follow-up observations can be planned) will be more scientifically productive, especially for faster-moving NEOs.
Towards this end, efforts are underway to develop the NEOfixer\footnote{See, e.g., \url{https://www.noao.edu/meetings/lsst-tds/presentations/Seaman_NEOfixer.pdf}}, an {NEO} follow-up broker which will subscribe directly to {NEO} candidates from the {MPC}, not the {LSST} alert stream ( {LSST} data is ingested to the {MPC} daily).
The goal of NEOfixer is to strategically improve the quality of the {NEO} catalog by optimizing worldwide\footnote{NEOfixers users include professional and amateur astronomers.} {NEO} follow-up, scaled to the demands of future wide-area surveys like {LSST}.
NEOfixer plans to offer subscribers an event prioritization that is customizable to a given observatory's location, aperture, and instrumentation (via a web interface and a direct socket), and the ability for users to report back their follow-up efforts in real time.
{\bf Lessons learned from the Catalina Sky Survey include:}
adopting a singular science focus can both facilitate the decision-making process for, and amplify the success rate of, a time-domain survey processing system;
interfacing with trusted community infrastructure enables community follow-up, which adds value to a data set;
providing customizable event prioritization for community follow-up helps ensure that resources are well matched to the needed observations.
% The NEO community is currently finding about 2000 new NEOs per year, and about 10-100 times more new main belt asteroids per year. Currently, about 2-3 candidate NEOs are discovered per real NEO.
\subsection{Zwicky Transient Facility ( {ZTF})}
% MLG sourced ZTF material from:
% LSST CBW Participants Drive > Presentations - Wednesday Afternoon > ZTF LSST Brokers.pdf
% LSSTBrokerWorkshop19 > SOC's Notes (private)
In operations since 2018, the {ZTF} covers the northern sky ( {declination} $>$30 degrees) in filters {\it gri} to a 5$\sigma$ depth of $r$=20.5 mag with a variety of cadences, including up to six visits a night for some areas at some times \citep{2019PASP..131f8003B,2019PASP..131g8001G}.
Alert packets for difference-image sources are released in real time, have a similar format\footnote{Apache Avro; see \S~\ref{sec:archandtech}.} as the planned {LSST} alerts, and are currently being consumed by at least $100$ unique sites via community brokers.
The primary follow-up system for the {ZTF} project is the {SED} Machine ( {SEDM}; \citealt{2018PASP..130c5003B}), a low-resolution Integral Field Unit spectrograph with a fully automated reduction {pipeline}.
The {ZTF} had publicly classified $1000$ supernovae as of mid-July 2019.
{\bf Lessons learned from {ZTF} include:}
API connections appear to be the most desirable mode for broker connections, which allow users to design their own interfaces;
as real-bogus scores evolve with new training, users will request that past events be re-classified;
the color information from {\it gri} only is insufficient for some science;
and finally that long alert IDs that are necessary due to the large number of detections but which are not human-friendly, e.g., ZTF19abctlvw will be generally derided\footnote{One suggestion was to use three 3-letter words in a row, e.g., ``catpawtoy", as that would provide $\sim$50 million memorable combinations.};
Since the {ZTF} is still ongoing, surely there are more lessons to follow.