Skip to content

Commit

Permalink
Built site for gh-pages
Browse files Browse the repository at this point in the history
  • Loading branch information
Quarto GHA Workflow Runner committed Jan 12, 2024
1 parent 29e450c commit b058a7d
Show file tree
Hide file tree
Showing 39 changed files with 38,293 additions and 12 deletions.
2 changes: 1 addition & 1 deletion .nojekyll
Original file line number Diff line number Diff line change
@@ -1 +1 @@
d75ed5ce
9ff87391
Binary file modified TBEP-CC.docx
Binary file not shown.
Binary file modified TBEP-CC.pdf
Binary file not shown.
561 changes: 553 additions & 8 deletions data.html

Large diffs are not rendered by default.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 2 additions & 2 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@

<meta name="author" content="Benjamin D. Best">
<meta name="author" content="Marcus W. Beck">
<meta name="dcterms.date" content="2024-01-11">
<meta name="dcterms.date" content="2024-01-12">

<title>TBEP-CC</title>
<style>
Expand Down Expand Up @@ -256,7 +256,7 @@ <h2 id="toc-title">Table of contents</h2>
<div>
<div class="quarto-title-meta-heading">Published</div>
<div class="quarto-title-meta-contents">
<p class="date">2024-01-11</p>
<p class="date">2024-01-12</p>
</div>
</div>

Expand Down
30 changes: 29 additions & 1 deletion search.json
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,35 @@
"href": "data.html#task-1.-assessment-of-available-data-and-coverage",
"title": "2  Data",
"section": "2.1 Task 1. Assessment of available data and coverage",
"text": "2.1 Task 1. Assessment of available data and coverage\nData descriptive of the risks of climate change can be obtained from several sources. These may include weather or climatological data, long-term tidal gauge data, or in situ water measurements responsive to climate change. Weather and climatological data could be obtained from local weather stations with long-term data, e.g., Tampa International Airport, and could include measures of air temperature, precipitation, and/or storm intensity/frequency. Tidal gauge data are readily available from the NOAA PORTS data retrieval system. Lastly, in situ water measurements could include water temperature, changes in flow hydrology, salinity, and/or pH. Data used to evaluate potential risks related to ocean acidification should also be explored.\nThe permanency and ease of access of each data source should be noted when making recommendations on indicators to operationalize. Further, indicators that communicate the risks associated with climate change are preferred, as opposed to those that simply indicate change. An example is the number of days in a year when temperature exceeds a critical threshold, as compared to temperature alone. An additional example is frequency of sunny day flooding events, as compared to tidal gauge measurements alone."
"text": "2.1 Task 1. Assessment of available data and coverage\nData descriptive of the risks of climate change can be obtained from several sources. These may include weather or climatological data, long-term tidal gauge data, or in situ water measurements responsive to climate change. Weather and climatological data could be obtained from local weather stations with long-term data, e.g., Tampa International Airport, and could include measures of air temperature, precipitation, and/or storm intensity/frequency. Tidal gauge data are readily available from the NOAA PORTS data retrieval system. Lastly, in situ water measurements could include water temperature, changes in flow hydrology, salinity, and/or pH. Data used to evaluate potential risks related to ocean acidification should also be explored.\nThe permanency and ease of access of each data source should be noted when making recommendations on indicators to operationalize. Further, indicators that communicate the risks associated with climate change are preferred, as opposed to those that simply indicate change. An example is the number of days in a year when temperature exceeds a critical threshold, as compared to temperature alone. An additional example is frequency of sunny day flooding events, as compared to tidal gauge measurements alone.\n\n\nCode\nif (!\"librarian\" %in% rownames(installed.packages()))\n install.packages(\"librarian\")\nlibrarian::shelf(\n dplyr, dygraphs, glue, here, leaflet, lubridate, sf,\n tbep-tech/tbeptools, \n RColorBrewer, readr, rnoaa, terra, tidyr, webshot2,\n quiet = T)\n\n# explicitly list packages for renv::dependencies(); renv::snapshot()\nlibrary(dplyr)\nlibrary(dygraphs)\nlibrary(glue)\nlibrary(here)\nlibrary(leaflet)\nlibrary(librarian)\nlibrary(lubridate)\nlibrary(RColorBrewer)\nlibrary(readr)\nlibrary(rnoaa)\nlibrary(sf)\nlibrary(tbeptools)\nlibrary(terra)\nlibrary(tidyr)\nlibrary(webshot2)\n\noptions(readr.show_col_types = F)"
},
{
"objectID": "data.html#temperature",
"href": "data.html#temperature",
"title": "2  Data",
"section": "2.2 Temperature",
"text": "2.2 Temperature\n\n2.2.1 Observed\nThe rnoaa R package uses NOAA NCDC API v2, which only goes to 2022-09-15.\n\nNCEI Web Services | Climate Data Online (CDO) | National Center for Environmental Information (NCEI)\nData Tools | Climate Data Online (CDO) | National Climatic Data Center (NCDC)\n\n\n2.2.1.1 Weather stations\n\nTampa International Airport\n\nStart Date: 1939-02-01\nEnd Date: 2024-01-07\n\n\nGot token at ncdc.noaa.gov/cdo-web/token. Added variable NOAA_NCDC_CDO_token to:\n\nlocally:\nfile.edit(\"~/.Renviron\")\non GitHub: Repository secrets in Actions secrets · tbep-tech/climate-change-indicators\nGCHN readme\n\nPRCP: Precipitation (tenths of mm)\nTMAX: Maximum temperature (tenths of degrees C)\nTMIN: Minimum temperature (tenths of degrees C)\n\n\n\n\nCode\n# provide NOAA key\noptions(noaakey = Sys.getenv(\"NOAA_NCDC_CDO_token\"))\n\n# Specify datasetid and station\nstn &lt;- \"GHCND:USW00012842\" # TAMPA INTERNATIONAL AIRPORT, FL US\nstn_csv &lt;- here(\"data/tpa_ghcnd.csv\")\nstn_meta_csv &lt;- here(\"data/tpa_meta.csv\")\n\nif (!file.exists(stn_meta_csv)){\n # cache station metadata since timeout from Github Actions\n stn_meta &lt;- ncdc_stations(\n datasetid = \"GHCND\", \n stationid = stn)\n write_csv(stn_meta$data, stn_meta_csv)\n}\nread_csv(stn_meta_csv)\n\n\n# A tibble: 1 × 9\n elevation mindate maxdate latitude name datacoverage id \n &lt;dbl&gt; &lt;date&gt; &lt;date&gt; &lt;dbl&gt; &lt;chr&gt; &lt;dbl&gt; &lt;chr&gt;\n1 1.8 1939-02-01 2024-01-09 28.0 TAMPA INTERNATION… 1 GHCN…\n# ℹ 2 more variables: elevationUnit &lt;chr&gt;, longitude &lt;dbl&gt;\n\n\nCode\nif (!file.exists(stn_csv)){\n\n date_beg &lt;- stn_meta$data$mindate\n date_end &lt;- stn_meta$data$maxdate\n max_rows &lt;- 1000\n vars &lt;- c(\"PRCP\",\"TMIN\",\"TMAX\")\n \n n_vars &lt;- length(vars)\n days_batch &lt;- floor(max_rows / n_vars)\n dates &lt;- unique(c(\n seq(\n ymd(date_beg), \n ymd(date_end), \n by = glue(\"{days_batch} days\")),\n ymd(date_end)))\n \n n_i &lt;- length(dates) - 1\n for (i in 1:n_i){\n # for (i in 14:n_i){\n date_beg &lt;- dates[i]\n if (i == n_i){\n date_end &lt;- dates[i+1]\n } else {\n date_end &lt;- dates[i+1] - days(1)\n }\n print(glue(\"{i} of {n_i}: {date_beg} to {date_end} ~ {Sys.time()}\"))\n \n # retry if get Error: Service Unavailable (HTTP 503)\n o &lt;- NULL\n attempt &lt;- 1\n attempt_max &lt;- 10\n while (is.null(o) && attempt &lt;= attempt_max) {\n if (attempt &gt; 1)\n print(glue(\" attempt {attempt}\", .trim = F))\n attempt &lt;- attempt + 1\n try(\n o &lt;- ncdc(\n datasetid = \"GHCND\", \n stationid = stn, \n datatypeid = vars, \n startdate = date_beg,\n enddate = date_end,\n limit = max_rows) )\n }\n \n if (i == 1) {\n df &lt;- o$data\n } else {\n df &lt;- rbind(df, o$data)\n }\n }\n stopifnot(duplicated(df[,1:2])|&gt; sum() == 0)\n \n df &lt;- df |&gt; \n mutate(\n date = as.Date(strptime(\n date, \"%Y-%m-%dT00:00:00\")),\n datatype = recode(\n datatype, \n PRCP = \"precip_mm\", \n TMIN = \"temp_c_min\", \n TMAX = \"temp_c_max\"),\n value = value / 10) |&gt; \n select(\n -station, # station : all \"GHCND:USW00012842\"\n -fl_m, # measurement flag: 3,524 are \"T\" for trace\n -fl_t, # time flag: all \"2400\"\n -fl_q) # quality flag: all \"\"\n \n write_csv(df, stn_csv)\n}\n\nd &lt;- read_csv(stn_csv)\n\nd |&gt; \n select(date, datatype, value) |&gt;\n filter(datatype %in% c(\"temp_c_min\",\"temp_c_max\")) |&gt;\n pivot_wider(\n names_from = datatype, \n values_from = value) |&gt;\n dygraph(main = \"Daily Temperature (ºC)\") |&gt; \n dyOptions(\n colors = brewer.pal(5, \"YlOrRd\")[c(5,3)]) |&gt; \n dySeries(\"temp_c_min\", label = \"min\") |&gt; \n dySeries(\"temp_c_max\", label = \"max\")\n\n\n\n\n\nFigure 2.1: ?(caption)\n\n\n\nCode\nd |&gt; \n select(date, datatype, value) |&gt;\n filter(datatype %in% c(\"precip_mm\")) |&gt;\n pivot_wider(\n names_from = datatype, \n values_from = value) |&gt;\n dygraph(main = \"Daily Precipitation (mm)\") |&gt; \n dySeries(\"precip_mm\", label = \"precip\")\n\n\n\n\n\nFigure 2.2: ?(caption)\n\n\n\n\n\n2.2.1.2 Satellite"
},
{
"objectID": "data.html#precipitation",
"href": "data.html#precipitation",
"title": "2  Data",
"section": "2.3 Precipitation",
"text": "2.3 Precipitation"
},
{
"objectID": "data.html#sea-level-rise",
"href": "data.html#sea-level-rise",
"title": "2  Data",
"section": "2.4 Sea Level Rise",
"text": "2.4 Sea Level Rise\nSea level rise occurs from principally two sources: 1) thermal expansion; and 2) freshwater inputs from glacial melting. Data for these trends can be obtained from NOAA’s Sea Level Trends (Figure 2.3)\nTypes of data:\n\nObserved (past, present) - tide gauge - satellite, e.g. Laboratory for Satellite Altimetry / Sea Level Rise\n\nLevel-3 products distributed through NOAA CoastWatch (Sea Level Anomaly and along-track altimetry)\n\nProjected (future). modeled\n\n\n2.4.1 Gauges\n\n\n\nFigure 2.3: Screenshot of NOAA’s Sea Level Trends zoomed into the Tampa Bay.\n\n\n\nPORTS: Tampa Bay PORTS - NOAA Tides & Currents\n\n\n\n2.4.2 Satellite\n\nNOAA / NESDIS / STAR - Laboratory for Satellite Altimetry / Sea Level Rise\n\n\n\nCode\nslr_nc &lt;- here(\"data/slr/slr_map_txj1j2.nc\")\nr_slr_gcs &lt;- rast(slr_nc) # 0.5 degree resolution\nr_slr_mer &lt;- projectRasterForLeaflet(r_slr_gcs, method=\"bilinear\")\n\nb &lt;- st_bbox(tbsegshed)\nr_slr_tb_mer &lt;- rast(slr_nc) |&gt; \n crop(b) # |&gt; \n # projectRasterForLeaflet(method=\"bilinear\")\n# only one value for Tampa Bay extracted at 0.5 degree resolution\n# values(r_slr_tb_mer, mat=F, na.rm=T) # 5.368306\n\nb &lt;- st_bbox(tbshed)\nplet(r_slr_mer, tiles=providers$Esri.OceanBasemap) |&gt; \n addProviderTiles(providers$CartoDB.DarkMatterOnlyLabels) |&gt; \n addPolygons(data = tbsegshed) |&gt;\n fitBounds(\n lng1 = b[[\"xmin\"]], lat1 = b[[\"ymin\"]], \n lng2 = b[[\"xmax\"]], lat2 = b[[\"ymax\"]])\n\n\n\n\n\nFigure 2.4: ?(caption)"
},
{
"objectID": "data.html#severe-weather",
"href": "data.html#severe-weather",
"title": "2  Data",
"section": "2.5 Severe Weather",
"text": "2.5 Severe Weather\n\nSWDI vignette • rnoaa"
},
{
"objectID": "references.html",
Expand Down
272 changes: 272 additions & 0 deletions site_libs/Proj4Leaflet-1.0.1/proj4leaflet.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,272 @@
(function (factory) {
var L, proj4;
if (typeof define === 'function' && define.amd) {
// AMD
define(['leaflet', 'proj4'], factory);
} else if (typeof module === 'object' && typeof module.exports === "object") {
// Node/CommonJS
L = require('leaflet');
proj4 = require('proj4');
module.exports = factory(L, proj4);
} else {
// Browser globals
if (typeof window.L === 'undefined' || typeof window.proj4 === 'undefined')
throw 'Leaflet and proj4 must be loaded first';
factory(window.L, window.proj4);
}
}(function (L, proj4) {
if (proj4.__esModule && proj4.default) {
// If proj4 was bundled as an ES6 module, unwrap it to get
// to the actual main proj4 object.
// See discussion in https://github.com/kartena/Proj4Leaflet/pull/147
proj4 = proj4.default;
}

L.Proj = {};

L.Proj._isProj4Obj = function(a) {
return (typeof a.inverse !== 'undefined' &&
typeof a.forward !== 'undefined');
};

L.Proj.Projection = L.Class.extend({
initialize: function(code, def, bounds) {
var isP4 = L.Proj._isProj4Obj(code);
this._proj = isP4 ? code : this._projFromCodeDef(code, def);
this.bounds = isP4 ? def : bounds;
},

project: function (latlng) {
var point = this._proj.forward([latlng.lng, latlng.lat]);
return new L.Point(point[0], point[1]);
},

unproject: function (point, unbounded) {
var point2 = this._proj.inverse([point.x, point.y]);
return new L.LatLng(point2[1], point2[0], unbounded);
},

_projFromCodeDef: function(code, def) {
if (def) {
proj4.defs(code, def);
} else if (proj4.defs[code] === undefined) {
var urn = code.split(':');
if (urn.length > 3) {
code = urn[urn.length - 3] + ':' + urn[urn.length - 1];
}
if (proj4.defs[code] === undefined) {
throw 'No projection definition for code ' + code;
}
}

return proj4(code);
}
});

L.Proj.CRS = L.Class.extend({
includes: L.CRS,

options: {
transformation: new L.Transformation(1, 0, -1, 0)
},

initialize: function(a, b, c) {
var code,
proj,
def,
options;

if (L.Proj._isProj4Obj(a)) {
proj = a;
code = proj.srsCode;
options = b || {};

this.projection = new L.Proj.Projection(proj, options.bounds);
} else {
code = a;
def = b;
options = c || {};
this.projection = new L.Proj.Projection(code, def, options.bounds);
}

L.Util.setOptions(this, options);
this.code = code;
this.transformation = this.options.transformation;

if (this.options.origin) {
this.transformation =
new L.Transformation(1, -this.options.origin[0],
-1, this.options.origin[1]);
}

if (this.options.scales) {
this._scales = this.options.scales;
} else if (this.options.resolutions) {
this._scales = [];
for (var i = this.options.resolutions.length - 1; i >= 0; i--) {
if (this.options.resolutions[i]) {
this._scales[i] = 1 / this.options.resolutions[i];
}
}
}

this.infinite = !this.options.bounds;

},

scale: function(zoom) {
var iZoom = Math.floor(zoom),
baseScale,
nextScale,
scaleDiff,
zDiff;
if (zoom === iZoom) {
return this._scales[zoom];
} else {
// Non-integer zoom, interpolate
baseScale = this._scales[iZoom];
nextScale = this._scales[iZoom + 1];
scaleDiff = nextScale - baseScale;
zDiff = (zoom - iZoom);
return baseScale + scaleDiff * zDiff;
}
},

zoom: function(scale) {
// Find closest number in this._scales, down
var downScale = this._closestElement(this._scales, scale),
downZoom = this._scales.indexOf(downScale),
nextScale,
nextZoom,
scaleDiff;
// Check if scale is downScale => return array index
if (scale === downScale) {
return downZoom;
}
if (downScale === undefined) {
return -Infinity;
}
// Interpolate
nextZoom = downZoom + 1;
nextScale = this._scales[nextZoom];
if (nextScale === undefined) {
return Infinity;
}
scaleDiff = nextScale - downScale;
return (scale - downScale) / scaleDiff + downZoom;
},

distance: L.CRS.Earth.distance,

R: L.CRS.Earth.R,

/* Get the closest lowest element in an array */
_closestElement: function(array, element) {
var low;
for (var i = array.length; i--;) {
if (array[i] <= element && (low === undefined || low < array[i])) {
low = array[i];
}
}
return low;
}
});

L.Proj.GeoJSON = L.GeoJSON.extend({
initialize: function(geojson, options) {
this._callLevel = 0;
L.GeoJSON.prototype.initialize.call(this, geojson, options);
},

addData: function(geojson) {
var crs;

if (geojson) {
if (geojson.crs && geojson.crs.type === 'name') {
crs = new L.Proj.CRS(geojson.crs.properties.name);
} else if (geojson.crs && geojson.crs.type) {
crs = new L.Proj.CRS(geojson.crs.type + ':' + geojson.crs.properties.code);
}

if (crs !== undefined) {
this.options.coordsToLatLng = function(coords) {
var point = L.point(coords[0], coords[1]);
return crs.projection.unproject(point);
};
}
}

// Base class' addData might call us recursively, but
// CRS shouldn't be cleared in that case, since CRS applies
// to the whole GeoJSON, inluding sub-features.
this._callLevel++;
try {
L.GeoJSON.prototype.addData.call(this, geojson);
} finally {
this._callLevel--;
if (this._callLevel === 0) {
delete this.options.coordsToLatLng;
}
}
}
});

L.Proj.geoJson = function(geojson, options) {
return new L.Proj.GeoJSON(geojson, options);
};

L.Proj.ImageOverlay = L.ImageOverlay.extend({
initialize: function (url, bounds, options) {
L.ImageOverlay.prototype.initialize.call(this, url, null, options);
this._projectedBounds = bounds;
},

// Danger ahead: Overriding internal methods in Leaflet.
// Decided to do this rather than making a copy of L.ImageOverlay
// and doing very tiny modifications to it.
// Future will tell if this was wise or not.
_animateZoom: function (event) {
var scale = this._map.getZoomScale(event.zoom);
var northWest = L.point(this._projectedBounds.min.x, this._projectedBounds.max.y);
var offset = this._projectedToNewLayerPoint(northWest, event.zoom, event.center);

L.DomUtil.setTransform(this._image, offset, scale);
},

_reset: function () {
var zoom = this._map.getZoom();
var pixelOrigin = this._map.getPixelOrigin();
var bounds = L.bounds(
this._transform(this._projectedBounds.min, zoom)._subtract(pixelOrigin),
this._transform(this._projectedBounds.max, zoom)._subtract(pixelOrigin)
);
var size = bounds.getSize();

L.DomUtil.setPosition(this._image, bounds.min);
this._image.style.width = size.x + 'px';
this._image.style.height = size.y + 'px';
},

_projectedToNewLayerPoint: function (point, zoom, center) {
var viewHalf = this._map.getSize()._divideBy(2);
var newTopLeft = this._map.project(center, zoom)._subtract(viewHalf)._round();
var topLeft = newTopLeft.add(this._map._getMapPanePos());

return this._transform(point, zoom)._subtract(topLeft);
},

_transform: function (point, zoom) {
var crs = this._map.options.crs;
var transformation = crs.transformation;
var scale = crs.scale(zoom);

return transformation.transform(point, scale);
}
});

L.Proj.imageOverlay = function (url, bounds, options) {
return new L.Proj.ImageOverlay(url, bounds, options);
};

return L.Proj;
}));
Loading

0 comments on commit b058a7d

Please sign in to comment.