From b9c7684a9f10b97d96db266b196a83f5b05114e9 Mon Sep 17 00:00:00 2001 From: taddyb Date: Tue, 24 Oct 2023 13:59:54 -0400 Subject: [PATCH] Deployed 2cec6b9 with MkDocs version: 1.5.3 --- projects/index.html | 3 ++- search/search_index.json | 2 +- sitemap.xml.gz | Bin 334 -> 334 bytes 3 files changed, 3 insertions(+), 2 deletions(-) diff --git a/projects/index.html b/projects/index.html index 4a29e24..90c1d55 100644 --- a/projects/index.html +++ b/projects/index.html @@ -721,7 +721,8 @@

Community\(\delta\) MC-Juniata-hydroDL2 (Bindas et al. 2023)


-

A differentiable routing method that mimics the classical Muskingum-Cunge routing model over a river network but embeds an NN to infer parameterizations for Manning’s roughness (n) and channel geometries from raw reach-scale attributes like catchment areas and sinuosity.

+

Manning's n recovery against USGS Data +A differentiable routing method that mimics the classical Muskingum-Cunge routing model over a river network but embeds an NN to infer parameterizations for Manning’s roughness (n).

Read More Here

diff --git a/search/search_index.json b/search/search_index.json index 59995e1..f07badb 100644 --- a/search/search_index.json +++ b/search/search_index.json @@ -1 +1 @@ -{"config":{"lang":["en"],"separator":"[\\s\\u200b\\-_,:!=\\[\\]()\"`/]+|\\.(?!\\d)|&[lg]t;|(?!\\b)(?=[A-Z][a-z])","pipeline":["stopWordFilter"],"fields":{"title":{"boost":1000.0},"text":{"boost":1.0},"tags":{"boost":1000000.0}}},"docs":[{"location":"","title":"HydroDL","text":"

Landing page coming soon...

"},{"location":"#benchmarks","title":"Benchmarks","text":""},{"location":"Contribute/","title":"Contribute","text":""},{"location":"Contribute/#tbd","title":"TBD","text":""},{"location":"Example/","title":"Examples","text":"

Several examples related to the above papers are presented here. Click the title link to see each example.

"},{"location":"Example/#1train-a-lstm-data-integration-model-to-make-streamflow-forecast","title":"1.Train a LSTM data integration model to make streamflow forecast","text":"

The dataset used is NCAR CAMELS dataset. Download CAMELS following this link. Please download both forcing, observation data CAMELS time series meteorology, observed flow, meta data (.zip) and basin attributes CAMELS Attributes (.zip). Put two unzipped folders under the same directory, like your/path/to/Camels/basin_timeseries_v1p2_metForcing_obsFlow, and your/path/to/Camels/camels_attributes_v2.0. Set the directory path your/path/to/Camels as the variable rootDatabase inside the code later.

Computational benchmark: training of CAMELS data (w/ or w/o data integration) with 671 basins, 10 years, 300 epochs, in ~1 hour with GPU.

"},{"location":"Installation/","title":"Installation","text":""},{"location":"Installation/#installation","title":"Installation","text":"

There are two different methods for hydroDL installation:

"},{"location":"Installation/#create-a-new-environment-then-activate-it","title":"Create a new environment, then activate it","text":"
conda create -n mhpihydrodl python=3.7\nconda activate mhpihydrodl\n
"},{"location":"Installation/#using-pypi-stable-package","title":"Using PyPI (stable package)","text":"

Install our hydroDL stable package from pip (Python version>=3.0)

pip install hydroDL\n
"},{"location":"Installation/#source-latest-version","title":"Source latest version","text":"

Install our latest hydroDL package from github

git clone https://github.com/mhpi/hydroDL.git\n

Note: If you want to run our examples directly, please download the example folder (It contains the code and data for these examples).

There exists a small compatibility issue with our code when using the latest pyTorch version. Feel free to contact us if you find any issues or code bugs that you cannot resolve.

"},{"location":"Quick_start/","title":"Quick Start:","text":"

The detailed code for quick start can be found in tutorial_quick_start.py

See below for a brief explanation of the major components you need to run a hydroDL model:

# imports\nfrom hydroDL.model.crit import RmseLoss\nfrom hydroDL.model.rnn import CudnnLstmModel as LSTM\nfrom hydroDL.model.train import trainModel\nfrom hydroDL.model.test import testModel\n\n# load your training and testing data \n# x: forcing data (pixels, time, features)\n# c: attribute data (pixels, features)\n# y: observed values (pixels, time, 1)\nx_train, c_train, y_train, x_val, c_val, y_val = load_data(...)\n\n# define your model and loss function\nmodel = LSTM(nx=num_variables, ny=1)\nloss_fn = RmseLoss()\n\n# train your model\nmodel = trainModel(model,\n    x_train,\n    y_train,\n    c_train,\n    loss_fn,\n)\n\n# validate your model\npred = testModel(model,\n             x_val,\n             c_val,\n)\n
"},{"location":"blog/","title":"Blog","text":""},{"location":"docs/","title":"Docs","text":"

HydroDL's differentiable modeling interface is set up to provide a seemless template for using physics models and neural networks together.

Our framework provides a standard for differentiable physics models to follow such that anyone can plug their models together.

"},{"location":"docs/#sections","title":"Sections","text":""},{"location":"docs/#plugins","title":"Plugins","text":"

Plugins are a way to build off of open-source Deep Learning papers and repositories.

"},{"location":"docs/datasets/","title":"Datasets","text":"

The Datasets used in hydroDL are individual @dataclass classes used to create a Pytorch torch.utils.data.Dataloader. The classe

Data

Inputs to the neural networks

Observations

Targets used when training

"},{"location":"docs/datasets/#data","title":"Data","text":"

Data classes are implementations of the following ABC class:

__init__.Data.py
from abc import ABC, abstractmethod\n\nfrom omegaconf import DictConfig\nimport torch\n\nclass Data(ABC):\n     @abstractmethod\n    def __init__(self, cfg: DictConfig, dates: Dates, normalize: Normalize):\n        \"\"\"A function to define what inputs are required by a Data object\"\"\"\n        pass\n\n    @abstractmethod\n    def _read_attributes(self) -> None:\n        \"\"\"\n        Abstract method for reading attributes related to the data.\n\n        \"\"\"\n        pass\n\n    @abstractmethod\n    def _read_forcings(self) -> None:\n        \"\"\"\n        Abstract method for reading attributes related to the data.\n        :return: None\n        \"\"\"\n        pass\n\n    @abstractmethod\n    def _read_data(self) -> None:\n        \"\"\"The method to read all data\"\"\"\n        pass\n\n    @abstractmethod\n    def get_data(self) -> Hydrofabric:\n        \"\"\"\n        Abstract method for retrieving data in the form of a hydrofabric\n\n        \"\"\"\n        pass\n
"},{"location":"docs/experiments/","title":"Experiments","text":"

HydroDL experiments are designed to seamlessly be both reusableand structured. All experiments are child classes of the base Experiment class:

__init__.Experiment.py
from abc import ABC, abstractmethod\nfrom typing import Dict, Type\n\nimport torch\nimport torch.nn\n\nclass Experiment(ABC):\n    @abstractmethod\n    def run(\n        self,\n        data_loader: torch.utils.data.DataLoader,\n        neural_network: nn.Module,\n        physics_models: Dict[str, Type[nn.Module]],\n    ) -> None:\n        \"\"\"a method that runs your experiment\"\"\"\n        pass\n

The arguments passed into the parameters of the run function are all either class references (physics_models) or full instantiated classes (data_loader, or neural_network)

"},{"location":"docs/neural_networks/","title":"Neural Networks","text":"

Neural Networks are configured similar to how they are instantiated in other PyTorch packages.

from functools import partial\n\nfrom omegaconf import DictConfig\nimport torch\nimport torch.nn as nn\n\nfrom hydroRoute.neural_networks import Initialization\n\nclass NN(nn.Module):\n    def __init__(self, cfg: DictConfig):\n        super(MLP, self).__init__()\n        self.cfg = cfg\n        self.Initialization = Initialization(self.cfg)\n\n    def initialize_weights(self) -> None:\n        \"\"\"\n        The partial function used to \n        \"\"\"\n        func = self.Initialization.get()\n        init_func = partial(self._initialize_weights, func=func)\n        self.apply(init_func)\n\n    @staticmethod\n    def _initialize_weights(m, func) -> None:\n        \"\"\"\n        An internal class used to intialize weights based\n        on a provided initialization function\n        \"\"\"\n        if isinstance(m, nn.Linear):\n            func(m.weight)\n\n    def forward(self, inputs: torch.Tensor) -> None:\n        pass\n
"},{"location":"docs/physics_models/","title":"Physics Models","text":"

HydroDL's implemented physics models are all child classes of the Pytorch nn.Module class. By creating your physics model as an nn.Module, you can tap into PyTorch's neural network functionality and get a lot of bonuses.

"},{"location":"docs/physics_models/#basics","title":"Basics","text":"

Our physics models are structured as follows:

from typing import Tuple\n\nimport torch\nimport torch.nn as nn\n\nclass PhysicsModel(nn.Module):\n    def __init__(self, cfg: DictConfig) -> None:\n        super(PhysicsModel, self).__init__()\n        self.cfg = cfg\n\n    def forward(self, inputs: Tuple[..., torch.Tensor]) -> torch.Tensor:\n

Where all that is required by a physics model is it's specified configuration file. Since there are different requirements for each physics model, it is necessary for you to read into the specific configurations required by each module.

"},{"location":"docs/plugins/","title":"Plugins","text":""},{"location":"docs/plugins/hydrodl/","title":"HydroDL","text":""},{"location":"projects/","title":"Community","text":"

See below for coding projects developed by the community that utilize HydroDL

"},{"location":"projects/bindas_2023/","title":"\\(\\delta\\) MC-Juniata-hydroDL2","text":"

Info

This paper is currently in preprint

"},{"location":"projects/bindas_2023/#code-release","title":"Code Release","text":"

Will be released upon publication

"},{"location":"projects/bindas_2023/#results","title":"Results","text":"The learned relationship between n and drainage area (square kilometers) for the Juniata River basin according to the trained graph neural network. (a) The distribution of river segments by Manning\u2019s n and drainage area on a linear scale. (b) The same distribution, but on a logarithmic scale with a logarithmic trendline. The network was trained for the period of 2001/02/01 to 2001/03/29. Each dot in the scatter plot represents a 2-km river reach."},{"location":"projects/bindas_2023/#bibtex-citation","title":"Bibtex Citation","text":"
@techreport{bindas2023improving,\n  title={Improving large-basin streamflow simulation using a modular, differentiable, learnable graph model for routing},\n  author={Bindas, Tadd and Tsai, Wen-Ping and Liu, Jiangtao and Rahmani, Farshid and Feng, Dapeng and Bian, Yuchen and Lawson, Kathryn and Shen, Chaopeng},\n  year={2023},\n  institution={Copernicus Meetings}\n}\n
"},{"location":"projects/feng_2023/","title":"\\(\\delta\\) HBV-globe1.0-hydroDL","text":"

Documentation coming soon

"},{"location":"projects/feng_2023/#bibtex-citation","title":"Bibtex Citation","text":"
@article{feng2023deep,\n  title={Deep Dive into Global Hydrologic Simulations: Harnessing the Power of Deep Learning and Physics-informed Differentiable Models ($\\delta$HBV-globe1. 0-hydroDL)},\n  author={Feng, Dapeng and Beck, Hylke and de Bruijn, Jens and Sahu, Reetik Kumar and Satoh, Yusuke and Wada, Yoshihide and Liu, Jiangtao and Pan, Ming and Lawson, Kathryn and Shen, Chaopeng},\n  journal={Geoscientific Model Development Discussions},\n  volume={2023},\n  pages={1--23},\n  year={2023},\n  publisher={G{\\\"o}ttingen, Germany}\n}\n
"}]} \ No newline at end of file +{"config":{"lang":["en"],"separator":"[\\s\\u200b\\-_,:!=\\[\\]()\"`/]+|\\.(?!\\d)|&[lg]t;|(?!\\b)(?=[A-Z][a-z])","pipeline":["stopWordFilter"],"fields":{"title":{"boost":1000.0},"text":{"boost":1.0},"tags":{"boost":1000000.0}}},"docs":[{"location":"","title":"HydroDL","text":"

Landing page coming soon...

"},{"location":"#benchmarks","title":"Benchmarks","text":""},{"location":"Contribute/","title":"Contribute","text":""},{"location":"Contribute/#tbd","title":"TBD","text":""},{"location":"Example/","title":"Examples","text":"

Several examples related to the above papers are presented here. Click the title link to see each example.

"},{"location":"Example/#1train-a-lstm-data-integration-model-to-make-streamflow-forecast","title":"1.Train a LSTM data integration model to make streamflow forecast","text":"

The dataset used is NCAR CAMELS dataset. Download CAMELS following this link. Please download both forcing, observation data CAMELS time series meteorology, observed flow, meta data (.zip) and basin attributes CAMELS Attributes (.zip). Put two unzipped folders under the same directory, like your/path/to/Camels/basin_timeseries_v1p2_metForcing_obsFlow, and your/path/to/Camels/camels_attributes_v2.0. Set the directory path your/path/to/Camels as the variable rootDatabase inside the code later.

Computational benchmark: training of CAMELS data (w/ or w/o data integration) with 671 basins, 10 years, 300 epochs, in ~1 hour with GPU.

"},{"location":"Installation/","title":"Installation","text":""},{"location":"Installation/#installation","title":"Installation","text":"

There are two different methods for hydroDL installation:

"},{"location":"Installation/#create-a-new-environment-then-activate-it","title":"Create a new environment, then activate it","text":"
conda create -n mhpihydrodl python=3.7\nconda activate mhpihydrodl\n
"},{"location":"Installation/#using-pypi-stable-package","title":"Using PyPI (stable package)","text":"

Install our hydroDL stable package from pip (Python version>=3.0)

pip install hydroDL\n
"},{"location":"Installation/#source-latest-version","title":"Source latest version","text":"

Install our latest hydroDL package from github

git clone https://github.com/mhpi/hydroDL.git\n

Note: If you want to run our examples directly, please download the example folder (It contains the code and data for these examples).

There exists a small compatibility issue with our code when using the latest pyTorch version. Feel free to contact us if you find any issues or code bugs that you cannot resolve.

"},{"location":"Quick_start/","title":"Quick Start:","text":"

The detailed code for quick start can be found in tutorial_quick_start.py

See below for a brief explanation of the major components you need to run a hydroDL model:

# imports\nfrom hydroDL.model.crit import RmseLoss\nfrom hydroDL.model.rnn import CudnnLstmModel as LSTM\nfrom hydroDL.model.train import trainModel\nfrom hydroDL.model.test import testModel\n\n# load your training and testing data \n# x: forcing data (pixels, time, features)\n# c: attribute data (pixels, features)\n# y: observed values (pixels, time, 1)\nx_train, c_train, y_train, x_val, c_val, y_val = load_data(...)\n\n# define your model and loss function\nmodel = LSTM(nx=num_variables, ny=1)\nloss_fn = RmseLoss()\n\n# train your model\nmodel = trainModel(model,\n    x_train,\n    y_train,\n    c_train,\n    loss_fn,\n)\n\n# validate your model\npred = testModel(model,\n             x_val,\n             c_val,\n)\n
"},{"location":"blog/","title":"Blog","text":""},{"location":"docs/","title":"Docs","text":"

HydroDL's differentiable modeling interface is set up to provide a seemless template for using physics models and neural networks together.

Our framework provides a standard for differentiable physics models to follow such that anyone can plug their models together.

"},{"location":"docs/#sections","title":"Sections","text":""},{"location":"docs/#plugins","title":"Plugins","text":"

Plugins are a way to build off of open-source Deep Learning papers and repositories.

"},{"location":"docs/datasets/","title":"Datasets","text":"

The Datasets used in hydroDL are individual @dataclass classes used to create a Pytorch torch.utils.data.Dataloader. The classe

Data

Inputs to the neural networks

Observations

Targets used when training

"},{"location":"docs/datasets/#data","title":"Data","text":"

Data classes are implementations of the following ABC class:

__init__.Data.py
from abc import ABC, abstractmethod\n\nfrom omegaconf import DictConfig\nimport torch\n\nclass Data(ABC):\n     @abstractmethod\n    def __init__(self, cfg: DictConfig, dates: Dates, normalize: Normalize):\n        \"\"\"A function to define what inputs are required by a Data object\"\"\"\n        pass\n\n    @abstractmethod\n    def _read_attributes(self) -> None:\n        \"\"\"\n        Abstract method for reading attributes related to the data.\n\n        \"\"\"\n        pass\n\n    @abstractmethod\n    def _read_forcings(self) -> None:\n        \"\"\"\n        Abstract method for reading attributes related to the data.\n        :return: None\n        \"\"\"\n        pass\n\n    @abstractmethod\n    def _read_data(self) -> None:\n        \"\"\"The method to read all data\"\"\"\n        pass\n\n    @abstractmethod\n    def get_data(self) -> Hydrofabric:\n        \"\"\"\n        Abstract method for retrieving data in the form of a hydrofabric\n\n        \"\"\"\n        pass\n
"},{"location":"docs/experiments/","title":"Experiments","text":"

HydroDL experiments are designed to seamlessly be both reusableand structured. All experiments are child classes of the base Experiment class:

__init__.Experiment.py
from abc import ABC, abstractmethod\nfrom typing import Dict, Type\n\nimport torch\nimport torch.nn\n\nclass Experiment(ABC):\n    @abstractmethod\n    def run(\n        self,\n        data_loader: torch.utils.data.DataLoader,\n        neural_network: nn.Module,\n        physics_models: Dict[str, Type[nn.Module]],\n    ) -> None:\n        \"\"\"a method that runs your experiment\"\"\"\n        pass\n

The arguments passed into the parameters of the run function are all either class references (physics_models) or full instantiated classes (data_loader, or neural_network)

"},{"location":"docs/neural_networks/","title":"Neural Networks","text":"

Neural Networks are configured similar to how they are instantiated in other PyTorch packages.

from functools import partial\n\nfrom omegaconf import DictConfig\nimport torch\nimport torch.nn as nn\n\nfrom hydroRoute.neural_networks import Initialization\n\nclass NN(nn.Module):\n    def __init__(self, cfg: DictConfig):\n        super(MLP, self).__init__()\n        self.cfg = cfg\n        self.Initialization = Initialization(self.cfg)\n\n    def initialize_weights(self) -> None:\n        \"\"\"\n        The partial function used to \n        \"\"\"\n        func = self.Initialization.get()\n        init_func = partial(self._initialize_weights, func=func)\n        self.apply(init_func)\n\n    @staticmethod\n    def _initialize_weights(m, func) -> None:\n        \"\"\"\n        An internal class used to intialize weights based\n        on a provided initialization function\n        \"\"\"\n        if isinstance(m, nn.Linear):\n            func(m.weight)\n\n    def forward(self, inputs: torch.Tensor) -> None:\n        pass\n
"},{"location":"docs/physics_models/","title":"Physics Models","text":"

HydroDL's implemented physics models are all child classes of the Pytorch nn.Module class. By creating your physics model as an nn.Module, you can tap into PyTorch's neural network functionality and get a lot of bonuses.

"},{"location":"docs/physics_models/#basics","title":"Basics","text":"

Our physics models are structured as follows:

from typing import Tuple\n\nimport torch\nimport torch.nn as nn\n\nclass PhysicsModel(nn.Module):\n    def __init__(self, cfg: DictConfig) -> None:\n        super(PhysicsModel, self).__init__()\n        self.cfg = cfg\n\n    def forward(self, inputs: Tuple[..., torch.Tensor]) -> torch.Tensor:\n

Where all that is required by a physics model is it's specified configuration file. Since there are different requirements for each physics model, it is necessary for you to read into the specific configurations required by each module.

"},{"location":"docs/plugins/","title":"Plugins","text":""},{"location":"docs/plugins/hydrodl/","title":"HydroDL","text":""},{"location":"projects/","title":"Community","text":"

See below for coding projects developed by the community that utilize HydroDL

"},{"location":"projects/bindas_2023/","title":"\\(\\delta\\) MC-Juniata-hydroDL2","text":"

Info

This paper is currently in preprint

"},{"location":"projects/bindas_2023/#code-release","title":"Code Release","text":"

Will be released upon publication

"},{"location":"projects/bindas_2023/#results","title":"Results","text":"The learned relationship between n and drainage area (square kilometers) for the Juniata River basin according to the trained graph neural network. (a) The distribution of river segments by Manning\u2019s n and drainage area on a linear scale. (b) The same distribution, but on a logarithmic scale with a logarithmic trendline. The network was trained for the period of 2001/02/01 to 2001/03/29. Each dot in the scatter plot represents a 2-km river reach."},{"location":"projects/bindas_2023/#bibtex-citation","title":"Bibtex Citation","text":"
@techreport{bindas2023improving,\n  title={Improving large-basin streamflow simulation using a modular, differentiable, learnable graph model for routing},\n  author={Bindas, Tadd and Tsai, Wen-Ping and Liu, Jiangtao and Rahmani, Farshid and Feng, Dapeng and Bian, Yuchen and Lawson, Kathryn and Shen, Chaopeng},\n  year={2023},\n  institution={Copernicus Meetings}\n}\n
"},{"location":"projects/feng_2023/","title":"\\(\\delta\\) HBV-globe1.0-hydroDL","text":"

Documentation coming soon

"},{"location":"projects/feng_2023/#bibtex-citation","title":"Bibtex Citation","text":"
@article{feng2023deep,\n  title={Deep Dive into Global Hydrologic Simulations: Harnessing the Power of Deep Learning and Physics-informed Differentiable Models ($\\delta$HBV-globe1. 0-hydroDL)},\n  author={Feng, Dapeng and Beck, Hylke and de Bruijn, Jens and Sahu, Reetik Kumar and Satoh, Yusuke and Wada, Yoshihide and Liu, Jiangtao and Pan, Ming and Lawson, Kathryn and Shen, Chaopeng},\n  journal={Geoscientific Model Development Discussions},\n  volume={2023},\n  pages={1--23},\n  year={2023},\n  publisher={G{\\\"o}ttingen, Germany}\n}\n
"}]} \ No newline at end of file diff --git a/sitemap.xml.gz b/sitemap.xml.gz index 8e62845fe43b807c0183fec65fc80a9a5459941b..61c749d9c63e80afd4998f65de493bdb1b84c8aa 100644 GIT binary patch delta 16 XcmX@dbdHH#zMF$Xip^ppyE`KQC1nG; delta 16 XcmX@dbdHH#zMF$%&u8