diff --git a/.DS_Store b/.DS_Store
new file mode 100644
index 0000000..fd6753d
Binary files /dev/null and b/.DS_Store differ
diff --git a/LICENSE b/LICENSE
index 785fc46..daf3031 100644
--- a/LICENSE
+++ b/LICENSE
@@ -1,6 +1,6 @@
MIT License
-Copyright (c) 2023 CIVML
+Copyright (c) 2023 BayesWorks
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
diff --git a/README.md b/README.md
index 695ddd5..15c28a7 100644
--- a/README.md
+++ b/README.md
@@ -1,2 +1,68 @@
-# cutagi-doc
-Documentation Website for cuTAGI
+
+
+# py/cuTAGI Documentation
+
+> cuTAGI is an open-source Bayesian neural networks library that is based on Tractable Approximate Gaussian Inference (TAGI) theory. It supports various neural network architectures such as full-connected, convolutional, and transpose convolutional layers, as well as skip connections, pooling and normalization layers. cuTAGI is capable of performing different tasks such as supervised, unsupervised, and reinforcement learning. This library has a python API called pyTAGI that allows users to easily use the C++ and CUDA libraries.
+
+
+## Getting Started
+
+To get started with using our library, check out our:
+
+- [installation guide](guide/install.md) for Windows, MacOS, and Linux (CPU + GPU).
+- [quick tutorial](guide/quick-tutorial.md) for a 1D toy problem.
+
+## Examples
+
+In this section, you will find a series of [examples](examples/examples.md) for each available architecture that you can use as a starting point.
+
+## API
+
+Check out our [API reference](api/api.md) for a complete list of all the functions and classes in our library.
+
+## Modules
+
+pyTAGI already includes a set of modules that allow users to make their own models. Check out our [modules reference](modules/modules.md) for a list of classes and functions.
+
+## Contributing
+
+We welcome contributions from the community by 1) forking the project, 2) Create a feature branch, and 3) Commit your changes.
+
+## Support
+
+If you run into any issues or have any questions, please [open an issue](https://github.com/lhnguyen102/cuTAGI/issues) or contact us at *luongha.nguyen@gmail.com* or *james.goulet@polymtl.ca*.
+
+## Citation
+
+```
+@misc{cutagi2022,
+ Author = {Luong-Ha Nguyen and James-A. Goulet},
+ Title = {cu{TAGI}: a {CUDA} library for {B}ayesian neural networks with Tractable Approximate {G}aussian Inference},
+ Year = {2022},
+ journal = {GitHub repository},
+ howpublished = {https://github.com/lhnguyen102/cuTAGI}
+}
+```
+
+## References
+
+* [Tractable approximate Gaussian inference for Bayesian neural networks](https://www.jmlr.org/papers/volume22/20-1009/20-1009.pdf) (James-A. Goulet, Luong-Ha Nguyen, and Said Amiri. JMLR, 2021)
+* [Analytically tractable hidden-states inference in Bayesian neural networks](https://www.jmlr.org/papers/volume23/21-0758/21-0758.pdf) (Luong-Ha Nguyen and James-A. Goulet. JMLR, 2022)
+* [Analytically tractable inference in deep neural networks](https://arxiv.org/pdf/2103.05461.pdf) (Luong-Ha Nguyen and James-A. Goulet. ArXiv 2021)
+* [Analytically tractable Bayesian deep Q-Learning](https://arxiv.org/pdf/2106.11086.pdf) (Luong-Ha Nguyen and James-A. Goulet. ArXiv, 2021)
+
+
+## License
+
+cuTAGI is licensed under the [MIT License](https://github.com/lhnguyen102/cuTAGI/blob/main/LICENSE).
+
+## Acknowledgement
+We would like to say a big thank you to Miquel Florensa who wrote and put together this document all by himself, showing hard work and a commitment to sharing clear and detailed information.
\ No newline at end of file
diff --git a/_sidebar.md b/_sidebar.md
new file mode 100644
index 0000000..04a9c63
--- /dev/null
+++ b/_sidebar.md
@@ -0,0 +1,3 @@
+- [**Home**](/)
+- [About py/cuTAGI](about.md)
+- [Our Team](team.md)
\ No newline at end of file
diff --git a/about.md b/about.md
new file mode 100644
index 0000000..5494833
--- /dev/null
+++ b/about.md
@@ -0,0 +1,19 @@
+
+
+# About py/cuTAGI
+
+The core developpements of py/cuTAGI have been made by Luong-Ha Nguyen building upon the theoretical work done at Polytechnique Montreal in collaboration with James-A. Goulet, Bhargob Deka, Van-Dai Vuong and Miquel Florensa. The project started in 2018 when, from our background with large-scale state-space models, we foresaw that it would be possible to perform analytical Bayesian inference in neural networks (see below our first try at what would become TAGI).
+
+
+
+Following the early proofs of concepts with small-scale examples with single-layer MLPs, we slowly expanded the developpement of TAGI for CNN, autotoencoders and GANs architctures. Then came proofs of concepts with reinforcement learning toy problems which led to full-scale applications on the Atari and MuJoCo benchmarks. The expansion of TAGI's applicability to new architectures continued with LSTM networks along with unprecedented features with analytical uncertainty quantification for Bayesian neural networks, analytical adversaial attacks, inference-based optimization and general purpose latent-space inference.
+
+Despite our repeated successes at leveraging analytical inference in neural network, the key limitation remaining was the lack of a efficient and scalalable library for TAGI; as the method does not relies on Backprop nor gradient descent, it is incompatible with traditionnal libraries such as PyTorch or TensorFlow. In 2021, Luong-Ha Nguyen decided to lead the developpement of the new cuTAGI plateform and later on the pyTAGI API with the objective to open the capabilities of TAGI to the entire community.
\ No newline at end of file
diff --git a/api/_sidebar.md b/api/_sidebar.md
new file mode 100644
index 0000000..48fd0bf
--- /dev/null
+++ b/api/_sidebar.md
@@ -0,0 +1,7 @@
+- pyTAGI API
+
+ - [Metrics](api/metrics.md)
+ - [NetProp](api/netprop.md)
+ - [Param](api/param.md)
+ - [TAGI Network](api/network.md)
+ - [TAGI Utils](api/utils.md)
\ No newline at end of file
diff --git a/api/api.md b/api/api.md
new file mode 100644
index 0000000..75f57a4
--- /dev/null
+++ b/api/api.md
@@ -0,0 +1,7 @@
+# API
+
+ - [Metrics](api/metrics.md)
+ - [NetProp](api/netprop.md)
+ - [Param](api/param.md)
+ - [TAGI Network](api/network.md)
+ - [TAGI Utils](api/utils.md)
\ No newline at end of file
diff --git a/api/metrics.md b/api/metrics.md
new file mode 100644
index 0000000..5ef3b54
--- /dev/null
+++ b/api/metrics.md
@@ -0,0 +1,72 @@
+# metric.py
+
+Measure the accuracy of the prediction.
+
+
+
+
+
+
+ Github Source code
+
+
+
+
+## *mse* method
+
+```python
+def mse(prediction: np.ndarray, observation: np.ndarray) -> float:
+ """Mean squared error"""
+```
+
+> Calculates the mean squared error between the prediction and observation arrays.
+
+**Parameters**
+- `prediction` (numpy.ndarray): Array containing the predicted values.
+- `observation` (numpy.ndarray): Array containing the observed values.
+
+**Returns**
+- `float`: Mean squared error.
+
+## *log_likelihood* method
+
+```python
+def log_likelihood(prediction: np.ndarray, observation: np.ndarray, std: np.ndarray) -> float:
+ """Compute the averaged log-likelihood"""
+```
+
+> Calculates the averaged log-likelihood between the prediction and observation arrays.
+
+**Parameters**
+- `prediction` (numpy.ndarray): Array containing the predicted values.
+- `observation` (numpy.ndarray): Array containing the observed values.
+- `std` (numpy.ndarray): Array containing the standard deviations.
+
+**Returns**
+- `float`: Averaged log-likelihood.
+
+## *rmse* method
+
+```python
+def rmse(prediction: np.ndarray, observation: np.ndarray) -> None:
+ """Root mean squared error"""
+```
+
+> Calculates the root mean squared error between the prediction and observation arrays.
+
+**Parameters**
+- `prediction` (numpy.ndarray): Array containing the predicted values.
+- `observation` (numpy.ndarray): Array containing the observed values.
+
+## *classification_error* method
+
+```python
+def classification_error(prediction: np.ndarray, label: np.ndarray) -> None:
+ """Compute the classification error"""
+```
+
+> Computes the classification error between the prediction and label arrays.
+
+**Parameters**
+- `prediction` (numpy.ndarray): Array containing the predicted values.
+- `label` (numpy.ndarray): Array containing the true labels.
diff --git a/api/netprop.md b/api/netprop.md
new file mode 100644
index 0000000..d8b9c79
--- /dev/null
+++ b/api/netprop.md
@@ -0,0 +1,95 @@
+# The NetProp class
+
+The `NetProp` class is a base class for network properties defined in the backend C++/CUDA layer. It provides various attributes and methods for defining network architecture and properties.
+
+
+
+
+
+
+ Github Source code
+
+
+
+## Attributes
+
+- `layers`: A list containing different [layers](api/netprop?id=layer-code) of the network architecture.
+- `nodes`: A list containing the number of hidden units for each layer.
+- `kernels`: A list containing the kernel sizes for convolutional layers.
+- `strides`: A list containing the strides for convolutional layers.
+- `widths`: A list containing the widths of the images.
+- `heights`: A list containing the heights of the images.
+- `filters`: A list containing the number of filters (depth of image) for each layer.
+- `activation`: A list containing the [activation](api/netprop?id=activation-code) function for each layer.
+- `pads`: A list containing the padding applied to the images.
+- `pad_types`: A list containing the types of padding.
+- `shortcuts`: A list containing the layer indices for residual networks.
+- `mu_v2b`: A NumPy array representing the mean of the observation noise squared.
+- `sigma_v2b`: A NumPy array representing the standard deviation of the observation noise squared.
+- `sigma_v`: A float representing the observation noise.
+- `decay_factor_sigma_v`: A float representing the decaying factor for sigma v (default value: 0.99).
+- `sigma_v_min`: A float representing the minimum value of the observation noise (default value: 0.3).
+- `sigma_x`: A float representing the input noise noise.
+- `is_idx_ud`: A boolean indicating whether or not to update only hidden units in the output layers.
+- `is_output_ud`: A boolean indicating whether or not to update the output layer.
+- `last_backward_layer`: An integer representing the index of the last layer whose hidden states are updated.
+- `nye`: An integer representing the number of observations for hierarchical softmax.
+- `noise_gain`: A float representing the gain for biases parameters relating to noise's hidden states.
+- `noise_type`: A string indicating whether the noise is homoscedastic or heteroscedastic.
+- `batch_size`: An integer representing the number of batches of data.
+- `input_seq_len`: An integer representing the sequence length for LSTM inputs.
+- `output_seq_len`: An integer representing the sequence length for the outputs of the last layer.
+- `seq_stride`: An integer representing the spacing between sequences for the LSTM layer.
+- `multithreading`: A boolean indicating whether or not to run parallel computing using multiple threads.
+- `collect_derivative`: A boolean indicating whether to enable the derivative computation mode.
+- `is_full_cov`: A boolean indicating whether to enable the full covariance mode.
+- `init_method`: A string representing the initialization method, e.g., He and Xavier.
+- `device`: A string indicating either "cpu" or "cuda".
+- `ra_mt`: A float representing the momentum for the normalization layer.
+
+## Example
+
+```python
+from pytagi import NetProp
+
+class RegressionMLP(NetProp):
+ """Multi-layer perceptron for regression task"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [1, 1, 1, 1]
+ self.nodes = [13, 50, 50, 1]
+ self.activations = [0, 4, 4, 0]
+ self.batch_size = 10
+ self.sigma_v = 0.3
+ self.sigma_v_min: float = 0.3
+ self.device = "cpu"
+```
+
+## Layer Code
+The following layer codes are used to represent different types of layers in the network:
+
+- 1: Fully-connected layer
+- 2: Convolutional layer
+- 21: Transpose convolutional layer
+- 3: Max pooling layer (currently not supported)
+- 4: Average pooling
+- 5: Layer normalization
+- 6: Batch normalization
+- 7: LSTM layer
+
+## Activation Code
+The following activation codes are used to represent different activation functions:
+
+- 0: No activation
+- 1: Tanh
+- 2: Sigmoid
+- 4: ReLU
+- 5: Softplus
+- 6: Leakyrelu
+- 7: Mixture ReLU
+- 8: Mixture bounded ReLU
+- 9: Mixture sigmoid
+- 10: Softmax with local linearization
+- 11: Remax
+- 12: Hierarchical softmax
diff --git a/api/network.md b/api/network.md
new file mode 100644
index 0000000..c04fca3
--- /dev/null
+++ b/api/network.md
@@ -0,0 +1,216 @@
+# The TagiNetwork class
+
+Python frontend calling TAGI network in C++/CUDA backend.
+
+
+
+
+
+
+ Github Source code
+
+
+
+## Attributes
+
+- `network`: Network wrapper that calls the tagi network from the backend
+- `net_prop`: Network properties
+
+## *constructor* method
+
+> Constructor for the TagiNetwork class.
+
+```python
+def __init__(self, net_prop: NetProp) -> None:
+```
+
+**Parameters**
+- `net_prop`: An instance of the [NetProp class](api/netprop.md) representing the network properties.
+
+## *net_prop* getter method
+
+```python
+@property
+def net_prop(self) -> NetProp():
+ """"Get network properties"""
+```
+
+**Returns**
+- `NetProp`: An instance of the [NetProp class](api/netprop.md).
+
+## *net_prop* setter method
+
+```python
+@net_prop.setter
+def net_prop(self, value: NetProp) -> None:
+ """Set network properties"""
+```
+
+**Parameters**
+- `value`: An instance of the [NetProp class](api/netprop.md) class representing the network properties.
+
+## *feed_forward* method
+
+```python
+def feed_forward(self, x_batch: np.ndarray,
+ Sx_batch: np.ndarray,
+ Sx_f_batch: np.ndarray) -> None:
+ """Forward pass the size of x_batch, Sx_batch (B, N)
+ where B is the batch size and N is the data dimension"""
+```
+
+**Parameters**
+- `x_batch`: Input data as a NumPy array.
+- `Sx_batch`: Diagonal variance of input data as a NumPy array.
+- `Sx_f_batch`: Full variance of input data as a NumPy array.
+
+## *connected_feed_forward* method
+
+```python
+def connected_feed_forward(self, ma: np.ndarray, va: np.ndarray,
+ mz: np.ndarray, vz: np.ndarray,
+ jcb: np.ndarray) -> None:
+ """Forward pass for the network that is connected to the other
+ network e.g., decoder network in autoencoder task where its inputs
+ are the outputs of the encoder network."""
+```
+
+**Parameters**
+- `ma`: Mean of activation units as a NumPy array.
+- `va`: Variance of activation units as a NumPy array.
+- `mz`: Mean of hidden states as a NumPy array.
+- `vz`: Variance of hidden states as a NumPy array.
+- `jcb`: Jacobian matrix (da/dz) as a NumPy array.
+
+## *state_feed_backward* method
+
+```python
+def state_feed_backward(self, y_batch: np.ndarray,
+ v_batch: np.ndarray,
+ ud_idx_batch: np.ndarray) -> None:
+ """Update hidden states the size of y_batch, V_batch (B, N)
+ where B is the batch size and N is the data dimension"""
+```
+
+**Parameters**
+- `y_batch`: Observations as a NumPy array.
+- `v_batch`: Variance of observations as a NumPy array.
+- `ud_idx_batch`: Updated indices for the last layer as a NumPy array.
+
+## *param_feed_backward* method
+
+```python
+def param_feed_backward(self) -> None:
+ """Update parameters"""
+```
+
+## *get_network_outputs* method
+
+```python
+def get_network_outputs(self) -> Tuple[np.ndarray, np.ndarray]:
+ """Get output layer's hidden state distribution"""
+```
+
+**Returns**
+- `ma`: Mean of activation units as a NumPy array.
+- `va`: Variance of activation units as a NumPy array.
+
+## *get_network_predictions* method
+
+```python
+def get_network_predictions(self) -> Tuple[np.ndarray, np.ndarray]:
+ """Get distribution of the predictions"""
+```
+
+**Returns**
+- `m_pred`: Mean of predictions as a NumPy array.
+- `v_pred`: Variance of predictions as a NumPy array.
+
+## *get_all_network_outputs* method
+
+```python
+def get_all_network_outputs(self) ->
+ Tuple[np.ndarray, np.ndarray, np.ndarray, np.ndarray, np.ndarray]:
+ """Get all hidden states of the output layers"""
+```
+
+**Returns**
+- `ma`: Mean of activations for the output layer as a NumPy array.
+- `va`: Variance of activations for the output layer as a NumPy array.
+- `mz`: Mean of hidden states for the output layer as a NumPy array.
+- `vz`: Variance of hidden states for the output layer as a NumPy array.
+- `jcb`: Jacobian matrix for the output layer as a NumPy array.
+
+## *get_all_network_inputs* method
+
+```python
+def get_all_network_inputs(self) ->
+ Tuple[np.ndarray, np.ndarray, np.ndarray, np.ndarray, np.ndarray]:
+ """Get all hidden states of the output layers"""
+```
+
+**Returns**
+- `ma`: Mean of activations for the input layer as a NumPy array.
+- `va`: Variance of activations for the input layer as a NumPy array.
+- `mz`: Mean of hidden states for the input layer as a NumPy array.
+- `vz`: Variance of hidden states for the input layer as a NumPy array.
+- `jcb`: Jacobian matrix for the input layer as a NumPy array.
+
+## *get_derivatives* method
+
+```python
+def get_derivatives(self, layer: int = 0) -> Tuple[np.ndarray, np.ndarray]:
+ """ Compute derivatives of the output layer w.r.t a given layer using TAGI"""
+```
+
+**Parameters**
+- `layer`: Layer index of the network.
+
+**Returns**
+- `mdy`: Mean values of derivatives as a NumPy array.
+- `vdy`: Variance values of derivatives as a NumPy array.
+
+## *get_inovation_mean_var* method
+
+```python
+def get_inovation_mean_var(self, layer: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Get updating quantities for the inovation"""
+```
+
+**Parameters**
+- `layer`: Layer index of the network.
+
+**Returns**
+- `delta_m`: Innovation mean as a NumPy array.
+- `delta_v`: Innovation variance as a NumPy array.
+
+## *get_state_delta_mean_var(self)* method
+
+```python
+def get_state_delta_mean_var(self) -> None:
+ """Get updating quatities for the first layer"""
+```
+
+**Returns**
+- `delta_mz`: Updating quantities for the hidden-state mean of the first layer as a NumPy array.
+- `delta_vz`: Updating quantities for the hidden-state variance of the first layer as a NumPy array.
+
+## *set_parameters* method
+
+```python
+def set_parameters(self, param: Param) -> None:
+ """Set parameter values to network"""
+```
+
+**Parameters**
+- `param`: An instance of the [Param class](api/param.md) representing the parameter values.
+
+## *get_parameters* method
+
+```python
+def get_parameters(self) -> tagi.Param:
+ """Get parameters of network"""
+```
+
+**Returns**
+- `param`: An instance of the [Param class](api/param.md) representing the network parameters.
diff --git a/api/param.md b/api/param.md
new file mode 100644
index 0000000..6864b08
--- /dev/null
+++ b/api/param.md
@@ -0,0 +1,62 @@
+# The Param class
+
+The `Param` class is a frontend API for weight and biases.
+
+
+
+
+
+
+ Github Source code
+
+
+
+## Attributes
+
+- `mw`: Mean of weight parameters (Type: `np.ndarray`)
+- `Sw`: Variance of weight parameters (Type: `np.ndarray`)
+- `mb`: Mean of bias parameters (Type: `np.ndarray`)
+- `Sb`: Variance of bias parameters (Type: `np.ndarray`)
+- `mw_sc`: Mean of weight parameters for the residual network (Type: `np.ndarray`)
+- `Sw_sc`: Variance of weight parameters for the residual network (Type: `np.ndarray`)
+- `mb_sc`: Mean of bias parameters for the residual network (Type: `np.ndarray`)
+- `Sb_sc`: Variance of bias parameters for the residual network (Type: `np.ndarray`)
+
+## *constructor* method
+
+```python
+def __init__(self, mw: np.ndarray, Sw: np.ndarray, mb: np.ndarray,
+ Sb: np.ndarray, mw_sc: np.ndarray, Sw_sc: np.ndarray,
+ mb_sc: np.ndarray, Sb_sc: np.ndarray) -> None:
+ """Frontend apt for weight and biases"""
+```
+
+> Initialize an instance of the `Param` class.
+
+**Parameters:**
+- `mw` (numpy.ndarray): Mean of weight parameters.
+- `Sw` (numpy.ndarray): Variance of weight parameters.
+- `mb` (numpy.ndarray): Mean of bias parameters.
+- `Sb` (numpy.ndarray): Variance of bias parameters.
+- `mw_sc` (numpy.ndarray): Mean of weight parameters for the residual network.
+- `Sw_sc` (numpy.ndarray): Variance of weight parameters for the residual network.
+- `mb_sc` (numpy.ndarray): Mean of bias parameters for the residual network.
+- `Sb_sc` (numpy.ndarray): Variance of bias parameters for the residual network.
+
+**Example**
+
+```python
+from pytagi import Param
+import numpy as np
+
+mw = np.array([0.5, 0.6, 0.7])
+Sw = np.array([0.1, 0.2, 0.3])
+mb = np.array([0.1, 0.2])
+Sb = np.array([0.01, 0.02])
+mw_sc = np.array([0.8, 0.9, 1.0])
+Sw_sc = np.array([0.05, 0.1, 0.15])
+mb_sc = np.array([0.05, 0.1])
+Sb_sc = np.array([0.005, 0.01])
+
+param = Param(mw, Sw, mb, Sb, mw_sc, Sw_sc, mb_sc, Sb_sc)
+```
diff --git a/api/utils.md b/api/utils.md
new file mode 100644
index 0000000..acb6e01
--- /dev/null
+++ b/api/utils.md
@@ -0,0 +1,415 @@
+# Tagi Utils
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+# The HierarchicalSoftmax class
+
+Hierarchical softmax wrapper. Further details can be found [here](https://building-babylon.net/2017/08/01/hierarchical-softmax).
+
+## *constructor* method
+
+> Constructor for the HierarchicalSoftmax class.
+
+```python
+def __init__(self) -> None:
+ super().__init__()
+```
+
+**Note:** The `super()` function is used to call the constructor of the base class `HrSoftmax`.
+
+---
+
+# The Utils class
+
+Frontend for utility functions from C++/CUDA backend.
+
+## Attributes
+
+- `backend_utils`: Utility functionalities from the backend.
+
+## *constructor* method
+
+> Constructor for the Utils class.
+
+```python
+def __init__(self) -> None:
+```
+
+## *label_to_obs* method
+
+```python
+def label_to_obs(self, labels: np.ndarray,
+ num_classes: int) -> Tuple[np.ndarray, np.ndarray, int]:
+ """Get observations and observation indices of the binary tree for classification"""
+
+```
+
+**Parameters**
+- `labels`: Labels of the dataset as a NumPy array.
+- `num_classes`: Total number of classes.
+
+**Returns**
+- `obs`: Encoded observations of the labels as a NumPy array.
+- `obs_idx`: Indices of the encoded observations in the output vector as a NumPy array.
+- `num_obs`: Number of encoded observations.
+
+## *label_to_one_hot* method
+
+```python
+def label_to_one_hot(self, labels: np.ndarray, num_classes: int) -> np.ndarray:
+ """Get the one hot encoder for each class"""
+
+```
+
+**Parameters**
+- `labels`: Labels of the dataset as a NumPy array.
+- `num_classes`: Total number of classes.
+
+**Returns**
+- `one_hot`: One hot encoder as a NumPy array.
+
+## *load_mnist_images* method
+
+```python
+def load_mnist_images(self, image_file: str, label_file: str,
+ num_images: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Load mnist dataset"""
+
+```
+
+**Parameters**
+- `image_file`: Location of the Mnist image file.
+- `label_file`: Location of the Mnist label file.
+- `num_images`: Number of images to be loaded.
+
+**Returns**
+- `images`: Image dataset as a NumPy array.
+- `labels`: Label dataset as a NumPy array.
+- `num_images`: Total number of images.
+
+## *load_cifar_images* method
+
+```python
+def load_cifar_images(self, image_file: str,
+ num: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Load cifar dataset"""
+
+```
+
+**Parameters**
+- `image_file`: Location of the image file.
+- `num`: Number of images to be loaded.
+
+**Returns**
+- `images`: Image dataset as a NumPy array.
+- `labels`: Label dataset as a NumPy array.
+
+Here are the method signatures for the additional methods:
+
+## *get_labels* method
+
+```python
+def get_labels(self, ma: np.ndarray, Sa: np.ndarray,
+ hr_softmax: HierarchicalSoftmax, num_classes: int,
+ batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Convert last layer's hidden state to labels"""
+```
+
+**Parameters**
+- `ma`: Mean of activation units for the output layer as a NumPy array.
+- `Sa`: Variance of activation units for the output layer as a NumPy array.
+- `hr_softmax`: Hierarchical softmax.
+- `num_classes`: Total number of classes.
+- `batch_size`: Number of data in a batch.
+
+**Returns**
+- `pred`: Label prediction as a NumPy array.
+- `prob`: Probability for each label as a NumPy array.
+
+## *get_errors* method
+
+```python
+def get_errors(self, ma: np.ndarray, Sa: np.ndarray, labels: np.ndarray,
+ hr_softmax: HierarchicalSoftmax, num_classes: int,
+ batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Convert last layer's hidden state to labels"""
+```
+
+**Parameters**
+- `ma`: Mean of activation units for the output layer as a NumPy array.
+- `Sa`: Variance of activation units for the output layer as a NumPy array.
+- `labels`: Label dataset as a NumPy array.
+- `hr_softmax`: Hierarchical softmax.
+- `num_classes`: Total number of classes.
+- `batch_size`: Number of data in a batch.
+
+**Returns**
+- `pred`: Label prediction as a NumPy array.
+- `prob`: Probability for each label as a NumPy array.
+
+## *get_hierarchical_softmax* method
+
+```python
+def get_hierarchical_softmax(self, num_classes: int) -> HierarchicalSoftmax:
+ """Convert labels to binary tree"""
+```
+
+**Parameters**
+- `num_classes`: Total number of classes.
+
+**Returns**
+- `hr_softmax`: Hierarchical softmax.
+
+## *obs_to_label_prob* method
+
+```python
+def obs_to_label_prob(self, ma: np.ndarray, Sa: np.ndarray,
+ hr_softmax: HierarchicalSoftmax,
+ num_classes: int) -> np.ndarray:
+ """Convert observation to label probabilities"""
+```
+
+**Parameters**
+- `ma`: Mean of activation units for the output layer as a NumPy array.
+- `Sa`: Variance of activation units for the output layer as a NumPy array.
+- `hr_softmax`: Hierarchical softmax.
+- `num_classes`: Total number of classes.
+
+**Returns**
+- `prob`: Probability for each label as a NumPy array.
+
+## *create_rolling_window* method
+
+```python
+def create_rolling_window(self, data: np.ndarray, output_col: np.ndarray,
+ input_seq_len: int, output_seq_len: int,
+ num_features: int,
+ stride: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Create rolling window for time series data"""
+```
+
+**Parameters**
+- `data`: Dataset as a NumPy array.
+- `output_col`: Indices of the output columns as a NumPy array.
+- `input_seq_len`: Length of the input sequence.
+- `output_seq_len`: Length of the output sequence.
+- `num_features`: Number of features.
+- `stride`: Controls the number of steps for the window movements.
+
+**Returns**
+- `input_data`: Input data for neural networks in sequence as a NumPy array.
+- `output_data`: Output data for neural networks in sequence as a NumPy array.
+
+## *get_upper_triu_cov* method
+
+```python
+def get_upper_triu_cov
+
+(self, batch_size: int, num_data: int,
+ sigma: float) -> np.ndarray:
+ """Create an upper triangle covariance matrix for inputs"""
+```
+
+**Parameters**
+- `batch_size`: Batch size as an integer.
+- `num_data`: Number of data as an integer.
+- `sigma`: Sigma value as a float.
+
+**Returns**
+- `vx_f`: Upper triangle covariance matrix for inputs as a NumPy array.
+
+---
+
+## *exponential_scheduler* method
+
+```python
+def exponential_scheduler(curr_v: float, min_v: float, decaying_factor: float,
+ curr_iter: float) -> float:
+ """Exponentially decaying"""
+```
+
+**Parameters**
+- `curr_v`: Current value as a float.
+- `min_v`: Minimum value as a float.
+- `decaying_factor`: Decaying factor as a float.
+- `curr_iter`: Current iteration as a float.
+
+**Returns**
+- `float`: A float representing the result of the exponential decay calculation. The returned value is the maximum of `curr_v * (decaying_factor**curr_iter)` and `min_v`.
+
+---
+
+# The Normalizer class
+
+Different methods to normalize the data before feeding it to neural networks.
+
+## *constructor* method
+
+> Constructor for the Normalizer class.
+
+```python
+def __init__(self, method: Union[str, None] = None) -> None:
+```
+
+**Parameters**
+- `method`: Optional. A string representing the normalization method.
+
+## *standardize* method
+
+```python
+def standardize(self, data: np.ndarray, mu: np.ndarray, std: np.ndarray) -> np.ndarray:
+ """Z-score normalization where data_norm = (data - data_mean) / data_std """
+```
+
+**Parameters**
+- `data`: Input data as a NumPy array.
+- `mu`: Mean values of the data as a NumPy array.
+- `std`: Standard deviation values of the data as a NumPy array.
+
+**Returns**
+- `np.ndarray`: Normalized data as a NumPy array.
+
+## *unstandardize* method
+
+```python
+@staticmethod
+def unstandardize(norm_data: np.ndarray, mu: np.ndarray, std: np.ndarray) -> np.ndarray:
+ """Transform standardized data to original space"""
+```
+
+**Parameters**
+- `norm_data`: Standardized data as a NumPy array.
+- `mu`: Mean values of the original data as a NumPy array.
+- `std`: Standard deviation values of the original data as a NumPy array.
+
+**Returns**
+- `np.ndarray`: Unstandardized data in the original space as a NumPy array.
+
+## *unstandardize_std* method
+
+```python
+@staticmethod
+def unstandardize_std(norm_std: np.ndarray, std: np.ndarray) -> np.ndarray:
+ """Transform standardized std to original space"""
+```
+
+**Parameters**
+- `norm_std`: Standardized standard deviation values as a NumPy array.
+- `std`: Standard deviation values of the original data as a NumPy array.
+
+**Returns**
+- `np.ndarray`: Unstandardized standard deviation values in the original space as a NumPy array.
+
+## *max_min_norm* method
+
+```python
+def max_min_norm(self, data: np.ndarray, max_value: np.ndarray, min_value: np.ndarray) -> np.ndarray:
+ """Normalize the data between 0 and 1"""
+```
+
+**Parameters**
+- `data`: Input data as a NumPy array.
+- `max_value`: Maximum values of the data as a NumPy array.
+- `min_value`: Minimum values of the data as a NumPy array.
+
+**Returns**
+- `np.ndarray`: Normalized data between 0 and 1 as a NumPy array.
+
+## *max_min_unnorm* method
+
+```python
+@staticmethod
+def max_min_unnorm(norm_data: np.ndarray, max_value: np.ndarray, min_value: np.ndarray) -> np.ndarray:
+ """Transform max-min normalized data to original space"""
+```
+
+**Parameters**
+- `norm_data`: Max-min normalized data as a NumPy array.
+- `max_value`: Maximum values of the original data as a NumPy array.
+- `min_value`: Minimum values of the original data as a NumPy array.
+
+**Returns**
+- `np.ndarray`: Unnormalized data in the original space as a NumPy array.
+
+## *max_min_unnorm_std* method
+
+```python
+@staticmethod
+def max_min_unnorm_std(norm_std: np.ndarray, max_value: np.ndarray, min_value: np.ndarray) -> np.ndarray:
+ """Transform max-min normalized std to original space"""
+```
+
+**Parameters**
+- `norm_std`: Max-min normalized standard deviation values as a NumPy array.
+- `max_value`: Maximum values of the original data as a NumPy array.
+- `min_value`:
+
+ Minimum values of the original data as a NumPy array.
+
+**Returns**
+- `np.ndarray`: Unnormalized standard deviation values in the original space as a NumPy array.
+
+## *compute_mean_std* method
+
+```python
+@staticmethod
+def compute_mean_std(data: np.ndarray) -> Tuple[np.ndarray, np.ndarray]:
+ """Compute sample mean and standard deviation"""
+```
+
+**Parameters**
+- `data`: Input data as a NumPy array.
+
+**Returns**
+- `Tuple[np.ndarray, np.ndarray]`: A tuple containing the sample mean and standard deviation as NumPy arrays.
+
+## *compute_max_min* method
+
+```python
+@staticmethod
+def compute_max_min(data: np.ndarray) -> Tuple[np.ndarray, np.ndarray]:
+ """Compute max min values"""
+```
+
+**Parameters**
+- `data`: Input data as a NumPy array.
+
+**Returns**
+- `Tuple[np.ndarray, np.ndarray]`: A tuple containing the maximum and minimum values of the data as NumPy arrays.
+
+---
+
+## *load_param_from_files function* method
+
+Load parameters from CSV files and return them as an instance of the Param class.
+
+```python
+def load_param_from_files(mw_file: str, Sw_file: str, mb_file: str,
+ Sb_file: str, mw_sc_file: str, Sw_sc_file: str,
+ mb_sc_file: str, Sb_sc_file: str) -> Param:
+ """Load parameters from CSV files"""
+```
+
+**Parameters**
+- `mw_file`: Path to the CSV file containing mw values.
+- `Sw_file`: Path to the CSV file containing Sw values.
+- `mb_file`: Path to the CSV file containing mb values.
+- `Sb_file`: Path to the CSV file containing Sb values.
+- `mw_sc_file`: Path to the CSV file containing mw_sc values.
+- `Sw_sc_file`: Path to the CSV file containing Sw_sc values.
+- `mb_sc_file`: Path to the CSV file containing mb_sc values.
+- `Sb_sc_file`: Path to the CSV file containing Sb_sc values.
+
+**Returns**
+- `Param`: An instance of the Param class containing the loaded parameter values.
+
+Note: The function assumes that the CSV files have no headers.
diff --git a/assets/.DS_Store b/assets/.DS_Store
new file mode 100644
index 0000000..a164caf
Binary files /dev/null and b/assets/.DS_Store differ
diff --git a/assets/docsify-themeable.min.js b/assets/docsify-themeable.min.js
new file mode 100644
index 0000000..3068c5d
--- /dev/null
+++ b/assets/docsify-themeable.min.js
@@ -0,0 +1,9 @@
+/*!
+ * docsify-themeable
+ * v0.9.0
+ * https://jhildenbiddle.github.io/docsify-themeable/
+ * (c) 2018-2022 John Hildenbiddle
+ * MIT license
+ */
+!function(){"use strict";function e(){return e=Object.assign||function(e){for(var t=1;t1&&void 0!==arguments[1]?arguments[1]:{},n={mimeType:t.mimeType||null,onBeforeSend:t.onBeforeSend||Function.prototype,onSuccess:t.onSuccess||Function.prototype,onError:t.onError||Function.prototype,onComplete:t.onComplete||Function.prototype},r=Array.isArray(e)?e:[e],o=Array.apply(null,Array(r.length)).map((function(e){return null}));function a(e){var t="string"==typeof e,n=t&&"<"===e.trim().charAt(0);return t&&!n}function s(e,t){n.onError(e,r[t],t)}function c(e,t){var a=n.onSuccess(e,r[t],t);e=!1===a?"":a||e,o[t]=e,-1===o.indexOf(null)&&n.onComplete(o)}var i=document.createElement("a");r.forEach((function(e,t){if(i.setAttribute("href",e),i.href=String(i.href),Boolean(document.all&&!window.atob)&&i.host.split(":")[0]!==location.host.split(":")[0]){if(i.protocol===location.protocol){var r=new XDomainRequest;r.open("GET",e),r.timeout=0,r.onprogress=Function.prototype,r.ontimeout=Function.prototype,r.onload=function(){var e=r.responseText;a(e)?c(e,t):s(r,t)},r.onerror=function(e){s(r,t)},setTimeout((function(){r.send()}),0)}else console.warn("Internet Explorer 9 Cross-Origin (CORS) requests must use the same protocol (".concat(e,")")),s(null,t)}else{var o=new XMLHttpRequest;o.open("GET",e),n.mimeType&&o.overrideMimeType&&o.overrideMimeType(n.mimeType),n.onBeforeSend(o,e,t),o.onreadystatechange=function(){if(4===o.readyState){var e=o.responseText;o.status<400&&a(e)||0===o.status&&a(e)?c(e,t):s(o,t)}},o.send()}}))}function n(e){var n=/\/\*[\s\S]+?\*\//g,o=/(?:@import\s*)(?:url\(\s*)?(?:['"])([^'"]*)(?:['"])(?:\s*\))?(?:[^;]*;)/g,a={rootElement:e.rootElement||document,include:e.include||'style,link[rel="stylesheet"]',exclude:e.exclude||null,filter:e.filter||null,skipDisabled:!1!==e.skipDisabled,useCSSOM:e.useCSSOM||!1,onBeforeSend:e.onBeforeSend||Function.prototype,onSuccess:e.onSuccess||Function.prototype,onError:e.onError||Function.prototype,onComplete:e.onComplete||Function.prototype},s=Array.apply(null,a.rootElement.querySelectorAll(a.include)).filter((function(e){return t=e,n=a.exclude,!(t.matches||t.matchesSelector||t.webkitMatchesSelector||t.mozMatchesSelector||t.msMatchesSelector||t.oMatchesSelector).call(t,n);var t,n})),c=Array.apply(null,Array(s.length)).map((function(e){return null}));function i(){if(-1===c.indexOf(null)){c.reduce((function(e,t,n){return""===t&&e.push(n),e}),[]).reverse().forEach((function(e){return[s,c].forEach((function(t){return t.splice(e,1)}))}));var e=c.join("");a.onComplete(e,c,s)}}function u(e,t,n,r){var o=a.onSuccess(e,n,r);d(e=void 0!==o&&!1===Boolean(o)?"":o||e,n,r,(function(e,r){null===c[t]&&(r.forEach((function(e){return a.onError(e.xhr,n,e.url)})),!a.filter||a.filter.test(e)?c[t]=e:c[t]="",i())}))}function l(e,t){var a=arguments.length>2&&void 0!==arguments[2]?arguments[2]:[],s={};return s.rules=(e.replace(n,"").match(o)||[]).filter((function(e){return-1===a.indexOf(e)})),s.urls=s.rules.map((function(e){return e.replace(o,"$1")})),s.absoluteUrls=s.urls.map((function(e){return r(e,t)})),s.absoluteRules=s.rules.map((function(e,n){var o=s.urls[n],a=r(s.absoluteUrls[n],t);return e.replace(o,a)})),s}function d(e,n,r,o){var s=arguments.length>4&&void 0!==arguments[4]?arguments[4]:[],c=arguments.length>5&&void 0!==arguments[5]?arguments[5]:[],i=l(e,r,c);i.rules.length?t(i.absoluteUrls,{onBeforeSend:function(e,t,r){a.onBeforeSend(e,n,t)},onSuccess:function(e,t,r){var o=a.onSuccess(e,n,t),s=l(e=!1===o?"":o||e,t,c);return s.rules.forEach((function(t,n){e=e.replace(t,s.absoluteRules[n])})),e},onError:function(t,a,u){s.push({xhr:t,url:a}),c.push(i.rules[u]),d(e,n,r,o,s,c)},onComplete:function(t){t.forEach((function(t,n){e=e.replace(i.rules[n],t)})),d(e,n,r,o,s,c)}}):o(e,s)}s.length?s.forEach((function(e,n){var o=e.getAttribute("href"),s=e.getAttribute("rel"),l="link"===e.nodeName.toLowerCase()&&o&&s&&-1!==s.toLowerCase().indexOf("stylesheet"),d=!1!==a.skipDisabled&&e.disabled,f="style"===e.nodeName.toLowerCase();if(l&&!d)t(o,{mimeType:"text/css",onBeforeSend:function(t,n,r){a.onBeforeSend(t,e,n)},onSuccess:function(t,a,s){var c=r(o);u(t,n,e,c)},onError:function(t,r,o){c[n]="",a.onError(t,e,r),i()}});else if(f&&!d){var m=e.textContent;a.useCSSOM&&(m=Array.apply(null,e.sheet.cssRules).map((function(e){return e.cssText})).join("")),u(m,n,e,location.href)}else c[n]="",i()})):a.onComplete("",[])}function r(e,t){var n=document.implementation.createHTMLDocument(""),r=n.createElement("base"),o=n.createElement("a");return n.head.appendChild(r),n.body.appendChild(o),r.href=t||document.baseURI||(document.querySelector("base")||{}).href||location.href,o.href=e,o.href}var o=a;function a(e,t,n){e instanceof RegExp&&(e=s(e,n)),t instanceof RegExp&&(t=s(t,n));var r=c(e,t,n);return r&&{start:r[0],end:r[1],pre:n.slice(0,r[0]),body:n.slice(r[0]+e.length,r[1]),post:n.slice(r[1]+t.length)}}function s(e,t){var n=t.match(e);return n?n[0]:null}function c(e,t,n){var r,o,a,s,c,i=n.indexOf(e),u=n.indexOf(t,i+1),l=i;if(i>=0&&u>0){if(e===t)return[i,u];for(r=[],a=n.length;l>=0&&!c;)l==i?(r.push(l),i=n.indexOf(e,l+1)):1==r.length?c=[r.pop(),u]:((o=r.pop())=0?i:u;r.length&&(c=[a,s])}return c}function i(t){var n=arguments.length>1&&void 0!==arguments[1]?arguments[1]:{},r={preserveStatic:!0,removeComments:!1},a=e({},r,n),s=[];function c(e){throw new Error("CSS parse error: ".concat(e))}function i(e){var n=e.exec(t);if(n)return t=t.slice(n[0].length),n}function u(){return i(/^{\s*/)}function l(){return i(/^}/)}function d(){i(/^\s*/)}function f(){if(d(),"/"===t[0]&&"*"===t[1]){for(var e=2;t[e]&&("*"!==t[e]||"/"!==t[e+1]);)e++;if(!t[e])return c("end of comment is missing");var n=t.slice(2,e);return t=t.slice(e+2),{type:"comment",comment:n}}}function m(){for(var e,t=[];e=f();)t.push(e);return a.removeComments?[]:t}function p(){for(d();"}"===t[0];)c("extra closing bracket");var e=i(/^(("(?:\\"|[^"])*"|'(?:\\'|[^'])*'|[^{])+)/);if(e){var n,r=e[0].trim();/\/\*/.test(r)&&(r=r.replace(/\/\*([^*]|[\r\n]|(\*+([^*/]|[\r\n])))*\*\/+/g,""));var o=/["']\w*,\w*["']/.test(r);return o&&(r=r.replace(/"(?:\\"|[^"])*"|'(?:\\'|[^'])*'/g,(function(e){return e.replace(/,/g,"")}))),n=/,/.test(r)?r.split(/\s*(?![^(]*\)),\s*/):[r],o&&(n=n.map((function(e){return e.replace(/\u200C/g,",")}))),n}}function v(){if("@"===t[0])return O();i(/^([;\s]*)+/);var e=/\/\*[^*]*\*+([^/*][^*]*\*+)*\//g,n=i(/^(\*?[-#/*\\\w.]+(\[[0-9a-z_-]+\])?)\s*/);if(n){if(n=n[0].trim(),!i(/^:\s*/))return c("property missing ':'");var r=i(/^((?:\/\*.*?\*\/|'(?:\\'|.)*?'|"(?:\\"|.)*?"|\((\s*'(?:\\'|.)*?'|"(?:\\"|.)*?"|[^)]*?)\s*\)|[^};])+)/),o={type:"declaration",property:n.replace(e,""),value:r?r[0].replace(e,"").trim():""};return i(/^[;\s]*/),o}}function h(){if(!u())return c("missing '{'");for(var e,t=m();e=v();)t.push(e),t=t.concat(m());return l()?t:c("missing '}'")}function y(){d();for(var e,t=[];e=i(/^((\d+\.\d+|\.\d+|\d+)%?|[a-z]+)\s*/);)t.push(e[1]),i(/^,\s*/);if(t.length)return{type:"keyframe",values:t,declarations:h()}}function g(){var e=i(/^@([-\w]+)?keyframes\s*/);if(e){var t=e[1];if(!(e=i(/^([-\w]+)\s*/)))return c("@keyframes missing name");var n,r=e[1];if(!u())return c("@keyframes missing '{'");for(var o=m();n=y();)o.push(n),o=o.concat(m());return l()?{type:"keyframes",name:r,vendor:t,keyframes:o}:c("@keyframes missing '}'")}}function b(){if(i(/^@page */))return{type:"page",selectors:p()||[],declarations:h()}}function w(){var e=i(/@(top|bottom|left|right)-(left|center|right|top|middle|bottom)-?(corner)?\s*/);if(e)return{type:"page-margin-box",name:"".concat(e[1],"-").concat(e[2])+(e[3]?"-".concat(e[3]):""),declarations:h()}}function E(){if(i(/^@font-face\s*/))return{type:"font-face",declarations:h()}}function S(){var e=i(/^@supports *([^{]+)/);if(e)return{type:"supports",supports:e[1].trim(),rules:M()}}function C(){if(i(/^@host\s*/))return{type:"host",rules:M()}}function x(){var e=i(/^@media([^{]+)*/);if(e)return{type:"media",media:(e[1]||"").trim(),rules:M()}}function A(){var e=i(/^@custom-media\s+(--[^\s]+)\s*([^{;]+);/);if(e)return{type:"custom-media",name:e[1].trim(),media:e[2].trim()}}function L(){var e=i(/^@([-\w]+)?document *([^{]+)/);if(e)return{type:"document",document:e[2].trim(),vendor:e[1]?e[1].trim():null,rules:M()}}function k(){var e=i(/^@(import|charset|namespace)\s*([^;]+);/);if(e)return{type:e[1],name:e[2].trim()}}function O(){if(d(),"@"===t[0]){var e=k()||E()||x()||g()||S()||L()||A()||C()||b()||w();if(e&&!a.preserveStatic){var n=!1;if(e.declarations)n=e.declarations.some((function(e){return/var\(/.test(e.value)}));else n=(e.keyframes||e.rules||[]).some((function(e){return(e.declarations||[]).some((function(e){return/var\(/.test(e.value)}))}));return n?e:{}}return e}}function _(){if(!a.preserveStatic){var e=o("{","}",t);if(e){var n=/:(?:root|host)(?![.:#(])/.test(e.pre)&&/--\S*\s*:/.test(e.body),r=/var\(/.test(e.body);if(!n&&!r)return t=t.slice(e.end+1),{}}}var s=p()||[],i=a.preserveStatic?h():h().filter((function(e){var t=s.some((function(e){return/:(?:root|host)(?![.:#(])/.test(e)}))&&/^--\S/.test(e.property),n=/var\(/.test(e.value);return t||n}));return s.length||c("selector missing"),{type:"rule",selectors:s,declarations:i}}function M(e){if(!e&&!u())return c("missing '{'");for(var n,r=m();t.length&&(e||"}"!==t[0])&&(n=O()||_());)n.type&&r.push(n),r=r.concat(m());return e||l()?r:c("missing '}'")}return{type:"stylesheet",stylesheet:{rules:M(!0),errors:s}}}function u(t){var n=arguments.length>1&&void 0!==arguments[1]?arguments[1]:{},r={parseHost:!1,store:{},onWarning:function(){}},o=e({},r,n),a=new RegExp(":".concat(o.parseHost?"host":"root","$"));return"string"==typeof t&&(t=i(t,o)),t.stylesheet.rules.forEach((function(e){"rule"===e.type&&e.selectors.some((function(e){return a.test(e)}))&&e.declarations.forEach((function(e,t){var n=e.property,r=e.value;n&&0===n.indexOf("--")&&(o.store[n]=r)}))})),o.store}function l(e){var t=arguments.length>1&&void 0!==arguments[1]?arguments[1]:"",n=arguments.length>2?arguments[2]:void 0,r={charset:function(e){return"@charset "+e.name+";"},comment:function(e){return 0===e.comment.indexOf("__CSSVARSPONYFILL")?"/*"+e.comment+"*/":""},"custom-media":function(e){return"@custom-media "+e.name+" "+e.media+";"},declaration:function(e){return e.property+":"+e.value+";"},document:function(e){return"@"+(e.vendor||"")+"document "+e.document+"{"+o(e.rules)+"}"},"font-face":function(e){return"@font-face{"+o(e.declarations)+"}"},host:function(e){return"@host{"+o(e.rules)+"}"},import:function(e){return"@import "+e.name+";"},keyframe:function(e){return e.values.join(",")+"{"+o(e.declarations)+"}"},keyframes:function(e){return"@"+(e.vendor||"")+"keyframes "+e.name+"{"+o(e.keyframes)+"}"},media:function(e){return"@media "+e.media+"{"+o(e.rules)+"}"},namespace:function(e){return"@namespace "+e.name+";"},page:function(e){return"@page "+(e.selectors.length?e.selectors.join(", "):"")+"{"+o(e.declarations)+"}"},"page-margin-box":function(e){return"@"+e.name+"{"+o(e.declarations)+"}"},rule:function(e){var t=e.declarations;if(t.length)return e.selectors.join(",")+"{"+o(t)+"}"},supports:function(e){return"@supports "+e.supports+"{"+o(e.rules)+"}"}};function o(e){for(var o="",a=0;a1&&void 0!==arguments[1]?arguments[1]:{},r={preserveStatic:!0,preserveVars:!1,variables:{},onWarning:function(){}},o=e({},r,n);return"string"==typeof t&&(t=i(t,o)),d(t.stylesheet,(function(e,t){for(var n=0;n1&&void 0!==arguments[1]?arguments[1]:{},n=arguments.length>2?arguments[2]:void 0;if(-1===e.indexOf("var("))return e;var r=o("(",")",e);function a(e){var r=e.split(",")[0].replace(/[\s\n\t]/g,""),o=(e.match(/(?:\s*,\s*){1}(.*)?/)||[])[1],a=Object.prototype.hasOwnProperty.call(t.variables,r)?String(t.variables[r]):void 0,s=a||(o?String(o):void 0),c=n||e;return a||t.onWarning('variable "'.concat(r,'" is undefined')),s&&"undefined"!==s&&s.length>0?p(s,t,c):"var(".concat(c,")")}if(r){if("var"===r.pre.slice(-3)){var s=0===r.body.trim().length;return s?(t.onWarning("var() must contain a non-whitespace string"),e):r.pre.slice(0,-3)+a(r.body)+p(r.post,t)}return r.pre+"(".concat(p(r.body,t),")")+p(r.post,t)}return-1!==e.indexOf("var(")&&t.onWarning('missing closing ")" in the value "'.concat(e,'"')),e}var v="undefined"!=typeof window,h=v&&window.CSS&&window.CSS.supports&&window.CSS.supports("(--a: 0)"),y={group:0,job:0},g={rootElement:v?document:null,shadowDOM:!1,include:"style,link[rel=stylesheet]",exclude:"",variables:{},onlyLegacy:!0,preserveStatic:!0,preserveVars:!1,silent:!1,updateDOM:!0,updateURLs:!0,watch:null,onBeforeSend:function(){},onError:function(){},onWarning:function(){},onSuccess:function(){},onComplete:function(){},onFinally:function(){}},b={cssComments:/\/\*[\s\S]+?\*\//g,cssKeyframes:/@(?:-\w*-)?keyframes/,cssMediaQueries:/@media[^{]+\{([\s\S]+?})\s*}/g,cssUrls:/url\((?!['"]?(?:data|http|\/\/):)['"]?([^'")]*)['"]?\)/g,cssVarDeclRules:/(?::(?:root|host)(?![.:#(])[\s,]*[^{]*{\s*[^}]*})/g,cssVarDecls:/(?:[\s;]*)(-{2}\w[\w-]*)(?:\s*:\s*)([^;]*);/g,cssVarFunc:/var\(\s*--[\w-]/,cssVars:/(?:(?::(?:root|host)(?![.:#(])[\s,]*[^{]*{\s*[^;]*;*\s*)|(?:var\(\s*))(--[^:)]+)(?:\s*[:)])/},w={dom:{},job:{},user:{}},E=!1,S=null,C=0,x=null,A=!1;function L(){var t=arguments.length>0&&void 0!==arguments[0]?arguments[0]:{},r="cssVars(): ",o=e({},g,t);function a(e,t,n,a){!o.silent&&window.console&&console.error("".concat(r).concat(e,"\n"),t),o.onError(e,t,n,a)}function s(e){!o.silent&&window.console&&console.warn("".concat(r).concat(e)),o.onWarning(e)}function c(e){o.onFinally(Boolean(e),h,N()-o.__benchmark)}if(v){if(o.watch)return o.watch=g.watch,k(o),void L(o);if(!1===o.watch&&S&&(S.disconnect(),S=null),!o.__benchmark){if(E===o.rootElement)return void O(t);var d=[].slice.call(o.rootElement.querySelectorAll('[data-cssvars]:not([data-cssvars="out"])'));if(o.__benchmark=N(),o.exclude=[S?'[data-cssvars]:not([data-cssvars=""])':'[data-cssvars="out"]',"link[disabled]:not([data-cssvars])",o.exclude].filter((function(e){return e})).join(","),o.variables=j(o.variables),d.forEach((function(e){var t="style"===e.nodeName.toLowerCase()&&e.__cssVars.text,n=t&&e.textContent!==e.__cssVars.text;t&&n&&(e.sheet&&(e.sheet.disabled=!1),e.setAttribute("data-cssvars",""))})),!S){var m=[].slice.call(o.rootElement.querySelectorAll('[data-cssvars="out"]'));m.forEach((function(e){var t=e.getAttribute("data-cssvars-group");(t?o.rootElement.querySelector('[data-cssvars="src"][data-cssvars-group="'.concat(t,'"]')):null)||e.parentNode.removeChild(e)})),C&&d.length2&&void 0!==arguments[2]?arguments[2]:[],d=e({},w.dom,w.user);if(w.job={},r.forEach((function(e,t){var r=n[t];if(e.__cssVars=e.__cssVars||{},e.__cssVars.text=r,b.cssVars.test(r))try{var c=i(r,{preserveStatic:o.preserveStatic,removeComments:!0});u(c,{parseHost:Boolean(o.rootElement.host),store:w.dom,onWarning:s}),e.__cssVars.tree=c}catch(t){a(t.message,e)}})),e(w.job,w.dom),o.updateDOM?(e(w.user,o.variables),e(w.job,w.user)):(e(w.job,w.user,o.variables),e(d,o.variables)),y.job>0&&Boolean(Object.keys(w.job).length>Object.keys(d).length||Boolean(Object.keys(d).length&&Object.keys(w.job).some((function(e){return w.job[e]!==d[e]})))))V(o.rootElement),L(o);else{var m=[],p=[],v=!1;if(o.updateDOM&&y.job++,r.forEach((function(t,r){var c=!t.__cssVars.tree;if(t.__cssVars.tree)try{f(t.__cssVars.tree,e({},o,{variables:w.job,onWarning:s}));var i=l(t.__cssVars.tree);if(o.updateDOM){var u=n[r],d=b.cssVarFunc.test(u);if(t.getAttribute("data-cssvars")||t.setAttribute("data-cssvars","src"),i.length&&d){var h=t.getAttribute("data-cssvars-group")||++y.group,g=i.replace(/\s/g,""),E=o.rootElement.querySelector('[data-cssvars="out"][data-cssvars-group="'.concat(h,'"]'))||document.createElement("style");v=v||b.cssKeyframes.test(i),o.preserveStatic&&t.sheet&&(t.sheet.disabled=!0),E.hasAttribute("data-cssvars")||E.setAttribute("data-cssvars","out"),g===t.textContent.replace(/\s/g,"")?(c=!0,E&&E.parentNode&&(t.removeAttribute("data-cssvars-group"),E.parentNode.removeChild(E))):g!==E.textContent.replace(/\s/g,"")&&([t,E].forEach((function(e){e.setAttribute("data-cssvars-job",y.job),e.setAttribute("data-cssvars-group",h)})),E.textContent=i,m.push(i),p.push(E),E.parentNode||t.parentNode.insertBefore(E,t.nextSibling))}}else t.textContent.replace(/\s/g,"")!==i&&m.push(i)}catch(e){a(e.message,t)}c&&t.setAttribute("data-cssvars","skip"),t.hasAttribute("data-cssvars-job")||t.setAttribute("data-cssvars-job",y.job)})),C=o.rootElement.querySelectorAll('[data-cssvars]:not([data-cssvars="out"])').length,o.shadowDOM)for(var h,g=[].concat(o.rootElement).concat([].slice.call(o.rootElement.querySelectorAll("*"))),S=0;h=g[S];++S)if(h.shadowRoot&&h.shadowRoot.querySelector("style")){var x=e({},o,{rootElement:h.shadowRoot});L(x)}o.updateDOM&&v&&_(o.rootElement),E=!1,o.onComplete(m.join(""),p,JSON.parse(JSON.stringify(w.job)),N()-o.__benchmark),c(p.length)}}}));else document.addEventListener("DOMContentLoaded",(function e(n){L(t),document.removeEventListener("DOMContentLoaded",e)}))}}function k(e){function t(e){var t=n(e)&&e.hasAttribute("disabled"),r=(e.sheet||{}).disabled;return t||r}function n(e){return"link"===e.nodeName.toLowerCase()&&-1!==(e.getAttribute("rel")||"").indexOf("stylesheet")}function r(e){return"style"===e.nodeName.toLowerCase()}window.MutationObserver&&(S&&(S.disconnect(),S=null),(S=new MutationObserver((function(o){o.some((function(o){return function(r){var o=!1;if("attributes"===r.type&&n(r.target)&&!t(r.target)){var a="disabled"===r.attributeName,s="href"===r.attributeName,c="skip"===r.target.getAttribute("data-cssvars"),i="src"===r.target.getAttribute("data-cssvars");a?o=!c&&!i:s&&(c?r.target.setAttribute("data-cssvars",""):i&&V(e.rootElement,!0),o=!0)}return o}(o)||function(e){var t=!1;if("childList"===e.type){var n=r(e.target),o="out"===e.target.getAttribute("data-cssvars");t=n&&!o}return t}(o)||function(e){var o=!1;return"childList"===e.type&&(o=[].slice.call(e.addedNodes).some((function(e){var o=1===e.nodeType&&e.hasAttribute("data-cssvars"),a=r(e)&&b.cssVars.test(e.textContent);return!o&&(n(e)||a)&&!t(e)}))),o}(o)||function(t){var n=!1;return"childList"===t.type&&(n=[].slice.call(t.removedNodes).some((function(t){var n=1===t.nodeType,r=n&&"out"===t.getAttribute("data-cssvars"),o=n&&"src"===t.getAttribute("data-cssvars"),a=o;if(o||r){var s=t.getAttribute("data-cssvars-group"),c=e.rootElement.querySelector('[data-cssvars-group="'.concat(s,'"]'));o&&V(e.rootElement,!0),c&&c.parentNode.removeChild(c)}return a}))),n}(o)}))&&L(e)}))).observe(document.documentElement,{attributes:!0,attributeFilter:["disabled","href"],childList:!0,subtree:!0}))}function O(e){var t=arguments.length>1&&void 0!==arguments[1]?arguments[1]:100;clearTimeout(x),x=setTimeout((function(){e.__benchmark=null,L(e)}),t)}function _(e){var t=["animation-name","-moz-animation-name","-webkit-animation-name"].filter((function(e){return getComputedStyle(document.body)[e]}))[0];if(t){for(var n=e.getElementsByTagName("*"),r=[],o="__CSSVARSPONYFILL-KEYFRAMES__",a=0,s=n.length;a0&&void 0!==arguments[0]?arguments[0]:{},t=/^-{2}/;return Object.keys(e).reduce((function(n,r){return n[t.test(r)?r:"--".concat(r.replace(/^-+/,""))]=e[r],n}),{})}function T(e){var t=arguments.length>1&&void 0!==arguments[1]?arguments[1]:location.href,n=document.implementation.createHTMLDocument(""),r=n.createElement("base"),o=n.createElement("a");return n.head.appendChild(r),n.body.appendChild(o),r.href=t,o.href=e,o.href}function N(){return v&&(window.performance||{}).now?window.performance.now():(new Date).getTime()}function V(e){var t=arguments.length>1&&void 0!==arguments[1]&&arguments[1],n=[].slice.call(e.querySelectorAll('[data-cssvars="skip"],[data-cssvars="src"]'));n.forEach((function(e){return e.setAttribute("data-cssvars","")})),t&&(w.dom={})}function q(e,t){var n=arguments.length>2&&void 0!==arguments[2]?arguments[2]:null,r=3===e.childNodes[0].nodeType,o=e.querySelector("ul");if(r&&o){var a=Array.apply(null,e.children).some((function(e){return e.tabIndex>-1})).length;if(!a){var s=document.createElement("span");for(null!==n&&s.setAttribute("tabindex",n),e.insertBefore(s,o);e.childNodes[0]!==s;)s.appendChild(e.childNodes[0])}}}L.reset=function(){for(var e in y.job=0,y.group=0,E=!1,S&&(S.disconnect(),S=null),C=0,x=null,A=!1,w)w[e]={}};var R="0.9.0";function D(e,t){return Number("0."+((window.Docsify||{}).version||"0").replace(/\./g,"")) a");e&&(e.parentNode.innerHTML=e.innerHTML)}))},function(e,t){e.doneEach((function(){var e=Array.apply(null,document.querySelectorAll("body > nav.app-nav > ul > li")),t=Array.apply(null,document.querySelectorAll(".sidebar > nav > ul > li"));e.forEach((function(e){var t="focus-within";q(e,"span",0),e.addEventListener("focusin",(function(n){e.contains(document.activeElement)&&e.classList.add(t)})),e.addEventListener("focusout",(function(n){e.contains(document.activeElement)||e.classList.remove(t)}))})),t.forEach((function(e){q(e,"span")}))}))},function(e,t){e.doneEach((function(){Array.apply(null,document.querySelectorAll("pre[data-lang]")).forEach((function(e){var t=e.querySelector("code"),n="language-".concat(e.getAttribute("data-lang"));e.classList.add(n),t.classList.add(n)}))}))},function(e,t){e.mounted((function(){var e=document.querySelector(".content"),t=setInterval((function(){e.textContent.length&&(document.body.classList.add("ready-fix"),clearInterval(t))}),250)})),e.ready((function(){document.body.classList.add("ready-fix")}))},function(e,t){e.init((function(){if(!1!==((window.$docsify||{}).themeable||{}).responsiveTables){var e=window.$docsify.markdown=window.$docsify.markdown||{},t=e.renderer=e.renderer||{};e.smartypants=e.smartypants||!0,t.table=t.table||function(e,t){var n='\n \n
\n '.concat(e," \n ").concat(t," \n
\n
");try{var r=document.createElement("div"),o=document.head.appendChild(document.createElement("style")).sheet,a="_"+Math.random().toString(36).substr(2,9);r.innerHTML=n;var s=r.querySelector("table");Array.apply(null,s.getElementsByTagName("th")).map((function(e){return e.innerHTML.replace(" "," ")})).forEach((function(e,t){var n="#".concat(a," td:nth-child(").concat(t+1,')::before{content:"').concat(e,'";}');o.insertRule(n,o.cssRules.length)})),s.id=a,n=r.innerHTML}catch(e){console.log("Failed to render responsive table: "+e)}return n}}}))}],window.$docsify.plugins||[],[function(e,t){e.ready((function(){var e=document.querySelector(".sidebar .search .clear-button");if(e){var t=document.createElement("button");t.className="clear-button",t.setAttribute("aria-label","Clear search"),t.innerHTML='\n \n \n \n \n \n ',e.parentNode.replaceChild(t,e)}}))},D("4.8.0",(function(e,t){e.ready((function(){var e=document.querySelector(".sidebar .search"),t=document.querySelector(".sidebar .search input[type=search]"),n=document.querySelector(".sidebar .search .clear-button");e&&e.addEventListener("click",(function(r){(r.target===n||n.contains(r.target))&&(e.classList.remove("show"),t.focus())})),t&&t.addEventListener("input",(function(n){t.value.length?e.classList.add("show"):e.classList.remove("show")}))}))})),D("4.8.0",(function(e,t){var n=Element.prototype.matches||Element.prototype.webkitMatchesSelector||Element.prototype.msMatchesSelector;e.doneEach((function(){var e="medium-zoom-image";Array.apply(null,document.querySelectorAll(".".concat(e))).forEach((function(t){var r=n.call(t,"a img"),o=n.call(t,".content img");if(r||!o){var a=t.cloneNode(!0);t.parentNode.replaceChild(a,t),a.classList.remove(e)}}))}))}))]).filter((function(e){return null!==e})),window.$docsify.search=window.$docsify.search||{},window.$docsify.search.hideOtherSidebarContent=!0,window.$docsify.themeable=window.$docsify.themeable||{},window.$docsify.themeable.version=R,window.$docsify.themeable.semver=R.split("."),window.$docsify.themeable.util={cssVars:function(){var e=arguments.length>0&&void 0!==arguments[0]?arguments[0]:B;L(e)}}}}();
+//# sourceMappingURL=docsify-themeable.min.js.map
diff --git a/assets/docsify.min.js b/assets/docsify.min.js
new file mode 100644
index 0000000..76fc755
--- /dev/null
+++ b/assets/docsify.min.js
@@ -0,0 +1 @@
+!function(){function c(i){var o=Object.create(null);return function(e){var n=f(e)?e:JSON.stringify(e);return o[n]||(o[n]=i(e))}}var a=c(function(e){return e.replace(/([A-Z])/g,function(e){return"-"+e.toLowerCase()})}),u=Object.prototype.hasOwnProperty,m=Object.assign||function(e){for(var n=arguments,i=1;i=e||n.classList.contains("hidden")?S(h,"add","sticky"):S(h,"remove","sticky"))}function ee(e,n,o,i){var t=[];null!=(n=l(n))&&(t=k(n,"a"));var a,r=decodeURI(e.toURL(e.getCurrentPath()));return t.sort(function(e,n){return n.href.length-e.href.length}).forEach(function(e){var n=decodeURI(e.getAttribute("href")),i=o?e.parentNode:e;e.title=e.title||e.innerText,0!==r.indexOf(n)||a?S(i,"remove","active"):(a=e,S(i,"add","active"))}),i&&(v.title=a?a.title||a.innerText+" - "+J:J),a}function ne(e,n){for(var i=0;ithis.end&&e>=this.next}[this.direction]}},{key:"_defaultEase",value:function(e,n,i,o){return(e/=o/2)<1?i/2*e*e+n:-i/2*(--e*(e-2)-1)+n}}]),re);function re(){var e=0c){n=n||p;break}n=p}!n||(r=fe[ve(e,n.getAttribute("data-id"))])&&r!==a&&(a&&a.classList.remove("active"),r.classList.add("active"),a=r,!pe&&h.classList.contains("sticky")&&(e=i.clientHeight,r=a.offsetTop+a.clientHeight+40,a=a.offsetTop>=t.scrollTop&&r<=t.scrollTop+e,i.scrollTop=a?t.scrollTop:+r"']/),xe=/[&<>"']/g,Se=/[<>"']|&(?!#?\w+;)/,Ae=/[<>"']|&(?!#?\w+;)/g,$e={"&":"&","<":"<",">":">",'"':""","'":"'"};var ze=/&(#(?:\d+)|(?:#x[0-9A-Fa-f]+)|(?:\w+));?/gi;function Fe(e){return e.replace(ze,function(e,n){return"colon"===(n=n.toLowerCase())?":":"#"===n.charAt(0)?"x"===n.charAt(1)?String.fromCharCode(parseInt(n.substring(2),16)):String.fromCharCode(+n.substring(1)):""})}var Ee=/(^|[^\[])\^/g;var Te=/[^\w:]/g,Ce=/^$|^[a-z][a-z0-9+.-]*:|^[?#]/i;var Re={},je=/^[^:]+:\/*[^/]*$/,Oe=/^([^:]+:)[\s\S]*$/,Le=/^([^:]+:\/*[^/]*)[\s\S]*$/;function qe(e,n){Re[" "+e]||(je.test(e)?Re[" "+e]=e+"/":Re[" "+e]=Pe(e,"/",!0));var i=-1===(e=Re[" "+e]).indexOf(":");return"//"===n.substring(0,2)?i?n:e.replace(Oe,"$1")+n:"/"===n.charAt(0)?i?n:e.replace(Le,"$1")+n:e+n}function Pe(e,n,i){var o=e.length;if(0===o)return"";for(var t=0;tn)i.splice(n);else for(;i.length>=1,e+=e;return i+e},We=we.defaults,Xe=Be,Qe=Ze,Je=Me,Ke=Ve;function en(e,n,i){var o=n.href,t=n.title?Je(n.title):null,n=e[1].replace(/\\([\[\]])/g,"$1");return"!"!==e[0].charAt(0)?{type:"link",raw:i,href:o,title:t,text:n}:{type:"image",raw:i,href:o,title:t,text:Je(n)}}var nn=function(){function e(e){this.options=e||We}return e.prototype.space=function(e){e=this.rules.block.newline.exec(e);if(e)return 1=i.length?e.slice(i.length):e}).join("\n")}(i,n[3]||"");return{type:"code",raw:i,lang:n[2]&&n[2].trim(),text:e}}},e.prototype.heading=function(e){var n=this.rules.block.heading.exec(e);if(n){var i=n[2].trim();return/#$/.test(i)&&(e=Xe(i,"#"),!this.options.pedantic&&e&&!/ $/.test(e)||(i=e.trim())),{type:"heading",raw:n[0],depth:n[1].length,text:i}}},e.prototype.nptable=function(e){e=this.rules.block.nptable.exec(e);if(e){var n={type:"table",header:Qe(e[1].replace(/^ *| *\| *$/g,"")),align:e[2].replace(/^ *|\| *$/g,"").split(/ *\| */),cells:e[3]?e[3].replace(/\n$/,"").split("\n"):[],raw:e[0]};if(n.header.length===n.align.length){for(var i=n.align.length,o=0;o ?/gm,"");return{type:"blockquote",raw:n[0],text:e}}},e.prototype.list=function(e){e=this.rules.block.list.exec(e);if(e){for(var n,i,o,t,a,r=e[0],c=e[2],u=1s[1].length:o[1].length>s[0].length||3/i.test(e[0])&&(n=!1),!i&&/^<(pre|code|kbd|script)(\s|>)/i.test(e[0])?i=!0:i&&/^<\/(pre|code|kbd|script)(\s|>)/i.test(e[0])&&(i=!1),{type:this.options.sanitize?"text":"html",raw:e[0],inLink:n,inRawBlock:i,text:this.options.sanitize?this.options.sanitizer?this.options.sanitizer(e[0]):Je(e[0]):e[0]}},e.prototype.link=function(e){var n=this.rules.inline.link.exec(e);if(n){e=n[2].trim();if(!this.options.pedantic&&/^$/.test(e))return;var i=Xe(e.slice(0,-1),"\\");if((e.length-i.length)%2==0)return}else{var o=Ke(n[2],"()");-1 $/.test(e)?i.slice(1):i.slice(1,-1):i)&&i.replace(this.rules.inline._escapes,"$1"),title:o&&o.replace(this.rules.inline._escapes,"$1")},n[0])}},e.prototype.reflink=function(e,n){if((i=this.rules.inline.reflink.exec(e))||(i=this.rules.inline.nolink.exec(e))){var e=(i[2]||i[1]).replace(/\s+/g," ");if((e=n[e.toLowerCase()])&&e.href)return en(i,e,i[0]);var i=i[0].charAt(0);return{type:"text",raw:i,text:i}}},e.prototype.strong=function(e,n,i){void 0===i&&(i="");var o=this.rules.inline.strong.start.exec(e);if(o&&(!o[1]||o[1]&&(""===i||this.rules.inline.punctuation.exec(i)))){n=n.slice(-1*e.length);var t,a="**"===o[0]?this.rules.inline.strong.endAst:this.rules.inline.strong.endUnd;for(a.lastIndex=0;null!=(o=a.exec(n));)if(t=this.rules.inline.strong.middle.exec(n.slice(0,o.index+3)))return{type:"strong",raw:e.slice(0,t[0].length),text:e.slice(2,t[0].length-2)}}},e.prototype.em=function(e,n,i){void 0===i&&(i="");var o=this.rules.inline.em.start.exec(e);if(o&&(!o[1]||o[1]&&(""===i||this.rules.inline.punctuation.exec(i)))){n=n.slice(-1*e.length);var t,a="*"===o[0]?this.rules.inline.em.endAst:this.rules.inline.em.endUnd;for(a.lastIndex=0;null!=(o=a.exec(n));)if(t=this.rules.inline.em.middle.exec(n.slice(0,o.index+2)))return{type:"em",raw:e.slice(0,t[0].length),text:e.slice(1,t[0].length-1)}}},e.prototype.codespan=function(e){var n=this.rules.inline.code.exec(e);if(n){var i=n[2].replace(/\n/g," "),o=/[^ ]/.test(i),e=/^ /.test(i)&&/ $/.test(i);return o&&e&&(i=i.substring(1,i.length-1)),i=Je(i,!0),{type:"codespan",raw:n[0],text:i}}},e.prototype.br=function(e){e=this.rules.inline.br.exec(e);if(e)return{type:"br",raw:e[0]}},e.prototype.del=function(e){e=this.rules.inline.del.exec(e);if(e)return{type:"del",raw:e[0],text:e[2]}},e.prototype.autolink=function(e,n){e=this.rules.inline.autolink.exec(e);if(e){var i,n="@"===e[2]?"mailto:"+(i=Je(this.options.mangle?n(e[1]):e[1])):i=Je(e[1]);return{type:"link",raw:e[0],text:i,href:n,tokens:[{type:"text",raw:i,text:i}]}}},e.prototype.url=function(e,n){var i,o,t,a;if(i=this.rules.inline.url.exec(e)){if("@"===i[2])t="mailto:"+(o=Je(this.options.mangle?n(i[0]):i[0]));else{for(;a=i[0],i[0]=this.rules.inline._backpedal.exec(i[0])[0],a!==i[0];);o=Je(i[0]),t="www."===i[1]?"http://"+o:o}return{type:"link",raw:i[0],text:o,href:t,tokens:[{type:"text",raw:o,text:o}]}}},e.prototype.inlineText=function(e,n,i){e=this.rules.inline.text.exec(e);if(e){i=n?this.options.sanitize?this.options.sanitizer?this.options.sanitizer(e[0]):Je(e[0]):e[0]:Je(this.options.smartypants?i(e[0]):e[0]);return{type:"text",raw:e[0],text:i}}},e}(),Ze=De,Ve=Ne,De=Ue,Ne={newline:/^(?: *(?:\n|$))+/,code:/^( {4}[^\n]+(?:\n(?: *(?:\n|$))*)?)+/,fences:/^ {0,3}(`{3,}(?=[^`\n]*\n)|~{3,})([^\n]*)\n(?:|([\s\S]*?)\n)(?: {0,3}\1[~`]* *(?:\n+|$)|$)/,hr:/^ {0,3}((?:- *){3,}|(?:_ *){3,}|(?:\* *){3,})(?:\n+|$)/,heading:/^ {0,3}(#{1,6})(?=\s|$)(.*)(?:\n+|$)/,blockquote:/^( {0,3}> ?(paragraph|[^\n]*)(?:\n|$))+/,list:/^( {0,3})(bull) [\s\S]+?(?:hr|def|\n{2,}(?! )(?! {0,3}bull )\n*|\s*$)/,html:"^ {0,3}(?:<(script|pre|style)[\\s>][\\s\\S]*?(?:\\1>[^\\n]*\\n+|$)|comment[^\\n]*(\\n+|$)|<\\?[\\s\\S]*?(?:\\?>\\n*|$)|\\n*|$)|\\n*|$)|?(tag)(?: +|\\n|/?>)[\\s\\S]*?(?:\\n{2,}|$)|<(?!script|pre|style)([a-z][\\w-]*)(?:attribute)*? */?>(?=[ \\t]*(?:\\n|$))[\\s\\S]*?(?:\\n{2,}|$)|(?!script|pre|style)[a-z][\\w-]*\\s*>(?=[ \\t]*(?:\\n|$))[\\s\\S]*?(?:\\n{2,}|$))",def:/^ {0,3}\[(label)\]: *\n? *([^\s>]+)>?(?:(?: +\n? *| *\n *)(title))? *(?:\n+|$)/,nptable:Ze,table:Ze,lheading:/^([^\n]+)\n {0,3}(=+|-+) *(?:\n+|$)/,_paragraph:/^([^\n]+(?:\n(?!hr|heading|lheading|blockquote|fences|list|html| +\n)[^\n]+)*)/,text:/^[^\n]+/,_label:/(?!\s*\])(?:\\[\[\]]|[^\[\]])+/,_title:/(?:"(?:\\"?|[^"\\])*"|'[^'\n]*(?:\n[^'\n]+)*\n?'|\([^()]*\))/};Ne.def=Ve(Ne.def).replace("label",Ne._label).replace("title",Ne._title).getRegex(),Ne.bullet=/(?:[*+-]|\d{1,9}[.)])/,Ne.item=/^( *)(bull) ?[^\n]*(?:\n(?! *bull ?)[^\n]*)*/,Ne.item=Ve(Ne.item,"gm").replace(/bull/g,Ne.bullet).getRegex(),Ne.listItemStart=Ve(/^( *)(bull)/).replace("bull",Ne.bullet).getRegex(),Ne.list=Ve(Ne.list).replace(/bull/g,Ne.bullet).replace("hr","\\n+(?=\\1?(?:(?:- *){3,}|(?:_ *){3,}|(?:\\* *){3,})(?:\\n+|$))").replace("def","\\n+(?="+Ne.def.source+")").getRegex(),Ne._tag="address|article|aside|base|basefont|blockquote|body|caption|center|col|colgroup|dd|details|dialog|dir|div|dl|dt|fieldset|figcaption|figure|footer|form|frame|frameset|h[1-6]|head|header|hr|html|iframe|legend|li|link|main|menu|menuitem|meta|nav|noframes|ol|optgroup|option|p|param|section|source|summary|table|tbody|td|tfoot|th|thead|title|tr|track|ul",Ne._comment=/|$)/,Ne.html=Ve(Ne.html,"i").replace("comment",Ne._comment).replace("tag",Ne._tag).replace("attribute",/ +[a-zA-Z:_][\w.:-]*(?: *= *"[^"\n]*"| *= *'[^'\n]*'| *= *[^\s"'=<>`]+)?/).getRegex(),Ne.paragraph=Ve(Ne._paragraph).replace("hr",Ne.hr).replace("heading"," {0,3}#{1,6} ").replace("|lheading","").replace("blockquote"," {0,3}>").replace("fences"," {0,3}(?:`{3,}(?=[^`\\n]*\\n)|~{3,})[^\\n]*\\n").replace("list"," {0,3}(?:[*+-]|1[.)]) ").replace("html","?(?:tag)(?: +|\\n|/?>)|<(?:script|pre|style|!--)").replace("tag",Ne._tag).getRegex(),Ne.blockquote=Ve(Ne.blockquote).replace("paragraph",Ne.paragraph).getRegex(),Ne.normal=De({},Ne),Ne.gfm=De({},Ne.normal,{nptable:"^ *([^|\\n ].*\\|.*)\\n {0,3}([-:]+ *\\|[-| :]*)(?:\\n((?:(?!\\n|hr|heading|blockquote|code|fences|list|html).*(?:\\n|$))*)\\n*|$)",table:"^ *\\|(.+)\\n {0,3}\\|?( *[-:]+[-| :]*)(?:\\n *((?:(?!\\n|hr|heading|blockquote|code|fences|list|html).*(?:\\n|$))*)\\n*|$)"}),Ne.gfm.nptable=Ve(Ne.gfm.nptable).replace("hr",Ne.hr).replace("heading"," {0,3}#{1,6} ").replace("blockquote"," {0,3}>").replace("code"," {4}[^\\n]").replace("fences"," {0,3}(?:`{3,}(?=[^`\\n]*\\n)|~{3,})[^\\n]*\\n").replace("list"," {0,3}(?:[*+-]|1[.)]) ").replace("html","?(?:tag)(?: +|\\n|/?>)|<(?:script|pre|style|!--)").replace("tag",Ne._tag).getRegex(),Ne.gfm.table=Ve(Ne.gfm.table).replace("hr",Ne.hr).replace("heading"," {0,3}#{1,6} ").replace("blockquote"," {0,3}>").replace("code"," {4}[^\\n]").replace("fences"," {0,3}(?:`{3,}(?=[^`\\n]*\\n)|~{3,})[^\\n]*\\n").replace("list"," {0,3}(?:[*+-]|1[.)]) ").replace("html","?(?:tag)(?: +|\\n|/?>)|<(?:script|pre|style|!--)").replace("tag",Ne._tag).getRegex(),Ne.pedantic=De({},Ne.normal,{html:Ve("^ *(?:comment *(?:\\n|\\s*$)|<(tag)[\\s\\S]+?\\1> *(?:\\n{2,}|\\s*$)| \\s]*)*?/?> *(?:\\n{2,}|\\s*$))").replace("comment",Ne._comment).replace(/tag/g,"(?!(?:a|em|strong|small|s|cite|q|dfn|abbr|data|time|code|var|samp|kbd|sub|sup|i|b|u|mark|ruby|rt|rp|bdi|bdo|span|br|wbr|ins|del|img)\\b)\\w+(?!:|[^\\w\\s@]*@)\\b").getRegex(),def:/^ *\[([^\]]+)\]: *([^\s>]+)>?(?: +(["(][^\n]+[")]))? *(?:\n+|$)/,heading:/^(#{1,6})(.*)(?:\n+|$)/,fences:Ze,paragraph:Ve(Ne.normal._paragraph).replace("hr",Ne.hr).replace("heading"," *#{1,6} *[^\n]").replace("lheading",Ne.lheading).replace("blockquote"," {0,3}>").replace("|fences","").replace("|list","").replace("|html","").getRegex()});Ze={escape:/^\\([!"#$%&'()*+,\-./:;<=>?@\[\]\\^_`{|}~])/,autolink:/^<(scheme:[^\s\x00-\x1f<>]*|email)>/,url:Ze,tag:"^comment|^[a-zA-Z][\\w:-]*\\s*>|^<[a-zA-Z][\\w-]*(?:attribute)*?\\s*/?>|^<\\?[\\s\\S]*?\\?>|^|^",link:/^!?\[(label)\]\(\s*(href)(?:\s+(title))?\s*\)/,reflink:/^!?\[(label)\]\[(?!\s*\])((?:\\[\[\]]?|[^\[\]\\])+)\]/,nolink:/^!?\[(?!\s*\])((?:\[[^\[\]]*\]|\\[\[\]]|[^\[\]])*)\](?:\[\])?/,reflinkSearch:"reflink|nolink(?!\\()",strong:{start:/^(?:(\*\*(?=[*punctuation]))|\*\*)(?![\s])|__/,middle:/^\*\*(?:(?:(?!overlapSkip)(?:[^*]|\\\*)|overlapSkip)|\*(?:(?!overlapSkip)(?:[^*]|\\\*)|overlapSkip)*?\*)+?\*\*$|^__(?![\s])((?:(?:(?!overlapSkip)(?:[^_]|\\_)|overlapSkip)|_(?:(?!overlapSkip)(?:[^_]|\\_)|overlapSkip)*?_)+?)__$/,endAst:/[^punctuation\s]\*\*(?!\*)|[punctuation]\*\*(?!\*)(?:(?=[punctuation_\s]|$))/,endUnd:/[^\s]__(?!_)(?:(?=[punctuation*\s])|$)/},em:{start:/^(?:(\*(?=[punctuation]))|\*)(?![*\s])|_/,middle:/^\*(?:(?:(?!overlapSkip)(?:[^*]|\\\*)|overlapSkip)|\*(?:(?!overlapSkip)(?:[^*]|\\\*)|overlapSkip)*?\*)+?\*$|^_(?![_\s])(?:(?:(?!overlapSkip)(?:[^_]|\\_)|overlapSkip)|_(?:(?!overlapSkip)(?:[^_]|\\_)|overlapSkip)*?_)+?_$/,endAst:/[^punctuation\s]\*(?!\*)|[punctuation]\*(?!\*)(?:(?=[punctuation_\s]|$))/,endUnd:/[^\s]_(?!_)(?:(?=[punctuation*\s])|$)/},code:/^(`+)([^`]|[^`][\s\S]*?[^`])\1(?!`)/,br:/^( {2,}|\\)\n(?!\s*$)/,del:Ze,text:/^(`+|[^`])(?:(?= {2,}\n)|[\s\S]*?(?:(?=[\\?@\\[\\]`^{|}~"};Ze.punctuation=Ve(Ze.punctuation).replace(/punctuation/g,Ze._punctuation).getRegex(),Ze._blockSkip="\\[[^\\]]*?\\]\\([^\\)]*?\\)|`[^`]*?`|<[^>]*?>",Ze._overlapSkip="__[^_]*?__|\\*\\*\\[^\\*\\]*?\\*\\*",Ze._comment=Ve(Ne._comment).replace("(?:--\x3e|$)","--\x3e").getRegex(),Ze.em.start=Ve(Ze.em.start).replace(/punctuation/g,Ze._punctuation).getRegex(),Ze.em.middle=Ve(Ze.em.middle).replace(/punctuation/g,Ze._punctuation).replace(/overlapSkip/g,Ze._overlapSkip).getRegex(),Ze.em.endAst=Ve(Ze.em.endAst,"g").replace(/punctuation/g,Ze._punctuation).getRegex(),Ze.em.endUnd=Ve(Ze.em.endUnd,"g").replace(/punctuation/g,Ze._punctuation).getRegex(),Ze.strong.start=Ve(Ze.strong.start).replace(/punctuation/g,Ze._punctuation).getRegex(),Ze.strong.middle=Ve(Ze.strong.middle).replace(/punctuation/g,Ze._punctuation).replace(/overlapSkip/g,Ze._overlapSkip).getRegex(),Ze.strong.endAst=Ve(Ze.strong.endAst,"g").replace(/punctuation/g,Ze._punctuation).getRegex(),Ze.strong.endUnd=Ve(Ze.strong.endUnd,"g").replace(/punctuation/g,Ze._punctuation).getRegex(),Ze.blockSkip=Ve(Ze._blockSkip,"g").getRegex(),Ze.overlapSkip=Ve(Ze._overlapSkip,"g").getRegex(),Ze._escapes=/\\([!"#$%&'()*+,\-./:;<=>?@\[\]\\^_`{|}~])/g,Ze._scheme=/[a-zA-Z][a-zA-Z0-9+.-]{1,31}/,Ze._email=/[a-zA-Z0-9.!#$%&'*+/=?^_`{|}~-]+(@)[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)+(?![-_])/,Ze.autolink=Ve(Ze.autolink).replace("scheme",Ze._scheme).replace("email",Ze._email).getRegex(),Ze._attribute=/\s+[a-zA-Z:_][\w.:-]*(?:\s*=\s*"[^"]*"|\s*=\s*'[^']*'|\s*=\s*[^\s"'=<>`]+)?/,Ze.tag=Ve(Ze.tag).replace("comment",Ze._comment).replace("attribute",Ze._attribute).getRegex(),Ze._label=/(?:\[(?:\\.|[^\[\]\\])*\]|\\.|`[^`]*`|[^\[\]\\`])*?/,Ze._href=/<(?:\\.|[^\n<>\\])+>|[^\s\x00-\x1f]*/,Ze._title=/"(?:\\"?|[^"\\])*"|'(?:\\'?|[^'\\])*'|\((?:\\\)?|[^)\\])*\)/,Ze.link=Ve(Ze.link).replace("label",Ze._label).replace("href",Ze._href).replace("title",Ze._title).getRegex(),Ze.reflink=Ve(Ze.reflink).replace("label",Ze._label).getRegex(),Ze.reflinkSearch=Ve(Ze.reflinkSearch,"g").replace("reflink",Ze.reflink).replace("nolink",Ze.nolink).getRegex(),Ze.normal=De({},Ze),Ze.pedantic=De({},Ze.normal,{strong:{start:/^__|\*\*/,middle:/^__(?=\S)([\s\S]*?\S)__(?!_)|^\*\*(?=\S)([\s\S]*?\S)\*\*(?!\*)/,endAst:/\*\*(?!\*)/g,endUnd:/__(?!_)/g},em:{start:/^_|\*/,middle:/^()\*(?=\S)([\s\S]*?\S)\*(?!\*)|^_(?=\S)([\s\S]*?\S)_(?!_)/,endAst:/\*(?!\*)/g,endUnd:/_(?!_)/g},link:Ve(/^!?\[(label)\]\((.*?)\)/).replace("label",Ze._label).getRegex(),reflink:Ve(/^!?\[(label)\]\s*\[([^\]]*)\]/).replace("label",Ze._label).getRegex()}),Ze.gfm=De({},Ze.normal,{escape:Ve(Ze.escape).replace("])","~|])").getRegex(),_extended_email:/[A-Za-z0-9._+-]+(@)[a-zA-Z0-9-_]+(?:\.[a-zA-Z0-9-_]*[a-zA-Z0-9])+(?![-_])/,url:/^((?:ftp|https?):\/\/|www\.)(?:[a-zA-Z0-9\-]+\.?)+[^\s<]*|^email/,_backpedal:/(?:[^?!.,:;*_~()&]+|\([^)]*\)|&(?![a-zA-Z0-9]+;$)|[?!.,:;*_~)]+(?!$))+/,del:/^(~~?)(?=[^\s~])([\s\S]*?[^\s~])\1(?=[^~]|$)/,text:/^([`~]+|[^`~])(?:(?= {2,}\n)|[\s\S]*?(?:(?=[\\'+(i?e:gn(e,!0))+"
\n":""+(i?e:gn(e,!0))+"
\n"},e.prototype.blockquote=function(e){return"\n"+e+" \n"},e.prototype.html=function(e){return e},e.prototype.heading=function(e,n,i,o){return this.options.headerIds?"\n":""+e+" \n"},e.prototype.hr=function(){return this.options.xhtml?" \n":" \n"},e.prototype.list=function(e,n,i){var o=n?"ol":"ul";return"<"+o+(n&&1!==i?' start="'+i+'"':"")+">\n"+e+""+o+">\n"},e.prototype.listitem=function(e){return""+e+" \n"},e.prototype.checkbox=function(e){return" "},e.prototype.paragraph=function(e){return""+e+"
\n"},e.prototype.table=function(e,n){return"\n\n"+e+" \n"+(n=n&&""+n+" ")+"
\n"},e.prototype.tablerow=function(e){return"\n"+e+" \n"},e.prototype.tablecell=function(e,n){var i=n.header?"th":"td";return(n.align?"<"+i+' align="'+n.align+'">':"<"+i+">")+e+""+i+">\n"},e.prototype.strong=function(e){return""+e+" "},e.prototype.em=function(e){return""+e+" "},e.prototype.codespan=function(e){return""+e+"
"},e.prototype.br=function(){return this.options.xhtml?" ":" "},e.prototype.del=function(e){return""+e+""},e.prototype.link=function(e,n,i){if(null===(e=dn(this.options.sanitize,this.options.baseUrl,e)))return i;e='"+i+" "},e.prototype.image=function(e,n,i){if(null===(e=dn(this.options.sanitize,this.options.baseUrl,e)))return i;i=' ":">"},e.prototype.text=function(e){return e},e}(),ln=function(){function e(){}return e.prototype.strong=function(e){return e},e.prototype.em=function(e){return e},e.prototype.codespan=function(e){return e},e.prototype.del=function(e){return e},e.prototype.html=function(e){return e},e.prototype.text=function(e){return e},e.prototype.link=function(e,n,i){return""+i},e.prototype.image=function(e,n,i){return""+i},e.prototype.br=function(){return""},e}(),vn=function(){function e(){this.seen={}}return e.prototype.serialize=function(e){return e.toLowerCase().trim().replace(/<[!\/a-z].*?>/gi,"").replace(/[\u2000-\u206F\u2E00-\u2E7F\\'!"#$%&()*+,./:;<=>?@[\]^`{|}~]/g,"").replace(/\s/g,"-")},e.prototype.getNextSafeSlug=function(e,n){var i=e,o=0;if(this.seen.hasOwnProperty(i))for(o=this.seen[e];i=e+"-"+ ++o,this.seen.hasOwnProperty(i););return n||(this.seen[e]=o,this.seen[i]=0),i},e.prototype.slug=function(e,n){void 0===n&&(n={});e=this.serialize(e);return this.getNextSafeSlug(e,n.dryrun)},e}(),hn=we.defaults,_n=Ie,mn=function(){function i(e){this.options=e||hn,this.options.renderer=this.options.renderer||new sn,this.renderer=this.options.renderer,this.renderer.options=this.options,this.textRenderer=new ln,this.slugger=new vn}return i.parse=function(e,n){return new i(n).parse(e)},i.parseInline=function(e,n){return new i(n).parseInline(e)},i.prototype.parse=function(e,n){void 0===n&&(n=!0);for(var i,o,t,a,r,c,u,f,p,d,g,s,l,v,h,_="",m=e.length,b=0;bAn error occurred:"+wn(e.message+"",!0)+" ";throw e}}xn.options=xn.setOptions=function(e){return bn(xn.defaults,e),yn(xn.defaults),xn},xn.getDefaults=Me,xn.defaults=we,xn.use=function(a){var n,e=bn({},a);if(a.renderer){var i,r=xn.defaults.renderer||new sn;for(i in a.renderer)!function(o){var t=r[o];r[o]=function(){for(var e=[],n=arguments.length;n--;)e[n]=arguments[n];var i=a.renderer[o].apply(r,e);return i=!1===i?t.apply(r,e):i}}(i);e.renderer=r}if(a.tokenizer){var t,c=xn.defaults.tokenizer||new nn;for(t in a.tokenizer)!function(){var o=c[t];c[t]=function(){for(var e=[],n=arguments.length;n--;)e[n]=arguments[n];var i=a.tokenizer[t].apply(c,e);return i=!1===i?o.apply(c,e):i}}();e.tokenizer=c}a.walkTokens&&(n=xn.defaults.walkTokens,e.walkTokens=function(e){a.walkTokens(e),n&&n(e)}),xn.setOptions(e)},xn.walkTokens=function(e,n){for(var i=0,o=e;iAn error occurred:"+wn(e.message+"",!0)+" ";throw e}},xn.Parser=mn,xn.parser=mn.parse,xn.Renderer=sn,xn.TextRenderer=ln,xn.Lexer=fn,xn.lexer=fn.lex,xn.Tokenizer=nn,xn.Slugger=vn;var Sn=xn.parse=xn;function An(e,i){if(void 0===i&&(i=''),!e||!e.length)return"";var o="";return e.forEach(function(e){var n=e.title.replace(/(<([^>]+)>)/g,"");o+=''+e.title+" ",e.children&&(o+=An(e.children,i))}),i.replace("{inner}",o)}function $n(e,n){return''+n.slice(5).trim()+"
"}function zn(e,o){var t=[],a={};return e.forEach(function(e){var n=e.level||1,i=n-1;o?@[\]^`{|}~]/g;function Tn(e){return e.toLowerCase()}function Cn(e){if("string"!=typeof e)return"";var n=e.trim().replace(/[A-Z]+/g,Tn).replace(/<[^>]+>/g,"").replace(En,"").replace(/\s/g,"-").replace(/-+/g,"-").replace(/^(\d)/,"_$1"),e=Fn[n],e=u.call(Fn,n)?e+1:0;return n=(Fn[n]=e)?n+"-"+e:n}Cn.clear=function(){Fn={}};var Rn={baseURL:"https://github.githubassets.com/images/icons/emoji/",data:{100:"unicode/1f4af.png?v8",1234:"unicode/1f522.png?v8","+1":"unicode/1f44d.png?v8","-1":"unicode/1f44e.png?v8","1st_place_medal":"unicode/1f947.png?v8","2nd_place_medal":"unicode/1f948.png?v8","3rd_place_medal":"unicode/1f949.png?v8","8ball":"unicode/1f3b1.png?v8",a:"unicode/1f170.png?v8",ab:"unicode/1f18e.png?v8",abacus:"unicode/1f9ee.png?v8",abc:"unicode/1f524.png?v8",abcd:"unicode/1f521.png?v8",accept:"unicode/1f251.png?v8",accordion:"unicode/1fa97.png?v8",adhesive_bandage:"unicode/1fa79.png?v8",adult:"unicode/1f9d1.png?v8",aerial_tramway:"unicode/1f6a1.png?v8",afghanistan:"unicode/1f1e6-1f1eb.png?v8",airplane:"unicode/2708.png?v8",aland_islands:"unicode/1f1e6-1f1fd.png?v8",alarm_clock:"unicode/23f0.png?v8",albania:"unicode/1f1e6-1f1f1.png?v8",alembic:"unicode/2697.png?v8",algeria:"unicode/1f1e9-1f1ff.png?v8",alien:"unicode/1f47d.png?v8",ambulance:"unicode/1f691.png?v8",american_samoa:"unicode/1f1e6-1f1f8.png?v8",amphora:"unicode/1f3fa.png?v8",anatomical_heart:"unicode/1fac0.png?v8",anchor:"unicode/2693.png?v8",andorra:"unicode/1f1e6-1f1e9.png?v8",angel:"unicode/1f47c.png?v8",anger:"unicode/1f4a2.png?v8",angola:"unicode/1f1e6-1f1f4.png?v8",angry:"unicode/1f620.png?v8",anguilla:"unicode/1f1e6-1f1ee.png?v8",anguished:"unicode/1f627.png?v8",ant:"unicode/1f41c.png?v8",antarctica:"unicode/1f1e6-1f1f6.png?v8",antigua_barbuda:"unicode/1f1e6-1f1ec.png?v8",apple:"unicode/1f34e.png?v8",aquarius:"unicode/2652.png?v8",argentina:"unicode/1f1e6-1f1f7.png?v8",aries:"unicode/2648.png?v8",armenia:"unicode/1f1e6-1f1f2.png?v8",arrow_backward:"unicode/25c0.png?v8",arrow_double_down:"unicode/23ec.png?v8",arrow_double_up:"unicode/23eb.png?v8",arrow_down:"unicode/2b07.png?v8",arrow_down_small:"unicode/1f53d.png?v8",arrow_forward:"unicode/25b6.png?v8",arrow_heading_down:"unicode/2935.png?v8",arrow_heading_up:"unicode/2934.png?v8",arrow_left:"unicode/2b05.png?v8",arrow_lower_left:"unicode/2199.png?v8",arrow_lower_right:"unicode/2198.png?v8",arrow_right:"unicode/27a1.png?v8",arrow_right_hook:"unicode/21aa.png?v8",arrow_up:"unicode/2b06.png?v8",arrow_up_down:"unicode/2195.png?v8",arrow_up_small:"unicode/1f53c.png?v8",arrow_upper_left:"unicode/2196.png?v8",arrow_upper_right:"unicode/2197.png?v8",arrows_clockwise:"unicode/1f503.png?v8",arrows_counterclockwise:"unicode/1f504.png?v8",art:"unicode/1f3a8.png?v8",articulated_lorry:"unicode/1f69b.png?v8",artificial_satellite:"unicode/1f6f0.png?v8",artist:"unicode/1f9d1-1f3a8.png?v8",aruba:"unicode/1f1e6-1f1fc.png?v8",ascension_island:"unicode/1f1e6-1f1e8.png?v8",asterisk:"unicode/002a-20e3.png?v8",astonished:"unicode/1f632.png?v8",astronaut:"unicode/1f9d1-1f680.png?v8",athletic_shoe:"unicode/1f45f.png?v8",atm:"unicode/1f3e7.png?v8",atom:"atom.png?v8",atom_symbol:"unicode/269b.png?v8",australia:"unicode/1f1e6-1f1fa.png?v8",austria:"unicode/1f1e6-1f1f9.png?v8",auto_rickshaw:"unicode/1f6fa.png?v8",avocado:"unicode/1f951.png?v8",axe:"unicode/1fa93.png?v8",azerbaijan:"unicode/1f1e6-1f1ff.png?v8",b:"unicode/1f171.png?v8",baby:"unicode/1f476.png?v8",baby_bottle:"unicode/1f37c.png?v8",baby_chick:"unicode/1f424.png?v8",baby_symbol:"unicode/1f6bc.png?v8",back:"unicode/1f519.png?v8",bacon:"unicode/1f953.png?v8",badger:"unicode/1f9a1.png?v8",badminton:"unicode/1f3f8.png?v8",bagel:"unicode/1f96f.png?v8",baggage_claim:"unicode/1f6c4.png?v8",baguette_bread:"unicode/1f956.png?v8",bahamas:"unicode/1f1e7-1f1f8.png?v8",bahrain:"unicode/1f1e7-1f1ed.png?v8",balance_scale:"unicode/2696.png?v8",bald_man:"unicode/1f468-1f9b2.png?v8",bald_woman:"unicode/1f469-1f9b2.png?v8",ballet_shoes:"unicode/1fa70.png?v8",balloon:"unicode/1f388.png?v8",ballot_box:"unicode/1f5f3.png?v8",ballot_box_with_check:"unicode/2611.png?v8",bamboo:"unicode/1f38d.png?v8",banana:"unicode/1f34c.png?v8",bangbang:"unicode/203c.png?v8",bangladesh:"unicode/1f1e7-1f1e9.png?v8",banjo:"unicode/1fa95.png?v8",bank:"unicode/1f3e6.png?v8",bar_chart:"unicode/1f4ca.png?v8",barbados:"unicode/1f1e7-1f1e7.png?v8",barber:"unicode/1f488.png?v8",baseball:"unicode/26be.png?v8",basecamp:"basecamp.png?v8",basecampy:"basecampy.png?v8",basket:"unicode/1f9fa.png?v8",basketball:"unicode/1f3c0.png?v8",basketball_man:"unicode/26f9-2642.png?v8",basketball_woman:"unicode/26f9-2640.png?v8",bat:"unicode/1f987.png?v8",bath:"unicode/1f6c0.png?v8",bathtub:"unicode/1f6c1.png?v8",battery:"unicode/1f50b.png?v8",beach_umbrella:"unicode/1f3d6.png?v8",bear:"unicode/1f43b.png?v8",bearded_person:"unicode/1f9d4.png?v8",beaver:"unicode/1f9ab.png?v8",bed:"unicode/1f6cf.png?v8",bee:"unicode/1f41d.png?v8",beer:"unicode/1f37a.png?v8",beers:"unicode/1f37b.png?v8",beetle:"unicode/1fab2.png?v8",beginner:"unicode/1f530.png?v8",belarus:"unicode/1f1e7-1f1fe.png?v8",belgium:"unicode/1f1e7-1f1ea.png?v8",belize:"unicode/1f1e7-1f1ff.png?v8",bell:"unicode/1f514.png?v8",bell_pepper:"unicode/1fad1.png?v8",bellhop_bell:"unicode/1f6ce.png?v8",benin:"unicode/1f1e7-1f1ef.png?v8",bento:"unicode/1f371.png?v8",bermuda:"unicode/1f1e7-1f1f2.png?v8",beverage_box:"unicode/1f9c3.png?v8",bhutan:"unicode/1f1e7-1f1f9.png?v8",bicyclist:"unicode/1f6b4.png?v8",bike:"unicode/1f6b2.png?v8",biking_man:"unicode/1f6b4-2642.png?v8",biking_woman:"unicode/1f6b4-2640.png?v8",bikini:"unicode/1f459.png?v8",billed_cap:"unicode/1f9e2.png?v8",biohazard:"unicode/2623.png?v8",bird:"unicode/1f426.png?v8",birthday:"unicode/1f382.png?v8",bison:"unicode/1f9ac.png?v8",black_cat:"unicode/1f408-2b1b.png?v8",black_circle:"unicode/26ab.png?v8",black_flag:"unicode/1f3f4.png?v8",black_heart:"unicode/1f5a4.png?v8",black_joker:"unicode/1f0cf.png?v8",black_large_square:"unicode/2b1b.png?v8",black_medium_small_square:"unicode/25fe.png?v8",black_medium_square:"unicode/25fc.png?v8",black_nib:"unicode/2712.png?v8",black_small_square:"unicode/25aa.png?v8",black_square_button:"unicode/1f532.png?v8",blond_haired_man:"unicode/1f471-2642.png?v8",blond_haired_person:"unicode/1f471.png?v8",blond_haired_woman:"unicode/1f471-2640.png?v8",blonde_woman:"unicode/1f471-2640.png?v8",blossom:"unicode/1f33c.png?v8",blowfish:"unicode/1f421.png?v8",blue_book:"unicode/1f4d8.png?v8",blue_car:"unicode/1f699.png?v8",blue_heart:"unicode/1f499.png?v8",blue_square:"unicode/1f7e6.png?v8",blueberries:"unicode/1fad0.png?v8",blush:"unicode/1f60a.png?v8",boar:"unicode/1f417.png?v8",boat:"unicode/26f5.png?v8",bolivia:"unicode/1f1e7-1f1f4.png?v8",bomb:"unicode/1f4a3.png?v8",bone:"unicode/1f9b4.png?v8",book:"unicode/1f4d6.png?v8",bookmark:"unicode/1f516.png?v8",bookmark_tabs:"unicode/1f4d1.png?v8",books:"unicode/1f4da.png?v8",boom:"unicode/1f4a5.png?v8",boomerang:"unicode/1fa83.png?v8",boot:"unicode/1f462.png?v8",bosnia_herzegovina:"unicode/1f1e7-1f1e6.png?v8",botswana:"unicode/1f1e7-1f1fc.png?v8",bouncing_ball_man:"unicode/26f9-2642.png?v8",bouncing_ball_person:"unicode/26f9.png?v8",bouncing_ball_woman:"unicode/26f9-2640.png?v8",bouquet:"unicode/1f490.png?v8",bouvet_island:"unicode/1f1e7-1f1fb.png?v8",bow:"unicode/1f647.png?v8",bow_and_arrow:"unicode/1f3f9.png?v8",bowing_man:"unicode/1f647-2642.png?v8",bowing_woman:"unicode/1f647-2640.png?v8",bowl_with_spoon:"unicode/1f963.png?v8",bowling:"unicode/1f3b3.png?v8",bowtie:"bowtie.png?v8",boxing_glove:"unicode/1f94a.png?v8",boy:"unicode/1f466.png?v8",brain:"unicode/1f9e0.png?v8",brazil:"unicode/1f1e7-1f1f7.png?v8",bread:"unicode/1f35e.png?v8",breast_feeding:"unicode/1f931.png?v8",bricks:"unicode/1f9f1.png?v8",bride_with_veil:"unicode/1f470-2640.png?v8",bridge_at_night:"unicode/1f309.png?v8",briefcase:"unicode/1f4bc.png?v8",british_indian_ocean_territory:"unicode/1f1ee-1f1f4.png?v8",british_virgin_islands:"unicode/1f1fb-1f1ec.png?v8",broccoli:"unicode/1f966.png?v8",broken_heart:"unicode/1f494.png?v8",broom:"unicode/1f9f9.png?v8",brown_circle:"unicode/1f7e4.png?v8",brown_heart:"unicode/1f90e.png?v8",brown_square:"unicode/1f7eb.png?v8",brunei:"unicode/1f1e7-1f1f3.png?v8",bubble_tea:"unicode/1f9cb.png?v8",bucket:"unicode/1faa3.png?v8",bug:"unicode/1f41b.png?v8",building_construction:"unicode/1f3d7.png?v8",bulb:"unicode/1f4a1.png?v8",bulgaria:"unicode/1f1e7-1f1ec.png?v8",bullettrain_front:"unicode/1f685.png?v8",bullettrain_side:"unicode/1f684.png?v8",burkina_faso:"unicode/1f1e7-1f1eb.png?v8",burrito:"unicode/1f32f.png?v8",burundi:"unicode/1f1e7-1f1ee.png?v8",bus:"unicode/1f68c.png?v8",business_suit_levitating:"unicode/1f574.png?v8",busstop:"unicode/1f68f.png?v8",bust_in_silhouette:"unicode/1f464.png?v8",busts_in_silhouette:"unicode/1f465.png?v8",butter:"unicode/1f9c8.png?v8",butterfly:"unicode/1f98b.png?v8",cactus:"unicode/1f335.png?v8",cake:"unicode/1f370.png?v8",calendar:"unicode/1f4c6.png?v8",call_me_hand:"unicode/1f919.png?v8",calling:"unicode/1f4f2.png?v8",cambodia:"unicode/1f1f0-1f1ed.png?v8",camel:"unicode/1f42b.png?v8",camera:"unicode/1f4f7.png?v8",camera_flash:"unicode/1f4f8.png?v8",cameroon:"unicode/1f1e8-1f1f2.png?v8",camping:"unicode/1f3d5.png?v8",canada:"unicode/1f1e8-1f1e6.png?v8",canary_islands:"unicode/1f1ee-1f1e8.png?v8",cancer:"unicode/264b.png?v8",candle:"unicode/1f56f.png?v8",candy:"unicode/1f36c.png?v8",canned_food:"unicode/1f96b.png?v8",canoe:"unicode/1f6f6.png?v8",cape_verde:"unicode/1f1e8-1f1fb.png?v8",capital_abcd:"unicode/1f520.png?v8",capricorn:"unicode/2651.png?v8",car:"unicode/1f697.png?v8",card_file_box:"unicode/1f5c3.png?v8",card_index:"unicode/1f4c7.png?v8",card_index_dividers:"unicode/1f5c2.png?v8",caribbean_netherlands:"unicode/1f1e7-1f1f6.png?v8",carousel_horse:"unicode/1f3a0.png?v8",carpentry_saw:"unicode/1fa9a.png?v8",carrot:"unicode/1f955.png?v8",cartwheeling:"unicode/1f938.png?v8",cat:"unicode/1f431.png?v8",cat2:"unicode/1f408.png?v8",cayman_islands:"unicode/1f1f0-1f1fe.png?v8",cd:"unicode/1f4bf.png?v8",central_african_republic:"unicode/1f1e8-1f1eb.png?v8",ceuta_melilla:"unicode/1f1ea-1f1e6.png?v8",chad:"unicode/1f1f9-1f1e9.png?v8",chains:"unicode/26d3.png?v8",chair:"unicode/1fa91.png?v8",champagne:"unicode/1f37e.png?v8",chart:"unicode/1f4b9.png?v8",chart_with_downwards_trend:"unicode/1f4c9.png?v8",chart_with_upwards_trend:"unicode/1f4c8.png?v8",checkered_flag:"unicode/1f3c1.png?v8",cheese:"unicode/1f9c0.png?v8",cherries:"unicode/1f352.png?v8",cherry_blossom:"unicode/1f338.png?v8",chess_pawn:"unicode/265f.png?v8",chestnut:"unicode/1f330.png?v8",chicken:"unicode/1f414.png?v8",child:"unicode/1f9d2.png?v8",children_crossing:"unicode/1f6b8.png?v8",chile:"unicode/1f1e8-1f1f1.png?v8",chipmunk:"unicode/1f43f.png?v8",chocolate_bar:"unicode/1f36b.png?v8",chopsticks:"unicode/1f962.png?v8",christmas_island:"unicode/1f1e8-1f1fd.png?v8",christmas_tree:"unicode/1f384.png?v8",church:"unicode/26ea.png?v8",cinema:"unicode/1f3a6.png?v8",circus_tent:"unicode/1f3aa.png?v8",city_sunrise:"unicode/1f307.png?v8",city_sunset:"unicode/1f306.png?v8",cityscape:"unicode/1f3d9.png?v8",cl:"unicode/1f191.png?v8",clamp:"unicode/1f5dc.png?v8",clap:"unicode/1f44f.png?v8",clapper:"unicode/1f3ac.png?v8",classical_building:"unicode/1f3db.png?v8",climbing:"unicode/1f9d7.png?v8",climbing_man:"unicode/1f9d7-2642.png?v8",climbing_woman:"unicode/1f9d7-2640.png?v8",clinking_glasses:"unicode/1f942.png?v8",clipboard:"unicode/1f4cb.png?v8",clipperton_island:"unicode/1f1e8-1f1f5.png?v8",clock1:"unicode/1f550.png?v8",clock10:"unicode/1f559.png?v8",clock1030:"unicode/1f565.png?v8",clock11:"unicode/1f55a.png?v8",clock1130:"unicode/1f566.png?v8",clock12:"unicode/1f55b.png?v8",clock1230:"unicode/1f567.png?v8",clock130:"unicode/1f55c.png?v8",clock2:"unicode/1f551.png?v8",clock230:"unicode/1f55d.png?v8",clock3:"unicode/1f552.png?v8",clock330:"unicode/1f55e.png?v8",clock4:"unicode/1f553.png?v8",clock430:"unicode/1f55f.png?v8",clock5:"unicode/1f554.png?v8",clock530:"unicode/1f560.png?v8",clock6:"unicode/1f555.png?v8",clock630:"unicode/1f561.png?v8",clock7:"unicode/1f556.png?v8",clock730:"unicode/1f562.png?v8",clock8:"unicode/1f557.png?v8",clock830:"unicode/1f563.png?v8",clock9:"unicode/1f558.png?v8",clock930:"unicode/1f564.png?v8",closed_book:"unicode/1f4d5.png?v8",closed_lock_with_key:"unicode/1f510.png?v8",closed_umbrella:"unicode/1f302.png?v8",cloud:"unicode/2601.png?v8",cloud_with_lightning:"unicode/1f329.png?v8",cloud_with_lightning_and_rain:"unicode/26c8.png?v8",cloud_with_rain:"unicode/1f327.png?v8",cloud_with_snow:"unicode/1f328.png?v8",clown_face:"unicode/1f921.png?v8",clubs:"unicode/2663.png?v8",cn:"unicode/1f1e8-1f1f3.png?v8",coat:"unicode/1f9e5.png?v8",cockroach:"unicode/1fab3.png?v8",cocktail:"unicode/1f378.png?v8",coconut:"unicode/1f965.png?v8",cocos_islands:"unicode/1f1e8-1f1e8.png?v8",coffee:"unicode/2615.png?v8",coffin:"unicode/26b0.png?v8",coin:"unicode/1fa99.png?v8",cold_face:"unicode/1f976.png?v8",cold_sweat:"unicode/1f630.png?v8",collision:"unicode/1f4a5.png?v8",colombia:"unicode/1f1e8-1f1f4.png?v8",comet:"unicode/2604.png?v8",comoros:"unicode/1f1f0-1f1f2.png?v8",compass:"unicode/1f9ed.png?v8",computer:"unicode/1f4bb.png?v8",computer_mouse:"unicode/1f5b1.png?v8",confetti_ball:"unicode/1f38a.png?v8",confounded:"unicode/1f616.png?v8",confused:"unicode/1f615.png?v8",congo_brazzaville:"unicode/1f1e8-1f1ec.png?v8",congo_kinshasa:"unicode/1f1e8-1f1e9.png?v8",congratulations:"unicode/3297.png?v8",construction:"unicode/1f6a7.png?v8",construction_worker:"unicode/1f477.png?v8",construction_worker_man:"unicode/1f477-2642.png?v8",construction_worker_woman:"unicode/1f477-2640.png?v8",control_knobs:"unicode/1f39b.png?v8",convenience_store:"unicode/1f3ea.png?v8",cook:"unicode/1f9d1-1f373.png?v8",cook_islands:"unicode/1f1e8-1f1f0.png?v8",cookie:"unicode/1f36a.png?v8",cool:"unicode/1f192.png?v8",cop:"unicode/1f46e.png?v8",copyright:"unicode/00a9.png?v8",corn:"unicode/1f33d.png?v8",costa_rica:"unicode/1f1e8-1f1f7.png?v8",cote_divoire:"unicode/1f1e8-1f1ee.png?v8",couch_and_lamp:"unicode/1f6cb.png?v8",couple:"unicode/1f46b.png?v8",couple_with_heart:"unicode/1f491.png?v8",couple_with_heart_man_man:"unicode/1f468-2764-1f468.png?v8",couple_with_heart_woman_man:"unicode/1f469-2764-1f468.png?v8",couple_with_heart_woman_woman:"unicode/1f469-2764-1f469.png?v8",couplekiss:"unicode/1f48f.png?v8",couplekiss_man_man:"unicode/1f468-2764-1f48b-1f468.png?v8",couplekiss_man_woman:"unicode/1f469-2764-1f48b-1f468.png?v8",couplekiss_woman_woman:"unicode/1f469-2764-1f48b-1f469.png?v8",cow:"unicode/1f42e.png?v8",cow2:"unicode/1f404.png?v8",cowboy_hat_face:"unicode/1f920.png?v8",crab:"unicode/1f980.png?v8",crayon:"unicode/1f58d.png?v8",credit_card:"unicode/1f4b3.png?v8",crescent_moon:"unicode/1f319.png?v8",cricket:"unicode/1f997.png?v8",cricket_game:"unicode/1f3cf.png?v8",croatia:"unicode/1f1ed-1f1f7.png?v8",crocodile:"unicode/1f40a.png?v8",croissant:"unicode/1f950.png?v8",crossed_fingers:"unicode/1f91e.png?v8",crossed_flags:"unicode/1f38c.png?v8",crossed_swords:"unicode/2694.png?v8",crown:"unicode/1f451.png?v8",cry:"unicode/1f622.png?v8",crying_cat_face:"unicode/1f63f.png?v8",crystal_ball:"unicode/1f52e.png?v8",cuba:"unicode/1f1e8-1f1fa.png?v8",cucumber:"unicode/1f952.png?v8",cup_with_straw:"unicode/1f964.png?v8",cupcake:"unicode/1f9c1.png?v8",cupid:"unicode/1f498.png?v8",curacao:"unicode/1f1e8-1f1fc.png?v8",curling_stone:"unicode/1f94c.png?v8",curly_haired_man:"unicode/1f468-1f9b1.png?v8",curly_haired_woman:"unicode/1f469-1f9b1.png?v8",curly_loop:"unicode/27b0.png?v8",currency_exchange:"unicode/1f4b1.png?v8",curry:"unicode/1f35b.png?v8",cursing_face:"unicode/1f92c.png?v8",custard:"unicode/1f36e.png?v8",customs:"unicode/1f6c3.png?v8",cut_of_meat:"unicode/1f969.png?v8",cyclone:"unicode/1f300.png?v8",cyprus:"unicode/1f1e8-1f1fe.png?v8",czech_republic:"unicode/1f1e8-1f1ff.png?v8",dagger:"unicode/1f5e1.png?v8",dancer:"unicode/1f483.png?v8",dancers:"unicode/1f46f.png?v8",dancing_men:"unicode/1f46f-2642.png?v8",dancing_women:"unicode/1f46f-2640.png?v8",dango:"unicode/1f361.png?v8",dark_sunglasses:"unicode/1f576.png?v8",dart:"unicode/1f3af.png?v8",dash:"unicode/1f4a8.png?v8",date:"unicode/1f4c5.png?v8",de:"unicode/1f1e9-1f1ea.png?v8",deaf_man:"unicode/1f9cf-2642.png?v8",deaf_person:"unicode/1f9cf.png?v8",deaf_woman:"unicode/1f9cf-2640.png?v8",deciduous_tree:"unicode/1f333.png?v8",deer:"unicode/1f98c.png?v8",denmark:"unicode/1f1e9-1f1f0.png?v8",department_store:"unicode/1f3ec.png?v8",derelict_house:"unicode/1f3da.png?v8",desert:"unicode/1f3dc.png?v8",desert_island:"unicode/1f3dd.png?v8",desktop_computer:"unicode/1f5a5.png?v8",detective:"unicode/1f575.png?v8",diamond_shape_with_a_dot_inside:"unicode/1f4a0.png?v8",diamonds:"unicode/2666.png?v8",diego_garcia:"unicode/1f1e9-1f1ec.png?v8",disappointed:"unicode/1f61e.png?v8",disappointed_relieved:"unicode/1f625.png?v8",disguised_face:"unicode/1f978.png?v8",diving_mask:"unicode/1f93f.png?v8",diya_lamp:"unicode/1fa94.png?v8",dizzy:"unicode/1f4ab.png?v8",dizzy_face:"unicode/1f635.png?v8",djibouti:"unicode/1f1e9-1f1ef.png?v8",dna:"unicode/1f9ec.png?v8",do_not_litter:"unicode/1f6af.png?v8",dodo:"unicode/1f9a4.png?v8",dog:"unicode/1f436.png?v8",dog2:"unicode/1f415.png?v8",dollar:"unicode/1f4b5.png?v8",dolls:"unicode/1f38e.png?v8",dolphin:"unicode/1f42c.png?v8",dominica:"unicode/1f1e9-1f1f2.png?v8",dominican_republic:"unicode/1f1e9-1f1f4.png?v8",door:"unicode/1f6aa.png?v8",doughnut:"unicode/1f369.png?v8",dove:"unicode/1f54a.png?v8",dragon:"unicode/1f409.png?v8",dragon_face:"unicode/1f432.png?v8",dress:"unicode/1f457.png?v8",dromedary_camel:"unicode/1f42a.png?v8",drooling_face:"unicode/1f924.png?v8",drop_of_blood:"unicode/1fa78.png?v8",droplet:"unicode/1f4a7.png?v8",drum:"unicode/1f941.png?v8",duck:"unicode/1f986.png?v8",dumpling:"unicode/1f95f.png?v8",dvd:"unicode/1f4c0.png?v8","e-mail":"unicode/1f4e7.png?v8",eagle:"unicode/1f985.png?v8",ear:"unicode/1f442.png?v8",ear_of_rice:"unicode/1f33e.png?v8",ear_with_hearing_aid:"unicode/1f9bb.png?v8",earth_africa:"unicode/1f30d.png?v8",earth_americas:"unicode/1f30e.png?v8",earth_asia:"unicode/1f30f.png?v8",ecuador:"unicode/1f1ea-1f1e8.png?v8",egg:"unicode/1f95a.png?v8",eggplant:"unicode/1f346.png?v8",egypt:"unicode/1f1ea-1f1ec.png?v8",eight:"unicode/0038-20e3.png?v8",eight_pointed_black_star:"unicode/2734.png?v8",eight_spoked_asterisk:"unicode/2733.png?v8",eject_button:"unicode/23cf.png?v8",el_salvador:"unicode/1f1f8-1f1fb.png?v8",electric_plug:"unicode/1f50c.png?v8",electron:"electron.png?v8",elephant:"unicode/1f418.png?v8",elevator:"unicode/1f6d7.png?v8",elf:"unicode/1f9dd.png?v8",elf_man:"unicode/1f9dd-2642.png?v8",elf_woman:"unicode/1f9dd-2640.png?v8",email:"unicode/1f4e7.png?v8",end:"unicode/1f51a.png?v8",england:"unicode/1f3f4-e0067-e0062-e0065-e006e-e0067-e007f.png?v8",envelope:"unicode/2709.png?v8",envelope_with_arrow:"unicode/1f4e9.png?v8",equatorial_guinea:"unicode/1f1ec-1f1f6.png?v8",eritrea:"unicode/1f1ea-1f1f7.png?v8",es:"unicode/1f1ea-1f1f8.png?v8",estonia:"unicode/1f1ea-1f1ea.png?v8",ethiopia:"unicode/1f1ea-1f1f9.png?v8",eu:"unicode/1f1ea-1f1fa.png?v8",euro:"unicode/1f4b6.png?v8",european_castle:"unicode/1f3f0.png?v8",european_post_office:"unicode/1f3e4.png?v8",european_union:"unicode/1f1ea-1f1fa.png?v8",evergreen_tree:"unicode/1f332.png?v8",exclamation:"unicode/2757.png?v8",exploding_head:"unicode/1f92f.png?v8",expressionless:"unicode/1f611.png?v8",eye:"unicode/1f441.png?v8",eye_speech_bubble:"unicode/1f441-1f5e8.png?v8",eyeglasses:"unicode/1f453.png?v8",eyes:"unicode/1f440.png?v8",face_exhaling:"unicode/1f62e-1f4a8.png?v8",face_in_clouds:"unicode/1f636-1f32b.png?v8",face_with_head_bandage:"unicode/1f915.png?v8",face_with_spiral_eyes:"unicode/1f635-1f4ab.png?v8",face_with_thermometer:"unicode/1f912.png?v8",facepalm:"unicode/1f926.png?v8",facepunch:"unicode/1f44a.png?v8",factory:"unicode/1f3ed.png?v8",factory_worker:"unicode/1f9d1-1f3ed.png?v8",fairy:"unicode/1f9da.png?v8",fairy_man:"unicode/1f9da-2642.png?v8",fairy_woman:"unicode/1f9da-2640.png?v8",falafel:"unicode/1f9c6.png?v8",falkland_islands:"unicode/1f1eb-1f1f0.png?v8",fallen_leaf:"unicode/1f342.png?v8",family:"unicode/1f46a.png?v8",family_man_boy:"unicode/1f468-1f466.png?v8",family_man_boy_boy:"unicode/1f468-1f466-1f466.png?v8",family_man_girl:"unicode/1f468-1f467.png?v8",family_man_girl_boy:"unicode/1f468-1f467-1f466.png?v8",family_man_girl_girl:"unicode/1f468-1f467-1f467.png?v8",family_man_man_boy:"unicode/1f468-1f468-1f466.png?v8",family_man_man_boy_boy:"unicode/1f468-1f468-1f466-1f466.png?v8",family_man_man_girl:"unicode/1f468-1f468-1f467.png?v8",family_man_man_girl_boy:"unicode/1f468-1f468-1f467-1f466.png?v8",family_man_man_girl_girl:"unicode/1f468-1f468-1f467-1f467.png?v8",family_man_woman_boy:"unicode/1f468-1f469-1f466.png?v8",family_man_woman_boy_boy:"unicode/1f468-1f469-1f466-1f466.png?v8",family_man_woman_girl:"unicode/1f468-1f469-1f467.png?v8",family_man_woman_girl_boy:"unicode/1f468-1f469-1f467-1f466.png?v8",family_man_woman_girl_girl:"unicode/1f468-1f469-1f467-1f467.png?v8",family_woman_boy:"unicode/1f469-1f466.png?v8",family_woman_boy_boy:"unicode/1f469-1f466-1f466.png?v8",family_woman_girl:"unicode/1f469-1f467.png?v8",family_woman_girl_boy:"unicode/1f469-1f467-1f466.png?v8",family_woman_girl_girl:"unicode/1f469-1f467-1f467.png?v8",family_woman_woman_boy:"unicode/1f469-1f469-1f466.png?v8",family_woman_woman_boy_boy:"unicode/1f469-1f469-1f466-1f466.png?v8",family_woman_woman_girl:"unicode/1f469-1f469-1f467.png?v8",family_woman_woman_girl_boy:"unicode/1f469-1f469-1f467-1f466.png?v8",family_woman_woman_girl_girl:"unicode/1f469-1f469-1f467-1f467.png?v8",farmer:"unicode/1f9d1-1f33e.png?v8",faroe_islands:"unicode/1f1eb-1f1f4.png?v8",fast_forward:"unicode/23e9.png?v8",fax:"unicode/1f4e0.png?v8",fearful:"unicode/1f628.png?v8",feather:"unicode/1fab6.png?v8",feelsgood:"feelsgood.png?v8",feet:"unicode/1f43e.png?v8",female_detective:"unicode/1f575-2640.png?v8",female_sign:"unicode/2640.png?v8",ferris_wheel:"unicode/1f3a1.png?v8",ferry:"unicode/26f4.png?v8",field_hockey:"unicode/1f3d1.png?v8",fiji:"unicode/1f1eb-1f1ef.png?v8",file_cabinet:"unicode/1f5c4.png?v8",file_folder:"unicode/1f4c1.png?v8",film_projector:"unicode/1f4fd.png?v8",film_strip:"unicode/1f39e.png?v8",finland:"unicode/1f1eb-1f1ee.png?v8",finnadie:"finnadie.png?v8",fire:"unicode/1f525.png?v8",fire_engine:"unicode/1f692.png?v8",fire_extinguisher:"unicode/1f9ef.png?v8",firecracker:"unicode/1f9e8.png?v8",firefighter:"unicode/1f9d1-1f692.png?v8",fireworks:"unicode/1f386.png?v8",first_quarter_moon:"unicode/1f313.png?v8",first_quarter_moon_with_face:"unicode/1f31b.png?v8",fish:"unicode/1f41f.png?v8",fish_cake:"unicode/1f365.png?v8",fishing_pole_and_fish:"unicode/1f3a3.png?v8",fist:"unicode/270a.png?v8",fist_left:"unicode/1f91b.png?v8",fist_oncoming:"unicode/1f44a.png?v8",fist_raised:"unicode/270a.png?v8",fist_right:"unicode/1f91c.png?v8",five:"unicode/0035-20e3.png?v8",flags:"unicode/1f38f.png?v8",flamingo:"unicode/1f9a9.png?v8",flashlight:"unicode/1f526.png?v8",flat_shoe:"unicode/1f97f.png?v8",flatbread:"unicode/1fad3.png?v8",fleur_de_lis:"unicode/269c.png?v8",flight_arrival:"unicode/1f6ec.png?v8",flight_departure:"unicode/1f6eb.png?v8",flipper:"unicode/1f42c.png?v8",floppy_disk:"unicode/1f4be.png?v8",flower_playing_cards:"unicode/1f3b4.png?v8",flushed:"unicode/1f633.png?v8",fly:"unicode/1fab0.png?v8",flying_disc:"unicode/1f94f.png?v8",flying_saucer:"unicode/1f6f8.png?v8",fog:"unicode/1f32b.png?v8",foggy:"unicode/1f301.png?v8",fondue:"unicode/1fad5.png?v8",foot:"unicode/1f9b6.png?v8",football:"unicode/1f3c8.png?v8",footprints:"unicode/1f463.png?v8",fork_and_knife:"unicode/1f374.png?v8",fortune_cookie:"unicode/1f960.png?v8",fountain:"unicode/26f2.png?v8",fountain_pen:"unicode/1f58b.png?v8",four:"unicode/0034-20e3.png?v8",four_leaf_clover:"unicode/1f340.png?v8",fox_face:"unicode/1f98a.png?v8",fr:"unicode/1f1eb-1f1f7.png?v8",framed_picture:"unicode/1f5bc.png?v8",free:"unicode/1f193.png?v8",french_guiana:"unicode/1f1ec-1f1eb.png?v8",french_polynesia:"unicode/1f1f5-1f1eb.png?v8",french_southern_territories:"unicode/1f1f9-1f1eb.png?v8",fried_egg:"unicode/1f373.png?v8",fried_shrimp:"unicode/1f364.png?v8",fries:"unicode/1f35f.png?v8",frog:"unicode/1f438.png?v8",frowning:"unicode/1f626.png?v8",frowning_face:"unicode/2639.png?v8",frowning_man:"unicode/1f64d-2642.png?v8",frowning_person:"unicode/1f64d.png?v8",frowning_woman:"unicode/1f64d-2640.png?v8",fu:"unicode/1f595.png?v8",fuelpump:"unicode/26fd.png?v8",full_moon:"unicode/1f315.png?v8",full_moon_with_face:"unicode/1f31d.png?v8",funeral_urn:"unicode/26b1.png?v8",gabon:"unicode/1f1ec-1f1e6.png?v8",gambia:"unicode/1f1ec-1f1f2.png?v8",game_die:"unicode/1f3b2.png?v8",garlic:"unicode/1f9c4.png?v8",gb:"unicode/1f1ec-1f1e7.png?v8",gear:"unicode/2699.png?v8",gem:"unicode/1f48e.png?v8",gemini:"unicode/264a.png?v8",genie:"unicode/1f9de.png?v8",genie_man:"unicode/1f9de-2642.png?v8",genie_woman:"unicode/1f9de-2640.png?v8",georgia:"unicode/1f1ec-1f1ea.png?v8",ghana:"unicode/1f1ec-1f1ed.png?v8",ghost:"unicode/1f47b.png?v8",gibraltar:"unicode/1f1ec-1f1ee.png?v8",gift:"unicode/1f381.png?v8",gift_heart:"unicode/1f49d.png?v8",giraffe:"unicode/1f992.png?v8",girl:"unicode/1f467.png?v8",globe_with_meridians:"unicode/1f310.png?v8",gloves:"unicode/1f9e4.png?v8",goal_net:"unicode/1f945.png?v8",goat:"unicode/1f410.png?v8",goberserk:"goberserk.png?v8",godmode:"godmode.png?v8",goggles:"unicode/1f97d.png?v8",golf:"unicode/26f3.png?v8",golfing:"unicode/1f3cc.png?v8",golfing_man:"unicode/1f3cc-2642.png?v8",golfing_woman:"unicode/1f3cc-2640.png?v8",gorilla:"unicode/1f98d.png?v8",grapes:"unicode/1f347.png?v8",greece:"unicode/1f1ec-1f1f7.png?v8",green_apple:"unicode/1f34f.png?v8",green_book:"unicode/1f4d7.png?v8",green_circle:"unicode/1f7e2.png?v8",green_heart:"unicode/1f49a.png?v8",green_salad:"unicode/1f957.png?v8",green_square:"unicode/1f7e9.png?v8",greenland:"unicode/1f1ec-1f1f1.png?v8",grenada:"unicode/1f1ec-1f1e9.png?v8",grey_exclamation:"unicode/2755.png?v8",grey_question:"unicode/2754.png?v8",grimacing:"unicode/1f62c.png?v8",grin:"unicode/1f601.png?v8",grinning:"unicode/1f600.png?v8",guadeloupe:"unicode/1f1ec-1f1f5.png?v8",guam:"unicode/1f1ec-1f1fa.png?v8",guard:"unicode/1f482.png?v8",guardsman:"unicode/1f482-2642.png?v8",guardswoman:"unicode/1f482-2640.png?v8",guatemala:"unicode/1f1ec-1f1f9.png?v8",guernsey:"unicode/1f1ec-1f1ec.png?v8",guide_dog:"unicode/1f9ae.png?v8",guinea:"unicode/1f1ec-1f1f3.png?v8",guinea_bissau:"unicode/1f1ec-1f1fc.png?v8",guitar:"unicode/1f3b8.png?v8",gun:"unicode/1f52b.png?v8",guyana:"unicode/1f1ec-1f1fe.png?v8",haircut:"unicode/1f487.png?v8",haircut_man:"unicode/1f487-2642.png?v8",haircut_woman:"unicode/1f487-2640.png?v8",haiti:"unicode/1f1ed-1f1f9.png?v8",hamburger:"unicode/1f354.png?v8",hammer:"unicode/1f528.png?v8",hammer_and_pick:"unicode/2692.png?v8",hammer_and_wrench:"unicode/1f6e0.png?v8",hamster:"unicode/1f439.png?v8",hand:"unicode/270b.png?v8",hand_over_mouth:"unicode/1f92d.png?v8",handbag:"unicode/1f45c.png?v8",handball_person:"unicode/1f93e.png?v8",handshake:"unicode/1f91d.png?v8",hankey:"unicode/1f4a9.png?v8",hash:"unicode/0023-20e3.png?v8",hatched_chick:"unicode/1f425.png?v8",hatching_chick:"unicode/1f423.png?v8",headphones:"unicode/1f3a7.png?v8",headstone:"unicode/1faa6.png?v8",health_worker:"unicode/1f9d1-2695.png?v8",hear_no_evil:"unicode/1f649.png?v8",heard_mcdonald_islands:"unicode/1f1ed-1f1f2.png?v8",heart:"unicode/2764.png?v8",heart_decoration:"unicode/1f49f.png?v8",heart_eyes:"unicode/1f60d.png?v8",heart_eyes_cat:"unicode/1f63b.png?v8",heart_on_fire:"unicode/2764-1f525.png?v8",heartbeat:"unicode/1f493.png?v8",heartpulse:"unicode/1f497.png?v8",hearts:"unicode/2665.png?v8",heavy_check_mark:"unicode/2714.png?v8",heavy_division_sign:"unicode/2797.png?v8",heavy_dollar_sign:"unicode/1f4b2.png?v8",heavy_exclamation_mark:"unicode/2757.png?v8",heavy_heart_exclamation:"unicode/2763.png?v8",heavy_minus_sign:"unicode/2796.png?v8",heavy_multiplication_x:"unicode/2716.png?v8",heavy_plus_sign:"unicode/2795.png?v8",hedgehog:"unicode/1f994.png?v8",helicopter:"unicode/1f681.png?v8",herb:"unicode/1f33f.png?v8",hibiscus:"unicode/1f33a.png?v8",high_brightness:"unicode/1f506.png?v8",high_heel:"unicode/1f460.png?v8",hiking_boot:"unicode/1f97e.png?v8",hindu_temple:"unicode/1f6d5.png?v8",hippopotamus:"unicode/1f99b.png?v8",hocho:"unicode/1f52a.png?v8",hole:"unicode/1f573.png?v8",honduras:"unicode/1f1ed-1f1f3.png?v8",honey_pot:"unicode/1f36f.png?v8",honeybee:"unicode/1f41d.png?v8",hong_kong:"unicode/1f1ed-1f1f0.png?v8",hook:"unicode/1fa9d.png?v8",horse:"unicode/1f434.png?v8",horse_racing:"unicode/1f3c7.png?v8",hospital:"unicode/1f3e5.png?v8",hot_face:"unicode/1f975.png?v8",hot_pepper:"unicode/1f336.png?v8",hotdog:"unicode/1f32d.png?v8",hotel:"unicode/1f3e8.png?v8",hotsprings:"unicode/2668.png?v8",hourglass:"unicode/231b.png?v8",hourglass_flowing_sand:"unicode/23f3.png?v8",house:"unicode/1f3e0.png?v8",house_with_garden:"unicode/1f3e1.png?v8",houses:"unicode/1f3d8.png?v8",hugs:"unicode/1f917.png?v8",hungary:"unicode/1f1ed-1f1fa.png?v8",hurtrealbad:"hurtrealbad.png?v8",hushed:"unicode/1f62f.png?v8",hut:"unicode/1f6d6.png?v8",ice_cream:"unicode/1f368.png?v8",ice_cube:"unicode/1f9ca.png?v8",ice_hockey:"unicode/1f3d2.png?v8",ice_skate:"unicode/26f8.png?v8",icecream:"unicode/1f366.png?v8",iceland:"unicode/1f1ee-1f1f8.png?v8",id:"unicode/1f194.png?v8",ideograph_advantage:"unicode/1f250.png?v8",imp:"unicode/1f47f.png?v8",inbox_tray:"unicode/1f4e5.png?v8",incoming_envelope:"unicode/1f4e8.png?v8",india:"unicode/1f1ee-1f1f3.png?v8",indonesia:"unicode/1f1ee-1f1e9.png?v8",infinity:"unicode/267e.png?v8",information_desk_person:"unicode/1f481.png?v8",information_source:"unicode/2139.png?v8",innocent:"unicode/1f607.png?v8",interrobang:"unicode/2049.png?v8",iphone:"unicode/1f4f1.png?v8",iran:"unicode/1f1ee-1f1f7.png?v8",iraq:"unicode/1f1ee-1f1f6.png?v8",ireland:"unicode/1f1ee-1f1ea.png?v8",isle_of_man:"unicode/1f1ee-1f1f2.png?v8",israel:"unicode/1f1ee-1f1f1.png?v8",it:"unicode/1f1ee-1f1f9.png?v8",izakaya_lantern:"unicode/1f3ee.png?v8",jack_o_lantern:"unicode/1f383.png?v8",jamaica:"unicode/1f1ef-1f1f2.png?v8",japan:"unicode/1f5fe.png?v8",japanese_castle:"unicode/1f3ef.png?v8",japanese_goblin:"unicode/1f47a.png?v8",japanese_ogre:"unicode/1f479.png?v8",jeans:"unicode/1f456.png?v8",jersey:"unicode/1f1ef-1f1ea.png?v8",jigsaw:"unicode/1f9e9.png?v8",jordan:"unicode/1f1ef-1f1f4.png?v8",joy:"unicode/1f602.png?v8",joy_cat:"unicode/1f639.png?v8",joystick:"unicode/1f579.png?v8",jp:"unicode/1f1ef-1f1f5.png?v8",judge:"unicode/1f9d1-2696.png?v8",juggling_person:"unicode/1f939.png?v8",kaaba:"unicode/1f54b.png?v8",kangaroo:"unicode/1f998.png?v8",kazakhstan:"unicode/1f1f0-1f1ff.png?v8",kenya:"unicode/1f1f0-1f1ea.png?v8",key:"unicode/1f511.png?v8",keyboard:"unicode/2328.png?v8",keycap_ten:"unicode/1f51f.png?v8",kick_scooter:"unicode/1f6f4.png?v8",kimono:"unicode/1f458.png?v8",kiribati:"unicode/1f1f0-1f1ee.png?v8",kiss:"unicode/1f48b.png?v8",kissing:"unicode/1f617.png?v8",kissing_cat:"unicode/1f63d.png?v8",kissing_closed_eyes:"unicode/1f61a.png?v8",kissing_heart:"unicode/1f618.png?v8",kissing_smiling_eyes:"unicode/1f619.png?v8",kite:"unicode/1fa81.png?v8",kiwi_fruit:"unicode/1f95d.png?v8",kneeling_man:"unicode/1f9ce-2642.png?v8",kneeling_person:"unicode/1f9ce.png?v8",kneeling_woman:"unicode/1f9ce-2640.png?v8",knife:"unicode/1f52a.png?v8",knot:"unicode/1faa2.png?v8",koala:"unicode/1f428.png?v8",koko:"unicode/1f201.png?v8",kosovo:"unicode/1f1fd-1f1f0.png?v8",kr:"unicode/1f1f0-1f1f7.png?v8",kuwait:"unicode/1f1f0-1f1fc.png?v8",kyrgyzstan:"unicode/1f1f0-1f1ec.png?v8",lab_coat:"unicode/1f97c.png?v8",label:"unicode/1f3f7.png?v8",lacrosse:"unicode/1f94d.png?v8",ladder:"unicode/1fa9c.png?v8",lady_beetle:"unicode/1f41e.png?v8",lantern:"unicode/1f3ee.png?v8",laos:"unicode/1f1f1-1f1e6.png?v8",large_blue_circle:"unicode/1f535.png?v8",large_blue_diamond:"unicode/1f537.png?v8",large_orange_diamond:"unicode/1f536.png?v8",last_quarter_moon:"unicode/1f317.png?v8",last_quarter_moon_with_face:"unicode/1f31c.png?v8",latin_cross:"unicode/271d.png?v8",latvia:"unicode/1f1f1-1f1fb.png?v8",laughing:"unicode/1f606.png?v8",leafy_green:"unicode/1f96c.png?v8",leaves:"unicode/1f343.png?v8",lebanon:"unicode/1f1f1-1f1e7.png?v8",ledger:"unicode/1f4d2.png?v8",left_luggage:"unicode/1f6c5.png?v8",left_right_arrow:"unicode/2194.png?v8",left_speech_bubble:"unicode/1f5e8.png?v8",leftwards_arrow_with_hook:"unicode/21a9.png?v8",leg:"unicode/1f9b5.png?v8",lemon:"unicode/1f34b.png?v8",leo:"unicode/264c.png?v8",leopard:"unicode/1f406.png?v8",lesotho:"unicode/1f1f1-1f1f8.png?v8",level_slider:"unicode/1f39a.png?v8",liberia:"unicode/1f1f1-1f1f7.png?v8",libra:"unicode/264e.png?v8",libya:"unicode/1f1f1-1f1fe.png?v8",liechtenstein:"unicode/1f1f1-1f1ee.png?v8",light_rail:"unicode/1f688.png?v8",link:"unicode/1f517.png?v8",lion:"unicode/1f981.png?v8",lips:"unicode/1f444.png?v8",lipstick:"unicode/1f484.png?v8",lithuania:"unicode/1f1f1-1f1f9.png?v8",lizard:"unicode/1f98e.png?v8",llama:"unicode/1f999.png?v8",lobster:"unicode/1f99e.png?v8",lock:"unicode/1f512.png?v8",lock_with_ink_pen:"unicode/1f50f.png?v8",lollipop:"unicode/1f36d.png?v8",long_drum:"unicode/1fa98.png?v8",loop:"unicode/27bf.png?v8",lotion_bottle:"unicode/1f9f4.png?v8",lotus_position:"unicode/1f9d8.png?v8",lotus_position_man:"unicode/1f9d8-2642.png?v8",lotus_position_woman:"unicode/1f9d8-2640.png?v8",loud_sound:"unicode/1f50a.png?v8",loudspeaker:"unicode/1f4e2.png?v8",love_hotel:"unicode/1f3e9.png?v8",love_letter:"unicode/1f48c.png?v8",love_you_gesture:"unicode/1f91f.png?v8",low_brightness:"unicode/1f505.png?v8",luggage:"unicode/1f9f3.png?v8",lungs:"unicode/1fac1.png?v8",luxembourg:"unicode/1f1f1-1f1fa.png?v8",lying_face:"unicode/1f925.png?v8",m:"unicode/24c2.png?v8",macau:"unicode/1f1f2-1f1f4.png?v8",macedonia:"unicode/1f1f2-1f1f0.png?v8",madagascar:"unicode/1f1f2-1f1ec.png?v8",mag:"unicode/1f50d.png?v8",mag_right:"unicode/1f50e.png?v8",mage:"unicode/1f9d9.png?v8",mage_man:"unicode/1f9d9-2642.png?v8",mage_woman:"unicode/1f9d9-2640.png?v8",magic_wand:"unicode/1fa84.png?v8",magnet:"unicode/1f9f2.png?v8",mahjong:"unicode/1f004.png?v8",mailbox:"unicode/1f4eb.png?v8",mailbox_closed:"unicode/1f4ea.png?v8",mailbox_with_mail:"unicode/1f4ec.png?v8",mailbox_with_no_mail:"unicode/1f4ed.png?v8",malawi:"unicode/1f1f2-1f1fc.png?v8",malaysia:"unicode/1f1f2-1f1fe.png?v8",maldives:"unicode/1f1f2-1f1fb.png?v8",male_detective:"unicode/1f575-2642.png?v8",male_sign:"unicode/2642.png?v8",mali:"unicode/1f1f2-1f1f1.png?v8",malta:"unicode/1f1f2-1f1f9.png?v8",mammoth:"unicode/1f9a3.png?v8",man:"unicode/1f468.png?v8",man_artist:"unicode/1f468-1f3a8.png?v8",man_astronaut:"unicode/1f468-1f680.png?v8",man_beard:"unicode/1f9d4-2642.png?v8",man_cartwheeling:"unicode/1f938-2642.png?v8",man_cook:"unicode/1f468-1f373.png?v8",man_dancing:"unicode/1f57a.png?v8",man_facepalming:"unicode/1f926-2642.png?v8",man_factory_worker:"unicode/1f468-1f3ed.png?v8",man_farmer:"unicode/1f468-1f33e.png?v8",man_feeding_baby:"unicode/1f468-1f37c.png?v8",man_firefighter:"unicode/1f468-1f692.png?v8",man_health_worker:"unicode/1f468-2695.png?v8",man_in_manual_wheelchair:"unicode/1f468-1f9bd.png?v8",man_in_motorized_wheelchair:"unicode/1f468-1f9bc.png?v8",man_in_tuxedo:"unicode/1f935-2642.png?v8",man_judge:"unicode/1f468-2696.png?v8",man_juggling:"unicode/1f939-2642.png?v8",man_mechanic:"unicode/1f468-1f527.png?v8",man_office_worker:"unicode/1f468-1f4bc.png?v8",man_pilot:"unicode/1f468-2708.png?v8",man_playing_handball:"unicode/1f93e-2642.png?v8",man_playing_water_polo:"unicode/1f93d-2642.png?v8",man_scientist:"unicode/1f468-1f52c.png?v8",man_shrugging:"unicode/1f937-2642.png?v8",man_singer:"unicode/1f468-1f3a4.png?v8",man_student:"unicode/1f468-1f393.png?v8",man_teacher:"unicode/1f468-1f3eb.png?v8",man_technologist:"unicode/1f468-1f4bb.png?v8",man_with_gua_pi_mao:"unicode/1f472.png?v8",man_with_probing_cane:"unicode/1f468-1f9af.png?v8",man_with_turban:"unicode/1f473-2642.png?v8",man_with_veil:"unicode/1f470-2642.png?v8",mandarin:"unicode/1f34a.png?v8",mango:"unicode/1f96d.png?v8",mans_shoe:"unicode/1f45e.png?v8",mantelpiece_clock:"unicode/1f570.png?v8",manual_wheelchair:"unicode/1f9bd.png?v8",maple_leaf:"unicode/1f341.png?v8",marshall_islands:"unicode/1f1f2-1f1ed.png?v8",martial_arts_uniform:"unicode/1f94b.png?v8",martinique:"unicode/1f1f2-1f1f6.png?v8",mask:"unicode/1f637.png?v8",massage:"unicode/1f486.png?v8",massage_man:"unicode/1f486-2642.png?v8",massage_woman:"unicode/1f486-2640.png?v8",mate:"unicode/1f9c9.png?v8",mauritania:"unicode/1f1f2-1f1f7.png?v8",mauritius:"unicode/1f1f2-1f1fa.png?v8",mayotte:"unicode/1f1fe-1f1f9.png?v8",meat_on_bone:"unicode/1f356.png?v8",mechanic:"unicode/1f9d1-1f527.png?v8",mechanical_arm:"unicode/1f9be.png?v8",mechanical_leg:"unicode/1f9bf.png?v8",medal_military:"unicode/1f396.png?v8",medal_sports:"unicode/1f3c5.png?v8",medical_symbol:"unicode/2695.png?v8",mega:"unicode/1f4e3.png?v8",melon:"unicode/1f348.png?v8",memo:"unicode/1f4dd.png?v8",men_wrestling:"unicode/1f93c-2642.png?v8",mending_heart:"unicode/2764-1fa79.png?v8",menorah:"unicode/1f54e.png?v8",mens:"unicode/1f6b9.png?v8",mermaid:"unicode/1f9dc-2640.png?v8",merman:"unicode/1f9dc-2642.png?v8",merperson:"unicode/1f9dc.png?v8",metal:"unicode/1f918.png?v8",metro:"unicode/1f687.png?v8",mexico:"unicode/1f1f2-1f1fd.png?v8",microbe:"unicode/1f9a0.png?v8",micronesia:"unicode/1f1eb-1f1f2.png?v8",microphone:"unicode/1f3a4.png?v8",microscope:"unicode/1f52c.png?v8",middle_finger:"unicode/1f595.png?v8",military_helmet:"unicode/1fa96.png?v8",milk_glass:"unicode/1f95b.png?v8",milky_way:"unicode/1f30c.png?v8",minibus:"unicode/1f690.png?v8",minidisc:"unicode/1f4bd.png?v8",mirror:"unicode/1fa9e.png?v8",mobile_phone_off:"unicode/1f4f4.png?v8",moldova:"unicode/1f1f2-1f1e9.png?v8",monaco:"unicode/1f1f2-1f1e8.png?v8",money_mouth_face:"unicode/1f911.png?v8",money_with_wings:"unicode/1f4b8.png?v8",moneybag:"unicode/1f4b0.png?v8",mongolia:"unicode/1f1f2-1f1f3.png?v8",monkey:"unicode/1f412.png?v8",monkey_face:"unicode/1f435.png?v8",monocle_face:"unicode/1f9d0.png?v8",monorail:"unicode/1f69d.png?v8",montenegro:"unicode/1f1f2-1f1ea.png?v8",montserrat:"unicode/1f1f2-1f1f8.png?v8",moon:"unicode/1f314.png?v8",moon_cake:"unicode/1f96e.png?v8",morocco:"unicode/1f1f2-1f1e6.png?v8",mortar_board:"unicode/1f393.png?v8",mosque:"unicode/1f54c.png?v8",mosquito:"unicode/1f99f.png?v8",motor_boat:"unicode/1f6e5.png?v8",motor_scooter:"unicode/1f6f5.png?v8",motorcycle:"unicode/1f3cd.png?v8",motorized_wheelchair:"unicode/1f9bc.png?v8",motorway:"unicode/1f6e3.png?v8",mount_fuji:"unicode/1f5fb.png?v8",mountain:"unicode/26f0.png?v8",mountain_bicyclist:"unicode/1f6b5.png?v8",mountain_biking_man:"unicode/1f6b5-2642.png?v8",mountain_biking_woman:"unicode/1f6b5-2640.png?v8",mountain_cableway:"unicode/1f6a0.png?v8",mountain_railway:"unicode/1f69e.png?v8",mountain_snow:"unicode/1f3d4.png?v8",mouse:"unicode/1f42d.png?v8",mouse2:"unicode/1f401.png?v8",mouse_trap:"unicode/1faa4.png?v8",movie_camera:"unicode/1f3a5.png?v8",moyai:"unicode/1f5ff.png?v8",mozambique:"unicode/1f1f2-1f1ff.png?v8",mrs_claus:"unicode/1f936.png?v8",muscle:"unicode/1f4aa.png?v8",mushroom:"unicode/1f344.png?v8",musical_keyboard:"unicode/1f3b9.png?v8",musical_note:"unicode/1f3b5.png?v8",musical_score:"unicode/1f3bc.png?v8",mute:"unicode/1f507.png?v8",mx_claus:"unicode/1f9d1-1f384.png?v8",myanmar:"unicode/1f1f2-1f1f2.png?v8",nail_care:"unicode/1f485.png?v8",name_badge:"unicode/1f4db.png?v8",namibia:"unicode/1f1f3-1f1e6.png?v8",national_park:"unicode/1f3de.png?v8",nauru:"unicode/1f1f3-1f1f7.png?v8",nauseated_face:"unicode/1f922.png?v8",nazar_amulet:"unicode/1f9ff.png?v8",neckbeard:"neckbeard.png?v8",necktie:"unicode/1f454.png?v8",negative_squared_cross_mark:"unicode/274e.png?v8",nepal:"unicode/1f1f3-1f1f5.png?v8",nerd_face:"unicode/1f913.png?v8",nesting_dolls:"unicode/1fa86.png?v8",netherlands:"unicode/1f1f3-1f1f1.png?v8",neutral_face:"unicode/1f610.png?v8",new:"unicode/1f195.png?v8",new_caledonia:"unicode/1f1f3-1f1e8.png?v8",new_moon:"unicode/1f311.png?v8",new_moon_with_face:"unicode/1f31a.png?v8",new_zealand:"unicode/1f1f3-1f1ff.png?v8",newspaper:"unicode/1f4f0.png?v8",newspaper_roll:"unicode/1f5de.png?v8",next_track_button:"unicode/23ed.png?v8",ng:"unicode/1f196.png?v8",ng_man:"unicode/1f645-2642.png?v8",ng_woman:"unicode/1f645-2640.png?v8",nicaragua:"unicode/1f1f3-1f1ee.png?v8",niger:"unicode/1f1f3-1f1ea.png?v8",nigeria:"unicode/1f1f3-1f1ec.png?v8",night_with_stars:"unicode/1f303.png?v8",nine:"unicode/0039-20e3.png?v8",ninja:"unicode/1f977.png?v8",niue:"unicode/1f1f3-1f1fa.png?v8",no_bell:"unicode/1f515.png?v8",no_bicycles:"unicode/1f6b3.png?v8",no_entry:"unicode/26d4.png?v8",no_entry_sign:"unicode/1f6ab.png?v8",no_good:"unicode/1f645.png?v8",no_good_man:"unicode/1f645-2642.png?v8",no_good_woman:"unicode/1f645-2640.png?v8",no_mobile_phones:"unicode/1f4f5.png?v8",no_mouth:"unicode/1f636.png?v8",no_pedestrians:"unicode/1f6b7.png?v8",no_smoking:"unicode/1f6ad.png?v8","non-potable_water":"unicode/1f6b1.png?v8",norfolk_island:"unicode/1f1f3-1f1eb.png?v8",north_korea:"unicode/1f1f0-1f1f5.png?v8",northern_mariana_islands:"unicode/1f1f2-1f1f5.png?v8",norway:"unicode/1f1f3-1f1f4.png?v8",nose:"unicode/1f443.png?v8",notebook:"unicode/1f4d3.png?v8",notebook_with_decorative_cover:"unicode/1f4d4.png?v8",notes:"unicode/1f3b6.png?v8",nut_and_bolt:"unicode/1f529.png?v8",o:"unicode/2b55.png?v8",o2:"unicode/1f17e.png?v8",ocean:"unicode/1f30a.png?v8",octocat:"octocat.png?v8",octopus:"unicode/1f419.png?v8",oden:"unicode/1f362.png?v8",office:"unicode/1f3e2.png?v8",office_worker:"unicode/1f9d1-1f4bc.png?v8",oil_drum:"unicode/1f6e2.png?v8",ok:"unicode/1f197.png?v8",ok_hand:"unicode/1f44c.png?v8",ok_man:"unicode/1f646-2642.png?v8",ok_person:"unicode/1f646.png?v8",ok_woman:"unicode/1f646-2640.png?v8",old_key:"unicode/1f5dd.png?v8",older_adult:"unicode/1f9d3.png?v8",older_man:"unicode/1f474.png?v8",older_woman:"unicode/1f475.png?v8",olive:"unicode/1fad2.png?v8",om:"unicode/1f549.png?v8",oman:"unicode/1f1f4-1f1f2.png?v8",on:"unicode/1f51b.png?v8",oncoming_automobile:"unicode/1f698.png?v8",oncoming_bus:"unicode/1f68d.png?v8",oncoming_police_car:"unicode/1f694.png?v8",oncoming_taxi:"unicode/1f696.png?v8",one:"unicode/0031-20e3.png?v8",one_piece_swimsuit:"unicode/1fa71.png?v8",onion:"unicode/1f9c5.png?v8",open_book:"unicode/1f4d6.png?v8",open_file_folder:"unicode/1f4c2.png?v8",open_hands:"unicode/1f450.png?v8",open_mouth:"unicode/1f62e.png?v8",open_umbrella:"unicode/2602.png?v8",ophiuchus:"unicode/26ce.png?v8",orange:"unicode/1f34a.png?v8",orange_book:"unicode/1f4d9.png?v8",orange_circle:"unicode/1f7e0.png?v8",orange_heart:"unicode/1f9e1.png?v8",orange_square:"unicode/1f7e7.png?v8",orangutan:"unicode/1f9a7.png?v8",orthodox_cross:"unicode/2626.png?v8",otter:"unicode/1f9a6.png?v8",outbox_tray:"unicode/1f4e4.png?v8",owl:"unicode/1f989.png?v8",ox:"unicode/1f402.png?v8",oyster:"unicode/1f9aa.png?v8",package:"unicode/1f4e6.png?v8",page_facing_up:"unicode/1f4c4.png?v8",page_with_curl:"unicode/1f4c3.png?v8",pager:"unicode/1f4df.png?v8",paintbrush:"unicode/1f58c.png?v8",pakistan:"unicode/1f1f5-1f1f0.png?v8",palau:"unicode/1f1f5-1f1fc.png?v8",palestinian_territories:"unicode/1f1f5-1f1f8.png?v8",palm_tree:"unicode/1f334.png?v8",palms_up_together:"unicode/1f932.png?v8",panama:"unicode/1f1f5-1f1e6.png?v8",pancakes:"unicode/1f95e.png?v8",panda_face:"unicode/1f43c.png?v8",paperclip:"unicode/1f4ce.png?v8",paperclips:"unicode/1f587.png?v8",papua_new_guinea:"unicode/1f1f5-1f1ec.png?v8",parachute:"unicode/1fa82.png?v8",paraguay:"unicode/1f1f5-1f1fe.png?v8",parasol_on_ground:"unicode/26f1.png?v8",parking:"unicode/1f17f.png?v8",parrot:"unicode/1f99c.png?v8",part_alternation_mark:"unicode/303d.png?v8",partly_sunny:"unicode/26c5.png?v8",partying_face:"unicode/1f973.png?v8",passenger_ship:"unicode/1f6f3.png?v8",passport_control:"unicode/1f6c2.png?v8",pause_button:"unicode/23f8.png?v8",paw_prints:"unicode/1f43e.png?v8",peace_symbol:"unicode/262e.png?v8",peach:"unicode/1f351.png?v8",peacock:"unicode/1f99a.png?v8",peanuts:"unicode/1f95c.png?v8",pear:"unicode/1f350.png?v8",pen:"unicode/1f58a.png?v8",pencil:"unicode/1f4dd.png?v8",pencil2:"unicode/270f.png?v8",penguin:"unicode/1f427.png?v8",pensive:"unicode/1f614.png?v8",people_holding_hands:"unicode/1f9d1-1f91d-1f9d1.png?v8",people_hugging:"unicode/1fac2.png?v8",performing_arts:"unicode/1f3ad.png?v8",persevere:"unicode/1f623.png?v8",person_bald:"unicode/1f9d1-1f9b2.png?v8",person_curly_hair:"unicode/1f9d1-1f9b1.png?v8",person_feeding_baby:"unicode/1f9d1-1f37c.png?v8",person_fencing:"unicode/1f93a.png?v8",person_in_manual_wheelchair:"unicode/1f9d1-1f9bd.png?v8",person_in_motorized_wheelchair:"unicode/1f9d1-1f9bc.png?v8",person_in_tuxedo:"unicode/1f935.png?v8",person_red_hair:"unicode/1f9d1-1f9b0.png?v8",person_white_hair:"unicode/1f9d1-1f9b3.png?v8",person_with_probing_cane:"unicode/1f9d1-1f9af.png?v8",person_with_turban:"unicode/1f473.png?v8",person_with_veil:"unicode/1f470.png?v8",peru:"unicode/1f1f5-1f1ea.png?v8",petri_dish:"unicode/1f9eb.png?v8",philippines:"unicode/1f1f5-1f1ed.png?v8",phone:"unicode/260e.png?v8",pick:"unicode/26cf.png?v8",pickup_truck:"unicode/1f6fb.png?v8",pie:"unicode/1f967.png?v8",pig:"unicode/1f437.png?v8",pig2:"unicode/1f416.png?v8",pig_nose:"unicode/1f43d.png?v8",pill:"unicode/1f48a.png?v8",pilot:"unicode/1f9d1-2708.png?v8",pinata:"unicode/1fa85.png?v8",pinched_fingers:"unicode/1f90c.png?v8",pinching_hand:"unicode/1f90f.png?v8",pineapple:"unicode/1f34d.png?v8",ping_pong:"unicode/1f3d3.png?v8",pirate_flag:"unicode/1f3f4-2620.png?v8",pisces:"unicode/2653.png?v8",pitcairn_islands:"unicode/1f1f5-1f1f3.png?v8",pizza:"unicode/1f355.png?v8",placard:"unicode/1faa7.png?v8",place_of_worship:"unicode/1f6d0.png?v8",plate_with_cutlery:"unicode/1f37d.png?v8",play_or_pause_button:"unicode/23ef.png?v8",pleading_face:"unicode/1f97a.png?v8",plunger:"unicode/1faa0.png?v8",point_down:"unicode/1f447.png?v8",point_left:"unicode/1f448.png?v8",point_right:"unicode/1f449.png?v8",point_up:"unicode/261d.png?v8",point_up_2:"unicode/1f446.png?v8",poland:"unicode/1f1f5-1f1f1.png?v8",polar_bear:"unicode/1f43b-2744.png?v8",police_car:"unicode/1f693.png?v8",police_officer:"unicode/1f46e.png?v8",policeman:"unicode/1f46e-2642.png?v8",policewoman:"unicode/1f46e-2640.png?v8",poodle:"unicode/1f429.png?v8",poop:"unicode/1f4a9.png?v8",popcorn:"unicode/1f37f.png?v8",portugal:"unicode/1f1f5-1f1f9.png?v8",post_office:"unicode/1f3e3.png?v8",postal_horn:"unicode/1f4ef.png?v8",postbox:"unicode/1f4ee.png?v8",potable_water:"unicode/1f6b0.png?v8",potato:"unicode/1f954.png?v8",potted_plant:"unicode/1fab4.png?v8",pouch:"unicode/1f45d.png?v8",poultry_leg:"unicode/1f357.png?v8",pound:"unicode/1f4b7.png?v8",pout:"unicode/1f621.png?v8",pouting_cat:"unicode/1f63e.png?v8",pouting_face:"unicode/1f64e.png?v8",pouting_man:"unicode/1f64e-2642.png?v8",pouting_woman:"unicode/1f64e-2640.png?v8",pray:"unicode/1f64f.png?v8",prayer_beads:"unicode/1f4ff.png?v8",pregnant_woman:"unicode/1f930.png?v8",pretzel:"unicode/1f968.png?v8",previous_track_button:"unicode/23ee.png?v8",prince:"unicode/1f934.png?v8",princess:"unicode/1f478.png?v8",printer:"unicode/1f5a8.png?v8",probing_cane:"unicode/1f9af.png?v8",puerto_rico:"unicode/1f1f5-1f1f7.png?v8",punch:"unicode/1f44a.png?v8",purple_circle:"unicode/1f7e3.png?v8",purple_heart:"unicode/1f49c.png?v8",purple_square:"unicode/1f7ea.png?v8",purse:"unicode/1f45b.png?v8",pushpin:"unicode/1f4cc.png?v8",put_litter_in_its_place:"unicode/1f6ae.png?v8",qatar:"unicode/1f1f6-1f1e6.png?v8",question:"unicode/2753.png?v8",rabbit:"unicode/1f430.png?v8",rabbit2:"unicode/1f407.png?v8",raccoon:"unicode/1f99d.png?v8",racehorse:"unicode/1f40e.png?v8",racing_car:"unicode/1f3ce.png?v8",radio:"unicode/1f4fb.png?v8",radio_button:"unicode/1f518.png?v8",radioactive:"unicode/2622.png?v8",rage:"unicode/1f621.png?v8",rage1:"rage1.png?v8",rage2:"rage2.png?v8",rage3:"rage3.png?v8",rage4:"rage4.png?v8",railway_car:"unicode/1f683.png?v8",railway_track:"unicode/1f6e4.png?v8",rainbow:"unicode/1f308.png?v8",rainbow_flag:"unicode/1f3f3-1f308.png?v8",raised_back_of_hand:"unicode/1f91a.png?v8",raised_eyebrow:"unicode/1f928.png?v8",raised_hand:"unicode/270b.png?v8",raised_hand_with_fingers_splayed:"unicode/1f590.png?v8",raised_hands:"unicode/1f64c.png?v8",raising_hand:"unicode/1f64b.png?v8",raising_hand_man:"unicode/1f64b-2642.png?v8",raising_hand_woman:"unicode/1f64b-2640.png?v8",ram:"unicode/1f40f.png?v8",ramen:"unicode/1f35c.png?v8",rat:"unicode/1f400.png?v8",razor:"unicode/1fa92.png?v8",receipt:"unicode/1f9fe.png?v8",record_button:"unicode/23fa.png?v8",recycle:"unicode/267b.png?v8",red_car:"unicode/1f697.png?v8",red_circle:"unicode/1f534.png?v8",red_envelope:"unicode/1f9e7.png?v8",red_haired_man:"unicode/1f468-1f9b0.png?v8",red_haired_woman:"unicode/1f469-1f9b0.png?v8",red_square:"unicode/1f7e5.png?v8",registered:"unicode/00ae.png?v8",relaxed:"unicode/263a.png?v8",relieved:"unicode/1f60c.png?v8",reminder_ribbon:"unicode/1f397.png?v8",repeat:"unicode/1f501.png?v8",repeat_one:"unicode/1f502.png?v8",rescue_worker_helmet:"unicode/26d1.png?v8",restroom:"unicode/1f6bb.png?v8",reunion:"unicode/1f1f7-1f1ea.png?v8",revolving_hearts:"unicode/1f49e.png?v8",rewind:"unicode/23ea.png?v8",rhinoceros:"unicode/1f98f.png?v8",ribbon:"unicode/1f380.png?v8",rice:"unicode/1f35a.png?v8",rice_ball:"unicode/1f359.png?v8",rice_cracker:"unicode/1f358.png?v8",rice_scene:"unicode/1f391.png?v8",right_anger_bubble:"unicode/1f5ef.png?v8",ring:"unicode/1f48d.png?v8",ringed_planet:"unicode/1fa90.png?v8",robot:"unicode/1f916.png?v8",rock:"unicode/1faa8.png?v8",rocket:"unicode/1f680.png?v8",rofl:"unicode/1f923.png?v8",roll_eyes:"unicode/1f644.png?v8",roll_of_paper:"unicode/1f9fb.png?v8",roller_coaster:"unicode/1f3a2.png?v8",roller_skate:"unicode/1f6fc.png?v8",romania:"unicode/1f1f7-1f1f4.png?v8",rooster:"unicode/1f413.png?v8",rose:"unicode/1f339.png?v8",rosette:"unicode/1f3f5.png?v8",rotating_light:"unicode/1f6a8.png?v8",round_pushpin:"unicode/1f4cd.png?v8",rowboat:"unicode/1f6a3.png?v8",rowing_man:"unicode/1f6a3-2642.png?v8",rowing_woman:"unicode/1f6a3-2640.png?v8",ru:"unicode/1f1f7-1f1fa.png?v8",rugby_football:"unicode/1f3c9.png?v8",runner:"unicode/1f3c3.png?v8",running:"unicode/1f3c3.png?v8",running_man:"unicode/1f3c3-2642.png?v8",running_shirt_with_sash:"unicode/1f3bd.png?v8",running_woman:"unicode/1f3c3-2640.png?v8",rwanda:"unicode/1f1f7-1f1fc.png?v8",sa:"unicode/1f202.png?v8",safety_pin:"unicode/1f9f7.png?v8",safety_vest:"unicode/1f9ba.png?v8",sagittarius:"unicode/2650.png?v8",sailboat:"unicode/26f5.png?v8",sake:"unicode/1f376.png?v8",salt:"unicode/1f9c2.png?v8",samoa:"unicode/1f1fc-1f1f8.png?v8",san_marino:"unicode/1f1f8-1f1f2.png?v8",sandal:"unicode/1f461.png?v8",sandwich:"unicode/1f96a.png?v8",santa:"unicode/1f385.png?v8",sao_tome_principe:"unicode/1f1f8-1f1f9.png?v8",sari:"unicode/1f97b.png?v8",sassy_man:"unicode/1f481-2642.png?v8",sassy_woman:"unicode/1f481-2640.png?v8",satellite:"unicode/1f4e1.png?v8",satisfied:"unicode/1f606.png?v8",saudi_arabia:"unicode/1f1f8-1f1e6.png?v8",sauna_man:"unicode/1f9d6-2642.png?v8",sauna_person:"unicode/1f9d6.png?v8",sauna_woman:"unicode/1f9d6-2640.png?v8",sauropod:"unicode/1f995.png?v8",saxophone:"unicode/1f3b7.png?v8",scarf:"unicode/1f9e3.png?v8",school:"unicode/1f3eb.png?v8",school_satchel:"unicode/1f392.png?v8",scientist:"unicode/1f9d1-1f52c.png?v8",scissors:"unicode/2702.png?v8",scorpion:"unicode/1f982.png?v8",scorpius:"unicode/264f.png?v8",scotland:"unicode/1f3f4-e0067-e0062-e0073-e0063-e0074-e007f.png?v8",scream:"unicode/1f631.png?v8",scream_cat:"unicode/1f640.png?v8",screwdriver:"unicode/1fa9b.png?v8",scroll:"unicode/1f4dc.png?v8",seal:"unicode/1f9ad.png?v8",seat:"unicode/1f4ba.png?v8",secret:"unicode/3299.png?v8",see_no_evil:"unicode/1f648.png?v8",seedling:"unicode/1f331.png?v8",selfie:"unicode/1f933.png?v8",senegal:"unicode/1f1f8-1f1f3.png?v8",serbia:"unicode/1f1f7-1f1f8.png?v8",service_dog:"unicode/1f415-1f9ba.png?v8",seven:"unicode/0037-20e3.png?v8",sewing_needle:"unicode/1faa1.png?v8",seychelles:"unicode/1f1f8-1f1e8.png?v8",shallow_pan_of_food:"unicode/1f958.png?v8",shamrock:"unicode/2618.png?v8",shark:"unicode/1f988.png?v8",shaved_ice:"unicode/1f367.png?v8",sheep:"unicode/1f411.png?v8",shell:"unicode/1f41a.png?v8",shield:"unicode/1f6e1.png?v8",shinto_shrine:"unicode/26e9.png?v8",ship:"unicode/1f6a2.png?v8",shipit:"shipit.png?v8",shirt:"unicode/1f455.png?v8",shit:"unicode/1f4a9.png?v8",shoe:"unicode/1f45e.png?v8",shopping:"unicode/1f6cd.png?v8",shopping_cart:"unicode/1f6d2.png?v8",shorts:"unicode/1fa73.png?v8",shower:"unicode/1f6bf.png?v8",shrimp:"unicode/1f990.png?v8",shrug:"unicode/1f937.png?v8",shushing_face:"unicode/1f92b.png?v8",sierra_leone:"unicode/1f1f8-1f1f1.png?v8",signal_strength:"unicode/1f4f6.png?v8",singapore:"unicode/1f1f8-1f1ec.png?v8",singer:"unicode/1f9d1-1f3a4.png?v8",sint_maarten:"unicode/1f1f8-1f1fd.png?v8",six:"unicode/0036-20e3.png?v8",six_pointed_star:"unicode/1f52f.png?v8",skateboard:"unicode/1f6f9.png?v8",ski:"unicode/1f3bf.png?v8",skier:"unicode/26f7.png?v8",skull:"unicode/1f480.png?v8",skull_and_crossbones:"unicode/2620.png?v8",skunk:"unicode/1f9a8.png?v8",sled:"unicode/1f6f7.png?v8",sleeping:"unicode/1f634.png?v8",sleeping_bed:"unicode/1f6cc.png?v8",sleepy:"unicode/1f62a.png?v8",slightly_frowning_face:"unicode/1f641.png?v8",slightly_smiling_face:"unicode/1f642.png?v8",slot_machine:"unicode/1f3b0.png?v8",sloth:"unicode/1f9a5.png?v8",slovakia:"unicode/1f1f8-1f1f0.png?v8",slovenia:"unicode/1f1f8-1f1ee.png?v8",small_airplane:"unicode/1f6e9.png?v8",small_blue_diamond:"unicode/1f539.png?v8",small_orange_diamond:"unicode/1f538.png?v8",small_red_triangle:"unicode/1f53a.png?v8",small_red_triangle_down:"unicode/1f53b.png?v8",smile:"unicode/1f604.png?v8",smile_cat:"unicode/1f638.png?v8",smiley:"unicode/1f603.png?v8",smiley_cat:"unicode/1f63a.png?v8",smiling_face_with_tear:"unicode/1f972.png?v8",smiling_face_with_three_hearts:"unicode/1f970.png?v8",smiling_imp:"unicode/1f608.png?v8",smirk:"unicode/1f60f.png?v8",smirk_cat:"unicode/1f63c.png?v8",smoking:"unicode/1f6ac.png?v8",snail:"unicode/1f40c.png?v8",snake:"unicode/1f40d.png?v8",sneezing_face:"unicode/1f927.png?v8",snowboarder:"unicode/1f3c2.png?v8",snowflake:"unicode/2744.png?v8",snowman:"unicode/26c4.png?v8",snowman_with_snow:"unicode/2603.png?v8",soap:"unicode/1f9fc.png?v8",sob:"unicode/1f62d.png?v8",soccer:"unicode/26bd.png?v8",socks:"unicode/1f9e6.png?v8",softball:"unicode/1f94e.png?v8",solomon_islands:"unicode/1f1f8-1f1e7.png?v8",somalia:"unicode/1f1f8-1f1f4.png?v8",soon:"unicode/1f51c.png?v8",sos:"unicode/1f198.png?v8",sound:"unicode/1f509.png?v8",south_africa:"unicode/1f1ff-1f1e6.png?v8",south_georgia_south_sandwich_islands:"unicode/1f1ec-1f1f8.png?v8",south_sudan:"unicode/1f1f8-1f1f8.png?v8",space_invader:"unicode/1f47e.png?v8",spades:"unicode/2660.png?v8",spaghetti:"unicode/1f35d.png?v8",sparkle:"unicode/2747.png?v8",sparkler:"unicode/1f387.png?v8",sparkles:"unicode/2728.png?v8",sparkling_heart:"unicode/1f496.png?v8",speak_no_evil:"unicode/1f64a.png?v8",speaker:"unicode/1f508.png?v8",speaking_head:"unicode/1f5e3.png?v8",speech_balloon:"unicode/1f4ac.png?v8",speedboat:"unicode/1f6a4.png?v8",spider:"unicode/1f577.png?v8",spider_web:"unicode/1f578.png?v8",spiral_calendar:"unicode/1f5d3.png?v8",spiral_notepad:"unicode/1f5d2.png?v8",sponge:"unicode/1f9fd.png?v8",spoon:"unicode/1f944.png?v8",squid:"unicode/1f991.png?v8",sri_lanka:"unicode/1f1f1-1f1f0.png?v8",st_barthelemy:"unicode/1f1e7-1f1f1.png?v8",st_helena:"unicode/1f1f8-1f1ed.png?v8",st_kitts_nevis:"unicode/1f1f0-1f1f3.png?v8",st_lucia:"unicode/1f1f1-1f1e8.png?v8",st_martin:"unicode/1f1f2-1f1eb.png?v8",st_pierre_miquelon:"unicode/1f1f5-1f1f2.png?v8",st_vincent_grenadines:"unicode/1f1fb-1f1e8.png?v8",stadium:"unicode/1f3df.png?v8",standing_man:"unicode/1f9cd-2642.png?v8",standing_person:"unicode/1f9cd.png?v8",standing_woman:"unicode/1f9cd-2640.png?v8",star:"unicode/2b50.png?v8",star2:"unicode/1f31f.png?v8",star_and_crescent:"unicode/262a.png?v8",star_of_david:"unicode/2721.png?v8",star_struck:"unicode/1f929.png?v8",stars:"unicode/1f320.png?v8",station:"unicode/1f689.png?v8",statue_of_liberty:"unicode/1f5fd.png?v8",steam_locomotive:"unicode/1f682.png?v8",stethoscope:"unicode/1fa7a.png?v8",stew:"unicode/1f372.png?v8",stop_button:"unicode/23f9.png?v8",stop_sign:"unicode/1f6d1.png?v8",stopwatch:"unicode/23f1.png?v8",straight_ruler:"unicode/1f4cf.png?v8",strawberry:"unicode/1f353.png?v8",stuck_out_tongue:"unicode/1f61b.png?v8",stuck_out_tongue_closed_eyes:"unicode/1f61d.png?v8",stuck_out_tongue_winking_eye:"unicode/1f61c.png?v8",student:"unicode/1f9d1-1f393.png?v8",studio_microphone:"unicode/1f399.png?v8",stuffed_flatbread:"unicode/1f959.png?v8",sudan:"unicode/1f1f8-1f1e9.png?v8",sun_behind_large_cloud:"unicode/1f325.png?v8",sun_behind_rain_cloud:"unicode/1f326.png?v8",sun_behind_small_cloud:"unicode/1f324.png?v8",sun_with_face:"unicode/1f31e.png?v8",sunflower:"unicode/1f33b.png?v8",sunglasses:"unicode/1f60e.png?v8",sunny:"unicode/2600.png?v8",sunrise:"unicode/1f305.png?v8",sunrise_over_mountains:"unicode/1f304.png?v8",superhero:"unicode/1f9b8.png?v8",superhero_man:"unicode/1f9b8-2642.png?v8",superhero_woman:"unicode/1f9b8-2640.png?v8",supervillain:"unicode/1f9b9.png?v8",supervillain_man:"unicode/1f9b9-2642.png?v8",supervillain_woman:"unicode/1f9b9-2640.png?v8",surfer:"unicode/1f3c4.png?v8",surfing_man:"unicode/1f3c4-2642.png?v8",surfing_woman:"unicode/1f3c4-2640.png?v8",suriname:"unicode/1f1f8-1f1f7.png?v8",sushi:"unicode/1f363.png?v8",suspect:"suspect.png?v8",suspension_railway:"unicode/1f69f.png?v8",svalbard_jan_mayen:"unicode/1f1f8-1f1ef.png?v8",swan:"unicode/1f9a2.png?v8",swaziland:"unicode/1f1f8-1f1ff.png?v8",sweat:"unicode/1f613.png?v8",sweat_drops:"unicode/1f4a6.png?v8",sweat_smile:"unicode/1f605.png?v8",sweden:"unicode/1f1f8-1f1ea.png?v8",sweet_potato:"unicode/1f360.png?v8",swim_brief:"unicode/1fa72.png?v8",swimmer:"unicode/1f3ca.png?v8",swimming_man:"unicode/1f3ca-2642.png?v8",swimming_woman:"unicode/1f3ca-2640.png?v8",switzerland:"unicode/1f1e8-1f1ed.png?v8",symbols:"unicode/1f523.png?v8",synagogue:"unicode/1f54d.png?v8",syria:"unicode/1f1f8-1f1fe.png?v8",syringe:"unicode/1f489.png?v8","t-rex":"unicode/1f996.png?v8",taco:"unicode/1f32e.png?v8",tada:"unicode/1f389.png?v8",taiwan:"unicode/1f1f9-1f1fc.png?v8",tajikistan:"unicode/1f1f9-1f1ef.png?v8",takeout_box:"unicode/1f961.png?v8",tamale:"unicode/1fad4.png?v8",tanabata_tree:"unicode/1f38b.png?v8",tangerine:"unicode/1f34a.png?v8",tanzania:"unicode/1f1f9-1f1ff.png?v8",taurus:"unicode/2649.png?v8",taxi:"unicode/1f695.png?v8",tea:"unicode/1f375.png?v8",teacher:"unicode/1f9d1-1f3eb.png?v8",teapot:"unicode/1fad6.png?v8",technologist:"unicode/1f9d1-1f4bb.png?v8",teddy_bear:"unicode/1f9f8.png?v8",telephone:"unicode/260e.png?v8",telephone_receiver:"unicode/1f4de.png?v8",telescope:"unicode/1f52d.png?v8",tennis:"unicode/1f3be.png?v8",tent:"unicode/26fa.png?v8",test_tube:"unicode/1f9ea.png?v8",thailand:"unicode/1f1f9-1f1ed.png?v8",thermometer:"unicode/1f321.png?v8",thinking:"unicode/1f914.png?v8",thong_sandal:"unicode/1fa74.png?v8",thought_balloon:"unicode/1f4ad.png?v8",thread:"unicode/1f9f5.png?v8",three:"unicode/0033-20e3.png?v8",thumbsdown:"unicode/1f44e.png?v8",thumbsup:"unicode/1f44d.png?v8",ticket:"unicode/1f3ab.png?v8",tickets:"unicode/1f39f.png?v8",tiger:"unicode/1f42f.png?v8",tiger2:"unicode/1f405.png?v8",timer_clock:"unicode/23f2.png?v8",timor_leste:"unicode/1f1f9-1f1f1.png?v8",tipping_hand_man:"unicode/1f481-2642.png?v8",tipping_hand_person:"unicode/1f481.png?v8",tipping_hand_woman:"unicode/1f481-2640.png?v8",tired_face:"unicode/1f62b.png?v8",tm:"unicode/2122.png?v8",togo:"unicode/1f1f9-1f1ec.png?v8",toilet:"unicode/1f6bd.png?v8",tokelau:"unicode/1f1f9-1f1f0.png?v8",tokyo_tower:"unicode/1f5fc.png?v8",tomato:"unicode/1f345.png?v8",tonga:"unicode/1f1f9-1f1f4.png?v8",tongue:"unicode/1f445.png?v8",toolbox:"unicode/1f9f0.png?v8",tooth:"unicode/1f9b7.png?v8",toothbrush:"unicode/1faa5.png?v8",top:"unicode/1f51d.png?v8",tophat:"unicode/1f3a9.png?v8",tornado:"unicode/1f32a.png?v8",tr:"unicode/1f1f9-1f1f7.png?v8",trackball:"unicode/1f5b2.png?v8",tractor:"unicode/1f69c.png?v8",traffic_light:"unicode/1f6a5.png?v8",train:"unicode/1f68b.png?v8",train2:"unicode/1f686.png?v8",tram:"unicode/1f68a.png?v8",transgender_flag:"unicode/1f3f3-26a7.png?v8",transgender_symbol:"unicode/26a7.png?v8",triangular_flag_on_post:"unicode/1f6a9.png?v8",triangular_ruler:"unicode/1f4d0.png?v8",trident:"unicode/1f531.png?v8",trinidad_tobago:"unicode/1f1f9-1f1f9.png?v8",tristan_da_cunha:"unicode/1f1f9-1f1e6.png?v8",triumph:"unicode/1f624.png?v8",trolleybus:"unicode/1f68e.png?v8",trollface:"trollface.png?v8",trophy:"unicode/1f3c6.png?v8",tropical_drink:"unicode/1f379.png?v8",tropical_fish:"unicode/1f420.png?v8",truck:"unicode/1f69a.png?v8",trumpet:"unicode/1f3ba.png?v8",tshirt:"unicode/1f455.png?v8",tulip:"unicode/1f337.png?v8",tumbler_glass:"unicode/1f943.png?v8",tunisia:"unicode/1f1f9-1f1f3.png?v8",turkey:"unicode/1f983.png?v8",turkmenistan:"unicode/1f1f9-1f1f2.png?v8",turks_caicos_islands:"unicode/1f1f9-1f1e8.png?v8",turtle:"unicode/1f422.png?v8",tuvalu:"unicode/1f1f9-1f1fb.png?v8",tv:"unicode/1f4fa.png?v8",twisted_rightwards_arrows:"unicode/1f500.png?v8",two:"unicode/0032-20e3.png?v8",two_hearts:"unicode/1f495.png?v8",two_men_holding_hands:"unicode/1f46c.png?v8",two_women_holding_hands:"unicode/1f46d.png?v8",u5272:"unicode/1f239.png?v8",u5408:"unicode/1f234.png?v8",u55b6:"unicode/1f23a.png?v8",u6307:"unicode/1f22f.png?v8",u6708:"unicode/1f237.png?v8",u6709:"unicode/1f236.png?v8",u6e80:"unicode/1f235.png?v8",u7121:"unicode/1f21a.png?v8",u7533:"unicode/1f238.png?v8",u7981:"unicode/1f232.png?v8",u7a7a:"unicode/1f233.png?v8",uganda:"unicode/1f1fa-1f1ec.png?v8",uk:"unicode/1f1ec-1f1e7.png?v8",ukraine:"unicode/1f1fa-1f1e6.png?v8",umbrella:"unicode/2614.png?v8",unamused:"unicode/1f612.png?v8",underage:"unicode/1f51e.png?v8",unicorn:"unicode/1f984.png?v8",united_arab_emirates:"unicode/1f1e6-1f1ea.png?v8",united_nations:"unicode/1f1fa-1f1f3.png?v8",unlock:"unicode/1f513.png?v8",up:"unicode/1f199.png?v8",upside_down_face:"unicode/1f643.png?v8",uruguay:"unicode/1f1fa-1f1fe.png?v8",us:"unicode/1f1fa-1f1f8.png?v8",us_outlying_islands:"unicode/1f1fa-1f1f2.png?v8",us_virgin_islands:"unicode/1f1fb-1f1ee.png?v8",uzbekistan:"unicode/1f1fa-1f1ff.png?v8",v:"unicode/270c.png?v8",vampire:"unicode/1f9db.png?v8",vampire_man:"unicode/1f9db-2642.png?v8",vampire_woman:"unicode/1f9db-2640.png?v8",vanuatu:"unicode/1f1fb-1f1fa.png?v8",vatican_city:"unicode/1f1fb-1f1e6.png?v8",venezuela:"unicode/1f1fb-1f1ea.png?v8",vertical_traffic_light:"unicode/1f6a6.png?v8",vhs:"unicode/1f4fc.png?v8",vibration_mode:"unicode/1f4f3.png?v8",video_camera:"unicode/1f4f9.png?v8",video_game:"unicode/1f3ae.png?v8",vietnam:"unicode/1f1fb-1f1f3.png?v8",violin:"unicode/1f3bb.png?v8",virgo:"unicode/264d.png?v8",volcano:"unicode/1f30b.png?v8",volleyball:"unicode/1f3d0.png?v8",vomiting_face:"unicode/1f92e.png?v8",vs:"unicode/1f19a.png?v8",vulcan_salute:"unicode/1f596.png?v8",waffle:"unicode/1f9c7.png?v8",wales:"unicode/1f3f4-e0067-e0062-e0077-e006c-e0073-e007f.png?v8",walking:"unicode/1f6b6.png?v8",walking_man:"unicode/1f6b6-2642.png?v8",walking_woman:"unicode/1f6b6-2640.png?v8",wallis_futuna:"unicode/1f1fc-1f1eb.png?v8",waning_crescent_moon:"unicode/1f318.png?v8",waning_gibbous_moon:"unicode/1f316.png?v8",warning:"unicode/26a0.png?v8",wastebasket:"unicode/1f5d1.png?v8",watch:"unicode/231a.png?v8",water_buffalo:"unicode/1f403.png?v8",water_polo:"unicode/1f93d.png?v8",watermelon:"unicode/1f349.png?v8",wave:"unicode/1f44b.png?v8",wavy_dash:"unicode/3030.png?v8",waxing_crescent_moon:"unicode/1f312.png?v8",waxing_gibbous_moon:"unicode/1f314.png?v8",wc:"unicode/1f6be.png?v8",weary:"unicode/1f629.png?v8",wedding:"unicode/1f492.png?v8",weight_lifting:"unicode/1f3cb.png?v8",weight_lifting_man:"unicode/1f3cb-2642.png?v8",weight_lifting_woman:"unicode/1f3cb-2640.png?v8",western_sahara:"unicode/1f1ea-1f1ed.png?v8",whale:"unicode/1f433.png?v8",whale2:"unicode/1f40b.png?v8",wheel_of_dharma:"unicode/2638.png?v8",wheelchair:"unicode/267f.png?v8",white_check_mark:"unicode/2705.png?v8",white_circle:"unicode/26aa.png?v8",white_flag:"unicode/1f3f3.png?v8",white_flower:"unicode/1f4ae.png?v8",white_haired_man:"unicode/1f468-1f9b3.png?v8",white_haired_woman:"unicode/1f469-1f9b3.png?v8",white_heart:"unicode/1f90d.png?v8",white_large_square:"unicode/2b1c.png?v8",white_medium_small_square:"unicode/25fd.png?v8",white_medium_square:"unicode/25fb.png?v8",white_small_square:"unicode/25ab.png?v8",white_square_button:"unicode/1f533.png?v8",wilted_flower:"unicode/1f940.png?v8",wind_chime:"unicode/1f390.png?v8",wind_face:"unicode/1f32c.png?v8",window:"unicode/1fa9f.png?v8",wine_glass:"unicode/1f377.png?v8",wink:"unicode/1f609.png?v8",wolf:"unicode/1f43a.png?v8",woman:"unicode/1f469.png?v8",woman_artist:"unicode/1f469-1f3a8.png?v8",woman_astronaut:"unicode/1f469-1f680.png?v8",woman_beard:"unicode/1f9d4-2640.png?v8",woman_cartwheeling:"unicode/1f938-2640.png?v8",woman_cook:"unicode/1f469-1f373.png?v8",woman_dancing:"unicode/1f483.png?v8",woman_facepalming:"unicode/1f926-2640.png?v8",woman_factory_worker:"unicode/1f469-1f3ed.png?v8",woman_farmer:"unicode/1f469-1f33e.png?v8",woman_feeding_baby:"unicode/1f469-1f37c.png?v8",woman_firefighter:"unicode/1f469-1f692.png?v8",woman_health_worker:"unicode/1f469-2695.png?v8",woman_in_manual_wheelchair:"unicode/1f469-1f9bd.png?v8",woman_in_motorized_wheelchair:"unicode/1f469-1f9bc.png?v8",woman_in_tuxedo:"unicode/1f935-2640.png?v8",woman_judge:"unicode/1f469-2696.png?v8",woman_juggling:"unicode/1f939-2640.png?v8",woman_mechanic:"unicode/1f469-1f527.png?v8",woman_office_worker:"unicode/1f469-1f4bc.png?v8",woman_pilot:"unicode/1f469-2708.png?v8",woman_playing_handball:"unicode/1f93e-2640.png?v8",woman_playing_water_polo:"unicode/1f93d-2640.png?v8",woman_scientist:"unicode/1f469-1f52c.png?v8",woman_shrugging:"unicode/1f937-2640.png?v8",woman_singer:"unicode/1f469-1f3a4.png?v8",woman_student:"unicode/1f469-1f393.png?v8",woman_teacher:"unicode/1f469-1f3eb.png?v8",woman_technologist:"unicode/1f469-1f4bb.png?v8",woman_with_headscarf:"unicode/1f9d5.png?v8",woman_with_probing_cane:"unicode/1f469-1f9af.png?v8",woman_with_turban:"unicode/1f473-2640.png?v8",woman_with_veil:"unicode/1f470-2640.png?v8",womans_clothes:"unicode/1f45a.png?v8",womans_hat:"unicode/1f452.png?v8",women_wrestling:"unicode/1f93c-2640.png?v8",womens:"unicode/1f6ba.png?v8",wood:"unicode/1fab5.png?v8",woozy_face:"unicode/1f974.png?v8",world_map:"unicode/1f5fa.png?v8",worm:"unicode/1fab1.png?v8",worried:"unicode/1f61f.png?v8",wrench:"unicode/1f527.png?v8",wrestling:"unicode/1f93c.png?v8",writing_hand:"unicode/270d.png?v8",x:"unicode/274c.png?v8",yarn:"unicode/1f9f6.png?v8",yawning_face:"unicode/1f971.png?v8",yellow_circle:"unicode/1f7e1.png?v8",yellow_heart:"unicode/1f49b.png?v8",yellow_square:"unicode/1f7e8.png?v8",yemen:"unicode/1f1fe-1f1ea.png?v8",yen:"unicode/1f4b4.png?v8",yin_yang:"unicode/262f.png?v8",yo_yo:"unicode/1fa80.png?v8",yum:"unicode/1f60b.png?v8",zambia:"unicode/1f1ff-1f1f2.png?v8",zany_face:"unicode/1f92a.png?v8",zap:"unicode/26a1.png?v8",zebra:"unicode/1f993.png?v8",zero:"unicode/0030-20e3.png?v8",zimbabwe:"unicode/1f1ff-1f1fc.png?v8",zipper_mouth_face:"unicode/1f910.png?v8",zombie:"unicode/1f9df.png?v8",zombie_man:"unicode/1f9df-2642.png?v8",zombie_woman:"unicode/1f9df-2640.png?v8",zzz:"unicode/1f4a4.png?v8"}};function jn(e,t){return e.replace(/<(code|pre|script|template)[^>]*?>[\s\S]+?<\/(code|pre|script|template)>/g,function(e){return e.replace(/:/g,"__colon__")}).replace(//g,function(e){return e.replace(/:/g,"__colon__")}).replace(/([a-z]{2,}:)?\/\/[^\s'">)]+/gi,function(e){return e.replace(/:/g,"__colon__")}).replace(/:([a-z0-9_\-+]+?):/g,function(e,n){return i=e,o=n,e=t,n=Rn.data[o],i,i=n?e&&/unicode/.test(n)?''+n.replace("unicode/","").replace(/\.png.*/,"").split("-").map(function(e){return""+e+";"}).join("").concat("︎")+" ":' ':i;var i,o}).replace(/__colon__/g,":")}function On(e){var o={};return{str:e=(e=void 0===e?"":e)&&e.replace(/^('|")/,"").replace(/('|")$/,"").replace(/(?:^|\s):([\w-]+:?)=?([\w-%]+)?/g,function(e,n,i){return-1===n.indexOf(":")?(o[n]=i&&i.replace(/"/g,"")||!0,""):e}).trim(),config:o}}function Ln(e){return(e=void 0===e?"":e).replace(/(<\/?a.*?>)/gi,"")}var qn,Pn=be(function(e){var u,f,p,d,n,g=function(u){var i=/(?:^|\s)lang(?:uage)?-([\w-]+)(?=\s|$)/i,n=0,e={},T={manual:u.Prism&&u.Prism.manual,disableWorkerMessageHandler:u.Prism&&u.Prism.disableWorkerMessageHandler,util:{encode:function e(n){return n instanceof C?new C(n.type,e(n.content),n.alias):Array.isArray(n)?n.map(e):n.replace(/&/g,"&").replace(/=r.reach);m+=_.value.length,_=_.next){var b=_.value;if(i.length>n.length)return;if(!(b instanceof C)){var k,w=1;if(l){if(!(k=R(h,m,n,s))||k.index>=n.length)break;var y=k.index,x=k.index+k[0].length,S=m;for(S+=_.value.length;S<=y;)_=_.next,S+=_.value.length;if(S-=_.value.length,m=S,_.value instanceof C)continue;for(var A=_;A!==i.tail&&(Sr.reach&&(r.reach=E);b=_.prev;z&&(b=j(i,b,z),m+=z.length),O(i,b,w);$=new C(c,g?T.tokenize($,g):$,v,$);_=j(i,b,$),F&&j(i,_,F),1r.reach&&(r.reach=E.reach))}}}}}(e,t,n,t.head,0),function(e){var n=[],i=e.head.next;for(;i!==e.tail;)n.push(i.value),i=i.next;return n}(t)},hooks:{all:{},add:function(e,n){var i=T.hooks.all;i[e]=i[e]||[],i[e].push(n)},run:function(e,n){var i=T.hooks.all[e];if(i&&i.length)for(var o,t=0;o=i[t++];)o(n)}},Token:C};function C(e,n,i,o){this.type=e,this.content=n,this.alias=i,this.length=0|(o||"").length}function R(e,n,i,o){e.lastIndex=n;i=e.exec(i);return i&&o&&i[1]&&(o=i[1].length,i.index+=o,i[0]=i[0].slice(o)),i}function a(){var e={value:null,prev:null,next:null},n={value:null,prev:e,next:null};e.next=n,this.head=e,this.tail=n,this.length=0}function j(e,n,i){var o=n.next,i={value:i,prev:n,next:o};return n.next=i,o.prev=i,e.length++,i}function O(e,n,i){for(var o=n.next,t=0;t"+t.content+""+t.tag+">"},!u.document)return u.addEventListener&&(T.disableWorkerMessageHandler||u.addEventListener("message",function(e){var n=JSON.parse(e.data),i=n.language,e=n.code,n=n.immediateClose;u.postMessage(T.highlight(e,T.languages[i],i)),n&&u.close()},!1)),T;var o=T.util.currentScript();function t(){T.manual||T.highlightAll()}return o&&(T.filename=o.src,o.hasAttribute("data-manual")&&(T.manual=!0)),T.manual||("loading"===(e=document.readyState)||"interactive"===e&&o&&o.defer?document.addEventListener("DOMContentLoaded",t):window.requestAnimationFrame?window.requestAnimationFrame(t):window.setTimeout(t,16)),T}("undefined"!=typeof window?window:"undefined"!=typeof WorkerGlobalScope&&self instanceof WorkerGlobalScope?self:{});e.exports&&(e.exports=g),void 0!==me&&(me.Prism=g),g.languages.markup={comment:{pattern://,greedy:!0},prolog:{pattern:/<\?[\s\S]+?\?>/,greedy:!0},doctype:{pattern:/"'[\]]|"[^"]*"|'[^']*')+(?:\[(?:[^<"'\]]|"[^"]*"|'[^']*'|<(?!!--)|)*\]\s*)?>/i,greedy:!0,inside:{"internal-subset":{pattern:/(^[^\[]*\[)[\s\S]+(?=\]>$)/,lookbehind:!0,greedy:!0,inside:null},string:{pattern:/"[^"]*"|'[^']*'/,greedy:!0},punctuation:/^$|[[\]]/,"doctype-tag":/^DOCTYPE/i,name:/[^\s<>'"]+/}},cdata:{pattern://i,greedy:!0},tag:{pattern:/<\/?(?!\d)[^\s>\/=$<%]+(?:\s(?:\s*[^\s>\/=]+(?:\s*=\s*(?:"[^"]*"|'[^']*'|[^\s'">=]+(?=[\s>]))|(?=[\s/>])))+)?\s*\/?>/,greedy:!0,inside:{tag:{pattern:/^<\/?[^\s>\/]+/,inside:{punctuation:/^<\/?/,namespace:/^[^\s>\/:]+:/}},"special-attr":[],"attr-value":{pattern:/=\s*(?:"[^"]*"|'[^']*'|[^\s'">=]+)/,inside:{punctuation:[{pattern:/^=/,alias:"attr-equals"},/"|'/]}},punctuation:/\/?>/,"attr-name":{pattern:/[^\s>\/]+/,inside:{namespace:/^[^\s>\/:]+:/}}}},entity:[{pattern:/&[\da-z]{1,8};/i,alias:"named-entity"},/?[\da-f]{1,8};/i]},g.languages.markup.tag.inside["attr-value"].inside.entity=g.languages.markup.entity,g.languages.markup.doctype.inside["internal-subset"].inside=g.languages.markup,g.hooks.add("wrap",function(e){"entity"===e.type&&(e.attributes.title=e.content.replace(/&/,"&"))}),Object.defineProperty(g.languages.markup.tag,"addInlined",{value:function(e,n){var i={};i["language-"+n]={pattern:/(^$)/i,lookbehind:!0,inside:g.languages[n]},i.cdata=/^$/i;i={"included-cdata":{pattern://i,inside:i}};i["language-"+n]={pattern:/[\s\S]+/,inside:g.languages[n]};n={};n[e]={pattern:RegExp(/(<__[^>]*>)(?:))*\]\]>|(?!)/.source.replace(/__/g,function(){return e}),"i"),lookbehind:!0,greedy:!0,inside:i},g.languages.insertBefore("markup","cdata",n)}}),Object.defineProperty(g.languages.markup.tag,"addAttribute",{value:function(e,n){g.languages.markup.tag.inside["special-attr"].push({pattern:RegExp(/(^|["'\s])/.source+"(?:"+e+")"+/\s*=\s*(?:"[^"]*"|'[^']*'|[^\s'">=]+(?=[\s>]))/.source,"i"),lookbehind:!0,inside:{"attr-name":/^[^\s=]+/,"attr-value":{pattern:/=[\s\S]+/,inside:{value:{pattern:/(^=\s*(["']|(?!["'])))\S[\s\S]*(?=\2$)/,lookbehind:!0,alias:[n,"language-"+n],inside:g.languages[n]},punctuation:[{pattern:/^=/,alias:"attr-equals"},/"|'/]}}}})}}),g.languages.html=g.languages.markup,g.languages.mathml=g.languages.markup,g.languages.svg=g.languages.markup,g.languages.xml=g.languages.extend("markup",{}),g.languages.ssml=g.languages.xml,g.languages.atom=g.languages.xml,g.languages.rss=g.languages.xml,function(e){var n=/(?:"(?:\\(?:\r\n|[\s\S])|[^"\\\r\n])*"|'(?:\\(?:\r\n|[\s\S])|[^'\\\r\n])*')/;e.languages.css={comment:/\/\*[\s\S]*?\*\//,atrule:{pattern:/@[\w-](?:[^;{\s]|\s+(?![\s{]))*(?:;|(?=\s*\{))/,inside:{rule:/^@[\w-]+/,"selector-function-argument":{pattern:/(\bselector\s*\(\s*(?![\s)]))(?:[^()\s]|\s+(?![\s)])|\((?:[^()]|\([^()]*\))*\))+(?=\s*\))/,lookbehind:!0,alias:"selector"},keyword:{pattern:/(^|[^\w-])(?:and|not|only|or)(?![\w-])/,lookbehind:!0}}},url:{pattern:RegExp("\\burl\\((?:"+n.source+"|"+/(?:[^\\\r\n()"']|\\[\s\S])*/.source+")\\)","i"),greedy:!0,inside:{function:/^url/i,punctuation:/^\(|\)$/,string:{pattern:RegExp("^"+n.source+"$"),alias:"url"}}},selector:{pattern:RegExp("(^|[{}\\s])[^{}\\s](?:[^{};\"'\\s]|\\s+(?![\\s{])|"+n.source+")*(?=\\s*\\{)"),lookbehind:!0},string:{pattern:n,greedy:!0},property:{pattern:/(^|[^-\w\xA0-\uFFFF])(?!\s)[-_a-z\xA0-\uFFFF](?:(?!\s)[-\w\xA0-\uFFFF])*(?=\s*:)/i,lookbehind:!0},important:/!important\b/i,function:{pattern:/(^|[^-a-z0-9])[-a-z0-9]+(?=\()/i,lookbehind:!0},punctuation:/[(){};:,]/},e.languages.css.atrule.inside.rest=e.languages.css;e=e.languages.markup;e&&(e.tag.addInlined("style","css"),e.tag.addAttribute("style","css"))}(g),g.languages.clike={comment:[{pattern:/(^|[^\\])\/\*[\s\S]*?(?:\*\/|$)/,lookbehind:!0,greedy:!0},{pattern:/(^|[^\\:])\/\/.*/,lookbehind:!0,greedy:!0}],string:{pattern:/(["'])(?:\\(?:\r\n|[\s\S])|(?!\1)[^\\\r\n])*\1/,greedy:!0},"class-name":{pattern:/(\b(?:class|extends|implements|instanceof|interface|new|trait)\s+|\bcatch\s+\()[\w.\\]+/i,lookbehind:!0,inside:{punctuation:/[.\\]/}},keyword:/\b(?:break|catch|continue|do|else|finally|for|function|if|in|instanceof|new|null|return|throw|try|while)\b/,boolean:/\b(?:false|true)\b/,function:/\b\w+(?=\()/,number:/\b0x[\da-f]+\b|(?:\b\d+(?:\.\d*)?|\B\.\d+)(?:e[+-]?\d+)?/i,operator:/[<>]=?|[!=]=?=?|--?|\+\+?|&&?|\|\|?|[?*/~^%]/,punctuation:/[{}[\];(),.:]/},g.languages.javascript=g.languages.extend("clike",{"class-name":[g.languages.clike["class-name"],{pattern:/(^|[^$\w\xA0-\uFFFF])(?!\s)[_$A-Z\xA0-\uFFFF](?:(?!\s)[$\w\xA0-\uFFFF])*(?=\.(?:constructor|prototype))/,lookbehind:!0}],keyword:[{pattern:/((?:^|\})\s*)catch\b/,lookbehind:!0},{pattern:/(^|[^.]|\.\.\.\s*)\b(?:as|assert(?=\s*\{)|async(?=\s*(?:function\b|\(|[$\w\xA0-\uFFFF]|$))|await|break|case|class|const|continue|debugger|default|delete|do|else|enum|export|extends|finally(?=\s*(?:\{|$))|for|from(?=\s*(?:['"]|$))|function|(?:get|set)(?=\s*(?:[#\[$\w\xA0-\uFFFF]|$))|if|implements|import|in|instanceof|interface|let|new|null|of|package|private|protected|public|return|static|super|switch|this|throw|try|typeof|undefined|var|void|while|with|yield)\b/,lookbehind:!0}],function:/#?(?!\s)[_$a-zA-Z\xA0-\uFFFF](?:(?!\s)[$\w\xA0-\uFFFF])*(?=\s*(?:\.\s*(?:apply|bind|call)\s*)?\()/,number:{pattern:RegExp(/(^|[^\w$])/.source+"(?:"+/NaN|Infinity/.source+"|"+/0[bB][01]+(?:_[01]+)*n?/.source+"|"+/0[oO][0-7]+(?:_[0-7]+)*n?/.source+"|"+/0[xX][\dA-Fa-f]+(?:_[\dA-Fa-f]+)*n?/.source+"|"+/\d+(?:_\d+)*n/.source+"|"+/(?:\d+(?:_\d+)*(?:\.(?:\d+(?:_\d+)*)?)?|\.\d+(?:_\d+)*)(?:[Ee][+-]?\d+(?:_\d+)*)?/.source+")"+/(?![\w$])/.source),lookbehind:!0},operator:/--|\+\+|\*\*=?|=>|&&=?|\|\|=?|[!=]==|<<=?|>>>?=?|[-+*/%&|^!=<>]=?|\.{3}|\?\?=?|\?\.?|[~:]/}),g.languages.javascript["class-name"][0].pattern=/(\b(?:class|extends|implements|instanceof|interface|new)\s+)[\w.\\]+/,g.languages.insertBefore("javascript","keyword",{regex:{pattern:/((?:^|[^$\w\xA0-\uFFFF."'\])\s]|\b(?:return|yield))\s*)\/(?:\[(?:[^\]\\\r\n]|\\.)*\]|\\.|[^/\\\[\r\n])+\/[dgimyus]{0,7}(?=(?:\s|\/\*(?:[^*]|\*(?!\/))*\*\/)*(?:$|[\r\n,.;:})\]]|\/\/))/,lookbehind:!0,greedy:!0,inside:{"regex-source":{pattern:/^(\/)[\s\S]+(?=\/[a-z]*$)/,lookbehind:!0,alias:"language-regex",inside:g.languages.regex},"regex-delimiter":/^\/|\/$/,"regex-flags":/^[a-z]+$/}},"function-variable":{pattern:/#?(?!\s)[_$a-zA-Z\xA0-\uFFFF](?:(?!\s)[$\w\xA0-\uFFFF])*(?=\s*[=:]\s*(?:async\s*)?(?:\bfunction\b|(?:\((?:[^()]|\([^()]*\))*\)|(?!\s)[_$a-zA-Z\xA0-\uFFFF](?:(?!\s)[$\w\xA0-\uFFFF])*)\s*=>))/,alias:"function"},parameter:[{pattern:/(function(?:\s+(?!\s)[_$a-zA-Z\xA0-\uFFFF](?:(?!\s)[$\w\xA0-\uFFFF])*)?\s*\(\s*)(?!\s)(?:[^()\s]|\s+(?![\s)])|\([^()]*\))+(?=\s*\))/,lookbehind:!0,inside:g.languages.javascript},{pattern:/(^|[^$\w\xA0-\uFFFF])(?!\s)[_$a-z\xA0-\uFFFF](?:(?!\s)[$\w\xA0-\uFFFF])*(?=\s*=>)/i,lookbehind:!0,inside:g.languages.javascript},{pattern:/(\(\s*)(?!\s)(?:[^()\s]|\s+(?![\s)])|\([^()]*\))+(?=\s*\)\s*=>)/,lookbehind:!0,inside:g.languages.javascript},{pattern:/((?:\b|\s|^)(?!(?:as|async|await|break|case|catch|class|const|continue|debugger|default|delete|do|else|enum|export|extends|finally|for|from|function|get|if|implements|import|in|instanceof|interface|let|new|null|of|package|private|protected|public|return|set|static|super|switch|this|throw|try|typeof|undefined|var|void|while|with|yield)(?![$\w\xA0-\uFFFF]))(?:(?!\s)[_$a-zA-Z\xA0-\uFFFF](?:(?!\s)[$\w\xA0-\uFFFF])*\s*)\(\s*|\]\s*\(\s*)(?!\s)(?:[^()\s]|\s+(?![\s)])|\([^()]*\))+(?=\s*\)\s*\{)/,lookbehind:!0,inside:g.languages.javascript}],constant:/\b[A-Z](?:[A-Z_]|\dx?)*\b/}),g.languages.insertBefore("javascript","string",{hashbang:{pattern:/^#!.*/,greedy:!0,alias:"comment"},"template-string":{pattern:/`(?:\\[\s\S]|\$\{(?:[^{}]|\{(?:[^{}]|\{[^}]*\})*\})+\}|(?!\$\{)[^\\`])*`/,greedy:!0,inside:{"template-punctuation":{pattern:/^`|`$/,alias:"string"},interpolation:{pattern:/((?:^|[^\\])(?:\\{2})*)\$\{(?:[^{}]|\{(?:[^{}]|\{[^}]*\})*\})+\}/,lookbehind:!0,inside:{"interpolation-punctuation":{pattern:/^\$\{|\}$/,alias:"punctuation"},rest:g.languages.javascript}},string:/[\s\S]+/}},"string-property":{pattern:/((?:^|[,{])[ \t]*)(["'])(?:\\(?:\r\n|[\s\S])|(?!\2)[^\\\r\n])*\2(?=\s*:)/m,lookbehind:!0,greedy:!0,alias:"property"}}),g.languages.insertBefore("javascript","operator",{"literal-property":{pattern:/((?:^|[,{])[ \t]*)(?!\s)[_$a-zA-Z\xA0-\uFFFF](?:(?!\s)[$\w\xA0-\uFFFF])*(?=\s*:)/m,lookbehind:!0,alias:"property"}}),g.languages.markup&&(g.languages.markup.tag.addInlined("script","javascript"),g.languages.markup.tag.addAttribute(/on(?:abort|blur|change|click|composition(?:end|start|update)|dblclick|error|focus(?:in|out)?|key(?:down|up)|load|mouse(?:down|enter|leave|move|out|over|up)|reset|resize|scroll|select|slotchange|submit|unload|wheel)/.source,"javascript")),g.languages.js=g.languages.javascript,void 0!==g&&"undefined"!=typeof document&&(Element.prototype.matches||(Element.prototype.matches=Element.prototype.msMatchesSelector||Element.prototype.webkitMatchesSelector),u={js:"javascript",py:"python",rb:"ruby",ps1:"powershell",psm1:"powershell",sh:"bash",bat:"batch",h:"c",tex:"latex"},d="pre[data-src]:not(["+(f="data-src-status")+'="loaded"]):not(['+f+'="'+(p="loading")+'"])',g.hooks.add("before-highlightall",function(e){e.selector+=", "+d}),g.hooks.add("before-sanity-check",function(e){var t,n,i,o,a,r,c=e.element;c.matches(d)&&(e.code="",c.setAttribute(f,p),(t=c.appendChild(document.createElement("CODE"))).textContent="Loading…",i=c.getAttribute("data-src"),"none"===(e=e.language)&&(n=(/\.(\w+)$/.exec(i)||[,"none"])[1],e=u[n]||n),g.util.setLanguage(t,e),g.util.setLanguage(c,e),(n=g.plugins.autoloader)&&n.loadLanguages(e),i=i,o=function(e){c.setAttribute(f,"loaded");var n,i,o=function(e){if(i=/^\s*(\d+)\s*(?:(,)\s*(?:(\d+)\s*)?)?$/.exec(e||"")){var n=Number(i[1]),e=i[2],i=i[3];return e?i?[n,Number(i)]:[n,void 0]:[n,n]}}(c.getAttribute("data-range"));o&&(n=e.split(/\r\n?|\n/g),i=o[0],o=null==o[1]?n.length:o[1],i<0&&(i+=n.length),i=Math.max(0,Math.min(i-1,n.length)),o<0&&(o+=n.length),o=Math.max(0,Math.min(o,n.length)),e=n.slice(i,o).join("\n"),c.hasAttribute("data-start")||c.setAttribute("data-start",String(i+1))),t.textContent=e,g.highlightElement(t)},a=function(e){c.setAttribute(f,"failed"),t.textContent=e},(r=new XMLHttpRequest).open("GET",i,!0),r.onreadystatechange=function(){4==r.readyState&&(r.status<400&&r.responseText?o(r.responseText):400<=r.status?a("✖ Error "+r.status+" while fetching file: "+r.statusText):a("✖ Error: File does not exist or is empty"))},r.send(null))}),n=!(g.plugins.fileHighlight={highlight:function(e){for(var n,i=(e||document).querySelectorAll(d),o=0;n=i[o++];)g.highlightElement(n)}}),g.fileHighlight=function(){n||(console.warn("Prism.fileHighlight is deprecated. Use `Prism.plugins.fileHighlight.highlight` instead."),n=!0),g.plugins.fileHighlight.highlight.apply(this,arguments)})});function Mn(e,n){return"___"+e.toUpperCase()+n+"___"}qn=Prism,Object.defineProperties(qn.languages["markup-templating"]={},{buildPlaceholders:{value:function(o,t,e,a){var r;o.language===t&&(r=o.tokenStack=[],o.code=o.code.replace(e,function(e){if("function"==typeof a&&!a(e))return e;for(var n,i=r.length;-1!==o.code.indexOf(n=Mn(t,i));)++i;return r[i]=e,n}),o.grammar=qn.languages.markup)}},tokenizePlaceholders:{value:function(f,p){var d,g;f.language===p&&f.tokenStack&&(f.grammar=qn.languages[p],d=0,g=Object.keys(f.tokenStack),function e(n){for(var i=0;i=g.length);i++){var o,t,a,r,c,u=n[i];"string"==typeof u||u.content&&"string"==typeof u.content?(t=g[d],a=f.tokenStack[t],o="string"==typeof u?u:u.content,c=Mn(p,t),-1<(r=o.indexOf(c))&&(++d,t=o.substring(0,r),a=new qn.Token(p,qn.tokenize(a,f.grammar),"language-"+p,a),r=o.substring(r+c.length),c=[],t&&c.push.apply(c,e([t])),c.push(a),r&&c.push.apply(c,e([r])),"string"==typeof u?n.splice.apply(n,[i,1].concat(c)):u.content=c)):u.content&&e(u.content)}return n}(f.tokens))}}});function In(t,e){var a=this;this.config=t,this.router=e,this.cacheTree={},this.toc=[],this.cacheTOC={},this.linkTarget=t.externalLinkTarget||"_blank",this.linkRel="_blank"===this.linkTarget?t.externalLinkRel||"noopener":"",this.contentBase=e.getBasePath();var n=this._initRenderer();this.heading=n.heading;var r=o(e=t.markdown||{})?e(Sn,n):(Sn.setOptions(m(e,{renderer:m(n,e.renderer)})),Sn);this._marked=r,this.compile=function(i){var o=!0,e=c(function(e){o=!1;var n="";return i&&(n=f(i)?r(i):r.parser(i),n=t.noEmoji?n:jn(n,t.nativeEmoji),Cn.clear(),n)})(i),n=a.router.parse().file;return o?a.toc=a.cacheTOC[n]:a.cacheTOC[n]=[].concat(a.toc),e}}var Nn={},Hn={markdown:function(e){return{url:e}},mermaid:function(e){return{url:e}},iframe:function(e,n){return{html:'"}},video:function(e,n){return{html:'Not Support "}},audio:function(e,n){return{html:'Not Support "}},code:function(e,n){var i=e.match(/\.(\w+)$/);return{url:e,lang:i="md"===(i=n||i&&i[1])?"markdown":i}}};In.prototype.compileEmbed=function(e,n){var i,o,t=On(n),a=t.str,t=t.config;if(n=a,t.include)return T(e)||(e=q(this.contentBase,R(this.router.getCurrentPath()),e)),t.type&&(o=Hn[t.type])?(i=o.call(this,e,n)).type=t.type:(o="code",/\.(md|markdown)/.test(e)?o="markdown":/\.mmd/.test(e)?o="mermaid":/\.html?/.test(e)?o="iframe":/\.(mp4|ogg)/.test(e)?o="video":/\.mp3/.test(e)&&(o="audio"),(i=Hn[o].call(this,e,n)).type=o),i.fragment=t.fragment,i},In.prototype._matchNotCompileLink=function(e){for(var n=this.config.noCompileLinks||[],i=0;i/g.test(o)&&(o=o.replace("\x3c!-- {docsify-ignore} --\x3e",""),e.title=Ln(o),e.ignoreSubHeading=!0),/{docsify-ignore}/g.test(o)&&(o=o.replace("{docsify-ignore}",""),e.title=Ln(o),e.ignoreSubHeading=!0),//g.test(o)&&(o=o.replace("\x3c!-- {docsify-ignore-all} --\x3e",""),e.title=Ln(o),e.ignoreAllSubs=!0),/{docsify-ignore-all}/g.test(o)&&(o=o.replace("{docsify-ignore-all}",""),e.title=Ln(o),e.ignoreAllSubs=!0);i=Cn(t.id||o),t=a.toURL(a.getCurrentPath(),{id:i});return e.slug=t,g.toc.push(e),"'+o+" "},t.code={renderer:e}.renderer.code=function(e,n){var i=Pn.languages[n=void 0===n?"markup":n]||Pn.languages.markup;return''+Pn.highlight(e.replace(/@DOCSIFY_QM@/g,"`"),i,n)+"
"},t.link=(i=(n={renderer:e,router:a,linkTarget:n,linkRel:i,compilerClass:g}).renderer,c=n.router,u=n.linkTarget,n.linkRel,f=n.compilerClass,i.link=function(e,n,i){var o=[],t=On(n=void 0===n?"":n),a=t.str,t=t.config;return u=t.target||u,r="_blank"===u?f.config.externalLinkRel||"noopener":"",n=a,T(e)||f._matchNotCompileLink(e)||t.ignore?(T(e)||"./"!==e.slice(0,2)||(e=document.URL.replace(/\/(?!.*\/).*/,"/").replace("#/./","")+e),o.push(0===e.indexOf("mailto:")?"":'target="'+u+'"'),o.push(0!==e.indexOf("mailto:")&&""!==r?' rel="'+r+'"':"")):(e===f.config.homepage&&(e="README"),e=c.toURL(e,null,c.getCurrentPath())),t.crossorgin&&"_self"===u&&"history"===f.config.routerMode&&-1===f.config.crossOriginLinks.indexOf(e)&&f.config.crossOriginLinks.push(e),t.disabled&&(o.push("disabled"),e="javascript:void(0)"),t.class&&o.push('class="'+t.class+'"'),t.id&&o.push('id="'+t.id+'"'),n&&o.push('title="'+n+'"'),'"+i+" "}),t.paragraph={renderer:e}.renderer.paragraph=function(e){e=/^!>/.test(e)?$n("tip",e):/^\?>/.test(e)?$n("warn",e):""+e+"
";return e},t.image=(o=(i={renderer:e,contentBase:o,router:a}).renderer,p=i.contentBase,d=i.router,o.image=function(e,n,i){var o=e,t=[],a=On(n),r=a.str,a=a.config;return n=r,a["no-zoom"]&&t.push("data-no-zoom"),n&&t.push('title="'+n+'"'),a.size&&(n=(r=a.size.split("x"))[0],(r=r[1])?t.push('width="'+n+'" height="'+r+'"'):t.push('width="'+n+'"')),a.class&&t.push('class="'+a.class+'"'),a.id&&t.push('id="'+a.id+'"'),T(e)||(o=q(p,R(d.getCurrentPath()),e)),0 ":' "}),t.list={renderer:e}.renderer.list=function(e,n,i){n=n?"ol":"ul";return"<"+n+" "+[//.test(e.split('class="task-list"')[0])?'class="task-list"':"",i&&1"+e+""+n+">"},t.listitem={renderer:e}.renderer.listitem=function(e){return/^(]*>)/.test(e)?''+e+" ":""+e+" "},e.origin=t,e},In.prototype.sidebar=function(e,n){var i=this.toc,o=this.router.getCurrentPath(),t="";if(e)t=this.compile(e);else{for(var a=0;a{inner}");this.cacheTree[o]=n}return t},In.prototype.subSidebar=function(e){if(e){var n=this.router.getCurrentPath(),i=this.cacheTree,o=this.toc;o[0]&&o[0].ignoreAllSubs&&o.splice(0),o[0]&&1===o[0].level&&o.shift();for(var t=0;t\n'+e+"\n"}]).links={}:(n=[{type:"html",text:e}]).links={}),a({token:t,embedToken:n}),++u>=c&&a({})}}(n);n.embed.url?X(n.embed.url).then(o):o(n.embed.html)}}({compile:i,embedTokens:c,fetch:n},function(e){var n,i=e.embedToken,e=e.token;e?(n=e.index,p.forEach(function(e){n>e.start&&(n+=e.length)}),m(f,i.links),r=r.slice(0,n).concat(i,r.slice(n+1)),p.push({start:n,length:i.length-1})):(Bn[t]=r.concat(),r.links=Bn[t].links=f,o(r))})}function Yn(e,n,i){var o,t,a,r;return n="function"==typeof i?i(n):"string"==typeof i?(a=[],r=0,(o=i).replace(V,function(n,e,i){a.push(o.substring(r,i-1)),r=i+=n.length+1,a.push(t&&t[n]||function(e){return("00"+("string"==typeof Y[n]?e[Y[n]]():Y[n](e))).slice(-n.length)})}),r!==o.length&&a.push(o.substring(r)),function(e){for(var n="",i=0,o=e||new Date;i404 - Not found","Vue"in window)for(var a=0,r=k(".markdown-section > *").filter(n);ascript").filter(function(e){return!/template/.test(e.type)})[0])||(e=e.innerText.trim())&&new Function(e)()),"Vue"in window){var u,f,p=[],d=Object.keys(i.vueComponents||{});2===t&&d.length&&d.forEach(function(e){window.Vue.options.components[e]||window.Vue.component(e,i.vueComponents[e])}),!Un&&i.vueGlobalOptions&&"function"==typeof i.vueGlobalOptions.data&&(Un=i.vueGlobalOptions.data()),p.push.apply(p,Object.keys(i.vueMounts||{}).map(function(e){return[b(o,e),i.vueMounts[e]]}).filter(function(e){var n=e[0];e[1];return n})),(i.vueGlobalOptions||d.length)&&(u=/{{2}[^{}]*}{2}/,f=/<[^>/]+\s([@:]|v-)[\w-:.[\]]+[=>\s]/,p.push.apply(p,k(".markdown-section > *").filter(function(i){return!p.some(function(e){var n=e[0];e[1];return n===i})}).filter(function(e){return e.tagName.toLowerCase()in(i.vueComponents||{})||e.querySelector(d.join(",")||null)||u.test(e.outerHTML)||f.test(e.outerHTML)}).map(function(e){var n=m({},i.vueGlobalOptions||{});return Un&&(n.data=function(){return Un}),[e,n]})));for(var g=0,s=p;g([^<]*?)$'))&&("color"===n[2]?o.style.background=n[1]+(n[3]||""):(e=n[1],S(o,"add","has-mask"),T(n[1])||(e=q(this.router.getBasePath(),n[1])),o.style.backgroundImage="url("+e+")",o.style.backgroundSize="cover",o.style.backgroundPosition="center center"),i=i.replace(n[0],"")),this._renderTo(".cover-main",i),K()):S(o,"remove","show")},n.prototype._updateRender=function(){var e,n,i,o;e=this,n=l(".app-name-link"),i=e.config.nameLink,o=e.route.path,n&&(f(e.config.nameLink)?n.setAttribute("href",i):"object"==typeof i&&(e=Object.keys(i).filter(function(e){return-1 ':"")),e.coverpage&&(f+=(o=", 100%, 85%",'')),e.logo&&(o=/^data:image/.test(e.logo),n=/(?:http[s]?:)?\/\//.test(e.logo),i=/^\./.test(e.logo),o||n||i||(e.logo=q(this.router.getBasePath(),e.logo))),f+=(i=(n=e).name||"",""+('')+' '),this._renderTo(u,f,!0)):this.rendered=!0,e.mergeNavbar&&s?p=b(".sidebar"):(c.classList.add("app-nav"),e.repo||c.classList.add("no-badge")),e.loadNavbar&&y(p,c),e.themeColor&&(v.head.appendChild(w("div","").firstElementChild),a=e.themeColor,window.CSS&&window.CSS.supports&&window.CSS.supports("(--v:red)")||(e=k("style:not(.inserted),link"),[].forEach.call(e,function(e){"STYLE"===e.nodeName?Q(e,a):"LINK"===e.nodeName&&(e=e.getAttribute("href"),/\.css$/.test(e)&&X(e).then(function(e){e=w("style",e);_.appendChild(e),Q(e,a)}))}))),this._updateRender(),S(h,"ready")},n}(function(e){function n(){e.apply(this,arguments)}return e&&(n.__proto__=e),((n.prototype=Object.create(e&&e.prototype)).constructor=n).prototype.routes=function(){return this.config.routes||{}},n.prototype.matchVirtualRoute=function(t){var a=this.routes(),r=Object.keys(a),c=function(){return null};function u(){var e=r.shift();if(!e)return c(null);var n=A(o=(i="^",0===(o=e).indexOf(i)?o:"^"+o),"$")?o:o+"$",i=t.match(n);if(!i)return u();var o=a[e];if("string"==typeof o)return c(o);if("function"!=typeof o)return u();n=o,e=Xn(),o=e[0];return(0,e[1])(function(e){return"string"==typeof e?c(e):!1===e?c(null):u()}),n.length<=2?o(n(t,i)):n(t,i,o)}return{then:function(e){c=e,u()}}},n}(function(i){function e(){for(var e=[],n=arguments.length;n--;)e[n]=arguments[n];i.apply(this,e),this.route={}}return i&&(e.__proto__=i),((e.prototype=Object.create(i&&i.prototype)).constructor=e).prototype.updateRender=function(){this.router.normalize(),this.route=this.router.parse(),h.setAttribute("data-page",this.route.file)},e.prototype.initRouter=function(){var n=this,e=this.config,e=new("history"===(e.routerMode||"hash")&&t?D:H)(e);this.router=e,this.updateRender(),U=this.route,e.onchange(function(e){n.updateRender(),n._updateRender(),U.path!==n.route.path?(n.$fetch(d,n.$resetEvents.bind(n,e.source)),U=n.route):n.$resetEvents(e.source)})},e}(function(e){function n(){e.apply(this,arguments)}return e&&(n.__proto__=e),((n.prototype=Object.create(e&&e.prototype)).constructor=n).prototype.initLifecycle=function(){var i=this;this._hooks={},this._lifecycle={},["init","mounted","beforeEach","afterEach","doneEach","ready"].forEach(function(e){var n=i._hooks[e]=[];i._lifecycle[e]=function(e){return n.push(e)}})},n.prototype.callHook=function(e,t,a){void 0===a&&(a=d);var r=this._hooks[e],c=this.config.catchPluginErrors,u=function(n){var e=r[n];if(n>=r.length)a(t);else if("function"==typeof e){var i="Docsify plugin error";if(2===e.length)try{e(t,function(e){t=e,u(n+1)})}catch(e){if(!c)throw e;console.error(i,e),u(n+1)}else try{var o=e(t);t=void 0===o?t:o,u(n+1)}catch(e){if(!c)throw e;console.error(i,e),u(n+1)}}else u(n+1)};u(0)},n}(we))))))));function Kn(e,n,i){return Qn&&Qn.abort&&Qn.abort(),Qn=X(e,!0,i)}window.Docsify={util:Me,dom:n,get:X,slugify:Cn,version:"4.13.0"},window.DocsifyCompiler=In,window.marked=Sn,window.Prism=Pn,e(function(e){return new Jn})}();
diff --git a/assets/js/index.js b/assets/js/index.js
new file mode 100644
index 0000000..910b0ac
--- /dev/null
+++ b/assets/js/index.js
@@ -0,0 +1,99 @@
+const plugin = (hook, vm) => {
+
+ var defaultConfig = {
+ siteFont : "PT Sans",
+ defaultTheme : 'dark',
+ codeFontFamily : 'Roboto Mono, Monaco, courier, monospace',
+ bodyFontSize : '17px',
+ dark: {
+ accent: '#42b983',
+ toogleBackground : '#ffffff',
+ background: '#091a28',
+ toogleImage : 'url(../../images/icons/sun.svg)'
+ },
+ light: {
+ accent: '#42b983',
+ toogleBackground : '#091a28',
+ background: '#ffffff',
+ toogleImage : 'url(../../images/icons/moon.svg)'
+ }
+ }
+
+ let themeConfig = defaultConfig;
+
+ if(vm.config.hasOwnProperty("darklightTheme")) {
+ for (var [key, value] of Object.entries(vm.config.darklightTheme)) {
+ if(key !== 'light' && key !== 'dark' && key !== 'defaultTheme') {
+ themeConfig[key] = value;
+ }
+ }
+
+ for (var [key, value] of Object.entries(themeConfig)) {
+ if(key !== 'light' && key !== 'dark') {
+ themeConfig[key] = value;
+ document.documentElement.style.setProperty('--'+key , value);
+ }
+ }
+
+ if(vm.config.darklightTheme.hasOwnProperty("dark")) {
+ for (var [key, value] of Object.entries(vm.config.darklightTheme.dark)) {
+ themeConfig.dark[key] = value
+ }
+ }
+
+ if(vm.config.darklightTheme.hasOwnProperty("light")) {
+ for (var [key, value] of Object.entries(vm.config.darklightTheme.light))
+ themeConfig.light[key] = value
+ }
+
+ } else {
+ for (var [key, value] of Object.entries(themeConfig)) {
+ if(key !== 'light' && key !== 'dark') {
+ themeConfig[key] = value;
+ document.documentElement.style.setProperty('--'+key , value);
+ }
+ }
+ }
+
+ if (window.matchMedia("(prefers-color-scheme: dark)").matches) {
+ themeConfig.defaultTheme = 'dark';
+ } else if (window.matchMedia("(prefers-color-scheme: light)").matches) {
+ themeConfig.defaultTheme = 'light';
+ }
+
+ var setTheme = (theme) => {
+
+ document.getElementById("docsify-darklight-theme").setAttribute("data-link-title", theme);
+ document.getElementById("docsify-darklight-theme").click()
+ localStorage.setItem('DARK_LIGHT_THEME', theme);
+ themeConfig.defaultTheme = theme;
+
+ if(theme == "light") {
+ for (var [key, value] of Object.entries(themeConfig.light))
+ document.documentElement.style.setProperty('--'+key , value)
+ } else if ( theme == "dark") {
+ for (var [key, value] of Object.entries(themeConfig.dark))
+ document.documentElement.style.setProperty('--'+key , value)
+ }
+
+
+ document.getElementById('docsify-darklight-theme')
+ .setAttribute('aria-pressed', theme === 'dark');
+
+ }
+
+
+ hook.doneEach(function() {
+ let savedTheme = localStorage.getItem('DARK_LIGHT_THEME')
+ if ( savedTheme == "light" || savedTheme == "dark") {
+ themeConfig.defaultTheme = savedTheme;
+ setTheme(themeConfig.defaultTheme)
+ } else {
+ setTheme(themeConfig.defaultTheme);
+ }
+
+ document.getElementById('docsify-darklight-theme').addEventListener('click', function() { themeConfig.defaultTheme === 'light' ? setTheme('dark') : setTheme('light')})
+ })
+ }
+
+ window.$docsify.plugins = [].concat(plugin, window.$docsify.plugins)
\ No newline at end of file
diff --git a/assets/js/main.js b/assets/js/main.js
new file mode 100644
index 0000000..8bef165
--- /dev/null
+++ b/assets/js/main.js
@@ -0,0 +1,84 @@
+(function() {
+ function initStyleSwitcher() {
+ var isInitialzed = false;
+ var sessionStorageKey = 'activeStylesheetHref';
+
+ function handleSwitch(activeHref, activeTitle) {
+ var activeElm = document.querySelector('link[href*="' + activeHref +'"],link[title="' + activeTitle +'"]');
+
+ if (!activeElm && activeHref) {
+ activeElm = document.createElement('link');
+ activeElm.setAttribute('href', activeHref);
+ activeElm.setAttribute('rel', 'stylesheet');
+ activeElm.setAttribute('title', activeTitle);
+
+ document.head.appendChild(activeElm);
+
+ activeElm.addEventListener('load', function linkOnLoad() {
+ activeElm.removeEventListener('load', linkOnLoad);
+ setActiveLink(activeElm);
+ });
+ }
+ else if (activeElm) {
+ setActiveLink(activeElm);
+ }
+ }
+
+ function setActiveLink(activeElm) {
+ var activeHref = activeElm.getAttribute('href');
+ var activeTitle = activeElm.getAttribute('title');
+ var inactiveElms = document.querySelectorAll('link[title]:not([href*="' + activeHref +'"]):not([title="' + activeTitle +'"])');
+
+ activeElm.setAttribute('rel', (activeElm.rel || '').replace(/\s*alternate/g, '').trim());
+
+ activeElm.disabled = true;
+ activeElm.disabled = false;
+
+ sessionStorage.setItem(sessionStorageKey, activeHref);
+
+ for (var i = 0; i < inactiveElms.length; i++) {
+ var elm = inactiveElms[i];
+
+ elm.disabled = true;
+
+ if (window.browsersyncObserver) {
+ var linkRel = elm.getAttribute('rel') || '';
+ var linkRelAlt = linkRel.indexOf('alternate') > -1 ? linkRel : (linkRel + ' alternate').trim();
+
+ elm.setAttribute('rel', linkRelAlt);
+ }
+ }
+
+ if ((window.$docsify || {}).themeable) {
+ window.$docsify.themeable.util.cssVars();
+ }
+ }
+
+ if (!isInitialzed) {
+ isInitialzed = true;
+ document.addEventListener('DOMContentLoaded', function() {
+ var activeHref = sessionStorage.getItem(sessionStorageKey);
+
+ if (activeHref) {
+ handleSwitch(activeHref);
+ }
+ });
+
+ document.addEventListener('click', function(evt) {
+ var dataHref = evt.target.getAttribute('data-link-href');
+ var dataTitle = evt.target.getAttribute('data-link-title')
+
+ if (dataHref || dataTitle) {
+ dataTitle = dataTitle
+ || evt.target.textContent
+ || '_' + Math.random().toString(36).substr(2, 9); // UID
+
+ handleSwitch(dataHref, dataTitle);
+ evt.preventDefault();
+ }
+ });
+ }
+ }
+
+ initStyleSwitcher();
+})();
diff --git a/assets/prism-python.js b/assets/prism-python.js
new file mode 100644
index 0000000..e5ea11f
--- /dev/null
+++ b/assets/prism-python.js
@@ -0,0 +1,78 @@
+Prism.languages.python = {
+ 'comment': {
+ pattern: /(^|[^\\])#.*/,
+ lookbehind: true
+ },
+ 'string-interpolation': {
+ pattern: /(?:f|rf|fr)(?:("""|''')[\s\S]*?\1|("|')(?:\\.|(?!\2)[^\\\r\n])*\2)/i,
+ greedy: true,
+ inside: {
+ 'interpolation': {
+ // "{" "}"
+ pattern: /((?:^|[^{])(?:{{)*){(?!{)(?:[^{}]|{(?!{)(?:[^{}]|{(?!{)(?:[^{}])+})+})+}/,
+ lookbehind: true,
+ inside: {
+ 'format-spec': {
+ pattern: /(:)[^:(){}]+(?=}$)/,
+ lookbehind: true
+ },
+ 'conversion-option': {
+ pattern: /![sra](?=[:}]$)/,
+ alias: 'punctuation'
+ },
+ rest: null
+ }
+ },
+ 'string': /[\s\S]+/
+ }
+ },
+ 'triple-quoted-string': {
+ pattern: /(?:[rub]|rb|br)?("""|''')[\s\S]*?\1/i,
+ greedy: true,
+ alias: 'string'
+ },
+ 'string': {
+ pattern: /(?:[rub]|rb|br)?("|')(?:\\.|(?!\1)[^\\\r\n])*\1/i,
+ greedy: true
+ },
+ 'function': {
+ pattern: /((?:^|\s)def[ \t]+)[a-zA-Z_]\w*(?=\s*\()/g,
+ lookbehind: true
+ },
+ 'class-name': {
+ pattern: /(\bclass\s+)\w+/i,
+ lookbehind: true
+ },
+ 'decorator': {
+ pattern: /(^\s*)@\w+(?:\.\w+)*/im,
+ lookbehind: true,
+ alias: ['annotation', 'punctuation'],
+ inside: {
+ 'punctuation': /\./
+ }
+ },
+ 'keyword': /\b(?:and|as|assert|async|await|break|class|continue|def|del|elif|else|except|exec|finally|for|from|global|if|import|in|is|lambda|nonlocal|not|or|pass|print|raise|return|try|while|with|yield)\b/,
+ 'builtin': /\b(?:__import__|abs|all|any|apply|ascii|basestring|bin|bool|buffer|bytearray|bytes|callable|chr|classmethod|cmp|coerce|compile|complex|delattr|dict|dir|divmod|enumerate|eval|execfile|file|filter|format|frozenset|getattr|globals|hasattr|hash|help|hex|id|input|int|float|intern|isinstance|issubclass|iter|len|list|locals|long|map|max|memoryview|min|next|object|oct|open|ord|pow|property|range|raw_input|reduce|reload|repr|reversed|round|set|setattr|slice|sorted|staticmethod|str|sum|super|tuple|type|unichr|unicode|vars|xrange|zip)\b/,
+ 'boolean': /\b(?:True|False|None)\b/,
+ 'number': /(?:\b(?=\d)|\B(?=\.))(?:0[bo])?(?:(?:\d|0x[\da-f])[\da-f]*\.?\d*|\.\d+)(?:e[+-]?\d+)?j?\b/i,
+ 'operator': /[-+%=]=?|!=|\*\*?=?|\/\/?=?|<[<=>]?|>[=>]?|[&|^~]/,
+ 'after-two-points': {
+ pattern: /(:)[^:(){}]+(?=}$|\)|,)/,
+ lookbehind: true
+ },
+ 'type-annotation-tuple': {
+ pattern: /\bTuple\[[^\[\]]*\]/,
+ lookbehind: true,
+ inside: {
+ 'tuple-elements': {
+ pattern: /\b\w+\b(?=\s*(?:,|$))/g,
+ lookbehind: true,
+ },
+ }
+ },
+ 'punctuation': /[{}[\];(),.:]/
+};
+
+Prism.languages.python['string-interpolation'].inside['interpolation'].inside.rest = Prism.languages.python;
+
+Prism.languages.py = Prism.languages.python;
diff --git a/assets/search.js b/assets/search.js
new file mode 100644
index 0000000..24736ae
--- /dev/null
+++ b/assets/search.js
@@ -0,0 +1,543 @@
+(function () {
+ /**
+ * Converts a colon formatted string to a object with properties.
+ *
+ * This is process a provided string and look for any tokens in the format
+ * of `:name[=value]` and then convert it to a object and return.
+ * An example of this is ':include :type=code :fragment=demo' is taken and
+ * then converted to:
+ *
+ * ```
+ * {
+ * include: '',
+ * type: 'code',
+ * fragment: 'demo'
+ * }
+ * ```
+ *
+ * @param {string} str The string to parse.
+ *
+ * @return {object} The original string and parsed object, { str, config }.
+ */
+ function getAndRemoveConfig(str) {
+ if ( str === void 0 ) str = '';
+
+ var config = {};
+
+ if (str) {
+ str = str
+ .replace(/^('|")/, '')
+ .replace(/('|")$/, '')
+ .replace(/(?:^|\s):([\w-]+:?)=?([\w-%]+)?/g, function (m, key, value) {
+ if (key.indexOf(':') === -1) {
+ config[key] = (value && value.replace(/"/g, '')) || true;
+ return '';
+ }
+
+ return m;
+ })
+ .trim();
+ }
+
+ return { str: str, config: config };
+ }
+
+ function removeDocsifyIgnoreTag(str) {
+ return str
+ .replace(//, '')
+ .replace(/{docsify-ignore}/, '')
+ .replace(//, '')
+ .replace(/{docsify-ignore-all}/, '')
+ .trim();
+ }
+
+ /* eslint-disable no-unused-vars */
+
+ var INDEXS = {};
+
+ var LOCAL_STORAGE = {
+ EXPIRE_KEY: 'docsify.search.expires',
+ INDEX_KEY: 'docsify.search.index',
+ };
+
+ function resolveExpireKey(namespace) {
+ return namespace
+ ? ((LOCAL_STORAGE.EXPIRE_KEY) + "/" + namespace)
+ : LOCAL_STORAGE.EXPIRE_KEY;
+ }
+
+ function resolveIndexKey(namespace) {
+ return namespace
+ ? ((LOCAL_STORAGE.INDEX_KEY) + "/" + namespace)
+ : LOCAL_STORAGE.INDEX_KEY;
+ }
+
+ function escapeHtml(string) {
+ var entityMap = {
+ '&': '&',
+ '<': '<',
+ '>': '>',
+ '"': '"',
+ "'": ''',
+ };
+
+ return String(string).replace(/[&<>"']/g, function (s) { return entityMap[s]; });
+ }
+
+ function getAllPaths(router) {
+ var paths = [];
+
+ Docsify.dom
+ .findAll('.sidebar-nav a:not(.section-link):not([data-nosearch])')
+ .forEach(function (node) {
+ var href = node.href;
+ var originHref = node.getAttribute('href');
+ var path = router.parse(href).path;
+
+ if (
+ path &&
+ paths.indexOf(path) === -1 &&
+ !Docsify.util.isAbsolutePath(originHref)
+ ) {
+ paths.push(path);
+ }
+ });
+
+ return paths;
+ }
+
+ function getTableData(token) {
+ if (!token.text && token.type === 'table') {
+ token.cells.unshift(token.header);
+ token.text = token.cells
+ .map(function (rows) {
+ return rows.join(' | ');
+ })
+ .join(' |\n ');
+ }
+ return token.text;
+ }
+
+ function getListData(token) {
+ if (!token.text && token.type === 'list') {
+ token.text = token.raw;
+ }
+ return token.text;
+ }
+
+ function saveData(maxAge, expireKey, indexKey) {
+ localStorage.setItem(expireKey, Date.now() + maxAge);
+ localStorage.setItem(indexKey, JSON.stringify(INDEXS));
+ }
+
+ function genIndex(path, content, router, depth) {
+ if ( content === void 0 ) content = '';
+
+ var tokens = window.marked.lexer(content);
+ var slugify = window.Docsify.slugify;
+ var index = {};
+ var slug;
+ var title = '';
+
+ tokens.forEach(function (token, tokenIndex) {
+ if (token.type === 'heading' && token.depth <= depth) {
+ var ref = getAndRemoveConfig(token.text);
+ var str = ref.str;
+ var config = ref.config;
+
+ var text = removeDocsifyIgnoreTag(token.text);
+
+ if (config.id) {
+ slug = router.toURL(path, { id: slugify(config.id) });
+ } else {
+ slug = router.toURL(path, { id: slugify(escapeHtml(text)) });
+ }
+
+ if (str) {
+ title = removeDocsifyIgnoreTag(str);
+ }
+
+ index[slug] = { slug: slug, title: title, body: '' };
+ } else {
+ if (tokenIndex === 0) {
+ slug = router.toURL(path);
+ index[slug] = {
+ slug: slug,
+ title: path !== '/' ? path.slice(1) : 'Home Page',
+ body: token.text || '',
+ };
+ }
+
+ if (!slug) {
+ return;
+ }
+
+ if (!index[slug]) {
+ index[slug] = { slug: slug, title: '', body: '' };
+ } else if (index[slug].body) {
+ token.text = getTableData(token);
+ token.text = getListData(token);
+
+ index[slug].body += '\n' + (token.text || '');
+ } else {
+ token.text = getTableData(token);
+ token.text = getListData(token);
+
+ index[slug].body = index[slug].body
+ ? index[slug].body + token.text
+ : token.text;
+ }
+ }
+ });
+ slugify.clear();
+ return index;
+ }
+
+ function ignoreDiacriticalMarks(keyword) {
+ if (keyword && keyword.normalize) {
+ return keyword.normalize('NFD').replace(/[\u0300-\u036f]/g, '');
+ }
+ return keyword;
+ }
+
+ /**
+ * @param {String} query Search query
+ * @returns {Array} Array of results
+ */
+ function search(query) {
+ var matchingResults = [];
+ var data = [];
+ Object.keys(INDEXS).forEach(function (key) {
+ data = data.concat(Object.keys(INDEXS[key]).map(function (page) { return INDEXS[key][page]; }));
+ });
+
+ query = query.trim();
+ var keywords = query.split(/[\s\-,\\/]+/);
+ if (keywords.length !== 1) {
+ keywords = [].concat(query, keywords);
+ }
+
+ var loop = function ( i ) {
+ var post = data[i];
+ var matchesScore = 0;
+ var resultStr = '';
+ var handlePostTitle = '';
+ var handlePostContent = '';
+ var postTitle = post.title && post.title.trim();
+ var postContent = post.body && post.body.trim();
+ var postUrl = post.slug || '';
+
+ if (postTitle) {
+ keywords.forEach(function (keyword) {
+ // From https://github.com/sindresorhus/escape-string-regexp
+ var regEx = new RegExp(
+ escapeHtml(ignoreDiacriticalMarks(keyword)).replace(
+ /[|\\{}()[\]^$+*?.]/g,
+ '\\$&'
+ ),
+ 'gi'
+ );
+ var indexTitle = -1;
+ var indexContent = -1;
+ handlePostTitle = postTitle
+ ? escapeHtml(ignoreDiacriticalMarks(postTitle))
+ : postTitle;
+ handlePostContent = postContent
+ ? escapeHtml(ignoreDiacriticalMarks(postContent))
+ : postContent;
+
+ indexTitle = postTitle ? handlePostTitle.search(regEx) : -1;
+ indexContent = postContent ? handlePostContent.search(regEx) : -1;
+
+ if (indexTitle >= 0 || indexContent >= 0) {
+ matchesScore += indexTitle >= 0 ? 3 : indexContent >= 0 ? 2 : 0;
+ if (indexContent < 0) {
+ indexContent = 0;
+ }
+
+ var start = 0;
+ var end = 0;
+
+ start = indexContent < 11 ? 0 : indexContent - 10;
+ end = start === 0 ? 70 : indexContent + keyword.length + 60;
+
+ if (postContent && end > postContent.length) {
+ end = postContent.length;
+ }
+
+ var matchContent =
+ handlePostContent &&
+ '...' +
+ handlePostContent
+ .substring(start, end)
+ .replace(
+ regEx,
+ function (word) { return ("" + word + " "); }
+ ) +
+ '...';
+
+ resultStr += matchContent;
+ }
+ });
+
+ if (matchesScore > 0) {
+ var matchingPost = {
+ title: handlePostTitle,
+ content: postContent ? resultStr : '',
+ url: postUrl,
+ score: matchesScore,
+ };
+
+ matchingResults.push(matchingPost);
+ }
+ }
+ };
+
+ for (var i = 0; i < data.length; i++) loop( i );
+
+ return matchingResults.sort(function (r1, r2) { return r2.score - r1.score; });
+ }
+
+ function init(config, vm) {
+ var isAuto = config.paths === 'auto';
+ var paths = isAuto ? getAllPaths(vm.router) : config.paths;
+
+ var namespaceSuffix = '';
+
+ // only in auto mode
+ if (paths.length && isAuto && config.pathNamespaces) {
+ var path = paths[0];
+
+ if (Array.isArray(config.pathNamespaces)) {
+ namespaceSuffix =
+ config.pathNamespaces.filter(
+ function (prefix) { return path.slice(0, prefix.length) === prefix; }
+ )[0] || namespaceSuffix;
+ } else if (config.pathNamespaces instanceof RegExp) {
+ var matches = path.match(config.pathNamespaces);
+
+ if (matches) {
+ namespaceSuffix = matches[0];
+ }
+ }
+ var isExistHome = paths.indexOf(namespaceSuffix + '/') === -1;
+ var isExistReadme = paths.indexOf(namespaceSuffix + '/README') === -1;
+ if (isExistHome && isExistReadme) {
+ paths.unshift(namespaceSuffix + '/');
+ }
+ } else if (paths.indexOf('/') === -1 && paths.indexOf('/README') === -1) {
+ paths.unshift('/');
+ }
+
+ var expireKey = resolveExpireKey(config.namespace) + namespaceSuffix;
+ var indexKey = resolveIndexKey(config.namespace) + namespaceSuffix;
+
+ var isExpired = localStorage.getItem(expireKey) < Date.now();
+
+ INDEXS = JSON.parse(localStorage.getItem(indexKey));
+
+ if (isExpired) {
+ INDEXS = {};
+ } else if (!isAuto) {
+ return;
+ }
+
+ var len = paths.length;
+ var count = 0;
+
+ paths.forEach(function (path) {
+ if (INDEXS[path]) {
+ return count++;
+ }
+
+ Docsify.get(vm.router.getFile(path), false, vm.config.requestHeaders).then(
+ function (result) {
+ INDEXS[path] = genIndex(path, result, vm.router, config.depth);
+ len === ++count && saveData(config.maxAge, expireKey, indexKey);
+ }
+ );
+ });
+ }
+
+ /* eslint-disable no-unused-vars */
+
+ var NO_DATA_TEXT = '';
+ var options;
+
+ function style() {
+ var code = "\n.sidebar {\n padding-top: 0;\n}\n\n.search {\n margin-bottom: 20px;\n padding: 6px;\n border-bottom: 1px solid #eee;\n}\n\n.search .input-wrap {\n display: flex;\n align-items: center;\n}\n\n.search .results-panel {\n display: none;\n}\n\n.search .results-panel.show {\n display: block;\n}\n\n.search input {\n outline: none;\n border: none;\n width: 100%;\n padding: 0.6em 7px;\n font-size: inherit;\n border: 1px solid transparent;\n}\n\n.search input:focus {\n box-shadow: 0 0 5px var(--theme-color, #42b983);\n border: 1px solid var(--theme-color, #42b983);\n}\n\n.search input::-webkit-search-decoration,\n.search input::-webkit-search-cancel-button,\n.search input {\n -webkit-appearance: none;\n -moz-appearance: none;\n appearance: none;\n}\n\n.search input::-ms-clear {\n display: none;\n height: 0;\n width: 0;\n}\n\n.search .clear-button {\n cursor: pointer;\n width: 36px;\n text-align: right;\n display: none;\n}\n\n.search .clear-button.show {\n display: block;\n}\n\n.search .clear-button svg {\n transform: scale(.5);\n}\n\n.search h2 {\n font-size: 17px;\n margin: 10px 0;\n}\n\n.search a {\n text-decoration: none;\n color: inherit;\n}\n\n.search .matching-post {\n border-bottom: 1px solid #eee;\n}\n\n.search .matching-post:last-child {\n border-bottom: 0;\n}\n\n.search p {\n font-size: 14px;\n overflow: hidden;\n text-overflow: ellipsis;\n display: -webkit-box;\n -webkit-line-clamp: 2;\n -webkit-box-orient: vertical;\n}\n\n.search p.empty {\n text-align: center;\n}\n\n.app-name.hide, .sidebar-nav.hide {\n display: none;\n}";
+
+ Docsify.dom.style(code);
+ }
+
+ function tpl(defaultValue) {
+ if ( defaultValue === void 0 ) defaultValue = '';
+
+ var html = "\n
\n ";
+ var el = Docsify.dom.create('div', html);
+ var aside = Docsify.dom.find('aside');
+
+ Docsify.dom.toggleClass(el, 'search');
+ Docsify.dom.before(aside, el);
+ }
+
+ function doSearch(value) {
+ var $search = Docsify.dom.find('div.search');
+ var $panel = Docsify.dom.find($search, '.results-panel');
+ var $clearBtn = Docsify.dom.find($search, '.clear-button');
+ var $sidebarNav = Docsify.dom.find('.sidebar-nav');
+ var $appName = Docsify.dom.find('.app-name');
+
+ if (!value) {
+ $panel.classList.remove('show');
+ $clearBtn.classList.remove('show');
+ $panel.innerHTML = '';
+
+ if (options.hideOtherSidebarContent) {
+ $sidebarNav && $sidebarNav.classList.remove('hide');
+ $appName && $appName.classList.remove('hide');
+ }
+
+ return;
+ }
+
+ var matchs = search(value);
+
+ var html = '';
+ matchs.forEach(function (post) {
+ html += "";
+ });
+
+ $panel.classList.add('show');
+ $clearBtn.classList.add('show');
+ $panel.innerHTML = html || ("" + NO_DATA_TEXT + "
");
+ if (options.hideOtherSidebarContent) {
+ $sidebarNav && $sidebarNav.classList.add('hide');
+ $appName && $appName.classList.add('hide');
+ }
+ }
+
+ function bindEvents() {
+ var $search = Docsify.dom.find('div.search');
+ var $input = Docsify.dom.find($search, 'input');
+ var $inputWrap = Docsify.dom.find($search, '.input-wrap');
+
+ var timeId;
+
+ /**
+ Prevent to Fold sidebar.
+
+ When searching on the mobile end,
+ the sidebar is collapsed when you click the INPUT box,
+ making it impossible to search.
+ */
+ Docsify.dom.on(
+ $search,
+ 'click',
+ function (e) { return ['A', 'H2', 'P', 'EM'].indexOf(e.target.tagName) === -1 &&
+ e.stopPropagation(); }
+ );
+ Docsify.dom.on($input, 'input', function (e) {
+ clearTimeout(timeId);
+ timeId = setTimeout(function (_) { return doSearch(e.target.value.trim()); }, 100);
+ });
+ Docsify.dom.on($inputWrap, 'click', function (e) {
+ // Click input outside
+ if (e.target.tagName !== 'INPUT') {
+ $input.value = '';
+ doSearch();
+ }
+ });
+ }
+
+ function updatePlaceholder(text, path) {
+ var $input = Docsify.dom.getNode('.search input[type="search"]');
+
+ if (!$input) {
+ return;
+ }
+
+ if (typeof text === 'string') {
+ $input.placeholder = text;
+ } else {
+ var match = Object.keys(text).filter(function (key) { return path.indexOf(key) > -1; })[0];
+ $input.placeholder = text[match];
+ }
+ }
+
+ function updateNoData(text, path) {
+ if (typeof text === 'string') {
+ NO_DATA_TEXT = text;
+ } else {
+ var match = Object.keys(text).filter(function (key) { return path.indexOf(key) > -1; })[0];
+ NO_DATA_TEXT = text[match];
+ }
+ }
+
+ function updateOptions(opts) {
+ options = opts;
+ }
+
+ function init$1(opts, vm) {
+ var keywords = vm.router.parse().query.s;
+
+ updateOptions(opts);
+ style();
+ tpl(keywords);
+ bindEvents();
+ keywords && setTimeout(function (_) { return doSearch(keywords); }, 500);
+ }
+
+ function update(opts, vm) {
+ updateOptions(opts);
+ updatePlaceholder(opts.placeholder, vm.route.path);
+ updateNoData(opts.noData, vm.route.path);
+ }
+
+ /* eslint-disable no-unused-vars */
+
+ var CONFIG = {
+ placeholder: 'Type to search',
+ noData: 'No Results!',
+ paths: 'auto',
+ depth: 2,
+ maxAge: 86400000, // 1 day
+ hideOtherSidebarContent: false,
+ namespace: undefined,
+ pathNamespaces: undefined,
+ };
+
+ var install = function (hook, vm) {
+ var util = Docsify.util;
+ var opts = vm.config.search || CONFIG;
+
+ if (Array.isArray(opts)) {
+ CONFIG.paths = opts;
+ } else if (typeof opts === 'object') {
+ CONFIG.paths = Array.isArray(opts.paths) ? opts.paths : 'auto';
+ CONFIG.maxAge = util.isPrimitive(opts.maxAge) ? opts.maxAge : CONFIG.maxAge;
+ CONFIG.placeholder = opts.placeholder || CONFIG.placeholder;
+ CONFIG.noData = opts.noData || CONFIG.noData;
+ CONFIG.depth = opts.depth || CONFIG.depth;
+ CONFIG.hideOtherSidebarContent =
+ opts.hideOtherSidebarContent || CONFIG.hideOtherSidebarContent;
+ CONFIG.namespace = opts.namespace || CONFIG.namespace;
+ CONFIG.pathNamespaces = opts.pathNamespaces || CONFIG.pathNamespaces;
+ }
+
+ var isAuto = CONFIG.paths === 'auto';
+
+ hook.mounted(function (_) {
+ init$1(CONFIG, vm);
+ !isAuto && init(CONFIG, vm);
+ });
+ hook.doneEach(function (_) {
+ update(CONFIG, vm);
+ isAuto && init(CONFIG, vm);
+ });
+ };
+
+ $docsify.plugins = [].concat(install, $docsify.plugins);
+
+ }());
+
\ No newline at end of file
diff --git a/assets/styles/theme-simple-dark.css b/assets/styles/theme-simple-dark.css
new file mode 100644
index 0000000..34f640b
--- /dev/null
+++ b/assets/styles/theme-simple-dark.css
@@ -0,0 +1,155 @@
+.github-corner{position:absolute;z-index:40;top:0;right:0;border-bottom:0;text-decoration:none}.github-corner svg{height:70px;width:70px;fill:var(--theme-color);color:var(--base-background-color)}.github-corner:hover .octo-arm{-webkit-animation:octocat-wave 560ms ease-in-out;animation:octocat-wave 560ms ease-in-out}@-webkit-keyframes octocat-wave{0%,100%{transform:rotate(0)}20%,60%{transform:rotate(-25deg)}40%,80%{transform:rotate(10deg)}}@keyframes octocat-wave{0%,100%{transform:rotate(0)}20%,60%{transform:rotate(-25deg)}40%,80%{transform:rotate(10deg)}}.progress{position:fixed;z-index:2147483647;top:0;left:0;right:0;height:3px;width:0;background-color:var(--theme-color);transition:width var(--duration-fast),opacity calc(var(--duration-fast)*2)}body.ready-transition:after,body.ready-transition>*:not(.progress){opacity:0;transition:opacity var(--spinner-transition-duration)}body.ready-transition:after{content:"";position:absolute;z-index:1000;top:calc(50% - var(--spinner-size)/2);left:calc(50% - var(--spinner-size)/2);height:var(--spinner-size);width:var(--spinner-size);border:var(--spinner-track-width, 0) solid var(--spinner-track-color);border-left-color:var(--theme-color);border-left-color:var(--theme-color);border-radius:50%;-webkit-animation:spinner var(--duration-slow) infinite linear;animation:spinner var(--duration-slow) infinite linear}body.ready-transition.ready-spinner:after{opacity:1}body.ready-transition.ready-fix:after{opacity:0}body.ready-transition.ready-fix>*:not(.progress){opacity:1;transition-delay:var(--spinner-transition-duration)}@-webkit-keyframes spinner{0%{transform:rotate(0deg)}100%{transform:rotate(360deg)}}@keyframes spinner{0%{transform:rotate(0deg)}100%{transform:rotate(360deg)}}*,*:before,*:after{box-sizing:inherit;font-size:inherit;-webkit-overflow-scrolling:touch;-webkit-tap-highlight-color:rgba(0,0,0,0);-webkit-text-size-adjust:none;-webkit-touch-callout:none}:root{box-sizing:border-box;background-color:var(--base-background-color);font-size:var(--base-font-size);font-weight:var(--base-font-weight);line-height:var(--base-line-height);letter-spacing:var(--base-letter-spacing);color:var(--base-color);-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale;font-smoothing:antialiased}html,button,input,optgroup,select,textarea{font-family:var(--base-font-family)}button,input,optgroup,select,textarea{font-size:100%;margin:0}a{text-decoration:none;-webkit-text-decoration-skip:ink;text-decoration-skip-ink:auto}body{margin:0}hr{height:0;margin:2em 0;border:none;border-bottom:var(--hr-border, 0)}img{max-width:100%;border:0}main{display:block}main.hidden{display:none}mark{background:var(--mark-background);color:var(--mark-color)}pre{font-family:var(--pre-font-family);font-size:var(--pre-font-size);font-weight:var(--pre-font-weight);line-height:var(--pre-line-height)}small{display:inline-block;font-size:var(--small-font-size)}strong{font-weight:var(--strong-font-weight);color:var(--strong-color, currentColor)}sub,sup{font-size:var(--subsup-font-size);line-height:0;position:relative;vertical-align:baseline}sub{bottom:-0.25em}sup{top:-0.5em}body:not([data-platform^=Mac]) *{scrollbar-color:hsla(var(--mono-hue), var(--mono-saturation), 50%, 0.3) hsla(var(--mono-hue), var(--mono-saturation), 50%, 0.1);scrollbar-width:thin}body:not([data-platform^=Mac]) * ::-webkit-scrollbar{width:5px;height:5px}body:not([data-platform^=Mac]) * ::-webkit-scrollbar-thumb{background:hsla(var(--mono-hue), var(--mono-saturation), 50%, 0.3)}body:not([data-platform^=Mac]) * ::-webkit-scrollbar-track{background:hsla(var(--mono-hue), var(--mono-saturation), 50%, 0.1)}::-moz-selection{background:var(--selection-color)}::selection{background:var(--selection-color)}.emoji{height:var(--emoji-size);vertical-align:middle}.task-list-item{list-style:none}.task-list-item input{margin-right:.5em;margin-left:0;vertical-align:.075em}.markdown-section code[class*=lang-],.markdown-section pre[data-lang]{font-family:var(--code-font-family);font-size:var(--code-font-size);font-weight:var(--code-font-weight);letter-spacing:normal;line-height:var(--code-block-line-height);-moz-tab-size:var(--code-tab-size);-o-tab-size:var(--code-tab-size);tab-size:var(--code-tab-size);text-align:left;white-space:pre;word-spacing:normal;word-wrap:normal;word-break:normal;-webkit-hyphens:none;hyphens:none}.markdown-section pre[data-lang]{position:relative;overflow:hidden;margin:var(--code-block-margin);padding:0;border-radius:var(--code-block-border-radius)}.markdown-section pre[data-lang]::after{content:attr(data-lang);position:absolute;top:.75em;right:.75em;opacity:.6;color:inherit;font-size:var(--font-size-s);line-height:1}.markdown-section pre[data-lang] code{display:block;overflow:auto;padding:var(--code-block-padding)}code[class*=lang-],pre[data-lang]{color:var(--code-theme-text)}pre[data-lang]::-moz-selection,pre[data-lang] ::-moz-selection,code[class*=lang-]::-moz-selection,code[class*=lang-] ::-moz-selection{background:var(--code-theme-selection, var(--selection-color))}pre[data-lang]::-moz-selection, pre[data-lang] ::-moz-selection, code[class*=lang-]::-moz-selection, code[class*=lang-] ::-moz-selection{background:var(--code-theme-selection, var(--selection-color))}pre[data-lang]::selection,pre[data-lang] ::selection,code[class*=lang-]::selection,code[class*=lang-] ::selection{background:var(--code-theme-selection, var(--selection-color))}:not(pre)>code[class*=lang-],pre[data-lang]{background:var(--code-theme-background)}.namespace{opacity:.7}.token.comment,.token.prolog,.token.doctype,.token.cdata{color:var(--code-theme-comment)}.token.punctuation{color:var(--code-theme-punctuation)}.token.property,.token.tag,.token.boolean,.token.number,.token.constant,.token.symbol,.token.deleted{color:var(--code-theme-tag)}.token.selector,.token.attr-name,.token.string,.token.char,.token.builtin,.token.inserted{color:var(--code-theme-selector)}.token.operator,.token.entity,.token.url,.language-css .token.string,.style .token.string{color:var(--code-theme-operator)}.token.atrule,.token.attr-value,.token.keyword{color:var(--code-theme-keyword)}.token.function{color:var(--code-theme-function)}.token.regex,.token.important,.token.variable{color:var(--code-theme-variable)}.token.important,.token.bold{font-weight:bold}.token.italic{font-style:italic}.token.entity{cursor:help}.markdown-section{position:relative;max-width:var(--content-max-width);margin:0 auto;padding:2rem 45px}.app-nav:not(:empty)~main .markdown-section{padding-top:3.5rem}.markdown-section figure,.markdown-section p,.markdown-section ol,.markdown-section ul{margin:1em 0}.markdown-section ol,.markdown-section ul{padding-left:1.5rem}.markdown-section ol ol,.markdown-section ol ul,.markdown-section ul ol,.markdown-section ul ul{margin-top:.15rem;margin-bottom:.15rem}.markdown-section a{border-bottom:var(--link-border-bottom);color:var(--link-color);-webkit-text-decoration:var(--link-text-decoration);text-decoration:var(--link-text-decoration);-webkit-text-decoration-color:var(--link-text-decoration-color);text-decoration-color:var(--link-text-decoration-color)}.markdown-section a:hover{border-bottom:var(--link-border-bottom--hover, var(--link-border-bottom, 0));color:var(--link-color--hover, var(--link-color));-webkit-text-decoration:var(--link-text-decoration--hover, var(--link-text-decoration));text-decoration:var(--link-text-decoration--hover, var(--link-text-decoration));-webkit-text-decoration-color:var(--link-text-decoration-color--hover, var(--link-text-decoration-color));text-decoration-color:var(--link-text-decoration-color--hover, var(--link-text-decoration-color))}.markdown-section a.anchor{border-bottom:0;color:inherit;text-decoration:none}.markdown-section a.anchor:hover{text-decoration:underline}.markdown-section blockquote{overflow:visible;margin:2em 0;padding:var(--blockquote-padding);border-width:var(--blockquote-border-width, 0);border-style:var(--blockquote-border-style);border-color:var(--blockquote-border-color);border-radius:var(--blockquote-border-radius);background:var(--blockquote-background);color:var(--blockquote-color);font-family:var(--blockquote-font-family);font-size:var(--blockquote-font-size);font-style:var(--blockquote-font-style);font-weight:var(--blockquote-font-weight);quotes:"“" "”" "‘" "’"}.markdown-section blockquote em{font-family:var(--blockquote-em-font-family);font-size:var(--blockquote-em-font-size);font-style:var(--blockquote-em-font-style);font-weight:var(--blockquote-em-font-weight)}.markdown-section blockquote p:first-child{margin-top:0}.markdown-section blockquote p:first-child:before,.markdown-section blockquote p:first-child:after{color:var(--blockquote-quotes-color);font-family:var(--blockquote-quotes-font-family);font-size:var(--blockquote-quotes-font-size);line-height:0}.markdown-section blockquote p:first-child:before{content:var(--blockquote-quotes-open);margin-right:.15em;vertical-align:-0.45em}.markdown-section blockquote p:first-child:after{content:var(--blockquote-quotes-close);margin-left:.15em;vertical-align:-0.55em}.markdown-section blockquote p:last-child{margin-bottom:0}.markdown-section code{font-family:var(--code-font-family);font-size:var(--code-font-size);font-weight:var(--code-font-weight);line-height:inherit}.markdown-section code:not([class*=lang-]):not([class*=language-]){margin:var(--code-inline-margin);padding:var(--code-inline-padding);border-radius:var(--code-inline-border-radius);background:var(--code-inline-background);color:var(--code-inline-color, currentColor);white-space:nowrap}.markdown-section h1:first-child,.markdown-section h2:first-child,.markdown-section h3:first-child,.markdown-section h4:first-child,.markdown-section h5:first-child,.markdown-section h6:first-child{margin-top:0}.markdown-section h1 a[data-id],.markdown-section h2 a[data-id],.markdown-section h3 a[data-id],.markdown-section h4 a[data-id],.markdown-section h5 a[data-id],.markdown-section h6 a[data-id]{display:inline-block}.markdown-section h1 code,.markdown-section h2 code,.markdown-section h3 code,.markdown-section h4 code,.markdown-section h5 code,.markdown-section h6 code{font-size:.875em}.markdown-section h1+h2,.markdown-section h1+h3,.markdown-section h1+h4,.markdown-section h1+h5,.markdown-section h1+h6,.markdown-section h2+h3,.markdown-section h2+h4,.markdown-section h2+h5,.markdown-section h2+h6,.markdown-section h3+h4,.markdown-section h3+h5,.markdown-section h3+h6,.markdown-section h4+h5,.markdown-section h4+h6,.markdown-section h5+h6{margin-top:1rem}.markdown-section h1{margin:var(--heading-h1-margin, var(--heading-margin));padding:var(--heading-h1-padding, var(--heading-padding));border-width:var(--heading-h1-border-width, 0);border-style:var(--heading-h1-border-style);border-color:var(--heading-h1-border-color);font-family:var(--heading-h1-font-family, var(--heading-font-family));font-size:var(--heading-h1-font-size);font-weight:var(--heading-h1-font-weight, var(--heading-font-weight));line-height:var(--base-line-height);color:var(--heading-h1-color, var(--heading-color))}.markdown-section h2{margin:var(--heading-h2-margin, var(--heading-margin));padding:var(--heading-h2-padding, var(--heading-padding));border-width:var(--heading-h2-border-width, 0);border-style:var(--heading-h2-border-style);border-color:var(--heading-h2-border-color);font-family:var(--heading-h2-font-family, var(--heading-font-family));font-size:var(--heading-h2-font-size);font-weight:var(--heading-h2-font-weight, var(--heading-font-weight));line-height:var(--base-line-height);color:var(--heading-h2-color, var(--heading-color))}.markdown-section h3{margin:var(--heading-h3-margin, var(--heading-margin));padding:var(--heading-h3-padding, var(--heading-padding));border-width:var(--heading-h3-border-width, 0);border-style:var(--heading-h3-border-style);border-color:var(--heading-h3-border-color);font-family:var(--heading-h3-font-family, var(--heading-font-family));font-size:var(--heading-h3-font-size);font-weight:var(--heading-h3-font-weight, var(--heading-font-weight));color:var(--heading-h3-color, var(--heading-color))}.markdown-section h4{margin:var(--heading-h4-margin, var(--heading-margin));padding:var(--heading-h4-padding, var(--heading-padding));border-width:var(--heading-h4-border-width, 0);border-style:var(--heading-h4-border-style);border-color:var(--heading-h4-border-color);font-family:var(--heading-h4-font-family, var(--heading-font-family));font-size:var(--heading-h4-font-size);font-weight:var(--heading-h4-font-weight, var(--heading-font-weight));color:var(--heading-h4-color, var(--heading-color))}.markdown-section h5{margin:var(--heading-h5-margin, var(--heading-margin));padding:var(--heading-h5-padding, var(--heading-padding));border-width:var(--heading-h5-border-width, 0);border-style:var(--heading-h5-border-style);border-color:var(--heading-h5-border-color);font-family:var(--heading-h5-font-family, var(--heading-font-family));font-size:var(--heading-h5-font-size);font-weight:var(--heading-h5-font-weight, var(--heading-font-weight));color:var(--heading-h5-color, var(--heading-color))}.markdown-section h6{margin:var(--heading-h6-margin, var(--heading-margin));padding:var(--heading-h6-padding, var(--heading-padding));border-width:var(--heading-h6-border-width, 0);border-style:var(--heading-h6-border-style);border-color:var(--heading-h6-border-color);font-family:var(--heading-h6-font-family, var(--heading-font-family));font-size:var(--heading-h6-font-size);font-weight:var(--heading-h6-font-weight, var(--heading-font-weight));color:var(--heading-h6-color, var(--heading-color))}.markdown-section iframe{margin:1em 0}.markdown-section img{max-width:50%;margin-left:auto;margin-right:auto;display: block;}.markdown-section kbd{display:inline-block;min-width:var(--kbd-min-width);margin:var(--kbd-margin);padding:var(--kbd-padding);border:var(--kbd-border);border-radius:var(--kbd-border-radius);background:var(--kbd-background);font-family:inherit;font-size:var(--kbd-font-size);text-align:center;letter-spacing:0;line-height:1;color:var(--kbd-color)}.markdown-section kbd+kbd{margin-left:-0.15em}.markdown-section table{display:block;overflow:auto;margin:1rem 0;border-spacing:0;border-collapse:collapse}.markdown-section th,.markdown-section td{padding:var(--table-cell-padding)}.markdown-section th:not([align]){text-align:left}.markdown-section thead{border-color:var(--table-head-border-color);border-style:solid;border-width:var(--table-head-border-width, 0);background:var(--table-head-background)}.markdown-section th{font-weight:var(--table-head-font-weight);color:var(--strong-color)}.markdown-section td{border-color:var(--table-cell-border-color);border-style:solid;border-width:var(--table-cell-border-width, 0)}.markdown-section tbody{border-color:var(--table-body-border-color);border-style:solid;border-width:var(--table-body-border-width, 0)}.markdown-section tbody tr:nth-child(odd){background:var(--table-row-odd-background)}.markdown-section tbody tr:nth-child(even){background:var(--table-row-even-background)}.markdown-section>ul .task-list-item{margin-left:-1.25em}.markdown-section>ul .task-list-item .task-list-item{margin-left:0}.markdown-section .table-wrapper{overflow-x:auto}.markdown-section .table-wrapper table{display:table;width:100%}.markdown-section .table-wrapper td::before{display:none}@media(max-width: 30em){.markdown-section .table-wrapper tbody,.markdown-section .table-wrapper tr,.markdown-section .table-wrapper td{display:block}.markdown-section .table-wrapper th,.markdown-section .table-wrapper td{border:none}.markdown-section .table-wrapper thead{display:none}.markdown-section .table-wrapper tr{border-color:var(--table-cell-border-color);border-style:solid;border-width:var(--table-cell-border-width, 0);padding:var(--table-cell-padding)}.markdown-section .table-wrapper tr:not(:last-child){border-bottom:0}.markdown-section .table-wrapper td{padding:.15em 0 .15em 8em}.markdown-section .table-wrapper td::before{display:inline-block;float:left;width:8em;margin-left:-8em;font-weight:bold;text-align:left}}.markdown-section .tip,.markdown-section .warn{position:relative;margin:2em 0;padding:var(--notice-padding);border-width:var(--notice-border-width, 0);border-style:var(--notice-border-style);border-color:var(--notice-border-color);border-radius:var(--notice-border-radius);background:var(--notice-background);font-family:var(--notice-font-family);font-weight:var(--notice-font-weight);color:var(--notice-color)}.markdown-section .tip:before,.markdown-section .warn:before{display:inline-block;position:var(--notice-before-position, relative);top:var(--notice-before-top);left:var(--notice-before-left);height:var(--notice-before-height);width:var(--notice-before-width);margin:var(--notice-before-margin);padding:var(--notice-before-padding);border-radius:var(--notice-before-border-radius);line-height:var(--notice-before-line-height);font-family:var(--notice-before-font-family);font-size:var(--notice-before-font-size);font-weight:var(--notice-before-font-weight);text-align:center}.markdown-section .tip{border-width:var(--notice-important-border-width, var(--notice-border-width, 0));border-style:var(--notice-important-border-style, var(--notice-border-style));border-color:var(--notice-important-border-color, var(--notice-border-color));background:var(--notice-important-background, var(--notice-background));color:var(--notice-important-color, var(--notice-color))}.markdown-section .tip:before{content:var(--notice-important-before-content, var(--notice-before-content));background:var(--notice-important-before-background, var(--notice-before-background));color:var(--notice-important-before-color, var(--notice-before-color))}.markdown-section .warn{border-width:var(--notice-tip-border-width, var(--notice-border-width, 0));border-style:var(--notice-tip-border-style, var(--notice-border-style));border-color:var(--notice-tip-border-color, var(--notice-border-color));background:var(--notice-tip-background, var(--notice-background));color:var(--notice-tip-color, var(--notice-color))}.markdown-section .warn:before{content:var(--notice-tip-before-content, var(--notice-before-content));background:var(--notice-tip-before-background, var(--notice-before-background));color:var(--notice-tip-before-color, var(--notice-before-color))}.cover{display:none;position:relative;z-index:20;min-height:100vh;flex-direction:column;align-items:center;justify-content:center;padding:calc(var(--cover-border-inset, 0px) + var(--cover-border-width, 0px));color:var(--cover-color);text-align:var(--cover-text-align)}@media screen and (-ms-high-contrast: active),screen and (-ms-high-contrast: none){.cover{height:100vh}}.cover:before,.cover:after{content:"";position:absolute}.cover:before{top:0;bottom:0;left:0;right:0;background-blend-mode:var(--cover-background-blend-mode);background-color:var(--cover-background-color);background-image:var(--cover-background-image);background-position:var(--cover-background-position);background-repeat:var(--cover-background-repeat);background-size:var(--cover-background-size)}.cover:after{top:var(--cover-border-inset, 0);bottom:var(--cover-border-inset, 0);left:var(--cover-border-inset, 0);right:var(--cover-border-inset, 0);border-width:var(--cover-border-width, 0);border-style:solid;border-color:var(--cover-border-color)}.cover a{border-bottom:var(--cover-link-border-bottom);color:var(--cover-link-color);-webkit-text-decoration:var(--cover-link-text-decoration);text-decoration:var(--cover-link-text-decoration);-webkit-text-decoration-color:var(--cover-link-text-decoration-color);text-decoration-color:var(--cover-link-text-decoration-color)}.cover a:hover{border-bottom:var(--cover-link-border-bottom--hover, var(--cover-link-border-bottom));color:var(--cover-link-color--hover, var(--cover-link-color));-webkit-text-decoration:var(--cover-link-text-decoration--hover, var(--cover-link-text-decoration));text-decoration:var(--cover-link-text-decoration--hover, var(--cover-link-text-decoration));-webkit-text-decoration-color:var(--cover-link-text-decoration-color--hover, var(--cover-link-text-decoration-color));text-decoration-color:var(--cover-link-text-decoration-color--hover, var(--cover-link-text-decoration-color))}.cover h1{color:var(--cover-heading-color);position:relative;margin:0;font-size:var(--cover-heading-font-size);font-weight:var(--cover-heading-font-weight);line-height:1.2}.cover h1 a,.cover h1 a:hover{display:block;border-bottom:none;color:inherit;text-decoration:none}.cover h1 small{position:absolute;bottom:0;margin-left:.5em}.cover h1 span{font-size:calc(var(--cover-heading-font-size-min)*1px)}@media(min-width: 26em){.cover h1 span{font-size:calc(var(--cover-heading-font-size-min)*1px + (var(--cover-heading-font-size-max) - var(--cover-heading-font-size-min))*(100vw - 420px)/604)}}@media(min-width: 64em){.cover h1 span{font-size:calc(var(--cover-heading-font-size-max)*1px)}}.cover blockquote{margin:0;color:var(--cover-blockquote-color);font-size:var(--cover-blockquote-font-size)}.cover blockquote a{color:inherit}.cover ul{padding:0;list-style-type:none}.cover .cover-main{position:relative;z-index:1;max-width:var(--cover-max-width);margin:var(--cover-margin);padding:0 45px}.cover .cover-main>p:last-child{margin:1.25em -0.25em}.cover .cover-main>p:last-child a{display:block;margin:.375em .25em;padding:var(--cover-button-padding);border:var(--cover-button-border);border-radius:var(--cover-button-border-radius);box-shadow:var(--cover-button-box-shadow);background:var(--cover-button-background);text-align:center;-webkit-text-decoration:var(--cover-button-text-decoration);text-decoration:var(--cover-button-text-decoration);-webkit-text-decoration-color:var(--cover-button-text-decoration-color);text-decoration-color:var(--cover-button-text-decoration-color);color:var(--cover-button-color);white-space:nowrap;transition:var(--cover-button-transition)}.cover .cover-main>p:last-child a:hover{border:var(--cover-button-border--hover, var(--cover-button-border));box-shadow:var(--cover-button-box-shadow--hover, var(--cover-button-box-shadow));background:var(--cover-button-background--hover, var(--cover-button-background));-webkit-text-decoration:var(--cover-button-text-decoration--hover, var(--cover-button-text-decoration));text-decoration:var(--cover-button-text-decoration--hover, var(--cover-button-text-decoration));-webkit-text-decoration-color:var(--cover-button-text-decoration-color--hover, var(--cover-button-text-decoration-color));text-decoration-color:var(--cover-button-text-decoration-color--hover, var(--cover-button-text-decoration-color));color:var(--cover-button-color--hover, var(--cover-button-color))}.cover .cover-main>p:last-child a:first-child{border:var(--cover-button-primary-border, var(--cover-button-border));box-shadow:var(--cover-button-primary-box-shadow, var(--cover-button-box-shadow));background:var(--cover-button-primary-background, var(--cover-button-background));-webkit-text-decoration:var(--cover-button-primary-text-decoration, var(--cover-button-text-decoration));text-decoration:var(--cover-button-primary-text-decoration, var(--cover-button-text-decoration));-webkit-text-decoration-color:var(--cover-button-primary-text-decoration-color, var(--cover-button-text-decoration-color));text-decoration-color:var(--cover-button-primary-text-decoration-color, var(--cover-button-text-decoration-color));color:var(--cover-button-primary-color, var(--cover-button-color))}.cover .cover-main>p:last-child a:first-child:hover{border:var(--cover-button-primary-border--hover, var(--cover-button-border--hover, var(--cover-button-primary-border, var(--cover-button-border))));box-shadow:var(--cover-button-primary-box-shadow--hover, var(--cover-button-box-shadow--hover, var(--cover-button-primary-box-shadow, var(--cover-button-box-shadow))));background:var(--cover-button-primary-background--hover, var(--cover-button-background--hover, var(--cover-button-primary-background, var(--cover-button-background))));-webkit-text-decoration:var(--cover-button-primary-text-decoration--hover, var(--cover-button-text-decoration--hover, var(--cover-button-primary-text-decoration, var(--cover-button-text-decoration))));text-decoration:var(--cover-button-primary-text-decoration--hover, var(--cover-button-text-decoration--hover, var(--cover-button-primary-text-decoration, var(--cover-button-text-decoration))));-webkit-text-decoration-color:var(--cover-button-primary-text-decoration-color--hover, var(--cover-button-text-decoration-color--hover, var(--cover-button-primary-text-decoration-color, var(--cover-button-text-decoration-color))));text-decoration-color:var(--cover-button-primary-text-decoration-color--hover, var(--cover-button-text-decoration-color--hover, var(--cover-button-primary-text-decoration-color, var(--cover-button-text-decoration-color))));color:var(--cover-button-primary-color--hover, var(--cover-button-color--hover, var(--cover-button-primary-color, var(--cover-button-color))))}@media(min-width: 30.01em){.cover .cover-main>p:last-child a{display:inline-block}}.cover .mask{visibility:var(--cover-background-mask-visibility, hidden);position:absolute;top:0;bottom:0;left:0;right:0;background-color:var(--cover-background-mask-color);opacity:var(--cover-background-mask-opacity)}.cover.has-mask .mask{visibility:visible}.cover.show{display:flex}.app-nav{position:absolute;z-index:30;top:calc(35px - .5em*var(--base-line-height));left:45px;right:80px;text-align:right}.app-nav.no-badge{right:45px}.app-nav li>img,.app-nav li>a>img{margin-top:-0.25em;vertical-align:middle}.app-nav li>img:first-child,.app-nav li>a>img:first-child{margin-right:.5em}.app-nav ul,.app-nav li{margin:0;padding:0;list-style:none}.app-nav li{position:relative}.app-nav li a{display:block;line-height:1;transition:var(--navbar-root-transition)}.app-nav>ul>li{display:inline-block;margin:var(--navbar-root-margin)}.app-nav>ul>li:first-child{margin-left:0}.app-nav>ul>li:last-child{margin-right:0}.app-nav>ul>li>a,.app-nav>ul>li>span{padding:var(--navbar-root-padding);border-width:var(--navbar-root-border-width, 0);border-style:var(--navbar-root-border-style);border-color:var(--navbar-root-border-color);border-radius:var(--navbar-root-border-radius);background:var(--navbar-root-background);color:var(--navbar-root-color);-webkit-text-decoration:var(--navbar-root-text-decoration);text-decoration:var(--navbar-root-text-decoration);-webkit-text-decoration-color:var(--navbar-root-text-decoration-color);text-decoration-color:var(--navbar-root-text-decoration-color)}.app-nav>ul>li>a:hover,.app-nav>ul>li>span:hover{background:var(--navbar-root-background--hover, var(--navbar-root-background));border-style:var(--navbar-root-border-style--hover, var(--navbar-root-border-style));border-color:var(--navbar-root-border-color--hover, var(--navbar-root-border-color));color:var(--navbar-root-color--hover, var(--navbar-root-color));-webkit-text-decoration:var(--navbar-root-text-decoration--hover, var(--navbar-root-text-decoration));text-decoration:var(--navbar-root-text-decoration--hover, var(--navbar-root-text-decoration));-webkit-text-decoration-color:var(--navbar-root-text-decoration-color--hover, var(--navbar-root-text-decoration-color));text-decoration-color:var(--navbar-root-text-decoration-color--hover, var(--navbar-root-text-decoration-color))}.app-nav>ul>li>a:not(:last-child),.app-nav>ul>li>span:not(:last-child){padding:var(--navbar-menu-root-padding, var(--navbar-root-padding));background:var(--navbar-menu-root-background, var(--navbar-root-background))}.app-nav>ul>li>a:not(:last-child):hover,.app-nav>ul>li>span:not(:last-child):hover{background:var(--navbar-menu-root-background--hover, var(--navbar-menu-root-background, var(--navbar-root-background--hover, var(--navbar-root-background))))}.app-nav>ul>li>a.active{background:var(--navbar-root-background--active, var(--navbar-root-background));border-style:var(--navbar-root-border-style--active, var(--navbar-root-border-style));border-color:var(--navbar-root-border-color--active, var(--navbar-root-border-color));color:var(--navbar-root-color--active, var(--navbar-root-color));-webkit-text-decoration:var(--navbar-root-text-decoration--active, var(--navbar-root-text-decoration));text-decoration:var(--navbar-root-text-decoration--active, var(--navbar-root-text-decoration));-webkit-text-decoration-color:var(--navbar-root-text-decoration-color--active, var(--navbar-root-text-decoration-color));text-decoration-color:var(--navbar-root-text-decoration-color--active, var(--navbar-root-text-decoration-color))}.app-nav>ul>li>a.active:not(:last-child):hover{background:var(--navbar-menu-root-background--active, var(--navbar-menu-root-background, var(--navbar-root-background--active, var(--navbar-root-background))))}.app-nav>ul>li ul{visibility:hidden;position:absolute;top:100%;right:50%;overflow-y:auto;box-sizing:border-box;max-height:50vh;padding:var(--navbar-menu-padding);border-width:var(--navbar-menu-border-width, 0);border-style:solid;border-color:var(--navbar-menu-border-color);border-radius:var(--navbar-menu-border-radius);background:var(--navbar-menu-background);box-shadow:var(--navbar-menu-box-shadow);text-align:left;white-space:nowrap;opacity:0;transform:translate(50%, -0.35em);transition:var(--navbar-menu-transition)}.app-nav>ul>li ul li{white-space:nowrap}.app-nav>ul>li ul a{margin:var(--navbar-menu-link-margin);padding:var(--navbar-menu-link-padding);border-width:var(--navbar-menu-link-border-width, 0);border-style:var(--navbar-menu-link-border-style);border-color:var(--navbar-menu-link-border-color);border-radius:var(--navbar-menu-link-border-radius);background:var(--navbar-menu-link-background);color:var(--navbar-menu-link-color);-webkit-text-decoration:var(--navbar-menu-link-text-decoration);text-decoration:var(--navbar-menu-link-text-decoration);-webkit-text-decoration-color:var(--navbar-menu-link-text-decoration-color);text-decoration-color:var(--navbar-menu-link-text-decoration-color)}.app-nav>ul>li ul a:hover{background:var(--navbar-menu-link-background--hover, var(--navbar-menu-link-background));border-style:var(--navbar-menu-link-border-style--hover, var(--navbar-menu-link-border-style));border-color:var(--navbar-menu-link-border-color--hover, var(--navbar-menu-link-border-color));color:var(--navbar-menu-link-color--hover, var(--navbar-menu-link-color));-webkit-text-decoration:var(--navbar-menu-link-text-decoration--hover, var(--navbar-menu-link-text-decoration));text-decoration:var(--navbar-menu-link-text-decoration--hover, var(--navbar-menu-link-text-decoration));-webkit-text-decoration-color:var(--navbar-menu-link-text-decoration-color--hover, var(--navbar-menu-link-text-decoration-color));text-decoration-color:var(--navbar-menu-link-text-decoration-color--hover, var(--navbar-menu-link-text-decoration-color))}.app-nav>ul>li ul a.active{background:var(--navbar-menu-link-background--active, var(--navbar-menu-link-background));border-style:var(--navbar-menu-link-border-style--active, var(--navbar-menu-link-border-style));border-color:var(--navbar-menu-link-border-color--active, var(--navbar-menu-link-border-color));color:var(--navbar-menu-link-color--active, var(--navbar-menu-link-color));-webkit-text-decoration:var(--navbar-menu-link-text-decoration--active, var(--navbar-menu-link-text-decoration));text-decoration:var(--navbar-menu-link-text-decoration--active, var(--navbar-menu-link-text-decoration));-webkit-text-decoration-color:var(--navbar-menu-link-text-decoration-color--active, var(--navbar-menu-link-text-decoration-color));text-decoration-color:var(--navbar-menu-link-text-decoration-color--active, var(--navbar-menu-link-text-decoration-color))}.app-nav>ul>li:hover ul,.app-nav>ul>li:focus ul,.app-nav>ul>li.focus-within ul{visibility:visible;opacity:1;transform:translate(50%, 0)}@media(min-width: 48em){nav.app-nav{margin-left:var(--sidebar-width)}}main{position:relative;overflow-x:hidden;min-height:100vh}.sidebar,.sidebar-toggle,.sidebar+.content{transition:all var(--sidebar-transition-duration) ease-out}@media(min-width: 48em){.sidebar+.content{margin-left:var(--sidebar-width)}}.sidebar{display:flex;flex-direction:column;position:fixed;z-index:10;top:0;right:100%;overflow-x:hidden;overflow-y:auto;height:100vh;width:var(--sidebar-width);padding:var(--sidebar-padding);border-width:var(--sidebar-border-width);border-style:solid;border-color:var(--sidebar-border-color);background:var(--sidebar-background)}.sidebar>h1{margin:0;margin:var(--sidebar-name-margin);padding:var(--sidebar-name-padding);background:var(--sidebar-name-background);color:var(--sidebar-name-color);font-family:var(--sidebar-name-font-family);font-size:var(--sidebar-name-font-size);font-weight:var(--sidebar-name-font-weight);text-align:var(--sidebar-name-text-align)}.sidebar>h1 img{max-width:100%}.sidebar>h1 .app-name-link{color:var(--sidebar-name-color)}body:not([data-platform^=Mac]) .sidebar::-webkit-scrollbar{width:5px}body:not([data-platform^=Mac]) .sidebar::-webkit-scrollbar-thumb{border-radius:50vw}@media(min-width: 48em){.sidebar{position:absolute;transform:translateX(var(--sidebar-width))}}@media print{.sidebar{display:none}}.sidebar-nav,.sidebar nav{order:1;margin:var(--sidebar-nav-margin);padding:var(--sidebar-nav-padding);background:var(--sidebar-nav-background)}.sidebar-nav ul,.sidebar nav ul{margin:0;padding:0;list-style:none}.sidebar-nav ul ul,.sidebar nav ul ul{margin-left:var(--sidebar-nav-indent)}.sidebar-nav a,.sidebar nav a{display:block;overflow:hidden;margin:var(--sidebar-nav-link-margin);padding:var(--sidebar-nav-link-padding);border-width:var(--sidebar-nav-link-border-width, 0);border-style:var(--sidebar-nav-link-border-style);border-color:var(--sidebar-nav-link-border-color);border-radius:var(--sidebar-nav-link-border-radius);background:var(--sidebar-nav-link-background);color:var(--sidebar-nav-link-color);font-weight:var(--sidebar-nav-link-font-weight);white-space:nowrap;-webkit-text-decoration:var(--sidebar-nav-link-text-decoration);text-decoration:var(--sidebar-nav-link-text-decoration);-webkit-text-decoration-color:var(--sidebar-nav-link-text-decoration-color);text-decoration-color:var(--sidebar-nav-link-text-decoration-color);text-overflow:ellipsis;transition:var(--sidebar-nav-link-transition)}.sidebar-nav a img,.sidebar nav a img{margin-top:-0.25em;vertical-align:middle}.sidebar-nav a img:first-child,.sidebar nav a img:first-child{margin-right:.5em}.sidebar-nav a:hover,.sidebar nav a:hover{border-width:var(--sidebar-nav-link-border-width--hover, var(--sidebar-nav-link-border-width, 0));border-style:var(--sidebar-nav-link-border-style--hover, var(--sidebar-nav-link-border-style));border-color:var(--sidebar-nav-link-border-color--hover, var(--sidebar-nav-link-border-color));background:var(--sidebar-nav-link-background--hover, var(--sidebar-nav-link-background));color:var(--sidebar-nav-link-color--hover, var(--sidebar-nav-link-color));font-weight:var(--sidebar-nav-link-font-weight--hover, var(--sidebar-nav-link-font-weight));-webkit-text-decoration:var(--sidebar-nav-link-text-decoration--hover, var(--sidebar-nav-link-text-decoration));text-decoration:var(--sidebar-nav-link-text-decoration--hover, var(--sidebar-nav-link-text-decoration));-webkit-text-decoration-color:var(--sidebar-nav-link-text-decoration-color);text-decoration-color:var(--sidebar-nav-link-text-decoration-color)}.sidebar-nav ul>li>span,.sidebar-nav ul>li>strong,.sidebar nav ul>li>span,.sidebar nav ul>li>strong{display:block;margin:var(--sidebar-nav-strong-margin);padding:var(--sidebar-nav-strong-padding);border-width:var(--sidebar-nav-strong-border-width, 0);border-style:solid;border-color:var(--sidebar-nav-strong-border-color);color:var(--sidebar-nav-strong-color);font-size:var(--sidebar-nav-strong-font-size);font-weight:var(--sidebar-nav-strong-font-weight);text-transform:var(--sidebar-nav-strong-text-transform)}.sidebar-nav ul>li>span+ul,.sidebar-nav ul>li>strong+ul,.sidebar nav ul>li>span+ul,.sidebar nav ul>li>strong+ul{margin-left:0}.sidebar-nav ul>li:first-child>span,.sidebar-nav ul>li:first-child>strong,.sidebar nav ul>li:first-child>span,.sidebar nav ul>li:first-child>strong{margin-top:0}.sidebar-nav::-webkit-scrollbar,.sidebar nav::-webkit-scrollbar{width:0}@supports(width: env(safe-area-inset)){@media only screen and (orientation: landscape){.sidebar-nav,.sidebar nav{margin-left:calc(env(safe-area-inset-left)/2)}}}.sidebar-nav li>a:before,.sidebar-nav li>strong:before{display:inline-block}.sidebar-nav li>a{background-repeat:var(--sidebar-nav-pagelink-background-repeat);background-size:var(--sidebar-nav-pagelink-background-size)}.sidebar-nav li>a[href^="/"]:not([href*="?id="]),.sidebar-nav li>a[href^="#/"]:not([href*="?id="]){transition:var(--sidebar-nav-pagelink-transition)}.sidebar-nav li>a[href^="/"]:not([href*="?id="]),.sidebar-nav li>a[href^="/"]:not([href*="?id="])~ul a,.sidebar-nav li>a[href^="#/"]:not([href*="?id="]),.sidebar-nav li>a[href^="#/"]:not([href*="?id="])~ul a{padding:var(--sidebar-nav-pagelink-padding, var(--sidebar-nav-link-padding))}.sidebar-nav li>a[href^="/"]:not([href*="?id="]):only-child,.sidebar-nav li>a[href^="#/"]:not([href*="?id="]):only-child{background:var(--sidebar-nav-pagelink-background)}.sidebar-nav li>a[href^="/"]:not([href*="?id="]):not(:only-child),.sidebar-nav li>a[href^="#/"]:not([href*="?id="]):not(:only-child){background:var(--sidebar-nav-pagelink-background--loaded, var(--sidebar-nav-pagelink-background))}.sidebar-nav li.active>a,.sidebar-nav li.collapse>a{border-width:var(--sidebar-nav-link-border-width--active, var(--sidebar-nav-link-border-width));border-style:var(--sidebar-nav-link-border-style--active, var(--sidebar-nav-link-border-style));border-color:var(--sidebar-nav-link-border-color--active, var(--sidebar-nav-link-border-color));background:var(--sidebar-nav-link-background--active, var(--sidebar-nav-link-background));color:var(--sidebar-nav-link-color--active, var(--sidebar-nav-link-color));font-weight:var(--sidebar-nav-link-font-weight--active, var(--sidebar-nav-link-font-weight));-webkit-text-decoration:var(--sidebar-nav-link-text-decoration--active, var(--sidebar-nav-link-text-decoration));text-decoration:var(--sidebar-nav-link-text-decoration--active, var(--sidebar-nav-link-text-decoration));-webkit-text-decoration-color:var(--sidebar-nav-link-text-decoration-color);text-decoration-color:var(--sidebar-nav-link-text-decoration-color)}.sidebar-nav li.active>a[href^="/"]:not([href*="?id="]):not(:only-child),.sidebar-nav li.active>a[href^="#/"]:not([href*="?id="]):not(:only-child){background:var(--sidebar-nav-pagelink-background--active, var(--sidebar-nav-pagelink-background--loaded, var(--sidebar-nav-pagelink-background)))}.sidebar-nav li.collapse>a[href^="/"]:not([href*="?id="]):not(:only-child),.sidebar-nav li.collapse>a[href^="#/"]:not([href*="?id="]):not(:only-child){background:var(--sidebar-nav-pagelink-background--collapse, var(--sidebar-nav-pagelink-background--loaded, var(--sidebar-nav-pagelink-background)))}.sidebar-nav li.collapse .app-sub-sidebar{display:none}.sidebar-nav>ul>li>a:before{content:var(--sidebar-nav-link-before-content-l1, var(--sidebar-nav-link-before-content));margin:var(--sidebar-nav-link-before-margin-l1, var(--sidebar-nav-link-before-margin));color:var(--sidebar-nav-link-before-color-l1, var(--sidebar-nav-link-before-color))}.sidebar-nav>ul>li.active>a:before{content:var(--sidebar-nav-link-before-content-l1--active, var(--sidebar-nav-link-before-content--active, var(--sidebar-nav-link-before-content-l1, var(--sidebar-nav-link-before-content))));color:var(--sidebar-nav-link-before-color-l1--active, var(--sidebar-nav-link-before-color--active, var(--sidebar-nav-link-before-color-l1, var(--sidebar-nav-link-before-color))))}.sidebar-nav>ul>li>ul>li>a:before{content:var(--sidebar-nav-link-before-content-l2, var(--sidebar-nav-link-before-content));margin:var(--sidebar-nav-link-before-margin-l2, var(--sidebar-nav-link-before-margin));color:var(--sidebar-nav-link-before-color-l2, var(--sidebar-nav-link-before-color))}.sidebar-nav>ul>li>ul>li.active>a:before{content:var(--sidebar-nav-link-before-content-l2--active, var(--sidebar-nav-link-before-content--active, var(--sidebar-nav-link-before-content-l2, var(--sidebar-nav-link-before-content))));color:var(--sidebar-nav-link-before-color-l2--active, var(--sidebar-nav-link-before-color--active, var(--sidebar-nav-link-before-color-l2, var(--sidebar-nav-link-before-color))))}.sidebar-nav>ul>li>ul>li>ul>li>a:before{content:var(--sidebar-nav-link-before-content-l3, var(--sidebar-nav-link-before-content));margin:var(--sidebar-nav-link-before-margin-l3, var(--sidebar-nav-link-before-margin));color:var(--sidebar-nav-link-before-color-l3, var(--sidebar-nav-link-before-color))}.sidebar-nav>ul>li>ul>li>ul>li.active>a:before{content:var(--sidebar-nav-link-before-content-l3--active, var(--sidebar-nav-link-before-content--active, var(--sidebar-nav-link-before-content-l3, var(--sidebar-nav-link-before-content))));color:var(--sidebar-nav-link-before-color-l3--active, var(--sidebar-nav-link-before-color--active, var(--sidebar-nav-link-before-color-l3, var(--sidebar-nav-link-before-color))))}.sidebar-nav>ul>li>ul>li>ul>li>ul>li>a:before{content:var(--sidebar-nav-link-before-content-l4, var(--sidebar-nav-link-before-content));margin:var(--sidebar-nav-link-before-margin-l4, var(--sidebar-nav-link-before-margin));color:var(--sidebar-nav-link-before-color-l4, var(--sidebar-nav-link-before-color))}.sidebar-nav>ul>li>ul>li>ul>li>ul>li.active>a:before{content:var(--sidebar-nav-link-before-content-l4--active, var(--sidebar-nav-link-before-content--active, var(--sidebar-nav-link-before-content-l4, var(--sidebar-nav-link-before-content))));color:var(--sidebar-nav-link-before-color-l4--active, var(--sidebar-nav-link-before-color--active, var(--sidebar-nav-link-before-color-l4, var(--sidebar-nav-link-before-color))))}.sidebar-nav>:last-child{margin-bottom:2rem}.sidebar-toggle,.sidebar-toggle-button{width:var(--sidebar-toggle-width);outline:none}.sidebar-toggle{position:fixed;z-index:11;top:0;bottom:0;left:0;max-width:40px;margin:0;padding:0;border:0;background:rgba(0,0,0,0);-webkit-appearance:none;-moz-appearance:none;appearance:none;cursor:pointer}.sidebar-toggle .sidebar-toggle-button{position:absolute;top:var(--sidebar-toggle-offset-top);left:var(--sidebar-toggle-offset-left);height:var(--sidebar-toggle-height);border-radius:var(--sidebar-toggle-border-radius);border-width:var(--sidebar-toggle-border-width);border-style:var(--sidebar-toggle-border-style);border-color:var(--sidebar-toggle-border-color);background:var(--sidebar-toggle-background, transparent);color:var(--sidebar-toggle-icon-color)}.sidebar-toggle span{position:absolute;top:calc(50% - var(--sidebar-toggle-icon-stroke-width)/2);left:calc(50% - var(--sidebar-toggle-icon-width)/2);height:var(--sidebar-toggle-icon-stroke-width);width:var(--sidebar-toggle-icon-width);background-color:currentColor}.sidebar-toggle span:nth-child(1){margin-top:calc(0px - var(--sidebar-toggle-icon-height)/2)}.sidebar-toggle span:nth-child(3){margin-top:calc(var(--sidebar-toggle-icon-height)/2)}@media(min-width: 48em){.sidebar-toggle{position:absolute;overflow:visible;top:var(--sidebar-toggle-offset-top);bottom:auto;left:0;height:var(--sidebar-toggle-height);transform:translateX(var(--sidebar-width))}.sidebar-toggle .sidebar-toggle-button{top:0}}@media print{.sidebar-toggle{display:none}}@media(max-width: 47.99em){body.close .sidebar,body.close .sidebar-toggle,body.close .sidebar+.content{transform:translateX(var(--sidebar-width))}}@media(min-width: 48em){body.close .sidebar+.content{transform:translateX(0)}}@media(max-width: 47.99em){body.close nav.app-nav,body.close .github-corner{display:none}}@media(min-width: 48em){body.close .sidebar,body.close .sidebar-toggle{transform:translateX(0)}}@media(min-width: 48em){body.close nav.app-nav{margin-left:0}}@media(max-width: 47.99em){body.close .sidebar-toggle{width:100%;max-width:none}body.close .sidebar-toggle span{margin-top:0}body.close .sidebar-toggle span:nth-child(1){transform:rotate(45deg)}body.close .sidebar-toggle span:nth-child(2){display:none}body.close .sidebar-toggle span:nth-child(3){transform:rotate(-45deg)}}@media(min-width: 48em){body.close .sidebar+.content{margin-left:0}}@media(min-width: 48em){body.sticky .sidebar,body.sticky .sidebar-toggle{position:fixed}}body .docsify-copy-code-button,body .docsify-copy-code-button:after{border-radius:var(--border-radius-m, 0);border-top-left-radius:0;border-bottom-right-radius:0;background:var(--copycode-background);color:var(--copycode-color)}body .docsify-copy-code-button span{border-radius:var(--border-radius-s, 0)}body .docsify-pagination-container{border-top:var(--pagination-border-top);color:var(--pagination-color)}body .pagination-item-label{font-size:var(--pagination-label-font-size)}body .pagination-item-label svg{color:var(--pagination-label-color);height:var(--pagination-chevron-height);stroke:var(--pagination-chevron-stroke);stroke-linecap:var(--pagination-chevron-stroke-linecap);stroke-linejoin:var(--pagination-chevron-stroke-linecap);stroke-width:var(--pagination-chevron-stroke-width)}body .pagination-item-title{color:var(--pagination-title-color);font-size:var(--pagination-title-font-size)}body .app-name.hide{display:block}body .sidebar{padding:var(--sidebar-padding)}.sidebar .search{margin:0;padding:0;border:0}.sidebar .search input{padding:0;line-height:1;font-size:inherit}.sidebar .search .clear-button{width:auto}.sidebar .search .clear-button svg{transform:scale(1)}.sidebar .search .matching-post{border:none}.sidebar .search p{font-size:inherit}.sidebar .search{order:var(--search-flex-order);margin:var(--search-margin);padding:var(--search-padding);background:var(--search-background)}.sidebar .search a{color:inherit}.sidebar .search h2{margin:var(--search-result-heading-margin);font-size:var(--search-result-heading-font-size);font-weight:var(--search-result-heading-font-weight);color:var(--search-result-heading-color)}.sidebar .search .input-wrap{align-items:stretch;margin:var(--search-input-margin);background-color:var(--search-input-background-color);border-width:var(--search-input-border-width, 0);border-style:solid;border-color:var(--search-input-border-color);border-radius:var(--search-input-border-radius)}.sidebar .search input[type=search]{min-width:0;padding:var(--search-input-padding);border:none;background-color:rgba(0,0,0,0);background-image:var(--search-input-background-image);background-position:var(--search-input-background-position);background-repeat:var(--search-input-background-repeat);background-size:var(--search-input-background-size);font-size:var(--search-input-font-size);color:var(--search-input-color);transition:var(--search-input-transition)}.sidebar .search input[type=search]::-ms-clear{display:none}.sidebar .search input[type=search]::-moz-placeholder{color:var(--search-input-placeholder-color, #808080)}.sidebar .search input[type=search]::placeholder{color:var(--search-input-placeholder-color, #808080)}.sidebar .search input[type=search]::-webkit-input-placeholder{line-height:normal}.sidebar .search input[type=search]:focus{background-color:var(--search-input-background-color--focus, var(--search-input-background-color));background-image:var(--search-input-background-image--focus, var(--search-input-background-image));background-position:var(--search-input-background-position--focus, var(--search-input-background-position));background-size:var(--search-input-background-size--focus, var(--search-input-background-size))}@supports(width: env(safe-area-inset)){@media only screen and (orientation: landscape){.sidebar .search input[type=search]{margin-left:calc(env(safe-area-inset-left)/2)}}}.sidebar .search p{overflow:hidden;text-overflow:ellipsis;-webkit-box-orient:vertical;-webkit-line-clamp:2}.sidebar .search p:empty{text-align:center}.sidebar .search .clear-button{margin:0;padding:0 10px;border:none;line-height:1;background:rgba(0,0,0,0);cursor:pointer}.sidebar .search .clear-button svg circle{fill:var(--search-clear-icon-color1, #808080)}.sidebar .search .clear-button svg path{stroke:var(--search-clear-icon-color2, #fff)}.sidebar .search.show~*:not(h1){display:none}.sidebar .search .results-panel{display:none;color:var(--search-result-item-color);font-size:var(--search-result-item-font-size);font-weight:var(--search-result-item-font-weight)}.sidebar .search .results-panel.show{display:block}.sidebar .search .matching-post{margin:var(--search-result-item-margin);padding:var(--search-result-item-padding)}.sidebar .search .matching-post,.sidebar .search .matching-post:last-child{border-width:var(--search-result-item-border-width, 0) !important;border-style:var(--search-result-item-border-style);border-color:var(--search-result-item-border-color)}.sidebar .search .matching-post p{margin:0}.sidebar .search .search-keyword{margin:var(--search-result-keyword-margin);padding:var(--search-result-keyword-padding);border-radius:var(--search-result-keyword-border-radius);background-color:var(--search-result-keyword-background);color:var(--search-result-keyword-color, currentColor);font-style:normal;font-weight:var(--search-result-keyword-font-weight)}.medium-zoom-overlay,.medium-zoom-image--open,.medium-zoom-image--opened{z-index:2147483646 !important}.medium-zoom-overlay{background:var(--zoomimage-overlay-background) !important}:root{--mono-hue: 113;--mono-saturation: 0%;--mono-shade3: hsl(var(--mono-hue), var(--mono-saturation), 20%);--mono-shade2: hsl(var(--mono-hue), var(--mono-saturation), 30%);--mono-shade1: hsl(var(--mono-hue), var(--mono-saturation), 40%);--mono-base: hsl(var(--mono-hue), var(--mono-saturation), 50%);--mono-tint1: hsl(var(--mono-hue), var(--mono-saturation), 70%);--mono-tint2: hsl(var(--mono-hue), var(--mono-saturation), 89%);--mono-tint3: hsl(var(--mono-hue), var(--mono-saturation), 97%);--theme-hue: 204;--theme-saturation: 90%;--theme-lightness: 45%;--theme-color: hsl(var(--theme-hue), var(--theme-saturation), var(--theme-lightness));--modular-scale: 1.333;--modular-scale--2: calc(var(--modular-scale--1) / var(--modular-scale));--modular-scale--1: calc(var(--modular-scale-1) / var(--modular-scale));--modular-scale-1: 1rem;--modular-scale-2: calc(var(--modular-scale-1) * var(--modular-scale));--modular-scale-3: calc(var(--modular-scale-2) * var(--modular-scale));--modular-scale-4: calc(var(--modular-scale-3) * var(--modular-scale));--modular-scale-5: calc(var(--modular-scale-4) * var(--modular-scale));--font-size-xxxl: var(--modular-scale-5);--font-size-xxl: var(--modular-scale-4);--font-size-xl: var(--modular-scale-3);--font-size-l: var(--modular-scale-2);--font-size-m: var(--modular-scale-1);--font-size-s: var(--modular-scale--1);--font-size-xs: var(--modular-scale--2);--duration-slow: 1s;--duration-medium: 0.5s;--duration-fast: 0.25s;--spinner-size: 60px;--spinner-track-width: 4px;--spinner-track-color: rgba(0, 0, 0, 0.15);--spinner-transition-duration: var(--duration-medium)}:root{--base-background-color: #fff;--base-color: var(--mono-shade2);--base-font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol";--base-font-size: 16px;--base-font-weight: normal;--base-line-height: 1.7;--emoji-size: calc(var(--base-line-height) * 1em);--hr-border: 1px solid var(--mono-tint2);--mark-background: #ffecb3;--pre-font-family: var(--code-font-family);--pre-font-size: var(--code-font-size);--pre-font-weight: normal;--selection-color: #b4d5fe;--small-font-size: var(--font-size-s);--strong-color: var(--heading-color);--strong-font-weight: 600;--subsup-font-size: var(--font-size-s)}:root{--content-max-width: 75em;--blockquote-background: var(--mono-tint3);--blockquote-border-style: solid;--blockquote-border-radius: var(--border-radius-m);--blockquote-em-font-weight: normal;--blockquote-font-weight: normal;--blockquote-padding: 1.5em;--code-font-family: Inconsolata, Consolas, Menlo, Monaco, "Andale Mono WT", "Andale Mono", "Lucida Console", "DejaVu Sans Mono", "Bitstream Vera Sans Mono", "Courier New", Courier, monospace;--code-font-size: calc(var(--font-size-m) * 0.95);--code-font-weight: normal;--code-tab-size: 4;--code-block-border-radius: var(--border-radius-m);--code-block-line-height: var(--base-line-height);--code-block-margin: 1em 0;--code-block-padding: 1.75em 1.5em 1.5em 1.5em;--code-inline-background: var(--code-theme-background);--code-inline-border-radius: var(--border-radius-s);--code-inline-color: var(--code-theme-text);--code-inline-margin: 0 0.15em;--code-inline-padding: 0.125em 0.4em;--code-theme-background: var(--mono-tint3);--heading-color: var(--mono-shade3);--heading-margin: 2.5rem 0 0;--heading-h1-border-style: solid;--heading-h1-font-size: var(--font-size-xxl);--heading-h2-border-style: solid;--heading-h2-font-size: var(--font-size-xl);--heading-h3-border-style: solid;--heading-h3-font-size: var(--font-size-l);--heading-h4-border-style: solid;--heading-h4-font-size: var(--font-size-m);--heading-h5-border-style: solid;--heading-h5-font-size: var(--font-size-s);--heading-h6-border-style: solid;--heading-h6-font-size: var(--font-size-xs);--kbd-background: var(--mono-tint3);--kbd-border-radius: var(--border-radius-m);--kbd-margin: 0 0.3em;--kbd-min-width: 2.5em;--kbd-padding: 0.65em 0.5em;--link-text-decoration: underline;--notice-background: var(--mono-tint3);--notice-border-radius: var(--border-radius-m);--notice-border-style: solid;--notice-font-weight: normal;--notice-padding: 1em 1.5em;--notice-before-font-weight: normal;--table-cell-padding: 0.75em 0.5em;--table-head-border-color: var(--table-cell-border-color);--table-head-font-weight: var(--strong-font-weight);--table-row-odd-background: var(--mono-tint3)}:root{--cover-margin: 0 auto;--cover-max-width: 40em;--cover-text-align: center;--cover-background-color: var(--base-background-color);--cover-background-mask-color: var(--base-background-color);--cover-background-mask-opacity: 0.8;--cover-background-position: center center;--cover-background-repeat: no-repeat;--cover-background-size: cover;--cover-blockquote-font-size: var(--font-size-l);--cover-border-color: var(--theme-color);--cover-button-border: 1px solid var(--theme-color);--cover-button-border-radius: var(--border-radius-m);--cover-button-color: var(--theme-color);--cover-button-padding: 0.5em 2rem;--cover-button-text-decoration: none;--cover-button-transition: all var(--duration-fast) ease-in-out;--cover-button-primary-background: var(--theme-color);--cover-button-primary-border: 1px solid var(--theme-color);--cover-button-primary-color: #fff;--cover-heading-color: var(--theme-color);--cover-heading-font-size: var(--font-size-xxl);--cover-heading-font-weight: normal;--cover-link-text-decoration: underline}:root{--navbar-root-border-style: solid;--navbar-root-margin: 0 0 0 1.5em;--navbar-root-transition: all var(--duration-fast);--navbar-menu-background: var(--base-background-color);--navbar-menu-border-radius: var(--border-radius-m);--navbar-menu-box-shadow: rgba(45,45,45,0.05) 0px 0px 1px, rgba(49,49,49,0.05) 0px 1px 2px, rgba(42,42,42,0.05) 0px 2px 4px, rgba(32,32,32,0.05) 0px 4px 8px, rgba(49,49,49,0.05) 0px 8px 16px, rgba(35,35,35,0.05) 0px 16px 32px;--navbar-menu-padding: 0.5em;--navbar-menu-transition: all var(--duration-fast);--navbar-menu-link-border-style: solid;--navbar-menu-link-margin: 0.75em 0.5em;--navbar-menu-link-padding: 0.2em 0}:root{--copycode-background: #808080;--copycode-color: #fff}:root{--docsifytabs-border-color: var(--mono-tint2);--docsifytabs-border-radius-px: var(--border-radius-s);--docsifytabs-tab-background: var(--mono-tint3);--docsifytabs-tab-color: var(--mono-tint1)}:root{--pagination-border-top: 1px solid var(--mono-tint2);--pagination-chevron-height: 0.8em;--pagination-chevron-stroke: currentColor;--pagination-chevron-stroke-linecap: round;--pagination-chevron-stroke-width: 1px;--pagination-label-font-size: var(--font-size-s);--pagination-title-font-size: var(--font-size-l)}:root{--search-margin: 1.5rem 0 0;--search-input-background-repeat: no-repeat;--search-input-border-color: var(--mono-tint1);--search-input-border-width: 1px;--search-input-padding: 0.5em;--search-flex-order: 1;--search-result-heading-color: var(--heading-color);--search-result-heading-font-size: var(--base-font-size);--search-result-heading-font-weight: normal;--search-result-heading-margin: 0 0 0.25em;--search-result-item-border-color: var(--mono-tint2);--search-result-item-border-style: solid;--search-result-item-border-width: 0 0 1px 0;--search-result-item-font-weight: normal;--search-result-item-padding: 1em 0;--search-result-keyword-background: var(--mark-background);--search-result-keyword-border-radius: var(--border-radius-s);--search-result-keyword-color: var(--mark-color);--search-result-keyword-font-weight: normal;--search-result-keyword-margin: 0 0.1em;--search-result-keyword-padding: 0.2em 0}:root{--zoomimage-overlay-background: rgba(0, 0, 0, 0.875)}:root{--sidebar-background: var(--base-background-color);--sidebar-border-width: 0;--sidebar-padding: 0 25px;--sidebar-transition-duration: var(--duration-fast);--sidebar-width: 17rem;--sidebar-name-font-size: var(--font-size-l);--sidebar-name-font-weight: normal;--sidebar-name-margin: 1.5rem 0 0;--sidebar-name-text-align: center;--sidebar-nav-strong-border-color: var(--sidebar-border-color);--sidebar-nav-strong-color: var(--heading-color);--sidebar-nav-strong-font-weight: var(--strong-font-weight);--sidebar-nav-strong-margin: 1.5em 0 0.5em;--sidebar-nav-strong-padding: 0.25em 0;--sidebar-nav-indent: 1em;--sidebar-nav-margin: 1.5rem 0 0;--sidebar-nav-link-border-style: solid;--sidebar-nav-link-border-width: 0;--sidebar-nav-link-color: var(--base-color);--sidebar-nav-link-font-weight: normal;--sidebar-nav-link-padding: 0.25em 0;--sidebar-nav-link-text-decoration--active: underline;--sidebar-nav-link-text-decoration--hover: underline;--sidebar-nav-link-before-margin: 0 0.35em 0 0;--sidebar-nav-pagelink-transition: var(--sidebar-nav-link-transition);--sidebar-toggle-border-radius: var(--border-radius-s);--sidebar-toggle-border-style: solid;--sidebar-toggle-border-width: 0;--sidebar-toggle-height: 36px;--sidebar-toggle-icon-color: var(--base-color);--sidebar-toggle-icon-height: 10px;--sidebar-toggle-icon-stroke-width: 1px;--sidebar-toggle-icon-width: 16px;--sidebar-toggle-offset-left: 0;--sidebar-toggle-offset-top: calc(35px - (var(--sidebar-toggle-height) / 2));--sidebar-toggle-width: 44px}:root{--code-theme-background: #f3f3f3;--code-theme-comment: #6e8090;--code-theme-function: #dd4a68;--code-theme-keyword: #07a;--code-theme-operator: #a67f59;--code-theme-punctuation: #999;--code-theme-selector: #690;--code-theme-tag: #905;--code-theme-text: #333;--code-theme-variable: #e90}:root{--border-radius-s: 2px;--border-radius-m: 4px;--border-radius-l: 8px;--strong-font-weight: 600;--blockquote-border-color: var(--theme-color);--blockquote-border-radius: 0 var(--border-radius-m) var(--border-radius-m) 0;--blockquote-border-width: 0 0 0 4px;--code-inline-background: var(--mono-tint2);--code-theme-background: var(--mono-tint3);--heading-font-weight: var(--strong-font-weight);--heading-h1-font-weight: 400;--heading-h2-font-weight: 400;--heading-h2-border-color: var(--mono-tint2);--heading-h2-border-width: 0 0 1px 0;--heading-h2-margin: 2.5rem 0 1.5rem;--heading-h2-padding: 0 0 0rem 0;--kbd-border: 1px solid var(--mono-tint2);--notice-border-radius: 0 var(--border-radius-m) var(--border-radius-m) 0;--notice-border-width: 0 0 0 4px;--notice-padding: 1em 1.5em 1em 3em;--notice-before-border-radius: 100%;--notice-before-font-weight: bold;--notice-before-height: 1.5em;--notice-before-left: 0.75em;--notice-before-line-height: 1.5;--notice-before-margin: 0 0.25em 0 0;--notice-before-position: absolute;--notice-before-width: var(--notice-before-height);--notice-important-background: hsl(340, 60%, 96%);--notice-important-border-color: hsl(340, 90%, 45%);--notice-important-before-background: var(--notice-important-border-color) url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23fff'%3E%3Cpath d='M10 14C10 15.1 9.1 16 8 16 6.9 16 6 15.1 6 14 6 12.9 6.9 12 8 12 9.1 12 10 12.9 10 14Z'/%3E%3Cpath d='M10 1.6C10 1.2 9.8 0.9 9.6 0.7 9.2 0.3 8.6 0 8 0 7.4 0 6.8 0.2 6.5 0.6 6.2 0.9 6 1.2 6 1.6 6 1.7 6 1.8 6 1.9L6.8 9.6C6.9 9.9 7 10.1 7.2 10.2 7.4 10.4 7.7 10.5 8 10.5 8.3 10.5 8.6 10.4 8.8 10.3 9 10.1 9.1 9.9 9.2 9.6L10 1.9C10 1.8 10 1.7 10 1.6Z'/%3E%3C/svg%3E") center / 0.875em no-repeat;--notice-important-before-color: #fff;--notice-important-before-content: "";--notice-tip-background: hsl(204, 60%, 96%);--notice-tip-border-color: hsl(204, 90%, 45%);--notice-tip-before-background: var(--notice-tip-border-color) url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23fff'%3E%3Cpath d='M9.1 0C10.2 0 10.7 0.7 10.7 1.6 10.7 2.6 9.8 3.6 8.6 3.6 7.6 3.6 7 3 7 2 7 1.1 7.7 0 9.1 0Z'/%3E%3Cpath d='M5.8 16C5 16 4.4 15.5 5 13.2L5.9 9.1C6.1 8.5 6.1 8.2 5.9 8.2 5.7 8.2 4.6 8.6 3.9 9.1L3.5 8.4C5.6 6.6 7.9 5.6 8.9 5.6 9.8 5.6 9.9 6.6 9.5 8.2L8.4 12.5C8.2 13.2 8.3 13.5 8.5 13.5 8.7 13.5 9.6 13.2 10.4 12.5L10.9 13.2C8.9 15.2 6.7 16 5.8 16Z'/%3E%3C/svg%3E") center / 0.875em no-repeat;--notice-tip-before-color: #fff;--notice-tip-before-content: "";--table-cell-border-color: var(--mono-tint2);--table-cell-border-width: 1px 0;--cover-background-color: hsl(var(--theme-hue), 25%, 60%);--cover-background-image: radial-gradient(ellipse at center 115%, rgba(255, 255, 255, 0.9), transparent);--cover-blockquote-color: var(--strong-color);--cover-heading-color: #fff;--cover-heading-font-size-max: 56;--cover-heading-font-size-min: 34;--cover-heading-font-weight: 200;--navbar-root-color--active: var(--theme-color);--navbar-menu-border-radius: var(--border-radius-m);--navbar-menu-root-background: url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' width='9.6' height='6' viewBox='0 0 9.6 6'%3E%3Cpath d='M1.5 1.5l3.3 3 3.3-3' stroke-width='1.5' stroke='rgb%28179, 179, 179%29' fill='none' stroke-linecap='square' stroke-linejoin='miter' vector-effect='non-scaling-stroke'/%3E%3C/svg%3E") right no-repeat;--navbar-menu-root-padding: 0 18px 0 0;--search-input-background-color: #fff;--search-input-background-image: url("data:image/svg+xml,%3Csvg height='20px' width='20px' viewBox='0 0 24 24' fill='none' stroke='rgba(0, 0, 0, 0.3)' stroke-width='1.5' stroke-linecap='round' stroke-linejoin='round' preserveAspectRatio='xMidYMid meet' xmlns='http://www.w3.org/2000/svg'%3E%3Ccircle cx='10.5' cy='10.5' r='7.5' vector-effect='non-scaling-stroke'%3E%3C/circle%3E%3Cline x1='21' y1='21' x2='15.8' y2='15.8' vector-effect='non-scaling-stroke'%3E%3C/line%3E%3C/svg%3E");--search-input-background-position: 21px center;--search-input-border-color: var(--sidebar-border-color);--search-input-border-width: 1px 0;--search-input-margin: 0 -25px;--search-input-padding: 0.65em 1em 0.65em 50px;--search-input-placeholder-color: rgba(0, 0, 0, 0.4);--search-clear-icon-color1: rgba(0, 0, 0, 0.3);--search-result-heading-font-weight: var(--strong-font-weight);--search-result-item-border-color: var(--sidebar-border-color);--search-result-keyword-border-radius: var(--border-radius-s);--sidebar-background: var(--mono-tint3);--sidebar-border-color: var(--mono-tint2);--sidebar-border-width: 0 1px 0 0;--sidebar-name-color: var(--theme-color);--sidebar-name-font-weight: 300;--sidebar-nav-strong-border-width: 0 0 1px 0;--sidebar-nav-strong-font-size: smaller;--sidebar-nav-strong-margin: 2em -25px 0.75em 0;--sidebar-nav-strong-padding: 0.25em 0 0.75em 0;--sidebar-nav-strong-text-transform: uppercase;--sidebar-nav-link-border-color: transparent;--sidebar-nav-link-border-color--active: var(--theme-color);--sidebar-nav-link-border-width: 0 4px 0 0;--sidebar-nav-link-color--active: var(--theme-color);--sidebar-nav-link-margin: 0 -25px 0 0;--sidebar-nav-link-text-decoration: none;--sidebar-nav-link-text-decoration--active: none;--sidebar-nav-link-text-decoration--hover: underline;--sidebar-nav-pagelink-background: no-repeat 2px calc(50% - 2.5px) / 6px 5px linear-gradient(45deg, transparent 2.75px, var(--mono-tint1) 2.75px 4.25px, transparent 4px), no-repeat 2px calc(50% + 2.5px) / 6px 5px linear-gradient(135deg, transparent 2.75px, var(--mono-tint1) 2.75px 4.25px, transparent 4px);--sidebar-nav-pagelink-background--active: no-repeat 0px center / 5px 6px linear-gradient(225deg, transparent 2.75px, var(--theme-color) 2.75px 4.25px, transparent 4.25px), no-repeat 5px center / 5px 6px linear-gradient(135deg, transparent 2.75px, var(--theme-color) 2.75px 4.25px, transparent 4.25px);--sidebar-nav-pagelink-background--collapse: no-repeat 2px calc(50% - 2.5px) / 6px 5px linear-gradient(45deg, transparent 2.75px, var(--theme-color) 2.75px 4.25px, transparent 4px), no-repeat 2px calc(50% + 2.5px) / 6px 5px linear-gradient(135deg, transparent 2.75px, var(--theme-color) 2.75px 4.25px, transparent 4px);--sidebar-nav-pagelink-background--loaded: no-repeat 0px center / 5px 6px linear-gradient(225deg, transparent 2.75px, var(--mono-tint1) 2.75px 4.25px, transparent 4.25px), no-repeat 5px center / 5px 6px linear-gradient(135deg, transparent 2.75px, var(--mono-tint1) 2.75px 4.25px, transparent 4.25px);--sidebar-nav-pagelink-padding: 0.25em 0 0.25em 20px;--sidebar-nav-pagelink-transition: none;--sidebar-toggle-background: var(--sidebar-border-color);--sidebar-toggle-border-radius: 0 var(--border-radius-s) var(--border-radius-s) 0;--sidebar-toggle-width: 32px}:root{--code-theme-background: #222;--code-theme-comment: #516e7a;--code-theme-function: #f07178;--code-theme-keyword: #c2e78c;--code-theme-operator: #ffcb6b;--code-theme-punctuation: #89ddff;--code-theme-selector: #ffcb6b;--code-theme-tag: #f07178;--code-theme-text: #f3f3f3;--code-theme-variable: #ffcb6b}:root{--mono-hue: 201;--mono-saturation: 18%;--mono-shade3: hsl(var(--mono-hue), var(--mono-saturation), 13%);--mono-shade2: hsl(var(--mono-hue), var(--mono-saturation), 15%);--mono-shade1: hsl(var(--mono-hue), var(--mono-saturation), 17%);--mono-base: hsl(var(--mono-hue), var(--mono-saturation), 19%);--mono-tint1: hsl(var(--mono-hue), var(--mono-saturation), 25%);--mono-tint2: hsl(var(--mono-hue), var(--mono-saturation), 35%);--mono-tint3: hsl(var(--mono-hue), var(--mono-saturation), 43%);--spinner-track-color: rgba(255, 255, 255, 0.15);--base-background-color: var(--mono-base);--base-color: #d3d3d3;--hr-border: 1px solid var(--mono-tint2);--mark-background: #ffcb6b;--mark-color: var(--base-background-color);--selection-color: rgba(94, 131, 175, 0.75);--blockquote-background: var(--mono-shade2);--code-inline-background: var(--mono-tint1);--code-theme-background: var(--mono-shade2);--heading-color: #fff;--heading-h2-border-color: var(--mono-tint2);--kbd-background: var(--mono-shade2);--kbd-border: none;--kbd-color: var(--strong-color);--notice-important-background: var(--mono-shade2);--notice-tip-background: var(--mono-shade2);--table-cell-border-color: var(--mono-tint1);--table-row-odd-background: var(--mono-shade2);--cover-background-color: var(--base-background-color);--cover-background-image: radial-gradient(ellipse at center bottom, var(--mono-tint3), transparent);--cover-blockquote-color: var(--mark-background);--cover-button-border: 1px solid var(--mono-tint3);--cover-button-color: #fff;--navbar-menu-background: var(--mono-tint1);--navbar-menu-box-shadow: rgba(0,0,0,0.05) 0px 0px 1px, rgba(0,0,0,0.05) 0px 1px 2px, rgba(0,0,0,0.05) 0px 2px 4px, rgba(0,0,0,0.05) 0px 4px 8px, rgba(0,0,0,0.05) 0px 8px 16px, rgba(0,0,0,0.05) 0px 16px 32px;--copycode-background: var(--mono-tint1);--copycode-color: #fff;--docsifytabs-border-color: var(--mono-tint2);--docsifytabs-tab-background: var(--mono-shade1);--docsifytabs-tab-color: var(--mono-tint2);--pagination-border-top: 1px solid var(--mono-tint2);--pagination-title-color: #fff;--search-input-background-color: var(--mono-shade2);--search-input-background-image: url("data:image/svg+xml,%3Csvg height='20px' width='20px' viewBox='0 0 24 24' fill='none' stroke='rgba(255, 255, 255, 0.3)' stroke-width='1.5' stroke-linecap='round' stroke-linejoin='round' preserveAspectRatio='xMidYMid meet' xmlns='http://www.w3.org/2000/svg'%3E%3Ccircle cx='10.5' cy='10.5' r='7.5' vector-effect='non-scaling-stroke'%3E%3C/circle%3E%3Cline x1='21' y1='21' x2='15.8' y2='15.8' vector-effect='non-scaling-stroke'%3E%3C/line%3E%3C/svg%3E");--search-input-border-color: var(--mono-tint1);--search-input-placeholder-color: rgba(255, 255, 255, 0.4);--search-clear-icon-color1: rgba(255, 255, 255, 0.3);--sidebar-background: var(--mono-shade1);--sidebar-border-color: var(--mono-tint1);--sidebar-nav-pagelink-background: no-repeat 2px calc(50% - 2.5px) / 6px 5px linear-gradient(45deg, transparent 2.75px, var(--mono-tint2) 2.75px 4.25px, transparent 4px), no-repeat 2px calc(50% + 2.5px) / 6px 5px linear-gradient(135deg, transparent 2.75px, var(--mono-tint2) 2.75px 4.25px, transparent 4px);--sidebar-nav-pagelink-background--loaded: no-repeat 0px center / 5px 6px linear-gradient(225deg, transparent 2.75px, var(--mono-tint2) 2.75px 4.25px, transparent 4.25px), no-repeat 5px center / 5px 6px linear-gradient(135deg, transparent 2.75px, var(--mono-tint2) 2.75px 4.25px, transparent 4.25px)}
+
+/*# sourceMappingURL=theme-simple-dark.css.map */
+
+body {
+ line-height: 1.5;
+}
+
+h2 {
+ margin-top: 0.8em !important;
+ margin-bottom: 0.5em !important;
+}
+
+h2 em {
+ color:hsl(341, 42%, 64%);
+}
+
+
+pre {
+ margin-bottom: 0em !important;
+}
+
+p.warn {
+ margin-top: 1.5em !important;
+}
+
+blockquote {
+ margin-top: 0.8em !important;
+ margin-bottom: 0.8em !important;
+ padding: 0.8em !important;
+}
+
+pre code {
+ padding: 0.8em !important;
+ margin: 0 !important;
+}
+
+li code {
+ padding: 0.05em !important;
+ margin: 0 !important;
+}
+
+pre.language- {
+ margin-top: 0.2em !important;
+ margin-bottom: 0.5em !important;
+}
+
+pre.language- code {
+ padding: 0.7em !important;
+ margin: 0 !important;
+}
+
+pre.language-sh {
+ margin-top: 0.2em !important;
+ margin-bottom: 0.5em !important;
+}
+
+pre.language-sh code {
+ padding: 0.7em !important;
+ margin: 0 !important;
+}
+
+ul.navbar {
+ list-style-type: none;
+ margin: 0;
+ padding: 0;
+ overflow: hidden;
+ /*background-color: var(--theme-color);*/
+ background-color: var(--base-background-color);
+}
+
+li.navbar {
+ float: left;
+}
+
+li.navbar a.navbar {
+ display: inline-block;
+ position: relative;
+ color: rgb(255, 255, 255);
+ text-align: center;
+ padding: 10px 16px;
+ text-decoration: none;
+}
+
+li.navbar a.navbar::after {
+ content: '';
+ position: absolute;
+ width: 100%;
+ transform: scaleX(0);
+ height: 2px;
+ bottom: 0;
+ left: 0;
+ background-color: #0087ca;
+ transform-origin: bottom right;
+ transition: transform 0.25s ease-out;
+}
+
+li.navbar a.navbar:hover::after {
+ transform: scaleX(1);
+ transform-origin: bottom left;
+}
+
+
+
+li.navbar a img {
+ max-width: 100%;
+ outline: 0;
+ border: 0;
+}
+
+.github-link {
+ display: flex;
+ align-items: center;
+}
+
+.github-icon-container {
+ margin-right: 10px; /* add some spacing between the icon and text */
+}
+
+.github-text-container {
+ color: #0087ca;
+ text-decoration: none;
+ font-weight: bold;
+}
+
+.token.after-two-points {
+ color: rgb(108, 154, 180);
+ }
+
+
+.token.type-annotation-tuple {
+ color: rgb(108, 154, 180);
+}
+
+
+ /* toogler */
+
+ #docsify-darklight-theme {
+ border: none;
+ background-color: transparent;
+ position: flex;
+ float: right;
+ margin-top: 8px;
+ margin-left: 8px;
+ width: 25px;
+ height: 25px;
+ background-repeat: no-repeat;
+ background-image: var(--toogleImage);
+ -o-background-size: cover;
+ -moz-background-size: cover;
+ -webkit-background-size:cover;
+ background-size: cover;
+ cursor: pointer;
+ transition: background-image .15s ease-in-out .15s
+}
diff --git a/assets/styles/theme-simple.css b/assets/styles/theme-simple.css
new file mode 100644
index 0000000..eac552f
--- /dev/null
+++ b/assets/styles/theme-simple.css
@@ -0,0 +1,179 @@
+.github-corner{position:absolute;z-index:40;top:0;right:0;border-bottom:0;text-decoration:none}.github-corner svg{height:70px;width:70px;fill:var(--theme-color);color:var(--base-background-color)}.github-corner:hover .octo-arm{-webkit-animation:octocat-wave 560ms ease-in-out;animation:octocat-wave 560ms ease-in-out}@-webkit-keyframes octocat-wave{0%,100%{transform:rotate(0)}20%,60%{transform:rotate(-25deg)}40%,80%{transform:rotate(10deg)}}@keyframes octocat-wave{0%,100%{transform:rotate(0)}20%,60%{transform:rotate(-25deg)}40%,80%{transform:rotate(10deg)}}.progress{position:fixed;z-index:2147483647;top:0;left:0;right:0;height:3px;width:0;background-color:var(--theme-color);transition:width var(--duration-fast),opacity calc(var(--duration-fast)*2)}body.ready-transition:after,body.ready-transition>*:not(.progress){opacity:0;transition:opacity var(--spinner-transition-duration)}body.ready-transition:after{content:"";position:absolute;z-index:1000;top:calc(50% - var(--spinner-size)/2);left:calc(50% - var(--spinner-size)/2);height:var(--spinner-size);width:var(--spinner-size);border:var(--spinner-track-width, 0) solid var(--spinner-track-color);border-left-color:var(--theme-color);border-left-color:var(--theme-color);border-radius:50%;-webkit-animation:spinner var(--duration-slow) infinite linear;animation:spinner var(--duration-slow) infinite linear}body.ready-transition.ready-spinner:after{opacity:1}body.ready-transition.ready-fix:after{opacity:0}body.ready-transition.ready-fix>*:not(.progress){opacity:1;transition-delay:var(--spinner-transition-duration)}@-webkit-keyframes spinner{0%{transform:rotate(0deg)}100%{transform:rotate(360deg)}}@keyframes spinner{0%{transform:rotate(0deg)}100%{transform:rotate(360deg)}}*,*:before,*:after{box-sizing:inherit;font-size:inherit;-webkit-overflow-scrolling:touch;-webkit-tap-highlight-color:rgba(0,0,0,0);-webkit-text-size-adjust:none;-webkit-touch-callout:none}:root{box-sizing:border-box;background-color:var(--base-background-color);font-size:var(--base-font-size);font-weight:var(--base-font-weight);line-height:var(--base-line-height);letter-spacing:var(--base-letter-spacing);color:var(--base-color);-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale;font-smoothing:antialiased}html,button,input,optgroup,select,textarea{font-family:var(--base-font-family)}button,input,optgroup,select,textarea{font-size:100%;margin:0}a{text-decoration:none;-webkit-text-decoration-skip:ink;text-decoration-skip-ink:auto}body{margin:0}hr{height:0;margin:2em 0;border:none;border-bottom:var(--hr-border, 0)}img{max-width:100%;border:0}main{display:block}main.hidden{display:none}mark{background:var(--mark-background);color:var(--mark-color)}pre{font-family:var(--pre-font-family);font-size:var(--pre-font-size);font-weight:var(--pre-font-weight);line-height:var(--pre-line-height)}small{display:inline-block;font-size:var(--small-font-size)}strong{font-weight:var(--strong-font-weight);color:var(--strong-color, currentColor)}sub,sup{font-size:var(--subsup-font-size);line-height:0;position:relative;vertical-align:baseline}sub{bottom:-0.25em}sup{top:-0.5em}body:not([data-platform^=Mac]) *{scrollbar-color:hsla(var(--mono-hue), var(--mono-saturation), 50%, 0.3) hsla(var(--mono-hue), var(--mono-saturation), 50%, 0.1);scrollbar-width:thin}body:not([data-platform^=Mac]) * ::-webkit-scrollbar{width:5px;height:5px}body:not([data-platform^=Mac]) * ::-webkit-scrollbar-thumb{background:hsla(var(--mono-hue), var(--mono-saturation), 50%, 0.3)}body:not([data-platform^=Mac]) * ::-webkit-scrollbar-track{background:hsla(var(--mono-hue), var(--mono-saturation), 50%, 0.1)}::-moz-selection{background:var(--selection-color)}::selection{background:var(--selection-color)}.emoji{height:var(--emoji-size);vertical-align:middle}.task-list-item{list-style:none}.task-list-item input{margin-right:.5em;margin-left:0;vertical-align:.075em}.markdown-section code[class*=lang-],.markdown-section pre[data-lang]{font-family:var(--code-font-family);font-size:var(--code-font-size);font-weight:var(--code-font-weight);letter-spacing:normal;line-height:var(--code-block-line-height);-moz-tab-size:var(--code-tab-size);-o-tab-size:var(--code-tab-size);tab-size:var(--code-tab-size);text-align:left;white-space:pre;word-spacing:normal;word-wrap:normal;word-break:normal;-webkit-hyphens:none;hyphens:none}.markdown-section pre[data-lang]{position:relative;overflow:hidden;margin:var(--code-block-margin);padding:0;border-radius:var(--code-block-border-radius)}.markdown-section pre[data-lang]::after{content:attr(data-lang);position:absolute;top:.75em;right:.75em;opacity:.6;color:inherit;font-size:var(--font-size-s);line-height:1}.markdown-section pre[data-lang] code{display:block;overflow:auto;padding:var(--code-block-padding)}code[class*=lang-],pre[data-lang]{color:var(--code-theme-text)}pre[data-lang]::-moz-selection,pre[data-lang] ::-moz-selection,code[class*=lang-]::-moz-selection,code[class*=lang-] ::-moz-selection{background:var(--code-theme-selection, var(--selection-color))}pre[data-lang]::-moz-selection, pre[data-lang] ::-moz-selection, code[class*=lang-]::-moz-selection, code[class*=lang-] ::-moz-selection{background:var(--code-theme-selection, var(--selection-color))}pre[data-lang]::selection,pre[data-lang] ::selection,code[class*=lang-]::selection,code[class*=lang-] ::selection{background:var(--code-theme-selection, var(--selection-color))}:not(pre)>code[class*=lang-],pre[data-lang]{background:var(--code-theme-background)}.namespace{opacity:.7}.token.comment,.token.prolog,.token.doctype,.token.cdata{color:var(--code-theme-comment)}.token.punctuation{color:var(--code-theme-punctuation)}.token.property,.token.tag,.token.boolean,.token.number,.token.constant,.token.symbol,.token.deleted{color:var(--code-theme-tag)}.token.selector,.token.attr-name,.token.string,.token.char,.token.builtin,.token.inserted{color:var(--code-theme-selector)}.token.operator,.token.entity,.token.url,.language-css .token.string,.style .token.string{color:var(--code-theme-operator)}.token.atrule,.token.attr-value,.token.keyword{color:var(--code-theme-keyword)}.token.function{color:var(--code-theme-function)}.token.regex,.token.important,.token.variable{color:var(--code-theme-variable)}.token.important,.token.bold{font-weight:bold}.token.italic{font-style:italic}.token.entity{cursor:help}.markdown-section{position:relative;max-width:var(--content-max-width);margin:0 auto;padding:2rem 45px}.app-nav:not(:empty)~main .markdown-section{padding-top:3.5rem}.markdown-section figure,.markdown-section p,.markdown-section ol,.markdown-section ul{margin:1em 0}.markdown-section ol,.markdown-section ul{padding-left:1.5rem}.markdown-section ol ol,.markdown-section ol ul,.markdown-section ul ol,.markdown-section ul ul{margin-top:.15rem;margin-bottom:.15rem}.markdown-section a{border-bottom:var(--link-border-bottom);color:var(--link-color);-webkit-text-decoration:var(--link-text-decoration);text-decoration:var(--link-text-decoration);-webkit-text-decoration-color:var(--link-text-decoration-color);text-decoration-color:var(--link-text-decoration-color)}.markdown-section a:hover{border-bottom:var(--link-border-bottom--hover, var(--link-border-bottom, 0));color:var(--link-color--hover, var(--link-color));-webkit-text-decoration:var(--link-text-decoration--hover, var(--link-text-decoration));text-decoration:var(--link-text-decoration--hover, var(--link-text-decoration));-webkit-text-decoration-color:var(--link-text-decoration-color--hover, var(--link-text-decoration-color));text-decoration-color:var(--link-text-decoration-color--hover, var(--link-text-decoration-color))}.markdown-section a.anchor{border-bottom:0;color:inherit;text-decoration:none}.markdown-section a.anchor:hover{text-decoration:underline}.markdown-section blockquote{overflow:visible;margin:2em 0;padding:var(--blockquote-padding);border-width:var(--blockquote-border-width, 0);border-style:var(--blockquote-border-style);border-color:var(--blockquote-border-color);border-radius:var(--blockquote-border-radius);background:var(--blockquote-background);color:var(--blockquote-color);font-family:var(--blockquote-font-family);font-size:var(--blockquote-font-size);font-style:var(--blockquote-font-style);font-weight:var(--blockquote-font-weight);quotes:"“" "”" "‘" "’"}.markdown-section blockquote em{font-family:var(--blockquote-em-font-family);font-size:var(--blockquote-em-font-size);font-style:var(--blockquote-em-font-style);font-weight:var(--blockquote-em-font-weight)}.markdown-section blockquote p:first-child{margin-top:0}.markdown-section blockquote p:first-child:before,.markdown-section blockquote p:first-child:after{color:var(--blockquote-quotes-color);font-family:var(--blockquote-quotes-font-family);font-size:var(--blockquote-quotes-font-size);line-height:0}.markdown-section blockquote p:first-child:before{content:var(--blockquote-quotes-open);margin-right:.15em;vertical-align:-0.45em}.markdown-section blockquote p:first-child:after{content:var(--blockquote-quotes-close);margin-left:.15em;vertical-align:-0.55em}.markdown-section blockquote p:last-child{margin-bottom:0}.markdown-section code{font-family:var(--code-font-family);font-size:var(--code-font-size);font-weight:var(--code-font-weight);line-height:inherit}.markdown-section code:not([class*=lang-]):not([class*=language-]){margin:var(--code-inline-margin);padding:var(--code-inline-padding);border-radius:var(--code-inline-border-radius);background:var(--code-inline-background);color:var(--code-inline-color, currentColor);white-space:nowrap}.markdown-section h1:first-child,.markdown-section h2:first-child,.markdown-section h3:first-child,.markdown-section h4:first-child,.markdown-section h5:first-child,.markdown-section h6:first-child{margin-top:0}.markdown-section h1 a[data-id],.markdown-section h2 a[data-id],.markdown-section h3 a[data-id],.markdown-section h4 a[data-id],.markdown-section h5 a[data-id],.markdown-section h6 a[data-id]{display:inline-block}.markdown-section h1 code,.markdown-section h2 code,.markdown-section h3 code,.markdown-section h4 code,.markdown-section h5 code,.markdown-section h6 code{font-size:.875em}.markdown-section h1+h2,.markdown-section h1+h3,.markdown-section h1+h4,.markdown-section h1+h5,.markdown-section h1+h6,.markdown-section h2+h3,.markdown-section h2+h4,.markdown-section h2+h5,.markdown-section h2+h6,.markdown-section h3+h4,.markdown-section h3+h5,.markdown-section h3+h6,.markdown-section h4+h5,.markdown-section h4+h6,.markdown-section h5+h6{margin-top:1rem}.markdown-section h1{margin:var(--heading-h1-margin, var(--heading-margin));padding:var(--heading-h1-padding, var(--heading-padding));border-width:var(--heading-h1-border-width, 0);border-style:var(--heading-h1-border-style);border-color:var(--heading-h1-border-color);font-family:var(--heading-h1-font-family, var(--heading-font-family));font-size:var(--heading-h1-font-size);font-weight:var(--heading-h1-font-weight, var(--heading-font-weight));line-height:var(--base-line-height);color:var(--heading-h1-color, var(--heading-color))}.markdown-section h2{margin:var(--heading-h2-margin, var(--heading-margin));padding:var(--heading-h2-padding, var(--heading-padding));border-width:var(--heading-h2-border-width, 0);border-style:var(--heading-h2-border-style);border-color:var(--heading-h2-border-color);font-family:var(--heading-h2-font-family, var(--heading-font-family));font-size:var(--heading-h2-font-size);font-weight:var(--heading-h2-font-weight, var(--heading-font-weight));line-height:var(--base-line-height);color:var(--heading-h2-color, var(--heading-color))}.markdown-section h3{margin:var(--heading-h3-margin, var(--heading-margin));padding:var(--heading-h3-padding, var(--heading-padding));border-width:var(--heading-h3-border-width, 0);border-style:var(--heading-h3-border-style);border-color:var(--heading-h3-border-color);font-family:var(--heading-h3-font-family, var(--heading-font-family));font-size:var(--heading-h3-font-size);font-weight:var(--heading-h3-font-weight, var(--heading-font-weight));color:var(--heading-h3-color, var(--heading-color))}.markdown-section h4{margin:var(--heading-h4-margin, var(--heading-margin));padding:var(--heading-h4-padding, var(--heading-padding));border-width:var(--heading-h4-border-width, 0);border-style:var(--heading-h4-border-style);border-color:var(--heading-h4-border-color);font-family:var(--heading-h4-font-family, var(--heading-font-family));font-size:var(--heading-h4-font-size);font-weight:var(--heading-h4-font-weight, var(--heading-font-weight));color:var(--heading-h4-color, var(--heading-color))}.markdown-section h5{margin:var(--heading-h5-margin, var(--heading-margin));padding:var(--heading-h5-padding, var(--heading-padding));border-width:var(--heading-h5-border-width, 0);border-style:var(--heading-h5-border-style);border-color:var(--heading-h5-border-color);font-family:var(--heading-h5-font-family, var(--heading-font-family));font-size:var(--heading-h5-font-size);font-weight:var(--heading-h5-font-weight, var(--heading-font-weight));color:var(--heading-h5-color, var(--heading-color))}.markdown-section h6{margin:var(--heading-h6-margin, var(--heading-margin));padding:var(--heading-h6-padding, var(--heading-padding));border-width:var(--heading-h6-border-width, 0);border-style:var(--heading-h6-border-style);border-color:var(--heading-h6-border-color);font-family:var(--heading-h6-font-family, var(--heading-font-family));font-size:var(--heading-h6-font-size);font-weight:var(--heading-h6-font-weight, var(--heading-font-weight));color:var(--heading-h6-color, var(--heading-color))}.markdown-section iframe{margin:1em 0}.markdown-section img{max-width:50%;margin-left:auto;margin-right:auto;display: block;}.markdown-section kbd{display:inline-block;min-width:var(--kbd-min-width);margin:var(--kbd-margin);padding:var(--kbd-padding);border:var(--kbd-border);border-radius:var(--kbd-border-radius);background:var(--kbd-background);font-family:inherit;font-size:var(--kbd-font-size);text-align:center;letter-spacing:0;line-height:1;color:var(--kbd-color)}.markdown-section kbd+kbd{margin-left:-0.15em}.markdown-section table{display:block;overflow:auto;margin:1rem 0;border-spacing:0;border-collapse:collapse}.markdown-section th,.markdown-section td{padding:var(--table-cell-padding)}.markdown-section th:not([align]){text-align:left}.markdown-section thead{border-color:var(--table-head-border-color);border-style:solid;border-width:var(--table-head-border-width, 0);background:var(--table-head-background)}.markdown-section th{font-weight:var(--table-head-font-weight);color:var(--strong-color)}.markdown-section td{border-color:var(--table-cell-border-color);border-style:solid;border-width:var(--table-cell-border-width, 0)}.markdown-section tbody{border-color:var(--table-body-border-color);border-style:solid;border-width:var(--table-body-border-width, 0)}.markdown-section tbody tr:nth-child(odd){background:var(--table-row-odd-background)}.markdown-section tbody tr:nth-child(even){background:var(--table-row-even-background)}.markdown-section>ul .task-list-item{margin-left:-1.25em}.markdown-section>ul .task-list-item .task-list-item{margin-left:0}.markdown-section .table-wrapper{overflow-x:auto}.markdown-section .table-wrapper table{display:table;width:100%}.markdown-section .table-wrapper td::before{display:none}@media(max-width: 30em){.markdown-section .table-wrapper tbody,.markdown-section .table-wrapper tr,.markdown-section .table-wrapper td{display:block}.markdown-section .table-wrapper th,.markdown-section .table-wrapper td{border:none}.markdown-section .table-wrapper thead{display:none}.markdown-section .table-wrapper tr{border-color:var(--table-cell-border-color);border-style:solid;border-width:var(--table-cell-border-width, 0);padding:var(--table-cell-padding)}.markdown-section .table-wrapper tr:not(:last-child){border-bottom:0}.markdown-section .table-wrapper td{padding:.15em 0 .15em 8em}.markdown-section .table-wrapper td::before{display:inline-block;float:left;width:8em;margin-left:-8em;font-weight:bold;text-align:left}}.markdown-section .tip,.markdown-section .warn{position:relative;margin:2em 0;padding:var(--notice-padding);border-width:var(--notice-border-width, 0);border-style:var(--notice-border-style);border-color:var(--notice-border-color);border-radius:var(--notice-border-radius);background:var(--notice-background);font-family:var(--notice-font-family);font-weight:var(--notice-font-weight);color:var(--notice-color)}.markdown-section .tip:before,.markdown-section .warn:before{display:inline-block;position:var(--notice-before-position, relative);top:var(--notice-before-top);left:var(--notice-before-left);height:var(--notice-before-height);width:var(--notice-before-width);margin:var(--notice-before-margin);padding:var(--notice-before-padding);border-radius:var(--notice-before-border-radius);line-height:var(--notice-before-line-height);font-family:var(--notice-before-font-family);font-size:var(--notice-before-font-size);font-weight:var(--notice-before-font-weight);text-align:center}.markdown-section .tip{border-width:var(--notice-important-border-width, var(--notice-border-width, 0));border-style:var(--notice-important-border-style, var(--notice-border-style));border-color:var(--notice-important-border-color, var(--notice-border-color));background:var(--notice-important-background, var(--notice-background));color:var(--notice-important-color, var(--notice-color))}.markdown-section .tip:before{content:var(--notice-important-before-content, var(--notice-before-content));background:var(--notice-important-before-background, var(--notice-before-background));color:var(--notice-important-before-color, var(--notice-before-color))}.markdown-section .warn{border-width:var(--notice-tip-border-width, var(--notice-border-width, 0));border-style:var(--notice-tip-border-style, var(--notice-border-style));border-color:var(--notice-tip-border-color, var(--notice-border-color));background:var(--notice-tip-background, var(--notice-background));color:var(--notice-tip-color, var(--notice-color))}.markdown-section .warn:before{content:var(--notice-tip-before-content, var(--notice-before-content));background:var(--notice-tip-before-background, var(--notice-before-background));color:var(--notice-tip-before-color, var(--notice-before-color))}.cover{display:none;position:relative;z-index:20;min-height:100vh;flex-direction:column;align-items:center;justify-content:center;padding:calc(var(--cover-border-inset, 0px) + var(--cover-border-width, 0px));color:var(--cover-color);text-align:var(--cover-text-align)}@media screen and (-ms-high-contrast: active),screen and (-ms-high-contrast: none){.cover{height:100vh}}.cover:before,.cover:after{content:"";position:absolute}.cover:before{top:0;bottom:0;left:0;right:0;background-blend-mode:var(--cover-background-blend-mode);background-color:var(--cover-background-color);background-image:var(--cover-background-image);background-position:var(--cover-background-position);background-repeat:var(--cover-background-repeat);background-size:var(--cover-background-size)}.cover:after{top:var(--cover-border-inset, 0);bottom:var(--cover-border-inset, 0);left:var(--cover-border-inset, 0);right:var(--cover-border-inset, 0);border-width:var(--cover-border-width, 0);border-style:solid;border-color:var(--cover-border-color)}.cover a{border-bottom:var(--cover-link-border-bottom);color:var(--cover-link-color);-webkit-text-decoration:var(--cover-link-text-decoration);text-decoration:var(--cover-link-text-decoration);-webkit-text-decoration-color:var(--cover-link-text-decoration-color);text-decoration-color:var(--cover-link-text-decoration-color)}.cover a:hover{border-bottom:var(--cover-link-border-bottom--hover, var(--cover-link-border-bottom));color:var(--cover-link-color--hover, var(--cover-link-color));-webkit-text-decoration:var(--cover-link-text-decoration--hover, var(--cover-link-text-decoration));text-decoration:var(--cover-link-text-decoration--hover, var(--cover-link-text-decoration));-webkit-text-decoration-color:var(--cover-link-text-decoration-color--hover, var(--cover-link-text-decoration-color));text-decoration-color:var(--cover-link-text-decoration-color--hover, var(--cover-link-text-decoration-color))}.cover h1{color:var(--cover-heading-color);position:relative;margin:0;font-size:var(--cover-heading-font-size);font-weight:var(--cover-heading-font-weight);line-height:1.2}.cover h1 a,.cover h1 a:hover{display:block;border-bottom:none;color:inherit;text-decoration:none}.cover h1 small{position:absolute;bottom:0;margin-left:.5em}.cover h1 span{font-size:calc(var(--cover-heading-font-size-min)*1px)}@media(min-width: 26em){.cover h1 span{font-size:calc(var(--cover-heading-font-size-min)*1px + (var(--cover-heading-font-size-max) - var(--cover-heading-font-size-min))*(100vw - 420px)/604)}}@media(min-width: 64em){.cover h1 span{font-size:calc(var(--cover-heading-font-size-max)*1px)}}.cover blockquote{margin:0;color:var(--cover-blockquote-color);font-size:var(--cover-blockquote-font-size)}.cover blockquote a{color:inherit}.cover ul{padding:0;list-style-type:none}.cover .cover-main{position:relative;z-index:1;max-width:var(--cover-max-width);margin:var(--cover-margin);padding:0 45px}.cover .cover-main>p:last-child{margin:1.25em -0.25em}.cover .cover-main>p:last-child a{display:block;margin:.375em .25em;padding:var(--cover-button-padding);border:var(--cover-button-border);border-radius:var(--cover-button-border-radius);box-shadow:var(--cover-button-box-shadow);background:var(--cover-button-background);text-align:center;-webkit-text-decoration:var(--cover-button-text-decoration);text-decoration:var(--cover-button-text-decoration);-webkit-text-decoration-color:var(--cover-button-text-decoration-color);text-decoration-color:var(--cover-button-text-decoration-color);color:var(--cover-button-color);white-space:nowrap;transition:var(--cover-button-transition)}.cover .cover-main>p:last-child a:hover{border:var(--cover-button-border--hover, var(--cover-button-border));box-shadow:var(--cover-button-box-shadow--hover, var(--cover-button-box-shadow));background:var(--cover-button-background--hover, var(--cover-button-background));-webkit-text-decoration:var(--cover-button-text-decoration--hover, var(--cover-button-text-decoration));text-decoration:var(--cover-button-text-decoration--hover, var(--cover-button-text-decoration));-webkit-text-decoration-color:var(--cover-button-text-decoration-color--hover, var(--cover-button-text-decoration-color));text-decoration-color:var(--cover-button-text-decoration-color--hover, var(--cover-button-text-decoration-color));color:var(--cover-button-color--hover, var(--cover-button-color))}.cover .cover-main>p:last-child a:first-child{border:var(--cover-button-primary-border, var(--cover-button-border));box-shadow:var(--cover-button-primary-box-shadow, var(--cover-button-box-shadow));background:var(--cover-button-primary-background, var(--cover-button-background));-webkit-text-decoration:var(--cover-button-primary-text-decoration, var(--cover-button-text-decoration));text-decoration:var(--cover-button-primary-text-decoration, var(--cover-button-text-decoration));-webkit-text-decoration-color:var(--cover-button-primary-text-decoration-color, var(--cover-button-text-decoration-color));text-decoration-color:var(--cover-button-primary-text-decoration-color, var(--cover-button-text-decoration-color));color:var(--cover-button-primary-color, var(--cover-button-color))}.cover .cover-main>p:last-child a:first-child:hover{border:var(--cover-button-primary-border--hover, var(--cover-button-border--hover, var(--cover-button-primary-border, var(--cover-button-border))));box-shadow:var(--cover-button-primary-box-shadow--hover, var(--cover-button-box-shadow--hover, var(--cover-button-primary-box-shadow, var(--cover-button-box-shadow))));background:var(--cover-button-primary-background--hover, var(--cover-button-background--hover, var(--cover-button-primary-background, var(--cover-button-background))));-webkit-text-decoration:var(--cover-button-primary-text-decoration--hover, var(--cover-button-text-decoration--hover, var(--cover-button-primary-text-decoration, var(--cover-button-text-decoration))));text-decoration:var(--cover-button-primary-text-decoration--hover, var(--cover-button-text-decoration--hover, var(--cover-button-primary-text-decoration, var(--cover-button-text-decoration))));-webkit-text-decoration-color:var(--cover-button-primary-text-decoration-color--hover, var(--cover-button-text-decoration-color--hover, var(--cover-button-primary-text-decoration-color, var(--cover-button-text-decoration-color))));text-decoration-color:var(--cover-button-primary-text-decoration-color--hover, var(--cover-button-text-decoration-color--hover, var(--cover-button-primary-text-decoration-color, var(--cover-button-text-decoration-color))));color:var(--cover-button-primary-color--hover, var(--cover-button-color--hover, var(--cover-button-primary-color, var(--cover-button-color))))}@media(min-width: 30.01em){.cover .cover-main>p:last-child a{display:inline-block}}.cover .mask{visibility:var(--cover-background-mask-visibility, hidden);position:absolute;top:0;bottom:0;left:0;right:0;background-color:var(--cover-background-mask-color);opacity:var(--cover-background-mask-opacity)}.cover.has-mask .mask{visibility:visible}.cover.show{display:flex}.app-nav{position:absolute;z-index:30;top:calc(35px - .5em*var(--base-line-height));left:45px;right:80px;text-align:right}.app-nav.no-badge{right:45px}.app-nav li>img,.app-nav li>a>img{margin-top:-0.25em;vertical-align:middle}.app-nav li>img:first-child,.app-nav li>a>img:first-child{margin-right:.5em}.app-nav ul,.app-nav li{margin:0;padding:0;list-style:none}.app-nav li{position:relative}.app-nav li a{display:block;line-height:1;transition:var(--navbar-root-transition)}.app-nav>ul>li{display:inline-block;margin:var(--navbar-root-margin)}.app-nav>ul>li:first-child{margin-left:0}.app-nav>ul>li:last-child{margin-right:0}.app-nav>ul>li>a,.app-nav>ul>li>span{padding:var(--navbar-root-padding);border-width:var(--navbar-root-border-width, 0);border-style:var(--navbar-root-border-style);border-color:var(--navbar-root-border-color);border-radius:var(--navbar-root-border-radius);background:var(--navbar-root-background);color:var(--navbar-root-color);-webkit-text-decoration:var(--navbar-root-text-decoration);text-decoration:var(--navbar-root-text-decoration);-webkit-text-decoration-color:var(--navbar-root-text-decoration-color);text-decoration-color:var(--navbar-root-text-decoration-color)}.app-nav>ul>li>a:hover,.app-nav>ul>li>span:hover{background:var(--navbar-root-background--hover, var(--navbar-root-background));border-style:var(--navbar-root-border-style--hover, var(--navbar-root-border-style));border-color:var(--navbar-root-border-color--hover, var(--navbar-root-border-color));color:var(--navbar-root-color--hover, var(--navbar-root-color));-webkit-text-decoration:var(--navbar-root-text-decoration--hover, var(--navbar-root-text-decoration));text-decoration:var(--navbar-root-text-decoration--hover, var(--navbar-root-text-decoration));-webkit-text-decoration-color:var(--navbar-root-text-decoration-color--hover, var(--navbar-root-text-decoration-color));text-decoration-color:var(--navbar-root-text-decoration-color--hover, var(--navbar-root-text-decoration-color))}.app-nav>ul>li>a:not(:last-child),.app-nav>ul>li>span:not(:last-child){padding:var(--navbar-menu-root-padding, var(--navbar-root-padding));background:var(--navbar-menu-root-background, var(--navbar-root-background))}.app-nav>ul>li>a:not(:last-child):hover,.app-nav>ul>li>span:not(:last-child):hover{background:var(--navbar-menu-root-background--hover, var(--navbar-menu-root-background, var(--navbar-root-background--hover, var(--navbar-root-background))))}.app-nav>ul>li>a.active{background:var(--navbar-root-background--active, var(--navbar-root-background));border-style:var(--navbar-root-border-style--active, var(--navbar-root-border-style));border-color:var(--navbar-root-border-color--active, var(--navbar-root-border-color));color:var(--navbar-root-color--active, var(--navbar-root-color));-webkit-text-decoration:var(--navbar-root-text-decoration--active, var(--navbar-root-text-decoration));text-decoration:var(--navbar-root-text-decoration--active, var(--navbar-root-text-decoration));-webkit-text-decoration-color:var(--navbar-root-text-decoration-color--active, var(--navbar-root-text-decoration-color));text-decoration-color:var(--navbar-root-text-decoration-color--active, var(--navbar-root-text-decoration-color))}.app-nav>ul>li>a.active:not(:last-child):hover{background:var(--navbar-menu-root-background--active, var(--navbar-menu-root-background, var(--navbar-root-background--active, var(--navbar-root-background))))}.app-nav>ul>li ul{visibility:hidden;position:absolute;top:100%;right:50%;overflow-y:auto;box-sizing:border-box;max-height:50vh;padding:var(--navbar-menu-padding);border-width:var(--navbar-menu-border-width, 0);border-style:solid;border-color:var(--navbar-menu-border-color);border-radius:var(--navbar-menu-border-radius);background:var(--navbar-menu-background);box-shadow:var(--navbar-menu-box-shadow);text-align:left;white-space:nowrap;opacity:0;transform:translate(50%, -0.35em);transition:var(--navbar-menu-transition)}.app-nav>ul>li ul li{white-space:nowrap}.app-nav>ul>li ul a{margin:var(--navbar-menu-link-margin);padding:var(--navbar-menu-link-padding);border-width:var(--navbar-menu-link-border-width, 0);border-style:var(--navbar-menu-link-border-style);border-color:var(--navbar-menu-link-border-color);border-radius:var(--navbar-menu-link-border-radius);background:var(--navbar-menu-link-background);color:var(--navbar-menu-link-color);-webkit-text-decoration:var(--navbar-menu-link-text-decoration);text-decoration:var(--navbar-menu-link-text-decoration);-webkit-text-decoration-color:var(--navbar-menu-link-text-decoration-color);text-decoration-color:var(--navbar-menu-link-text-decoration-color)}.app-nav>ul>li ul a:hover{background:var(--navbar-menu-link-background--hover, var(--navbar-menu-link-background));border-style:var(--navbar-menu-link-border-style--hover, var(--navbar-menu-link-border-style));border-color:var(--navbar-menu-link-border-color--hover, var(--navbar-menu-link-border-color));color:var(--navbar-menu-link-color--hover, var(--navbar-menu-link-color));-webkit-text-decoration:var(--navbar-menu-link-text-decoration--hover, var(--navbar-menu-link-text-decoration));text-decoration:var(--navbar-menu-link-text-decoration--hover, var(--navbar-menu-link-text-decoration));-webkit-text-decoration-color:var(--navbar-menu-link-text-decoration-color--hover, var(--navbar-menu-link-text-decoration-color));text-decoration-color:var(--navbar-menu-link-text-decoration-color--hover, var(--navbar-menu-link-text-decoration-color))}.app-nav>ul>li ul a.active{background:var(--navbar-menu-link-background--active, var(--navbar-menu-link-background));border-style:var(--navbar-menu-link-border-style--active, var(--navbar-menu-link-border-style));border-color:var(--navbar-menu-link-border-color--active, var(--navbar-menu-link-border-color));color:var(--navbar-menu-link-color--active, var(--navbar-menu-link-color));-webkit-text-decoration:var(--navbar-menu-link-text-decoration--active, var(--navbar-menu-link-text-decoration));text-decoration:var(--navbar-menu-link-text-decoration--active, var(--navbar-menu-link-text-decoration));-webkit-text-decoration-color:var(--navbar-menu-link-text-decoration-color--active, var(--navbar-menu-link-text-decoration-color));text-decoration-color:var(--navbar-menu-link-text-decoration-color--active, var(--navbar-menu-link-text-decoration-color))}.app-nav>ul>li:hover ul,.app-nav>ul>li:focus ul,.app-nav>ul>li.focus-within ul{visibility:visible;opacity:1;transform:translate(50%, 0)}@media(min-width: 48em){nav.app-nav{margin-left:var(--sidebar-width)}}main{position:relative;overflow-x:hidden;min-height:100vh}.sidebar,.sidebar-toggle,.sidebar+.content{transition:all var(--sidebar-transition-duration) ease-out}@media(min-width: 48em){.sidebar+.content{margin-left:var(--sidebar-width)}}.sidebar{display:flex;flex-direction:column;position:fixed;z-index:10;top:0;right:100%;overflow-x:hidden;overflow-y:auto;height:100vh;width:var(--sidebar-width);padding:var(--sidebar-padding);border-width:var(--sidebar-border-width);border-style:solid;border-color:var(--sidebar-border-color);background:var(--sidebar-background)}.sidebar>h1{margin:0;margin:var(--sidebar-name-margin);padding:var(--sidebar-name-padding);background:var(--sidebar-name-background);color:var(--sidebar-name-color);font-family:var(--sidebar-name-font-family);font-size:var(--sidebar-name-font-size);font-weight:var(--sidebar-name-font-weight);text-align:var(--sidebar-name-text-align)}.sidebar>h1 img{max-width:100%}.sidebar>h1 .app-name-link{color:var(--sidebar-name-color)}body:not([data-platform^=Mac]) .sidebar::-webkit-scrollbar{width:5px}body:not([data-platform^=Mac]) .sidebar::-webkit-scrollbar-thumb{border-radius:50vw}@media(min-width: 48em){.sidebar{position:absolute;transform:translateX(var(--sidebar-width))}}@media print{.sidebar{display:none}}.sidebar-nav,.sidebar nav{order:1;margin:var(--sidebar-nav-margin);padding:var(--sidebar-nav-padding);background:var(--sidebar-nav-background)}.sidebar-nav ul,.sidebar nav ul{margin:0;padding:0;list-style:none}.sidebar-nav ul ul,.sidebar nav ul ul{margin-left:var(--sidebar-nav-indent)}.sidebar-nav a,.sidebar nav a{display:block;overflow:hidden;margin:var(--sidebar-nav-link-margin);padding:var(--sidebar-nav-link-padding);border-width:var(--sidebar-nav-link-border-width, 0);border-style:var(--sidebar-nav-link-border-style);border-color:var(--sidebar-nav-link-border-color);border-radius:var(--sidebar-nav-link-border-radius);background:var(--sidebar-nav-link-background);color:var(--sidebar-nav-link-color);font-weight:var(--sidebar-nav-link-font-weight);white-space:nowrap;-webkit-text-decoration:var(--sidebar-nav-link-text-decoration);text-decoration:var(--sidebar-nav-link-text-decoration);-webkit-text-decoration-color:var(--sidebar-nav-link-text-decoration-color);text-decoration-color:var(--sidebar-nav-link-text-decoration-color);text-overflow:ellipsis;transition:var(--sidebar-nav-link-transition)}.sidebar-nav a img,.sidebar nav a img{margin-top:-0.25em;vertical-align:middle}.sidebar-nav a img:first-child,.sidebar nav a img:first-child{margin-right:.5em}.sidebar-nav a:hover,.sidebar nav a:hover{border-width:var(--sidebar-nav-link-border-width--hover, var(--sidebar-nav-link-border-width, 0));border-style:var(--sidebar-nav-link-border-style--hover, var(--sidebar-nav-link-border-style));border-color:var(--sidebar-nav-link-border-color--hover, var(--sidebar-nav-link-border-color));background:var(--sidebar-nav-link-background--hover, var(--sidebar-nav-link-background));color:var(--sidebar-nav-link-color--hover, var(--sidebar-nav-link-color));font-weight:var(--sidebar-nav-link-font-weight--hover, var(--sidebar-nav-link-font-weight));-webkit-text-decoration:var(--sidebar-nav-link-text-decoration--hover, var(--sidebar-nav-link-text-decoration));text-decoration:var(--sidebar-nav-link-text-decoration--hover, var(--sidebar-nav-link-text-decoration));-webkit-text-decoration-color:var(--sidebar-nav-link-text-decoration-color);text-decoration-color:var(--sidebar-nav-link-text-decoration-color)}.sidebar-nav ul>li>span,.sidebar-nav ul>li>strong,.sidebar nav ul>li>span,.sidebar nav ul>li>strong{display:block;margin:var(--sidebar-nav-strong-margin);padding:var(--sidebar-nav-strong-padding);border-width:var(--sidebar-nav-strong-border-width, 0);border-style:solid;border-color:var(--sidebar-nav-strong-border-color);color:var(--sidebar-nav-strong-color);font-size:var(--sidebar-nav-strong-font-size);font-weight:var(--sidebar-nav-strong-font-weight);text-transform:var(--sidebar-nav-strong-text-transform)}.sidebar-nav ul>li>span+ul,.sidebar-nav ul>li>strong+ul,.sidebar nav ul>li>span+ul,.sidebar nav ul>li>strong+ul{margin-left:0}.sidebar-nav ul>li:first-child>span,.sidebar-nav ul>li:first-child>strong,.sidebar nav ul>li:first-child>span,.sidebar nav ul>li:first-child>strong{margin-top:0}.sidebar-nav::-webkit-scrollbar,.sidebar nav::-webkit-scrollbar{width:0}@supports(width: env(safe-area-inset)){@media only screen and (orientation: landscape){.sidebar-nav,.sidebar nav{margin-left:calc(env(safe-area-inset-left)/2)}}}.sidebar-nav li>a:before,.sidebar-nav li>strong:before{display:inline-block}.sidebar-nav li>a{background-repeat:var(--sidebar-nav-pagelink-background-repeat);background-size:var(--sidebar-nav-pagelink-background-size)}.sidebar-nav li>a[href^="/"]:not([href*="?id="]),.sidebar-nav li>a[href^="#/"]:not([href*="?id="]){transition:var(--sidebar-nav-pagelink-transition)}.sidebar-nav li>a[href^="/"]:not([href*="?id="]),.sidebar-nav li>a[href^="/"]:not([href*="?id="])~ul a,.sidebar-nav li>a[href^="#/"]:not([href*="?id="]),.sidebar-nav li>a[href^="#/"]:not([href*="?id="])~ul a{padding:var(--sidebar-nav-pagelink-padding, var(--sidebar-nav-link-padding))}.sidebar-nav li>a[href^="/"]:not([href*="?id="]):only-child,.sidebar-nav li>a[href^="#/"]:not([href*="?id="]):only-child{background:var(--sidebar-nav-pagelink-background)}.sidebar-nav li>a[href^="/"]:not([href*="?id="]):not(:only-child),.sidebar-nav li>a[href^="#/"]:not([href*="?id="]):not(:only-child){background:var(--sidebar-nav-pagelink-background--loaded, var(--sidebar-nav-pagelink-background))}.sidebar-nav li.active>a,.sidebar-nav li.collapse>a{border-width:var(--sidebar-nav-link-border-width--active, var(--sidebar-nav-link-border-width));border-style:var(--sidebar-nav-link-border-style--active, var(--sidebar-nav-link-border-style));border-color:var(--sidebar-nav-link-border-color--active, var(--sidebar-nav-link-border-color));background:var(--sidebar-nav-link-background--active, var(--sidebar-nav-link-background));color:var(--sidebar-nav-link-color--active, var(--sidebar-nav-link-color));font-weight:var(--sidebar-nav-link-font-weight--active, var(--sidebar-nav-link-font-weight));-webkit-text-decoration:var(--sidebar-nav-link-text-decoration--active, var(--sidebar-nav-link-text-decoration));text-decoration:var(--sidebar-nav-link-text-decoration--active, var(--sidebar-nav-link-text-decoration));-webkit-text-decoration-color:var(--sidebar-nav-link-text-decoration-color);text-decoration-color:var(--sidebar-nav-link-text-decoration-color)}.sidebar-nav li.active>a[href^="/"]:not([href*="?id="]):not(:only-child),.sidebar-nav li.active>a[href^="#/"]:not([href*="?id="]):not(:only-child){background:var(--sidebar-nav-pagelink-background--active, var(--sidebar-nav-pagelink-background--loaded, var(--sidebar-nav-pagelink-background)))}.sidebar-nav li.collapse>a[href^="/"]:not([href*="?id="]):not(:only-child),.sidebar-nav li.collapse>a[href^="#/"]:not([href*="?id="]):not(:only-child){background:var(--sidebar-nav-pagelink-background--collapse, var(--sidebar-nav-pagelink-background--loaded, var(--sidebar-nav-pagelink-background)))}.sidebar-nav li.collapse .app-sub-sidebar{display:none}.sidebar-nav>ul>li>a:before{content:var(--sidebar-nav-link-before-content-l1, var(--sidebar-nav-link-before-content));margin:var(--sidebar-nav-link-before-margin-l1, var(--sidebar-nav-link-before-margin));color:var(--sidebar-nav-link-before-color-l1, var(--sidebar-nav-link-before-color))}.sidebar-nav>ul>li.active>a:before{content:var(--sidebar-nav-link-before-content-l1--active, var(--sidebar-nav-link-before-content--active, var(--sidebar-nav-link-before-content-l1, var(--sidebar-nav-link-before-content))));color:var(--sidebar-nav-link-before-color-l1--active, var(--sidebar-nav-link-before-color--active, var(--sidebar-nav-link-before-color-l1, var(--sidebar-nav-link-before-color))))}.sidebar-nav>ul>li>ul>li>a:before{content:var(--sidebar-nav-link-before-content-l2, var(--sidebar-nav-link-before-content));margin:var(--sidebar-nav-link-before-margin-l2, var(--sidebar-nav-link-before-margin));color:var(--sidebar-nav-link-before-color-l2, var(--sidebar-nav-link-before-color))}.sidebar-nav>ul>li>ul>li.active>a:before{content:var(--sidebar-nav-link-before-content-l2--active, var(--sidebar-nav-link-before-content--active, var(--sidebar-nav-link-before-content-l2, var(--sidebar-nav-link-before-content))));color:var(--sidebar-nav-link-before-color-l2--active, var(--sidebar-nav-link-before-color--active, var(--sidebar-nav-link-before-color-l2, var(--sidebar-nav-link-before-color))))}.sidebar-nav>ul>li>ul>li>ul>li>a:before{content:var(--sidebar-nav-link-before-content-l3, var(--sidebar-nav-link-before-content));margin:var(--sidebar-nav-link-before-margin-l3, var(--sidebar-nav-link-before-margin));color:var(--sidebar-nav-link-before-color-l3, var(--sidebar-nav-link-before-color))}.sidebar-nav>ul>li>ul>li>ul>li.active>a:before{content:var(--sidebar-nav-link-before-content-l3--active, var(--sidebar-nav-link-before-content--active, var(--sidebar-nav-link-before-content-l3, var(--sidebar-nav-link-before-content))));color:var(--sidebar-nav-link-before-color-l3--active, var(--sidebar-nav-link-before-color--active, var(--sidebar-nav-link-before-color-l3, var(--sidebar-nav-link-before-color))))}.sidebar-nav>ul>li>ul>li>ul>li>ul>li>a:before{content:var(--sidebar-nav-link-before-content-l4, var(--sidebar-nav-link-before-content));margin:var(--sidebar-nav-link-before-margin-l4, var(--sidebar-nav-link-before-margin));color:var(--sidebar-nav-link-before-color-l4, var(--sidebar-nav-link-before-color))}.sidebar-nav>ul>li>ul>li>ul>li>ul>li.active>a:before{content:var(--sidebar-nav-link-before-content-l4--active, var(--sidebar-nav-link-before-content--active, var(--sidebar-nav-link-before-content-l4, var(--sidebar-nav-link-before-content))));color:var(--sidebar-nav-link-before-color-l4--active, var(--sidebar-nav-link-before-color--active, var(--sidebar-nav-link-before-color-l4, var(--sidebar-nav-link-before-color))))}.sidebar-nav>:last-child{margin-bottom:2rem}.sidebar-toggle,.sidebar-toggle-button{width:var(--sidebar-toggle-width);outline:none}.sidebar-toggle{position:fixed;z-index:11;top:0;bottom:0;left:0;max-width:40px;margin:0;padding:0;border:0;background:rgba(0,0,0,0);-webkit-appearance:none;-moz-appearance:none;appearance:none;cursor:pointer}.sidebar-toggle .sidebar-toggle-button{position:absolute;top:var(--sidebar-toggle-offset-top);left:var(--sidebar-toggle-offset-left);height:var(--sidebar-toggle-height);border-radius:var(--sidebar-toggle-border-radius);border-width:var(--sidebar-toggle-border-width);border-style:var(--sidebar-toggle-border-style);border-color:var(--sidebar-toggle-border-color);background:var(--sidebar-toggle-background, transparent);color:var(--sidebar-toggle-icon-color)}.sidebar-toggle span{position:absolute;top:calc(50% - var(--sidebar-toggle-icon-stroke-width)/2);left:calc(50% - var(--sidebar-toggle-icon-width)/2);height:var(--sidebar-toggle-icon-stroke-width);width:var(--sidebar-toggle-icon-width);background-color:currentColor}.sidebar-toggle span:nth-child(1){margin-top:calc(0px - var(--sidebar-toggle-icon-height)/2)}.sidebar-toggle span:nth-child(3){margin-top:calc(var(--sidebar-toggle-icon-height)/2)}@media(min-width: 48em){.sidebar-toggle{position:absolute;overflow:visible;top:var(--sidebar-toggle-offset-top);bottom:auto;left:0;height:var(--sidebar-toggle-height);transform:translateX(var(--sidebar-width))}.sidebar-toggle .sidebar-toggle-button{top:0}}@media print{.sidebar-toggle{display:none}}@media(max-width: 47.99em){body.close .sidebar,body.close .sidebar-toggle,body.close .sidebar+.content{transform:translateX(var(--sidebar-width))}}@media(min-width: 48em){body.close .sidebar+.content{transform:translateX(0)}}@media(max-width: 47.99em){body.close nav.app-nav,body.close .github-corner{display:none}}@media(min-width: 48em){body.close .sidebar,body.close .sidebar-toggle{transform:translateX(0)}}@media(min-width: 48em){body.close nav.app-nav{margin-left:0}}@media(max-width: 47.99em){body.close .sidebar-toggle{width:100%;max-width:none}body.close .sidebar-toggle span{margin-top:0}body.close .sidebar-toggle span:nth-child(1){transform:rotate(45deg)}body.close .sidebar-toggle span:nth-child(2){display:none}body.close .sidebar-toggle span:nth-child(3){transform:rotate(-45deg)}}@media(min-width: 48em){body.close .sidebar+.content{margin-left:0}}@media(min-width: 48em){body.sticky .sidebar,body.sticky .sidebar-toggle{position:fixed}}body .docsify-copy-code-button,body .docsify-copy-code-button:after{border-radius:var(--border-radius-m, 0);border-top-left-radius:0;border-bottom-right-radius:0;background:var(--copycode-background);color:var(--copycode-color)}body .docsify-copy-code-button span{border-radius:var(--border-radius-s, 0)}body .docsify-pagination-container{border-top:var(--pagination-border-top);color:var(--pagination-color)}body .pagination-item-label{font-size:var(--pagination-label-font-size)}body .pagination-item-label svg{color:var(--pagination-label-color);height:var(--pagination-chevron-height);stroke:var(--pagination-chevron-stroke);stroke-linecap:var(--pagination-chevron-stroke-linecap);stroke-linejoin:var(--pagination-chevron-stroke-linecap);stroke-width:var(--pagination-chevron-stroke-width)}body .pagination-item-title{color:var(--pagination-title-color);font-size:var(--pagination-title-font-size)}body .app-name.hide{display:block}body .sidebar{padding:var(--sidebar-padding)}.sidebar .search{margin:0;padding:0;border:0}.sidebar .search input{padding:0;line-height:1;font-size:inherit}.sidebar .search .clear-button{width:auto}.sidebar .search .clear-button svg{transform:scale(1)}.sidebar .search .matching-post{border:none}.sidebar .search p{font-size:inherit}.sidebar .search{order:var(--search-flex-order);margin:var(--search-margin);padding:var(--search-padding);background:var(--search-background)}.sidebar .search a{color:inherit}.sidebar .search h2{margin:var(--search-result-heading-margin);font-size:var(--search-result-heading-font-size);font-weight:var(--search-result-heading-font-weight);color:var(--search-result-heading-color)}.sidebar .search .input-wrap{align-items:stretch;margin:var(--search-input-margin);background-color:var(--search-input-background-color);border-width:var(--search-input-border-width, 0);border-style:solid;border-color:var(--search-input-border-color);border-radius:var(--search-input-border-radius)}.sidebar .search input[type=search]{min-width:0;padding:var(--search-input-padding);border:none;background-color:rgba(0,0,0,0);background-image:var(--search-input-background-image);background-position:var(--search-input-background-position);background-repeat:var(--search-input-background-repeat);background-size:var(--search-input-background-size);font-size:var(--search-input-font-size);color:var(--search-input-color);transition:var(--search-input-transition)}.sidebar .search input[type=search]::-ms-clear{display:none}.sidebar .search input[type=search]::-moz-placeholder{color:var(--search-input-placeholder-color, #808080)}.sidebar .search input[type=search]::placeholder{color:var(--search-input-placeholder-color, #808080)}.sidebar .search input[type=search]::-webkit-input-placeholder{line-height:normal}.sidebar .search input[type=search]:focus{background-color:var(--search-input-background-color--focus, var(--search-input-background-color));background-image:var(--search-input-background-image--focus, var(--search-input-background-image));background-position:var(--search-input-background-position--focus, var(--search-input-background-position));background-size:var(--search-input-background-size--focus, var(--search-input-background-size))}@supports(width: env(safe-area-inset)){@media only screen and (orientation: landscape){.sidebar .search input[type=search]{margin-left:calc(env(safe-area-inset-left)/2)}}}.sidebar .search p{overflow:hidden;text-overflow:ellipsis;-webkit-box-orient:vertical;-webkit-line-clamp:2}.sidebar .search p:empty{text-align:center}.sidebar .search .clear-button{margin:0;padding:0 10px;border:none;line-height:1;background:rgba(0,0,0,0);cursor:pointer}.sidebar .search .clear-button svg circle{fill:var(--search-clear-icon-color1, #808080)}.sidebar .search .clear-button svg path{stroke:var(--search-clear-icon-color2, #fff)}.sidebar .search.show~*:not(h1){display:none}.sidebar .search .results-panel{display:none;color:var(--search-result-item-color);font-size:var(--search-result-item-font-size);font-weight:var(--search-result-item-font-weight)}.sidebar .search .results-panel.show{display:block}.sidebar .search .matching-post{margin:var(--search-result-item-margin);padding:var(--search-result-item-padding)}.sidebar .search .matching-post,.sidebar .search .matching-post:last-child{border-width:var(--search-result-item-border-width, 0) !important;border-style:var(--search-result-item-border-style);border-color:var(--search-result-item-border-color)}.sidebar .search .matching-post p{margin:0}.sidebar .search .search-keyword{margin:var(--search-result-keyword-margin);padding:var(--search-result-keyword-padding);border-radius:var(--search-result-keyword-border-radius);background-color:var(--search-result-keyword-background);color:var(--search-result-keyword-color, currentColor);font-style:normal;font-weight:var(--search-result-keyword-font-weight)}.medium-zoom-overlay,.medium-zoom-image--open,.medium-zoom-image--opened{z-index:2147483646 !important}.medium-zoom-overlay{background:var(--zoomimage-overlay-background) !important}:root{--mono-hue: 113;--mono-saturation: 0%;--mono-shade3: hsl(var(--mono-hue), var(--mono-saturation), 20%);--mono-shade2: hsl(var(--mono-hue), var(--mono-saturation), 30%);--mono-shade1: hsl(var(--mono-hue), var(--mono-saturation), 40%);--mono-base: hsl(var(--mono-hue), var(--mono-saturation), 50%);--mono-tint1: hsl(var(--mono-hue), var(--mono-saturation), 70%);--mono-tint2: hsl(var(--mono-hue), var(--mono-saturation), 89%);--mono-tint3: hsl(var(--mono-hue), var(--mono-saturation), 97%);--theme-hue: 204;--theme-saturation: 90%;--theme-lightness: 45%;--theme-color: hsl(var(--theme-hue), var(--theme-saturation), var(--theme-lightness));--modular-scale: 1.333;--modular-scale--2: calc(var(--modular-scale--1) / var(--modular-scale));--modular-scale--1: calc(var(--modular-scale-1) / var(--modular-scale));--modular-scale-1: 1rem;--modular-scale-2: calc(var(--modular-scale-1) * var(--modular-scale));--modular-scale-3: calc(var(--modular-scale-2) * var(--modular-scale));--modular-scale-4: calc(var(--modular-scale-3) * var(--modular-scale));--modular-scale-5: calc(var(--modular-scale-4) * var(--modular-scale));--font-size-xxxl: var(--modular-scale-5);--font-size-xxl: var(--modular-scale-4);--font-size-xl: var(--modular-scale-3);--font-size-l: var(--modular-scale-2);--font-size-m: var(--modular-scale-1);--font-size-s: var(--modular-scale--1);--font-size-xs: var(--modular-scale--2);--duration-slow: 1s;--duration-medium: 0.5s;--duration-fast: 0.25s;--spinner-size: 60px;--spinner-track-width: 4px;--spinner-track-color: rgba(0, 0, 0, 0.15);--spinner-transition-duration: var(--duration-medium)}:root{--base-background-color: #fff;--base-color: var(--mono-shade2);--base-font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol";--base-font-size: 16px;--base-font-weight: normal;--base-line-height: 1.7;--emoji-size: calc(var(--base-line-height) * 1em);--hr-border: 1px solid var(--mono-tint2);--mark-background: #ffecb3;--pre-font-family: var(--code-font-family);--pre-font-size: var(--code-font-size);--pre-font-weight: normal;--selection-color: #b4d5fe;--small-font-size: var(--font-size-s);--strong-color: var(--heading-color);--strong-font-weight: 600;--subsup-font-size: var(--font-size-s)}:root{--content-max-width: 75em;--blockquote-background: var(--mono-tint3);--blockquote-border-style: solid;--blockquote-border-radius: var(--border-radius-m);--blockquote-em-font-weight: normal;--blockquote-font-weight: normal;--blockquote-padding: 1.5em;--code-font-family: Inconsolata, Consolas, Menlo, Monaco, "Andale Mono WT", "Andale Mono", "Lucida Console", "DejaVu Sans Mono", "Bitstream Vera Sans Mono", "Courier New", Courier, monospace;--code-font-size: calc(var(--font-size-m) * 0.95);--code-font-weight: normal;--code-tab-size: 4;--code-block-border-radius: var(--border-radius-m);--code-block-line-height: var(--base-line-height);--code-block-margin: 1em 0;--code-block-padding: 1.75em 1.5em 1.5em 1.5em;--code-inline-background: var(--code-theme-background);--code-inline-border-radius: var(--border-radius-s);--code-inline-color: var(--code-theme-text);--code-inline-margin: 0 0.15em;--code-inline-padding: 0.125em 0.4em;--code-theme-background: var(--mono-tint3);--heading-color: var(--mono-shade3);--heading-margin: 2.5rem 0 0;--heading-h1-border-style: solid;--heading-h1-font-size: var(--font-size-xxl);--heading-h2-border-style: solid;--heading-h2-font-size: var(--font-size-xl);--heading-h3-border-style: solid;--heading-h3-font-size: var(--font-size-l);--heading-h4-border-style: solid;--heading-h4-font-size: var(--font-size-m);--heading-h5-border-style: solid;--heading-h5-font-size: var(--font-size-s);--heading-h6-border-style: solid;--heading-h6-font-size: var(--font-size-xs);--kbd-background: var(--mono-tint3);--kbd-border-radius: var(--border-radius-m);--kbd-margin: 0 0.3em;--kbd-min-width: 2.5em;--kbd-padding: 0.65em 0.5em;--link-text-decoration: underline;--notice-background: var(--mono-tint3);--notice-border-radius: var(--border-radius-m);--notice-border-style: solid;--notice-font-weight: normal;--notice-padding: 1em 1.5em;--notice-before-font-weight: normal;--table-cell-padding: 0.75em 0.5em;--table-head-border-color: var(--table-cell-border-color);--table-head-font-weight: var(--strong-font-weight);--table-row-odd-background: var(--mono-tint3)}:root{--cover-margin: 0 auto;--cover-max-width: 40em;--cover-text-align: center;--cover-background-color: var(--base-background-color);--cover-background-mask-color: var(--base-background-color);--cover-background-mask-opacity: 0.8;--cover-background-position: center center;--cover-background-repeat: no-repeat;--cover-background-size: cover;--cover-blockquote-font-size: var(--font-size-l);--cover-border-color: var(--theme-color);--cover-button-border: 1px solid var(--theme-color);--cover-button-border-radius: var(--border-radius-m);--cover-button-color: var(--theme-color);--cover-button-padding: 0.5em 2rem;--cover-button-text-decoration: none;--cover-button-transition: all var(--duration-fast) ease-in-out;--cover-button-primary-background: var(--theme-color);--cover-button-primary-border: 1px solid var(--theme-color);--cover-button-primary-color: #fff;--cover-heading-color: var(--theme-color);--cover-heading-font-size: var(--font-size-xxl);--cover-heading-font-weight: normal;--cover-link-text-decoration: underline}:root{--navbar-root-border-style: solid;--navbar-root-margin: 0 0 0 1.5em;--navbar-root-transition: all var(--duration-fast);--navbar-menu-background: var(--base-background-color);--navbar-menu-border-radius: var(--border-radius-m);--navbar-menu-box-shadow: rgba(45,45,45,0.05) 0px 0px 1px, rgba(49,49,49,0.05) 0px 1px 2px, rgba(42,42,42,0.05) 0px 2px 4px, rgba(32,32,32,0.05) 0px 4px 8px, rgba(49,49,49,0.05) 0px 8px 16px, rgba(35,35,35,0.05) 0px 16px 32px;--navbar-menu-padding: 0.5em;--navbar-menu-transition: all var(--duration-fast);--navbar-menu-link-border-style: solid;--navbar-menu-link-margin: 0.75em 0.5em;--navbar-menu-link-padding: 0.2em 0}:root{--copycode-background: #808080;--copycode-color: #fff}:root{--docsifytabs-border-color: var(--mono-tint2);--docsifytabs-border-radius-px: var(--border-radius-s);--docsifytabs-tab-background: var(--mono-tint3);--docsifytabs-tab-color: var(--mono-tint1)}:root{--pagination-border-top: 1px solid var(--mono-tint2);--pagination-chevron-height: 0.8em;--pagination-chevron-stroke: currentColor;--pagination-chevron-stroke-linecap: round;--pagination-chevron-stroke-width: 1px;--pagination-label-font-size: var(--font-size-s);--pagination-title-font-size: var(--font-size-l)}:root{--search-margin: 1.5rem 0 0;--search-input-background-repeat: no-repeat;--search-input-border-color: var(--mono-tint1);--search-input-border-width: 1px;--search-input-padding: 0.5em;--search-flex-order: 1;--search-result-heading-color: var(--heading-color);--search-result-heading-font-size: var(--base-font-size);--search-result-heading-font-weight: normal;--search-result-heading-margin: 0 0 0.25em;--search-result-item-border-color: var(--mono-tint2);--search-result-item-border-style: solid;--search-result-item-border-width: 0 0 1px 0;--search-result-item-font-weight: normal;--search-result-item-padding: 1em 0;--search-result-keyword-background: var(--mark-background);--search-result-keyword-border-radius: var(--border-radius-s);--search-result-keyword-color: var(--mark-color);--search-result-keyword-font-weight: normal;--search-result-keyword-margin: 0 0.1em;--search-result-keyword-padding: 0.2em 0}:root{--zoomimage-overlay-background: rgba(0, 0, 0, 0.875)}:root{--sidebar-background: var(--base-background-color);--sidebar-border-width: 0;--sidebar-padding: 0 25px;--sidebar-transition-duration: var(--duration-fast);--sidebar-width: 17rem;--sidebar-name-font-size: var(--font-size-l);--sidebar-name-font-weight: normal;--sidebar-name-margin: 1.5rem 0 0;--sidebar-name-text-align: center;--sidebar-nav-strong-border-color: var(--sidebar-border-color);--sidebar-nav-strong-color: var(--heading-color);--sidebar-nav-strong-font-weight: var(--strong-font-weight);--sidebar-nav-strong-margin: 1.5em 0 0.5em;--sidebar-nav-strong-padding: 0.25em 0;--sidebar-nav-indent: 1em;--sidebar-nav-margin: 1.5rem 0 0;--sidebar-nav-link-border-style: solid;--sidebar-nav-link-border-width: 0;--sidebar-nav-link-color: var(--base-color);--sidebar-nav-link-font-weight: normal;--sidebar-nav-link-padding: 0.25em 0;--sidebar-nav-link-text-decoration--active: underline;--sidebar-nav-link-text-decoration--hover: underline;--sidebar-nav-link-before-margin: 0 0.35em 0 0;--sidebar-nav-pagelink-transition: var(--sidebar-nav-link-transition);--sidebar-toggle-border-radius: var(--border-radius-s);--sidebar-toggle-border-style: solid;--sidebar-toggle-border-width: 0;--sidebar-toggle-height: 36px;--sidebar-toggle-icon-color: var(--base-color);--sidebar-toggle-icon-height: 10px;--sidebar-toggle-icon-stroke-width: 1px;--sidebar-toggle-icon-width: 16px;--sidebar-toggle-offset-left: 0;--sidebar-toggle-offset-top: calc(35px - (var(--sidebar-toggle-height) / 2));--sidebar-toggle-width: 44px}:root{--code-theme-background: #f3f3f3; --code-theme-comment: #6e8090;--code-theme-function: #dd4a68;--code-theme-keyword: #07a;--code-theme-operator: #a67f59;--code-theme-punctuation: #999;--code-theme-selector: #690;--code-theme-tag: #905;--code-theme-text: #ffffff;--code-theme-variable: #e90}:root{--border-radius-s: 2px;--border-radius-m: 4px;--border-radius-l: 8px;--strong-font-weight: 600;--blockquote-border-color: var(--theme-color);--blockquote-border-radius: 0 var(--border-radius-m) var(--border-radius-m) 0;--blockquote-border-width: 0 0 0 4px;--code-inline-background: var(--mono-tint2);--code-theme-background: var(--mono-tint3);--heading-font-weight: var(--strong-font-weight);--heading-h1-font-weight: 400;--heading-h2-font-weight: 400;--heading-h2-border-color: var(--mono-tint2);--heading-h2-border-width: 0 0 1px 0;--heading-h2-margin: 2.5rem 0 1.5rem;--heading-h2-padding: 0 0 1rem 0;--kbd-border: 1px solid var(--mono-tint2);--notice-border-radius: 0 var(--border-radius-m) var(--border-radius-m) 0;--notice-border-width: 0 0 0 4px;--notice-padding: 1em 1.5em 1em 3em;--notice-before-border-radius: 100%;--notice-before-font-weight: bold;--notice-before-height: 1.5em;--notice-before-left: 0.75em;--notice-before-line-height: 1.5;--notice-before-margin: 0 0.25em 0 0;--notice-before-position: absolute;--notice-before-width: var(--notice-before-height);--notice-important-background: hsl(340, 60%, 96%);--notice-important-border-color: hsl(340, 90%, 45%);--notice-important-before-background: var(--notice-important-border-color) url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23fff'%3E%3Cpath d='M10 14C10 15.1 9.1 16 8 16 6.9 16 6 15.1 6 14 6 12.9 6.9 12 8 12 9.1 12 10 12.9 10 14Z'/%3E%3Cpath d='M10 1.6C10 1.2 9.8 0.9 9.6 0.7 9.2 0.3 8.6 0 8 0 7.4 0 6.8 0.2 6.5 0.6 6.2 0.9 6 1.2 6 1.6 6 1.7 6 1.8 6 1.9L6.8 9.6C6.9 9.9 7 10.1 7.2 10.2 7.4 10.4 7.7 10.5 8 10.5 8.3 10.5 8.6 10.4 8.8 10.3 9 10.1 9.1 9.9 9.2 9.6L10 1.9C10 1.8 10 1.7 10 1.6Z'/%3E%3C/svg%3E") center / 0.875em no-repeat;--notice-important-before-color: #fff;--notice-important-before-content: "";--notice-tip-background: hsl(204, 60%, 96%);--notice-tip-border-color: hsl(204, 90%, 45%);--notice-tip-before-background: var(--notice-tip-border-color) url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23fff'%3E%3Cpath d='M9.1 0C10.2 0 10.7 0.7 10.7 1.6 10.7 2.6 9.8 3.6 8.6 3.6 7.6 3.6 7 3 7 2 7 1.1 7.7 0 9.1 0Z'/%3E%3Cpath d='M5.8 16C5 16 4.4 15.5 5 13.2L5.9 9.1C6.1 8.5 6.1 8.2 5.9 8.2 5.7 8.2 4.6 8.6 3.9 9.1L3.5 8.4C5.6 6.6 7.9 5.6 8.9 5.6 9.8 5.6 9.9 6.6 9.5 8.2L8.4 12.5C8.2 13.2 8.3 13.5 8.5 13.5 8.7 13.5 9.6 13.2 10.4 12.5L10.9 13.2C8.9 15.2 6.7 16 5.8 16Z'/%3E%3C/svg%3E") center / 0.875em no-repeat;--notice-tip-before-color: #fff;--notice-tip-before-content: "";--table-cell-border-color: var(--mono-tint2);--table-cell-border-width: 1px 0;--cover-background-color: hsl(var(--theme-hue), 25%, 60%);--cover-background-image: radial-gradient(ellipse at center 115%, rgba(255, 255, 255, 0.9), transparent);--cover-blockquote-color: var(--strong-color);--cover-heading-color: #fff;--cover-heading-font-size-max: 56;--cover-heading-font-size-min: 34;--cover-heading-font-weight: 200;--navbar-root-color--active: var(--theme-color);--navbar-menu-border-radius: var(--border-radius-m);--navbar-menu-root-background: url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' width='9.6' height='6' viewBox='0 0 9.6 6'%3E%3Cpath d='M1.5 1.5l3.3 3 3.3-3' stroke-width='1.5' stroke='rgb%28179, 179, 179%29' fill='none' stroke-linecap='square' stroke-linejoin='miter' vector-effect='non-scaling-stroke'/%3E%3C/svg%3E") right no-repeat;--navbar-menu-root-padding: 0 18px 0 0;--search-input-background-color: #fff;--search-input-background-image: url("data:image/svg+xml,%3Csvg height='20px' width='20px' viewBox='0 0 24 24' fill='none' stroke='rgba(0, 0, 0, 0.3)' stroke-width='1.5' stroke-linecap='round' stroke-linejoin='round' preserveAspectRatio='xMidYMid meet' xmlns='http://www.w3.org/2000/svg'%3E%3Ccircle cx='10.5' cy='10.5' r='7.5' vector-effect='non-scaling-stroke'%3E%3C/circle%3E%3Cline x1='21' y1='21' x2='15.8' y2='15.8' vector-effect='non-scaling-stroke'%3E%3C/line%3E%3C/svg%3E");--search-input-background-position: 21px center;--search-input-border-color: var(--sidebar-border-color);--search-input-border-width: 1px 0;--search-input-margin: 0 -25px;--search-input-padding: 0.65em 1em 0.65em 50px;--search-input-placeholder-color: rgba(0, 0, 0, 0.4);--search-clear-icon-color1: rgba(0, 0, 0, 0.3);--search-result-heading-font-weight: var(--strong-font-weight);--search-result-item-border-color: var(--sidebar-border-color);--search-result-keyword-border-radius: var(--border-radius-s);--sidebar-background: var(--mono-tint3);--sidebar-border-color: var(--mono-tint2);--sidebar-border-width: 0 1px 0 0;--sidebar-name-color: var(--theme-color);--sidebar-name-font-weight: 300;--sidebar-nav-strong-border-width: 0 0 1px 0;--sidebar-nav-strong-font-size: smaller;--sidebar-nav-strong-margin: 2em -25px 0.75em 0;--sidebar-nav-strong-padding: 0.25em 0 0.75em 0;--sidebar-nav-strong-text-transform: uppercase;--sidebar-nav-link-border-color: transparent;--sidebar-nav-link-border-color--active: var(--theme-color);--sidebar-nav-link-border-width: 0 4px 0 0;--sidebar-nav-link-color--active: var(--theme-color);--sidebar-nav-link-margin: 0 -25px 0 0;--sidebar-nav-link-text-decoration: none;--sidebar-nav-link-text-decoration--active: none;--sidebar-nav-link-text-decoration--hover: underline;--sidebar-nav-pagelink-background: no-repeat 2px calc(50% - 2.5px) / 6px 5px linear-gradient(45deg, transparent 2.75px, var(--mono-tint1) 2.75px 4.25px, transparent 4px), no-repeat 2px calc(50% + 2.5px) / 6px 5px linear-gradient(135deg, transparent 2.75px, var(--mono-tint1) 2.75px 4.25px, transparent 4px);--sidebar-nav-pagelink-background--active: no-repeat 0px center / 5px 6px linear-gradient(225deg, transparent 2.75px, var(--theme-color) 2.75px 4.25px, transparent 4.25px), no-repeat 5px center / 5px 6px linear-gradient(135deg, transparent 2.75px, var(--theme-color) 2.75px 4.25px, transparent 4.25px);--sidebar-nav-pagelink-background--collapse: no-repeat 2px calc(50% - 2.5px) / 6px 5px linear-gradient(45deg, transparent 2.75px, var(--theme-color) 2.75px 4.25px, transparent 4px), no-repeat 2px calc(50% + 2.5px) / 6px 5px linear-gradient(135deg, transparent 2.75px, var(--theme-color) 2.75px 4.25px, transparent 4px);--sidebar-nav-pagelink-background--loaded: no-repeat 0px center / 5px 6px linear-gradient(225deg, transparent 2.75px, var(--mono-tint1) 2.75px 4.25px, transparent 4.25px), no-repeat 5px center / 5px 6px linear-gradient(135deg, transparent 2.75px, var(--mono-tint1) 2.75px 4.25px, transparent 4.25px);--sidebar-nav-pagelink-padding: 0.25em 0 0.25em 20px;--sidebar-nav-pagelink-transition: none;--sidebar-toggle-background: var(--sidebar-border-color);--sidebar-toggle-border-radius: 0 var(--border-radius-s) var(--border-radius-s) 0;--sidebar-toggle-width: 32px}
+
+/*# sourceMappingURL=theme-simple.css.map */
+
+body {
+ line-height: 1.5;
+}
+
+h2 {
+ margin-top: 0.8em !important;
+ margin-bottom: 0.5em !important;
+ padding: 0em !important;
+}
+
+h2 em {
+ color:hsl(341, 42%, 64%);
+}
+
+pre {
+ margin-bottom: 0em !important;
+}
+
+p.warn {
+ margin-top: 1.5em !important;
+}
+
+blockquote {
+ margin-top: 0.8em !important;
+ margin-bottom: 0.8em !important;
+ padding: 0.8em !important;
+}
+
+h2 code {
+ color:rgb(0, 0, 0) !important;
+ background-color: #bbbbbb !important;
+}
+
+p code {
+ color:rgb(0, 0, 0) !important;
+ background-color: #bbbbbb !important;
+}
+
+pre code {
+ padding: 0.8em !important;
+ margin: 0 !important;
+}
+
+li code {
+ padding: 0.05em !important;
+ margin: 0 !important;
+ color:rgb(0, 0, 0) !important;
+ background-color: #bbbbbb !important;
+}
+
+pre.language- {
+ margin-top: 0.2em !important;
+ margin-bottom: 0.5em !important;
+}
+
+pre.language- code {
+ padding: 0.7em !important;
+ margin: 0 !important;
+ color:rgb(0, 0, 0) !important;
+ background-color: #bbbbbb !important;
+}
+
+pre.language-sh {
+ margin-top: 0.2em !important;
+ margin-bottom: 0.5em !important;
+}
+
+pre.language-python {
+ margin-top: 0.2em !important;
+ margin-bottom: 0.5em !important;
+ background-color: #000000;
+ opacity:0.8;
+}
+
+pre.language-sh code {
+ padding: 0.7em !important;
+ margin: 0 !important;
+ color:rgb(0, 0, 0) !important;
+ background-color: #bbbbbb !important;
+}
+
+ul.navbar {
+ list-style-type: none;
+ margin: 0;
+ padding: 0;
+ overflow: hidden;
+ /*background-color: var(--theme-color);*/
+ background-color: var(--base-background-color);
+}
+
+li.navbar {
+ float: left;
+}
+
+li.navbar a.navbar {
+ display: inline-block;
+ position: relative;
+ color: rgb(0, 0, 0);
+ text-align: center;
+ padding: 10px 16px;
+ text-decoration: none;
+}
+
+li.navbar a.navbar::after {
+ content: '';
+ position: absolute;
+ width: 100%;
+ transform: scaleX(0);
+ height: 2px;
+ bottom: 0;
+ left: 0;
+ background-color: #0087ca;
+ transform-origin: bottom right;
+ transition: transform 0.25s ease-out;
+}
+
+li.navbar a.navbar:hover::after {
+ transform: scaleX(1);
+ transform-origin: bottom left;
+}
+
+
+
+li.navbar a img {
+ max-width: 100%;
+ outline: 0;
+ border: 0;
+}
+
+.github-link {
+ display: flex;
+ align-items: center;
+}
+
+.github-icon-container {
+ margin-right: 10px; /* add some spacing between the icon and text */
+}
+
+.github-text-container {
+ color: #0087ca;
+ text-decoration: none;
+ font-weight: bold;
+}
+
+
+.token.after-two-points {
+ color: rgb(108, 154, 180);
+ }
+
+
+.token.type-annotation-tuple {
+ color: rgb(108, 154, 180);
+}
+
+
+ /* toogler */
+
+ #docsify-darklight-theme {
+ border: none;
+ background-color: transparent;
+ position: flex;
+ float: right;
+ margin-top: 8px;
+ margin-left: 8px;
+ width: 25px;
+ height: 25px;
+ background-repeat: no-repeat;
+ background-image: var(--toogleImage);
+ -o-background-size: cover;
+ -moz-background-size: cover;
+ -webkit-background-size:cover;
+ background-size: cover;
+ cursor: pointer;
+ transition: background-image .15s ease-in-out .15s
+}
\ No newline at end of file
diff --git a/assets/zoom-image.min.js b/assets/zoom-image.min.js
new file mode 100644
index 0000000..6164281
--- /dev/null
+++ b/assets/zoom-image.min.js
@@ -0,0 +1 @@
+!function(){function t(e){return"IMG"===e.tagName}function d(e){function t(){for(var e=arguments,t=arguments.length,o=Array(t),n=0;nu.scrollOffset&&setTimeout(a,150))}),window.addEventListener("resize",a);var f={open:i,close:a,toggle:o,update:function(){var e=0 None:
+ self.num_epochs = num_epochs
+ self.data_loader = data_loader
+
+ # Encoder network
+ self.encoder_prop = encoder_prop
+ self.encoder = TagiNetwork(self.encoder_prop)
+ if encoder_param is not None:
+ self.encoder.set_parameters(param=encoder_param)
+
+ # Decoder network
+ self.decoder_prop = decoder_prop
+ self.decoder = TagiNetwork(self.decoder_prop)
+ if decoder_param is not None:
+ self.decoder.set_parameters(decoder_param)
+ self.viz = viz
+ self.dtype = dtype
+
+ def train(self) -> None:
+ """Train encoder and decoder"""
+ # Initialziation
+ assert self.encoder_prop.batch_size == self.decoder_prop.batch_size
+
+ # Inputs
+ batch_size = self.encoder_prop.batch_size
+ Sx_batch, Sx_f_batch = self.init_inputs(batch_size)
+
+ # Outputs
+ V_batch, empty_ud_idx_batch = self.init_outputs(batch_size)
+
+ input_data, _, _, _ = self.data_loader["train"]
+ num_data = input_data.shape[0]
+ num_iter = int(num_data / batch_size)
+ pbar = tqdm(range(self.num_epochs))
+ for epoch in pbar:
+ # Decaying observation's variance
+ self.decoder_prop.sigma_v = exponential_scheduler(
+ curr_v=self.decoder_prop.sigma_v,
+ min_v=self.decoder_prop.sigma_v_min,
+ decaying_factor=self.decoder_prop.decay_factor_sigma_v,
+ curr_iter=epoch)
+ V_batch = V_batch * 0.0 + self.decoder_prop.sigma_v**2
+
+ for i in range(num_iter):
+ # Momentum for batch norm layer
+ if (i == 0 and epoch == 0):
+ self.encoder.net_prop.ra_mt = 0.0
+ self.decoder.net_prop.ra_mt = 0.0
+ else:
+ self.encoder.net_prop.ra_mt = 0.9
+ self.decoder.net_prop.ra_mt = 0.9
+
+ # Get data
+ idx = np.random.choice(num_data, size=batch_size)
+ x_batch = input_data[idx, :]
+
+ # Encoder's feed forward
+ self.encoder.feed_forward(x_batch, Sx_batch, Sx_f_batch)
+
+ # Decoder's feed forward
+ ma, va, mz, vz, jcb = self.encoder.get_all_network_outputs()
+ self.decoder.connected_feed_forward(ma=ma,
+ va=va,
+ mz=mz,
+ vz=vz,
+ jcb=jcb)
+
+ # Decoder's feed backward for states & parameters
+ self.decoder.state_feed_backward(x_batch, V_batch,
+ empty_ud_idx_batch)
+ self.decoder.param_feed_backward()
+
+ # Encoder's feed backward for states & parameters
+ enc_delta_mz_init, enc_delta_vz_init = self.encoder.get_state_delta_mean_var(
+ )
+
+ # Encoder's feed backward for state & parameters
+ self.encoder.state_feed_backward(enc_delta_mz_init,
+ enc_delta_vz_init,
+ empty_ud_idx_batch)
+ self.encoder.param_feed_backward()
+
+ # Progress bar
+ pbar.set_description(
+ f"Epoch# {epoch: 0}|{i * batch_size + len(x_batch):>5}|{num_data: 1}"
+ )
+
+ self.predict()
+
+ def predict(self) -> None:
+ """Generate images"""
+ # Inputs
+ batch_size = self.encoder_prop.batch_size
+ Sx_batch, Sx_f_batch = self.init_inputs(batch_size)
+
+ generated_images = []
+ for count, (x_batch, _) in enumerate(self.data_loader["test"]):
+ # Disable average running for batch norm layer
+ self.encoder.net_prop.ra_mt = 1.0
+ self.decoder.net_prop.ra_mt = 1.0
+
+ # Encoder's feed forward
+ self.encoder.feed_forward(x_batch, Sx_batch, Sx_f_batch)
+
+ # Decoder's feed forward
+ ma, va, mz, vz, jcb = self.encoder.get_all_network_outputs()
+ self.decoder.connected_feed_forward(ma=ma,
+ va=va,
+ mz=mz,
+ vz=vz,
+ jcb=jcb)
+
+ # Get images
+ norm_pred, _ = self.decoder.get_network_predictions()
+ generated_images.append(norm_pred)
+
+ # Only first 100 images
+ if count * batch_size > 100:
+ break
+
+ generated_images = np.stack(generated_images).flatten()
+ generated_images = generated_images[:self.encoder_prop.nodes[0] * 100]
+
+ # Visualization
+ if self.viz is not None:
+ n_row = 10
+ n_col = 10
+ self.viz.plot_images(n_row=n_row,
+ n_col=n_col,
+ imgs=generated_images)
+
+ def init_inputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initnitalize the covariance matrix for inputs"""
+ Sx_batch = np.zeros((batch_size, self.encoder_prop.nodes[0]),
+ dtype=self.dtype)
+
+ Sx_f_batch = np.array([], dtype=self.dtype)
+
+ return Sx_batch, Sx_f_batch
+
+ def init_outputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initnitalize the covariance matrix for outputs"""
+ # Outputs
+ V_batch = np.zeros((batch_size, self.decoder_prop.nodes[-1]),
+ dtype=self.dtype) + self.decoder_prop.sigma_v**2
+ ud_idx_batch = np.zeros((batch_size, 0), dtype=np.int32)
+
+ return V_batch, ud_idx_batch
diff --git a/code/autoencoder_runner.py b/code/autoencoder_runner.py
new file mode 100644
index 0000000..0631116
--- /dev/null
+++ b/code/autoencoder_runner.py
@@ -0,0 +1,50 @@
+import numpy as np
+from visualizer import ImageViz
+
+from python_examples.autoencoder import Autoencoder
+from python_examples.data_loader import MnistDataloader
+from python_examples.model import MnistDecoder, MnistEncoder
+
+
+def main():
+ """Training and testing API"""
+ # User-input
+ num_epochs = 10
+ mu = np.array([0.1309])
+ sigma = np.array([1])
+ img_size = np.array([1, 28, 28])
+ x_train_file = "./data/mnist/train-images-idx3-ubyte"
+ y_train_file = "./data/mnist/train-labels-idx1-ubyte"
+ x_test_file = "./data/mnist/t10k-images-idx3-ubyte"
+ y_test_file = "./data/mnist/t10k-labels-idx1-ubyte"
+
+ # Model
+ encoder_prop = MnistEncoder()
+ decoder_prop = MnistDecoder()
+
+ # Data loader
+ reg_data_loader = MnistDataloader(batch_size=encoder_prop.batch_size)
+ data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+
+ # Visualization
+ viz = ImageViz(task_name="autoencoder",
+ data_name="mnist",
+ mu=mu,
+ sigma=sigma,
+ img_size=img_size)
+
+ # Train and test
+ reg_task = Autoencoder(num_epochs=num_epochs,
+ data_loader=data_loader,
+ encoder_prop=encoder_prop,
+ decoder_prop=decoder_prop,
+ viz=viz)
+ reg_task.train()
+ reg_task.predict()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/code/classification.py b/code/classification.py
new file mode 100644
index 0000000..2dde066
--- /dev/null
+++ b/code/classification.py
@@ -0,0 +1,151 @@
+###############################################################################
+# File: classification.py
+# Description: Example of classification task using pytagi
+# Authors: Luong-Ha Nguyen & James-A. Goulet
+# Created: October 19, 2022
+# Updated: November 12, 2022
+# Contact: luongha.nguyen@gmail.com & james.goulet@polymtl.ca
+# Copyright (c) 2022 Luong-Ha Nguyen & James-A. Goulet. Some rights reserved.
+###############################################################################
+from typing import Tuple
+
+import numpy as np
+from tqdm import tqdm
+
+import pytagi.metric as metric
+from pytagi import NetProp, TagiNetwork
+from pytagi import HierarchicalSoftmax, Utils
+
+
+class Classifier:
+ """Classifier images"""
+
+ hr_softmax: HierarchicalSoftmax
+ utils: Utils = Utils()
+
+ def __init__(self, num_epochs: int, data_loader: dict, net_prop: NetProp,
+ num_classes: int) -> None:
+ self.num_epochs = num_epochs
+ self.data_loader = data_loader
+ self.net_prop = net_prop
+ self.num_classes = num_classes
+ self.network = TagiNetwork(self.net_prop)
+
+ @property
+ def num_classes(self) -> int:
+ """Get number of classes"""
+
+ return self._num_classes
+
+ @num_classes.setter
+ def num_classes(self, value: int) -> None:
+ """Set number of classes"""
+ self._num_classes = value
+ self.hr_softmax = self.utils.get_hierarchical_softmax(
+ self._num_classes)
+ self.net_prop.nye = self.hr_softmax.num_obs
+
+ def train(self) -> None:
+ """Train the network using TAGI"""
+
+ # Inputs
+ batch_size = self.net_prop.batch_size
+ Sx_batch, Sx_f_batch = self.init_inputs(batch_size)
+
+ # Outputs
+ V_batch, _ = self.init_outputs(batch_size)
+
+ input_data, output_data, output_idx, labels = self.data_loader["train"]
+ num_data = input_data.shape[0]
+ num_iter = int(num_data / batch_size)
+ pbar = tqdm(range(self.num_epochs))
+ error_rates = []
+ for epoch in pbar:
+ for i in range(num_iter):
+ # Get data
+ idx = np.random.choice(num_data, size=batch_size)
+ x_batch = input_data[idx, :]
+ y_batch = output_data[idx, :]
+ ud_idx_batch = output_idx[idx, :]
+ label = labels[idx]
+
+ # Feed forward
+ self.network.feed_forward(x_batch, Sx_batch, Sx_f_batch)
+
+ # Update hidden states
+ self.network.state_feed_backward(y_batch, V_batch,
+ ud_idx_batch)
+
+ # Update parameters
+ self.network.param_feed_backward()
+
+ # Error rate
+ ma_pred, Sa_pred = self.network.get_network_outputs()
+ pred, _ = self.utils.get_labels(ma=ma_pred,
+ Sa=Sa_pred,
+ hr_softmax=self.hr_softmax,
+ num_classes=self.num_classes,
+ batch_size=batch_size)
+
+ error_rate = metric.classification_error(prediction=pred,
+ label=label)
+ error_rates.append(error_rate)
+ if i % 1000 == 0 and i > 0:
+ extracted_error_rate = np.hstack(error_rates)
+ avg_error_rate = np.mean(extracted_error_rate[-100:])
+ pbar.set_description(
+ f"Epoch# {epoch: 0}|{i * batch_size + len(x_batch):>5}|{num_data: 1}\t Error rate: {avg_error_rate * 100:>7.2f}%"
+ )
+
+ # Validate on test set after each epoch
+ self.predict()
+
+ def predict(self) -> None:
+ """Make prediction using TAGI"""
+ # Inputs
+ batch_size = self.net_prop.batch_size
+ Sx_batch, Sx_f_batch = self.init_inputs(batch_size)
+
+ preds = []
+ labels = []
+ for x_batch, y_batch in self.data_loader["test"]:
+ # Predicitons
+ self.network.feed_forward(x_batch, Sx_batch, Sx_f_batch)
+ ma, Sa = self.network.get_network_outputs()
+ pred, _ = self.utils.get_labels(ma=ma,
+ Sa=Sa,
+ hr_softmax=self.hr_softmax,
+ num_classes=self.num_classes,
+ batch_size=batch_size)
+
+ # Store data
+ preds.append(pred)
+ labels.append(y_batch)
+
+ preds = np.stack(preds).flatten()
+ labels = np.stack(labels).flatten()
+
+ # Compute classification error rate
+ error_rate = metric.classification_error(prediction=preds,
+ label=labels)
+
+ print("#############")
+ print(f"Error rate : {error_rate * 100: 0.2f}%")
+
+ def init_inputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initnitalize the covariance matrix for inputs"""
+ Sx_batch = np.zeros((batch_size, self.net_prop.nodes[0]),
+ dtype=np.float32)
+
+ Sx_f_batch = np.array([], dtype=np.float32)
+
+ return Sx_batch, Sx_f_batch
+
+ def init_outputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initnitalize the covariance matrix for outputs"""
+ # Outputs
+ V_batch = np.zeros((batch_size, self.net_prop.nodes[-1]),
+ dtype=np.float32) + self.net_prop.sigma_v**2
+ ud_idx_batch = np.zeros((batch_size, 0), dtype=np.int32)
+
+ return V_batch, ud_idx_batch
diff --git a/code/classification_runner.py b/code/classification_runner.py
new file mode 100644
index 0000000..27e674a
--- /dev/null
+++ b/code/classification_runner.py
@@ -0,0 +1,35 @@
+from python_examples.classification import Classifier
+from python_examples.data_loader import ClassificationDataloader
+from python_examples.model import ConvCifarMLP
+
+
+def main():
+ """Training and testing API"""
+ # User-input
+ num_epochs = 50
+ x_train_file = "./data/cifar/x_train.csv"
+ y_train_file = "./data/cifar/y_train.csv"
+ x_test_file = "./data/cifar/x_test.csv"
+ y_test_file = "./data/cifar/y_test.csv"
+
+ # Model
+ net_prop = ConvCifarMLP()
+
+ # Data loader
+ reg_data_loader = ClassificationDataloader(batch_size=net_prop.batch_size)
+ data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+
+ # Train and test
+ clas_task = Classifier(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ num_classes=10)
+ clas_task.train()
+ clas_task.predict()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/code/data_loader.py b/code/data_loader.py
new file mode 100644
index 0000000..ba65dea
--- /dev/null
+++ b/code/data_loader.py
@@ -0,0 +1,321 @@
+###############################################################################
+# File: dataloader.py
+# Description: Prepare data for neural networks
+# Authors: Luong-Ha Nguyen & James-A. Goulet
+# Created: October 12, 2022
+# Updated: October 30, 2022
+# Contact: luongha.nguyen@gmail.com & james.goulet@polymtl.ca
+# Copyright (c) 2022 Luong-Ha Nguyen & James-A. Goulet. Some rights reserved.
+###############################################################################
+from abc import ABC, abstractmethod
+
+import numpy as np
+import pandas as pd
+from pytagi import Normalizer, Utils
+
+
+class DataloaderBase(ABC):
+ """Dataloader template"""
+
+ normalizer: Normalizer = Normalizer()
+
+ def __init__(self, batch_size: int) -> None:
+ self.batch_size = batch_size
+
+ @abstractmethod
+ def process_data(self) -> dict:
+
+ raise NotImplementedError
+
+ def create_data_loader(self, raw_input: np.ndarray,
+ raw_output: np.ndarray) -> list:
+ """Create dataloader based on batch size"""
+ num_input_data = raw_input.shape[0]
+ num_output_data = raw_output.shape[0]
+ assert num_input_data == num_output_data
+
+ # Even indices
+ even_indices = self.split_evenly(num_input_data, self.batch_size)
+
+ if np.mod(num_input_data, self.batch_size) != 0:
+ # Remider indices
+ rem_indices = self.split_reminder(num_input_data, self.batch_size)
+ even_indices.append(rem_indices)
+
+ indices = np.stack(even_indices)
+ input_data = raw_input[indices]
+ output_data = raw_output[indices]
+ dataset = []
+ for x_batch, y_batch in zip(input_data, output_data):
+ dataset.append((x_batch, y_batch))
+ return dataset
+
+ @staticmethod
+ def split_data(data: int,
+ test_ratio: float = 0.2,
+ val_ratio: float = 0.0) -> dict:
+ """Split data into training, validation, and test sets"""
+ num_data = data.shape[1]
+ splited_data = {}
+ if val_ratio != 0.0:
+ end_val_idx = num_data - int(test_ratio * num_data)
+ end_train_idx = int(end_val_idx - val_ratio * end_val_idx)
+ splited_data["train"] = data[:end_train_idx]
+ splited_data["val"] = data[end_train_idx:end_val_idx]
+ splited_data["test"] = data[end_val_idx:]
+ else:
+ end_train_idx = num_data - int(test_ratio * num_data)
+ splited_data["train"] = data[:end_train_idx]
+ splited_data["val"] = []
+ splited_data["test"] = data[end_train_idx:]
+
+ return splited_data
+
+ @staticmethod
+ def load_data_from_csv(data_file: str) -> pd.DataFrame:
+ """Load data from csv file"""
+
+ data = pd.read_csv(data_file, skiprows=1, delimiter=",", header=None)
+
+ return data.values
+
+ @staticmethod
+ def split_evenly(num_data, chunk_size: int):
+ """split data evenly"""
+ indices = np.arange(int(num_data - np.mod(num_data, chunk_size)))
+
+ return np.split(indices, int(np.floor(num_data / chunk_size)))
+
+ @staticmethod
+ def split_reminder(num_data: int, chunk_size: int):
+ """Pad the reminder"""
+ indices = np.arange(num_data)
+ reminder_start = int(num_data - np.mod(num_data, chunk_size))
+ num_samples = chunk_size - (num_data - reminder_start)
+ random_idx = np.random.choice(indices, size=num_samples, replace=False)
+ reminder_idx = indices[reminder_start:]
+
+ return np.concatenate((random_idx, reminder_idx))
+
+
+class RegressionDataLoader(DataloaderBase):
+ """Load and format data that are feeded to the neural network.
+ The user must provide the input and output data file in *csv"""
+
+ def __init__(self, batch_size: int, num_inputs: int,
+ num_outputs: int) -> None:
+ super().__init__(batch_size)
+ self.num_inputs = num_inputs
+ self.num_outputs = num_outputs
+
+ def process_data(self, x_train_file: str, y_train_file: str,
+ x_test_file: str, y_test_file: str) -> dict:
+ """Process data from the csv file"""
+
+ # Load data
+ x_train = self.load_data_from_csv(x_train_file)
+ y_train = self.load_data_from_csv(y_train_file)
+ x_test = self.load_data_from_csv(x_test_file)
+ y_test = self.load_data_from_csv(y_test_file)
+
+ # Normalizer
+ x_mean, x_std = self.normalizer.compute_mean_std(
+ np.concatenate((x_train, x_test)))
+ y_mean, y_std = self.normalizer.compute_mean_std(
+ np.concatenate((y_train, y_test)))
+
+ x_train = self.normalizer.standardize(data=x_train,
+ mu=x_mean,
+ std=x_std)
+ y_train = self.normalizer.standardize(data=y_train,
+ mu=y_mean,
+ std=y_std)
+ x_test = self.normalizer.standardize(data=x_test, mu=x_mean, std=x_std)
+ y_test = self.normalizer.standardize(data=y_test, mu=y_mean, std=y_std)
+
+ # Dataloader
+ data_loader = {}
+ data_loader["train"] = (x_train, y_train)
+ data_loader["test"] = self.create_data_loader(raw_input=x_test,
+ raw_output=y_test)
+ data_loader["x_norm_param_1"] = x_mean
+ data_loader["x_norm_param_2"] = x_std
+ data_loader["y_norm_param_1"] = y_mean
+ data_loader["y_norm_param_2"] = y_std
+
+ return data_loader
+
+
+class MnistDataloader(DataloaderBase):
+ """Data loader for mnist dataset"""
+
+ def process_data(self, x_train_file: str, y_train_file: str,
+ x_test_file: str, y_test_file: str) -> dict:
+ """Process mnist images"""
+ # Initialization
+ utils = Utils()
+ num_train_images = 60000
+ num_test_images = 10000
+
+ # Traininng set
+ train_images, train_labels = utils.load_mnist_images(
+ image_file=x_train_file,
+ label_file=y_train_file,
+ num_images=num_train_images)
+
+ y_train, y_train_idx, num_enc_obs = utils.label_to_obs(
+ labels=train_labels, num_classes=10)
+ x_mean, x_std = self.normalizer.compute_mean_std(train_images)
+ x_std = 1
+
+ # Test set
+ test_images, test_labels = utils.load_mnist_images(
+ image_file=x_test_file,
+ label_file=y_test_file,
+ num_images=num_test_images)
+
+ # Normalizer
+ x_train = self.normalizer.standardize(data=train_images,
+ mu=x_mean,
+ std=x_std)
+ x_test = self.normalizer.standardize(data=test_images,
+ mu=x_mean,
+ std=x_std)
+
+ y_train = y_train.reshape((num_train_images, num_enc_obs))
+ y_train_idx = y_train_idx.reshape((num_train_images, num_enc_obs))
+ x_train = x_train.reshape((num_train_images, 28, 28))
+ x_test = x_test.reshape((num_test_images, 28, 28))
+
+ # Data loader
+ data_loader = {}
+ data_loader["train"] = (x_train, y_train, y_train_idx, train_labels)
+ data_loader["test"] = self.create_data_loader(raw_input=x_test,
+ raw_output=test_labels)
+ data_loader["x_norm_param_1"] = x_mean
+ data_loader["x_norm_param_2"] = x_std
+
+ return data_loader
+
+
+class TimeSeriesDataloader(DataloaderBase):
+ """Data loader for time series"""
+
+ def __init__(self, batch_size: int, output_col: np.ndarray,
+ input_seq_len: int, output_seq_len: int, num_features: int,
+ stride: int) -> None:
+ super().__init__(batch_size)
+ self.output_col = output_col
+ self.input_seq_len = input_seq_len
+ self.output_seq_len = output_seq_len
+ self.num_features = num_features
+ self.stride = stride
+
+ def process_data(self, x_train_file: str, datetime_train_file: str,
+ x_test_file: str, datetime_test_file: str) -> dict:
+ """Process time series"""
+ # Initialization
+ utils = Utils()
+
+ # Load data
+ x_train = self.load_data_from_csv(x_train_file)
+ datetime_train = self.load_data_from_csv(datetime_train_file)
+
+ x_test = self.load_data_from_csv(x_test_file)
+ datetime_test = self.load_data_from_csv(datetime_test_file)
+
+ # Normalizer
+ x_mean, x_std = self.normalizer.compute_mean_std(x_train)
+ x_train = self.normalizer.standardize(data=x_train,
+ mu=x_mean,
+ std=x_std)
+ x_test = self.normalizer.standardize(data=x_test, mu=x_mean, std=x_std)
+
+ # Create rolling windows
+ x_train_rolled, y_train_rolled = utils.create_rolling_window(
+ data=x_train,
+ output_col=self.output_col,
+ input_seq_len=self.input_seq_len,
+ output_seq_len=self.output_seq_len,
+ num_features=self.num_features,
+ stride=self.stride)
+
+ x_test_rolled, y_test_rolled = utils.create_rolling_window(
+ data=x_test,
+ output_col=self.output_col,
+ input_seq_len=self.input_seq_len,
+ output_seq_len=self.output_seq_len,
+ num_features=self.num_features,
+ stride=self.stride)
+
+ # Dataloader
+ data_loader = {}
+ data_loader["train"] = (x_train_rolled, y_train_rolled)
+ data_loader["test"] = self.create_data_loader(raw_input=x_test_rolled,
+ raw_output=y_test_rolled)
+ # Store normalization parameters
+ data_loader["x_norm_param_1"] = x_mean
+ data_loader["x_norm_param_2"] = x_std
+ data_loader["y_norm_param_1"] = x_mean[self.output_col]
+ data_loader["y_norm_param_2"] = x_std[self.output_col]
+
+ # NOTE: Datetime is saved for the visualization purpose
+ data_loader["datetime_train"] = [
+ np.datetime64(date) for date in np.squeeze(datetime_train)
+ ]
+ data_loader["datetime_test"] = [
+ np.datetime64(date) for date in np.squeeze(datetime_test)
+ ]
+
+ return data_loader
+
+class ClassificationDataloader(DataloaderBase):
+ """Data loader for csv dataset for classification"""
+
+ def __init__(self, batch_size: int) -> None:
+ super().__init__(batch_size)
+
+ def process_data(self, x_train_file: str, y_train_file: str,
+ x_test_file: str, y_test_file: str) -> dict:
+ """Process data from the csv file"""
+
+ utils = Utils()
+ num_train_images = 50000
+ num_test_images = 10000
+
+ # Load data
+ train_images = self.load_data_from_csv(x_train_file)
+ y_train = self.load_data_from_csv(y_train_file)
+ test_images = self.load_data_from_csv(x_test_file)
+ y_test = self.load_data_from_csv(y_test_file)
+ train_labels = np.argmax(y_train, axis=1)
+ test_labels = np.argmax(y_test, axis=1)
+
+ y_train, y_train_idx, num_enc_obs = utils.label_to_obs(
+ labels=train_labels, num_classes=10)
+ x_mean, x_std = self.normalizer.compute_mean_std(train_images)
+ x_std = 1
+
+ # Normalizer
+ x_train = self.normalizer.standardize(data=train_images,
+ mu=x_mean,
+ std=x_std)
+ x_test = self.normalizer.standardize(data=test_images,
+ mu=x_mean,
+ std=x_std)
+
+
+ y_train = y_train.reshape((num_train_images, num_enc_obs))
+ y_train_idx = y_train_idx.reshape((num_train_images, num_enc_obs))
+ x_train = x_train.reshape((num_train_images, 32, 32, 3))
+ x_test = x_test.reshape((num_test_images, 32, 32, 3))
+
+ # Data loader
+ data_loader = {}
+ data_loader["train"] = (x_train, y_train, y_train_idx, train_labels)
+ data_loader["test"] = self.create_data_loader(raw_input=x_test,
+ raw_output=test_labels)
+ data_loader["x_norm_param_1"] = x_mean
+ data_loader["x_norm_param_2"] = x_std
+
+ return data_loader
diff --git a/code/derivative_regression_runner.py b/code/derivative_regression_runner.py
new file mode 100644
index 0000000..46b67da
--- /dev/null
+++ b/code/derivative_regression_runner.py
@@ -0,0 +1,44 @@
+from visualizer import PredictionViz
+
+from python_examples.data_loader import RegressionDataLoader
+from python_examples.model import DervMLP
+from python_examples.regression import Regression
+
+
+def main():
+ """Training and testing API"""
+ # User-input
+ num_inputs = 1
+ num_outputs = 1
+ num_epochs = 50
+ x_train_file = "./data/toy_example/derivative_x_train_1D.csv"
+ y_train_file = "./data/toy_example/derivative_y_train_1D.csv"
+ x_test_file = "./data/toy_example/derivative_x_test_1D.csv"
+ y_test_file = "./data/toy_example/derivative_y_test_1D.csv"
+
+ # Model
+ net_prop = DervMLP()
+
+ # Data loader
+ reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+ data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+
+ # Train and test
+ viz = PredictionViz(task_name="derivative", data_name="toy1D")
+ reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+ reg_task.train()
+ reg_task.compute_derivatives(
+ layer=0,
+ truth_derv_file="./data/toy_example/derivative_dy_test_1D.csv")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/code/electricity_time_series_runner.py b/code/electricity_time_series_runner.py
new file mode 100644
index 0000000..fffff4b
--- /dev/null
+++ b/code/electricity_time_series_runner.py
@@ -0,0 +1,75 @@
+from visualizer import PredictionViz
+
+from python_examples.data_loader import TimeSeriesDataloader
+from python_examples.model import TimeSeriesLSTM
+from python_examples.time_series_forecaster import TimeSeriesForecaster
+from pytagi import load_param_from_files
+
+
+def main():
+ """Training and testing API"""
+ # User-input
+ num_epochs = 50
+ output_col = [0]
+ num_features = 1
+ input_seq_len = 5
+ output_seq_len = 1
+ seq_stride = 1
+ x_train_file = "./data/UCI/Electricity/x_train_file.csv"
+ datetime_train_file = "./data/UCI/Electricity/datetime_train_file.csv"
+ x_test_file = "./data/UCI/Electricity/x_test_file.csv"
+ datetime_test_file = "./data/UCI/Electricity/datetime_test_file.csv"
+
+ '''
+ # Load pretrained weights and biases
+ mw_file = "./saved_param/lstm_demo_2lstm_1_mw.csv"
+ Sw_file = "./saved_param/lstm_demo_2lstm_2_Sw.csv"
+ mb_file = "./saved_param/lstm_demo_2lstm_3_mb.csv"
+ Sb_file = "./saved_param/lstm_demo_2lstm_4_Sb.csv"
+ mw_sc_file = "./saved_param/lstm_demo_2lstm_5_mw_sc.csv"
+ Sw_sc_file = "./saved_param/lstm_demo_2lstm_6_Sw_sc.csv"
+ mb_sc_file = "./saved_param/lstm_demo_2lstm_7_mb_sc.csv"
+ Sb_sc_file = "./saved_param/lstm_demo_2lstm_8_Sb_sc.csv"
+ param = load_param_from_files(mw_file=mw_file,
+ Sw_file=Sw_file,
+ mb_file=mb_file,
+ Sb_file=Sb_file,
+ mw_sc_file=mw_sc_file,
+ Sw_sc_file=Sw_sc_file,
+ mb_sc_file=mb_sc_file,
+ Sb_sc_file=Sb_sc_file)
+ '''
+
+ # Model
+ net_prop = TimeSeriesLSTM(input_seq_len=input_seq_len,
+ output_seq_len=output_seq_len,
+ seq_stride=seq_stride)
+
+ # Data loader
+ ts_data_loader = TimeSeriesDataloader(batch_size=net_prop.batch_size,
+ output_col=output_col,
+ input_seq_len=input_seq_len,
+ output_seq_len=output_seq_len,
+ num_features=num_features,
+ stride=seq_stride)
+ data_loader = ts_data_loader.process_data(
+ x_train_file=x_train_file,
+ datetime_train_file=datetime_train_file,
+ x_test_file=x_test_file,
+ datetime_test_file=datetime_test_file)
+
+ # Visualzier
+ viz = PredictionViz(task_name="forecasting", data_name="Global active power")
+
+ # Train and test
+ reg_task = TimeSeriesForecaster(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ #param=param,
+ viz=viz)
+ reg_task.train()
+ reg_task.predict()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/code/fullcov_regression_runner.py b/code/fullcov_regression_runner.py
new file mode 100644
index 0000000..3c5ac6e
--- /dev/null
+++ b/code/fullcov_regression_runner.py
@@ -0,0 +1,42 @@
+from visualizer import PredictionViz
+
+from python_examples.data_loader import RegressionDataLoader
+from python_examples.model import FullCovMLP
+from python_examples.regression import Regression
+
+
+def main():
+ """Training and testing API"""
+ # User-input
+ num_inputs = 1
+ num_outputs = 1
+ num_epochs = 50
+ x_train_file = "./data/toy_example/x_train_1D_full_cov.csv"
+ y_train_file = "./data/toy_example/y_train_1D_full_cov.csv"
+ x_test_file = "./data/toy_example/x_test_1D_full_cov.csv"
+ y_test_file = "./data/toy_example/y_test_1D_full_cov.csv"
+
+ # Model
+ net_prop = FullCovMLP()
+
+ # Data loader
+ reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+ data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+
+ # Train and test
+ viz = PredictionViz(task_name="full_cov_regression", data_name="toy1D")
+ reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+ reg_task.train()
+ reg_task.predict()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/code/heteros_regression_runner.py b/code/heteros_regression_runner.py
new file mode 100644
index 0000000..d6ab03e
--- /dev/null
+++ b/code/heteros_regression_runner.py
@@ -0,0 +1,42 @@
+from visualizer import PredictionViz
+
+from python_examples.data_loader import RegressionDataLoader
+from python_examples.model import HeterosMLP
+from python_examples.regression import Regression
+
+
+def main():
+ """Training and testing API"""
+ # User-input
+ num_inputs = 1
+ num_outputs = 1
+ num_epochs = 50
+ x_train_file = "./data/toy_example/x_train_1D_noise_inference.csv"
+ y_train_file = "./data/toy_example/y_train_1D_noise_inference.csv"
+ x_test_file = "./data/toy_example/x_test_1D_noise_inference.csv"
+ y_test_file = "./data/toy_example/y_test_1D_noise_inference.csv"
+
+ # Model
+ net_prop = HeterosMLP()
+
+ # Data loader
+ reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+ data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+
+ # Train and test
+ viz = PredictionViz(task_name="heteros_regression", data_name="toy1D")
+ reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+ reg_task.train()
+ reg_task.predict()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/code/model.py b/code/model.py
new file mode 100644
index 0000000..ed744db
--- /dev/null
+++ b/code/model.py
@@ -0,0 +1,252 @@
+###############################################################################
+# File: model.py
+# Description: Diffrent example how to build a model in pytagi
+# Authors: Luong-Ha Nguyen & James-A. Goulet
+# Created: October 12, 2022
+# Updated: Marche 12, 2023
+# Contact: luongha.nguyen@gmail.com & james.goulet@polymtl.ca
+# Copyright (c) 2022 Luong-Ha Nguyen & James-A. Goulet. Some rights reserved.
+###############################################################################
+from pytagi import NetProp
+
+
+class RegressionMLP(NetProp):
+ """Multi-layer perceptron for regression task"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [1, 1, 1]
+ self.nodes = [1, 50, 1]
+ self.activations = [0, 4, 0]
+ self.batch_size = 4
+ self.sigma_v = 0.06
+ self.sigma_v_min: float = 0.06
+ self.device = "cpu"
+
+
+class HeterosMLP(NetProp):
+ """Multi-layer preceptron for regression task where the
+ output's noise varies overtime"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [1, 1, 1, 1]
+ self.nodes: list = [1, 100, 100, 2] # output layer = [mean, std]
+ self.activations: list = [0, 4, 4, 0]
+ self.batch_size: int = 10
+ self.sigma_v: float = 0
+ self.sigma_v_min: float = 0
+ self.noise_type: str = "heteros"
+ self.noise_gain: float = 1.0
+ self.init_method: str = "He"
+ self.device: str = "cpu"
+
+
+class DervMLP(NetProp):
+ """Multi-layer perceptron for computing the derivative of a
+ regression task"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [1, 1, 1, 1]
+ self.nodes: list = [1, 64, 64, 1]
+ self.activations: list = [0, 1, 4, 0]
+ self.batch_size: int = 10
+ self.sigma_v: float = 0.3
+ self.sigma_v_min: float = 0.1
+ self.decay_factor_sigma_v: float = 0.99
+ self.collect_derivative: bool = True
+ self.init_method: str = "He"
+
+
+class FullCovMLP(NetProp):
+ """Multi-layer perceptron for performing full-covariance prediction and
+ inference"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [1, 1, 1, 1]
+ self.nodes: list = [1, 30, 30, 1]
+ self.activations: list = [0, 4, 4, 0]
+ self.batch_size: int = 10
+ self.sigma_v: float = 0.5
+ self.sigma_v_min: float = 0.065
+ self.decay_factor_sigma_v: float = 0.95
+ self.sigma_x: float = 0.3485
+ self.is_full_cov: bool = True
+ self.multithreading: bool = True
+ self.device: str = "cpu"
+
+
+class MnistMLP(NetProp):
+ """Multi-layer perceptron for mnist classificaiton.
+
+ NOTE: The number of hidden states for last layer is 11 because
+ TAGI use the hierarchical softmax for the classification task.
+ Further details can be found in
+ https://www.jmlr.org/papers/volume22/20-1009/20-1009.pdf
+ """
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [1, 1, 1, 1]
+ self.nodes = [784, 100, 100, 11]
+ self.activations = [0, 7, 7, 12]
+ self.batch_size = 100
+ self.sigma_v = 1
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.device = "cpu"
+
+class ConvMLP(NetProp):
+ """Multi-layer perceptron for mnist classificaiton.
+
+ NOTE: The number of hidden states for last layer is 11 because
+ TAGI use the hierarchical softmax for the classification task.
+ Further details can be found in
+ https://www.jmlr.org/papers/volume22/20-1009/20-1009.pdf
+ """
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [2, 2, 4, 2, 4, 1, 1]
+ self.nodes = [784, 0, 0, 0, 0, 150, 11]
+ self.kernels = [4, 3, 5, 3, 1, 1, 1]
+ self.strides = [1, 1, 2, 1, 2, 0, 0]
+ self.widths = [28, 27, 13, 9, 4, 1, 1]
+ self.heights = [28, 27, 13, 9, 4, 1, 1]
+ self.filters = [1, 32, 32, 64, 64, 0, 1]
+ self.pads = [0, 1, 0, 0, 0, 0, 0]
+ self.pad_types = [0, 1, 0, 0, 0, 0, 0]
+ self.activations = [0, 4, 0, 4, 0, 4, 12]
+ self.batch_size = 16
+ self.sigma_v = 1
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.device = "cuda"
+
+class ConvBatchNormMLP(NetProp):
+ """Multi-layer perceptron for mnist classificaiton."""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [2, 2, 6, 4, 2, 6, 4, 1, 1]
+ self.nodes = [784, 0, 0, 0, 0, 0, 0, 150, 11]
+ self.kernels = [4, 3, 1, 5, 3, 1, 1, 1, 1]
+ self.strides = [1, 1, 1, 2, 1, 1, 2, 0, 0]
+ self.widths = [28, 27, 27, 13, 9, 9, 4, 1, 1]
+ self.heights = [28, 27, 27, 13, 9, 9, 4, 1, 1]
+ self.filters = [1, 32, 32, 32, 64, 64, 64, 0, 1]
+ self.pads = [0, 1, 0, 0, 0, 0, 0, 0, 0]
+ self.pad_types = [0, 1, 0, 0, 0, 0, 0, 0, 0]
+ self.activations = [0, 4, 0, 0, 4, 0, 0, 4, 12]
+ self.batch_size = 16
+ self.sigma_v = 1
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.device = "cuda"
+
+class ConvCifarMLP(NetProp):
+ """Multi-layer perceptron for cifar classificaiton."""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [2, 2, 4, 2, 4, 2, 4, 1, 1]
+ self.nodes = [3072, 0, 0, 0, 0, 0, 0, 64, 11]
+ self.kernels = [3, 5, 3, 5, 3, 5, 3, 1, 1]
+ self.strides = [1, 1, 2, 1, 2, 1, 2, 0, 0]
+ self.widths = [32, 32, 16, 16, 8, 8, 4, 1, 1]
+ self.heights = [32, 32, 16, 16, 8, 8, 4, 1, 1]
+ self.filters = [3, 32, 32, 32, 32, 64, 64, 64, 1]
+ self.pads = [0, 1, 1, 1, 1, 1, 1, 0, 0]
+ self.pad_types = [0, 2, 1, 2, 1, 2, 1, 0, 0]
+ self.activations = [0, 4, 0, 4, 0, 4, 0, 4, 12]
+ self.batch_size = 16
+ self.sigma_v = 1
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.device = "cuda"
+
+class SoftmaxMnistMLP(NetProp):
+ """Multi-layer perceptron for mnist classificaiton."""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [1, 1, 1, 1]
+ self.nodes = [784, 100, 100, 10]
+ self.activations = [0, 4, 4, 11]
+ self.batch_size = 10
+ self.sigma_v = 2
+ self.is_idx_ud = False
+ self.multithreading = True
+ self.device = "cpu"
+
+
+class TimeSeriesLSTM(NetProp):
+ """LSTM for time series forecasting"""
+
+ def __init__(self,
+ input_seq_len: int,
+ output_seq_len: int,
+ seq_stride: int = 1,
+ *args,
+ **kwargs) -> None:
+ super().__init__(*args, **kwargs)
+ self.layers: list = [1, 7, 7, 1]
+ self.nodes: list = [1, 5, 5, 1]
+ self.activations: list = [0, 0, 0, 0]
+ self.batch_size: int = 10
+ self.input_seq_len: int = input_seq_len
+ self.output_seq_len: int = output_seq_len
+ self.seq_stride: int = seq_stride
+ self.sigma_v: float = 2
+ self.sigma_v_min: float = 0.3
+ self.decay_factor_sigma_v: float = 0.95
+ self.multithreading: bool = False
+ self.device: str = "cpu"
+
+
+class MnistEncoder(NetProp):
+ """Encoder network for Mnist example"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [2, 2, 6, 4, 2, 6, 4, 1, 1]
+ self.nodes: list = [784, 0, 0, 0, 0, 0, 0, 100, 10]
+ self.kernels: list = [3, 1, 3, 3, 1, 3, 1, 1, 1]
+ self.strides: list = [1, 0, 2, 1, 0, 2, 0, 0, 0]
+ self.widths: list = [28, 0, 0, 0, 0, 0, 0, 0, 0]
+ self.heights: list = [28, 0, 0, 0, 0, 0, 0, 0, 0]
+ self.filters: list = [1, 16, 16, 16, 32, 32, 32, 1, 1]
+ self.pads: list = [1, 0, 1, 1, 0, 1, 0, 0, 0]
+ self.pad_types: list = [1, 0, 2, 1, 0, 2, 0, 0, 0]
+ self.activations: list = [0, 4, 0, 0, 4, 0, 0, 4, 0]
+ self.batch_size: int = 10
+ self.is_output_ud: bool = False
+ self.init_method: str = "He"
+ self.device: str = "cuda"
+
+
+class MnistDecoder(NetProp):
+ """Decoder network for Mnist example"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [1, 1, 21, 21, 21]
+ self.nodes: list = [10, 1568, 0, 0, 784]
+ self.kernels: list = [1, 3, 3, 3, 1]
+ self.strides: list = [0, 2, 2, 1, 0]
+ self.widths: list = [0, 7, 0, 0, 0]
+ self.heights: list = [0, 7, 0, 0, 0]
+ self.filters: list = [1, 32, 32, 16, 1]
+ self.pads: list = [0, 1, 1, 1, 0]
+ self.pad_types: list = [0, 2, 2, 1, 0]
+ self.activations: list = [0, 4, 4, 4, 0]
+ self.batch_size: int = 10
+ self.sigma_v: float = 8
+ self.sigma_v_min: float = 2
+ self.is_idx_ud: bool = False
+ self.last_backward_layer: int = 0
+ self.decay_factor_sigma_v: float = 0.95
+ self.init_method: str = "He"
+ self.device: str = "cuda"
diff --git a/code/regression.py b/code/regression.py
new file mode 100644
index 0000000..74d7e02
--- /dev/null
+++ b/code/regression.py
@@ -0,0 +1,232 @@
+###############################################################################
+# File: regression.py
+# Description: Example of regression task using pytagi
+# Authors: Luong-Ha Nguyen & James-A. Goulet
+# Created: October 12, 2022
+# Updated: November 07, 2022
+# Contact: luongha.nguyen@gmail.com & james.goulet@polymtl.ca
+# Copyright (c) 2022 Luong-Ha Nguyen & James-A. Goulet. Some rights reserved.
+###############################################################################
+from typing import Union, Tuple
+
+import numpy as np
+import pandas as pd
+from tqdm import tqdm
+
+import pytagi.metric as metric
+from pytagi import NetProp, TagiNetwork
+from pytagi import Normalizer as normalizer
+from pytagi import Utils, exponential_scheduler
+from visualizer import PredictionViz
+
+
+class Regression:
+ """Regression task using TAGI"""
+
+ utils: Utils = Utils()
+
+ def __init__(self,
+ num_epochs: int,
+ data_loader: dict,
+ net_prop: NetProp,
+ dtype=np.float32,
+ viz: Union[PredictionViz, None] = None) -> None:
+ self.num_epochs = num_epochs
+ self.data_loader = data_loader
+ self.net_prop = net_prop
+ self.network = TagiNetwork(self.net_prop)
+ self.dtype = dtype
+ self.viz = viz
+
+ def train(self) -> None:
+ """Train the network using TAGI"""
+ # Inputs
+ batch_size = self.net_prop.batch_size
+ Sx_batch, Sx_f_batch = self.init_inputs(batch_size)
+
+ # Outputs
+ V_batch, ud_idx_batch = self.init_outputs(batch_size)
+
+ input_data, output_data = self.data_loader["train"]
+ num_data = input_data.shape[0]
+ num_iter = int(num_data / batch_size)
+ pbar = tqdm(range(self.num_epochs))
+ for epoch in pbar:
+ # Decaying observation's variance
+ self.net_prop.sigma_v = exponential_scheduler(
+ curr_v=self.net_prop.sigma_v,
+ min_v=self.net_prop.sigma_v_min,
+ decaying_factor=self.net_prop.decay_factor_sigma_v,
+ curr_iter=epoch)
+ V_batch = V_batch * 0.0 + self.net_prop.sigma_v**2
+
+ for i in range(num_iter):
+ # Get data
+ idx = np.random.choice(num_data, size=batch_size)
+ x_batch = input_data[idx, :]
+ y_batch = output_data[idx, :]
+
+ # Feed forward
+ self.network.feed_forward(x_batch, Sx_batch, Sx_f_batch)
+
+ # Update hidden states
+ self.network.state_feed_backward(y_batch, V_batch,
+ ud_idx_batch)
+
+ # Update parameters
+ self.network.param_feed_backward()
+
+ # Loss
+ norm_pred, _ = self.network.get_network_predictions()
+ pred = normalizer.unstandardize(
+ norm_data=norm_pred,
+ mu=self.data_loader["y_norm_param_1"],
+ std=self.data_loader["y_norm_param_2"])
+ obs = normalizer.unstandardize(
+ norm_data=y_batch,
+ mu=self.data_loader["y_norm_param_1"],
+ std=self.data_loader["y_norm_param_2"])
+ mse = metric.mse(pred, obs)
+ pbar.set_description(
+ f"Epoch# {epoch: 0}|{i * batch_size + len(x_batch):>5}|{num_data: 1}\t mse: {mse:>7.2f}"
+ )
+
+ def predict(self, std_factor: int = 1) -> None:
+ """Make prediction using TAGI"""
+ # Inputs
+ batch_size = self.net_prop.batch_size
+ Sx_batch, Sx_f_batch = self.init_inputs(batch_size)
+
+ mean_predictions = []
+ variance_predictions = []
+ y_test = []
+ x_test = []
+ for x_batch, y_batch in self.data_loader["test"]:
+ # Predicitons
+ self.network.feed_forward(x_batch, Sx_batch, Sx_f_batch)
+ ma, Sa = self.network.get_network_predictions()
+
+ mean_predictions.append(ma)
+ variance_predictions.append(Sa + self.net_prop.sigma_v**2)
+ x_test.append(x_batch)
+ y_test.append(y_batch)
+
+ mean_predictions = np.stack(mean_predictions).flatten()
+ std_predictions = (np.stack(variance_predictions).flatten())**0.5
+ y_test = np.stack(y_test).flatten()
+ x_test = np.stack(x_test).flatten()
+
+ # Unnormalization
+ mean_predictions = normalizer.unstandardize(
+ norm_data=mean_predictions,
+ mu=self.data_loader["y_norm_param_1"],
+ std=self.data_loader["y_norm_param_2"])
+ std_predictions = normalizer.unstandardize_std(
+ norm_std=std_predictions, std=self.data_loader["y_norm_param_2"])
+
+ x_test = normalizer.unstandardize(
+ norm_data=x_test,
+ mu=self.data_loader["x_norm_param_1"],
+ std=self.data_loader["x_norm_param_2"])
+ y_test = normalizer.unstandardize(
+ norm_data=y_test,
+ mu=self.data_loader["y_norm_param_1"],
+ std=self.data_loader["y_norm_param_2"])
+
+ # Compute log-likelihood
+ mse = metric.mse(mean_predictions, y_test)
+ log_lik = metric.log_likelihood(prediction=mean_predictions,
+ observation=y_test,
+ std=std_predictions)
+
+ error_rate = metric.error_rate(mean_predictions, y_test)
+
+ # Visualization
+ if self.viz is not None:
+ self.viz.plot_predictions(
+ x_train=None,
+ y_train=None,
+ x_test=x_test,
+ y_test=y_test,
+ y_pred=mean_predictions,
+ sy_pred=std_predictions,
+ std_factor=std_factor,
+ label="diag",
+ title="Diagonal covariance",
+ )
+
+ print("#############")
+ print(f"MSE : {mse: 0.2f}")
+ print(f"Log-likelihood: {log_lik: 0.2f}")
+ print(f"Error rate : {error_rate: 0.2f}")
+
+ def compute_derivatives(self,
+ layer: int = 0,
+ truth_derv_file: Union[None, str] = None) -> None:
+ """Compute dervative of a given layer"""
+ # Inputs
+ batch_size = self.net_prop.batch_size
+ Sx_batch, Sx_f_batch = self.init_inputs(batch_size)
+
+ mean_derv = []
+ variance_derv = []
+ x_test = []
+ for x_batch, _ in self.data_loader["test"]:
+ # Predicitons
+ self.network.feed_forward(x_batch, Sx_batch, Sx_f_batch)
+ mdy, vdy = self.network.get_derivatives(layer)
+
+ mean_derv.append(mdy)
+ variance_derv.append(vdy)
+ x_test.append(x_batch)
+
+ mean_derv = np.stack(mean_derv).flatten()
+ std_derv = (np.stack(variance_derv).flatten())**0.5
+ x_test = np.stack(x_test).flatten()
+
+ # Unnormalization
+ x_test = normalizer.unstandardize(
+ norm_data=x_test,
+ mu=self.data_loader["x_norm_param_1"],
+ std=self.data_loader["x_norm_param_2"])
+
+ if truth_derv_file is not None:
+ truth_dev_test = pd.read_csv(truth_derv_file,
+ skiprows=1,
+ delimiter=",",
+ header=None)
+ self.viz.plot_predictions(
+ x_train=None,
+ y_train=None,
+ x_test=x_test,
+ y_test=truth_dev_test.values,
+ y_pred=mean_derv,
+ sy_pred=std_derv,
+ std_factor=3,
+ label="deriv",
+ title="Neural Network's Derivative",
+ )
+
+ def init_inputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initnitalize the covariance matrix for inputs"""
+ Sx_batch = np.zeros((batch_size, self.net_prop.nodes[0]),
+ dtype=self.dtype)
+
+ Sx_f_batch = np.array([], dtype=self.dtype)
+ if self.net_prop.is_full_cov:
+ Sx_f_batch = self.utils.get_upper_triu_cov(
+ batch_size=batch_size,
+ num_data=self.net_prop.nodes[0],
+ sigma=self.net_prop.sigma_x)
+ Sx_batch = Sx_batch + self.net_prop.sigma_x**2
+
+ return Sx_batch, Sx_f_batch
+
+ def init_outputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initnitalize the covariance matrix for outputs"""
+ # Outputs
+ V_batch = np.zeros((batch_size, self.net_prop.nodes[-1]),
+ dtype=self.dtype) + self.net_prop.sigma_v**2
+ ud_idx_batch = np.zeros((batch_size, 0), dtype=np.int32)
+
+ return V_batch, ud_idx_batch
diff --git a/code/regression_runner.py b/code/regression_runner.py
new file mode 100644
index 0000000..403514d
--- /dev/null
+++ b/code/regression_runner.py
@@ -0,0 +1,48 @@
+from visualizer import PredictionViz
+
+from python_examples.data_loader import RegressionDataLoader
+from python_examples.model import RegressionMLP, MLP
+from python_examples.regression import Regression
+
+
+def main():
+ """Training and testing API"""
+ # User-input
+ num_inputs = 1
+ num_outputs = 1
+ num_epochs = 50
+ x_train_file = "./data/toy_example/x_train_1D.csv"
+ y_train_file = "./data/toy_example/y_train_1D.csv"
+ x_test_file = "./data/toy_example/x_test_1D.csv"
+ y_test_file = "./data/toy_example/y_test_1D.csv"
+
+ # Model
+ net_prop = MLP(layers =[1, 1, 1],
+ nodes = [num_inputs, 50, num_outputs],
+ activations =[0, 4, 0],
+ batch_size = 4,
+ sigma_v = 0.06,
+ sigma_v_min = 0.06,
+ device="cpu")
+
+ # Data loader
+ reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+ data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+
+ # Train and test
+ viz = PredictionViz(task_name="regression", data_name="toy1D")
+ reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+ reg_task.train()
+ reg_task.predict(std_factor=3)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/code/resnet_cifar10_classification_runner.py b/code/resnet_cifar10_classification_runner.py
new file mode 100644
index 0000000..bc23bd8
--- /dev/null
+++ b/code/resnet_cifar10_classification_runner.py
@@ -0,0 +1,60 @@
+from python_examples.classification import Classifier
+from python_examples.data_loader import ClassificationDataloader
+from pytagi import NetProp
+
+class ConvCifarMLP(NetProp):
+ """Multi-layer perceptron for cifar classificaiton."""
+
+ def __init__(self) -> None:
+ super().__init__()
+ #------------------- #Input-----------------#Stage1-------------------------#Stage2-------------------------#Stage3-------------------------#Stage4-------------------------#Output---
+ self.layers = [2, 2, 4, 2, 4, 2, 4, 2, 4, 2, 4, 1, 1]
+ self.nodes = [3072, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 256, 11]
+ self.kernels = [7, 1, 3, 1, 3, 1, 3, 1, 3, 1, 3, 1, 1]
+ self.strides = [1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 0]
+ self.widths = [128, 128, 64, 64, 32, 32, 16, 16, 8, 8, 4, 1, 1]
+ self.heights = [128, 128, 64, 64, 32, 32, 16, 16, 8, 8, 4, 1, 1]
+ self.filters = [3, 32, 32, 32, 32, 64, 64, 128, 128, 256, 256, 1, 1]
+ self.pads = [0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0]
+ self.pad_types = [0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0]
+ self.activations = [0, 4, 0, 4, 0, 4, 0, 4, 0, 4, 0, 4, 12]
+ self.shortcuts = [-1, 3, -1, 6, -1, -1, -1, 7, -1, -1, -1, -1, -1]
+ self.batch_size = 16
+ self.sigma_v = 1
+ self.sigma_v_min = 0.2
+ self.decay_factor_sigma_v = 0.975
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.init_method: str = "He"
+ self.device = "cuda"
+
+def main():
+ """Training and testing API"""
+ # User-input
+ num_epochs = 1
+ x_train_file = "./data/cifar10/x_train.csv"
+ y_train_file = "./data/cifar10/y_train.csv"
+ x_test_file = "./data/cifar10/x_test.csv"
+ y_test_file = "./data/cifar10/y_test.csv"
+
+ # Model
+ net_prop = ConvCifarMLP()
+
+ # Data loader
+ reg_data_loader = ClassificationDataloader(batch_size=net_prop.batch_size)
+ data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+
+ # Train and test
+ clas_task = Classifier(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ num_classes=10)
+ clas_task.train()
+ clas_task.predict()
+
+
+if __name__ == "__main__":
+ main()
\ No newline at end of file
diff --git a/code/time_series_forecaster.py b/code/time_series_forecaster.py
new file mode 100644
index 0000000..fff70b1
--- /dev/null
+++ b/code/time_series_forecaster.py
@@ -0,0 +1,174 @@
+###############################################################################
+# File: time_series_forecaster.py
+# Description: Example of the time series forecasting
+# Authors: Luong-Ha Nguyen & James-A. Goulet
+# Created: October 26, 2022
+# Updated: November 12, 2022
+# Contact: luongha.nguyen@gmail.com & james.goulet@polymtl.ca
+# Copyright (c) 2022 Luong-Ha Nguyen & James-A. Goulet. Some rights reserved.
+###############################################################################
+from typing import Union, Tuple
+
+import numpy as np
+import pytagi.metric as metric
+from pytagi import NetProp, Param, TagiNetwork
+from pytagi import Normalizer as normalizer
+from pytagi import exponential_scheduler
+from tqdm import tqdm
+from visualizer import PredictionViz
+
+
+class TimeSeriesForecaster:
+ """Time series forecaster using TAGI"""
+
+ def __init__(self,
+ num_epochs: int,
+ data_loader: dict,
+ net_prop: NetProp,
+ param: Union[Param, None] = None,
+ viz: Union[PredictionViz, None] = None,
+ dtype=np.float32) -> None:
+ self.num_epochs = num_epochs
+ self.data_loader = data_loader
+ self.net_prop = net_prop
+ self.network = TagiNetwork(self.net_prop)
+ self.viz = viz
+ self.dtype = dtype
+ if param is not None:
+ self.network.set_parameters(param=param)
+
+ def train(self) -> None:
+ """Train LSTM network"""
+ # Inputs
+ batch_size = self.net_prop.batch_size
+ Sx_batch, Sx_f_batch = self.init_inputs(batch_size)
+
+ # Outputs
+ V_batch, ud_idx_batch = self.init_outputs(batch_size)
+
+ input_data, output_data = self.data_loader["train"]
+ num_data = input_data.shape[0]
+ num_iter = int(num_data / batch_size)
+ pbar = tqdm(range(self.num_epochs))
+ for epoch in pbar:
+ # Decaying observation's variance
+ self.net_prop.sigma_v = exponential_scheduler(
+ curr_v=self.net_prop.sigma_v,
+ min_v=self.net_prop.sigma_v_min,
+ decaying_factor=self.net_prop.decay_factor_sigma_v,
+ curr_iter=epoch)
+ V_batch = V_batch * 0.0 + self.net_prop.sigma_v**2
+
+ for i in range(num_iter):
+ # Get data
+ idx = np.random.choice(num_data, size=batch_size)
+ x_batch = input_data[idx, :]
+ y_batch = output_data[idx, :]
+
+ # Feed forward
+ self.network.feed_forward(x_batch, Sx_batch, Sx_f_batch)
+
+ # Update hidden states
+ self.network.state_feed_backward(y_batch, V_batch,
+ ud_idx_batch)
+
+ # Update parameters
+ self.network.param_feed_backward()
+
+ # Loss
+ norm_pred, _ = self.network.get_network_predictions()
+ pred = normalizer.unstandardize(
+ norm_data=norm_pred,
+ mu=self.data_loader["y_norm_param_1"],
+ std=self.data_loader["y_norm_param_2"])
+ obs = normalizer.unstandardize(
+ norm_data=y_batch,
+ mu=self.data_loader["y_norm_param_1"],
+ std=self.data_loader["y_norm_param_2"])
+ mse = metric.mse(pred, obs)
+
+ # Progress bar
+ pbar.set_description(
+ f"Epoch# {epoch: 0}|{i * batch_size + len(x_batch):>5}|{num_data: 1}\t mse: {mse:>7.2f}"
+ )
+
+ def predict(self) -> None:
+ """Make prediction for time series using TAGI"""
+ # Inputs
+ batch_size = self.net_prop.batch_size
+ Sx_batch, Sx_f_batch = self.init_inputs(batch_size)
+
+ mean_predictions = []
+ variance_predictions = []
+ y_test = []
+ x_test = []
+ for x_batch, y_batch in self.data_loader["test"]:
+ # Predicitons
+ self.network.feed_forward(x_batch, Sx_batch, Sx_f_batch)
+ ma, Sa = self.network.get_network_predictions()
+
+ mean_predictions.append(ma)
+ variance_predictions.append(Sa + self.net_prop.sigma_v**2)
+ x_test.append(x_batch)
+ y_test.append(y_batch)
+
+ mean_predictions = np.stack(mean_predictions).flatten()
+ std_predictions = (np.stack(variance_predictions).flatten())**0.5
+ y_test = np.stack(y_test).flatten()
+ x_test = np.stack(x_test).flatten()
+
+ mean_predictions = normalizer.unstandardize(
+ norm_data=mean_predictions,
+ mu=self.data_loader["y_norm_param_1"],
+ std=self.data_loader["y_norm_param_2"])
+
+ std_predictions = normalizer.unstandardize_std(
+ norm_std=std_predictions, std=self.data_loader["y_norm_param_2"])
+
+ y_test = normalizer.unstandardize(
+ norm_data=y_test,
+ mu=self.data_loader["y_norm_param_1"],
+ std=self.data_loader["y_norm_param_2"])
+
+ # Compute log-likelihood
+ mse = metric.mse(mean_predictions, y_test)
+ log_lik = metric.log_likelihood(prediction=mean_predictions,
+ observation=y_test,
+ std=std_predictions)
+
+ # Visualization
+ if self.viz is not None:
+ self.viz.plot_predictions(
+ x_train=None,
+ y_train=None,
+ x_test=self.data_loader["datetime_test"][:len(y_test)],
+ y_test=y_test,
+ y_pred=mean_predictions,
+ sy_pred=std_predictions,
+ std_factor=1,
+ label="time_series_forecasting",
+ title=r"\textbf{Time Series Forecasting}",
+ time_series=True)
+
+ print("#############")
+ print(f"MSE : {mse: 0.2f}")
+ print(f"Log-likelihood: {log_lik: 0.2f}")
+
+ def init_inputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initnitalize the covariance matrix for inputs"""
+ Sx_batch = np.zeros(
+ (batch_size * self.net_prop.input_seq_len, self.net_prop.nodes[0]),
+ dtype=self.dtype)
+
+ Sx_f_batch = np.array([], dtype=self.dtype)
+
+ return Sx_batch, Sx_f_batch
+
+ def init_outputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initnitalize the covariance matrix for outputs"""
+ # Outputs
+ V_batch = np.zeros((batch_size, self.net_prop.nodes[-1]),
+ dtype=self.dtype) + self.net_prop.sigma_v**2
+ ud_idx_batch = np.zeros((batch_size, 0), dtype=np.int32)
+
+ return V_batch, ud_idx_batch
diff --git a/code/time_series_runner.py b/code/time_series_runner.py
new file mode 100644
index 0000000..40553f7
--- /dev/null
+++ b/code/time_series_runner.py
@@ -0,0 +1,78 @@
+from visualizer import PredictionViz
+
+from python_examples.data_loader import TimeSeriesDataloader
+from python_examples.model import TimeSeriesLSTM
+from python_examples.time_series_forecaster import TimeSeriesForecaster
+from pytagi import load_param_from_files
+
+
+def main():
+ """Training and testing API"""
+ # User-input
+ num_epochs = 50
+ output_col = [0]
+ num_features = 1
+ input_seq_len = 5
+ output_seq_len = 1
+ seq_stride = 1
+ x_train_file = "./data/toy_time_series/x_train_sin_data.csv"
+ datetime_train_file = "./data/toy_time_series/train_sin_datetime.csv"
+ x_test_file = "./data/toy_time_series/x_test_sin_data.csv"
+ datetime_test_file = "./data/toy_time_series/test_sin_datetime.csv"
+
+ '''
+ Uncomment this section and the paramaters passing to the time forecaster object in order to charge pretrained parameters.
+ You can generate them by running build/main cfg_time_series_2lstm.txt (and setting load_param to false)
+
+ # Load pretrained weights and biases
+ mw_file = "./saved_param/lstm_demo_2lstm_1_mw.csv"
+ Sw_file = "./saved_param/lstm_demo_2lstm_2_Sw.csv"
+ mb_file = "./saved_param/lstm_demo_2lstm_3_mb.csv"
+ Sb_file = "./saved_param/lstm_demo_2lstm_4_Sb.csv"
+ mw_sc_file = "./saved_param/lstm_demo_2lstm_5_mw_sc.csv"
+ Sw_sc_file = "./saved_param/lstm_demo_2lstm_6_Sw_sc.csv"
+ mb_sc_file = "./saved_param/lstm_demo_2lstm_7_mb_sc.csv"
+ Sb_sc_file = "./saved_param/lstm_demo_2lstm_8_Sb_sc.csv"
+ param = load_param_from_files(mw_file=mw_file,
+ Sw_file=Sw_file,
+ mb_file=mb_file,
+ Sb_file=Sb_file,
+ mw_sc_file=mw_sc_file,
+ Sw_sc_file=Sw_sc_file,
+ mb_sc_file=mb_sc_file,
+ Sb_sc_file=Sb_sc_file)
+ '''
+
+ # Model
+ net_prop = TimeSeriesLSTM(input_seq_len=input_seq_len,
+ output_seq_len=output_seq_len,
+ seq_stride=seq_stride)
+
+ # Data loader
+ ts_data_loader = TimeSeriesDataloader(batch_size=net_prop.batch_size,
+ output_col=output_col,
+ input_seq_len=input_seq_len,
+ output_seq_len=output_seq_len,
+ num_features=num_features,
+ stride=seq_stride)
+ data_loader = ts_data_loader.process_data(
+ x_train_file=x_train_file,
+ datetime_train_file=datetime_train_file,
+ x_test_file=x_test_file,
+ datetime_test_file=datetime_test_file)
+
+ # Visualzier
+ viz = PredictionViz(task_name="forecasting", data_name="sin_signal")
+
+ # Train and test
+ reg_task = TimeSeriesForecaster(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ #param=param,
+ viz=viz)
+ reg_task.train()
+ reg_task.predict()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/code/uci_heteros_regression_runner.py b/code/uci_heteros_regression_runner.py
new file mode 100644
index 0000000..ae1cccc
--- /dev/null
+++ b/code/uci_heteros_regression_runner.py
@@ -0,0 +1,54 @@
+from python_examples.data_loader import RegressionDataLoader
+from python_examples.regression import Regression
+from pytagi import NetProp
+
+class HeterosUCIMLP(NetProp):
+ """Multi-layer preceptron for regression task where the
+ output's noise varies overtime"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [1, 1, 1, 1]
+ self.nodes: list = [13, 50, 50, 2] # output layer = [mean, std]
+ self.activations: list = [0, 4, 4, 0]
+ self.batch_size: int = 10
+ self.sigma_v: float = 2
+ self.sigma_v_min: float = 0.3
+ self.noise_type: str = "heteros"
+ self.noise_gain: float = 1.0
+ self.init_method: str = "He"
+ self.device: str = "cpu"
+
+def main():
+ """Training and testing API"""
+ # User-input
+ num_inputs = 1
+ num_outputs = 1
+ num_epochs = 50
+ x_train_file = "./data/toy_example/x_train_1D_noise_inference.csv"
+ y_train_file = "./data/toy_example/y_train_1D_noise_inference.csv"
+ x_test_file = "./data/toy_example/x_test_1D_noise_inference.csv"
+ y_test_file = "./data/toy_example/y_test_1D_noise_inference.csv"
+
+ # Model
+ net_prop = HeterosUCIMLP()
+
+ # Data loader
+ reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+ data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+
+ # Train and test
+ reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop)
+ reg_task.train()
+ reg_task.predict()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/examples/.DS_Store b/examples/.DS_Store
new file mode 100644
index 0000000..9eab516
Binary files /dev/null and b/examples/.DS_Store differ
diff --git a/examples/_sidebar.md b/examples/_sidebar.md
new file mode 100644
index 0000000..20547d4
--- /dev/null
+++ b/examples/_sidebar.md
@@ -0,0 +1,4 @@
+ - [**FNN examples**](examples/fnn/fnn-examples.md)
+ - [**CNN examples**](examples/cnn/cnn-examples.md)
+ - [**LSTM examples**](examples/lstm/lstm-examples.md)
+ - [**Autoencoders examples**](examples/ae/autoencoder-examples.md)
diff --git a/examples/ae/_sidebar.md b/examples/ae/_sidebar.md
new file mode 100644
index 0000000..195590e
--- /dev/null
+++ b/examples/ae/_sidebar.md
@@ -0,0 +1,2 @@
+- [Autoencoder MNIST](examples/ae/autoencoder-mnist.md)
+- [Autoencoder CIFAR10](examples/ae/autoencoder-cifar10.md)
\ No newline at end of file
diff --git a/examples/ae/autoencoder-cifar10.md b/examples/ae/autoencoder-cifar10.md
new file mode 100644
index 0000000..8aaf915
--- /dev/null
+++ b/examples/ae/autoencoder-cifar10.md
@@ -0,0 +1,164 @@
+# Autoencoder CIFAR10
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/05/05
+**Description:** This example shows how to train an autoencoder to reconstruct the CIFAR10 images.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+We first import the required modules: the numpy library, the ImageViz, the autoencoder, the data loader and the encoder/decoder classes.
+
+```python
+import numpy as np
+from visualizer import ImageViz
+
+from python_examples.autoencoder import Autoencoder
+from python_examples.data_loader import MnistDataloader
+from python_examples.model import MnistDecoder, MnistEncoder
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+We define the number of epochs, some model properties and the paths to the data. Notice that the data is in ubyte format and divided in four files: the training images, the training labels, the test images and the test labels.
+
+```python
+# User-input
+num_epochs = 10 # row for 10 epochs
+mu = np.array([0.1309,0.1309,0.1309]) # mean of each input
+sigma = np.array([1,1,1]) # standard deviation of each input
+img_size = np.array([3, 32, 32]) # size of image input
+x_train_file = "./data/cifar10/x_train.csv"
+y_train_file = "./data/cifar10/y_train.csv"
+x_test_file = "./data/cifar10/x_test.csv"
+y_test_file = "./data/cifar10/y_test.csv"
+```
+
+**You can find the used data in the [CIFAR10 dataset](https://www.cs.toronto.edu/~kriz/cifar.html) website.*
+## 3. Create the model
+
+In this example we will create a model consisting in an encoder and a decoder that will allow us to reconstruct the original images from the CIFAR10 dataset. Find out more about the architecture in [Analytically Tractable Inference in Deep Neural Networks](https://arxiv.org/pdf/2103.05461.pdf).
+
+```python
+class CifarEncoder(NetProp):
+ """Encoder network for Cifar10 example"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [2, 2, 6, 4, 2, 6, 4, 1, 1]
+ self.nodes: list = [3072, 0, 0, 0, 0, 0, 0, 100, 10]
+ self.kernels: list = [3, 1, 3, 3, 1, 3, 1, 1, 1]
+ self.strides: list = [1, 0, 2, 1, 0, 2, 0, 0, 0]
+ self.widths: list = [28, 0, 0, 0, 0, 0, 0, 0, 0]
+ self.heights: list = [28, 0, 0, 0, 0, 0, 0, 0, 0]
+ self.filters: list = [3, 16, 16, 16, 32, 32, 32, 1, 1]
+ self.pads: list = [1, 0, 1, 1, 0, 1, 0, 0, 0]
+ self.pad_types: list = [1, 0, 2, 1, 0, 2, 0, 0, 0]
+ self.activations: list = [0, 4, 0, 0, 4, 0, 0, 4, 0]
+ self.batch_size: int = 10
+ self.is_output_ud: bool = False
+ self.init_method: str = "He"
+ self.device: str = "cuda"
+
+
+class CifarDecoder(NetProp):
+ """Decoder network for Cifar10 example"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [1, 1, 21, 21, 21]
+ self.nodes: list = [10, 1568, 0, 0, 3072]
+ self.kernels: list = [1, 3, 3, 3, 1]
+ self.strides: list = [0, 2, 2, 1, 0]
+ self.widths: list = [0, 7, 0, 0, 0]
+ self.heights: list = [0, 7, 0, 0, 0]
+ self.filters: list = [1, 32, 32, 16, 3]
+ self.pads: list = [0, 1, 1, 1, 0]
+ self.pad_types: list = [0, 2, 2, 1, 0]
+ self.activations: list = [0, 4, 4, 4, 0]
+ self.batch_size: int = 10
+ self.sigma_v: float = 8
+ self.sigma_v_min: float = 2
+ self.is_idx_ud: bool = False
+ self.last_backward_layer: int = 0
+ self.decay_factor_sigma_v: float = 0.95
+ self.init_method: str = "He"
+ self.device: str = "cuda"
+
+```
+
+```python
+# Model
+encoder_prop = CifarEncoder()
+decoder_prop = CifarDecoder()
+```
+
+## 4. Load the data
+
+The next step is to load the data. We will use the [ClassificationDataloader class](modules/data-loader?id=data-loader) to load the data from a csv and we will pass the batch size and the data paths to the class.
+
+```python
+# Data loader
+ae_data_loader = ClassificationDataloader(batch_size=encoder_prop.batch_size)
+data_loader = ae_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Create visualizer
+
+In order to visualize the reconstruction of the images we can use the PredictionViz class.
+
+```python
+# Visualization
+viz = ImageViz(task_name="autoencoder",
+ data_name="cifar",
+ mu=mu,
+ sigma=sigma,
+ img_size=img_size)
+```
+
+> Learn more about PredictionViz class [here](https://github.com/lhnguyen102/cuTAGI/blob/main/visualizer.py).
+
+## 6. Create the autoencoder object
+
+Once we processed the data, we can create the autoencoder object. We will pass the number of epochs, the data loader, the network properties and vizualization object to the class.
+
+```python
+# Train and test
+ae_task = Autoencoder(num_epochs=num_epochs,
+ data_loader=data_loader,
+ encoder_prop=encoder_prop,
+ decoder_prop=decoder_prop,
+ viz=viz)
+```
+
+> Find out more about the [Autoencoder class](modules/autoencoder.md).
+
+## 7 Train and evaluate the model
+
+Finally, we can train and evaluate the model. We will call the train and predict methods of the autoencoder object.
+
+```python
+ae_task.train()
+ae_task.predict()
+```
+
+## 8. Results
+
+If we created the visualizer object, we can visualize the results. The following figure shows the reconstructed images.
+
+![autoencoder mnist](../../images/cifar_autoencoder_disp.png)
diff --git a/examples/ae/autoencoder-examples.md b/examples/ae/autoencoder-examples.md
new file mode 100644
index 0000000..7ec8eb0
--- /dev/null
+++ b/examples/ae/autoencoder-examples.md
@@ -0,0 +1,5 @@
+# Autoencoder examples
+
+- [Autoencoder MNIST](examples/ae/autoencoder-mnist.md)
+
+- [Autoencoder CIFAR10](examples/ae/autoencoder-cifar10.md)
\ No newline at end of file
diff --git a/examples/ae/autoencoder-mnist.md b/examples/ae/autoencoder-mnist.md
new file mode 100644
index 0000000..4aa8bb6
--- /dev/null
+++ b/examples/ae/autoencoder-mnist.md
@@ -0,0 +1,117 @@
+# Autoencoder MNIST
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/05/05
+**Description:** This example shows how to train an autoencoder to reconstruct the MNIST images.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+We first import the required modules: the numpy library, the ImageViz, the autoencoder, the data loader and the encoder/decoder classes.
+
+```python
+import numpy as np
+from visualizer import ImageViz
+
+from python_examples.autoencoder import Autoencoder
+from python_examples.data_loader import MnistDataloader
+from python_examples.model import MnistDecoder, MnistEncoder
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+We define the number of epochs, some model properties and the paths to the data. Notice that the data is in ubyte format and divided in four files: the training images, the training labels, the test images and the test labels.
+
+```python
+# User-input
+num_epochs = 10 # row for 10 epochs
+mu = np.array([0.1309]) # mean of each input
+sigma = np.array([1]) # standard deviation of each input
+img_size = np.array([1, 28, 28]) # size of image input
+x_train_file = "./data/mnist/train-images-idx3-ubyte"
+y_train_file = "./data/mnist/train-labels-idx1-ubyte"
+x_test_file = "./data/mnist/t10k-images-idx3-ubyte"
+y_test_file = "./data/mnist/t10k-labels-idx1-ubyte"
+```
+
+**You can find the used data in the [MNIST data](https://github.com/lhnguyen102/cuTAGI/tree/main/data/mnist) in the repository.*
+
+## 3. Create the model
+
+In this example we will create a model consisting in an encoder and a decoder that will allow us to reconstruct the original images. Find out more about the architecture in [Analytically Tractable Inference in Deep Neural Networks](https://arxiv.org/pdf/2103.05461.pdf).
+
+```python
+# Model
+encoder_prop = MnistEncoder()
+decoder_prop = MnistDecoder()
+```
+
+## 4. Load the data
+
+The next step is to load the data. We will use the [MnistDataloader class](modules/data-loader?id=data-loader) to load the MNIST data and we will pass the batch size of the encoder and the data paths to the class.
+
+```python
+# Data loader
+ae_data_loader = MnistDataloader(batch_size=encoder_prop.batch_size)
+data_loader = ae_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Create visualizer
+
+In order to visualize the reconstruction of the images we can use the PredictionViz class.
+
+```python
+# Visualization
+viz = ImageViz(task_name="autoencoder",
+ data_name="mnist",
+ mu=mu,
+ sigma=sigma,
+ img_size=img_size)
+```
+
+> Learn more about PredictionViz class [here](https://github.com/lhnguyen102/cuTAGI/blob/main/visualizer.py).
+
+## 6. Create the autoencoder object
+
+Once we processed the data, we can create the autoencoder object. We will pass the number of epochs, the data loader, the network properties and vizualization object to the class.
+
+```python
+# Train and test
+ae_task = Autoencoder(num_epochs=num_epochs,
+ data_loader=data_loader,
+ encoder_prop=encoder_prop,
+ decoder_prop=decoder_prop,
+ viz=viz)
+```
+
+> Find out more about the [Autoencoder class](modules/autoencoder.md).
+
+## 7 Train and evaluate the model
+
+Finally, we can train and evaluate the model. We will call the train and predict methods of the autoencoder object.
+
+```python
+ae_task.train()
+ae_task.predict()
+```
+
+## 8. Results
+
+If we created the visualizer object, we can visualize the results. The following figure shows the reconstructed images.
+
+![autoencoder mnist](../../images/mnist_autoencoder_disp.png)
diff --git a/examples/cnn/2-conv-mnist.md b/examples/cnn/2-conv-mnist.md
new file mode 100644
index 0000000..e4fca9d
--- /dev/null
+++ b/examples/cnn/2-conv-mnist.md
@@ -0,0 +1,101 @@
+# 2-Convolutional MNIST
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/04/21
+**Description:** This example shows how to train a convolutional neural network (CNN) to classify the MNIST dataset.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+We first import the required modules: the classifier, the data loader and the model.
+
+```python
+from python_examples.classification import Classifier
+from python_examples.data_loader import MnistDataloader
+from python_examples.model import ConvMLP
+
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+We define the number of epochs and the paths to the data. Notice that the data is in ubyte format and divided in four files: the training images, the training labels, the test images and the test labels.
+
+```python
+# User-input
+num_epochs = 50 # row for 50 epochs
+x_train_file = "./data/mnist/train-images-idx3-ubyte"
+y_train_file = "./data/mnist/train-labels-idx1-ubyte"
+x_test_file = "./data/mnist/t10k-images-idx3-ubyte"
+y_test_file = "./data/mnist/t10k-labels-idx1-ubyte"
+```
+
+**You can find the used data in the [MNIST data](https://github.com/lhnguyen102/cuTAGI/tree/main/data/mnist) in the repository.*
+
+## 3. Create the model
+
+In this example we will create a model of two convolutional layers, a batch size of 16 and will use hierarchical softmax for the classification task. Find out more about the [ConvMLP class](modules/models?id=_2-conv-mnist-classification-mlp-class) and all its parameters.
+
+```python
+# Model
+net_prop = ConvMLP()
+```
+
+## 4. Load the data
+
+The next step is to load the data. We will use the [MnistDataloader class](modules/data-loader?id=data-loader) to load the data and we will pass the batch size and the data paths to the class.
+
+```python
+# Data loader
+class_data_loader = MnistDataloader(batch_size=net_prop.batch_size)
+data_loader = class_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Create the classification object
+
+Once we processed the data, we can create the classifier object. We will pass the number of epochs, the data loader, the network properties and the number of classes to the class. In this case the number of classes is 10 because we are classifying the MNIST dataset.
+
+```python
+# Train and test
+clas_task = Classifier(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ num_classes=10)
+```
+
+> Find out more about the [Classifier class](modules/classifier.md).
+
+## 6. Train and evaluate the model
+
+Finally, we can train and evaluate the model. We will call the train and predict methods of the classifier object.
+
+```python
+ clas_task.train()
+ clas_task.predict()
+```
+
+## 7. Results
+
+In this section we will see the performace of the model using cuTAGI and we will compare the results with the results of a backpropagation model.
+
+| Model | Error Rate [%] | | Hyperparameters | |
+| :------: | :------------: | :---: | :-------------: | :---: |
+| | e = 1 | e = E | E | B |
+| **TAGI** | 2.07 | 0.65 | 50 | 16 |
+| BP | - | 0.67 | 1000 | 128 |
+
+?> The table above compares the classification accuracy with the results from [Wan et al.](http://proceedings.mlr.press/v28/wan13.pdf) where both approaches use the same CNN architecture with 2 convolutional layers (32-64) and a fully connected layer with 150 hidden units.
diff --git a/examples/cnn/3-conv-cifar10.md b/examples/cnn/3-conv-cifar10.md
new file mode 100644
index 0000000..b7bc4ac
--- /dev/null
+++ b/examples/cnn/3-conv-cifar10.md
@@ -0,0 +1,125 @@
+# 3-Convolutional CIFAR10
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/05/05
+**Description:** This example shows how to train a convolutional neural network (CNN) to classify the CIFAR10 dataset.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+We first import the required modules: the classifier, the data loader and the model.
+
+```python
+from python_examples.classification import Classifier
+from python_examples.data_loader import ClassificationDataloader
+from pytagi import NetProp
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+We define the number of epochs and the paths to the data. Notice that the data is in ubyte format and divided in four files: the training images, the training labels, the test images and the test labels.
+
+```python
+num_epochs = 50
+x_train_file = "./data/cifar/x_train.csv"
+y_train_file = "./data/cifar/y_train.csv"
+x_test_file = "./data/cifar/x_test.csv"
+y_test_file = "./data/cifar/y_test.csv"
+```
+
+**You can find the used data in the [CIFAR10 dataset](https://www.cs.toronto.edu/~kriz/cifar.html) website.*
+
+## 3. Create the model
+
+In this example we will create a model of three convolutional layers, a batch size of 16 and will use hierarchical softmax for the classification task. Find out more about the architecture in [Analytically Tractable Inference in Deep Neural Networks](https://arxiv.org/pdf/2103.05461.pdf).
+
+```python
+class ConvCifarMLP(NetProp):
+ """Multi-layer perceptron for cifar classificaiton."""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [2, 2, 4, 2, 4, 2, 4, 1, 1]
+ self.nodes = [3072, 0, 0, 0, 0, 0, 0, 64, 11]
+ self.kernels = [5, 3, 5, 3, 5, 3, 1, 1, 1]
+ self.strides = [1, 2, 1, 2, 1, 2, 0, 0, 0]
+ self.widths = [32, 32, 16, 16, 8, 8, 4, 1, 1]
+ self.heights = [32, 32, 16, 16, 8, 8, 4, 1, 1]
+ self.filters = [3, 32, 32, 32, 32, 64, 64, 1, 1]
+ self.pads = [2, 1, 2, 1, 2, 1, 0, 0, 0]
+ self.pad_types = [1, 2, 1, 2, 1, 2, 0, 0, 0]
+ self.activations = [0, 4, 0, 4, 0, 4, 0, 4, 12]
+ self.batch_size = 16
+ self.sigma_v = 1
+ self.sigma_v_min = 0.3
+ self.decay_factor_sigma_v = 0.975
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.init_method: str = "He"
+ self.device = "cuda"
+```
+
+```python
+# Model
+net_prop = ConvCifarMLP()
+```
+
+## 4. Load the data
+
+The next step is to load the data. We will use the [ClassificationDataloader class](modules/data-loader?id=data-loader) to load the data and we will pass the batch size and the data paths to the class.
+
+```python
+# Data loader
+class_data_loader = ClassificationDataloader(batch_size=net_prop.batch_size)
+data_loader = class_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Create the classification object
+
+Once we processed the data, we can create the classifier object. We will pass the number of epochs, the data loader, the network properties and the number of classes to the class. In this case the number of classes is 10 because we are classifying the CIFAR10 dataset.
+
+```python
+# Train and test
+clas_task = Classifier(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ num_classes=10)
+```
+
+> Find out more about the [Classifier class](modules/classifier.md).
+
+## 6. Train and evaluate the model
+
+Finally, we can train and evaluate the model. We will call the train and predict methods of the classifier object.
+
+```python
+ clas_task.train()
+ clas_task.predict()
+```
+
+## 7. Results
+
+In this section we will see the performace of the model using cuTAGI and we will compare the results with the results of a backpropagation model.
+
+| Model | Error Rate [%] | | Hyperparameters | |
+| :------: | :------------: | :---: | :-------------: | :---: |
+| | e = 1 | e = E | E | B |
+| **TAGI** | 52.71 | 29.66 | 50 | 16 |
+| BP | - | 23.5 | 150 | 128 |
+
+?> The table above compares the classification accuracy with the results from [Wan et al.](http://proceedings.mlr.press/v28/wan13.pdf) where both approaches use the same CNN architecture with 3 convolutional layers (32-16-8) and a fully connected layer with 64 hidden units.
diff --git a/examples/cnn/_sidebar.md b/examples/cnn/_sidebar.md
new file mode 100644
index 0000000..bd1daa7
--- /dev/null
+++ b/examples/cnn/_sidebar.md
@@ -0,0 +1,4 @@
+- [2-conv MNIST](examples/cnn/2-conv-mnist.md)
+- [Batch Norm MNIST](examples/cnn/batch-norm-mnist.md)
+- [3-conv CIFAR10](examples/cnn/3-conv-cifar10.md)
+- [RESNET18 - CIFAR10](examples/cnn/resnet18-cifar10.md)
\ No newline at end of file
diff --git a/examples/cnn/batch-norm-mnist.md b/examples/cnn/batch-norm-mnist.md
new file mode 100644
index 0000000..649bfc4
--- /dev/null
+++ b/examples/cnn/batch-norm-mnist.md
@@ -0,0 +1,101 @@
+# Batch normalization on MNIST
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/04/24
+**Description:** This example shows how to train a convolutional neural network (CNN) with batch normalization to classify the MNIST dataset.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+We first import the required modules: the classifier, the data loader and the model.
+
+```python
+from python_examples.classification import Classifier
+from python_examples.data_loader import MnistDataloader
+from python_examples.model import ConvBatchNormMLP
+
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+We define the number of epochs and the paths to the data. Notice that the data is in ubyte format and divided in four files: the training images, the training labels, the test images and the test labels.
+
+```python
+# User-input
+num_epochs = 50 # row for 50 epochs
+x_train_file = "./data/mnist/train-images-idx3-ubyte"
+y_train_file = "./data/mnist/train-labels-idx1-ubyte"
+x_test_file = "./data/mnist/t10k-images-idx3-ubyte"
+y_test_file = "./data/mnist/t10k-labels-idx1-ubyte"
+```
+
+**You can find the used data in the [MNIST data](https://github.com/lhnguyen102/cuTAGI/tree/main/data/mnist) in the repository.*
+
+## 3. Create the model
+
+In this example we will create a model of two convolutional layers, a batch size of 16 and will use hierarchical softmax for the classification task. Find out more about the [ConvBatchNormMLP class](modules/models?id=_2-conv-mnist-classification-mlp-class) and all its parameters.
+
+```python
+# Model
+net_prop = ConvBatchNormMLP()
+```
+
+## 4. Load the data
+
+The next step is to load the data. We will use the [MnistDataloader class](modules/data-loader?id=data-loader) to load the data and we will pass the batch size and the data paths to the class.
+
+```python
+# Data loader
+class_data_loader = MnistDataloader(batch_size=net_prop.batch_size)
+data_loader = class_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Create the classification object
+
+Once we processed the data, we can create the classifier object. We will pass the number of epochs, the data loader, the network properties and the number of classes to the class. In this case the number of classes is 10 because we are classifying the MNIST dataset.
+
+```python
+# Train and test
+clas_task = Classifier(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ num_classes=10)
+```
+
+> Find out more about the [Classifier class](modules/classifier.md).
+
+## 6. Train and evaluate the model
+
+Finally, we can train and evaluate the model. We will call the train and predict methods of the classifier object.
+
+```python
+clas_task.train()
+clas_task.predict()
+```
+
+## 7. Results
+
+In this section we will see the performace of the model using cuTAGI and we will compare the results with the results of a backpropagation model.
+
+| Model | Error Rate [%] | | Hyperparameters | |
+| :------: | :------------: | :---: | :-------------: | :---: |
+| | e = 1 | e = E | E | B |
+| **TAGI** | 2.13 | 0.96 | 50 | 16 |
+| BP | - | 0.46 | 300 | 128 |
+
+?> The table above compares the classification accuracy with the results from [Lei et al.](https://link.springer.com/article/10.1007/s42452-019-1903-4) where both approaches use a similar CNN architecture with 2 convolutional layers (32-64) and a fully connected layer with 150 hidden units in TAGI and 1280 hidden units in BP.
diff --git a/examples/cnn/cnn-examples.md b/examples/cnn/cnn-examples.md
new file mode 100644
index 0000000..4bbb9c4
--- /dev/null
+++ b/examples/cnn/cnn-examples.md
@@ -0,0 +1,6 @@
+# CNN examples
+
+- [2-conv MNIST](examples/cnn/2-conv-mnist.md)
+- [Batch Norm MNIST](examples/cnn/batch-norm-mnist.md)
+- [3-conv CIFAR10](examples/cnn/3-conv-cifar10.md)
+- [RESNET18 CIFAR10](examples/cnn/resnet18-cifar10.md)
\ No newline at end of file
diff --git a/examples/cnn/resnet18-cifar10.md b/examples/cnn/resnet18-cifar10.md
new file mode 100644
index 0000000..607842a
--- /dev/null
+++ b/examples/cnn/resnet18-cifar10.md
@@ -0,0 +1,128 @@
+# ResNet18 CIFAR10
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/05/05
+**Description:** This example shows how to train a residual network to classify the CIFAR10 dataset.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+We first import the required modules: the classifier, the data loader and the model.
+
+```python
+from python_examples.classification import Classifier
+from python_examples.data_loader import ClassificationDataloader
+from pytagi import NetProp
+
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+We define the number of epochs and the paths to the data. Notice that the data is in ubyte format and divided in four files: the training images, the training labels, the test images and the test labels.
+
+```python
+num_epochs = 50
+x_train_file = "./data/cifar/x_train.csv"
+y_train_file = "./data/cifar/y_train.csv"
+x_test_file = "./data/cifar/x_test.csv"
+y_test_file = "./data/cifar/y_test.csv"
+```
+
+**You can find the used data in the [CIFAR10 dataset](https://www.cs.toronto.edu/~kriz/cifar.html) website.*
+
+## 3. Create the model
+
+In this example we are going to use a [Resnet18 architecture](https://arxiv.org/abs/1512.03385).
+
+```python
+class ResnetCifarMLP(NetProp):
+ """Multi-layer perceptron for cifar classificaiton."""
+
+ def __init__(self) -> None:
+ super().__init__()
+ #-------------------#Input-----------------#Stage1-------------------------#Stage2-------------------------#Stage3-------------------------#Stage4-------------------------#Output---
+ self.layers = [2, 2, 4, 2, 4, 2, 4, 2, 4, 2, 4, 2, 4, 2, 4, 2, 4, 2, 4, 1, 1]
+ self.nodes = [3072, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 256, 11]
+ self.kernels = [7, 1, 3, 1, 3, 1, 3, 1, 3, 1, 3, 1, 3, 1, 3, 1, 3, 1, 3, 1, 1]
+ self.strides = [1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 0]
+ self.widths = [256, 256, 128, 128, 64, 64, 32, 32, 32, 32, 16, 16, 8, 8, 8, 4, 4, 4, 4, 1, 1]
+ self.heights = [256, 256, 128, 128, 64, 64, 32, 32, 32, 32, 16, 16, 8, 8, 8, 4, 4, 4, 4, 1, 1]
+ self.filters = [3, 64, 64, 64, 64, 64, 64, 128, 128, 128, 256, 256, 256, 256, 256, 512, 512, 512, 512, 512, 1]
+ self.pads = [1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 0]
+ self.pad_types = [1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 0]
+ self.activations = [0, 4, 0, 4, 0, 4, 0, 4, 0, 4, 0, 4, 0, 4, 0, 4, 0, 4, 0, 4, 12]
+ self.shortcuts = [-1, 3, -1, 6, -1, 8, -1, 10, -1, 12, -1, 14, -1, 16, -1, 18, -1, 20, -1, -1, -1]
+ self.batch_size = 16
+ self.sigma_v = 1
+ self.sigma_v_min = 0.2
+ self.decay_factor_sigma_v = 0.975
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.init_method: str = "He"
+ self.device = "cuda"
+```
+
+```python
+# Model
+net_prop = ResnetCifarMLP()
+```
+
+## 4. Load the data
+
+The next step is to load the data. We will use the [ClassificationDataloader class](modules/data-loader?id=data-loader) to load the data and we will pass the batch size and the data paths to the class.
+
+```python
+# Data loader
+class_data_loader = ClassificationDataloader(batch_size=net_prop.batch_size)
+data_loader = class_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Create the classification object
+
+Once we processed the data, we can create the classifier object. We will pass the number of epochs, the data loader, the network properties and the number of classes to the class. In this case the number of classes is 10 because we are classifying the CIFAR10 dataset.
+
+```python
+# Train and test
+clas_task = Classifier(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ num_classes=10)
+```
+
+> Find out more about the [Classifier class](modules/classifier.md).
+
+## 6. Train and evaluate the model
+
+Finally, we can train and evaluate the model. We will call the train and predict methods of the classifier object.
+
+```python
+ clas_task.train()
+ clas_task.predict()
+```
+
+## 7. Results
+
+In this section we will see the performace of the model using cuTAGI and we will compare the results with the results of a backpropagation model.
+
+| Model | Error Rate [%] | | Hyperparameters | |
+| :------: | :------------: | :---: | :-------------: | :---: |
+| | e = 1 | e = E | E | B |
+| **TAGI** | 72.4 | 21.5 | 50 | 16 |
+| BP | - | 14.0 | 160 | 128 |
+
+?> The table above compares the classification accuracy with the results from [Osawa et al.](https://www.researchgate.net/publication/333650027_Practical_Deep_Learning_with_Bayesian_Principles) where they use a Resnet18 trained with backpropagation.
diff --git a/examples/examples.md b/examples/examples.md
new file mode 100644
index 0000000..67951f9
--- /dev/null
+++ b/examples/examples.md
@@ -0,0 +1,15 @@
+## FNN examples
+
+[FNN examples](examples/fnn/fnn-examples.md)
+
+## CNN examples
+
+[CNN examples](examples/cnn/cnn-examples.md)
+
+## LSTM examples
+
+[LSTM examples](examples/lstm/lstm-examples.md)
+
+## Autoencoders examples
+
+[Autoencoders examples](examples/ae/autoencoder-examples.md)
diff --git a/examples/fnn/1d-toy-derivative.md b/examples/fnn/1d-toy-derivative.md
new file mode 100644
index 0000000..853990d
--- /dev/null
+++ b/examples/fnn/1d-toy-derivative.md
@@ -0,0 +1,110 @@
+# 1D toy regression problem with full covariance
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/05/10
+**Description:** This example shows how to solve a 1D toy regression problem with derivatives using a FNN.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+```python
+from visualizer import PredictionViz
+
+from python_examples.data_loader import RegressionDataLoader
+from python_examples.model import DervMLP
+from python_examples.regression import Regression
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+In this simple example we will use a 1D toy dataset. The data is generated from a function with a random noise. The goal is to learn the function from the data.
+
+```python
+# User-input
+num_inputs = 1 # 1 explanatory variable
+num_outputs = 1 # 1 predicted output
+num_epochs = 50 # row for 50 epochs
+x_train_file = "./data/toy_example/derivative_x_train_1D.csv"
+y_train_file = "./data/toy_example/derivative_y_train_1D.csv"
+x_test_file = "./data/toy_example/derivative_x_test_1D.csv"
+y_test_file = "./data/toy_example/derivative_y_test_1D.csv"
+```
+
+**You can find the used data in the [toy_example data](https://github.com/lhnguyen102/cuTAGI/tree/main/data/toy_example) in the repository.*
+
+?>We can plot the training data points and the trend line we want to learn.
+
+![1D toy derivative problem data](../../images/1D_toy_regression_derivative_data.png)
+
+## 3. Create the model
+
+We will use a FNN with a simple architecture as defined in the FullCovMLP class wich is suited for this basic regression problem with derivatives. Find out more about the [DervMLP](modules/models?id=derivative-regression-mlp-class).
+
+```python
+# Model
+net_prop = DervMLP()
+```
+
+> If you want to use a different model, you can create your own class and make sure that it inherits from the NetProp class, more information in [models page](modules/models?id=netprop-class).
+
+## 4. Load the data
+
+We will make use of the [RegressionDataLoader](modules/data-loader?id=data-loader) class to load and process the data. The *process_data* function requires the input and output test and training files in a **csv** format.
+
+```python
+# Data loader
+reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+
+data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Create visualizer
+
+In order to visualize the predictions of the regression we can use the PredictionViz class. This class will create a window with the true function, the predicted function and the confidence intervals.
+
+```python
+viz = PredictionViz(task_name="derivative", data_name="toy1D")
+```
+
+> Learn more about PredictionViz class [here](https://github.com/lhnguyen102/cuTAGI/blob/main/visualizer.py).
+
+## 6. Train and evaluate the model
+
+Using the [regression class](modules/regression?id=regression-class) that makes use of TAGI, we will train and compute the derivatives of the specified layer. We can also specify a file with the true derivatives to compare the results when visualizing.
+
+```python
+reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+
+reg_task.train()
+reg_task.compute_derivatives(
+ layer=0,
+ truth_derv_file="./data/toy_example/derivative_dy_test_1D.csv")
+```
+
+## 7. Visualize the results
+
+?> If you have created the visualizarion object and passed it to the regression object, a new window will pop up with the results.
+
+![1D toy regression heteroscedastic problem](../../images/1D_toy_regression_derivative.png)
+
+**The black line is the true function, the red line is the predicted function and the red zone is the confidence intervals.*
diff --git a/examples/fnn/1d-toy-fullcov.md b/examples/fnn/1d-toy-fullcov.md
new file mode 100644
index 0000000..79ffe43
--- /dev/null
+++ b/examples/fnn/1d-toy-fullcov.md
@@ -0,0 +1,113 @@
+# 1D toy regression problem with full covariance
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/05/10
+**Description:** This example shows how to solve a 1D toy regression problem with full covariance using a FNN.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+```python
+from visualizer import PredictionViz
+
+from python_examples.data_loader import RegressionDataLoader
+from python_examples.model import FullCovMLP
+from python_examples.regression import Regression
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+In this simple example we will use a 1D toy dataset. The data is generated from a function with a random noise. The goal is to learn the function from the data.
+
+```python
+# User-input
+num_inputs = 1 # 1 explanatory variable
+num_outputs = 1 # 1 predicted output
+num_epochs = 50 # row for 50 epochs
+x_train_file = "./data/toy_example/x_train_1D_full_cov.csv"
+y_train_file = "./data/toy_example/y_train_1D_full_cov.csv"
+x_test_file = "./data/toy_example/x_test_1D_full_cov.csv"
+y_test_file = "./data/toy_example/y_test_1D_full_cov.csv"
+```
+
+**You can find the used data in the [toy_example data](https://github.com/lhnguyen102/cuTAGI/tree/main/data/toy_example) in the repository.*
+
+?>We can plot the training data points and the trend line we want to learn.
+
+![1D toy full covariance problem data](../../images/1D_toy_regression_fullcov_data.png)
+
+## 3. Create the model
+
+We will use a FNN with a simple architecture as defined in the FullCovMLP class wich is suited for this basic regression problem with full covariance. Find out more about the [FullCovMLP class](modules/models?id=full-covariance-regression-mlp-class).
+
+```python
+# Model
+net_prop = HeterosMLP()
+```
+
+> If you want to use a different model, you can create your own class and make sure that it inherits from the NetProp class, more information in [models page](modules/models?id=netprop-class).
+
+## 4. Load the data
+
+We will make use of the [RegressionDataLoader](modules/data-loader?id=data-loader) class to load and process the data. The *process_data* function requires the input and output test and training files in a **csv** format.
+
+```python
+# Data loader
+reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+
+data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Create visualizer
+
+In order to visualize the predictions of the regression we can use the PredictionViz class. This class will create a window with the true function, the predicted function and the confidence intervals.
+
+```python
+viz = PredictionViz(task_name="full_cov_regression", data_name="toy1D")
+```
+
+> Learn more about PredictionViz class [here](https://github.com/lhnguyen102/cuTAGI/blob/main/visualizer.py).
+
+## 6. Train and evaluate the model
+
+Using the [regression class](modules/regression?id=regression-class) that makes use of TAGI, we will train and test the model. When doing the prediction we can specify the standard deviation factor to calculate the confidence intervals.
+
+```python
+reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+
+reg_task.train()
+reg_task.predict()
+```
+
+## 7. Visualize the results
+
+At the end of the execution the results will be printed in the console as seen below.
+
+> MSE : 2.96
+> Log-likelihood: -3.67
+
+?> If you have created the visualizarion object and passed it to the regression object, a new window will pop up with the results.
+
+![1D toy regression heteroscedastic problem](../../images/1D_toy_regression_fullcov.png)
+
+**The black line is the true function, the red line is the predicted function and the red zone is the confidence intervals.*
diff --git a/examples/fnn/1d-toy-heteros.md b/examples/fnn/1d-toy-heteros.md
new file mode 100644
index 0000000..98f73bc
--- /dev/null
+++ b/examples/fnn/1d-toy-heteros.md
@@ -0,0 +1,113 @@
+# 1D toy regression problem with heteroscedasticity
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/03/14
+**Description:** This example shows how to solve a 1D toy regression problem with heteroscedasticity using a FNN.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+```python
+from visualizer import PredictionViz
+
+from python_examples.data_loader import RegressionDataLoader
+from python_examples.model import HeterosMLP
+from python_examples.regression import Regression
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+In this simple example we will use a 1D toy dataset. The data is generated from a function with a random noise. The goal is to learn the function from the data.
+
+```python
+# User-input
+num_inputs = 1 # 1 explanatory variable
+num_outputs = 1 # 1 predicted output
+num_epochs = 50 # row for 50 epochs
+x_train_file = "./data/toy_example/x_train_1D_noise_inference.csv"
+y_train_file = "./data/toy_example/y_train_1D_noise_inference.csv"
+x_test_file = "./data/toy_example/x_test_1D_noise_inference.csv"
+y_test_file = "./data/toy_example/y_test_1D_noise_inference.csv"
+```
+
+**You can find the used data in the [toy_example data](https://github.com/lhnguyen102/cuTAGI/tree/main/data/toy_example) in the repository.*
+
+?>We can plot the training data points and the trend line we want to learn.
+
+![1D toy regression problem data](../../images/1D_toy_regression_heteros_data.png)
+
+## 3. Create the model
+
+We will use a FNN with a simple architecture as defined in the HeterosMLP class wich is suited for this basic regression problem with heteroscedasticity. Find out more about the [HeterosMLP class](modules/models?id=heteroscedastic-regression-mlp-class).
+
+```python
+# Model
+net_prop = HeterosMLP()
+```
+
+> If you want to use a different model, you can create your own class and make sure that it inherits from the NetProp class, more information in [models page](modules/models?id=netprop-class).
+
+## 4. Load the data
+
+We will make use of the [RegressionDataLoader](modules/data-loader?id=data-loader) class to load and process the data. The *process_data* function requires the input and output test and training files in a **csv** format.
+
+```python
+# Data loader
+reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+
+data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Create visualizer
+
+In order to visualize the predictions of the regression we can use the PredictionViz class. This class will create a window with the true function, the predicted function and the confidence intervals.
+
+```python
+viz = PredictionViz(task_name="heteros_regression", data_name="toy1D")
+```
+
+> Learn more about PredictionViz class [here](https://github.com/lhnguyen102/cuTAGI/blob/main/visualizer.py).
+
+## 6. Train and evaluate the model
+
+Using the [regression class](modules/regression?id=regression-class) that makes use of TAGI, we will train and test the model. When doing the prediction we can specify the standard deviation factor to calculate the confidence intervals.
+
+```python
+reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+
+reg_task.train()
+reg_task.predict()
+```
+
+## 7. Visualize the results
+
+At the end of the execution the results will be printed in the console as seen below.
+
+> MSE : 2.10
+> Log-likelihood: -0.15
+
+?> If you have created the visualizarion object and passed it to the regression object, a new window will pop up with the results.
+
+![1D toy regression heteroscedastic problem](../../images/1D_toy_regression_heteros.png)
+
+**The black line is the true function, the red line is the predicted function and the red zone is the confidence intervals.*
diff --git a/examples/fnn/1d-toy.md b/examples/fnn/1d-toy.md
new file mode 100644
index 0000000..f091dea
--- /dev/null
+++ b/examples/fnn/1d-toy.md
@@ -0,0 +1,114 @@
+# 1D toy regression problem
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/03/15
+**Description:** This example shows how to use the a simple feedforward neural network to solve a 1D toy regression problem.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+```python
+from visualizer import PredictionViz
+
+from python_examples.data_loader import RegressionDataLoader
+from python_examples.model import RegressionMLP
+from python_examples.regression import Regression
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+In this simple example we will use a 1D toy dataset. The data is generated from a function with a random noise. The goal is to learn the function from the data.
+
+```python
+# User-input
+num_inputs = 1 # 1 explanatory variable
+num_outputs = 1 # 1 predicted output
+num_epochs = 50 # row for 50 epochs
+x_train_file = "./data/toy_example/x_train_1D.csv"
+y_train_file = "./data/toy_example/y_train_1D.csv"
+x_test_file = "./data/toy_example/x_test_1D.csv"
+y_test_file = "./data/toy_example/y_test_1D.csv"
+```
+
+**You can find the used data in the [toy_example data](https://github.com/lhnguyen102/cuTAGI/tree/main/data/toy_example) in the repository.*
+
+?>We can plot the training data points and the trend line we want to learn.
+
+![1D toy regression problem data](../../images/1D_toy_regression_data.png)
+
+
+## 3. Create the model
+
+We will use a FNN with a simple architecture as defined in the RegressionMLP class wich is suited for this basic regression problem. Find out more about the [RegressionMLP class](modules/models?id=regression-mlp-class).
+
+```python
+# Model
+net_prop = RegressionMLP()
+```
+
+> If you want to use a different model, you can create your own class and make sure that it inherits from the NetProp class, more information in [models page](modules/models?id=netprop-class).
+
+## 4. Load the data
+
+We will make use of the [RegressionDataLoader](modules/data-loader?id=data-loader) class to load and process the data. The *process_data* function requires the input and output test and training files in a **csv** format.
+
+```python
+# Data loader
+reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+
+data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Create visualizer
+
+In order to visualize the predictions of the regression we can use the PredictionViz class. This class will create a window with the true function, the predicted function and the confidence intervals.
+
+```python
+viz = PredictionViz(task_name="regression", data_name="toy1D")
+```
+
+> Learn more about PredictionViz class [here](https://github.com/lhnguyen102/cuTAGI/blob/main/visualizer.py).
+
+## 6. Train and evaluate the model
+
+Using the [regression class](modules/regression?id=regression-class) that makes use of TAGI, we will train and test the model. When doing the prediction we can specify the standard deviation factor to calculate the confidence intervals.
+
+```python
+reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+
+reg_task.train()
+reg_task.predict(std_factor=3)
+```
+
+## 7. Visualize the results
+
+At the end of the execution the results will be printed in the console as seen below.
+
+> MSE : 1026.14
+> Log-likelihood: -5.89
+
+?> If you have created the visualizarion object and passed it to the regression object, a new window will pop up with the results.
+
+![1D toy regression problem](../../images/1D_toy_regression.png)
+
+**The black line is the true function, the red line is the predicted function and the red zone is the confidence intervals.*
diff --git a/examples/fnn/_sidebar.md b/examples/fnn/_sidebar.md
new file mode 100644
index 0000000..4ef63c1
--- /dev/null
+++ b/examples/fnn/_sidebar.md
@@ -0,0 +1,5 @@
+- [1D toy regression problem](examples/fnn/1d-toy.md)
+- [1D toy regression problem with heteroscedasticity](examples/fnn/1d-toy-heteros.md)
+- [1D toy regression problem with full covariance](examples/fnn/1d-toy-fullcov.md)
+- [1D toy regression problem with derivatives](examples/fnn/1d-toy-derivative.md)
+- [UCI Large regression dataset (heteroscedastic)](examples/fnn/uci-large-heteros.md)
\ No newline at end of file
diff --git a/examples/fnn/fnn-examples.md b/examples/fnn/fnn-examples.md
new file mode 100644
index 0000000..05be702
--- /dev/null
+++ b/examples/fnn/fnn-examples.md
@@ -0,0 +1,7 @@
+# FNN examples
+
+- [1D toy regression problem](examples/fnn/1d-toy.md)
+- [1D toy regression problem with heteroscedasticity](examples/fnn/1d-toy-heteros.md)
+- [1D toy regression problem with full covariance](examples/fnn/1d-toy-fullcov.md)
+- [1D toy regression problem with derivatives](examples/fnn/1d-toy-derivative.md)
+- [UCI Large regression dataset (heteroscedastic)](examples/fnn/uci-large-heteros.md)
diff --git a/examples/fnn/uci-large-heteros.md b/examples/fnn/uci-large-heteros.md
new file mode 100644
index 0000000..b5db2d6
--- /dev/null
+++ b/examples/fnn/uci-large-heteros.md
@@ -0,0 +1,107 @@
+# UCI regression problem with heteroscedasticity
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/05/23
+**Description:** This example shows how predict the housing prices in Boston using a FNN with heteroscedasticity.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+```python
+from python_examples.data_loader import RegressionDataLoader
+from python_examples.regression import Regression
+from pytagi import NetProp
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+In this simple example we will use the Boston housing dataset and we will try to predict the housing prices given 13 caracteristics.
+
+```python
+# User-input
+num_inputs = 13 # 1 explanatory variable
+num_outputs = 1 # 1 predicted output
+num_epochs = 50 # row for 50 epochs
+x_train_file = "./data/UCI/Boston_housing/x_train.csv"
+y_train_file = "./data/UCI/Boston_housing/y_train.csv"
+x_test_file = "./data/UCI/Boston_housing/x_test.csv"
+y_test_file = "./data/UCI/Boston_housing/y_test.csv"
+```
+
+**You can find the used data in the [UCI data](https://github.com/lhnguyen102/cuTAGI/tree/main/data/UCI) in the repository.*
+
+## 3. Create the model
+
+We will use a FNN with a simple architecture as defined in the HeterosUCIMLP class wich is suited for this regression problem with heteroscedasticity.
+
+```python
+class HeterosUCIMLP(NetProp):
+ """Multi-layer preceptron for regression task where the
+ output's noise varies overtime"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [1, 1, 1, 1]
+ self.nodes = [13, 50, 50, 2] # output layer = [mean, std]
+ self.activations = [0, 4, 4, 0]
+ self.batch_size = 10
+ self.sigma_v = 2
+ self.sigma_v_min = 0.3
+ self.noise_gain = 1.0
+ self.noise_type = "heteros"
+ self.init_method = "He"
+ self.device = "cpu"
+```
+
+```python
+# Model
+net_prop = HeterosUCIMLP()
+```
+
+## 4. Load the data
+
+We will make use of the [RegressionDataLoader](modules/data-loader?id=data-loader) class to load and process the data. The *process_data* function requires the input and output test and training files in a **csv** format.
+
+```python
+# Data loader
+reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+
+data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## 5. Train and evaluate the model
+
+Using the [regression class](modules/regression?id=regression-class) that makes use of TAGI, we will train and test the model. When doing the prediction we can specify the standard deviation factor to calculate the confidence intervals.
+
+```python
+reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop)
+
+reg_task.train()
+reg_task.predict()
+```
+
+## 6. Results
+
+At the end of the execution the results will be printed in the console as seen below.
+
+> MSE : 4.66
+> Log-likelihood: -3.84
diff --git a/examples/lstm/_sidebar.md b/examples/lstm/_sidebar.md
new file mode 100644
index 0000000..3e58f3a
--- /dev/null
+++ b/examples/lstm/_sidebar.md
@@ -0,0 +1,2 @@
+- [Toy time series problem](examples/lstm/toy-time-series.md)
+- [Electricity time series problem](examples/lstm/electricity-time-series.md)
diff --git a/examples/lstm/electricity-time-series.md b/examples/lstm/electricity-time-series.md
new file mode 100644
index 0000000..4a3b4d7
--- /dev/null
+++ b/examples/lstm/electricity-time-series.md
@@ -0,0 +1,124 @@
+# Electricity time series problem
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/05/25
+**Description:** This example shows how to use the time series forecaster to solve a forecasting LSTM problem to predict electricity consumptions.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+```python
+from visualizer import PredictionViz
+
+from python_examples.data_loader import TimeSeriesDataloader
+from python_examples.model import TimeSeriesLSTM
+from python_examples.time_series_forecaster import TimeSeriesForecaster
+from pytagi import load_param_from_files
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+In this simple example we will use the individual household electric power consumption data set. The goal is to learn to predict the following timesteps knowing the previous ones.
+
+```python
+# User-input
+num_epochs = 50 # row for 50 epochs
+output_col = [0] # column to predict
+num_features = 1 # number of features
+input_seq_len = 5 # input sequence length
+output_seq_len = 1 # output sequence length
+seq_stride = 1 # stride between input sequences
+x_train_file = "./data/UCI/Electricity/x_train_file.csv"
+datetime_train_file = "./data/UCI/Electricity/datetime_train_file.csv"
+x_test_file = "./data/UCI/Electricity/x_test_file.csv"
+datetime_test_file = "./data/UCI/Electricity/datetime_test_file.csv"
+```
+
+**You can find the used data in the [UCI webpage](http://archive.ics.uci.edu/ml/datasets/Individual+household+electric+power+consumption).*
+
+?>We can plot the training data points and the trend line we want to learn.
+
+![Electricity LSTM problem data](../../images/electricity_lstm_data.png)
+
+## 3. Create the model
+
+We will use a with a simple LSTM architecture as defined in the TimeSeriesLSTM class wich is suited for Time Series Forecasting. Find out more about the [TimeSeriesLSTM class](modules/models?id=lstm-for-time-series-forecasting).
+
+```python
+# Model
+net_prop = TimeSeriesLSTM(input_seq_len=input_seq_len,
+ output_seq_len=output_seq_len,
+ seq_stride=seq_stride)
+```
+
+> If you want to use a different model, you can create your own class and make sure that it inherits from the NetProp class, more information in [models page](modules/models?id=netprop-class).
+
+## 4. Load the data
+
+We will make use of the [TimeSeriesDataloader](modules/data-loader?id=data-loader) class to load and process the data. The *process_data* function requires the input and output test and training files in a **csv** format.
+
+```python
+# Data loader
+ts_data_loader = TimeSeriesDataloader(batch_size=net_prop.batch_size,
+ output_col=output_col,
+ input_seq_len=input_seq_len,
+ output_seq_len=output_seq_len,
+ num_features=num_features,
+ stride=seq_stride)
+data_loader = ts_data_loader.process_data(
+ x_train_file=x_train_file,
+ datetime_train_file=datetime_train_file,
+ x_test_file=x_test_file,
+ datetime_test_file=datetime_test_file)
+```
+
+## 5. Create visualizer
+
+In order to visualize the predictions of the regression we can use the PredictionViz class. This class will create a window with the true function, the predicted function and the confidence intervals.
+
+```python
+# Visualzier
+viz = PredictionViz(task_name="forecasting", data_name="Global active power")
+```
+
+> Learn more about PredictionViz class [here](https://github.com/lhnguyen102/cuTAGI/blob/main/visualizer.py).
+
+## 6. Train and evaluate the model
+
+Using the [TimeSeriesForecaster class](modules/time-series-forecaster.md) that makes use of TAGI, we will train and test the model. When doing the prediction we can specify the standard deviation factor to calculate the confidence intervals.
+
+```python
+# Train and test
+ts_task = TimeSeriesForecaster(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+
+ts_task.train()
+ts_task.predict()
+```
+
+## 7. Visualize the results
+
+At the end of the execution the results will be printed in the console as seen below.
+
+> MSE : 0.07
+> Log-likelihood: -0.50
+
+?> If you have created the visualizarion object and passed it to the regression object, a new window will pop up with the results.
+
+![Electricity LSTM problem data](../../images/electricity_lstm_full.png)
+
+**The black line is the true function, the red line is the predicted function and the red zone is the confidence intervals.*
diff --git a/examples/lstm/lstm-examples.md b/examples/lstm/lstm-examples.md
new file mode 100644
index 0000000..027bef1
--- /dev/null
+++ b/examples/lstm/lstm-examples.md
@@ -0,0 +1,4 @@
+# LSTM examples
+
+- [Toy time series problem](examples/lstm/toy-time-series.md)
+- [Electricity time series problem](examples/lstm/electricity-time-series.md)
diff --git a/examples/lstm/toy-time-series.md b/examples/lstm/toy-time-series.md
new file mode 100644
index 0000000..9cfdcfa
--- /dev/null
+++ b/examples/lstm/toy-time-series.md
@@ -0,0 +1,124 @@
+# Toy time series problem
+
+**Author:** [Miquel Florensa](https://www.linkedin.com/in/miquel-florensa/)
+**Date:** 2023/05/25
+**Description:** This example shows how to use the time series forecaster to solve a LSTM toy problem.
+
+
+
+
+
+
+ Github Source code
+
+
+
+---
+
+## 1. Setup
+
+```python
+from visualizer import PredictionViz
+
+from python_examples.data_loader import TimeSeriesDataloader
+from python_examples.model import TimeSeriesLSTM
+from python_examples.time_series_forecaster import TimeSeriesForecaster
+from pytagi import load_param_from_files
+```
+
+?>Notice that this modules are described [here](modules/modules.md) and the source code is in the *python_examples* directory, in case you have the modules in another directory you must change this paths.
+
+## 2. Prepare the data
+
+In this simple example we will use a toy dataset. The data is generated from a sinus function. The goal is to learn to predict the following timesteps knowing the previous ones.
+
+```python
+# User-input
+num_epochs = 50 # row for 50 epochs
+output_col = [0] # column to predict
+num_features = 1 # number of features
+input_seq_len = 5 # input sequence length
+output_seq_len = 1 # output sequence length
+seq_stride = 1 # stride between input sequences
+x_train_file = "./data/toy_time_series/x_train_sin_data.csv"
+datetime_train_file = "./data/toy_time_series/train_sin_datetime.csv"
+x_test_file = "./data/toy_time_series/x_test_sin_data.csv"
+datetime_test_file = "./data/toy_time_series/test_sin_datetime.csv"
+```
+
+**You can find the used data in the [toy time series data](https://github.com/lhnguyen102/cuTAGI/tree/main/data/toy_time_series) in the repository.*
+
+?>We can plot the training data points and the trend line we want to learn.
+
+![Toy LSTM problem data](../../images/toy_lstm_data.png)
+
+## 3. Create the model
+
+We will use a with a simple LSTM architecture as defined in the TimeSeriesLSTM class wich is suited for Time Series Forecasting. Find out more about the [TimeSeriesLSTM class](modules/models?id=lstm-for-time-series-forecasting).
+
+```python
+# Model
+net_prop = TimeSeriesLSTM(input_seq_len=input_seq_len,
+ output_seq_len=output_seq_len,
+ seq_stride=seq_stride)
+```
+
+> If you want to use a different model, you can create your own class and make sure that it inherits from the NetProp class, more information in [models page](modules/models?id=netprop-class).
+
+## 4. Load the data
+
+We will make use of the [TimeSeriesDataloader](modules/data-loader?id=data-loader) class to load and process the data. The *process_data* function requires the input and output test and training files in a **csv** format.
+
+```python
+# Data loader
+ts_data_loader = TimeSeriesDataloader(batch_size=net_prop.batch_size,
+ output_col=output_col,
+ input_seq_len=input_seq_len,
+ output_seq_len=output_seq_len,
+ num_features=num_features,
+ stride=seq_stride)
+data_loader = ts_data_loader.process_data(
+ x_train_file=x_train_file,
+ datetime_train_file=datetime_train_file,
+ x_test_file=x_test_file,
+ datetime_test_file=datetime_test_file)
+```
+
+## 5. Create visualizer
+
+In order to visualize the predictions of the regression we can use the PredictionViz class. This class will create a window with the true function, the predicted function and the confidence intervals.
+
+```python
+# Visualzier
+viz = PredictionViz(task_name="forecasting", data_name="sin_signal")
+```
+
+> Learn more about PredictionViz class [here](https://github.com/lhnguyen102/cuTAGI/blob/main/visualizer.py).
+
+## 6. Train and evaluate the model
+
+Using the [TimeSeriesForecaster class](modules/time-series-forecaster.md) that makes use of TAGI, we will train and test the model. When doing the prediction we can specify the standard deviation factor to calculate the confidence intervals.
+
+```python
+# Train and test
+ts_task = TimeSeriesForecaster(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+
+ts_task.train()
+ts_task.predict()
+```
+
+## 7. Visualize the results
+
+At the end of the execution the results will be printed in the console as seen below.
+
+> MSE : 0.03
+> Log-likelihood: -1.11
+
+?> If you have created the visualizarion object and passed it to the regression object, a new window will pop up with the results.
+
+![Toy LSTM problem](../../images/toy_lstm.png)
+
+**The black line is the true function, the red line is the predicted function and the red zone is the confidence intervals.*
diff --git a/guide/_sidebar.md b/guide/_sidebar.md
new file mode 100644
index 0000000..1532fa0
--- /dev/null
+++ b/guide/_sidebar.md
@@ -0,0 +1,4 @@
+- **User Guide**
+
+ - [Installing cuTAGI](guide/install.md)
+ - [Quick Tutorial](guide/quick-tutorial.md)
diff --git a/guide/install.md b/guide/install.md
new file mode 100644
index 0000000..0376587
--- /dev/null
+++ b/guide/install.md
@@ -0,0 +1,142 @@
+## Prerequisites for Local Installation
+* Compiler with C++14 support
+* CMake>=3.23
+* CUDA toolkit (optional)
+
+## `pytagi` Installation
+`pytagi` is a Python wrapper of C++/CUDA backend for TAGI method. The developers can install either [distributed](#pypi-installation) or [local](#local-installation) versions of `pytagi`. Currently `pytagi` only supports Python version >=3.9 on both MacOS and Ubuntu.
+
+### Create Miniconda Environment
+We recommend installing miniconda for managing Python environment, yet `pytagi` works well with other alternatives.
+1. Install miniconda by following these [instructions](https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html#system-requirements)
+2. Create a conda environment
+ ```
+ conda create --name your_env_name python=3.10
+ ```
+3. Activate conda environment
+ ```
+ conda activate your_env_name
+ ```
+
+### PyPI Installation
+1. [Create conda environment](#create-conda-environment)
+2. Install requirements
+ ```
+ pip install -r requirements.txt
+ ```
+3. Install `pytagi`
+ ```
+ pip install pytagi
+ ```
+4. Test `pytagi` package
+ ```sh
+ python -m python_examples.regression_runner
+ ```
+NOTE: This PyPI distributed version does not require the codebase in this repository. The developers can create their own applications (see [python_examples](python_examples)).
+
+### Local Installation
+1. Clone this repository. Note that `git submodule` command allows cloning [pybind11](https://github.com/pybind/pybind11) which is the binding python package of C++/CUDA.
+ ```
+ git clone https://github.com/lhnguyen102/cuTAGI.git
+ cd cuTAGI
+ git submodule update --init --recursive
+ ```
+2. [Create conda environment](#create-conda-environment)
+4. Install requirements
+ ```
+ pip install -r requirements.txt
+ ```
+5. Install `pytagi` package
+ ```sh
+ pip install .
+ ```
+6. Test `pytagi` package
+ ```sh
+ python -m python_examples.regression_runner
+ ```
+
+## `cutagi` Installation
+`cutagi` is the native version implemented in C++/CUDA for TAGI method. We highly recommend installing cuTAGI using Docker method to facilitate the installation.
+
+
+### Docker Build
+1. Install Docker by following these [instructions](https://docs.docker.com/get-docker/)
+2. Build docker image
+ * CPU build
+ ```sh
+ bash bin/build.sh
+ ```
+ * CUDA build
+ ```sh
+ bash bin/build.sh -d cuda
+ ```
+*NOTE: During the build and run, make sure that Docker desktop application is opened. The commands for runing tasks such as classification and regression can be found [here](#docker-run)
+
+### Ubuntu 20.04
+1. Install [CUDA toolkit](https://developer.nvidia.com/cuda-toolkit) >=10.1 in `/usr/local/` and add the CUDA location to PATH. For example, adding the following to your `~/.bashrc`
+ ```sh
+ export PATH="/usr/local/cuda-10.1/bin:$PATH"
+ export LD_LIBRARY_PATH="/usr/local/cuda-10.1/lib64:$LD_LIBRARY_PATH"
+ ```
+2. Install GCC compiler by entering this line in `Terminal`
+ ```sh
+ sudo apt install g++
+ ```
+3. Install CMake by following [these instructions](https://cmake.org/install/)
+
+4. Build the project using CMake by the folder `cuTAGI` and entering these lines in `Terminal`
+ ```sh
+ cmake . -B build
+ cmake --build build --config RelWithDebInfo -j 16
+ ```
+
+### Windows
+1. Download and install MS Visual Studio 2019 community and C/C++ by following [these instructions](https://docs.microsoft.com/en-us/cpp/build/vscpp-step-0-installation?view=msvc-170)
+
+2. Install [CUDA toolkit](https://developer.nvidia.com/cuda-toolkit) >=10.1 and add CUDA location to Environment variables [(see Step 5.3)](https://towardsdatascience.com/installing-tensorflow-with-cuda-cudnn-and-gpu-support-on-windows-10-60693e46e781)
+
+3. Copy all extenstion files from CUDA to MS Visual Studio. See this [link](https://github.com/mitsuba-renderer/mitsuba2/issues/103#issuecomment-618378963) for further details.
+ ```sh
+ COPY FROM C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\extras\visual_studio_integration\MSBuildExtensions
+ TO C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Microsoft\VC\v160\BuildCustomizations
+ ```
+4. Download and install CMake [Windows x64 Installer](https://cmake.org/download/) and add the install directory (e.g., `C:\Program Files\CMake\bin`) to PATH in [Environment variables](https://docs.microsoft.com/en-us/previous-versions/office/developer/sharepoint-2010/ee537574(v=office.14))
+
+5. Add CMake CUDA compiler to [Environment variables](https://docs.microsoft.com/en-us/previous-versions/office/developer/sharepoint-2010/ee537574(v=office.14)).
+ ```sh
+ variable = CMAKE_CUDA_COMPILER
+ value = C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.1\bin\nvcc.exe
+ ```
+6. Build the project using CMake by navigating to the folder `cuTAGI` and entering these lines in `Command Prompt`
+ ```sh
+ cmake . -B build
+ cmake --build build --config RelWithDebInfo -j 16
+ ```
+
+*NOTE: Users must enter the CUDA version installed on their machine. Here, we illustrate the installation with CUDA version v10.1 (see Step 1 for Ubuntu and 3 & 5 for Windows).
+
+### Mac OS (CPU Version)
+1. [Install gcc and g++](https://formulae.brew.sh/formula/gcc) via `Terminal`
+ ```sh
+ brew install gcc
+ ```
+2. Install CMake by following [these instructions](https://cmake.org/install/)
+
+3. [Add CMake to PATH](https://code2care.org/pages/permanently-set-path-variable-in-mac-zsh-shell). Add the following line to your `.zshrc` file
+ ```sh
+ export PATH="/Applications/CMake.app/Contents/bin/:$PATH"
+ ```
+
+4. Build the project using CMake by the folder `cuTAGI` and entering these lines in `Terminal`
+ ```sh
+ cmake . -B build
+ cmake --build build --config RelWithDebInfo -j 16
+ ```
+
+### VS Code
+1. Install gcc and g++ w.r.t operating system such as Ubuntu, Window, and Mac OS
+2. Install CMake
+3. Install [the following prerequites](https://code.visualstudio.com/docs/cpp/cmake-linux)
+* Visual Studio Code
+* C++ extension for VS Code
+* CMake Tools extension for VS Code
\ No newline at end of file
diff --git a/guide/quick-tutorial.md b/guide/quick-tutorial.md
new file mode 100644
index 0000000..90a9da3
--- /dev/null
+++ b/guide/quick-tutorial.md
@@ -0,0 +1,81 @@
+
+
+## Introduction
+
+In this tutorial, we will see how to use pytagi to solve a simple regression problem. We will use a 1D toy dataset and a feedforward neural network (FNN) with a simple architecture.
+
+## Define user input and data
+
+In this simple example, we will use a 1D toy dataset. The dataset is composed of 10 training samples and 100 test samples and can be found in the [github repository](https://github.com/lhnguyen102/cuTAGI/tree/main/data/toy_example).
+
+```python
+# User-input
+num_inputs = 1
+num_outputs = 1
+x_train_file = "./data/toy_example/x_train_1D.csv"
+y_train_file = "./data/toy_example/y_train_1D.csv"
+x_test_file = "./data/toy_example/x_test_1D.csv"
+y_test_file = "./data/toy_example/y_test_1D.csv"
+```
+
+## Build Regression Model
+
+We will use a FNN with a simple architecture. We will use the RegressionMLP class already defined and consistent with this basic regression problem (you can find the class implementation [here](modules/models?id=regression-mlp-class)).
+
+```python
+# Model
+net_prop = RegressionMLP()
+```
+
+If you want to use a different model, you can define your own class and make sure that it inherits from the NetProp class, more information in [models page](modules/models?id=mlp-generic-class).
+
+## Data loader
+
+We will make use of the [RegressionDataLoader](modules/data-loader.md) class to load and process the data. The *process_data* function requires the input and output test and training files in a **csv** format.
+
+```python
+# Data loader
+reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+
+data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## Train and test the model
+
+Using the [regression class](modules/regression?id=regression-class) that makes use of TAGI, we will train and test the model. In order to perform the task we will also need to specify the number of epochs.
+
+```python
+# Optional: Visualize the test using visualizer.py
+viz = PredictionViz(task_name="regression", data_name="toy1D")
+
+num_epochs = 50
+
+# Train and test
+reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+reg_task.train()
+reg_task.predict(std_factor=3)
+```
+
+**Learn more about PredictionViz class [here](https://github.com/lhnguyen102/cuTAGI/blob/main/visualizer.py).*
+
+## Results
+
+Since we have created a vizualization object that we passed to the regression class, at the end of the execution we will be able to see the results of the prediction. The black line is the true function, the red line is the predicted function and the red zone is the confidence intervals.
+
+![1D toy regression problem](../images/1D_toy_regression.png)
diff --git a/images/.DS_Store b/images/.DS_Store
new file mode 100644
index 0000000..2e542a7
Binary files /dev/null and b/images/.DS_Store differ
diff --git a/images/1D_toy_regression.png b/images/1D_toy_regression.png
new file mode 100644
index 0000000..ed4603a
Binary files /dev/null and b/images/1D_toy_regression.png differ
diff --git a/images/1D_toy_regression_data.png b/images/1D_toy_regression_data.png
new file mode 100644
index 0000000..b4aa3fb
Binary files /dev/null and b/images/1D_toy_regression_data.png differ
diff --git a/images/1D_toy_regression_derivative.png b/images/1D_toy_regression_derivative.png
new file mode 100644
index 0000000..fbf4f0a
Binary files /dev/null and b/images/1D_toy_regression_derivative.png differ
diff --git a/images/1D_toy_regression_derivative_data.png b/images/1D_toy_regression_derivative_data.png
new file mode 100644
index 0000000..ca83d98
Binary files /dev/null and b/images/1D_toy_regression_derivative_data.png differ
diff --git a/images/1D_toy_regression_fullcov.png b/images/1D_toy_regression_fullcov.png
new file mode 100644
index 0000000..c1657ed
Binary files /dev/null and b/images/1D_toy_regression_fullcov.png differ
diff --git a/images/1D_toy_regression_fullcov_data.png b/images/1D_toy_regression_fullcov_data.png
new file mode 100644
index 0000000..43b9e02
Binary files /dev/null and b/images/1D_toy_regression_fullcov_data.png differ
diff --git a/images/1D_toy_regression_heteros.png b/images/1D_toy_regression_heteros.png
new file mode 100644
index 0000000..002da8d
Binary files /dev/null and b/images/1D_toy_regression_heteros.png differ
diff --git a/images/1D_toy_regression_heteros_data.png b/images/1D_toy_regression_heteros_data.png
new file mode 100644
index 0000000..bd1ea64
Binary files /dev/null and b/images/1D_toy_regression_heteros_data.png differ
diff --git a/images/GitHub-Mark.png b/images/GitHub-Mark.png
new file mode 100644
index 0000000..9490ffc
Binary files /dev/null and b/images/GitHub-Mark.png differ
diff --git a/images/TAGI_2018.png b/images/TAGI_2018.png
new file mode 100644
index 0000000..71bfd47
Binary files /dev/null and b/images/TAGI_2018.png differ
diff --git a/images/architectures/arch-3-cov-cifar.png b/images/architectures/arch-3-cov-cifar.png
new file mode 100644
index 0000000..0d818f8
Binary files /dev/null and b/images/architectures/arch-3-cov-cifar.png differ
diff --git a/images/bayeswork.jpg b/images/bayeswork.jpg
new file mode 100644
index 0000000..d30d8ce
Binary files /dev/null and b/images/bayeswork.jpg differ
diff --git a/images/bayeswork_v2.jpg b/images/bayeswork_v2.jpg
new file mode 100644
index 0000000..d65a79b
Binary files /dev/null and b/images/bayeswork_v2.jpg differ
diff --git a/images/cifar_autoencoder.png b/images/cifar_autoencoder.png
new file mode 100644
index 0000000..b152408
Binary files /dev/null and b/images/cifar_autoencoder.png differ
diff --git a/images/cifar_autoencoder_disp.png b/images/cifar_autoencoder_disp.png
new file mode 100644
index 0000000..5ce594a
Binary files /dev/null and b/images/cifar_autoencoder_disp.png differ
diff --git a/images/cupyTAGI.png b/images/cupyTAGI.png
new file mode 100644
index 0000000..fadfd49
Binary files /dev/null and b/images/cupyTAGI.png differ
diff --git a/images/electricity_lstm.png b/images/electricity_lstm.png
new file mode 100644
index 0000000..9b94680
Binary files /dev/null and b/images/electricity_lstm.png differ
diff --git a/images/electricity_lstm_data.png b/images/electricity_lstm_data.png
new file mode 100644
index 0000000..18e2b62
Binary files /dev/null and b/images/electricity_lstm_data.png differ
diff --git a/images/electricity_lstm_full.png b/images/electricity_lstm_full.png
new file mode 100644
index 0000000..120d1f5
Binary files /dev/null and b/images/electricity_lstm_full.png differ
diff --git a/images/icons/docsify-darklight-theme-logo.png b/images/icons/docsify-darklight-theme-logo.png
new file mode 100644
index 0000000..ea307cc
Binary files /dev/null and b/images/icons/docsify-darklight-theme-logo.png differ
diff --git a/images/icons/favicon.png b/images/icons/favicon.png
new file mode 100644
index 0000000..a372b48
Binary files /dev/null and b/images/icons/favicon.png differ
diff --git a/images/icons/moon.svg b/images/icons/moon.svg
new file mode 100644
index 0000000..bc0bf10
--- /dev/null
+++ b/images/icons/moon.svg
@@ -0,0 +1 @@
+
\ No newline at end of file
diff --git a/images/icons/sun.svg b/images/icons/sun.svg
new file mode 100644
index 0000000..8fb9960
--- /dev/null
+++ b/images/icons/sun.svg
@@ -0,0 +1 @@
+
\ No newline at end of file
diff --git a/images/mnist_autoencoder_disp.png b/images/mnist_autoencoder_disp.png
new file mode 100644
index 0000000..dda347b
Binary files /dev/null and b/images/mnist_autoencoder_disp.png differ
diff --git a/images/toy_lstm.png b/images/toy_lstm.png
new file mode 100644
index 0000000..9fe2b1e
Binary files /dev/null and b/images/toy_lstm.png differ
diff --git a/images/toy_lstm_data.png b/images/toy_lstm_data.png
new file mode 100644
index 0000000..437701e
Binary files /dev/null and b/images/toy_lstm_data.png differ
diff --git a/index.html b/index.html
new file mode 100644
index 0000000..bd97a0b
--- /dev/null
+++ b/index.html
@@ -0,0 +1,113 @@
+
+
+
+
+
+
+
+
+ cuTAGI Tutorial
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ Loading ...
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/modules/_sidebar.md b/modules/_sidebar.md
new file mode 100644
index 0000000..6936bcc
--- /dev/null
+++ b/modules/_sidebar.md
@@ -0,0 +1,9 @@
+- [**pyTAGI Modules**](modules/modules.md)
+
+ - [Models](modules/models.md)
+ - [Dataloader](modules/data-loader.md)
+ - [Regression](modules/regression.md)
+ - [Classification](modules/classification.md)
+ - [Time Series Forecaster](modules/time_series_forecaster.md)
+ - [Autoencoder](modules/autoencoder.md)
+
diff --git a/modules/autoencoder.md b/modules/autoencoder.md
new file mode 100644
index 0000000..d8243d9
--- /dev/null
+++ b/modules/autoencoder.md
@@ -0,0 +1,111 @@
+# Autoencoder class
+
+The `Autoencoder` class is responsible for performing the autoencoder task using the TAGI algorithm.
+
+
+
+
+
+
+ GitHub Source code
+
+
+
+## Attributes
+
+- `utils`: An instance of the `Utils` class.
+- `num_epochs`: The number of epochs for training.
+- `data_loader`: A dictionary containing the data loader.
+- `encoder_prop`: An instance of the `NetProp` class representing the properties of the encoder network.
+- `decoder_prop`: An instance of the `NetProp` class representing the properties of the decoder network.
+- `encoder`: An instance of the `TagiNetwork` class representing the encoder network.
+- `decoder`: An instance of the `TagiNetwork` class representing the decoder network.
+- `viz`: An optional instance of the `ImageViz` class for visualization.
+- `dtype`: The data type (default: `np.float32`).
+
+## *constructor* method
+
+> Constructor for the Autoencoder class.
+
+```python
+def __init__(
+ self,
+ num_epochs: int,
+ data_loader: dict,
+ encoder_prop: NetProp,
+ decoder_prop: NetProp,
+ encoder_param: Union[Param, None] = None,
+ decoder_param: Union[Param, None] = None,
+ viz: Union[ImageViz, None] = None,
+ dtype=np.float32
+) -> None:
+```
+
+**Parameters**
+- `num_epochs`: An integer representing the number of epochs for training.
+- `data_loader`: A dictionary containing the data loader.
+- `encoder_prop`: An instance of the `NetProp` class representing the properties of the encoder network.
+- `decoder_prop`: An instance of the `NetProp` class representing the properties of the decoder network.
+- `encoder_param`: An optional instance of the `Param` class representing the parameters of the encoder network (default: `None`).
+- `decoder_param`: An optional instance of the `Param` class representing the parameters of the decoder network (default: `None`).
+- `viz`: An optional instance of the `ImageViz` class for visualization (default: `None`).
+- `dtype`: The data type (default: `np.float32`).
+
+## *train* method
+
+```python
+def train(self) -> None:
+ """Train encoder and decoder"""
+```
+
+Trains the encoder and decoder networks using the TAGI algorithm. It performs the following steps:
+1. Initializes inputs and outputs.
+2. Performs training iterations for each epoch.
+3. Updates the network parameters and hidden states.
+4. Computes the loss and displays the progress.
+5. Calls the `predict` method.
+
+## *predict* method
+
+```python
+def predict(self) -> None:
+ """Generate images"""
+```
+
+Generates images using the trained encoder and decoder networks. It performs the following steps:
+1. Initializes inputs.
+2. Makes predictions using the encoder and decoder networks.
+3. Retrieves the generated images.
+4. Performs visualization if the `viz` attribute is not `None`.
+
+## *init_inputs* method
+
+```python
+def init_inputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initialize the covariance matrix for inputs"""
+```
+
+Initializes the covariance matrix for the inputs. It returns the initialized covariance matrices `Sx_batch` and `Sx_f_batch`.
+
+
+
+**Parameters**
+- `batch_size`: An integer representing the batch size.
+
+**Returns**
+- A tuple containing the initialized covariance matrices `Sx_batch` and `Sx_f_batch`.
+
+## *init_outputs* method
+
+```python
+def init_outputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initialize the covariance matrix for outputs"""
+```
+
+Initializes the covariance matrix for the outputs. It returns the initialized covariance matrices `V_batch` and `ud_idx_batch`.
+
+**Parameters**
+- `batch_size`: An integer representing the batch size.
+
+**Returns**
+- A tuple containing the initialized covariance matrices `V_batch` and `ud_idx_batch`.
\ No newline at end of file
diff --git a/modules/classification.md b/modules/classification.md
new file mode 100644
index 0000000..a7f6faf
--- /dev/null
+++ b/modules/classification.md
@@ -0,0 +1,118 @@
+# The Classifier class
+
+The `Classifier` class is responsible for performing image classification using the TAGI algorithm.
+
+
+
+
+
+
+ Github Source code
+
+
+
+
+## Attributes
+
+- `hr_softmax`: Instance of the HierarchicalSoftmax class.
+- `utils`: Instance of the Utils class.
+- `num_epochs`: Number of epochs for training.
+- `data_loader`: Dictionary containing data loaders.
+- `net_prop`: Instance of the NetProp class representing the network properties.
+- `num_classes`: Number of classes.
+- `network`: Instance of the TagiNetwork class representing the network.
+
+## *constructor* method
+
+> Constructor for the Classifier class.
+
+```python
+def __init__(self, num_epochs: int, data_loader: dict, net_prop: NetProp,
+ num_classes: int) -> None:
+```
+
+**Parameters**
+- `num_epochs`: An integer representing the number of epochs for training.
+- `data_loader`: A dictionary containing data loaders.
+- `net_prop`: An instance of the NetProp class representing the network properties.
+- `num_classes`: An integer representing the number of classes.
+
+## *num_classes* getter method
+
+```python
+@property
+def num_classes(self) -> int:
+ """Get number of classes"""
+```
+
+**Returns**
+- `int`: The number of classes.
+
+## *num_classes* setter method
+
+```python
+@num_classes.setter
+def num_classes(self, value: int) -> None:
+ """Set number of classes"""
+```
+
+**Parameters**
+- `value`: An integer representing the number of classes.
+
+## *train* method
+
+```python
+def train(self) -> None:
+ """Train the network using TAGI"""
+```
+
+> See [TAGI](https://www.jmlr.org/papers/volume22/20-1009/20-1009.pdf) paper for more information.
+
+## *predict* method
+
+```python
+def predict(self) -> None:
+ """Make prediction using TAGI"""
+```
+
+> See [TAGI](https://www.jmlr.org/papers/volume22/20-1009/20-1009.pdf) paper for more information.
+
+## *train_one_hot* method
+
+```python
+def train_one_hot(self) -> None:
+ """Train the network using TAGI with one-hot encoded labels"""
+```
+
+## *predict_one_hot* method
+
+```python
+def predict_one_hot(self) -> None:
+ """Make predictions using TAGI with one-hot encoded labels."""
+```
+
+## *init_inputs* method
+
+```python
+def init_inputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initnitalize the covariance matrix for inputs"""
+```
+
+**Parameters**
+ - `batch_size` (int): The batch size.
+
+**Returns**
+ - `Tuple[np.ndarray, np.ndarray]`: A tuple containing the input covariance matrices.
+
+## *init_outputs* method
+
+```python
+def init_outputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initnitalize the covariance matrix for outputs"""
+```
+
+**Parameters**
+ - `batch_size` (int): The batch size.
+
+**Returns**
+ - `Tuple[np.ndarray, np.ndarray]`: A tuple containing the output covariance matrices.
diff --git a/modules/data-loader.md b/modules/data-loader.md
new file mode 100644
index 0000000..9f052f5
--- /dev/null
+++ b/modules/data-loader.md
@@ -0,0 +1,292 @@
+
+
+# data_loader.py
+
+
+
+
+
+
+ Github Source code
+
+
+
+# The DataloaderBase class
+
+This class represents a template for a data loader.
+
+## Atributes
+
+- `normalizer`: An instance from the [Normalizer](api/utils?id=the-normalizer-class) class.
+
+## *constructor* method
+
+> Constructor for the DataloaderBase class.
+
+```python
+def __init__(self, batch_size: int) -> None:
+```
+
+**Parameters**
+- `batch_size`: An integer representing the batch size.
+
+## *process_data* method
+
+```python
+@abstractmethod
+def process_data(self) -> dict:
+ """Abstract method for processing the data"""
+```
+
+**Returns**
+- `dict`: A dictionary containing the processed data.
+
+## *create_data_loader* method
+
+```python
+def create_data_loader(self, raw_input: np.ndarray, raw_output: np.ndarray) -> list:
+ """Create dataloader based on batch size"""
+```
+
+**Parameters**
+- `raw_input`: Raw input data as a NumPy array.
+- `raw_output`: Raw output data as a NumPy array.
+
+**Returns**
+- `list`: A list of tuples representing the input-output pairs in each batch.
+
+## *split_data* method
+
+```python
+@staticmethod
+def split_data(data: int, test_ratio: float = 0.2, val_ratio: float = 0.0) -> dict:
+ """Split data into training, validation, and test sets"""
+```
+
+**Parameters**
+- `data`: Input data as a NumPy array.
+- `test_ratio`: Optional. Float representing the ratio of test data (default: 0.2).
+- `val_ratio`: Optional. Float representing the ratio of validation data (default: 0.0).
+
+**Returns**
+- `dict`: A dictionary containing the split data sets.
+
+## *load_data_from_csv* method
+
+```python
+@staticmethod
+def load_data_from_csv(data_file: str) -> pd.DataFrame:
+ """Load data from a CSV file"""
+```
+
+**Parameters**
+- `data_file`: Path to the CSV file.
+
+**Returns**
+- `pd.DataFrame`: The loaded data as a Pandas DataFrame.
+
+## *split_evenly* method
+
+```python
+@staticmethod
+def split_evenly(num_data, chunk_size: int):
+ """Split data evenly"""
+```
+
+**Parameters**
+- `num_data`: The number of data points.
+- `chunk_size`: The size of each chunk.
+
+**Returns**
+- `np.ndarray`: An array of indices representing the split data.
+
+## *split_reminder* method
+
+```python
+@staticmethod
+def split_reminder(num_data: int, chunk_size: int):
+ """Pad the reminder"""
+```
+
+**Parameters**
+- `num_data`: The number of data points.
+- `chunk_size`: The size of each chunk.
+
+**Returns**
+- `np.ndarray`: An array of indices representing the split data.
+
+---
+
+
+
+
+
+
+
+
+
+
+
+# The RegressionDataLoader class
+
+A class for loading and formatting data that is fed to a neural network. This class inherits from the DataloaderBase class.
+
+## *constructor* method
+
+> Constructor for the RegressionDataLoader class.
+
+```python
+def __init__(self, batch_size: int, num_inputs: int, num_outputs: int) -> None:
+```
+
+**Parameters**
+- `batch_size`: An integer representing the batch size.
+- `num_inputs`: An integer representing the number of input features.
+- `num_outputs`: An integer representing the number of output features.
+
+## *process_data* method
+
+```python
+def process_data(self, x_train_file: str, y_train_file: str,
+ x_test_file: str, y_test_file: str) -> dict:
+ """Process data from the csv file"""
+```
+
+**Parameters**
+- `x_train_file`: A string representing the file path of the input training data in CSV format.
+- `y_train_file`: A string representing the file path of the output training data in CSV format.
+- `x_test_file`: A string representing the file path of the input testing data in CSV format.
+- `y_test_file`: A string representing the file path of the output testing data in CSV format.
+
+**Returns**
+- `dict`: A dictionary containing the processed data and normalization parameters.
+
+
+
+
+
+
+
+
+
+
+
+# MnistDataloader class
+
+Data loader for mnist dataset.
+
+## *process_data* method
+
+```python
+def process_data(self, x_train_file: str, y_train_file: str,
+ x_test_file: str, y_test_file: str) -> dict:
+ """Process mnist images"""
+```
+
+**Parameters**
+- `x_train_file`: Path to the file containing mnist training images.
+- `y_train_file`: Path to the file containing mnist training labels.
+- `x_test_file`: Path to the file containing mnist test images.
+- `y_test_file`: Path to the file containing mnist test labels.
+
+**Returns**
+- `dict`: A dictionary containing the processed data.
+
+
+
+
+
+
+
+
+
+
+
+# The TimeSeriesDataloader class
+
+Data loader for time series.
+
+## *constructor* method
+
+> Constructor for the TimeSeriesDataloader class.
+
+```python
+def __init__(self, batch_size: int, output_col: np.ndarray,
+ input_seq_len: int, output_seq_len: int, num_features: int,
+ stride: int) -> None:
+```
+
+**Parameters**
+- `batch_size`: An integer representing the batch size.
+- `output_col`: A NumPy array representing the output column.
+- `input_seq_len`: An integer representing the length of the input sequence.
+- `output_seq_len`: An integer representing the length of the output sequence.
+- `num_features`: An integer representing the number of features.
+- `stride`: An integer representing the stride.
+
+## *process_data* method
+
+```python
+def process_data(self, x_train_file: str, datetime_train_file: str,
+ x_test_file: str, datetime_test_file: str) -> dict:
+ """Process time series"""
+```
+
+**Parameters**
+- `x_train_file`: A string representing the file path for training input data.
+- `datetime_train_file`: A string representing the file path for training datetime data.
+- `x_test_file`: A string representing the file path for testing input data.
+- `datetime_test_file`: A string representing the file path for testing datetime data.
+
+**Returns**
+- `dict`: A dictionary containing the processed data.
+
+
+
+
+
+
+
+
+
+
+
+# ClassificationDataloader class
+
+Data loader for csv dataset for classification.
+
+## *constructor* method
+
+> Constructor for the ClassificationDataloader class.
+
+```python
+def __init__(self, batch_size: int) -> None:
+```
+
+**Parameters**
+- `batch_size`: An integer representing the batch size for the data loader.
+
+## *process_data* method
+
+```python
+def process_data(self, x_train_file: str, y_train_file: str,
+ x_test_file: str, y_test_file: str) -> dict:
+ """Process data from the csv file"""
+```
+
+**Parameters**
+- `x_train_file`: File path for the training images CSV file.
+- `y_train_file`: File path for the training labels CSV file.
+- `x_test_file`: File path for the test images CSV file.
+- `y_test_file`: File path for the test labels CSV file.
+
+**Returns**
+- `dict`: A dictionary containing the processed data.
diff --git a/modules/models.md b/modules/models.md
new file mode 100644
index 0000000..a7dd6f4
--- /dev/null
+++ b/modules/models.md
@@ -0,0 +1,330 @@
+# Models
+
+?> **All models explained in this document can be found in the [python examples](https://github.com/lhnguyen102/cuTAGI/blob/main/python_examples/model.py) source code.**
+
+In order to use the pyTAGI library, it is necessary to create a model class that inherits from the NetProp class. This NetProp class is essentially a wrapper from cuTAGI and is described in detail in its [Tagi Network API](api/tagi-network.md) section. Thus, it will be required to import this class.
+
+```python
+from pytagi import NetProp
+```
+
+## MLP Generic class
+
+!> *This class is not implemented in the original code. It may be usefull to have it, or maybe not.*
+
+This class is the base class for all the models. It can be customized as desired and it has the following arguments:
+
+- layers: list of integers.
+- nodes: list of integers.
+- activations: list of integers.
+- batch_size: integer.
+- *sigma_v: float.
+- *sigma_v_min: float.
+- *noise_type: string.
+- *noise_gain: float.
+- *init_method: string.
+- *device: string.
+
+**Not mandatory arguments*
+
+```python
+from typing import Union
+
+class MLP(NetProp):
+ """Multi-layer perceptron"""
+
+ def __init__(self,
+ layers: list,
+ nodes: list,
+ activations: list,
+ batch_size: int,
+ sigma_v: Union[float , None] = None,
+ sigma_v_min: Union[float, None] = None,
+ noise_type: Union[str, None] = None,
+ noise_gain: Union[float, None] = None,
+ init_method: Union[str, None] = "He",
+ device: Union[str, None] = "cpu") -> None:
+ super().__init__()
+ self.layers = layers
+ self.nodes = nodes
+ self.activations = activations
+ self.batch_size = batch_size
+ if sigma_v is not None:
+ self.sigma_v = sigma_v
+ if sigma_v_min is not None:
+ self.sigma_v_min = sigma_v_min
+ if noise_type is not None:
+ self.noise_type = noise_type
+ if noise_gain is not None:
+ self.noise_gain = noise_gain
+ self.init_method: init_method
+ self.device = device
+```
+
+## Regression MLP class
+
+This simple model has one input layer, one hidden layer and one output layer. The input layer will have a single variable, the hidden layer will have 50 hidden units and the output layer will have one variable. The activation function of the hidden layer will be ReLU and the batch size will be four. The observation noise's standard deviation and its minimum will be 0.06. When one wich to use a scheduler to decrease `sigma_v` over epochs, `sigma_v_min` should be choosen to be smaller than `sigma_v` (Note: this is commonly the case for CNN).
+
+```python
+# Model
+from pytagi import NetProp
+
+class RegressionMLP(NetProp):
+ """Multi-layer perceptron for regression task"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [1, 1, 1] # [input layer, hidden layer, output layer]
+ self.nodes = [1, 50, 1] # [#inputs, #hidden units, #outputs ]
+ self.activations = [0, 4, 0] # [~, ReLU activation, ~ ]
+ self.batch_size = 4 # Number of observation per batch
+ self.sigma_v = 0.06 # Observation error's standard deviation
+ self.sigma_v_min: float = 0.06 # Min. observation error's std for the scheduler
+ self.device = "cpu" # CPU computations
+```
+
+## Heteroscedastic Regression MLP class
+
+```python
+class HeterosMLP(NetProp):
+ """Multi-layer preceptron for regression task where the
+ output's noise varies overtime"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [1, 1, 1, 1]
+ self.nodes: list = [1, 100, 100, 2] # output layer = [mean, std]
+ self.activations: list = [0, 4, 4, 0]
+ self.batch_size: int = 10
+ self.sigma_v: float = 0
+ self.sigma_v_min: float = 0
+ self.noise_type: str = "heteros"
+ self.noise_gain: float = 1.0
+ self.init_method: str = "He"
+ self.device: str = "cpu"
+```
+
+## Full-Covariance Regression MLP class
+
+```python
+
+class FullCovMLP(NetProp):
+ """Multi-layer perceptron for performing full-covariance prediction and
+ inference"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [1, 1, 1, 1]
+ self.nodes: list = [1, 30, 30, 1]
+ self.activations: list = [0, 4, 4, 0]
+ self.batch_size: int = 10
+ self.sigma_v: float = 0.5
+ self.sigma_v_min: float = 0.065
+ self.decay_factor_sigma_v: float = 0.95
+ self.sigma_x: float = 0.3485
+ self.is_full_cov: bool = True
+ self.multithreading: bool = True
+ self.device: str = "cpu"
+```
+
+## Derivative Regression MLP class
+
+```python
+class DervMLP(NetProp):
+ """Multi-layer perceptron for computing the derivative of a
+ regression task"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [1, 1, 1, 1]
+ self.nodes: list = [1, 64, 64, 1]
+ self.activations: list = [0, 1, 4, 0]
+ self.batch_size: int = 10
+ self.sigma_v: float = 0.3
+ self.sigma_v_min: float = 0.1
+ self.decay_factor_sigma_v: float = 0.99
+ self.collect_derivative: bool = True
+ self.init_method: str = "He"
+```
+
+## MNIST Classification MLP class
+
+```python
+class MnistMLP(NetProp):
+ """Multi-layer perceptron for mnist classification.
+
+ NOTE: The number of hidden states for last layer is 11 because
+ TAGI use the hierarchical softmax for the classification task.
+ Further details can be found in
+ https://www.jmlr.org/papers/volume22/20-1009/20-1009.pdf
+ """
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [1, 1, 1, 1]
+ self.nodes = [784, 100, 100, 11]
+ self.activations = [0, 4, 4, 0]
+ self.batch_size = 10
+ self.sigma_v = 1
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.device = "cpu"
+```
+
+## 2 CONV. MNIST Classification MLP class
+
+```python
+class ConvMLP(NetProp):
+ """Multi-layer perceptron for mnist classificaiton using CNN with 2
+ convolutional layers."""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [2, 2, 4, 2, 4, 1, 1]
+ self.nodes = [784, 0, 0, 0, 0, 20, 11]
+ self.kernels = [4, 3, 5, 3, 1, 1, 1]
+ self.strides = [1, 2, 1, 2, 0, 0, 0]
+ self.widths = [28, 27, 13, 9, 4, 1, 1]
+ self.heights = [28, 27, 13, 9, 4, 1, 1]
+ self.filters = [1, 32, 32, 64, 64, 150, 1]
+ self.pads = [1, 0, 0, 0, 0, 0, 0]
+ self.pad_types = [1, 0, 0, 0, 0, 0, 0]
+ self.activations = [0, 4, 0, 4, 0, 4, 12]
+ self.batch_size = 16
+ self.sigma_v = 1
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.device = "cuda"
+```
+
+## BATCH NORMALIZATION MLP class
+
+```python
+class ConvBatchNormMLP(NetProp):
+ """Multi-layer perceptron for mnist classificaiton using CNN with batch
+ normalization."""
+
+ """TODO: This class is not yet implemented."""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [2, 2, 6, 4, 2, 6, 4, 1, 1]
+ self.nodes = [784, 0, 0, 0, 0, 0, 0, 150, 11]
+ self.kernels = [4, 3, 1, 5, 3, 1, 1, 1, 1]
+ self.strides = [1, 1, 1, 2, 1, 1, 2, 0, 0]
+ self.widths = [28, 27, 27, 13, 9, 9, 4, 1, 1]
+ self.heights = [28, 27, 27, 13, 9, 9, 4, 1, 1]
+ self.filters = [1, 32, 32, 32, 64, 64, 64, 0, 1]
+ self.pads = [0, 1, 0, 0, 0, 0, 0, 0, 0]
+ self.pad_types = [0, 1, 0, 0, 0, 0, 0, 0, 0]
+ self.activations = [0, 4, 0, 0, 4, 0, 0, 4, 12]
+ self.batch_size = 16
+ self.sigma_v = 1
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.device = "cpu"
+```
+
+## 3 CONV CIFAR10 Classification MLP class
+
+```python
+class ConvCifarMLP(NetProp):
+ """Multi-layer perceptron for cifar classificaiton."""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [2, 2, 4, 2, 4, 2, 4, 1, 1]
+ self.nodes = [3072, 0, 0, 0, 0, 0, 0, 64, 11]
+ self.kernels = [3, 5, 3, 5, 3, 5, 3, 1, 1]
+ self.strides = [1, 1, 2, 1, 2, 1, 2, 0, 0]
+ self.widths = [32, 32, 16, 16, 8, 8, 4, 1, 1]
+ self.heights = [32, 32, 16, 16, 8, 8, 4, 1, 1]
+ self.filters = [3, 32, 32, 32, 32, 64, 64, 64, 1]
+ self.pads = [0, 1, 1, 1, 1, 1, 1, 0, 0]
+ self.pad_types = [0, 2, 1, 2, 1, 2, 1, 0, 0]
+ self.activations = [0, 4, 0, 4, 0, 4, 0, 4, 12]
+ self.batch_size = 16
+ self.sigma_v = 1
+ self.is_idx_ud = True
+ self.multithreading = True
+ self.device = "cuda"
+```
+
+## LSTM for Time Series Forecasting
+
+```python
+class TimeSeriesLSTM(NetProp):
+ """LSTM for time series forecasting"""
+
+ def __init__(self,
+ input_seq_len: int,
+ output_seq_len: int,
+ seq_stride: int = 1,
+ *args,
+ **kwargs) -> None:
+ super().__init__(*args, **kwargs)
+ self.layers: list = [1, 7, 7, 1]
+ self.nodes: list = [1, 5, 5, 1]
+ self.activations: list = [0, 0, 0, 0]
+ self.batch_size: int = 10
+ self.input_seq_len: int = input_seq_len
+ self.output_seq_len: int = output_seq_len
+ self.seq_stride: int = seq_stride
+ self.sigma_v: float = 2
+ self.sigma_v_min: float = 0.3
+ self.decay_factor_sigma_v: float = 0.95
+ self.multithreading: bool = False
+ self.device: str = "cpu"
+```
+
+## MNIST Encoder
+
+```python
+class MnistEncoder(NetProp):
+ """Encoder network for Mnist example"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [2, 2, 6, 4, 2, 6, 4, 1, 1]
+ self.nodes: list = [784, 0, 0, 0, 0, 0, 0, 100, 10]
+ self.kernels: list = [3, 1, 3, 3, 1, 3, 1, 1, 1]
+ self.strides: list = [1, 0, 2, 1, 0, 2, 0, 0, 0]
+ self.widths: list = [28, 0, 0, 0, 0, 0, 0, 0, 0]
+ self.heights: list = [28, 0, 0, 0, 0, 0, 0, 0, 0]
+ self.filters: list = [1, 16, 16, 16, 32, 32, 32, 1, 1]
+ self.pads: list = [1, 0, 1, 1, 0, 1, 0, 0, 0]
+ self.pad_types: list = [1, 0, 2, 1, 0, 2, 0, 0, 0]
+ self.activations: list = [0, 4, 0, 0, 4, 0, 0, 4, 0]
+ self.batch_size: int = 10
+ self.is_output_ud: bool = False
+ self.init_method: str = "He"
+ self.device: str = "cuda"
+```
+
+## MNIST Decoder
+
+```python
+class MnistDecoder(NetProp):
+ """Decoder network for Mnist example"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers: list = [1, 1, 21, 21, 21]
+ self.nodes: list = [10, 1568, 0, 0, 784]
+ self.kernels: list = [1, 3, 3, 3, 1]
+ self.strides: list = [0, 2, 2, 1, 0]
+ self.widths: list = [0, 7, 0, 0, 0]
+ self.heights: list = [0, 7, 0, 0, 0]
+ self.filters: list = [1, 32, 32, 16, 1]
+ self.pads: list = [0, 1, 1, 1, 0]
+ self.pad_types: list = [0, 2, 2, 1, 0]
+ self.activations: list = [0, 4, 4, 4, 0]
+ self.batch_size: int = 10
+ self.sigma_v: float = 8
+ self.sigma_v_min: float = 2
+ self.is_idx_ud: bool = False
+ self.last_backward_layer: int = 0
+ self.decay_factor_sigma_v: float = 0.95
+ self.init_method: str = "He"
+ self.device: str = "cuda"
+```
diff --git a/modules/modules.md b/modules/modules.md
new file mode 100644
index 0000000..58c514d
--- /dev/null
+++ b/modules/modules.md
@@ -0,0 +1,10 @@
+# Modules
+
+pyTAGI already has some modules implemented. These modules enables user to create a model, load data, and perform different tasks such as regression, classification, and so on. The following modules are available:
+
+- [**Models**](modules/models.md): This module contains the classes that define the models that can be used with pyTAGI.
+- [**Data loader**](modules/data-loader.md): Set of classes that can be used to load data.
+- [**Regression**](modules/regression.md): This module contains the classes that perform regression tasks.
+- [**Classification**](modules/classification.md): This module contains the classes that perform classification tasks.
+- [**Time series Forecaster**](modules/time-series-forecaster.md): This module contains the classes that perform time series forecasting tasks.
+- [**Autoencoder**](modules/autoencoder.md): This module contains the classes that perform autoencoder tasks.
diff --git a/modules/regression.md b/modules/regression.md
new file mode 100644
index 0000000..fc295a5
--- /dev/null
+++ b/modules/regression.md
@@ -0,0 +1,112 @@
+# Regression class
+
+The `Regression` class is responsible for performing regression using the TAGI algorithm.
+
+
+
+
+
+
+ Github Source code
+
+
+
+
+## Attributes
+
+- `utils`: An instance of the `Utils` class.
+- `num_epochs`: The number of epochs for training.
+- `data_loader`: A dictionary containing the data loader.
+- `net_prop`: An instance of the `NetProp` class representing the network properties.
+- `network`: An instance of the `TagiNetwork` class.
+- `dtype`: The data type (default: `np.float32`).
+- `viz`: An optional instance of the `PredictionViz` class for visualization.
+
+## *constructor* method
+
+> Constructor for the Regression class.
+
+```python
+def __init__(self, num_epochs: int, data_loader: dict, net_prop: NetProp, dtype=np.float32, viz: Union[PredictionViz, None] = None) -> None:
+```
+
+**Parameters**
+- `num_epochs`: An integer representing the number of epochs for training.
+- `data_loader`: A dictionary containing the data loader.
+- `net_prop`: An instance of the `NetProp` class representing the network properties.
+- `dtype`: The data type (default: `np.float32`).
+- `viz`: An optional instance of the `PredictionViz` class for visualization.
+
+## *train* method
+
+```python
+def train(self) -> None:
+ """Train the network using TAGI"""
+```
+
+Trains the network using TAGI algorithm. It performs the following steps:
+1. Initializes inputs and outputs.
+2. Performs training iterations for each epoch.
+3. Updates the network parameters and hidden states.
+4. Computes the loss (mean squared error) and displays the progress.
+
+> See [TAGI](https://www.jmlr.org/papers/volume22/20-1009/20-1009.pdf) paper for more information.
+
+## *predict* method
+
+```python
+def predict(self, std_factor: int = 1) -> None:
+ """Make prediction using TAGI"""
+```
+
+Makes predictions using TAGI algorithm. It performs the following steps:
+1. Initializes inputs.
+2. Makes predictions using the trained network.
+3. Unnormalizes the predictions.
+4. Computes the mean squared error and log-likelihood of the predictions.
+5. Prints the results.
+
+> See [TAGI](https://www.jmlr.org/papers/volume22/20-1009/20-1009.pdf) paper for more information.
+
+## *compute_derivatives* method
+
+```python
+def compute_derivatives(self, layer: int = 0, truth_derv_file: Union[None, str] = None) -> None:
+ """Compute derivative of a given layer"""
+```
+
+Computes the derivative of a given layer in the network. It performs the following steps:
+1. Initializes inputs.
+2. Computes the derivatives using the trained network.
+3. Unnormalizes the inputs.
+4. Optionally plots the predictions against the ground truth derivatives.
+
+## *init_inputs* method
+
+```python
+def init_inputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initialize the covariance matrix for inputs"""
+```
+
+Initializes the covariance matrix for inputs. It returns the initialized covariance matrices.
+
+**Parameters**
+- `batch_size`: An integer representing the batch size.
+
+**Returns**
+- A tuple containing the initialized covariance matrices for inputs.
+
+## *init_outputs* method
+
+```python
+def init_outputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initialize the covariance matrix for outputs"""
+```
+
+Initializes the covariance matrix for outputs. It returns the initialized covariance matrices.
+
+**Parameters**
+- `batch_size`: An integer representing the batch size.
+
+**Returns**
+- A tuple containing the initialized covariance matrices for outputs.
diff --git a/modules/time_series_forecaster.md b/modules/time_series_forecaster.md
new file mode 100644
index 0000000..cba2c6d
--- /dev/null
+++ b/modules/time_series_forecaster.md
@@ -0,0 +1,90 @@
+# TimeSeriesForecaster class
+
+The `TimeSeriesForecaster` class is responsible for time series forecasting using the TAGI algorithm.
+
+
+
+
+
+
+ Github Source code
+
+
+
+## Attributes
+
+- `num_epochs`: An integer representing the number of epochs for training.
+- `data_loader`: A dictionary containing the data loader.
+- `net_prop`: An instance of the `NetProp` class representing the network properties.
+- `network`: An instance of the `TagiNetwork` class.
+- `viz`: An optional instance of the `PredictionViz` class for visualization.
+- `dtype`: The data type (default: `np.float32`).
+
+## *constructor* method
+
+```python
+def __init__(self, num_epochs: int, data_loader: dict,
+ net_prop: NetProp, param: Union[Param, None] = None,
+ viz: Union[PredictionViz, None] = None, dtype=np.float32) -> None:
+```
+
+**Parameters**
+- `num_epochs`: An integer representing the number of epochs for training.
+- `data_loader`: A dictionary containing the data loader.
+- `net_prop`: An instance of the `NetProp` class representing the network properties.
+- `param`: An optional instance of the `Param` class for setting network parameters (default: `None`).
+- `viz`: An optional instance of the `PredictionViz` class for visualization (default: `None`).
+- `dtype`: The data type (default: `np.float32`).
+
+## *train* method
+
+```python
+def train(self) -> None:
+ """Train LSTM network"""
+```
+
+Trains the LSTM network using the TAGI algorithm. It performs the following steps:
+1. Initializes inputs and outputs.
+2. Performs training iterations for each epoch.
+3. Updates the hidden states, network parameters, and loss.
+4. Displays the training progress.
+
+## *predict* method
+
+```python
+def predict(self) -> None:
+ """Make prediction for time series using TAGI"""
+```
+
+Makes predictions for time series using the TAGI algorithm. It performs the following steps:
+1. Initializes inputs.
+2. Makes predictions using the trained network.
+3. Unnormalizes the predictions.
+4. Computes the mean squared error, log-likelihood, and prints the results.
+5. If visualization is enabled, plots the predictions.
+
+## *init_inputs* method
+
+```python
+def init_inputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initialize the covariance matrix for inputs"""
+```
+
+**Parameters**
+- `batch_size`: An integer representing the batch size.
+
+**Returns**
+- A tuple containing two numpy arrays: `Sx_batch` and `Sx_f_batch`.
+
+## *init_outputs* method
+
+```python
+def init_outputs(self, batch_size: int) -> Tuple[np.ndarray, np.ndarray]:
+ """Initialize the covariance matrix for outputs"""
+```
+
+**Parameters**
+- `batch_size`: An integer representing the batch size.
+
+**Returns**
+- A tuple containing two numpy arrays: `V_batch` and `ud_idx_batch`.
\ No newline at end of file
diff --git a/quick_tutorial.md b/quick_tutorial.md
new file mode 100644
index 0000000..478bd93
--- /dev/null
+++ b/quick_tutorial.md
@@ -0,0 +1,103 @@
+
+
+# 1D toy regression problem
+
+## Introduction
+
+In this tutorial, we will see how to use pytagi to solve a simple regression problem. We will use a 1D toy dataset and a feedforward neural network (FNN) with a simple architecture.
+
+## Define user input and data
+
+In this simple example, we will use a 1D toy dataset. The dataset is composed of 10 training samples and 100 test samples.
+
+```python
+# User-input
+num_inputs = 1
+num_outputs = 1
+num_epochs = 50
+x_train_file = "./data/toy_example/x_train_1D.csv"
+y_train_file = "./data/toy_example/y_train_1D.csv"
+x_test_file = "./data/toy_example/x_test_1D.csv"
+y_test_file = "./data/toy_example/y_test_1D.csv"
+```
+
+## Build Regression Model
+
+We will use a FNN with a simple architecture. We will use the RegressionMLP class explained [here](#regression-mlp-class).
+
+```python
+# Model
+net_prop = RegressionMLP() #MLP model configuration
+```
+
+## Data loader
+
+We will use the RegressionDataLoader class explained [here](tutorial_data_loader.md) to load and process the data.
+
+```python
+# Data loader
+reg_data_loader = RegressionDataLoader(num_inputs=num_inputs,
+ num_outputs=num_outputs,
+ batch_size=net_prop.batch_size)
+data_loader = reg_data_loader.process_data(x_train_file=x_train_file,
+ y_train_file=y_train_file,
+ x_test_file=x_test_file,
+ y_test_file=y_test_file)
+```
+
+## Train and test the model
+
+Using the [regression class](https://github.com/lhnguyen102/cuTAGI/blob/main/python_examples/regression.py) that makes use of TAGI, we will train the model using analytical inference and then and test it.
+
+```python
+ # Optional: Visualize the test using visualizer.py
+ viz = PredictionViz(task_name="regression", data_name="toy1D")
+
+ # Train and test
+ reg_task = Regression(num_epochs=num_epochs,
+ data_loader=data_loader,
+ net_prop=net_prop,
+ viz=viz)
+ reg_task.train() #Train by infering parameter values
+ reg_task.predict(std_factor=3) #plot 3σ confidence region
+```
+
+**`PredictionViz` class in [here](https://github.com/lhnguyen102/cuTAGI/blob/main/visualizer.py)*
+
+## Results
+
+The results are shown in the following figure. The black line is the true function, the red line is the predicted function and the red zone is the confidence intervals.
+
+
+
+
+
+## Regression MLP class
+
+The model will have one input layer, one hidden layer and one output layer. The input layer will have a single variable, the hidden layer will have 50 hidden units and the output layer will have one variable. The activation function of the hidden layer will be ReLU and the batch size will be four. The observation noise's standard deviation and its minimum will be 0.06. When one wich to use a scheduler to decrease `sigma_v` over epochs, `sigma_v_min` should be choosen to be smaller than `sigma_v` (Note: this is commonly the case for CNN).
+
+```python
+# Model
+from pytagi import NetProp
+
+class RegressionMLP(NetProp):
+ """Multi-layer perceptron for regression task"""
+
+ def __init__(self) -> None:
+ super().__init__()
+ self.layers = [1, 1, 1] # [input layer, hidden layer, output layer]
+ self.nodes = [1, 50, 1] # [#inputs, #hidden units, #outputs ]
+ self.activations = [0, 4, 0] # [~, ReLU activation, ~ ]
+ self.batch_size = 4 # Number of observation per batch
+ self.sigma_v = 0.06 # Observation error's standard deviation
+ self.sigma_v_min: float = 0.06 # Min. observation error's std for the scheduler
+ self.device = "cpu" # CPU computations
+```
diff --git a/team.md b/team.md
new file mode 100644
index 0000000..24ee1df
--- /dev/null
+++ b/team.md
@@ -0,0 +1,9 @@
+# Who are we?
+The py/cuTAGI library has beed developped by
+- Luong-Ha Nguyen (Main developpper & administrator)
+- Miquel Florensa (Unit tests & examples)
+- James-A. Goulet (Methodological developement)
+
+With the methodological support from
+- Bhargob Deka (Uncertainty quantification with TAGI-V)
+- Van-Dai Vuong (LSTM architecture)
\ No newline at end of file