eo-learn makes extraction of valuable information from satellite imagery easy.
The availability of open Earth observation (EO) data through the Copernicus and Landsat programs represents an unprecedented resource for many EO applications, ranging from ocean and land use and land cover monitoring, disaster control, emergency services and humanitarian relief. Given the large amount of high spatial resolution data at high revisit frequency, techniques able to automatically extract complex patterns in such spatio-temporal data are needed.
eo-learn
is a collection of open source Python packages that have been developed to seamlessly access and process
spatio-temporal image sequences acquired by any satellite fleet in a timely and automatic manner. eo-learn
is
easy to use, it's design modular, and encourages collaboration -- sharing and reusing of specific tasks in a typical
EO-value-extraction workflows, such as cloud masking, image co-registration, feature extraction, classification, etc. Everyone is free
to use any of the available tasks and is encouraged to improve the, develop new ones and share them with the rest of the community.
eo-learn
makes extraction of valuable information from satellite imagery as easy as defining a sequence of operations to be performed on satellite imagery. Image below illustrates a processing chain that maps water in satellite imagery by thresholding the Normalised Difference Water Index in user specified region of interest.
eo-learn
library acts as a bridge between Earth observation/Remote sensing field and Python ecosystem for data science and machine learning. The library is written in Python and uses NumPy arrays to store and handle remote sensing data. Its aim is to make entry easier for non-experts to the field of remote sensing on one hand and bring the state-of-the-art tools for computer vision, machine learning, and deep learning existing in Python ecosystem to remote sensing experts.
eo-learn
is divided into several subpackages according to different functionalities and external package dependencies. Therefore it is not necessary for user to install entire package but only the parts that he needs.
At the moment there are the following subpackages:
eo-learn-core
- The main subpackage which implements basic building blocks (EOPatch
,EOTask
andEOWorkflow
) and commonly used functionalities.eo-learn-coregistration
- The subpackage that deals with image co-registraion.eo-learn-features
- A collection of utilities for extracting data properties and feature manipulation.eo-learn-geometry
- Geometry subpackage used for geometric transformation and conversion between vector and raster data.eo-learn-io
- Input/output subpackage that deals with obtaining data from Sentinel Hub services or saving and loading data locally.eo-learn-mask
- The subpackage used for masking of data and calculation of cloud masks.eo-learn-ml-tools
- Various tools that can be used before or after the machine learning process.eo-learn-visualization
- Visualization tools for core elements of eo-learn.
The package requires Python version >=3.6 . It can be installed with:
pip install eo-learn
In order to avoid heavy package dependencies it is possible to install each subpackage separately:
pip install eo-learn-core
pip install eo-learn-coregistration
pip install eo-learn-features
pip install eo-learn-geometry
pip install eo-learn-io
pip install eo-learn-mask
pip install eo-learn-ml-tools
pip install eo-learn-visualization
Before installing eo-learn
on Windows it is recommended to install the following packages from Unofficial Windows wheels repository:
gdal
rasterio
shapely
fiona
cartopy (required by eo-learn-visualization[FULL])
One of dependecies of eo-learn-mask
subpackage is lightgbm
package. On windows it requires 64 bit Python distribution. If having problems during installation please check LightGBM installation guide.
A part of subpackage eo-learn-visualization
requires additional dependencies which don't get installed by default. Those can be installed with
pip install eo-learn-visualization[FULL]
The package requires a Python environment >=3.6.
Thanks to the maintainers of the conda forge feedstock (@benhuff, @dcunn, @mwilson8, @oblute, @rluria14), eo-learn
can
be installed using conda-forge
as follows:
conda config --add channels conda-forge
conda install eo-learn
In order to avoid heavy package dependencies it is possible to install each subpackage separately:
conda install eo-learn-core
conda install eo-learn-coregistration
conda install eo-learn-features
conda install eo-learn-geometry
conda install eo-learn-io
conda install eo-learn-mask
conda install eo-learn-ml-tools
conda install eo-learn-visualization
For more information on the package content, visit readthedocs.
If you would like to contribute to eo-learn
, check out our contribution guidelines.
- Introducing eo-learn (by Devis Peressutti)
- Land Cover Classification with eo-learn: Part 1 - Mastering Satellite Image Data in an Open-Source Python Environment (by Matic Lubej)
- Land Cover Classification with eo-learn: Part 2 - Going from Data to Predictions in the Comfort of Your Laptop (by Matic Lubej)
- Land Cover Classification with eo-learn: Part 3 - Pushing Beyond the Point of “Good Enough” (by Matic Lubej)
- Innovations in satellite measurements for development
- Use eo-learn with AWS SageMaker (by Drew Bollinger)
- Spatio-Temporal Deep Learning: An Application to Land Cover Classification (by Anze Zupanc)
- Tree Cover Prediction with Deep Learning (by Daniel Moraite)
- NoRSC19 Workshop on eo-learn
- Tracking a rapidly changing planet (by Development Seed)
- Land Cover Monitoring System (by Jovan Visnjic and Matej Aleksandrov)
- eo-learn Webinar (by Anze Zupanc)
- Cloud Masks at Your Service
Feel free to ask questions about the package and its use cases at Sentinel Hub forum or raise an issue on GitHub.
You are welcome to send your feedback to the package authors, EO Research team, through any of Sentinel Hub communication channel.
See LICENSE.
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 776115.