Skip to content

Latest commit

 

History

History
52 lines (31 loc) · 2.34 KB

README.md

File metadata and controls

52 lines (31 loc) · 2.34 KB

UIMVDR

This is the repository for UIMVDR.

Article accepted to INTERSPEECH 2024. Link to arXiv

Abstract

Neural networks have recently become the dominant approach to sound separation. Their good performance relies on large datasets of isolated recordings. For speech and music, isolated single channel data are readily available; however, the same does not hold in the multi-channel case, and with most other sound classes. Multi-channel methods have the potential to outperform single channel approaches as they can exploit both spatial and spectral features, but the lack of training data remains a challenge. We propose unsupervised improved minimum variation distortionless response (UIMVDR), which enables multi-channel separation to leverage in-the-wild single-channel data through unsupervised training and beamforming. Results show that UIMVDR generalizes well and improves separation performance compared to supervised models, particularly in cases with limited supervised data. By using data available online, it also reduces the effort required to gather data for multi-channel approaches.

Get the code

# Clone with git in a terminal
git clone https://github.com/introlab/uimvdr.git
# Go in the root folder
cd uimvdr
# Install the dependencies
pip install -r requirements.txt

Get the Multi-Channel Free Sound Test Dataset (MCFSTD)

Zenodo link

Pretrained models

Get pretrained models on Google Drive

Improvements

Send us your comments/suggestions to improve the project in "Issues".

Authors

  • Jacob Kealey (@JacobKealey)
  • John Hershey
  • François Grondin (@FrancoisGrondin)

Licence

Acknowledgments

Thanks to Jusper Lee for his pytorch implementation of the ConvTasNet: https://github.com/JusperLee/Conv-TasNet

The work done here was supported by the Natural Sciences and Engineering Research Council of Canada (NSERC) and by the Fonds de Recherche du Québec en Nature et Technologies (FRQNT).

IntRoLab

IntRoLab - Laboratoire de robotique intelligente / interactive / intégrée / interdisciplinaire @ Université de Sherbrooke