This repository contains the code to reproduce the results reported in the paper A Solver-Free Framework for Scalable Learning in Neural ILP Architectures, which has been accepted at NeurIPS 2022. We also provide the core components of our technique as a light-weight python package.
git clone https://github.com/dair-iitd/ilploss
cd ilploss
conda env create -f env_export.yaml
conda activate ilploss
Download and unzip the data from here into a directory named data/
.
We recommend mamba, which drastically speeds up conda environment creation.
The installation has been tested to work on Linux.
./trainer.py --config <path-to-config>
All our experiments are available as config files in the conf/
directory. For example to train and test ILP-Loss on random constraints for the binary domain with 8 ground truth constraints and dataset seed 0, run:
./trainer.py --config conf/random_constraints/binary_random/ilploss/8x16/0.yaml
@inproceedings{ilploss,
author = {Nandwani, Yatin and Ranjan, Rishabh and Mausam and Singla, Parag},
title = {A Solver-Free Framework for Scalable Learning in Neural ILP Architectures},
booktitle = {Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2022, NeurIPS 2022, November 29-Decemer 1, 2022},
year = {2022},
}