Gradient Estimation for Exactly-k Constraints

Abstract

The exactly-k constraint is ubiquitous in machine learning and scientific applications, such as ensuring that the sum of electric charges in a neutral atom is zero. However, enforcing such constraints in machine learning models while allowing differentiable learning is challenging. In this work, we aim to provide a “cookbook” for seamlessly incorporating exactly-k constraints into machine learning models by extending a recent gradient estimator from Bernoulli variables to Gaussian and Poisson variables, utilizing constraint probabilities. We show the effectiveness of our proposed gradient estimators in synthetic experiments, and further demonstrate the practical utility of our approach by training neural networks to predict partial charges for metal-organic frameworks, aiding virtual screening in chemistry. Our proposed method not only enhances the capability of learning models but also expands their applicability to a wider range of scientific domains where satisfaction of constraints is crucial.

Publication
In Proceedings of the NeurIPS Workshop on AI for Scientific Discovery: From Theory to Practice, 2023
Zhe Zeng
Zhe Zeng
Assistant Professor

I do research in probabilistic ML and neurosymbolic AI to enable and support decision-making in the real world in the presence of probabilistic uncertainty and symbolic knowledge, where the symbolic knowledge can be graph structures and logical, arithmetic, and physical constraints.