Deep learning enabled the most disruptive technological breakthroughs in recent years. Image and speech recognition, autonomous driving, virtual assistants, and language translators are only a few areas exemplifying the deep learning revolution. Distributed deep learning accelerated the revolution by reducing the training and inference times for sophisticated deep neural networks by using the high performance computing resources. From neuroscience to cosmology many different fields are already exploring different neural network architectures coupled with data generated by simulations or observations to solve challenging science problems.
Speaker: Murat Keceli, Argonne Leadership Computing Facility
Quantum chemistry lies on the interface of physics and chemistry, where the goal is to describe chemical properties by using the principles of quantum mechanics. The predictions enabled by quantum chemistry is invaluable to understand and control chemical reactions and the discovery of novel materials or drugs. However, accurate predictions have an expensive computational cost that increases sharply with the system size. Neural network potentials (NNPs) provide an alternative path to do quantum chemistry using deep learning methods with data generated by ab initio simulations. Recently, we have seen successful applications of NNPs that can reach chemical accuracy for relative energies both in the compositional space (a set of different molecules) and the configurational space (a set of different configurations for the same molecule). In this talk, I’ll review the developments in this area with a particular focus on SchNet and ANI NNPs.