Abstract
Machine learning advances chemistry and materials science by enabling largescale exploration of chemical space based on quantum chemical calculations. While these models supply fast and accurate predictions of atomistic chemical properties, they do not explicitly capture the electronic degrees of freedom of a molecule, which limits their applicability for reactive chemistry and chemical analysis. Here we present a deep learning framework for the prediction of the quantum mechanical wavefunction in a local basis of atomic orbitals from which all other groundstate properties can be derived. This approach retains full access to the electronic structure via the wavefunction at forcefieldlike efficiency and captures quantum mechanics in an analytically differentiable representation. On several examples, we demonstrate that this opens promising avenues to perform inverse design of molecular structures for targeting electronic property optimisation and a clear path towards increased synergy of machine learning and quantum chemistry.
Introduction
Machine learning (ML) methods reach ever deeper into quantum chemistry and materials simulation, delivering predictive models of interatomic potential energy surfaces^{1,2,3,4,5,6}, molecular forces^{7,8}, electron densities^{9}, density functionals^{10}, and molecular response properties such as polarisabilities^{11}, and infrared spectra^{12}. Large data sets of molecular properties calculated from quantum chemistry or measured from experiment are equally being used to construct predictive models to explore the vast chemical compound space^{13,14,15,16,17} to find new sustainable catalyst materials^{18}, and to design new synthetic pathways^{19}. Recent research has explored the potential role of machine learning in constructing approximate quantum chemical methods^{20}, as well as predicting MP2 and coupled cluster energies from Hartree–Fock orbitals^{21,22}. There have also been approaches that use neural networks as a basis representation of the wavefunction^{23,24,25}.
Most existing ML models have in common that they learn from quantum chemistry to describe molecular properties as scalar, vector, or tensor fields^{26,27}. Figure 1a shows schematically how quantum chemistry data of different electronic properties, such as energies or dipole moments, is used to construct individual ML models for the respective properties. This allows for the efficient exploration of chemical space with respect to these properties. Yet, these ML models do not explicitly capture the electronic degrees of freedom in molecules that lie at the heart of quantum chemistry. All chemical concepts and physical molecular properties are determined by the electronic Schrödinger equation and derive from the groundstate wavefunction. Thus, an electronic structure ML model that directly predicts the groundstate wavefunction (see Fig. 1b) would not only allow to obtain all groundstate properties, but could open avenues towards new approximate quantum chemistry methods based on an interface between ML and quantum chemistry. Hegde and Bowen^{28} have explored this idea using kernel ridge regression to predict the band structure and ballistic transmission in a limited study on straining singlespecies bulk systems with up to μfour atomic orbitals. Another recent example of this scheme is the prediction of coupledcluster singles and doubles amplitudes from MP2derived properties by Townsend and Vogiatzis^{29}.
In this work, we develop a deep learning framework that provides an accurate ML model of molecular electronic structure via a direct representation of the electronic Hamiltonian in a local basis representation. The model provides a seamless interface between quantum mechanics and ML by predicting the eigenvalue spectrum and molecular orbitals (MOs) of the Hamiltonian for organic molecules close to ‘chemical accuracy’ (~0.04 eV). This is achieved by training a flexible ML model to capture the chemical environment of atoms in molecules and of pairs of atoms. Thereby, it provides access to electronic properties that are important for chemical interpretation of reactions such as charge populations, bond orders, as well as dipole and quadrupole moments without the need of specialised ML models for each property. We demonstrate how our model retains the conceptual strength of quantum chemistry by performing an MLdriven molecular dynamics simulation of malondialdehyde showing the evolution of the electronic structure during a proton transfer while reducing the computational cost by 2–3 orders of magnitude. As we obtain a symmetryadapted and analytically differentiable representation of the electronic structure, we are able to optimise electronic properties, such as the HOMOLUMO gap, in a step towards inverse design of molecular structures. Beyond that, we show that the electronic structure predicted by our approach may serve as input to further quantum chemical calculations. For example, wavefunction restarts based on this ML model provide a significant speedup of the selfconsistent field procedure (SCF) due to a reduced number of iterations, without loss of accuracy. The latter showcases that quantum chemistry and machine learning can be used in tandem for future electronic structure methods.
Results
Atomic representation of molecular electronic structure
In quantum chemistry, the wavefunction associated with the electronic Hamiltonian \(\hat H\) is typically expressed by antisymmetrised products of singleelectron functions or molecular orbitals. These are represented in a local atomic orbital basis of spherical atomic functions \(\left {\psi _m} \right\rangle = \mathop {\sum}\nolimits_i {c_m^i\left {\phi _i} \right\rangle }\) with varying angular momentum. As a consequence, one can write the electronic Schrödinger equation in matrix form
where the Hamiltonian matrix H may correspond to the Fock or Kohn–Sham matrix, depending on the chosen level of theory^{30}. In both cases, the Hamiltonian and overlap matrices are defined as:
and
The eigenvalues \(\epsilon _m\) and electronic wavefunction coefficients \(c_m^i\) contain the same information as H and S where the electronic eigenvalues are naturally invariant to rigid molecular rotations, translations or permutation of equivalent atoms. Unfortunately, as a function of atomic coordinates and changing molecular configurations, eigenvalues and wavefunction coefficients are not wellbehaved or smooth. State degeneracies and electronic level crossings provide a challenge to the direct prediction of eigenvalues and wavefunctions with ML techniques. We address this problem with a deep learning architecture that directly describes the Hamiltonian matrix in local atomic orbital representation.
SchNOrb deep learning framework
SchNOrb (SchNet for Orbitals) presents a framework that captures the electronic structure in a local representation of atomic orbitals that is common in quantum chemistry. Figure 2a gives an overview of the proposed architecture. SchNOrb extends the deep tensor neural network SchNet^{31} to represent electronic wavefunctions. The core idea is to construct symmetryadapted pairwise features \({\mathbf{\Omega }}_{ij}^l\) to represent the block of the Hamiltonian matrix corresponding to atoms i, j. They are written as a product of rotationally invariant (λ = 0) and covariant (λ > 0) components \({\boldsymbol{\omega }}_{ij}^\lambda\) which ensures that – given a sufficiently large feature space – all rotational symmetries up to angular momentum l can be represented:
here, r_{ij} is the vector pointing from atom i to atom j, \({\mathbf{p}}_{ij}^\lambda \in {\Bbb R}^B\) are rotationally invariant coefficients and W^{λ} ∈ R^{3×D} are learnable parameters projecting the features along D randomly chosen directions. This allows to rotate the different factors of \({\mathbf{\Omega }}_{ij}^l \in {\Bbb R}^{B \cdot D}\) relative to each other and further increases the flexibility of the model for D > 3. In case of λ = 0, the coefficients are independent of the directions due to rotational invariance.
We obtain the coefficients \({\mathbf{p}}_{ij}^\lambda\) from an atomistic neural network, as shown in Fig. 2a. Starting from atom type embeddings \({\mathbf{x}}_i^0\), rotationally invariant representations of atomistic environments \({\mathbf{x}}_i^T\) are computed by applying T consecutive interaction refinements. These are by construction invariant with respect to rotation, translation and permutations of atoms. This part of the architecture is equivalent to the SchNet model for atomistic predictions (see refs. ^{32,33}). In addition, we construct representations of atom pairs i,j that will enable the prediction of the coefficients \({\mathbf{p}}_{ij}^\lambda\). This is achieved by 2L + 1 SchNOrb interaction blocks, which compute the coefficients \({\mathbf{p}}_{ij}^\lambda\) with a given angular momentum λ with respect to the atomic environment of the respective atom pair ij. This corresponds to adapting the atomic orbital interaction based on the presence and position of atomic orbitals in the vicinity of the atom pair. As shown in Fig. 2b, the coefficient matrix depends on pair interactions mlp_{pair} of atoms i, j as well as environment interactions mlp_{env} of atom pairs (i, m) and (n, j) for neighbouring atoms m, n. These are crucial to enable the model to capture the orientation of the atom pair within the molecule for which pairwise interactions of atomic environments are not sufficient.
The Hamiltonian matrix is obtained by treating onsite and offsite blocks separately. Given a basis of atomic orbitals up to angular momentum L, we require pairwise environments with angular momenta up to 2L to describe all Hamiltonian blocks
The predicted Hamiltonian is obtained through symmetrisation \({\mathbf{H}} = \frac{1}{2}({\tilde{\mathbf{H}}} + {\tilde{\mathbf{H}}}^{{\intercal}} )\). H_{off} and H_{on} are modelled by neural networks that are described in detail in the methods section. The overlap matrix S can be obtained in the same manner. Based on this, the orbital energies and coefficients can be calculated according to Eq. (1). The computational cost of the diagonalisation is negligible for the molecules and basis sets we study here (<1 ms). For large basis sets and molecules, when the diagonalisation starts to dominate the computational cost, our method requires only a single diagonalization instead of one per SCF step. In addition to the Hamiltonian and overlap matrices, we predict the total energy separately as a sum over atomwise energy contributions, in analogy with the conventional SchNet treatment^{32} to drive the molecular dynamics simulations.
Learning electronic structure and derived properties
The proposed SchNOrb architecture allows us to perform predictions of total energies, Hamiltonian and overlap matrices in endtoend fashion using a combined regression loss. We train separate neural networks for several data sets of water as well as ethanol, malondialdehyde, and uracil from the MD17 dataset^{7}. The reference calculations were performed with HartreeFock (HF) and density functional theory (DFT) with the PBE exchange correlation functional^{34}. The employed Gaussian atomic orbital bases include angular momenta up to l = 2 (dorbitals). We augment the training data by adding rotated geometries and correspondingly rotated Hamiltonian and overlap matrices to learn the correct rotational symmetries (see Methods section). Detailed model and training settings for each data set are listed in Supplementary Table 1.
As Supplementary Table 2 shows, the total energies could be predicted up to a mean absolute error below 2 meV for the molecules. The predictions show mean absolute errors below 8 meV for the Hamiltonian and below 1 × 10^{−4} for the overlap matrices. We examine how these errors propagate to orbital energy and coefficients. Figure 2e shows mean absolute errors for energies of the lowest 20 molecular orbitals for ethanol reference calculations using DFT as well as HF. The errors for the DFT reference data are consistently lower. Beyond that, the occupied orbitals (1–13) are predicted with higher accuracy (<20 meV) than the virtual orbitals (~100 meV). We conjecture that the larger error for virtual orbitals arises from the fact that these are not strictly defined by the underlying data from the HF and Kohn–Sham DFT calculations. Virtual orbitals are only defined up to an arbitrary unitary transformation. Their physical interpretation is limited and, in HF and DFT theory, they do not enter in the description of groundstate properties. For the remaining data sets, the average errors of the occupied orbitals are <10 meV for water and malondialdehyde, as well as 48 meV for uracil. This is shown in detail in Supplementary Fig. 1. The orbital coefficients are predicted with cosine similarities ≥90% (see Supplementary Fig. 2). Figure 2f depicts the predicted and reference orbital energies for the frontier MOs of ethanol (solid and dotted lines, respectively), as well as the orbital shapes derived from the coefficients. Both occupied and unoccupied energy levels are reproduced with high accuracy, including the highest occupied (HOMO) and lowest unoccupied orbitals (LUMO). This trend is also reflected in the overall shape of the orbitals. Even the slightly higher deviations in the orbital energies observed for the third and fourth unoccupied orbital only result in minor deformations. The learned covariance of molecular orbitals for rotations of a water molecule is shown in Supplementary Fig. 3.
The ML model uses about 93 million parameters to predict a large Hamiltonian matrix with >100 atomic orbitals. This size is comparable to stateoftheart neural networks for the generation of similarly sized images^{35}. Supplementary Table 6 shows the computational costs of calculating the reference data, training the network and predicting Hamiltonians. While training of SchNOrb took about 80 h, performing the required DFT reference calculations remains the bottleneck for obtaining a trained network, in particular for larger molecules. Our approach to predicting Hamiltonian matrices leads to accelerations of 2–3 orders of magnitude.
As SchNOrb learns the electronic structure of molecular systems, all chemical properties that are defined as quantum mechanical operators on the wavefunctions can be computed from the ML prediction without the need to train a separate model. We investigate this feature by directly calculating electronic dipole and quadrupole moments from the orbital coefficients predicted by SchNOrb, as well as the HF total energies for the ethanol molecule. The corresponding mean absolute errors are reported in Supplementary Tables 4 and 5. The calculation of energies and forces from the coefficients requires the evaluation of the core Hamiltonian (HF) or exchange correlation terms (DFT), for which we currently resort to the ORCA code. To avoid this computational overhead and obtain highly accurate predictions for molecular dynamics simulations with mean absolute error below 1 meV, we predict energies and forces directly as a sum of atomic contributions^{31}. Regarding the electrostatic moments, excellent agreement with the electronic structure reference is observed for the majority of molecules (<0.054 D for dipoles and <0.058 D Å for quadrupoles). The only deviation from this trend is observed for uracil, where a loss function minimising only the errors of Hamiltonian and overlap matrices is too limited. The dipole moment depends strongly on the molecular electron density derived from the orbital coefficients, which are never learned directly.
Beyond that, we have studied the prediction accuracy for ethanol when using the larger def2tzvp basis set which includes forbitals (l = 3). While the predictions of the Hamiltonian and overlap matrices remain remarkably accurate with 8.3 meV and 10^{−6}, respectively, the derived properties exhibit large errors, e.g., an MAE of 0.4775 eV for the orbital energies. For large numbers of orbitals, errors in the Hamiltonian can accumulate due to the diagonalisation. This problem could be solved by improving the neural network architecture to further reduce the prediction error or introducing a density dependent term into the loss function, which will be explored in future investigations.
In this case, a similar accuracy as the other methods could in principle be reached upon the addition of more reference data points. The above results demonstrate the utility of combining a learned Hamiltonian with quantum operators. This makes it possible to access a wide range of chemical properties without the need for explicitly developing specialised neural network architectures.
Chemical insights from electronic deep learning
Recently, a lot of research has focused on explaining predictions of ML models^{36,37,38} aiming both at the validation of the model^{39,40} as well as the extraction of scientific insight^{17,31,41}. However, these methods explain ML predictions either in terms of the input space, atom types and positions in this case, or latent features such as local chemical potentials^{31,42}. In quantum chemistry however, it is more common to analyse electronic properties in terms of the MOs and properties derived from the electronic wavefunction, which are direct output quantities of the SchNOrb architecture.
Molecular orbitals encode the distribution of electrons in a molecule, thus offering direct insights into its underlying electronic structure. They form the basis for a wealth of chemical bonding analysis schemes, bridging the gap between quantum mechanics and abstract chemical concepts, such as bond orders and atomic partial charges^{30}. These quantities are invaluable tools in understanding and interpreting chemical processes based on molecular reactivity and chemical bonding strength. As SchNOrb yields the MOs, we are able to apply population analysis to our ML predictions. Figure 2d shows Loewdin partial atomic charges and bond orders for the uracil molecule. Loewdin charges provide a chemically intuitive measure for the electron distribution and can e.g., aid in identifying potential nucleophilic or electrophilic reaction sites in a molecule. The negatively charged carbonyl oxygens in uracil, for example, are involved in forming RNA base pairs. The corresponding bond orders provide information on the connectivity and types of bonds between atoms. In the case of uracil, the two double bonds of the carbonyl groups are easily recognisable (bond order 2.12 and 2.14, respectively). However, it is also possible to identify electron delocalisation effects in the pyrimidine ring, where the carbon double bond donates electron density to its neighbours. A population analysis for malondialdehyde, as well as population prediction errors for all molecules can be found in Supplementary Fig. 4 and Supplementary Table 3.
The SchNOrb architecture enables an accurate prediction of the electronic structure across molecular configuration space, which provides for rich chemical interpretation during molecular reaction dynamics. Figure 3a shows an excerpt of a molecular dynamics simulation of malondialdehyde that was driven by atomic forces predicted using SchNOrb. It depicts the proton transfer together with the relevant MOs and the electronic density. Supplementary Video 1 shows a sidebyside comparison between the predicted and reference HOMO2 orbital during this excerpt of the trajectory. The density paints an intuitive picture of the reaction as it migrates along with the hydrogen. This exchange of electron density during proton transfer is also reflected in the orbitals. Their dynamical rearrangement indicates an alternation between single and double bonds. The latter effect is hard to recognise based on the density alone and demonstrates the wealth of information encoded in the molecular wavefunctions.
Figure 3b depicts the forces the different MOs exert onto the hydrogen atom exchanged during the proton transfer. All forces are projected onto the reaction coordinate, where positive values correspond to a force driving the proton towards the product state. In the initial configuration I, most forces lead to attraction of the hydrogen atom to the right oxygen. In the intermediate configuration II, orbital rearrangement results in a situation where the majority of orbital force contributions on the hydrogen atom become minimal, representing mostly nonbonding character between oxygens and hydrogen. One exception is MO 13, depicted in the inset of Fig. 3b. Due to a minor deviation from a symmetric O–H–O arrangement, the orbital represents a onesided O–H bond, exerting forces that promote the reaction. The intrinsic fluctuations during the proton transfer molecular dynamics are captured by the MOs as can be seen in Fig. 3c. This shows the distribution of orbital energies encountered during the reaction. As would be expected, both HOMO2 and HOMO3 (inset, orange and blue respectively), which strongly participate in the proton transfer, show significantly broadened peaks due to strong energy variations in the dynamics. This example nicely shows the chemically intuitive interpretation that can be obtained by the electronic structure prediction of SchNOrb.
Deep learningenhanced quantum chemistry
An essential paradigm of chemistry is that the molecular structure defines chemical properties. Inverse chemical design turns this paradigm on its head by enabling propertydriven chemical structure exploration. The SchNOrb framework constitutes a suitable tool to enable inverse chemical design due to its analytic representation of electronic structure in terms of the atomic positions. We can therefore obtain analytic derivatives with respect to the atomic positions, which provide the ability to optimise electronic properties. Figure 4a shows the minimisation and maximisation of the HOMOLUMO gap \(\epsilon _{{\mathrm{gap}}}\) of malondialdehyde as an example. We perform gradient descent and ascent from a randomly selected configuration r_{ref} until convergence at r_{min} and r_{max}, respectively. We are able to identify structures which minimise and maximise the gap from its initial 3.15 to 2.68 eV at r_{min} and 3.59 eV at r_{max}. While in this proof of concept these changes were predominantly caused by local deformations in the carbon–carbon bonds indicated in Fig. 4a, they present an encouraging prospect how electronic surrogate models such as SchNOrb can contribute to computational chemical design using more sophisticated optimisation methods, such as alchemical derivatives^{43} or reinforcement learning^{44}.
ML applications for electronic structure methods have usually been onedirectional, i.e., ML models are trained to predict the outputs of calculations. On the other hand, models in the spirit of Fig. 1b, such as SchNOrb, offer the prospect of providing a deeper integration with quantum chemistry methods by substituting parts of the electronic structure calculation. SchNOrb directly predicts wavefunctions based on quantum chemistry data, which in turn, can serve as input for further quantum chemical calculations. For example, in the context of HF or DFT calculations, the relevant equations are solved via a selfconsistent field approach (SCF) that determines a set of MOs. The convergence with respect to SCF iteration steps largely determines the computational speed of an electronic structure calculation and strongly depends on the quality of the initialisation for the wavefunction. The coefficients predicted by SchNOrb can serve as such an initialisation of SCF calculations. To this end, we generated wavefunction files for the ORCA quantum chemistry package^{45} from the predicted SchNOrb coefficients, which were then used to initialise SCF calculations. Figure 4b depicts the SCF convergence for three sets of computations on the uracil molecule: using the standard initialisation techniques of quantum chemistry codes, and the SchNorb coefficients with or without a second order solver. Nominally, only small improvements are observed using SchNorb coefficients in combination with a conventional SCF solver. This is due to the various strategies employed in electronic structure codes in order to provide a numerically robust SCF procedure. By performing SCF calculations with a second order solver, which would not converge using a less accurate starting point than our SchNorb MO coefficients, the efficiency of our combined ML and second order SCF approach becomes apparent. Convergence is obtained in only a fraction of the original iterations, reducing the number of cycles by ~77%. Similarly, Supplementary Fig. 5 shows the reduction of SCF iteration by ~73% for malondialdehyde. However, since secondorder optimisation steps are more costly, it is more timeefficient to perform conventional SOSCF which reduces the convergence time by 13% and 16% for uracil and malondialdehyde, respectively.
It should be noted, that this combined approach does not introduce any approximations into the electronic structure method itself and yields exactly the same results as the full computation. Another example of integration of the SchNOrb deep learning framework with quantum chemistry, as shown in Fig. 1b, is the use of predicted wavefunctions and MO energies based on Hartree–Fock as starting point for postHartree–Fock correlation methods such as Møller–Plesset perturbation theory (MP2). Supplementary Table 5 presents the mean absolute error of an MP2 calculation for ethanol based on wavefunctions predicted from SchNOrb. The associated prediction error for the test set is 83 meV. Compared to the overall HF and MP2 energy, the relative error of SchNOrb amounts to 0.01% and 0.06%, respectively. For the MP2 correlation energy, we observe a deviation of 17%, the reason of which is inclusion of virtual orbitals in the calculation of the MP2 integrals. However, even in this case, the total error only amounts to a deviation of 93 meV.
Discussion
The SchNOrb framework provides an analytical expression for the electronic wavefunctions in a local atomic orbital representation as a function of molecular composition and atom positions. While previous approaches have predicted Hamiltonians of singlespecies bulk materials in a small basis set for limited geometric deformations^{28}, SchNorb has been shown to enable the accurate predictions of molecular Hamiltonians in a basis of >100 atomic orbitals up to angular momentum l = 2 for a much larger configuration space obtained from molecular dynamics simulations. As a consequence, the model provides access to atomic derivatives of wavefunctions, which include molecular orbital energy derivatives, Hamiltonian derivatives, which can be used to approximate nonadiabatic couplings^{46}, as well as higher order derivatives that describe the electronnuclear response of the molecule. Thus, the SchNOrb framework preserves the benefits of interatomic potentials while enabling access to the electronic structure as predicted by quantum chemistry methods.
SchNOrb opens up completely new applications to MLenhanced molecular simulation. This includes the construction of interatomic potentials with electronic properties that can facilitate efficient photochemical simulations during surface hopping trajectory dynamics or Ehrenfesttype meanfield simulations, but also enables the development of new MLenhanced approaches to inverse molecular design via electronic property optimisation.
The SchNOrb neural network architecture demonstrates that an accurate prediction of electronic structure is feasible. However, with increasing number of atomic orbitals, the diagonalisation of the Hamiltonian leads to the accumulation of prediction errors and becomes the bottleneck of the prediction. Thus, for larger molecules and basis sets, the accuracy of SchNOrb will have to be further improved. More research on the architecture is also required in order to reduce the required amount of training data and parameters, e.g., by adding more prior knowledge to the model. An import step into this direction is to encode the full rotational symmetries of the basis into the architecture, replacing the current data augmentation scheme. Alternatively, on the basis of the SchNOrb framework, intelligent preprocessing of quantum chemistry data in the form of effective Hamiltonians or optimised minimal basis representations^{47} can be developed in the future. Such preprocessing will also pave the way towards the prediction of the electronic structure based on postHF correlated wavefunction methods and postDFT quasiparticle methods.
This work serves as a first proof of principle that direct ML models of electronic structure based on quantum chemistry can be constructed and used to enhance further quantum chemistry calculations. We have presented an immediate consequence of this by reducing the number of DFTSCF iterations with wavefunctions predicted via SchNOrb. The presented model delivers derived electronic properties that can be formulated as quantum mechanical expectation values. This provides an important step towards a full integration of ML and quantum chemistry into the scientific discovery cycle.
Methods
Reference data
Reference configurations were sampled at random from the MD17 dataset^{7} for each molecule. The number of selected configurations per molecule is given in Supplementary Table 1. All reference calculations were carried out with the ORCA quantum chemistry code^{45} using the def2SVP basis set^{48}. Integration grid levels of 4 and 5 were employed during SCF iterations and the final computation of properties, respectively. Unless stated otherwise, the default ORCA SCF procedure was used, which is based on the Pulay method^{49}. For the remaining cases, the Newton–Raphson procedure implemented in ORCA was employed as a second order SCF solver. SCF convergence criteria were set to VeryTight. DFT calculations were carried out using the PBE functional^{34}. For ethanol, additional HF computations were performed.
Molecular dynamics simulations for malondialdehyde were carried out with SchNetPack^{50}. The equations of motions were integrated using a timestep of 0.5 fs. Simulation temperatures were kept at 300 K with a Langevin thermostat^{51} employing a time constant of 100 fs. Trajectories were propagated for a total of 50 ps, of which the first 10 ps were discarded.
Details on the neural network architecture
In the following we describe the neural network depicted in Fig. 2 in detail. We use shifted softplus activation functions
throughout the architecture. Linear layers are written as
with input \({\mathbf{x}} \in {\Bbb R}^{n_{{\mathrm{in}}}}\), weights \({\mathbf{W}} \in {\Bbb R}^{n_{{\mathrm{in}}} \times n_{{\mathrm{out}}}}\) and bias \({\mathbf{b}} \in {\Bbb R}^{n_{{\mathrm{out}}}}\). Fullyconnected neural networks with one hidden layer are written as
with weights \({\mathbf{W}}_1 \in {\Bbb R}^{n_{{\mathrm{in}}} \times n_{{\mathrm{hidden}}}}\) and \({\mathbf{W}}_2 \in {\Bbb R}^{n_{{\mathrm{hidden}}} \times n_{{\mathrm{out}}}}\) and biases b_{1}, b_{2} accordingly. Model parameters are shared within layers across atoms and interactions, but never across layers. We omit layer indices for clarity.
The representations of atomic environments are constructed with the neural network structure as in SchNet. In the following, we summarise this first part of the model. For further details, please refer to Schütt et al.^{32}. First, each atom is assigned an initial elementspecific embedding
where Z_{i} is the nuclear charge and B is the number of atomwise features. In this work, we use B = 1000 for all models. The representations are refined using SchNet interaction layers (Fig. 2b). The main component is a continuousfilter convolutional layer (cfconv)
which takes a spatial filter \({\mathbf{W}}_{{\mathrm{filter}}}:{\Bbb R} \to {\Bbb R}^B\)
with
where r_{c} is the cutoff radius and Δμ is the grid spacing of the radial basis function expansion of interatomic distance r_{ij}. While this adds spatial information to the environment representations for each feature separately, the crosstalk between features is performed atomwise by fullyconnected layers (linear and mlp_{atom} in Fig. 2b) to obtain the refinements \({\mathbf{v}}_i^t\), where t is the current interaction iteration. The refined atom representations are then
These representations of atomic environments are employed by SchNet to predict chemical properties via atomwise contributions. However, in order to extend this scheme to the prediction of the Hamiltonian, we need to construct representations of pairwise environments in a second interaction phase.
The Hamiltonian matrix is of the form
where a matrix block \({\mathbf{H}}_{ij} \in {\Bbb R}^{n_{{\mathrm{ao}},i} \times n_{{\mathrm{ao}},j}}\) depends on the atoms i, j within their chemical environment as well as on the choice of n_{ao,i} and n_{ao,j} atomic orbitals, respectively. Therefore, SchNOrb builds representations of these embedded atom pairs based on the previously constructed representations of atomic environments. This is achieved through the SchNOrb interaction module (see Fig. 2c).
First, a raw representation of atom pairs is obtained using a factorised tensor layer^{31,52}:
The layers \({\mathrm{linear}}_1:{\Bbb R}^B \mapsto {\Bbb R}^B\) map the atom representations to the factors, while the filtergenerating network \({\mathbf{W}}_{{\mathrm{filter}}}:{\Bbb R} \to {\Bbb R}^B\) is defined analogously to Eq. (12) and directly maps to the factor space. In analogy to how the SchNet interactions are used to build atomic environments, SchNOrb interactions are applied several times, where each instance further refines the atomic environments of the atom pairs with additive corrections
as well as constructs pairwise features:
where mlp_{pair} models the direct interactions of atoms i, j while mlp_{env} models the interactions of the pair with neighbouring atoms. As described above, the atom pair coefficients are used to form a basis set
where λ corresponds to the angular momentum channel and W ∈ \({\Bbb R}\)3×D are learnable parameters to project along D directions. For all results in this work, we used D = 4. For interactions between sorbitals, we consider the special case λ = 0 where the features along all directions are equal due to rotational invariance. At this point, \({\boldsymbol{\omega }}_{ij}^0\) is rotationally invariant and \({\boldsymbol{\omega }}_{ij}^{\lambda > 0}\) is covariant. On this basis, we obtain features with higher angular momenta using:
where features \({\mathbf{\Omega }}_{ij}^l\) possess angular momentum l. The SchNOrb representations of atom pairs embedded in their chemical environment, that were constructed from the previously constructed SchNet atomwise features, will serve in a next step to predict the corresponding blocks of the groundstate Hamiltonian.
Finally, we assemble the Hamiltonian and overlap matrices to be predicted. Each atom pair block is predicted from the corresponding features \({\mathbf{\Omega }}_{ij}^l\):
where we restrict the network to linear layers in order to conserve the angular momenta:
with \({\mathrm{linear}}_{{\mathrm{off}}},{\mathrm{linear}}_{{\mathrm{on}}}:{\Bbb R}^{2L + 1 \times D} \to {\Bbb R}^{n_{{\mathrm{ao,max}} \times n_{{\mathrm{ao,max}}}}}\), i.e., mapping to the maximal number of atomic orbitals in the data. Then, a mask is applied to the matrix block to yield only \({\tilde{\mathbf{H}}}_{ij} \in {\Bbb R}^{n_{{\mathrm{ao}},i} \times n_{{\mathrm{ao}},j}}\). Finally, we symmetrise the prediced Hamiltonian:
The overlap matrix is obtained similarly with blocks
The prediction of the total energy is obtained analogously to SchNet as a sum over atomwise energy contributions:
Data augmentation
While SchNOrb constructs features \({\mathbf{\Omega}}_{ij}^{l}\) and \({\mathbf{\Omega }}_{ii}^{l}\) with angular momenta such that the Hamiltonian matrix can be represented as a linear combination of those, it does not encode the full rotational symmetry a priori. However, this can be learned by SchNOrb assuming the training data reflects enough rotations of a molecule. To save computing power, we reduce the amount of reference calculations by randomly rotating configurations before each training epoch using Wigner \(\cal{D}\) rotation matrices^{53}. Given a randomly sampled rotor R, the applied transformations are
for atom positions r_{i}, atomic forces F_{i}, Hamiltonian matrix H, and overlap S.
Neural network training
For the training, we used a combined loss to train on energies E, atomic forces F, Hamiltonian H and overlap matrices S simultaneously:
where the variables marked with a tilde refer to the corresponding predictions and ρ determines the tradeoff between total energies and forces. The neural networks were trained with stochastic gradient descent using the ADAM optimiser^{54}. We reduced the learning rate using a decay factor of 0.8 after t_{patience} epochs without improvement of the validation loss. The training is stopped at lr ≤ 5 × 10^{−6}. The minibatch sizes, patience and data set sizes are listed in Supplementary Table 1. Afterwards, the model with lowest validation error is selected for testing.
Data availability
All datasets used in this work have been made available on http://www.quantummachine.org/datasets.
Code availability
All code developed in this work will be made available upon request.
References
 1.
Behler, J. & Parrinello, M. Generalized neuralnetwork representation of highdimensional potentialenergy surfaces. Phys. Rev. Lett. 98, 146401 (2007).
 2.
Braams, B. J. & Bowman, J. M. Permutationally invariant potential energy surfaces in high dimensionality. Int. Rev. Phys. Chem. 28, 577–606 (2009).
 3.
Bartók, A. P., Payne, M. C., Kondor, R. & Csányi, G. Gaussian approximation potentials: The accuracy of quantum mechanics, without the electrons. Phys. Rev. Lett. 104, 136403 (2010).
 4.
Smith, J. S., Isayev, O. & Roitberg, A. E. Ani1: an extensible neural network potential with dft accuracy at force field computational cost. Chem. Sci. 8, 3192–3203 (2017).
 5.
Podryabinkin, E. V. & Shapeev, A. V. Active learning of linearly parametrized interatomic potentials. Comput. Mater. Sci. 140, 171–180 (2017).
 6.
Podryabinkin, E. V., Tikhonov, E. V., Shapeev, A. V. & Oganov, A. R. Accelerating crystal structure prediction by machinelearning interatomic potentials with active learning. Phys. Rev. B 99, 064114 (2019).
 7.
Chmiela, S. et al. Machine learning of accurate energyconserving molecular force fields. Sci. Adv. 3, e1603015 (2017).
 8.
Chmiela, S., Sauceda, H. E., Müller, K.R. & Tkatchenko, A. Towards exact molecular dynamics simulations with machinelearned force fields. Nat. Commun. 9, 3887 (2018).
 9.
Ryczko, K., Strubbe, D. A. & Tamblyn, I. Deep learning and densityfunctional theory. Phys. Rev. A 100, 022512 (2019).
 10.
Brockherde, F. et al. Bypassing the KohnSham equations with machine learning. Nat. Commun. 8, 872 (2017).
 11.
Wilkins, D. M. et al. Accurate molecular polarizabilities with coupled cluster theory and machine learning. Proc. Natl Acad. Sci. USA 116, 3401–3406 (2019).
 12.
Gastegger, M., Behler, J. & Marquetand, P. Machine learning molecular dynamics for the simulation of infrared spectra. Chem. Sci. 8, 6924–6935 (2017).
 13.
Rupp, M., Tkatchenko, A., Müller, K.R. & Von Lilienfeld, O. A. Fast and accurate modeling of molecular atomization energies with machine learning. Phys. Rev. Lett. 108, 058301 (2012).
 14.
Eickenberg, M., Exarchakis, G., Hirn, M. & Mallat, S. In Adv. Neural Inf. Process. Syst. 30, 6543–6552 (Curran Associates, Inc., 2017).
 15.
von Lilienfeld, O. A. Quantum machine learning in chemical compound space. Angew. Chem. Int. Ed. 57, 4164–4169 (2018).
 16.
Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O. & Dahl, G. E. In Proceedings of the 34th International Conference on Machine Learning, 1263–1272 (2017).
 17.
Jha, D. et al. Elemnet: Deep learning the chemistry of materials from only elemental composition. Sci. Rep. 8, 17593 (2018).
 18.
Kitchin, J. R. Machine learning in catalysis. Nat. Catal. 1, 230–232 (2018).
 19.
Maryasin, B., Marquetand, P. & Maulide, N. Machine learning for organic synthesis: are robots replacing chemists? Angew. Chem. Int. Ed. 57, 6978–6980 (2018).
 20.
Li, H., Collins, C., Tanha, M., Gordon, G. J. & Yaron, D. J. A density functional tight binding layer for deep learning of chemical hamiltonians. J. Chem. Theory Comput. 14, 5764–5776 (2018).
 21.
Welborn, M., Cheng, L. & Miller, T. F. III Transferability in machine learning for electronic structure via the molecular orbital basis. J. Chem. Theory Comput. 14, 4772–4779 (2018).
 22.
Cheng, L., Welborn, M., Christensen, A. S. & Miller, T. F. A universal density matrix functional from molecular orbitalbased machine learning: transferability across organic molecules. J. Chem. Phys. 150, 131103 (2019).
 23.
Sugawara, M. Numerical solution of the schrödinger equation by neural network and genetic algorithm. Comput. Phys. Commun. 140, 366–380 (2001).
 24.
Manzhos, S. & Carrington, T. An improved neural network method for solving the schrödinger equation. Can. J. Chem. 87, 864–871 (2009).
 25.
Carleo, G. & Troyer, M. Solving the quantum manybody problem with artificial neural networks. Science 355, 602–606 (2017).
 26.
Grisafi, A., Wilkins, D. M., Csányi, G. & Ceriotti, M. Symmetryadapted machine learning for tensorial properties of atomistic systems. Phys. Rev. Lett. 120, 036002 (2018).
 27.
Thomas, N. et al. Tensor field networks: Rotationand translationequivariant neural networks for 3d point clouds. Preprint at https://arxiv.org/abs/1802.08219 (2018).
 28.
Hegde, G. & Bowen, R. C. Machinelearned approximations to density functional theory hamiltonians. Sci. Rep. 7, 42669 (2017).
 29.
Townsend, J. & Vogiatzis, K. D. Datadriven acceleration of the coupledcluster singles and doubles iterative solver. J. Phys. Chem. Lett. 10, 4129–4135 (2019).
 30.
Cramer, C. J. Essentials of computational chemistry: theories and models (John Wiley & Sons, 2004).
 31.
Schütt, K. T., Arbabzadah, F., Chmiela, S., Müller, K.R. & Tkatchenko, A. Quantumchemical insights from deep tensor neural networks. Nat. Commun. 8, 13890 (2017).
 32.
Schütt, K. T., Sauceda, H. E., Kindermans, P.J., Tkatchenko, A. & Müller, K.R. Schnet–a deep learning architecture for molecules and materials. J. Chem. Phys. 148, 241722 (2018).
 33.
Schütt, K. T. et al. In Adv. Neural Inf. Processing Syst. 30, 992–1002 (2017).
 34.
Perdew, J. P., Burke, K. & Ernzerhof, M. Generalized gradient approximation made simple. Phys. Rev. Lett. 77, 3865–3868 (1996).
 35.
Brock, A., Donahue, J. & Simonyan, K. Large scale GAN training for high fidelity natural image synthesis. In International Conference on Learning Representations https://openreview.net/forum?id=B1xsqj09Fm (2019).
 36.
Bach, S. et al. On pixelwise explanations for nonlinear classifier decisions by layerwise relevance propagation. PLoS ONE 10, e0130140 (2015).
 37.
Montavon, G., Samek, W. & Müller, K.R. Methods for interpreting and understanding deep neural networks. Digit. Signal Process. 73, 1–15 (2018).
 38.
Kindermans, P.J. et al. In Int. Conf. Learn. Representations. https://openreview.net/forum?id=Hkn7CBaTW (2018).
 39.
Kim, B. et al. In Proc. 35th Int. Conf. Mach. Learn., 2668–2677 (2018).
 40.
Lapuschkin, S. et al. Unmasking clever hans predictors and assessing what machines really learn. Nat. Commun. 10, 1096 (2019).
 41.
De, S., Bartók, A. P., Csányi, G. & Ceriotti, M. Comparing molecules and solids across structural and alchemical space. Phys. Chem. Chem. Phys. 18, 13754–13769 (2016).
 42.
Schütt, K. T., Gastegger, M., Tkatchenko, A. & Müller, K.R. In Explainable AI: Interpreting, Explaining and Visualizing Deep Learning, 311–330 (Springer, 2019).
 43.
To Baben, M., Achenbach, J. & Von Lilienfeld, O. Guiding ab initio calculations by alchemical derivatives. J. Chem. Phys. 144, 104103 (2016).
 44.
You, J., Liu, B., Ying, Z., Pande, V. & Leskovec, J. In Adv. Neural Inf. Process. Syst. 31, 6410–6421 (2018).
 45.
Neese, F. The ORCA program system. WIREs Comput. Mol. Sci. 2, 73–78 (2012).
 46.
Maurer, R. J., Askerka, M., Batista, V. S. & Tully, J. C. Abinitio tensorial electronic friction for molecules on metal surfaces: nonadiabatic vibrational relaxation. Phys. Rev. B 94, 115432 (2016).
 47.
Lu, W. C. et al. Molecule intrinsic minimal basis sets. I. Exact resolution of ab initio optimized molecular orbitals in terms of deformed atomic minimalbasis orbitals. J. Chem. Phys. 120, 2629–2637 (2004).
 48.
Weigend, F. & Ahlrichs, R. Balanced basis sets of split valence, triple zeta valence and quadruple zeta valence quality for H to Rn: design and assessment of accuracy. Phys. Chem. Chem. Phys. 7, 3297–3305 (2005).
 49.
Pulay, P. Convergence acceleration of iterative sequences. the case of scf iteration. Chem. Phys. Lett. 73, 393–398 (1980).
 50.
Schütt, K. T. et al. SchNetPack: A deep learning toolbox for atomistic systems. J. Chem. Theory Comput. 15, 448–455 (2018).
 51.
Bussi, G. & Parrinello, M. Accurate sampling using langevin dynamics. Phys. Rev. E 75, 056707 (2007).
 52.
Sutskever, I., Martens, J. & Hinton, G. E. In Proceedings of the 28th International Conference on Machine Learning, 1017–1024 (2011).
 53.
Schober, C., Reuter, K. & Oberhofer, H. Critical analysis of fragmentorbital DFT schemes for the calculation of electronic coupling values. J. Chem. Phys. 144, 054103 (2016).
 54.
Kingma, D. P. & Ba, J. Adam: A method for stochastic optimization. In International Conference for Learning Representations https://arxiv.org/abs/1412.6980. (2014).
Acknowledgements
We gratefully acknowledge support by the Institute of Pure and Applied Mathematics (IPAM) at the University of California Los Angeles during a long program workshop. R.J.M. acknowledges funding through a UKRI Future Leaders Fellowship (MR/S016023/1). K.T.S. and K.R.M. acknowledge support by the Federal Ministry of Education and Research (BMBF) for the Berlin Center for Machine Learning (01IS18037A). This project has received funding from the European Unions Horizon 2020 research and innovation program under the Marie SkłodowskaCurie grant agreement No. 792572. Computing resources have been provided by the Scientific Computing Research Technology Platform of the University of Warwick, and the EPSRCfunded high end computing Materials Chemistry Consortium (EP/R029431/1). K.R.M. acknowledges partial financial support by the German Ministry for Education and Research (BMBF) under Grants 01IS14013AE, 01GQ1115 and 01GQ0850; Deutsche Forschungsgesellschaft (DFG) under Grant Math+, EXC 2046/1, Project ID 390685689 and by the Technology Promotion (IITP) grant funded by the Korea government (Nos. 2017000451, 2017001779). Correspondence to R.J.M., K.R.M., and A.T.
Author information
Affiliations
Contributions
K.T.S. and R.J.M. proposed the approach of this work. K.T.S. conceived and implemented SchNOrb. M.G. and R.J.M. carried out the reference calculations. K.T.S. and M.G. performed the molecular dynamics simulations and prepared the figures. K.T.S., M.G., and R.J.M. designed the analyses and wrote the paper. K.T.S., M.G., A.T., K.R.M., and R.J.M. discussed results and commented on the manuscript.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Peer review information Nature Communications thanks the anonymous reviewers for their contribution to the peer review of this work. Peer reviewer reports are available.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Schütt, K.T., Gastegger, M., Tkatchenko, A. et al. Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions. Nat Commun 10, 5024 (2019). https://doi.org/10.1038/s41467019128752
Received:
Accepted:
Published:
Further reading

Voicecontrolled quantum chemistry
Nature Computational Science (2021)

Benchmark and application of unsupervised classification approaches for univariate data
Communications Physics (2021)

Molecular excited states through a machine learning lens
Nature Reviews Chemistry (2021)

Machine learning powered ellipsometry
Light: Science & Applications (2021)

Quantum–mechanical property prediction of solvated drug molecules: what have we learned from a decade of SAMPL blind prediction challenges?
Journal of ComputerAided Molecular Design (2021)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.