Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kristof T. Schütt is active.

Publication


Featured researches published by Kristof T. Schütt.


Nature Communications | 2017

Quantum-chemical insights from deep tensor neural networks

Kristof T. Schütt; Farhad Arbabzadah; Stefan Chmiela; Klaus-Robert Müller; Alexandre Tkatchenko

Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol−1) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems.


Physical Review B | 2014

How to represent crystal structures for machine learning: Towards fast prediction of electronic properties

Kristof T. Schütt; H. Glawe; F. Brockherde; Antonio Sanna; Klaus-Robert Müller; E. K. U. Gross

High-throughput density functional calculations of solids are highly time-consuming. As an alternative, we propose a machine learning approach for the fast prediction of solid-state properties. To achieve this, local spin-density approximation calculations are used as a training set. We focus on predicting the value of the density of electronic states at the Fermi energy. We find that conventional representations of the input data, such as the Coulomb matrix, are not suitable for the training of learning machines in the case of periodic solids. We propose a novel crystal structure representation for which learning and competitive prediction accuracies become possible within an unrestricted class of spd systems of arbitrary unit-cell size.


Science Advances | 2017

Machine learning of accurate energy-conserving molecular force fields

Stefan Chmiela; Alexandre Tkatchenko; Huziel Sauceda; Igor Poltavsky; Kristof T. Schütt; Klaus-Robert Müller

The law of energy conservation is used to develop an efficient machine learning approach to construct accurate force fields. Using conservation of energy—a fundamental property of closed classical and quantum mechanical systems—we develop an efficient gradient-domain machine learning (GDML) approach to construct accurate molecular force fields using a restricted number of samples from ab initio molecular dynamics (AIMD) trajectories. The GDML implementation is able to reproduce global potential energy surfaces of intermediate-sized molecules with an accuracy of 0.3 kcal mol−1 for energies and 1 kcal mol−1 Å̊−1 for atomic forces using only 1000 conformational geometries for training. We demonstrate this accuracy for AIMD trajectories of molecules, including benzene, toluene, naphthalene, ethanol, uracil, and aspirin. The challenge of constructing conservative force fields is accomplished in our work by learning in a Hilbert space of vector-valued functions that obey the law of energy conservation. The GDML approach enables quantitative molecular dynamics simulations for molecules at a fraction of cost of explicit AIMD calculations, thereby allowing the construction of efficient force fields with the accuracy and transferability of high-level ab initio methods.


security and artificial intelligence | 2012

Early detection of malicious behavior in JavaScript code

Kristof T. Schütt; Marius Kloft; Alexander Bikadorov; Konrad Rieck

Malicious JavaScript code is widely used for exploiting vulnerabilities in web browsers and infecting users with malicious software. Static detection methods fail to protect from this threat, as they are unable to cope with the complexity and dynamics of interpreted code. In contrast, the dynamic analysis of JavaScript code at run-time has proven to be effective in identifying malicious behavior. During the execution of the code, however, damage may already take place and thus an early detection is critical for effective protection. In this paper, we introduce EarlyBird: a detection method optimized for early identification of malicious behavior in JavaScript code. The method uses machine learning techniques for jointly optimizing the accuracy and the time of detection. In an evaluation with hundreds of real attacks, EarlyBird precisely identifies malicious behavior while limiting the amount of malicious code that is executed by a factor of 2 (43%) on average.


Journal of Chemical Physics | 2018

SchNet - A Deep Learning Architecture for Molecules and Materials

Kristof T. Schütt; Huziel Sauceda; Pieter-Jan Kindermans; Alexandre Tkatchenko; Klaus-Robert Müller

Deep learning has led to a paradigm shift in artificial intelligence, including web, text, and image search, speech recognition, as well as bioinformatics, with growing impact in chemical physics. Machine learning, in general, and deep learning, in particular, are ideally suitable for representing quantum-mechanical interactions, enabling us to model nonlinear potential-energy surfaces or enhancing the exploration of chemical compound space. Here we present the deep learning architecture SchNet that is specifically designed to model atomistic systems by making use of continuous-filter convolutional layers. We demonstrate the capabilities of SchNet by accurately predicting a range of properties across chemical space for molecules and materials, where our model learns chemically plausible embeddings of atom types across the periodic table. Finally, we employ SchNet to predict potential-energy surfaces and energy-conserving force fields for molecular dynamics simulations of small molecules and perform an exemplary study on the quantum-mechanical properties of C20-fullerene that would have been infeasible with regular ab initio molecular dynamics.


arXiv: Machine Learning | 2017

PatternNet and PatternLRP - Improving the interpretability of neural networks.

Pieter-Jan Kindermans; Kristof T. Schütt; Maximilian Alber; Klaus-Robert Müller; Sven Dähne


neural information processing systems | 2017

SchNet: A continuous-filter convolutional neural network for modeling quantum interactions

Kristof T. Schütt; Pieter-Jan Kindermans; Huziel Enoc Sauceda Felix; Stefan Chmiela; Alexandre Tkatchenko; Klaus-Robert Müller


arXiv: Machine Learning | 2016

Investigating the influence of noise and distractors on the interpretation of neural networks.

Pieter-Jan Kindermans; Kristof T. Schütt; Klaus-Robert Müller; Sven Dähne


arXiv: Machine Learning | 2018

The (Un)reliability of saliency methods

Pieter-Jan Kindermans; Sara Hooker; Julius Adebayo; Kristof T. Schütt; Maximilian Alber; Sven Dähne; Dumitru Erhan; Been Kim


international conference on learning representations | 2018

Learning how to explain neural networks: PatternNet and PatternAttribution

Pieter-Jan Kindermans; Kristof T. Schütt; Maximilian Alber; Klaus-Robert Müller; Dumitru Erhan; Been Kim; Sven Dähne

Collaboration


Dive into the Kristof T. Schütt's collaboration.

Top Co-Authors

Avatar

Klaus-Robert Müller

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Maximilian Alber

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Sven Dähne

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Stefan Chmiela

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Been Kim

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge