Some of the material in is restricted to members of the community. By logging in, you may be able to gain additional access to certain collections or items. If you have questions about access or logging in, please use the form on the Contact Page.
Nanoparticles with a solid, inorganic core surrounded by long chain organic ligands have many useful properties and applications. A feature of these materials is that their properties can be tuned to an application: this makes preliminary simulations appealing (to cut down on the possibility space before going into the lab). However, from a simulation perspective, nanoparticles are big and expensive to simulate at the atomic level. There exist a collection of methods to take gross structural information and produce a potential fit for simulations at the molecular level. In this work, five such methods (and a few alterations to those methods) were performed on a series of increasingly large molecules to see how they perform at the most aggressive level of coarse graining. The methods were compared based on how well they reproduced structural information about the molecules, and on how much they sped up the dynamics of those systems. In order to make meaningful comparisons between these results, the uncertainty in the results needs to be known. Since large simulations are involved, running multiple simulations is expensive. However, Shanbhag (Shanbhag, 2013) recently proposed a method to obtain the uncertainty in diffusion coefficients obtained from a molecular dynamics simulation (via bootstrapping the atomic trajectories to generate estimates). This method was originally tested only on a simple system, so its validity on more complicated systems needed to be verified. This work tested the validity of this method by running two hundred Lennard-Jones simulations, performing bootstrapping on each, and finding the percentage of bootstrap results that failed to capture the overall mean. This was repeated under different conditions and potentials to determine exactly when and how poorly this method fails. After running the bootstrapping comparisons, it was found that simulations start out with a certain level of underestimation: the exact amount depends on how strongly the particles are interacting. If using unweighted least squares regression on the mean squared displacement, the amount of underestimation approaches a minimum once the simulation has run long enough for the particles to traverse the simulation box. Other methods that put emphasis on short time data do not recover gracefully from the initial effects of correlation. Armed with the ability to get a measure of the uncertainty, the effects of coarse graining were studied. It was found that Inverse Boltzmann best reproduced structural information, at the cost of added computation. Of the computationally cheap methods, Hypernetted chain tended to perform the best for reproducing structural information, while the potential of mean force and force averaging were typically among the worst. When it comes to transferability, for the pure methods force averaging was fairly transferable, Hypernetted chain less so, with Inverse Boltzmann suffering from overfitting (though this problem is improved by calculating a bridge function). While it was expected that coarse graining would speed up dynamics, it was hoped the speedup would be consistent: it was not.
bootstrapping, coarse graining, error estimation
Date of Defense
November 3, 2017.
A Dissertation submitted to the Department of Scientific Computing in partial fulfillment of the requirements for the degree of Doctor of Philosophy.
Includes bibliographical references.
Sachin Shanbhag, Professor Directing Dissertation; Per Arne Rikvold, University Representative; Chen Huang, Committee Member; Jose Mendoza-Cortes, Committee Member; Dennis Slice, Committee Member.
Florida State University
Crysup, B. R. (2017). Making Material Simulation Faster: Coarse Graining, Bridging and Bootstrapping. Retrieved from http://purl.flvc.org/fsu/fd/FSU_FALL2017_Crysup_fsu_0071E_14203