Current Search: Srivastava, Anuj (x)
Search results
Pages
 Title
 Statistical Shape Analysis Of Simplified Neuronal Trees.
 Creator

Duncan, Adam, Klassen, Eric, Srivastava, Anuj
 Abstract/Description

Neuron morphology plays a central role in characterizing cognitive health and functionality of brain structures. The problem of quantifying neuron shapes and capturing statistical variability of shapes is difficult because neurons differ both in geometry and in topology. This paper develops a mathematical representation of neuronal trees, restricting to the trees that consist of: (1) a main branch viewed as a parameterized curve in R3, and (2) some number of secondary branchesalso...
Show moreNeuron morphology plays a central role in characterizing cognitive health and functionality of brain structures. The problem of quantifying neuron shapes and capturing statistical variability of shapes is difficult because neurons differ both in geometry and in topology. This paper develops a mathematical representation of neuronal trees, restricting to the trees that consist of: (1) a main branch viewed as a parameterized curve in R3, and (2) some number of secondary branchesalso parameterized curves in R3which emanate from the main branch at arbitrary points. It imposes a metric on the representation space, in order to compare neuronal shapes, and to obtain optimal deformations (geodesics) across arbitrary trees. The key idea is to impose certain equivalence relations that allow trees with different geometries and topologies to be compared efficiently. The combinatorial problem of matching side branches across trees is reduced to a linear assignment with wellknown efficient solutions. This framework is then applied to comparing, clustering, and classifying neurons using fully automated algorithms. The framework is illustrated on three datasets of neuron reconstructions, specifically showing geodesics paths and crossvalidated classification between experimental groups.
Show less  Date Issued
 20180901
 Identifier
 FSU_libsubv1_wos_000444259500002, 10.1214/17AOAS1107
 Format
 Citation
 Title
 The FTable: A Data Structure for Rendering PhotoAccurate Images of Faces from Experimentally Acquired Reflectance.
 Creator

Song, Hui, Banks, David, Srivastava, Anuj, Liu, Xiuwen, Department of Computer Science, Florida State University
 Abstract/Description

My thesis is that a realistic image of a human face can be generated using a computational simulation of light transport. I demonstrate this thesis by acquiring the threedimensional geometry of a face, measuring the reflectance function from the face, and generating a graphical image that accurately matches a photograph of the face. I also show that the faithful simulation of the physics of light transport is important for synthesizing a realistic image of the face. One important consequence...
Show moreMy thesis is that a realistic image of a human face can be generated using a computational simulation of light transport. I demonstrate this thesis by acquiring the threedimensional geometry of a face, measuring the reflectance function from the face, and generating a graphical image that accurately matches a photograph of the face. I also show that the faithful simulation of the physics of light transport is important for synthesizing a realistic image of the face. One important consequence of this work is that it can be used in the future to recognize a face based on a photograph. By rendering realistic images of geometric meshes of faces using accurate re ectance functions and faithful physics, photorealistic images can be compared pixel by pixel to digitized photographs to search for the closest match between a face in a geometric database and a face in a photograph.
Show less  Date Issued
 2004
 Identifier
 FSU_migr_etd1637
 Format
 Thesis
 Title
 A Study of Image Representations for ContentBased Image Retrieval.
 Creator

Sun, Donghu, Liu, Xiuwen, Srivastava, Anuj, Schwartz, Daniel, Department of Computer Science, Florida State University
 Abstract/Description

The performance of a contentbased image retrieval system depends on the representation of images. As a typical image consists of different objects, an image segmentation is needed for more accurate representations of contents. The first part of this thesis describes a generic image segmentation algorithm based on local spectral histograms of images. This algorithm, demonstrated by experimental results, is shown to be effective for both texture and nontexture images, and comparable to other...
Show moreThe performance of a contentbased image retrieval system depends on the representation of images. As a typical image consists of different objects, an image segmentation is needed for more accurate representations of contents. The first part of this thesis describes a generic image segmentation algorithm based on local spectral histograms of images. This algorithm, demonstrated by experimental results, is shown to be effective for both texture and nontexture images, and comparable to other segmentation algorithms. Due to the time constraint of an image retrieval system, the second part of this thesis focuses on low dimensional representations of images. By analyzing the semantics of commonly used linear subspace representations through sampling their intrinsic generalizations, their limitations are illustrated and a nonlinear representation, called Spectral Subspace Analysis (SSA) that overcomes these limitations is proposed. In addition, to obtain optimal retrieval performance, an algorithm for learning optimal representations is developed by formulating the problem as an optimization one on a Grassmann manifold and exploiting the underlying geometry of the manifold. Experimental results on different data sets show that both the SSA representation and the learned optimal representations can improve retrieval performance significantly for contentbased image retrieval systems.
Show less  Date Issued
 2004
 Identifier
 FSU_migr_etd1505
 Format
 Thesis
 Title
 Reducing the WCET of Applications on Low End Embedded Systems.
 Creator

Zhao, Wankang, Whalley, David, Srivastava, Anuj, Baker, Theodore P., Engelen, Robert A. van, Gallivan, Kyle, Department of Computer Science, Florida State University
 Abstract/Description

Applications in embedded systems often need to meet specified timing constraints. It is advantageous to not only calculate the WorstCase Execution Time (WCET) of an application, but to also perform transformations that attempt to reduce the WCET, since an application with a lower WCET will be less likely to violate its timing constraints. A compiler has been integrated with a timing analyzer to obtain the WCET of a program on demand during compilation. This environment is used to investigate...
Show moreApplications in embedded systems often need to meet specified timing constraints. It is advantageous to not only calculate the WorstCase Execution Time (WCET) of an application, but to also perform transformations that attempt to reduce the WCET, since an application with a lower WCET will be less likely to violate its timing constraints. A compiler has been integrated with a timing analyzer to obtain the WCET of a program on demand during compilation. This environment is used to investigate three different types of compiler optimization techniques to reduce WCET. First, an interactive compilation system has been developed that allows a user to interact with a compiler and get feedback regarding the WCET. In addition, a genetic algorithm is used to automatically search for an effective optimization phase sequence to reduce the WCET. Second, a WCET code positioning optimization has been investigated that uses worstcase path information to reorder basic blocks so that the branch penalties can be reduced in the worstcase path. Third, WCET path optimizations, similar to frequent path optimizations, are used to reduce the WCET. There are several contributions to this work. To the best of our knowledge, this is the first compiler that interacts with a timing analyzer to use WCET predictions during the compilation of applications. The dissertation demonstrates that a genetic algorithm search can find an optimization sequence that simultaneously improves both WCET and code size. New compiler optimizations have been developed that use WC path information from a timing analyzer. The results show that the WCET code positioning algorithms typically find the optimal layout of the basic blocks with the minimal WCET. It is also shown that frequent path optimizations can be applied on WC paths using worstcase path information from a timing analyzer to reduce WCET. These new compiler optimizations described in this dissertation not only significantly reduce WCET, but also are completely automatic.
Show less  Date Issued
 2005
 Identifier
 FSU_migr_etd0528
 Format
 Thesis
 Title
 Discontinuous Galerkin Spectral Element Approximations on Moving Meshes for Wave Scattering from Reflective Moving Boundaries.
 Creator

AcostaMinoli, Cesar Augusto, Kopriva, David, Srivastava, Anuj, Hussaini, M. Yousuﬀ, Sussman, Mark, Ewald, Brian, Department of Mathematics, Florida State University
 Abstract/Description

This dissertation develops and evaluates a high order method to compute wave scattering from moving boundaries. Specifically, we derive and evaluate a Discontinuous Galerkin Spectral elements method (DGSEM) with Arbitrary Lagrangian Eulerian (ALE) mapping to compute conservation laws on moving meshes and numerical boundary conditions for Maxwell's equations, the linear Euler equations and the nonlinear Euler gasdynamics equations to calculate the numerical flux on reflective moving...
Show moreThis dissertation develops and evaluates a high order method to compute wave scattering from moving boundaries. Specifically, we derive and evaluate a Discontinuous Galerkin Spectral elements method (DGSEM) with Arbitrary Lagrangian Eulerian (ALE) mapping to compute conservation laws on moving meshes and numerical boundary conditions for Maxwell's equations, the linear Euler equations and the nonlinear Euler gasdynamics equations to calculate the numerical flux on reflective moving boundaries. We use one of a family of explicit time integrators such as AdamsBashforth or low storage explicit RungeKutta. The approximations preserve the discrete metric identities and the Discrete Geometric Conservation Law (DGCL) by construction. We present timestep refinement studies with moving meshes to validate the moving mesh approximations. The test problems include propagation of an electromagnetic gaussian plane wave, a cylindrical pressure wave propagating in a subsonic flow, and a vortex convecting in a uniform inviscid subsonic flow. Each problem is computed on a timedeforming mesh with three methods used to calculate the mesh velocities: From exact differentiation, from the integration of an acceleration equation, and from numerical differentiation of the mesh position. In addition, we also present four numerical examples using Maxwell's equations, one example using the linear Euler equations and one more example using nonlinear Euler equations to validate these approximations. These are: reflection of light from a constantly moving mirror, reflection of light from a constantly moving cylinder, reflection of light from a vibrating mirror, reflection of sound in linear acoustics and dipole sound generation by an oscillating cylinder in an inviscid flow.
Show less  Date Issued
 2011
 Identifier
 FSU_migr_etd0111
 Format
 Thesis
 Title
 Analysis of the Wealth Distribution at Equilibrium in a Heterogeneous Agent Economy.
 Creator

Badshah, Muffasir H., Srivastava, Anuj, Beaumont, Paul, Wu, Wei, Kercheval, Alec, Department of Statistics, Florida State University
 Abstract/Description

This paper aims at analyzing a macro economy with a continuum of infinitelylived households that make rational decisions about consumption and wealth savings in the face of employment and aggregate productivity shocks. The heterogeneous population structure arises when households differ in wealth and employment status against which they cannot insure. In this framework, the household wealth evolution is modeled as a mixture Markov process. The stationary wealth distributions are obtained...
Show moreThis paper aims at analyzing a macro economy with a continuum of infinitelylived households that make rational decisions about consumption and wealth savings in the face of employment and aggregate productivity shocks. The heterogeneous population structure arises when households differ in wealth and employment status against which they cannot insure. In this framework, the household wealth evolution is modeled as a mixture Markov process. The stationary wealth distributions are obtained using eigen structures of transition matrices under the PerronFrobenius theorem. This step is utilized repeatedly to find the equilibrium state of the system, and it leads to an efficient framework for studying the dynamic general equilibrium. A systematic evaluation of the equilibrium state under different initial conditions is further presented and analyzed.
Show less  Date Issued
 2010
 Identifier
 FSU_migr_etd0844
 Format
 Thesis
 Title
 A Block Incremental Algorithm for Computing Dominant Singular Subspaces.
 Creator

Baker, Christopher Grover, Gallivan, Kyle, Srivastava, Anuj, Engelen, Robert van, Department of Computer Science, Florida State University
 Abstract/Description

This thesis presents and evaluates a generic algorithm for incrementally computing the dominant singular subspaces of a matrix. The relationship between the generality of the results and the necessary computation is explored. The performance of this method, both numerical and computational, is discussed in terms of the algorithmic parameters, such as block size and acceptance threshhold. Bounds on the error are presented along with a posteriori approximations of these bounds. Finally, a group...
Show moreThis thesis presents and evaluates a generic algorithm for incrementally computing the dominant singular subspaces of a matrix. The relationship between the generality of the results and the necessary computation is explored. The performance of this method, both numerical and computational, is discussed in terms of the algorithmic parameters, such as block size and acceptance threshhold. Bounds on the error are presented along with a posteriori approximations of these bounds. Finally, a group of methods are proposed which iteratively improve the accuracy of computed results and the quality of the bounds.
Show less  Date Issued
 2004
 Identifier
 FSU_migr_etd0961
 Format
 Thesis
 Title
 Optimal Linear Representations of Images under Diverse Criteria.
 Creator

Rubinshtein, Evgenia, Srivastava, Anuj, Liu, Xiuwen, Huﬀer, Fred, Chicken, Eric, Department of Statistics, Florida State University
 Abstract/Description

Image analysis often requires dimension reduction before statistical analysis, in order to apply sophisticated procedures. Motivated by eventual applications, a variety of criteria have been proposed: reconstruction error, class separation, nonGaussianity using kurtosis, sparseness, mutual information, recognition of objects, and their combinations. Although some criteria have analytical solutions, the remaining ones require numerical approaches. We present geometric tools for finding linear...
Show moreImage analysis often requires dimension reduction before statistical analysis, in order to apply sophisticated procedures. Motivated by eventual applications, a variety of criteria have been proposed: reconstruction error, class separation, nonGaussianity using kurtosis, sparseness, mutual information, recognition of objects, and their combinations. Although some criteria have analytical solutions, the remaining ones require numerical approaches. We present geometric tools for finding linear projections that optimize a given criterion for a given data set. The main idea is to formulate a problem of optimization on a Grassmann or a Stiefel manifold, and to use differential geometry of the underlying space to construct optimization algorithms. Purely deterministic updates lead to local solutions, and addition of random components allows for stochastic gradient searches that eventually lead to global solutions. We demonstrate these results using several image datasets, including natural images and facial images.
Show less  Date Issued
 2006
 Identifier
 FSU_migr_etd1926
 Format
 Thesis
 Title
 Constructing A Revised Version of the Face Stimulus Assessment to Measure Formal Elements: A Pilot Study.
 Creator

Mattson, Donald C., Gussak, David E., Srivastava, Anuj, Rosal, Marcia L., Villeneuve, Pat, Department of Art Education, Florida State University
 Abstract/Description

The Face Stimulus AssessmentRevised (FSAR) is an artbased instrument constructed from elements of the Face Stimulus Assessment (FSA, Betts, 2003). The pilot test involved computerized rating of formal elements between those with Major Depressive Disorder (n = 20), and controls without known diagnosis of Major Depressive Disorder (n=20). Significance resulted from a multiple ttest analysis of the data. In response to the hypothesis that the formal elements of color and/or free space from...
Show moreThe Face Stimulus AssessmentRevised (FSAR) is an artbased instrument constructed from elements of the Face Stimulus Assessment (FSA, Betts, 2003). The pilot test involved computerized rating of formal elements between those with Major Depressive Disorder (n = 20), and controls without known diagnosis of Major Depressive Disorder (n=20). Significance resulted from a multiple ttest analysis of the data. In response to the hypothesis that the formal elements of color and/or free space from the FSAR, rated by public domain image analysis software (PDIAS), can distinguish Major Depression artwork from control group artwork, this study concluded that certain colors and free space distinguished the groups. Those with Major Depression drew less purple (t(38) = 2.95, p= .05, d = .96), less orange (t(38) = 2.28, p = .05, d = 70), and more left free space (t(38) = 2.26, p = .05, d = .73) than controls. As a result, it may be possible for the FSAR to become a standardized instrument for screening Major Depression.
Show less  Date Issued
 2011
 Identifier
 FSU_migr_etd2665
 Format
 Thesis
 Title
 A Bayesian MRF Framework for Labeling Terrain Using Hyperspectral Imaging.
 Creator

Neher, Robert E., Srivastava, Anuj, Liu, Xiuwen, Huffer, Fred, Wegkamp, Marten, Department of Statistics, Florida State University
 Abstract/Description

We explore the nonGaussianity of hyperspectral data and present probability models that capture variability of hyperspectral images. In particular, we present a nonparametric probability distribution that models the distribution of the hyperspectral data after reducing the dimension of the data via either principal components or Fisher's discriminant analysis. We also explore the directional differences in observed images and present two parametric distributions, the generalized Laplacian...
Show moreWe explore the nonGaussianity of hyperspectral data and present probability models that capture variability of hyperspectral images. In particular, we present a nonparametric probability distribution that models the distribution of the hyperspectral data after reducing the dimension of the data via either principal components or Fisher's discriminant analysis. We also explore the directional differences in observed images and present two parametric distributions, the generalized Laplacian and the Bessel K form, that well model the nonGaussian behavior of the directional differences. We then propose a model that labels each spatial site, using Bayesian inference and Markov random fields, that incorporates the information of the nonparametric distribution of the data, and the parametric distributions of the directional differences, along with a prior distribution that favors smooth labeling. We then test our model on actual hyperspectral data and present the results of our model, using the Washington D.C. Mall and Indian Springs rural area data sets.
Show less  Date Issued
 2004
 Identifier
 FSU_migr_etd2691
 Format
 Thesis
 Title
 Robust Change Detection and Change Point Estimation for Poisson Count Processes.
 Creator

Perry, Marcus B., Pignatiello, Joseph J., Srivastava, Anuj, Simpson, James R., Zhang, Chuck, Department of Industrial and Manufacturing Engineering, Florida State University
 Abstract/Description

Poisson count process are often used to model the number of occurrences over some interval unit. In an industrial quality control setting, these processes are often used to model the number of nonconformities per unit of product. Current methods used for monitoring and estimating changes in Poisson count processes assume that the magnitude and type of change are known a priori. Since rarely in practice are these known, this dissertation reports on the development and evaluation of several...
Show morePoisson count process are often used to model the number of occurrences over some interval unit. In an industrial quality control setting, these processes are often used to model the number of nonconformities per unit of product. Current methods used for monitoring and estimating changes in Poisson count processes assume that the magnitude and type of change are known a priori. Since rarely in practice are these known, this dissertation reports on the development and evaluation of several methods for detecting and estimating change points when the magnitude and type of change are unknown. Instead, the only assumption requires that the type of change belongs to a family of monotonic change types. Results indicate that the methodologies proposed throughout this dissertation research provide robust detection and estimation capabilities (relative to current methods) with regard to the magnitude and type of monotonic change that may be present.
Show less  Date Issued
 2004
 Identifier
 FSU_migr_etd2014
 Format
 Thesis
 Title
 Singleand MultipleObjective Stochastic Programming Models with Applications to Aerodynamics.
 Creator

Croicu, AnaMaria, Hussaini, M. Yousuﬀ, Srivastava, Anuj, Kopriva, David, Wang, Qi, Department of Mathematics, Florida State University
 Abstract/Description

Deterministic design assumes that there is no uncertainty in the modeling parameters, and as a consequence, there is no variability in the simulation outputs. Therefore, deterministic optimal designs that are obtained without taking into account uncertainty are usually unreliable. This is the case with transonic shape optimization, where the randomness in the cruise Mach number might have significant impact on the optimal geometric design. In this context, a stochastic search turns out to be...
Show moreDeterministic design assumes that there is no uncertainty in the modeling parameters, and as a consequence, there is no variability in the simulation outputs. Therefore, deterministic optimal designs that are obtained without taking into account uncertainty are usually unreliable. This is the case with transonic shape optimization, where the randomness in the cruise Mach number might have significant impact on the optimal geometric design. In this context, a stochastic search turns out to be more appropriate. Approaches to stochastic optimization have followed a variety of modeling philosophies, but little has been done to systematically compare different models. The goal of this thesis is to present a comparison between two stochastic optimization algorithms, with the emphasis on applications, especially on the airfoil shape optimization. Singleobjective and multiobjective optimization programs are analyzed as well. The relationship between the expected minimum value (EMV) criterion and the minimum expected value (MEV) criterion is explored, and it is shown that, under favorable conditions, a better optimal point could be obtained via the EMV approach. Unfortunately, the advantages of using the EMV approach are far outweighed by the prohibitive exorbitant computational cost.
Show less  Date Issued
 2005
 Identifier
 FSU_migr_etd3027
 Format
 Thesis
 Title
 Statistical Modelling and Applications of Neural Spike Trains.
 Creator

Lawhern, Vernon, Wu, Wei, Contreras, Robert J., Srivastava, Anuj, Huﬀer, Fred, Niu, Xufeng, Department of Statistics, Florida State University
 Abstract/Description

In this thesis we investigate statistical modelling of neural activity in the brain. We first develop a framework which is an extension of the statespace Generalized Linear Model (GLM) by Eden and colleagues [20] to include the effects of hidden states. These states, collectively, represent variables which are not observed (or even observable) in the modeling process but nonetheless can have an impact on the neural activity. We then develop a framework that allows us to input apriori target...
Show moreIn this thesis we investigate statistical modelling of neural activity in the brain. We first develop a framework which is an extension of the statespace Generalized Linear Model (GLM) by Eden and colleagues [20] to include the effects of hidden states. These states, collectively, represent variables which are not observed (or even observable) in the modeling process but nonetheless can have an impact on the neural activity. We then develop a framework that allows us to input apriori target information into the model. We examine both of these modelling frameworks on motor cortex data recorded from monkeys performing different targetdriven hand and arm movement tasks. Finally, we perform temporal coding analysis of sensory stimulation using principled statistical models and show the efficacy of our approach.
Show less  Date Issued
 2011
 Identifier
 FSU_migr_etd3251
 Format
 Thesis
 Title
 Automated Face Tracking and Recognition.
 Creator

Hesher, Matthew Curtis, Erlebacher, Gordon, Srivastava, Anuj, Gallivan, Kyle, Department of Computer Science, Florida State University
 Abstract/Description

We have considered the problem of tracking and recognition using a three dimensional representation of human faces. First we present a review of the research in the tracking and recognition fields including a list of several commercially available face tracking and recognition systems. Next, two algorithms are described: one for tracking faces from observed images and one for recognition of faces from observed geometries. The tracking algorithm uses 3D shape and texture of a human face to...
Show moreWe have considered the problem of tracking and recognition using a three dimensional representation of human faces. First we present a review of the research in the tracking and recognition fields including a list of several commercially available face tracking and recognition systems. Next, two algorithms are described: one for tracking faces from observed images and one for recognition of faces from observed geometries. The tracking algorithm uses 3D shape and texture of a human face to estimate the changing position and orientation of a real face in a video image sequence. The recognition algorithm uses principal component analysis (PCA) of range images generated from the 3D shape of a human face to create a database of lowdimensional face representations for efficient recognition. Range images are robust to illumination and texture variations and thus avoid some of the current limitations in face recognition.
Show less  Date Issued
 2003
 Identifier
 FSU_migr_etd4074
 Format
 Thesis
 Title
 Functional Data Analysis and Partial Shape Matching in the Square Root Velocity Framework.
 Creator

Robinson, Daniel T., Klassen, Eric, Reina, Laura, Bellenot, Steven, Mio, Washington, Srivastava, Anuj, Department of Mathematics, Florida State University
 Abstract/Description

We investigate two problems in elastic curve shape analysis, working within the context of the square root velocity (SRV) framework. The first of these is to develop specialized algorithms for the analysis of onedimensional curves, which are just realvalued functions. In this particularly simple case, the elastic matching problem can be stated as a finite combinatorial problem in which the optimal solution can be found exactly. We also develop a method for groupwise alignment, and use it to...
Show moreWe investigate two problems in elastic curve shape analysis, working within the context of the square root velocity (SRV) framework. The first of these is to develop specialized algorithms for the analysis of onedimensional curves, which are just realvalued functions. In this particularly simple case, the elastic matching problem can be stated as a finite combinatorial problem in which the optimal solution can be found exactly. We also develop a method for groupwise alignment, and use it to compute Karcher means of collections of functions. Second, we consider the problem of finding optimal partial matches between curves in Euclidean space within the SRV framework, and present algorithms and heuristics to solve this problem. Finally, we give a brief overview of libsrvf, an opensource software library providing implementations of the algorithms developed in the course of this work.
Show less  Date Issued
 2012
 Identifier
 FSU_migr_etd5424
 Format
 Thesis
 Title
 Statistical Models on Human Shapes with Application to Bayesian Image Segmentation and Gait Recognition.
 Creator

Kaziska, David M., Srivastava, Anuj, Mio, Washington, Chicken, Eric, Wegkamp, Marten, Department of Statistics, Florida State University
 Abstract/Description

In this dissertation we develop probability models for human shapes and apply those probability models to the problems of image segmentation and human identi_cation by gait recognition. To build probability models on human shapes, we consider human shape to be realizations of random variables on a space of simple closed curves and a space of elastic curves. Both of these spaces are quotient spaces of in_nite dimensional manifolds. Our probability models arise through Tangent Principal...
Show moreIn this dissertation we develop probability models for human shapes and apply those probability models to the problems of image segmentation and human identi_cation by gait recognition. To build probability models on human shapes, we consider human shape to be realizations of random variables on a space of simple closed curves and a space of elastic curves. Both of these spaces are quotient spaces of in_nite dimensional manifolds. Our probability models arise through Tangent Principal Component Analysis, a method of studying probability models on manifolds by projecting them onto a tangent plane to the manifold. Since we put the tangent plane at the Karcher mean of sample shapes, we begin our study by examining statistical properties of Karcher means on manifolds. We derive theoretical results for the location of Karcher means on certain manifolds, and perform a simulation study of properties of Karcher means on our shape space. Turning to the speci_c problem of distributions on human shapes we examine alternatives for probability models and _nd that kernel density estimators perform well. We use this model to sample shapes and to perform shape testing. The _rst application we consider is human detection in infrared images. We pursue this application using Bayesian image segmentation, in which our proposed human in an image is a maximum likelihood estimate, obtained using a prior distribution on human shapes and a likelihood arising from a divergence measure on the pixels in the image. We then consider human identi_cation by gait recognition. We examine human gait as a cyclostationary process on the space of elastic curves and develop a metric on processes based on the geodesic distance between sequences on that space. We develop and demonstrate a framework for gait recognition based on this metric, which includes the following elements: automatic detection of gait cycles, interpolation to register gait cycles, computation of a mean gait cycle, and identi_cation by matching a test cycle to the nearest member of a training set. We perform the matching both by an exhaustive search of the training set and through an expedited method using clusterbased trees and boosting.
Show less  Date Issued
 2005
 Identifier
 FSU_migr_etd3275
 Format
 Thesis
 Title
 On the Geometry of Hurwitz Surfaces.
 Creator

Vogeler, Roger, Bowers, Philip L., Heil, Wolfgang H., Klassen, Eric P., Quine, John R., Srivastava, Anuj, Department of Mathematics, Florida State University
 Abstract/Description

A Riemann surface of genus g has at most 84(g − 1) automorphisms. A Hurwitz surface is one for which this maximum is attained; the corresponding group of automorphisms is called a Hurwitz group. By uniformization, the surface admits a hyperbolic structure wherein the automorphisms act by isometry. Such isometries descend from the (2,3,7) triangle group T acting on the universal cover H2. We develop a combinatorial approach which leads to a classification of the conjugacy classes of hyperbolic...
Show moreA Riemann surface of genus g has at most 84(g − 1) automorphisms. A Hurwitz surface is one for which this maximum is attained; the corresponding group of automorphisms is called a Hurwitz group. By uniformization, the surface admits a hyperbolic structure wherein the automorphisms act by isometry. Such isometries descend from the (2,3,7) triangle group T acting on the universal cover H2. We develop a combinatorial approach which leads to a classification of the conjugacy classes of hyperbolic elements of T, arranged by length. This allows us to study the closed geodesics of Hurwitz surfaces by performing calculations in the corresponding Hurwitz groups. We identify the systoles and other short curves on most of the Hurwitz surfaces of genus less than 10,000. We also determine which of these surfaces are chiral and which are amphichiral. In addition, we show that certain families of closed geodesics are simple on every Hurwitz surface.
Show less  Date Issued
 2003
 Identifier
 FSU_migr_etd4544
 Format
 Thesis
 Title
 Riemannian Shape Analysis of Curves and Surfaces.
 Creator

Kurtek, Sebastian, Srivastava, Anuj, Klassen, Eric, Wu, Wei, Huﬀer, Fred, Dryden, Ian, Department of Statistics, Florida State University
 Abstract/Description

Shape analysis of curves and surfaces is a very important tool in many applications ranging from computer vision to bioinformatics and medical imaging. There are many difficulties when analyzing shapes of parameterized curves and surfaces. Firstly, it is important to develop representations and metrics such that the analysis is invariant to parameterization in addition to the standard transformations (rigid motion and scaling). Furthermore, under the chosen representations and metrics, the...
Show moreShape analysis of curves and surfaces is a very important tool in many applications ranging from computer vision to bioinformatics and medical imaging. There are many difficulties when analyzing shapes of parameterized curves and surfaces. Firstly, it is important to develop representations and metrics such that the analysis is invariant to parameterization in addition to the standard transformations (rigid motion and scaling). Furthermore, under the chosen representations and metrics, the analysis must be performed on infinitedimensional and sometimes nonlinear spaces, which poses an additional difficulty. In this work, we develop and apply methods which address these issues. We begin by defining a framework for shape analysis of parameterized open curves and extend these ideas to shape analysis of surfaces. We utilize the presented frameworks in various classification experiments spanning multiple application areas. In the case of curves, we consider the problem of clustering DTMRI brain fibers, classification of protein backbones, modeling and segmentation of signatures and statistical analysis of biosignals. In the case of surfaces, we perform disease classification using 3D anatomical structures in the brain, classification of handwritten digits by viewing images as quadrilateral surfaces, and finally classification of cropped facial surfaces. We provide two additional extensions of the general shape analysis frameworks that are the focus of this dissertation. The first one considers shape analysis of marked spherical surfaces where in addition to the surface information we are given a set of manually or automatically generated landmarks. This requires additional constraints on the definition of the reparameterization group and is applicable in many domains, especially medical imaging and graphics. Second, we consider reflection symmetry analysis of planar closed curves and spherical surfaces. Here, we also provide an example of disease detection based on brain asymmetry measures. We close with a brief summary and a discussion of open problems, which we plan on exploring in the future.
Show less  Date Issued
 2012
 Identifier
 FSU_migr_etd4963
 Format
 Thesis
 Title
 A Novel Riemannian Metric for Analyzing Spherical Functions with Applications to HARDI Data.
 Creator

Ncube, Sentibaleng, Srivastava, Anuj, Klassen, Eric, Wu, Wei, Niu, Xufeng, Department of Statistics, Florida State University
 Abstract/Description

We propose a novel Riemannian framework for analyzing orientation distribution functions (ODFs), or their probability density functions (PDFs), in HARDI data sets for use in comparing, interpolating, averaging, and denoising PDFs. This is accomplished by separating shape and orientation features of PDFs, and then analyzing them separately under their own Riemannian metrics. We formulate the action of the rotation group on the space of PDFs, and define the shape space as the quotient space of...
Show moreWe propose a novel Riemannian framework for analyzing orientation distribution functions (ODFs), or their probability density functions (PDFs), in HARDI data sets for use in comparing, interpolating, averaging, and denoising PDFs. This is accomplished by separating shape and orientation features of PDFs, and then analyzing them separately under their own Riemannian metrics. We formulate the action of the rotation group on the space of PDFs, and define the shape space as the quotient space of PDFs modulo the rotations. In other words, any two PDFs are compared in: (1) shape by rotationally aligning one PDF to another, using the FisherRao distance on the aligned PDFs, and (2) orientation by comparing their rotation matrices. This idea improves upon the results from using the FisherRao metric in analyzing PDFs directly, a technique that is being used increasingly, and leads to geodesic interpolations that are biologically feasible. This framework leads to definitions and efficient computations for the Karcher mean that provide tools for improved interpolation and denoising. We demonstrate these ideas, using an experimental setup involving several PDFs.
Show less  Date Issued
 2011
 Identifier
 FSU_migr_etd5064
 Format
 Thesis
 Title
 A Riemannian Framework for Annotated Curves Analysis.
 Creator

Liu, Wei, Srivastava, Anuj, Zhang, Jinfeng, Klassen, Eric P., Huﬀer, Fred, Department of Statistics, Florida State University
 Abstract/Description

We propose a Riemannian framework for shape analysis of annotated curves, curves that have certain attributes defined along them, in addition to their geometries.These attributes may be in form of vectorvalued functions, discrete landmarks, or symbolic labels, and provide auxiliary information along the curves. The resulting shape analysis, that is comparing, matching, and deforming, is naturally influenced by the auxiliary functions. Our idea is to construct curves in higher dimensions...
Show moreWe propose a Riemannian framework for shape analysis of annotated curves, curves that have certain attributes defined along them, in addition to their geometries.These attributes may be in form of vectorvalued functions, discrete landmarks, or symbolic labels, and provide auxiliary information along the curves. The resulting shape analysis, that is comparing, matching, and deforming, is naturally influenced by the auxiliary functions. Our idea is to construct curves in higher dimensions using both geometric and auxiliary coordinates, and analyze shapes of these curves. The difficulty comes from the need for removing different groups from different components: the shape is invariant to rigidmotion, global scale and reparameterization while the auxiliary component is usually invariant only to the reparameterization. Thus, the removal of some transformations (rigid motion and global scale) is restricted only to the geometric coordinates, while the reparameterization group is removed for all coordinates. We demonstrate this framework using a number of experiments.
Show less  Date Issued
 2011
 Identifier
 FSU_migr_etd4997
 Format
 Thesis
 Title
 Monte Carlo Likelihood Estimation for Conditional Autoregressive Models with Application to Sparse Spatiotemporal Data.
 Creator

Bain, Rommel, Huffer, Fred, Becker, Betsy, Niu, Xufeng, Srivastava, Anuj, Department of Statistics, Florida State University
 Abstract/Description

Spatiotemporal modeling is increasingly used in a diverse array of fields, such as ecology, epidemiology, health care research, transportation, economics, and other areas where data arise from a spatiotemporal process. Spatiotemporal models describe the relationship between observations collected from different spatiotemporal sites. The modeling of spatiotemporal interactions arising from spatiotemporal data is done by incorporating the spacetime dependence into the covariance structure. A...
Show moreSpatiotemporal modeling is increasingly used in a diverse array of fields, such as ecology, epidemiology, health care research, transportation, economics, and other areas where data arise from a spatiotemporal process. Spatiotemporal models describe the relationship between observations collected from different spatiotemporal sites. The modeling of spatiotemporal interactions arising from spatiotemporal data is done by incorporating the spacetime dependence into the covariance structure. A main goal of spatiotemporal modeling is the estimation and prediction of the underlying process that generates the observations under study and the parameters that govern the process. Furthermore, analysis of the spatiotemporal correlation of variables can be used for estimating values at sites where no measurements exist. In this work, we develop a framework for estimating quantities that are functions of complete spatiotemporal data when the spatiotemporal data is incomplete. We present two classes of conditional autoregressive (CAR) models (the homogeneous CAR (HCAR) model and the weighted CAR (WCAR) model) for the analysis of sparse spatiotemporal data (the log of monthly mean zooplankton biomass) collected on a spatiotemporal lattice by the California Cooperative Oceanic Fisheries Investigations (CalCOFI). These models allow for spatiotemporal dependencies between nearest neighbor sites on the spatiotemporal lattice. Typically, CAR model likelihood inference is quite complicated because of the intractability of the CAR model's normalizing constant. Sparse spatiotemporal data further complicates likelihood inference. We implement Monte Carlo likelihood (MCL) estimation methods for parameter estimation of our HCAR and WCAR models. Monte Carlo likelihood estimation provides an approximation for intractable likelihood functions. We demonstrate our framework by giving estimates for several different quantities that are functions of the complete CalCOFI time series data.
Show less  Date Issued
 2013
 Identifier
 FSU_migr_etd7283
 Format
 Thesis
 Title
 NonIntrusive Methods for Probablistic Uncertainty Quantification and Global Sensitivity Analysis in Nonlinea Stochastic Phenomena.
 Creator

Liu, Yaning, Hussaini, M. Yousuff, Okten, Giray, Srivastava, Anuj, Sussman, Mark, Department of Mathematics, Florida State University
 Abstract/Description

The objective of this work is to quantify uncertainty and perform global sensitivity analysis for nonlinear models with a moderate or large number of stochastic parameters. We implement nonintrusive methods that do not require modification of the programming code of the underlying deterministic model. To avoid the curse of dimensionality, two methods, namely sampling methods and high dimensional model representation are employed to propagate uncertainty and compute global sensitivity indices...
Show moreThe objective of this work is to quantify uncertainty and perform global sensitivity analysis for nonlinear models with a moderate or large number of stochastic parameters. We implement nonintrusive methods that do not require modification of the programming code of the underlying deterministic model. To avoid the curse of dimensionality, two methods, namely sampling methods and high dimensional model representation are employed to propagate uncertainty and compute global sensitivity indices. Variancebased global sensitivity analysis identifies significant and insignificant model parameters. It also provides basis for reducing a model's stochastic dimension by freezing identified insignificant model parameters at their nominal values. The dimensionreduced model can then be analyzed efficiently. We use uncertainty quantification and global sensitivity analysis in three applications. The first application is to the Rothermel wildland surface fire spread model, which consists of around 80 nonlinear algebraic equations and 24 parameters. We find the reduced models for the selected model outputs and apply efficient sampling methods to quantify the uncertainty. High dimensional model representation is also applied for the Rothermel model for comparison. The second application is to a recently developed biological model that describes inflammatory host response to a bacterial infection. The model involves four nonlinear coupled ordinary differential equations and the dimension of the stochastic space is 16. We compute global sensitivity indices for all parameters and build a dimensionreduced model. The sensitivity results, combined with experiments, can improve the validity of the model. The third application quantifies the uncertainty of weather derivative models and investigates model robustness based on global sensitivity analysis. Three commonly used weather derivative models for the daily average temperature are considered. The one which is least influenced by an increase of parametric uncertainty level is identified as robust. In summary, the following contributions are made in this dissertation: 1. The optimization of sensitivity derivative enhanced sampling that guarantees variance reduction and improved estimation of stochastic moments. 2. The combination of optimized sensitivity derivative enhanced sampling with randomized quasiMonte Carlo sampling, and adaptive Monte Carlo sampling, to achieve higher convergence rates. 3. The construction of cutHDMR component functions based on Gauss quadrature points which results in a more accurate surrogate model, derivation of an integral form of low order partial variances based on cutHDMR, and efficient computation of global sensitivity analysis based on cutHDMR. 4. The application of efficient sampling methods, RSHDMR and cutHDMR for the quantification of Rothermel's wildland fire surface spread model. 5. The uncertainty quantification and global sensitivity analysis of a newly developed immune response model with parametric uncertainty. 6. The uncertainty quantification of weather derivative models and the analysis of model robustness based on global sensitivity analysis.
Show less  Date Issued
 2013
 Identifier
 FSU_migr_etd8681
 Format
 Thesis
 Title
 2D Affine and Projective Shape Analysis, and Bayesian Elastic Active Contours.
 Creator

Bryner, Darshan W., Srivastava, Anuj, Klassen, Eric, Gallivan, Kyle, Huffer, Fred, Wu, Wei, Zhang, Jinfeng, Department of Statistics, Florida State University
 Abstract/Description

An object of interest in an image can be characterized to some extent by the shape of its external boundary. Current techniques for shape analysis consider the notion of shape to be invariant to the similarity transformations (rotation, translation and scale), but often times in 2D images of 3D scenes, perspective effects can transform shapes of objects in a more complicated manner than what can be modeled by the similarity transformations alone. Therefore, we develop a general Riemannian...
Show moreAn object of interest in an image can be characterized to some extent by the shape of its external boundary. Current techniques for shape analysis consider the notion of shape to be invariant to the similarity transformations (rotation, translation and scale), but often times in 2D images of 3D scenes, perspective effects can transform shapes of objects in a more complicated manner than what can be modeled by the similarity transformations alone. Therefore, we develop a general Riemannian framework for shape analysis where metrics and related quantities are invariant to larger groups, the affine and projective groups, that approximate such transformations that arise from perspective skews. Highlighting two possibilities for representing object boundaries  ordered points (or landmarks) and parametrized curves  we study different combinations of these representations (points and curves) and transformations (affine and projective). Specifically, we provide solutions to three out of four situations and develop algorithms for computing geodesics and intrinsic sample statistics, leading up to Gaussiantype statistical models, and classifying test shapes using such models learned from training data. In the case of parametrized curves, an added issue is to obtain invariance to the reparameterization group. The geodesics are constructed by particularizing the pathstraightening algorithm to geometries of current manifolds and are used, in turn, to compute shape statistics and Gaussiantype shape models. We demonstrate these ideas using a number of examples from shape and activity recognition. After developing such Gaussiantype shape models, we present a variational framework for naturally incorporating these shape models as prior knowledge in guidance of active contours for boundary extraction in images. This socalled Bayesian active contour framework is especially suitable for images where boundary estimation is difficult due to low contrast, low resolution, and presence of noise and clutter. In traditional active contour models curves are driven towards minimum of an energy composed of image and smoothing terms. We introduce an additional shape term based on shape models of prior known relevant shape classes. The minimization of this total energy, using iterated gradientbased updates of curves, leads to an improved segmentation of object boundaries. We demonstrate this Bayesian approach to segmentation using a number of shape classes in many imaging scenarios including the synthetic imaging modalities of SAS (synthetic aperture sonar) and SAR (synthetic aperture radar), which are notoriously difficult to obtain accurate boundary extractions. In practice, the training shapes used for priorshape models may be collected from viewing angles different from those for the test images and thus may exhibit a shape variability brought about by perspective effects. Therefore, by allowing for a prior shape model to be invariant to, say, affine transformations of curves, we propose an active contour algorithm where the resulting segmentation is robust to perspective skews.
Show less  Date Issued
 2013
 Identifier
 FSU_migr_etd8534
 Format
 Thesis
 Title
 Elastic Shape Analysis of RNAs and Proteins.
 Creator

Laborde, Jose M., Srivastava, Anuj, Zhang, Jinfeng, Klassen, Eric, McGee, Daniel, Department of Statistics, Florida State University
 Abstract/Description

Proteins and RNAs are molecular machines performing biological functions in the cells of all organisms. Automatic comparison and classification of these biomolecules are fundamental yet open problems in the field of Structural Bioinformatics. An outstanding unsolved issue is the definition and efficient computation of a formal distance between any two biomolecules. Current methods use alignment scores, which are not proper distances, to derive statistical tests for comparison and...
Show moreProteins and RNAs are molecular machines performing biological functions in the cells of all organisms. Automatic comparison and classification of these biomolecules are fundamental yet open problems in the field of Structural Bioinformatics. An outstanding unsolved issue is the definition and efficient computation of a formal distance between any two biomolecules. Current methods use alignment scores, which are not proper distances, to derive statistical tests for comparison and classifications. This work applies Elastic Shape Analysis (ESA), a method recently developed in computer vision, to construct rigorous mathematical and statistical frameworks for the comparison, clustering and classification of proteins and RNAs. ESA treats bio molecular structures as 3D parameterized curves, which are represented with a special map called the square root velocity function (SRVF). In the resulting shape space of elastic curves, one can perform statistical analysis of curves as if they were random variables. One can compare, match and deform one curve into another, or as well as compute averages and covariances of curve populations, and perform hypothesis testing and classification of curves according to their shapes. We have successfully applied ESA to the comparison and classification of protein and RNA structures. We further extend the ESA framework to incorporate additional nongeometric information that tags the shape of the molecules (namely, the sequence of nucleotide/aminoacid letters for RNAs/proteins and, in the latter case, also the labels for the socalled secondary structure). The biological representation is chosen such that the ESA framework continues to be mathematically formal. We have achieved superior classification of RNA functions compared to stateoftheart methods on benchmark RNA datasets which has led to the publication of this work in the journal, Nucleic Acids Research (NAR). Based on the ESA distances, we have also developed a fast method to classify protein domains by using a representative set of protein structures generated by a clusteringbased technique we call Multiple Centroid Class Partitioning (MCCP). Comparison with other standard approaches showed that MCCP significantly improves the accuracy while keeping the representative set smaller than the other methods. The current schemes for the classification and organization of proteins (such as SCOP and CATH) assume a discrete space of their structures, where a protein is classified into one and only one class in a hierarchical tree structure. Our recent study, and studies by other researchers, showed that the protein structure space is more continuous than discrete. To capture the complex but quantifiable continuous nature of protein structures, we propose to organize these molecules using a network model, where individual proteins are mapped to possibly multiple nodes of classes, each associated with a probability. Structural classes will then be connected to form a network based on overlaps of corresponding probability distributions in the structural space.
Show less  Date Issued
 2013
 Identifier
 FSU_migr_etd8586
 Format
 Thesis
 Title
 Shape Analysis of Curves in Higher Dimensions.
 Creator

Wells, Linda Crystal, Klassen, Eric, Chicken, Eric, Srivastava, Anuj, Mio, Washington, Nichols, Warren, Department of Mathematics, Florida State University
 Abstract/Description

In this dissertation we will discuss geodesics between open curves and also between closed curves in Rn where n ≥ 2. In order to calculate these geodesics, we will form a Riemannian metric on a space of smooth curves with nonvanishing derivative. The metric will be invariant with respect to scaling, translation, rotation, and reparametrization. Using this metric we will define a distance between two curves invariant to the above mentioned transformations. This distance function will be...
Show moreIn this dissertation we will discuss geodesics between open curves and also between closed curves in Rn where n ≥ 2. In order to calculate these geodesics, we will form a Riemannian metric on a space of smooth curves with nonvanishing derivative. The metric will be invariant with respect to scaling, translation, rotation, and reparametrization. Using this metric we will define a distance between two curves invariant to the above mentioned transformations. This distance function will be defined utilizing the existence of isometries which allow our curves to map into a subspace of L2 where we already have geodesics defined and then map that geodesic back to the space of curves we are working in. Then we apply our metric to the geodesic to define the distance between the two initial curves. Some of our applications are 2D open curves, 3D open curves, and 3D closed curves including facial curves being categorized. The case of curves in R2 was studies by Laurent Younes, Peter W. Michor, Jayant Shah and David Mumford.
Show less  Date Issued
 2013
 Identifier
 FSU_migr_etd7658
 Format
 Thesis
 Title
 Statistical Analysis of Trajectories on Riemannian Manifolds.
 Creator

Su, Jingyong, Srivastava, Anuj, Klassen, Erik, Huffer, Fred, Zhang, Jinfeng, Department of Statistics, Florida State University
 Abstract/Description

This thesis consists of two distinct topics. First, we present a framework for estimation and analysis of trajectories on Riemananian manifolds. Second, we propose a framework of detecting, classifying, and estimating shapes in point cloud data. This thesis mainly focuses on statistical analysis of trajectories that take values on nonlinear manifolds. There are many difficulties when analyzing temporal trajectories on nonlinear manifold. First, the observed data are always noisy and discrete...
Show moreThis thesis consists of two distinct topics. First, we present a framework for estimation and analysis of trajectories on Riemananian manifolds. Second, we propose a framework of detecting, classifying, and estimating shapes in point cloud data. This thesis mainly focuses on statistical analysis of trajectories that take values on nonlinear manifolds. There are many difficulties when analyzing temporal trajectories on nonlinear manifold. First, the observed data are always noisy and discrete at unsynchronized times. Second, trajectories are observed under arbitrary temporal evolutions. In this work, we first address the problem of estimating full smooth trajectories on nonlinear manifolds using only a set of timeindexed points, for use in interpolation, smoothing, and prediction of dynamic systems. Furthermore, we study statistical analysis of trajectories that take values on nonlinear Riemannian manifolds and are observed under arbitrary temporal evolutions. The problem of analyzing such temporal trajectories including registration, comparison, modeling and evaluation exist in a lot of applications. We introduce a quantity that provides both a cost function for temporal registration and a proper distance for comparison of trajectories. This distance, in turn, is used to define statistical summaries, such as the sample means and covariances, of given trajectories and Gaussiantype models to capture their variability. Both theoretical proofs and experimental results are provided to validate our work. The problems of detecting, classifying, and estimating shapes in point cloud data are important due to their general applicability in image analysis, computer vision, and graphics. They are challenging because the data is typically noisy, cluttered, and unordered. We study these problems using a fully statistical model where the data is modeled using a Poisson process on the objects boundary (curves or surfaces), corrupted by additive noise and a clutter process. Using likelihood functions dictated by the model, we develop a generalized likelihood ratio test for detecting a shape in a point cloud. Additionally, we develop a procedure for estimating most likely shapes in observed point clouds under given shape hypotheses. We demonstrate this framework using examples of 2D and 3D shape detection and estimation in both real and simulated data, and a usage of this framework in shape retrieval from a 3D shape database.
Show less  Date Issued
 2013
 Identifier
 FSU_migr_etd7619
 Format
 Thesis
 Title
 Improving Inference in Population Genetics Using Statistics.
 Creator

Palczewski, Michal, Beerli, Peter, Srivastava, Anuj, Erlebacher, Gordon, Lemmon, Alan, Slice, Dennis, Department of Scientific Computing, Florida State University
 Abstract/Description

My studies at Florida State University focused on using computers and statistics to solve problems in population genetics. I have created models and algorithms that have the potential to improve the statistical analysis of population genetics. Population genetical data is often noisy and thus requires the use of statistics in order to be able to draw meaning from the data. This dissertation consists of three main projects. The first project involves the parallel evaluation an model inference...
Show moreMy studies at Florida State University focused on using computers and statistics to solve problems in population genetics. I have created models and algorithms that have the potential to improve the statistical analysis of population genetics. Population genetical data is often noisy and thus requires the use of statistics in order to be able to draw meaning from the data. This dissertation consists of three main projects. The first project involves the parallel evaluation an model inference on multilocus data sets. Bayes factors are used for model selection. We used thermodynamic integration to calculate these Bayes factors. To be able to take advantage of parallel processing and parallelize calculation across a high performance computer cluster, I developed a new method to split the Bayes factor calculation into independent units and then combine them later. The next project, the Transition Probability Structured Coalescence [TSPC], involved the creation of a continuous approximation to the discrete migration process used in the structured coalescent that is commonly used to infer migration rates in biological populations. Previous methods required the simulation of these migration events, but there is little power to estimate the time and occurrence of these events. In my method, they are replaced with a one dimensional numerical integration. The third project involved the development of a model for the inference of the time of speciation. Previous models used a set time to delineate a speciation and speciation was a point process. Instead, this point process is replaced with a parameterized speciation model where each lineage speciates according to a parameterized distribution. This is effectively a broader model that allows both very quick and slow speciation. It also includes the previous model as a limiting case. These three project, although rather independent of each other, improve the inference of population genetic models and thus allow better analyses of genetic data in fields such as phylogeography, conservation, and epidemiology.
Show less  Date Issued
 2013
 Identifier
 FSU_migr_etd7540
 Format
 Thesis
 Title
 Throughput Improvement in Multihop Ad Hoc Network Using Adaptive Carrier Sensing Range and Contention Window.
 Creator

Acholem, Onyekachi, Harvey, Bruce, Zhang, Zhenghao, Srivastava, Anuj, Roberts, Rodney, Foo, Simon, Department of Electrical and Computer Engineering, Florida State University
 Abstract/Description

Demand for decentralized, wireless, adhoc systems, where hosts are free to leave or join, to replace wired communication systems has seen a phenomenal growth. Such networks need little or no infrastructure support to operate. Deploying these networks such as in wireless sensor networks (WSN) enables new frontiers in developing opportunities to collect and process data from remote locations. The large number of nodes in these wireless networks invariably results in higher node densities and...
Show moreDemand for decentralized, wireless, adhoc systems, where hosts are free to leave or join, to replace wired communication systems has seen a phenomenal growth. Such networks need little or no infrastructure support to operate. Deploying these networks such as in wireless sensor networks (WSN) enables new frontiers in developing opportunities to collect and process data from remote locations. The large number of nodes in these wireless networks invariably results in higher node densities and increased levels of network interference. Interference mitigation is therefore crucial in ensuring these networks operate efficiently. Often the lack of network planning and regulations for such networks require the targeted access strategy to be adaptive to network conditions and distributed. The goal of this research is to design an algorithm employing mathematical tools in optimizing spatial reuse among nodes in the ad hoc network so that multiple communications between nodes can proceed simultaneously thereby maximizing the network throughput. To maximize spatial reuse, the IEEE 802.11 Medium Access Control (MAC) protocol would be modified so that each transmitting node can finetune its data rate and carrier sense range adaptively depending on minimal receiver response local data. All nodes must be able to detect and communicate with their neighbors in order to determine the network structure, to execute network functions and transmit collated information back to the remote node. The network topology will be discovered using clustering schemes such as the Kmeans technique that minimizes the Euclidean distance between random nodes. Each cluster will have a cluster head that would keep track of local information about nodes in its cluster. A further goal of this research would be to demonstrate that the physical carrier sensing incorporated in the 802.11 MAC protocol can adaptively optimize the sensing threshold of the nodes and minimize interference within the network without the benefit of the requesttosend and cleartosend handshake of the virtual carrier sensing. Considerable nodal energy and packet overhead would be saved by turning off the RTS/CTS handshake process. An analytic design will be presented for acquiring the optimal sensing threshold given a network topology; data rate and transmit/receive power of the nodes. Two major issues to be addressed in improving spatial reuse are: 1. The optimal range of transmit data rate/ carrier sense threshold for maximum network capacity 2. The relationship between the carrier sense threshold and contention window. Furthermore, results from this research will show that tuning the carrier sense threshold and contention window offers several advantages including delivering considerable aggregate throughput more than that obtained from a static carrier sense threshold network with no previous knowledge of the network topology. This will enable nodes sustain a high data rate, while maintaining the adverse effect of collision on other neighboring simultaneous communications at minimum. In the end, the communication protocol will be improved to achieve better utilization of the scarce wireless spectrum. The simulation and performance evaluation tools required for this work would be Network Simulator2 (NS2) simulator, AWK and PERL programming languages.
Show less  Date Issued
 2010
 Identifier
 FSU_migr_etd0108
 Format
 Thesis
 Title
 Uncertainty Quantification of Nonlinear Stochastic Phenomena.
 Creator

Jimenez, Edwin, Hussaini, M. Y., Srivastava, Anuj, Sussman, Mark, Kopriva, David, Department of Mathematics, Florida State University
 Abstract/Description

The present work quantifies uncertainty in two nonlinear problems using efficient sampling methods and polynomial chaos expansions. The first application is to the Rothermel wildland fire spread model. This model consists of a nonlinear system of algebraic and transcendental equations that relates environmental variables (input parameter groups) such as fuel type, fuel moisture, terrain, and wind to describe the fire environment. The second application quantifies aeroacoustic uncertainty of a...
Show moreThe present work quantifies uncertainty in two nonlinear problems using efficient sampling methods and polynomial chaos expansions. The first application is to the Rothermel wildland fire spread model. This model consists of a nonlinear system of algebraic and transcendental equations that relates environmental variables (input parameter groups) such as fuel type, fuel moisture, terrain, and wind to describe the fire environment. The second application quantifies aeroacoustic uncertainty of a Joukowski airfoil in stochastic vortical gusts. The stochastic gusts are described by random variables that model the gust amplitudes and frequency. The quantification of uncertainty is measured in terms of statistical moments. We construct moment estimates using a variance reduction procedure as well as an efficient stochastic collocation method.
Show less  Date Issued
 2009
 Identifier
 FSU_migr_etd3511
 Format
 Thesis
 Title
 Functional Component Analysis and Regression Using Elastic Methods.
 Creator

Tucker, J. Derek, Srivastava, Anuj, Wu, Wei, Klassen, Eric, Huﬀer, Fred, Department of Statistics, Florida State University
 Abstract/Description

Constructing generative models for functional observations is an important task in statistical function analysis. In general, functional data contains both phase (or x or horizontal) and amplitude (or y or vertical) variability. Traditional methods often ignore the phase variability and focus solely on the amplitude variation, using crosssectional techniques such as functional principal component analysis for dimensional reduction and regression for data modeling. Ignoring phase variability...
Show moreConstructing generative models for functional observations is an important task in statistical function analysis. In general, functional data contains both phase (or x or horizontal) and amplitude (or y or vertical) variability. Traditional methods often ignore the phase variability and focus solely on the amplitude variation, using crosssectional techniques such as functional principal component analysis for dimensional reduction and regression for data modeling. Ignoring phase variability leads to a loss of structure in the data, and inefficiency in data models. Moreover, most methods use a "preprocessing'' alignment step to remove the phasevariability; without considering a more natural joint solution. This dissertation presents three approaches to this problem. The first relies on separating the phase (xaxis) and amplitude (yaxis), then modeling these components using joint distributions. This separation in turn, is performed using a technique called elastic alignment of functions that involves a new mathematical representation of functional data. Then, using individual principal components, one for each phase and amplitude components, it imposes joint probability models on principal coefficients of these components while respecting the nonlinear geometry of the phase representation space. The second combines the phasevariability into the objective function for two component analysis methods, functional principal component analysis and functional principal least squares. This creates a more complete solution, as the phasevariability is removed while simultaneously extracting the components. The third approach combines the phasevariability into the functional linear regression model and then extends the model to logistic and multinomial logistic regression. Through incorporating the phasevariability a more parsimonious regression model is obtained and therefore, more accurate prediction of observations is achieved. These models then are easily extended from functional data to curves (which are essentially functions in R2) to perform regression with curves as predictors. These ideas are demonstrated using random sampling for models estimated from simulated and real datasets, and show their superiority over models that ignore phaseamplitude separation. Furthermore, the models are applied to classification of functional data and achieve high performance in applications involving SONAR signals of underwater objects, handwritten signatures, periodic body movements recorded by smart phones, and physiological data.
Show less  Date Issued
 2014
 Identifier
 FSU_migr_etd9106
 Format
 Thesis
 Title
 Parametric and Nonparametric Spherical Regression with Diffeomorphisms.
 Creator

Rosenthal, Michael, Srivastava, Anuj, Wu, Wei, Klassen, Eric, Pati, Debdeep, Department of Statistics, Florida State University
 Abstract/Description

Spherical regression explores relationships between pairs of variables on spherical domains. Spherical data has become more prevalent in biological, gaming, geographical, and meteorological investigations, creating a need for tools that analyze such data. Previous works on spherical regression have focused on rigid parametric models or nonparametric kernel smoothing methods. This leaves a huge gap in the available tools with no intermediate options currently available. This work will develop...
Show moreSpherical regression explores relationships between pairs of variables on spherical domains. Spherical data has become more prevalent in biological, gaming, geographical, and meteorological investigations, creating a need for tools that analyze such data. Previous works on spherical regression have focused on rigid parametric models or nonparametric kernel smoothing methods. This leaves a huge gap in the available tools with no intermediate options currently available. This work will develop two such intermediate models, one parametric using projective linear transformation and one nonparametric model using diffeomorphic maps from a sphere to itself. The models are estimated in a maximumlikelihood framework using gradientbased optimizations. For the parametric model, an efficient NewtonRaphson algorithm is derived and asymptotic analysis is developed. A firstorder roughness penalty is specified for the nonparametric model using the Jacobian of diffeomorphisms. The prediction performance of the proposed models are compared with stateoftheart methods using simulated and real data involving plate tectonics, cloud deformations, wind, accelerometer, bird migration, and vectorcardiogram data.
Show less  Date Issued
 2014
 Identifier
 FSU_migr_etd9082
 Format
 Thesis
 Title
 Developing SRSF Shape Analysis Techniques for Applications in Neuroscience and Genomics.
 Creator

Wesolowski, Sergiusz, Wu, Wei, Bertram, R. (Richard), Srivastava, Anuj, Beerli, Peter, Mio, Washington, Florida State University, College of Arts and Sciences, Department of...
Show moreWesolowski, Sergiusz, Wu, Wei, Bertram, R. (Richard), Srivastava, Anuj, Beerli, Peter, Mio, Washington, Florida State University, College of Arts and Sciences, Department of Mathematics
Show less  Abstract/Description

Dissertation focuses on exploring the capabilities of the SRSF statistical shape analysis framework through various applications. Each application gives rise to a specific mathematical shape analysis model. The theoretical investigation of the models, driven by real data problems, give rise to new tools and theorems necessary to conduct a sound inference in the space of shapes. From theoretical standpoint the robustness results are provided for the model parameters estimation and an ANOVA...
Show moreDissertation focuses on exploring the capabilities of the SRSF statistical shape analysis framework through various applications. Each application gives rise to a specific mathematical shape analysis model. The theoretical investigation of the models, driven by real data problems, give rise to new tools and theorems necessary to conduct a sound inference in the space of shapes. From theoretical standpoint the robustness results are provided for the model parameters estimation and an ANOVAlike statistical testing procedure is discussed. The projects were a result of the collaboration between theoretical and applicationfocused research groups: the Shape Analysis Group at the Department of Statistics at Florida State University, the Center of Genomics and Personalized Medicine at FSU and the FSU's Department of Neuroscience. As a consequence each of the projects consists of two aspects—the theoretical investigation of the mathematical model and the application driven by a real life problem. The applications components, are similar from the data modeling standpoint. In each case the problem is set in an infinite dimensional space, elements of which are experimental data points that can be viewed as shapes. The three projects are: ``A new framework for Euclidean summary statistics in the neural spike train space''. The project provides a statistical framework for analyzing the spike train data and a new noise removal procedure for neural spike trains. The framework adapts the SRSF elastic metric in the space of point patterns to provides a new notion of the distance. ``SRSF shape analysis for sequencing data reveal new differentiating patterns''. This project uses the shape interpretation of the Next Generation Sequencing data to provide a new point of view of the exon level gene activity. The novel approach reveals a new differential gene behavior, that can't be captured by the stateofthe art techniques. Code is available online on github repository. ``How changes in shape of nucleosomal DNA near TSS influence changes of gene expression''. The result of this work is the novel shape analysis model explaining the relation between the change of the DNA arrangement on nucleosomes and the change in the differential gene expression.
Show less  Date Issued
 2017
 Identifier
 FSU_FALL2017_Wesolowski_fsu_0071E_14177
 Format
 Thesis
 Title
 Riemannian Manifold TrustRegion Methods with Applications to Eigenproblems.
 Creator

Baker, Christopher Grover, Gallivan, Kyle, Absil, PierreAntoine, Krothapalli, Anjaneyulu, Erlebacher, Gordon, Srivastava, Anuj, Hussaini, Yousuﬀ, Department of Scientific...
Show moreBaker, Christopher Grover, Gallivan, Kyle, Absil, PierreAntoine, Krothapalli, Anjaneyulu, Erlebacher, Gordon, Srivastava, Anuj, Hussaini, Yousuﬀ, Department of Scientific Computing, Florida State University
Show less  Abstract/Description

This thesis presents and evaluates a generic algorithm for incrementally computing the dominant singular subspaces of a matrix. The relationship between the generality of the results and the necessary computation is explored, and it is shown that more efficient computation can be obtained by relaxing the algebraic constraints on the factoriation. The performance of this method, both numerical and computational, is discussed in terms of the algorithmic parameters, such as block size and...
Show moreThis thesis presents and evaluates a generic algorithm for incrementally computing the dominant singular subspaces of a matrix. The relationship between the generality of the results and the necessary computation is explored, and it is shown that more efficient computation can be obtained by relaxing the algebraic constraints on the factoriation. The performance of this method, both numerical and computational, is discussed in terms of the algorithmic parameters, such as block size and acceptance threshhold. Bounds on the error are presented along with a posteriori approximations of these bounds. Finally, a group of methods are proposed which iteratively improve the accuracy of computed results and the quality of the bounds.
Show less  Date Issued
 2008
 Identifier
 FSU_migr_etd0926
 Format
 Thesis
 Title
 Elastic Functional Principal Component Analysis for Modeling and Testing of Functional Data.
 Creator

Duncan, Megan, Srivastava, Anuj, Klassen, E., Huffer, Fred W., Wu, Wei, Florida State University, College of Arts and Sciences, Department of Statistics
 Abstract/Description

Statistical analysis of functional data requires tools for comparing, summarizing and modeling observed functions as elements of a function space. A key issue in Functional Data Analysis (FDA) is the presence of the phase variability in the observed data. A successful statistical model of functional data has to account for the presence of phase variability. Otherwise the ensuing inferences can be inferior. Recent methods for FDA include steps for phase separation or functional alignment. For...
Show moreStatistical analysis of functional data requires tools for comparing, summarizing and modeling observed functions as elements of a function space. A key issue in Functional Data Analysis (FDA) is the presence of the phase variability in the observed data. A successful statistical model of functional data has to account for the presence of phase variability. Otherwise the ensuing inferences can be inferior. Recent methods for FDA include steps for phase separation or functional alignment. For example, Elastic Functional Principal Component Analysis (Elastic FPCA) uses the strengths of Functional Principal Component Analysis (FPCA), along with the tools from Elastic FDA, to perform joint phaseamplitude separation and modeling. A related problem in FDA is to quantify and test for the amount of phase in a given data. We develop two types of hypothesis tests for testing the significance of phase variability: a metricbased approach and a modelbased approach. The metricbased approach treats phase and amplitude as independent components and uses their respective metrics to apply the FriedmanRafsky Test, Schilling's Nearest Neighbors, and Energy Test to test the differences between functions and their amplitudes. In the modelbased test, we use Concordance Correlation Coefficients as a tool to quantify the agreement between functions and their reconstructions using FPCA and Elastic FPCA. We demonstrate this framework using a number of simulated and real data, including weather, tecator, and growth data.
Show less  Date Issued
 2018
 Identifier
 2018_Sp_Duncan_fsu_0071E_14470
 Format
 Thesis
 Title
 Elastic Functional Regression Model.
 Creator

Ahn, Kyungmin, Srivastava, Anuj, Klassen, E., Wu, Wei, Huffer, Fred W., Florida State University, College of Arts and Sciences, Department of Statistics
 Abstract/Description

Functional variables serve important roles as predictors in a variety of pattern recognition and vision applications. Focusing on a specific subproblem, termed scalaronfunction regression, most current approaches adopt the standard L2 inner product to form a link between functional predictors and scalar responses. These methods may perform poorly when predictor functions contain nuisance phase variability, i.e., predictors are temporally misaligned due to noise. While a simple solution...
Show moreFunctional variables serve important roles as predictors in a variety of pattern recognition and vision applications. Focusing on a specific subproblem, termed scalaronfunction regression, most current approaches adopt the standard L2 inner product to form a link between functional predictors and scalar responses. These methods may perform poorly when predictor functions contain nuisance phase variability, i.e., predictors are temporally misaligned due to noise. While a simple solution could be to prealign predictors as a preprocessing step, before applying a regression model, this alignment is seldom optimal from the perspective of regression. In this dissertation, we propose a new approach, termed elastic functional regression, where alignment is included in the regression model itself, and is performed in conjunction with the estimation of other model parameters. This model is based on a normpreserving warping of predictors, not the standard time warping of functions, and provides better prediction in situations where the shape or the amplitude of the predictor is more useful than its phase. We demonstrate the effectiveness of this framework using simulated and real data.
Show less  Date Issued
 2018
 Identifier
 2018_Sp_Ahn_fsu_0071E_14452
 Format
 Thesis
 Title
 Statistical Shape Analysis of Neuronal Tree Structures.
 Creator

Duncan, Adam, Srivastava, Anuj, Klassen, E., Wu, Wei, Huffer, Fred W., Florida State University, College of Arts and Sciences, Department of Statistics
 Abstract/Description

Neuron morphology plays a central role in characterizing cognitive health and functionality of brain structures. The problem of quantifying neuron shapes, and capturing statistical variability of shapes, is difficult because axons and dendrites have tree structures that differ in both geometry and topology. In this work, we restrict to the trees that consist of: (1) a main branch viewed as a parameterized curve in ℝ³, and (2) some number of secondary branches  also parameterized curves in...
Show moreNeuron morphology plays a central role in characterizing cognitive health and functionality of brain structures. The problem of quantifying neuron shapes, and capturing statistical variability of shapes, is difficult because axons and dendrites have tree structures that differ in both geometry and topology. In this work, we restrict to the trees that consist of: (1) a main branch viewed as a parameterized curve in ℝ³, and (2) some number of secondary branches  also parameterized curves in ℝ³  which emanate from the main branch at arbitrary points. We present two shapeanalytic frameworks which each give a metric structure to the set of such tree shapes, Both frameworks are based on an elastic metric on the space of curves with certain shapepreserving nuisance variables modded out. In the first framework, the side branches are treated as a continuum of curvevalued annotations to the main branch. In the second framework, the side branches are treated as discrete entities and are matched to each other by permutation. We show geodesic deformations between tree shapes in both frameworks, and we show Fréchet means and modes of variability, as well as crossvalidated classification between different experimental groups using the second framework. We conclude with a smaller project which extends some of these ideas to more general weighted attributed graphs.
Show less  Date Issued
 2018
 Identifier
 2018_Sp_Duncan_fsu_0071E_14500
 Format
 Thesis
 Title
 SparsityRegularized Learning for NanoMetrology.
 Creator

Li, Xin, Park, Chiwoo, Srivastava, Anuj, Liang, Zhiyong, Vanli, Omer Arda, Florida State University, College of Engineering, Department of Industrial and Manufacturing Engineering
 Abstract/Description

The key objective of nanomaterial metrology is to extract relevant information on nanostructure for quantitatively correlating structureproperty with functionality. Historic improvements on in strumentation platforms has enabled comprehensive capture of the information stream both glob ally and locally. For example, the impressive scanning transmission electron microscopy (STEM) progress is the access to vibrational spectroscopic signals such as atomically resolved electron en ergy loss...
Show moreThe key objective of nanomaterial metrology is to extract relevant information on nanostructure for quantitatively correlating structureproperty with functionality. Historic improvements on in strumentation platforms has enabled comprehensive capture of the information stream both glob ally and locally. For example, the impressive scanning transmission electron microscopy (STEM) progress is the access to vibrational spectroscopic signals such as atomically resolved electron en ergy loss spectroscopy (EELS) and the most recent ptychography. This is particularly pertinent in the scanning probe microscopy (SPM) community that has seen a rapidly growing trend towards simultaneous capture of multiple imaging channel and increasing data sizes. Meanwhile signal pro cessing analysis remained in the same, depending on simple physics models. This approach by definition ignores the material behaviors associated with the deviations from simple physics models and hence require more complex dynamic models. Introduction of such models, in turn, can lead to spurious growth of free parameters and potential overfitting etc. To derive signal analysis pathways necessitated by large,complex datasets generated by progress in instrumentation hardware, here we propose dataphysics inference driven approaches for high veracity and informationrich nanomaterial metrology. Mathematically, we found structural spar sity regularizations extremely useful which are explained at corresponding applications in later chapters. In a nutshell, we overview the following contributions: 1.We proposed a physicsinfused semiparametric regression approach for estimating the size distribution of nanoparticles with DLS measurements, yielding more details of the size distribution than the traditional methodology. Our methodology expands DLS capability of characterizing heterogeneously shaped nanoparticles. 2.We proposed a twolevel structural sparsity regularized regression model and correspondingly developed a variant of group orthogonal matching pursuit algorithm for simultaneously estimating global periodic structure and detecting local outlier structures in noisy STEM images. We believe this an important step toward automatic phase. 3.We develop and implement a universal realtime image reconstruction algorithm from rapid and sparse STEM scans for noninvasive and highdynamic range imaging. We build up and opensource the systematic platform that fundamentally push the evolution of STEM for both imaging and ebeam based atombyatom fabrication, forming a marriage between the imaging and manipulation modes via intelligent and adaptive responses to the realtime material evolution.
Show less  Date Issued
 2018
 Identifier
 2018_Su_Li_fsu_0071E_14718_Comp
 Format
 Set of related objects
 Title
 QuasiMonte Carlo and Markov Chain QuasiMonte Carlo Methods in Estimation and Prediction of Time Series Models.
 Creator

Tzeng, YuYing, Ökten, Giray, Beaumont, Paul M., Srivastava, Anuj, Kercheval, Alec N., Kim, Kyounghee (Professor of Mathematics), Florida State University, College of Arts and...
Show moreTzeng, YuYing, Ökten, Giray, Beaumont, Paul M., Srivastava, Anuj, Kercheval, Alec N., Kim, Kyounghee (Professor of Mathematics), Florida State University, College of Arts and Sciences, Department of Mathematics
Show less  Abstract/Description

Randomized quasiMonte Carlo (RQMC) methods were first developed in mid 1990’s as a hybrid of Monte Carlo and quasiMonte Carlo (QMC) methods. They were designed to have the superior error reduction properties of lowdiscrepancy sequences, but also amenable to the statistical error analysis Monte Carlo methods enjoy. RQMC methods are used successfully in applications such as option pricing, high dimensional numerical integration, and uncertainty quantification. This dissertation discusses the...
Show moreRandomized quasiMonte Carlo (RQMC) methods were first developed in mid 1990’s as a hybrid of Monte Carlo and quasiMonte Carlo (QMC) methods. They were designed to have the superior error reduction properties of lowdiscrepancy sequences, but also amenable to the statistical error analysis Monte Carlo methods enjoy. RQMC methods are used successfully in applications such as option pricing, high dimensional numerical integration, and uncertainty quantification. This dissertation discusses the use of RQMC and QMC methods in econometric time series analysis. In time series simulation, the two main problems are parameter estimation and forecasting. The parameter estimation problem involves the use of Markov chain Monte Carlo (MCMC) algorithms such as MetropolisHastings and Gibbs sampling. In Chapter 3, we use an approximately completely uniform distributed sequence which was recently discussed by Owen et al. [2005], and an RQMC sequence introduced by O ̈kten [2009], in some MCMC algorithms to estimate the parameters of a Probit and SVlogAR(1) model. Numerical results are used to compare these sequences with standard Monte Carlo simulation. In the time series forecasting literature, there was an earlier attempt to use QMC by Li and Winker [2003], which did not provide a rigorous error analysis. Chapter 4 presents how RQMC can be used in time series forecasting with its proper error analysis. Numerical results are used to compare various sequences for a simple AR(1) model. We then apply RQMC to compute the valueatrisk and expected shortfall measures for a stock portfolio whose returns follow a highly nonlinear Markov switching stochastic volatility model which does not admit analytical solutions for the returns distribution. The proper use of QMC and RQMC methods in Monte Carlo and Markov chain Monte Carlo algorithms can greatly reduce the computational error in many applications from sciences, en gineering, economics and finance. This dissertation brings the proper (R)QMC methodology to time series simulation, and discusses the advantages as well as the limitations of the methodology compared the standard Monte Carlo methods.
Show less  Date Issued
 2017
 Identifier
 FSU_SUMMER2017_Tzeng_fsu_0071E_13607
 Format
 Thesis
 Title
 LargeScale MultiTarget Tracking Problem for Interacting Targets.
 Creator

Vo, Garret Dan, Park, Chiwoo, Srivastava, Anuj, Liang, Zhiyong (Richard), Vanli, Omer Arda, Florida State University, FAMUFSU College of Engineering (Tallahassee, Fla.),...
Show moreVo, Garret Dan, Park, Chiwoo, Srivastava, Anuj, Liang, Zhiyong (Richard), Vanli, Omer Arda, Florida State University, FAMUFSU College of Engineering (Tallahassee, Fla.), Department of Industrial and Manufacturing Engineering
Show less  Abstract/Description

The unique physical properties of nanoparticles depend on their sizes and shapes. Therefore, an ability to precisely control the size of nanoparticles and tune their morphology will allow scientists and engineers to modify their physical properties, which will lead to many potential applications. To precisely control nanoparticles' sizes and shapes requires a deep understanding of their growth mechanism. To understand their growth mechanism, a direct observation and its quantitative analysis...
Show moreThe unique physical properties of nanoparticles depend on their sizes and shapes. Therefore, an ability to precisely control the size of nanoparticles and tune their morphology will allow scientists and engineers to modify their physical properties, which will lead to many potential applications. To precisely control nanoparticles' sizes and shapes requires a deep understanding of their growth mechanism. To understand their growth mechanism, a direct observation and its quantitative analysis are both necessary. In the direct observation study, the electron microscopy method has shown promises, because the in situ method enables researchers to see the growth process using video recordings. In these video recordings, each frame displays an image from an electron microscope. However, this method yields a vast number of electron images; therefore, analyzing these images to monitor the nanoparticles' growth is a challenging task. The objective of this dissertation is to develop an automation process to capture the complex growth event of nanoparticles in a sequence of electron microscope images. The automation process consists of two tasks: detect nanoparticles in an electron microscope image that has a nonuniform background and significant noise; and then track these detected nanoparticles in a large number of video frames obtained from a single camera. In each frame, complex interaction among these nanoparticles exists; therefore, the tracking algorithm will capture the complex interaction among these nanoparticles. Two solutions are proposed in this dissertation. To detect nanoparticles, an electron microscope image is converted to a binary image through a process called image binarization. To perform the image binarization step, the background of the electron microscope image is first estimated with a robust regression technique; then, it is subtracted from the input image. Afterwards, a global thresholding algorithm is applied to the subtracted outcome in order to achieve the binary image. To track these detected nanoparticles in a large number of video frames, an online algorithm has been created. This algorithm leverages the multiway data association, which is capable of tracking complex interaction among nanoparticles but suffers from computational inefficiency for a large number of video frames. The online algorithm forms fragmented trajectories between two consecutive frames (i.e. framebyframe data association). When missedassociation between nanoparticles occur, the algorithm augments these missedassociated nanopartiles to nanoparticles in the second frame in the framebyframe data association step. Then, the algorithm continues forming trajectories with the multiway data association for the incoming video frame. When these augmented nanoparticles are associated within the sliding window, the algorithm initiates the creation of tracks, which connect missedassociated nanoparticles at their respective time frames to their correspondents at the incoming video frame. While working on the second solution, we also created a computer simulation model to generate multitarget datasets with their respective groundtruth associations.The generated datasets and their respective groundtruth associations will serve as a benchmark data to test and evaluate multitarget tracking algorithms. The simulation model serves two purposes: cover all complexity of multitarget tracking scenarios, which public datasets lack; and provide the groundtruth target tracking and association so that the evaluation of multitarget tracking algorithms can be performed without any manual video annotation process.
Show less  Date Issued
 2019
 Identifier
 2019_Summer_Vo_fsu_0071E_15279_P
 Format
 Set of related objects
 Title
 Shape Data Analysis for Machine Learning in Power Systems Applications.
 Creator

Cordova Guillen, Jose David, Pamidi, Sastry V., Srivastava, Anuj, Ozguven, Eren Erman, Li, Hui, Foo, Simon Y., Florida State University, FAMUFSU College of Engineering,...
Show moreCordova Guillen, Jose David, Pamidi, Sastry V., Srivastava, Anuj, Ozguven, Eren Erman, Li, Hui, Foo, Simon Y., Florida State University, FAMUFSU College of Engineering, Department of Electrical and Computer Engineering
Show less  Abstract/Description

This dissertation proposes the use of the shape of data as a new feature to improve and develop new in machine learning and deep learning algorithms utilized for different power systems applications. The new features are obtained through Shape Data Analysis (SDA), an emerging field in Statistics. SDA is used to obtain the shape of the data structure to observe different patterns developed under distribution networks abnormal conditions, as well as determining the shape of load curves to...
Show moreThis dissertation proposes the use of the shape of data as a new feature to improve and develop new in machine learning and deep learning algorithms utilized for different power systems applications. The new features are obtained through Shape Data Analysis (SDA), an emerging field in Statistics. SDA is used to obtain the shape of the data structure to observe different patterns developed under distribution networks abnormal conditions, as well as determining the shape of load curves to improve existing electrical load forecasting algorithms. Specifically, shapebased data analysis is implemented and developed for two different applications: electrical fault detection and electrical consumption shortterm load forecasting. The algorithms proposed are implemented on data collected from Intelligent Electronic Devices (IEDs), Phasor Measurement Units (PMUs), and Supervisory Control and Data Acquisition (SCADA) systems in power distribution networks.
Show less  Date Issued
 2019
 Identifier
 2019_Spring_CordovaGuillen_fsu_0071E_12807
 Format
 Thesis
 Title
 Bayesian Tractography Using Geometric Shape Priors.
 Creator

Dong, Xiaoming, Srivastava, Anuj, Klassen, E. (Eric), Wu, Wei, Huffer, Fred W. (Fred William), Florida State University, College of Arts and Sciences, Department of Statistics
 Abstract/Description

Diffusionweighted image(DWI) and tractography have been developed for decades and are key elements in recent, largescale efforts for mapping the human brain. The two techniques together provide us a unique possibility to access the macroscopic structure and connectivity of the human brain noninvasively and in vivo. The information obtained not only can help visualize brain connectivity and help segment the brain into different functional areas but also provides tools for understanding some...
Show moreDiffusionweighted image(DWI) and tractography have been developed for decades and are key elements in recent, largescale efforts for mapping the human brain. The two techniques together provide us a unique possibility to access the macroscopic structure and connectivity of the human brain noninvasively and in vivo. The information obtained not only can help visualize brain connectivity and help segment the brain into different functional areas but also provides tools for understanding some major cognitive diseases such as multiple sclerosis, schizophrenia, epilepsy, etc. There are lots of efforts have been put into this area. On the one hand, a vast spectrum of tractography algorithms have been developed in recent years, ranging from deterministic approaches through probabilistic methods to global tractography; On the other hand, various mathematical models, such as diffusion tensor, multitensor model, spherical deconvolution, Qball modeling, have been developed to better exploit the acquisition dependent signal of Diffusionweighted image(DWI). Despite considerable progress in this area, current methods still face many challenges, such as sensitive to noise, lots of false positive/negative fibers, incapable of handling complex fiber geometry and expensive computation cost. More importantly, recent researches have shown that, even with highquality data, the results using current tractography methods may not be improved, suggesting that it is unlikely to obtain an anatomically accurate map of the human brain solely based on the diffusion profile. Motivated by these issues, this dissertation develops a global approach that incorporates anatomical validated geometric shape prior when reconstructing neuron fibers. The fiber tracts between regions of interest are initialized and updated via deformations based on gradients of the posterior energy defined in this paper. This energy has contributions from diffusion data, shape prior information, and roughness penalty. The dissertation first describes and demonstrates the proposed method on the 2D dataset and then extends it to 3D Phantom data and the real brain data. The results show that the proposed method is relatively immune to issues such as noise, complicated fiber structure like fiber crossings and kissing, false positive fibers, and achieve more explainable tractography results.
Show less  Date Issued
 2019
 Identifier
 2019_Spring_DONG_fsu_0071E_15144
 Format
 Thesis
 Title
 Belief Function Theory: Monte Carlo Methods and Application to Stock Markets.
 Creator

Salehy, Seyyed Nima, Ökten, Giray, Srivastava, Anuj, Cogan, Nicholas G., Fahim, Arash, Florida State University, College of Arts and Sciences, Department of Mathematics
 Abstract/Description

Belief function theory, also known as DempsterShafer theory or evidence theory, gives a general framework for quantifying, representing, and managing uncertainty, and it is widely used in several applications from artificial intelligence to accounting. The belief function theory provides tools to combine several sources' opinions (belief functions), among which, Dempster's rule of combination is the most commonly used. The main drawback of using Dempster's rule to combine belief functions is...
Show moreBelief function theory, also known as DempsterShafer theory or evidence theory, gives a general framework for quantifying, representing, and managing uncertainty, and it is widely used in several applications from artificial intelligence to accounting. The belief function theory provides tools to combine several sources' opinions (belief functions), among which, Dempster's rule of combination is the most commonly used. The main drawback of using Dempster's rule to combine belief functions is its computational complexity, which limits the application of Dempster's rule to small number of belief functions. We introduce a family of new Monte Carlo and quasiMonte Carlo algorithms aimed at approximating Dempster's rule of combination. Then, we present numerical results to show the superiority of the new methods over the existing ones. The algorithms are then used to implement some stock investment strategies based on DempsterShafer theory. We will introduce a new strategy, and apply it to the U.S. stock market over a certain period of time. Numerical results suggest the strategies based on the belief function theory outperform the S&P 500 index, with our new strategy giving the best returns.
Show less  Date Issued
 2019
 Identifier
 2019_Spring_SALEHY_fsu_0071E_15151
 Format
 Thesis
 Title
 Design and Analysis of Response Surface Designs with Restricted Randomization.
 Creator

Wesley, Wayne R., Simpson, James R., Srivastava, Anuj, Parker, Peter A., Pignatiello, Joseph J., Department of Industrial and Manufacturing Engineering, Florida State University
 Abstract/Description

Many industrial experiments are conducted under various conditions which do not facilitate complete randomization of all the experimental factors. In response surface methodology whenever there are restrictions on randomization the experimental procedure usually follows the split plot design approach. Split plot designs are used when there are factors which are difficult or costly to change or adjust during an experiment. Split plot designs are currently generating renewed interest because of...
Show moreMany industrial experiments are conducted under various conditions which do not facilitate complete randomization of all the experimental factors. In response surface methodology whenever there are restrictions on randomization the experimental procedure usually follows the split plot design approach. Split plot designs are used when there are factors which are difficult or costly to change or adjust during an experiment. Split plot designs are currently generating renewed interest because of their usefulness and practical application in industrial settings. Despite the work accomplished through various research efforts, there is still a need to understand the optimality properties of these designs for secondorder response surface models. This dissertation provides the development of an analytical approach for the computation of various optimality properties for the assessment of secondorder split plot designs. The approach involves a thorough investigation of the impact of restricted randomization on the information matrix, which characterizes much of the relationship between the design points and the proposed response surface model for split plot designs. Several important insights are presented for the construction of secondorder split plot designs. In addition, the analytical equations reported compute exact design optimality values and are more efficient than currently available methods. A particular feature of these analytical equations is that they are functions of the design parameters, radius and variance ratio. Further, a significant result is the ability to efficiently compute the exact value of the integrated prediction variance for both split plot designs and completely randomized designs. The functionality of the computational procedures presented provides easy evaluation of the impact of changes in the design structure and variance ratio on the optimality properties of secondorder split plot designs.
Show less  Date Issued
 2006
 Identifier
 FSU_migr_etd1180
 Format
 Thesis
 Title
 Stochastic Models and Inferences for Commodity Futures Pricing.
 Creator

Ncube, Moeti M., Srivastava, Anuj, Doran, James, Mason, Patrick, Niu, Xufeng, Huﬀer, Fred, Wu, Wei, Department of Statistics, Florida State University
 Abstract/Description

The stochastic modeling of financial assets is essential to the valuation of financial products and investment decisions. These models are governed by certain parameters that are estimated through a process known as calibration. Current procedures typically perform a gridsearch optimization of a given objective function over a specified parameter space. These methods can be computationally intensive and require restrictions on the parameter space to achieve timely convergence. In this thesis...
Show moreThe stochastic modeling of financial assets is essential to the valuation of financial products and investment decisions. These models are governed by certain parameters that are estimated through a process known as calibration. Current procedures typically perform a gridsearch optimization of a given objective function over a specified parameter space. These methods can be computationally intensive and require restrictions on the parameter space to achieve timely convergence. In this thesis, we propose an alternative Kalman Smoother Expectation Maximization procedure (KSEM) that can jointly estimate all the parameters and produces better model t that compared to alternative estimation procedures. Further, we consider the additional complexity of the modeling of jumps or spikes that may occur in a time series. For this calibration we develop a Particle Smoother Expectation Maximization procedure (PSEM) for the optimization of nonlinear systems. This is an entirely new estimation approach, and we provide several examples of it's application.
Show less  Date Issued
 2009
 Identifier
 FSU_migr_etd2707
 Format
 Thesis
 Title
 Inferences in Shape Spaces with Applications to Image Analysis and Computer Vision.
 Creator

Joshi, Shantanu H., Srivastava, Anuj, MeyerBaese, Anke, Klassen, Eric, Roberts, Rodney, Foo, Simon Y., Fisher, John W., Department of Electrical and Computer Engineering,...
Show moreJoshi, Shantanu H., Srivastava, Anuj, MeyerBaese, Anke, Klassen, Eric, Roberts, Rodney, Foo, Simon Y., Fisher, John W., Department of Electrical and Computer Engineering, Florida State University
Show less  Abstract/Description

Shapes of boundaries can play an important role in characterizing objects in images. Shape analysis involves choosing mathematical representations of shapes, deriving tools for quantifying shape differences, and characterizing imaged objects according to the shapes of their boundaries. We describe an approach for statistical analysis of shapes of closed curves using ideas from differential geometry. In this thesis, we initially focus on characterizing shapes of continuous curves, both open...
Show moreShapes of boundaries can play an important role in characterizing objects in images. Shape analysis involves choosing mathematical representations of shapes, deriving tools for quantifying shape differences, and characterizing imaged objects according to the shapes of their boundaries. We describe an approach for statistical analysis of shapes of closed curves using ideas from differential geometry. In this thesis, we initially focus on characterizing shapes of continuous curves, both open and closed, in R^2 and then propose extensions to more general elastic curves in R^n. Under appropriate constraints that remove shapepreserving transformations, these curves form infinitedimensional, nonlinear spaces, called shape spaces. We impose a Riemannian structure on the shape space and construct geodesic paths under different metrics. Geodesic paths are used to accomplish a variety of tasks, including the definition of a metric to compare shapes, the computation of intrinsic statistics for a set of shapes, and the definition of intrinsic probability models on shape spaces. Riemannian metrics allow for the development of a set of tools for computing intrinsic statistics for a set of shapes and clustering them hierarchically for efficient retrieval. Pursuing this idea, we also present algorithms to compute simple shape statistics  means and covariances,  and derive probability models on shape spaces using local principal component analysis (PCA), called tangent PCA (TPCA). These concepts are demonstrated using a number of applications: (i) unsupervised clustering of imaged objects according to their shapes, (ii) developing statistical shape models of human silhouettes in infrared surveillance images, (iii) interpolation of endo and epicardial boundaries in echocardiographic image sequences, and (iv) using shape statistics to test phylogenetic hypotheses. Finally, we present a framework for incorporating prior information about highprobability shapes in the process of contour extraction and object recognition in images. Here one studies shapes as elements of an infinitedimensional, nonlinear quotient space, and statistics of shapes are defined and computed intrinsically using differential geometry of this shape space. Prior models on shapes are constructed using probability distributions on tangent bundles of shape spaces. Similar to the past work on active contours, where curves are driven by vector fields based on image gradients and roughness penalties, we incorporate prior shape knowledge also in form of gradient fields on curves. Through experimental results, we demonstrate the use of prior shape models in estimation of object boundaries, and their success in handling partial obscuration and missing data. Furthermore, we describe the use of this framework in shapebased object recognition or classification. This Bayesian shape extraction approach is found to yield a significant improvement in detection of objects in presence of occlusions or obscurations.
Show less  Date Issued
 2007
 Identifier
 FSU_migr_etd3697
 Format
 Thesis
 Title
 Probabilistic Uncertainty Analysis and Its Applications in Option Models.
 Creator

Namihira, Motoi J., Kopriva, David A., Srivastava, Anuj, Ewald, Brian, Hussaini, M. Yousuﬀ, Nichols, Warren, Okten, Giray, Department of Mathematics, Florida State University
 Abstract/Description

In this work we quantify the effect of uncertainty in volatility in the prices and Deltas of an American and European put using probabilistic uncertainty analysis. We review the current methods of uncertainty analysis including worst case or scenario analysis, Monte Carlo, and provide an in depth review of Polynomial Chaos in both one and multiple dimensions. We develop a numerically stable method of generating orthogonal polynomials that is used in the practical construction of the...
Show moreIn this work we quantify the effect of uncertainty in volatility in the prices and Deltas of an American and European put using probabilistic uncertainty analysis. We review the current methods of uncertainty analysis including worst case or scenario analysis, Monte Carlo, and provide an in depth review of Polynomial Chaos in both one and multiple dimensions. We develop a numerically stable method of generating orthogonal polynomials that is used in the practical construction of the Polynomial Chaos basis functions. We also develop a semi analytic density transform method that is 200 times faster and 1000 times more accurate than the Monte Carlo based kernel density method. Finally, we analyze the European and American put option models assuming a distribution for the volatility that is historically observed. We find that the sensitivity to uncertainty in volatility is greatest for the price of ATM puts, and tapers as one moves away from the strike. The Delta, however, exhibits the least sensitivity when ATM and is most sensitive when moderately ITM. The price uncertainty for ITM American puts is less than the price uncertainty of equivalent European puts. For OTM options, the price uncertainty is similar between American and European puts. The uncertainty in the Delta of ITM American puts is greater than the uncertainty of equivalent European puts. For OTM puts, the uncertainty in Delta is similar between American and European puts. For the American put, uncertainty in volatility introduces uncertainty in the location of the optimal exercise boundary, thereby making optimal exercise decisions more difficult.
Show less  Date Issued
 2013
 Identifier
 FSU_migr_etd7525
 Format
 Thesis
 Title
 Metabolite Identification by Nuclear Magnetic Resonance Spectroscopy.
 Creator

Bingol, Ahmet Kerem, Logan, Timothy M., Brüschweiler, Rafael, Zhou, HuanXiang, Li, Hong, Srivastava, Anuj, Program in Molecular Biophysics, Florida State University
 Abstract/Description

The metabolic makeup of a biological system is a key determinant of its biological state providing detailed insights into its function. Identification and quantification of the metabolites in a system form critical components of metabolomics. Nuclear magnetic resonance (NMR) spectroscopy is a unique tool for this purpose providing a wealth of atomicdetail information without requiring extensive fractionation of samples. So far, a majority of NMR metabolomics studies have been performed by...
Show moreThe metabolic makeup of a biological system is a key determinant of its biological state providing detailed insights into its function. Identification and quantification of the metabolites in a system form critical components of metabolomics. Nuclear magnetic resonance (NMR) spectroscopy is a unique tool for this purpose providing a wealth of atomicdetail information without requiring extensive fractionation of samples. So far, a majority of NMR metabolomics studies have been performed by using 1D NMR techniques because of the short duration of the experiments. The drawback of 1D NMR is the high occurrence of peak overlaps that impairs metabolite identification and quantification. The use of multidimensional NMR techniques can resolve peak overlaps and provide connectivity information of atoms within molecules, thereby outweighing the longer measurement times. In this thesis, we introduce novel approaches to identify metabolites by using multidimensional NMR spectroscopy. Our main approach consists of two major steps. In the first step, the metabolite mixture is deconvoluted into its individual components and in the second step; each individual component is analyzed by using its NMR spectrum. In order to achieve fast, robust and (semi)automated deconvolution, DeCoDeC technique is introduced and applied to a variety of 1H and 13C TOCSY based NMR spectra. Deconvoluted TOCSY traces are directly queried in metabolite databanks for identification. Since many metabolites are not present in metabolite databanks, we developed a strategy to extract their carbon backbone structures (topology), which is a prerequiste for de novo structure determination. This led to the determination of 112 topologies of unique metabolites in E. coli from a single sample that constitutes the "topolome" of a cell. The topolome is dominated by carbon topologies of carbohydrates (34.8%) and amino acids (45.5%) that can constitute building blocks of more complex structures. Furthermore, since databanks are designed to query 1D NMR spectrum, querying of TOCSY traces against 1D NMR spectra in databanks resulted in imperfect matches. To overcome this, we created a customized 13C TOCSY database, which substantially improved the accuracy of database query of 13C TOCSY traces. Together these new tools open up the prospect to enable routine yet accurate analysis of an increasingly complex and diverse range of molecular solutions including metabolomics samples.
Show less  Date Issued
 2013
 Identifier
 FSU_migr_etd7298
 Format
 Thesis
 Title
 Tools for Statistical Analysis on Shape Spaces of ThreeDimensional Object.
 Creator

Xie, Qian, Srivastava, Anuj, Klassen, E. (Eric), Huffer, Fred W. (Fred William), Wu, Wei, Zhang, Jinfeng, Florida State University, College of Arts and Sciences, Department of...
Show moreXie, Qian, Srivastava, Anuj, Klassen, E. (Eric), Huffer, Fred W. (Fred William), Wu, Wei, Zhang, Jinfeng, Florida State University, College of Arts and Sciences, Department of Statistics
Show less  Abstract/Description

With the increasing popularity of information technology, especially electronic imaging techniques, large amount of high dimensional data such as 3D shapes become pervasive in science, engineering and even people's daily life, in the recent years. Though the data quantity is huge, the extraction of relevant knowledge on those data is still limited. How to understand data in a meaningful way is generally an open problem. The specific challenges include finding adequate mathematical...
Show moreWith the increasing popularity of information technology, especially electronic imaging techniques, large amount of high dimensional data such as 3D shapes become pervasive in science, engineering and even people's daily life, in the recent years. Though the data quantity is huge, the extraction of relevant knowledge on those data is still limited. How to understand data in a meaningful way is generally an open problem. The specific challenges include finding adequate mathematical representations of data and designing proper algorithms to process them. The existing tools for analyzing highdimensional data, including 3D shape data, are found to be insufficient as they usually suffer from many factors, such as misalignments, noise, and clutter. This thesis attempts to develop a framework for processing, analyzing and understanding highdimensional data, especially 3D shapes, by proposing a set of statistical tools including theory, algorithms and optimization applied to practical problems. In particular, the following aspects of shape analysis are considered: 1. A framework adopting the SRNF representation, based on parallel transport of deformations across surfaces in the shape space, leads to statistical analysis on shape data. Three main analyses are conducted under this framework: (1) computing geodesics when either two end surfaces or the starting surface and an initial deformation are given; (2) parallel transporting deformation across surfaces; and (3) sampling random surfaces. 2. Computational efficiency plays an important role in performing statistical shape analysis on large datasets of 3D objects. To speed up the previous method, a framework with numerical solution is introduced by approximating the inverse mapping, and it reduces the computational cost by an order of magnitude. 3. The geometrical and morphological information, or their shapes, of 3D objects can be analyzed explicitly using boundaries extracted from original image scans. An alternative idea is to consider variability in shapes directly from their embedding images. A novel framework is proposed to unify three important tasks, registering, comparing and modeling images. 4. Finally, the spatial deformations learned from registering images are modeled using the GRID based decomposition. This specific model provides a way to decompose a large deformation into local and fundamental ones so that shape differences between images are easily interpretable. We conclude this thesis with conclusions drawn in this research and discuss potential future directions of statistical shape analysis in the last chapter, both from methodological and application aspects.
Show less  Date Issued
 2015
 Identifier
 FSU_migr_etd9495
 Format
 Thesis
 Title
 A Framework for Comparing Shape Distributions.
 Creator

Henning, Wade, Srivastava, Anuj, Alamo, Ruﬁna G., Huﬀer, Fred W. (Fred William), Wu, Wei, Florida State University, College of Arts and Sciences, Department of Statistics
 Abstract/Description

The problem of comparisons of shape populations is present in many branches of science, including nanomanufacturing, medical imaging, particle analysis, fisheries, seed science, and computer vision. Researchers in these fields have traditionally characterized the profiles in these sets using combinations of scalar valued descriptor features, like aspect ratio or roughness, whose distributions are easy to compare using classical statistics. However, there is a desire in this community for a...
Show moreThe problem of comparisons of shape populations is present in many branches of science, including nanomanufacturing, medical imaging, particle analysis, fisheries, seed science, and computer vision. Researchers in these fields have traditionally characterized the profiles in these sets using combinations of scalar valued descriptor features, like aspect ratio or roughness, whose distributions are easy to compare using classical statistics. However, there is a desire in this community for a single comprehensive feature that uniquely defines these profiles. The shape of the profile itself is such a feature. Shape features have traditionally been studied as individuals, and comparing distributions underlying sets of shapes is challenging. Since the data comes in the form of samples from shape populations, we use kernel methods to estimate underlying shape densities. We then take a metric approach to define a proper distance, termed the FisherRao distance, to quantify differences between any two densities. This distance can be used for clustering, classification and other types of statistical modeling; however, this dissertation focuses on comparing shape populations as a classical twosample hypothesis test with populations characterized by respective probability densities on shape space. Since we are interested in the shapes of planar closed curves and the space of such curves is infinite dimensional, there are some theoretical issues in defining and estimating densities on this space. We therefore use a spherical multidimensional scaling algorithm to project shape distributions to the unit twosphere, and this allows us to use a von MisesFisher kernel for density estimation. The estimated densities are then compared using the FisherRao distance, which, in turn, is estimated using Monte Carlo methods. This distance estimate is used as a test statistic for the twosample hypothesis test mentioned above. We use a bootstrap approach to perform the test and to evaluate population classification performance. We demonstrate these ideas using applications from industrial and chemical engineering.
Show less  Date Issued
 2014
 Identifier
 FSU_migr_etd9185
 Format
 Thesis
 Title
 Geometric Approaches for Analysis of Images, Densities and Trajectories on Manifolds.
 Creator

Zhang, Zhengwu, Srivastava, Anuj, Klassen, E. (Eric), Wu, Wei, Pati, Debdeep, Florida State University, College of Arts and Sciences, Department of Statistics
 Abstract/Description

In this dissertation, we focus on the problem of analyzing highdimensional functional data using geometric approaches. The term functional data refers to images, densities and trajectories on manifolds. The nature of these data imposes difficulties on statistical analysis. First, the objects are functional type of data which are infinite dimensional. One needs to explore the possible representations of each type such that the representations can facilitate the future statistical analysis....
Show moreIn this dissertation, we focus on the problem of analyzing highdimensional functional data using geometric approaches. The term functional data refers to images, densities and trajectories on manifolds. The nature of these data imposes difficulties on statistical analysis. First, the objects are functional type of data which are infinite dimensional. One needs to explore the possible representations of each type such that the representations can facilitate the future statistical analysis. Second, the representation spaces are often nonlinear manifolds. Thus, proper Riemannian structures are necessary to compare objects. Third, the analysis and comparison of objects need be invariant to certain nuisance variables. For example, comparison between two images should be invariant to their blur levels, and comparison between timeindexed trajectories on manifolds should be invariant to their temporal evaluation rates. We start by introducing frameworks for representing, comparing and analyzing functions in Euclidean space including signals, images and densities, and the comparisons are invariant to the Gaussian blur existed in these objects. Applications in blur levels matching, blurred image recognition, image classification and twosample hypothesis test are discussed. Next, we present frameworks for analyzing longitudinal trajectories on a manifold M, while the analysis is invariant to the reparameterization action (temporal variation). Particularly, we are interested in analyzing trajectories in two manifolds: the twosphere and the set of symmetric positivedefinite matrices. Applications such as bird migration and hurricane tracks analysis, visual speech recognition and hand gesture recognition are used to demonstrate the advantages of the proposed frameworks. In the end, a Bayesian framework for clustering of shapes of curves is presented, and examples of clustering cell shapes and protein structures are discussed.
Show less  Date Issued
 2015
 Identifier
 FSU_migr_etd9503
 Format
 Thesis