Current Search: Beaumont, Paul M. (x)
Search results
 Title
 Reevaluating the Link Between Volatility and Growth.
 Creator

Yigit, Fatma Pinar, Norrbin, Stefan C., Beaumont, Paul M., Schlagenhauf, Don, Department of Economics, Florida State University
 Abstract/Description

Until the 1980's, business cycle and growth theory were viewed as separate and unrelated. However, the last two decades were characterized by growth in literature that examined the link between growth and short term fluctuations. Two primary differing views emerged; one suggesting a negative connection and the second suggesting a positive connection between volatility and growth. The purpose of this study is to attempt to understand the relationship between growth and output volatility. The...
Show moreUntil the 1980's, business cycle and growth theory were viewed as separate and unrelated. However, the last two decades were characterized by growth in literature that examined the link between growth and short term fluctuations. Two primary differing views emerged; one suggesting a negative connection and the second suggesting a positive connection between volatility and growth. The purpose of this study is to attempt to understand the relationship between growth and output volatility. The focus will be to find a link between growth and output volatility either negative as Ramey and Ramey (1995) concluded or positive as Kormendi and Meguire (1985) claimed. This study will also expand on the work of Ramey and Ramey (1995) by considering a larger sample size and different time ranges by using a more recent Penn World Table dataset. The robustness of Ramey and Ramey's (1995) empirical study is examined. The first part of this paper examines the tradeoff between growth and volatility by assuming that volatility differs among countries but not over time. The results show some support for the negative relationship using the updated data set. However, the evidence for the negative relationship is weaker than the Ramey and Ramey's (1995) strong evidence for negative relationship. A stronger relationship was found when data was sampled differently, using a moving period measure. These results show more support for the negative relationship.
Show less  Date Issued
 2004
 Identifier
 FSU_migr_etd0664
 Format
 Thesis
 Title
 Essays on the Role of Trade Frictions in International Economics.
 Creator

Yoshimine, Koichi, Norrbin, Stefan C., Huﬀer, Fred W., Beaumont, Paul M., Garriga, Carlos, Department of Economics, Florida State University
 Abstract/Description

This dissertation consists of three essays. The first essay examines the effects of tax differentials on the trade balance across countries. Given that intrafirm trade accounts for the sizable share of the world's international trade, it is expected that incomeshifting activities of multinational firms can bias the trade balance in many countries. Specifically, an increase in the relative tax liability in one country is expected to decrease the trade balance of that country. Using proxies to...
Show moreThis dissertation consists of three essays. The first essay examines the effects of tax differentials on the trade balance across countries. Given that intrafirm trade accounts for the sizable share of the world's international trade, it is expected that incomeshifting activities of multinational firms can bias the trade balance in many countries. Specifically, an increase in the relative tax liability in one country is expected to decrease the trade balance of that country. Using proxies to the effective tax liability of 19 OECD countries, the cointegrating regressions show significantly negative relationships between tax differentials and the trade balance among relatively small industrial countries. The second essay asks whether the empirically observed home biases in international trade are accounted for by a theoretical model. It has been pointed out that trade among individual Canadian provinces is much larger than the trade between individual Canadian provinces and individual U.S. states. There is a similar tendency in the trade among the OECD member countries. Obstfeld and Rogoff (2000) claim that such a bias can be explained if one takes into account the interaction between transaction costs and the elasticity of substitution. This study tests their claim using a dynamic general equilibrium model where agents pay proportional transaction costs. The simulation results show that the bias levels generated by the plausible values for transaction cost and elasticity are not particularly inconsistent with the observed levels in the US  Canada relationship. The third essay tests a version of international real business cycle model aimed at examining the effect on the exchangerate volatility of market segmentation generated by a trade friction across countries. Obstfeld and Rogoff (2000) argue that segmentation in international goods market can explain the empirically observed real exchangerate volatility. In this study, a trade cost in goods market combined with income heterogeneity of consumers endogenously generates market segmentation by preventing a fraction of consumers from participating in international trade. Under such a circumstance, the volatility of exchange rate actually rises, but the volatility is still below the observed reality, suggesting that trade cost alone cannot explain the anomalous exchangerate behaviors.
Show less  Date Issued
 2004
 Identifier
 FSU_migr_etd0867
 Format
 Thesis
 Title
 Volatility Linkages in Growth and Asset Pricing.
 Creator

Ozer, Gorkem, Beaumont, Paul M., Kercheval, Alec, Norrbin, Stefan C., Zuehlke, Thomas W., Department of Economics, Florida State University
 Abstract/Description

The relationship between economic growth and volatility has been the subject of numerous theoretical and empirical studies in economics. In Chapter 2, I review this extensive literature. Key among the empirical results are the Ramey and Ramey (1995) result that volatility has a negative effect on growth and the Sachs and Warner (1995) result that resource rich countries grow relatively more slowly than other countries. These two findings motivated us to examine whether the volatility effect...
Show moreThe relationship between economic growth and volatility has been the subject of numerous theoretical and empirical studies in economics. In Chapter 2, I review this extensive literature. Key among the empirical results are the Ramey and Ramey (1995) result that volatility has a negative effect on growth and the Sachs and Warner (1995) result that resource rich countries grow relatively more slowly than other countries. These two findings motivated us to examine whether the volatility effect on growth is sensitive to the natural resource endowment. These results are presented in Chapter 3 where we also extend Ramey and Ramey (1995) by using time varying volatility and moving window volatility measures and also by exploring alternative control variable sets. Although we do not confirm all of the results of Ramey and Ramey (1995), we conclude that the effect of volatility on growth is robust to the inclusion of natural resource endowments, different control variables, and different volatility formulations. In Chapter 4, we explore the role of various sources of variation on asset prices. Specifically, we examine the effect of the noisy earnings reports on the equity premium in an asset pricing model. In our model, consumers make their investment decisions based on preliminary announcements of earnings reports and after the revisions are made by the release of the actual earnings reports they make their consumption decisions. Consequently, the stochastic discount factor used for asset price determination is based on the preliminary announcements rather than the true earnings process. The variance of the revisions plays an important role in the decisions of the consumers. If the variance of revisions is high the agents will tend to ignore the announcements and rely on the mean of historical earnings realizations. This tends to smooth the stochastic discount factor in the pricing equation which has the impact of reducing the equity premium in the model. Therefore, the equity premium puzzle is even more severe than reported by Mehra and Prescott (1985) when imperfect earnings forecasts are accounted for and consumers face a signal extraction problem in earnings.
Show less  Date Issued
 2005
 Identifier
 FSU_migr_etd2439
 Format
 Thesis
 Title
 Essays on Productivity, Labor Allocations, and Intangible Capital.
 Creator

Malik, Kashif Z. (Kashif Zaheer), Marquis, Milton H., Kercheval, Alec N., Norrbin, Stefan C., Beaumont, Paul M., Department of Economics, Florida State University
 Abstract/Description

The first essay conducts robustness analysis on Gali's (1999) results. Following Gali's identification strategy, the model is extended to the sectoral level within the private sector. The paper also looks at the two important breaks, 1973 recession and 1984beginning of the "great moderation". The private sector results suggest that nontechnology shocks are the major cause of business cycle fluctuation rather than technology shocks. Sectoral data also produced this conclusion with the...
Show moreThe first essay conducts robustness analysis on Gali's (1999) results. Following Gali's identification strategy, the model is extended to the sectoral level within the private sector. The paper also looks at the two important breaks, 1973 recession and 1984beginning of the "great moderation". The private sector results suggest that nontechnology shocks are the major cause of business cycle fluctuation rather than technology shocks. Sectoral data also produced this conclusion with the exception of one sector. Most of the results do not change for the pre and postrecession and great moderation dates. This essay reinforces the notion that technology shocks play a limited role in the aggregate shortrun fluctuations of business cycles. These results pose a challenge to modern real business cycle theory. The question does hours decline in response to a technology shock attracted a lot of research in the last decade. The second essay attempted to investigate the response to hours in a threevariableproductivity, hours and corporate profits model using vector autoregressive with longrun and shortrun restrictions. The model imposes three restric tions: technology shocks affect productivity permanently, hour's shock and profit shocks do not affect productivity in the longrun and profit shocks do not affect hours contempora neously. The results seemed to be more encouraging for real business cycle theory and are inconsistent with the conclusion that technology shocks play limited role in business cycle fluctuations. An important finding is that profits matter empirically since it changed the response to hours from a technology shock. By adding profits to the model, hours do not decline from a productivity shock. Though the initial impact is negative they recover in first quarter and they comove with productivity.The response to hours shock is however consistent with Gali (1999). Hours worked increase in response to a shock to employment. Recent empirical research argued that intangible capital has been playing an important role in explaining productivity gains in the last two decades. In the third essay, intangible capital is introduced in an otherwise standard real business cycle model. Firms expend resources to create intangible capital which is an additional input in the production func tion. Since firm's investment in intangible capital is procyclical it produces positive profits despite being a competitive firm. The firm increases investment in intangible capital from both temporary and permanent productivity shock. It also plays a significant role in pro ducing endogenous movement in productivity. Firms use more labor and physical capital to produce intangible capital since it raises productivity and future profits. However, there is a tradeoff between current period profits and investment in intangible capital. Perma nent technology shock results in higher factor share of labor and capital allocated to create intangible capital which decreases profits in the current period; however, higher investment in intangible capital would raise future profits.
Show less  Date Issued
 2011
 Identifier
 FSU_migr_etd5012
 Format
 Thesis
 Title
 Jump Dependence and Multidimensional Default Risk: A New Class of Structural Models with Stochastic Intensities.
 Creator

Garreau, Pierre, Kercheval, Alec N., Marquis, Milton H., Beaumont, Paul M., Kopriva, David A., Okten, Giray, Department of Mathematics, Florida State University
 Abstract/Description

This thesis presents a new structural framework for multidimensional default risk. The time of default is the first jump of the logreturns of the stock price of a firm below a stochastic default level. When the stock price is an exponential Levy process, this new formulation is equivalent to a default model with stochastic intensity where the intensity process is parametrized by a Levy measure. This framework calibrates well to various term structures of credit default swaps. Furthermore,...
Show moreThis thesis presents a new structural framework for multidimensional default risk. The time of default is the first jump of the logreturns of the stock price of a firm below a stochastic default level. When the stock price is an exponential Levy process, this new formulation is equivalent to a default model with stochastic intensity where the intensity process is parametrized by a Levy measure. This framework calibrates well to various term structures of credit default swaps. Furthermore, the dependence between the default times of firms within a basket of credit securities is the result of the jump dependence of their respective stock prices: this class of models makes the link between the Equity and Credit markets. As an application, we show the valuation of a firsttodefault swaps. To motivate this new framework, we compute the default probability in a traditional structural model of default where the firm value follows a general Levy processes. This is made possible via the resolution of a partial integrodifferential equation (PIDE). We solve this equation numerically using a spectral element method based on the approximation of the solution with high order polynomials described in (Garreau & Korpiva, 2013). This method is able to handle the sharp kernels in the integral term. It is faster than the competing numerical Laplace transform methods used for first passage time problems, and can be used to compute the price of exotic options with barriers. This PIDE approach does not however extend well in higher dimensions. To understand the joint default of our new framework, we investigate the dependence structures of Levy processes. We show that for two one dimensional Levy processes to form a two dimensional Levy process, their joint survival times need to satisfy a two dimensional version of the memoryless property. We make the link with bivariate exponential random variables and the MarshallOlkin copula. This result yields a necessary construction of dependent Levy processes, a characterization theorem for Poisson random measures and has important ramification for default models with jointly conditionally Poisson processes.
Show less  Date Issued
 2013
 Identifier
 FSU_migr_etd8555
 Format
 Thesis
 Title
 Stripping the Yield Curve with Maximally Smooth Forward Curves.
 Creator

JerassyEtzion, Yaniv, Beaumont, Paul M., Tenenbaum, Gershon, Rasmussen, David, Norrbin, Stefan, Department of Economics, Florida State University
 Abstract/Description

Continuous discount functions and forward rate curves are needed for nearly all asset pricing applications. Unfortunately, forward curves are not directly observable so they must be constructed from existing fixedincome security prices. In this paper I present two algorithms to construct maximally smooth forward rate and discount curves from the term structure of ontherun U.S. treasury bills and bonds. I use ontherun treasuries to get the most recent and liquid prices available. The...
Show moreContinuous discount functions and forward rate curves are needed for nearly all asset pricing applications. Unfortunately, forward curves are not directly observable so they must be constructed from existing fixedincome security prices. In this paper I present two algorithms to construct maximally smooth forward rate and discount curves from the term structure of ontherun U.S. treasury bills and bonds. I use ontherun treasuries to get the most recent and liquid prices available. The maximum smoothness criterion produces more accurate prices for derivatives such as swaps and ensures that no artificial arbitrage will be introduced when using the constructed forward curve for pricing outofsample securities. When coupon bonds are included among the securities it is necessary to both strip the coupon payments and interpolate the spot curve. To be consistent, these steps must be done simultaneously but this complication usually leads to highly nonlinear algorithms. The first method I describe uses an iterated, piecewise, quartic polynomial interpolation (IPQPI) of the forward curve that only requires the solution of linear equations while maintaining minimal pricing errors and maximum smoothness of the interpolated curves.The second method uses a genetic programming (GP) algorithm that searches over the space of diferentiable functions for maximally smooth forward curves with minimal pricing errors. I find that the IPQPI method performs better than the GP and other algorithms commonly used in industry and academics.
Show less  Date Issued
 2010
 Identifier
 FSU_migr_etd3526
 Format
 Thesis
 Title
 Certification and Fiduciary Liability in the U.S. Financial Advisor Industry.
 Creator

Gough, Jeffrey M., Benson, Bruce L., Autore, Donald M., Semykina, Anastasia, Beaumont, Paul M, Florida State University, College of Social Sciences and Public Policy, Department...
Show moreGough, Jeffrey M., Benson, Bruce L., Autore, Donald M., Semykina, Anastasia, Beaumont, Paul M, Florida State University, College of Social Sciences and Public Policy, Department of Economics
Show less  Abstract/Description

The U.S. financial advisor industry seems to have all of the necessary ingredients for a classic lemons problem: an information asymmetry between advisor and client, frequent conflicts of interest between advisor compensation and client goals, and uncertain quality stemming from the many credence good features of financial products and advice. In this study, the first of two empirical investigations sets out to test whether voluntary advisor certification helps to mitigate this potential...
Show moreThe U.S. financial advisor industry seems to have all of the necessary ingredients for a classic lemons problem: an information asymmetry between advisor and client, frequent conflicts of interest between advisor compensation and client goals, and uncertain quality stemming from the many credence good features of financial products and advice. In this study, the first of two empirical investigations sets out to test whether voluntary advisor certification helps to mitigate this potential lemons problem. Using a unique dataset constructed from public registration records, certification appears successful in this mitigation role. Specifically, higher quality is evident among certified advisors who, after controlling for other observable characteristics, are estimated to have 30 percent lower rates of disclosure events per year of experience than their noncertified counterparts. It is argued that this quality segmentation is the result of an underlying signaling mechanism. Current federal law imposes fiduciary liability upon financial advisors registered as investment adviser representatives (IARs.) However, despite offering similar services, this same liability standard is not applied to advisors who register as brokers. Many groups, including the U.S. Congress, express concern about how this fiduciary discrepancy creates negative consequences for broker clients. While welldefined consequences are often left unspecified, it is typically implied that brokers, owing fewer legal obligations to their clients, engage in more misconduct than IARs. The second empirical investigation of this study sets out to test whether this presumed misconduct is evident in higher rates of customer complaints. Despite popular concern, several methods of analysis are unable to find any statistically significant difference between the complaint rates of brokers and IARs. Instead, it appears that advisors with dual registrationa group that presumably accepts some fiduciary obligationsare the most likely to generate significantly higher complaint rates. This study proceeds to defend the use of customer complaints as an indirect measure of misconduct and describes several factors which might mitigate the effects of a fiduciary discrepancy.
Show less  Date Issued
 2014
 Identifier
 FSU_migr_etd9179
 Format
 Thesis
 Title
 ValueatRisk and Expected Shortfall Estimation via Randomized QuasiMonte Carlo Methods and Comparative Analysis.
 Creator

Franke, Stephen Robert, Marquis, Milton H., Ökten, Giray, Beaumont, Paul M., Fournier, Gary M., Florida State University, College of Social Sciences and Public Policy,...
Show moreFranke, Stephen Robert, Marquis, Milton H., Ökten, Giray, Beaumont, Paul M., Fournier, Gary M., Florida State University, College of Social Sciences and Public Policy, Department of Economics
Show less  Abstract/Description

Randomized quasiMonte Carlo methods have been shown to offer estimates with smaller variances compared with estimates obtained with Monte Carlo. This dissertation examines the application of randomized quasiMonte Carlo methods in the context of valueatrisk and expected shortfall, two measures of downside risk associated with financial portfolios. It finds that while the randomized quasiMonte Carlo estimates have the variancereduction of estimates property when applied to the...
Show moreRandomized quasiMonte Carlo methods have been shown to offer estimates with smaller variances compared with estimates obtained with Monte Carlo. This dissertation examines the application of randomized quasiMonte Carlo methods in the context of valueatrisk and expected shortfall, two measures of downside risk associated with financial portfolios. It finds that while the randomized quasiMonte Carlo estimates have the variancereduction of estimates property when applied to the aforementioned risk measures of financial portfolios, the reduced standard errors have a rate of convergence much closer to 1/√M than the potential 1/M described by the theory for the 22day time horizon of valueatrisk and expected shortfall. The rate of convergence increased for the 8day horizon, suggesting that the advantages of randomized quasiMonte Carlo estimation in terms of standard error of estimates and accuracy of estimates improve for shorter time horizons of the aforementioned risk measures and are no worse for longer time horizons.
Show less  Date Issued
 2018
 Identifier
 2018_Fall_Franke_fsu_0071E_14840
 Format
 Thesis
 Title
 Geography, Economic Institutions, Political Institutions, and Economic Performance.
 Creator

Ferraro, Amanda Catherine, Gwartney, James D., Norrbin, Stefan C., Beaumont, Paul M., Sherron, Katie A., Kitchens, Carl T., Florida State University, College of Social Sciences...
Show moreFerraro, Amanda Catherine, Gwartney, James D., Norrbin, Stefan C., Beaumont, Paul M., Sherron, Katie A., Kitchens, Carl T., Florida State University, College of Social Sciences and Public Policy, Department of Economics
Show less  Abstract/Description

Does geography impact economic growth directly even after considering economic and political institutions? This paper explores which countries are the most geographically disadvantaged and if these disadvantages play a role in their economic growth and per capita income levels. A group of the 30 most geographically disadvantaged countries is determined by summing multiple geography variables to understand the overall disadvantages these countries face. The difference between per capita income...
Show moreDoes geography impact economic growth directly even after considering economic and political institutions? This paper explores which countries are the most geographically disadvantaged and if these disadvantages play a role in their economic growth and per capita income levels. A group of the 30 most geographically disadvantaged countries is determined by summing multiple geography variables to understand the overall disadvantages these countries face. The difference between per capita income levels and growth rates of these countries compared to other developing countries is analyzed to discover the disadvantage these geographic characteristics have. This analysis will explore how important geography is to growth relative to economic and political institutions, and whether the effects of geography change over time.
Show less  Date Issued
 2017
 Identifier
 FSU_FALL2017_Ferraro_fsu_0071N_14141
 Format
 Thesis
 Title
 Parma: Applications of VectorAutoregressive Models to Biological Inference with an Emphasis on ProcrustesBased Data.
 Creator

Soda, K. James (Kenneth James), Slice, Dennis E., Beaumont, Paul M., Beerli, Peter, MeyerBaese, Anke, Shanbhag, Sachin, Florida State University, College of Arts and Sciences,...
Show moreSoda, K. James (Kenneth James), Slice, Dennis E., Beaumont, Paul M., Beerli, Peter, MeyerBaese, Anke, Shanbhag, Sachin, Florida State University, College of Arts and Sciences, Department of Scientific Computing
Show less  Abstract/Description

Many phenomena in ecology, evolution, and organismal biology relate to how a system changes through time. Unfortunately, most of the statistical methods that are common in these fields represent samples as static scalars or vectors. Since variables in temporallydynamic systems do not have stable values this representation is unideal. Differential equation and basis function representations provide alternative systems for description, but they are also not without drawbacks of their own....
Show moreMany phenomena in ecology, evolution, and organismal biology relate to how a system changes through time. Unfortunately, most of the statistical methods that are common in these fields represent samples as static scalars or vectors. Since variables in temporallydynamic systems do not have stable values this representation is unideal. Differential equation and basis function representations provide alternative systems for description, but they are also not without drawbacks of their own. Differential equations are typically outside the scope of statistical inference, and basis function representations rely on functions that solely relate to the original data in regards to qualitative appearance, not in regards to any property of the original system. In this dissertation, I propose that vector autoregressivemoving average (VARMA) and vector autoregressive (VAR) processes can represent temporallydynamic systems. Under this strategy, each sample is a time series, instead of a scalar or vector. Unlike differential equations, these representations facilitate statistical description and inference, and, unlike basis function representations, these processes directly relate to an emergent property of dynamic systems, their crosscovariance structure. In the first chapter, I describe how VAR representations for biological systems lead to both a metric for the difference between systems, the Euclidean process distance, and to a statistical test to assess whether two time series may have originated from a single VAR process, the likelihood ratio test for a common process. Using simulated time series, I demonstrate that the likelihood ratio test for a common process has a true Type I error rate that is close to the prespecified nominal error rate, regardless of the number of subseries in the system or of the order of the processes. Further, using the Euclidean process distance as a measure of difference, I establish power curves for the test using logistic regression. The test has a high probability of rejecting a false null hypothesis, even for modest differences between series. In addition, I illustrate that if two competitors follow the LotkaVolterra equations for competition with some additional white noise, the system deviates from VAR assumptions. Yet, the test can still differentiate between a simulation based on these equations in which the constraints on the system change and a simulation where the constraints do not change. Although the Type I error rate is inflated in this scenario, the degree of inflation does not appear to be larger when the system deviates more noticeably from model assumptions. In the second chapter, I investigate the likelihood ratio test for a common process's performance with shape trajectory data. Shape trajectories are an extension of geometric morphometric data in which a sample is a set of temporallyordered shapes as opposed to a single static shape. Like all geometric morphometric data, each shape in a trajectory is inherently highdimensional. Since the number of parameters in a VAR representation grows quadratically with the number of subseries, shape trajectory data will often require dimension reduction before a VAR representation can be estimated, but the effects that this reduction will have on subsequent inferences remains unclear. In this study, I simulated shape trajectories based on the movements of roundworms. I then reduced the number of variables that described each shape using principle components analysis. Based on these lower dimensional representations, I estimated the likelihood ratio test's Type I error rate and power with the simulated trajectories. In addition, I also used the same workflow on an empirical dataset of women walking (originally from Morris13) but also tried varying amounts of preprocessing before applying the workflow as well. The likelihood ratio test's Type I error rate was mildly inflated with the simulated shape trajectories but had a high probability of rejecting false null hypotheses. Without preprocessing, the likelihood ratio test for a common process had a highly inflated Type I error rate with the empirical data, but when the sampling density is lowered and the number of cycles is standardized within a comparison the degree of inflation becomes comparable to that of the simulated shape trajectories. Yet, these preprocessing steps do not appear to negatively impact the test's power. Visualization is a crucial step in geometric morphometric studies, but there are currently few, if any, methods to visualize differences in shape trajectories. To address this absence, I propose an extension to the classic vectordisplacement diagram. In this new procedure, the VAR representations for two trajectories' processes generate two simulated trajectories that share the same shocks. Then, a vectordisplacement diagram compares the simulated shapes at each time step. The set of all diagrams then illustrates the difference between the trajectories' processes. I assessed the validity of this procedure using two simulated shape trajectories, one based on the movements of roundworms and the other on the movements of earthworms. The result provided mixed results. Some diagrams do show comparisons between shapes that are similar to those in the original trajectories but others do not. Of particular note, diagrams show a bias towards whichever trajectory's process was used to generate pseudorandom shocks. This implies that the shocks to the system are just as crucial a component to a trajectory's behavior as the VAR model itself. Finally, in the third chapter I discuss a new R library to study dynamic systems and represent them as VAR and VARMA processes, iPARMA. Since certain processes can have multiple VARMA representations, the routines in this library place an emphasis on the reverse echelon format. For every process, there is only one VARMA model in reverse echelon format. The routines in iPARMA cover a diverse set of topics, but they all generally fall into one of four categories: simulation and study, model estimation, hypothesis testing, and visualization methods for shape trajectories. Within the chapter, I discuss highlights and features of key routines' algorithms, as well as how they differ from analogous routines in the R package MTS \citep{mtsCite}. In many regards, this dissertation is foundational, so it provides a number of lines for future research. One major area for further work involves alternative ways to represent a system as a VAR or VARMA process. For example, the parameter estimates in a VAR or VARMA model could depict a process as a point in parameter space. Other potentially fruitful areas include the extension of representational applications to other families of time series models, such as cointegrated models, or altering the generalized Procrustes algorithm to better suit shape trajectories. Based on these extensions, it is my hope that statistical inference based on stochastic process representations will help to progress what systems biologists are able to study and what questions they are able to answer about them.
Show less  Date Issued
 2017
 Identifier
 FSU_SUMMER2017_Soda_fsu_0071E_13917_P
 Format
 Set of related objects
 Title
 Multiple Imputation Methods for Large MultiScale Data Sets with Missing or Suppressed Values.
 Creator

Cao, Jian, Beaumont, Paul M., Duke, D. W., Norrbin, Stefan C., Ökten, Giray, CanoUrbina, Javier, Florida State University, College of Social Sciences and Public Policy,...
Show moreCao, Jian, Beaumont, Paul M., Duke, D. W., Norrbin, Stefan C., Ökten, Giray, CanoUrbina, Javier, Florida State University, College of Social Sciences and Public Policy, Department of Economics
Show less  Abstract/Description

Without proper treatment, direct analysis on data sets with missing or suppressed values can lead to biased results. Among all of the missing data handling methods, multiple imputation (MI) methods are regarded as the state of the art. The multiple imputed data sets can, on the one hand, generate unbiased estimates, and on the other hand, provide a reliable way to adjust standard errors based on missing data uncertainty. Despite many advantages, existing MI methods have poor performance on...
Show moreWithout proper treatment, direct analysis on data sets with missing or suppressed values can lead to biased results. Among all of the missing data handling methods, multiple imputation (MI) methods are regarded as the state of the art. The multiple imputed data sets can, on the one hand, generate unbiased estimates, and on the other hand, provide a reliable way to adjust standard errors based on missing data uncertainty. Despite many advantages, existing MI methods have poor performance on complicated MultiScale data, especially when the data set is large. The large data set of interest to us is the Quarterly Census of Employment and Wage (QCEW), which is the employment and wages of every establishment in the US. These detailed data are aggregated up through three scales: industry structure, geographic levels and time. The size of the QCEW data is as large as 210 x✕ 2217 ✕ 3193 ≈ 1.5 billion observations. For privacy concerns the data are heavily suppressed and this missingness could appear anywhere in this complicated structure. The existing methods are either accurate or fast but bot both in handling the QCEW data. Our goal is to develop a MI method which is capable of handling the missing value problem of large multiscale data set both accurately and efficiently. This research addresses this goal in three directions. First, I improve the accuracy of the fastest MI method, Bootstrapping based Expectation Maximization (EMB) algorithm, by equipping it with a MultiScale Updating step. This updating step uses the information from the singular covariance matrix to take multiscale structure into account and to simulate more accurate imputations. Second, I improve the MI method by using a Quasi Monte Carlo technique to accelerate its convergence speed. Finally, I develop a Sequential Parallel Imputation method which can detect the structure and missing pattern of large data sets, and partition it to small data sets automatically. The resulting Parallel Sequential MultiScale Bootstrapping Expectation Maximization Multiple Imputation (PSIMBEMMI) method is accurate, very fast, and can be applied to very large data sets.
Show less  Date Issued
 2018
 Identifier
 2018_Su_Cao_fsu_0071E_14706
 Format
 Thesis
 Title
 Scalable Nonconvex Optimization Algorithms: Theory and Applications.
 Creator

Wang, Zhifeng, She, Yiyuan, Beaumont, Paul M., Niu, Xufeng, Zhang, Jinfeng, Florida State University, College of Arts and Sciences, Department of Statistics
 Abstract/Description

Modern statistical problems often involve minimizing objective functions that are not necessarily convex or smooth. In this study, we devote to developing scalable algorithms for nonconvex optimization with statistical guarantees. We first investigate a broad surrogate framework defined by generalized Bregman divergence functions for developing scalable algorithms. Local linear approximation, mirror descent, iterative thresholding, and DC programming can all be viewed as particular instances....
Show moreModern statistical problems often involve minimizing objective functions that are not necessarily convex or smooth. In this study, we devote to developing scalable algorithms for nonconvex optimization with statistical guarantees. We first investigate a broad surrogate framework defined by generalized Bregman divergence functions for developing scalable algorithms. Local linear approximation, mirror descent, iterative thresholding, and DC programming can all be viewed as particular instances. The Bregman recharacterization enables us to choose suitable measures of computational error to establish global convergence rate results even for nonconvex problems in highdimensional settings. Moreover, under some regularity conditions, the sequence of iterates in Bregman surrogate optimization can be shown to approach the statistical truth within the desired accuracy geometrically fast. The algorithms can be accelerated with a careful control of relaxation and stepsize parameters. Simulation studies are performed to support the theoretical results. An important applications of nonconvex optimization is robust estimation. Outliers widely occur in bigdata and highdimensional applications and may severely affect statistical estimation and inference. A framework of outlierresistant estimation is introduced to robustify an arbitrarily given loss function. It has a close connection to the method of trimming but explicitly includes outlyingness parameters for all samples, which greatly facilitates computation, theory, and parameter tuning. To address the issues of nonconvexity and nonsmoothness, we develop scalable algorithms with implementation ease and guaranteed fast convergence. In particular, we introduce a new means to alleviate the requirement on the initial value such that on regular datasets the number of data resamplings can be substantially reduced. Moreover, based on combined statistical and computational treatments, we are able to develop new tools for nonasymptotic robust analysis regarding a general loss. The obtained estimators, though not necessarily globally or even locally optimal, enjoy minimax rate optimality in both low dimensions and high dimensions. Experiments in regression, classification, and neural networks show excellent performance of the proposed methodology in robust parameter estimation and variable selection.
Show less  Date Issued
 2018
 Identifier
 2018_Su_Wang_fsu_0071E_14775
 Format
 Thesis
 Title
 Domestic Effects of Federal Regulation of Oil and Gas Industries.
 Creator

Gmeiner, Robert James, Holcombe, Randall G., Feiock, Richard C., Beaumont, Paul M., Norrbin, Stefan C., Florida State University, College of Social Sciences and Public Policy,...
Show moreGmeiner, Robert James, Holcombe, Randall G., Feiock, Richard C., Beaumont, Paul M., Norrbin, Stefan C., Florida State University, College of Social Sciences and Public Policy, Department of Economics
Show less  Abstract/Description

The capture theory of regulation concludes that regulatory agencies tend to be captured by the firms they are regulating, so that regulations benefit those regulated firms. This paper examines the cumulative effects of federal regulations on the oil and gas industry and finds that regulations have benefited the more powerful economic interests in that industry, consistent with the capture theory. Regulations have tended to narrow refiners' margins and are associated with positive return and...
Show moreThe capture theory of regulation concludes that regulatory agencies tend to be captured by the firms they are regulating, so that regulations benefit those regulated firms. This paper examines the cumulative effects of federal regulations on the oil and gas industry and finds that regulations have benefited the more powerful economic interests in that industry, consistent with the capture theory. Regulations have tended to narrow refiners' margins and are associated with positive return and negative volatility responses for stocks of vertically integrated firms, which are the largest players in the industry. This narrowing of margins is mostly a longterm effect, but has some short run effects more on input prices than output prices, further benefiting vertically integrated firms. Refining regulations affect input prices more than output prices primarily by affecting demand for certain types of crude oil. Effects of regulation on input prices and even relative quantities of different inputs are robust, whereas effects of regulation on output prices are far more tenuous. There is no clear evidence that consumers are worse off because of the regulatory environment, but the robust empirical evidence does indicate that the regulatory environment differentially benefits large vertically integrated producers.
Show less  Date Issued
 2018
 Identifier
 2018_Sp_Gmeiner_fsu_0071E_14287
 Format
 Thesis
 Title
 QuasiMonte Carlo and Markov Chain QuasiMonte Carlo Methods in Estimation and Prediction of Time Series Models.
 Creator

Tzeng, YuYing, Ökten, Giray, Beaumont, Paul M., Srivastava, Anuj, Kercheval, Alec N., Kim, Kyounghee (Professor of Mathematics), Florida State University, College of Arts and...
Show moreTzeng, YuYing, Ökten, Giray, Beaumont, Paul M., Srivastava, Anuj, Kercheval, Alec N., Kim, Kyounghee (Professor of Mathematics), Florida State University, College of Arts and Sciences, Department of Mathematics
Show less  Abstract/Description

Randomized quasiMonte Carlo (RQMC) methods were first developed in mid 1990’s as a hybrid of Monte Carlo and quasiMonte Carlo (QMC) methods. They were designed to have the superior error reduction properties of lowdiscrepancy sequences, but also amenable to the statistical error analysis Monte Carlo methods enjoy. RQMC methods are used successfully in applications such as option pricing, high dimensional numerical integration, and uncertainty quantification. This dissertation discusses the...
Show moreRandomized quasiMonte Carlo (RQMC) methods were first developed in mid 1990’s as a hybrid of Monte Carlo and quasiMonte Carlo (QMC) methods. They were designed to have the superior error reduction properties of lowdiscrepancy sequences, but also amenable to the statistical error analysis Monte Carlo methods enjoy. RQMC methods are used successfully in applications such as option pricing, high dimensional numerical integration, and uncertainty quantification. This dissertation discusses the use of RQMC and QMC methods in econometric time series analysis. In time series simulation, the two main problems are parameter estimation and forecasting. The parameter estimation problem involves the use of Markov chain Monte Carlo (MCMC) algorithms such as MetropolisHastings and Gibbs sampling. In Chapter 3, we use an approximately completely uniform distributed sequence which was recently discussed by Owen et al. [2005], and an RQMC sequence introduced by O ̈kten [2009], in some MCMC algorithms to estimate the parameters of a Probit and SVlogAR(1) model. Numerical results are used to compare these sequences with standard Monte Carlo simulation. In the time series forecasting literature, there was an earlier attempt to use QMC by Li and Winker [2003], which did not provide a rigorous error analysis. Chapter 4 presents how RQMC can be used in time series forecasting with its proper error analysis. Numerical results are used to compare various sequences for a simple AR(1) model. We then apply RQMC to compute the valueatrisk and expected shortfall measures for a stock portfolio whose returns follow a highly nonlinear Markov switching stochastic volatility model which does not admit analytical solutions for the returns distribution. The proper use of QMC and RQMC methods in Monte Carlo and Markov chain Monte Carlo algorithms can greatly reduce the computational error in many applications from sciences, en gineering, economics and finance. This dissertation brings the proper (R)QMC methodology to time series simulation, and discusses the advantages as well as the limitations of the methodology compared the standard Monte Carlo methods.
Show less  Date Issued
 2017
 Identifier
 FSU_SUMMER2017_Tzeng_fsu_0071E_13607
 Format
 Thesis
 Title
 A Stock Market AgentBased Model Using Evolutionary Game Theory and Quantum Mechanical Formalism.
 Creator

Montin, Benoit S., Nolder, Craig A., Huﬀer, Fred W., Case, Bettye Anne, Beaumont, Paul M., Kercheval, Alec N., Sumners, DeWitt L., Department of Mathematics, Florida State...
Show moreMontin, Benoit S., Nolder, Craig A., Huﬀer, Fred W., Case, Bettye Anne, Beaumont, Paul M., Kercheval, Alec N., Sumners, DeWitt L., Department of Mathematics, Florida State University
Show less  Abstract/Description

The financial market is modelled as a complex selforganizing system. Three economic agents interact in a simplified economy and seek the maximization of their wealth. Replicator dynamics are used as a myopic behavioral rule to describe how agents learn and benefit from their experiences. Stock price fluctuations result from interactions between economic agents, budget constraints and conservation laws. Time is discrete. Invariant distributions over the state space, that is to say probability...
Show moreThe financial market is modelled as a complex selforganizing system. Three economic agents interact in a simplified economy and seek the maximization of their wealth. Replicator dynamics are used as a myopic behavioral rule to describe how agents learn and benefit from their experiences. Stock price fluctuations result from interactions between economic agents, budget constraints and conservation laws. Time is discrete. Invariant distributions over the state space, that is to say probability measures that remain unchanged by the oneperiod transition rule, form stochastic equilibria for our composite system. When agents make mistakes, there is a unique stochastic steady state which reflects the average and limit behavior. Convergence of the iterates occurs at a geometric rate in the total variation norm. Interestingly, when the probability of making a mistake tends to zero, the invariant distribution converges weakly to a stochastic equilibrium for the model without mistakes. Most agentbased computational economies heavily rely on simulations. Having adopted a simple representation of financial markets, we have been able to prove the above theoretical results and gain intuition on complexity economics. The impact of simple monetary policies on the limit stock price distribution, such as a decrease of the riskfree rate of interest, has been analyzed. Of interest as well, the limit stock log return distribution presents realworld features (skewed and leptokurtic) that more traditional models usually fail to explain or consider. Our artificial market is incomplete. The bid and ask prices of a vanilla Call option have been computed to illustrate option pricing in our setting.
Show less  Date Issued
 2004
 Identifier
 FSU_migr_etd2331
 Format
 Thesis
 Title
 Asset Pricing in a Lucas Framework with Boundedly Rational, Heterogeneous Agents.
 Creator

Culham, Andrew J. (Andrew James), Beaumont, Paul M., Kercheval, Alec N., Schlagenhauf, Don, Goncharov, Yevgeny, Kopriva, David, Department of Mathematics, Florida State University
 Abstract/Description

The standard dynamic general equilibrium model of financial markets does a poor job of explaining the empirical facts observed in real market data. The common assumptions of homogeneous investors and rational expectations equilibrium are thought to be major factors leading to this poor performance. In an attempt to relax these assumptions, the literature has seen the emergence of agentbased computational models where artificial economies are populated with agents who trade in stylized asset...
Show moreThe standard dynamic general equilibrium model of financial markets does a poor job of explaining the empirical facts observed in real market data. The common assumptions of homogeneous investors and rational expectations equilibrium are thought to be major factors leading to this poor performance. In an attempt to relax these assumptions, the literature has seen the emergence of agentbased computational models where artificial economies are populated with agents who trade in stylized asset markets. Although they offer a great deal of flexibility, the theoretical community has often criticized these agentbased models because the agents are too limited in their analytical abilities. In this work, we create an artificial market with a single risky asset and populate it with fully optimizing, forward looking, infinitely lived, heterogeneous agents. We restrict the state space of our agents by not allowing them to observe the aggregate distribution of wealth so they are required to compute their conditional demand functions while simultaneously learning the equations of motion for the aggregate state variables. We develop an efficient and flexible model code that can be used to explore a wide number of asset pricing questions while remaining consistent with conventional asset pricing theory. We validate our model and code against known analytical solutions as well as against a new analytical result for agents with differing discount rates. Our simulation results for general cases without known analytical solutions show that, in general, agents' asset holdings converge to a steadystate distribution and the agents are able to learn the equilibrium prices despite the restricted state space. Further work will be necessary to determine whether the exceptional cases have some fundamental theoretical explanation or can be attributed to numerical issues. We conjecture that convergence to the equilibrium is global and that the marketclearing price acts to guide the agents' forecasts toward that equilibrium.
Show less  Date Issued
 2007
 Identifier
 FSU_migr_etd2948
 Format
 Thesis
 Title
 Investigations of Behavioral Phenomena in Auctions and Gambles.
 Creator

White, Robert A., Pevnitskaya, Svetlana A., Großer, Jens Willi, Beaumont, Paul M, Isaac, R. Mark (Robert Mark), Florida State University, College of Social Sciences and Public...
Show moreWhite, Robert A., Pevnitskaya, Svetlana A., Großer, Jens Willi, Beaumont, Paul M, Isaac, R. Mark (Robert Mark), Florida State University, College of Social Sciences and Public Policy, Department of Economics
Show less  Abstract/Description

The first two chapters of this dissertation investigate the connection between the rich theoretical environment of first price auctions and the patterns of behavior in such resulting from specific treatments in the form of social comparison feedback. In the first chapter we find that behavioral treatments we designed can be used to increase charity auction revenue. In the second chapter we find that subjects do want to compare themselves to other subjects when it is costless, though not...
Show moreThe first two chapters of this dissertation investigate the connection between the rich theoretical environment of first price auctions and the patterns of behavior in such resulting from specific treatments in the form of social comparison feedback. In the first chapter we find that behavioral treatments we designed can be used to increase charity auction revenue. In the second chapter we find that subjects do want to compare themselves to other subjects when it is costless, though not always, and that subjects are sometimes willing to pay for the ability to compare themselves to other subjects. The third of this dissertation is an introduction to the notable research into choice in the presence of risk to date. The fourth chapter of this dissertation is a theoretical and experimental evaluation of the current stateoftheart models of choice in the presence of risk. We find that half of our experimental subjects behave as though they have either constant relative or constant absolute risk aversion and that the behavior of nearly all subjects can be explained by a nonexpected utility theory.
Show less  Date Issued
 2015
 Identifier
 FSU_migr_etd9486
 Format
 Thesis
 Title
 Asset Pricing Equilibria for Heterogeneous, LimitedInformation Agents.
 Creator

Jones, Dawna Candice, Kercheval, Alec N., Beaumont, Paul M, Van Winkle, David H., Nichols, Warren, Ökten, Giray, Florida State University, College of Arts and Sciences,...
Show moreJones, Dawna Candice, Kercheval, Alec N., Beaumont, Paul M, Van Winkle, David H., Nichols, Warren, Ökten, Giray, Florida State University, College of Arts and Sciences, Department of Mathematics
Show less  Abstract/Description

The standard general equilibrium asset pricing models typically make two simplifying assumptions: homogeneous agents and the existence of a rational expectations equilibrium. This context sometimes yields outcomes that are inconsistent with the empirical findings. We hypothesize that allowing agent heterogeneity could assist in replicating the empirical results. However, the inclusion of heterogeneity in models where agents are fully rational proves impossible to solve without severe...
Show moreThe standard general equilibrium asset pricing models typically make two simplifying assumptions: homogeneous agents and the existence of a rational expectations equilibrium. This context sometimes yields outcomes that are inconsistent with the empirical findings. We hypothesize that allowing agent heterogeneity could assist in replicating the empirical results. However, the inclusion of heterogeneity in models where agents are fully rational proves impossible to solve without severe simplifying assumptions. The reason for this difficulty is that heterogeneous agent models generate an endogenously complicated distribution of wealth across the agents. The state space for each agent's optimization problem includes the complex dynamics of the wealth distribution. There is no general way to characterize the interaction between the distribution of wealth and the macroeconomic aggregates. To address this issue, we implement an agentbased model where the agents have bounded rationality. In our model, we have a complete markets economy with two agents and two assets. The agents are heterogeneous and utility maximizing with constant coefficient of relative risk aversion [CRRA] preferences. How the agents address the stochastic behaviour of the evolution of the wealth distribution is central to our task since aggregate prices depend on this behaviour. An important component of this dissertation involves dealing with the computational difficulty of dynamic heterogeneousagent models. That is, in order to predict prices, agents need a way to keep track of the evolution of the wealth distribution. We do this by allowing each agent to assume that a priceequivalent representative agent exists and that the representative agent has a constant coefficient of relative risk aversion. In so doing, the agents are able to formulate predictive pricing and demand functions which allow them to predict aggregate prices and make consumption and investment decisions each period. However, the agents' predictions are only approximately correct. Therefore, we introduce a learning mechanism to maintain the required level of accuracy in the agents' price predictions. From this setup, we find that the model, with learning, will converge over time to an approximate expectations equilibrium, provided that the the initial conditions are close enough to the rational expectations equilibrium prices. Two main contributions in our work are: 1) to formulate a new concept of approximate equilibria, and 2) to show how equilibria can be approximated numerically, despite the fact that the true state space at any point in time is mathematically complex. These contributions offer the possibility of characterizing a new class of asset pricing models where agents are heterogeneous and only just slightly limited in their rationality. That is, the partially informed agents in our model are able to forecast and utilitymaximize only just as well as economists who face problems of estimating aggregate variables. By using an exogenously assigned adaptive learning rule, we analyse this implementation in a Lucastype heterogeneous agent model. We focus on the sensitivity of the risk parameter and the convergence of the model to an approximate expectations equilibrium. Also, we study the extent to which adaptive learning is able to explain the empirical findings in an asset pricing model with heterogeneous agents.
Show less  Date Issued
 2015
 Identifier
 FSU_migr_etd9624
 Format
 Thesis
 Title
 Controlling Influence: The Development and Function of Labor Law in Saudi Arabia.
 Creator

Balcer, Jordan, Hanley, Will, Gaiser, Adam R., Beaumont, Paul M., Florida State University, College of Social Sciences and Public Policy, Program in International Affairs
 Abstract/Description

Saudi Arabia has long been held as one of the largest oil producers in the world with a vast amount of wealth. The country's meteoric rise to their current status can be attributed to the development of its oil fields by an American company that gained virtual control over the economy of a fledgling nation. This situation prompted the country to establish a codified Labor Law, which gradually gave control back to the State. In this project the 1969 Labor and Workmen Code will be examined as...
Show moreSaudi Arabia has long been held as one of the largest oil producers in the world with a vast amount of wealth. The country's meteoric rise to their current status can be attributed to the development of its oil fields by an American company that gained virtual control over the economy of a fledgling nation. This situation prompted the country to establish a codified Labor Law, which gradually gave control back to the State. In this project the 1969 Labor and Workmen Code will be examined as well as the factors that caused its creation and prompted its continued evolution to its current form. This thesis explores the factors to why it was necessary for the Saudi Arabian Government to create a codified labor law and abandon Sharia (Islamic Law) in commercial matters, in addition to how the state currently uses the law to keep a firm grasp on its natural resources. The lessons learned from the Americanled ARAMCO period (19381980) were included in the law, thus creating a turning point in Saud Arabian history that allowed the government to reclaim control of its economy. Many sources will be used in this thesis but the most substantial is the Labor and Workmen Law because it contains specific provisions that were enacted to curb American influence. Translated sources from the Saudi Ministry of Labor and the Saudi Arabian Monetary Agency are also used to highlight the current form of the law as well as many secondary sources bolster the argument that the Saudi Arabian Government established a Labor Code that would ensure a reduction of American hegemony while also making the state the sole influence in labor and commercial matters.
Show less  Date Issued
 2014
 Identifier
 FSU_migr_etd9134
 Format
 Thesis
 Title
 Asset Market Dynamics of Heterogeneous Agent Models with Learning.
 Creator

Guan, Yuanying, Beaumont, Paul M., Kercheval, Alec N., Marquis, Milton, MestertonGibbons, Mike, Nichols, Warren D., Department of Mathematics, Florida State University
 Abstract/Description

The standard Lucas asset pricing model makes two common assumptions of homogeneous agents and rational expectations equilibrium. However, these assumptions are unrealistic for real financial markets. In this work, we relax these assumptions and establish a Lucas type agentbased asset pricing model. We create an artificial economy with a single risky asset and populate it with heterogeneous, boundedly rational, utility maximizing, infinitely lived and forward looking agents. We restrict...
Show moreThe standard Lucas asset pricing model makes two common assumptions of homogeneous agents and rational expectations equilibrium. However, these assumptions are unrealistic for real financial markets. In this work, we relax these assumptions and establish a Lucas type agentbased asset pricing model. We create an artificial economy with a single risky asset and populate it with heterogeneous, boundedly rational, utility maximizing, infinitely lived and forward looking agents. We restrict agents' information by allowing them to use only available information when they make optimal choices. With independent, identically distributed market returns, agents are able to compute their policy functions and the equilibrium pricing function with Duffie's method (Duffie, 1988) without perfect information about the market. When agents are out of equilibrium, they simultaneously compute their policy functions with predictive pricing functions and use adaptive learning schemes to learn the motion of the correct pricing function. Agents are able to learn the correct equilibrium pricing function with certain risk and learning parameters. In some other cases, the market price has excess volatility and the trading volume is very high. Simulations of the market behavior show rich dynamics, including a whole cascade from period doubling bifurcations to chaos. We apply the full families theory (De Melo and Van Strien, 1993) to prove that the rich dynamics do not come from numerical errors but are embedded in the structure of our dynamical system.
Show less  Date Issued
 2011
 Identifier
 FSU_migr_etd3938
 Format
 Thesis
 Title
 Financial Assets in a Heterogeneous Agent General Equilibrium Model with Aggregate and Idiosyncratic Risk.
 Creator

Schmerbeck, Aaron J., Beaumont, Paul M., Kercheval, Alec N., Nolder, Craig, Marquis, Milton, Schlagenhauf, Don, Department of Economics, Florida State University
 Abstract/Description

The financial economics profession has determined that identical agents in a dynamic, stochastic, general equilibrium (DSGE) model does not provide price and trading dynamics realized in financial markets. There has been quite a bit of research over the last three decades extending heterogeneity to the Lucas asset pricing framework, to address this issue. Once the assumption of homogeneous agents is relaxed, the problem becomes increasingly complex due to a state space including the wealth...
Show moreThe financial economics profession has determined that identical agents in a dynamic, stochastic, general equilibrium (DSGE) model does not provide price and trading dynamics realized in financial markets. There has been quite a bit of research over the last three decades extending heterogeneity to the Lucas asset pricing framework, to address this issue. Once the assumption of homogeneous agents is relaxed, the problem becomes increasingly complex due to a state space including the wealth distribution, continuation utilities, and wealth distribution dynamics. To establish a more computationally feasible model, specical modifications have been made such as heterogeneity in idiosyncratic shocks and not risk aversion, including aggregate or idiosyncratic risk (but not both), or assuming no growth in the economy (steady state). In this research, I will define a DSGE model with heterogeneous agents. This heterogeneity will refer to differing CRRA utilities through risk aversion. The economy will have growth due to the assumed dividend process. Agents will face idiosyncratic and aggregate shocks in a complete markets setting. The framework of the provided algorithm will enable issues to be addressed beyond homogeneous agent models. The numerical simulation results of this model provide considerable asset price volatility and high trading volume. These results occur even in the complete markets setting, where investors are expected to fully insure. Given these dynamics from the simulations of the algorithm, I demonstrate the ability to calibrate this model to address specific financial economic issues, such as the equity premium puzzle. More importantly this exercise will assume realistic agent parameters of risk aversion and discount factors, relative to economic theory.
Show less  Date Issued
 2014
 Identifier
 FSU_migr_etd9088
 Format
 Thesis
 Title
 Modelling Limit Order Book Dynamics Using Hawkes Processes.
 Creator

Chen, Yuanda, Kercheval, Alec N., Beaumont, Paul M., Ewald, Brian D., Zhu, Lingjiong, Florida State University, College of Arts and Sciences, Department of Mathematics
 Abstract/Description

The Hawkes process serves as a natural choice for modeling selfexciting dynamics, such as the behavior of an electronic exchangehosted limit order book (LOB). However, due to the lack of analytical solutions, probability estimates of future events often must rely on Monte Carlo simulation. Although Monte Carlo simulation is known to be good at solving pathdependent problems, it has the limitation that a high computation time is often required to get good accuracy. This is a concern in...
Show moreThe Hawkes process serves as a natural choice for modeling selfexciting dynamics, such as the behavior of an electronic exchangehosted limit order book (LOB). However, due to the lack of analytical solutions, probability estimates of future events often must rely on Monte Carlo simulation. Although Monte Carlo simulation is known to be good at solving pathdependent problems, it has the limitation that a high computation time is often required to get good accuracy. This is a concern in fields like algorithmic trading where fast calculation is essential. In this dissertation we propose the use of a 4dimensional Hawkes process to model the LOB and to forecast midprice movement probabilities using Monte Carlo simulation. We study the feasibility of making this prediction quickly enough to be applicable in practice. We show that fast predictions are feasible, and show in tests on real data that the model has some trading value in forecasting midprice movements. This dissertation also compares the performance of several popular computer languages, Python, MATLAB, Cython and C, in singlecore experiments, and examines the scalability for parallel computing using Cython and C.
Show less  Date Issued
 2017
 Identifier
 FSU_2017SP_Chen_fsu_0071E_13187
 Format
 Thesis
 Title
 Three Essays on Competition in Regional Oligopoly.
 Creator

Landgraf, Steven William, Isaac, R. Mark, Kantor, Shawn Everett, Perfect, Steven Bruce, Beaumont, Paul M., Kitchens, Carl T., Florida State University, College of Social...
Show moreLandgraf, Steven William, Isaac, R. Mark, Kantor, Shawn Everett, Perfect, Steven Bruce, Beaumont, Paul M., Kitchens, Carl T., Florida State University, College of Social Sciences and Public Policy, Department of Economics
Show less  Abstract/Description

This dissertation examines competitive issues among utilitytype industries characterized by oligopolistic market structures with the potential for market power. These markets are characterized by geographicallyconstrained competition, from a subnational region down to the neighborhood level. Insights from this research can be used to inform policy decisions that attempt to rectify negative consequences of concentrated market structure in essential industries. Chapter 1 analyzes the...
Show moreThis dissertation examines competitive issues among utilitytype industries characterized by oligopolistic market structures with the potential for market power. These markets are characterized by geographicallyconstrained competition, from a subnational region down to the neighborhood level. Insights from this research can be used to inform policy decisions that attempt to rectify negative consequences of concentrated market structure in essential industries. Chapter 1 analyzes the potential impact that entry by public (governmentowned) Internet Service Providers (ISPs) might have on investment in quality by private incumbent ISPs in local markets. The estimates indicate that the presence of a public entry threat is associated with lower maximum upload and download speeds offered by private cable and DSL providers. However, the presence of a public threat encourages private firm entry, so it is not clear that negative effects on speed are due to crowding out. In states where municipal entry is made more difficult by regulation, these effects disappear. Therefore, restrictive regulation of municipal broadband has a nontrivial effect on competition. Chapter 2 estimates the minimum number of ISPs required to bring a local market to a competitive level. The estimates, while imprecise, suggest that ISP markets become competitive with three firms, which corroborates previous research in the literature. This suggests that inducing competitive outcomes through entry promotion policies might be a reasonable goal considering the natural tendency towards oligopoly in the industry. However, the technology mix of the local market might affect policy recommendations. Chapter 3 determines if asymmetric passthrough of costs to prices occurs in wholesale electricity markets, specifically in the passthrough of natural gas prices to natural gas fueled electricity generation. I find some preliminary evidence that suggests that generator output prices respond more strongly to positive changes in the natural gas price than negative changes, and that the size of the asymmetry is more pronounced for generators who have a greater potential to exercise market power.
Show less  Date Issued
 2018
 Identifier
 2018_Su_Landgraf_fsu_0071E_14719
 Format
 Thesis
 Title
 Learning and Motion Planning for GaitBased Legged Robots.
 Creator

Harper, Mario Yuuji, Erlebacher, Gordon, Collins, E., Beaumont, Paul M., Clark, Jonathan E., Shanbhag, Sachin, MeyerBäse, Anke, Florida State University, College of Arts and...
Show moreHarper, Mario Yuuji, Erlebacher, Gordon, Collins, E., Beaumont, Paul M., Clark, Jonathan E., Shanbhag, Sachin, MeyerBäse, Anke, Florida State University, College of Arts and Sciences, Department of Scientific Computing
Show less  Abstract/Description

Animals have demonstrated the capacity to traverse many complex unstructured terrains at high speeds by utilizing effective locomotion regimes. Motion in difficult and uncertain environments have only seen partial success on traditional wheeled or trackbased robots and is limited to slow deliberative maneuvers on legged robots, which are focused on maintaining continuous stability through proper foothold selection. While legged robots have demonstrated successful navigation across many...
Show moreAnimals have demonstrated the capacity to traverse many complex unstructured terrains at high speeds by utilizing effective locomotion regimes. Motion in difficult and uncertain environments have only seen partial success on traditional wheeled or trackbased robots and is limited to slow deliberative maneuvers on legged robots, which are focused on maintaining continuous stability through proper foothold selection. While legged robots have demonstrated successful navigation across many complex surfaces, motion planning algorithms currently fail to consider the unique mobility characteristics that honor the natural selfstabilizing dynamics of gaitbased locomotion such as running and climbing. This dissertation outlines some of the specific motion planning challenges faced when attempting to plan for legged systems with dynamic gaits, with specific instances of these demonstrated by four robots, the dynamic running platforms: XRL, LLAMA, Minitaur and the dynamic climbing platform TAILS. Using a unique implementation of Sampling Based Model Predictive Optimization (SBMPO) designed expressly for dynamic legged robots, we demonstrate the ability to learn kinodynamic models, motion plan through obstacles on varied terrains and demonstrate navigation on vertical walls. This research has pioneered the technique which allows dynamic legged robots to navigate while honoring the natural dynamics of robot gait. Further, this document will describe to the reader the methods and algorithms that enabled Florida State University to be the first in the world to demonstrate motion planning on a dynamic climbing robot. This work is demonstrated in simulation and verified through hardware experiments on canonical motion planning scenarios, controlled laboratory settings and in unstructured terrains. Finally, this work has opened the field of dynamic legged robot intelligence for future researchers by enabling fundamental navigation and planning, efficient realtime algorithms for onboard computing, and the development of techniques to account for complex constrained motions unique to individual robots and terrains.
Show less  Date Issued
 2018
 Identifier
 2018_Fall_Harper_fsu_0071E_14735
 Format
 Thesis