Department of Mathematical Sciences

University of Nevada, Las Vegas


Statistics Colloquium/Seminar Series

2014-2015

 

[2006-2008] [2008-2009] [2009-2010] [2010-2011] [2011-2012] [2012-2013] [2013-2014]

 

For more information, contact the Colloquium/Seminar Coordinator, Dr. Hokwon Cho

(To see Math Dept Colloquia/Seminars, click next: Math Dept Seminar)

 

Fall 2014


Friday, September 26
CBC-C112, 11:30 am

(refreshments at 11:15 am)

Prof. Jaechoul Lee
Department of Mathematics

Boise State University

Title: Trends in Extreme United States Temperatures


[Abstract]  Extreme temperatures have profound societal, ecological, and economic impacts. While most scientists concur that average temperatures in the contiguous United States since 1900 have warmed on aggregate, there is no a priori reason to believe that temporal trends in averages and extremes will exhibit the same patterns during this period. Indeed, under minor regularity conditions, the sample mean and maximum of stationary time series are statistically independent in large samples.
     This talk presents trend estimation methods for monthly maximum and minimum temperature time series observed in the 48 conterminous United States over the last century. Previous authors have suggested that minimum temperatures are warming faster than maximum temperatures in the United States; such an aspect can be rigorously investigated via the methods discussed in this study. Here, statistical models with extreme value and change point features are used to estimate trends and their standard errors. A spatial smoothing is then done to extract general structure. The results show that monthly maximum temperatures are not often greatly changing --- perhaps surprisingly, there are many stations that show some cooling. In contrast, the minimum temperatures show significant warming. Overall, the southeastern United States shows the least warming (even some cooling), and the western United States, northern Midwest, and New England have experienced the most warming.



Friday, October 17
CBC-C112, 11:30 am

(refreshments at 11: 15am)

Prof. Guogen Shan
Department of Environmental & Occupational Health

University of Nevada, Las Vegas

Title: Exact Statistical Inference for comparing

two independent Poisson rates


[Abstract]  Two fundamental problems for comparing two independent Poisson rates are considered: p-value calculation and confidence interval construction. Exact tests for p-value calculation are always preferable due to the guarantee of test size in small to medium sample settings. Han (2008) compared the performance of partial maximization p-values based on the Wald test statistic, the likelihood ratio test statistic, the score test statistic, and the conditional p-value. These four testing procedures do not perform consistently, as the results depend on the choice of test statistics for general alternatives. We consider the approach based on estimation and partial maximization, and compare these to the ones studied by Han (2008) for testing superiority. The approach based on partial maximization using the score test is recommended due to the comparable performance and computational advantage in large sample settings. Additionally, the approach based on estimation and partial maximization performs consistently for all the three test statistics. We also examine exact one-sided confidence limits for the ratio of two independent Poisson rates. The Buehler method is utilized to obtain exact limits, and this method is used in conjunction with existing approximate limits. The exact limits respect the coverage requirement, and they are as small as possible under certain mild conditions.



Friday, October 24
CBC-C122, 11:30 am

(refreshments at 11: 15am)

Prof. Sanjib Basu
Statistics Division

Northern Illinois University

Title: A unified competing risks cure rate model with application

in cancer survival data


[Abstract]  A competing risks framework refers to multiple risks acting simultaneously on a subject. A cure rate postulates a fraction of the subjects to be cured or failure-free, and can be formulated as a mixture model, or alternatively by a bounded cumulative hazard model.  We develop models that unify the competing risks and cure rate approaches. The identifiability of these unified models is studied in detail.  We describe Bayesian analysis of these models, and discuss conceptual, methodological and computational issues related to model fitting and model selection. We describe detailed applications in survival data from breast cancer patients in the Surveillance, Epidemiology, and End Results (SEER) program of the National Cancer Institute (NCI) of the United State.



Friday, November 7
CBC-C112, 11:00 am

(refreshments at 10: 45am)

Prof. Usha Govindarajulu
Department of Epidemiology and Biostatistics

State University of New York, Medical Center

Brooklyn, New York

Title: Frailty models: Applications to biomedical and genetic studies


[Abstract]  In this talk, we provide a tutorial as an overview and general framework of frailty modeling and estimation for multiplicative hazards models in the context of biomedical and genetic studies. We will also briefly discuss other topics in frailty models, such as diagnostic methods for model adequacy and inference in frailty models. Some examples of analyses using multivariate frailty models in a non-parametric hazards setting on biomedical datasets will be shown and the implications of choosing to use frailty and relevance to genetic applications will also be discussed.



Friday, November 14
CBC-C112, 11:30 am

(refreshments at 11:15am)

Prof. James Flegal
Department of Statistics

University of California, Riverside

Title: Relative fixed-width stopping rules for Markov chain Monte Carlo simulations


[Abstract]  Markov chain Monte Carlo (MCMC) simulations are commonly employed for estimating features of a target distribution, particularly for Bayesian inference.  A fundamental challenge is determining when these simulations should stop.  We consider a sequential stopping rule that terminates the simulation when the width of a confidence interval is sufficiently small relative to the size of the target parameter.  Specifically, we propose relative magnitude and relative standard deviation stopping rules in the context of MCMC.  In each setting, we develop sufficient conditions for asymptotic validity, that is conditions to ensure the simulation will terminate with probability one and the resulting confidence intervals will have the proper coverage probability.  Our results are applicable in a wide variety of MCMC estimation settings, such as expectation, quantile, or simultaneous multivariate estimation.  Finally, we investigate the finite sample properties through a variety of examples and provide some recommendations to practitioners.


 

Spring 2015


Friday, February 6, 2015
CBC-C126, 11:30 am

(refreshments at 11: 15am)

Dr. Aaron Luttman
National Securities Technologies

U.S. Department of Energy


Title: Bayesian Methods for Image Reconstruction

with Feature-preserving Priors


[Abstract]  Many problems in image reconstruction--such as deconvolution and Radon or Abel inversion - can be formulated as linear inverse problems, and these problems frequently arise in the analysis of diagnostics in large-scale experimentation for the U.S. Department of Energy’s (DOE) science-based Stockpile Stewardship program. Classical, variational approaches to linear inverse problems can give excellent reconstructions, but there has been a recent shift within the DOE requiring that uncertainty estimates accompany the “solutions” that are reported. In this work we will present a Markov Chain Monte Carlo approach to computing solutions to linear inverse problems in imaging, sampling from a posterior distribution derived from a true Poisson likelihood and smoothness prior. We will also present a hierarchical Bayesian model for spatially-­adaptive regularization that allows the data to drive the structure of the prior, rather than enforcing smoothness a priori. The techniques will be demonstrated on real data captured at a U.S. Department of Energy X­ray radiography facility at the Nevada National Security Site.



Friday, March 20, 2015
CBC-C126, 11:30 am

(refreshments at 11:15 am)

Prof. Nalini Ravishanker
Department of Statistics

University of Connecticut, Storrs


Title: Estimating Function Approaches for Nonlinear Time Series


[Abstract]  Several classes of linear and nonlinear time series models are useful in several application areas, including biology, business, economics, engineering, finance, etc. There is ongoing research on developing fast and accurate methods for estimation and prediction. In situations where we can only provide information on the first few conditional moments of the observed process, but not specify the probability distributions, the framework of martingale estimating functions (EFs) provides an optimal approach for developing inference. This talk discusses linear EFs and the more informative quadratic EFs which can be derived when information on higher-order conditional moments of the process are available. The EF approach is especially useful in practice when recursive estimates of model parameters can be derived, resulting in a fast computational estimation approach. We illustrate this approach for different classes of nonlinear time series models, such as generalized duration models and random coefficient autoregressive models with heavy-tailed errors, which are useful in financial data analysis. Simulation studies enable us to assess the accuracy of the recursive estimates, and the importance of good initial values for starting the recursions.



Friday, March 27, 2015
CBC-C126, 11:30 am

(refreshments at 11:15am)

Prof. Subrata Kundu
Department of Statistics

George Washington University

Title: Nonparametric and Parametric Estimation in Software Reliability


[Abstract]  Non-homogeneous Poisson Process (NHPP) models form a significant subclass of the many software reliability models proposed in the literature. First, we discuss some logical implications of NHPP models and estimability of the underlying parameters. We prove an important limitation of NHPP models for which the limit of the expected number of failures m(t) as the testing time $t\to\infty$ is finite.
Specifically, the parameters of those models cannot be estimated consistently as the testing time approaches infinity. Then, we present a nonparametric method for estimating $\nu$, the number of bugs in a code, and investigate its properties. Our results show that the proposed estimator performs well in terms of bias and asymptotic normality, while the MLE of $\nu$ derived assuming that the common renewal distribution is exponential may be seriously biased if that assumption does not hold. We present a new parametric approach as well
.



Friday, April 10, 2015
CBC-C126, 11:30 am

(refreshments at 11:15 am)

Prof. Manoj Chacko
Department of Statistics

University of Kerala, India

Title: Ranked Set Sampling and its Applications in Parametric Estimation


[Abstract]  Ranked set sampling (RSS) is applicable whenever ranking of a set of sampling units can be done easily by a judgment method or based on the measurement of an auxiliary variable on the units selected. In this work, we consider ranked set sampling, in which ranking of units are done based on measurements made on an easily and exactly measurable auxiliary variable X which is correlated with the study variable Y to estimate the parameters associated with the study variate Y. We assume both Morgenstern type bivariate exponential distribution and bivariate Pareto distribution for (X,Y). Different modifications of RSS such as extreme ranked set sampling and moving extreme ranked set sampling are also considered to obtain the estimators of the parameters associated with study variety Y. Efficiency comparison is also made on all estimators considered in this work.


 

 

è Statistics Colloquium/Seminar Series