This book is concerned with statistical methods for the analysis of data collected from a survey. A survey could consist of data collected from a questionnaire or from measurements, such as those taken as part of a quality control process. Concerned with the statistical methods for the analysis of sample survey data, this book will update and extend the successful book edited by Skinner, Holt and Smith on 'Analysis of Complex Surveys'. The focus will be on methodological issues, which arise when applying statistical methods to sample survey data and will discuss in detail the impact of complex sampling schemes. Further issues, such as how to deal with missing data and measurement of error will also be critically discussed. There have significant improvements in statistical software which implement complex sampling schemes (eg SUDAAN, STATA, WESVAR, PC CARP ) in the last decade and there is greater need for practical advice for those analysing survey data. To ensure a broad audience, the statistical theory will be made accessible through the use of practical examples. This book will be accessible to a broad audience of statisticians but will primarily be of interest to practitioners analysing survey data. Increased awareness by social scientists of the variety of powerful statistical methods will make this book a useful reference.
Managing uncertainties in industrial systems is a daily challenge to ensure improved design, robust operation, accountable performance and responsive risk control. Authored by a leading European network of experts representing a cross section of industries, Uncertainty in Industrial Practice aims to provide a reference for the dissemination of uncertainty treatment in any type of industry. It is concerned with the quantification of uncertainties in the presence of data, model(s) and knowledge about the system, and offers a technical contribution to decision-making processes whilst acknowledging industrial constraints. The approach presented can be applied to a range of different business contexts, from research or early design through to certification or in-service processes. The authors aim to foster optimal trade-offs between literature-referenced methodologies and the simplified approaches often inevitable in practice, owing to data, time or budget limitations of technical decision-makers. Uncertainty in Industrial Practice: Features recent uncertainty case studies carried out in the nuclear, air & space, oil, mechanical and civil engineering industries set in a common methodological framework. Presents methods for organizing and treating uncertainties in a generic and prioritized perspective. Illustrates practical difficulties and solutions encountered according to the level of complexity, information available and regulatory and financial constraints. Discusses best practice in uncertainty modeling, propagation and sensitivity analysis through a variety of statistical and numerical methods. Reviews recent standards, references and available software, providing an essential resource for engineers and risk analysts in a wide variety of industries. This book provides a guide to dealing with quantitative uncertainty in engineering and modelling and is aimed at practitioners, including risk-industry regulators and academics wishing to develop industry-realistic methodologies.
Handbook of Computational Econometrics examines the state of the art of computational econometrics and provides exemplary studies dealing with computational issues arising from a wide spectrum of econometric fields including such topics as bootstrapping, the evaluation of econometric software, and algorithms for control, optimization, and estimation. Each topic is fully introduced before proceeding to a more in-depth examination of the relevant methodologies and valuable illustrations. This book: Provides self-contained treatments of issues in computational econometrics with illustrations and invaluable bibliographies. Brings together contributions from leading researchers. Develops the techniques needed to carry out computational econometrics. Features network studies, non-parametric estimation, optimization techniques, Bayesian estimation and inference, testing methods, time-series analysis, linear and nonlinear methods, VAR analysis, bootstrapping developments, signal extraction, software history and evaluation. This book will appeal to econometricians, financial statisticians, econometric researchers and students of econometrics at both graduate and advanced undergraduate levels.
Longitudinal surveys are surveys that involve collecting data from multiple subjects on multiple occasions. They are typically used for collecting data relating to social, economic, educational and health-related issues and they serve as an important tool for economists, sociologists, and other researchers. Focusing on the design, implementation and analysis of longitudinal surveys, Methodology of Longitudinal Surveys discusses the current state of the art in carrying out these surveys. The book also covers issues that arise in surveys that collect longitudinal data via retrospective methods. Aimed at researchers and practitioners analyzing data from statistical surveys the book will also be suitable as supplementary reading for graduate students of survey statistics. This book: Covers all the main stages in the design, implementation and analysis of longitudinal surveys. Reviews recent developments in the field, including the use of dependent interviewing and mixed mode data collection. Discusses the state of the art in sampling, weighting and non response adjustment. Features worked examples throughout using real data. Addresses issues arising from the collection of data via retrospective methods, as well as ethical issues, confidentiality and non-response bias. Is written by an international team of contributors consisting of some of the most respected Survey Methodology experts in the field
This well-written book provides a clear and accessible treatment of the theory of discrete and continuous-time Markov chains, with an emphasis towards applications. The mathematical treatment is precise and rigorous without superfluous details, and the results are immediately illustrated in illuminating examples. This book will be extremely useful to anybody teaching a course on Markov processes. Jean-François Le Gall, Professor at Université de Paris-Orsay, France. Markov processes is the class of stochastic processes whose past and future are conditionally independent, given their present state. They constitute important models in many applied fields. After an introduction to the Monte Carlo method, this book describes discrete time Markov chains, the Poisson process and continuous time Markov chains. It also presents numerous applications including Markov Chain Monte Carlo, Simulated Annealing, Hidden Markov Models, Annotation and Alignment of Genomic sequences, Control and Filtering, Phylogenetic tree reconstruction and Queuing networks. The last chapter is an introduction to stochastic calculus and mathematical finance. Features include: The Monte Carlo method, discrete time Markov chains, the Poisson process and continuous time jump Markov processes. An introduction to diffusion processes, mathematical finance and stochastic calculus. Applications of Markov processes to various fields, ranging from mathematical biology, to financial engineering and computer science. Numerous exercises and problems with solutions to most of them
Geostatistics is essential for environmental scientists. Weather and climate vary from place to place, soil varies at every scale at which it is examined, and even man-made attributes – such as the distribution of pollution – vary. The techniques used in geostatistics are ideally suited to the needs of environmental scientists, who use them to make the best of sparse data for prediction, and top plan future surveys when resources are limited. Geostatistical technology has advanced much in the last few years and many of these developments are being incorporated into the practitioner’s repertoire. This second edition describes these techniques for environmental scientists. Topics such as stochastic simulation, sampling, data screening, spatial covariances, the variogram and its modeling, and spatial prediction by kriging are described in rich detail. At each stage the underlying theory is fully explained, and the rationale behind the choices given, allowing the reader to appreciate the assumptions and constraints involved.
This book provides a pedagogical examination of the way in which stochastic models are encountered in applied sciences and techniques such as physics, engineering, biology and genetics, economics and social sciences. It covers Markov and semi-Markov models, as well as their particular cases: Poisson, renewal processes, branching processes, Ehrenfest models, genetic models, optimal stopping, reliability, reservoir theory, storage models, and queuing systems. Given this comprehensive treatment of the subject, students and researchers in applied sciences, as well as anyone looking for an introduction to stochastic models, will find this title of invaluable use.
Praise for the First Edition: «For a beginner [this book] is a treasure trove; for an experienced person it can provide new ideas on how better to pursue the subject of applied statistics.» —Journal of Quality Technology Sensibly organized for quick reference, Statistical Rules of Thumb, Second Edition compiles simple rules that are widely applicable, robust, and elegant, and each captures key statistical concepts. This unique guide to the use of statistics for designing, conducting, and analyzing research studies illustrates real-world statistical applications through examples from fields such as public health and environmental studies. Along with an insightful discussion of the reasoning behind every technique, this easy-to-use handbook also conveys the various possibilities statisticians must think of when designing and conducting a study or analyzing its data. Each chapter presents clearly defined rules related to inference, covariation, experimental design, consultation, and data representation, and each rule is organized and discussed under five succinct headings: introduction; statement and illustration of the rule; the derivation of the rule; a concluding discussion; and exploration of the concept's extensions. The author also introduces new rules of thumb for topics such as sample size for ratio analysis, absolute and relative risk, ANCOVA cautions, and dichotomization of continuous variables. Additional features of the Second Edition include: Additional rules on Bayesian topics New chapters on observational studies and Evidence-Based Medicine (EBM) Additional emphasis on variation and causation Updated material with new references, examples, and sources A related Web site provides a rich learning environment and contains additional rules, presentations by the author, and a message board where readers can share their own strategies and discoveries. Statistical Rules of Thumb, Second Edition is an ideal supplementary book for courses in experimental design and survey research methods at the upper-undergraduate and graduate levels. It also serves as an indispensable reference for statisticians, researchers, consultants, and scientists who would like to develop an understanding of the statistical foundations of their research efforts. A related website www.vanbelle.org provides additional rules, author presentations and more.
A comprehensive, step-by-step introduction to wavelets in statistics. What are wavelets? What makes them increasingly indispensable in statistical nonparametrics? Why are they suitable for «time-scale» applications? How are they used to solve such problems as denoising, regression, or density estimation? Where can one find up-to-date information on these newly «discovered» mathematical objects? These are some of the questions Brani Vidakovic answers in Statistical Modeling by Wavelets. Providing a much-needed introduction to the latest tools afforded statisticians by wavelet theory, Vidakovic compiles, organizes, and explains in depth research data previously available only in disparate journal articles. He carefully balances both statistical and mathematical techniques, supplementing the material with a wealth of examples, more than 100 illustrations, and extensive references-with data sets and S-Plus wavelet overviews made available for downloading over the Internet. Both introductory and data-oriented modeling topics are featured, including: * Continuous and discrete wavelet transformations. * Statistical optimality properties of wavelet shrinkage. * Theoretical aspects of wavelet density estimation. * Bayesian modeling in the wavelet domain. * Properties of wavelet-based random functions and densities. * Several novel and important wavelet applications in statistics. * Wavelet methods in time series. Accessible to anyone with a background in advanced calculus and algebra, Statistical Modeling by Wavelets promises to become the standard reference for statisticians and engineers seeking a comprehensive introduction to an emerging field.
A new look at weak-convergence methods in metric spaces-from a master of probability theory In this new edition, Patrick Billingsley updates his classic work Convergence of Probability Measures to reflect developments of the past thirty years. Widely known for his straightforward approach and reader-friendly style, Dr. Billingsley presents a clear, precise, up-to-date account of probability limit theory in metric spaces. He incorporates many examples and applications that illustrate the power and utility of this theory in a range of disciplines-from analysis and number theory to statistics, engineering, economics, and population biology. With an emphasis on the simplicity of the mathematics and smooth transitions between topics, the Second Edition boasts major revisions of the sections on dependent random variables as well as new sections on relative measure, on lacunary trigonometric series, and on the Poisson-Dirichlet distribution as a description of the long cycles in permutations and the large divisors of integers. Assuming only standard measure-theoretic probability and metric-space topology, Convergence of Probability Measures provides statisticians and mathematicians with basic tools of probability theory as well as a springboard to the «industrial-strength» literature available today.