Explores wide-ranging applications of modeling and simulation techniques that allow readers to conduct research and ask «What if??» Principles of Modeling and Simulation: A Multidisciplinary Approach is the first book to provide an introduction to modeling and simulation techniques across diverse areas of study. Numerous researchers from the fields of social science, engineering, computer science, and business have collaborated on this work to explore the multifaceted uses of computational modeling while illustrating their applications in common spreadsheets. The book is organized into three succinct parts: Principles of Modeling and Simulation provides a brief history of modeling and simulation, outlines its many functions, and explores the advantages and disadvantages of using models in problem solving. Two major reasons to employ modeling and simulation are illustrated through the study of a specific problem in conjunction with the use of related applications, thus gaining insight into complex concepts. Theoretical Underpinnings examines various modeling techniques and introduces readers to two significant simulation concepts: discrete event simulation and simulation of continuous systems. This section details the two primary methods in which humans interface with simulations, and it also distinguishes the meaning, importance, and significance of verification and validation. Practical Domains delves into specific topics related to transportation, business, medicine, social science, and enterprise decision support. The challenges of modeling and simulation are discussed, along with advanced applied principles of modeling and simulation such as representation techniques, integration into the application infrastructure, and emerging technologies. With its accessible style and wealth of real-world examples, Principles of Modeling and Simulation: A Multidisciplinary Approach is a valuable book for modeling and simulation courses at the upper-undergraduate and graduate levels. It is also an indispensable reference for researchers and practitioners working in statistics, mathematics, engineering, computer science, economics, and the social sciences who would like to further develop their understanding and knowledge of the field.
This book explains the concept of spatial data quality, a key theory for minimizing the risks of data misuse in a specific decision-making context. Drawing together chapters written by authors who are specialists in their particular field, it provides both the data producer and the data user perspectives on how to evaluate the quality of vector or raster data which are both produced and used. It also covers the key concepts in this field, such as: how to describe the quality of vector or raster data; how to enhance this quality; how to evaluate and document it, using methods such as metadata; how to communicate it to users; and how to relate it with the decision-making process. Also included is a Foreword written by Professor Michael F. Goodchild.
Switching processes, invented by the author in 1977, is the main tool used in the investigation of traffic problems from automotive to telecommunications. The title provides a new approach to low traffic problems based on the analysis of flows of rare events and queuing models. In the case of fast switching, averaging principle and diffusion approximation results are proved and applied to the investigation of transient phenomena for wide classes of overloading queuing networks. The book is devoted to developing the asymptotic theory for the class of switching queuing models which covers models in a Markov or semi-Markov environment, models under the influence of flows of external or internal perturbations, unreliable and hierarchic networks, etc.
Many scientific, medical or engineering problems raise the issue of recovering some physical quantities from indirect measurements; for instance, detecting or quantifying flaws or cracks within a material from acoustic or electromagnetic measurements at its surface is an essential problem of non-destructive evaluation. The concept of inverse problems precisely originates from the idea of inverting the laws of physics to recover a quantity of interest from measurable data. Unfortunately, most inverse problems are ill-posed, which means that precise and stable solutions are not easy to devise. Regularization is the key concept to solve inverse problems. The goal of this book is to deal with inverse problems and regularized solutions using the Bayesian statistical tools, with a particular view to signal and image estimation. The first three chapters bring the theoretical notions that make it possible to cast inverse problems within a mathematical framework. The next three chapters address the fundamental inverse problem of deconvolution in a comprehensive manner. Chapters 7 and 8 deal with advanced statistical questions linked to image estimation. In the last five chapters, the main tools introduced in the previous chapters are put into a practical context in important applicative areas, such as astronomy or medical imaging.
Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.
An accessible, clearly organized survey of the basic topics of measure theory for students and researchers in mathematics, statistics, and physics In order to fully understand and appreciate advanced probability, analysis, and advanced mathematical statistics, a rudimentary knowledge of measure theory and like subjects must first be obtained. The Theory of Measures and Integration illuminates the fundamental ideas of the subject-fascinating in their own right-for both students and researchers, providing a useful theoretical background as well as a solid foundation for further inquiry. Eric Vestrup's patient and measured text presents the major results of classical measure and integration theory in a clear and rigorous fashion. Besides offering the mainstream fare, the author also offers detailed discussions of extensions, the structure of Borel and Lebesgue sets, set-theoretic considerations, the Riesz representation theorem, and the Hardy-Littlewood theorem, among other topics, employing a clear presentation style that is both evenly paced and user-friendly. Chapters include: * Measurable Functions * The Lp Spaces * The Radon-Nikodym Theorem * Products of Two Measure Spaces * Arbitrary Products of Measure Spaces Sections conclude with exercises that range in difficulty between easy «finger exercises»and substantial and independent points of interest. These more difficult exercises are accompanied by detailed hints and outlines. They demonstrate optional side paths in the subject as well as alternative ways of presenting the mainstream topics. In writing his proofs and notation, Vestrup targets the person who wants all of the details shown up front. Ideal for graduate students in mathematics, statistics, and physics, as well as strong undergraduates in these disciplines and practicing researchers, The Theory of Measures and Integration proves both an able primary text for a real analysis sequence with a focus on measure theory and a helpful background text for advanced courses in probability and statistics.
Shorter, more concise chapters provide flexible coverage of the subject. Expanded coverage includes: uncertainty and randomness, prior distributions, predictivism, estimation, analysis of variance, and classification and imaging. Includes topics not covered in other books, such as the de Finetti Transform. Author S. James Press is the modern guru of Bayesian statistics.
An important role of diagnostic medicine research is to estimate and compare the accuracies of diagnostic tests. This book provides a comprehensive account of statistical methods for design and analysis of diagnostic studies, including sample size calculations, estimation of the accuracy of a diagnostic test, comparison of accuracies of competing diagnostic tests, and regression analysis of diagnostic accuracy data. Discussing recently developed methods for correction of verification bias and imperfect reference bias, methods for analysis of clustered diagnostic accuracy data, and meta-analysis methods, Statistical Methods in Diagnostic Medicine explains: * Common measures of diagnostic accuracy and designs for diagnostic accuracy studies * Methods of estimation and hypothesis testing of the accuracy of diagnostic tests * Meta-analysis * Advanced analytic techniques-including methods for comparing correlated ROC curves in multi-reader studies, correcting verification bias, and correcting when an imperfect gold standard is used Thoroughly detailed with numerous applications and end-of-chapter problems as well as a related FTP site providing FORTRAN program listings, data sets, and instructional hints, Statistical Methods in Diagnostic Medicine is a valuable addition to the literature of the field, serving as a much-needed guide for both clinicians and advanced students.
A path-breaking account of Markov decision processes-theory and computation This book's clear presentation of theory, numerous chapter-end problems, and development of a unified method for the computation of optimal policies in both discrete and continuous time make it an excellent course text for graduate students and advanced undergraduates. Its comprehensive coverage of important recent advances in stochastic dynamic programming makes it a valuable working resource for operations research professionals, management scientists, engineers, and others. Stochastic Dynamic Programming and the Control of Queueing Systems presents the theory of optimization under the finite horizon, infinite horizon discounted, and average cost criteria. It then shows how optimal rules of operation (policies) for each criterion may be numerically determined. A great wealth of examples from the application area of the control of queueing systems is presented. Nine numerical programs for the computation of optimal policies are fully explicated. The Pascal source code for the programs is available for viewing and downloading on the Wiley Web site at www.wiley.com/products/subject/mathematics. The site contains a link to the author's own Web site and is also a place where readers may discuss developments on the programs or other aspects of the material. The source files are also available via ftp at ftp://ftp.wiley.com/public/sci_tech_med/stochastic Stochastic Dynamic Programming and the Control of Queueing Systems features: * Path-breaking advances in Markov decision process techniques, brought together for the first time in book form * A theorem/proof format (proofs may be omitted without loss of continuity) * Development of a unified method for the computation of optimal rules of system operation * Numerous examples drawn mainly from the control of queueing systems * Detailed discussions of nine numerical programs * Helpful chapter-end problems * Appendices with complete treatment of background material
The analysis of variance is presented as an exploratory component of data analysis, while retaining the customary least squares fitting methods. Balanced data layouts are used to reveal key ideas and techniques for exploration. The approach emphasizes both the individual observations and the separate parts that the analysis produces. Most chapters include exercises and the appendices give selected percentage points of the Gaussian, t, F chi-squared and studentized range distributions.