Математика

Различные книги в жанре Математика

Statistical and Managerial Techniques for Six Sigma Methodology. Theory and Application

Barone Stefano

Six Sigma methodology is a business management strategy which seeks to improve the quality of process output by identifying and removing the causes of errors and minimizing variability in manufacturing and business processes. This book examines the Six Sigma methodology through illustrating the most widespread tools and techniques involved in Six Sigma application. Both managerial and statistical aspects are analysed allowing the reader to apply these tools in the field. Furthermore, the book offers insight on variation and risk management and focuses on the structure and organizational aspects of Six Sigma projects. Key features: • Presents both statistical and managerial aspects of Six Sigma, covering both basic and more advanced statistical techniques. • Provides clear examples and case studies to illustrate the concepts and methodologies used in Six Sigma. • Written by experienced authors in the field. This textbook is ideal for graduates studying Six Sigma for Black Belt and Green Belt qualifications as well as for engineering and quality management courses. Business consultants and consultancy firms implementing Six Sigma will also benefit from this book.

Basic Statistics. A Primer for the Biomedical Sciences

Clark Virginia A.

New Edition of a Classic Guide to Statistical Applications in the Biomedical Sciences In the last decade, there have been significant changes in the way statistics is incorporated into biostatistical, medical, and public health research. Addressing the need for a modernized treatment of these statistical applications, Basic Statistics, Fourth Edition presents relevant, up-to-date coverage of research methodology using careful explanations of basic statistics and how they are used to address practical problems that arise in the medical and public health settings. Through concise and easy-to-follow presentations, readers will learn to interpret and examine data by applying common statistical tools, such as sampling, random assignment, and survival analysis. Continuing the tradition of its predecessor, this new edition outlines a thorough discussion of different kinds of studies and guides readers through the important, related decision-making processes such as determining what information is needed and planning the collections process. The book equips readers with the knowledge to carry out these practices by explaining the various types of studies that are commonly conducted in the fields of medical and public health, and how the level of evidence varies depending on the area of research. Data screening and data entry into statistical programs is explained and accompanied by illustrations of statistical analyses and graphs. Additional features of the Fourth Edition include: A new chapter on data collection that outlines the initial steps in planning biomedical and public health studies A new chapter on nonparametric statistics that includes a discussion and application of the Sign test, the Wilcoxon Signed Rank test, and the Wilcoxon Rank Sum test and its relationship to the Mann-Whitney U test An updated introduction to survival analysis that includes the Kaplan Meier method for graphing the survival function and a brief introduction to tests for comparing survival functions Incorporation of modern statistical software, such as SAS, Stata, SPSS, and Minitab into the presented discussion of data analysis Updated references at the end of each chapter Basic Statistics, Fourth Edition is an ideal book for courses on biostatistics, medicine, and public health at the upper-undergraduate and graduate levels. It is also appropriate as a reference for researchers and practitioners who would like to refresh their fundamental understanding of statistical techniques.

Discrete Fourier Analysis and Wavelets. Applications to Signal and Image Processing

Broughton S. Allen

A thorough guide to the classical and contemporary mathematical methods of modern signal and image processing Discrete Fourier Analysis and Wavelets presents a thorough introduction to the mathematical foundations of signal and image processing. Key concepts and applications are addressed in a thought-provoking manner and are implemented using vector, matrix, and linear algebra methods. With a balanced focus on mathematical theory and computational techniques, this self-contained book equips readers with the essential knowledge needed to transition smoothly from mathematical models to practical digital data applications. The book first establishes a complete vector space and matrix framework for analyzing signals and images. Classical methods such as the discrete Fourier transform, the discrete cosine transform, and their application to JPEG compression are outlined followed by coverage of the Fourier series and the general theory of inner product spaces and orthogonal bases. The book then addresses convolution, filtering, and windowing techniques for signals and images. Finally, modern approaches are introduced, including wavelets and the theory of filter banks as a means of understanding the multiscale localized analysis underlying the JPEG 2000 compression standard. Throughout the book, examples using image compression demonstrate how mathematical theory translates into application. Additional applications such as progressive transmission of images, image denoising, spectrographic analysis, and edge detection are discussed. Each chapter provides a series of exercises as well as a MATLAB project that allows readers to apply mathematical concepts to solving real problems. Additional MATLAB routines are available via the book's related Web site. With its insightful treatment of the underlying mathematics in image compression and signal processing, Discrete Fourier Analysis and Wavelets is an ideal book for mathematics, engineering, and computer science courses at the upper-undergraduate and beginning graduate levels. It is also a valuable resource for mathematicians, engineers, and other practitioners who would like to learn more about the relevance of mathematics in digital data processing.

Statistics and Probability with Applications for Engineers and Scientists

Guttman Irwin

Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Probability with Applications for Engineers and Scientists covers descriptive statistics first, then goes on to discuss the fundamentals of probability theory. Along with case studies, examples, and real-world data sets, the book incorporates clear instructions on how to use the statistical packages Minitab® and Microsoft® Office Excel® to analyze various data sets. The book also features: • Detailed discussions on sampling distributions, statistical estimation of population parameters, hypothesis testing, reliability theory, statistical quality control including Phase I and Phase II control charts, and process capability indices • A clear presentation of nonparametric methods and simple and multiple linear regression methods, as well as a brief discussion on logistic regression method • Comprehensive guidance on the design of experiments, including randomized block designs, one- and two-way layout designs, Latin square designs, random effects and mixed effects models, factorial and fractional factorial designs, and response surface methodology • A companion website containing data sets for Minitab and Microsoft Office Excel, as well as JMP ® routines and results Assuming no background in probability and statistics, Statistics and Probability with Applications for Engineers and Scientists features a unique, yet tried-and-true, approach that is ideal for all undergraduate students as well as statistical practitioners who analyze and illustrate real-world data in engineering and the natural sciences.

Lower Previsions

Troffaes Matthias C.M.

This book has two main purposes. On the one hand, it provides a concise and systematic development of the theory of lower previsions, based on the concept of acceptability, in spirit of the work of Williams and Walley. On the other hand, it also extends this theory to deal with unbounded quantities, which abound in practical applications. Following Williams, we start out with sets of acceptable gambles. From those, we derive rationality criteria–avoiding sure loss and coherence–and inference methods–natural extension–for (unconditional) lower previsions. We then proceed to study various aspects of the resulting theory, including the concept of expectation (linear previsions), limits, vacuous models, classical propositional logic, lower oscillations, and monotone convergence. We discuss n-monotonicity for lower previsions, and relate lower previsions with Choquet integration, belief functions, random sets, possibility measures, various integrals, symmetry, and representation theorems based on the Bishop-De Leeuw theorem. Next, we extend the framework of sets of acceptable gambles to consider also unbounded quantities. As before, we again derive rationality criteria and inference methods for lower previsions, this time also allowing for conditioning. We apply this theory to construct extensions of lower previsions from bounded random quantities to a larger set of random quantities, based on ideas borrowed from the theory of Dunford integration. A first step is to extend a lower prevision to random quantities that are bounded on the complement of a null set (essentially bounded random quantities). This extension is achieved by a natural extension procedure that can be motivated by a rationality axiom stating that adding null random quantities does not affect acceptability. In a further step, we approximate unbounded random quantities by a sequences of bounded ones, and, in essence, we identify those for which the induced lower prevision limit does not depend on the details of the approximation. We call those random quantities 'previsible'. We study previsibility by cut sequences, and arrive at a simple sufficient condition. For the 2-monotone case, we establish a Choquet integral representation for the extension. For the general case, we prove that the extension can always be written as an envelope of Dunford integrals. We end with some examples of the theory.

Statistics for Spatio-Temporal Data

Wikle Christopher K.

Winner of the 2013 DeGroot Prize. A state-of-the-art presentation of spatio-temporal processes, bridging classic ideas with modern hierarchical statistical modeling concepts and the latest computational methods Noel Cressie and Christopher K. Wikle, are also winners of the 2011 PROSE Award in the Mathematics category, for the book “Statistics for Spatio-Temporal Data” (2011), published by John Wiley and Sons. (The PROSE awards, for Professional and Scholarly Excellence, are given by the Association of American Publishers, the national trade association of the US book publishing industry.) Statistics for Spatio-Temporal Data has now been reprinted with small corrections to the text and the bibliography. The overall content and pagination of the new printing remains the same; the difference comes in the form of corrections to typographical errors, editing of incomplete and missing references, and some updated spatio-temporal interpretations. From understanding environmental processes and climate trends to developing new technologies for mapping public-health data and the spread of invasive-species, there is a high demand for statistical analyses of data that take spatial, temporal, and spatio-temporal information into account. Statistics for Spatio-Temporal Data presents a systematic approach to key quantitative techniques that incorporate the latest advances in statistical computing as well as hierarchical, particularly Bayesian, statistical modeling, with an emphasis on dynamical spatio-temporal models. Cressie and Wikle supply a unique presentation that incorporates ideas from the areas of time series and spatial statistics as well as stochastic processes. Beginning with separate treatments of temporal data and spatial data, the book combines these concepts to discuss spatio-temporal statistical methods for understanding complex processes. Topics of coverage include: Exploratory methods for spatio-temporal data, including visualization, spectral analysis, empirical orthogonal function analysis, and LISAs Spatio-temporal covariance functions, spatio-temporal kriging, and time series of spatial processes Development of hierarchical dynamical spatio-temporal models (DSTMs), with discussion of linear and nonlinear DSTMs and computational algorithms for their implementation Quantifying and exploring spatio-temporal variability in scientific applications, including case studies based on real-world environmental data Throughout the book, interesting applications demonstrate the relevance of the presented concepts. Vivid, full-color graphics emphasize the visual nature of the topic, and a related FTP site contains supplementary material. Statistics for Spatio-Temporal Data is an excellent book for a graduate-level course on spatio-temporal statistics. It is also a valuable reference for researchers and practitioners in the fields of applied mathematics, engineering, and the environmental and health sciences.

Making Sense of Data I. A Practical Guide to Exploratory Data Analysis and Data Mining

Johnson Wayne P.

Praise for the First Edition “…a well-written book on data analysis and data mining that provides an excellent foundation…” —CHOICE “This is a must-read book for learning practical statistics and data analysis…” —Computing Reviews.com A proven go-to guide for data analysis, Making Sense of Data I: A Practical Guide to Exploratory Data Analysis and Data Mining, Second Edition focuses on basic data analysis approaches that are necessary to make timely and accurate decisions in a diverse range of projects. Based on the authors’ practical experience in implementing data analysis and data mining, the new edition provides clear explanations that guide readers from almost every field of study. In order to facilitate the needed steps when handling a data analysis or data mining project, a step-by-step approach aids professionals in carefully analyzing data and implementing results, leading to the development of smarter business decisions. The tools to summarize and interpret data in order to master data analysis are integrated throughout, and the Second Edition also features: Updated exercises for both manual and computer-aided implementation with accompanying worked examples New appendices with coverage on the freely available Traceis™ software, including tutorials using data from a variety of disciplines such as the social sciences, engineering, and finance New topical coverage on multiple linear regression and logistic regression to provide a range of widely used and transparent approaches Additional real-world examples of data preparation to establish a practical background for making decisions from data Making Sense of Data I: A Practical Guide to Exploratory Data Analysis and Data Mining, Second Edition is an excellent reference for researchers and professionals who need to achieve effective decision making from data. The Second Edition is also an ideal textbook for undergraduate and graduate-level courses in data analysis and data mining and is appropriate for cross-disciplinary courses found within computer science and engineering departments.

Handbook of Web Surveys

Bethlehem Jelke

BEST PRACTICES TO CREATE AND IMPLEMENTHIGHLY EFFECTIVE WEB SURVEYS Exclusively combining design and sampling issues, Handbook of Web Surveys presents a theoretical yet practical approach to creating and conducting web surveys. From the history of web surveys to various modes of data collection to tips for detecting error, this book thoroughly introduces readers to the this cutting-edge technique and offers tips for creating successful web surveys. The authors provide a history of web surveys and go on to explore the advantages and disadvantages of this mode of data collection. Common challenges involving under-coverage, self-selection, and measurement errors are discussed as well as topics including: Sampling designs and estimation procedures Comparing web surveys to face-to-face, telephone, and mail surveys Errors in web surveys Mixed-mode surveys Weighting techniques including post-stratification, generalized regression estimation, and raking ratio estimation Use of propensity scores to correct bias Web panels Real-world examples illustrate the discussed concepts, methods, and techniques, with related data freely available on the book's Website. Handbook of Web Surveys is an essential reference for researchers in the fields of government, business, economics, and the social sciences who utilize technology to gather, analyze, and draw results from data. It is also a suitable supplement for survey methods courses at the upper-undergraduate and graduate levels.

Theory of Computational Complexity

Ko Ker-I

Praise for the First Edition «…complete, up-to-date coverage of computational complexity theory…the book promises to become the standard reference on computational complexity.» -Zentralblatt MATH A thorough revision based on advances in the field of computational complexity and readers’ feedback, the Second Edition of Theory of Computational Complexity presents updates to the principles and applications essential to understanding modern computational complexity theory. The new edition continues to serve as a comprehensive resource on the use of software and computational approaches for solving algorithmic problems and the related difficulties that can be encountered. Maintaining extensive and detailed coverage, Theory of Computational Complexity, Second Edition, examines the theory and methods behind complexity theory, such as computational models, decision tree complexity, circuit complexity, and probabilistic complexity. The Second Edition also features recent developments on areas such as NP-completeness theory, as well as: A new combinatorial proof of the PCP theorem based on the notion of expander graphs, a research area in the field of computer science Additional exercises at varying levels of difficulty to further test comprehension of the presented material End-of-chapter literature reviews that summarize each topic and offer additional sources for further study Theory of Computational Complexity, Second Edition, is an excellent textbook for courses on computational theory and complexity at the graduate level. The book is also a useful reference for practitioners in the fields of computer science, engineering, and mathematics who utilize state-of-the-art software and computational methods to conduct research. A thorough revision based on advances in the field of computational complexity and readers’feedback, the Second Edition of Theory of Computational Complexity presents updates to theprinciples and applications essential to understanding modern computational complexitytheory. The new edition continues to serve as a comprehensive resource on the use of softwareand computational approaches for solving algorithmic problems and the related difficulties thatcan be encountered.Maintaining extensive and detailed coverage, Theory of Computational Complexity, SecondEdition, examines the theory and methods behind complexity theory, such as computationalmodels, decision tree complexity, circuit complexity, and probabilistic complexity. The SecondEdition also features recent dev

Combinatorial Reasoning. An Introduction to the Art of Counting

DeTemple Duane

Written by two well-known scholars in the field, Combinatorial Reasoning: An Introduction to the Art of Counting presents a clear and comprehensive introduction to the concepts and methodology of beginning combinatorics. Focusing on modern techniques and applications, the book develops a variety of effective approaches to solving counting problems. Balancing abstract ideas with specific topical coverage, the book utilizes real world examples with problems ranging from basic calculations that are designed to develop fundamental concepts to more challenging exercises that allow for a deeper exploration of complex combinatorial situations. Simple cases are treated first before moving on to general and more advanced cases. Additional features of the book include: • Approximately 700 carefully structured problems designed for readers at multiple levels, many with hints and/or short answers • Numerous examples that illustrate problem solving using both combinatorial reasoning and sophisticated algorithmic methods • A novel approach to the study of recurrence sequences, which simplifies many proofs and calculations • Concrete examples and diagrams interspersed throughout to further aid comprehension of abstract concepts • A chapter-by-chapter review to clarify the most crucial concepts covered Combinatorial Reasoning: An Introduction to the Art of Counting is an excellent textbook for upper-undergraduate and beginning graduate-level courses on introductory combinatorics and discrete mathematics.