The book provides an overview of the most advanced quantum informational geometric techniques, which can help quantum communication theorists analyze quantum channels, such as security or additivity properties. Each section addresses an area of major research of quantum information theory and quantum communication networks. The authors present the fundamental theoretical results of quantum information theory, while also presenting the details of advanced quantum ccommunication protocols with clear mathematical and information theoretical background. This book bridges the gap between quantum physics, quantum information theory, and practical engineering.
Big Data is a new field, with many technological challenges to be understood in order to use it to its full potential. These challenges arise at all stages of working with Big Data, beginning with data generation and acquisition. The storage and management phase presents two critical challenges: infrastructure, for storage and transportation, and conceptual models. Finally, to extract meaning from Big Data requires complex analysis. Here the authors propose using metaheuristics as a solution to these challenges; they are first able to deal with large size problems and secondly flexible and therefore easily adaptable to different types of data and different contexts. The use of metaheuristics to overcome some of these data mining challenges is introduced and justified in the first part of the book, alongside a specific protocol for the performance evaluation of algorithms. An introduction to metaheuristics follows. The second part of the book details a number of data mining tasks, including clustering, association rules, supervised classification and feature selection, before explaining how metaheuristics can be used to deal with them. This book is designed to be self-contained, so that readers can understand all of the concepts discussed within it, and to provide an overview of recent applications of metaheuristics to knowledge discovery problems in the context of Big Data.
A focused guide for healthcare simulation operations in education and training With the growing use of simulation within the field of healthcare, Healthcare Simulation: A Guide for Operations Specialists provides a much needed resource for developing the roles and responsibilities of simulation operations specialists. The book illustrates the current state and evolution of the simulation professional workforce and discusses the topics necessary for the development of these pivotal roles. The book promotes the value of simulation-based education in healthcare and its associated outcomes while clarifying the operational requirements of successful simulations. Featuring numerous contributions from international experts, consultants, and specialists, Healthcare Simulation: A Guide for Operations Specialists presents advances in healthcare simulation techniques and also features: Coverage of the best practices and available technologies for healthcare simulation operations specialists within healthcare education, training, and assessment Interdisciplinary, practical examples throughout to help readers better understand the presented material An overview of the many facets of day-to-day operations within a healthcare simulation program Discussions regarding the concurrent need for understanding proper patient care that accompanies the human-to-machine interface in patient simulation Healthcare Simulation: A Guide for Operations Specialists is an excellent reference for healthcare simulation professionals including administrators, medical directors, managers, simulation technologists, faculty members, and educators in academic and healthcare settings. The book is also a useful supplementary textbook for graduate-level courses related to simulation and certificate programs in simulation education and simulation operations.
Programming multi-core and many-core computing systems Sabri Pllana, Linnaeus University, Sweden Fatos Xhafa, Technical University of Catalonia, Spain Provides state-of-the-art methods for programming multi-core and many-core systems The book comprises a selection of twenty two chapters covering: fundamental techniques and algorithms; programming approaches; methodologies and frameworks; scheduling and management; testing and evaluation methodologies; and case studies for programming multi-core and many-core systems. Program development for multi-core processors, especially for heterogeneous multi-core processors, is significantly more complex than for single-core processors. However, programmers have been traditionally trained for the development of sequential programs, and only a small percentage of them have experience with parallel programming. In the past, only a relatively small group of programmers interested in High Performance Computing (HPC) was concerned with the parallel programming issues, but the situation has changed dramatically with the appearance of multi-core processors on commonly used computing systems. It is expected that with the pervasiveness of multi-core processors, parallel programming will become mainstream. The pervasiveness of multi-core processors affects a large spectrum of systems, from embedded and general-purpose, to high-end computing systems. This book assists programmers in mastering the efficient programming of multi-core systems, which is of paramount importance for the software-intensive industry towards a more effective product-development cycle. Key features: Lessons, challenges, and roadmaps ahead. Contains real world examples and case studies. Helps programmers in mastering the efficient programming of multi-core and many-core systems. The book serves as a reference for a larger audience of practitioners, young researchers and graduate level students. A basic level of programming knowledge is required to use this book.
Evolutionary computation algorithms are employed to minimize functions with large number of variables. Biogeography-based optimization (BBO) is an optimization algorithm that is based on the science of biogeography, which researches the migration patterns of species. These migration paradigms provide the main logic behind BBO. Due to the cross-disciplinary nature of the optimization problems, there is a need to develop multiple approaches to tackle them and to study the theoretical reasoning behind their performance. This book explains the mathematical model of BBO algorithm and its variants created to cope with continuous domain problems (with and without constraints) and combinatorial problems.
This book presents a substantial description of the principles and applications of digital holography. The first part of the book deals with mathematical basics and the linear filtering theory necessary to approach the topic. The next part describes the fundamentals of diffraction theory and exhaustively details the numerical computation of diffracted fields using FFT algorithms. A thorough presentation of the principles of holography and digital holography, including digital color holography, is proposed in the third part. A special section is devoted to the algorithms and methods for the numerical reconstruction of holograms. There is also a chapter devoted to digital holographic interferometry with applications in holographic microscopy, quantitative phase contrast imaging, multidimensional deformation investigations, surface shape measurements, fluid mechanics, refractive index investigations, synthetic aperture imaging and information encrypting. Keys so as to understand the differences between digital holography and speckle interferometry and examples of software for hologram reconstructions are also treated in brief. Contents 1. Mathematical Prerequisites. 2. The Scalar Theory of Diffraction. 3. Calculating Diffraction by Fast Fourier Transform. 4. Fundamentals of Holography. 5. Digital Off-Axis Fresnel Holography. 6. Reconstructing Wavefronts Propagated through an Optical System. 7. Digital Holographic Interferometry and Its Applications. Appendix. Examples of Digital Hologram Reconstruction Programs
This book provides an introduction to spatial analyses concerning disaggregated (or micro) spatial data. Particular emphasis is put on spatial data compilation and the structuring of the connections between the observations. Descriptive analysis methods of spatial data are presented in order to identify and measure the spatial, global and local dependency. The authors then focus on autoregressive spatial models, to control the problem of spatial dependency between the residues of a basic linear statistical model, thereby contravening one of the basic hypotheses of the ordinary least squares approach. This book is a popularized reference for students looking to work with spatialized data, but who do not have the advanced statistical theoretical basics.
Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service–Oriented Approach focuses on how to design, analyze, and deploy web service–based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features application scenarios that include healthcare and biomedical applications, such as personalized healthcare processing, DNA sequence data processing, and electrocardiogram wave analysis, and presents: Updated research and development results on the composition technologies of web services for ever-sophisticated service requirements from various users and communities Fundamental methods such as Petri nets and social network analysis to advance the theory and applications of workflow design and web service composition Practical and real applications of the developed theory and methods for such platforms as personalized healthcare and Biomedical Informatics Grids The authors' efforts on advancing service composition methods for both business and scientific software systems, with theoretical and empirical contributions With workflow-driven service composition and reuse being a hot topic in both academia and industry, this book is ideal for researchers, engineers, scientists, professionals, and students who work on service computing, software engineering, business and scientific workflow management, the internet, and management information systems (MIS).
The Fundamental Science in «Computer Science» Is the Science of Thought For the first time, the collective genius of the great 18th-century German cognitive philosopher-scientists Immanuel Kant, Georg Wilhelm Friedrich Hegel, and Arthur Schopenhauer have been integrated into modern 21st-century computer science. In contrast to the languishing mainstream of Artificial Intelligence, this book takes the human thought system as its model, resulting in an entirely different approach. This book presents the architecture of a thoroughly and broadly educated human mind as translated into modern software engineering design terms. The result is The Autonomous System, based on dynamic logic and the architecture of the human mind. With its human-like intelligence, it is capable of rational thought, reasoning, and an understanding of itself and its tasks. «A system of thoughts must always have an architectural structure.» —Arthur Schopenhauer, The World as Will and Presentation
Nowadays, billions of lines of code are in the COBOL programming language. This book is an analysis, a diagnosis, a strategy, a MDD method and a tool to transform legacy COBOL into modernized applications that comply with Internet computing, Service-Oriented Architecture (SOA) and the Cloud. It serves as a blueprint for those in charge of finding solutions to this considerable challenge.