Scheduled System Maintenance
On Tuesday, September 26, IEEE Xplore will undergo scheduled maintenance from 1:00-4:00 PM ET.
During this time, there may be intermittent impact on performance. We apologize for any inconvenience.

Morgan and ClayPool Synthesis Digital LIBRARY

788 Results Returned

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital Control in Power Electronics

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book presents the reader, whether an electrical engineering student in power electronics or a design engineer, some typical power converter control problems and their basic digital solutions, based on the most widespread digital control techniques. The presentation is focused on different applications of the same power converter topology, the half-bridge voltage source inverter, considered both in its single- and three-phase implementation. This is chosen as the case study because, besides being simple and well known, it allows the discussion of a significant spectrum of the more frequently encountered digital control applications in power electronics, from digital pulse width modulation (DPWM) and space vector modulation (SVM), to inverter output current and voltage control. The book aims to serve two purposes: to give a basic, introductory knowledge of the digital control techniques applied to power converters, and to raise the interest for discrete time control theory, stimula ing new developments in its application to switching power converters. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Tutorial on Queuing and Trunking with Applications to Communications

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The motivation for developing this synthesis lecture was to provide a tutorial on queuing and trunking, with extensions to networks of queues, suitable for supplementing courses in communications, stochastic processes, and networking. An essential component of this lecture is MATLAB-based demonstrations and exercises, which can be easily modified to enable the student to observe and evaluate the impact of changing parameters, arrival and departure statistics, queuing disciplines, the number of servers, and other important aspects of the underlying system model. Much of the work in this lecture is based on Poisson statistics, since Poisson models are useful due to the fact that Poisson models are analytically tractable and provide a useful approximation for many applications. We recognize that the validity of Poisson statistics is questionable for a number of networking applications and therefore we briefly discuss self-similar models and the Hurst parameter, long-term dependent models the Pareto distribution, and other related topics. Appropriate references are given for continued study on these topics. The initial chapters of this book consider individual queues in isolation. The systems studied consist of an arrival process, a single queue with a particular queuing discipline, and one or more servers. While this allows us to study the basic concepts of queuing and trunking, modern data networks consist of many queues that interact in complex ways. While many of these interactions defy analysis, the final chapter introduces a model of a network of queues in which, after being served in one queue, customers may join another queue. The key result for this model is known as Jackson's Theorem. Finally, we state the BCMP Theorem, which can be viewed as a further extension of Jackson's Theorem and present Kleinrock's formula, which can be viewed as the network version of Little's Theorem. Table of Contents: Introduction / Poisson, Erlang, and Pareto Distributions / A Brief Introduction to Queueing Theory / Blocking and Delay / Networks of Queues View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Nonlinear Source Separation

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The purpose of this lecture book is to present the state of the art in nonlinear blind source separation, in a form appropriate for students, researchers and developers. Source separation deals with the problem of recovering sources that are observed in a mixed condition. When we have little knowledge about the sources and about the mixture process, we speak of blind source separation. Linear blind source separation is a relatively well studied subject, however nonlinear blind source separation is still in a less advanced stage, but has seen several significant developments in the last few years. This publication reviews the main nonlinear separation methods, including the separation of post-nonlinear mixtures, and the MISEP, ensemble learning and kTDSEP methods for generic mixtures. These methods are studied with a significant depth. A historical overview is also presented, mentioning most of the relevant results, on nonlinear blind source separation, that have been presented over th years. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Respiratory System and Sounds Analysis

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Breath sounds have long been important indicators of respiratory health and disease. Acoustical monitoring of respiratory sounds has been used by researchers for various diagnostic purposes. A few decades ago, physicians relied on their hearing to detect any symptomatic signs in respiratory sounds of their patients. However, with the aid of computer technology and digital signal processing techniques in recent years, breath sound analysis has drawn much attention because of its diagnostic capabilities. Computerized respiratory sound analysis can now quantify changes in lung sounds; make permanent records of the measurements made and produce graphical representations that help with the diagnosis and treatment of patients suffering from lung diseases. Digital signal processing techniques have been widely used to derive characteristics features of the lung sounds for both diagnostic and assessment of treatment purposes. Although the analytical techniques of signal processing are largely ndependent of the application, interpretation of their results on biological data, i.e. respiratory sounds, requires substantial understanding of the involved physiological system. This lecture series begins with an overview of the anatomy and physiology related to human respiratory system, and proceeds to advanced research in respiratory sound analysis and modeling, and their application as diagnostic aids. Although some of the used signal processing techniques have been explained briefly, the intention of this book is not to describe the analytical methods of signal processing but the application of them and how the results can be interpreted. The book is written for engineers with university level knowledge of mathematics and digital signal processing. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Face Detection and Adaptation

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Face detection, because of its vast array of applications, is one of the most active research areas in computer vision. In this book, we review various approaches to face detection developed in the past decade, with more emphasis on boosting-based learning algorithms. We then present a series of algorithms that are empowered by the statistical view of boosting and the concept of multiple instance learning. We start by describing a boosting learning framework that is capable to handle billions of training examples. It differs from traditional bootstrapping schemes in that no intermediate thresholds need to be set during training, yet the total number of negative examples used for feature selection remains constant and focused (on the poor performing ones). A multiple instance pruning scheme is then adopted to set the intermediate thresholds after boosting learning. This algorithm generates detectors that are both fast and accurate. We then present two multiple instance learning schemes for face detection, multiple instance learning boosting (MILBoost) and winner-take-all multiple category boosting (WTA-McBoost). MILBoost addresses the uncertainty in accurately pinpointing the location of the object being detected, while WTA-McBoost addresses the uncertainty in determining the most appropriate subcategory label for multiview object detection. Both schemes can resolve the ambiguity of the labeling process and reduce outliers during training, which leads to improved detector performances. In many applications, a detector trained with generic data sets may not perform optimally in a new environment. We propose detection adaption, which is a promising solution for this problem. We present an adaptation scheme based on the Taylor expansion of the boosting learning objective function, and we propose to store the second order statistics of the generic training data for future adaptation. We show that with a small amount of labeled data in the new environment, the detector' performance can be greatly improved. We also present two interesting applications where boosting learning was applied successfully. The first application is face verification for filtering and ranking image/video search results on celebrities. We present boosted multi-task learning (MTL), yet another boosting learning algorithm that extends MILBoost with a graphical model. Since the available number of training images for each celebrity may be limited, learning individual classifiers for each person may cause overfitting. MTL jointly learns classifiers for multiple people by sharing a few boosting classifiers in order to avoid overfitting. The second application addresses the need of speaker detection in conference rooms. The goal is to find who is speaking, given a microphone array and a panoramic video of the room. We show that by combining audio and visual features in a boosting framework, we can determine the speaker's position very accurately. Finally, we offer our thoughts on uture directions for face detection. Table of Contents: A Brief Survey of the Face Detection Literature / Cascade-based Real-Time Face Detection / Multiple Instance Learning for Face Detection / Detector Adaptation / Other Applications / Conclusions and Future Work View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Camera Networks:The Acquisition and Analysis of Videos over Wide Areas

    Copyright Year: 2012

    Morgan and Claypool eBooks

    As networks of video cameras are installed in many applications like security and surveillance, environmental monitoring, disaster response, and assisted living facilities, among others, image understanding in camera networks is becoming an important area of research and technology development. There are many challenges that need to be addressed in the process. Some of them are listed below: - Traditional computer vision challenges in tracking and recognition, robustness to pose, illumination, occlusion, clutter, recognition of objects, and activities; - Aggregating local information for wide area scene understanding, like obtaining stable, long-term tracks of objects; - Positioning of the cameras and dynamic control of pan-tilt-zoom (PTZ) cameras for optimal sensing; - Distributed processing and scene analysis algorithms; - Resource constraints imposed by different applications like security and surveillance, environmental monitoring, disaster response, assisted living facilities, et . In this book, we focus on the basic research problems in camera networks, review the current state-of-the-art and present a detailed description of some of the recently developed methodologies. The major underlying theme in all the work presented is to take a network-centric view whereby the overall decisions are made at the network level. This is sometimes achieved by accumulating all the data at a central server, while at other times by exchanging decisions made by individual cameras based on their locally sensed data. Chapter One starts with an overview of the problems in camera networks and the major research directions. Some of the currently available experimental testbeds are also discussed here. One of the fundamental tasks in the analysis of dynamic scenes is to track objects. Since camera networks cover a large area, the systems need to be able to track over such wide areas where there could be both overlapping and non-overlapping fields of view of the cameras, as addressed in Chapter Two: Distributed processing is another challenge in camera networks and recent methods have shown how to do tracking, pose estimation and calibration in a distributed environment. Consensus algorithms that enable these tasks are described in Chapter Three. Chapter Four summarizes a few approaches on object and activity recognition in both distributed and centralized camera network environments. All these methods have focused primarily on the analysis side given that images are being obtained by the cameras. Efficient utilization of such networks often calls for active sensing, whereby the acquisition and analysis phases are closely linked. We discuss this issue in detail in Chapter Five and show how collaborative and opportunistic sensing in a camera network can be achieved. Finally, Chapter Six concludes the book by highlighting the major directions for future research. Table of Contents: An Introduction to Camera Networks / Wide-Area Tracking / Distributed Processing in C mera Networks / Object and Activity Recognition / Active Sensing / Future Research Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Statistical Relational Artificial Intelligence:Logic, Probability, and Computation

    Copyright Year: 2016

    Morgan and Claypool eBooks

    An intelligent agent interacting with the real world will encounter individual people, courses, test results, drugs prescriptions, chairs, boxes, etc., and needs to reason about properties of these individuals and relations among them as well as cope with uncertainty. Uncertainty has been studied in probability theory and graphical models, and relations have been studied in logic, in particular in the predicate calculus and its extensions. This book examines the foundations of combining logic and probability into what are called relational probabilistic models. It introduces representations, inference, and learning techniques for probability, logic, and their combinations. The book focuses on two representations in detail: Markov logic networks, a relational extension of undirected graphical models and weighted first-order predicate calculus formula, and Problog, a probabilistic extension of logic programs that can also be viewed as a Turing-complete relational extension of Bayesian n tworks. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital System Verification:A Combined Formal Methods and Simulation Framework

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 50 years of development, simulation methods have reached a degree of maturity, however, new advances continue to be developed in the area. A simulation approach for functional verification can theoretically validate all possible behaviors of a design but requires excessive computational resources. Rapidly evolving markets demand short design cycles while the increasing complexity of a design caus s simulation approaches to provide less and less coverage. Formal verification is an attractive alternative since 100% coverage can be achieved; however, large designs impose unrealistic computational requirements. Combining formal verification and simulation into a single integrated circuit validation framework is an attractive alternative. This book focuses on an Integrated Design Validation (IDV) system that provides a framework for design validation and takes advantage of current technology in the areas of simulation and formal verification resulting in a practical validation engine with reasonable runtime. After surveying the basic principles of formal verification and simulation, this book describes the IDV approach to integrated circuit functional validation. Table of Contents: Introduction / Formal Methods Background / Simulation Approaches / Integrated Design Validation System / Conclusion and Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Stochastic Network Optimization with Application to Communication and Queueing Systems

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This text presents a modern theory of analysis, control, and optimization for dynamic networks. Mathematical techniques of Lyapunov drift and Lyapunov optimization are developed and shown to enable constrained optimization of time averages in general stochastic systems. The focus is on communication and queueing systems, including wireless networks with time-varying channels, mobility, and randomly arriving traffic. A simple drift-plus-penalty framework is used to optimize time averages such as throughput, throughput-utility, power, and distortion. Explicit performance-delay tradeoffs are provided to illustrate the cost of approaching optimality. This theory is also applicable to problems in operations research and economics, where energy-efficient and profit-maximizing decisions must be made without knowing the future. Topics in the text include the following: - Queue stability theory - Backpressure, max-weight, and virtual queue methods - Primal-dual methods for non-convex stochasti utility maximization - Universal scheduling theory for arbitrary sample paths - Approximate and randomized scheduling theory - Optimization of renewal systems and Markov decision systems Detailed examples and numerous problem set questions are provided to reinforce the main concepts. Table of Contents: Introduction / Introduction to Queues / Dynamic Scheduling Example / Optimizing Time Averages / Optimizing Functions of Time Averages / Approximate Scheduling / Optimization of Renewal Systems / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information and Human Values

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book seeks to advance our understanding of the relationship between information and human values by synthesizing the complementary but typically disconnected threads in the literature, reflecting on my 15 years of research on the relationship between information and human values, advancing our intellectual understanding of the key facets of this topic, and encouraging further research to continue exploring this important and timely research topic. The book begins with an explanation of what human values are and why they are important. Next, three distinct literatures on values, information, and technology are analyzed and synthesized, including the social psychology literature on human values, the information studies literature on the core values of librarianship, and the human-computer interaction literature on value-sensitive design. After that, three detailed case studies are presented based on reflections on a wide range of research studies. The first case study focuses on th role of human values in the design and use of educational simulations. The second case study focuses on the role of human values in the design and use of computational models. The final case study explores human values in communication via, about, or using information technology. The book concludes by laying out a values and design cycle for studying values in information and presenting an agenda for further research. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Performance Modeling, Stochastic Networks, and Statistical Multiplexing:Second Edition

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of introducing an appropriate mathematical framework for modeling and analysis as well as understanding the phenomenon of statistical multiplexing. The models, techniques, and results presented form the core of traffic engineering methods used to design, control and allocate resources in communication networks.The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the important ideas of Palm distributions associated with traffic models and their role in computing performance measures. The monograph also covers stochastic network theory including Markovian networks. Recent results on network utility optimization and connections to stochastic insensitivity are discussed. Also presented are ideas of large buffer, and many sources asymptotics that play an important role i understanding statistical multiplexing. In particular, the important concept of effective bandwidths as mappings from queueing level phenomena to loss network models is clearly presented along with a detailed discussion of accurate approximations for large networks. Table of Contents: Introduction to Traffic Models and Analysis / Queues and Performance Analysis / Loss Models for Networks / Stochastic Networks and Insensitivity / Statistical Multiplexing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Reasoning with Probabilistic and Deterministic Graphical Models:Exact Algorithms

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Graphical models (e.g., Bayesian and constraint networks, influence diagrams, and Markov decision processes) have become a central paradigm for knowledge representation and reasoning in both artificial intelligence and computer science in general. These models are used to perform many reasoning tasks, such as scheduling, planning and learning, diagnosis and prediction, design, hardware and software verification, and bioinformatics. These problems can be stated as the formal tasks of constraint satisfaction and satisfiability, combinatorial optimization, and probabilistic inference. It is well known that the tasks are computationally hard, but research during the past three decades has yielded a variety of principles and techniques that significantly advanced the state of the art. In this book we provide comprehensive coverage of the primary exact algorithms for reasoning with such models. The main feature exploited by the algorithms is the model's graph. We present inference-based, m ssage-passing schemes (e.g., variable-elimination) and search-based, conditioning schemes (e.g., cycle-cutset conditioning and AND/OR search). Each class possesses distinguished characteristics and in particular has different time vs. space behavior. We emphasize the dependence of both schemes on few graph parameters such as the treewidth, cycle-cutset, and (the pseudo-tree) height. We believe the principles outlined here would serve well in moving forward to approximation and anytime-based schemes. The target audience of this book is researchers and students in the artificial intelligence and machine learning area, and beyond. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Circuits:Signals and Filters

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Pragmatic Circuits: Signals and Filters is built around the processing of signals. Topics include spectra, a short introduction to the Fourier series, design of filters, and the properties of the Fourier transform. The focus is on signals rather than power. But the treatment is still pragmatic. For example, the author accepts the work of Butterworth and uses his results to design filters in a fairly methodical fashion. This third of three volumes finishes with a look at spectra by showing how to get a spectrum even if a signal is not periodic. The Fourier transform provides a way of dealing with such non-periodic signals. The two other volumes in the Pragmatic Circuits series include titles on DC and Time Domain and Frequency Domain. These short lecture books will be of use to students at any level of electrical engineering and for practicing engineers, or scientists, in any field looking for a practical and applied introduction to circuits and signals. The author's “pragmati ” and applied style gives a unique and helpful “non-idealistic, practical, opinionated” introduction to circuits View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p> New data acquisition techniques are emerging and are providing fast and efficient means for multidimensional spatial data collection. Airborne LIDAR surveys, SAR satellites, stereo-photogrammetry and mobile mapping systems are increasingly used for the digital reconstruction of the environment. All these systems provide extremely high volumes of raw data, often enriched with other sensor data (e.g., beam intensity). Improving methods to process and visually analyze this massive amount of geospatial and user-generated data is crucial to increase the efficiency of organizations and to better manage societal challenges. </p> <p> Within this context, this book proposes an up-to-date view of computational methods and tools for spatio-temporal data fusion, multivariate surface generation, and feature extraction, along with their main applications for surface approximation and rainfall analysis. The book is intended to attract interest from different fields, such s computer vision, computer graphics, geomatics, and remote sensing, working on the common goal of processing 3D data. To this end, it presents and compares methods that process and analyze the massive amount of geospatial data in order to support better management of societal challenges through more timely and better decision making, independent of a specific data modeling paradigm (e.g., 2D vector data, regular grids or 3D point clouds). </p> <p> We also show how current research is developing from the traditional layered approach, adopted by most GIS softwares, to intelligent methods for integrating existing data sets that might contain important information on a geographical area and environmental phenomenon. These services combine traditional map-oriented visualization with fully 3D visual decision support methods and exploit semantics-oriented information (e.g., a-priori knowledge, annotations, segmentations) when processing, merging, and integrating big pre-exis ing data sets. </p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Articulation and Intelligibility

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Immediately following the Second World War, between 1947 and 1955, several classic papers quantified the fundamentals of human speech information processing and recognition. In 1947 French and Steinberg published their classic study on the articulation index. In 1948 Claude Shannon published his famous work on the theory of information. In 1950 Fletcher and Galt published their theory of the articulation index, a theory that Fletcher had worked on for 30 years, which integrated his classic works on loudness and speech perception with models of speech intelligibility. In 1951 George Miller then wrote the first book Language and Communication, analyzing human speech communication with Claude Shannon's just published theory of information. Finally in 1955 George Miller published the first extensive analysis of phone decoding, in the form of confusion matrices, as a function of the speech-to-noise ratio. This work extended the Bell Labs' speech articulation studies with ideas from Shann n's Information theory. Both Miller and Fletcher showed that speech, as a code, is incredibly robust to mangling distortions of filtering and noise. Regrettably much of this early work was forgotten. While the key science of information theory blossomed, other than the work of George Miller, it was rarely applied to aural speech research. The robustness of speech, which is the most amazing thing about the speech code, has rarely been studied. It is my belief (i.e., assumption) that we can analyze speech intelligibility with the scientific method. The quantitative analysis of speech intelligibility requires both science and art. The scientific component requires an error analysis of spoken communication, which depends critically on the use of statistics, information theory, and psychophysical methods. The artistic component depends on knowing how to restrict the problem in such a way that progress may be made. It is critical to tease out the relevant from the irrelevant and dig for th key issues. This will focus us on the decoding of nonsense phonemes with no visual component, which have been mangled by filtering and noise. This monograph is a summary and theory of human speech recognition. It builds on and integrates the work of Fletcher, Miller, and Shannon. The long-term goal is to develop a quantitative theory for predicting the recognition of speech sounds. In Chapter 2 the theory is developed for maximum entropy (MaxEnt) speech sounds, also called nonsense speech. In Chapter 3, context is factored in. The book is largely reflective, and quantitative, with a secondary goal of providing an historical context, along with the many deep insights found in these early works. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Automated Grammatical Error Detection for Language Learners, Second Edition

    Copyright Year: 2014

    Morgan and Claypool eBooks

    It has been estimated that over a billion people are using or learning English as a second or foreign language, and the numbers are growing not only for English but for other languages as well. These language learners provide a burgeoning market for tools that help identify and correct learners' writing errors. Unfortunately, the errors targeted by typical commercial proofreading tools do not include those aspects of a second language that are hardest to learn. This volume describes the types of constructions English language learners find most difficult: constructions containing prepositions, articles, and collocations. It provides an overview of the automated approaches that have been developed to identify and correct these and other classes of learner errors in a number of languages. Error annotation and system evaluation are particularly important topics in grammatical error detection because there are no commonly accepted standards. Chapters in the book describe the options avai able to researchers, recommend best practices for reporting results, and present annotation and evaluation schemes. The final chapters explore recent innovative work that opens new directions for research. It is the authors' hope that this volume will continue to contribute to the growing interest in grammatical error detection by encouraging researchers to take a closer look at the field and its many challenging problems. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Visual Object Recognition

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The visual recognition problem is central to computer vision research. From robotics to information retrieval, many desired applications demand the ability to identify and localize categories, places, and objects. This tutorial overviews computer vision algorithms for visual object recognition and image classification. We introduce primary representations and learning approaches, with an emphasis on recent advances in the field. The target audience consists of researchers or students working in AI, robotics, or vision who would like to understand what methods and representations are available for these problems. This lecture summarizes what is and isn't possible to do reliably today, and overviews key concepts that could be employed in systems requiring visual categorization. Table of Contents: Introduction / Overview: Recognition of Specific Objects / Local Features: Detection and Description / Matching Local Features / Geometric Verification of Matched Features / Example Systems: S ecific-Object Recognition / Overview: Recognition of Generic Object Categories / Representations for Object Categories / Generic Object Detection: Finding and Scoring Candidates / Learning Generic Object Category Models / Example Systems: Generic Object Recognition / Other Considerations and Current Challenges / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Real-Time Image and Video Processing

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book presents an overview of the guidelines and strategies for transitioning an image or video processing algorithm from a research environment into a real-time constrained environment. Such guidelines and strategies are scattered in the literature of various disciplines including image processing, computer engineering, and software engineering, and thus have not previously appeared in one place. By bringing these strategies into one place, the book is intended to serve the greater community of researchers, practicing engineers, industrial professionals, who are interested in taking an image or video processing algorithm from a research environment to an actual real-time implementation on a resource constrained hardware platform. These strategies consist of algorithm simplifications, hardware architectures, and software methods. Throughout the book, carefully selected representative examples from the literature are presented to illustrate the discussed concepts. After reading the book, the readers are exposed to a wide variety of techniques and tools, which they can then employ to design a real-time image or video processing system. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Performance Modeling of Communication Networks with Markov Chains

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book is an introduction to Markov chain modeling with applications to communication networks. It begins with a general introduction to performance modeling in Chapter 1 where we introduce different performance models. We then introduce basic ideas of Markov chain modeling: Markov property, discrete time Markov chain (DTMC) and continuous time Markov chain (CTMC). We also discuss how to find the steady state distributions from these Markov chains and how they can be used to compute the system performance metric. The solution methodologies include a balance equation technique, limiting probability technique, and the uniformization. We try to minimize the theoretical aspects of the Markov chain so that the book is easily accessible to readers without deep mathematical backgrounds. We then introduce how to develop a Markov chain model with simple applications: a forwarding system, a cellular system blocking, slotted ALOHA, Wi-Fi model, and multichannel based LAN model. The examples c ver CTMC, DTMC, birth-death process and non birth-death process. We then introduce more difficult examples in Chapter 4, which are related to wireless LAN networks: the Bianchi model and Multi-Channel MAC model with fixed duration. These models are more advanced than those introduced in Chapter 3 because they require more advanced concepts such as renewal-reward theorem and the queueing network model. We introduce these concepts in the appendix as needed so that readers can follow them without difficulty. We hope that this textbook will be helpful to students, researchers, and network practitioners who want to understand and use mathematical modeling techniques. Table of Contents: Performance Modeling / Markov Chain Modeling / Developing Markov Chain Performance Models / Advanced Markov Chain Models View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Scholarly Communication on the Academic Social Web

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Collaboration among scholars has always been recognized as a fundamental feature in scientific discovery. The ever-increasing diversity among disciplines and complexity of research problems make it more impelling to collaborate in order to keep up with the fast pace of innovation and advance knowledge. Along with the rapidly developing Internet communication technologies and the increasing popularity of social web, we have observed many important developments of scholarly collaboration on the academic social web. In this lecture, we review the rapid transformation of scholarly collaboration on various academic social web platforms, and examine how these platforms have facilitated academics throughout their research life cycle- from forming ideas, collecting data, authoring articles to disseminating findings. We refer to the term academic social web platforms in this lecture as a category of Web 2.0 tools or online platforms (such as CiteULike, Mendeley, academia.edu, and ResearchGate) that enable and facilitate scholarly information exchange and participation. We will also examine scholars’ collaboration behaviors include sharing academic resources, exchanging opinions, following each other's research, keeping up with current research trends, and most importantly, building up their professional networks. Inspired by the model developed by G. Olson, Olson, and Venolia (2000) on factors for successful scientific collaboration, our examination of the status of scholarly collaboration on academic social web has four emphases: technology readiness, coupling work, building common ground, and collaboration readiness. Finally, we will talk about the insights and challenges of all these online scholarly collaboration activities imposed to the research communities who are engaging in supporting online scholarly collaboration. This lecture aims to help researchers and practitioners to understand the development of scholarly collaboration on academic social web, and t build up an active community of scholars who are interested in this topic. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advances in Modern Blind Signal Separation Algorithms:Theory and Applications

    Copyright Year: 2010

    Morgan and Claypool eBooks

    With human-computer interactions and hands-free communications becoming overwhelmingly important in the new millennium, recent research efforts have been increasingly focusing on state-of-the-art multi-microphone signal processing solutions to improve speech intelligibility in adverse environments. One such prominent statistical signal processing technique is blind signal separation (BSS). BSS was first introduced in the early 1990s and quickly emerged as an area of intense research activity showing huge potential in numerous applications. BSS comprises the task of 'blindly' recovering a set of unknown signals, the so-called sources from their observed mixtures, based on very little to almost no prior knowledge about the source characteristics or the mixing structure. The goal of BSS is to process multi-sensory observations of an inaccessible set of signals in a manner that reveals their individual (and original) form, by exploiting the spatial and temporal diversity, readily access ble through a multi-microphone configuration. Proceeding blindly exhibits a number of advantages, since assumptions about the room configuration and the source-to-sensor geometry can be relaxed without affecting overall efficiency. This booklet investigates one of the most commercially attractive applications of BSS, which is the simultaneous recovery of signals inside a reverberant (naturally echoing) environment, using two (or more) microphones. In this paradigm, each microphone captures not only the direct contributions from each source, but also several reflected copies of the original signals at different propagation delays. These recordings are referred to as the convolutive mixtures of the original sources. The goal of this booklet in the lecture series is to provide insight on recent advances in algorithms, which are ideally suited for blind signal separation of convolutive speech mixtures. More importantly, specific emphasis is given in practical applications of the developed BSS algorithms associated with real-life scenarios. The developed algorithms are put in the context of modern DSP devices, such as hearing aids and cochlear implants, where design requirements dictate low power consumption and call for portability and compact size. Along these lines, this booklet focuses on modern BSS algorithms which address (1) the limited amount of processing power and (2) the small number of microphones available to the end-user. Table of Contents: Fundamentals of blind signal separation / Modern blind signal separation algorithms / Application of blind signal processing strategies to noise reduction for the hearing-impaired / Conclusions and future challenges / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    iRODS Primer:Integrated Rule-Oriented Data System

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Policy-based data management enables the creation of community-specific collections. Every collection is created for a purpose. The purpose defines the set of properties that will be associated with the collection. The properties are enforced by management policies that control the execution of procedures that are applied whenever data are ingested or accessed. The procedures generate state information that defines the outcome of enforcing the management policy. The state information can be queried to validate assessment criteria and verify that the required collection properties have been conserved. The integrated Rule-Oriented Data System implements the data management framework required to support policy-based data management. Policies are turned into computer actionable Rules. Procedures are composed from a Micro-service-oriented architecture. The result is a highly extensible and tunable system that can enforce management policies, automate administrative tasks, and periodically alidate assessment criteria. Table of Contents: Introduction / Integrated Rule-Oriented Data System / iRODS Architecture / Rule-Oriented Programming / The iRODS Rule System / iRODS Micro-services / Example Rules / Extending iRODS / Appendix A: iRODS Shell Commands / Appendix B: Rulegen Grammar / Appendix C: Exercises / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Operating System Security

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Operating systems provide the fundamental mechanisms for securing computer processing. Since the 1960s, operating systems designers have explored how to build "secure" operating systems - operating systems whose mechanisms protect the system against a motivated adversary. Recently, the importance of ensuring such security has become a mainstream issue for all operating systems. In this book, we examine past research that outlines the requirements for a secure operating system and research that implements example systems that aim for such requirements. For system designs that aimed to satisfy these requirements, we see that the complexity of software systems often results in implementation challenges that we are still exploring to this day. However, if a system design does not aim for achieving the secure operating system requirements, then its security features fail to protect the system in a myriad of ways. We also study systems that have been retrofit with secure operating system fe tures after an initial deployment. In all cases, the conflict between function on one hand and security on the other leads to difficult choices and the potential for unwise compromises. From this book, we hope that systems designers and implementors will learn the requirements for operating systems that effectively enforce security and will better understand how to manage the balance between function and security. Table of Contents: Introduction / Access Control Fundamentals / Multics / Security in Ordinary Operating Systems / Verifiable Security Goals / Security Kernels / Securing Commercial Operating Systems / Case Study: Solaris Trusted Extensions / Case Study: Building a Secure Operating System for Linux / Secure Capability Systems / Secure Virtual Machine Systems / System Assurance View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Resource-Oriented Architecture Patterns for Webs of Data

    Copyright Year: 2013

    Morgan and Claypool eBooks

    The surge of interest in the REpresentational State Transfer (REST) architectural style, the Semantic Web, and Linked Data has resulted in the development of innovative, flexible, and powerful systems that embrace one or more of these compatible technologies. However, most developers, architects, Information Technology managers, and platform owners have only been exposed to the basics of resource-oriented architectures. This book is an attempt to catalog and elucidate several reusable solutions that have been seen in the wild in the now increasingly familiar "patterns book" style. These are not turn key implementations, but rather, useful strategies for solving certain problems in the development of modern, resource-oriented systems, both on the public Web and within an organization's firewalls. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Instant Recovery with Write-Ahead Logging:Page Repair, System Restart, and Media Restore

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Traditional theory and practice of write-ahead logging and of database recovery techniques revolve around three failure classes: transaction failures resolved by rollback; system failures (typically software faults) resolved by restart with log analysis, “redo,” and “undo” phases; and media failures (typically hardware faults) resolved by restore operations that combine multiple types of backups and log replay. The recent addition of single-page failures and single-page recovery has opened new opportunities far beyond its original aim of immediate, lossless repair of single-page wear-out in novel or traditional storage hardware. In the contexts of system and media failures, efficient single-page recovery enables on-demand incremental “redo” and “undo” as part of system restart or media restore operations. This can give the illusion of practically instantaneous restart and restore: instant restart permits processing new queries an updates seconds after system reboot and instant restore permits resuming queries and updates on empty replacement media as if those were already fully recovered. In addition to these instant recovery techniques, the discussion introduces much faster offline restore operations without slowdown in backup operations and with hardly any slowdown in log archiving operations. The new restore techniques also render differential and incremental backups obsolete, complete backup commands on the database server practically instantly, and even permit taking full backups without imposing any load on the database server. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Die-stacking Architecture

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The emerging three-dimensional (3D) chip architectures, with their intrinsic capability of reducing the wire length, promise attractive solutions to reduce the delay of interconnects in future microprocessors. 3D memory stacking enables much higher memory bandwidth for future chip-multiprocessor design, mitigating the "memory wall" problem. In addition, heterogenous integration enabled by 3D technology can also result in innovative designs for future microprocessors. This book first provides a brief introduction to this emerging technology, and then presents a variety of approaches to designing future 3D microprocessor systems, by leveraging the benefits of low latency, high bandwidth, and heterogeneous integration capability which are offered by 3D technology. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Particle Swarm Optimizaton:A Physics-Based Approach

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This work aims to provide new introduction to the particle swarm optimization methods using a formal analogy with physical systems. By postulating that the swarm motion behaves similar to both classical and quantum particles, we establish a direct connection between what are usually assumed to be separate fields of study, optimization and physics. Within this framework, it becomes quite natural to derive the recently introduced quantum PSO algorithm from the Hamiltonian or the Lagrangian of the dynamical system. The physical theory of the PSO is used to suggest some improvements in the algorithm itself, like temperature acceleration techniques and the periodic boundary condition. At the end, we provide a panorama of applications demonstrating the power of the PSO, classical and quantum, in handling difficult engineering problems. The goal of this work is to provide a general multi-disciplinary view on various topics in physics, mathematics, and engineering by illustrating their interd pendence within the unified framework of the swarm dynamics. Table of Contents: Introduction / The Classical Particle Swarm Optimization Method / Boundary Conditions for the PSO Method / The Quantum Particle Swarm Optimization / Bibliography /Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Basic Probability Theory for Biomedical Engineers

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems—as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first chapter. The second chapter introduces the topic of random variables. Later chapters simply expand upon these key ideas and extend the range of application. A considerable effort has been made to develop the theory in a logical manner—developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Ever effort has been made to be consistent with commonly used notation and terminology—both within the engineering community as well as the probability and statistics literature. Biomedical engineering examples are introduced throughout the text and a large number of self-study problems are available for the reader. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Microcontrollers Fundamentals for Engineers and Scientists

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book provides practicing scientists and engineers a tutorial on the fundamental concepts and use of microcontrollers. Today, microcontrollers, or single integrated circuit (chip) computers, play critical roles in almost all instrumentation and control systems. Most existing books arewritten for undergraduate and graduate students taking an electrical and/or computer engineering course. Furthermore, these texts have beenwritten with a particular model of microcontroller as the target discussion. These textbooks also require a requisite knowledge of digital design fundamentals. This textbook presents the fundamental concepts common to all microcontrollers. Our goals are to present the over–arching theory of microcontroller operation and to provide a detailed discussion on constituent subsystems available in most microcontrollers. With such goals, we envision that the theory discussed in this book can be readily applied to a wide variety of microcontroller technologies, allo ing practicing scientists and engineers to become acquainted with basic concepts prior to beginning a design involving a specific microcontroller. We have found that the fundamental principles of a given microcontroller are easily transferred to other controllers. Although this is a relatively small book, it is packed with useful information for quickly coming up to speed on microcontroller concepts. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis of Oriented Texture:With application to the Detection of Architectural Distortion in Mammograms

    Copyright Year: 2011

    Morgan and Claypool eBooks

    The presence of oriented features in images often conveys important information about the scene or the objects contained; the analysis of oriented patterns is an important task in the general framework of image understanding. As in many other applications of computer vision, the general framework for the understanding of oriented features in images can be divided into low- and high-level analysis. In the context of the study of oriented features, low-level analysis includes the detection of oriented features in images; a measure of the local magnitude and orientation of oriented features over the entire region of analysis in the image is called the orientation field. High-level analysis relates to the discovery of patterns in the orientation field, usually by associating the structure perceived in the orientation field with a geometrical model. This book presents an analysis of several important methods for the detection of oriented features in images, and a discussion of the phase po trait method for high-level analysis of orientation fields. In order to illustrate the concepts developed throughout the book, an application is presented of the phase portrait method to computer-aided detection of architectural distortion in mammograms. Table of Contents: Detection of Oriented Features in Images / Analysis of Oriented Patterns Using Phase Portraits / Optimization Techniques / Detection of Sites of Architectural Distortion in Mammograms View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High Performance Datacenter Networks:Architectures, Algorithms, and Opportunities

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Datacenter networks provide the communication substrate for large parallel computer systems that form the ecosystem for high performance computing (HPC) systems and modern Internet applications. The design of new datacenter networks is motivated by an array of applications ranging from communication intensive climatology, complex material simulations and molecular dynamics to such Internet applications as Web search, language translation, collaborative Internet applications, streaming video and voice-over-IP. For both Supercomputing and Cloud Computing the network enables distributed applications to communicate and interoperate in an orchestrated and efficient way. This book describes the design and engineering tradeoffs of datacenter networks. It describes interconnection networks from topology and network architecture to routing algorithms, and presents opportunities for taking advantage of the emerging technology trends that are influencing router microarchitecture. With the emerge ce of "many-core" processor chips, it is evident that we will also need "many-port" routing chips to provide a bandwidth-rich network to avoid the performance limiting effects of Amdahl's Law. We provide an overview of conventional topologies and their routing algorithms and show how technology, signaling rates and cost-effective optics are motivating new network topologies that scale up to millions of hosts. The book also provides detailed case studies of two high performance parallel computer systems and their networks. Table of Contents: Introduction / Background / Topology Basics / High-Radix Topologies / Routing / Scalable Switch Microarchitecture / System Packaging / Case Studies / Closing Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Three-Dimensional Integration and Modeling:A Revolution in RF and Wireless Packaging

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book presents a step-by-step discussion of the 3D integration approach for the development of compact system-on-package (SOP) front-ends.Various examples of fully-integrated passive building blocks (cavity/microstip filters, duplexers, antennas), as well as a multilayer ceramic (LTCC) V-band transceiver front-end midule demonstrate the revolutionary effects of this approach in RF/Wireless packaging and multifunctional miniaturization. Designs covered are based on novel ideas and are presented for the first time for millimeterwave (60GHz) ultrabroadband wireless modules. Table of Contents: Introduction / Background on Technologies for Millimeter-Wave Passive Front-Ends / Three-Dimensional Packaging in Multilayer Organic Substrates / Microstrip-Type Integrated Passives / Cavity-Type Integrated Passives / Three-Dimensional Antenna Architectures / Fully Integrated Three-Dimensional Passive Front-Ends / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Learner-Centered Design of Computing Education:Research on Computing for Everyone

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Computing education is in enormous demand. Many students (both children and adult) are realizing that they will need programming in the future. This book presents the argument that they are not all going to use programming in the same way and for the same purposes. What do we mean when we talk about teaching everyone to program? When we target a broad audience, should we have the same goals as computer science education for professional software developers? How do we design computing education that works for everyone? This book proposes use of a learner-centered design approach to create computing education for a broad audience. It considers several reasons for teaching computing to everyone and how the different reasons lead to different choices about learning goals and teaching methods. The book reviews the history of the idea that programming isn’t just for the professional software developer. It uses research studies on teaching computing in liberal arts programs, to graph c designers, to high school teachers, in order to explore the idea that computer science for everyone requires us to re-think how we teach and what we teach. The conclusion describes how we might create computing education for everyone. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Perspective on Single-Channel Frequency-Domain Speech Enhancement

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book focuses on a class of single-channel noise reduction methods that are performed in the frequency domain via the short-time Fourier transform (STFT). The simplicity and relative effectiveness of this class of approaches make them the dominant choice in practical systems. Even though many popular algorithms have been proposed through more than four decades of continuous research, there are a number of critical areas where our understanding and capabilities still remain quite rudimentary, especially with respect to the relationship between noise reduction and speech distortion. All existing frequency-domain algorithms, no matter how they are developed, have one feature in common: the solution is eventually expressed as a gain function applied to the STFT of the noisy signal only in the current frame. As a result, the narrowband signal-to-noise ratio (SNR) cannot be improved, and any gains achieved in noise reduction on the fullband basis come with a price to pay, which is speec distortion. In this book, we present a new perspective on the problem by exploiting the difference between speech and typical noise in circularity and interframe self-correlation, which were ignored in the past. By gathering the STFT of the microphone signal of the current frame, its complex conjugate, and the STFTs in the previous frames, we construct several new, multiple-observation signal models similar to a microphone array system: there are multiple noisy speech observations, and their speech components are correlated but not completely coherent while their noise components are presumably uncorrelated. Therefore, the multichannel Wiener filter and the minimum variance distortionless response (MVDR) filter that were usually associated with microphone arrays will be developed for single-channel noise reduction in this book. This might instigate a paradigm shift geared toward speech distortionless noise reduction techniques. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Exploitation of a Ship's Magnetic Field Signatures

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Surface ship and submarine magnetic field signatures have been exploited for over 80 years by naval influence mines, and both underwater and airborne surveillance systems. The generating mechanism of the four major shipboard sources of magnetic fields is explained, along with a detailed description of the induced and permanent ferromagnetic signature characteristics. A brief historical summary of magnetic naval mine development during World War II is followed by a discussion of important improvements found in modern weapons, including an explanation of the damage mechanism for non-contact explosions. A strategy for selecting an optimum mine actuation threshold is given. A multi-layered defensive strategy against naval mines is outlined, with graphical explanations of the relationships between ship signature reduction and minefield clearing effectiveness. In addition to a brief historical discussion of underwater and airborne submarine surveillance systems and magnetic field sensing pr nciples, mathematical formulations are presented for computing the expected target signal strengths and noise levels for several barrier types. Besides the sensor self-noise, equations for estimating geomagnetic, ocean surface wave, platform, and vector sensor motion noises will be given along with simple algorithms for their reduction. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics provides a comprehensive tutorial of the most widely used method for solving Maxwell's equations -- the Finite Difference Time-Domain Method. This book is an essential guide for students, researchers, and professional engineers who want to gain a fundamental knowledge of the FDTD method. It can accompany an undergraduate or entry-level graduate course or be used for self-study. The book provides all the background required to either research or apply the FDTD method for the solution of Maxwell's equations to practical problems in engineering and science. Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics guides the reader through the foundational theory of the FDTD method starting with the one-dimensional transmission-line problem and then progressing to the solution of Maxwell's equations in three dimensions. It also provides step by step guides to modeling physic l sources, lumped-circuit components, absorbing boundary conditions, perfectly matched layer absorbers, and sub-cell structures. Post processing methods such as network parameter extraction and far-field transformations are also detailed. Efficient implementations of the FDTD method in a high level language are also provided. Table of Contents: Introduction / 1D FDTD Modeling of the Transmission Line Equations / Yee Algorithm for Maxwell's Equations / Source Excitations / Absorbing Boundary Conditions / The Perfectly Matched Layer (PML) Absorbing Medium / Subcell Modeling / Post Processing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Smartphone-Based Real-Time Digital Signal Processing

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Real-time or applied digital signal processing courses are offered as follow-ups to conventional or theory-oriented digital signal processing courses in many engineering programs for the purpose of teaching students the technical know-how for putting signal processing algorithms or theory into practical use. These courses normally involve access to a teaching laboratory that is equipped with hardware boards, in particular DSP boards, together with their supporting software. A number of textbooks have been written discussing how to achieve real-time implementation on these hardware boards. This book discusses how smartphones can be used as hardware boards for real-time implementation of signal processing algorithms as an alternative to the hardware boards that are currently being used in signal processing teaching laboratories. The fact that mobile devices, in particular smartphones, have now become powerful processing platforms has led to the development of this book, thus enabling st dents to use their own smartphones to run signal processing algorithms in real-time considering that these days nearly all students possess smartphones. Changing the hardware platforms that are currently used in applied or real-time signal processing courses to smartphones creates a truly mobile laboratory experience or environment for students. In addition, it relieves the cost burden associated with using a dedicated signal processing board noting that the software development tools for smartphones are free of charge and are well-developed. This book is written in such a way that it can be used as a textbook for applied or real time digital signal processing courses offered at many universities. Ten lab experiments that are commonly encountered in such courses are covered in the book. This book is written primarily for those who are already familiar with signal processing concepts and are interested in their real-time and practical aspects. Similar to existing real-time courses, kno ledge of C programming is assumed. This book can also be used as a self-study guide for those who wish to become familiar with signal processing app development on either Android or iPhone smartphones. All the lab codes can be obtained as a software package from http://sites.fastspring.com/bookcodes/product/bookcodes View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Concise Introduction to Multiagent Systems and Distributed Artificial Intelligence

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Multiagent systems is an expanding field that blends classical fields like game theory and decentralized control with modern fields like computer science and machine learning. This monograph provides a concise introduction to the subject, covering the theoretical foundations as well as more recent developments in a coherent and readable manner. The text is centered on the concept of an agent as decision maker. Chapter 1 is a short introduction to the field of multiagent systems. Chapter 2 covers the basic theory of singleagent decision making under uncertainty. Chapter 3 is a brief introduction to game theory, explaining classical concepts like Nash equilibrium. Chapter 4 deals with the fundamental problem of coordinating a team of collaborative agents. Chapter 5 studies the problem of multiagent reasoning and decision making under partial observability. Chapter 6 focuses on the design of protocols that are stable against manipulations by self-interested agents. Chapter 7 provides a s ort introduction to the rapidly expanding field of multiagent reinforcement learning. The material can be used for teaching a half-semester course on multiagent systems covering, roughly, one chapter per lecture. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Engineering Economics and Decision Analysis

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The authors cover two general topics: basic engineering economics and risk analysis in this text. Within the topic of engineering economics are discussions on the time value of money and interest relationships. These interest relationships are used to define certain project criteria that are used by engineers and project managers to select the best economic choice among several alternatives. Projects examined will include both income- and service-producing investments. The effects of escalation, inflation, and taxes on the economic analysis of alternatives are discussed. Risk analysis incorporates the concepts of probability and statistics in the evaluation of alternatives. This allows management to determine the probability of success or failure of the project. Two types of sensitivity analyses are presented. The first is referred to as the range approach while the second uses probabilistic concepts to determine a measure of the risk involved. The authors have designed the text to as ist individuals to prepare to successfully complete the economics portions of the Fundamentals of Engineering Exam. Table of Contents: Introduction / Interest and the Time Value of Money / Project Evaluation Methods / Service Producing Investments / Income Producing Investments / Determination of Project Cash Flow / Financial Leverage / Basic Statistics and Probability / Sensitivity Analysis View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering: Women and Leadership

    Copyright Year: 2008

    Morgan and Claypool eBooks

    In this book we explore a sea change occurring in leadership for academic women in the sciences and engineering. Our approach is a two-pronged one: On the one hand, we outline the nature of the changes and their sources, both in various literatures and from program research results. On the other hand, we specify and provide detail about the persistent problems and obstacles that remain as barriers to women’s full participation in academic science and engineering, their career advancement and success, and, most important, their role as leaders in making change. At the heart of this book is our goal to give some shape to the research, practice, and programs developed by women academic leaders making institutional change in the sciences and engineering. Table of Contents: Women in a New Era of Academic Leadership / Background: Academic Leadership for Women in Science and Engineering / Gender and Leadership: Theories and Applications / Women in Engineering Leadership Institute: Cri ical Issues for Women Academic Engineers as Leaders / From Success Stories to Success Strategies: Leadership for Promoting Diversity in Academic Science and Engineering / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Theory of Timed I/O Automata

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This monograph presents the timed input/output automaton (TIOA) modeling framework, a basic mathematical framework to support description and analysis of timed (computing) systems. Timed systems are systems in which desirable correctness or performance properties of the system depend on the timing of events, not just on the order of their occurrence. Timed systems are employed in a wide range of domains including communications, embedded systems, real-time operating systems, and automated control. Many applications involving timed systems have strong safety, reliability, and predictability requirements, which makes it important to have methods for systematic design of systems and rigorous analysis of timing-dependent behavior. An important feature of the TIOA framework is its support for decomposing timed system descriptions. In particular, the framework includes a notion of external behavior for a TIOA, which captures its discrete interactions with its environment. The framework also defines what it means for one TIOA to implement another, based on an inclusion relationship between their external behavior sets, and defines notions of simulations, which provide sufficient conditions for demonstrating implementation relationships. The framework includes a composition operation for TIOAs, which respects external behavior, and a notion of receptiveness, which implies that a TIOA does not block the passage of time. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multidimensional Databases and Data Warehousing

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The present book's subject is multidimensional data models and data modeling concepts as they are applied in real data warehouses. The book aims to present the most important concepts within this subject in a precise and understandable manner. The book's coverage of fundamental concepts includes data cubes and their elements, such as dimensions, facts, and measures and their representation in a relational setting; it includes architecture-related concepts; and it includes the querying of multidimensional databases. The book also covers advanced multidimensional concepts that are considered to be particularly important. This coverage includes advanced dimension-related concepts such as slowly changing dimensions, degenerate and junk dimensions, outriggers, parent-child hierarchies, and unbalanced, non-covering, and non-strict hierarchies. The book offers a principled overview of key implementation techniques that are particularly important to multidimensional databases, including mat rialized views, bitmap indices, join indices, and star join processing. The book ends with a chapter that presents the literature on which the book is based and offers further readings for those readers who wish to engage in more in-depth study of specific aspects of the book's subject. Table of Contents: Introduction / Fundamental Concepts / Advanced Concepts / Implementation Issues / Further Readings View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    On the Efficient Determination of Most Near Neighbors:Horseshoes, Hand Grenades, Web Search and Other Situations When Close Is Close Enough, Second Edition

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The time-worn aphorism "close only counts in horseshoes and hand grenades" is clearly inadequate. Close also counts in golf, shuffleboard, archery, darts, curling, and other games of accuracy in which hitting the precise center of the target isn't to be expected every time, or in which we can expect to be driven from the target by skilled opponents. This book is not devoted to sports discussions, but to efficient algorithms for determining pairs of closely related web pages—and a few other situations in which we have found that inexact matching is good enough — where proximity suffices. We will not, however, attempt to be comprehensive in the investigation of probabilistic algorithms, approximation algorithms, or even techniques for organizing the discovery of nearest neighbors. We are more concerned with finding nearby neighbors; if they are not particularly close by, we are not particularly interested. In thinking of when approximation is sufficient, remember the of -told joke about two campers sitting around after dinner. They hear noises coming towards them. One of them reaches for a pair of running shoes, and starts to don them. The second then notes that even with running shoes, they cannot hope to outrun a bear, to which the first notes that most likely the bear will be satiated after catching the slower of them. We seek problems in which we don't need to be faster than the bear, just faster than the others fleeing the bear. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Testing iOS Apps with HadoopUnit:Rapid Distributed GUI Testing

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Smartphone users have come to expect high-quality apps. This has increased the importance of software testing in mobile software development. Unfortunately, testing apps—particularly the GUI—can be very time-consuming. Exercising every user interface element and verifying transitions between different views of the app under test quickly becomes problematic. For example, execution of iOS GUI test suites using Apple’s UI Automation framework can take an hour or more if the app’s interface is complicated. The longer it takes to run a test, the less frequently the test can be run, which in turn reduces software quality. This book describes how to accelerate the testing process for iOS apps using HadoopUnit, a distributed test execution environment that leverages the parallelism inherent in the Hadoop platform. HadoopUnit was previously used to run unit and system tests in the cloud. It has been modified to perform GUI testing of iOS apps on a small-scale cluste —a modest computing infrastructure available to almost every developer. Experimental results have shown that distributed test execution with HadoopUnit can significantly outperform the test execution on a single machine, even if the size of the cluster used for the execution is as small as two nodes. This means that the approach described in this book could be adopted without a huge investment in IT resources. HadoopUnit is a cost-effective solution for reducing lengthy test execution times of system-level GUI testing of iOS apps. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Biomedical Transport Processes

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Transport processes represent important life-sustaining elements in all humans. These include mass transfer processes, including gas exchange in the lungs, transport across capillaries and alveoli, transport across the kidneys, and transport across cell membranes. These mass transfer processes affect how oxygen and carbon dioxide are exchanged in your bloodstream, how metabolic waste products are removed from your blood, how nutrients are transported to tissues, and how all cells function throughout the body. A discussion of kidney dialysis and gas exchange mechanisms is included. Another element in biomedical transport processes is that of momentum transport and fluid flow. This describes how blood is propelled from the heart and throughout the cardiovascular system, how blood elements affect the body, including gas exchange, infection control, clotting of blood, and blood flow resistance, which affects cardiac work. A discussion of the measurement of the blood resistance to flow (vi cosity), blood flow, and pressure is also included. A third element in transport processes in the human body is that of heat transfer, including heat transfer inside the body towards the periphery as well as heat transfer from the body to the environment. A discussion of temperature measurements and body protection in extreme heat conditions is also included. Table of Contents: Biomedical Mass Transport / Biofluid Mechanics and Momentum Transport / Biomedical Heat Transport View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing Asynchronous Circuits using NULL Convention Logic (NCL)

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Designing Asynchronous Circuits using NULL Convention Logic (NCL) begins with an introduction to asynchronous (clockless) logic in general, and then focuses on delay-insensitive asynchronous logic design using the NCL paradigm. The book details design of input-complete and observable dual-rail and quad-rail combinational circuits, and then discusses implementation of sequential circuits, which require datapath feedback. Next, throughput optimization techniques are presented, including pipelining, embedding registration, early completion, and NULL cycle reduction. Subsequently, low-power design techniques, such as wavefront steering and Multi-Threshold CMOS (MTCMOS) for NCL, are discussed. The book culminates with a comprehensive design example of an optimized Greatest Common Divisor circuit. Readers should have prior knowledge of basic logic design concepts, such as Boolean algebra and Karnaugh maps. After studying this book, readers should have a good understanding of the differences between asynchronous and synchronous circuits, and should be able to design arbitrary NCL circuits, optimized for area, throughput, and power. Table of Contents: Introduction to Asynchronous Logic / Overview of NULL Convention Logic (NCL) / Combinational NCL Circuit Design / Sequential NCL Circuit Design / NCL Throughput Optimization / Low-Power NCL Design / Comprehensive NCL Design Example View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Single-Instruction Multiple-Data Execution

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Having hit power limitations to even more aggressive out-of-order execution in processor cores, many architects in the past decade have turned to single-instruction-multiple-data (SIMD) execution to increase single-threaded performance. SIMD execution, or having a single instruction drive execution of an identical operation on multiple data items, was already well established as a technique to efficiently exploit data parallelism. Furthermore, support for it was already included in many commodity processors. However, in the past decade, SIMD execution has seen a dramatic increase in the set of applications using it, which has motivated big improvements in hardware support in mainstream microprocessors. The easiest way to provide a big performance boost to SIMD hardware is to make it wider— i.e., increase the number of data items hardware operates on simultaneously. Indeed, microprocessor vendors have done this. However, as we exploit more data parallelism in applications, cert in challenges can negatively impact performance. In particular, conditional execution, noncontiguous memory accesses, and the presence of some dependences across data items are key roadblocks to achieving peak performance with SIMD execution. This book first describes data parallelism, and why it is so common in popular applications. We then describe SIMD execution, and explain where its performance and energy benefits come from compared to other techniques to exploit parallelism. Finally, we describe SIMD hardware support in current commodity microprocessors. This includes both expected design tradeoffs, as well as unexpected ones, as we work to overcome challenges encountered when trying to map real software to SIMD execution. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Essential Principles for Autonomous Robotics

    Copyright Year: 2013

    Morgan and Claypool eBooks

    From driving, flying, and swimming, to digging for unknown objects in space exploration, autonomous robots take on varied shapes and sizes. In part, autonomous robots are designed to perform tasks that are too dirty, dull, or dangerous for humans. With nontrivial autonomy and volition, they may soon claim their own place in human society. These robots will be our allies as we strive for understanding our natural and man-made environments and build positive synergies around us. Although we may never perfect replication of biological capabilities in robots, we must harness the inevitable emergence of robots that synchronizes with our own capacities to live, learn, and grow. This book is a snapshot of motivations and methodologies for our collective attempts to transform our lives and enable us to cohabit with robots that work with and for us. It reviews and guides the reader to seminal and continual developments that are the foundations for successful paradigms. It attempts to demystify the abilities and limitations of robots. It is a progress report on the continuing work that will fuel future endeavors. Table of Contents: Part I: Preliminaries/Agency, Motion, and Anatomy/Behaviors / Architectures / Affect/Sensors / Manipulators/Part II: Mobility/Potential Fields/Roadmaps / Reactive Navigation / Multi-Robot Mapping: Brick and Mortar Strategy / Part III: State of the Art / Multi-Robotics Phenomena / Human-Robot Interaction / Fuzzy Control / Decision Theory and Game Theory / Part IV: On the Horizon / Applications: Macro and Micro Robots / References / Author Biography / Discussion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Landmarking and Segmentation of 3D CT Images

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Segmentation and landmarking of computed tomographic (CT) images of pediatric patients are important and useful in computer-aided diagnosis (CAD), treatment planning, and objective analysis of normal as well as pathological regions. Identification and segmentation of organs and tissues in the presence of tumors are difficult. Automatic segmentation of the primary tumor mass in neuroblastoma could facilitate reproducible and objective analysis of the tumor's tissue composition, shape, and size. However, due to the heterogeneous tissue composition of the neuroblastic tumor, ranging from low-attenuation necrosis to high-attenuation calcification, segmentation of the tumor mass is a challenging problem. In this context, methods are described in this book for identification and segmentation of several abdominal and thoracic landmarks to assist in the segmentation of neuroblastic tumors in pediatric CT images. Methods to identify and segment automatically the peripheral artifacts and tissu s, the rib structure, the vertebral column, the spinal canal, the diaphragm, and the pelvic surface are described. Techniques are also presented to evaluate quantitatively the results of segmentation of the vertebral column, the spinal canal, the diaphragm, and the pelvic girdle by comparing with the results of independent manual segmentation performed by a radiologist. The use of the landmarks and removal of several tissues and organs are shown to assist in limiting the scope of the tumor segmentation process to the abdomen, to lead to the reduction of the false-positive error, and to improve the result of segmentation of neuroblastic tumors. Table of Contents: Introduction to Medical Image Analysis / Image Segmentation / Experimental Design and Database / Ribs, Vertebral Column, and Spinal Canal / Delineation of the Diaphragm / Delineation of the Pelvic Girdle / Application of Landmarking / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Easy Path to Convex Analysis and Applications

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Convex optimization has an increasing impact on many areas of mathematics, applied sciences, and practical applications. It is now being taught at many universities and being used by researchers of different fields. As convex analysis is the mathematical foundation for convex optimization, having deep knowledge of convex analysis helps students and researchers apply its tools more effectively. The main goal of this book is to provide an easy access to the most fundamental parts of convex analysis and its applications to optimization. Modern techniques of variational analysis are employed to clarify and simplify some basic proofs in convex analysis and build the theory of generalized differentiation for convex functions and sets in finite dimensions. We also present new applications of convex analysis to location problems in connection with many interesting geometric problems such as the Fermat-Torricelli problem, the Heron problem, the Sylvester problem, and their generalizations. Of ourse, we do not expect to touch every aspect of convex analysis, but the book consists of sufficient material for a first course on this subject. It can also serve as supplemental reading material for a course on convex optimization and applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Contextual Analysis of Videos

    Copyright Year: 2013

    Morgan and Claypool eBooks

    video context analysis, interactive Swarms, particle swarm optimization, multi-target tracking, social behavior, crowded scenes, abnormality detection, visual surveillance, manifold embedding, crowd analysis, spatio-temporal Laplacian Eigenmap View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Resilient Architecture Design for Voltage Variation

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Shrinking feature size and diminishing supply voltage are making circuits sensitive to supply voltage fluctuations within the microprocessor, caused by normal workload activity changes. If left unattended, voltage fluctuations can lead to timing violations or even transistor lifetime issues that degrade processor robustness. Mechanisms that learn to tolerate, avoid, and eliminate voltage fluctuations based on program and microarchitectural events can help steer the processor clear of danger, thus enabling tighter voltage margins that improve performance or lower power consumption. We describe the problem of voltage variation and the factors that influence this variation during processor design and operation. We also describe a variety of runtime hardware and software mitigation techniques that either tolerate, avoid, and/or eliminate voltage violations. We hope processor architects will find the information useful since tolerance, avoidance, and elimination are generalizable construct that can serve as a basis for addressing other reliability challenges as well. Table of Contents: Introduction / Modeling Voltage Variation / Understanding the Characteristics of Voltage Variation / Traditional Solutions and Emerging Solution Forecast / Allowing and Tolerating Voltage Emergencies / Predicting and Avoiding Voltage Emergencies / Eliminiating Recurring Voltage Emergencies / Future Directions on Resiliency View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Lying by Approximation:The Truth about Finite Element Analysis

    Copyright Year: 2013

    Morgan and Claypool eBooks

    In teaching an introduction to the finite element method at the undergraduate level, a prudent mix of theory and applications is often sought. In many cases, analysts use the finite element method to perform parametric studies on potential designs to size parts, weed out less desirable design scenarios, and predict system behavior under load. In this book, we discuss common pitfalls encountered by many finite element analysts, in particular, students encountering the method for the first time. We present a variety of simple problems in axial, bending, torsion, and shear loading that combine the students' knowledge of theoretical mechanics, numerical methods, and approximations particular to the finite element method itself. We also present case studies in which analyses are coupled with experiments to emphasize validation, illustrate where interpretations of numerical results can be misleading, and what can be done to allay such tendencies. Challenges in presenting the necessary mix f theory and applications in a typical undergraduate course are discussed. We also discuss a list of tips and rules of thumb for applying the method in practice. Table of Contents: Preface / Acknowledgments / Guilty Until Proven Innocent / Let's Get Started / Where We Begin to Go Wrong / It's Only a Model / Wisdom Is Doing It / Summary / Afterword / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Database Anonymization:Privacy Models, Data Utility, and Microaggregation-based Inter-model Connections

    Copyright Year: 2016

    Morgan and Claypool eBooks

    The current social and economic context increasingly demands open data to improve scientific research and decision making. However, when published data refer to individual respondents, disclosure risk limitation techniques must be implemented to anonymize the data and guarantee by design the fundamental right to privacy of the subjects the data refer to. Disclosure risk limitation has a long record in the statistical and computer science research communities, who have developed a variety of privacy-preserving solutions for data releases. This Synthesis Lecture provides a comprehensive overview of the fundamentals of privacy in data releases focusing on the computer science perspective. Specifically, we detail the privacy models, anonymization methods, and utility and risk metrics that have been proposed so far in the literature. Besides, as a more advanced topic, we identify and discuss in detail connections between several privacy models (i.e., how to accumulate the privacy guarantee they offer to achieve more robust protection and when such guarantees are equivalent or complementary); we also explore the links between anonymization methods and privacy models (how anonymization methods can be used to enforce privacy models and thereby offer ex ante privacy guarantees). These latter topics are relevant to researchers and advanced practitioners, who will gain a deeper understanding on the available data anonymization solutions and the privacy guarantees they can offer. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Electronics: Book 2:Amplifiers: Analysis and Design

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book, Amplifiers: Analysis and Design, is the second of four books of a larger work, Fundamentals of Electronics. It is comprised of four chapters that describe the fundamentals of amplifier performance. Beginning with a review of two-port analysis, the first chapter introduces the modeling of the response of transistors to AC signals. Basic one-transistor amplifiers are extensively discussed. The next chapter expands the discussion to multiple transistor amplifiers. The coverage of simple amplifiers is concluded with a chapter that examines power amplifiers. This discussion defines the limits of small-signal analysis and explores the realm where these simplifying assumptions are no longer valid and distortion becomes present. The final chapter concludes the book with the first of two chapters in Fundamental of Electronics on the significant topic of feedback amplifiers. Fundamentals of Electronics has been designed primarily for use in an upper division course in electronics for electrical engineering students. Typically such a course spans a full academic years consisting of two semesters or three quarters. As such, Amplifiers: Analysis and Design, and two other books, Electronic Devices and Circuit Applications, and Active Filters and Amplifier Frequency Response, form an appropriate body of material for such a course. Secondary applications include the use with Electronic Devices and Circuit Applications in a one-semester electronics course for engineers or as a reference for practicing engineers. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multi-Core Cache Hierarchies

    Copyright Year: 2011

    Morgan and Claypool eBooks

    A key determinant of overall system performance and power dissipation is the cache hierarchy since access to off-chip memory consumes many more cycles and energy than on-chip accesses. In addition, multi-core processors are expected to place ever higher bandwidth demands on the memory system. All these issues make it important to avoid off-chip memory access by improving the efficiency of the on-chip cache. Future multi-core processors will have many large cache banks connected by a network and shared by many cores. Hence, many important problems must be solved: cache resources must be allocated across many cores, data must be placed in cache banks that are near the accessing core, and the most important data must be identified for retention. Finally, difficulties in scaling existing technologies require adapting to and exploiting new technology constraints. The book attempts a synthesis of recent cache research that has focused on innovations for multi-core processors. It is an excel ent starting point for early-stage graduate students, researchers, and practitioners who wish to understand the landscape of recent cache research. The book is suitable as a reference for advanced computer architecture classes as well as for experienced researchers and VLSI engineers. Table of Contents: Basic Elements of Large Cache Design / Organizing Data in CMP Last Level Caches / Policies Impacting Cache Hit Rates / Interconnection Networks within Large Caches / Technology / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Private Information Retrieval

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book deals with Private Information Retrieval (PIR), a technique allowing a user to retrieve an element from a server in possession of a database without revealing to the server which element is retrieved. PIR has been widely applied to protect the privacy of the user in querying a service provider on the Internet. For example, by PIR, one can query a location-based service provider about the nearest car park without revealing his location to the server. The first PIR approach was introduced by Chor, Goldreich, Kushilevitz and Sudan in 1995 in a multi-server setting, where the user retrieves information from multiple database servers, each of which has a copy of the same database. To ensure user privacy in the multi-server setting, the servers must be trusted not to collude. In 1997, Kushilevitz and Ostrovsky constructed the first single-database PIR. Since then, many efficient PIR solutions have been discovered. Beginning with a thorough survey of single-database PIR techniques, this text focuses on the latest technologies and applications in the field of PIR. The main categories are illustrated with recently proposed PIR-based solutions by the authors. Because of the latest treatment of the topic, this text will be highly beneficial to researchers and industry professionals in information security and privacy. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    MATLAB for Engineering and the Life Sciences

    Copyright Year: 2011

    Morgan and Claypool eBooks

    In recent years, the life sciences have embraced simulation as an important tool in biomedical research. Engineers are also using simulation as a powerful step in the design process. In both arenas, Matlab has become the gold standard. It is easy to learn, flexible, and has a large and growing userbase. MATLAB for Engineering and the Life Sciences is a self-guided tour of the basic functionality of MATLAB along with the functions that are most commonly used in biomedical engineering and other life sciences. Although the text is written for undergraduates, graduate students and academics, those in industry may also find value in learning MATLAB through biologically inspired examples. For instructors, the book is intended to take the emphasis off of learning syntax so that the course can focus more on algorithmic thinking. Although it is not assumed that the reader has taken differential equations or a linear algebra class, there are short introductions to many of these concepts. Follow ng a short history of computing, the MATLAB environment is introduced. Next, vectors and matrices are discussed, followed by matrix-vector operations. The core programming elements of MATLAB are introduced in three successive chapters on scripts, loops, and conditional logic. The last three chapters outline how to manage the input and output of data, create professional quality graphics and find and use Matlab toolboxes. Throughout, biomedical examples are used to illustrate MATLAB's capabilities. Table of Contents: Introduction / Matlab Programming Environment / Vectors / Matrices / Matrix -- Vector Operations / Scripts and Functions / Loops / Conditional Logic / Data In, Data Out / Graphics / Toolboxes View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Neural Network Methods in Natural Language Processing

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Neural networks are a family of powerful machine learning models. This book focuses on the application of neural network models to natural language data. The first half of the book (Parts I and II) covers the basics of supervised machine learning and feed-forward neural networks, the basics of working with machine learning over language data, and the use of vector-based rather than symbolic representations for words. It also covers the computation-graph abstraction, which allows to easily define and train arbitrary neural networks, and is the basis behind the design of contemporary neural network software libraries.</p> <p>The second part of the book (Parts III and IV) introduces more specialized neural network architectures, including 1D convolutional neural networks, recurrent neural networks, conditioned-generation models, and attention-based models. These architectures and techniques are the driving force behind state-of-the-art algorithms for machine tr nslation, syntactic parsing, and many other applications. Finally, we also discuss tree-shaped networks, structured prediction, and the prospects of multi-task learning.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multipath Effects in GPS Receivers:A Primer

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Autonomous vehicles use global navigation satellite systems (GNSS) to provide a position within a few centimeters of truth. Centimeter positioning requires accurate measurement of each satellite's direct path propagation time. Multipath corrupts the propagation time estimate by creating a time-varying bias. A GNSS receiver model is developed and the effects of multipath are investigated. MATLABtm code is provided to enable readers to run simple GNSS receiver simulations. More specifically, GNSS signal models are presented and multipath mitigation techniques are described for various multipath conditions. Appendices are included in the booklet to derive some of the basics on early minus late code synchronization methods. Details on the numerically controlled oscillator and its properties are also given in the appendix. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Privacy-Preserving Data Publishing:An Overview

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Privacy preservation has become a major issue in many data analysis applications. When a data set is released to other parties for data analysis, privacy-preserving techniques are often required to reduce the possibility of identifying sensitive information about individuals. For example, in medical data, sensitive information can be the fact that a particular patient suffers from HIV. In spatial data, sensitive information can be a specific location of an individual. In web surfing data, the information that a user browses certain websites may be considered sensitive. Consider a dataset containing some sensitive information is to be released to the public. In order to protect sensitive information, the simplest solution is not to disclose the information. However, this would be an overkill since it will hinder the process of data analysis over the data from which we can find interesting patterns. Moreover, in some applications, the data must be disclosed under the government regulati ns. Alternatively, the data owner can first modify the data such that the modified data can guarantee privacy and, at the same time, the modified data retains sufficient utility and can be released to other parties safely. This process is usually called as privacy-preserving data publishing. In this monograph, we study how the data owner can modify the data and how the modified data can preserve privacy and protect sensitive information. Table of Contents: Introduction / Fundamental Concepts / One-Time Data Publishing / Multiple-Time Data Publishing / Graph Data / Other Data Types / Future Research Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Measuring User Engagement

    Copyright Year: 2014

    Morgan and Claypool eBooks

    User engagement refers to the quality of the user experience that emphasizes the positive aspects of interacting with an online application and, in particular, the desire to use that application longer and repeatedly. User engagement is a key concept in the design of online applications (whether for desktop, tablet or mobile), motivated by the observation that successful applications are not just used, but are engaged with. Users invest time, attention, and emotion in their use of technology, and seek to satisfy pragmatic and hedonic needs. Measurement is critical for evaluating whether online applications are able to successfully engage users, and may inform the design of and use of applications. User engagement is a multifaceted, complex phenomenon; this gives rise to a number of potential measurement approaches. Common ways to evaluate user engagement include using self-report measures, e.g., questionnaires; observational methods, e.g. facial expression analysis, speech analysis; n uro-physiological signal processing methods, e.g., respiratory and cardiovascular accelerations and decelerations, muscle spasms; and web analytics, e.g., number of site visits, click depth. These methods represent various trade-offs in terms of the setting (laboratory versus ``in the wild''), object of measurement (user behaviour, affect or cognition) and scale of data collected. For instance, small-scale user studies are deep and rich, but limited in terms of generalizability, whereas large-scale web analytic studies are powerful but negate users' motivation and context. The focus of this book is how user engagement is currently being measured and various considerations for its measurement. Our goal is to leave readers with an appreciation of the various ways in which to measure user engagement, and their associated strengths and weaknesses. We emphasize the multifaceted nature of user engagement and the unique contextual constraints that come to bear upon attempts to measure eng gement in different settings, and across different user groups and web domains. At the same time, this book advocates for the development of ``good'' measures and good measurement practices that will advance the study of user engagement and improve our understanding of this construct, which has become so vital in our wired world. Table of Contents: Preface / Acknowledgments / Introduction and Scope / Approaches Based on Self-Report Methods / Approaches Based on Physiological Measurements / Approaches Based on Web Analytics / Beyond Desktop, Single Site, and Single Task / Enhancing the Rigor of User Engagement Methods and Measures / Conclusions and Future Research Directions / Bibliography / Authors' Biographies / Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis Techniques for Information Security

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Increasingly our critical infrastructures are reliant on computers. We see examples of such infrastructures in several domains, including medical, power, telecommunications, and finance. Although automation has advantages, increased reliance on computers exposes our critical infrastructures to a wider variety and higher likelihood of accidental failures and malicious attacks. Disruption of services caused by such undesired events can have catastrophic effects, such as disruption of essential services and huge financial losses. The increased reliance of critical services on our cyberinfrastructure and the dire consequences of security breaches have highlighted the importance of information security. Authorization, security protocols, and software security are three central areas in security in which there have been significant advances in developing systematic foundations and analysis methods that work for practical systems. This book provides an introduction to this work, covering rep esentative approaches, illustrated by examples, and providing pointers to additional work in the area. Table of Contents: Introduction / Foundations / Detecting Buffer Overruns Using Static Analysis / Analyzing Security Policies / Analyzing Security Protocols View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Nonimaging Optics in Solar Energy

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Nonimaging optics is a subdiscipline of optics whose development over the last 35–40 years was led by scientists from the University of Chicago and other cooperating individuals and institutions. The approach provides a formalism that allows the design of optical devices that approach the maximum physically attainable geometric concentration for a given set of optical tolerances. This means that it has the potential to revolutionize the design of solar concentrators. In this monograph, the basic practical applications of the techniques of nonimaging optics to solar energy collection and concentration are developed and explained. The formalism for designing a wide variety of concentrator types, such as the compound parabolic concentrator and its many embodiments and variations, is presented. Both advantages and limitations of the approach are reviewed. Practical and economic aspects of concentrator design for both thermal and photovoltaic applications are discussed as well. The hole range of concentrator applications from simple low-concentration nontracking designs to ultrahigh-concentration multistage configurations is covered. Table of Contents: Introduction / CPCs / Practical Design of CPC Thermal Collectors / Practical Design of CPC PV Concentrators / Two-Stage Nonimaging Concentrators for Solar Thermal Applications / Two-Stage Nonimaging Concentrators for Solar PV Applications / Selected Demonstrations of Nonimaging Concentrator Performance / The Importance of Economic Factors in Effective Solar Concentrator Design / Ultrahigh Concentration / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Evolving agents to play games is a promising technology. It can provide entertaining opponents for games like Chess or Checkers, matched to a human opponent as an alternative to the perfect and unbeatable opponents embodied by current artificial intelligences. Evolved agents also permit us to explore the strategy space of mathematical games like Prisoner's Dilemma and Rock-Paper-Scissors.</p> <p>This book summarizes, explores, and extends recent work showing that there are many unsuspected factors that must be controlled in order to create a plausible or useful set of agents for modeling cooperation and conflict, deal making, or other social behaviors. The book also provides a proposal for an agent training protocol that is intended as a step toward being able to train humaniform agents—in other words, agents that plausibly model human behavior.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Envisionment and Discovery Collaboratory (EDC):Explorations in Human-Centered Informatics with Tabletop Computing Environments

    Copyright Year: 2015

    Morgan and Claypool eBooks

    he Envisionment and Discovery Collaboratory (EDC) is a long-term research platform exploring immersive socio-technical environments in which stakeholders can collaboratively frame and solve problems and discuss and make decisions in a variety of application domains and different disciplines. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2014

    Morgan and Claypool eBooks

    A response of the engineering profession to the challenges of security, poverty and underdevelopment, environmental sustainability, and native cultures is described. Ethical codes, which govern the behavior of engineers, are examined from a historical perspective linking the prevailing codes to models of the natural world. A new ethical code based on a recently introduced model of Nature as an integral community is provided and discussed. Applications of the new code are described using a case study approach. With the ethical code based on an integral community in place, new design algorithms are developed and also explored using case studies. Implications of the proposed changes in ethics and design on engineering education are considered. Table of Contents: Preface / Acknowledgments / Introduction / Engineering Ethics / Models of the Earth / Engineering in a Morally Deep World / Engineering Design in a Morally Deep World / Implications for Engineering Education / Final Thoughts / Re erences / Author's Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modern Image Quality Assessment

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This Lecture book is about objective image quality assessment—where the aim is to provide computational models that can automatically predict perceptual image quality. The early years of the 21st century have witnessed a tremendous growth in the use of digital images as a means for representing and communicating information. A considerable percentage of this literature is devoted to methods for improving the appearance of images, or for maintaining the appearance of images that are processed. Nevertheless, the quality of digital images, processed or otherwise, is rarely perfect. Images are subject to distortions during acquisition, compression, transmission, processing, and reproduction. To maintain, control, and enhance the quality of images, it is important for image acquisition, management, communication, and processing systems to be able to identify and quantify image quality degradations. The goals of this book are as follows; a) to introduce the fundamentals of image qual ty assessment, and to explain the relevant engineering problems, b) to give a broad treatment of the current state-of-the-art in image quality assessment, by describing leading algorithms that address these engineering problems, and c) to provide new directions for future research, by introducing recent models and paradigms that significantly differ from those used in the past. The book is written to be accessible to university students curious about the state-of-the-art of image quality assessment, expert industrial R&D engineers seeking to implement image/video quality assessment systems for specific applications, and academic theorists interested in developing new algorithms for image quality assessment or using existing algorithms to design or optimize other image processing applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A, B, See... in 3D:A Workbook to Improve 3-D Visualization Skills

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The workbook provides over 100 3D visualization exercises challenging the student to create three dimensions from two. It is a powerful and effective way to help engineering and architecture educators teach spatial visualization. Most of the 3-D visualization exercises currently being used by students in Design and Graphics classes present the objects in isometric views already in 3-D, asking the viewer to create multiple views, fold patterns, manipulate, reflect, or rotate them. The exercises presenting the objects in incomplete multiview projections asking the students to add missing lines use mostly real 3D objects that are more easily recognizable to help the student correlate 2D with 3D. This workbook uses a different approach. Each view of the solid represents a letter of the alphabet. The letters are by definition 2D representations and when they are combined to create a 3D object, visualizing it becomes quite a challenge. This workbook is intended for Engineering, Architecture and Art students and faculty that want to increase their 3-D visualization skills. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Microcontroller Programming and Interfacing Texas Instruments MSP430:Part II

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book provides a thorough introduction to the Texas Instruments MPS430 microcontroller. The MPS430 is a 16-bit reduced instruction set (RISC) processor that features ultra low power consumption and integrated digital and analog hardware. Variants of the MPS430 microcontroller have been in production since 1993. This provides for a host of MPS430 products including evaluation boards, compilers, and documentation. A thorough introduction to the MPS430 line of microcontrollers, programming techniques, and interface concepts are provided along with considerable tutorial information with many illustrated examples. Each chapter provides laboratory exercises to apply what has been presented in the chapter. The book is intended for an upper level undergraduate course in microcontrollers or mechatronics but may also be used as a reference for capstone design projects. Also, practicing engineers already familiar with another microcontroller, who require a quick tutorial on the microcontroll r, will find this book very useful. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advances in Multi-Channel Resource Allocation:Throughput, Delay, and Complexity

    Copyright Year: 2016

    Morgan and Claypool eBooks

    The last decade has seen an unprecedented growth in the demand for wireless services. These services are fueled by applications that often require not only high data rates, but also very low latency to function as desired. However, as wireless networks grow and support increasingly large numbers of users, these control algorithms must also incur only low complexity in order to be implemented in practice. Therefore, there is a pressing need to develop wireless control algorithms that can achieve both high throughput and low delay, but with low-complexity operations. While these three performance metrics, i.e., throughput, delay, and complexity, are widely acknowledged as being among the most important for modern wireless networks, existing approaches often have had to sacrifice a subset of them in order to optimize the others, leading to wireless resource allocation algorithms that either suffer poor performance or are difficult to implement. In contrast, the recent results presented i this book demonstrate that, by cleverly taking advantage of multiple physical or virtual channels, one can develop new low-complexity algorithms that attain both provably high throughput and provably low delay. The book covers both the intra-cell and network-wide settings. In each case, after the pitfalls of existing approaches are examined, new systematic methodologies are provided to develop algorithms that perform provably well in all three dimensions. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Web Page Recommendation Models:Theory and Algorithms

    Copyright Year: 2010

    Morgan and Claypool eBooks

    One of the application areas of data mining is the World Wide Web (WWW or Web), which serves as a huge, widely distributed, global information service for every kind of information such as news, advertisements, consumer information, financial management, education, government, e-commerce, health services, and many other information services. The Web also contains a rich and dynamic collection of hyperlink information, Web page access and usage information, providing sources for data mining. The amount of information on the Web is growing rapidly, as well as the number of Web sites and Web pages per Web site. Consequently, it has become more difficult to find relevant and useful information for Web users. Web usage mining is concerned with guiding the Web users to discover useful knowledge and supporting them for decision-making. In that context, predicting the needs of a Web user as she visits Web sites has gained importance. The requirement for predicting user needs in order to guide the user in a Web site and improve the usability of the Web site can be addressed by recommending pages to the user that are related to the interest of the user at that time. This monograph gives an overview of the research in the area of discovering and modeling the users' interest in order to recommend related Web pages. The Web page recommender systems studied in this monograph are categorized according to the data mining algorithms they use for recommendation. Table of Contents: Introduction to Web Page Recommender Systems / Preprocessing for Web Page Recommender Models / Pattern Extraction / Evaluation Metrics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fixed-Point Signal Processing

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book is intended to fill the gap between the "ideal precision" digital signal processing (DSP) that is widely taught, and the limited precision implementation skills that are commonly required in fixed-point processors and field programmable gate arrays (FPGAs). These skills are often neglected at the university level, particularly for undergraduates. We have attempted to create a resource both for a DSP elective course and for the practicing engineer with a need to understand fixed-point implementation. Although we assume a background in DSP, Chapter 2 contains a review of basic theory and Chapter 3 reviews random processes to support the noise model of quantization error. Chapter 4 details the binary arithmetic that underlies fixed-point processors and then introduces fractional format for binary numbers. Chapter 5 covers the noise model for quantization error and the effects of coefficient quantization in filters. Because of the numerical sensitivity of IIR filters, they are u ed extensively as an example system in both Chapters 5 and 6. Fortunately, the principles of dealing with limited precision can be applied to a wide variety of numerically sensitive systems, not just IIR filters. Chapter 6 discusses the problems of product roundoff error and various methods of scaling to avoid overflow. Chapter 7 discusses limit cycle effects and a few common methods for minimizing them. There are a number of simple exercises integrated into the text to allow you to test your understanding. Answers to the exercises are included in the footnotes. A number of MATLAB examples are provided in the text. They generally assume access to the Fixed-Point Toolbox. If you lack access to this software, consider either purchasing or requesting an evaluation license from The Mathworks. The code listed in the text and other helpful MATLAB code is also available at http://www.morganclaypool.com/page/padgett and http://www.rose-hulman.edu/padgett/fpsp. You will also find MATLAB exerci es designed to demonstrate each of the four types of error discussed in Chapters 5 and 6. Simulink examples are also provided on the web site. Table of Contents: Getting Started / DSP Concepts / Random Processes and Noise / Fixed Point Numbers / Quantization Effects: Data and Coefficients / Quantization Effects - Round-Off Noise and Overflow / Limit Cycles View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dynamic Speech Models

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Speech dynamics refer to the temporal characteristics in all stages of the human speech communication process. This speech “chain” starts with the formation of a linguistic message in a speaker's brain and ends with the arrival of the message in a listener's brain. Given the intricacy of the dynamic speech process and its fundamental importance in human communication, this monograph is intended to provide a comprehensive material on mathematical models of speech dynamics and to address the following issues: How do we make sense of the complex speech process in terms of its functional role of speech communication? How do we quantify the special role of speech timing? How do the dynamics relate to the variability of speech that has often been said to seriously hamper automatic speech recognition? How do we put the dynamic process of speech into a quantitative form to enable detailed analyses? And finally, how can we incorporate the knowledge of speech dynamics into com uterized speech analysis and recognition algorithms? The answers to all these questions require building and applying computational models for the dynamic speech process. What are the compelling reasons for carrying out dynamic speech modeling? We provide the answer in two related aspects. First, scientific inquiry into the human speech code has been relentlessly pursued for several decades. As an essential carrier of human intelligence and knowledge, speech is the most natural form of human communication. Embedded in the speech code are linguistic (as well as para-linguistic) messages, which are conveyed through four levels of the speech chain. Underlying the robust encoding and transmission of the linguistic messages are the speech dynamics at all the four levels. Mathematical modeling of speech dynamics provides an effective tool in the scientific methods of studying the speech chain. Such scientific studies help understand why humans speak as they do and how humans exploit redunda cy and variability by way of multitiered dynamic processes to enhance the efficiency and effectiveness of human speech communication. Second, advancement of human language technology, especially that in automatic recognition of natural-style human speech is also expected to benefit from comprehensive computational modeling of speech dynamics. The limitations of current speech recognition technology are serious and are well known. A commonly acknowledged and frequently discussed weakness of the statistical model underlying current speech recognition technology is the lack of adequate dynamic modeling schemes to provide correlation structure across the temporal speech observation sequence. Unfortunately, due to a variety of reasons, the majority of current research activities in this area favor only incremental modifications and improvements to the existing HMM-based state-of-the-art. For example, while the dynamic and correlation modeling is known to be an important topic, most of the ystems nevertheless employ only an ultra-weak form of speech dynamics; e.g., differential or delta parameters. Strong-form dynamic speech modeling, which is the focus of this monograph, may serve as an ultimate solution to this problem. After the introduction chapter, the main body of this monograph consists of four chapters. They cover various aspects of theory, algorithms, and applications of dynamic speech models, and provide a comprehensive survey of the research work in this area spanning over past 20~years. This monograph is intended as advanced materials of speech and signal processing for graudate-level teaching, for professionals and engineering practioners, as well as for seasoned researchers and engineers specialized in speech processing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Biosystems Approach to Industrial Patient Monitoring and Diagnostic Devices

    Copyright Year: 2008

    Morgan and Claypool eBooks

    A medical device is an apparatus that uses engineering and scientific principles to interface to physiology and diagnose or treat a disease. In this Lecture, we specifically consider thosemedical devices that are computer based, and are therefore referred to as medical instruments. Further, the medical instruments we discuss are those that incorporate system theory into their designs. We divide these types of instruments into those that provide continuous observation and those that provide a single snapshot of health information. These instruments are termed patient monitoring devices and diagnostic devices, respectively.Within this Lecture, we highlight some of the common system theory techniques that are part of the toolkit of medical device engineers in industry. These techniques include the pseudorandom binary sequence, adaptive filtering, wavelet transforms, the autoregressive moving average model with exogenous input, artificial neural networks, fuzzy models, and fuzzy control. ecause the clinical usage requirements for patient monitoring and diagnostic devices are so high, system theory is the preferred substitute for heuristic, empirical processing during noise artifact minimization and classification. Table of Contents: Preface / Medical Devices / System Theory / Patient Monitoring Devices / Diagnostic Devices / Conclusion / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing for User Engagement:Aesthetic and Attractive User Interfaces

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book explores the design process for user experience and engagement, which expands the traditional concept of usability and utility in design to include aesthetics, fun and excitement. User experience has evolved as a new area of Human Computer Interaction research, motivated by non-work oriented applications such as games, education and emerging interactive Web 2.0. The chapter starts by examining the phenomena of user engagement and experience and setting them in the perspective of cognitive psychology, in particular motivation, emotion and mood. The perspective of aesthetics is expanded towards interaction and engagement to propose design treatments, metaphors, and interactive techniques which can promote user interest, excitement and satisfying experiences. This is followed by reviewing the design process and design treatments which can promote aesthetic perception and engaging interaction. The final part of the chapter provides design guidelines and principles drawn from the interaction and graphical design literature which are cross-referenced to issues in the design process. Examples of designs and design treatments are given to illustrate principles and advice, accompanied by critical reflection. Table of Contents: Introduction / Psychology of User Engagement / UE Design Process / Design Principles and Guidelines / Perspectives and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    VIVO:A Semantic Approach to Scholarly Networking and Discovery

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The world of scholarship is changing rapidly. Increasing demands on scholars, the growing size and complexity of questions and problems to be addressed, and advances in sophistication of data collection, analysis, and presentation require new approaches to scholarship. A ubiquitous, open information infrastructure for scholarship, consisting of linked open data, open-source software tools, and a community committed to sustainability are emerging to meet the needs of scholars today. This book provides an introduction to VIVO, http://vivoweb.org/, a tool for representing information about research and researchers -- their scholarly works, research interests, and organizational relationships. VIVO provides an expressive ontology, tools for managing the ontology, and a platform for using the ontology to create and manage linked open data for scholarship and discovery. Begun as a project at Cornell and further developed by an NIH funded consortium, VIVO is now being established as an open ource project with community participation from around the world. By the end of 2012, over 20 countries and 50 organizations will provide information in VIVO format on more than one million researchers and research staff, including publications, research resources, events, funding, courses taught, and other scholarly activity. The rapid growth of VIVO and of VIVO-compatible data sources speaks to the fundamental need to transform scholarship for the 21st century. Table of Contents: Scholarly Networking Needs and Desires / The VIVO Ontology / Implementing VIVO and Filling It with Life / Case Study: University of Colorado at Boulder / Case Study: Weill Cornell Medical College / Extending VIVO / Analyzing and Visualizing VIVO Data / The Future of VIVO: Growing the Community View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing for Digital Reading

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Reading is a complex human activity that has evolved, and co-evolved, with technology over thousands of years. Mass printing in the fifteenth century firmly established what we know as the modern book, with its physical format of covers and paper pages, and now-standard features such as page numbers, footnotes, and diagrams. Today, electronic documents are enabling paperless reading supported by eReading technologies such as Kindles and Nooks, yet a high proportion of users still opt to print on paper before reading. This persistent habit of "printing to read" is one sign of the shortcomings of digital documents -- although the popularity of eReaders is one sign of the shortcomings of paper. How do we get the best of both worlds? The physical properties of paper (for example, it is light, thin, and flexible) contribute to the ease with which physical documents are manipulated; but these properties have a completely different set of affordances to their digital equivalents. Paper can b folded, ripped, or scribbled on almost subconsciously -- activities that require significant cognitive attention in their digital form, if they are even possible. The nearly subliminal interaction that comes from years of learned behavior with paper has been described as lightweight interaction, which is achieved when a person actively reads an article in a way that is so easy and unselfconscious that they are not apt to remember their actions later. Reading is now in a period of rapid change, and digital text is fast becoming the predominant mode of reading. As a society, we are merely at the start of the journey of designing truly effective tools for handling digital text. This book investigates the advantages of paper, how the affordances of paper can be realized in digital form, and what forms best support lightweight interaction for active reading. To understand how to design for the future, we review the ways reading technology and reader behavior have both changed and remained constant over hundreds of years. We explore the reasoning behind reader behavior and introduce and evaluate several user interface designs that implement these lightweight properties familiar from our everyday use of paper. We start by looking back, reviewing the development of reading technology and the progress of research on reading over many years. Drawing key concepts from this review, we move forward to develop and test methods for creating new and more effective interactions for supporting digital reading. Finally, we lay down a set of lightweight attributes which can be used as evidence-based guidelines to improve the usability of future digital reading technologies. By the end of this book, then, we hope you will be equipped to critique the present state of digital reading, and to better design and evaluate new interaction styles and technologies. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Crafting your Research Future:A Guide to Successful Master's and PhD Degrees in Science & Engineering

    Copyright Year: 2012

    Morgan and Claypool eBooks

    What is it like to be a researcher or a scientist? For young people, including graduate students and junior faculty members in universities, how can they identify good ideas for research? How do they conduct solid research to verify and realize their new ideas? How can they formulate their ideas and research results into high-quality articles, and publish them in highly competitive journals and conferences? What are effective ways to supervise graduate students so that they can establish themselves quickly in their research careers? In this book, Ling and Yang answer these questions in a step-by-step manner with specific and concrete examples from their first-hand research experience. Table of Contents: Acknowledgments / Preface / Basics of Research / Goals of Ph.D. Research / Getting Started: Finding New Ideas and Organizing Your Plans / Conducting Solid Research / Writing and Publishing Papers / Misconceptions and Tips for Paper Writing / Writing and Defending a Ph.D. Thesis / Life fter Ph.D. / Summary / References / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Link Reversal Algorithms

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Link reversal is a versatile algorithm design technique that has been used in numerous distributed algorithms for a variety of problems. The common thread in these algorithms is that the distributed system is viewed as a graph, with vertices representing the computing nodes and edges representing some other feature of the system (for instance, point-to-point communication channels or a conflict relationship). Each algorithm assigns a virtual direction to the edges of the graph, producing a directed version of the original graph. As the algorithm proceeds, the virtual directions of some of the links in the graph change in order to accomplish some algorithm-specific goal. The criterion for changing link directions is based on information that is local to a node (such as the node having no outgoing links) and thus this approach scales well, a feature that is desirable for distributed algorithms. This monograph presents, in a tutorial way, a representative sampling of the work on link-rev rsal-based distributed algorithms. The algorithms considered solve routing, leader election, mutual exclusion, distributed queueing, scheduling, and resource allocation. The algorithms can be roughly divided into two types, those that assume a more abstract graph model of the networks, and those that take into account more realistic details of the system. In particular, these more realistic details include the communication between nodes, which may be through asynchronous message passing, and possible changes in the graph, for instance, due to movement of the nodes. We have not attempted to provide a comprehensive survey of all the literature on these topics. Instead, we have focused in depth on a smaller number of fundamental papers, whose common thread is that link reversal provides a way for nodes in the system to observe their local neighborhoods, take only local actions, and yet cause global problems to be solved. We conjecture that future interesting uses of link reversal are ye to be discovered. Table of Contents: Introduction / Routing in a Graph: Correctness / Routing in a Graph: Complexity / Routing and Leader Election in a Distributed System / Mutual Exclusion in a Distributed System / Distributed Queueing / Scheduling in a Graph / Resource Allocation in a Distributed System / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Electromagnetics 2:Quasistatics and Waves

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book is the second of two volumes which have been created to provide an understanding of the basic principles and applications of electromagnetic fields for electrical engineering students. Fundamentals of Electromagnetics Vol 2: Quasistatics and Waves examines how the low-frequency models of lumped elements are modified to include parasitic elements. For even higher frequencies, wave behavior in space and on transmission lines is explained. Finally, the textbook concludes with details of transmission line properties and applications. Upon completion of this book and its companion Fundamentals of Electromagnetics Vol 1: Internal Behavior of Lumped Elements, with a focus on the DC and low-frequency behavior of electromagnetic fields within lumped elements, students will have gained the necessary knowledge to progress to advanced studies of electromagnetics. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    OFDM Systems for Wireless Communications

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Orthogonal Frequency Division Multiplexing (OFDM) systems are widely used in the standards for digital audio/video broadcasting, WiFi and WiMax. Being a frequency-domain approach to communications, OFDM has important advantages in dealing with the frequency-selective nature of high data rate wireless communication channels. As the needs for operating with higher data rates become more pressing, OFDM systems have emerged as an effective physical-layer solution. This short monograph is intended as a tutorial which highlights the deleterious aspects of the wireless channel and presents why OFDM is a good choice as a modulation that can transmit at high data rates. The system-level approach we shall pursue will also point out the disadvantages of OFDM systems especially in the context of peak to average ratio, and carrier frequency synchronization. Finally, simulation of OFDM systems will be given due prominence. Simple MATLAB programs are provided for bit error rate simulation using a di crete-time OFDM representation. Software is also provided to simulate the effects of inter-block-interference, inter-carrier-interference and signal clipping on the error rate performance. Different components of the OFDM system are described, and detailed implementation notes are provided for the programs. The program can be downloaded here. Table of Contents: Introduction / Modeling Wireless Channels / Baseband OFDM System / Carrier Frequency Offset / Peak to Average Power Ratio / Simulation of the Performance of OFDM Systems / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Generating Plans from Proofs:The Interpolation-based Approach to Query Reformulation

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Query reformulation refers to a process of translating a source query—a request for information in some high-level logic-based language—into a target plan that abides by certain interface restrictions. Many practical problems in data management can be seen as instances of the reformulation problem. For example, the problem of translating an SQL query written over a set of base tables into another query written over a set of views; the problem of implementing a query via translating to a program calling a set of database APIs; the problem of implementing a query using a collection of web services. In this book we approach query reformulation in a very general setting that encompasses all the problems above, by relating it to a line of research within mathematical logic. For many decades logicians have looked at the problem of converting "implicit definitions" into "explicit definitions," using an approach known as interpolation. We will review the theory of interpolatio , and explain its close connection with query reformulation. We will give a detailed look at how the interpolation-based approach is used to generate translations between logic-based queries over different vocabularies, and also how it can be used to go from logic-based queries to programs. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This volume presents novel computational models for representing digital humans and their interactions with other virtual characters and meaningful environments. In this context, we describe efficient algorithms to animate, control, and author human-like agents having their own set of unique capabilities, personalities, and desires. We begin with the lowest level of footstep determination to steer agents in collision-free paths. Steering choices are controlled by navigation in complex environments, including multi-domain planning with dynamically changing situations. Virtual agents are given perceptual capabilities analogous to those of real people, including sound perception, multi-sense attention, and understanding of environment semantics which affect their behavior choices. The roles and impacts of individual attributes, such as memory and personality are explored. The animation challenges of integrating a number of simultaneous behavior and movement demands on an agent are addres ed through an open source software system. Finally, the creation of stories and narratives with groups of agents subject to planning and environmental constraints culminates the presentation. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mismatch and Noise in Modern IC Processes

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Component variability, mismatch, and various noise effects are major contributors to design limitations in most modern IC processes. Mismatch and Noise in Modern IC Processes examines these related effects and how they affect the building block circuits of modern integrated circuits, from the perspective of a circuit designer. Variability usually refers to a large scale variation that can occur on a wafer to wafer and lot to lot basis, and over long distances on a wafer. This phenomenon is well understood and the effects of variability are included in most integrated circuit design with the use of corner or statistical component models. Mismatch, which is the emphasis of section I of the book, is a local level of variability that leaves the characteristics of adjacent transistors unmatched. This is of particular concern in certain analog and memory systems, but also has an effect on digital logic schemes, where uncertainty is introduced into delay times, which can reduce margins and i troduce 'race' conditions. Noise is a dynamic effect that causes a local mismatch or variability that can vary during operation of a circuit, and is considered in section II. Noise can be the result of atomic effects in devices or circuit interactions, and both of these are discussed in terms of analog and digital circuitry. Table of Contents: Part I: Mismatch / Introduction / Variability and Mismatch in Digital Systems / Variability and Mismatch in Analog Systems I / Variability and Mismatch in Analog Systems II / Lifetime-Induced Variability / Mismatch in Nonconventional Processes / Mismatch Correction Circuits / Part II: Noise / Component and Digital Circuit Noise / Noise Effects in Digital Systems / Noise Effects in Analog Systems / Circuit Design to Minimize Noise Effects / Noise Considerations in SOI View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Physically Unclonable Functions (PUFs):Applications, Models, and Future Directions

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Today, embedded systems are used in many security-critical applications, from access control, electronic tickets, sensors, and smart devices (e.g., wearables) to automotive applications and critical infrastructures. These systems are increasingly used to produce and process both security-critical and privacy-sensitive data, which bear many security and privacy risks. Establishing trust in the underlying devices and making them resistant to software and hardware attacks is a fundamental requirement in many applications and a challenging, yet unsolved, task. Solutions solely based on software can never ensure their own integrity and trustworthiness while resource-constraints and economic factors often prevent the integration of sophisticated security hardware and cryptographic co-processors. In this context, Physically Unclonable Functions (PUFs) are an emerging and promising technology to establish trust in embedded systems with minimal hardware requirements. This book explores the des gn of trusted embedded systems based on PUFs. Specifically, it focuses on the integration of PUFs into secure and efficient cryptographic protocols that are suitable for a variety of embedded systems. It exemplarily discusses how PUFs can be integrated into lightweight device authentication and attestation schemes, which are popular and highly relevant applications of PUFs in practice. For the integration of PUFs into secure cryptographic systems, it is essential to have a clear view of their properties. This book gives an overview of different approaches to evaluate the properties of PUF implementations and presents the results of a large scale security analysis of different PUF types implemented in application-specific integrated circuits (ASICs). To analyze the security of PUF-based schemes as is common in modern cryptography, it is necessary to have a security framework for PUFs and PUF-based systems. In this book, we give a flavor of the formal modeling of PUFs that is in its beg nning and that is still undergoing further refinement in current research. The objective of this book is to provide a comprehensive overview of the current state of secure PUF-based cryptographic system design and the related challenges and limitations. Table of Contents: Preface / Introduction / Basics of Physically Unclonable Functions / Attacks on PUFs and PUF-based Systems / Advanced PUF Concepts / PUF Implementations and Evaluation / PUF-based Cryptographic Protocols / Security Model for PUF-based Systems / Conclusion / Terms and Abbreviations / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Tensor Properties of Solids:Part Two: Transport Properties of Solids

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Tensor Properties of Solids presents the phenomenological development of solid state properties represented as matter tensors in two parts: Part I on equilibrium tensor properties and Part II on transport tensor properties. Part I begins with an introduction to tensor notation, transformations, algebra, and calculus together with the matrix representations. Crystallography, as it relates to tensor properties of crystals, completes the background treatment. A generalized treatment of solid-state equilibrium thermodynamics leads to the systematic correlation of equilibrium tensor properties. This is followed by developments covering first-, second-, third-, and higher-order tensor effects. Included are the generalized compliance and rigidity matrices for first-order tensor properties, Maxwell relations, effect of measurement conditions, and the dependent coupled effects and use of interaction diagrams. Part I concludes with the second- and higher-order effects, including numerous optica tensor properties. Part II presents the driving forces and fluxes for the well-known proper conductivities. An introduction to irreversible thermodynamics includes the concepts of microscopic reversibility, Onsager's reciprocity principle, entropy density production, and the proper choice of the transport parameters. This is followed by the force-flux equations for electronic charge and heat flow and the relationships between the proper conductivities and phenomenological coefficients. The thermoelectric effects in solids are discussed and extended to the piezothermoelectric and piezoresistance tensor effects. The subjects of thermomagnetic, galvanomagnetic, and thermogalvanomagnetic effects are developed together with other higher-order magnetotransport property tensors. A glossary of terms, expressions, and symbols are provided at the end of the text, and end-of-chapter problems are provided on request. Endnotes provide the necessary references for further reading. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Embedded Systems: Using ANSI C and the Arduino Development Environment

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Many electrical and computer engineering projects involve some kind of embedded system in which a microcontroller sits at the center as the primary source of control. The recently-developed Arduino development platform includes an inexpensive hardware development board hosting an eight-bit ATMEL ATmega-family processor and a Java-based software-development environment. These features allow an embedded systems beginner the ability to focus their attention on learning how to write embedded software instead of wasting time overcoming the engineering CAD tools learning curve. The goal of this text is to introduce fundamental methods for creating embedded software in general, with a focus on ANSI C. The Arduino development platform provides a great means for accomplishing this task. As such, this work presents embedded software development using 100% ANSI C for the Arduino's ATmega328P processor. We deviate from using the Arduino-specific Wiring libraries in an attempt to provide the most general embedded methods. In this way, the reader will acquire essential knowledge necessary for work on future projects involving other processors. Particular attention is paid to the notorious issue of using C pointers in order to gain direct access to microprocessor registers, which ultimately allow control over all peripheral interfacing. Table of Contents: Introduction / ANSI C / Introduction to Arduino / Embedded Debugging / ATmega328P Architecture / General-Purpose Input/Output / Timer Ports / Analog Input Ports / Interrupt Processing / Serial Communications / Assembly Language / Non-volatile Memory View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    MATLAB® Software for the Code Excited Linear Prediction Algorithm:The Federal Standard-1020

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book describes several modules of the Code Excited Linear Prediction (CELP) algorithm. The authors use the Federal Standard-1016 CELP MATLAB® software to describe in detail several functions and parameter computations associated with analysis-by-synthesis linear prediction. The book begins with a description of the basics of linear prediction followed by an overview of the FS-1016 CELP algorithm. Subsequent chapters describe the various modules of the CELP algorithm in detail. In each chapter, an overall functional description of CELP modules is provided along with detailed illustrations of their MATLAB® implementation. Several code examples and plots are provided to highlight some of the key CELP concepts. Link to MATLAB® code found within the book Table of Contents: Introduction to Linear Predictive Coding / Autocorrelation Analysis and Linear Prediction / Line Spectral Frequency Computation / Spectral Distortion / The Codebook Search / The FS-1016 Decoder View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Atmel AVR Microcontroller Primer:Programming and Interfacing

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This textbook provides practicing scientists and engineers a primer on the Atmel AVR microcontroller. Our approach is to provide the fundamental skills to quickly get up and operating with this internationally popular microcontroller. The Atmel ATmega16 is used as a representative sample of the AVR line. The knowledge you gain on the ATmega16 can be easily translated to every other microcontroller in the AVR line. We cover the main subsystems aboard the ATmega16, providing a short theory section followed by a description of the related microcontroller subsystem with accompanying hardware and software to exercise the subsytem. In all examples, we use the C programming language. We conclude with a detailed chapter describing how to interface the microcontroller to a wide variety of input and output devices. Table of Contents: Atmel AVR Architecture Overview / Serial Communication Subsystem / Analog-to-Digital Conversion / Interrupt Subsystem / Timing Subsystem / Atmel AVR Operating Para eters and Interfacing / ATmega16 Register Set / ATmega16 Header File View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Arduino Microcontroller Processing for Everyone:Part II

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book is about the Arduino microcontroller and the Arduino concept. The visionary Arduino team of Massimo Banzi, David Cuartielles, Tom Igoe, Gianluca Martino, and David Mellis launched a new innovation in microcontroller hardware in 2005, the concept of open source hardware. Their approach was to openly share details of microcontroller-based hardware design platforms to stimulate the sharing of ideas and promote innovation. This concept has been popular in the software world for many years. This book is intended for a wide variety of audiences including students of the fine arts, middle and senior high school students, engineering design students, and practicing scientists and engineers. To meet this wide audience, the book has been divided into sections to satisfy the need of each reader. The book contains many software and hardware examples to assist the reader in developing a wide variety of systems. For the examples, the Arduino Duemilanove and the Atmel ATmega328 is employed as the target processor. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Elastic Shape Analysis of Three-Dimensional Objects

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Statistical analysis of shapes of 3D objects is an important problem with a wide range of applications. This analysis is difficult for many reasons, including the fact that objects differ in both geometry and topology. In this manuscript, we narrow the problem by focusing on objects with fixed topology, say objects that are diffeomorphic to unit spheres, and develop tools for analyzing their geometries. The main challenges in this problem are to register points across objects and to perform analysis while being invariant to certain shape-preserving transformations.</p> <p>We develop a comprehensive framework for analyzing shapes of spherical objects, i.e., objects that are embeddings of a unit sphere in ℝ, including tools for: quantifying shape differences, optimally deforming shapes into each other, summarizing shape samples, extracting principal modes of shape variability, and modeling shape variability associated with populations. An important str ngth of this framework is that it is elastic: it performs alignment, registration, and comparison in a single unified framework, while being invariant to shape-preserving transformations.</p> <p>The approach is essentially Riemannian in the following sense. We specify natural mathematical representations of surfaces of interest, and impose Riemannian metrics that are invariant to the actions of the shape-preserving transformations. In particular, they are invariant to reparameterizations of surfaces. While these metrics are too complicated to allow broad usage in practical applications, we introduce a novel representation, termed square-root normal fields (SRNFs), that transform a particular invariant elastic metric into the standard L² metric. As a result, one can use standard techniques from functional data analysis for registering, comparing, and summarizing shapes. Specifically, this results in: pairwise registration of surfaces; computation of geodesic paths encoding optimal deformations; computation of Karcher means and covariances under the shape metric; tangent Principal Component Analysis (PCA) and extraction of dominant modes of variability; and finally, modeling of shape variability using wrapped normal densities.</p> <p>These ideas are demonstrated using two case studies: the analysis of surfaces denoting human bodies in terms of shape and pose variability; and the clustering and classification of the shapes of subcortical brain structures for use in medical diagnosis.</p> <p>This book develops these ideas without assuming advanced knowledge in differential geometry and statistics. We summarize some basic tools from differential geometry in the appendices, and introduce additional concepts and terminology as needed in the individual chapters.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dependency Parsing

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Dependency-based methods for syntactic parsing have become increasingly popular in natural language processing in recent years. This book gives a thorough introduction to the methods that are most widely used today. After an introduction to dependency grammar and dependency parsing, followed by a formal characterization of the dependency parsing problem, the book surveys the three major classes of parsing models that are in current use: transition-based, graph-based, and grammar-based models. It continues with a chapter on evaluation and one on the comparison of different methods, and it closes with a few words on current trends and future prospects of dependency parsing. The book presupposes a knowledge of basic concepts in linguistics and computer science, as well as some knowledge of parsing methods for constituency-based representations. Table of Contents: Introduction / Dependency Parsing / Transition-Based Parsing / Graph-Based Parsing / Grammar-Based Parsing / Evaluation / Comp rison / Final Thoughts View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2016

    Morgan and Claypool eBooks

    In order to sustain Moore's Law-based device scaling, principal attention has focused on toward device architectural innovations for improved device performance as per ITRS projections for technology nodes up to 10 nm. Efficient integration of lower substrate temperatures (<300K) to these innovatively configured device structures can enable the industry professionals to keep up with Moore's Law-based scaling curve conforming with ITRS projection of device performance outcome values. In this prospective review E-book, the authors have systematically reviewed the research results based on scaled device architectures, identified key bottlenecks to sustained scaling-based performance, and through original device simulation outcomes of conventional long channel MOSFET extracted the variation profile of threshold voltage as a function of substrate temperature which will be instrumental in reducing subthreshold leakage current in the temperature range 100K-300K. An exploitation methodo ogy to regulate the die temperature to enable the efficient performance of a high-density VLSI circuit is also documented in order to make the lower substrate temperature operation of VLSI circuits and systems on chip process compatible. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Efficient Quadrature Rules for Illumination Integrals:From Quasi Monte Carlo to Bayesian Monte Carlo

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Rendering photorealistic images is a costly process which can take up to several days in the case of high quality images. In most cases, the task of sampling the incident radiance function to evaluate the illumination integral is responsible for an important share of the computation time. Therefore, to reach acceptable rendering times, the illumination integral must be evaluated using a limited set of samples. Such a restriction raises the question of how to obtain the most accurate approximation possible with such a limited set of samples. One must thus ensure that sampling produces the highest amount of information possible by carefully placing and weighting the limited set of samples. Furthermore, the integral evaluation should take into account not only the information brought by sampling but also possible information available prior to sampling, such as the integrand smoothness. This idea of sparse information and the need to fully exploit the little information available is pres nt throughout this book. The presented methods correspond to the state-of-the-art solutions in computer graphics, and take into account information which had so far been underexploited (or even neglected) by the previous approaches. The intended audiences are Ph.D. students and researchers in the field of realistic image synthesis or global illumination algorithms, or any person with a solid background in graphics and numerical techniques. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Phonocardiography Signal Processing

    Copyright Year: 2009

    Morgan and Claypool eBooks

    The auscultation method is an important diagnostic indicator for hemodynamic anomalies. Heart sound classification and analysis play an important role in the auscultative diagnosis. The term phonocardiography refers to the tracing technique of heart sounds and the recording of cardiac acoustics vibration by means of a microphone-transducer. Therefore, understanding the nature and source of this signal is important to give us a tendency for developing a competent tool for further analysis and processing, in order to enhance and optimize cardiac clinical diagnostic approach. This book gives the reader an inclusive view of the main aspects in phonocardiography signal processing. Table of Contents: Introduction to Phonocardiography Signal Processing / Phonocardiography Acoustics Measurement / PCG Signal Processing Framework / Phonocardiography Wavelets Analysis / Phonocardiography Spectral Analysis / PCG Pattern Classification / Special Application of Phonocardiography / Phonocardiography Acoustic Imaging and Mapping View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Distributed Computing by Oblivious Mobile Robots

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The study of what can be computed by a team of autonomous mobile robots, originally started in robotics and AI, has become increasingly popular in theoretical computer science (especially in distributed computing), where it is now an integral part of the investigations on computability by mobile entities. The robots are identical computational entities located and able to move in a spatial universe; they operate without explicit communication and are usually unable to remember the past; they are extremely simple, with limited resources, and individually quite weak. However, collectively the robots are capable of performing complex tasks, and form a system with desirable fault-tolerant and self-stabilizing properties. The research has been concerned with the computational aspects of such systems. In particular, the focus has been on the minimal capabilities that the robots should have in order to solve a problem. This book focuses on the recent algorithmic results in the field of distr buted computing by oblivious mobile robots (unable to remember the past). After introducing the computational model with its nuances, we focus on basic coordination problems: pattern formation, gathering, scattering, leader election, as well as on dynamic tasks such as flocking. For each of these problems, we provide a snapshot of the state of the art, reviewing the existing algorithmic results. In doing so, we outline solution techniques, and we analyze the impact of the different assumptions on the robots' computability power. Table of Contents: Introduction / Computational Models / Gathering and Convergence / Pattern Formation / Scatterings and Coverings / Flocking / Other Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Communication Networks:A Concise Introduction

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book results from many years of teaching an upper division course on communication networks in the EECS department at University of California, Berkeley. It is motivated by the perceived need for an easily accessible textbook that puts emphasis on the core concepts behind current and next generation networks. After an overview of how today's Internet works and a discussion of the main principles behind its architecture, we discuss the key ideas behind Ethernet, WiFi networks, routing, internetworking and TCP. To make the book as self contained as possible, brief discussions of probability and Markov chain concepts are included in the appendices. This is followed by a brief discussion of mathematical models that provide insight into the operations of network protocols. Next, the main ideas behind the new generation of wireless networks based on WiMAX and LTE, and the notion of QoS are presented. A concise discussion of the physical layer technologies underlying various networks i also included. Finally, a sampling of topics is presented that may have significant influence on the future evolution of networks including overlay networks like content delivery and peer-to-peer networks, sensor networks, distributed algorithms, Byzantine agreement and source compression. Table of Contents: The Internet / Principles / Ethernet / WiFi / Routing / Internetworking / Transport / Models / WiMAX & LTE / QOS / Physical Layer / Additional Topics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Big Data Integration

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The big data era is upon us: data are being generated, analyzed, and used at an unprecedented scale, and data-driven decision making is sweeping through all aspects of society. Since the value of data explodes when it can be linked and fused with other data, addressing the big data integration (BDI) challenge is critical to realizing the promise of big data. BDI differs from traditional data integration along the dimensions of volume, velocity, variety, and veracity. First, not only can data sources contain a huge volume of data, but also the number of data sources is now in the millions. Second, because of the rate at which newly collected data are made available, many of the data sources are very dynamic, and the number of data sources is also rapidly exploding. Third, data sources are extremely heterogeneous in their structure and content, exhibiting considerable variety even for substantially similar entities. Fourth, the data sources are of widely differing qualities, with signif cant differences in the coverage, accuracy and timeliness of data provided. This book explores the progress that has been made by the data integration community on the topics of schema alignment, record linkage and data fusion in addressing these novel challenges faced by big data integration. Each of these topics is covered in a systematic way: first starting with a quick tour of the topic in the context of traditional data integration, followed by a detailed, example-driven exposition of recent innovative techniques that have been proposed to address the BDI challenges of volume, velocity, variety, and veracity. Finally, it presents merging topics and opportunities that are specific to BDI, identifying promising directions for the data integration community. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Strategic Health Technology Incorporation

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Technology is essential to the delivery of health care but it is still only a tool that needs to be deployed wisely to ensure beneficial outcomes at reasonable costs. Among various categories of health technology, medical equipment has the unique distinction of requiring both high initial investments and costly maintenance during its entire useful life. This characteristic does not, however, imply that medical equipment is more costly than other categories, provided that it is managed properly. The foundation of a sound technology management process is the planning and acquisition of equipment, collectively called technology incorporation. This lecture presents a rational, strategic process for technology incorporation based on experience, some successful and many unsuccessful, accumulated in industrialized and developing countries over the last three decades. The planning step is focused on establishing a Technology Incorporation Plan (TIP) using data collected from an audit of exist ng technology, evaluating needs, impacts, costs, and benefits, and consolidating the information collected for decision making. The acquisition step implements TIP by selecting equipment based on technical, regulatory, financial, and supplier considerations, and procuring it using one of the multiple forms of purchasing or agreements with suppliers. This incorporation process is generic enough to be used, with suitable adaptations, for a wide variety of health organizations with different sizes and acuity levels, ranging from health clinics to community hospitals to major teaching hospitals and even to entire health systems. Such a broadly applicable process is possible because it is based on a conceptual framework composed of in-depth analysis of the basic principles that govern each stage of technology lifecycle. Using this incorporation process, successful TIPs have been created and implemented, thereby contributing to the improvement of healthcare services and limiting the associa ed expenses. Table of Contents: Introduction / Conceptual Framework / The Incorporation Process / Discussion / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Boolean Differential Calculus

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>The Boolean Differential Calculus (BDC) is a very powerful theory that extends the basic concepts of Boolean Algebras significantly.</p> <p>Its applications are based on Boolean spaces 𝔹 and 𝔹ⁿ, Boolean operations, and basic structures such as Boolean Algebras and Boolean Rings, Boolean functions, Boolean equations, Boolean inequalities, incompletely specified Boolean functions, and Boolean lattices of Boolean functions. These basics, sometimes also called switching theory, are widely used in many modern information processing applications.</p> <p>The BDC extends the known concepts and allows the consideration of changes of function values. Such changes can be explored for pairs of function values as well as for whole subspaces. The BDC defines a small number of derivative and differential operations. Many existing theorems are very welcome and allow new insights due to possible transformations of problems. The avail ble operations of the BDC have been efficiently implemented in several software packages.</p> <p>The common use of the basic concepts and the BDC opens a very wide field of applications. The roots of the BDC go back to the practical problem of testing digital circuits. The BDC deals with changes of signals which are very important in applications of the analysis and the synthesis of digital circuits. The comprehensive evaluation and utilization of properties of Boolean functions allow, for instance, to decompose Boolean functions very efficiently; this can be applied not only in circuit design, but also in data mining. Other examples for the use of the BDC are the detection of hazards or cryptography. The knowledge of the BDC gives the scientists and engineers an extended insight into Boolean problems leading to new applications, e.g., the use of Boolean lattices of Boolean functions.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Speech Recognition Algorithms based on Weighted Finite-State Transducers

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book introduces the theory, algorithms, and implementation techniques for efficient decoding in speech recognition mainly focusing on the Weighted Finite-State Transducer (WFST) approach. The decoding process for speech recognition is viewed as a search problem whose goal is to find a sequence of words that best matches an input speech signal. Since this process becomes computationally more expensive as the system vocabulary size increases, research has long been devoted to reducing the computational cost. Recently, the WFST approach has become an important state-of-the-art speech recognition technology, because it offers improved decoding speed with fewer recognition errors compared with conventional methods. However, it is not easy to understand all the algorithms used in this framework, and they are still in a black box for many people. In this book, we review the WFST approach and aim to provide comprehensive interpretations of WFST operations and decoding algorithms to help nyone who wants to understand, develop, and study WFST-based speech recognizers. We also mention recent advances in this framework and its applications to spoken language processing. Table of Contents: Introduction / Brief Overview of Speech Recognition / Introduction to Weighted Finite-State Transducers / Speech Recognition by Weighted Finite-State Transducers / Dynamic Decoders with On-the-fly WFST Operations / Summary and Perspective View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Articulatory Speech Synthesis from the Fluid Dynamics of the Vocal Apparatus

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This book addresses the problem of articulatory speech synthesis based on computed vocal tract geometries and the basic physics of sound production in it. Unlike conventional methods based on analysis/synthesis using the well-known source filter model, which assumes the independence of the excitation and filter, we treat the entire vocal apparatus as one mechanical system that produces sound by means of fluid dynamics. The vocal apparatus is represented as a three-dimensional time-varying mechanism and the sound propagation inside it is due to the non-planar propagation of acoustic waves through a viscous, compressible fluid described by the Navier-Stokes equations. We propose a combined minimum energy and minimum jerk criterion to compute the dynamics of the vocal tract during articulation. Theoretical error bounds and experimental results show that this method obtains a close match to the phonetic target positions while avoiding abrupt changes in the articulatory trajectory. The voc l folds are set into aerodynamic oscillation by the flow of air from the lungs. The modulated air stream then excites the moving vocal tract. This method shows strong evidence for source-filter interaction. Based on our results, we propose that the articulatory speech production model has the potential to synthesize speech and provide a compact parameterization of the speech signal that can be useful in a wide variety of speech signal processing problems. Table of Contents: Introduction / Literature Review / Estimation of Dynamic Articulatory Parameters / Construction of Articulatory Model Based on MRI Data / Vocal Fold Excitation Models / Experimental Results of Articulatory Synthesis / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Stream Management

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Many applications process high volumes of streaming data, among them Internet traffic analysis, financial tickers, and transaction log mining. In general, a data stream is an unbounded data set that is produced incrementally over time, rather than being available in full before its processing begins. In this lecture, we give an overview of recent research in stream processing, ranging from answering simple queries on high-speed streams to loading real-time data feeds into a streaming warehouse for off-line analysis. We will discuss two types of systems for end-to-end stream processing: Data Stream Management Systems (DSMSs) and Streaming Data Warehouses (SDWs). A traditional database management system typically processes a stream of ad-hoc queries over relatively static data. In contrast, a DSMS evaluates static (long-running) queries on streaming data, making a single pass over the data and using limited working memory. In the first part of this lecture, we will discuss research prob ems in DSMSs, such as continuous query languages, non-blocking query operators that continually react to new data, and continuous query optimization. The second part covers SDWs, which combine the real-time response of a DSMS by loading new data as soon as they arrive with a data warehouse's ability to manage Terabytes of historical data on secondary storage. Table of Contents: Introduction / Data Stream Management Systems / Streaming Data Warehouses / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Articular Cartilage Tissue Engineering

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Cartilage injuries in children and adolescents are increasingly observed, with roughly 20% of knee injuries in adolescents requiring surgery. In the US alone, costs of osteoarthritis (OA) are in excess of $65 billion per year (both medical costs and lost wages). Comorbidities are common with OA and are also costly to manage. Articular cartilage's low friction and high capacity to bear load makes it critical in the movement of one bone against another, and its lack of a sustained natural healing response has necessitated a plethora of therapies. Tissue engineering is an emerging technology at the threshold of translation to clinical use. Replacement cartilage can be constructed in the laboratory to recapitulate the functional requirements of native tissues. This book outlines the biomechanical and biochemical characteristics of articular cartilage in both normal and pathological states, through development and aging. It also provides a historical perspective of past and current cartil ge treatments and previous tissue engineering efforts. Methods and standards for evaluating the function of engineered tissues are discussed, and current cartilage products are presented with an analysis on the United States Food and Drug Administration regulatory pathways that products must follow to market. This book was written to serve as a reference for researchers seeking to learn about articular cartilage, for undergraduate and graduate level courses, and as a compendium of articular cartilage tissue engineering design criteria. Table of Contents: Hyaline Articular Cartilage / Cartilage Aging and Pathology / In Vitro / Bioreactors / Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Entity Resolution in the Web of Data

    Copyright Year: 2015

    Morgan and Claypool eBooks

    In recent years, several knowledge bases have been built to enable large-scale knowledge sharing, but also an entity-centric Web search, mixing both structured data and text querying. These knowledge bases offer machine-readable descriptions of real-world entities, e.g., persons, places, published on the Web as Linked Data. However, due to the different information extraction tools and curation policies employed by knowledge bases, multiple, complementary and sometimes conflicting descriptions of the same real-world entities may be provided. Entity resolution aims to identify different descriptions that refer to the same entity appearing either within or across knowledge bases. The objective of this book is to present the new entity resolution challenges stemming from the openness of the Web of data in describing entities by an unbounded number of knowledge bases, the semantic and structural diversity of the descriptions provided across domains even for the same real-world entities, a well as the autonomy of knowledge bases in terms of adopted processes for creating and curating entity descriptions. The scale, diversity, and graph structuring of entity descriptions in the Web of data essentially challenge how two descriptions can be effectively compared for similarity, but also how resolution algorithms can efficiently avoid examining pairwise all descriptions. The book covers a wide spectrum of entity resolution issues at the Web scale, including basic concepts and data structures, main resolution tasks and workflows, as well as state-of-the-art algorithmic techniques and experimental trade-offs. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multimodal Imaging in Neurology:Special Focus on MRI Applications and MEG

    Copyright Year: 2007

    Morgan and Claypool eBooks

    The field of brain imaging is developing at a rapid pace and has greatly advanced the areas of cognitive and clinical neuroscience. The availability of neuroimaging techniques, especially magnetic resonance imaging (MRI), functional MRI (fMRI), diffusion tensor imaging (DTI) and magnetoencephalography (MEG) and magnetic source imaging (MSI) has brought about breakthroughs in neuroscience. To obtain comprehensive information about the activity of the human brain, different analytical approaches should be complemented. Thus, in "intermodal multimodality" imaging, great efforts have been made to combine the highest spatial resolution (MRI, fMRI) with the best temporal resolution (MEG or EEG). "Intramodal multimodality" imaging combines various functional MRI techniques (e.g., fMRI, DTI, and/or morphometric/volumetric analysis). The multimodal approach is conceptually based on the combination of different noninvasive functional neuroimaging tools, their registration and cointegration. In articular, the combination of imaging applications that map different functional systems is useful, such as fMRI as a technique for the localization of cortical function and DTI as a technique for mapping of white matter fiber bundles or tracts. This booklet gives an insight into the wide field of multimodal imaging with respect to concepts, data acquisition, and postprocessing. Examples for intermodal and intramodal multimodality imaging are also demonstrated. Table of Contents: Introduction / Neurological Measurement Techniques and First Steps of Postprocessing / Coordinate Transformation / Examples for Multimodal Imaging / Clinical Aspects of Multimodal Imaging / References / Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Translating Euclid:Designing a Human-Centered Mathematics

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Translating Euclid reports on an effort to transform geometry for students from a stylus-and-clay-tablet corpus of historical theorems to a stimulating computer-supported collaborative-learning inquiry experience. The origin of geometry was a turning point in the pre-history of informatics, literacy, and rational thought. Yet, this triumph of human intellect became ossified through historic layers of systematization, beginning with Euclid’s organization of the Elements of geometry. Often taught by memorization of procedures, theorems, and proofs, geometry in schooling rarely conveys its underlying intellectual excitement. The recent development of dynamic-geometry software offers an opportunity to translate the study of geometry into a contemporary vernacular. However, this involves transformations along multiple dimensions of the conceptual and practical context of learning. Translating Euclid steps through the multiple challenges involved in redesigning geometry education to take advantage of computer support. Networked computers portend an interactive approach to exploring dynamic geometry as well as broadened prospects for collaboration. The proposed conception of geometry emphasizes the central role of the construction of dependencies as a design activity, integrating human creation and mathematical discovery to form a human-centered approach to mathematics. This book chronicles an iterative effort to adapt technology, theory, pedagogy and practice to support this vision of collaborative dynamic geometry and to evolve the approach through on-going cycles of trial with students and refinement of resources. It thereby provides a case study of a design-based research effort in computer-supported collaborative learning from a human-centered informatics perspective. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Association for Multi-Object Visual Tracking

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>In the human quest for scientific knowledge, empirical evidence is collected by visual perception. Tracking with computer vision takes on the important role to reveal complex patterns of motion that exist in the world we live in. Multi-object tracking algorithms provide new information on how groups and individual group members move through three-dimensional space. They enable us to study in depth the relationships between individuals in moving groups. These may be interactions of pedestrians on a crowded sidewalk, living cells under a microscope, or bats emerging in large numbers from a cave. Being able to track pedestrians is important for urban planning; analysis of cell interactions supports research on biomaterial design; and the study of bat and bird flight can guide the engineering of aircraft. We were inspired by this multitude of applications to consider the crucial component needed to advance a single-object tracking system to a multi-object tracking systemȁ ;data association.</p><p>Data association in the most general sense is the process of matching information about newly observed objects with information that was previously observed about them. This information may be about their identities, positions, or trajectories. Algorithms for data association search for matches that optimize certain match criteria and are subject to physical conditions. They can therefore be formulated as solving a "constrained optimization problem"—the problem of optimizing an objective function of some variables in the presence of constraints on these variables. As such, data association methods have a strong mathematical grounding and are valuable general tools for computer vision researchers.</p><p>This book serves as a tutorial on data association methods, intended for both students and experts in computer vision. We describe the basic research problems, review the current state of the art, and present some recently developed approaches. The book covers multi-object tracking in two and three dimensions. We consider two imaging scenarios involving either single cameras or multiple cameras with overlapping fields of view, and requiring across-time and across-view data association methods. In addition to methods that match new measurements to already established tracks, we describe methods that match trajectory segments, also called tracklets. The book presents a principled application of data association to solve two interesting tasks: first, analyzing the movements of groups of free-flying animals and second, reconstructing the movements of groups of pedestrians. We conclude by discussing exciting directions for future research.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Artificial Organs

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The replacement or augmentation of failing human organs with artificial devices and systems has been an important element in health care for several decades. Such devices as kidney dialysis to augment failing kidneys, artificial heart valves to replace failing human valves, cardiac pacemakers to reestablish normal cardiac rhythm, and heart assist devices to augment a weakened human heart have assisted millions of patients in the previous 50 years and offers lifesaving technology for tens of thousands of patients each year. Significant advances in these biomedical technologies have continually occurred during this period, saving numerous lives with cutting edge technologies. Each of these artificial organ systems will be described in detail in separate sections of this lecture. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Statistical Language Models for Information Retrieval

    Copyright Year: 2009

    Morgan and Claypool eBooks

    As online information grows dramatically, search engines such as Google are playing a more and more important role in our lives. Critical to all search engines is the problem of designing an effective retrieval model that can rank documents accurately for a given query. This has been a central research problem in information retrieval for several decades. In the past ten years, a new generation of retrieval models, often referred to as statistical language models, has been successfully applied to solve many different information retrieval problems. Compared with the traditional models such as the vector space model, these new models have a more sound statistical foundation and can leverage statistical estimation to optimize retrieval parameters. They can also be more easily adapted to model non-traditional and complex retrieval problems. Empirically, they tend to achieve comparable or better performance than a traditional model with less effort on parameter tuning. This book systemati ally reviews the large body of literature on applying statistical language models to information retrieval with an emphasis on the underlying principles, empirically effective language models, and language models developed for non-traditional retrieval tasks. All the relevant literature has been synthesized to make it easy for a reader to digest the research progress achieved so far and see the frontier of research in this area. The book also offers practitioners an informative introduction to a set of practically useful language models that can effectively solve a variety of retrieval problems. No prior knowledge about information retrieval is required, but some basic knowledge about probability and statistics would be useful for fully digesting all the details. Table of Contents: Introduction / Overview of Information Retrieval Models / Simple Query Likelihood Retrieval Model / Complex Query Likelihood Model / Probabilistic Distance Retrieval Model / Language Models for Special Retr eval Tasks / Language Models for Latent Topic Analysis / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Outlier Detection for Temporal Data

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Outlier (or anomaly) detection is a very broad field which has been studied in the context of a large number of research areas like statistics, data mining, sensor networks, environmental science, distributed systems, spatio-temporal mining, etc. Initial research in outlier detection focused on time series-based outliers (in statistics). Since then, outlier detection has been studied on a large variety of data types including high-dimensional data, uncertain data, stream data, network data, time series data, spatial data, and spatio-temporal data. While there have been many tutorials and surveys for general outlier detection, we focus on outlier detection for temporal data in this book. A large number of applications generate temporal datasets. For example, in our everyday life, various kinds of records like credit, personnel, financial, judicial, medical, etc., are all temporal. This stresses the need for an organized and detailed study of outliers with respect to such temporal data. In the past decade, there has been a lot of research on various forms of temporal data including consecutive data snapshots, series of data snapshots and data streams. Besides the initial work on time series, researchers have focused on rich forms of data including multiple data streams, spatio-temporal data, network data, community distribution data, etc. Compared to general outlier detection, techniques for temporal outlier detection are very different. In this book, we will present an organized picture of both recent and past research in temporal outlier detection. We start with the basics and then ramp up the reader to the main ideas in state-of-the-art outlier detection techniques. We motivate the importance of temporal outlier detection and brief the challenges beyond usual outlier detection. Then, we list down a taxonomy of proposed techniques for temporal outlier detection. Such techniques broadly include statistical techniques (like AR models, Markov models, histograms, neura networks), distance- and density-based approaches, grouping-based approaches (clustering, community detection), network-based approaches, and spatio-temporal outlier detection approaches. We summarize by presenting a wide collection of applications where temporal outlier detection techniques have been applied to discover interesting outliers. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Automated Grammatical Error Detection for Language Learners

    Copyright Year: 2010

    Morgan and Claypool eBooks

    It has been estimated that over a billion people are using or learning English as a second or foreign language, and the numbers are growing not only for English but for other languages as well. These language learners provide a burgeoning market for tools that help identify and correct learners' writing errors. Unfortunately, the errors targeted by typical commercial proofreading tools do not include those aspects of a second language that are hardest to learn. This volume describes the types of constructions English language learners find most difficult -- constructions containing prepositions, articles, and collocations. It provides an overview of the automated approaches that have been developed to identify and correct these and other classes of learner errors in a number of languages. Error annotation and system evaluation are particularly important topics in grammatical error detection because there are no commonly accepted standards. Chapters in the book describe the options av ilable to researchers, recommend best practices for reporting results, and present annotation and evaluation schemes. The final chapters explore recent innovative work that opens new directions for research. It is the authors' hope that this volume will contribute to the growing interest in grammatical error detection by encouraging researchers to take a closer look at the field and its many challenging problems. Table of Contents: Introduction / History of Automated Grammatical Error Detection / Special Problems of Language Learners / Language Learner Data / Evaluating Error Detection Systems / Article and Preposition Errors / Collocation Errors / Different Approaches for Different Errors / Annotating Learner Errors / New Directions / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Representations, Transformations, and Statistics for Visual Reasoning

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Analytical reasoning techniques are methods by which users explore their data to obtain insight and knowledge that can directly support situational awareness and decision making. Recently, the analytical reasoning process has been augmented through the use of interactive visual representations and tools which utilize cognitive, design and perceptual principles. These tools are commonly referred to as visual analytics tools, and the underlying methods and principles have roots in a variety of disciplines. This chapter provides an introduction to young researchers as an overview of common visual representations and statistical analysis methods utilized in a variety of visual analytics systems. The application and design of visualization and analytical algorithms are subject to design decisions, parameter choices, and many conflicting requirements. As such, this chapter attempts to provide an initial set of guidelines for the creation of the visual representation, including pitfalls and reas where the graphics can be enhanced through interactive exploration. Basic analytical methods are explored as a means of enhancing the visual analysis process, moving from visual analysis to visual analytics. Table of Contents: Data Types / Color Schemes / Data Preconditioning / Visual Representations and Analysis / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    CAD/CAM of Sculptured Surfaces on Multi-Axis NC Machine

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Many products are designed with aesthetic sculptured surfaces to enhance their appearance, an important factor in customer satisfaction, especially for automotive and consumer electronics products. In other cases, products have sculptured surfaces to meet functional requirements. Functional surfaces interact with the environment or with other surfaces. Because of this, functional surfaces can also be called dynamic surfaces. Functional surfaces do not possess the property to slide over itself, which causes significant complexity in machining of sculptured surfaces. The application of multiaxis numerically controlled (NC) machines is the only way for an efficient machining of sculptured surfaces. Reduction of machining time is a critical issue when machining sculptured surfaces on multiaxis NC machines. To reduce the machining cost of a sculptured surface, the machining time must be as short as possible. Table of Contents: Introduction / Analytical Representation of Scupltured Surfaces / Kinematics of Sculptured-Surface Machining / Analytical Description of the Geometry of Contact of the Sculptured Surface and of the Generating Surface of the Form-Cutting Tool / Form-Cutting Tools of Optimal Design / Conditions of Proper Sculptured-Surface Generation / Predicted Accuracy of the Machined Sculptured Surface / Optimal Sculptured-Surface Machining View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Practical Guide to Testing Wireless Smartphone Applications

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Testing applications for mobile phones is difficult, time-consuming, and hard to do effectively. Many people have limited their testing efforts to hands-on testing of an application on a few physical handsets, and they have to repeat the process every time a new version of the software is ready to test. They may miss many of the permutations of real-world use, and as a consequence their users are left with the unpleasant mess of a failing application on their phone. Test automation can help to increase the range and scope of testing, while reducing the overhead of manual testing of each version of the software. However automation is not a panacea, particularly for mobile applications, so we need to pick our test automation challenges wisely. This book is intended to help software and test engineers pick appropriately to achieve more; and as a consequence deliver better quality, working software to users. This Synthesis lecture provides practical advice based on direct experience of us ng software test automation to help improve the testing of a wide range of mobile phone applications, including the latest AJAX applications. The focus is on applications that rely on a wireless network connection to a remote server, however the principles may apply to other related fields and applications. We start by explaining terms and some of the key challenges involved in testing smartphone applications. Subsequent chapters describe a type of application e.g. markup, AJAX, Client, followed by a related chapter on how to test each of these applications. Common test automation techniques are covered in a separate chapter, and finally there is a brief chapter on when to test manually. The book also contains numerous pointers and links to further material to help you to improve your testing using automation appropriately. Table of Contents: Introduction / Markup Languages / Testing Techniques for Markup Applications / AJAX Mobile Applications / Testing Mobile AJAX Applications / Cli nt Applications / Testing Techniques for Client Applications / Common Techniques / When to Test Manually / Future Work / Appendix A: Links and References / Appendix B: Data Connectivity / Appendix C: Configuring Your Machine View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis and Design of Substrate Integrated Waveguide Using Efficient 2D Hybrid Method

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Substrate integrated waveguide (SIW) is a new type of transmission line. It implements a waveguide on a piece of printed circuit board by emulating the side walls of the waveguide using two rows of metal posts. It inherits the merits both from the microstrip for compact size and easy integration, and from the waveguide for low radiation loss, and thus opens another door to design efficient microwave circuits and antennas at a low cost. This book presents a two-dimensional fullwave analysis method to investigate an SIW circuit composed of metal and dielectric posts. It combines the cylindrical eigenfunction expansion and the method of moments to avoid geometrical descritization of the posts. The method is presented step-by-step, with all the necessary formulations provided for a practitioner who wants to implement this method by himself. This book covers the SIW circuit printed on either homogeneous or inhomogeneous substrate, the microstrip-to-SIW transition and the speed-up technique for the simulation of symmetrical SIW circuits. Different types of SIW circuits are shown and simulated using the proposed method. In addition, several slot antennas and horn antennas fabricated using the SIW technology are also given. Table of Contents: Introduction / SIW Circuits Composed of Metallic Posts / SIW Circuits with Dielectric Posts / Even-Odd Mode Analysis of a Symmetrical Circuit / Microstrip to SIW Transition and Half Mode SIW / SIW Antennas View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Transactional Memory

    Copyright Year: 2007

    Morgan and Claypool eBooks

    The advent of multicore processors has renewed interest in the idea of incorporating transactions into the programming model used to write parallel programs. This approach, known as transactional memory, offers an alternative, and hopefully better, way to coordinate concurrent threads. The ACI (atomicity, consistency, isolation) properties of transactions provide a foundation to ensure that concurrent reads and writes of shared data do not produce inconsistent or incorrect results. At a higher level, a computation wrapped in a transaction executes atomically – either it completes successfully and commits its result in its entirety or it aborts. In addition, isolation ensures the transaction produces the same result as if no other transactions were executing concurrently. Although transactions are not a parallel programming panacea, they shift much of the burden of synchronizing and coordinating parallel computations from a programmer to a compiler, runtime system, and hardware. The challenge for the system implementers is to build an efficient transactional memory infrastructure. This book presents an overview of the state of the art in the design and implementation of transactional memory systems, as of early summer 2006. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multi-Objective Decision Making

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Many real-world decision problems have multiple objectives. For example, when choosing a medical treatment plan, we want to maximize the efficacy of the treatment, but also minimize the side effects. These objectives typically conflict, e.g., we can often increase the efficacy of the treatment, but at the cost of more severe side effects. In this book, we outline how to deal with multiple objectives in decision-theoretic planning and reinforcement learning algorithms. To illustrate this, we employ the popular problem classes of multi-objective Markov decision processes (MOMDPs) and multi-objective coordination graphs (MO-CoGs).</p> <p>First, we discuss different use cases for multi-objective decision making, and why they often necessitate explicitly multi-objective algorithms. We advocate a utility-based approach to multi-objective decision making, i.e., that what constitutes an optimal solution to a multi-objective decision problem should be derived from th available information about user utility. We show how different assumptions about user utility and what types of policies are allowed lead to different solution concepts, which we outline in a taxonomy of multi-objective decision problems.</p> <p>Second, we show how to create new methods for multi-objective decision making using existing single-objective methods as a basis. Focusing on planning, we describe two ways to creating multi-objective algorithms: in the inner loop approach, the inner workings of a single-objective method are adapted to work with multi-objective solution concepts; in the outer loop approach, a wrapper is created around a single-objective method that solves the multi-objective problem as a series of single-objective problems. After discussing the creation of such methods for the planning setting, we discuss how these approaches apply to the learning setting.</p> <p>Next, we discuss three promising application domains for multi-o jective decision making algorithms: energy, health, and infrastructure and transportation. Finally, we conclude by outlining important open problems and promising future directions.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Geometry of Walker Manifolds

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book, which focuses on the study of curvature, is an introduction to various aspects of pseudo-Riemannian geometry. We shall use Walker manifolds (pseudo-Riemannian manifolds which admit a non-trivial parallel null plane field) to exemplify some of the main differences between the geometry of Riemannian manifolds and the geometry of pseudo-Riemannian manifolds and thereby illustrate phenomena in pseudo-Riemannian geometry that are quite different from those which occur in Riemannian geometry, i.e. for indefinite as opposed to positive definite metrics. Indefinite metrics are important in many diverse physical contexts: classical cosmological models (general relativity) and string theory to name but two. Walker manifolds appear naturally in numerous physical settings and provide examples of extremal mathematical situations as will be discussed presently. To describe the geometry of a pseudo-Riemannian manifold, one must first understand the curvature of the manifold. We shall anal ze a wide variety of curvature properties and we shall derive both geometrical and topological results. Special attention will be paid to manifolds of dimension 3 as these are quite tractable. We then pass to the 4 dimensional setting as a gateway to higher dimensions. Since the book is aimed at a very general audience (and in particular to an advanced undergraduate or to a beginning graduate student), no more than a basic course in differential geometry is required in the way of background. To keep our treatment as self-contained as possible, we shall begin with two elementary chapters that provide an introduction to basic aspects of pseudo-Riemannian geometry before beginning on our study of Walker geometry. An extensive bibliography is provided for further reading. Math subject classifications : Primary: 53B20 -- (PACS: 02.40.Hw) Secondary: 32Q15, 51F25, 51P05, 53B30, 53C50, 53C80, 58A30, 83F05, 85A04 Table of Contents: Basic Algebraic Notions / Basic Geometrical Notions / Walker S ructures / Three-Dimensional Lorentzian Walker Manifolds / Four-Dimensional Walker Manifolds / The Spectral Geometry of the Curvature Tensor / Hermitian Geometry / Special Walker Manifolds View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Numerical Methods for Linear Complementarity Problems in Physics-Based Animation

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Linear complementarity problems (LCPs) have for many years been used in physics-based animation to model contact forces between rigid bodies in contact. More recently, LCPs have found their way into the realm of fluid dynamics. Here, LCPs are used to model boundary conditions with fluid-wall contacts. LCPs have also started to appear in deformable models and granular simulations. There is an increasing need for numerical methods to solve the resulting LCPs with all these new applications. This book provides a numerical foundation for such methods, especially suited for use in computer graphics. This book is mainly intended for a researcher/Ph.D. student/post-doc/professor who wants to study the algorithms and do more work/research in this area. Programmers might have to invest some time brushing up on math skills, for this we refer to Appendices A and B. The reader should be familiar with linear algebra and differential calculus. We provide pseudo code for all the numerical methods, w ich should be comprehensible by any computer scientist with rudimentary programming skills. The reader can find an online supplementary code repository, containing Matlab implementations of many of the core methods covered in these notes, as well as a few Python implementations [Erleben, 2011]. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semiotic Engineering Methods for Scientific Research in HCI

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Semiotic engineering was originally proposed as a semiotic approach to designing user interface languages. Over the years, with research done at the Department of Informatics of the Pontifical Catholic University of Rio de Janeiro, it evolved into a semiotic theory of human-computer interaction (HCI). It views HCI as computer-mediated communication between designers and users at interaction time. The system speaks for its designers in various types of conversations specified at design time. These conversations communicate the designers' understanding of who the users are, what they know the users want or need to do, in which preferred ways, and why. The designers' message to users includes even the interactive language in which users will have to communicate back with the system in order to achieve their specific goals. Hence, the process is, in fact, one of communication about communication, or metacommunication. Semiotic engineering has two methods to evaluate the quality of metac mmunication in HCI: the semiotic inspection method (SIM) and the communicability evaluation method (CEM). Up to now, they have been mainly used and discussed in technical contexts, focusing on how to detect problems and how to improve the metacommunication of specific systems. In this book, Clarisse de Souza and Carla Leitão discuss how SIM and CEM, which are both qualitative methods, can also be used in scientific contexts to generate new knowledge about HCI. The discussion goes into deep considerations about scientific methodology, calling the reader's attention to the essence of qualitative methods in research and the kinds of results they can produce. To illustrate their points, the authors present an extensive case study with a free open-source digital audio editor called Audacity. They show how the results obtained with a triangulation of SIM and CEM point at new research avenues not only for semiotic engineering and HCI but also for other areas of computer science such a software engineering and programming. Table of Contents: Introduction / Essence of Semiotic Engineering / Semiotic Engineering Methods / Case Study with Audacity / Lessons Learned with Semiotic Engineering Methods / The Near Future of Semiotic Engineering View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Scalability Challenges in Web Search Engines

    Copyright Year: 2015

    Morgan and Claypool eBooks

    In this book, we aim to provide a fairly comprehensive overview of the scalability and efficiency challenges in large-scale web search engines. More specifically, we cover the issues involved in the design of three separate systems that are commonly available in every web-scale search engine: web crawling, indexing, and query processing systems. We present the performance challenges encountered in these systems and review a wide range of design alternatives employed as solution to these challenges, specifically focusing on algorithmic and architectural optimizations. We discuss the available optimizations at different computational granularities, ranging from a single computer node to a collection of data centers. We provide some hints to both the practitioners and theoreticians involved in the field about the way large-scale web search engines operate and the adopted design choices. Moreover, we survey the efficiency literature, providing pointers to a large number of relatively impo tant research papers. Finally, we discuss some open research problems in the context of search engine efficiency. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Webometrics:Quantitative Web Research for the Social Sciences

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Webometrics is concerned with measuring aspects of the web: web sites, web pages, parts of web pages, words in web pages, hyperlinks, web search engine results. The importance of the web itself as a communication medium and for hosting an increasingly wide array of documents, from journal articles to holiday brochures, needs no introduction. Given this huge and easily accessible source of information, there are limitless possibilities for measuring or counting on a huge scale (e.g., the number of web sites, the number of web pages, the number of blogs) or on a smaller scale (e.g., the number of web sites in Ireland, the number of web pages in the CNN web site, the number of blogs mentioning Barack Obama before the 2008 presidential campaign). This book argues that it can be useful for social scientists to measure aspects of the web and explains how this can be achieved on both a small and large scale. The book is intended for social scientists with research topics that are wholly or p rtly online (e.g., social networks, news, political communication) and social scientists with offline research topics with an online reflection, even if this is not a core component (e.g., diaspora communities, consumer culture, linguistic change). The book is also intended for library and information science students in the belief that the knowledge and techniques described will be useful for them to guide and aid other social scientists in their research. In addition, the techniques and issues are all directly relevant to library and information science research problems. Table of Contents: Introduction / Web Impact Assessment / Link Analysis / Blog Searching / Automatic Search Engine Searches: LexiURL Searcher / Web Crawling: SocSciBot / Search Engines and Data Reliability / Tracking User Actions Online / Advaned Techniques / Summary and Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Taxobook: Applications, Implementation, and Integration in Search (Part 3 of a 3-Part Series):Applications, Implementation, and Integration in Search: Part 3 of a 3-Part Series

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This book is the third of a three-part series on taxonomies, and covers putting your taxonomy into use in as many ways as possible to maximize retrieval for your users. Chapter 1 suggests several items to research and consider before you start your implementation and integration process. It explores the different pieces of software that you will need for your system and what features to look for in each. Chapter 2 launches with a discussion of how taxonomy terms can be used within a workflow, connecting two—or more—taxonomies, and intelligent coordination of platforms and taxonomies. Microsoft SharePoint is a widely used and popular program, and I consider their use of taxonomies in this chapter. Following that is a discussion of taxonomies and semantic integration and then the relationship between indexing and the hierarchy of a taxonomy. Chapter 3 (“How is a Taxonomy Connected to Search?”) provides discussions and examples of putting taxonomies into use i practical applications. It discusses displaying content based on search, how taxonomy is connected to search, using a taxonomy to guide a searcher, tools for search, including search engines, crawlers and spiders, and search software, the parts of a search-capable system, and then how to assemble that search-capable system. This chapter also examines how to measure quality in search, the different kinds of search, and theories on search from several famous theoreticians—two from the 18th and 19th centuries, and two contemporary. Following that is a section on inverted files, parsing, discovery, and clustering. While you probably don’t need a comprehensive understanding of these concepts to build a solid, workable system, enough information is provided for the reader to see how they fit into the overall scheme. This chapter concludes with a look at faceted search and some possibilities for search interfaces. Chapter 4, “Implementing a Taxonomy in a Database or on Website,” starts where many content systems really should—with the authors, or at least the people who create the content. This chapter discusses matching up various groups of related data to form connections, data visualization and text analytics, and mobile and e-commerce applications for taxonomies. Finally, Chapter 5 presents some educated guesses about the future of knowledge organization. Table of Contents: List of Figures / Preface / Acknowledgments / On Your Mark, Get Ready …. WAIT! Things to Know Before You Start the Implementation Step / Taxonomy and Thesaurus Implementation / How is a Taxonomy Connected to Search? / Implementing a Taxonomy in a Database or on a Website / What Lies Ahead for Knowledge Organization? / Glossary / End Notes / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Ever since its invention in the 1980s, the compound semiconductor heterojunction-based high electron mobility transistor (HEMT) has been widely used in radio frequency (RF) applications. This book provides readers with broad coverage on techniques and new trends of HEMT, employing leading compound semiconductors, III-N and III-V materials. The content includes an overview of GaN HEMT device-scaling technologies and experimental research breakthroughs in fabricating various GaN MOSHEMT transistors. Readers are offered an inspiring example of monolithic integration of HEMT with LEDs, too. The authors compile the most relevant aspects of III-V HEMT, including the current status of state-of-art HEMTs, their possibility of replacing the Si CMOS transistor channel, and growth opportunities of III-V materials on an Si substrate. With detailed exploration and explanations, the book is a helpful source suitable for anyone learning about and working on compound semiconductor devices. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semantics Empowered Web 3.0:Managing Enterprise, Social, Sensor, and Cloud-based Data and Services for Advanced Applications

    Copyright Year: 2012

    Morgan and Claypool eBooks

    After the traditional document-centric Web 1.0 and user-generated content focused Web 2.0, Web 3.0 has become a repository of an ever growing variety of Web resources that include data and services associated with enterprises, social networks, sensors, cloud, as well as mobile and other devices that constitute the Internet of Things. These pose unprecedented challenges in terms of heterogeneity (variety), scale (volume), and continuous changes (velocity), as well as present corresponding opportunities if they can be exploited. Just as semantics has played a critical role in dealing with data heterogeneity in the past to provide interoperability and integration, it is playing an even more critical role in dealing with the challenges and helping users and applications exploit all forms of Web 3.0 data. This book presents a unified approach to harness and exploit all forms of contemporary Web resources using the core principles of ability to associate meaning with data through conceptual or domain models and semantic descriptions including annotations, and through advanced semantic techniques for search, integration, and analysis. It discusses the use of Semantic Web standards and techniques when appropriate, but also advocates the use of lighter weight, easier to use, and more scalable options when they are more suitable. The authors' extensive experience spanning research and prototypes to development of operational applications and commercial technologies and products guide the treatment of the material. Table of Contents: Role of Semantics and Metadata / Types and Models of Semantics / Annotation -- Adding Semantics to Data / Semantics for Enterprise Data / Semantics for Services / Semantics for Sensor Data / Semantics for Social Data / Semantics for Cloud Computing / Semantics for Advanced Applications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Capstone Design Courses, Part Two:Preparing Biomedical Engineers for the Real World

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The biomedical engineering senior capstone design course is probably the most important course taken by undergraduate biomedical engineering students. It provides them with the opportunity to apply what they have learned in previous years, develop their communication, teamwork, project management, and design skills, and learn about the product development process. It prepares students for professional practice and serves as a preview of what it will be like to work as a biomedical engineer. The capstone design experience can change the way engineering students think about technology, themselves, society, and the world around them. It can make them aware of their potential to make a positive contribution to healthcare throughout the world and generate excitement for, and pride in, the engineering profession. Ideas for how to organize, structure, and manage a senior capstone design course for biomedical and other engineering students are presented here. These ideas will be helpful to fa ulty who are creating a new design course, expanding a current design program, or just looking for some ideas for improving an existing course. The better we can make these courses, the more "industry ready" our students will be, and the better prepared they will be for meaningful, successful careers in biomedical engineering. This book is the second part of a series covering Capstone Design Courses for biomedical engineers. Part I is available online here and in print (ISBN 9781598292923) and covers the following topics: Purpose, Goals, and Benefits; Designing a Course to Meet Student Needs; Enhancing the Capstone Design Courses; Meeting the Changing Needs of Future Engineers. Table of Contents: The Myth of the "Industry-Ready" Engineer / Recent Trends and the Current State of Capstone Design / Preparing Students for Capstone Design / Helping Students Recognize the Value of Capstone Design Courses / Developing Teamwork Skills / Incorporating Design Controls / Learning to Identify Pro lems, Unmet Needs, and New Product Opportunities / Design Verification and Validation / Liability Issues with Assistive Technology Projects / Standards in Capstone Design Courses and the Engineering Curriculum / Design Transfer and Design for Manufacturability / Learning from other Engineering Disciplines: Capstone Design Conferences / Maintaining a Relevant, Up-to-Date Capstone Design Course / Active Learning in Capstone Design Courses / Showcasing Student Projects: National Student Design Competitions / Managing Student Expectations of the "Real World" / Career Management and Professional Development / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Extreme Value Theory-Based Methods for Visual Recognition

    Copyright Year: 2017

    Morgan and Claypool eBooks

    A common feature of many approaches to modeling sensory statistics is an emphasis on capturing the "average." From early representations in the brain, to highly abstracted class categories in machine learning for classification tasks, central-tendency models based on the Gaussian distribution are a seemingly natural and obvious choice for modeling sensory data. However, insights from neuroscience, psychology, and computer vision suggest an alternate strategy: preferentially focusing representational resources on the extremes of the distribution of sensory inputs. The notion of treating extrema near a decision boundary as features is not necessarily new, but a comprehensive statistical theory of recognition based on extrema is only now just emerging in the computer vision literature. This book begins by introducing the statistical Extreme Value Theory (EVT) for visual recognition. In contrast to central-tendency modeling, it is hypothesized that distributions near decision boundaries f rm a more powerful model for recognition tasks by focusing coding resources on data that are arguably the most diagnostic features. EVT has several important properties: strong statistical grounding, better modeling accuracy near decision boundaries than Gaussian modeling, the ability to model asymmetric decision boundaries, and accurate prediction of the probability of an event beyond our experience. The second part of the book uses the theory to describe a new class of machine learning algorithms for decision making that are a measurable advance beyond the state-of-the-art. This includes methods for post-recognition score analysis, information fusion, multi-attribute spaces, and calibration of supervised machine learning algorithms. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Linguistic Annotation and Text Analytics

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Linguistic annotation and text analytics are active areas of research and development, with academic conferences and industry events such as the Linguistic Annotation Workshops and the annual Text Analytics Summits. This book provides a basic introduction to both fields, and aims to show that good linguistic annotations are the essential foundation for good text analytics. After briefly reviewing the basics of XML, with practical exercises illustrating in-line and stand-off annotations, a chapter is devoted to explaining the different levels of linguistic annotations. The reader is encouraged to create example annotations using the WordFreak linguistic annotation tool. The next chapter shows how annotations can be created automatically using statistical NLP tools, and compares two sets of tools, the OpenNLP and Stanford NLP tools. The second half of the book describes different annotation formats and gives practical examples of how to interchange annotations between different formats sing XSLT transformations. The two main text analytics architectures, GATE and UIMA, are then described and compared, with practical exercises showing how to configure and customize them. The final chapter is an introduction to text analytics, describing the main applications and functions including named entity recognition, coreference resolution and information extraction, with practical examples using both open source and commercial tools. Copies of the example files, scripts, and stylesheets used in the book are available from the companion website, located at the book website. Table of Contents: Working with XML / Linguistic Annotation / Using Statistical NLP Tools / Annotation Interchange / Annotation Architectures / Text Analytics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Collaborative Web Search:Who, What, Where, When, and Why

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Today, Web search is treated as a solitary experience. Web browsers and search engines are typically designed to support a single user, working alone. However, collaboration on information-seeking tasks is actually commonplace. Students work together to complete homework assignments, friends seek information about joint entertainment opportunities, family members jointly plan vacation travel, and colleagues jointly conduct research for their projects. As improved networking technologies and the rise of social media simplify the process of remote collaboration, and large, novel display form-factors simplify the process of co-located group work, researchers have begun to explore ways to facilitate collaboration on search tasks. This lecture investigates the who, what, where, when and why of collaborative search, and gives insight in how emerging solutions can address collaborators' needs. Table of Contents: Introduction / Who? / What? / Where? / When? / Why? / Conclusion: How? View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Databases on Modern Hardware:How to Stop Underutilization and Love Multicores

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Data management systems enable various influential applications from high-performance online services (e.g., social networks like Twitter and Facebook or financial markets) to big data analytics (e.g., scientific exploration, sensor networks, business intelligence). As a result, data management systems have been one of the main drivers for innovations in the database and computer architecture communities for several decades. Recent hardware trends require software to take advantage of the abundant parallelism existing in modern and future hardware. The traditional design of the data management systems, however, faces inherent scalability problems due to its tightly coupled components. In addition, it cannot exploit the full capability of the aggressive micro-architectural features of modern processors. As a result, today's most commonly used server types remain largely underutilized leading to a huge waste of hardware resources and energy.</p> <p>In this bo k, we shed light on the challenges present while running DBMS on modern multicore hardware. We divide the material into two dimensions of scalability: implicit/vertical and explicit/horizontal.</p> <p>The first part of the book focuses on the vertical dimension: it describes the instruction- and data-level parallelism opportunities in a core coming from the hardware and software side. In addition, it examines the sources of under-utilization in a modern processor and presents insights and hardware/software techniques to better exploit the microarchitectural resources of a processor by improving cache locality at the right level of the memory hierarchy.</p> <p>The second part focuses on the horizontal dimension, i.e., scalability bottlenecks of database applications at the level of multicore and multisocket multicore architectures. It first presents a systematic way of eliminating such bottlenecks in online transaction processing workloads, which is base on minimizing unbounded communication, and shows several techniques that minimize bottlenecks in major components of database management systems. Then, it demonstrates the data and work sharing opportunities for analytical workloads, and reviews advanced scheduling mechanisms that are aware of nonuniform memory accesses and alleviate bandwidth saturation.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>The purpose of this book is to cover essential aspects of vehicle suspension systems and provide an easy approach for their analysis and design. It is intended specifically for undergraduate students and anyone with an interest in design and analysis of suspension systems. In order to simplify the understanding of more difficult concepts, the book uses a step-by-step approach along with pictures, graphs and examples.</p> <p>The book begins with the introduction of the role of suspensions in cars and a description of their main components. The types of suspensions are discussed and their differences reviewed. The mechanisms or geometries of different suspension systems are introduced and the tools for their analysis are discussed. In addition, vehicle vibration is reviewed in detail and models are developed to study vehicle ride comfort.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Deep Web Query Interface Understanding and Integration

    Copyright Year: 2012

    Morgan and Claypool eBooks

    There are millions of searchable data sources on the Web and to a large extent their contents can only be reached through their own query interfaces. There is an enormous interest in making the data in these sources easily accessible. There are primarily two general approaches to achieve this objective. The first is to surface the contents of these sources from the deep Web and add the contents to the index of regular search engines. The second is to integrate the searching capabilities of these sources and support integrated access to them. In this book, we introduce the state-of-the-art techniques for extracting, understanding, and integrating the query interfaces of deep Web data sources. These techniques are critical for producing an integrated query interface for each domain. The interface serves as the mediator for searching all data sources in the concerned domain. While query interface integration is only relevant for the deep Web integration approach, the extraction and under tanding of query interfaces are critical for both deep Web exploration approaches. This book aims to provide in-depth and comprehensive coverage of the key technologies needed to create high quality integrated query interfaces automatically. The following technical issues are discussed in detail in this book: query interface modeling, query interface extraction, query interface clustering, query interface matching, query interface attribute integration, and query interface integration. Table of Contents: Introduction / Query Interface Representation and Extraction / Query Interface Clustering and Categorization / Query Interface Matching / Query Interface Attribute Integration / Query Interface Integration / Summary and Future Research View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Intelligent Autonomous Robotics:A Robot Soccer Case Study

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Robotics technology has recently advanced to the point of being widely accessible for relatively low-budget research, as well as for graduate, undergraduate, and even secondary and primary school education. This lecture provides an example of how to productively use a cutting-edge advanced robotics platform for education and research by providing a detailed case study with the Sony AIBO robot, a vision-based legged robot. The case study used for this lecture is the UT Austin Villa RoboCup Four-Legged Team. This lecture describes both the development process and the technical details of its end result. The main contributions of this lecture are (i) a roadmap for new classes and research groups interested in intelligent autonomous robotics who are starting from scratch with a new robot, and (ii) documentation of the algorithms behind our own approach on the AIBOs with the goal of making them accessible for use on other vision-based and/or legged robot platforms. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book serves as a practical guide to simulation of 3D deformable solids using the Finite Element Method (FEM). It reviews a number of topics related to the theory and implementation of FEM approaches: measures of deformation, constitutive laws of nonlinear materials, tetrahedral discretizations, and model reduction techniques for real-time simulation. Simulations of deformable solids are important in many applications in computer graphics, including film special effects, computer games, and virtual surgery. The Finite Element Method has become a popular tool in many such applications. Variants of FEM catering to both offline and real-time simulation have had a mature presence in computer graphics literature. This book is designed for readers familiar with numerical simulation in computer graphics, who would like to obtain a cohesive picture of the various FEM simulation methods available, their strengths and weaknesses, and their applicability in various simulation scenarios. The ook is also a practical implementation guide for the visual effects developer, offering a lean yet adequate synopsis of the underlying mathematical theory. Chapter 1 introduces the quantitative descriptions used to capture the deformation of elastic solids, the concept of strain energy, and discusses how force and stress result as a response to deformation. Chapter 2 reviews a number of constitutive models, i.e., analytical laws linking deformation to the resulting force that has successfully been used in various graphics-oriented simulation tasks. Chapter 3 summarizes how deformation and force can be computed discretely on a tetrahedral mesh, and how an implicit integrator can be structured around this discretization. Finally, chapter 4 presents the state of the art in model reduction techniques for real-time FEM solid simulation and discusses which techniques are suitable for which applications. Topics discussed in this chapter include linear modal analysis, modal warping, subspace imulation, and domain decomposition. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    FPGA-Accelerated Simulation of Computer Systems

    Copyright Year: 2014

    Morgan and Claypool eBooks

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed for FPGA accelerated simulation, survey the state-of-the-art in FPGA-accelerated simulation, and describe in detail selected instances of the described techniques. Table of Contents: Preface / Acknowledgments / Introduction / Simulator Background / Accelerating Computer System Simulators with FPGAs / Simulation Virtualization / Categorizing FPGA-based Simulators / Conclusion / Bibliography / Autho s' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Biologic Foundations for Skeletal Tissue Engineering

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Tissue engineering research for bone and joint applications entails multidisciplinary teams bringing together the needed expertise in anatomy, biology, biochemistry, pathophysiology, materials science, biomechanics, fluidics, and clinical and veterinary orthopedics. It is the goal of this volume to provide students and investigators who are entering this exciting area with an understanding of the biologic foundations necessary to appreciate the problems in bone and cartilage that may benefit from innovative tissue engineering approaches. This volume includes state-of-the-art information about bone and cartilage physiology at the levels of cell and molecular biology, tissue structure, developmental processes, their metabolic and structural functions, responses to injury, mechanisms of post-natal healing and graft incorporation, the many congenital and acquired disorders, effects of aging, and current clinical standards of care. It reviews the strengths and limitations of various experi ental animal models, sources of cells, composition and design of scaffolds, activities of growth factors and genes to enhance histogenesis, and the need for new materials in the context of cell-based and cell-free tissue engineering. These building blocks constitute the dynamic environments in which innovative approaches are needed for addressing debilitating disorders of the skeleton. It is likely that a single tactic will not be sufficient for different applications because of variations in the systemic and local environments. The realizations that tissue regeneration is complex and dynamic underscore the continuing need for innovative multidisciplinary investigations, with an eye to simple and safe therapies for disabled patients. Table of Contents: Introduction / Structure and Function of Bone and Cartilage Tissue / Development / Responses to Injury and Grafting / Clinical Applications for Skeletal Tissue Engineering / Animal Models / Tissue Engineering Principles for Bone and Car ilage / Perspectives View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    GPU-Based Techniques for Global Illumination Effects

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book presents techniques to render photo-realistic images by programming the Graphics Processing Unit (GPU). We discuss effects such as mirror reflections, refractions, caustics, diffuse or glossy indirect illumination, radiosity, single or multiple scattering in participating media, tone reproduction, glow, and depth of field. The book targets game developers, graphics programmers, and also students with some basic understanding of computer graphics algorithms, rendering APIs like Direct3D or OpenGL, and shader programming. In order to make the book self-contained, the most important concepts of local illumination and global illumination rendering, graphics hardware, and Direct3D/HLSL programming are reviewed in the first chapters. After these introductory chapters we warm up with simple methods including shadow and environment mapping, then we move on toward advanced concepts aiming at global illumination rendering. Since it would have been impossible to give a rigorous review f all approaches proposed in this field, we go into the details of just a few methods solving each particular global illumination effect. However, a short discussion of the state of the art and links to the bibliography are also provided to refer the interested reader to techniques that are not detailed in this book. The implementation of the selected methods is also presented in HLSL, and we discuss their observed performance, merits, and disadvantages. In the last chapter, we also review how these techniques can be integrated in an advanced game engine and present case studies of their exploitation in games. Having gone through this book, the reader will have an overview of the state of the art, will be able to apply and improve these techniques, and most importantly, will be capable of developing brand new GPU algorithms. Table of Contents: Global Illumintation Rendering / Local Illumination Rendering Pipeline of GPUs / Programming and Controlling GPUs / Simple Improvements of the ocal Illumination Model / Ray Casting on the GPU / Specular Effects with Rasterization / Diffuse and Glossy Indirect Illumination / Pre-computation Aided Global Illumination / Participating Media Rendering / Fake Global Illumination / Postprocessing Effects / Integrating GI Effects in Games and Virtual Reality Systems / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Libraries and the Semantic Web:An Introduction to Its Applications and Opportunities for Libraries

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This book covers the concept of the Semantic Web—what it is, the components that comprise it, including Linked Data, and the various ways that libraries are engaged in contributing to its development in making library resources and services ever more accessible to end-users. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computational Methods for Integrating Vision and Language

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>Modeling data from visual and linguistic modalities together creates opportunities for better understanding of both, and supports many useful applications. Examples of dual visual-linguistic data includes images with keywords, video with narrative, and figures in documents. We consider two key task-driven themes: translating from one modality to another (e.g., inferring annotations for images) and understanding the data using all modalities, where one modality can help disambiguate information in another. The multiple modalities can either be essentially semantically redundant (e.g., keywords provided by a person looking at the image), or largely complementary (e.g., meta data such as the camera used). Redundancy and complementarity are two endpoints of a scale, and we observe that good performance on translation requires some redundancy, and that joint inference is most useful where some information is complementary. </p><p> Computational methods discussed re broadly organized into ones for simple keywords, ones going beyond keywords toward natural language, and ones considering sequential aspects of natural language. Methods for keywords are further organized based on localization of semantics, going from words about the scene taken as whole, to words that apply to specific parts of the scene, to relationships between parts. Methods going beyond keywords are organized by the linguistic roles that are learned, exploited, or generated. These include proper nouns, adjectives, spatial and comparative prepositions, and verbs. More recent developments in dealing with sequential structure include automated captioning of scenes and video, alignment of video and text, and automated answering of questions about scenes depicted in images.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis of the MPEG-1 Layer III (MP3) Algorithm using MATLAB

    Copyright Year: 2011

    Morgan and Claypool eBooks

    The MPEG-1 Layer III (MP3) algorithm is one of the most successful audio formats for consumer audio storage and for transfer and playback of music on digital audio players. The MP3 compression standard along with the AAC (Advanced Audio Coding) algorithm are associated with the most successful music players of the last decade. This book describes the fundamentals and the MATLAB implementation details of the MP3 algorithm. Several of the tedious processes in MP3 are supported by demonstrations using MATLAB software. The book presents the theoretical concepts and algorithms used in the MP3 standard. The implementation details and simulations with MATLAB complement the theoretical principles. The extensive list of references enables the reader to perform a more detailed study on specific aspects of the algorithm and gain exposure to advancements in perceptual coding. Table of Contents: Introduction / Analysis Subband Filter Bank / Psychoacoustic Model II / MDCT / Bit Allocation, Quantiza ion and Coding / Decoder View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering Thermodynamics and 21st Century Energy Problems:A Textbook Companion for Student Engagement

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Energy is a basic human need; technologies for energy conversion and use are fundamental to human survival. As energy technology evolves to meet demands for development and ecological sustainability in the 21st century, engineers need to have up-to-date skills and knowledge to meet the creative challenges posed by current and future energy problems. Further, engineers need to cultivate a commitment to and passion for lifelong learning which will enable us to actively engage new developments in the field. This undergraduate textbook companion seeks to develop these capacities in tomorrow's engineers in order to provide for future energy needs around the world. This book is designed to complement traditional texts in engineering thermodynamics, and thus is organized to accompany explorations of the First and Second Laws, fundamental property relations, and various applications across engineering disciplines. It contains twenty modules targeted toward meeting five often-neglected ABET o tcomes: ethics, communication, lifelong learning, social context, and contemporary issues. The modules are based on pedagogies of liberation, used for decades in the humanities and social sciences for instilling critical thinking and reflective action in students by bringing attention to power relations in the classroom and in the world. This book is intended to produce a conversation and creative exploration around how to teach and learn thermodynamics differently. Because liberative pedagogies are at their heart relational, it is important to maintain spaces for discussing classroom practices with these modules, and for sharing ideas for implementing critical pedagogies in engineering contexts. The reader is therefore encouraged to visit the book's blog. Table of Contents: What and Why? / The First Law: Making Theory Relevant / The Second Law and Property Relations / Thinking Big Picture about Energy and Sustainability View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Tissue Engineering of Temporomandibular Joint Cartilage

    Copyright Year: 2009

    Morgan and Claypool eBooks

    The temporomandibular joint (TMJ) is a site of intense morbidity for millions of people, especially young, pre-menopausal women. Central to TMJ afflictions are the cartilaginous tissues of the TMJ, especially those of the disc and condylar cartilage, which play crucial roles in normal function of this unusual joint. Damage or disease to these tissues significantly impacts a patient's quality of life by making common activities such as talking and eating difficult and painful. Unfortunately, these tissues have limited ability to heal, necessitating the development of treatments for repair or replacement. The burgeoning field of tissue engineering holds promise that replacement tissues can be constructed in the laboratory to recapitulate the functional requirements of native tissues. This book outlines the biomechanical, biochemical, and anatomical characteristics of the disc and condylar cartilage, and also provides a historical perspective of past and current TMJ treatments and previ us tissue engineering efforts. This book was written to serve as a reference for researchers seeking to learn about the TMJ, for undergraduate and graduate level courses, and as a compendium of TMJ tissue engineering design criteria. Table of Contents: The Temporomandibular Joint / Fibrocartilage of the TMJ Disc / Cartilage of the Mandibular Condyle / Tissue Engineering of the Disc / Tissue Engineering of the Mandibular Condyle / Current Perspectives View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quantum Computing for Computer Architects, Second Edition

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore the systems-level challenges in achieving scalable, fault-tolerant quantum computation. In this lecture, we provide an engineering-oriented introduction to quantum computation with an overview of the theory behind key quantum algorithms. Next, we look at architectural case studies based upon experimental data and future projections for quantum computation implemented using trapped ions. While we ocus here on architectures targeted for realization using trapped ions, the techniques for quantum computer architecture design, quantum fault-tolerance, and compilation described in this lecture are applicable to many other physical technologies that may be viable candidates for building a large-scale quantum computing system. We also discuss general issues involved with programming a quantum computer as well as a discussion of work on quantum architectures based on quantum teleportation. Finally, we consider some of the open issues remaining in the design of quantum computers. Table of Contents: Introduction / Basic Elements for Quantum Computation / Key Quantum Algorithms / Building Reliable and Scalable Quantum Architectures / Simulation of Quantum Computation / Architectural Elements / Case Study: The Quantum Logic Array Architecture / Programming the Quantum Architecture / Using the QLA for Quantum Simulation: The Transverse Ising Model / Teleportation-Based Quantum Architecture / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    XML Retrieval

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Documents usually have a content and a structure. The content refers to the text of the document, whereas the structure refers to how a document is logically organized. An increasingly common way to encode the structure is through the use of a mark-up language. Nowadays, the most widely used mark-up language for representing structure is the eXtensible Mark-up Language (XML). XML can be used to provide a focused access to documents, i.e. returning XML elements, such as sections and paragraphs, instead of whole documents in response to a query. Such focused strategies are of particular benefit for information repositories containing long documents, or documents covering a wide variety of topics, where users are directed to the most relevant content within a document. The increased adoption of XML to represent a document structure requires the development of tools to effectively access documents marked-up in XML. This book provides a detailed description of query languages, indexing str tegies, ranking algorithms, presentation scenarios developed to access XML documents. Major advances in XML retrieval were seen from 2002 as a result of INEX, the Initiative for Evaluation of XML Retrieval. INEX, also described in this book, provided test sets for evaluating XML retrieval effectiveness. Many of the developments and results described in this book were investigated within INEX. Table of Contents: Introduction / Basic XML Concepts / Historical Perspectives / Query Languages / Indexing Strategies / Ranking Strategies / Presentation Strategies / Evaluating XML Retrieval Effectiveness / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Path Problems in Networks

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The algebraic path problem is a generalization of the shortest path problem in graphs. Various instances of this abstract problem have appeared in the literature, and similar solutions have been independently discovered and rediscovered. The repeated appearance of a problem is evidence of its relevance. This book aims to help current and future researchers add this powerful tool to their arsenal, so that they can easily identify and use it in their own work. Path problems in networks can be conceptually divided into two parts: A distillation of the extensive theory behind the algebraic path problem, and an exposition of a broad range of applications. First of all, the shortest path problem is presented so as to fix terminology and concepts: existence and uniqueness of solutions, robustness to parameter changes, and centralized and distributed computation algorithms. Then, these concepts are generalized to the algebraic context of semirings. Methods for creating new semirings, useful f r modeling new problems, are provided. A large part of the book is then devoted to numerous applications of the algebraic path problem, ranging from mobile network routing to BGP routing to social networks. These applications show what kind of problems can be modeled as algebraic path problems; they also serve as examples on how to go about modeling new problems. This monograph will be useful to network researchers, engineers, and graduate students. It can be used either as an introduction to the topic, or as a quick reference to the theoretical facts, algorithms, and application examples. The theoretical background assumed for the reader is that of a graduate or advanced undergraduate student in computer science or engineering. Some familiarity with algebra and algorithms is helpful, but not necessary. Algebra, in particular, is used as a convenient and concise language to describe problems that are essentially combinatorial. Table of Contents: Classical Shortest Path / The Algebraic Path Problem / Properties and Computation of Solutions / Applications / Related Areas / List of Semirings and Applications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Hard Problems in Software Testing:Solutions Using Testing as a Service (TaaS)

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This book summarizes the current hard problems in software testing as voiced by leading practitioners in the field. The problems were identified through a series of workshops, interviews, and surveys. Some of the problems are timeless, such as education and training, while others such as system security have recently emerged as increasingly important. The book also provides an overview of the current state of Testing as a Service (TaaS) based on an exploration of existing commercial offerings and a survey of academic research. TaaS is a relatively new development that offers software testers the elastic computing capabilities and generous storage capacity of the cloud on an as-needed basis. Some of the potential benefits of TaaS include automated provisioning of test execution environments and support for rapid feedback in agile development via continuous regression testing. The book includes a case study of a representative web application and three commercial TaaS tools to determine which hard problems in software testing are amenable to a TaaS solution. The findings suggest there remains a significant gap that must be addressed before TaaS can be fully embraced by the industry, particularly in the areas of tester education and training and a need for tools supporting more types of testing. The book includes a roadmap for enhancing TaaS to help bridge the gap between potential benefits and actual results. Table of Contents: Introduction / Hard Problems in Software Testing / Testing as a Service (TaaS) / Case Study and Gap Analysis / Summary / Appendix A: Hard Problems in Software Testing Survey / Appendix B: Google App Engine Code Examples / Appendix C: Sauce Labs Code Examples / References / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Lifelong Machine Learning

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p><i>Lifelong Machine Learning</i> (or <i>Lifelong Learning</i>) is an advanced machine learning paradigm that learns continuously, accumulates the knowledge learned in previous tasks, and uses it to help future learning. In the process, the learner becomes more and more knowledgeable and effective at learning. This learning ability is one of the hallmarks of human intelligence. However, the current dominant machine learning paradigm learns <i>in isolation</i>: given a training dataset, it runs a machine learning algorithm on the dataset to produce a model. It makes no attempt to retain the learned knowledge and use it in future learning. Although this <i>isolated learning paradigm</i> has been very successful, it requires a large number of training examples, and is only suitable for well-defined and narrow tasks. In comparison, we humans can learn effectively with a few examples because we have accumulated so mu h knowledge in the past which enables us to learn with little data or effort. Lifelong learning aims to achieve this capability. As statistical machine learning matures, it is time to make a major effort to break the isolated learning tradition and to study lifelong learning to bring machine learning to new heights. Applications such as intelligent assistants, chatbots, and physical robots that interact with humans and systems in real-life environments are also calling for such lifelong learning capabilities. Without the ability to accumulate the learned knowledge and use it to learn more knowledge incrementally, a system will probably never be truly intelligent. This book serves as an introductory text and survey to lifelong learning.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Finite State Machine Datapath Design, Optimization, and Implementation

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Finite State Machine Datapath Design, Optimization, and Implementation explores the design space of combined FSM/Datapath implementations. The lecture starts by examining performance issues in digital systems such as clock skew and its effect on setup and hold time constraints, and the use of pipelining for increasing system clock frequency. This is followed by definitions for latency and throughput, with associated resource tradeoffs explored in detail through the use of dataflow graphs and scheduling tables applied to examples taken from digital signal processing applications. Also, design issues relating to functionality, interfacing, and performance for different types of memories commonly found in ASICs and FPGAs such as FIFOs, single-ports, and dual-ports are examined. Selected design examples are presented in implementation-neutral Verilog code and block diagrams, with associated design files available as downloads for both Altera Quartus and Xilinx Virtex FPGA platforms. A wor ing knowledge of Verilog, logic synthesis, and basic digital design techniques is required. This lecture is suitable as a companion to the synthesis lecture titled Introduction to Logic Synthesis using Verilog HDL. Table of Contents: Calculating Maximum Clock Frequency / Improving Design Performance / Finite State Machine with Datapath (FSMD) Design / Embedded Memory Usage in Finite State Machine with Datapath (FSMD) Designs View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Ethics for Bioengineers

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Increasingly, biomedical scientists and engineers are involved in projects, design, or research and development that involve humans or animals. The book presents general concepts on professionalism and the regulation of the profession of engineering, including a discussion on what is ethics and moral conduct, ethical theories and the codes of ethics that are most relevant for engineers. An ethical decision-making process is suggested. Other issues such as conflicts of interest, plagiarism, intellectual property, confidentiality, privacy, fraud, and corruption are presented. General guidelines, the process for obtaining ethics approval from Ethics Review Boards, and the importance of obtaining informed consent from volunteers recruited for studies are presented. A discussion on research with animals is included. Ethical dilemmas focus on reproductive technologies, stem cells, cloning, genetic testing, and designer babies. The book includes a discussion on ethics and the technologies of body enhancement and of regeneration. The importance of assessing the impact of technology on people, society, and on our planet is stressed. Particular attention is given to nanotechnologies, the environment, and issues that pertain to developing countries. Ideas on gender, culture, and ethics focus on how research and access to medical services have, at times, been discriminatory towards women. The cultural aspects focus on organ transplantation in Japan, and a case study of an Aboriginal child in Canada; both examples show the impact that culture can have on how care is provided or accepted. The final section of the book discusses data collection and analysis and offers a guideline for honest reporting of results, avoiding fraud, or unethical approaches. The appendix presents a few case studies where fraud and/or unethical research have occurred. Table of Contents: Introduction to Ethics / Experiments with Human Subjects or Animals / Examples of Ethical Dilemmas in Biomedical Resea ch / Technology and Society / Gender, Culture, and Ethics / Data Collection and Analysis View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Cardiac Tissue Engineering:Principles, Materials, and Applications

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Cardiac tissue engineering aims at repairing damaged heart muscle and producing human cardiac tissues for application in drug toxicity studies. This book offers a comprehensive overview of the cardiac tissue engineering strategies, including presenting and discussing the various concepts in use, research directions and applications. Essential basic information on the major components in cardiac tissue engineering, namely cell sources and biomaterials, is firstly presented to the readers, followed by a detailed description of their implementation in different strategies, broadly divided to cellular and acellular ones. In cellular approaches, the biomaterials are used to increase cell retention after implantation or as scaffolds when bioengineering the cardiac patch, in vitro. In acellular approaches, the biomaterials are used as ECM replacement for damaged cardiac ECM after MI, or, in combination with growth factors, the biomaterials assume an additional function as a depot for prolong d factor activity for the effective recruitment of repairing cells. The book also presents technological innovations aimed to improve the quality of the cardiac patches, such as bioreactor applications, stimulation patterns and prevascularization. This book could be of interest not only from an educational perspective (i.e. for graduate students), but also for researchers and medical professionals, to offer them fresh views on novel and powerful treatment strategies. We hope that the reader will find a broad spectrum of ideas and possibilities described in this book both interesting and convincing. Table of Contents: Introduction / The Heart: Structure, Cardiovascular Diseases, and Regeneration / Cell Sources for Cardiac Tissue Engineering / Biomaterials: Polymers, Scaffolds, and Basic Design Criteria / Biomaterials as Vehicles for Stem Cell Delivery and Retention in the Infarct / Bioengineering of Cardiac Patches, In Vitro / Perfusion Bioreactors and Stimulation Patterns in Cardiac T ssue Engineering / Vascularization of Cardiac Patches / Acellular Biomaterials for Cardiac Repair / Biomaterial-based Controlled Delivery of Bioactive Molecules for Myocardial Regeneration View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sensory Organ Replacement and Repair

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The senses of human hearing and sight are often taken for granted by many individuals until they are lost or adversely affected. Millions of individuals suffer from partial or total hearing loss and millions of others have impaired vision. The technologies associated with augmenting these two human senses range from simple hearing aids to complex cochlear implants, and from (now commonplace) intraocular lenses to complex artificial corneas. The areas of human hearing and human sight will be described in detail with the associated array of technologies also described. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Models of Horizontal Eye Movements, Part II:A 3rd Order Linear Saccade Model

    Copyright Year: 2010

    Morgan and Claypool eBooks

    There are five different types of eye movements: saccades, smooth pursuit, vestibular ocular eye movements, optokinetic eye movements, and vergence eye movements. The purpose of this book is focused primarily on mathematical models of the horizontal saccadic eye movement system and the smooth pursuit system, rather than on how visual information is processed. A saccade is a fast eye movement used to acquire a target by placing the image of the target on the fovea. Smooth pursuit is a slow eye movement used to track a target as it moves by keeping the target on the fovea. The vestibular ocular movement is used to keep the eyes on a target during brief head movements. The optokinetic eye movement is a combination of saccadic and slow eye movements that keeps a full-field image stable on the retina during sustained head rotation. Each of these movements is a conjugate eye movement, that is, movements of both eyes together driven by a common neural source. A vergence movement is a non-con ugate eye movement allowing the eyes to track targets as they come closer or farther away. In this book, a 2009 version of a state-of-the-art model is presented for horizontal saccades that is 3rd-order and linear, and controlled by a physiologically based time-optimal neural network. The oculomotor plant and saccade generator are the basic elements of the saccadic system. The control of saccades is initiated by the superior colliculus and terminated by the cerebellar fa