By Topic

Morgan and ClayPool Synthesis Digital LIBRARY

779 Results Returned

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Chinese Natural Language Processing

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book introduces Chinese language-processing issues and techniques to readers who already have a basic background in natural language processing (NLP). Since the major difference between Chinese and Western languages is at the word level, the book primarily focuses on Chinese morphological analysis and introduces the concept, structure, and interword semantics of Chinese words. The following topics are covered: a general introduction to Chinese NLP; Chinese characters, morphemes, and words and the characteristics of Chinese words that have to be considered in NLP applications; Chinese word segmentation; unknown word detection; word meaning and Chinese linguistic resources; interword semantics based on word collocation and NLP techniques for collocation extraction. Table of Contents: Introduction / Words in Chinese / Challenges in Chinese Morphological Processing / Chinese Word Segmentation / Unknown Word Identification / Word Meaning / Chinese Collocations / Automatic Chinese Coll cation Extraction / Appendix / References / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Central Nervous System Tissue Engineering: Current Considerations and Strategies

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Combating neural degeneration from injury or disease is extremely difficult in the brain and spinal cord, i.e. central nervous system (CNS). Unlike the peripheral nerves, CNS neurons are bombarded by physical and chemical restrictions that prevent proper healing and restoration of function. The CNS is vital to bodily function, and loss of any part of it can severely and permanently alter a person's quality of life. Tissue engineering could offer much needed solutions to regenerate or replace damaged CNS tissue. This review will discuss current CNS tissue engineering approaches integrating scaffolds, cells and stimulation techniques. Hydrogels are commonly used CNS tissue engineering scaffolds to stimulate and enhance regeneration, but fiber meshes and other porous structures show specific utility depending on application. CNS relevant cell sources have focused on implantation of exogenous cells or stimulation of endogenous populations. Somatic cells of the CNS are rarely utilized for tissue engineering; however, glial cells of the peripheral nervous system (PNS) may be used to myelinate and protect spinal cord damage. Pluripotent and multipotent stem cells offer alternative cell sources due to continuing advancements in identification and differentiation of these cells. Finally, physical, chemical, and electrical guidance cues are extremely important to neural cells, serving important roles in development and adulthood. These guidance cues are being integrated into tissue engineering approaches. Of particular interest is the inclusion of cues to guide stem cells to differentiate into CNS cell types, as well to guide neuron targeting. This review should provide the reader with a broad understanding of CNS tissue engineering challenges and tactics, with the goal of fostering the future development of biologically inspired designs. Table of Contents: Introduction / Anatomy of the CNS and Progression of Neurological Damage / Biomaterials for Scaffold Preparation / Cel Sources for CNS TE / Stimulation and Guidance / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Engineering Design Challenge:A Unique Opportunity

    Copyright Year: 2013

    Morgan and Claypool eBooks

    The Engineering Design Challenge addresses teaching engineering design and presents design projects for first-year students and interdisciplinary design ventures. A short philosophy and background of engineering design is discussed. The organization of the University of Wyoming first-year Introduction to Engineering program is presented with an emphasis on the first-year design challenges. These challenges are presented in a format readily incorporated in other first-year programs. The interdisciplinary design courses address the institutional constraints and present organizational approaches that resolve these issues. Student results are summarized and briefly assessed. A series of short intellectual problems are included to initiate discussion and understanding of design issues. Sample syllabi, research paper requirements, and oral presentation evaluation sheets are included. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Estimation of Cortical Connectivity in Humans: Advanced Signal Processing Techniques

    Copyright Year: 2007

    Morgan and Claypool eBooks

    In the last ten years many different brain imaging devices have conveyed a lot of information about the brain functioning in different experimental conditions. In every case, the biomedical engineers, together with mathematicians, physicists and physicians are called to elaborate the signals related to the brain activity in order to extract meaningful and robust information to correlate with the external behavior of the subjects. In such attempt, different signal processing tools used in telecommunications and other field of engineering or even social sciences have been adapted and re-used in the neuroscience field. The present book would like to offer a short presentation of several methods for the estimation of the cortical connectivity of the human brain. The methods here presented are relatively simply to implement, robust and can return valuable information about the causality of the activation of the different cortical areas in humans using non invasive electroencephalographic r cordings. The knowledge of such signal processing tools will enrich the arsenal of the computational methods that a engineer or a mathematician could apply in the processing of brain signals. Table of Contents: Introduction / Estimation of the Effective Connectivity from Stationary Data by Structural Equation Modeling / Estimation of the Functional Connectivity from Stationary Data by Multivariate Autoregressive Methods / Estimation of Cortical Activity by the use of Realistic Head Modeling / Application: Estimation of Connectivity from Movement-Related Potentials / Application to High-Resolution EEG Recordings in a Cognitive Task (Stroop Test) / Application to Data Related to the Intention of Limb Movements in Normal Subjects and in a Spinal Cord Injured Patient / The Instantaneous Estimation of the Time-Varying Cortical Connectivity by Adaptive Multivariate Estimators / Time-Varying Connectivity from Event-Related Potentials View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Cleaning:A Practical Perspective

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Data warehouses consolidate various activities of a business and often form the backbone for generating reports that support important business decisions. Errors in data tend to creep in for a variety of reasons. Some of these reasons include errors during input data collection and errors while merging data collected independently across different databases. These errors in data warehouses often result in erroneous upstream reports, and could impact business decisions negatively. Therefore, one of the critical challenges while maintaining large data warehouses is that of ensuring the quality of data in the data warehouse remains high. The process of maintaining high data quality is commonly referred to as data cleaning. In this book, we first discuss the goals of data cleaning. Often, the goals of data cleaning are not well defined and could mean different solutions in different scenarios. Toward clarifying these goals, we abstract out a common set of data cleaning tasks that often ne d to be addressed. This abstraction allows us to develop solutions for these common data cleaning tasks. We then discuss a few popular approaches for developing such solutions. In particular, we focus on an operator-centric approach for developing a data cleaning platform. The operator-centric approach involves the development of customizable operators that could be used as building blocks for developing common solutions. This is similar to the approach of relational algebra for query processing. The basic set of operators can be put together to build complex queries. Finally, we discuss the development of custom scripts which leverage the basic data cleaning operators along with relational operators to implement effective solutions for data cleaning tasks. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Image Understanding Using Sparse Representations

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Image understanding has been playing an increasingly crucial role in several inverse problems and computer vision. Sparse models form an important component in image understanding, since they emulate the activity of neural receptors in the primary visual cortex of the human brain. Sparse methods have been utilized in several learning problems because of their ability to provide parsimonious, interpretable, and efficient models. Exploiting the sparsity of natural signals has led to advances in several application areas including image compression, denoising, inpainting, compressed sensing, blind source separation, super-resolution, and classification. The primary goal of this book is to present the theory and algorithmic considerations in using sparse models for image understanding and computer vision applications. To this end, algorithms for obtaining sparse representations and their performance guarantees are discussed in the initial chapters. Furthermore, approaches for designing ov rcomplete, data-adapted dictionaries to model natural images are described. The development of theory behind dictionary learning involves exploring its connection to unsupervised clustering and analyzing its generalization characteristics using principles from statistical learning theory. An exciting application area that has benefited extensively from the theory of sparse representations is compressed sensing of image and video data. Theory and algorithms pertinent to measurement design, recovery, and model-based compressed sensing are presented. The paradigm of sparse models, when suitably integrated with powerful machine learning frameworks, can lead to advances in computer vision applications such as object recognition, clustering, segmentation, and activity recognition. Frameworks that enhance the performance of sparse models in such applications by imposing constraints based on the prior discriminatory information and the underlying geometrical structure, and kernelizing the spa se coding and dictionary learning methods are presented. In addition to presenting theoretical fundamentals in sparse learning, this book provides a platform for interested readers to explore the vastly growing application domains of sparse representations. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Constructing Knowldge Art:An Experiential Perspective on Crafting Participatory Representations

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This book is about how people (we refer to them as practitioners) can help guide participants in creating representations of issues or ideas, such as collaborative diagrams, especially in the context of Participatory Design (PD). At its best, such representations can reach a very high level of expressiveness and usefulness, an ideal we refer to as Knowledge Art. Achieving that level requires effective engagement, often aided by facilitators or other practitioners. Most PD research focuses on tools and methods, or on participant experience. The next source of advantage is to better illuminate the role of practitioners-the people working with participants, tools, and methods in service of a project’s larger goals. Just like participants, practitioners experience challenges, interactions, and setbacks, and come up with creative ways to address them while maintaining their stance of service to participants and stakeholders. Our research interest is in understanding what moves and c oices practitioners make that either help or hinder participants’ engagement with representations. We present a theoretical framework that looks at these choices from the experiential perspectives of narrative, aesthetics, ethics, sensemaking and improvisation and apply it to five diverse case studies of actual practice. Table of Contents: Acknowledgments / Introduction / Participatory Design and Representational Practice / Dimensions of Knowledge Art / Case Studies / Discussion and Conclusions / Appendix: Knowledge Art Analytics / Bibliography / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    From Tool to Partner:The Evolution of Human-Computer Interaction

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>This is the first comprehensive history of human-computer interaction (HCI). Whether you are a user experience professional or an academic researcher, whether you identify with computer science, human factors, information systems, information science, design, or communication, you can discover how your experiences fit into the expanding field of HCI. You can determine where to look for relevant information in other fields—and where you won’t find it.</p><p>This book describes the different fields that have participated in improving our digital tools. It is organized chronologically, describing major developments across fields in each period. Computer use has changed radically, but many underlying forces are constant. Technology has changed rapidly, human nature very little. An irresistible force meets an immovable object. The exponential rate of technological change gives us little time to react before technology moves on. Patterns and trajec ories described in this book provide your best chance to anticipate what could come next.</p><p>We have reached a turning point. Tools that we built for ourselves to use are increasingly influencing how we use them, in ways that are planned and sometimes unplanned. The book ends with issues worthy of consideration as we explore the new world that we and our digital partners are shaping.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Biomedical Image Analysis: Tracking

    Copyright Year: 2006

    Morgan and Claypool eBooks

    In biological and medical imaging applications, tracking objects in motion is a critical task. This book describes the state-of-the-art in biomedical tracking techniques. We begin by detailing methods for tracking using active contours, which have been highly successful in biomedical applications. The book next covers the major probabilistic methods for tracking. Starting with the basic Bayesian model, we describe the Kalman filter and conventional tracking methods that use centroid and correlation measurements for target detection. Innovations such as the extended Kalman filter and the interacting multiple model open the door to capturing complex biological objects in motion. A salient highlight of the book is the introduction of the recently emerged particle filter, which promises to solve tracking problems that were previously intractable by conventional means. Another unique feature of Biomedical Image Analysis: Tracking is the explanation of shape-based methods for biomedical ima e analysis. Methods for both rigid and nonrigid objects are depicted. Each chapter in the book puts forth biomedical case studies that illustrate the methods in action. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Algorithms for Reinforcement Learning

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Reinforcement learning is a learning paradigm concerned with learning to control a system so as to maximize a numerical performance measure that expresses a long-term objective. What distinguishes reinforcement learning from supervised learning is that only partial feedback is given to the learner about the learner's predictions. Further, the predictions may have long term effects through influencing the future state of the controlled system. Thus, time plays a special role. The goal in reinforcement learning is to develop efficient learning algorithms, as well as to understand the algorithms' merits and limitations. Reinforcement learning is of great interest because of the large number of practical applications that it can be used to address, ranging from problems in artificial intelligence to operations research or control engineering. In this book, we focus on those algorithms of reinforcement learning that build on the powerful theory of dynamic programming. We give a fairly co prehensive catalog of learning problems, describe the core ideas, note a large number of state of the art algorithms, followed by the discussion of their theoretical properties and limitations. Table of Contents: Markov Decision Processes / Value Prediction Problems / Control / For Further Exploration View full abstract»

  • Freely Available from IEEE

    The Datacenter as a Computer:An Introduction to the Design of Warehouse-Scale Machines

    Copyright Year: 2009

    Morgan and Claypool eBooks

    As computation continues to move into the cloud, the computing platform of interest no longer resembles a pizza box or a refrigerator, but a warehouse full of computers. These new large datacenters are quite different from traditional hosting facilities of earlier times and cannot be viewed simply as a collection of co-located servers. Large portions of the hardware and software resources in these facilities must work in concert to efficiently deliver good levels of Internet service performance, something that can only be achieved by a holistic approach to their design and deployment. In other words, we must treat the datacenter itself as one massive warehouse-scale computer (WSC). We describe the architecture of WSCs, the main factors influencing their design, operation, and cost structure, and the characteristics of their software base. We hope it will be useful to architects and programmers of today's WSCs, as well as those of future many-core platforms which may one day implement the equivalent of today's WSCs on a single board. Table of Contents: Introduction / Workloads and Software Infrastructure / Hardware Building Blocks / Datacenter Basics / Energy and Power Efficiency / Modeling Costs / Dealing with Failures and Repairs / Closing Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Chaotic Maps:Dynamics, Fractals, and Rapid Fluctuations

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book consists of lecture notes for a semester-long introductory graduate course on dynamical systems and chaos taught by the authors at Texas A&M University and Zhongshan University, China. There are ten chapters in the main body of the book, covering an elementary theory of chaotic maps in finite-dimensional spaces. The topics include one-dimensional dynamical systems (interval maps), bifurcations, general topological, symbolic dynamical systems, fractals and a class of infinite-dimensional dynamical systems which are induced by interval maps, plus rapid fluctuations of chaotic maps as a new viewpoint developed by the authors in recent years. Two appendices are also provided in order to ease the transitions for the readership from discrete-time dynamical systems to continuous-time dynamical systems, governed by ordinary and partial differential equations. Table of Contents: Simple Interval Maps and Their Iterations / Total Variations of Iterates of Maps / Ordering among Per ods: The Sharkovski Theorem / Bifurcation Theorems for Maps / Homoclinicity. Lyapunoff Exponents / Symbolic Dynamics, Conjugacy and Shift Invariant Sets / The Smale Horseshoe / Fractals / Rapid Fluctuations of Chaotic Maps on RN / Infinite-dimensional Systems Induced by Continuous-Time Difference Equations View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Robot Learning from Human Teachers

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Learning from Demonstration (LfD) explores techniques for learning a task policy from examples provided by a human teacher. The field of LfD has grown into an extensive body of literature over the past 30 years, with a wide variety of approaches for encoding human demonstrations and modeling skills and tasks. Additionally, we have recently seen a focus on gathering data from non-expert human teachers (i.e., domain experts but not robotics experts). In this book, we provide an introduction to the field with a focus on the unique technical challenges associated with designing robots that learn from naive human teachers. We begin, in the introduction, with a unification of the various terminology seen in the literature as well as an outline of the design choices one has in designing an LfD system. Chapter 2 gives a brief survey of the psychology literature that provides insights from human social learning that are relevant to designing robotic social learners. Chapter 3 walks through an fD interaction, surveying the design choices one makes and state of the art approaches in prior work. First, is the choice of input, how the human teacher interacts with the robot to provide demonstrations. Next, is the choice of modeling technique. Currently, there is a dichotomy in the field between approaches that model low-level motor skills and those that model high-level tasks composed of primitive actions. We devote a chapter to each of these. Chapter 7 is devoted to interactive and active learning approaches that allow the robot to refine an existing task model. And finally, Chapter 8 provides best practices for evaluation of LfD systems, with a focus on how to approach experiments with human subjects in this domain. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineers for Korea

    Copyright Year: 2013

    Morgan and Claypool eBooks

    “The engineer is bearer of the nation’s industrialization,” says the tower pictured on the front cover. President Park Chung-hee (1917-1979) was seeking to scale up a unified national identity through industrialization, with engineers as iconic leaders. But Park encountered huge obstacles in what he called the “second economy” of mental nationalism. Technical workers had long been subordinate to classically-trained scholar officials. Even as the country became an industrial powerhouse, the makers of engineers never found approaches to techno-national formation—engineering education and training—that Koreans would wholly embrace. This book follows the fraught attempts of engineers to identify with Korea as a whole. It is for engineers, both Korean and non-Korean, who seek to become better critical analysts of their own expertise, identities, and commitments. It is for non-engineers who encounter or are affected by Korean engineers and en ineering, and want to understand and engage them. It is for researchers who serve as critical participants in the making of engineers and puzzle over the contents and effects of techno-national formation. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computer-aided Detection of Architectural Distortion in Prior Mammograms of Interval Cancer

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Architectural distortion is an important and early sign of breast cancer, but because of its subtlety, it is a common cause of false-negative findings on screening mammograms. Screening mammograms obtained prior to the detection of cancer could contain subtle signs of early stages of breast cancer, in particular, architectural distortion. This book presents image processing and pattern recognition techniques to detect architectural distortion in prior mammograms of interval-cancer cases. The methods are based upon Gabor filters, phase portrait analysis, procedures for the analysis of the angular spread of power, fractal analysis, Laws' texture energy measures derived from geometrically transformed regions of interest (ROIs), and Haralick's texture features. With Gabor filters and phase-portrait analysis, 4,224 ROIs were automatically obtained from 106 prior mammograms of 56 interval-cancer cases, including 301 true-positive ROIs related to architectural distortion, and from 52 mammo rams of 13 normal cases. For each ROI, the fractal dimension, the entropy of the angular spread of power, 10 Laws' texture energy measures, and Haralick's 14 texture features were computed. The areas under the receiver operating characteristic (ROC) curves obtained using the features selected by stepwise logistic regression and the leave-one-image-out method are 0.77 with the Bayesian classifier, 0.76 with Fisher linear discriminant analysis, and 0.79 with a neural network classifier. Free-response ROC analysis indicated sensitivities of 0.80 and 0.90 at 5.7 and 8.8 false positives (FPs) per image, respectively, with the Bayesian classifier and the leave-one-image-out method. The present study has demonstrated the ability to detect early signs of breast cancer 15 months ahead of the time of clinical diagnosis, on the average, for interval-cancer cases, with a sensitivity of 0.8 at 5.7 FP/image. The presented computer-aided detection techniques, dedicated to accurate detection and lo alization of architectural distortion, could lead to efficient detection of early and subtle signs of breast cancer at pre-mass-formation stages. Table of Contents: Introduction / Detection of Early Signs of Breast Cancer / Detection and Analysis of Oriented Patterns / Detection of Potential Sites of Architectural Distortion / Experimental Set Up and Datasets / Feature Selection and Pattern Classification / Analysis of Oriented Patterns Related to Architectural Distortion / Detection of Architectural Distortion in Prior Mammograms / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Theory Tools for Computer Graphics

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Information theory (IT) tools, widely used in scientific fields such as engineering, physics, genetics, neuroscience, and many others, are also emerging as useful transversal tools in computer graphics. In this book, we present the basic concepts of IT and how they have been applied to the graphics areas of radiosity, adaptive ray-tracing, shape descriptors, viewpoint selection and saliency, scientific visualization, and geometry simplification. Some of the approaches presented, such as the viewpoint techniques, are now the state of the art in visualization. Almost all of the techniques presented in this book have been previously published in peer-reviewed conference proceedings or international journals. Here, we have stressed their common aspects and presented them in an unified way, so the reader can clearly see which problems IT tools can help solve, which specific tools to use, and how to apply them. A basic level of knowledge in computer graphics is required but basic concepts i IT are presented. The intended audiences are both students and practitioners of the fields above and related areas in computer graphics. In addition, IT practitioners will learn about these applications. Table of Contents: Information Theory Basics / Scene Complexity and Refinement Criteria for Radiosity / Shape Descriptors / Refinement Criteria for Ray-Tracing / Viewpoint Selection and Mesh Saliency / View Selection in Scientific Visualization / Viewpoint-based Geometry Simplification View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Learning to Rank for Information Retrieval and Natural Language Processing:Second Edition

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Learning to rank refers to machine learning techniques for training a model in a ranking task. Learning to rank is useful for many applications in information retrieval, natural language processing, and data mining. Intensive studies have been conducted on its problems recently, and significant progress has been made. This lecture gives an introduction to the area including the fundamental problems, major approaches, theories, applications, and future work. The author begins by showing that various ranking problems in information retrieval and natural language processing can be formalized as two basic ranking tasks, namely ranking creation (or simply ranking) and ranking aggregation. In ranking creation, given a request, one wants to generate a ranking list of offerings based on the features derived from the request and the offerings. In ranking aggregation, given a request, as well as a number of ranking lists of offerings, one wants to generate a new ranking list of the offerings. R nking creation (or ranking) is the major problem in learning to rank. It is usually formalized as a supervised learning task. The author gives detailed explanations on learning for ranking creation and ranking aggregation, including training and testing, evaluation, feature creation, and major approaches. Many methods have been proposed for ranking creation. The methods can be categorized as the pointwise, pairwise, and listwise approaches according to the loss functions they employ. They can also be categorized according to the techniques they employ, such as the SVM based, Boosting based, and Neural Network based approaches. The author also introduces some popular learning to rank methods in details. These include: PRank, OC SVM, McRank, Ranking SVM, IR SVM, GBRank, RankNet, ListNet & ListMLE, AdaRank, SVM MAP, SoftRank, LambdaRank, LambdaMART, Borda Count, Markov Chain, and CRanking. The author explains several example applications of learning to rank including web search, col aborative filtering, definition search, keyphrase extraction, query dependent summarization, and re-ranking in machine translation. A formulation of learning for ranking creation is given in the statistical learning framework. Ongoing and future research directions for learning to rank are also discussed. Table of Contents: Learning to Rank / Learning for Ranking Creation / Learning for Ranking Aggregation / Methods of Learning to Rank / Applications of Learning to Rank / Theory of Learning to Rank / Ongoing and Future Work View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Particle Swarm Optimizaton:A Physics-Based Approach

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This work aims to provide new introduction to the particle swarm optimization methods using a formal analogy with physical systems. By postulating that the swarm motion behaves similar to both classical and quantum particles, we establish a direct connection between what are usually assumed to be separate fields of study, optimization and physics. Within this framework, it becomes quite natural to derive the recently introduced quantum PSO algorithm from the Hamiltonian or the Lagrangian of the dynamical system. The physical theory of the PSO is used to suggest some improvements in the algorithm itself, like temperature acceleration techniques and the periodic boundary condition. At the end, we provide a panorama of applications demonstrating the power of the PSO, classical and quantum, in handling difficult engineering problems. The goal of this work is to provide a general multi-disciplinary view on various topics in physics, mathematics, and engineering by illustrating their interd pendence within the unified framework of the swarm dynamics. Table of Contents: Introduction / The Classical Particle Swarm Optimization Method / Boundary Conditions for the PSO Method / The Quantum Particle Swarm Optimization / Bibliography /Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Geometric Programming for Design Equation Development and Cost/Profit Optimization:(with illustrative case study problems and solutions), Third Edition

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Geometric Programming is used for cost minimization, profit maximization, obtaining cost ratios, and the development of generalized design equations for the primal variables. The early pioneers of geometric programming—Zener, Duffin, Peterson, Beightler, Wilde, and Phillips—played important roles in its development. Five new case studies have been added to the third edition. There are five major sections: (1) Introduction, History and Theoretical Fundamentals; (2) Cost Minimization Applications with Zero Degrees of Difficulty; (3) Profit Maximization Applications with Zero Degrees of Difficulty; (4) Applications with Positive Degrees of Difficulty; and (5) Summary, Future Directions, and Geometric Programming Theses & Dissertations Titles. The various solution techniques presented are the constrained derivative approach, condensation of terms approach, dimensional analysis approach, and transformed dual approach. A primary goal of this work is to have readers deve op more case studies and new solution techniques to further the application of geometric programming. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Guide to Visual Multi-Level Interface Design From Synthesis of Empirical Study Evidence

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Displaying multiple levels of data visually has been proposed to address the challenge of limited screen space. Although many previous empirical studies have addressed different aspects of this question, the information visualization research community does not currently have a clearly articulated consensus on how, when, or even if displaying data at multiple levels is effective. To shed more light on this complex topic, we conducted a systematic review of 22 existing multi-level interface studies to extract high-level design guidelines. To facilitate discussion, we cast our analysis findings into a four-point decision tree: (1) When are multi-level displays useful? (2) What should the higher visual levels display? (3) Should the different visual levels be displayed simultaneously, or one at a time? (4) Should the visual levels be embedded in a single display, or separated into multiple displays? Our analysis resulted in three design guidelines: (1) the number of levels in display and data should match; (2) high visual levels should only display task-relevant information; (3) simultaneous display, rather than temporal switching, is suitable for tasks with multi-level answers. Table of Contents: Introduction / Terminology / Methodology / Summary of Studies / Decision 1: Single or Multi-level Interface? / Decision 2: How to Create the High-Level Displays? / Decision 3: Simultaneous or Temporal Displays of the Multiple Visual Levels / Decision 4: How to Spatially Arrange the Visual Levels, Embedded or Separate? / Limitations of Study / Design Recommendations / Discussion and Future Work View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quality of Service in Wireless Networks Over Unlicensed Spectrum

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This Synthesis Lecture presents a discussion of Quality of Service (QoS) in wireless networks over unlicensed spectrum. The topic is presented from the point of view of protocols for wireless networks (e.g., 802.11) rather than the physical layer point of view usually discussed for cellular networks in the licensed wireless spectrum. A large number of mobile multimedia wireless applications are being deployed over WiFi (IEEE 802.11) and Bluetooth wireless networks and the number will increase in the future as more phones, tablets, and laptops are equipped with these unlicensed spectrum wireless interfaces. Achieving QoS objectives in wireless networks is challenging due to limited wireless resources, wireless nodes interference, wireless shared media, node mobility, and diverse topologies. The author presents the QoS problem as (1) an optimization problem with different constraints coming from the interference, mobility, and wireless resource constraints and (2) an algorithmic problem with fundamental algorithmic functions within wireless resource management and protocols. Table of Contents: Preface / Basics of Quality of Service in Wireless Networks / QoS-Aware Resource Allocation / Bandwidth Management / Delay Management / Routing / Acknowledgment / References / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multiculturalism and Information and Communication Technology

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Research on multiculturalism and information and communication technology (ICT) has been important to understanding recent history, planning for future large-scale initiatives, and understanding unrealized expectations for social and technological change. This interdisciplinary area of research has examined interactions between ICT and culture at the group and society levels. However, there is debate within the literature as to the nature of the relationship between culture and technology. In this synthesis, we suggest that the tensions result from the competing ideologies that drive researchers, allowing us to conceptualize the relationship between culture and ICT under three primary models, each with its own assumptions: 1) Social informatics, 2) Social determinism, and 3) Technological determinism. Social informatics views the relationship to be one of sociotechnical interaction, in which culture and ICTs affect each other mutually and iteratively, rather than linearly; the vast ma ority of the literature approach the relationships between ICT and culture under the assumptions of social informatics. From a socially deterministic perspective, ICTs are viewed as the dependent variable in the equation, whereas, from a technologically deterministic perspective, ICTs are an independent variable. The issues of multiculturalism and ICTs attracted much scholarly attention and have been explored under a myriad of contexts, with substantial literature on global development, social and political issues, business and public administration as well as education and scholarly collaboration. We synthesize here research in the areas of global development, social and political issues, and business collaboration. Finally we conclude by proposing under-explored areas for future research directions. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High-Speed Digital System Design

    Copyright Year: 2006

    Morgan and Claypool eBooks

    High-Speed Digital System Design bridges the gap from theory to implementation in the real world. Systems with clock speeds in low megahertz range qualify for high-speed. Proper design results in quality digital transmissions and lowers the chance for errors. This book is for computer and electrical engineers who may or may not have learned electromagnetic theory. The presentation style allows readers to quickly begin designing their own high-speed systems and diagnosing existing designs for errors. After studying this book, readers will be able to: Design the power distribution system for a printed circuit board to minimize noise Plan the layers of a PCB for signals, power, and ground to maximize signal quality and minimize noise Include test structures in the printed circuit board to easily diagnose manufacturing mistakes Choose the best PCB design parameters such a trace width, height,and routed path to ensure the most stable characteristic impedance Determine the correct terminati n to minimize reflections Predict the delay caused by a given PCB trace Minimize driver power consumption using AC terminations Compensate for discontinuities along a PCB trace Use pre-emphasis and equalization techniques to counteract lossy transmission lines Determine the amount of crosstalk between two traces Diagnose existing PCBs to determine the sources of errors View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computational Modeling of Narrative

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The field of narrative (or story) understanding and generation is one of the oldest in natural language processing (NLP) and artificial intelligence (AI), which is hardly surprising, since storytelling is such a fundamental and familiar intellectual and social activity. In recent years, the demands of interactive entertainment and interest in the creation of engaging narratives with life-like characters have provided a fresh impetus to this field. This book provides an overview of the principal problems, approaches, and challenges faced today in modeling the narrative structure of stories. The book introduces classical narratological concepts from literary theory and their mapping to computational approaches. It demonstrates how research in AI and NLP has modeled character goals, causality, and time using formalisms from planning, case-based reasoning, and temporal reasoning, and discusses fundamental limitations in such approaches. It proposes new representations for embedded narrati es and fictional entities, for assessing the pace of a narrative, and offers an empirical theory of audience response. These notions are incorporated into an annotation scheme called NarrativeML. The book identifies key issues that need to be addressed, including annotation methods for long literary narratives, the representation of modality and habituality, and characterizing the goals of narrators. It also suggests a future characterized by advanced text mining of narrative structure from large-scale corpora and the development of a variety of useful authoring aids. This is the first book to provide a systematic foundation that integrates together narratology, AI, and computational linguistics. It can serve as a narratology primer for computer scientists and an elucidation of computational narratology for literary theorists. It is written in a highly accessible manner and is intended for use by a broad scientific audience that includes linguists (computational and formal semanticist ), AI researchers, cognitive scientists, computer scientists, game developers, and narrative theorists. Table of Contents: List of Figures / List of Tables / Narratological Background / Characters as Intentional Agents / Time / Plot / Summary and Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Trading Agents

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Automated trading in electronic markets is one of the most common and consequential applications of autonomous software agents. Design of effective trading strategies requires thorough understanding of how market mechanisms operate, and appreciation of strategic issues that commonly manifest in trading scenarios. Drawing on research in auction theory and artificial intelligence, this book presents core principles of strategic reasoning that apply to market situations. The author illustrates trading strategy choices through examples of concrete market environments, such as eBay, as well as abstract market models defined by configurations of auctions and traders. Techniques for addressing these choices constitute essential building blocks for the design of trading strategies for rich market applications. The lecture assumes no prior background in game theory or auction theory, or artificial intelligence. Table of Contents: Introduction / Example: Bidding on eBay / Auction Fundamentals / Continuous Double Auctions / Interdependent Markets / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital Image Processing for Ophthalmology:Detection and Modeling of Retinal Vascular Architecture

    Copyright Year: 2014

    Morgan and Claypool eBooks

    The monitoring of the effects of retinopathy on the visual system can be assisted by analyzing the vascular architecture of the retina. This book presents methods based on Gabor filters to detect blood vessels in fundus images of the retina. Forty images of the retina from the Digital Retinal Images for Vessel Extraction (DRIVE) database were used to evaluate the performance of the methods. The results demonstrate high efficiency in the detection of blood vessels with an area under the receiver operating characteristic curve of 0.96. Monitoring the openness of the major temporal arcade (MTA) could facilitate improved diagnosis and optimized treatment of retinopathy. This book presents methods for the detection and modeling of the MTA, including the generalized Hough transform to detect parabolic forms. Results obtained with 40 images of the DRIVE database, compared with hand-drawn traces of the MTA, indicate a mean distance to the closest point of about 0.24mm. This book illustrates a plications of the methods mentioned above for the analysis of the effects of proliferative diabetic retinopathy and retinopathy of prematurity on retinal vascular architecture. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Adiabatic Quantum Computation and Quantum Annealing:Theory and Practice

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Adiabatic quantum computation (AQC) is an alternative to the better-known gate model of quantum computation. The two models are polynomially equivalent, but otherwise quite dissimilar: one property that distinguishes AQC from the gate model is its analog nature. Quantum annealing (QA) describes a type of heuristic search algorithm that can be implemented to run in the ``native instruction set'' of an AQC platform. D-Wave Systems Inc. manufactures {quantum annealing processor chips} that exploit quantum properties to realize QA computations in hardware. The chips form the centerpiece of a novel computing platform designed to solve NP-hard optimization problems. Starting with a 16-qubit prototype announced in 2007, the company has launched and sold increasingly larger models: the 128-qubit D-Wave One system was announced in 2010 and the 512-qubit D-Wave Two system arrived on the scene in 2013. A 1,000-qubit model is expected to be available in 2014. This monograph presents an introduc ory overview of this unusual and rapidly developing approach to computation. We start with a survey of basic principles of quantum computation and what is known about the AQC model and the QA algorithm paradigm. Next we review the D-Wave technology stack and discuss some challenges to building and using quantum computing systems at a commercial scale. The last chapter reviews some experimental efforts to understand the properties and capabilities of these unusual platforms. The discussion throughout is aimed at an audience of computer scientists with little background in quantum computation or in physics. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Generalized Transmission Line Method to Study the Far-zone Radiation of Antennas Under a Multilayer Structure

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book gives a step-by-step presentation of a generalized transmission line method to study the far-zone radiation of antennas under a multilayer structure. Normally, a radiation problem requires a full wave analysis which may be time consuming. The beauty of the generalized transmission line method is that it transforms the radiation problem for a specific type of structure, say the multilayer structure excited by an antenna, into a circuit problem that can be efficiently analyzed. Using the Reciprocity Theorem and far-field approximation, the method computes the far-zone radiation due to a Hertzian dipole within a multilayer structure by solving an equivalent transmission line circuit. Since an antenna can be modeled as a set of Hertzian dipoles, the method could be used to predict the far-zone radiation of an antenna under a multilayer structure. The analytical expression for the far-zone field is derived for a structure with or without a polarizer. The procedure of obtaining th Hertzian dipole model that is required by the generalized transmission line method is also described. Several examples are given to demonstrate the capabilities, accuracy, and efficiency of this method. Table of Contents: Antennas Under a Multilayer Dielectric Slab / Antennas Under a Polarized Multilayer Structure / Hertzian Dipole Model for an Antenna / Bibliography / Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Finite State Machine Datapath Design, Optimization, and Implementation

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Finite State Machine Datapath Design, Optimization, and Implementation explores the design space of combined FSM/Datapath implementations. The lecture starts by examining performance issues in digital systems such as clock skew and its effect on setup and hold time constraints, and the use of pipelining for increasing system clock frequency. This is followed by definitions for latency and throughput, with associated resource tradeoffs explored in detail through the use of dataflow graphs and scheduling tables applied to examples taken from digital signal processing applications. Also, design issues relating to functionality, interfacing, and performance for different types of memories commonly found in ASICs and FPGAs such as FIFOs, single-ports, and dual-ports are examined. Selected design examples are presented in implementation-neutral Verilog code and block diagrams, with associated design files available as downloads for both Altera Quartus and Xilinx Virtex FPGA platforms. A wor ing knowledge of Verilog, logic synthesis, and basic digital design techniques is required. This lecture is suitable as a companion to the synthesis lecture titled Introduction to Logic Synthesis using Verilog HDL. Table of Contents: Calculating Maximum Clock Frequency / Improving Design Performance / Finite State Machine with Datapath (FSMD) Design / Embedded Memory Usage in Finite State Machine with Datapath (FSMD) Designs View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fieldwork for Healthcare:Guidance for Investigating Human Factors in Computing Systems

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Conducting fieldwork for investigating technology use in healthcare is a challenging undertaking, and yet there is little in the way of community support and guidance for conducting these studies. There is a need for better knowledge sharing and resources to facilitate learning. This is the second of two volumes designed as a collective graduate guidebook for conducting fieldwork in healthcare. This volume brings together thematic chapters that draw out issues and lessons learned from practical experience. Researchers who have first-hand experience of conducting healthcare fieldwork collaborated to write these chapters. This volume contains insights, tips, and tricks from studies in clinical and non-clinical environments, from hospital to home. This volume starts with an introduction to the ethics and governance procedures a researcher might encounter when conducting fieldwork in this sensitive study area. Subsequent chapters address specific aspects of conducting situated healthcare esearch. Chapters on readying the researcher and relationships in the medical domain break down some of the complex social aspects of this type of research. They are followed by chapters on the practicalities of collecting data and implementing interventions, which focus on domain-specific issues that may arise. Finally, we close the volume by discussing the management of impact in healthcare fieldwork. The guidance contained in these chapters enables new researchers to form their project plans and also their contingency plans in this complex and challenging domain. For more experienced researchers, it offers advice and support through familiar stories and experiences. For supervisors and teachers, it offers a source of reference and debate. Together with the first volume, Fieldwork for Healthcare: Case Studies Investigating Human Factors in Computing systems, these books provide a substantive resource on how to conduct fieldwork in healthcare. Table of Contents: Preface / Acknowledgm nts / Ethics, Governance, and Patient and Public Involvement in Healthcare / Readying the Researcher for Fieldwork in Healthcare / Establishing and Maintaining Relationships in Healthcare Fields / Practicalities of Data Collection in Healthcare Fieldwork / Healthcare Intervention Studies “In the Wild” / Impact of Fieldwork in Healthcare: Understanding Impact on Researchers, Research, Practice, and Beyond / References / Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mismatch and Noise in Modern IC Processes

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Component variability, mismatch, and various noise effects are major contributors to design limitations in most modern IC processes. Mismatch and Noise in Modern IC Processes examines these related effects and how they affect the building block circuits of modern integrated circuits, from the perspective of a circuit designer. Variability usually refers to a large scale variation that can occur on a wafer to wafer and lot to lot basis, and over long distances on a wafer. This phenomenon is well understood and the effects of variability are included in most integrated circuit design with the use of corner or statistical component models. Mismatch, which is the emphasis of section I of the book, is a local level of variability that leaves the characteristics of adjacent transistors unmatched. This is of particular concern in certain analog and memory systems, but also has an effect on digital logic schemes, where uncertainty is introduced into delay times, which can reduce margins and i troduce 'race' conditions. Noise is a dynamic effect that causes a local mismatch or variability that can vary during operation of a circuit, and is considered in section II. Noise can be the result of atomic effects in devices or circuit interactions, and both of these are discussed in terms of analog and digital circuitry. Table of Contents: Part I: Mismatch / Introduction / Variability and Mismatch in Digital Systems / Variability and Mismatch in Analog Systems I / Variability and Mismatch in Analog Systems II / Lifetime-Induced Variability / Mismatch in Nonconventional Processes / Mismatch Correction Circuits / Part II: Noise / Component and Digital Circuit Noise / Noise Effects in Digital Systems / Noise Effects in Analog Systems / Circuit Design to Minimize Noise Effects / Noise Considerations in SOI View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advanced Radar Detection Schemes Under Mismatched Signal Models

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Adaptive detection of signals embedded in correlated Gaussian noise has been an active field of research in the last decades. This topic is important in many areas of signal processing such as, just to give some examples, radar, sonar, communications, and hyperspectral imaging. Most of the existing adaptive algorithms have been designed following the lead of the derivation of Kelly's detector which assumes perfect knowledge of the target steering vector. However, in realistic scenarios, mismatches are likely to occur due to both environmental and instrumental factors. When a mismatched signal is present in the data under test, conventional algorithms may suffer severe performance degradation. The presence of strong interferers in the cell under test makes the detection task even more challenging. An effective way to cope with this scenario relies on the use of "tunable" detectors, i.e., detectors capable of changing their directivity through the tuning of proper parameters. The aim o this book is to present some recent advances in the design of tunable detectors and the focus is on the so-called two-stage detectors, i.e., adaptive algorithms obtained cascading two detectors with opposite behaviors. We derive exact closed-form expressions for the resulting probability of false alarm and the probability of detection for both matched and mismatched signals embedded in homogeneous Gaussian noise. It turns out that such solutions guarantee a wide operational range in terms of tunability while retaining, at the same time, an overall performance in presence of matched signals commensurate with Kelly's detector. Table of Contents: Introduction / Adaptive Radar Detection of Targets / Adaptive Detection Schemes for Mismatched Signals / Enhanced Adaptive Sidelobe Blanking Algorithms / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    PSpice for Digital Signal Processing

    Copyright Year: 2007

    Morgan and Claypool eBooks

    PSpice for Digital Signal Processing is the last in a series of five books using Cadence Orcad PSpice version 10.5 and introduces a very novel approach to learning digital signal processing (DSP). DSP is traditionally taught using Matlab/Simulink software but has some inherent weaknesses for students particularly at the introductory level. The ‘plug in variables and play’ nature of these software packages can lure the student into thinking they possess an understanding they don’t actually have because these systems produce results quicklywithout revealing what is going on. However, it must be said that, for advanced level work Matlab/Simulink really excel. In this book we start by examining basic signals starting with sampled signals and dealing with the concept of digital frequency. The delay part, which is the heart of DSP, is explained and applied initially to simple FIR and IIR filters. We examine linear time invariant systems starting with the difference equa ion and applying the z-transform to produce a range of filter type i.e. low-pass, high-pass and bandpass. The important concept of convolution is examined and here we demonstrate the usefulness of the 'log' command in Probe for giving the correct display to demonstrate the 'flip n slip' method. Digital oscillators, including quadrature carrier generation, are then examined. Several filter design methods are considered and include the bilinear transform, impulse invariant, and window techniques. Included also is a treatment of the raised-cosine family of filters. A range of DSP applications are then considered and include the Hilbert transform, single sideband modulator using the Hilbert transform and quad oscillators, integrators and differentiators. Decimation and interpolation are simulated to demonstrate the usefulness of the multi-sampling environment. Decimation is also applied in a treatment on digital receivers. Lastly, we look at some musical applications for DSP such as r verberation/echo using real-world signals imported into PSpice using the program Wav2Ascii. The zero-forcing equalizer is dealt with in a simplistic manner and illustrates the effectiveness of equalizing signals in a receiver after transmission. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Boolean Differential Equations

    Copyright Year: 2013

    Morgan and Claypool eBooks

    The Boolean Differential Calculus (BDC) is a very powerful theory that extends the structure of a Boolean Algebra significantly. Based on a small number of definitions, many theorems have been proven. The available operations have been efficiently implemented in several software packages. There is a very wide field of applications. While a Boolean Algebra is focused on values of logic functions, the BDC allows the evaluation of changes of function values. Such changes can be explored for pairs of function values as well as for whole subspaces. Due to the same basic data structures, the BDC can be applied to any task described by logic functions and equations together with the Boolean Algebra. The BDC can be widely used for the analysis, synthesis, and testing of digital circuits. Generally speaking, a Boolean differential equation (BDE) is an equation in which elements of the BDC appear. It includes variables, functions, and derivative operations of these functions. The solution of su h a BDE is a set of Boolean functions. This is a significant extension of Boolean equations, which have sets of Boolean vectors as solutions. In the simplest BDE a derivative operation of the BDC on the left-hand side is equal to a logic function on the right-hand side. The solution of such a simple BDE means to execute an operation which is inverse to the given derivative. BDEs can be applied in the same fields as the BDC, however, their possibility to express sets of Boolean functions extends the application field significantly. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multiresolution Frequency Domain Technique for Electromagnetics

    Copyright Year: 2012

    Morgan and Claypool eBooks

    In this book, a general frequency domain numerical method similar to the finite difference frequency domain (FDFD) technique is presented. The proposed method, called the multiresolution frequency domain (MRFD) technique, is based on orthogonal Battle-Lemarie and biorthogonal Cohen-Daubechies-Feauveau (CDF) wavelets. The objective of developing this new technique is to achieve a frequency domain scheme which exhibits improved computational efficiency figures compared to the traditional FDFD method: reduced memory and simulation time requirements while retaining numerical accuracy. The newly introduced MRFD scheme is successfully applied to the analysis of a number of electromagnetic problems, such as computation of resonance frequencies of one and three dimensional resonators, analysis of propagation characteristics of general guided wave structures, and electromagnetic scattering from two dimensional dielectric objects. The efficiency characteristics of MRFD techniques based on diffe ent wavelets are compared to each other and that of the FDFD method. Results indicate that the MRFD techniques provide substantial savings in terms of execution time and memory requirements, compared to the traditional FDFD method. Table of Contents: Introduction / Basics of the Finite Difference Method and Multiresolution Analysis / Formulation of the Multiresolution Frequency Domain Schemes / Application of MRFD Formulation to Closed Space Structures / Application of MRFD Formulation to Open Space Structures / A Multiresolution Frequency Domain Formulation for Inhomogeneous Media / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Continuum Biomechanics

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book is concerned with the study of continuum mechanics applied to biological systems, i.e., continuum biomechanics. This vast and exciting subject allows description of when a bone may fracture due to excessive loading, how blood behaves as both a solid and fluid, down to how cells respond to mechanical forces that lead to changes in their behavior, a process known as mechanotransduction. We have written for senior undergraduate students and first year graduate students in mechanical or biomedical engineering, but individuals working at biotechnology companies that deal in biomaterials or biomechanics should also find the information presented relevant and easily accessible. Table of Contents: Tensor Calculus / Kinematics of a Continuum / Stress / Elasticity / Fluids / Blood and Circulation / Viscoelasticity / Poroelasticity and Thermoelasticity / Biphasic Theory View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Landmarking and Segmentation of 3D CT Images

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Segmentation and landmarking of computed tomographic (CT) images of pediatric patients are important and useful in computer-aided diagnosis (CAD), treatment planning, and objective analysis of normal as well as pathological regions. Identification and segmentation of organs and tissues in the presence of tumors are difficult. Automatic segmentation of the primary tumor mass in neuroblastoma could facilitate reproducible and objective analysis of the tumor's tissue composition, shape, and size. However, due to the heterogeneous tissue composition of the neuroblastic tumor, ranging from low-attenuation necrosis to high-attenuation calcification, segmentation of the tumor mass is a challenging problem. In this context, methods are described in this book for identification and segmentation of several abdominal and thoracic landmarks to assist in the segmentation of neuroblastic tumors in pediatric CT images. Methods to identify and segment automatically the peripheral artifacts and tissu s, the rib structure, the vertebral column, the spinal canal, the diaphragm, and the pelvic surface are described. Techniques are also presented to evaluate quantitatively the results of segmentation of the vertebral column, the spinal canal, the diaphragm, and the pelvic girdle by comparing with the results of independent manual segmentation performed by a radiologist. The use of the landmarks and removal of several tissues and organs are shown to assist in limiting the scope of the tumor segmentation process to the abdomen, to lead to the reduction of the false-positive error, and to improve the result of segmentation of neuroblastic tumors. Table of Contents: Introduction to Medical Image Analysis / Image Segmentation / Experimental Design and Database / Ribs, Vertebral Column, and Spinal Canal / Delineation of the Diaphragm / Delineation of the Pelvic Girdle / Application of Landmarking / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dynamic Speech Models

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Speech dynamics refer to the temporal characteristics in all stages of the human speech communication process. This speech “chain” starts with the formation of a linguistic message in a speaker's brain and ends with the arrival of the message in a listener's brain. Given the intricacy of the dynamic speech process and its fundamental importance in human communication, this monograph is intended to provide a comprehensive material on mathematical models of speech dynamics and to address the following issues: How do we make sense of the complex speech process in terms of its functional role of speech communication? How do we quantify the special role of speech timing? How do the dynamics relate to the variability of speech that has often been said to seriously hamper automatic speech recognition? How do we put the dynamic process of speech into a quantitative form to enable detailed analyses? And finally, how can we incorporate the knowledge of speech dynamics into compu erized speech analysis and recognition algorithms? The answers to all these questions require building and applying computational models for the dynamic speech process. What are the compelling reasons for carrying out dynamic speech modeling? We provide the answer in two related aspects. First, scientific inquiry into the human speech code has been relentlessly pursued for several decades. As an essential carrier of human intelligence and knowledge, speech is the most natural form of human communication. Embedded in the speech code are linguistic (as well as para-linguistic) messages, which are conveyed through four levels of the speech chain. Underlying the robust encoding and transmission of the linguistic messages are the speech dynamics at all the four levels. Mathematical modeling of speech dynamics provides an effective tool in the scientific methods of studying the speech chain. Such scientific studies help understand why humans speak as they do and how humans exploit redundanc and variability by way of multitiered dynamic processes to enhance the efficiency and effectiveness of human speech communication. Second, advancement of human language technology, especially that in automatic recognition of natural-style human speech is also expected to benefit from comprehensive computational modeling of speech dynamics. The limitations of current speech recognition technology are serious and are well known. A commonly acknowledged and frequently discussed weakness of the statistical model underlying current speech recognition technology is the lack of adequate dynamic modeling schemes to provide correlation structure across the temporal speech observation sequence. Unfortunately, due to a variety of reasons, the majority of current research activities in this area favor only incremental modifications and improvements to the existing HMM-based state-of-the-art. For example, while the dynamic and correlation modeling is known to be an important topic, most of the sy tems nevertheless employ only an ultra-weak form of speech dynamics; e.g., differential or delta parameters. Strong-form dynamic speech modeling, which is the focus of this monograph, may serve as an ultimate solution to this problem. After the introduction chapter, the main body of this monograph consists of four chapters. They cover various aspects of theory, algorithms, and applications of dynamic speech models, and provide a comprehensive survey of the research work in this area spanning over past 20~years. This monograph is intended as advanced materials of speech and signal processing for graudate-level teaching, for professionals and engineering practioners, as well as for seasoned researchers and engineers specialized in speech processing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Despeckle Filtering for Ultrasound Imaging and Video, Volume I:Algorithms and Software, Second Edition

    Copyright Year: 2015

    Morgan and Claypool eBooks

    It is well known that speckle is a multiplicative noise that degrades image and video quality and the visual expert's evaluation in ultrasound imaging and video. This necessitates the need for robust despeckling image and video techniques for both routine clinical practice and tele-consultation. The goal for this book (book 1 of 2 books) is to introduce the problem of speckle occurring in ultrasound image and video as well as the theoretical background (equations), the algorithmic steps, and the MATLABTM code for the following group of despeckle filters: linear filtering, nonlinear filtering, anisotropic diffusion filtering, and wavelet filtering. This book proposes a comparative evaluation framework of these despeckle filters based on texture analysis, image quality evaluation metrics, and visual evaluation by medical experts. Despeckle noise reduction through the application of these filters will improve the visual observation quality or it may be used as a pre-processing step for urther automated analysis, such as image and video segmentation, and texture characterization in ultrasound cardiovascular imaging, as well as in bandwidth reduction in ultrasound video transmission for telemedicine applications. The aforementioned topics will be covered in detail in the companion book to this one. Furthermore, in order to facilitate further applications we have developed in MATLABTM two different toolboxes that integrate image (IDF) and video (VDF) despeckle filtering, texture analysis, and image and video quality evaluation metrics. The code for these toolsets is open source and these are available to download complementary to the two books. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Design and Development of RFID and RFID-Enabled Sensors on Flexible Low Cost Substrates

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book presents a step-by-step discussion of the design and development of radio frequency identification (RFID) and RFID-enabled sensors on flexible low cost substrates for UHF frequency bands. Various examples of fully function building blocks (design and fabrication of antennas, integration with ICs and microcontrollers, power sources, as well as inkjet-printing techniques) demonstrate the revolutionary effect of this approach in low cost RFID and RFID-enabled sensors fields. This approach could be easily extended to other microwave and wireless applications as well. The first chapter describes the basic functionality and the physical and IT-related principles underlying RFID and sensors technology. Chapter two explains in detail inkjet-printing technology providing the characterization of the conductive ink, which consists of nano-silver-particles, while highlighting the importance of this technology as a fast and simple fabrication technique especially on flexible organic subs rates such as Liquid Crystal Polymer (LCP) or paper-based substrates. Chapter three demonstrates several compact inkjet-printed UHF RFID antennas using antenna matching techniques to match IC's complex impedance as prototypes to provide the proof of concept of this technology. Chapter four discusses the benefits of using conformal magnetic material as a substrate for miniaturized high-frequency circuit applications. In addition, in Chapter five, the authors also touch up the state-of-the-art area of fully-integrated wireless sensor modules on organic substrates and show the first ever 2D sensor integration with an RFID tag module on paper, as well as the possibility of 3D multilayer paper-based RF/microwave structures. Table of Contents: Radio Frequency Identification Introduction / Flexible Organic Low Cost Substrates / Benchmarking RFID Prototypes on Organic Substrates / Conformal Magnetic Composite RFID Tags / Inkjet-Printed RFID-Enabled Sensors View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    DSP for MATLAB™ and LabVIEW™ III:Digital Filter Design

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book is Volume III of the series DSP for MATLAB™ and LabVIEW™. Volume III covers digital filter design, including the specific topics of FIR design via windowed-ideal-lowpass filter, FIR highpass, bandpass, and bandstop filter design from windowed-ideal lowpass filters, FIR design using the transition-band-optimized Frequency Sampling technique (implemented by Inverse-DFT or Cosine/Sine Summation Formulas), design of equiripple FIRs of all standard types including Hilbert Transformers and Differentiators via the Remez Exchange Algorithm, design of Butterworth, Chebyshev (Types I and II), and Elliptic analog prototype lowpass filters, conversion of analog lowpass prototype filters to highpass, bandpass, and bandstop filters, and conversion of analog filters to digital filters using the Impulse Invariance and Bilinear Transform techniques. Certain filter topologies specific to FIRs are also discussed, as are two simple FIR types, the Comb and Moving Average filters. T e entire series consists of four volumes that collectively cover basic digital signal processing in a practical and accessible manner, but which nonetheless include all essential foundation mathematics. As the series title implies, the scripts (of which there are more than 200) described in the text and supplied in code form here will run on both MATLAB™ and LabVIEW™. The text for all volumes contains many examples, and many useful computational scripts, augmented by demonstration scripts and LabVIEW™ Virtual Instruments (VIs) that can be run to illustrate various signal processing concepts graphically on the user's computer screen. Volume I consists of four chapters that collectively set forth a brief overview of the field of digital signal processing, useful signals and concepts (including convolution, recursion, difference equations, LTI systems, etc), conversion from the continuous to discrete domain and back (i.e., analog-to-digital and digital-to-analog con ersion), aliasing, the Nyquist rate, normalized frequency, sample rate conversion and Mu-law compression, and signal processing principles including correlation, the correlation sequence, the Real DFT, correlation by convolution, matched filtering, simple FIR filters, and simple IIR filters. Chapter four of Volume I, in particular, provides an intuitive or "first principle" understanding of how digital filtering and frequency transforms work. Volume II provides detailed coverage of discrete frequency transforms, including a brief overview of common frequency transforms, both discrete and continuous, followed by detailed treatments of the Discrete Time Fourier Transform (DTFT), the z-Transform (including definition and properties, the inverse z-transform, frequency response via z-transform, and alternate filter realization topologies including Direct Form, Direct Form Transposed, Cascade Form, Parallel Form, and Lattice Form), and the Discrete Fourier Transform (DFT) (including Discret Fourier Series, the DFT-IDFT pair, DFT of common signals, bin width, sampling duration, and sample rate, the FFT, the Goertzel Algorithm, Linear, Periodic, and Circular convolution, DFT Leakage, and computation of the Inverse DFT). Volume IV, the culmination of the series, is an introductory treatment of LMS Adaptive Filtering and applications, and covers cost functions, performance surfaces, coefficient perturbation to estimate the gradient, the LMS algorithm, response of the LMS algorithm to narrow-band signals, and various topologies such as ANC (Active Noise Cancelling) or system modeling, Periodic Signal Removal/Prediction/Adaptive Line Enhancement (ALE), Interference Cancellation, Echo Cancellation (with single- and dual-H topologies), and Inverse Filtering/Deconvolution/Equalization. Table of Contents: Principles of FIR Design / FIR Design Techniques / Classical IIR Design View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Theory of Timed I/O Automata, Second Edition

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This monograph presents the Timed Input/Output Automaton (TIOA) modeling framework, a basic mathematical framework to support description and analysis of timed (computing) systems. Timed systems are systems in which desirable correctness or performance properties of the system depend on the timing of events, not just on the order of their occurrence. Timed systems are employed in a wide range of domains including communications, embedded systems, real-time operating systems, and automated control. Many applications involving timed systems have strong safety, reliability, and predictability requirements, which make it important to have methods for systematic design of systems and rigorous analysis of timing-dependent behavior. The TIOA framework also supports description and analysis of timed distributed algorithms -- distributed algorithms whose correctness and performance depend on the relative speeds of processors, accuracy of local clocks, or communication delay bounds. Such algori hms arise, for example, in traditional and wireless communications, networks of mobile devices, and shared-memory multiprocessors. The need to prove rigorous theoretical results about timed distributed algorithms makes it important to have a suitable mathematical foundation. An important feature of the TIOA framework is its support for decomposing timed system descriptions. In particular, the framework includes a notion of external behavior for a timed I/O automaton, which captures its discrete interactions with its environment. The framework also defines what it means for one TIOA to implement another, based on an inclusion relationship between their external behavior sets, and defines notions of simulations, which provide sufficient conditions for demonstrating implementation relationships. The framework includes a composition operation for TIOAs, which respects external behavior, and a notion of receptiveness, which implies that a TIOA does not block the passage of time. The TIOA f amework also defines the notion of a property and what it means for a property to be a safety or a liveness property. It includes results that capture common proof methods for showing that automata satisfy properties. Table of Contents: Introduction / Mathematical Preliminaries / Describing Timed System Behavior / Timed Automata / Operations on Timed Automata / Properties for Timed Automata / Timed I/O Automata / Operations on Timed I/O Automata / Conclusions and Future Work View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Databases on Modern Hardware:How to Stop Underutilization and Love Multicores

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Data management systems enable various influential applications from high-performance online services (e.g., social networks like Twitter and Facebook or financial markets) to big data analytics (e.g., scientific exploration, sensor networks, business intelligence). As a result, data management systems have been one of the main drivers for innovations in the database and computer architecture communities for several decades. Recent hardware trends require software to take advantage of the abundant parallelism existing in modern and future hardware. The traditional design of the data management systems, however, faces inherent scalability problems due to its tightly coupled components. In addition, it cannot exploit the full capability of the aggressive micro-architectural features of modern processors. As a result, today's most commonly used server types remain largely underutilized leading to a huge waste of hardware resources and energy.</p> <p>In this bo k, we shed light on the challenges present while running DBMS on modern multicore hardware. We divide the material into two dimensions of scalability: implicit/vertical and explicit/horizontal.</p> <p>The first part of the book focuses on the vertical dimension: it describes the instruction- and data-level parallelism opportunities in a core coming from the hardware and software side. In addition, it examines the sources of under-utilization in a modern processor and presents insights and hardware/software techniques to better exploit the microarchitectural resources of a processor by improving cache locality at the right level of the memory hierarchy.</p> <p>The second part focuses on the horizontal dimension, i.e., scalability bottlenecks of database applications at the level of multicore and multisocket multicore architectures. It first presents a systematic way of eliminating such bottlenecks in online transaction processing workloads, which is base on minimizing unbounded communication, and shows several techniques that minimize bottlenecks in major components of database management systems. Then, it demonstrates the data and work sharing opportunities for analytical workloads, and reviews advanced scheduling mechanisms that are aware of nonuniform memory accesses and alleviate bandwidth saturation.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Interactive Technologies for Autism:A Review

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Development, deployment, and evaluation of interactive technologies for individuals with autism have been rapidly increasing over the last decade. There is great promise for the use of these types of technologies to enrich interventions, facilitate communication, and support data collection. Emerging technologies in this area also have the potential to enhance assessment and diagnosis of individuals with autism, to understand the nature of autism, and to help researchers conduct basic and applied research. This book provides an in-depth review of the historical and state-of-the-art use of technology by and for individuals with autism. The intention is to give readers a comprehensive background in order to understand what has been done and what promises and challenges lie ahead. By providing a classification scheme and general review, this book can also help technology designers and researchers better understand what technologies have been successful, what problems remain open, and whe e innovations can further address challenges and opportunities for individuals with autism and the variety of stakeholders connected to them. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modern Image Quality Assessment

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This Lecture book is about objective image quality assessment—where the aim is to provide computational models that can automatically predict perceptual image quality. The early years of the 21st century have witnessed a tremendous growth in the use of digital images as a means for representing and communicating information. A considerable percentage of this literature is devoted to methods for improving the appearance of images, or for maintaining the appearance of images that are processed. Nevertheless, the quality of digital images, processed or otherwise, is rarely perfect. Images are subject to distortions during acquisition, compression, transmission, processing, and reproduction. To maintain, control, and enhance the quality of images, it is important for image acquisition, management, communication, and processing systems to be able to identify and quantify image quality degradations. The goals of this book are as follows; a) to introduce the fundamentals of image qual ty assessment, and to explain the relevant engineering problems, b) to give a broad treatment of the current state-of-the-art in image quality assessment, by describing leading algorithms that address these engineering problems, and c) to provide new directions for future research, by introducing recent models and paradigms that significantly differ from those used in the past. The book is written to be accessible to university students curious about the state-of-the-art of image quality assessment, expert industrial R&D engineers seeking to implement image/video quality assessment systems for specific applications, and academic theorists interested in developing new algorithms for image quality assessment or using existing algorithms to design or optimize other image processing applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introductory Tiling Theory for Computer Graphics

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Tiling theory is an elegant branch of mathematics that has applications in several areas of computer science. The most immediate application area is graphics, where tiling theory has been used in the contexts of texture generation, sampling theory, remeshing, and of course the generation of decorative patterns. The combination of a solid theoretical base (complete with tantalizing open problems), practical algorithmic techniques, and exciting applications make tiling theory a worthwhile area of study for practitioners and students in computer science. This synthesis lecture introduces the mathematical and algorithmic foundations of tiling theory to a computer graphics audience. The goal is primarily to introduce concepts and terminology, clear up common misconceptions, and state and apply important results. The book also describes some of the algorithms and data structures that allow several aspects of tiling theory to be used in practice. Table of Contents: Introduction / Tiling Basi s / Symmetry / Tilings by Polygons / Isohedral Tilings / Nonperiodic and Aperiodic Tilings / Survey View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Theory and Rate Distortion Theory for Communications and Compression

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book. While the coverage of some standard topics is shortened or eliminated, the standard, but important, topics of the chain rules for entropy and mutual information, relative entropy, the data processing inequality, and the Markov chain condition receive a full treatment. Similarly, lossless source coding techniques presented include the Lempel-Ziv-Welch coding method. The material on rate Distortion theory and exp oring fundamental limits on lossy source coding covers the often-neglected Shannon lower bound and the Shannon backward channel condition, rate distortion theory for sources with memory, and the extremely practical topic of rate distortion functions for composite sources. The target audience for the book consists of graduate students at the master's degree level and practicing engineers. It is hoped that practicing engineers can work through this book and comprehend the key results needed to understand the utility of information theory and rate distortion theory and then utilize the results presented to analyze and perhaps improve the communications and compression systems with which they are familiar. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Integration:The Relational Logic Approach

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Data integration is a critical problem in our increasingly interconnected but inevitably heterogeneous world. There are numerous data sources available in organizational databases and on public information systems like the World Wide Web. Not surprisingly, the sources often use different vocabularies and different data structures, being created, as they are, by different people, at different times, for different purposes. The goal of data integration is to provide programmatic and human users with integrated access to multiple, heterogeneous data sources, giving each user the illusion of a single, homogeneous database designed for his or her specific need. The good news is that, in many cases, the data integration process can be automated. This book is an introduction to the problem of data integration and a rigorous account of one of the leading approaches to solving this problem, viz., the relational logic approach. Relational logic provides a theoretical framework for discussing da a integration. Moreover, in many important cases, it provides algorithms for solving the problem in a computationally practical way. In many respects, relational logic does for data integration what relational algebra did for database theory several decades ago. A companion web site provides interactive demonstrations of the algorithms. Table of Contents: Preface / Interactive Edition / Introduction / Basic Concepts / Query Folding / Query Planning / Master Schema Management / Appendix / References / Index / Author Biography Don't have access? Recommend our Synthesis Digital Library to your library or purchase a personal subscription. Email info@morganclaypool.com for details. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Optimization and Mathematical Modeling in Computer Architecture

    Copyright Year: 2013

    Morgan and Claypool eBooks

    In this book we give an overview of modeling techniques used to describe computer systems to mathematical optimization tools. We give a brief introduction to various classes of mathematical optimization frameworks with special focus on mixed integer linear programming which provides a good balance between solver time and expressiveness. We present four detailed case studies -- instruction set customization, data center resource management, spatial architecture scheduling, and resource allocation in tiled architectures -- showing how MILP can be used and quantifying by how much it outperforms traditional design exploration techniques. This book should help a skilled systems designer to learn techniques for using MILP in their problems, and the skilled optimization expert to understand the types of computer systems problems that MILP can be applied to. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Transient Signals on Transmission Lines:An Introduction to Non-Ideal Effects and Signal Integrity Issues in Electrical Systems

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This lecture provides an introduction to transmission line effects in the time domain. Fundamentals including time of flight, impedance discontinuities, proper termination schemes, nonlinear and reactive loads, and crosstalk are considered. Required prerequisite knowledge is limited to conventional circuit theory. The material is intended to supplement standard textbooks for use with undergraduate students in electrical engineering or computer engineering. The contents should also be of value to practicing engineers with interests in signal integrity and high-speed digital design. Table of Contents: Introduction / Solution of the Transmission Line Equations / DC Signals on a Resistively Loaded Transmission Line / Termination Schemes / Equivalent Circuits, Cascaded Lines, and Fan-Outs / Initially-Charged Transmission Lines / Finite Duration Pulses on Transmission Lines / Transmission Lines with Reactive Terminations / Lines with Nonlinear Loads / Crosstalk on Weakly Coupled Transmissio Lines View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Exa-scale computing needs to re-examine the existing hardware platform that can support intensive data-oriented computing. Since the main bottleneck is from memory, we aim to develop an energy-efficient in-memory computing platform in this book. First, the models of spin-transfer torque magnetic tunnel junction and racetrack memory are presented. Next, we show that the spintronics could be a candidate for future data-oriented computing for storage, logic, and interconnect. As a result, by utilizing spintronics, in-memory-based computing has been applied for data encryption and machine learning. The implementations of in-memory AES, Simon cipher, as well as interconnect are explained in details. In addition, in-memory-based machine learning and face recognition are also illustrated in this book. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Essential Principles for Autonomous Robotics

    Copyright Year: 2013

    Morgan and Claypool eBooks

    From driving, flying, and swimming, to digging for unknown objects in space exploration, autonomous robots take on varied shapes and sizes. In part, autonomous robots are designed to perform tasks that are too dirty, dull, or dangerous for humans. With nontrivial autonomy and volition, they may soon claim their own place in human society. These robots will be our allies as we strive for understanding our natural and man-made environments and build positive synergies around us. Although we may never perfect replication of biological capabilities in robots, we must harness the inevitable emergence of robots that synchronizes with our own capacities to live, learn, and grow. This book is a snapshot of motivations and methodologies for our collective attempts to transform our lives and enable us to cohabit with robots that work with and for us. It reviews and guides the reader to seminal and continual developments that are the foundations for successful paradigms. It attempts to demystify the abilities and limitations of robots. It is a progress report on the continuing work that will fuel future endeavors. Table of Contents: Part I: Preliminaries/Agency, Motion, and Anatomy/Behaviors / Architectures / Affect/Sensors / Manipulators/Part II: Mobility/Potential Fields/Roadmaps / Reactive Navigation / Multi-Robot Mapping: Brick and Mortar Strategy / Part III: State of the Art / Multi-Robotics Phenomena / Human-Robot Interaction / Fuzzy Control / Decision Theory and Game Theory / Part IV: On the Horizon / Applications: Macro and Micro Robots / References / Author Biography / Discussion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Anthropology of Services:Toward a Practice Approach to Designing Services

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book explores the possibility for an anthropology of services and outlines a practice approach to designing services. The reader is taken on a journey that Blomberg and Darrah have been on for the better part of a decade from their respective positions helping to establish a services research group within a large global enterprise and an applied anthropology master's program at a Silicon Valley university. They delve into the world of services to understand both how services are being conceptualized today and the possible benefits that might result from taking an anthropological view on services and their design. The authors argue that the anthropological gaze can be useful precisely because it combines attention to details of everyday life with consideration of the larger milieu in which those details make sense. Furthermore, it asks us to reflect upon and assess our own perspectives on that which we hope to understand and change. Central to their exploration is the question of how to conceptualize and engage with the world of services given their heterogeneity, the increasing global importance of the service economy, and the possibilities introduced for an engaged scholarship on service design. While discourse on services and service design can imply something distinctively new, the authors point to parallels with what is known about how humans have engaged with each other and the material world over millennia. Establishing the ubiquity of services as a starting point, the authors go on to consider the limits of design when the boundaries and connections between what can be designed and what can only be performed are complex and deeply mediated. In this regard the authors outline a practice approach to designing that acknowledges that designing involves participating in a social context, that design and use occur in concert, that people populate a world that has been largely built by and with others, and that formal models of services are impoverished repre entations of human performance. An Anthropology of Services draws attention to the conceptual and methodological messiness of service worlds while providing the reader with strategies for intervening in these worlds for human betterment as complex and challenging as that may be. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Game Theory for Wireless Engineers

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The application of mathematical analysis to wireless networks has met with limited success, due to the complexity of mobility and traffic models, coupled with the dynamic topology and the unpredictability of link quality that characterize such networks. The ability to model individual, independent decision makers whose actions potentially affect all other decision makers makes game theory particularly attractive to analyze the performance of ad hoc networks. Game theory is a field of applied mathematics that describes and analyzes interactive decision situations. It consists of a set of analytical tools that predict the outcome of complex interactions among rational entities, where rationality demands a strict adherence to a strategy based on perceived or measured results. In the early to mid-1990's, game theory was applied to networking problems including flow control, congestion control, routing and pricing of Internet services. More recently, there has been growing interest in ado ting game-theoretic methods to model today's leading communications and networking issues, including power control and resource sharing in wireless and peer-to-peer networks. This work presents fundamental results in game theory and their application to wireless communications and networking. We discuss normal-form, repeated, and Markov games with examples selected from the literature. We also describe ways in which learning can be modeled in game theory, with direct applications to the emerging field of cognitive radio. Finally, we discuss challenges and limitations in the application of game theory to the analysis of wireless systems. We do not assume familiarity with game theory. We introduce major game theoretic models and discuss applications of game theory including medium access, routing, energy-efficient protocols, and others. We seek to provide the reader with a foundational understanding of the current research on game theory applied to wireless communications and networkin . View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advances in Modern Blind Signal Separation Algorithms:Theory and Applications

    Copyright Year: 2010

    Morgan and Claypool eBooks

    With human-computer interactions and hands-free communications becoming overwhelmingly important in the new millennium, recent research efforts have been increasingly focusing on state-of-the-art multi-microphone signal processing solutions to improve speech intelligibility in adverse environments. One such prominent statistical signal processing technique is blind signal separation (BSS). BSS was first introduced in the early 1990s and quickly emerged as an area of intense research activity showing huge potential in numerous applications. BSS comprises the task of 'blindly' recovering a set of unknown signals, the so-called sources from their observed mixtures, based on very little to almost no prior knowledge about the source characteristics or the mixing structure. The goal of BSS is to process multi-sensory observations of an inaccessible set of signals in a manner that reveals their individual (and original) form, by exploiting the spatial and temporal diversity, readily access ble through a multi-microphone configuration. Proceeding blindly exhibits a number of advantages, since assumptions about the room configuration and the source-to-sensor geometry can be relaxed without affecting overall efficiency. This booklet investigates one of the most commercially attractive applications of BSS, which is the simultaneous recovery of signals inside a reverberant (naturally echoing) environment, using two (or more) microphones. In this paradigm, each microphone captures not only the direct contributions from each source, but also several reflected copies of the original signals at different propagation delays. These recordings are referred to as the convolutive mixtures of the original sources. The goal of this booklet in the lecture series is to provide insight on recent advances in algorithms, which are ideally suited for blind signal separation of convolutive speech mixtures. More importantly, specific emphasis is given in practical applications of the developed BSS algorithms associated with real-life scenarios. The developed algorithms are put in the context of modern DSP devices, such as hearing aids and cochlear implants, where design requirements dictate low power consumption and call for portability and compact size. Along these lines, this booklet focuses on modern BSS algorithms which address (1) the limited amount of processing power and (2) the small number of microphones available to the end-user. Table of Contents: Fundamentals of blind signal separation / Modern blind signal separation algorithms / Application of blind signal processing strategies to noise reduction for the hearing-impaired / Conclusions and future challenges / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sequential Monte Carlo Methods for Nonlinear Discrete-Time Filtering

    Copyright Year: 2013

    Morgan and Claypool eBooks

    In these notes, we introduce particle filtering as a recursive importance sampling method that approximates the minimum-mean-square-error (MMSE) estimate of a sequence of hidden state vectors in scenarios where the joint probability distribution of the states and the observations is non-Gaussian and, therefore, closed-form analytical expressions for the MMSE estimate are generally unavailable. We begin the notes with a review of Bayesian approaches to static (i.e., time-invariant) parameter estimation. In the sequel, we describe the solution to the problem of sequential state estimation in linear, Gaussian dynamic models, which corresponds to the well-known Kalman (or Kalman-Bucy) filter. Finally, we move to the general nonlinear, non-Gaussian stochastic filtering problem and present particle filtering as a sequential Monte Carlo approach to solve that problem in a statistically optimal way. We review several techniques to improve the performance of particle filters, including importa ce function optimization, particle resampling, Markov Chain Monte Carlo move steps, auxiliary particle filtering, and regularized particle filtering. We also discuss Rao-Blackwellized particle filtering as a technique that is particularly well-suited for many relevant applications such as fault detection and inertial navigation. Finally, we conclude the notes with a discussion on the emerging topic of distributed particle filtering using multiple processors located at remote nodes in a sensor network. Throughout the notes, we often assume a more general framework than in most introductory textbooks by allowing either the observation model or the hidden state dynamic model to include unknown parameters. In a fully Bayesian fashion, we treat those unknown parameters also as random variables. Using suitable dynamic conjugate priors, that approach can be applied then to perform joint state and parameter estimation. Table of Contents: Introduction / Bayesian Estimation of Static Vectors / he Stochastic Filtering Problem / Sequential Monte Carlo Methods / Sampling/Importance Resampling (SIR) Filter / Importance Function Selection / Markov Chain Monte Carlo Move Step / Rao-Blackwellized Particle Filters / Auxiliary Particle Filter / Regularized Particle Filters / Cooperative Filtering with Multiple Observers / Application Examples / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multi-Objective Decision Making

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Many real-world decision problems have multiple objectives. For example, when choosing a medical treatment plan, we want to maximize the efficacy of the treatment, but also minimize the side effects. These objectives typically conflict, e.g., we can often increase the efficacy of the treatment, but at the cost of more severe side effects. In this book, we outline how to deal with multiple objectives in decision-theoretic planning and reinforcement learning algorithms. To illustrate this, we employ the popular problem classes of multi-objective Markov decision processes (MOMDPs) and multi-objective coordination graphs (MO-CoGs).</p> <p>First, we discuss different use cases for multi-objective decision making, and why they often necessitate explicitly multi-objective algorithms. We advocate a utility-based approach to multi-objective decision making, i.e., that what constitutes an optimal solution to a multi-objective decision problem should be derived from th available information about user utility. We show how different assumptions about user utility and what types of policies are allowed lead to different solution concepts, which we outline in a taxonomy of multi-objective decision problems.</p> <p>Second, we show how to create new methods for multi-objective decision making using existing single-objective methods as a basis. Focusing on planning, we describe two ways to creating multi-objective algorithms: in the inner loop approach, the inner workings of a single-objective method are adapted to work with multi-objective solution concepts; in the outer loop approach, a wrapper is created around a single-objective method that solves the multi-objective problem as a series of single-objective problems. After discussing the creation of such methods for the planning setting, we discuss how these approaches apply to the learning setting.</p> <p>Next, we discuss three promising application domains for multi-o jective decision making algorithms: energy, health, and infrastructure and transportation. Finally, we conclude by outlining important open problems and promising future directions.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Geographical Design:Spatial Cognition and Geographical Information Science

    Copyright Year: 2011

    Morgan and Claypool eBooks

    With GIS technologies ranging from Google Maps and Google Earth to the use of smart phones and in-car navigation systems, spatial knowledge is often acquired and communicated through geographic information technologies. This monograph describes the interplay between spatial cognition research and use of spatial interfaces. It begins by reviewing what is known about how humans process spatial concepts and then moves on to discuss how interfaces can be improved to take advantage of those capabilities. Special attention is given to a variety of innovative geographical platforms that provide users with an intuitive understanding and support the further acquisition of spatial knowledge. The monograph concludes with a discussion of the number of outstanding issues, including the changing nature of maps as the primary spatial interface, concerns about privacy for spatial information, and a look at the future of user-centered spatial information systems. Table of Contents: Introduction / Spat al Cognition / Technologies / Cognitive Interfaces for Wayfinding / Open Issues / For More Information View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semantics in Mobile Sensing

    Copyright Year: 2014

    Morgan and Claypool eBooks

    The dramatic progress of smartphone technologies has ushered in a new era of mobile sensing, where traditional wearable on-body sensors are being rapidly superseded by various embedded sensors in our smartphones. For example, a typical smartphone today, has at the very least a GPS, WiFi, Bluetooth, triaxial accelerometer, and gyroscope. Alongside, new accessories are emerging such as proximity, magnetometer, barometer, temperature, and pressure sensors. Even the default microphone can act as an acoustic sensor to track noise exposure for example. These sensors act as a "lens" to understand the user's context along different dimensions. Data can be passively collected from these sensors without interrupting the user. As a result, this new era of mobile sensing has fueled significant interest in understanding what can be extracted from such sensor data both instantaneously as well as considering volumes of time series from these sensors. For example, GPS logs can be used to determine a tomatically the significant places associated to a user's life (e.g., home, office, shopping areas). The logs may also reveal travel patterns, and how a user moves from one place to another (e.g., driving or using public transport). These may be used to proactively inform the user about delays, relevant promotions from shops, in his "regular" route. Similarly, accelerometer logs can be used to measure a user's average walking speed, compute step counts, gait identification, and estimate calories burnt per day. The key objective is to provide better services to end users. The objective of this book is to inform the reader of the methodologies and techniques for extracting meaningful information (called "semantics") from sensors on our smartphones. These techniques form the cornerstone of several application areas utilizing smartphone sensor data. We discuss technical challenges and algorithmic solutions for modeling and mining knowledge from smartphone-resident sensor data streams. T is book devotes two chapters to dive deep into a set of highly available, commoditized sensors---the positioning sensor (GPS) and motion sensor (accelerometer). Furthermore, this book has a chapter devoted to energy-efficient computation of semantics, as battery life is a major concern on user experience. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quorum Systems:With Applications to Storage and Consensus

    Copyright Year: 2012

    Morgan and Claypool eBooks

    A quorum system is a collection of subsets of nodes, called quorums, with the property that each pair of quorums have a non-empty intersection. Quorum systems are the key mathematical abstraction for ensuring consistency in fault-tolerant and highly available distributed computing. Critical for many applications since the early days of distributed computing, quorum systems have evolved from simple majorities of a set of processes to complex hierarchical collections of sets, tailored for general adversarial structures. The initial non-empty intersection property has been refined many times to account for, e.g., stronger (Byzantine) adversarial model, latency considerations or better availability. This monograph is an overview of the evolution and refinement of quorum systems, with emphasis on their role in two fundamental applications: distributed read/write storage and consensus. Table of Contents: Introduction / Preliminaries / Classical Quorum Systems / Classical Quorum-Based Emulat ons / Byzantine Quorum Systems / Latency-efficient Quorum Systems / Probabilistic Quorum Systems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Meta-Smith Charts and Their Applications

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book presents the developments and potential applications of Meta-Smith charts, which can be applied to practical and useful transmission line problems (e.g., metamaterial transmission lines and nonreciprocal transmission lines). These problems are beyond the capability of the standard Smith chart to be applied effectively. As any RF engineer is aware, a key property of the Smith chart is the insight it provides, even in very complex design processes. Like the Smith chart, Meta-Smith charts provide a useful way of visualizing transmission line phenomena. They provide useful physical insight, and they can also assist in solving related problems effectively. This book can be used as a companion guide in studying Microwave Engineering for senior undergraduate students as well as for graduate students. It is also recommended for researchers in the RF community, especially those working with periodic transmission line structures and metamaterial transmission lines. Problems are also p ovided at the end of each chapter for readers to gain a better understanding of material presented in this book. Table of Contents: Essential Transmission Line Theory / Theory of CCITLs / Theory of BCITLs / Meta-Smith Charts for CCITLs and BCITLs / Applications of Meta-Smith Charts View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semantic Similarity from Natural Language and Ontology Analysis

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Artificial Intelligence federates numerous scientific fields in the aim of developing machines able to assist human operators performing complex treatments---most of which demand high cognitive skills (e.g. learning or decision processes). Central to this quest is to give machines the ability to estimate the likeness or similarity between things in the way human beings estimate the similarity between stimuli. In this context, this book focuses on semantic measures: approaches designed for comparing semantic entities such as units of language, e.g. words, sentences, or concepts and instances defined into knowledge bases. The aim of these measures is to assess the similarity or relatedness of such semantic entities by taking into account their semantics, i.e. their meaning---intuitively, the words tea and coffee, which both refer to stimulating beverage, will be estimated to be more semantically similar than the words toffee (confection) and coffee, despite that the last pair has a high r syntactic similarity. The two state-of-the-art approaches for estimating and quantifying semantic similarities/relatedness of semantic entities are presented in detail: the first one relies on corpora analysis and is based on Natural Language Processing techniques and semantic models while the second is based on more or less formal, computer-readable and workable forms of knowledge such as semantic networks, thesauri or ontologies. Semantic measures are widely used today to compare units of language, concepts, instances or even resources indexed by them (e.g., documents, genes). They are central elements of a large variety of Natural Language Processing applications and knowledge-based treatments, and have therefore naturally been subject to intensive and interdisciplinary research efforts during last decades. Beyond a simple inventory and categorization of existing measures, the aim of this monograph is to convey novices as well as researchers of these domains toward a better under tanding of semantic similarity estimation and more generally semantic measures. To this end, we propose an in-depth characterization of existing proposals by discussing their features, the assumptions on which they are based and empirical results regarding their performance in particular applications. By answering these questions and by providing a detailed discussion on the foundations of semantic measures, our aim is to give the reader key knowledge required to: (i) select the more relevant methods according to a particular usage context, (ii) understand the challenges offered to this field of study, (iii) distinguish room of improvements for state-of-the-art approaches and (iv) stimulate creativity toward the development of new approaches. In this aim, several definitions, theoretical and practical details, as well as concrete applications are presented View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Content-based Retrieval of Medical Images:Landmarking, Indexing, and Relevance Feedback

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Content-based image retrieval (CBIR) is the process of retrieval of images from a database that are similar to a query image, using measures derived from the images themselves, rather than relying on accompanying text or annotation. To achieve CBIR, the contents of the images need to be characterized by quantitative features; the features of the query image are compared with the features of each image in the database and images having high similarity with respect to the query image are retrieved and displayed. CBIR of medical images is a useful tool and could provide radiologists with assistance in the form of a display of relevant past cases. One of the challenging aspects of CBIR is to extract features from the images to represent their visual, diagnostic, or application-specific information content. In this book, methods are presented for preprocessing, segmentation, landmarking, feature extraction, and indexing of mammograms for CBIR. The preprocessing steps include anisotropic di fusion and the Wiener filter to remove noise and perform image enhancement. Techniques are described for segmentation of the breast and fibroglandular disk, including maximum entropy, a moment-preserving method, and Otsu's method. Image processing techniques are described for automatic detection of the nipple and the edge of the pectoral muscle via analysis in the Radon domain. By using the nipple and the pectoral muscle as landmarks, mammograms are divided into their internal, external, upper, and lower parts for further analysis. Methods are presented for feature extraction using texture analysis, shape analysis, granulometric analysis, moments, and statistical measures. The CBIR system presented provides options for retrieval using the Kohonen self-organizing map and the k-nearest-neighbor method. Methods are described for inclusion of expert knowledge to reduce the semantic gap in CBIR, including the query point movement method for relevance feedback (RFb). Analysis of performanc is described in terms of precision, recall, and relevance-weighted precision of retrieval. Results of application to a clinical database of mammograms are presented, including the input of expert radiologists into the CBIR and RFb processes. Models are presented for integration of CBIR and computer-aided diagnosis (CAD) with a picture archival and communication system (PACS) for efficient workflow in a hospital. Table of Contents: Introduction to Content-based Image Retrieval / Mammography and CAD of Breast Cancer / Segmentation and Landmarking of Mammograms / Feature Extraction and Indexing of Mammograms / Content-based Retrieval of Mammograms / Integration of CBIR and CAD into Radiological Workflow View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Strategic Health Technology Incorporation

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Technology is essential to the delivery of health care but it is still only a tool that needs to be deployed wisely to ensure beneficial outcomes at reasonable costs. Among various categories of health technology, medical equipment has the unique distinction of requiring both high initial investments and costly maintenance during its entire useful life. This characteristic does not, however, imply that medical equipment is more costly than other categories, provided that it is managed properly. The foundation of a sound technology management process is the planning and acquisition of equipment, collectively called technology incorporation. This lecture presents a rational, strategic process for technology incorporation based on experience, some successful and many unsuccessful, accumulated in industrialized and developing countries over the last three decades. The planning step is focused on establishing a Technology Incorporation Plan (TIP) using data collected from an audit of exist ng technology, evaluating needs, impacts, costs, and benefits, and consolidating the information collected for decision making. The acquisition step implements TIP by selecting equipment based on technical, regulatory, financial, and supplier considerations, and procuring it using one of the multiple forms of purchasing or agreements with suppliers. This incorporation process is generic enough to be used, with suitable adaptations, for a wide variety of health organizations with different sizes and acuity levels, ranging from health clinics to community hospitals to major teaching hospitals and even to entire health systems. Such a broadly applicable process is possible because it is based on a conceptual framework composed of in-depth analysis of the basic principles that govern each stage of technology lifecycle. Using this incorporation process, successful TIPs have been created and implemented, thereby contributing to the improvement of healthcare services and limiting the associa ed expenses. Table of Contents: Introduction / Conceptual Framework / The Incorporation Process / Discussion / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Managing Event Information: Modeling, Retrieval, and Applications:Modeling, Retrieval, and Applications

    Copyright Year: 2011

    Morgan and Claypool eBooks

    With the proliferation of citizen reporting, smart mobile devices, and social media, an increasing number of people are beginning to generate information about events they observe and participate in. A significant fraction of this information contains multimedia data to share the experience with their audience. A systematic information modeling and management framework is necessary to capture this widely heterogeneous, schemaless, potentially humongous information produced by many different people. This book is an attempt to examine the modeling, storage, querying, and applications of such an event management system in a holistic manner. It uses a semantic-web style graph-based view of events, and shows how this event model, together with its query facility, can be used toward emerging applications like semi-automated storytelling. Table of Contents: Introduction / Event Data Models / Implementing an Event Data Model / Querying Events / Storytelling with Events / An Emerging Applicati n / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis and Visualization of Citation Networks

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Citation analysis—the exploration of reference patterns in the scholarly and scientific literature—has long been applied in a number of social sciences to study research impact, knowledge flows, and knowledge networks. It has important information science applications as well, particularly in knowledge representation and in information retrieval. Recent years have seen a burgeoning interest in citation analysis to help address research, management, or information service issues such as university rankings, research evaluation, or knowledge domain visualization. This renewed and growing interest stems from significant improvements in the availability and accessibility of digital bibliographic data (both citation and full text) and of relevant computer technologies. The former provides large amounts of data and the latter the necessary tools for researchers to conduct new types of large-scale citation analysis, even without special access to special data collections. Excit ng new developments are emerging this way in many aspects of citation analysis. This book critically examines both theory and practical techniques of citation network analysis and visualization, one of the two main types of citation analysis (the other being evaluative citation analysis). To set the context for its main theme, the book begins with a discussion of the foundations of citation analysis in general, including an overview of what can and what cannot be done with citation analysis (Chapter 1). An in-depth examination of the generally accepted steps and procedures for citation network analysis follows, including the concepts and techniques that are associated with each step (Chapter 2). Individual issues that are particularly important in citation network analysis are then scrutinized, namely: field delineation and data sources for citation analysis (Chapter 3); disambiguation of names and references (Chapter 4); and visualization of citation networks (Chapter 5). Sufficient echnical detail is provided in each chapter so the book can serve as a practical how-to guide to conducting citation network analysis and visualization studies. While the discussion of most of the topics in this book applies to all types of citation analysis, the structure of the text and the details of procedures, examples, and tools covered here are geared to citation network analysis rather than evaluative citation analysis. This conscious choice was based on the authors’ observation that, compared to evaluative citation analysis, citation network analysis has not been covered nearly as well by dedicated books, despite the fact that it has not been subject to nearly as much severe criticism and has been substantially enriched in recent years with new theory and techniques from research areas such as network science, social network analysis, or information visualization. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Signal Processing of Random Physiological Signals

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This lecture book is intended to be an accessible and comprehensive introduction to random signal processing with an emphasis on the real-world applications of biosignals. Although the material has been written and developed primarily for advanced undergraduate biomedical engineering students it will also be of interest to engineers and interested biomedical professionals of any discipline seeking an introduction to the field. Within education, most biomedical engineering programs are aimed to provide the knowledge required of a graduate student while undergraduate programs are geared toward designing circuits and of evaluating only the cardiac signals. Very few programs teach the processes with which to evaluate brainwave, sleep, respiratory sounds, heart valve sounds, electromyograms, electro-oculograms, or random signals acquired from the body. The primary goal of this lecture book is to help the reader understand the time and frequency domain processes which may be used and to eva uate random physiological signals. A secondary goal is to learn the evaluation of actual mammalian data without spending most the time writing software programs. This publication utilizes “DADiSP”, a digital signal processing software, from the DSP Development Corporation. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to the Finite Element Method in Electromagnetics

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This series lecture is an introduction to the finite element method with applications in electromagnetics. The finite element method is a numerical method that is used to solve boundary-value problems characterized by a partial differential equation and a set of boundary conditions. The geometrical domain of a boundary-value problem is discretized using sub-domain elements, called the finite elements, and the differential equation is applied to a single element after it is brought to a “weak” integro-differential form. A set of shape functions is used to represent the primary unknown variable in the element domain. A set of linear equations is obtained for each element in the discretized domain. A global matrix system is formed after the assembly of all elements. This lecture is divided into two chapters. Chapter 1 describes one-dimensional boundary-value problems with applications to electrostatic problems described by the Poisson's equation. The accuracy of the finite element method is evaluated for linear and higher order elements by computing the numerical error based on two different definitions. Chapter 2 describes two-dimensional boundary-value problems in the areas of electrostatics and electrodynamics (time-harmonic problems). For the second category, an absorbing boundary condition was imposed at the exterior boundary to simulate undisturbed wave propagation toward infinity. Computations of the numerical error were performed in order to evaluate the accuracy and effectiveness of the method in solving electromagnetic problems. Both chapters are accompanied by a number of Matlab codes which can be used by the reader to solve one- and two-dimensional boundary-value problems. These codes can be downloaded from the publisher's URL: www.morganclaypool.com/page/polycarpou This lecture is written primarily for the nonexpert engineer or the undergraduate or graduate student who wants to learn, for the first time, the finite element method with appl cations to electromagnetics. It is also targeted for research engineers who have knowledge of other numerical techniques and want to familiarize themselves with the finite element method. The lecture begins with the basics of the method, including formulating a boundary-value problem using a weighted-residual method and the Galerkin approach, and continues with imposing all three types of boundary conditions including absorbing boundary conditions. Another important topic of emphasis is the development of shape functions including those of higher order. In simple words, this series lecture provides the reader with all information necessary for someone to apply successfully the finite element method to one- and two-dimensional boundary-value problems in electromagnetics. It is suitable for newcomers in the field of finite elements in electromagnetics. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bad to the Bone:Crafting Electronic Systems with BeagleBone Black, Second Edition

    Copyright Year: 2015

    Morgan and Claypool eBooks

    BeagleBone Black is a low-cost, open hardware computer uniquely suited to interact with sensors and actuators directly and over the Web. Introduced in April 2013 by BeagleBoard.org, a community of developers first established in early 2008, BeagleBone Black is used frequently to build vision-enabled robots, home automation systems, artistic lighting systems, and countless other do-it-yourself and professional projects. BeagleBone variants include the original BeagleBone and the newer BeagleBone Black, both hosting a powerful 32-bit, super-scalar ARM Cortex A8 processor capable of running numerous mobile and desktop-capable operating systems, typically variants of Linux including Debian, Android, and Ubuntu. Yet, BeagleBone is small enough to fit in a small mint tin box. The "Bone" may be used in a wide variety of projects from middle school science fair projects to senior design projects to first prototypes of very complex systems. Novice users may access the power of the Bone through the user-friendly BoneScript software, experienced through a Web browser in most major operating systems, including Microsoft Windows, Apple Mac OS X, or the Linux operating systems. Seasoned users may take full advantage of the Bone's power using the underlying Linux-based operating system, a host of feature extension boards (Capes) and a wide variety of Linux community open source libraries. This book provides an introduction to this powerful computer and has been designed for a wide variety of users including the first time novice through the seasoned embedded system design professional. The book contains background theory on system operation coupled with many well-documented, illustrative examples. Examples for novice users are centered on motivational, fun robot projects while advanced projects follow the theme of assistive technology and image-processing applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    On-Chip Photonic Interconnects:A Computer Architect's Perspective

    Copyright Year: 2013

    Morgan and Claypool eBooks

    As the number of cores on a chip continues to climb, architects will need to address both bandwidth and power consumption issues related to the interconnection network. Electrical interconnects are not likely to scale well to a large number of processors for energy efficiency reasons, and the problem is compounded by the fact that there is a fixed total power budget for a die, dictated by the amount of heat that can be dissipated without special (and expensive) cooling and packaging techniques. Thus, there is a need to seek alternatives to electrical signaling for on-chip interconnection applications. Photonics, which has a fundamentally different mechanism of signal propagation, offers the potential to not only overcome the drawbacks of electrical signaling, but also enable the architect to build energy efficient, scalable systems. The purpose of this book is to introduce computer architects to the possibilities and challenges of working with photons and designing on-chip photonic in erconnection networks. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Trustworthy Policies for Distributed Repositories

    Copyright Year: 2016

    Morgan and Claypool eBooks

    A trustworthy repository provides assurance in the form of management documents, event logs, and audit trails that digital objects are being managed correctly. The assurance includes plans for the sustainability of the repository, the accession of digital records, the management of technology evolution, and the mitigation of the risk of data loss. A detailed assessment is provided by the ISO-16363:2012 standard, "Space data and information transfer systems—Audit and certification of trustworthy digital repositories." This book examines whether the ISO specification for trustworthiness can be enforced by computer actionable policies. An implementation of the policies is provided and the policies are sorted into categories for procedures to manage externally generated documents, specify repository parameters, specify preservation metadata attributes, specify audit mechanisms for all preservation actions, specify control of preservation operations, and control preservation proper ies as technology evolves. An application of the resulting procedures is made to enforce trustworthiness within National Science Foundation data management plans. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Intelligent Systems in Traffic and Transportation

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Urban mobility is not only one of the pillars of modern economic systems, but also a key issue in the quest for equality of opportunity, once it can improve access to other services. Currently, however, there are a number of negative issues related to traffic, especially in mega-cities, such as economical issues (cost of opportunity caused by delays), environmental (externalities related to emissions of pollutants), and social (traffic accidents). Solutions to these issues are more and more closely tied to information and communication technology. Indeed, a search in the technical literature (using the keyword ``urban traffic" to filter out articles on data network traffic) retrieved the following number of articles (as of December 3, 2013): 9,443 (ACM Digital Library), 26,054 (Scopus), and 1,730,000 (Google Scholar). Moreover, articles listed in the ACM query relate to conferences as diverse as MobiCom, CHI, PADS, and AAMAS. This means that there is a big and diverse community of com uter scientists and computer engineers who tackle research that is connected to the development of intelligent traffic and transportation systems. It is also possible to see that this community is growing, and that research projects are getting more and more interdisciplinary. To foster the cooperation among the involved communities, this book aims at giving a broad introduction into the basic but relevant concepts related to transportation systems, targeting researchers and practitioners from computer science and information technology. In addition, the second part of the book gives a panorama of some of the most exciting and newest technologies, originating in computer science and computer engineering, that are now being employed in projects related to car-to-car communication, interconnected vehicles, car navigation, platooning, crowd sensing and sensor networks, among others. This material will also be of interest to engineers and researchers from the traffic and transportation co munity. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Web Page Recommendation Models:Theory and Algorithms

    Copyright Year: 2010

    Morgan and Claypool eBooks

    One of the application areas of data mining is the World Wide Web (WWW or Web), which serves as a huge, widely distributed, global information service for every kind of information such as news, advertisements, consumer information, financial management, education, government, e-commerce, health services, and many other information services. The Web also contains a rich and dynamic collection of hyperlink information, Web page access and usage information, providing sources for data mining. The amount of information on the Web is growing rapidly, as well as the number of Web sites and Web pages per Web site. Consequently, it has become more difficult to find relevant and useful information for Web users. Web usage mining is concerned with guiding the Web users to discover useful knowledge and supporting them for decision-making. In that context, predicting the needs of a Web user as she visits Web sites has gained importance. The requirement for predicting user needs in order to guide the user in a Web site and improve the usability of the Web site can be addressed by recommending pages to the user that are related to the interest of the user at that time. This monograph gives an overview of the research in the area of discovering and modeling the users' interest in order to recommend related Web pages. The Web page recommender systems studied in this monograph are categorized according to the data mining algorithms they use for recommendation. Table of Contents: Introduction to Web Page Recommender Systems / Preprocessing for Web Page Recommender Models / Pattern Extraction / Evaluation Metrics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Active Learning

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The key idea behind active learning is that a machine learning algorithm can perform better with less training if it is allowed to choose the data from which it learns. An active learner may pose "queries," usually in the form of unlabeled data instances to be labeled by an "oracle" (e.g., a human annotator) that already understands the nature of the problem. This sort of approach is well-motivated in many modern machine learning and data mining applications, where unlabeled data may be abundant or easy to come by, but training labels are difficult, time-consuming, or expensive to obtain. This book is a general introduction to active learning. It outlines several scenarios in which queries might be formulated, and details many query selection algorithms which have been organized into four broad categories, or "query selection frameworks." We also touch on some of the theoretical foundations of active learning, and conclude with an overview of the strengths and weaknesses of these appr aches in practice, including a summary of ongoing work to address these open challenges and opportunities. Table of Contents: Automating Inquiry / Uncertainty Sampling / Searching Through the Hypothesis Space / Minimizing Expected Error and Variance / Exploiting Structure in Data / Theory / Practical Considerations View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Communication Networks:A Concise Introduction

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book results from many years of teaching an upper division course on communication networks in the EECS department at University of California, Berkeley. It is motivated by the perceived need for an easily accessible textbook that puts emphasis on the core concepts behind current and next generation networks. After an overview of how today's Internet works and a discussion of the main principles behind its architecture, we discuss the key ideas behind Ethernet, WiFi networks, routing, internetworking and TCP. To make the book as self contained as possible, brief discussions of probability and Markov chain concepts are included in the appendices. This is followed by a brief discussion of mathematical models that provide insight into the operations of network protocols. Next, the main ideas behind the new generation of wireless networks based on WiMAX and LTE, and the notion of QoS are presented. A concise discussion of the physical layer technologies underlying various networks i also included. Finally, a sampling of topics is presented that may have significant influence on the future evolution of networks including overlay networks like content delivery and peer-to-peer networks, sensor networks, distributed algorithms, Byzantine agreement and source compression. Table of Contents: The Internet / Principles / Ethernet / WiFi / Routing / Internetworking / Transport / Models / WiMAX & LTE / QOS / Physical Layer / Additional Topics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Capstone Design Courses, Part Two:Preparing Biomedical Engineers for the Real World

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The biomedical engineering senior capstone design course is probably the most important course taken by undergraduate biomedical engineering students. It provides them with the opportunity to apply what they have learned in previous years, develop their communication, teamwork, project management, and design skills, and learn about the product development process. It prepares students for professional practice and serves as a preview of what it will be like to work as a biomedical engineer. The capstone design experience can change the way engineering students think about technology, themselves, society, and the world around them. It can make them aware of their potential to make a positive contribution to healthcare throughout the world and generate excitement for, and pride in, the engineering profession. Ideas for how to organize, structure, and manage a senior capstone design course for biomedical and other engineering students are presented here. These ideas will be helpful to fa ulty who are creating a new design course, expanding a current design program, or just looking for some ideas for improving an existing course. The better we can make these courses, the more "industry ready" our students will be, and the better prepared they will be for meaningful, successful careers in biomedical engineering. This book is the second part of a series covering Capstone Design Courses for biomedical engineers. Part I is available online here and in print (ISBN 9781598292923) and covers the following topics: Purpose, Goals, and Benefits; Designing a Course to Meet Student Needs; Enhancing the Capstone Design Courses; Meeting the Changing Needs of Future Engineers. Table of Contents: The Myth of the "Industry-Ready" Engineer / Recent Trends and the Current State of Capstone Design / Preparing Students for Capstone Design / Helping Students Recognize the Value of Capstone Design Courses / Developing Teamwork Skills / Incorporating Design Controls / Learning to Identify Pro lems, Unmet Needs, and New Product Opportunities / Design Verification and Validation / Liability Issues with Assistive Technology Projects / Standards in Capstone Design Courses and the Engineering Curriculum / Design Transfer and Design for Manufacturability / Learning from other Engineering Disciplines: Capstone Design Conferences / Maintaining a Relevant, Up-to-Date Capstone Design Course / Active Learning in Capstone Design Courses / Showcasing Student Projects: National Student Design Competitions / Managing Student Expectations of the "Real World" / Career Management and Professional Development / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Surface Computing and Collaborative Analysis Work

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the cha lenges security personnel face in securing networks from attackers, and intelligence analysts encounter when analyzing intelligence data. Both of these activities are becoming increasingly collaborative endeavors, and there are huge opportunities for improving collaboration by leveraging surface computing. This work highlights for interaction designers and software developers the particular challenges and opportunities presented by interaction with surfaces. We have reviewed hundreds of recent research papers, and report on advancements in the fields of surface-enabled collaborative analytic work, interactive techniques for surface technologies, and useful theory that can provide direction to interaction design work. We also offer insight into issues that arise when developing applications for multi-touch surfaces derived from our own experiences creating collaborative applications. We present these insights at a level appropriate for all members of the software design and development team. Table of Contents: List of Figures / Acknowledgments / Figure Credits / Purpose and Direction / Surface Technologies and Collaborative Analysis Systems / Interacting with Surface Technologies / Collaborative Work Enabled by Surfaces / The Theory and the Design of Surface Applications / The Development of Surface Applications / Concluding Comments / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Global Warming and the Future of the Earth

    Copyright Year: 2007

    Morgan and Claypool eBooks

    The globally averaged surface temperature of the Earth has increased during the past century by about 0.7°C. Most of the increase can be attributed to the greenhouse effect, the increase in the atmospheric concentration of carbon dioxide that is emitted when fossil fuels are burned to produce energy.The book begins with the important distinction between weather and climate, followed by data showing how carbon dioxide has increased and the incontrovertible evidence that it is caused by burning fossil fuels (i.e., coal, oil, and natural gas). I also address the inevitable skepticism that global warming arouses and offer a number of responses to the global warming skeptics. After dealing with the skeptics, I analyze both the current and future effects of global warming. These future effects are based on scenarios or “storylines” put forth by the International Institute for Applied Systems Analysis. In closing, I address the controversial (and grim) suggestion that we ave already passed the “tipping point,” which is the time after which, regardless of our future actions, global warming will cause considerable hardship on human society. I intend this book to be approachable for all concerned citizens, but especially students of the sciences and engineering who will soon be in a position to make a difference in the areas of energy and the environment. I have tried to frame the debate in terms of what the engineering community must do to help combat global warming. We have no choice but to think in terms of global environmental constraints as we design new power plants, factories, automobiles, buildings, and homes. The best thing for scientists to do is to present what we know, clearly separating what is known from what is suspected, in a non-apocalyptic manner. If matters are clearly and passionately presented to the public, we must be prepared to accept the will of the people. This presents the scientific community with an enormous res onsibility, perhaps unlike any we have had in the past. Contents: Weather and Climate (and a Little History) / Are the Concentrations of Greenhouse Gases in the Atmosphere Increasing? / The Greenhouse Effect and the Evidence of Global Warming / The Skeptics: Are Their Doubts Scientifically Valid / Impacts: The "So What" Question / The Bottom Line View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bridging the Gap Between Engineering and the Global World:A Case Study of the Coconut (Coir) Fiber Industry in Kerala, India

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Over the last two decades, globalization has had a profound impact on how we view the world and its sustainability. One group of professionals that lies at the heart of sustainability is the engineers. Engineers are trained problem solvers, required to implement technical solutions and are at the forefront of the development of new technologies. Although engineers play a critical role in sustainability, traditional engineering programs typically only focus on the technocentric and ecocentric dimensions of sustainability, providing little training on the sociocentric dimension. With more and more interest in sustainability, it is becoming increasingly important to also provide engineers with an awareness of sociocentric issues and the necessary skills to address them. The aim of this book is to provide engineering educators with a real-life case study that can be brought into existing courses to help bridge the gap between engineering and the global world. The case study focuses on how our engineering study of different natural plant fibers for soil erosion control led us to small villages in Kerala, India, where marginalized women workers often stand waste deep in water several hours a day, clean and beat coconuts by hand, and separate and spin coconut (coir) fibers into yarn by hand, for very low wages. The case study provides insight into the three dimensions of sustainability (technocentric, ecocentric, and sociocentric) and how they come together in a typical engineering problem. Table of Contents: Reinforcing the Classroom / Natural Plant Fibers for Engineering Applications: Technocentric and Ecocentric Dimensions of Sustainability / The Coir Fiber Industry in Kerala, India: Sociocentric Dimension of Sustainability / Case Study / Conclusion / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Light Field Sampling

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Light field is one of the most representative image-based rendering techniques that generate novel virtual views from images instead of 3D models. The light field capture and rendering process can be considered as a procedure of sampling the light rays in the space and interpolating those in novel views. As a result, light field can be studied as a high-dimensional signal sampling problem, which has attracted a lot of research interest and become a convergence point between computer graphics and signal processing, and even computer vision. This lecture focuses on answering two questions regarding light field sampling, namely how many images are needed for a light field, and if such number is limited, where we should capture them. The book can be divided into three parts. First, we give a complete analysis on uniform sampling of IBR data. By introducing the surface plenoptic function, we are able to analyze the Fourier spectrum of non-Lambertian and occluded scenes. Given the spectrum, we also apply the generalized sampling theorem on the IBR data, which results in better rendering quality than rectangular sampling for complex scenes. Such uniform sampling analysis provides general guidelines on how the images in IBR should be taken. For instance, it shows that non-Lambertian and occluded scenes often require a higher sampling rate. Next, we describe a very general sampling framework named freeform sampling. Freeform sampling handles three kinds of problems: sample reduction, minimum sampling rate to meet an error requirement, and minimization of reconstruction error given a fixed number of samples. When the to-be-reconstructed function values are unknown, freeform sampling becomes active sampling. Algorithms of active sampling are developed for light field and show better results than the traditional uniform sampling approach. Third, we present a self-reconfigurable camera array that we developed, which features a very efficient algorithm for real-time rendering an the ability of automatically reconfiguring the cameras to improve the rendering quality. Both are based on active sampling. Our camera array is able to render dynamic scenes interactively at high quality. To the best of our knowledge, it is the first camera array that can reconfigure the camera positions automatically. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Block Transceivers:OFDM and Beyond

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The demand for data traffic over mobile communication networks has substantially increased during the last decade. As a result, these mobile broadband devices spend the available spectrum fiercely, requiring the search for new technologies. In transmissions where the channel presents a frequency-selective behavior, multicarrier modulation (MCM) schemes have proven to be more efficient, in terms of spectral usage, than conventional modulations and spread spectrum techniques. The orthogonal frequency-division multiplexing (OFDM) is the most popular MCM method, since it not only increases spectral efficiency but also yields simple transceivers. All OFDM-based systems, including the single-carrier with frequency-division equalization (SC-FD), transmit redundancy in order to cope with the problem of interference among symbols. This book presents OFDM-inspired systems that are able to, at most, halve the amount of redundancy used by OFDM systems while keeping the computational complexity co parable. Such systems, herein called memoryless linear time-invariant (LTI) transceivers with reduced redundancy, require low-complexity arithmetical operations and fast algorithms. In addition, whenever the block transmitter and receiver have memory and/or are linear time-varying (LTV), it is possible to reduce the redundancy in the transmission even further, as also discussed in this book. For the transceivers with memory it is possible to eliminate the redundancy at the cost of making the channel equalization more difficult. Moreover, when time-varying block transceivers are also employed, then the amount of redundancy can be as low as a single symbol per block, regardless of the size of the channel memory. With the techniques presented in the book it is possible to address what lies beyond the use of OFDM-related solutions in broadband transmissions. Table of Contents: The Big Picture / Transmultiplexers / OFDM / Memoryless LTI Transceivers with Reduced Redundancy / FIR LTV Transc ivers with Reduced Redundancy View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Communication and Agreement Abstractions for Fault-Tolerant Asynchronous Distributed Systems

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Understanding distributed computing is not an easy task. This is due to the many facets of uncertainty one has to cope with and master in order to produce correct distributed software. Considering the uncertainty created by asynchrony and process crash failures in the context of message-passing systems, the book focuses on the main abstractions that one has to understand and master in order to be able to produce software with guaranteed properties. These fundamental abstractions are communication abstractions that allow the processes to communicate consistently (namely the register abstraction and the reliable broadcast abstraction), and the consensus agreement abstractions that allows them to cooperate despite failures. As they give a precise meaning to the words "communicate" and "agree" despite asynchrony and failures, these abstractions allow distributed programs to be designed with properties that can be stated and proved. Impossibility results are associated with these abstracti ns. Hence, in order to circumvent these impossibilities, the book relies on the failure detector approach, and, consequently, that approach to fault-tolerance is central to the book. Table of Contents: List of Figures / The Atomic Register Abstraction / Implementing an Atomic Register in a Crash-Prone Asynchronous System / The Uniform Reliable Broadcast Abstraction / Uniform Reliable Broadcast Abstraction Despite Unreliable Channels / The Consensus Abstraction / Consensus Algorithms for Asynchronous Systems Enriched with Various Failure Detectors / Constructing Failure Detectors View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High-Level Structures for Quantum Computing

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This book is concerned with the models of quantum computation. Information processing based on the rules of quantum mechanics provides us with new opportunities for developing more efficient algorithms and protocols. However, to harness the power offered by quantum information processing it is essential to control the behavior of quantum mechanical objects in a precise manner. As this seems to be conceptually difficult at the level of quantum states and unitary gates, high-level quantum programming languages have been proposed for this purpose. The aim of this book is to provide an introduction to abstract models of computation used in quantum information theory. Starting from the abstract models of Turing machine and finite automata, we introduce the models of Boolean circuits and Random Access Machine and use them to present quantum programming techniques and quantum programming languages. Table of Contents: Introduction / Turing machines / Quantum Finite State Automata / Computatio al Circuits / Random Access Machines / Quantum Programming Environment / Quantum Programming Languages / Imperative quantum programming / Functional Quantum Programming / Outlook View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Protection from Insider Threats

    Copyright Year: 2012

    Morgan and Claypool eBooks

    As data represent a key asset for today's organizations, the problem of how to protect this data from theft and misuse is at the forefront of these organizations' minds. Even though today several data security techniques are available to protect data and computing infrastructures, many such techniques -- such as firewalls and network security tools -- are unable to protect data from attacks posed by those working on an organization's "inside." These "insiders" usually have authorized access to relevant information systems, making it extremely challenging to block the misuse of information while still allowing them to do their jobs. This book discusses several techniques that can provide effective protection against attacks posed by people working on the inside of an organization. Chapter One introduces the notion of insider threat and reports some data about data breaches due to insider threats. Chapter Two covers authentication and access control techniques, and Chapter Three show how these general security techniques can be extended and used in the context of protection from insider threats. Chapter Four addresses anomaly detection techniques that are used to determine anomalies in data accesses by insiders. These anomalies are often indicative of potential insider data attacks and therefore play an important role in protection from these attacks. Security information and event management (SIEM) tools and fine-grained auditing are discussed in Chapter Five. These tools aim at collecting, analyzing, and correlating -- in real-time -- any information and event that may be relevant for the security of an organization. As such, they can be a key element in finding a solution to such undesirable insider threats. Chapter Six goes on to provide a survey of techniques for separation-of-duty (SoD). SoD is an important principle that, when implemented in systems and tools, can strengthen data protection from malicious insiders. However, to date, very few approaches hav been proposed for implementing SoD in systems. In Chapter Seven, a short survey of a commercial product is presented, which provides different techniques for protection from malicious users with system privileges -- such as a DBA in database management systems. Finally, in Chapter Eight, the book concludes with a few remarks and additional research directions. Table of Contents: Introduction / Authentication / Access Control / Anomaly Detection / Security Information and Event Management and Auditing / Separation of Duty / Case Study: Oracle Database Vault / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Design of Reconfigurable Antennas Using Graph Models

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This lecture discusses the use of graph models to represent reconfigurable antennas. The rise of antennas that adapt to their environment and change their operation based on the user's request hasn't been met with clear design guidelines. There is a need to propose some rules for the optimization of any reconfigurable antenna design and performance. Since reconfigurable antennas are seen as a collection of self-organizing parts, graph models can be introduced to relate each possible topology to a corresponding electromagnetic performance in terms of achieving a characteristic frequency of operation, impedance, and polarization. These models help designers understand reconfigurable antenna structures and enhance their functionality since they transform antennas from bulky devices into mathematical and software accessible models. The use of graphs facilitates the software control and cognition ability of reconfigurable antennas while optimizing their performance. This lecture also dis usses the reduction of redundancy, complexity and reliability of reconfigurable antennas and reconfigurable antenna arrays. The full analysis of these parameters allows a better reconfigurable antenna implementation in wireless and space communications platforms. The use of graph models to reduce the complexity while preserving the reliability of reconfigurable antennas allow a better incorporation in applications such as cognitive radio, MIMO, satellite communications, and personal communication systems. A swifter response time is achieved with less cost and losses. This lecture is written for individuals who wish to venture into the field of reconfigurable antennas, with a little prior experience in this area, and learn how graph rules and theory, mainly used in the field of computer science, networking, and control systems can be applied to electromagnetic structures. This lecture will walk the reader through a design and analysis process of reconfigurable antennas using graph mode s with a practical and theoretical outlook. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    GPU-Based Techniques for Global Illumination Effects

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book presents techniques to render photo-realistic images by programming the Graphics Processing Unit (GPU). We discuss effects such as mirror reflections, refractions, caustics, diffuse or glossy indirect illumination, radiosity, single or multiple scattering in participating media, tone reproduction, glow, and depth of field. The book targets game developers, graphics programmers, and also students with some basic understanding of computer graphics algorithms, rendering APIs like Direct3D or OpenGL, and shader programming. In order to make the book self-contained, the most important concepts of local illumination and global illumination rendering, graphics hardware, and Direct3D/HLSL programming are reviewed in the first chapters. After these introductory chapters we warm up with simple methods including shadow and environment mapping, then we move on toward advanced concepts aiming at global illumination rendering. Since it would have been impossible to give a rigorous review f all approaches proposed in this field, we go into the details of just a few methods solving each particular global illumination effect. However, a short discussion of the state of the art and links to the bibliography are also provided to refer the interested reader to techniques that are not detailed in this book. The implementation of the selected methods is also presented in HLSL, and we discuss their observed performance, merits, and disadvantages. In the last chapter, we also review how these techniques can be integrated in an advanced game engine and present case studies of their exploitation in games. Having gone through this book, the reader will have an overview of the state of the art, will be able to apply and improve these techniques, and most importantly, will be capable of developing brand new GPU algorithms. Table of Contents: Global Illumintation Rendering / Local Illumination Rendering Pipeline of GPUs / Programming and Controlling GPUs / Simple Improvements of the ocal Illumination Model / Ray Casting on the GPU / Specular Effects with Rasterization / Diffuse and Glossy Indirect Illumination / Pre-computation Aided Global Illumination / Participating Media Rendering / Fake Global Illumination / Postprocessing Effects / Integrating GI Effects in Games and Virtual Reality Systems / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Arabic Natural Language Processing

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book provides system developers and researchers in natural language processing and computational linguistics with the necessary background information for working with the Arabic language. The goal is to introduce Arabic linguistic phenomena and review the state-of-the-art in Arabic processing. The book discusses Arabic script, phonology, orthography, morphology, syntax and semantics, with a final chapter on machine translation issues. The chapter sizes correspond more or less to what is linguistically distinctive about Arabic, with morphology getting the lion's share, followed by Arabic script. No previous knowledge of Arabic is needed. This book is designed for computer scientists and linguists alike. The focus of the book is on Modern Standard Arabic; however, notes on practical issues related to Arabic dialects and languages written in the Arabic script are presented in different chapters. Table of Contents: What is "Arabic"? / Arabic Script / Arabic Phonology and Orthograph / Arabic Morphology / Computational Morphology Tasks / Arabic Syntax / A Note on Arabic Semantics / A Note on Arabic and Machine Translation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Query Processing over Uncertain Databases

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Due to measurement errors, transmission lost, or injected noise for privacy protection, uncertainty exists in the data of many real applications. However, query processing techniques for deterministic data cannot be directly applied to uncertain data because they do not have mechanisms to handle data uncertainty. Therefore, efficient and effective manipulation of uncertain data is a practical yet challenging research topic. In this book, we start from the data models for imprecise and uncertain data, move on to defining different semantics for queries on uncertain data, and finally discuss the advanced query processing techniques for various probabilistic queries in uncertain databases. The book serves as a comprehensive guideline for query processing over uncertain databases. Table of Contents: Introduction / Uncertain Data Models / Spatial Query Semantics over Uncertain Data Models / Spatial Query Processing over Uncertain Databases / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    iRODS Primer:Integrated Rule-Oriented Data System

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Policy-based data management enables the creation of community-specific collections. Every collection is created for a purpose. The purpose defines the set of properties that will be associated with the collection. The properties are enforced by management policies that control the execution of procedures that are applied whenever data are ingested or accessed. The procedures generate state information that defines the outcome of enforcing the management policy. The state information can be queried to validate assessment criteria and verify that the required collection properties have been conserved. The integrated Rule-Oriented Data System implements the data management framework required to support policy-based data management. Policies are turned into computer actionable Rules. Procedures are composed from a Micro-service-oriented architecture. The result is a highly extensible and tunable system that can enforce management policies, automate administrative tasks, and periodically alidate assessment criteria. Table of Contents: Introduction / Integrated Rule-Oriented Data System / iRODS Architecture / Rule-Oriented Programming / The iRODS Rule System / iRODS Micro-services / Example Rules / Extending iRODS / Appendix A: iRODS Shell Commands / Appendix B: Rulegen Grammar / Appendix C: Exercises / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Making Claims:The Claim as a Knowledge Design, Capture, and Sharing Tool in HCI

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Human-centered informatics (HCI) is a young discipline that is still defining its core components, with approaches rooted in engineering, science, and creative design. In the spirit of this book series, this book explores HCI as an intersection point for different perspectives of computing and information technology, seeking to understand how groups of designers can communicate with an increasingly diverse set of colleagues on a broadening set of problems. In so doing, this book traces the evolution of claims as a way to capture and share knowledge, particularly in comparison to other approaches like patterns and issues. Claims can be a centrally important aspect in HCI design efforts, either consciously by targeted design techniques or through ingrained habits of experienced designers. An examination of claims, their uses in design, and the possibilities for explicit use in future collaborative design endeavors seeks to inspire their further development use in HCI design. Table of Co tents: What are Claims? / Knowing and Sharing / Evolution of Claims / Using Claims / Looking Forward View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to Duplicate Detection

    Copyright Year: 2010

    Morgan and Claypool eBooks

    With the ever increasing volume of data, data quality problems abound. Multiple, yet different representations of the same real-world objects in data, duplicates, are one of the most intriguing data quality problems. The effects of such duplicates are detrimental; for instance, bank customers can obtain duplicate identities, inventory levels are monitored incorrectly, catalogs are mailed multiple times to the same household, etc. Automatically detecting duplicates is difficult: First, duplicate representations are usually not identical but slightly differ in their values. Second, in principle all pairs of records should be compared, which is infeasible for large volumes of data. This lecture examines closely the two main components to overcome these difficulties: (i) Similarity measures are used to automatically identify duplicates when comparing two records. Well-chosen similarity measures improve the effectiveness of duplicate detection. (ii) Algorithms are developed to perform on v ry large volumes of data in search for duplicates. Well-designed algorithms improve the efficiency of duplicate detection. Finally, we discuss methods to evaluate the success of duplicate detection. Table of Contents: Data Cleansing: Introduction and Motivation / Problem Definition / Similarity Functions / Duplicate Detection Algorithms / Evaluating Detection Success / Conclusion and Outlook / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Generating Plans from Proofs:The Interpolation-based Approach to Query Reformulation

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Query reformulation refers to a process of translating a source query—a request for information in some high-level logic-based language—into a target plan that abides by certain interface restrictions. Many practical problems in data management can be seen as instances of the reformulation problem. For example, the problem of translating an SQL query written over a set of base tables into another query written over a set of views; the problem of implementing a query via translating to a program calling a set of database APIs; the problem of implementing a query using a collection of web services. In this book we approach query reformulation in a very general setting that encompasses all the problems above, by relating it to a line of research within mathematical logic. For many decades logicians have looked at the problem of converting "implicit definitions" into "explicit definitions," using an approach known as interpolation. We will review the theory of interpolatio , and explain its close connection with query reformulation. We will give a detailed look at how the interpolation-based approach is used to generate translations between logic-based queries over different vocabularies, and also how it can be used to go from logic-based queries to programs. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Distributed Computing by Oblivious Mobile Robots

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The study of what can be computed by a team of autonomous mobile robots, originally started in robotics and AI, has become increasingly popular in theoretical computer science (especially in distributed computing), where it is now an integral part of the investigations on computability by mobile entities. The robots are identical computational entities located and able to move in a spatial universe; they operate without explicit communication and are usually unable to remember the past; they are extremely simple, with limited resources, and individually quite weak. However, collectively the robots are capable of performing complex tasks, and form a system with desirable fault-tolerant and self-stabilizing properties. The research has been concerned with the computational aspects of such systems. In particular, the focus has been on the minimal capabilities that the robots should have in order to solve a problem. This book focuses on the recent algorithmic results in the field of distr buted computing by oblivious mobile robots (unable to remember the past). After introducing the computational model with its nuances, we focus on basic coordination problems: pattern formation, gathering, scattering, leader election, as well as on dynamic tasks such as flocking. For each of these problems, we provide a snapshot of the state of the art, reviewing the existing algorithmic results. In doing so, we outline solution techniques, and we analyze the impact of the different assumptions on the robots' computability power. Table of Contents: Introduction / Computational Models / Gathering and Convergence / Pattern Formation / Scatterings and Coverings / Flocking / Other Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital Control in Power Electronics, 2nd Edition

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book presents the reader, whether an electrical engineering student in power electronics or a design engineer, a selection of power converter control problems and their basic digital solutions, based on the most widespread digital control techniques. The presentation is primarily focused on different applications of the same power converter topology, the half-bridge voltage source inverter, considered both in its single- and three-phase implementation. This is chosen as the test case because, besides being simple and well known, it allows the discussion of a significant spectrum of the most frequently encountered digital control applications in power electronics, from digital pulse width modulation (DPWM) and space vector modulation (SVM), to inverter output current and voltage control, ending with the relatively more complex VSI applications related to the so called smart-grid scenario. This book aims to serve two purposes: (1) to give a basic, introductory knowledge of the digi al control techniques applied to power converters; and (2) to raise the interest for discrete time control theory, stimulating new developments in its application to switching power converters. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modeling Bipolar Power Semiconductor Devices

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book presents physics-based models of bipolar power semiconductor devices and their implementation in MATLAB and Simulink. The devices are subdivided into different regions, and the operation in each region, along with the interactions at the interfaces which are analyzed using basic semiconductor physics equations that govern their behavior. The Fourier series solution is used to solve the ambipolar diffusion equation in the lightly doped drift region of the devices. In addition to the external electrical characteristics, internal physical and electrical information, such as the junction voltages and the carrier distribution in different regions of the device, can be obtained using the models. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Research Infrastructures for Hardware Accelerators

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Hardware acceleration in the form of customized datapath and control circuitry tuned to specific applications has gained popularity for its promise to utilize transistors more efficiently. Historically, the computer architecture community has focused on general-purpose processors, and extensive research infrastructure has been developed to support research efforts in this domain. Envisioning future computing systems with a diverse set of general-purpose cores and accelerators, computer architects must add accelerator-related research infrastructures to their toolboxes to explore future heterogeneous systems. This book serves as a primer for the field, as an overview of the vast literature on accelerator architectures and their design flows, and as a resource guidebook for researchers working in related areas. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics provides a comprehensive tutorial of the most widely used method for solving Maxwell's equations -- the Finite Difference Time-Domain Method. This book is an essential guide for students, researchers, and professional engineers who want to gain a fundamental knowledge of the FDTD method. It can accompany an undergraduate or entry-level graduate course or be used for self-study. The book provides all the background required to either research or apply the FDTD method for the solution of Maxwell's equations to practical problems in engineering and science. Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics guides the reader through the foundational theory of the FDTD method starting with the one-dimensional transmission-line problem and then progressing to the solution of Maxwell's equations in three dimensions. It also provides step by step guides to modeling physic l sources, lumped-circuit components, absorbing boundary conditions, perfectly matched layer absorbers, and sub-cell structures. Post processing methods such as network parameter extraction and far-field transformations are also detailed. Efficient implementations of the FDTD method in a high level language are also provided. Table of Contents: Introduction / 1D FDTD Modeling of the Transmission Line Equations / Yee Algorithm for Maxwell's Equations / Source Excitations / Absorbing Boundary Conditions / The Perfectly Matched Layer (PML) Absorbing Medium / Subcell Modeling / Post Processing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital Libraries Applications:CBIR, Education, Social Networks, eScience/Simulation, and GIS

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Digital libraries (DLs) have evolved since their launch in 1991 into an important type of information system, with widespread application. This volume advances that trend further by describing new research and development in the DL field that builds upon the 5S (Societies, Scenarios, Spaces, Structures, Streams) framework, which is discussed in three other DL volumes in this series.While the 5S framework may be used to describe many types of information systems, and is likely to have even broader utility and appeal, we focus here on digital libraries. Drawing upon six (Akbar, Kozievitch, Leidig, Li, Murthy, Park) completed and two (Chen, Fouh) in-process dissertations, as well as the efforts of collaborating researchers, and scores of related publications, presentations, tutorials, and reports, this book demonstrates the applicability of 5S in five digital library application areas, that also have importance in the context of the WWW, Web 2.0, and innovative information systems. By in egrating surveys of the state-of-the-art, newresearch, connections with formalization, case studies, and exercises/projects, this book can serve as a textbook for those interested in computing, information, and/or library science. Chapter 1 focuses on images, explaining how they connect with information retrieval, in the context of CBIR systems. Chapter 2 gives two case studies of DLs used in education, which is one of the most common applications of digital libraries. Chapter 3 covers social networks, which are at the heart of work onWeb 2.0, explaining the construction and use of deduced graphs, that can enhance retrieval and recommendation. Chapter 4 demonstrates the value of DLs in eScience, focusing, in particular, on cyber-infrastructure for simulation. Chapter 5 surveys geospatial information in DLs, with a case study on geocoding. Given this rich content, we trust that any interested in digital libraries, or in related systems, will find this volume to be motivating, intellect ally satisfying, and useful. We hope it will help move digital libraries forward into a science as well as a practice. We hope it will help build community that will address the needs of the next generation of DLs. Table of Contents: Content-Based Image Retrieval / Education / Social Networks in Digital Libraries / eScience and Simulation Digital Libraries / Geospatial Information / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Tensor Properties of Solids

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Tensor Properties of Solids presents the phenomenological development of solid state properties represented as matter tensors in two parts: Part I on equilibrium tensor properties and Part II on transport tensor properties. Part I begins with an introduction to tensor notation, transformations, algebra, and calculus together with the matrix representations. Crystallography, as it relates to tensor properties of crystals, completes the background treatment. A generalized treatment of solid-state equilibrium thermodynamics leads to the systematic correlation of equilibrium tensor properties. This is followed by developments covering first-, second-, third-, and higher-order tensor effects. Included are the generalized compliance and rigidity matrices for first-order tensor properties, Maxwell relations, effect of measurement conditions, and the dependent coupled effects and use of interaction diagrams. Part I concludes with the second- and higher-order effects, including numerous optica tensor properties. Part II presents the driving forces and fluxes for the well-known proper conductivities. An introduction to irreversible thermodynamics includes the concepts of microscopic reversibility, Onsager's reciprocity principle, entropy density production, and the proper choice of the transport parameters. This is followed by the force-flux equations for electronic charge and heat flow and the relationships between the proper conductivities and phenomenological coefficients. The thermoelectric effects in solids are discussed and extended to the piezothermoelectric and piezoresistance tensor effects. The subjects of thermomagnetic, galvanomagnetic, and thermogalvanomagnetic effects are developed together with other higher-order magnetotransport property tensors. A glossary of terms, expressions, and symbols are provided at the end of the text, and end-of-chapter problems are provided on request. Endnotes provide the necessary references for further reading. Table of Conten s: I. Equilibrium Tensor Properties of Solids / Introduction / Introduction to Tensor Notation, Tensor Transformations, Tensor Calculus, and Matrix Representation / Crystal Systems, Symmetry Elements, and Symmetry Transformations / Generalized Thermostatics and the Systematic Correlation of Physical Properties / The Dependent Coupled Effects and the Interrelationships Between First-Order Tensor Properties - Use of Interaction Diagrams / Third- and Fourth-Rank Tensor Properties - Symmetry Considerations / Second- and Higher-Order Effects - Symmetry Considerations / II. Transport Properties of Solids / Introduction to Transport Properties and the Thermodynamics of Irreversible Processes / Thermoelectric, Piezothermoelectric, and Diffusive Effects in Solids / Effect of Magnetic Field on the Transport Properties / Appendix A: Magnetic Tensor Properties, Magnetic Crystals, and the Combined Space-Time Transformations / Endnotes / Glossary / Biography / Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2014

    Morgan and Claypool eBooks

    A response of the engineering profession to the challenges of security, poverty and underdevelopment, environmental sustainability, and native cultures is described. Ethical codes, which govern the behavior of engineers, are examined from a historical perspective linking the prevailing codes to models of the natural world. A new ethical code based on a recently introduced model of Nature as an integral community is provided and discussed. Applications of the new code are described using a case study approach. With the ethical code based on an integral community in place, new design algorithms are developed and also explored using case studies. Implications of the proposed changes in ethics and design on engineering education are considered. Table of Contents: Preface / Acknowledgments / Introduction / Engineering Ethics / Models of the Earth / Engineering in a Morally Deep World / Engineering Design in a Morally Deep World / Implications for Engineering Education / Final Thoughts / Re erences / Author's Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Provenance Data in Social Media

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Social media shatters the barrier to communicate anytime anywhere for people of all walks of life. The publicly available, virtually free information in social media poses a new challenge to consumers who have to discern whether a piece of information published in social media is reliable. For example, it can be difficult to understand the motivations behind a statement passed from one user to another, without knowing the person who originated the message. Additionally, false information can be propagated through social media, resulting in embarrassment or irreversible damages. Provenance data associated with a social media statement can help dispel rumors, clarify opinions, and confirm facts. However, provenance data about social media statements is not readily available to users today. Currently, providing this data to users requires changing the social media infrastructure or offering subscription services. Taking advantage of social media features, research in this nascent field s earheads the search for a way to provide provenance data to social media users, thus leveraging social media itself by mining it for the provenance data. Searching for provenance data reveals an interesting problem space requiring the development and application of new metrics in order to provide meaningful provenance data to social media users. This lecture reviews the current research on information provenance, explores exciting research opportunities to address pressing needs, and shows how data mining can enable a social media user to make informed judgements about statements published in social media. Table of Contents: Information Provenance in Social Media / Provenance Attributes / Provenance via Network Information / Provenance Data View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Model-Driven Software Engineering in Practice

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This book discusses how model-based approaches can improve the daily practice of software professionals. This is known as Model-Driven Software Engineering (MDSE) or, simply, Model-Driven Engineering (MDE). MDSE practices have proved to increase efficiency and effectiveness in software development, as demonstrated by various quantitative and qualitative studies. MDSE adoption in the software industry is foreseen to grow exponentially in the near future, e.g., due to the convergence of software development and business analysis. The aim of this book is to provide you with an agile and flexible tool to introduce you to the MDSE world, thus allowing you to quickly understand its basic principles and techniques and to choose the right set of MDSE instruments for your needs so that you can start to benefit from MDSE right away. The book is organized into two main parts. The first part discusses the foundations of MDSE in terms of basic concepts (i.e., models and transformations), driving p inciples, application scenarios and current standards, like the well-known MDA initiative proposed by OMG (Object Management Group) as well as the practices on how to integrate MDSE in existing development processes. The second part deals with the technical aspects of MDSE, spanning from the basics on when and how to build a domain-specific modeling language, to the description of Model-to-Text and Model-to-Model transformations, and the tools that support the management of MDSE projects. The book is targeted to a diverse set of readers, spanning: professionals, CTOs, CIOs, and team managers that need to have a bird's eye vision on the matter, so as to take the appropriate decisions when it comes to choosing the best development techniques for their company or team; software analysts, developers, or designers that expect to use MDSE for improving everyday work productivity, either by applying the basic modeling techniques and notations or by defining new domain-specific modeling lang ages and applying end-to-end MDSE practices in the software factory; and academic teachers and students to address undergrad and postgrad courses on MDSE. In addition to the contents of the book, more resources are provided on the book's website, including the examples presented in the book. Table of Contents: Introduction / MDSE Principles / MDSE Use Cases / Model-Driven Architecture (MDA) / Integration of MDSE in your Development Process / Modeling Languages at a Glance / Developing your Own Modeling Language / Model-to-Model Transformations / Model-to-Text Transformations / Managing Models / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>This synthesis lecture presents an intuitive introduction to the mathematics of motion and deformation in computer graphics. Starting with familiar concepts in graphics, such as Euler angles, quaternions, and affine transformations, we illustrate that a mathematical theory behind these concepts enables us to develop the techniques for efficient/effective creation of computer animation.</p> <p>This book, therefore, serves as a good guidepost to mathematics (differential geometry and Lie theory) for students of geometric modeling and animation in computer graphics. Experienced developers and researchers will also benefit from this book, since it gives a comprehensive overview of mathematical approaches that are particularly useful in character modeling, deformation, and animation.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Network Simulation

    Copyright Year: 2006

    Morgan and Claypool eBooks

    A detailed introduction to the design, implementation, and use of network simulation tools is presented. The requirements and issues faced in the design of simulators for wired and wireless networks are discussed. Abstractions such as packet- and fluid-level network models are covered. Several existing simulations are given as examples, with details and rationales regarding design decisions presented. Issues regarding performance and scalability are discussed in detail, describing how one can utilize distributed simulation methods to increase the scale and performance of a simulation environment. Finally, a case study of two simulation tools is presented that have been developed using distributed simulation techniques. This text is essential to any student, researcher, or network architect desiring a detailed understanding of how network simulation tools are designed, implemented, and used. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Resource-Oriented Architecture Patterns for Webs of Data

    Copyright Year: 2013

    Morgan and Claypool eBooks

    The surge of interest in the REpresentational State Transfer (REST) architectural style, the Semantic Web, and Linked Data has resulted in the development of innovative, flexible, and powerful systems that embrace one or more of these compatible technologies. However, most developers, architects, Information Technology managers, and platform owners have only been exposed to the basics of resource-oriented architectures. This book is an attempt to catalog and elucidate several reusable solutions that have been seen in the wild in the now increasingly familiar "patterns book" style. These are not turn key implementations, but rather, useful strategies for solving certain problems in the development of modern, resource-oriented systems, both on the public Web and within an organization's firewalls. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Applications of Affine and Weyl Geometry

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Pseudo-Riemannian geometry is, to a large extent, the study of the Levi-Civita connection, which is the unique torsion-free connection compatible with the metric structure. There are, however, other affine connections which arise in different contexts, such as conformal geometry, contact structures, Weyl structures, and almost Hermitian geometry. In this book, we reverse this point of view and instead associate an auxiliary pseudo-Riemannian structure of neutral signature to certain affine connections and use this correspondence to study both geometries. We examine Walker structures, Riemannian extensions, and K¿¿hler--Weyl geometry from this viewpoint. This book is intended to be accessible to mathematicians who are not expert in the subject and to students with a basic grounding in differential geometry. Consequently, the first chapter contains a comprehensive introduction to the basic results and definitions we shall need---proofs are included of many of these results to make it as elf-contained as possible. Para-complex geometry plays an important role throughout the book and consequently is treated carefully in various chapters, as is the representation theory underlying various results. It is a feature of this book that, rather than as regarding para-complex geometry as an adjunct to complex geometry, instead, we shall often introduce the para-complex concepts first and only later pass to the complex setting. The second and third chapters are devoted to the study of various kinds of Riemannian extensions that associate to an affine structure on a manifold a corresponding metric of neutral signature on its cotangent bundle. These play a role in various questions involving the spectral geometry of the curvature operator and homogeneous connections on surfaces. The fourth chapter deals with K¿¿hler--Weyl geometry, which lies, in a certain sense, midway between affine geometry and K¿¿hler geometry. Another feature of the book is that we have tried wherever possible to find the original references in the subject for possible historical interest. Thus, we have cited the seminal papers of Levi-Civita, Ricci, Schouten, and Weyl, to name but a few exemplars. We have also given different proofs of various results than those that are given in the literature, to take advantage of the unified treatment of the area given herein. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modeling Digital Switching Circuits with Linear Algebra

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Modeling Digital Switching Circuits with Linear Algebra describes an approach for modeling digital information and circuitry that is an alternative to Boolean algebra. While the Boolean algebraic model has been wildly successful and is responsible for many advances in modern information technology, the approach described in this book offers new insight and different ways of solving problems. Modeling the bit as a vector instead of a scalar value in the set {0, 1} allows digital circuits to be characterized with transfer functions in the form of a linear transformation matrix. The use of transfer functions is ubiquitous in many areas of engineering and their rich background in linear systems theory and signal processing is easily applied to digital switching circuits with this model. The common tasks of circuit simulation and justification are specific examples of the application of the linear algebraic model and are described in detail. The advantages offered by the new model as compa ed to traditional methods are emphasized throughout the book. Furthermore, the new approach is easily generalized to other types of information processing circuits such as those based upon multiple-valued or quantum logic; thus providing a unifying mathematical framework common to each of these areas. Modeling Digital Switching Circuits with Linear Algebra provides a blend of theoretical concepts and practical issues involved in implementing the method for circuit design tasks. Data structures are described and are shown to not require any more resources for representing the underlying matrices and vectors than those currently used in modern electronic design automation (EDA) tools based on the Boolean model. Algorithms are described that perform simulation, justification, and other common EDA tasks in an efficient manner that are competitive with conventional design tools. The linear algebraic model can be used to implement common EDA tasks directly upon a structural netlist thus avo ding the intermediate step of transforming a circuit description into a representation of a set of switching functions as is commonly the case when conventional Boolean techniques are used. Implementation results are provided that empirically demonstrate the practicality of the linear algebraic model. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advanced Circuit Simulation using Multisim Workbench

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Multisim is now the de facto standard for circuit simulation. It is a SPICE-based circuit simulator which combines analog, discrete-time, and mixed-mode circuits. In addition, it is the only simulator which incorporates microcontroller simulation in the same environment. It also includes a tool for printed circuit board design. Advanced Circuit Simulation Using Multisim Workbench is a companion book to Circuit Analysis Using Multisim, published by Morgan & Claypool in 2011. This new book covers advanced analyses and the creation of models and subcircuits. It also includes coverage of transmission lines, the special elements which are used to connect components in PCBs and integrated circuits. Finally, it includes a description of Ultiboard, the tool for PCB creation from a circuit description in Multisim. Both books completely cover most of the important features available for a successful circuit simulation with Multisim. Table of Contents: Models and Subcircuits / Transmission Line / Other Types of Analyses / Simulating Microcontrollers / PCB Design With Ultiboard View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Instant Recovery with Write-Ahead Logging:Page Repair, System Restart, and Media Restore

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Traditional theory and practice of write-ahead logging and of database recovery techniques revolve around three failure classes: transaction failures resolved by rollback; system failures (typically software faults) resolved by restart with log analysis, “redo,” and “undo” phases; and media failures (typically hardware faults) resolved by restore operations that combine multiple types of backups and log replay. The recent addition of single-page failures and single-page recovery has opened new opportunities far beyond its original aim of immediate, lossless repair of single-page wear-out in novel or traditional storage hardware. In the contexts of system and media failures, efficient single-page recovery enables on-demand incremental “redo” and “undo” as part of system restart or media restore operations. This can give the illusion of practically instantaneous restart and restore: instant restart permits processing new queries an updates seconds after system reboot and instant restore permits resuming queries and updates on empty replacement media as if those were already fully recovered. In addition to these instant recovery techniques, the discussion introduces much faster offline restore operations without slowdown in backup operations and with hardly any slowdown in log archiving operations. The new restore techniques also render differential and incremental backups obsolete, complete backup commands on the database server practically instantly, and even permit taking full backups without imposing any load on the database server. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Probabilistic Approaches to Recommendations

    Copyright Year: 2014

    Morgan and Claypool eBooks

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robust formal mathematical framework to model these assumptions and study their effects in the recommendation process. This book starts with a brief summary of the recommendation problem and its challenges and a review of some widely used techniques Next, we introduce and discuss probabilistic approaches for modeling preference data. We focus our attention on methods based on latent factors, such as m xture models, probabilistic matrix factorization, and topic models, for explicit and implicit preference data. These methods represent a significant advance in the research and technology of recommendation. The resulting models allow us to identify complex patterns in preference data, which can be exploited to predict future purchases effectively. The extreme sparsity of preference data poses serious challenges to the modeling of user preferences, especially in the cases where few observations are available. Bayesian inference techniques elegantly address the need for regularization, and their integration with latent factor modeling helps to boost the performances of the basic techniques. We summarize the strengths and weakness of several approaches by considering two different but related evaluation perspectives, namely, rating prediction and recommendation accuracy. Furthermore, we describe how probabilistic methods based on latent factors enable the exploitation of preference patte ns in novel applications beyond rating prediction or recommendation accuracy. We finally discuss the application of probabilistic techniques in two additional scenarios, characterized by the availability of side information besides preference data. In summary, the book categorizes the myriad probabilistic approaches to recommendations and provides guidelines for their adoption in real-world situations. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Automatic Detection of Verbal Deception

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The attempt to spot deception through its correlates in human behavior has a long history. Until recently, these efforts have concentrated on identifying individual "cues" that might occur with deception. However, with the advent of computational means to analyze language and other human behavior, we now have the ability to determine whether there are consistent clusters of differences in behavior that might be associated with a false statement as opposed to a true one. While its focus is on verbal behavior, this book describes a range of behaviors—physiological, gestural as well as verbal—that have been proposed as indicators of deception. An overview of the primary psychological and cognitive theories that have been offered as explanations of deceptive behaviors gives context for the description of specific behaviors. The book also addresses the differences between data collected in a laboratory and "real-world" data with respect to the emotional and cognitive state f the liar. It discusses sources of real-world data and problematic issues in its collection and identifies the primary areas in which applied studies based on real-world data are critical, including police, security, border crossing, customs, and asylum interviews; congressional hearings; financial reporting; legal depositions; human resource evaluation; predatory communications that include Internet scams, identity theft, and fraud; and false product reviews. Having established the background, this book concentrates on computational analyses of deceptive verbal behavior that have enabled the field of deception studies to move from individual cues to overall differences in behavior. The computational work is organized around the features used for classification from 𝑛-gram through syntax to predicate-argument and rhetorical structure. The book concludes with a set of open questions that the computational work has generated. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Wireless Network Pricing

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Today's wireless communications and networking practices are tightly coupled with economic considerations, to the extent that it is almost impossible to make a sound technology choice without understanding the corresponding economic implications. This book aims at providing a foundational introduction on how microeconomics, and pricing theory in particular, can help us to understand and build better wireless networks. The book can be used as lecture notes for a course in the field of network economics, or a reference book for wireless engineers and applied economists to understand how pricing mechanisms influence the fast growing modern wireless industry. This book first covers the basics of wireless communication technologies and microeconomics, before going in-depth about several pricing models and their wireless applications. The pricing models include social optimal pricing, monopoly pricing, price differentiation, oligopoly pricing, and network externalities, supported by introd ctory discussions of convex optimization and game theory. The wireless applications include wireless video streaming, service provider competitions, cellular usage-based pricing, network partial price differentiation, wireless spectrum leasing, distributed power control, and cellular technology upgrade. More information related to the book (including references, slides, and videos) can be found at ncel.ie.cuhk.edu.hk/content/wireless-network-pricing. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Judgment Aggregation:A Primer

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Judgment aggregation is a mathematical theory of collective decision-making. It concerns the methods whereby individual opinions about logically interconnected issues of interest can, or cannot, be aggregated into one collective stance. Aggregation problems have traditionally been of interest for disciplines like economics and the political sciences, as well as philosophy, where judgment aggregation itself originates from, but have recently captured the attention of disciplines like computer science, artificial intelligence and multi-agent systems. Judgment aggregation has emerged in the last decade as a unifying paradigm for the formalization and understanding of aggregation problems. Still, no comprehensive presentation of the theory is available to date. This Synthesis Lecture aims at filling this gap presenting the key motivations, results, abstractions and techniques underpinning it. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering Thermodynamics and 21st Century Energy Problems:A Textbook Companion for Student Engagement

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Energy is a basic human need; technologies for energy conversion and use are fundamental to human survival. As energy technology evolves to meet demands for development and ecological sustainability in the 21st century, engineers need to have up-to-date skills and knowledge to meet the creative challenges posed by current and future energy problems. Further, engineers need to cultivate a commitment to and passion for lifelong learning which will enable us to actively engage new developments in the field. This undergraduate textbook companion seeks to develop these capacities in tomorrow's engineers in order to provide for future energy needs around the world. This book is designed to complement traditional texts in engineering thermodynamics, and thus is organized to accompany explorations of the First and Second Laws, fundamental property relations, and various applications across engineering disciplines. It contains twenty modules targeted toward meeting five often-neglected ABET o tcomes: ethics, communication, lifelong learning, social context, and contemporary issues. The modules are based on pedagogies of liberation, used for decades in the humanities and social sciences for instilling critical thinking and reflective action in students by bringing attention to power relations in the classroom and in the world. This book is intended to produce a conversation and creative exploration around how to teach and learn thermodynamics differently. Because liberative pedagogies are at their heart relational, it is important to maintain spaces for discussing classroom practices with these modules, and for sharing ideas for implementing critical pedagogies in engineering contexts. The reader is therefore encouraged to visit the book's blog. Table of Contents: What and Why? / The First Law: Making Theory Relevant / The Second Law and Property Relations / Thinking Big Picture about Energy and Sustainability View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    HCI Theory:Classical, Modern, and Contemporary

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Theory is the bedrock of many sciences, providing a rigorous method to advance knowledge, through testing and falsifying hypotheses about observable phenomena. To begin with, the nascent field of HCI followed the scientific method borrowing theories from cognitive science to test theories about user performance at the interface. But HCI has emerged as an eclectic interdiscipline rather than a well-defined science. It now covers all aspects of human life, from birth to bereavement, through all manner of computing, from device ecologies to nano-technology. It comes as no surprise that the role of theory in HCI has also greatly expanded from the early days of scientific testing to include other functions such as describing, explaining, critiquing, and as the basis for generating new designs. The book charts the theoretical developments in HCI, both past and present, reflecting on how they have shaped the field. It explores both the rhetoric and the reality: how theories have been concept alized, what was promised, how they have been used and which has made the most impact in the field -- and the reasons for this. Finally, it looks to the future and asks whether theory will continue to have a role, and, if so, what this might be. Table of Contents: Introduction / The Backdrop to HCI Theory / The Role and Contribution of Theory in HCI / Classical Theories / Modern Theories / Contemporary Theory / Discussion / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Impossibility Results for Distributed Computing

    Copyright Year: 2014

    Morgan and Claypool eBooks

    To understand the power of distributed systems, it is necessary to understand their inherent limitations: what problems cannot be solved in particular systems, or without sufficient resources (such as time or space). This book presents key techniques for proving such impossibility results and applies them to a variety of different problems in a variety of different system models. Insights gained from these results are highlighted, aspects of a problem that make it difficult are isolated, features of an architecture that make it inadequate for solving certain problems efficiently are identified, and different system models are compared. Table of Contents: Acknowledgments / Introduction / Indistinguishability / Shifting and Scaling / Scenario Arguments / Information Theory Arguments / Covering Arguments / Valency Arguments / Combinatorial Arguments / Reductions and Simulations / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Scalability Challenges in Web Search Engines

    Copyright Year: 2015

    Morgan and Claypool eBooks

    In this book, we aim to provide a fairly comprehensive overview of the scalability and efficiency challenges in large-scale web search engines. More specifically, we cover the issues involved in the design of three separate systems that are commonly available in every web-scale search engine: web crawling, indexing, and query processing systems. We present the performance challenges encountered in these systems and review a wide range of design alternatives employed as solution to these challenges, specifically focusing on algorithmic and architectural optimizations. We discuss the available optimizations at different computational granularities, ranging from a single computer node to a collection of data centers. We provide some hints to both the practitioners and theoreticians involved in the field about the way large-scale web search engines operate and the adopted design choices. Moreover, we survey the efficiency literature, providing pointers to a large number of relatively impo tant research papers. Finally, we discuss some open research problems in the context of search engine efficiency. View full abstract»

  • Freely Available from IEEE

    The Datacenter as a Computer:An Introduction to the Design of Warehouse-Scale Machines

    Copyright Year: 2013

    Morgan and Claypool eBooks

    After nearly four years of substantial academic and industrial developments in warehouse-scale computing, we are delighted to present our first major update to this lecture. The increased popularity of public clouds has made WSC software techniques relevant to a larger pool of programmers since our first edition. Therefore, we expanded Chapter 2 to reflect our better understanding of WSC software systems and the toolbox of software techniques for WSC programming. In Chapter 3, we added to our coverage of the evolving landscape of wimpy vs. brawny server trade-offs, and we now present an overview of WSC interconnects and storage systems that was promised but lacking in the original edition. Thanks largely to the help of our new co-author, Google Distinguished Engineer Jimmy Clidaras, the material on facility mechanical and power distribution design has been updated and greatly extended (see Chapters 4 and 5). Chapters 6 and 7 have also been revamped significantly. We hope this revised dition continues to meet the needs of educators and professionals in this area. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Physical Design and Query Compilation

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Query compilation is the problem of translating user requests formulated over purely conceptual and domain specific ways of understanding data, commonly called logical designs, to efficient executable programs called query plans. Such plans access various concrete data sources through their low-level often iterator-based interfaces. An appreciation of the concrete data sources, their interfaces and how such capabilities relate to logical design is commonly called a physical design. This book is an introduction to the fundamental methods underlying database technology that solves the problem of query compilation. The methods are presented in terms of first-order logic which serves as the vehicle for specifying physical design, expressing user requests and query plans, and understanding how query plans implement user requests. Table of Contents: Introduction / Logical Design and User Queries / Basic Physical Design and Query Plans / On Practical Physical Design / Query Compilation and P an Synthesis / Updating Data View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Hardware Malware

    Copyright Year: 2013

    Morgan and Claypool eBooks

    In our digital world, integrated circuits are present in nearly every moment of our daily life. Even when using the coffee machine in the morning, or driving our car to work, we interact with integrated circuits. The increasing spread of information technology in virtually all areas of life in the industrialized world offers a broad range of attack vectors. So far, mainly software-based attacks have been considered and investigated, while hardware-based attacks have attracted comparatively little interest. The design and production process of integrated circuits is mostly decentralized due to financial and logistical reasons. Therefore, a high level of trust has to be established between the parties involved in the hardware development lifecycle. During the complex production chain, malicious attackers can insert non-specified functionality by exploiting untrusted processes and backdoors. This work deals with the ways in which such hidden, non-specified functionality can be introduced into hardware systems. After briefly outlining the development and production process of hardware systems, we systematically describe a new type of threat, the hardware Trojan. We provide a historical overview of the development of research activities in this field to show the growing interest of international research in this topic. Current work is considered in more detail. We discuss the components that make up a hardware Trojan as well as the parameters that are relevant for an attack. Furthermore, we describe current approaches for detecting, localizing, and avoiding hardware Trojans to combat them effectively. Moreover, this work develops a comprehensive taxonomy of countermeasures and explains in detail how specific problems are solved. In a final step, we provide an overview of related work and offer an outlook on further research in this field. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Click Models for Web Search

    Copyright Year: 2015

    Morgan and Claypool eBooks

    With the rapid growth of web search in recent years the problem of modeling its users has started to attract more and more attention of the information retrieval community. This has several motivations. By building a model of user behavior we are essentially developing a better understanding of a user, which ultimately helps us to deliver a better search experience. A model of user behavior can also be used as a predictive device for non-observed items such as document relevance, which makes it useful for improving search result ranking. Finally, in many situations experimenting with real users is just infeasible and hence user simulations based on accurate models play an essential role in understanding the implications of algorithmic changes to search engine results or presentation changes to the search engine result page. In this survey we summarize advances in modeling user click behavior on a web search engine result page. We present simple click models as well as more complex mod ls aimed at capturing non-trivial user behavior patterns on modern search engine result pages. We discuss how these models compare to each other, what challenges they have, and what ways there are to address these challenges. We also study the problem of evaluating click models and discuss the main applications of click models. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Integral:A Crux for Analysis

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book treats all of the most commonly used theories of the integral. After motivating the idea of integral, we devote a full chapter to the Riemann integral and the next to the Lebesgue integral. Another chapter compares and contrasts the two theories. The concluding chapter offers brief introductions to the Henstock integral, the Daniell integral, the Stieltjes integral, and other commonly used integrals. The purpose of this book is to provide a quick but accurate (and detailed) introduction to all aspects of modern integration theory. It should be accessible to any student who has had calculus and some exposure to upper division mathematics. Table of Contents: Introduction / The Riemann Integral / The Lebesgue Integral / Comparison of the Riemann and Lebesgue Integrals / Other Theories of the Integral View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Underwater Communications

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Underwater vehicles and underwater moorings are increasing in tactical importance. As such, it is critical to have a robust and secure communication system connecting underwater vehicles on a long seaborne mission and a ground station. As a matter of fact, the deployment of efficient communication links with underwater vehicles is one of the greatest technological challenges presently confronted by the world's naval forces. To circumvent most of the limitations involved in the use of RF or acoustic channels for perfectly secure communications with underwater vehicles, it is worth considering the feasibility of an optical channel to facilitate a two-way satellite communication link secured via perfectly secure ciphers enabled by a quantum key distribution protocol. This book offers a concise review of underwater communications systems. Our approach is pedagogical, making a strong emphasis on the physics behind the attenuating properties of the oceanic environment and the propagation o electromagnetic signals in the ELF, VLF, and optical bands. We assume the reader is familiar with the basic principles of classical electrodynamics and optics. The system design, components, and noise analysis of an underwater optical communications device are discussed in detail. Furthermore, we offer simulations of the performance of the communication system for different types of ocean waters. Our final conclusion is that it appears to be feasible to design and build underwater communications using optical classical and quantum channels secured with quantum key distribution protocols. Table of Contents: Introduction / Electrodynamics of Attenuating Media / Underwater Communication Channels / Underwater Optical Communications: Technology / Underwater Optical Communications: Noise Analysis / Underwater Optical Communications: System Performance / Underwater Quantum Communications / Conclusions PDF (1764 KB) PDF Plus (1444 KB) View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Captains of Energy:Systems Dynamics from an Energy Perspective

    Copyright Year: 2015

    Morgan and Claypool eBooks

    In teaching an introduction to transport or systems dynamics modeling at the undergraduate level, it is possible to lose pedagogical traction in a sea of abstract mathematics. What the mathematical modeling of time-dependent system behavior offers is a venue in which students can be taught that physical analogies exist between what they likely perceive as distinct areas of study in the physical sciences. We introduce a storyline whose characters are superheroes that store and dissipate energy in dynamic systems. Introducing students to the overarching conservation laws helps develop the analogy that ties the different disciplines together under a common umbrella of system energy. In this book, we use the superhero cast to present the effort-flow analogy and its relationship to the conservation principles of mass, momentum, energy, and electrical charge. We use a superhero movie script common to mechanical, electrical, fluid, and thermal engineering systems to illustrate how to apply t e analogy to arrive at governing differential equations describing the systems' behavior in time. Ultimately, we show how only two types of differential equation, and therefore, two types of system response are possible. This novel approach of storytelling and a movie script is used to help make the mathematics of lumped system modeling more approachable for students. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Reconfigurable Supercomputing

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book covers technologies, applications, tools, languages, procedures, advantages, and disadvantages of reconfigurable supercomputing using Field Programmable Gate Arrays (FPGAs). The target audience is the community of users of High Performance Computers (HPC) who may benefit from porting their applications into a reconfigurable environment. As such, this book is intended to guide the HPC user through the many algorithmic considerations, hardware alternatives, usability issues, programming languages, and design tools that need to be understood before embarking on the creation of reconfigurable parallel codes. We hope to show that FPGA acceleration, based on the exploitation of the data parallelism, pipelining and concurrency remains promising in view of the diminishing improvements in traditional processor and system design. Table of Contents: FPGA Technology / Reconfigurable Supercomputing / Algorithmic Considerations / FPGA Programming Languages / Case Study: Sorting / Alternat ve Technologies and Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Automated Grammatical Error Detection for Language Learners

    Copyright Year: 2010

    Morgan and Claypool eBooks

    It has been estimated that over a billion people are using or learning English as a second or foreign language, and the numbers are growing not only for English but for other languages as well. These language learners provide a burgeoning market for tools that help identify and correct learners' writing errors. Unfortunately, the errors targeted by typical commercial proofreading tools do not include those aspects of a second language that are hardest to learn. This volume describes the types of constructions English language learners find most difficult -- constructions containing prepositions, articles, and collocations. It provides an overview of the automated approaches that have been developed to identify and correct these and other classes of learner errors in a number of languages. Error annotation and system evaluation are particularly important topics in grammatical error detection because there are no commonly accepted standards. Chapters in the book describe the options av ilable to researchers, recommend best practices for reporting results, and present annotation and evaluation schemes. The final chapters explore recent innovative work that opens new directions for research. It is the authors' hope that this volume will contribute to the growing interest in grammatical error detection by encouraging researchers to take a closer look at the field and its many challenging problems. Table of Contents: Introduction / History of Automated Grammatical Error Detection / Special Problems of Language Learners / Language Learner Data / Evaluating Error Detection Systems / Article and Preposition Errors / Collocation Errors / Different Approaches for Different Errors / Annotating Learner Errors / New Directions / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Phonocardiography Signal Processing

    Copyright Year: 2009

    Morgan and Claypool eBooks

    The auscultation method is an important diagnostic indicator for hemodynamic anomalies. Heart sound classification and analysis play an important role in the auscultative diagnosis. The term phonocardiography refers to the tracing technique of heart sounds and the recording of cardiac acoustics vibration by means of a microphone-transducer. Therefore, understanding the nature and source of this signal is important to give us a tendency for developing a competent tool for further analysis and processing, in order to enhance and optimize cardiac clinical diagnostic approach. This book gives the reader an inclusive view of the main aspects in phonocardiography signal processing. Table of Contents: Introduction to Phonocardiography Signal Processing / Phonocardiography Acoustics Measurement / PCG Signal Processing Framework / Phonocardiography Wavelets Analysis / Phonocardiography Spectral Analysis / PCG Pattern Classification / Special Application of Phonocardiography / Phonocardiography Acoustic Imaging and Mapping View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Differential Privacy:From Theory to Practice

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>Over the last decade, differential privacy (DP) has emerged as the de facto standard privacy notion for research in privacy-preserving data analysis and publishing. The DP notion offers strong privacy guarantee and has been applied to many data analysis tasks.</p><p>This Synthesis Lecture is the first of two volumes on differential privacy. This lecture differs from the existing books and surveys on differential privacy in that we take an approach balancing theory and practice. We focus on empirical accuracy performances of algorithms rather than asymptotic accuracy guarantees. At the same time, we try to explain why these algorithms have those empirical accuracy performances. We also take a balanced approach regarding the semantic meanings of differential privacy, explaining both its strong guarantees and its limitations.</p><p>We start by inspecting the definition and basic properties of DP, and the main primitives for achieving DP. Then, w give a detailed discussion on the the semantic privacy guarantee provided by DP and the caveats when applying DP. Next, we review the state of the art mechanisms for publishing histograms for low-dimensional datasets, mechanisms for conducting machine learning tasks such as classification, regression, and clustering, and mechanisms for publishing information to answer marginal queries for high-dimensional datasets. Finally, we explain the sparse vector technique, including the many errors that have been made in the literature using it.</p><p>The planned Volume 2 will cover usage of DP in other settings, including high-dimensional datasets, graph datasets, local setting, location privacy, and so on. We will also discuss various relaxations of DP.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Distributed Graph Coloring:Fundamentals and Recent Developments

    Copyright Year: 2013

    Morgan and Claypool eBooks

    The objective of our monograph is to cover the developments on the theoretical foundations of distributed symmetry breaking in the message-passing model. We hope that our monograph will stimulate further progress in this exciting area. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Path planning and navigation are indispensable components for controlling autonomous agents in interactive virtual worlds. Given the growing demands on the size and complexity of modern virtual worlds, a number of new techniques have been developed for achieving intelligent navigation for the next generation of interactive multi-agent simulations. This book reviews the evolution of several related techniques, starting from classical planning and computational geometry techniques and then gradually moving toward more advanced topics with focus on recent developments from the work of the authors. The covered topics range from discrete search and geometric representations to planning under different types of constraints and harnessing the power of graphics hardware in order to address Euclidean shortest paths and discrete search for multiple agents under limited time budgets. The use of planning algorithms beyond path planning is also discussed in the areas of crowd animation and whole-b dy motion planning for virtual characters. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Retrieval Evaluation

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Evaluation has always played a major role in information retrieval, with the early pioneers such as Cyril Cleverdon and Gerard Salton laying the foundations for most of the evaluation methodologies in use today. The retrieval community has been extremely fortunate to have such a well-grounded evaluation paradigm during a period when most of the human language technologies were just developing. This lecture has the goal of explaining where these evaluation methodologies came from and how they have continued to adapt to the vastly changed environment in the search engine world today. The lecture starts with a discussion of the early evaluation of information retrieval systems, starting with the Cranfield testing in the early 1960s, continuing with the Lancaster "user" study for MEDLARS, and presenting the various test collection investigations by the SMART project and by groups in Britain. The emphasis in this chapter is on the how and the why of the various methodologies developed. The second chapter covers the more recent "batch" evaluations, examining the methodologies used in the various open evaluation campaigns such as TREC, NTCIR (emphasis on Asian languages), CLEF (emphasis on European languages), INEX (emphasis on semi-structured data), etc. Here again the focus is on the how and why, and in particular on the evolving of the older evaluation methodologies to handle new information access techniques. This includes how the test collection techniques were modified and how the metrics were changed to better reflect operational environments. The final chapters look at evaluation issues in user studies -- the interactive part of information retrieval, including a look at the search log studies mainly done by the commercial search engines. Here the goal is to show, via case studies, how the high-level issues of experimental design affect the final evaluations. Table of Contents: Introduction and Early History / "Batch" Evaluation Since 1992 / Interactive Evaluation / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and Society:Working Towards Social Justice, Part I: Engineering and Society

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Engineers work in an increasingly complex entanglement of ideas, people, cultures, technology, systems and environments. Today, decisions made by engineers often have serious implications for not only their clients but for society as a whole and the natural world. Such decisions may potentially influence cultures, ways of living, as well as alter ecosystems which are in delicate balance. In order to make appropriate decisions and to co-create ideas and innovations within and among the complex networks of communities which currently exist and are shaped by our decisions, we need to regain our place as professionals, to realise the significance of our work and to take responsibility in a much deeper sense. Engineers must develop the 'ability to respond' to emerging needs of all people, across all cultures. To do this requires insights and knowledge which are at present largely within the domain of the social and political sciences but which need to be shared with our students in ways hich are meaningful and relevant to engineering. This book attempts to do just that. In Part 1 Baillie introduces ideas associated with the ways in which engineers relate to the communities in which they work. Drawing on scholarship from science and technology studies, globalisation and development studies, as well as work in science communication and dialogue, this introductory text sets the scene for an engineering community which engages with the public. In Part 2 Catalano frames the thinking processes necessary to create ethical and just decisions in engineering, to understand the implications of our current decision making processes and think about ways in which we might adapt these to become more socially just in the future. In Part 3 Baillie and Catalano have provided case studies of everyday issues such as water, garbage and alarm clocks, to help us consider how we might see through the lenses of our new knowledge from Parts 1 and 2 and apply this to our everyday existence as ngineers. Table of Contents: Introduction / Throwing Away Rubbish / Turning on the Tap / Awakened by an Alarm Clock / Driving the SUV / Travelling to Waikiki Beach View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and War:Militarism, Ethics, Institutions, Alternatives

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book investigates the close connections between engineering and war, broadly understood, and the conceptual and structural barriers that face those who would seek to loosen those connections. It shows how military institutions and interests have long influenced engineering education, research, and practice and how they continue to shape the field in the present. The book also provides a generalized framework for responding to these influences useful to students and scholars of engineering, as well as reflective practitioners. The analysis draws on philosophy, history, critical theory, and technology studies to understand the connections between engineering and war and how they shape our very understandings of what engineering is and what it might be. After providing a review of diverse dimensions of engineering itself, the analysis shifts to different dimensions of the connections between engineering and war. First, it considers the ethics of war generally and then explores quest ons of integrity for engineering practitioners facing career decisions relating to war. Next, it considers the historical rise of the military-industrial-academic complex, especially from World War II to the present. Finally, it considers a range of responses to the militarization of engineering from those who seek to unsettle the status quo. Only by confronting the ethical, historical, and political consequences of engineering for warfare, this book argues, can engineering be sensibly reimagined. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    On the Efficient Determination of Most Near Neighbors:Horseshoes, Hand Grenades, Web Search and Other Situations When Close Is Close Enough, Second Edition

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The time-worn aphorism "close only counts in horseshoes and hand grenades" is clearly inadequate. Close also counts in golf, shuffleboard, archery, darts, curling, and other games of accuracy in which hitting the precise center of the target isn't to be expected every time, or in which we can expect to be driven from the target by skilled opponents. This book is not devoted to sports discussions, but to efficient algorithms for determining pairs of closely related web pages—and a few other situations in which we have found that inexact matching is good enough — where proximity suffices. We will not, however, attempt to be comprehensive in the investigation of probabilistic algorithms, approximation algorithms, or even techniques for organizing the discovery of nearest neighbors. We are more concerned with finding nearby neighbors; if they are not particularly close by, we are not particularly interested. In thinking of when approximation is sufficient, remember the of -told joke about two campers sitting around after dinner. They hear noises coming towards them. One of them reaches for a pair of running shoes, and starts to don them. The second then notes that even with running shoes, they cannot hope to outrun a bear, to which the first notes that most likely the bear will be satiated after catching the slower of them. We seek problems in which we don't need to be faster than the bear, just faster than the others fleeing the bear. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Understanding Atrial Fibrillation:The Signal Processing Contribution

    Copyright Year: 2008

    Morgan and Claypool eBooks

    The book presents recent advances in signal processing techniques for modeling, analysis, and understanding of the heart's electrical activity during atrial fibrillation. This arrhythmia is the most commonly encountered in clinical practice and its complex and metamorphic nature represents a challenging problem for clinicians, engineers, and scientists. Research on atrial fibrillation has stimulated the development of a wide range of signal processing tools to better understand the mechanisms ruling its initiation, maintenance, and termination. This book provides undergraduate and graduate students, as well as researchers and practicing engineers, with an overview of techniques, including time domain techniques for atrial wave extraction, time-frequency analysis for exploring wave dynamics, and nonlinear techniques to characterize the ventricular response and the organization of atrial activity. The book includes an introductory chapter about atrial fibrillation and its mechanisms, t eatment, and management. The successive chapters are dedicated to the analysis of atrial signals recorded on the body surface and to the quantification of ventricular response. The rest of the book explores techniques to characterize endo- and epicardial recordings and to model atrial conduction. Under the appearance of being a monothematic book on atrial fibrillation, the reader will not only recognize common problems of biomedical signal processing but also discover that analysis of atrial fibrillation is a unique challenge for developing and testing novel signal processing tools. Table of Contents: Analysis of Ventricular Response During Atrial Fibrillation / Organization Measures of Atrial Activity During Fibrillation / Modeling Atrial Fibrillation: From Myocardial Cells to ECG / Algorithms for Atrial Tachyarrythmia Detection for Long-Term Monitoring with Implantable Devices View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Notion of Relevance in Information Science:Everybody knows what relevance is. But, what is it really?

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Everybody knows what relevance is. It is a "ya'know" notion, concept, idea–no need to explain whatsoever. Searching for relevant information using information technology (IT) became a ubiquitous activity in contemporary information society. <i>Relevant</i> information means information that pertains to the matter or problem at hand—it is directly connected with effective communication. The purpose of this book is to trace the evolution and with it the history of thinking and research on relevance in information science and related fields from the human point of view. The objective is to synthesize what we have learned about relevance in several decades of investigation about the notion in information science. This book deals with how people deal with relevance—it does not cover how systems deal with relevance; it does not deal with algorithms. Spurred by advances in information retrieval (IR) and information systems of various kinds in handling of relevance, a number of basic questions are raised: <i>But what is relevance to start with? What are some of its properties and manifestations? How do people treat relevance? What affects relevance assessments? What are the effects of inconsistent human relevance judgments on tests of relative performance of different IR algorithms or approaches?</i> These general questions are discussed in detail. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Enhancing Information Security and Privacy by Combining Biometrics with Cryptography

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This book deals with "crypto-biometrics", a relatively new and multi-disciplinary area of research (started in 1998). Combining biometrics and cryptography provides multiple advantages, such as, revocability, template diversity, better verification accuracy, and generation of cryptographically usable keys that are strongly linked to the user identity. In this text, a thorough review of the subject is provided and then some of the main categories are illustrated with recently proposed systems by the authors. Beginning with the basics, this text deals with various aspects of crypto-biometrics, including review, cancelable biometrics, cryptographic key generation from biometrics, and crypto-biometric key sharing protocols. Because of the thorough treatment of the topic, this text will be highly beneficial to researchers and industry professionals in information security and privacy. Table of Contents: Introduction / Cancelable Biometric System / Cryptographic Key Regeneration Using Biome rics / Biometrics-Based Secure Authentication Protocols / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Transmission-Line Modeling (TLM) Method in Electromagnetics

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book presents the topic in electromagnetics known as Transmission-Line Modeling or Matrix method-TLM. While it is written for engineering students at graduate and advanced undergraduate levels, it is also highly suitable for specialists in computational electromagnetics working in industry, who wish to become familiar with the topic. The main method of implementation of TLM is via the time-domain differential equations, however, this can also be via the frequency-domain differential equations. The emphasis in this book is on the time-domain TLM. Physical concepts are emphasized here before embarking onto mathematical development in order to provide simple, straightforward suggestions for the development of models that can then be readily programmed for further computations. Sections with strong mathematical flavors have been included where there are clear methodological advantages forming the basis for developing practical modeling tools. The book can be read at different depths epending on the background of the reader, and can be consulted as and when the need arises. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Cardiac Tissue Engineering:Principles, Materials, and Applications

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Cardiac tissue engineering aims at repairing damaged heart muscle and producing human cardiac tissues for application in drug toxicity studies. This book offers a comprehensive overview of the cardiac tissue engineering strategies, including presenting and discussing the various concepts in use, research directions and applications. Essential basic information on the major components in cardiac tissue engineering, namely cell sources and biomaterials, is firstly presented to the readers, followed by a detailed description of their implementation in different strategies, broadly divided to cellular and acellular ones. In cellular approaches, the biomaterials are used to increase cell retention after implantation or as scaffolds when bioengineering the cardiac patch, in vitro. In acellular approaches, the biomaterials are used as ECM replacement for damaged cardiac ECM after MI, or, in combination with growth factors, the biomaterials assume an additional function as a depot for prolong d factor activity for the effective recruitment of repairing cells. The book also presents technological innovations aimed to improve the quality of the cardiac patches, such as bioreactor applications, stimulation patterns and prevascularization. This book could be of interest not only from an educational perspective (i.e. for graduate students), but also for researchers and medical professionals, to offer them fresh views on novel and powerful treatment strategies. We hope that the reader will find a broad spectrum of ideas and possibilities described in this book both interesting and convincing. Table of Contents: Introduction / The Heart: Structure, Cardiovascular Diseases, and Regeneration / Cell Sources for Cardiac Tissue Engineering / Biomaterials: Polymers, Scaffolds, and Basic Design Criteria / Biomaterials as Vehicles for Stem Cell Delivery and Retention in the Infarct / Bioengineering of Cardiac Patches, In Vitro / Perfusion Bioreactors and Stimulation Patterns in Cardiac T ssue Engineering / Vascularization of Cardiac Patches / Acellular Biomaterials for Cardiac Repair / Biomaterial-based Controlled Delivery of Bioactive Molecules for Myocardial Regeneration View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Replicated Data Management for Mobile Computing

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Managing data in a mobile computing environment invariably involves caching or replication. In many cases, a mobile device has access only to data that is stored locally, and much of that data arrives via replication from other devices, PCs, and services. Given portable devices with limited resources, weak or intermittent connectivity, and security vulnerabilities, data replication serves to increase availability, reduce communication costs, foster sharing, and enhance survivability of critical information. Mobile systems have employed a variety of distributed architectures from client–server caching to peer-to-peer replication. Such systems generally provide weak consistency models in which read and update operations can be performed at any replica without coordination with other devices. The design of a replication protocol then centers on issues of how to record, propagate, order, and filter updates. Some protocols utilize operation logs, whereas others replicate state. Syst ms might provide best-effort delivery, using gossip protocols or multicast, or guarantee eventual consistency for arbitrary communication patterns, using recently developed pairwise, knowledge-driven protocols. Additionally, systems must detect and resolve the conflicts that arise from concurrent updates using techniques ranging from version vectors to read–write dependency checks. This lecture explores the choices faced in designing a replication protocol, with particular emphasis on meeting the needs of mobile applications. It presents the inherent trade-offs and implicit assumptions in alternative designs. The discussion is grounded by including case studies of research and commercial systems including Coda, Ficus, Bayou, Sybase’s iAnywhere, and Microsoft’s Sync Framework. Table of Contents: Introduction / System Models / Data Consistency / Replicated Data Protocols / Partial Replication / Conflict Management / Case Studies / Conclusions / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Theoretical Foundations for Digital Libraries: the 5S (Societies, Scenarios, Spaces, Structures, Streams) Approach

    Copyright Year: 2012

    Morgan and Claypool eBooks

    In 1991, a group of researchers chose the term digital libraries to describe an emerging field of research, development, and practice. Since then, Virginia Tech has had funded research in this area, largely through its Digital Library Research Laboratory. This book is the first in a four book series that reports our key findings and current research investigations. Underlying this book series are six completed dissertations (Gon¿¿alves, Kozievitch, Leidig, Murthy, Shen, Torres), eight dissertations underway, and many masters theses. These reflect our experience with a long string of prototype or production systems developed in the lab, such as CITIDEL, CODER, CTRnet, Ensemble, ETANA, ETD-db, MARIAN, and Open Digital Libraries. There are hundreds of related publications, presentations, tutorials, and reports. We have built upon that work so this book, and the others in the series, will address digital library related needs in many computer science, information science, and library scien e (e.g., LIS) courses, as well as the requirements of researchers, developers, and practitioners. Much of the early work in the digital library field struck a balance between addressing real-world needs, integrating methods from related areas, and advancing an ever-expanding research agenda. Our work has fit in with these trends, but simultaneously has been driven by a desire to provide a firm conceptual and formal basis for the field.Our aim has been to move from engineering to science. We claim that our 5S (Societies, Scenarios, Spaces, Structures, Streams) framework, discussed in publications dating back to at least 1998, provides a suitable basis. This book introduces 5S, and the key theoretical and formal aspects of the 5S framework. While the 5S framework may be used to describe many types of information systems, and is likely to have even broader utility and appeal, we focus here on digital libraries. Our view of digital libraries is broad, so further generalization should be s raightforward. We have connected with related fields, including hypertext/hypermedia, information storage and retrieval, knowledge management, machine learning, multimedia, personal information management, and Web 2.0. Applications have included managing not only publications, but also archaeological information, educational resources, fish images, scientific datasets, and scientific experiments/ simulations. Table of Contents: Introduction / Exploration / Mathematical Preliminaries / Minimal Digital Library / Archaeological Digital Libraries / 5S Results: Lemmas, Proofs, and 5SSuite / Glossary / Bibliography / Authors' Biographies / Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multithreading Architecture

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Multithreaded architectures now appear across the entire range of computing devices, from the highest-performing general purpose devices to low-end embedded processors. Multithreading enables a processor core to more effectively utilize its computational resources, as a stall in one thread need not cause execution resources to be idle. This enables the computer architect to maximize performance within area constraints, power constraints, or energy constraints. However, the architectural options for the processor designer or architect looking to implement multithreading are quite extensive and varied, as evidenced not only by the research literature but also by the variety of commercial implementations. This book introduces the basic concepts of multithreading, describes a number of models of multithreading, and then develops the three classic models (coarse-grain, fine-grain, and simultaneous multithreading) in greater detail. It describes a wide variety of architectural and software esign tradeoffs, as well as opportunities specific to multithreading architectures. Finally, it details a number of important commercial and academic hardware implementations of multithreading. Table of Contents: Introduction / Multithreaded Execution Models / Coarse-Grain Multithreading / Fine-Grain Multithreading / Simultaneous Multithreading / Managing Contention / New Opportunities for Multithreaded Processors / Experimentation and Metrics / Implementations of Multithreaded Processors / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Full-Text (Substring) Indexes in External Memory

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Nowadays, textual databases are among the most rapidly growing collections of data. Some of these collections contain a new type of data that differs from classical numerical or textual data. These are long sequences of symbols, not divided into well-separated small tokens (words). The most prominent among such collections are databases of biological sequences, which are experiencing today an unprecedented growth rate. Starting in 2008, the "1000 Genomes Project" has been launched with the ultimate goal of collecting sequences of additional 1,500 Human genomes, 500 each of European, African, and East Asian origin. This will produce an extensive catalog of Human genetic variations. The size of just the raw sequences in this catalog would be about 5 terabytes. Querying strings without well-separated tokens poses a different set of challenges, typically addressed by building full-text indexes, which provide effective structures to index all the substrings of the given strings. Since full text indexes occupy more space than the raw data, it is often necessary to use disk space for their construction. However, until recently, the construction of full-text indexes in secondary storage was considered impractical due to excessive I/O costs. Despite this, algorithms developed in the last decade demonstrated that efficient external construction of full-text indexes is indeed possible. This book is about large-scale construction and usage of full-text indexes. We focus mainly on suffix trees, and show efficient algorithms that can convert suffix trees to other kinds of full-text indexes and vice versa. There are four parts in this book. They are a mix of string searching theory with the reality of external memory constraints. The first part introduces general concepts of full-text indexes and shows the relationships between them. The second part presents the first series of external-memory construction algorithms that can handle the construction of full-text indexes for moder tely large strings in the order of few gigabytes. The third part presents algorithms that scale for very large strings. The final part examines queries that can be facilitated by disk-resident full-text indexes. Table of Contents: Structures for Indexing Substrings / External Construction of Suffix Trees / Scaling Up: When the Input Exceeds the Main Memory / Queries for Disk-based Indexes / Conclusions and Open Problems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Electronically Scanned Arrays

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Scanning arrays present the radar or communications engineer with the ultimate in antenna flexibility. They also present a multitude of new opportunities and new challenges that need to be addressed. In order to describe the needs for scanned array development, this book begins with a brief discussion of the history that led to present array antennas. This text is a compact but comprehensive treatment of the scanned array, from the underlying basis for array pattern behavior to the engineering choices leading to successful design. The book describes the scanned array in terms of radiation from apertures and wire antennas and introduces the effects resulting directly from scanning, including beam broadening, impedance mismatch and gain reduction and pattern squint and those effects of array periodicity including grating and quantization lobes and array blindness. The text also presents the engineering tools for improving pattern control and array efficiency including lattice selection, subarrray technology and pattern synthesis. Equations and figurers quantify the phenomena being described and provide the reader with the tools to tradeoff various performance features. The discussions proceed beyond the introductory material and to the state of the art in modern array design. Contents: Basic Principles and Applications of Array Antennas / Element Coupling Effects in Array Antennas / Array Pattern Synthesis / Subarray Techniques for Limited Field of View and Wide Band Applications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Social Semantic Web Mining

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The past ten years have seen a rapid growth in the numbers of people signing up to use Web-based social networks (hundreds of millions of new members are now joining the main services each year) with a large amount of content being shared on these networks (tens of billions of content items are shared each month). With this growth in usage and data being generated, there are many opportunities to discover the knowledge that is often inherent but somewhat hidden in these networks. Web mining techniques are being used to derive this hidden knowledge. In addition, the Semantic Web, including the Linked Data initiative to connect previously disconnected datasets, is making it possible to connect data from across various social spaces through common representations and agreed upon terms for people, content items, etc. In this book, we detail some current research being carried out to semantically represent the implicit and explicit structures on the Social Web, along with the techniques be ng used to elicit relevant knowledge from these structures, and we present the mechanisms that can be used to intelligently mesh these semantic representations with intelligent knowledge discovery processes. We begin this book with an overview of the origins of the Web, and then show how web intelligence can be derived from a combination of web and Social Web mining. We give an overview of the Social and Semantic Webs, followed by a description of the combined Social Semantic Web (along with some of the possibilities it affords), and the various semantic representation formats for the data created in social networks and on social media sites. Provenance and provenance mining is an important aspect here, especially when data is combined from multiple services. We will expand on the subject of provenance and especially its importance in relation to social data. We will describe extensions to social semantic vocabularies specifically designed for community mining purposes (SIOCM). In the last three chapters, we describe how the combination of web intelligence and social semantic data can be used to derive knowledge from the Social Web, starting at the community level (macro), and then moving through group mining (meso) to user profile mining (micro). Table of Contents: Acknowledgments / Grant Aid / Introduction and the Web / Web Mining / The Social Web / The Semantic Web / The Social Semantic Web / Social Semantic Web Mining / Social Semantic Web Mining of Communities / Social Semantic Web Mining of Groups / Social Semantic Web Mining of Users / Conclusions / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Probabilistic Ranking Techniques in Relational Databases

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Ranking queries are widely used in data exploration, data analysis and decision making scenarios. While most of the currently proposed ranking techniques focus on deterministic data, several emerging applications involve data that are imprecise or uncertain. Ranking uncertain data raises new challenges in query semantics and processing, making conventional methods inapplicable. Furthermore, the interplay between ranking and uncertainty models introduces new dimensions for ordering query results that do not exist in the traditional settings. This lecture describes new formulations and processing techniques for ranking queries on uncertain data. The formulations are based on marriage of traditional ranking semantics with possible worlds semantics under widely-adopted uncertainty models. In particular, we focus on discussing the impact of tuple-level and attribute-level uncertainty on the semantics and processing techniques of ranking queries. Under the tuple-level uncertainty model, we escribe new processing techniques leveraging the capabilities of relational database systems to recognize and handle data uncertainty in score-based ranking. Under the attribute-level uncertainty model, we describe new probabilistic ranking models and a set of query evaluation algorithms, including sampling-based techniques. We also discuss supporting rank join queries on uncertain data, and we show how to extend current rank join methods to handle uncertainty in scoring attributes. Table of Contents: Introduction / Uncertainty Models / Query Semantics / Methodologies / Uncertain Rank Join / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Human Factors in Healthcare:A Field Guide to Continuous Improvement

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Have you ever experienced the burden of an adverse event or a near-miss in healthcare and wished there was a way to mitigate it? This book walks you through a classic adverse event as a case study and shows you how.</p><p>It is a practical guide to continuously improving your healthcare environment, processes, tools, and ultimate outcomes, through the discipline of human factors. Using this book, you as a healthcare professional can improve patient safety and quality of care.</p><p>Adverse events are a major concern in healthcare today. As the complexity of healthcare increases-with technological advances and information overload-the field of human factors offers practical approaches to understand the situation, mitigate risk, and improve outcomes.</p><p>The first part of this book presents a human factors conceptual framework, and the second part offers a systematic, pragmatic approach. Both the framework and the approach are employed to analyze and understand healthcare situations, both proactively-for constant improvement-and reactively-learning from adverse events.</p><p>This book guides healthcare professionals through the process of mapping the environmental and human factors; assessing them in relation to the tasks each person performs; recognizing how gaps in the fit between human capabilities and the demands of the task in the environment have a ripple effect that increases risk; and drawing conclusions about what types of changes facilitate improvement and mitigate risk, thereby contributing to improved healthcare outcomes.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Processing on FPGAs

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Roughly a decade ago, power consumption and heat dissipation concerns forced the semiconductor industry to radically change its course, shifting from sequential to parallel computing. Unfortunately, improving performance of applications has now become much more difficult than in the good old days of frequency scaling. This is also affecting databases and data processing applications in general, and has led to the popularity of so-called data appliances—specialized data processing engines, where software and hardware are sold together in a closed box. Field-programmable gate arrays (FPGAs) increasingly play an important role in su