By Topic

Morgan and ClayPool Synthesis Digital LIBRARY

715 Results Returned

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Electrical Engineering: Systems & Instruments

    William Eccles
    Copyright Year: 2011

    Morgan and Claypool eBooks

    Pragmatic Electrical Engineering: Systems and Instruments is about some of the non-energy parts of electrical systems, the parts that control things and measure physical parameters. The primary topics are control systems and their characterization, instrumentation, signals, and electromagnetic compatibility. This text features a large number of completely worked examples to aid the reader in understanding how the various principles fit together. While electric engineers may find this material useful as a review, engineers in other fields can use this short lecture text as a modest introduction to these non-energy parts of electrical systems. Some knowledge of basic d-c circuits and of phasors in the sinusoidal steady state is presumed. Table of Contents: Closed-Loop Control Systems / Characterizing a System / Instrumentation / Processing Signals / Electromagnetic Compatibility View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Answer Machine

    Susan Feldman
    Copyright Year: 2012

    Morgan and Claypool eBooks

    The Answer Machine is a practical, non-technical guide to the technologies behind information seeking and analysis. It introduces search and content analytics to software buyers, knowledge managers, and searchers who want to understand and design effective online environments. The book describes how search evolved from an expert-only to an end user tool. It provides an overview of search engines, categorization and clustering, natural language processing, content analytics, and visualization technologies. Detailed profiles for Web search, eCommerce search, eDiscovery, and enterprise search contrast the types of users, uses, tasks, technologies, and interaction designs for each. These variables shape each application, although the underlying technologies are the same. Types of information tasks and the trade-offs between precision and recall, time, volume and precision, and privacy vs. personalization are discussed within this context. The book examines trends toward convenient, contex -aware computing, big data and analytics technologies, conversational systems, and answer machines. The Answer Machine explores IBM Watson's DeepQA technology and describes how it is used to answer health care and Jeopardy questions. The book concludes by discussing the implications of these advances: how they will change the way we run our businesses, practice medicine, govern, or conduct our lives in the digital age. Table of Contents: Introduction / The Query Process and Barriers to Finding Information Online / Online Search: An Evolution / Search and Discovery Technologies: An Overview / Information Access: A Spectrum of Needs and Uses / Future Tense: The Next Era in Information Access and Discovery / Answer Machines View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Philosophy of Technology:From Technical Artefacts to Sociotechnical Systems

    Peter Vermaas ; Peter Kroes ; Ibo van de Poel ; Maarten Franssen
    Copyright Year: 2010

    Morgan and Claypool eBooks

    In A Philosophy of Technology: From Technical Artefacts to Sociotechnical Systems, technology is analysed from a series of different perspectives. The analysis starts by focussing on the most tangible products of technology, called technical artefacts, and then builds step-wise towards considering those artefacts within their context of use, and ultimately as embedded in encompassing sociotechnical systems that also include humans as operators and social rules like legislation. Philosophical characterisations are given of technical artefacts, their context of use and of sociotechnical systems. Analyses are presented of how technical artefacts are designed in engineering and what types of technological knowledge is involved in engineering. And the issue is considered how engineers and others can or cannot influence the development of technology. These characterisations are complemented by ethical analyses of the moral status of technical artefacts and the possibilities and impossibilit es for engineers to influence this status when designing artefacts and the sociotechnical systems in which artefacts are embedded. The running example in the book is aviation, where aeroplanes are examples of technical artefacts and the world aviation system is an example of a sociotechnical system. Issues related to the design of quiet aeroplane engines and the causes of aviation accidents are analysed for illustrating the moral status of designing, and the role of engineers therein. Table of Contents: Technical Artefacts / Technical Designing / Ethics and Designing / Technological Knowledge / Sociotechnical Systems / The Role of Social Factors in Technological Development / Ethics and Unintended Consequences of Technology View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Code Division Multiple Access (CDMA)

    R. Michael Buehrer
    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book covers the basic aspects of Code Division Multiple Access or CDMA. It begins with an introduction to the basic ideas behind fixed and random access systems in order to demonstrate the difference between CDMA and the more widely understood TDMA, FDMA or CSMA. Secondly, a review of basic spread spectrum techniques are presented which are used in CDMA systems including direct sequence, frequency-hopping and time-hopping approaches. The basic concept of CDMA is presented, followed by the four basic principles of CDMA systems that impact their performance: interference averaging, universal frequency reuse, soft handoff, and statistical multiplexing. The focus of the discussion will then shift to applications. The most common application of CDMA currently is cellular systems. A detailed discussion on cellular voice systems based on CDMA, specifically IS-95, is presented. The capacity of such systems will be examined as well as performance enhancement techniques such as coding and patial filtering. Also discussed are Third Generation CDMA cellular systems and how they differ from Second Generation systems. A second application of CDMA that is covered is spread spectrum packet radio networks. Finally, there is an examination of multi-user detection and interference cancellation and how such techniques impact CDMA networks. This book should be of interest and value to engineers, advanced students, and researchers in communications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Concise Introduction to Models and Methods for Automated Planning

    Hector Geffner ; Blai Bonet
    Copyright Year: 2013

    Morgan and Claypool eBooks

    Planning is the model-based approach to autonomous behavior where the agent behavior is derived automatically from a model of the actions, sensors, and goals. The main challenges in planning are computational as all models, whether featuring uncertainty and feedback or not, are intractable in the worst case when represented in compact form. In this book, we look at a variety of models used in AI planning, and at the methods that have been developed for solving them. The goal is to provide a modern and coherent view of planning that is precise, concise, and mostly self-contained, without being shallow. For this, we make no attempt at covering the whole variety of planning approaches, ideas, and applications, and focus on the essentials. The target audience of the book are students and researchers interested in autonomous behavior and planning from an AI, engineering, or cognitive science perspective. Table of Contents: Preface / Planning and Autonomous Behavior / Classical Planning: Fu l Information and Deterministic Actions / Classical Planning: Variations and Extensions / Beyond Classical Planning: Transformations / Planning with Sensing: Logical Models / MDP Planning: Stochastic Actions and Full Feedback / POMDP Planning: Stochastic Actions and Partial Feedback / Discussion / Bibliography / Author's Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Faceted Search

    Daniel Tunkelang
    Copyright Year: 2009

    Morgan and Claypool eBooks

    We live in an information age that requires us, more than ever, to represent, access, and use information. Over the last several decades, we have developed a modern science and technology for information retrieval, relentlessly pursuing the vision of a "memex" that Vannevar Bush proposed in his seminal article, "As We May Think." Faceted search plays a key role in this program. Faceted search addresses weaknesses of conventional search approaches and has emerged as a foundation for interactive information retrieval. User studies demonstrate that faceted search provides more effective information-seeking support to users than best-first search. Indeed, faceted search has become increasingly prevalent in online information access systems, particularly for e-commerce and site search. In this lecture, we explore the history, theory, and practice of faceted search. Although we cannot hope to be exhaustive, our aim is to provide sufficient depth and breadth to offer a useful resource to bot researchers and practitioners. Because faceted search is an area of interest to computer scientists, information scientists, interface designers, and usability researchers, we do not assume that the reader is a specialist in any of these fields. Rather, we offer a self-contained treatment of the topic, with an extensive bibliography for those who would like to pursue particular aspects in more depth. Table of Contents: I. Key Concepts / Introduction: What Are Facets? / Information Retrieval / Faceted Information Retrieval / II. Research and Practice / Academic Research / Commercial Applications / III. Practical Concerns / Back-End Concerns / Front-End Concerns / Conclusion / Glossary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High-Speed Digital System Design

    Justin Davis
    Copyright Year: 2006

    Morgan and Claypool eBooks

    High-Speed Digital System Design bridges the gap from theory to implementation in the real world. Systems with clock speeds in low megahertz range qualify for high-speed. Proper design results in quality digital transmissions and lowers the chance for errors. This book is for computer and electrical engineers who may or may not have learned electromagnetic theory. The presentation style allows readers to quickly begin designing their own high-speed systems and diagnosing existing designs for errors. After studying this book, readers will be able to: Design the power distribution system for a printed circuit board to minimize noise Plan the layers of a PCB for signals, power, and ground to maximize signal quality and minimize noise Include test structures in the printed circuit board to easily diagnose manufacturing mistakes Choose the best PCB design parameters such a trace width, height,and routed path to ensure the most stable characteristic impedance Determine the correct terminati n to minimize reflections Predict the delay caused by a given PCB trace Minimize driver power consumption using AC terminations Compensate for discontinuities along a PCB trace Use pre-emphasis and equalization techniques to counteract lossy transmission lines Determine the amount of crosstalk between two traces Diagnose existing PCBs to determine the sources of errors View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Logic Synthesis using Verilog HDL

    Robert B. Reese ; Mitchell A. Thornton
    Copyright Year: 2006

    Morgan and Claypool eBooks

    Introduction to Logic Synthesis Using Verilog HDL explains how to write accurate Verilog descriptions of digital systems that can be synthesized into digital system netlists with desirable characteristics. The book contains numerous Verilog examples that begin with simple combinational networks and progress to synchronous sequential logic systems. Common pitfalls in the development of synthesizable Verilog HDL are also discussed along with methods for avoiding them. The target audience is anyone with a basic understanding of digital logic principles who wishes to learn how to model digital systems in the Verilog HDL in a manner that also allows for automatic synthesis. A wide range of readers, from hobbyists and undergraduate students to seasoned professionals, will find this a compelling and approachable work. The book provides concise coverage of the material and includes many examples, enabling readers to quickly generate high-quality synthesizable Verilog models. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Circuits: Signals and Filters

    William J. Eccles
    Copyright Year: 2006

    Morgan and Claypool eBooks

    Pragmatic Circuits: Signals and Filters is built around the processing of signals. Topics include spectra, a short introduction to the Fourier series, design of filters, and the properties of the Fourier transform. The focus is on signals rather than power. But the treatment is still pragmatic. For example, the author accepts the work of Butterworth and uses his results to design filters in a fairly methodical fashion. This third of three volumes finishes with a look at spectra by showing how to get a spectrum even if a signal is not periodic. The Fourier transform provides a way of dealing with such non-periodic signals. The two other volumes in the Pragmatic Circuits series include titles on DC and Time Domain and Frequency Domain. These short lecture books will be of use to students at any level of electrical engineering and for practicing engineers, or scientists, in any field looking for a practical and applied introduction to circuits and signals. The author's “pragmatic ” and applied style gives a unique and helpful “non-idealistic, practical, opinionated” introduction to circuits View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Communication and Agreement Abstractions for Fault-Tolerant Asynchronous Distributed Systems

    Michel Raynal
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Understanding distributed computing is not an easy task. This is due to the many facets of uncertainty one has to cope with and master in order to produce correct distributed software. Considering the uncertainty created by asynchrony and process crash failures in the context of message-passing systems, the book focuses on the main abstractions that one has to understand and master in order to be able to produce software with guaranteed properties. These fundamental abstractions are communication abstractions that allow the processes to communicate consistently (namely the register abstraction and the reliable broadcast abstraction), and the consensus agreement abstractions that allows them to cooperate despite failures. As they give a precise meaning to the words "communicate" and "agree" despite asynchrony and failures, these abstractions allow distributed programs to be designed with properties that can be stated and proved. Impossibility results are associated with these abstracti ns. Hence, in order to circumvent these impossibilities, the book relies on the failure detector approach, and, consequently, that approach to fault-tolerance is central to the book. Table of Contents: List of Figures / The Atomic Register Abstraction / Implementing an Atomic Register in a Crash-Prone Asynchronous System / The Uniform Reliable Broadcast Abstraction / Uniform Reliable Broadcast Abstraction Despite Unreliable Channels / The Consensus Abstraction / Consensus Algorithms for Asynchronous Systems Enriched with Various Failure Detectors / Constructing Failure Detectors View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Gazing at Games:An Introduction to Eye Tracking Control

    Veronica Sundstedt
    Copyright Year: 2012

    Morgan and Claypool eBooks

    Eye tracking is a process that identifies a specific point in both space and time that is being looked at by the observer. This information can also be used in real-time to control applications using the eyes. Recent innovations in the video game industry include alternative input modalities to provide an enhanced, more immersive user experience. In particular, eye gaze control has recently been explored as an input modality in video games. This book is an introduction for those interested in using eye tracking to control or analyze video games and virtual environments. Key concepts are illustrated through three case studies in which gaze control and voice recognition have been used in combination to control virtual characters and applications. The lessons learned in the case studies are presented and issues relating to incorporating eye tracking in interactive applications are discussed. The reader will be given an introduction to human visual attention, eye movements and eye trackin technologies. Previous work in the field of studying fixation behavior in games and using eye tracking for video game interaction will also be presented. The final chapter discusses ideas for how this field can be developed further to create richer interaction for characters and crowds in virtual environments. Alternative means of interaction in video games are especially important for disabled users for whom traditional techniques, such as mouse and keyboard, may be far from ideal. This book is also relevant for those wishing to use gaze control in applications other than games. Table of Contents: Introduction / The Human Visual System / Eye Tracking / Eye Tracking in Video Games / Gaze and Voice Controlled Video Games: Case Study I and II / Gaze and Voice Controlled Drawing: Case Study III / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Integration:The Relational Logic Approach

    Michael Genesereth
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Data integration is a critical problem in our increasingly interconnected but inevitably heterogeneous world. There are numerous data sources available in organizational databases and on public information systems like the World Wide Web. Not surprisingly, the sources often use different vocabularies and different data structures, being created, as they are, by different people, at different times, for different purposes. The goal of data integration is to provide programmatic and human users with integrated access to multiple, heterogeneous data sources, giving each user the illusion of a single, homogeneous database designed for his or her specific need. The good news is that, in many cases, the data integration process can be automated. This book is an introduction to the problem of data integration and a rigorous account of one of the leading approaches to solving this problem, viz., the relational logic approach. Relational logic provides a theoretical framework for discussing da a integration. Moreover, in many important cases, it provides algorithms for solving the problem in a computationally practical way. In many respects, relational logic does for data integration what relational algebra did for database theory several decades ago. A companion web site provides interactive demonstrations of the algorithms. Table of Contents: Preface / Interactive Edition / Introduction / Basic Concepts / Query Folding / Query Planning / Master Schema Management / Appendix / References / Index / Author Biography Don't have access? Recommend our Synthesis Digital Library to your library or purchase a personal subscription. Email info@morganclaypool.com for details. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    P2P Techniques for Decentralized Applications

    Esther Pacitti ; Reza Akbaranian ; Manal El-Dick
    Copyright Year: 2012

    Morgan and Claypool eBooks

    As an alternative to traditional client-server systems, Peer-to-Peer (P2P) systems provide major advantages in terms of scalability, autonomy and dynamic behavior of peers, and decentralization of control. Thus, they are well suited for large-scale data sharing in distributed environments. Most of the existing P2P approaches for data sharing rely on either structured networks (e.g., DHTs) for efficient indexing, or unstructured networks for ease of deployment, or some combination. However, these approaches have some limitations, such as lack of freedom for data placement in DHTs, and high latency and high network traffic in unstructured networks. To address these limitations, gossip protocols which are easy to deploy and scale well, can be exploited. In this book, we will give an overview of these different P2P techniques and architectures, discuss their trade-offs, and illustrate their use for decentralizing several large-scale data sharing applications. Table of Contents: P2P Overla s, Query Routing, and Gossiping / Content Distribution in P2P Systems / Recommendation Systems / Top-k Query Processing in P2P Systems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Design of Reconfigurable Antennas Using Graph Models

    Joseph Costantine ; Youssef Tawk ; Christos Christodoulou
    Copyright Year: 2013

    Morgan and Claypool eBooks

    This lecture discusses the use of graph models to represent reconfigurable antennas. The rise of antennas that adapt to their environment and change their operation based on the user's request hasn't been met with clear design guidelines. There is a need to propose some rules for the optimization of any reconfigurable antenna design and performance. Since reconfigurable antennas are seen as a collection of self-organizing parts, graph models can be introduced to relate each possible topology to a corresponding electromagnetic performance in terms of achieving a characteristic frequency of operation, impedance, and polarization. These models help designers understand reconfigurable antenna structures and enhance their functionality since they transform antennas from bulky devices into mathematical and software accessible models. The use of graphs facilitates the software control and cognition ability of reconfigurable antennas while optimizing their performance. This lecture also dis usses the reduction of redundancy, complexity and reliability of reconfigurable antennas and reconfigurable antenna arrays. The full analysis of these parameters allows a better reconfigurable antenna implementation in wireless and space communications platforms. The use of graph models to reduce the complexity while preserving the reliability of reconfigurable antennas allow a better incorporation in applications such as cognitive radio, MIMO, satellite communications, and personal communication systems. A swifter response time is achieved with less cost and losses. This lecture is written for individuals who wish to venture into the field of reconfigurable antennas, with a little prior experience in this area, and learn how graph rules and theory, mainly used in the field of computer science, networking, and control systems can be applied to electromagnetic structures. This lecture will walk the reader through a design and analysis process of reconfigurable antennas using graph mode s with a practical and theoretical outlook. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multidimensional Databases and Data Warehousing

    Christian Jensen ; Torben Bach Pedersen ; Christian Thomsen
    Copyright Year: 2010

    Morgan and Claypool eBooks

    The present book's subject is multidimensional data models and data modeling concepts as they are applied in real data warehouses. The book aims to present the most important concepts within this subject in a precise and understandable manner. The book's coverage of fundamental concepts includes data cubes and their elements, such as dimensions, facts, and measures and their representation in a relational setting; it includes architecture-related concepts; and it includes the querying of multidimensional databases. The book also covers advanced multidimensional concepts that are considered to be particularly important. This coverage includes advanced dimension-related concepts such as slowly changing dimensions, degenerate and junk dimensions, outriggers, parent-child hierarchies, and unbalanced, non-covering, and non-strict hierarchies. The book offers a principled overview of key implementation techniques that are particularly important to multidimensional databases, including mat rialized views, bitmap indices, join indices, and star join processing. The book ends with a chapter that presents the literature on which the book is based and offers further readings for those readers who wish to engage in more in-depth study of specific aspects of the book's subject. Table of Contents: Introduction / Fundamental Concepts / Advanced Concepts / Implementation Issues / Further Readings View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Conceptual Models: Core to Good Design

    Jeff Johnson ; Austin Henderson
    Copyright Year: 2011

    Morgan and Claypool eBooks

    People make use of software applications in their activities, applying them as tools in carrying out tasks. That this use should be good for people--easy, effective, efficient, and enjoyable--is a principal goal of design. In this book, we present the notion of Conceptual Models, and argue that Conceptual Models are core to achieving good design. From years of helping companies create software applications, we have come to believe that building applications without Conceptual Models is just asking for designs that will be confusing and difficult to learn, remember, and use. We show how Conceptual Models are the central link between the elements involved in application use: people's tasks (task domains), the use of tools to perform the tasks, the conceptual structure of those tools, the presentation of the conceptual model (i.e., the user interface), the language used to describe it, its implementation, and the learning that people must do to use the application. We further show that utting a Conceptual Model at the center of the design and development process can pay rich dividends: designs that are simpler and mesh better with users' tasks, avoidance of unnecessary features, easier documentation, faster development, improved customer uptake, and decreased need for training and customer support. Table of Contents: Using Tools / Start with the Conceptual Model / Definition / Structure / Example / Essential Modeling / Optional Modeling / Process / Value / Epilogue View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Optical Interconnects

    Ray T. Chen ; Chulchae Choi
    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book describes fully embedded board level optical interconnect in detail including the fabrication of the thin-film VCSEL array, its characterization, thermal management, the fabrication of optical interconnection layer, and the integration of devices on a flexible waveguide film. All the optical components are buried within electrical PCB layers in a fully embedded board level optical interconnect. Therefore, we can save foot prints on the top real estate of the PCB and relieve packaging difficulty reduced by separating fabrication processes. To realize fully embedded board level optical interconnects, many stumbling blocks need to be addressed such as thin-film transmitter and detector, thermal management, process compatibility, reliability, cost effective fabrication process, and easy integration. The material presented eventually will relieve such concerns and make the integration of optical interconnection highly feasible. The hybrid integration of the optical interconnectio layer and electrical layers is ongoing. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and Society:Working Towards Social Justice, Part III: Windows on Society

    George Catalano ; Caroline Baillie
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Engineers work in an increasingly complex entanglement of ideas, people, cultures, technology, systems and environments. Today, decisions made by engineers often have serious implications for not only their clients but for society as a whole and the natural world. Such decisions may potentially influence cultures, ways of living, as well as alter ecosystems which are in delicate balance. In order to make appropriate decisions and to co-create ideas and innovations within and among the complex networks of communities which currently exist and are shaped by our decisions, we need to regain our place as professionals, to realise the significance of our work and to take responsibility in a much deeper sense. Engineers must develop the 'ability to respond' to emerging needs of all people, across all cultures. To do this requires insights and knowledge which are at present largely within the domain of the social and political sciences but which need to be shared with our students in ways hich are meaningful and relevant to engineering. This book attempts to do just that. In Part 1 Baillie introduces ideas associated with the ways in which engineers relate to the communities in which they work. Drawing on scholarship from science and technology studies, globalisation and development studies, as well as work in science communication and dialogue, this introductory text sets the scene for an engineering community which engages with the public. In Part 2 Catalano frames the thinking processes necessary to create ethical and just decisions in engineering, to understand the implications of our current decision making processes and think about ways in which we might adapt these to become more socially just in the future. In Part 3 Baillie and Catalano have provided case studies of everyday issues such as water, garbage and alarm clocks, to help us consider how we might see through the lenses of our new knowledge from Parts 1 and 2 and apply this to our everyday existence as ngineers. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    iRODS Primer:Integrated Rule-Oriented Data System

    Arcot Rajasekar ; Reagan Moore ; Chien-Yi Hou ; Christopher A. Lee
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Policy-based data management enables the creation of community-specific collections. Every collection is created for a purpose. The purpose defines the set of properties that will be associated with the collection. The properties are enforced by management policies that control the execution of procedures that are applied whenever data are ingested or accessed. The procedures generate state information that defines the outcome of enforcing the management policy. The state information can be queried to validate assessment criteria and verify that the required collection properties have been conserved. The integrated Rule-Oriented Data System implements the data management framework required to support policy-based data management. Policies are turned into computer actionable Rules. Procedures are composed from a Micro-service-oriented architecture. The result is a highly extensible and tunable system that can enforce management policies, automate administrative tasks, and periodically alidate assessment criteria. Table of Contents: Introduction / Integrated Rule-Oriented Data System / iRODS Architecture / Rule-Oriented Programming / The iRODS Rule System / iRODS Micro-services / Example Rules / Extending iRODS / Appendix A: iRODS Shell Commands / Appendix B: Rulegen Grammar / Appendix C: Exercises / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Translating Euclid:Designing a Human-Centered Mathematics

    Gerry Stahl
    Copyright Year: 2013

    Morgan and Claypool eBooks

    Translating Euclid reports on an effort to transform geometry for students from a stylus-and-clay-tablet corpus of historical theorems to a stimulating computer-supported collaborative-learning inquiry experience. The origin of geometry was a turning point in the pre-history of informatics, literacy, and rational thought. Yet, this triumph of human intellect became ossified through historic layers of systematization, beginning with Euclid’s organization of the Elements of geometry. Often taught by memorization of procedures, theorems, and proofs, geometry in schooling rarely conveys its underlying intellectual excitement. The recent development of dynamic-geometry software offers an opportunity to translate the study of geometry into a contemporary vernacular. However, this involves transformations along multiple dimensions of the conceptual and practical context of learning. Translating Euclid steps through the multiple challenges involved in redesigning geometry education to ake advantage of computer support. Networked computers portend an interactive approach to exploring dynamic geometry as well as broadened prospects for collaboration. The proposed conception of geometry emphasizes the central role of the construction of dependencies as a design activity, integrating human creation and mathematical discovery to form a human-centered approach to mathematics. This book chronicles an iterative effort to adapt technology, theory, pedagogy and practice to support this vision of collaborative dynamic geometry and to evolve the approach through on-going cycles of trial with students and refinement of resources. It thereby provides a case study of a design-based research effort in computer-supported collaborative learning from a human-centered informatics perspective. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Paradigm Shift to Multimodality in Contemporary Computer Interfaces

    Sharon Oviatt ; Philip R. Cohen
    Copyright Year: 2015

    Morgan and Claypool eBooks

    During the last decade, cell phones with multimodal interfaces based on combined new media have become the dominant computer interface worldwide. Multimodal interfaces support mobility and expand the expressive power of human input to computers. They have shifted the fulcrum of human-computer interaction much closer to the human. This book explains the foundation of human-centered multimodal interaction and interface design, based on the cognitive and neurosciences, as well as the major benefits of multimodal interfaces for human cognition and performance. It describes the data-intensive methodologies used to envision, prototype, and evaluate new multimodal interfaces. From a system development viewpoint, this book outlines major approaches for multimodal signal processing, fusion, architectures, and techniques for robustly interpreting users' meaning. Multimodal interfaces have been commercialized extensively for field and mobile applications during the last decade. Research also is growing rapidly in areas like multimodal data analytics, affect recognition, accessible interfaces, embedded and robotic interfaces, machine learning and new hybrid processing approaches, and similar topics. The expansion of multimodal interfaces is part of the long-term evolution of more expressively powerful input to computers, a trend that will substantially improve support for human cognition and performance. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    How We Cope with Digital Technology

    Phil Turner
    Copyright Year: 2013

    Morgan and Claypool eBooks

    Digital technology has become a defining characteristic of modern life. Almost everyone uses it, we all rely on it, and many of us own a multitude of devices. What is more, we all expect to be able to use these technologies "straight out the box." This lecture discusses how we are able to do this without apparent problems. We are able to use digital technology because we have learned to cope with it. "To cope" is used in philosophy to mean "absorbed engagement," that is, we use our smart phones and tablet computers with little or no conscious effort. In human-computer interaction this kind of use is more often described as intuitive. While this, of course, is testament to improved design, our interest in this lecture is in the human side of these interactions. We cope with technology because we are familiar with it. We define familiarity as the readiness to engage with technology which arises from being repeatedly exposed to it—often from birth. This exposure involves the frequ nt use of it and seeing people all around us using it every day. Digital technology has become as common a feature of our everyday lives as the motor car, TV, credit card, cutlery, or a dozen other things which we also use without conscious deliberation. We will argue that we cope with digital technology in the same way as we do these other technologies by means of this everyday familiarity. But this is only half of the story. We also regularly support or scaffold our use of technology. These scaffolding activities are described as "epistemic actions" which we adopt to make it easier for us to accomplish our goals. With digital technology these epistemic actions include appropriating it to more closer meet our needs. In summary, coping is a situated, embodied, and distributed description of how we use digital technology. Table of Contents: Introduction / Familiarity / Coping / Epistemic Scaffolding / Coping in Context / Bibliography / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Business Processes:A Database Perspective

    Tova Milo ; Daniel Deutsch
    Copyright Year: 2012

    Morgan and Claypool eBooks

    While classic data management focuses on the data itself, research on Business Processes also considers the context in which this data is generated and manipulated, namely the processes, users, and goals that this data serves. This provides the analysts a better perspective of the organizational needs centered around the data. As such, this research is of fundamental importance. Much of the success of database systems in the last decade is due to the beauty and elegance of the relational model and its declarative query languages, combined with a rich spectrum of underlying evaluation and optimization techniques, and efficient implementations. Much like the case for traditional database research, elegant modeling and rich underlying technology are likely to be highly beneficiary for the Business Process owners and their users; both can benefit from easy formulation and analysis of the processes. While there have been many important advances in this research in recent years, there is st ll much to be desired: specifically, there have been many works that focus on the processes behavior (flow), and many that focus on its data, but only very few works have dealt with both the state-of-the-art in a database approach to Business Process modeling and analysis, the progress towards a holistic flow-and-data framework for these tasks, and highlight the current gaps and research directions. Table of Contents: Introduction / Modeling / Querying Business Processes / Other Issues / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Crafting your Research Future:A Guide to Successful Master's and PhD Degrees in Science & Engineering

    Charles Ling ; Qiang Yang
    Copyright Year: 2012

    Morgan and Claypool eBooks

    What is it like to be a researcher or a scientist? For young people, including graduate students and junior faculty members in universities, how can they identify good ideas for research? How do they conduct solid research to verify and realize their new ideas? How can they formulate their ideas and research results into high-quality articles, and publish them in highly competitive journals and conferences? What are effective ways to supervise graduate students so that they can establish themselves quickly in their research careers? In this book, Ling and Yang answer these questions in a step-by-step manner with specific and concrete examples from their first-hand research experience. Table of Contents: Acknowledgments / Preface / Basics of Research / Goals of Ph.D. Research / Getting Started: Finding New Ideas and Organizing Your Plans / Conducting Solid Research / Writing and Publishing Papers / Misconceptions and Tips for Paper Writing / Writing and Defending a Ph.D. Thesis / Life fter Ph.D. / Summary / References / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Developing Embedded Software using DaVinci and OMAP Technology

    B. I. Pawate
    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book discusses how to develop embedded products using DaVinci & OMAP Technology from Texas Instruments Incorporated. It presents a single software platform for diverse hardware platforms. DaVinci & OMAP Technology refers to the family of processors, development tools, software products, and support. While DaVinci Technology is driven by the needs of consumer video products such as IP network cameras, networked projectors, digital signage and portable media players, OMAP Technology is driven by the needs of wireless products such as smart phones. Texas Instruments offers a wide variety of processing devices to meet our users' price and performance needs. These vary from single digital signal processing devices to complex, system-on-chip (SoC) devices with multiple processors and peripherals. As a software developer you question: Do I need to become an expert in signal processing and learn the details of these complex devices before I can use them in my application? As a senior executive you wonder: How can I reduce my engineering development cost? How can I move from one processor to another from Texas Instruments without incurring a significant development cost? This book addresses these questions with sample code and gives an insight into the software architecture and associated component software products that make up this software platform. As an example, we show how we develop an IP network camera. Using this software platform, you can choose to focus on the application and quickly create a product without having to learn the details of the underlying hardware or signal processing algorithms. Alternatively, you can choose to differentiate at both the application as well as the signal processing layer by developing and adding your algorithms using the xDAIS for Digital Media, xDM, guidelines for component software. Finally, you may use one code base across different hardware platforms. Table of Contents: Software Platform / More about xDM, VISA, & CE / Building a Product Based on DaVinci Technology / Reducing Development Cost / eXpressDSP Digital Media (xDM) / Sample Application Using xDM / Embedded Peripheral Software Interface (EPSI) / Sample Application Using EPSI / Sample Application Using EPSI and xDM / IP Network Camera on DM355 Using TI Software / Adding your secret sauce to the Signal Processing Layer (SPL) / Further Reading View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Power Electronics for Modern Wind Turbines

    Frede Blaabjerg ; Zhe Chen
    Copyright Year: 2006

    Morgan and Claypool eBooks

    Wind energy is now the world's fastest growing energy source. In the past 10 years, the global wind energy capacity has increased rapidly. The installed global wind power capacity has grown to 47.317 GW from about 3.5 GW in 1994. The global wind power industry installed 7976 MW in 2004, an increase in total installed generating capacity of 20%. The phenomenal growth in the wind energy industry can be attributed to the concerns to the environmental issues, and research and development of innovative cost-reducing technologies. Denmark is a leading producer of wind turbines in the world, with an almost 40% share of the total worldwide production. The wind energy industry is a giant contributor to the Danish economy. In Denmark, the 3117 MW (in 2004) wind power is supplied by approximately 5500 wind turbines. Individuals and cooperatives own around 80% of the capacity. Denmark will increase the percentage of energy produced from wind to 25% by 2008, and aims for a 50% wind share of energ production by 2025. Wind technology has improved significantly over the past two decades, and almost all of the aspects related to the wind energy technology are still under active research and development. However, this monograph will introduce some basics of the electrical and power electronic aspects involved with modern wind generation systems, including modern power electronics and converters, electric generation and conversion systems for both fixed speed and variable speed systems, control techniques for wind turbines, configurations of wind farms, and the issues of integrating wind turbines into power systems. P View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modern EMC Analysis Techniques Volume II:Models and Applications

    Nikolaos V. Kantartzis ; Theodoros D. Tsiboukis
    Copyright Year: 2008

    Morgan and Claypool eBooks

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of modern real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, numerical investigations delve into printed circuit boards, monolithic microwave integrated circuits, radio frequency microelectromechanical systems as well as to the critical issues of electromagnetic interference, immunity, shielding, and signal integrity. Biomedical problems and EMC test facility characterizations are also thoroughly covered by means of diverse time-domain models and accurate implementations. Furthermore, the analysis covers the case of large-scale applications and electrostatic discharge problems, while special attention is drawn to the impact of contemporary materials in the EMC world, such as double negative metamaterials, bi-isotropic media, and several others. Table of Contents: Introduction / Printed Circuit Boards in EMC Structures / Electromagnetic Interference, Immunity, Shielding, and Signal Integrity / Bioelectromagnetic Problems: Human Exposure to Electromagnetic Fields / Time-Domain Characterization of EMC Test Facilities / Large-Scale EMC and Electrostatic Discharge Problems / Contemporary Material Modeling in EMC Applications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Representations, Transformations, and Statistics for Visual Reasoning

    Ross Maciejewski
    Copyright Year: 2011

    Morgan and Claypool eBooks

    Analytical reasoning techniques are methods by which users explore their data to obtain insight and knowledge that can directly support situational awareness and decision making. Recently, the analytical reasoning process has been augmented through the use of interactive visual representations and tools which utilize cognitive, design and perceptual principles. These tools are commonly referred to as visual analytics tools, and the underlying methods and principles have roots in a variety of disciplines. This chapter provides an introduction to young researchers as an overview of common visual representations and statistical analysis methods utilized in a variety of visual analytics systems. The application and design of visualization and analytical algorithms are subject to design decisions, parameter choices, and many conflicting requirements. As such, this chapter attempts to provide an initial set of guidelines for the creation of the visual representation, including pitfalls and reas where the graphics can be enhanced through interactive exploration. Basic analytical methods are explored as a means of enhancing the visual analysis process, moving from visual analysis to visual analytics. Table of Contents: Data Types / Color Schemes / Data Preconditioning / Visual Representations and Analysis / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Control System Synthesis:A Factorization Approach, Part II

    Mathukumalli Vidyasagar
    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book introduces the so-called "stable factorization approach" to the synthesis of feedback controllers for linear control systems. The key to this approach is to view the multi-input, multi-output (MIMO) plant for which one wishes to design a controller as a matrix over the fraction field F associated with a commutative ring with identity, denoted by R, which also has no divisors of zero. In this setting, the set of single-input, single-output (SISO) stable control systems is precisely the ring R, while the set of stable MIMO control systems is the set of matrices whose elements all belong to R. The set of unstable, meaning not necessarily stable, control systems is then taken to be the field of fractions F associated with R in the SISO case, and the set of matrices with elements in F in the MIMO case. The central notion introduced in the book is that, in most situations of practical interest, every matrix P whose elements belong to F can be "factored" as a "ratio" of two matrice N,D whose elements belong to R, in such a way that N,D are coprime. In the familiar case where the ring R corresponds to the set of bounded-input, bounded-output (BIBO)-stable rational transfer functions, coprimeness is equivalent to two functions not having any common zeros in the closed right half-plane including infinity. However, the notion of coprimeness extends readily to discrete-time systems, distributed-parameter systems in both the continuous- as well as discrete-time domains, and to multi-dimensional systems. Thus the stable factorization approach enables one to capture all these situations within a common framework. The key result in the stable factorization approach is the parametrization of all controllers that stabilize a given plant. It is shown that the set of all stabilizing controllers can be parametrized by a single parameter R, whose elements all belong to R. Moreover, every transfer matrix in the closed-loop system is an affine function of the design parameter R Thus problems of reliable stabilization, disturbance rejection, robust stabilization etc. can all be formulated in terms of choosing an appropriate R. This is a reprint of the book Control System Synthesis: A Factorization Approach originally published by M.I.T. Press in 1985. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Understanding Circuits:Learning Problem Solving Using Circuit Analysis

    Khalid Sayood
    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book/lecture is intended for a college freshman level class in problem solving, where the particular problems deal with electrical and electronic circuits. It can also be used in a junior/senior level class in high school to teach circuit analysis. The basic problem-solving paradigm used in this book is that of resolution of a problem into its component parts. The reader learns how to take circuits of varying levels of complexity using this paradigm. The problem-solving exercises also familiarize the reader with a number of different circuit components including resistors, capacitors, diodes, transistors, and operational amplifiers and their use in practical circuits. The reader should come away with both an understanding of how to approach complex problems and a “feel” for electrical and electronic circuits. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Studies of Work and the Workplace in HCI:Concepts and Techniques

    Graham Button ; Wes Sharrock
    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book has two purposes. First, to introduce the study of work and the workplace as a method for informing the design of computer systems to be used at work. We primarily focus on the predominant way in which the organization of work has been approached within the field of human-computer interaction (HCI), which is from the perspective of ethnomethodology. We locate studies of work in HCI within its intellectual antecedents, and describe paradigmatic examples and case studies. Second, we hope to provide those who are intending to conduct the type of fieldwork that studies of work and the workplace draw off with suggestions as to how they can go about their own work of developing observations about the settings they encounter. These suggestions take the form of a set of maxims that we have found useful while conducting the studies we have been involved in. We draw from our own fieldwork notes in order to illustrate these maxims. In addition we also offer some homilies about how to m ke observations; again, these are ones we have found useful in our own work. Table of Contents: Motivation / Overview: A Paradigmatic Case / Scientific Foundations / Detailed Description / Case Study / How to Conduct Ethnomethodological Studies of Work / Making Observations / Current Status View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Engineering Economics and Decision Analysis

    David Whitman ; Ronald Terry
    Copyright Year: 2012

    Morgan and Claypool eBooks

    The authors cover two general topics: basic engineering economics and risk analysis in this text. Within the topic of engineering economics are discussions on the time value of money and interest relationships. These interest relationships are used to define certain project criteria that are used by engineers and project managers to select the best economic choice among several alternatives. Projects examined will include both income- and service-producing investments. The effects of escalation, inflation, and taxes on the economic analysis of alternatives are discussed. Risk analysis incorporates the concepts of probability and statistics in the evaluation of alternatives. This allows management to determine the probability of success or failure of the project. Two types of sensitivity analyses are presented. The first is referred to as the range approach while the second uses probabilistic concepts to determine a measure of the risk involved. The authors have designed the text to as ist individuals to prepare to successfully complete the economics portions of the Fundamentals of Engineering Exam. Table of Contents: Introduction / Interest and the Time Value of Money / Project Evaluation Methods / Service Producing Investments / Income Producing Investments / Determination of Project Cash Flow / Financial Leverage / Basic Statistics and Probability / Sensitivity Analysis View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Concepts:From Books to Cyberspace Identities

    Gary Marchionini
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Information is essential to all human activity, and information in electronic form both amplifies and augments human information interactions. This lecture surveys some of the different classical meanings of information, focuses on the ways that electronic technologies are affecting how we think about these senses of information, and introduces an emerging sense of information that has implications for how we work, play, and interact with others. The evolutions of computers and electronic networks and people's uses and adaptations of these tools manifesting a dynamic space called cyberspace. Our traces of activity in cyberspace give rise to a new sense of information as instantaneous identity states that I term proflection of self. Proflections of self influence how others act toward us. Four classical senses of information are described as context for this new form of information. The four senses selected for inclusion here are the following: thought and memory, communication proces , artifact, and energy. Human mental activity and state (thought and memory) have neurological, cognitive, and affective facets.The act of informing (communication process) is considered from the perspective of human intentionality and technical developments that have dramatically amplified human communication capabilities. Information artifacts comprise a common sense of information that gives rise to a variety of information industries. Energy is the most general sense of information and is considered from the point of view of physical, mental, and social state change. This sense includes information theory as a measurable reduction in uncertainty. This lecture emphasizes how electronic representations have blurred media boundaries and added computational behaviors that yield new forms of information interaction, which, in turn, are stored, aggregated, and mined to create profiles that represent our cyber identities. Table of Contents: The Many Meanings of Information / Information s Thought and Memory / Information as Communication Process / Information as Artifact / Information as Energy / Information as Identity in Cyberspace: The Fifth Voice / Conclusion and Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advances in Waveform-Agile Sensing for Tracking

    Sandeep Prasad Sira ; Antonia Papanreou-Suppappola ; Darryl Morrell
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Recent advances in sensor technology and information processing afford a new flexibility in the design of waveforms for agile sensing. Sensors are now developed with the ability to dynamically choose their transmit or receive waveforms in order to optimize an objective cost function. This has exposed a new paradigm of significant performance improvements in active sensing: dynamic waveform adaptation to environment conditions, target structures, or information features. The manuscript provides a review of recent advances in waveform-agile sensing for target tracking applications. A dynamic waveform selection and configuration scheme is developed for two active sensors that track one or multiple mobile targets. A detailed description of two sequential Monte Carlo algorithms for agile tracking are presented, together with relevant Matlab code and simulation studies, to demonstrate the benefits of dynamic waveform adaptation. The work will be of interest not only to practitioners of rada and sonar, but also other applications where waveforms can be dynamically designed, such as communications and biosensing. Table of Contents: Waveform-Agile Target Tracking Application Formulation / Dynamic Waveform Selection with Application to Narrowband and Wideband Environments / Dynamic Waveform Selection for Tracking in Clutter / Conclusions / CRLB Evaluation for Gaussian Envelope GFM Chirp from the Ambiguity Function / CRLB Evaluation from the Complex Envelope View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Proxemic Interactions:From Theory to Practice

    Nicolai Marquardt ; Saul Greenberg
    Copyright Year: 2015

    Morgan and Claypool eBooks

    In the everyday world, much of what we do as social beings is dictated by how we perceive and manage our interpersonal space. This is called proxemics. At its simplest, people naturally correlate physical distance to social distance. We believe that people’s expectations of proxemics can be exploited in interaction design to mediate their interactions with devices (phones, tablets, computers, appliances, large displays) contained within a small ubiquitous computing ecology. Just as people expect increasing engagement and intimacy as they approach others, so should they naturally expect increasing connectivity and interaction possibilities as they bring themselves and their devices in close proximity to one another. This is called Proxemic Interactions. This book concerns the design of proxemic interactions within such future proxemic-aware ecologies. It imagines a world of devices that have fine-grained knowledge of nearby people and other devices—how they move into rang , their precise distance, their identity, and even their orientation—and how such knowledge can be exploited to design interaction techniques. The first part of this book concerns theory. After introducing proxemics, we operationalize proxemics for ubicomp interaction via the Proxemic Interactions framework that designers can use to mediate people’s interactions with digital devices. The framework, in part, identifies five key dimensions of proxemic measures (distance, orientation, movement, identity, and location) to consider when designing proxemic-aware ubicomp systems. The second part of this book applies this theory to practice via three case studies of proxemic-aware systems that react continuously to people’s and devices’ proxemic relationships. The case studies explore the application of proxemics in small-space ubicomp ecologies by considering first person-to-device, then device-to-device, and finally person-to-person and device-to-device proxemic elationships. We also offer a critical perspective on proxemic interactions in the form of “dark patterns,” where knowledge of proxemics may (and likely will) be easily exploited to the detriment of the user. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Biomedical Transport Processes

    Gerald Miller
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Transport processes represent important life-sustaining elements in all humans. These include mass transfer processes, including gas exchange in the lungs, transport across capillaries and alveoli, transport across the kidneys, and transport across cell membranes. These mass transfer processes affect how oxygen and carbon dioxide are exchanged in your bloodstream, how metabolic waste products are removed from your blood, how nutrients are transported to tissues, and how all cells function throughout the body. A discussion of kidney dialysis and gas exchange mechanisms is included. Another element in biomedical transport processes is that of momentum transport and fluid flow. This describes how blood is propelled from the heart and throughout the cardiovascular system, how blood elements affect the body, including gas exchange, infection control, clotting of blood, and blood flow resistance, which affects cardiac work. A discussion of the measurement of the blood resistance to flow (vi cosity), blood flow, and pressure is also included. A third element in transport processes in the human body is that of heat transfer, including heat transfer inside the body towards the periphery as well as heat transfer from the body to the environment. A discussion of temperature measurements and body protection in extreme heat conditions is also included. Table of Contents: Biomedical Mass Transport / Biofluid Mechanics and Momentum Transport / Biomedical Heat Transport View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Complexity of Noise:A Philosophical Outlook on Quantum Error Correction

    Amit Hagar
    Copyright Year: 2010

    Morgan and Claypool eBooks

    In quantum computing, where algorithms exist that can solve computational problems more efficiently than any known classical algorithms, the elimination of errors that result from external disturbances or from imperfect gates has become the "holy grail", and a worldwide quest for a large scale fault-tolerant, and computationally superior, quantum computer is currently taking place. Optimists rely on the premise that, under a certain threshold of errors, an arbitrary long fault-tolerant quantum computation can be achieved with only moderate (i.e., at most polynomial) overhead in computational cost. Pessimists, on the other hand, object that there are in principle (as opposed to merely technological) reasons why such machines are still inexistent, and that no matter what gadgets are used, large scale quantum computers will never be computationally superior to classical ones. Lacking a complete empirical characterization of quantum noise, the debate on the physical possibility of such ma hines invites philosophical scrutiny. Making this debate more precise by suggesting a novel statistical mechanical perspective thereof is the goal of this project. Table of Contents: Introduction / The Curse of the Open System / To Balance a Pencil on Its Tip / Universality at All Cost / Coda View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    DSP for MATLAB™ and LabVIEW™ III:Digital Filter Design

    Forester W. Isen
    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book is Volume III of the series DSP for MATLAB™ and LabVIEW™. Volume III covers digital filter design, including the specific topics of FIR design via windowed-ideal-lowpass filter, FIR highpass, bandpass, and bandstop filter design from windowed-ideal lowpass filters, FIR design using the transition-band-optimized Frequency Sampling technique (implemented by Inverse-DFT or Cosine/Sine Summation Formulas), design of equiripple FIRs of all standard types including Hilbert Transformers and Differentiators via the Remez Exchange Algorithm, design of Butterworth, Chebyshev (Types I and II), and Elliptic analog prototype lowpass filters, conversion of analog lowpass prototype filters to highpass, bandpass, and bandstop filters, and conversion of analog filters to digital filters using the Impulse Invariance and Bilinear Transform techniques. Certain filter topologies specific to FIRs are also discussed, as are two simple FIR types, the Comb and Moving Average filters. T e entire series consists of four volumes that collectively cover basic digital signal processing in a practical and accessible manner, but which nonetheless include all essential foundation mathematics. As the series title implies, the scripts (of which there are more than 200) described in the text and supplied in code form here will run on both MATLAB™ and LabVIEW™. The text for all volumes contains many examples, and many useful computational scripts, augmented by demonstration scripts and LabVIEW™ Virtual Instruments (VIs) that can be run to illustrate various signal processing concepts graphically on the user's computer screen. Volume I consists of four chapters that collectively set forth a brief overview of the field of digital signal processing, useful signals and concepts (including convolution, recursion, difference equations, LTI systems, etc), conversion from the continuous to discrete domain and back (i.e., analog-to-digital and digital-to-analog con ersion), aliasing, the Nyquist rate, normalized frequency, sample rate conversion and Mu-law compression, and signal processing principles including correlation, the correlation sequence, the Real DFT, correlation by convolution, matched filtering, simple FIR filters, and simple IIR filters. Chapter four of Volume I, in particular, provides an intuitive or "first principle" understanding of how digital filtering and frequency transforms work. Volume II provides detailed coverage of discrete frequency transforms, including a brief overview of common frequency transforms, both discrete and continuous, followed by detailed treatments of the Discrete Time Fourier Transform (DTFT), the z-Transform (including definition and properties, the inverse z-transform, frequency response via z-transform, and alternate filter realization topologies including Direct Form, Direct Form Transposed, Cascade Form, Parallel Form, and Lattice Form), and the Discrete Fourier Transform (DFT) (including Discret Fourier Series, the DFT-IDFT pair, DFT of common signals, bin width, sampling duration, and sample rate, the FFT, the Goertzel Algorithm, Linear, Periodic, and Circular convolution, DFT Leakage, and computation of the Inverse DFT). Volume IV, the culmination of the series, is an introductory treatment of LMS Adaptive Filtering and applications, and covers cost functions, performance surfaces, coefficient perturbation to estimate the gradient, the LMS algorithm, response of the LMS algorithm to narrow-band signals, and various topologies such as ANC (Active Noise Cancelling) or system modeling, Periodic Signal Removal/Prediction/Adaptive Line Enhancement (ALE), Interference Cancellation, Echo Cancellation (with single- and dual-H topologies), and Inverse Filtering/Deconvolution/Equalization. Table of Contents: Principles of FIR Design / FIR Design Techniques / Classical IIR Design View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics

    Stephen Gedney
    Copyright Year: 2011

    Morgan and Claypool eBooks

    Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics provides a comprehensive tutorial of the most widely used method for solving Maxwell's equations -- the Finite Difference Time-Domain Method. This book is an essential guide for students, researchers, and professional engineers who want to gain a fundamental knowledge of the FDTD method. It can accompany an undergraduate or entry-level graduate course or be used for self-study. The book provides all the background required to either research or apply the FDTD method for the solution of Maxwell's equations to practical problems in engineering and science. Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics guides the reader through the foundational theory of the FDTD method starting with the one-dimensional transmission-line problem and then progressing to the solution of Maxwell's equations in three dimensions. It also provides step by step guides to modeling physic l sources, lumped-circuit components, absorbing boundary conditions, perfectly matched layer absorbers, and sub-cell structures. Post processing methods such as network parameter extraction and far-field transformations are also detailed. Efficient implementations of the FDTD method in a high level language are also provided. Table of Contents: Introduction / 1D FDTD Modeling of the Transmission Line Equations / Yee Algorithm for Maxwell's Equations / Source Excitations / Absorbing Boundary Conditions / The Perfectly Matched Layer (PML) Absorbing Medium / Subcell Modeling / Post Processing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    GPU-Based Techniques for Global Illumination Effects

    Laszlo Szirmay-Kalos ; Laszlo Szecsi ; Mateu Sbert
    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book presents techniques to render photo-realistic images by programming the Graphics Processing Unit (GPU). We discuss effects such as mirror reflections, refractions, caustics, diffuse or glossy indirect illumination, radiosity, single or multiple scattering in participating media, tone reproduction, glow, and depth of field. The book targets game developers, graphics programmers, and also students with some basic understanding of computer graphics algorithms, rendering APIs like Direct3D or OpenGL, and shader programming. In order to make the book self-contained, the most important concepts of local illumination and global illumination rendering, graphics hardware, and Direct3D/HLSL programming are reviewed in the first chapters. After these introductory chapters we warm up with simple methods including shadow and environment mapping, then we move on toward advanced concepts aiming at global illumination rendering. Since it would have been impossible to give a rigorous review f all approaches proposed in this field, we go into the details of just a few methods solving each particular global illumination effect. However, a short discussion of the state of the art and links to the bibliography are also provided to refer the interested reader to techniques that are not detailed in this book. The implementation of the selected methods is also presented in HLSL, and we discuss their observed performance, merits, and disadvantages. In the last chapter, we also review how these techniques can be integrated in an advanced game engine and present case studies of their exploitation in games. Having gone through this book, the reader will have an overview of the state of the art, will be able to apply and improve these techniques, and most importantly, will be capable of developing brand new GPU algorithms. Table of Contents: Global Illumintation Rendering / Local Illumination Rendering Pipeline of GPUs / Programming and Controlling GPUs / Simple Improvements of the ocal Illumination Model / Ray Casting on the GPU / Specular Effects with Rasterization / Diffuse and Glossy Indirect Illumination / Pre-computation Aided Global Illumination / Participating Media Rendering / Fake Global Illumination / Postprocessing Effects / Integrating GI Effects in Games and Virtual Reality Systems / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Cardiac Tissue Engineering:Principles, Materials, and Applications

    Smadar Cohen ; Emil Ruvinov ; Yulia Sapir
    Copyright Year: 2012

    Morgan and Claypool eBooks

    Cardiac tissue engineering aims at repairing damaged heart muscle and producing human cardiac tissues for application in drug toxicity studies. This book offers a comprehensive overview of the cardiac tissue engineering strategies, including presenting and discussing the various concepts in use, research directions and applications. Essential basic information on the major components in cardiac tissue engineering, namely cell sources and biomaterials, is firstly presented to the readers, followed by a detailed description of their implementation in different strategies, broadly divided to cellular and acellular ones. In cellular approaches, the biomaterials are used to increase cell retention after implantation or as scaffolds when bioengineering the cardiac patch, in vitro. In acellular approaches, the biomaterials are used as ECM replacement for damaged cardiac ECM after MI, or, in combination with growth factors, the biomaterials assume an additional function as a depot for prolong d factor activity for the effective recruitment of repairing cells. The book also presents technological innovations aimed to improve the quality of the cardiac patches, such as bioreactor applications, stimulation patterns and prevascularization. This book could be of interest not only from an educational perspective (i.e. for graduate students), but also for researchers and medical professionals, to offer them fresh views on novel and powerful treatment strategies. We hope that the reader will find a broad spectrum of ideas and possibilities described in this book both interesting and convincing. Table of Contents: Introduction / The Heart: Structure, Cardiovascular Diseases, and Regeneration / Cell Sources for Cardiac Tissue Engineering / Biomaterials: Polymers, Scaffolds, and Basic Design Criteria / Biomaterials as Vehicles for Stem Cell Delivery and Retention in the Infarct / Bioengineering of Cardiac Patches, In Vitro / Perfusion Bioreactors and Stimulation Patterns in Cardiac T ssue Engineering / Vascularization of Cardiac Patches / Acellular Biomaterials for Cardiac Repair / Biomaterial-based Controlled Delivery of Bioactive Molecules for Myocardial Regeneration View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advanced Metasearch Engine Technology

    Weiyi Meng ; Clement Yu
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Among the search tools currently on the Web, search engines are the most well known thanks to the popularity of major search engines such as Google and Yahoo!. While extremely successful, these major search engines do have serious limitations. This book introduces large-scale metasearch engine technology, which has the potential to overcome the limitations of the major search engines. Essentially, a metasearch engine is a search system that supports unified access to multiple existing search engines by passing the queries it receives to its component search engines and aggregating the returned results into a single ranked list. A large-scale metasearch engine has thousands or more component search engines. While metasearch engines were initially motivated by their ability to combine the search coverage of multiple search engines, there are also other benefits such as the potential to obtain better and fresher results and to reach the Deep Web. The following major components of large-s ale metasearch engines will be discussed in detail in this book: search engine selection, search engine incorporation, and result merging. Highly scalable and automated solutions for these components are emphasized. The authors make a strong case for the viability of the large-scale metasearch engine technology as a competitive technology for Web search. Table of Contents: Introduction / Metasearch Engine Architecture / Search Engine Selection / Search Engine Incorporation / Result Merging / Summary and Future Research View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Practical Global Illumination with Irradiance Caching

    Jaroslav Krivanek ; Pascal Gautron
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Irradiance caching is a ray tracing-based technique for computing global illumination on diffuse surfaces. Specifically, it addresses the computation of indirect illumination bouncing off one diffuse object onto another. The sole purpose of irradiance caching is to make this computation reasonably fast. The main idea is to perform the indirect illumination sampling only at a selected set of locations in the scene, store the results in a cache, and reuse the cached value at other points through fast interpolation. This book is for anyone interested in making a production-ready implementation of irradiance caching that reliably renders artifact-free images. Since its invention 20 years ago, the irradiance caching algorithm has been successfully used to accelerate global illumination computation in the Radiance lighting simulation system. Its widespread use had to wait until computers became fast enough to consider global illumination in film production rendering. Since then, its use is biquitous. Virtually all commercial and open-source rendering software base the global illumination computation upon irradiance caching. Although elegant and powerful, the algorithm in its basic form often fails to produce artifact-free mages. Unfortunately, practical information on implementing the algorithm is scarce. The main objective of this book is to show the irradiance caching algorithm along with all the details and tricks upon which the success of its practical implementation is dependent. In addition, we discuss some extensions of the basic algorithm, such as a GPU implementation for interactive global illumination computation and temporal caching that exploits temporal coherence to suppress flickering in animations. Our goal is to show the material without being overly theoretical. However, the reader should have some basic understanding of rendering concepts, ray tracing in particular. Familiarity with global illumination is useful but not necessary to read this book. Tab e of Contents: Introduction to Ray Tracing and Global Illumination / Irradiance Caching Core / Practical Rendering with Irradiance Caching / Irradiance Caching in a Complete Global Illumination / Irradiance Caching on Graphics Hardware / Temporal Irradiance Caching View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Science Fiction Prototyping:Designing the Future with Science Fiction

    Brian David Johnson
    Copyright Year: 2011

    Morgan and Claypool eBooks

    Science fiction is the playground of the imagination. If you are interested in science or fascinated with the future then science fiction is where you explore new ideas and let your dreams and nightmares duke it out on the safety of the page or screen. But what if we could use science fiction to do more than that? What if we could use science fiction based on science fact to not only imagine our future but develop new technologies and products? What if we could use stories, movies and comics as a kind of tool to explore the real world implications and uses of future technologies today? Science Fiction Prototyping is a practical guide to using fiction as a way to imagine our future in a whole new way. Filled with history, real world examples and conversations with experts like best selling science fiction author Cory Doctorow, senior editor at Dark Horse Comics Chris Warner and Hollywood science expert Sidney Perkowitz, Science Fiction Prototyping will give you the tools you need to be in designing the future with science fiction. The future is Brian David Johnson’s business. As a futurist at Intel Corporation, his charter is to develop an actionable vision for computing in 2021. His work is called “future casting”—using ethnographic field studies, technology research, trend data, and even science fiction to create a pragmatic vision of consumers and computing. Johnson has been pioneering development in artificial intelligence, robotics, and reinventing TV. He speaks and writes extensively about future technologies in articles and scientific papers as well as science fiction short stories and novels (Fake Plastic Love and Screen Future: The Future of Entertainment, Computing and the Devices We Love). He has directed two feature films and is an illustrator and commissioned painter. Table of Contents: Preface / Foreword / Epilogue / Dedication / Acknowledgments / 1. The Future Is in Your Hands / 2. Religious Robots and Runaway Were-Tigers: Brief Overview of the Science and the Fiction that Went Into Two SF Prototypes / 3. How to Build Your Own SF Prototype in Five Steps or Less / 4. I, Robot: From Asimov to Doctorow: Exploring Short Fiction as an SF Prototype and a Conversation With Cory Doctorow / 5. The Men in the Moon: Exploring Movies as an SF Prototype and a Conversation with Sidney Perkowitz / 6. Science in the Gutters: Exploring Comics as an SF Prototype and a Conversation With Chris Warner / 7. Making the Future: Now that You Have Developed Your SF Prototype, What’s Next? / 8. Einstein’s Thought Experiments and Asimov’s Second Dream / Appendix A: The SF Prototypes / Notes / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Accurate Computation of Mathieu Functions

    Andrew Peterson ; Malcolm Bibby
    Copyright Year: 2013

    Morgan and Claypool eBooks

    This lecture presents a modern approach for the computation of Mathieu functions. These functions find application in boundary value analysis such as electromagnetic scattering from elliptic cylinders and flat strips, as well as the analogous acoustic and optical problems, and many other applications in science and engineering. The authors review the traditional approach used for these functions, show its limitations, and provide an alternative "tuned" approach enabling improved accuracy and convergence. The performance of this approach is investigated for a wide range of parameters and machine precision. Examples from electromagnetic scattering are provided for illustration and to show the convergence of the typical series that employ Mathieu functions for boundary value analysis. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bionanotechnology

    Elisabeth S. Papazoglou ; Aravind Parthasarathy
    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book aims to provide vital information about the growing field of bionanotechnology for undergraduate and graduate students, as well as working professionals in various fields. The fundamentals of nanotechnology are covered along with several specific bionanotechnology applications, including nanobioimaging and drug delivery which is a growing $100 billions industry. The uniqueness of the field has been brought out with unparalleled lucidity; a balance between important insight into the synthetic methods of preparing stable nano-structures and medical applications driven focus educates and informs the reader on the impact of this emerging field. Critical examination of potential threats followed by a current global outlook completes the discussion. In short, the book takes you through a journey from fundamentals to frontiers of bionanotechnology so that you can understand and make informed decisions on the impact of bionano on your career and business. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Power

    William Eccles
    Copyright Year: 2008

    Morgan and Claypool eBooks

    Pragmatic Power is focused on just three aspects of the AC electrical power system that supplies and moves the vast majority of electrical energy nearly everywhere in the world: three-phase power systems, transformers, and induction motors. The reader needs to have had an introduction to electrical circuits and AC power, although the text begins with a review of the basics of AC power. Balanced three-phase systems are studied by developing their single-phase equivalents. The study includes a look at how the cost of "power" is affected by reactive power and power factor. Transformers are considered as a circuit element in a power system, one that can be reasonably modeled to simplify system analysis. Induction motors are presented as the most common way to change electrical energy into rotational energy. Examples include the correct selection of an induction motor for a particular rotating load. All of these topics include completely worked examples to aid the reader in understanding h w to apply what has been learned. This short lecture book will be of use to students at any level of engineering, not just electrical, because it is intended for the practicing engineer or scientist looking for a practical, applied introduction to AC power systems. The author's "pragmatic" and applied style gives a unique and helpful "nonidealistic, practical, and opinionated" introduction to the topic. Table of Contents: Three-Phase Power: 3 > 3 x 1 / Transformers: Edison Lost / Induction Motors: Just One Moving Part View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Representing and Reasoning with Qualitative Preferences:Tools and Applications

    Ganesh Ram Santhanam ; Samik Basu ; Vasant Honavar
    Copyright Year: 2016

    Morgan and Claypool eBooks

    This book provides a tutorial introduction to modern techniques for representing and reasoning about qualitative preferences with respect to a set of alternatives. The syntax and semantics of several languages for representing preference languages, including CP-nets, TCP-nets, CI-nets, and CP-theories, are reviewed. Some key problems in reasoning about preferences are introduced, including determining whether one alternative is preferred to another, or whether they are equivalent, with respect to a given set of preferences. These tasks can be reduced to model checking in temporal logic. Specifically, an induced preference graph that represents a given set of preferences can be efficiently encoded using a Kripke Structure for Computational Tree Logic (CTL). One can translate preference queries with respect to a set of preferences into an equivalent set of formulae in CTL, such that the CTL formula is satisfied whenever the preference query holds. This allows us to use a model checker t reason about preferences, i.e., answer preference queries, and to obtain a justification as to why a preference query is satisfied (or not) with respect to a set of preferences. This book defines the notions of the equivalence of two sets of preferences, including what it means for one set of preferences to subsume another, and shows how to answer preferential equivalence and subsumption queries using model checking. Furthermore, this book demontrates how to generate alternatives ordered by preference, along with providing ways to deal with inconsistent preference specifications. A description of CRISNER—an open source software implementation of the model checking approach to qualitative preference reasoning in CP-nets, TCP-nets, and CP-theories is included, as well as examples illustrating its use. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Control Grid Motion Estimation for Efficient Application of Optical Flow

    Christine M. Zwart ; David Frakes
    Copyright Year: 2012

    Morgan and Claypool eBooks

    Motion estimation is a long-standing cornerstone of image and video processing. Most notably, motion estimation serves as the foundation for many of today's ubiquitous video coding standards including H.264. Motion estimators also play key roles in countless other applications that serve the consumer, industrial, biomedical, and military sectors. Of the many available motion estimation techniques, optical flow is widely regarded as most flexible. The flexibility offered by optical flow is particularly useful for complex registration and interpolation problems, but comes at a considerable computational expense. As the volume and dimensionality of data that motion estimators are applied to continue to grow, that expense becomes more and more costly. Control grid motion estimators based on optical flow can accomplish motion estimation with flexibility similar to pure optical flow, but at a fraction of the computational expense. Control grid methods also offer the added benefit of repres nting motion far more compactly than pure optical flow. This booklet explores control grid motion estimation and provides implementations of the approach that apply to data of multiple dimensionalities. Important current applications of control grid methods including registration and interpolation are also developed. Table of Contents: Introduction / Control Grid Interpolation (CGI) / Application of CGI to Registration Problems / Application of CGI to Interpolation Problems / Discussion and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Path Problems in Networks

    John Baras ; George Theodorakopoulos
    Copyright Year: 2010

    Morgan and Claypool eBooks

    The algebraic path problem is a generalization of the shortest path problem in graphs. Various instances of this abstract problem have appeared in the literature, and similar solutions have been independently discovered and rediscovered. The repeated appearance of a problem is evidence of its relevance. This book aims to help current and future researchers add this powerful tool to their arsenal, so that they can easily identify and use it in their own work. Path problems in networks can be conceptually divided into two parts: A distillation of the extensive theory behind the algebraic path problem, and an exposition of a broad range of applications. First of all, the shortest path problem is presented so as to fix terminology and concepts: existence and uniqueness of solutions, robustness to parameter changes, and centralized and distributed computation algorithms. Then, these concepts are generalized to the algebraic context of semirings. Methods for creating new semirings, useful f r modeling new problems, are provided. A large part of the book is then devoted to numerous applications of the algebraic path problem, ranging from mobile network routing to BGP routing to social networks. These applications show what kind of problems can be modeled as algebraic path problems; they also serve as examples on how to go about modeling new problems. This monograph will be useful to network researchers, engineers, and graduate students. It can be used either as an introduction to the topic, or as a quick reference to the theoretical facts, algorithms, and application examples. The theoretical background assumed for the reader is that of a graduate or advanced undergraduate student in computer science or engineering. Some familiarity with algebra and algorithms is helpful, but not necessary. Algebra, in particular, is used as a convenient and concise language to describe problems that are essentially combinatorial. Table of Contents: Classical Shortest Path / The Algebraic Path Problem / Properties and Computation of Solutions / Applications / Related Areas / List of Semirings and Applications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Datacenter as a Computer:An Introduction to the Design of Warehouse-Scale Machines

    Luis Andre Barroso ; Jimmy Clidaras ; Urs Hoelzle
    Copyright Year: 2013

    Morgan and Claypool eBooks

    After nearly four years of substantial academic and industrial developments in warehouse-scale computing, we are delighted to present our first major update to this lecture. The increased popularity of public clouds has made WSC software techniques relevant to a larger pool of programmers since our first edition. Therefore, we expanded Chapter 2 to reflect our better understanding of WSC software systems and the toolbox of software techniques for WSC programming. In Chapter 3, we added to our coverage of the evolving landscape of wimpy vs. brawny server trade-offs, and we now present an overview of WSC interconnects and storage systems that was promised but lacking in the original edition. Thanks largely to the help of our new co-author, Google Distinguished Engineer Jimmy Clidaras, the material on facility mechanical and power distribution design has been updated and greatly extended (see Chapters 4 and 5). Chapters 6 and 7 have also been revamped significantly. We hope this revised dition continues to meet the needs of educators and professionals in this area. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Discriminative Learning for Speech Recognition

    Xiadong He ; Li Deng
    Copyright Year: 2008

    Morgan and Claypool eBooks

    In this book, we introduce the background and mainstream methods of probabilistic modeling and discriminative parameter optimization for speech recognition. The specific models treated in depth include the widely used exponential-family distributions and the hidden Markov model. A detailed study is presented on unifying the common objective functions for discriminative learning in speech recognition, namely maximum mutual information (MMI), minimum classification error, and minimum phone/word error. The unification is presented, with rigorous mathematical analysis, in a common rational-function form. This common form enables the use of the growth transformation (or extended Baum–Welch) optimization framework in discriminative learning of model parameters. In addition to all the necessary introduction of the background and tutorial material on the subject, we also included technical details on the derivation of the parameter optimization formulas for exponential-family distribut ons, discrete hidden Markov models (HMMs), and continuous-density HMMs in discriminative learning. Selected experimental results obtained by the authors in firsthand are presented to show that discriminative learning can lead to superior speech recognition performance over conventional parameter learning. Details on major algorithmic implementation issues with practical significance are provided to enable the practitioners to directly reproduce the theory in the earlier part of the book into engineering practice. Table of Contents: Introduction and Background / Statistical Speech Recognition: A Tutorial / Discriminative Learning: A Unified Objective Function / Discriminative Learning Algorithm for Exponential-Family Distributions / Discriminative Learning Algorithm for Hidden Markov Model / Practical Implementation of Discriminative Learning / Selected Experimental Results / Epilogue / Major Symbols Used in the Book and Their Descriptions / Mathematical Notation / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Research Infrastructures for Hardware Accelerators

    Yakun Sophia Shao ; David Brooks
    Copyright Year: 2015

    Morgan and Claypool eBooks

    Hardware acceleration in the form of customized datapath and control circuitry tuned to specific applications has gained popularity for its promise to utilize transistors more efficiently. Historically, the computer architecture community has focused on general-purpose processors, and extensive research infrastructure has been developed to support research efforts in this domain. Envisioning future computing systems with a diverse set of general-purpose cores and accelerators, computer architects must add accelerator-related research infrastructures to their toolboxes to explore future heterogeneous systems. This book serves as a primer for the field, as an overview of the vast literature on accelerator architectures and their design flows, and as a resource guidebook for researchers working in related areas. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and Sustainable Community Development

    Juan Lucena ; Jen Schneider ; Jon A. Leydens
    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book, Engineering and Sustainable Community Development, presents an overview of engineering as it relates to humanitarian engineering, service learning engineering, or engineering for community development, often called sustainable community development (SCD). The topics covered include a history of engineers and development, the problems of using industry-based practices when designing for communities, how engineers can prepare to work with communities, and listening in community development. It also includes two case studies -- one of engineers developing a windmill for a community in India, and a second of an engineer "mapping communities" in Honduras to empower people to use water effectively -- and student perspectives and experiences on one curricular model dealing with community development. Table of Contents: Introduction / Engineers and Development: From Empires to Sustainable Development / Why Design for Industry Will Not Work as Design for Community / Engineering with Community / Listening to Community / ESCD Case Study 1: Sika Dhari's Windmill / ESCD Case Study 2: Building Organizations and Mapping Communities in Honduras / Students' Perspectives on ESCD: A Course Model / Beyond Engineers and Community: A Path Forward View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing for User Engagement:Aesthetic and Attractive User Interfaces

    Alistair Sutcliffe
    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book explores the design process for user experience and engagement, which expands the traditional concept of usability and utility in design to include aesthetics, fun and excitement. User experience has evolved as a new area of Human Computer Interaction research, motivated by non-work oriented applications such as games, education and emerging interactive Web 2.0. The chapter starts by examining the phenomena of user engagement and experience and setting them in the perspective of cognitive psychology, in particular motivation, emotion and mood. The perspective of aesthetics is expanded towards interaction and engagement to propose design treatments, metaphors, and interactive techniques which can promote user interest, excitement and satisfying experiences. This is followed by reviewing the design process and design treatments which can promote aesthetic perception and engaging interaction. The final part of the chapter provides design guidelines and principles drawn from the interaction and graphical design literature which are cross-referenced to issues in the design process. Examples of designs and design treatments are given to illustrate principles and advice, accompanied by critical reflection. Table of Contents: Introduction / Psychology of User Engagement / UE Design Process / Design Principles and Guidelines / Perspectives and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computer-aided Detection of Architectural Distortion in Prior Mammograms of Interval Cancer

    Shantanu Banik ; Rangaraj Rangayyan ; J. E. Leo Desautels
    Copyright Year: 2013

    Morgan and Claypool eBooks

    Architectural distortion is an important and early sign of breast cancer, but because of its subtlety, it is a common cause of false-negative findings on screening mammograms. Screening mammograms obtained prior to the detection of cancer could contain subtle signs of early stages of breast cancer, in particular, architectural distortion. This book presents image processing and pattern recognition techniques to detect architectural distortion in prior mammograms of interval-cancer cases. The methods are based upon Gabor filters, phase portrait analysis, procedures for the analysis of the angular spread of power, fractal analysis, Laws' texture energy measures derived from geometrically transformed regions of interest (ROIs), and Haralick's texture features. With Gabor filters and phase-portrait analysis, 4,224 ROIs were automatically obtained from 106 prior mammograms of 56 interval-cancer cases, including 301 true-positive ROIs related to architectural distortion, and from 52 mammo rams of 13 normal cases. For each ROI, the fractal dimension, the entropy of the angular spread of power, 10 Laws' texture energy measures, and Haralick's 14 texture features were computed. The areas under the receiver operating characteristic (ROC) curves obtained using the features selected by stepwise logistic regression and the leave-one-image-out method are 0.77 with the Bayesian classifier, 0.76 with Fisher linear discriminant analysis, and 0.79 with a neural network classifier. Free-response ROC analysis indicated sensitivities of 0.80 and 0.90 at 5.7 and 8.8 false positives (FPs) per image, respectively, with the Bayesian classifier and the leave-one-image-out method. The present study has demonstrated the ability to detect early signs of breast cancer 15 months ahead of the time of clinical diagnosis, on the average, for interval-cancer cases, with a sensitivity of 0.8 at 5.7 FP/image. The presented computer-aided detection techniques, dedicated to accurate detection and lo alization of architectural distortion, could lead to efficient detection of early and subtle signs of breast cancer at pre-mass-formation stages. Table of Contents: Introduction / Detection of Early Signs of Breast Cancer / Detection and Analysis of Oriented Patterns / Detection of Potential Sites of Architectural Distortion / Experimental Set Up and Datasets / Feature Selection and Pattern Classification / Analysis of Oriented Patterns Related to Architectural Distortion / Detection of Architectural Distortion in Prior Mammograms / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Strategic Health Technology Incorporation

    Binseng Wang
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Technology is essential to the delivery of health care but it is still only a tool that needs to be deployed wisely to ensure beneficial outcomes at reasonable costs. Among various categories of health technology, medical equipment has the unique distinction of requiring both high initial investments and costly maintenance during its entire useful life. This characteristic does not, however, imply that medical equipment is more costly than other categories, provided that it is managed properly. The foundation of a sound technology management process is the planning and acquisition of equipment, collectively called technology incorporation. This lecture presents a rational, strategic process for technology incorporation based on experience, some successful and many unsuccessful, accumulated in industrialized and developing countries over the last three decades. The planning step is focused on establishing a Technology Incorporation Plan (TIP) using data collected from an audit of exist ng technology, evaluating needs, impacts, costs, and benefits, and consolidating the information collected for decision making. The acquisition step implements TIP by selecting equipment based on technical, regulatory, financial, and supplier considerations, and procuring it using one of the multiple forms of purchasing or agreements with suppliers. This incorporation process is generic enough to be used, with suitable adaptations, for a wide variety of health organizations with different sizes and acuity levels, ranging from health clinics to community hospitals to major teaching hospitals and even to entire health systems. Such a broadly applicable process is possible because it is based on a conceptual framework composed of in-depth analysis of the basic principles that govern each stage of technology lifecycle. Using this incorporation process, successful TIPs have been created and implemented, thereby contributing to the improvement of healthcare services and limiting the associa ed expenses. Table of Contents: Introduction / Conceptual Framework / The Incorporation Process / Discussion / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Understanding Atrial Fibrillation:The Signal Processing Contribution, Part II

    Luca Mainardi ; Leif Sornmo ; Sergio Cerutti
    Copyright Year: 2008

    Morgan and Claypool eBooks

    The book presents recent advances in signal processing techniques for modeling, analysis, and understanding of the heart's electrical activity during atrial fibrillation. This arrhythmia is the most commonly encountered in clinical practice and its complex and metamorphic nature represents a challenging problem for clinicians, engineers, and scientists. Research on atrial fibrillation has stimulated the development of a wide range of signal processing tools to better understand the mechanisms ruling its initiation, maintenance, and termination. This book provides undergraduate and graduate students, as well as researchers and practicing engineers, with an overview of techniques, including time domain techniques for atrial wave extraction, time-frequency analysis for exploring wave dynamics, and nonlinear techniques to characterize the ventricular response and the organization of atrial activity. The book includes an introductory chapter about atrial fibrillation and its mechanisms, t eatment, and management. The successive chapters are dedicated to the analysis of atrial signals recorded on the body surface and to the quantification of ventricular response. The rest of the book explores techniques to characterize endo- and epicardial recordings and to model atrial conduction. Under the appearance of being a monothematic book on atrial fibrillation, the reader will not only recognize common problems of biomedical signal processing but also discover that analysis of atrial fibrillation is a unique challenge for developing and testing novel signal processing tools. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Basic Simulation Models of Phase Tracking Devices Using MATLAB

    William Tranter ; Ratchaneekorn Thamvichai ; Tamal Bose
    Copyright Year: 2010

    Morgan and Claypool eBooks

    The Phase-Locked Loop (PLL), and many of the devices used for frequency and phase tracking, carrier and symbol synchronization, demodulation, and frequency synthesis, are fundamental building blocks in today's complex communications systems. It is therefore essential for both students and practicing communications engineers interested in the design and implementation of modern communication systems to understand and have insight into the behavior of these important and ubiquitous devices. Since the PLL behaves as a nonlinear device (at least during acquisition), computer simulation can be used to great advantage in gaining insight into the behavior of the PLL and the devices derived from the PLL. The purpose of this Synthesis Lecture is to provide basic theoretical analyses of the PLL and devices derived from the PLL and simulation models suitable for supplementing undergraduate and graduate courses in communications. The Synthesis Lecture is also suitable for self study by practicin engineers. A significant component of this book is a set of basic MATLAB-based simulations that illustrate the operating characteristics of PLL-based devices and enable the reader to investigate the impact of varying system parameters. Rather than providing a comprehensive treatment of the underlying theory of phase-locked loops, theoretical analyses are provided in sufficient detail in order to explain how simulations are developed. The references point to sources currently available that treat this subject in considerable technical depth and are suitable for additional study. Download MATLAB codes (.zip) Table of Contents: Introduction / Basic PLL Theory / Structures Developed From The Basic PLL / Simulation Models / MATLAB Simulations / Noise Performance Analysis View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Neural Interfacing:Forging the Human-Machine Connection

    Thomas D. Coates
    Copyright Year: 2008

    Morgan and Claypool eBooks

    In the past 50 years there has been an explosion of interest in the development of technologies whose end goal is to connect the human brain and/or nervous system directly to computers. Once the subject of science fiction, the technologies necessary to accomplish this goal are rapidly becoming reality. In laboratories around the globe, research is being undertaken to restore function to the physically disabled, to replace areas of the brain damaged by disease or trauma and to augment human abilities. Building neural interfaces and neuro-prosthetics relies on a diverse array of disciplines such as neuroscience, engineering, medicine and microfabrication just to name a few. This book presents a short history of neural interfacing (N.I.) research and introduces the reader to some of the current efforts to develop neural prostheses. The book is intended as an introduction for the college freshman or others wishing to learn more about the field. A resource guide is included for students al ng with a list of laboratories conducting N.I. research and universities with N.I. related tracks of study. Table of Contents: Neural Interfaces Past and Present / Current Neuroprosthesis Research / Conclusion / Resources for Students View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Zero Effort Technologies: Considerations, Challenges, and Use in Health, Wellness, and Rehabilitation

    Alex Mihailidis ; Jennifer Boger ; Jesse Hoey ; Tizneem Jiancaro
    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book introduces zero-effort technologies (ZETs), an emerging class of technology that requires little or no effort from the people who use it. ZETs use advanced techniques, such as computer vision, sensor fusion, decision-making and planning, and machine learning to autonomously operate through the collection, analysis, and application of data about the user and his/her context. This book gives an overview of ZETs, presents concepts in the development of pervasive intelligent technologies and environments for health and rehabilitation, along with an in-depth discussion of the design principles that this approach entails. The book concludes with a discussion of specific ZETs that have applied these design principles with the goal of ensuring the safety and well-being of the people who use them, such as older adults with dementia and provides thoughts regarding future directions of the field. Table of Contents: Lecture Overview / Introduction to Zero Effort Technologies / Designing ZETs / Building and Evaluating ZETs / Examples of ZETs / Conclusions and Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Aspects of Differential Geometry I

    Peter Gilkey ; JeongHyeong Park ; Ramon Vazquez-Lorenzo
    Copyright Year: 2015

    Morgan and Claypool eBooks

    Differential Geometry is a wide field. We have chosen to concentrate upon certain aspects that are appropriate for an introduction to the subject; we have not attempted an encyclopedic treatment. In Book I, we focus on preliminaries. Chapter 1 provides an introduction to multivariable calculus and treats the Inverse Function Theorem, Implicit Function Theorem, the theory of the Riemann Integral, and the Change of Variable Theorem. Chapter 2 treats smooth manifolds, the tangent and cotangent bundles, and Stokes' Theorem. Chapter 3 is an introduction to Riemannian geometry. The Levi-Civita connection is presented, geodesics introduced, the Jacobi operator is discussed, and the Gauss-Bonnet Theorem is proved. The material is appropriate for an undergraduate course in the subject. We have given some different proofs than those that are classically given and there is some new material in these volumes. For example, the treatment of the Chern-Gauss-Bonnet Theorem for pseudo-Riemannian mani olds with boundary is new. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Biomedical Engineering:Biomechanics and Bioelectricity

    Douglas Christensen
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Intended as an introduction to the field of biomedical engineering, this book covers the topics of biomechanics (Part I) and bioelectricity (Part II). Each chapter emphasizes a fundamental principle or law, such as Darcy's Law, Poiseuille's Law, Hooke's Law, Starling's Law, levers, and work in the area of fluid, solid, and cardiovascular biomechanics. In addition, electrical laws and analysis tools are introduced, including Ohm's Law, Kirchhoff's Laws, Coulomb's Law, capacitors, and the fluid/electrical analogy. Culminating the electrical portion are chapters covering Nernst and membrane potentials and Fourier transforms. Examples are solved throughout the book and problems with answers are given at the end of each chapter. A semester-long Major Project that models the human systemic cardiovascular system, utilizing both a Matlab numerical simulation and an electrical analog circuit, ties many of the book's concepts together. Table of Contents: Ohm's Law: Current, Voltage and Resistance / Kirchhoff's Voltage and Current Laws: Circuit Analysis / Operational Amplifiers / Coulomb's Law, Capacitors and the Fluid/Electrical Analogy / Series and Parallel Combinations / Thevenin Equivalent Circuits / Nernst Potential: Cell Membrane Equivalent Circuit / Fourier Transforms: Alternating Currents (AC) View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Zhaojun Liu ; Tongde Huang ; Qiang Li ; Xing Lu ; Xinbo Zou
    Copyright Year: 2016

    Morgan and Claypool eBooks

    Ever since its invention in the 1980s, the compound semiconductor heterojunction-based high electron mobility transistor (HEMT) has been widely used in radio frequency (RF) applications. This book provides readers with broad coverage on techniques and new trends of HEMT, employing leading compound semiconductors, III-N and III-V materials. The content includes an overview of GaN HEMT device-scaling technologies and experimental research breakthroughs in fabricating various GaN MOSHEMT transistors. Readers are offered an inspiring example of monolithic integration of HEMT with LEDs, too. The authors compile the most relevant aspects of III-V HEMT, including the current status of state-of-art HEMTs, their possibility of replacing the Si CMOS transistor channel, and growth opportunities of III-V materials on an Si substrate. With detailed exploration and explanations, the book is a helpful source suitable for anyone learning about and working on compound semiconductor devices. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Embedded Systems: Using ANSI C and the Arduino Development Environment

    David Russell
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Many electrical and computer engineering projects involve some kind of embedded system in which a microcontroller sits at the center as the primary source of control. The recently-developed Arduino development platform includes an inexpensive hardware development board hosting an eight-bit ATMEL ATmega-family processor and a Java-based software-development environment. These features allow an embedded systems beginner the ability to focus their attention on learning how to write embedded software instead of wasting time overcoming the engineering CAD tools learning curve. The goal of this text is to introduce fundamental methods for creating embedded software in general, with a focus on ANSI C. The Arduino development platform provides a great means for accomplishing this task. As such, this work presents embedded software development using 100% ANSI C for the Arduino's ATmega328P processor. We deviate from using the Arduino-specific Wiring libraries in an attempt to provide the most general embedded methods. In this way, the reader will acquire essential knowledge necessary for work on future projects involving other processors. Particular attention is paid to the notorious issue of using C pointers in order to gain direct access to microprocessor registers, which ultimately allow control over all peripheral interfacing. Table of Contents: Introduction / ANSI C / Introduction to Arduino / Embedded Debugging / ATmega328P Architecture / General-Purpose Input/Output / Timer Ports / Analog Input Ports / Interrupt Processing / Serial Communications / Assembly Language / Non-volatile Memory View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Tensor Properties of Solids:Part Two: Transport Properties of Solids

    Richard F. Tinder
    Copyright Year: 2007

    Morgan and Claypool eBooks

    Tensor Properties of Solids presents the phenomenological development of solid state properties represented as matter tensors in two parts: Part I on equilibrium tensor properties and Part II on transport tensor properties. Part I begins with an introduction to tensor notation, transformations, algebra, and calculus together with the matrix representations. Crystallography, as it relates to tensor properties of crystals, completes the background treatment. A generalized treatment of solid-state equilibrium thermodynamics leads to the systematic correlation of equilibrium tensor properties. This is followed by developments covering first-, second-, third-, and higher-order tensor effects. Included are the generalized compliance and rigidity matrices for first-order tensor properties, Maxwell relations, effect of measurement conditions, and the dependent coupled effects and use of interaction diagrams. Part I concludes with the second- and higher-order effects, including numerous optica tensor properties. Part II presents the driving forces and fluxes for the well-known proper conductivities. An introduction to irreversible thermodynamics includes the concepts of microscopic reversibility, Onsager's reciprocity principle, entropy density production, and the proper choice of the transport parameters. This is followed by the force-flux equations for electronic charge and heat flow and the relationships between the proper conductivities and phenomenological coefficients. The thermoelectric effects in solids are discussed and extended to the piezothermoelectric and piezoresistance tensor effects. The subjects of thermomagnetic, galvanomagnetic, and thermogalvanomagnetic effects are developed together with other higher-order magnetotransport property tensors. A glossary of terms, expressions, and symbols are provided at the end of the text, and end-of-chapter problems are provided on request. Endnotes provide the necessary references for further reading. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Merging Languages and Engineering:Partnering Across the Disciplines

    John Grandin
    Copyright Year: 2013

    Morgan and Claypool eBooks

    At the University of Rhode Island over 25% of engineering undergraduates simultaneously complete a second degree in German, French, Spanish, or Chinese. They furthermore spend an entire year abroad, one semester as exchange students at a partner university and six months as professional engineering interns at a cooperating company. With a close-to 100% placement rate, over 400 graduates, and numerous national awards, the URI International Engineering Program (IEP) is a proven path of preparation for young engineers in today's global workplace. The author of this volume, John Grandin, is an emeritus professor of German who developed and led the IEP for twenty-three years. In these pages, he provides a two-pronged approach to explain the origin and history of this program rooted in such an unusual merger of two traditionally distinct higher education disciplines. He looks first at himself to explain how and why he became an international educator and what led him to his lasting passion for the IEP. He then provides an historical overview of the program's origin and growth, including looks at the bumps and bruises and ups and downs along the way. Grandin hopes that this story will be of use and value to other educators determined to reform higher education and align it with the needs of the 21st Century. Table of Contents: How I became a Professor of German / My Unexpected Path to Engineering / Building a Network of Support / Sidetracked by a Stint in the Dean's Office / Reshaping the Language Mission / Struggling to Institutionalize / Partnering with Universities Abroad / Going into the Hotel and Restaurant Business / Taking the Lead Nationally / Building the Chinese IEP / Staying Involved after Retirement / The Broader Message for Higher Education / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital System Verification:A Combined Formal Methods and Simulation Framework

    Lun Li ; Mitchel Thornton
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 50 years of development, simulation methods have reached a degree of maturity, however, new advances continue to be developed in the area. A simulation approach for functional verification can theoretically validate all possible behaviors of a design but requires excessive computational resources. Rapidly evolving markets demand short design cycles while the increasing complexity of a design caus s simulation approaches to provide less and less coverage. Formal verification is an attractive alternative since 100% coverage can be achieved; however, large designs impose unrealistic computational requirements. Combining formal verification and simulation into a single integrated circuit validation framework is an attractive alternative. This book focuses on an Integrated Design Validation (IDV) system that provides a framework for design validation and takes advantage of current technology in the areas of simulation and formal verification resulting in a practical validation engine with reasonable runtime. After surveying the basic principles of formal verification and simulation, this book describes the IDV approach to integrated circuit functional validation. Table of Contents: Introduction / Formal Methods Background / Simulation Approaches / Integrated Design Validation System / Conclusion and Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Community Detection and Mining in Social Media

    Lei Tang ; Huan Liu
    Copyright Year: 2010

    Morgan and Claypool eBooks

    The past decade has witnessed the emergence of participatory Web and social media, bringing people together in many creative ways. Millions of users are playing, tagging, working, and socializing online, demonstrating new forms of collaboration, communication, and intelligence that were hardly imaginable just a short time ago. Social media also helps reshape business models, sway opinions and emotions, and opens up numerous possibilities to study human interaction and collective behavior in an unparalleled scale. This lecture, from a data mining perspective, introduces characteristics of social media, reviews representative tasks of computing with social media, and illustrates associated challenges. It introduces basic concepts, presents state-of-the-art algorithms with easy-to-understand examples, and recommends effective evaluation methods. In particular, we discuss graph-based community detection techniques and many important extensions that handle dynamic, heterogeneous networks i social media. We also demonstrate how discovered patterns of communities can be used for social media mining. The concepts, algorithms, and methods presented in this lecture can help harness the power of social media and support building socially-intelligent systems. This book is an accessible introduction to the study of emph{community detection and mining in social media}. It is an essential reading for students, researchers, and practitioners in disciplines and applications where social media is a key source of data that piques our curiosity to understand, manage, innovate, and excel. This book is supported by additional materials, including lecture slides, the complete set of figures, key references, some toy data sets used in the book, and the source code of representative algorithms. The readers are encouraged to visit the book website for the latest information. Table of Contents: Social Media and Social Computing / Nodes, Ties, and Influence / Community Detection and Evaluat on / Communities in Heterogeneous Networks / Social Media Mining View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advanced Circuit Simulation using Multisim Workbench

    David Baez-Lopez ; Felix E. Guerrero-Castro ; Ofelia Delfina Cervantes-Villagomez
    Copyright Year: 2012

    Morgan and Claypool eBooks

    Multisim is now the de facto standard for circuit simulation. It is a SPICE-based circuit simulator which combines analog, discrete-time, and mixed-mode circuits. In addition, it is the only simulator which incorporates microcontroller simulation in the same environment. It also includes a tool for printed circuit board design. Advanced Circuit Simulation Using Multisim Workbench is a companion book to Circuit Analysis Using Multisim, published by Morgan & Claypool in 2011. This new book covers advanced analyses and the creation of models and subcircuits. It also includes coverage of transmission lines, the special elements which are used to connect components in PCBs and integrated circuits. Finally, it includes a description of Ultiboard, the tool for PCB creation from a circuit description in Multisim. Both books completely cover most of the important features available for a successful circuit simulation with Multisim. Table of Contents: Models and Subcircuits / Transmission Line / Other Types of Analyses / Simulating Microcontrollers / PCB Design With Ultiboard View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multi-Core Cache Hierarchies

    Rajeev Balasubramonian ; Norman P. Jouppi ; Naveen Muralimanohar
    Copyright Year: 2011

    Morgan and Claypool eBooks

    A key determinant of overall system performance and power dissipation is the cache hierarchy since access to off-chip memory consumes many more cycles and energy than on-chip accesses. In addition, multi-core processors are expected to place ever higher bandwidth demands on the memory system. All these issues make it important to avoid off-chip memory access by improving the efficiency of the on-chip cache. Future multi-core processors will have many large cache banks connected by a network and shared by many cores. Hence, many important problems must be solved: cache resources must be allocated across many cores, data must be placed in cache banks that are near the accessing core, and the most important data must be identified for retention. Finally, difficulties in scaling existing technologies require adapting to and exploiting new technology constraints. The book attempts a synthesis of recent cache research that has focused on innovations for multi-core processors. It is an excel ent starting point for early-stage graduate students, researchers, and practitioners who wish to understand the landscape of recent cache research. The book is suitable as a reference for advanced computer architecture classes as well as for experienced researchers and VLSI engineers. Table of Contents: Basic Elements of Large Cache Design / Organizing Data in CMP Last Level Caches / Policies Impacting Cache Hit Rates / Interconnection Networks within Large Caches / Technology / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Light Field Sampling

    Cha Zhang ; Tsuhan Chen
    Copyright Year: 2006

    Morgan and Claypool eBooks

    Light field is one of the most representative image-based rendering techniques that generate novel virtual views from images instead of 3D models. The light field capture and rendering process can be considered as a procedure of sampling the light rays in the space and interpolating those in novel views. As a result, light field can be studied as a high-dimensional signal sampling problem, which has attracted a lot of research interest and become a convergence point between computer graphics and signal processing, and even computer vision. This lecture focuses on answering two questions regarding light field sampling, namely how many images are needed for a light field, and if such number is limited, where we should capture them. The book can be divided into three parts. First, we give a complete analysis on uniform sampling of IBR data. By introducing the surface plenoptic function, we are able to analyze the Fourier spectrum of non-Lambertian and occluded scenes. Given the spectrum, we also apply the generalized sampling theorem on the IBR data, which results in better rendering quality than rectangular sampling for complex scenes. Such uniform sampling analysis provides general guidelines on how the images in IBR should be taken. For instance, it shows that non-Lambertian and occluded scenes often require a higher sampling rate. Next, we describe a very general sampling framework named freeform sampling. Freeform sampling handles three kinds of problems: sample reduction, minimum sampling rate to meet an error requirement, and minimization of reconstruction error given a fixed number of samples. When the to-be-reconstructed function values are unknown, freeform sampling becomes active sampling. Algorithms of active sampling are developed for light field and show better results than the traditional uniform sampling approach. Third, we present a self-reconfigurable camera array that we developed, which features a very efficient algorithm for real-time rendering an the ability of automatically reconfiguring the cameras to improve the rendering quality. Both are based on active sampling. Our camera array is able to render dynamic scenes interactively at high quality. To the best of our knowledge, it is the first camera array that can reconfigure the camera positions automatically. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modeling Digital Switching Circuits with Linear Algebra

    Mitchell A. Thornton
    Copyright Year: 2014

    Morgan and Claypool eBooks

    Modeling Digital Switching Circuits with Linear Algebra describes an approach for modeling digital information and circuitry that is an alternative to Boolean algebra. While the Boolean algebraic model has been wildly successful and is responsible for many advances in modern information technology, the approach described in this book offers new insight and different ways of solving problems. Modeling the bit as a vector instead of a scalar value in the set {0, 1} allows digital circuits to be characterized with transfer functions in the form of a linear transformation matrix. The use of transfer functions is ubiquitous in many areas of engineering and their rich background in linear systems theory and signal processing is easily applied to digital switching circuits with this model. The common tasks of circuit simulation and justification are specific examples of the application of the linear algebraic model and are described in detail. The advantages offered by the new model as compa ed to traditional methods are emphasized throughout the book. Furthermore, the new approach is easily generalized to other types of information processing circuits such as those based upon multiple-valued or quantum logic; thus providing a unifying mathematical framework common to each of these areas. Modeling Digital Switching Circuits with Linear Algebra provides a blend of theoretical concepts and practical issues involved in implementing the method for circuit design tasks. Data structures are described and are shown to not require any more resources for representing the underlying matrices and vectors than those currently used in modern electronic design automation (EDA) tools based on the Boolean model. Algorithms are described that perform simulation, justification, and other common EDA tasks in an efficient manner that are competitive with conventional design tools. The linear algebraic model can be used to implement common EDA tasks directly upon a structural netlist thus avo ding the intermediate step of transforming a circuit description into a representation of a set of switching functions as is commonly the case when conventional Boolean techniques are used. Implementation results are provided that empirically demonstrate the practicality of the linear algebraic model. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Health Care Engineering, Part I:Clinical Engineering and Technology Management

    Monique Frize
    Copyright Year: 2013

    Morgan and Claypool eBooks

    The first chapter describes the health care delivery systems in Canada and in the U.S. This is followed by examples of various approaches used to measure physiological variables in humans, either for the purpose of diagnosis or monitoring potential disease conditions; a brief description of sensor technologies is included. The function and role of the clinical engineer in managing medical technologies in industrialized and in developing countries are presented. This is followed by a chapter on patient safety (mainly electrical safety and electromagnetic interference); it includes a section on how to minimize liability and how to develop a quality assurance program for technology management. The next chapter discusses applications of telemedicine, including technical, social, and ethical issues. The last chapter presents a discussion on the impact of technology on health care and the technology assessment process. This two-part book consolidates material that supports courses on techno ogy development and management issues in health care institutions. It can be useful for anyone involved in design, development, or research, whether in industry, hospitals, or government. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Search-User Interface Design

    Max Wilson
    Copyright Year: 2011

    Morgan and Claypool eBooks

    Search User Interfaces (SUIs) represent the gateway between people who have a task to complete, and the repositories of information and data stored around the world. Not surprisingly, therefore, there are many communities who have a vested interest in the way SUIs are designed. There are people who study how humans search for information, and people who study how humans use computers. There are people who study good user interface design, and people who design aesthetically pleasing user interfaces. There are also people who curate and manage valuable information resources, and people who design effective algorithms to retrieve results from them. While it would be easy for one community to reject another for their limited ability to design a good SUI, the truth is that they all can, and they all have made valuable contributions. Fundamentally, therefore, we must accept that designing a great SUI means leveraging the knowledge and skills from all of these communities. The aim of this b ok is to at least acknowledge, if not integrate, all of these perspectives to bring the reader into a multidisciplinary mindset for how we should think about SUI design. Further, this book aims to provide the reader with a framework for thinking about how different innovations each contribute to the overall design of a SUI. With this framework and a multidisciplinary perspective in hand, the book then continues by reviewing: early, successful, established, and experimental concepts for SUI design. The book then concludes by discussing how we can analyse and evaluate the on-going developments in SUI design, as this multidisciplinary area of research moves forwards. Finally, in reviewing these many SUIs and SUI features, the book finishes by extracting a series of 20 SUI design recommendations that are listed in the conclusions. Table of Contents: Introduction / Searcher-Computer Interaction / Early Search User Interfaces / Modern Search User Interfaces / Experimental Search User Interf ces / Evaluating Search User Interfaces / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Estimating the Query Difficulty for Information Retrieval

    David Carmel ; Elad Yom-Tov
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Many information retrieval (IR) systems suffer from a radical variance in performance when responding to users' queries. Even for systems that succeed very well on average, the quality of results returned for some of the queries is poor. Thus, it is desirable that IR systems will be able to identify "difficult" queries so they can be handled properly. Understanding why some queries are inherently more difficult than others is essential for IR, and a good answer to this important question will help search engines to reduce the variance in performance, hence better servicing their customer needs. Estimating the query difficulty is an attempt to quantify the quality of search results retrieved for a query from a given collection of documents. This book discusses the reasons that cause search engines to fail for some of the queries, and then reviews recent approaches for estimating query difficulty in the IR field. It then describes a common methodology for evaluating the prediction qual ty of those estimators, and experiments with some of the predictors applied by various IR methods over several TREC benchmarks. Finally, it discusses potential applications that can utilize query difficulty estimators by handling each query individually and selectively, based upon its estimated difficulty. Table of Contents: Introduction - The Robustness Problem of Information Retrieval / Basic Concepts / Query Performance Prediction Methods / Pre-Retrieval Prediction Methods / Post-Retrieval Prediction Methods / Combining Predictors / A General Model for Query Difficulty / Applications of Query Difficulty Estimation / Summary and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Database Replication

    Bettina Kemme ; Ricardo Jimenez-Peris ; Marta Patino-Martinez
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and there are many hurdles to overcome. At the forefront is replica control: assuring that data copies remain consistent when updates occur. There exist many alternatives in regard to where updates can occur and when changes are propagated to data copies, how changes are applied, where the replication tool is located, etc. A particular challenge is to combine replica control with transaction manageme t as it requires several operations to be treated as a single logical unit, and it provides atomicity, consistency, isolation and durability across the replicated system. The book provides a categorization of replica control mechanisms, presents several replica and concurrency control mechanisms in detail, and discusses many of the issues that arise when such solutions need to be implemented within or on top of relational database systems. Furthermore, the book presents the tasks that are needed to build a fault-tolerant replication solution, provides an overview of load-balancing strategies that allow load to be equally distributed across all replicas, and introduces the concept of self-provisioning that allows the replicated system to dynamically decide on the number of replicas that are needed to handle the current load. As performance evaluation is a crucial aspect when developing a replication tool, the book presents an analytical model of the scalability potential of various rep ication solution. For readers that are only interested in getting a good overview of the challenges of database replication and the general mechanisms of how to implement replication solutions, we recommend to read Chapters 1 to 4. For readers that want to get a more complete picture and a discussion of advanced issues, we further recommend the Chapters 5, 8, 9 and 10. Finally, Chapters 6 and 7 are of interest for those who want get familiar with thorough algorithm design and correctness reasoning. Table of Contents: Overview / 1-Copy-Equivalence and Consistency / Basic Protocols / Replication Architecture / The Scalability of Replication / Eager Replication and 1-Copy-Serializability / 1-Copy-Snapshot Isolation / Lazy Replication / Self-Configuration and Elasticity / Other Aspects of Replication View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Estimation of Cortical Connectivity in Humans: Advanced Signal Processing Techniques

    Laura Astolfi ; Fabio Babiloni
    Copyright Year: 2007

    Morgan and Claypool eBooks

    In the last ten years many different brain imaging devices have conveyed a lot of information about the brain functioning in different experimental conditions. In every case, the biomedical engineers, together with mathematicians, physicists and physicians are called to elaborate the signals related to the brain activity in order to extract meaningful and robust information to correlate with the external behavior of the subjects. In such attempt, different signal processing tools used in telecommunications and other field of engineering or even social sciences have been adapted and re-used in the neuroscience field. The present book would like to offer a short presentation of several methods for the estimation of the cortical connectivity of the human brain. The methods here presented are relatively simply to implement, robust and can return valuable information about the causality of the activation of the different cortical areas in humans using non invasive electroencephalographic r cordings. The knowledge of such signal processing tools will enrich the arsenal of the computational methods that a engineer or a mathematician could apply in the processing of brain signals. Table of Contents: Introduction / Estimation of the Effective Connectivity from Stationary Data by Structural Equation Modeling / Estimation of the Functional Connectivity from Stationary Data by Multivariate Autoregressive Methods / Estimation of Cortical Activity by the use of Realistic Head Modeling / Application: Estimation of Connectivity from Movement-Related Potentials / Application to High-Resolution EEG Recordings in a Cognitive Task (Stroop Test) / Application to Data Related to the Intention of Limb Movements in Normal Subjects and in a Spinal Cord Injured Patient / The Instantaneous Estimation of the Time-Varying Cortical Connectivity by Adaptive Multivariate Estimators / Time-Varying Connectivity from Event-Related Potentials View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Scheduling and Congestion Control for Wireless and Processing Networks

    Libin Jiang ; Jean Walrand
    Copyright Year: 2010

    Morgan and Claypool eBooks

    In this book, we consider the problem of achieving the maximum throughput and utility in a class of networks with resource-sharing constraints. This is a classical problem of great importance. In the context of wireless networks, we first propose a fully distributed scheduling algorithm that achieves the maximum throughput. Inspired by CSMA (Carrier Sense Multiple Access), which is widely deployed in today's wireless networks, our algorithm is simple, asynchronous, and easy to implement. Second, using a novel maximal-entropy technique, we combine the CSMA scheduling algorithm with congestion control to approach the maximum utility. Also, we further show that CSMA scheduling is a modular MAC-layer algorithm that can work with other protocols in the transport layer and network layer. Third, for wireless networks where packet collisions are unavoidable, we establish a general analytical model and extend the above algorithms to that case. Stochastic Processing Networks (SPNs) model manuf cturing, communication, and service systems. In manufacturing networks, for example, tasks require parts and resources to produce other parts. SPNs are more general than queueing networks and pose novel challenges to throughput-optimum scheduling. We proposes a "deficit maximum weight" (DMW) algorithm to achieve throughput optimality and maximize the net utility of the production in SPNs. Table of Contents: Introduction / Overview / Scheduling in Wireless Networks / Utility Maximization in Wireless Networks / Distributed CSMA Scheduling with Collisions / Stochastic Processing networks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Shimeng Yu
    Copyright Year: 2016

    Morgan and Claypool eBooks

    RRAM technology has made significant progress in the past decade as a competitive candidate for the next generation non-volatile memory (NVM). This lecture is a comprehensive tutorial of metal oxide-based RRAM technology from device fabrication to array architecture design. State-of-the-art RRAM device performances, characterization, and modeling techniques are summarized, and the design considerations of the RRAM integration to large-scale array with peripheral circuits are discussed. Chapter 2 introduces the RRAM device fabrication techniques and methods to eliminate the forming process, and will show its scalability down to sub-10 nm regime. Then the device performances such as programming speed, variability control, and multi-level operation are presented, and finally the reliability issues such as cycling endurance and data retention are discussed. Chapter 3 discusses the RRAM physical mechanism, and the materials characterization techniques to observe the conductive filaments an the electrical characterization techniques to study the electronic conduction processes. It also presents the numerical device modeling techniques for simulating the evolution of the conductive filaments as well as the compact device modeling techniques for circuit-level design. Chapter 4 discusses the two common RRAM array architectures for large-scale integration: one-transistor-one-resistor (1T1R) and cross-point architecture with selector. The write/read schemes are presented and the peripheral circuitry design considerations are discussed. Finally, a 3D integration approach is introduced for building ultra-high density RRAM array. Chapter 5 is a brief summary and will give an outlook for RRAM’s potential novel applications beyond the NVM applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering, Poverty, and the Earth

    George D. Catalano
    Copyright Year: 2007

    Morgan and Claypool eBooks

    In the present work, the growing awareness in engineering of the profession’s responsibility towards the environment and the poor is considered. The following approach is taken: a brief overview of the issues of poverty particularly in the U.S. and the deterioration of the natural world with a focus on the Arctic is provided. Case studies involving New Orleans in the aftermath of Hurricane Katrina and the status of polar bears in a time of shrinking Arctic ice cover are detailed. Recent developments in engineering related to the issues of poverty and the environment are discussed. A new paradigm for engineering based on the works of Leonardo Boff and Thomas Berry, one that places an important emphasis upon a community, is explored. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineers Engaging Community:Water and Energy

    Carolyn Oldham ; Gregory Crebbin ; Stephen Dobbs ; Andrea Gaynor
    Copyright Year: 2013

    Morgan and Claypool eBooks

    Water and energy are fundamental elements of community well-being and economic development, and a key focus of engineering efforts the world over. As such, they offer outstanding opportunities for the development of socially just engineering practices. This work examines the engineering of water and energy systems with a focus on issues of social justice and sustainability. A key theme running through the work is engaging community on water and energy engineering projects: How is this achieved in diverse contexts? And, what can we learn from past failures and successes in water and energy engineering? The book includes a detailed case study of issues involved in the provision of water and energy, among other needs, in a developing and newly independent nation, East Timor. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Microcontroller Programming and Interfacing TI MSP430:Part I

    Steven Barrett ; Daniel Pack
    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book provides a thorough introduction to the Texas Instruments MPS430 microcontroller. The MPS430 is a 16-bit reduced instruction set (RISC) processor that features ultra low power consumption and integrated digital and analog hardware. Variants of the MPS430 microcontroller have been in production since 1993. This provides for a host of MPS430 products including evaluation boards, compilers, and documentation. A thorough introduction to the MPS430 line of microcontrollers, programming techniques, and interface concepts are provided along with considerable tutorial information with many illustrated examples. Each chapter provides laboratory exercises to apply what has been presented in the chapter. The book is intended for an upper level undergraduate course in microcontrollers or mechatronics but may also be used as a reference for capstone design projects. Also, practicing engineers already familiar with another microcontroller, who require a quick tutorial on the microcontroll r, will find this book very useful. Table of Contents: Timer Systems / Resets and Interrupts / Analog Peripherals / Communication Systems / System Level Design View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to Constraint-Based Temporal Reasoning

    Roman Bartak ; Robert A. Morris ; K. Brent Venable
    Copyright Year: 2014

    Morgan and Claypool eBooks

    Solving challenging computational problems involving time has been a critical component in the development of artificial intelligence systems almost since the inception of the field. This book provides a concise introduction to the core computational elements of temporal reasoning for use in AI systems for planning and scheduling, as well as systems that extract temporal information from data. It presents a survey of temporal frameworks based on constraints, both qualitative and quantitative, as well as of major temporal consistency techniques. The book also introduces the reader to more recent extensions to the core model that allow AI systems to explicitly represent temporal preferences and temporal uncertainty. This book is intended for students and researchers interested in constraint-based temporal reasoning. It provides a self-contained guide to the different representations of time, as well as examples of recent applications of time in AI systems. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Graph-Based Semi-Supervised Learning

    Amarnag Subramanya ; Partha Pratim Talukdar
    Copyright Year: 2014

    Morgan and Claypool eBooks

    While labeled data is expensive to prepare, ever increasing amounts of unlabeled data is becoming widely available. In order to adapt to this phenomenon, several semi-supervised learning (SSL) algorithms, which learn from labeled as well as unlabeled data, have been developed. In a separate line of work, researchers have started to realize that graphs provide a natural way to represent data in a variety of domains. Graph-based SSL algorithms, which bring together these two lines of work, have been shown to outperform the state-of-the-art in many applications in speech processing, computer vision, natural language processing, and other areas of Artificial Intelligence. Recognizing this promising and emerging area of research, this synthesis lecture focuses on graph-based SSL algorithms (e.g., label propagation methods). Our hope is that after reading this book, the reader will walk away with the following: (1) an in-depth knowledge of the current state-of-the-art in graph-based SSL alg rithms, and the ability to implement them; (2) the ability to decide on the suitability of graph-based SSL methods for a problem; and (3) familiarity with different applications where graph-based SSL methods have been successfully applied. Table of Contents: Introduction / Graph Construction / Learning and Inference / Scalability / Applications / Future Work / Bibliography / Authors' Biographies / Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computer Architecture Techniques for Power-Efficiency

    Stefanos Kaxiras ; Margaret Martonosi
    Copyright Year: 2008

    Morgan and Claypool eBooks

    In the last few years, power dissipation has become an important design constraint, on par with performance, in the design of new computer systems. Whereas in the past, the primary job of the computer architect was to translate improvements in operating frequency and transistor count into performance, now power efficiency must be taken into account at every step of the design process. While for some time, architects have been successful in delivering 40% to 50% annual improvement in processor performance, costs that were previously brushed aside eventually caught up. The most critical of these costs is the inexorable increase in power dissipation and power density in processors. Power dissipation issues have catalyzed new topic areas in computer architecture, resulting in a substantial body of work on more power-efficient architectures. Power dissipation coupled with diminishing performance gains, was also the main cause for the switch from single-core to multi-core architectures and slowdown in frequency increase. This book aims to document some of the most important architectural techniques that were invented, proposed, and applied to reduce both dynamic power and static power dissipation in processors and memory hierarchies. A significant number of techniques have been proposed for a wide range of situations and this book synthesizes those techniques by focusing on their common characteristics. Table of Contents: Introduction / Modeling, Simulation, and Measurement / Using Voltage and Frequency Adjustments to Manage Dynamic Power / Optimizing Capacitance and Switching Activity to Reduce Dynamic Power / Managing Static (Leakage) Power / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High Dynamic Range Video

    Karol Myszkowski ; Rafal Mantiuk ; Grzegorz Krawczyk
    Copyright Year: 2008

    Morgan and Claypool eBooks

    As new displays and cameras offer enhanced color capabilities, there is a need to extend the precision of digital content. High Dynamic Range (HDR) imaging encodes images and video with higher than normal 8 bit-per-color-channel precision, enabling representation of the complete color gamut and the full visible range of luminance.However, to realize transition from the traditional toHDRimaging, it is necessary to develop imaging algorithms that work with the high-precision data. Tomake such algorithms effective and feasible in practice, it is necessary to take advantage of the limitations of the human visual system by aligning the data shortcomings to those of the human eye, thus limiting storage and processing precision. Therefore, human visual perception is the key component of the solutions we discuss in this book. This book presents a complete pipeline forHDR image and video processing fromacquisition, through compression and quality evaluation, to display. At the HDR image and vi eo acquisition stage specialized HDR sensors or multi-exposure techniques suitable for traditional cameras are discussed. Then, we present a practical solution for pixel values calibration in terms of photometric or radiometric quantities, which are required in some technically oriented applications. Also, we cover the problem of efficient image and video compression and encoding either for storage or transmission purposes, including the aspect of backward compatibility with existing formats. Finally, we review existing HDR display technologies and the associated problems of image contrast and brightness adjustment. For this purpose tone mapping is employed to accommodate HDR content to LDR devices. Conversely, the so-called inverse tone mapping is required to upgrade LDR content for displaying on HDR devices. We overview HDR-enabled image and video quality metrics, which are needed to verify algorithms at all stages of the pipeline. Additionally, we cover successful examples of the H R technology applications, in particular, in computer graphics and computer vision. The goal of this book is to present all discussed components of the HDR pipeline with the main focus on video. For some pipeline stages HDR video solutions are either not well established or do not exist at all, in which case we describe techniques for single HDR images. In such cases we attempt to select the techniques, which can be extended into temporal domain. Whenever needed, relevant background information on human perception is given, which enables better understanding of the design choices behind the discussed algorithms and HDR equipment. Table of Contents: Introduction / Representation of an HDR Image / HDR Image and Video Acquisition / HDR Image Quality / HDR Image, Video, and Texture Compression / Tone Reproduction / HDR Display Devices / LDR2HDR: Recovering Dynamic Range in Legacy Content / HDRI in Computer Graphics / Software View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Provenance Data in Social Media

    Geoffrey Barbier ; Zhuo Feng ; Pritam Gundecha ; Huan Liu
    Copyright Year: 2013

    Morgan and Claypool eBooks

    Social media shatters the barrier to communicate anytime anywhere for people of all walks of life. The publicly available, virtually free information in social media poses a new challenge to consumers who have to discern whether a piece of information published in social media is reliable. For example, it can be difficult to understand the motivations behind a statement passed from one user to another, without knowing the person who originated the message. Additionally, false information can be propagated through social media, resulting in embarrassment or irreversible damages. Provenance data associated with a social media statement can help dispel rumors, clarify opinions, and confirm facts. However, provenance data about social media statements is not readily available to users today. Currently, providing this data to users requires changing the social media infrastructure or offering subscription services. Taking advantage of social media features, research in this nascent field s earheads the search for a way to provide provenance data to social media users, thus leveraging social media itself by mining it for the provenance data. Searching for provenance data reveals an interesting problem space requiring the development and application of new metrics in order to provide meaningful provenance data to social media users. This lecture reviews the current research on information provenance, explores exciting research opportunities to address pressing needs, and shows how data mining can enable a social media user to make informed judgements about statements published in social media. Table of Contents: Information Provenance in Social Media / Provenance Attributes / Provenance via Network Information / Provenance Data View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Tragedy in the Gulf:A Call for a New Engineering Ethic

    George D. Catalano
    Copyright Year: 2010

    Morgan and Claypool eBooks

    The recent tragedy in the Gulf of Mexico and resultant ethical consequences for the engineering profession are introduced and discussed. The need for a new engineering ethic is identified and introduced based upon advancements in science, complex systems and eco-philosophy. Motivations for introducing a new ethic rather than modifying existing ethics are also discussed. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Transient Electro-Thermal Modeling of Bipolar Power Semiconductor Devices

    Tanya Kirilova Gachovska ; Jerry Hudgins ; Bin Du ; Enrico Santi
    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book presents physics-based electro-thermal models of bipolar power semiconductor devices including their packages, and describes their implementation in MATLAB and Simulink. It is a continuation of our first book Modeling of Bipolar Power Semiconductor Devices. The device electrical models are developed by subdividing the devices into different regions and the operations in each region, along with the interactions at the interfaces, are analyzed using the basic semiconductor physics equations that govern device behavior. The Fourier series solution is used to solve the ambipolar diffusion equation in the lightly doped drift region of the devices. In addition to the external electrical characteristics, internal physical and electrical information, such as junction voltages and carrier distribution in different regions of the device, can be obtained using the models. The instantaneous dissipated power, calculated using the electrical device models, serves as input to the thermal m del (RC network with constant and nonconstant thermal resistance and thermal heat capacity, or Fourier thermal model) of the entire module or package, which computes the junction temperature of the device. Once an updated junction temperature is calculated, the temperature-dependent semiconductor material parameters are re-calculated and used with the device electrical model in the next time-step of the simulation. The physics-based electro-thermal models can be used for optimizing device and package design and also for validating extracted parameters of the devices. The thermal model can be used alone for monitoring the junction temperature of a power semiconductor device, and the resulting simulation results used as an indicator of the health and reliability of the semiconductor power device. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Privacy in Social Networks

    Elena Zheleva ; Evimaria Terzi ; Lise Getoor
    Copyright Year: 2013

    Morgan and Claypool eBooks

    This synthesis lecture provides a survey of work on privacy in online social networks (OSNs). This work encompasses concerns of users as well as service providers and third parties. Our goal is to approach such concerns from a computer-science perspective, and building upon existing work on privacy, security, statistical modeling and databases to provide an overview of the technical and algorithmic issues related to privacy in OSNs. We start our survey by introducing a simple OSN data model and describe common statistical-inference techniques that can be used to infer potentially sensitive information. Next, we describe some privacy definitions and privacy mechanisms for data publishing. Finally, we describe a set of recent techniques for modeling, evaluating, and managing individual users' privacy risk within the context of OSNs. Table of Contents: Introduction / A Model for Online Social Networks / Types of Privacy Disclosure / Statistical Methods for Inferring Information in Netwo ks / Anonymity and Differential Privacy / Attacks and Privacy-preserving Mechanisms / Models of Information Sharing / Users' Privacy Risk / Management of Privacy Settings View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    XML Retrieval

    Mounia Lalmas
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Documents usually have a content and a structure. The content refers to the text of the document, whereas the structure refers to how a document is logically organized. An increasingly common way to encode the structure is through the use of a mark-up language. Nowadays, the most widely used mark-up language for representing structure is the eXtensible Mark-up Language (XML). XML can be used to provide a focused access to documents, i.e. returning XML elements, such as sections and paragraphs, instead of whole documents in response to a query. Such focused strategies are of particular benefit for information repositories containing long documents, or documents covering a wide variety of topics, where users are directed to the most relevant content within a document. The increased adoption of XML to represent a document structure requires the development of tools to effectively access documents marked-up in XML. This book provides a detailed description of query languages, indexing str tegies, ranking algorithms, presentation scenarios developed to access XML documents. Major advances in XML retrieval were seen from 2002 as a result of INEX, the Initiative for Evaluation of XML Retrieval. INEX, also described in this book, provided test sets for evaluating XML retrieval effectiveness. Many of the developments and results described in this book were investigated within INEX. Table of Contents: Introduction / Basic XML Concepts / Historical Perspectives / Query Languages / Indexing Strategies / Ranking Strategies / Presentation Strategies / Evaluating XML Retrieval Effectiveness / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital Library Technologies:Complex Objects, Annotation, Ontologies, Classification, Extraction, and Security

    Edward A. Fox ; Ricardo da Silva Torres
    Copyright Year: 2014

    Morgan and Claypool eBooks

    Digital libraries (DLs) have introduced new technologies, as well as leveraging, enhancing, and integrating related technologies, since the early 1990s. These efforts have been enriched through a formal approach, e.g., the 5S (Societies, Scenarios, Spaces, Structures, Streams) framework, which is discussed in two earlier volumes in this series. This volume should help advance work not only in DLs, but also in the WWW and other information systems. Drawing upon four (Kozievitch, Murthy, Park, Yang) completed and three (Elsherbiny, Farag, Srinivasan) in-process dissertations, as well as the efforts of collaborating researchers and scores of related publications, presentations, tutorials, and reports, this book should advance the DL field with regard to at least six key technologies. By integrating surveys of the state-of-the-art, new research, connections with formalization, case studies, and exercises/projects, this book can serve as a computing or information science textbook. It can upport studies in cyber-security, document management, hypertext/hypermedia, IR, knowledge management, LIS, multimedia, and machine learning. Chapter 1, with a case study on fingerprint collections, focuses on complex (composite, compound) objects, connecting DL and related work on buckets, DCC, and OAI-ORE. Chapter 2, discussing annotations, as in hypertext/hypermedia, emphasizes parts of documents, including images as well as text, managing superimposed information. The SuperIDR system, and prototype efforts with Flickr, should motivate further development and standardization related to annotation, which would benefit all DL and WWW users. Chapter 3, on ontologies, explains how they help with browsing, query expansion, focused crawling, and classification. This chapter connects DLs with the Semantic Web, and uses CTRnet as an example. Chapter 4, on (hierarchical) classification, leverages LIS theory, as well as machine learning, and is important for DLs as well as the WWW. Chapter 5 on extraction from text, covers document segmentation, as well as how to construct a database from heterogeneous collections of references (from ETDs); i.e., converting strings to canonical forms. Chapter 6 surveys the security approaches used in information systems, and explains how those approaches can apply to digital libraries which are not fully open. Given this rich content, those interested in DLs will be able to find solutions to key problems, using the right technologies and methods. We hope this book will help show how formal approaches can enhance the development of suitable technologies and how they can be better integrated with DLs and other information systems. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Planning with Markov Decision Processes:An AI Perspective

    Mausam ; Andrey Kolobov
    Copyright Year: 2012

    Morgan and Claypool eBooks

    Markov Decision Processes (MDPs) are widely popular in Artificial Intelligence for modeling sequential decision-making scenarios with probabilistic dynamics. They are the framework of choice when designing an intelligent agent that needs to act for long periods of time in an environment where its actions could have uncertain outcomes. MDPs are actively researched in two related subareas of AI, probabilistic planning and reinforcement learning. Probabilistic planning assumes known models for the agent's goals and domain dynamics, and focuses on determining how the agent should behave to achieve its objectives. On the other hand, reinforcement learning additionally learns these models based on the feedback the agent gets from the environment. This book provides a concise introduction to the use of MDPs for solving probabilistic planning problems, with an emphasis on the algorithmic perspective. It covers the whole spectrum of the field, from the basics to state-of-the-art optimal and a proximation algorithms. We first describe the theoretical foundations of MDPs and the fundamental solution techniques for them. We then discuss modern optimal algorithms based on heuristic search and the use of structured representations. A major focus of the book is on the numerous approximation schemes for MDPs that have been developed in the AI literature. These include determinization-based approaches, sampling techniques, heuristic functions, dimensionality reduction, and hierarchical representations. Finally, we briefly introduce several extensions of the standard MDP classes that model and solve even more complex planning problems. Table of Contents: Introduction / MDPs / Fundamental Algorithms / Heuristic Search Algorithms / Symbolic Algorithms / Approximation Algorithms / Advanced Notes View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Database Anonymization:Privacy Models, Data Utility, and Microaggregation-based Inter-model Connections

    Josep Domingo-Ferrer ; David Sanchez ; Jordi Soria-Comas
    Copyright Year: 2016

    Morgan and Claypool eBooks

    The current social and economic context increasingly demands open data to improve scientific research and decision making. However, when published data refer to individual respondents, disclosure risk limitation techniques must be implemented to anonymize the data and guarantee by design the fundamental right to privacy of the subjects the data refer to. Disclosure risk limitation has a long record in the statistical and computer science research communities, who have developed a variety of privacy-preserving solutions for data releases. This Synthesis Lecture provides a comprehensive overview of the fundamentals of privacy in data releases focusing on the computer science perspective. Specifically, we detail the privacy models, anonymization methods, and utility and risk metrics that have been proposed so far in the literature. Besides, as a more advanced topic, we identify and discuss in detail connections between several privacy models (i.e., how to accumulate the privacy guarantee they offer to achieve more robust protection and when such guarantees are equivalent or complementary); we also explore the links between anonymization methods and privacy models (how anonymization methods can be used to enforce privacy models and thereby offer ex ante privacy guarantees). These latter topics are relevant to researchers and advanced practitioners, who will gain a deeper understanding on the available data anonymization solutions and the privacy guarantees they can offer. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Intelligent Systems in Traffic and Transportation

    Ana L. C. Bazzan ; Franziska Klugl
    Copyright Year: 2014

    Morgan and Claypool eBooks

    Urban mobility is not only one of the pillars of modern economic systems, but also a key issue in the quest for equality of opportunity, once it can improve access to other services. Currently, however, there are a number of negative issues related to traffic, especially in mega-cities, such as economical issues (cost of opportunity caused by delays), environmental (externalities related to emissions of pollutants), and social (traffic accidents). Solutions to these issues are more and more closely tied to information and communication technology. Indeed, a search in the technical literature (using the keyword ``urban traffic" to filter out articles on data network traffic) retrieved the following number of articles (as of December 3, 2013): 9,443 (ACM Digital Library), 26,054 (Scopus), and 1,730,000 (Google Scholar). Moreover, articles listed in the ACM query relate to conferences as diverse as MobiCom, CHI, PADS, and AAMAS. This means that there is a big and diverse community of com uter scientists and computer engineers who tackle research that is connected to the development of intelligent traffic and transportation systems. It is also possible to see that this community is growing, and that research projects are getting more and more interdisciplinary. To foster the cooperation among the involved communities, this book aims at giving a broad introduction into the basic but relevant concepts related to transportation systems, targeting researchers and practitioners from computer science and information technology. In addition, the second part of the book gives a panorama of some of the most exciting and newest technologies, originating in computer science and computer engineering, that are now being employed in projects related to car-to-car communication, interconnected vehicles, car navigation, platooning, crowd sensing and sensor networks, among others. This material will also be of interest to engineers and researchers from the traffic and transportation co munity. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Private Information Retrieval

    Xun Yi ; Russell Paulet ; Elisa Bertino
    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book deals with Private Information Retrieval (PIR), a technique allowing a user to retrieve an element from a server in possession of a database without revealing to the server which element is retrieved. PIR has been widely applied to protect the privacy of the user in querying a service provider on the Internet. For example, by PIR, one can query a location-based service provider about the nearest car park without revealing his location to the server. The first PIR approach was introduced by Chor, Goldreich, Kushilevitz and Sudan in 1995 in a multi-server setting, where the user retrieves information from multiple database servers, each of which has a copy of the same database. To ensure user privacy in the multi-server setting, the servers must be trusted not to collude. In 1997, Kushilevitz and Ostrovsky constructed the first single-database PIR. Since then, many efficient PIR solutions have been discovered. Beginning with a thorough survey of single-database PIR techniques, this text focuses on the latest technologies and applications in the field of PIR. The main categories are illustrated with recently proposed PIR-based solutions by the authors. Because of the latest treatment of the topic, this text will be highly beneficial to researchers and industry professionals in information security and privacy. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advanced Radar Detection Schemes Under Mismatched Signal Models

    Francesco Bandiera ; Danilo Orlando ; Giuseppe Ricci
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Adaptive detection of signals embedded in correlated Gaussian noise has been an active field of research in the last decades. This topic is important in many areas of signal processing such as, just to give some examples, radar, sonar, communications, and hyperspectral imaging. Most of the existing adaptive algorithms have been designed following the lead of the derivation of Kelly's detector which assumes perfect knowledge of the target steering vector. However, in realistic scenarios, mismatches are likely to occur due to both environmental and instrumental factors. When a mismatched signal is present in the data under test, conventional algorithms may suffer severe performance degradation. The presence of strong interferers in the cell under test makes the detection task even more challenging. An effective way to cope with this scenario relies on the use of "tunable" detectors, i.e., detectors capable of changing their directivity through the tuning of proper parameters. The aim o this book is to present some recent advances in the design of tunable detectors and the focus is on the so-called two-stage detectors, i.e., adaptive algorithms obtained cascading two detectors with opposite behaviors. We derive exact closed-form expressions for the resulting probability of false alarm and the probability of detection for both matched and mismatched signals embedded in homogeneous Gaussian noise. It turns out that such solutions guarantee a wide operational range in terms of tunability while retaining, at the same time, an overall performance in presence of matched signals commensurate with Kelly's detector. Table of Contents: Introduction / Adaptive Radar Detection of Targets / Adaptive Detection Schemes for Mismatched Signals / Enhanced Adaptive Sidelobe Blanking Algorithms / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Storing Clocked Programs Inside DNA:A Simplifying Framework for Nanocomputing

    Jessica Chang ; Dennis Shasha
    Copyright Year: 2011

    Morgan and Claypool eBooks

    In the history of modern computation, large mechanical calculators preceded computers. A person would sit there punching keys according to a procedure and a number would eventually appear. Once calculators became fast enough, it became obvious that the critical path was the punching rather than the calculation itself. That is what made the stored program concept vital to further progress. Once the instructions were stored in the machine, the entire computation could run at the speed of the machine. This book shows how to do the same thing for DNA computing. Rather than asking a robot or a person to pour in specific strands at different times in order to cause a DNA computation to occur (by analogy to a person punching numbers and operations into a mechanical calculator), the DNA instructions are stored within the solution and guide the entire computation. We show how to store straight line programs, conditionals, loops, and a rudimentary form of subroutines. To achieve this goal, the ook proposes a complete language for describing the intrinsic topology of DNA complexes and nanomachines, along with the dynamics of such a system. We then describe dynamic behavior using a set of basic transitions, which operate on a small neighborhood within a complex in a well-defined way. These transitions can be formalized as purely syntactical functions of the string representations. Building on that foundation, the book proposes a novel machine motif which constitutes an instruction stack, allowing for the clocked release of an arbitrary sequence of DNA instruction or data strands. The clock mechanism is built of special strands of DNA called "tick" and "tock." Each time a "tick" and "tock" enter a DNA solution, a strand is released from an instruction stack (by analogy to the way in which as a clock cycle in an electronic computer causes a new instruction to enter a processing unit). As long as there remain strands on the stack, the next cycle will release a new instruction st and. Regardless of the actual strand or component to be released at any particular clock step, the "tick" and "tock" fuel strands remain the same, thus shifting the burden of work away from the end user of a machine and easing operation. Pre-loaded stacks enable the concept of a stored program to be realized as a physical DNA mechanism. A conceptual example is given of such a stack operating a walker device. The stack allows for a user to operate such a clocked walker by means of simple repetition of adding two fuel types, in contrast to the previous mechanism of adding a unique fuel -- at least 12 different types of strands -- for each step of the mechanism. We demonstrate by a series of experiments conducted in Ned Seeman's lab that it is possible to "initialize" a clocked stored program DNA machine. We end the book with a discussion of the design features of a programming language for clocked DNA programming. There is a lot left to do. Table of Contents: Introduction / Notation / Topological Description of DNA Computing / Machines and Motifs / Experiment: Storing Clocked Programs in DNA / A Clocked DNA Programming Language View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Generating Plans from Proofs:The Interpolation-based Approach to Query Reformulation

    Michael Benedikt ; Julien Leblay ; Balder ten Cate ; Efthymia Tsamoura
    Copyright Year: 2016

    Morgan and Claypool eBooks

    Query reformulation refers to a process of translating a source query—a request for information in some high-level logic-based language—into a target plan that abides by certain interface restrictions. Many practical problems in data management can be seen as instances of the reformulation problem. For example, the problem of translating an SQL query written over a set of base tables into another query written over a set of views; the problem of implementing a query via translating to a program calling a set of database APIs; the problem of implementing a query using a collection of web services. In this book we approach query reformulation in a very general setting that encompasses all the problems above, by relating it to a line of research within mathematical logic. For many decades logicians have looked at the problem of converting "implicit definitions" into "explicit definitions," using an approach known as interpolation. We will review the theory of interpolatio , and explain its close connection with query reformulation. We will give a detailed look at how the interpolation-based approach is used to generate translations between logic-based queries over different vocabularies, and also how it can be used to go from logic-based queries to programs. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bacterial Sensors:Synthetic Design and Application Principles

    Jan Roelof van der Meer
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Bacterial reporters are live, genetically engineered cells with promising application in bioanalytics. They contain genetic circuitry to produce a cellular sensing element, which detects the target compound and relays the detection to specific synthesis of so-called reporter proteins (the presence or activity of which is easy to quantify). Bioassays with bacterial reporters are a useful complement to chemical analytics because they measure biological responses rather than total chemical concentrations. Simple bacterial reporter assays may also replace more costly chemical methods as a first line sample analysis technique. Recent promising developments integrate bacterial reporter cells with microsystems to produce bacterial biosensors. This lecture presents an in-depth treatment of the synthetic biological design principles of bacterial reporters, the engineering of which started as simple recombinant DNA puzzles, but has now become a more rational approach of choosing and combining s nsing, controlling and reporting DNA 'parts'. Several examples of existing bacterial reporter designs and their genetic circuitry will be illustrated. Besides the design principles, the lecture also focuses on the application principles of bacterial reporter assays. A variety of assay formats will be illustrated, and principles of quantification will be dealt with. In addition to this discussion, substantial reference material is supplied in various Annexes. Table of Contents: Short History of the use of Bacteria for Biosensing and Bioreporting / Genetic Engineering Concepts / Measuring with Bioreporters / Epilogue View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dynamic Binary Modification:Tools, Techniques and Applications

    Kim Hazelwood
    Copyright Year: 2011

    Morgan and Claypool eBooks

    Dynamic binary modification tools form a software layer between a running application and the underlying operating system, providing the powerful opportunity to inspect and potentially modify every user-level guest application instruction that executes. Toolkits built upon this technology have enabled computer architects to build powerful simulators and emulators for design-space exploration, compiler writers to analyze and debug the code generated by their compilers, software developers to fully explore the features, bottlenecks, and performance of their software, and even end-users to extend the functionality of proprietary software running on their computers. Several dynamic binary modification systems are freely available today that place this power into the hands of the end user. While these systems are quite complex internally, they mask that complexity with an easy-to-learn API that allows a typical user to ramp up fairly quickly and build any of a number of powerful tools. Mea while, these tools are robust enough to form the foundation for software products in use today. This book serves as a primer for researchers interested in dynamic binary modification systems, their internal design structure, and the wide range of tools that can be built leveraging these systems. The hands-on examples presented throughout form a solid foundation for designing and constructing more complex tools, with an appreciation for the techniques necessary to make those tools robust and efficient. Meanwhile, the reader will get an appreciation for the internal design of the engines themselves. Table of Contents: Dynamic Binary Modification: Overview / Using a Dynamic Binary Modifier / Program Analysis and Debugging / Active Program Modification / Architectural Exploration / Advanced System Internals / Historical Perspectives / Summary and Observations View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Retrieval Models:Foundations and Relationships

    Thomas Roelleke
    Copyright Year: 2013

    Morgan and Claypool eBooks

    Information Retrieval (IR) models are a core component of IR research and IR systems. The past decade brought a consolidation of the family of IR models, which by 2000 consisted of relatively isolated views on TF-IDF (Term-Frequency times Inverse-Document-Frequency) as the weighting scheme in the vector-space model (VSM), the probabilistic relevance framework (PRF), the binary independence retrieval (BIR) model, BM25 (Best-Match Version 25, the main instantiation of the PRF/BIR), and language modelling (LM). Also, the early 2000s saw the arrival of divergence from randomness (DFR). Regarding intuition and simplicity, though LM is clear from a probabilistic point of view, several people stated: "It is easy to understand TF-IDF and BM25. For LM, however, we understand the math, but we do not fully understand why it works." This book takes a horizontal approach gathering the foundations of TF-IDF, PRF, BIR, Poisson, BM25, LM, probabilistic inference networks (PIN's), and divergence-base models. The aim is to create a consolidated and balanced view on the main models. A particular focus of this book is on the "relationships between models." This includes an overview over the main frameworks (PRF, logical IR, VSM, generalized VSM) and a pairing of TF-IDF with other models. It becomes evident that TF-IDF and LM measure the same, namely the dependence (overlap) between document and query. The Poisson probability helps to establish probabilistic, non-heuristic roots for TF-IDF, and the Poisson parameter, average term frequency, is a binding link between several retrieval models and model parameters. Table of Contents: List of Figures / Preface / Acknowledgments / Introduction / Foundations of IR Models / Relationships Between IR Models / Summary & Research Outlook / Bibliography / Author's Biography / Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Finite State Machine Datapath Design, Optimization, and Implementation

    Justin Davis ; Robert Reese
    Copyright Year: 2007

    Morgan and Claypool eBooks

    Finite State Machine Datapath Design, Optimization, and Implementation explores the design space of combined FSM/Datapath implementations. The lecture starts by examining performance issues in digital systems such as clock skew and its effect on setup and hold time constraints, and the use of pipelining for increasing system clock frequency. This is followed by definitions for latency and throughput, with associated resource tradeoffs explored in detail through the use of dataflow graphs and scheduling tables applied to examples taken from digital signal processing applications. Also, design issues relating to functionality, interfacing, and performance for different types of memories commonly found in ASICs and FPGAs such as FIFOs, single-ports, and dual-ports are examined. Selected design examples are presented in implementation-neutral Verilog code and block diagrams, with associated design files available as downloads for both Altera Quartus and Xilinx Virtex FPGA platforms. A wor ing knowledge of Verilog, logic synthesis, and basic digital design techniques is required. This lecture is suitable as a companion to the synthesis lecture titled Introduction to Logic Synthesis using Verilog HDL. Table of Contents: Calculating Maximum Clock Frequency / Improving Design Performance / Finite State Machine with Datapath (FSMD) Design / Embedded Memory Usage in Finite State Machine with Datapath (FSMD) Designs View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Hardware Malware

    Edgar Weippl ; Christian Krieg ; Adrian Dabrowski ; Heidelinde Hobel
    Copyright Year: 2013

    Morgan and Claypool eBooks

    In our digital world, integrated circuits are present in nearly every moment of our daily life. Even when using the coffee machine in the morning, or driving our car to work, we interact with integrated circuits. The increasing spread of information technology in virtually all areas of life in the industrialized world offers a broad range of attack vectors. So far, mainly software-based attacks have been considered and investigated, while hardware-based attacks have attracted comparatively little interest. The design and production process of integrated circuits is mostly decentralized due to financial and logistical reasons. Therefore, a high level of trust has to be established between the parties involved in the hardware development lifecycle. During the complex production chain, malicious attackers can insert non-specified functionality by exploiting untrusted processes and backdoors. This work deals with the ways in which such hidden, non-specified functionality can be introduced into hardware systems. After briefly outlining the development and production process of hardware systems, we systematically describe a new type of threat, the hardware Trojan. We provide a historical overview of the development of research activities in this field to show the growing interest of international research in this topic. Current work is considered in more detail. We discuss the components that make up a hardware Trojan as well as the parameters that are relevant for an attack. Furthermore, we describe current approaches for detecting, localizing, and avoiding hardware Trojans to combat them effectively. Moreover, this work develops a comprehensive taxonomy of countermeasures and explains in detail how specific problems are solved. In a final step, we provide an overview of related work and offer an outlook on further research in this field. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Aaron Swartz's The Programmable Web:An Unfinished Work

    Aaron Swartz
    Copyright Year: 2013

    Morgan and Claypool eBooks

    This short work is the first draft of a book manuscript by Aaron Swartz written for the series "Synthesis Lectures on the Semantic Web" at the invitation of its editor, James Hendler. Unfortunately, the book wasn't completed before Aaron's death in January 2013. As a tribute, the editor and publisher are publishing the work digitally without cost. From the author's introduction: " . . . we will begin by trying to understand the architecture of the Web -- what it got right and, occasionally, what it got wrong, but most importantly why it is the way it is. We will learn how it allows both users and search engines to co-exist peacefully while supporting everything from photo-sharing to financial transactions. We will continue by considering what it means to build a program on top of the Web -- how to write software that both fairly serves its immediate users as well as the developers who want to build on top of it. Too often, an API is bolted on top of an existing application, as an a terthought or a completely separate piece. But, as we'll see, when a web application is designed properly, APIs naturally grow out of it and require little effort to maintain. Then we'll look into what it means for your application to be not just another tool for people and software to use, but part of the ecology -- a section of the programmable web. This means exposing your data to be queried and copied and integrated, even without explicit permission, into the larger software ecosystem, while protecting users' freedom. Finally, we'll close with a discussion of that much-maligned phrase, 'the Semantic Web,' and try to understand what it would really mean." View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Background Subtraction:Theory and Practice

    Ahmed Elgammal
    Copyright Year: 2014

    Morgan and Claypool eBooks

    Background subtraction is a widely used concept for detection of moving objects in videos. In the last two decades there has been a lot of development in designing algorithms for background subtraction, as well as wide use of these algorithms in various important applications, such as visual surveillance, sports video analysis, motion capture, etc. Various statistical approaches have been proposed to model scene backgrounds. The concept of background subtraction also has been extended to detect objects from videos captured from moving cameras. This book reviews the concept and practice of background subtraction. We discuss several traditional statistical background subtraction models, including the widely used parametric Gaussian mixture models and non-parametric models. We also discuss the issue of shadow suppression, which is essential for human motion analysis applications. This book discusses approaches and tradeoffs for background maintenance. This book also reviews many of the r cent developments in background subtraction paradigm. Recent advances in developing algorithms for background subtraction from moving cameras are described, including motion-compensation-based approaches and motion-segmentation-based approaches. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Camera Networks:The Acquisition and Analysis of Videos over Wide Areas

    Amit Roy-Chodhury ; Bi Song
    Copyright Year: 2012

    Morgan and Claypool eBooks

    As networks of video cameras are installed in many applications like security and surveillance, environmental monitoring, disaster response, and assisted living facilities, among others, image understanding in camera networks is becoming an important area of research and technology development. There are many challenges that need to be addressed in the process. Some of them are listed below: - Traditional computer vision challenges in tracking and recognition, robustness to pose, illumination, occlusion, clutter, recognition of objects, and activities; - Aggregating local information for wide area scene understanding, like obtaining stable, long-term tracks of objects; - Positioning of the cameras and dynamic control of pan-tilt-zoom (PTZ) cameras for optimal sensing; - Distributed processing and scene analysis algorithms; - Resource constraints imposed by different applications like security and surveillance, environmental monitoring, disaster response, assisted living facilities, et . In this book, we focus on the basic research problems in camera networks, review the current state-of-the-art and present a detailed description of some of the recently developed methodologies. The major underlying theme in all the work presented is to take a network-centric view whereby the overall decisions are made at the network level. This is sometimes achieved by accumulating all the data at a central server, while at other times by exchanging decisions made by individual cameras based on their locally sensed data. Chapter One starts with an overview of the problems in camera networks and the major research directions. Some of the currently available experimental testbeds are also discussed here. One of the fundamental tasks in the analysis of dynamic scenes is to track objects. Since camera networks cover a large area, the systems need to be able to track over such wide areas where there could be both overlapping and non-overlapping fields of view of the cameras, as addressed in Chapter Two: Distributed processing is another challenge in camera networks and recent methods have shown how to do tracking, pose estimation and calibration in a distributed environment. Consensus algorithms that enable these tasks are described in Chapter Three. Chapter Four summarizes a few approaches on object and activity recognition in both distributed and centralized camera network environments. All these methods have focused primarily on the analysis side given that images are being obtained by the cameras. Efficient utilization of such networks often calls for active sensing, whereby the acquisition and analysis phases are closely linked. We discuss this issue in detail in Chapter Five and show how collaborative and opportunistic sensing in a camera network can be achieved. Finally, Chapter Six concludes the book by highlighting the major directions for future research. Table of Contents: An Introduction to Camera Networks / Wide-Area Tracking / Distributed Processing in C mera Networks / Object and Activity Recognition / Active Sensing / Future Research Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quorum Systems:With Applications to Storage and Consensus

    Marko Vukolic
    Copyright Year: 2012

    Morgan and Claypool eBooks

    A quorum system is a collection of subsets of nodes, called quorums, with the property that each pair of quorums have a non-empty intersection. Quorum systems are the key mathematical abstraction for ensuring consistency in fault-tolerant and highly available distributed computing. Critical for many applications since the early days of distributed computing, quorum systems have evolved from simple majorities of a set of processes to complex hierarchical collections of sets, tailored for general adversarial structures. The initial non-empty intersection property has been refined many times to account for, e.g., stronger (Byzantine) adversarial model, latency considerations or better availability. This monograph is an overview of the evolution and refinement of quorum systems, with emphasis on their role in two fundamental applications: distributed read/write storage and consensus. Table of Contents: Introduction / Preliminaries / Classical Quorum Systems / Classical Quorum-Based Emulat ons / Byzantine Quorum Systems / Latency-efficient Quorum Systems / Probabilistic Quorum Systems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Patient-Centered Design of Cognitive Assistive Technology for Traumatic Brain Injury Telerehabilitation

    Elliot Cole
    Copyright Year: 2013

    Morgan and Claypool eBooks

    Computer software has been productive in helping individuals with cognitive disabilities. Personalizing the user interface is an important strategy in designing software for these users, because of the barriers created by conventional user interfaces for the cognitively disabled. Cognitive assistive technology (CAT) has typically been used to provide help with everyday activities, outside of cognitive rehabilitation therapy. This book describes a quarter century of computing R&D at the Institute for Cognitive Prosthetics, focusing on the needs of individuals with cognitive disabilities from brain injury. Models and methods from Human Computer Interaction (HCI) have been particularly valuable, initially in illuminating those needs. Subsequently HCI methods have expanded CAT to be powerful rehabilitation therapy tools, restoring some damaged cognitive abilities which have resisted conventional therapy. Patient-Centered Design (PCD) emerged as a design methodology which incorporates both clinical and technical factors. PCD also takes advantage of the patient's ability to redesign and refine the user interface, and to achieve a very good fit between user and system. Cognitive Prosthetics Telerehabilitation is a powerful therapy modality. Essential characteristics are delivering service to patients in their own home, having the patient's priority activities be the focus of therapy, using cognitive prosthetic software which applies Patient Centered Design, and videoconferencing with a workspace shared between therapist and patient. Cognitive Prosthetics Telerehabilitation has a rich set of advantages for the many stakeholders involved with brain injury rehabilitation. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introductory Medical Imaging

    Anil Bharath
    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book provides an introduction to the principles of several of the more widely used methods in medical imaging. Intended for engineering students, it provides a final-year undergraduate- or graduate-level introduction to several imaging modalities, including MRI, ultrasound, and X-Ray CT. The emphasis of the text is on mathematical models for imaging and image reconstruction physics. Emphasis is also given to sources of imaging artefacts. Such topics are usually not addressed across the different imaging modalities in one book, and this is a notable strength of the treatment given here. Table of Contents: Introduction / Diagnostic X-Ray Imaging / X-Ray CT / Ultrasonics / Pulse-Echo Ultrasonic Imaging / Doppler Velocimetry / An Introduction to MRI View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Trust in Social Media

    Jiliang Tang ; Huan Liu
    Copyright Year: 2015

    Morgan and Claypool eBooks

    Social media greatly enables people to participate in online activities and shatters the barrier for online users to create and share information at any place at any time. However, the explosion of user-generated content poses novel challenges for online users to find relevant information, or, View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Embedded System Design with the Atmel AVR Microcontroller:Part I

    Steven Barrett
    Copyright Year: 2009

    Morgan and Claypool eBooks

    This textbook provides practicing scientists and engineers an advanced treatment of the Atmel AVR microcontroller. This book is intended as a follow-on to a previously published book, titled Atmel AVR Microcontroller Primer: Programming and Interfacing. Some of the content from this earlier text is retained for completeness. This book will emphasize advanced programming and interfacing skills. We focus on system level design consisting of several interacting microcontroller subsystems. The first chapter discusses the system design process. Our approach is to provide the skills to quickly get up to speed to operate the internationally popular Atmel AVR microcontroller line by developing systems level design skills. We use the Atmel ATmega164 as a representative sample of the AVR line. The knowledge you gain on this microcontroller can be easily translated to every other microcontroller in the AVR line. In succeeding chapters, we cover the main subsystems aboard the microcontroller, pro iding a short theory section followed by a description of the related microcontroller subsystem with accompanying software for the subsystem. We then provide advanced examples exercising some of the features discussed. In all examples, we use the C programming language. The code provided can be readily adapted to the wide variety of compilers available for the Atmel AVR microcontroller line. We also include a chapter describing how to interface the microcontroller to a wide variety of input and output devices. The book concludes with several detailed system level design examples employing the Atmel AVR microcontroller. Table of Contents: Embedded Systems Design / Atmel AVR Architecture Overview / Serial Communication Subsystem / Analog to Digital Conversion (ADC) / Interrupt Subsystem / Timing Subsystem / Atmel AVR Operating Parameters and Interfacing / System Level Design View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Biomedical Engineering:Biomechanics and Bioelectricity, Part II

    Douglas A. Christensen
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Intended as an introduction to the field of biomedical engineering, this book covers the topics of biomechanics (Part I) and bioelectricity (Part II). Each chapter emphasizes a fundamental principle or law, such as Darcy's Law, Poiseuille's Law, Hooke's Law, Starling's Law, levers, and work in the area of fluid, solid, and cardiovascular biomechanics. In addition, electrical laws and analysis tools are introduced, including Ohm's Law, Kirchhoff's Laws, Coulomb's Law, capacitors, and the fluid/electrical analogy. Culminating the electrical portion are chapters covering Nernst and membrane potentials and Fourier transforms. Examples are solved throughout the book and problems with answers are given at the end of each chapter. A semester-long Major Project that models the human systemic cardiovascular system, utilizing both a Matlab numerical simulation and an electrical analog circuit, ties many of the book's concepts together. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Metaphor:A Computational Perspective

    Tony Veale ; Ekaterina Shutova ; Beata Beigman Klebanov
    Copyright Year: 2016

    Morgan and Claypool eBooks

    The literary imagination may take flight on the wings of metaphor, but hard-headed scientists are just as likely as doe-eyed poets to reach for a metaphor when the descriptive need arises. Metaphor is a pervasive aspect of every genre of text and every register of speech, and is as useful for describing the inner workings of a "black hole" (itself a metaphor) as it is the affairs of the human heart. The ubiquity of metaphor in natural language thus poses a significant challenge for Natural Language Processing (NLP) systems and their builders, who cannot afford to wait until the problems of literal language have been solved before turning their attention to figurative phenomena. This book offers a comprehensive approach to the computational treatment of metaphor and its figurative brethren—including simile, analogy, and conceptual blending—that does not shy away from their important cognitive and philosophical dimensions. Veale, Shutova, and Beigman Klebanov approach me aphor from multiple computational perspectives, providing coverage of both symbolic and statistical approaches to interpretation and paraphrase generation, while also considering key contributions from philosophy on what constitutes the "meaning" of a metaphor. This book also surveys available metaphor corpora and discusses protocols for metaphor annotation. Any reader with an interest in metaphor, from beginning researchers to seasoned scholars, will find this book to be an invaluable guide to what is a fascinating linguistic phenomenon. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Integral Equation Methods for Electromagnetic and Elastic Waves

    Weng Chew ; Mei-Song Tong ; Bin hu
    Copyright Year: 2008

    Morgan and Claypool eBooks

    Integral Equation Methods for Electromagnetic and Elastic Waves is an outgrowth of several years of work. There have been no recent books on integral equation methods. There are books written on integral equations, but either they have been around for a while, or they were written by mathematicians. Much of the knowledge in integral equation methods still resides in journal papers. With this book, important relevant knowledge for integral equations are consolidated in one place and researchers need only read the pertinent chapters in this book to gain important knowledge needed for integral equation research. Also, learning the fundamentals of linear elastic wave theory does not require a quantum leap for electromagnetic practitioners. Integral equation methods have been around for several decades, and their introduction to electromagnetics has been due to the seminal works of Richmond and Harrington in the 1960s. There was a surge in the interest in this topic in the 1980s (notably t e work of Wilton and his coworkers) due to increased computing power. The interest in this area was on the wane when it was demonstrated that differential equation methods, with their sparse matrices, can solve many problems more efficiently than integral equation methods. Recently, due to the advent of fast algorithms, there has been a revival in integral equation methods in electromagnetics. Much of our work in recent years has been in fast algorithms for integral equations, which prompted our interest in integral equation methods. While previously, only tens of thousands of unknowns could be solved by integral equation methods, now, tens of millions of unknowns can be solved with fast algorithms. This has prompted new enthusiasm in integral equation methods. Table of Contents: Introduction to Computational Electromagnetics / Linear Vector Space, Reciprocity, and Energy Conservation / Introduction to Integral Equations / Integral Equations for Penetrable Objects / Low-Frequency Prob ems in Integral Equations / Dyadic Green's Function for Layered Media and Integral Equations / Fast Inhomogeneous Plane Wave Algorithm for Layered Media / Electromagnetic Wave versus Elastic Wave / Glossary of Acronyms View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Practical Guide to Testing Wireless Smartphone Applications

    Julian Harty
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Testing applications for mobile phones is difficult, time-consuming, and hard to do effectively. Many people have limited their testing efforts to hands-on testing of an application on a few physical handsets, and they have to repeat the process every time a new version of the software is ready to test. They may miss many of the permutations of real-world use, and as a consequence their users are left with the unpleasant mess of a failing application on their phone. Test automation can help to increase the range and scope of testing, while reducing the overhead of manual testing of each version of the software. However automation is not a panacea, particularly for mobile applications, so we need to pick our test automation challenges wisely. This book is intended to help software and test engineers pick appropriately to achieve more; and as a consequence deliver better quality, working software to users. This Synthesis lecture provides practical advice based on direct experience of us ng software test automation to help improve the testing of a wide range of mobile phone applications, including the latest AJAX applications. The focus is on applications that rely on a wireless network connection to a remote server, however the principles may apply to other related fields and applications. We start by explaining terms and some of the key challenges involved in testing smartphone applications. Subsequent chapters describe a type of application e.g. markup, AJAX, Client, followed by a related chapter on how to test each of these applications. Common test automation techniques are covered in a separate chapter, and finally there is a brief chapter on when to test manually. The book also contains numerous pointers and links to further material to help you to improve your testing using automation appropriately. Table of Contents: Introduction / Markup Languages / Testing Techniques for Markup Applications / AJAX Mobile Applications / Testing Mobile AJAX Applications / Cli nt Applications / Testing Techniques for Client Applications / Common Techniques / When to Test Manually / Future Work / Appendix A: Links and References / Appendix B: Data Connectivity / Appendix C: Configuring Your Machine View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multi-Pitch Estimation

    Mads Christensen ; Andreas Jakobsson
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Periodic signals can be decomposed into sets of sinusoids having frequencies that are integer multiples of a fundamental frequency. The problem of finding such fundamental frequencies from noisy observations is important in many speech and audio applications, where it is commonly referred to as pitch estimation. These applications include analysis, compression, separation, enhancement, automatic transcription and many more. In this book, an introduction to pitch estimation is given and a number of statistical methods for pitch estimation are presented. The basic signal models and associated estimation theoretical bounds are introduced, and the properties of speech and audio signals are discussed and illustrated. The presented methods include both single- and multi-pitch estimators based on statistical approaches, like maximum likelihood and maximum a posteriori methods, filtering methods based on both static and optimal adaptive designs, and subspace methods based on the principles of subspace orthogonality and shift-invariance. The application of these methods to analysis of speech and audio signals is demonstrated using both real and synthetic signals, and their performance is assessed under various conditions and their properties discussed. Finally, the estimators are compared in terms of computational and statistical efficiency, generalizability and robustness. Table of Contents: Fundamentals / Statistical Methods / Filtering Methods / Subspace Methods / Amplitude Estimation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Metric Learning

    Aurelien Bellet ; Amaury Habrard ; Marc Sebban
    Copyright Year: 2015

    Morgan and Claypool eBooks

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learning literature that covers algorithms, theory and applications for both numerical and structured data. We first introduce relevant definitions and classic metric functions, as well as examples of their use in machine learning and data mining. We then review a wide range of metric learning algorithms, starting with the simple setting of linear distance and similarity learning. We show how one may sc le-up these methods to very large amounts of training data. To go beyond the linear case, we discuss methods that learn nonlinear metrics or multiple linear metrics throughout the feature space, and review methods for more complex settings such as multi-task and semi-supervised learning. Although most of the existing work has focused on numerical data, we cover the literature on metric learning for structured data like strings, trees, graphs and time series. In the more technical part of the book, we present some recent statistical frameworks for analyzing the generalization performance in metric learning and derive results for some of the algorithms presented earlier. Finally, we illustrate the relevance of metric learning in real-world problems through a series of successful applications to computer vision, bioinformatics and information retrieval. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Globalization, Engineering, and Creativity

    John Reader
    Copyright Year: 2006

    Morgan and Claypool eBooks

    The text addresses the impact of globalization within engineering, particularly on working practices and prospects for creativity. It suggests that accepted norms of economic activity create enclosures and thresholds within the profession, which—as engineers increase their awareness (reflexivity)—will shape the future of engineering, and the values which underpin it. It is aimed at practicing engineers and those in training and is an introduction to the social and political context currently setting new challenges for the profession. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mining Heterogeneous Information Networks:Principles and Methodologies

    Yizhou Sun ; Jiawei Han
    Copyright Year: 2012

    Morgan and Claypool eBooks

    Real-world physical and abstract data objects are interconnected, forming gigantic, interconnected networks. By structuring these data objects and interactions between these objects into multiple types, such networks become semi-structured heterogeneous information networks. Most real-world applications that handle big data, including interconnected social media and social networks, scientific, engineering, or medical information systems, online e-commerce systems, and most database systems, can be structured into heterogeneous information networks. Therefore, effective analysis of large-scale heterogeneous information networks poses an interesting but critical challenge. In this book, we investigate the principles and methodologies of mining heterogeneous information networks. Departing from many existing network models that view interconnected data as homogeneous graphs or networks, our semi-structured heterogeneous information network model leverages the rich semantics of typed nod s and links in a network and uncovers surprisingly rich knowledge from the network. This semi-structured heterogeneous network modeling leads to a series of new principles and powerful methodologies for mining interconnected data, including: (1) rank-based clustering and classification; (2) meta-path-based similarity search and mining; (3) relation strength-aware mining, and many other potential developments. This book introduces this new research frontier and points out some promising research directions. Table of Contents: Introduction / Ranking-Based Clustering / Classification of Heterogeneous Information Networks / Meta-Path-Based Similarity Search / Meta-Path-Based Relationship Prediction / Relation Strength-Aware Clustering with Incomplete Attributes / User-Guided Clustering via Meta-Path Selection / Research Frontiers View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multimedia Information Retrieval

    Stefan Rueger
    Copyright Year: 2009

    Morgan and Claypool eBooks

    At its very core multimedia information retrieval means the process of searching for and finding multimedia documents; the corresponding research field is concerned with building the best possible multimedia search engines. The intriguing bit here is that the query itself can be a multimedia excerpt: For example, when you walk around in an unknown place and stumble across an interesting landmark, would it not be great if you could just take a picture with your mobile phone and send it to a service that finds a similar picture in a database and tells you more about the building -- and about its significance, for that matter? This book goes further by examining the full matrix of a variety of query modes versus document types. How do you retrieve a music piece by humming? What if you want to find news video clips on forest fires using a still image? The text discusses underlying techniques and common approaches to facilitate multimedia search engines from metadata driven retrieval, via iggy-back text retrieval where automated processes create text surrogates for multimedia, automated image annotation and content-based retrieval. The latter is studied in great depth looking at features and distances, and how to effectively combine them for efficient retrieval, to a point where the readers have the ingredients and recipe in their hands for building their own multimedia search engines. Supporting users in their resource discovery mission when hunting for multimedia material is not a technological indexing problem alone. We look at interactive ways of engaging with repositories through browsing and relevance feedback, roping in geographical context, and providing visual summaries for videos. The book concludes with an overview of state-of-the-art research projects in the area of multimedia information retrieval, which gives an indication of the research and development trends and, thereby, a glimpse of the future world. Table of Contents: What is Multimedia Information etrieval? / Basic Multimedia Search Technologies / Content-based Retrieval in Depth / Added Services / Multimedia Information Retrieval Research / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Automated Grammatical Error Detection for Language Learners, Second Edition

    Claudia Leacock ; Martin Chodorow ; Michael Gamon ; Joel Tetreault
    Copyright Year: 2014

    Morgan and Claypool eBooks

    It has been estimated that over a billion people are using or learning English as a second or foreign language, and the numbers are growing not only for English but for other languages as well. These language learners provide a burgeoning market for tools that help identify and correct learners' writing errors. Unfortunately, the errors targeted by typical commercial proofreading tools do not include those aspects of a second language that are hardest to learn. This volume describes the types of constructions English language learners find most difficult: constructions containing prepositions, articles, and collocations. It provides an overview of the automated approaches that have been developed to identify and correct these and other classes of learner errors in a number of languages. Error annotation and system evaluation are particularly important topics in grammatical error detection because there are no commonly accepted standards. Chapters in the book describe the options avai able to researchers, recommend best practices for reporting results, and present annotation and evaluation schemes. The final chapters explore recent innovative work that opens new directions for research. It is the authors' hope that this volume will continue to contribute to the growing interest in grammatical error detection by encouraging researchers to take a closer look at the field and its many challenging problems. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Exploratory Search:Beyond the Query-Response Paradigm

    Ryen White ; Resa Roth
    Copyright Year: 2009

    Morgan and Claypool eBooks

    As information becomes more ubiquitous and the demands that searchers have on search systems grow, there is a need to support search behaviors beyond simple lookup. Information seeking is the process or activity of attempting to obtain information in both human and technological contexts. Exploratory search describes an information-seeking problem context that is open-ended, persistent, and multifaceted, and information-seeking processes that are opportunistic, iterative, and multitactical. Exploratory searchers aim to solve complex problems and develop enhanced mental capacities. Exploratory search systems support this through symbiotic human-machine relationships that provide guidance in exploring unfamiliar information landscapes. Exploratory search has gained prominence in recent years. There is an increased interest from the information retrieval, information science, and human-computer interaction communities in moving beyond the traditional turn-taking interaction model support d by major Web search engines, and toward support for human intelligence amplification and information use. In this lecture, we introduce exploratory search, relate it to relevant extant research, outline the features of exploratory search systems, discuss the evaluation of these systems, and suggest some future directions for supporting exploratory search. Exploratory search is a new frontier in the search domain and is becoming increasingly important in shaping our future world. Table of Contents: Introduction / Defining Exploratory Search / Related Work / Features of Exploratory Search Systems / Evaluation of Exploratory Search Systems / Future Directions and concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Atmel AVR Microcontroller Primer:Programming and Interfacing

    Steven F. Barrett ; Daniel J. Pack
    Copyright Year: 2007

    Morgan and Claypool eBooks

    This textbook provides practicing scientists and engineers a primer on the Atmel AVR microcontroller. Our approach is to provide the fundamental skills to quickly get up and operating with this internationally popular microcontroller. The Atmel ATmega16 is used as a representative sample of the AVR line. The knowledge you gain on the ATmega16 can be easily translated to every other microcontroller in the AVR line. We cover the main subsystems aboard the ATmega16, providing a short theory section followed by a description of the related microcontroller subsystem with accompanying hardware and software to exercise the subsytem. In all examples, we use the C programming language. We conclude with a detailed chapter describing how to interface the microcontroller to a wide variety of input and output devices. Table of Contents: Atmel AVR Architecture Overview / Serial Communication Subsystem / Analog-to-Digital Conversion / Interrupt Subsystem / Timing Subsystem / Atmel AVR Operating Para eters and Interfacing / ATmega16 Register Set / ATmega16 Header File View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Case-Based Reasoning:A Concise Introduction

    Beatriz Lopez
    Copyright Year: 2013

    Morgan and Claypool eBooks

    Case-based reasoning is a methodology with a long tradition in artificial intelligence that brings together reasoning and machine learning techniques to solve problems based on past experiences or cases. Given a problem to be solved, reasoning involves the use of methods to retrieve similar past cases in order to reuse their solution for the problem at hand. Once the problem has been solved, learning methods can be applied to improve the knowledge based on past experiences. In spite of being a broad methodology applied in industry and services, case-based reasoning has often been forgotten in both artificial intelligence and machine learning books. The aim of this book is to present a concise introduction to case-based reasoning providing the essential building blocks for the design of case-based reasoning systems, as well as to bring together the main research lines in this field to encourage students to solve current CBR challenges. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Statistical Language Models for Information Retrieval

    Chengxiang Zhai
    Copyright Year: 2009

    Morgan and Claypool eBooks

    As online information grows dramatically, search engines such as Google are playing a more and more important role in our lives. Critical to all search engines is the problem of designing an effective retrieval model that can rank documents accurately for a given query. This has been a central research problem in information retrieval for several decades. In the past ten years, a new generation of retrieval models, often referred to as statistical language models, has been successfully applied to solve many different information retrieval problems. Compared with the traditional models such as the vector space model, these new models have a more sound statistical foundation and can leverage statistical estimation to optimize retrieval parameters. They can also be more easily adapted to model non-traditional and complex retrieval problems. Empirically, they tend to achieve comparable or better performance than a traditional model with less effort on parameter tuning. This book systemati ally reviews the large body of literature on applying statistical language models to information retrieval with an emphasis on the underlying principles, empirically effective language models, and language models developed for non-traditional retrieval tasks. All the relevant literature has been synthesized to make it easy for a reader to digest the research progress achieved so far and see the frontier of research in this area. The book also offers practitioners an informative introduction to a set of practically useful language models that can effectively solve a variety of retrieval problems. No prior knowledge about information retrieval is required, but some basic knowledge about probability and statistics would be useful for fully digesting all the details. Table of Contents: Introduction / Overview of Information Retrieval Models / Simple Query Likelihood Retrieval Model / Complex Query Likelihood Model / Probabilistic Distance Retrieval Model / Language Models for Special Retr eval Tasks / Language Models for Latent Topic Analysis / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Incentive-Centric Semantic Web Application Engineering

    Elena Simperl ; Roberta Cuel ; Martin Stein
    Copyright Year: 2013

    Morgan and Claypool eBooks

    While many Web 2.0-inspired approaches to semantic content authoring do acknowledge motivation and incentives as the main drivers of user involvement, the amount of useful human contributions actually available will always remain a scarce resource. Complementarily, there are aspects of semantic content authoring in which automatic techniques have proven to perform reliably, and the added value of human (and collective) intelligence is often a question of cost and timing. The challenge that this book attempts to tackle is how these two approaches (machine- and human-driven computation) could be combined in order to improve the cost-performance ratio of creating, managing, and meaningfully using semantic content. To do so, we need to first understand how theories and practices from social sciences and economics about user behavior and incentives could be applied to semantic content authoring. We will introduce a methodology to help software designers to embed incentives-minded functiona ities into semantic applications, as well as best practices and guidelines. We will present several examples of such applications, addressing tasks such as ontology management, media annotation, and information extraction, which have been built with these considerations in mind. These examples illustrate key design issues of incentivized Semantic Web applications that might have a significant effect on the success and sustainable development of the applications: the suitability of the task and knowledge domain to the intended audience, and the mechanisms set up to ensure high-quality contributions, and extensive user involvement. Table of Contents: Semantic Data Management: A Human-driven Process / Fundamentals of Motivation and Incentives / Case Study: Motivating Employees to Annotate Content / Case Study: Building a Community of Practice Around Web Service Management and Annotation / Case Study: Games with a Purpose for Semantic Content Creation / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Acoustical Impulse Response Functions of Music Performance Halls

    Douglas Frey ; Rangaraj Rangayyan ; Victor Coelho
    Copyright Year: 2013

    Morgan and Claypool eBooks

    Digital measurement of the analog acoustical parameters of a music performance hall is difficult. The aim of such work is to create a digital acoustical derivation that is an accurate numerical representation of the complex analog characteristics of the hall. The present study describes the exponential sine sweep (ESS) measurement process in the derivation of an acoustical impulse response function (AIRF) of three music performance halls in Canada. It examines specific difficulties of the process, such as preventing the external effects of the measurement transducers from corrupting the derivation, and provides solutions, such as the use of filtering techniques in order to remove such unwanted effects. In addition, the book presents a novel method of numerical verification through mean-squared error (MSE) analysis in order to determine how accurately the derived AIRF represents the acoustical behavior of the actual hall. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Ontology-Based Interpretation of Natural Language

    Philipp Cimiano ; Christina Unger ; John McCrae
    Copyright Year: 2014

    Morgan and Claypool eBooks

    For humans, understanding a natural language sentence or discourse is so effortless that we hardly ever think about it. For machines, however, the task of interpreting natural language, especially grasping meaning beyond the literal content, has proven extremely difficult and requires a large amount of background knowledge. This book focuses on the interpretation of natural language with respect to specific domain knowledge captured in ontologies. The main contribution is an approach that puts ontologies at the center of the interpretation process. This means that ontologies not only provide a formalization of domain knowledge necessary for interpretation but also support and guide the construction of meaning representations. We start with an introduction to ontologies and demonstrate how linguistic information can be attached to them by means of the ontology lexicon model lemon. These lexica then serve as basis for the automatic generation of grammars, which we use to compositionally construct meaning representations that conform with the vocabulary of an underlying ontology. As a result, the level of representational granularity is not driven by language but by the semantic distinctions made in the underlying ontology and thus by distinctions that are relevant in the context of a particular domain. We highlight some of the challenges involved in the construction of ontology-based meaning representations, and show how ontologies can be exploited for ambiguity resolution and the interpretation of temporal expressions. Finally, we present a question answering system that combines all tools and techniques introduced throughout the book in a real-world application, and sketch how the presented approach can scale to larger, multi-domain scenarios in the context of the Semantic Web. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Transactional Memory

    James R. Larus ; Ravi Rajwar
    Copyright Year: 2007

    Morgan and Claypool eBooks

    The advent of multicore processors has renewed interest in the idea of incorporating transactions into the programming model used to write parallel programs. This approach, known as transactional memory, offers an alternative, and hopefully better, way to coordinate concurrent threads. The ACI (atomicity, consistency, isolation) properties of transactions provide a foundation to ensure that concurrent reads and writes of shared data do not produce inconsistent or incorrect results. At a higher level, a computation wrapped in a transaction executes atomically – either it completes successfully and commits its result in its entirety or it aborts. In addition, isolation ensures the transaction produces the same result as if no other transactions were executing concurrently. Although transactions are not a parallel programming panacea, they shift much of the burden of synchronizing and coordinating parallel computations from a programmer to a compiler, runtime system, and hardware. The challenge for the system implementers is to build an efficient transactional memory infrastructure. This book presents an overview of the state of the art in the design and implementation of transactional memory systems, as of early summer 2006. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dynamic Speech Models

    Li Deng
    Copyright Year: 2006

    Morgan and Claypool eBooks

    Speech dynamics refer to the temporal characteristics in all stages of the human speech communication process. This speech “chain” starts with the formation of a linguistic message in a speaker's brain and ends with the arrival of the message in a listener's brain. Given the intricacy of the dynamic speech process and its fundamental importance in human communication, this monograph is intended to provide a comprehensive material on mathematical models of speech dynamics and to address the following issues: How do we make sense of the complex speech process in terms of its functional role of speech communication? How do we quantify the special role of speech timing? How do the dynamics relate to the variability of speech that has often been said to seriously hamper automatic speech recognition? How do we put the dynamic process of speech into a quantitative form to enable detailed analyses? And finally, how can we incorporate the knowledge of speech dynamics into compu erized speech analysis and recognition algorithms? The answers to all these questions require building and applying computational models for the dynamic speech process. What are the compelling reasons for carrying out dynamic speech modeling? We provide the answer in two related aspects. First, scientific inquiry into the human speech code has been relentlessly pursued for several decades. As an essential carrier of human intelligence and knowledge, speech is the most natural form of human communication. Embedded in the speech code are linguistic (as well as para-linguistic) messages, which are conveyed through four levels of the speech chain. Underlying the robust encoding and transmission of the linguistic messages are the speech dynamics at all the four levels. Mathematical modeling of speech dynamics provides an effective tool in the scientific methods of studying the speech chain. Such scientific studies help understand why humans speak as they do and how humans exploit redundanc and variability by way of multitiered dynamic processes to enhance the efficiency and effectiveness of human speech communication. Second, advancement of human language technology, especially that in automatic recognition of natural-style human speech is also expected to benefit from comprehensive computational modeling of speech dynamics. The limitations of current speech recognition technology are serious and are well known. A commonly acknowledged and frequently discussed weakness of the statistical model underlying current speech recognition technology is the lack of adequate dynamic modeling schemes to provide correlation structure across the temporal speech observation sequence. Unfortunately, due to a variety of reasons, the majority of current research activities in this area favor only incremental modifications and improvements to the existing HMM-based state-of-the-art. For example, while the dynamic and correlation modeling is known to be an important topic, most of the sy tems nevertheless employ only an ultra-weak form of speech dynamics; e.g., differential or delta parameters. Strong-form dynamic speech modeling, which is the focus of this monograph, may serve as an ultimate solution to this problem. After the introduction chapter, the main body of this monograph consists of four chapters. They cover various aspects of theory, algorithms, and applications of dynamic speech models, and provide a comprehensive survey of the research work in this area spanning over past 20~years. This monograph is intended as advanced materials of speech and signal processing for graudate-level teaching, for professionals and engineering practioners, as well as for seasoned researchers and engineers specialized in speech processing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sparse Representations for Radar with MATLAB® Examples

    Peter Knee
    Copyright Year: 2012

    Morgan and Claypool eBooks

    Although the field of sparse representations is relatively new, research activities in academic and industrial research labs are already producing encouraging results. The sparse signal or parameter model motivated several researchers and practitioners to explore high complexity/wide bandwidth applications such as Digital TV, MRI processing, and certain defense applications. The potential signal processing advancements in this area may influence radar technologies. This book presents the basic mathematical concepts along with a number of useful MATLAB® examples to emphasize the practical implementations both inside and outside the radar field. Table of Contents: Radar Systems: A Signal Processing Perspective / Introduction to Sparse Representations / Dimensionality Reduction / Radar Signal Processing Fundamentals / Sparse Representations in Radar View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to Duplicate Detection

    Feliz Nauman ; Melanie Herschel
    Copyright Year: 2010

    Morgan and Claypool eBooks

    With the ever increasing volume of data, data quality problems abound. Multiple, yet different representations of the same real-world objects in data, duplicates, are one of the most intriguing data quality problems. The effects of such duplicates are detrimental; for instance, bank customers can obtain duplicate identities, inventory levels are monitored incorrectly, catalogs are mailed multiple times to the same household, etc. Automatically detecting duplicates is difficult: First, duplicate representations are usually not identical but slightly differ in their values. Second, in principle all pairs of records should be compared, which is infeasible for large volumes of data. This lecture examines closely the two main components to overcome these difficulties: (i) Similarity measures are used to automatically identify duplicates when comparing two records. Well-chosen similarity measures improve the effectiveness of duplicate detection. (ii) Algorithms are developed to perform on v ry large volumes of data in search for duplicates. Well-designed algorithms improve the efficiency of duplicate detection. Finally, we discuss methods to evaluate the success of duplicate detection. Table of Contents: Data Cleansing: Introduction and Motivation / Problem Definition / Similarity Functions / Duplicate Detection Algorithms / Evaluating Detection Success / Conclusion and Outlook / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Social Media and Library Services

    Lorri Mon
    Copyright Year: 2015

    Morgan and Claypool eBooks

    The rise of social media technologies has created new ways to seek and share information for millions of users worldwide, but also has presented new challenges for libraries in meeting users where they are within social spaces. From social networking sites such as Facebook and Google+, and microblogging platforms such as Twitter and Tumblr to the image and video sites of YouTube, Flickr, Instagram, and to geotagging sites such as Foursquare, libraries have responded by establishing footholds within a variety of social media platforms and seeking new ways of engaging with online users in social spaces. Libraries are also responding to new social review sites such as Yelp and Tripadvisor, awareness sites including StumbleUpon, Pinterest, Goodreads, and Reddit, and social question-and-answer (Q&A) sites such as Yahoo! Answers—sites which engage social media users in functions similar to traditional library content curation, readers' advisory, information and referral, and re erence services. Establishing a social media presence extends the library's physical manifestation into virtual space and increases the library's visibility, reach, and impact. However, beyond simply establishing a social presence for the library, a greater challenge is building effective and engaging social media sites that successfully adapt a library's visibility, voice, and presence to the unique contexts, audiences, and cultures within diverse social media sites. This lecture examines the research and theory on social media and libraries, providing an overview of what is known and what is not yet known about libraries and social media. Chapter 1 focuses on the social media environments within which libraries are establishing a presence, including how social media sites differ from each other, yet work together within a social ecosphere. Chapter 2 examines how libraries are engaging with users across a variety of social media platforms and the extent to which libraries are invo ved in using these different social media platforms, as well as the activities of libraries in presenting a social "self," sharing information, and interacting with users via social media. Chapter 3 explores metrics and measures for assessing the impact of the library's activity in social media sites. The book concludes with Chapter 4 on evolving directions for libraries and social media, including potential implications of new and emerging technologies for libraries in social spaces. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Envisionment and Discovery Collaboratory (EDC):Explorations in Human-Centered Informatics with Tabletop Computing Environments

    Ernesto G. Arias ; Hal Eden ; Gerhard Fischer
    Copyright Year: 2015

    Morgan and Claypool eBooks

    he Envisionment and Discovery Collaboratory (EDC) is a long-term research platform exploring immersive socio-technical environments in which stakeholders can collaboratively frame and solve problems and discuss and make decisions in a variety of application domains and different disciplines. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Deformable Surface 3D Reconstruction from Monocular Images

    Matthieu Salzmann ; Pascal Fua
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Being able to recover the shape of 3D deformable surfaces from a single video stream would make it possible to field reconstruction systems that run on widely available hardware without requiring specialized devices. However, because many different 3D shapes can have virtually the same projection, such monocular shape recovery is inherently ambiguous. In this survey, we will review the two main classes of techniques that have proved most effective so far: The template-based methods that rely on establishing correspondences with a reference image in which the shape is already known, and non-rigid structure-from-motion techniques that exploit points tracked across the sequences to reconstruct a completely unknown shape. In both cases, we will formalize the approach, discuss its inherent ambiguities, and present the practical solutions that have been proposed to resolve them. To conclude, we will suggest directions for future research. Table of Contents: Introduction / Early Approaches t Non-Rigid Reconstruction / Formalizing Template-Based Reconstruction / Performing Template-Based Reconstruction / Formalizing Non-Rigid Structure from Motion / Performing Non-Rigid Structure from Motion / Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Tremor:From Pathogenesis to Treatment

    Giulana Grimaldi ; Mario Manto
    Copyright Year: 2008

    Morgan and Claypool eBooks

    Tremor represents one of the most common movement disorders worldwide. It affects both sexes and may occur at any age. In most cases, tremor is disabling and causes social difficulties, resulting in poorer quality of life. Tremor is now recognized as a public health issue given the aging of the population. Tremor is a complex phenomenon that has attracted the attention of scientists from various disciplines. Tremor results from dynamic interactions between multiple synaptically coupled neuronal systems and the biomechanical, physical, and electrical properties of the external effectors. There have been major advances in our understanding of tremor pathogenesis these last three decades, thanks to new imaging techniques and genetic discoveries. Moreover, significant progress in computer technologies, developments of reliable and unobtrusive wearable sensors, improvements in miniaturization, and advances in signal processing have opened new perspectives for the accurate characterization nd daily monitoring of tremor. New therapies are emerging. In this book, we provide an overview of tremor from pathogenesis to therapeutic aspects. We review the definitions, the classification of the varieties of tremor, and the contribution of central versus peripheral mechanisms. Neuroanatomical, neurophysiological, neurochemical, and pharmacological topics related to tremor are pointed out. Our goals are to explain the fundamental basis of tremor generation, to show the recent technological developments, especially in instrumentation, which are reshaping research and clinical practice, and to provide up-to-date information related to emerging therapies. The integrative transdisciplinary approach has been used, combining engineering and physiological principles to diagnose, monitor, and treat tremor. Guidelines for evaluation of tremor are explained. This book has been written for biomedical engineering students, engineers, researchers, medical students, biologists, neurologists, a d biomedical professionals of any discipline looking for an updated and multidisciplinary overview of tremor. It can be used for biomedical courses. Table of Contents: Introduction / Anatomical Overview of the Central and Peripheral Nervous System / Physiology of the Nervous System / Characterization of Tremor / Prinipal Disorders Associated with Tremor / Quantification of Tremor / Mechanisms of Tremor / Treatments View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Understanding User-Web Interactions via Web Analytics

    Bernard J. Jansen
    Copyright Year: 2009

    Morgan and Claypool eBooks

    This lecture presents an overview of the Web analytics process, with a focus on providing insight and actionable outcomes from collecting and analyzing Internet data. The lecture first provides an overview of Web analytics, providing in essence, a condensed version of the entire lecture. The lecture then outlines the theoretical and methodological foundations of Web analytics in order to make obvious the strengths and shortcomings of Web analytics as an approach. These foundational elements include the psychological basis in behaviorism and methodological underpinning of trace data as an empirical method. These foundational elements are illuminated further through a brief history of Web analytics from the original transaction log studies in the 1960s through the information science investigations of library systems to the focus on Websites, systems, and applications. Following a discussion of on-going interaction data within the clickstream created using log files and page tagging for analytics of Website and search logs, the lecture then presents a Web analytic process to convert these basic data to meaningful key performance indicators in order to measure likely converts that are tailored to the organizational goals or potential opportunities. Supplementary data collection techniques are addressed, including surveys and laboratory studies. The overall goal of this lecture is to provide implementable information and a methodology for understanding Web analytics in order to improve Web systems, increase customer satisfaction, and target revenue through effective analysis of user–Website interactions. Table of Contents: Understanding Web Analytics / The Foundations of Web Analytics: Theory and Methods / The History of Web Analytics / Data Collection for Web Analytics / Web Analytics Fundamentals / Web Analytics Strategy / Web Analytics as Competitive Intelligence / Supplementary Methods for Augmenting Web Analytics / Search Log Analytics / Conclusion / Key Te ms / Blogs for Further Reading / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quantitative Neurophysiology

    Joseph V. Tranquillo
    Copyright Year: 2008

    Morgan and Claypool eBooks

    Quantitative Neurophysiology is supplementary text for a junior or senior level course in neuroengineering. It may also serve as an quick-start for graduate students in engineering, physics or neuroscience as well as for faculty interested in becoming familiar with the basics of quantitative neuroscience. The first chapter is a review of the structure of the neuron and anatomy of the brain. Chapters 2-6 derive the theory of active and passive membranes, electrical propagation in axons and dendrites and the dynamics of the synapse. Chapter 7 is an introduction to modeling networks of neurons and artificial neural networks. Chapter 8 and 9 address the recording and decoding of extracellular potentials. The final chapter has descriptions of a number of more advanced or new topics in neuroengineering. Throughout the text, vocabulary is introduced which will enable students to read more advanced literature and communicate with other scientists and engineers working in the neurosciences. Nu erical methods are outlined so students with programming knowledge can implement the models presented in the text. Analogies are used to clarify topics and reinforce key concepts. Finally, homework and simulation problems are available at the end of each chapter. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Chaotic Maps:Dynamics, Fractals, and Rapid Fluctuations

    Goong Chen ; Yu Huang
    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book consists of lecture notes for a semester-long introductory graduate course on dynamical systems and chaos taught by the authors at Texas A&M University and Zhongshan University, China. There are ten chapters in the main body of the book, covering an elementary theory of chaotic maps in finite-dimensional spaces. The topics include one-dimensional dynamical systems (interval maps), bifurcations, general topological, symbolic dynamical systems, fractals and a class of infinite-dimensional dynamical systems which are induced by interval maps, plus rapid fluctuations of chaotic maps as a new viewpoint developed by the authors in recent years. Two appendices are also provided in order to ease the transitions for the readership from discrete-time dynamical systems to continuous-time dynamical systems, governed by ordinary and partial differential equations. Table of Contents: Simple Interval Maps and Their Iterations / Total Variations of Iterates of Maps / Ordering among Per ods: The Sharkovski Theorem / Bifurcation Theorems for Maps / Homoclinicity. Lyapunoff Exponents / Symbolic Dynamics, Conjugacy and Shift Invariant Sets / The Smale Horseshoe / Fractals / Rapid Fluctuations of Chaotic Maps on RN / Infinite-dimensional Systems Induced by Continuous-Time Difference Equations View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computational Modeling of Narrative

    Inderjeet Mani
    Copyright Year: 2012

    Morgan and Claypool eBooks

    The field of narrative (or story) understanding and generation is one of the oldest in natural language processing (NLP) and artificial intelligence (AI), which is hardly surprising, since storytelling is such a fundamental and familiar intellectual and social activity. In recent years, the demands of interactive entertainment and interest in the creation of engaging narratives with life-like characters have provided a fresh impetus to this field. This book provides an overview of the principal problems, approaches, and challenges faced today in modeling the narrative structure of stories. The book introduces classical narratological concepts from literary theory and their mapping to computational approaches. It demonstrates how research in AI and NLP has modeled character goals, causality, and time using formalisms from planning, case-based reasoning, and temporal reasoning, and discusses fundamental limitations in such approaches. It proposes new representations for embedded narrati es and fictional entities, for assessing the pace of a narrative, and offers an empirical theory of audience response. These notions are incorporated into an annotation scheme called NarrativeML. The book identifies key issues that need to be addressed, including annotation methods for long literary narratives, the representation of modality and habituality, and characterizing the goals of narrators. It also suggests a future characterized by advanced text mining of narrative structure from large-scale corpora and the development of a variety of useful authoring aids. This is the first book to provide a systematic foundation that integrates together narratology, AI, and computational linguistics. It can serve as a narratology primer for computer scientists and an elucidation of computational narratology for literary theorists. It is written in a highly accessible manner and is intended for use by a broad scientific audience that includes linguists (computational and formal semanticist ), AI researchers, cognitive scientists, computer scientists, game developers, and narrative theorists. Table of Contents: List of Figures / List of Tables / Narratological Background / Characters as Intentional Agents / Time / Plot / Summary and Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Genome Refactoring

    Natalie Kuldell ; Neal Lerner
    Copyright Year: 2009

    Morgan and Claypool eBooks

    The science of biology celebrates the discovery and understanding of biological systems that already exist in nature. In parallel, the engineering of biology must learn how to make use of our understanding of the natural world to design and build new useful biological systems. "Synthetic biology" represents one example of recent work to engineer biological systems. This emerging field aims to replace the ad hoc process of assembling biological systems by primarily developing tools to assemble reliable-but-complex living organisms from standard components that can later be reused in new combination. The focus of this book is "genome refactoring," one of several approaches to manage the complexity of a biological system in which the goal is to redesign the genetic elements that encode a living form--preserving the function of that form but encoding it with a genome far easier to study and extend. This book presents genome refactoring in two ways: as an important aspect of the emerging f eld of synthetic biology and as a powerful teaching tool to train would be professionals in the subject. Chapters focus on the overarching goals of synthetic biology and their alignment with the motivations and achievements in genome engineering; the engineering frameworks of refactoring, including genome synthesis, standardization of biological parts, and abstraction; a detailed description of the bacteriophages that have been refactored up to this point; and the methods of refactoring and contexts for that work drawn from the bacteriophage M13. Overall, these examples offer readers the potential for synthetic biology and the areas in need of further research. If successful, synthetic biology and genome refactoring could address any number of persistent societal needs, including sustainable energy, affordable and effective medicine, and green manufacturing practices. Table of Contents: Tools for Genome Engineering and Synthetic Biology / Bacteriophage as Templates for Refactoring / M thods/Teaching Protocols for M13 Reengineering / Writing and Speaking as Biological Engineers / Summary and Future Directions / Appendix A / Appendix B / Appendix C View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computational Genomic Signatures

    Ozkan Ufuk Nalbantoglu ; Khalid Sayood
    Copyright Year: 2011

    Morgan and Claypool eBooks

    Recent advances in development of sequencing technology has resulted in a deluge of genomic data. In order to make sense of this data, there is an urgent need for algorithms for data processing and quantitative reasoning. An emerging in silico approach, called computational genomic signatures, addresses this need by representing global species-specific features of genomes using simple mathematical models. This text introduces the general concept of computational genomic signatures, and it reviews some of the DNA sequence models which can be used as computational genomic signatures. The text takes the position that a practical computational genomic signature consists of both a model and a measure for computing the distance or similarity between models. Therefore, a discussion of sequence similarity/distance measurement in the context of computational genomic signatures is presented. The remainder of the text covers various applications of computational genomic signatures in the areas o metagenomics, phylogenetics and the detection of horizontal gene transfer. Table of Contents: Genome Signatures, Definition and Background / Other Computational Characterizations as Genome Signatures / Measuring Distance of Biological Sequences Using Genome Signatures / Applications: Phylogeny Construction / Applications: Metagenomics / Applications: Horizontal DNA Transfer Detection View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Biomedical Image Analysis: Tracking

    Scott T. Acton ; Nilanjan Ray
    Copyright Year: 2006

    Morgan and Claypool eBooks

    In biological and medical imaging applications, tracking objects in motion is a critical task. This book describes the state-of-the-art in biomedical tracking techniques. We begin by detailing methods for tracking using active contours, which have been highly successful in biomedical applications. The book next covers the major probabilistic methods for tracking. Starting with the basic Bayesian model, we describe the Kalman filter and conventional tracking methods that use centroid and correlation measurements for target detection. Innovations such as the extended Kalman filter and the interacting multiple model open the door to capturing complex biological objects in motion. A salient highlight of the book is the introduction of the recently emerged particle filter, which promises to solve tracking problems that were previously intractable by conventional means. Another unique feature of Biomedical Image Analysis: Tracking is the explanation of shape-based methods for biomedical ima e analysis. Methods for both rigid and nonrigid objects are depicted. Each chapter in the book puts forth biomedical case studies that illustrate the methods in action. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Lying by Approximation:The Truth about Finite Element Analysis

    Vincent C. Prantil ; Christopher Papadopoulos ; Paul D. Gessler
    Copyright Year: 2013

    Morgan and Claypool eBooks

    In teaching an introduction to the finite element method at the undergraduate level, a prudent mix of theory and applications is often sought. In many cases, analysts use the finite element method to perform parametric studies on potential designs to size parts, weed out less desirable design scenarios, and predict system behavior under load. In this book, we discuss common pitfalls encountered by many finite element analysts, in particular, students encountering the method for the first time. We present a variety of simple problems in axial, bending, torsion, and shear loading that combine the students' knowledge of theoretical mechanics, numerical methods, and approximations particular to the finite element method itself. We also present case studies in which analyses are coupled with experiments to emphasize validation, illustrate where interpretations of numerical results can be misleading, and what can be done to allay such tendencies. Challenges in presenting the necessary mix f theory and applications in a typical undergraduate course are discussed. We also discuss a list of tips and rules of thumb for applying the method in practice. Table of Contents: Preface / Acknowledgments / Guilty Until Proven Innocent / Let's Get Started / Where We Begin to Go Wrong / It's Only a Model / Wisdom Is Doing It / Summary / Afterword / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Probabilistic Databases

    Dan Suciu ; Dan Olteanu ; Christop Koch ; Christoph Koch
    Copyright Year: 2011

    Morgan and Claypool eBooks

    Probabilistic databases are databases where the value of some attributes or the presence of some records are uncertain and known only with some probability. Applications in many areas such as information extraction, RFID and scientific data management, data cleaning, data integration, and financial risk assessment produce large volumes of uncertain data, which are best modeled and processed by a probabilistic database. This book presents the state of the art in representation formalisms and query processing techniques for probabilistic data. It starts by discussing the basic principles for representing large probabilistic databases, by decomposing them into tuple-independent tables, block-independent-disjoint tables, or U-databases. Then it discusses two classes of techniques for query evaluation on probabilistic databases. In extensional query evaluation, the entire probabilistic inference can be pushed into the database engine and, therefore, processed as effectively as the evaluati n of standard SQL queries. The relational queries that can be evaluated this way are called safe queries. In intensional query evaluation, the probabilistic inference is performed over a propositional formula called lineage expression: every relational query can be evaluated this way, but the data complexity dramatically depends on the query being evaluated, and can be #P-hard. The book also discusses some advanced topics in probabilistic data management such as top-k query processing, sequential probabilistic databases, indexing and materialized views, and Monte Carlo databases. Table of Contents: Overview / Data and Query Model / The Query Evaluation Problem / Extensional Query Evaluation / Intensional Query Evaluation / Advanced Techniques View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Judgment Aggregation:A Primer

    Davide Grossi ; Gabriella Pigozzi
    Copyright Year: 2014

    Morgan and Claypool eBooks

    Judgment aggregation is a mathematical theory of collective decision-making. It concerns the methods whereby individual opinions about logically interconnected issues of interest can, or cannot, be aggregated into one collective stance. Aggregation problems have traditionally been of interest for disciplines like economics and the political sciences, as well as philosophy, where judgment aggregation itself originates from, but have recently captured the attention of disciplines like computer science, artificial intelligence and multi-agent systems. Judgment aggregation has emerged in the last decade as a unifying paradigm for the formalization and understanding of aggregation problems. Still, no comprehensive presentation of the theory is available to date. This Synthesis Lecture aims at filling this gap presenting the key motivations, results, abstractions and techniques underpinning it. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Relativistic Flight Mechanics and Space Travel

    Richard F. Tinder
    Copyright Year: 2006

    Morgan and Claypool eBooks

    Relativistic Flight Mechanics and Space Travel is about the fascinating prospect of future human space travel. Its purpose is to demonstrate that such ventures may not be as difficult as one might believe and are certainly not impossible. The foundations for relativistic flight mechanics are provided in a clear and instructive manner by using well established principles which are used to explore space flight possibilities within and beyond our galaxy. The main substance of the book begins with a background review of Einstein's Special Theory of Relativity as it pertains to relativistic flight mechanics and space travel. The book explores the dynamics and kinematics of relativistic space flight from the point of view of the astronauts in the spacecraft and compares these with those observed by earth's scientists and engineers-differences that are quite surprising. A quasi historical treatment leads quite naturally into the central subject areas of the book where attention is focused n various issues not ordinarily covered by such treatment. To accomplish this, numerous simple thought experiments are used to bring rather complicated subject matter down to a level easily understood by most readers with an engineering or science background. The primary subjects regarding photon rocketry and space travel are covered in some depth and include a flight plan together with numerous calculations represented in graphical form. A geometric treatment of relativistic effects by using Minkowski diagrams is included for completeness. The book concludes with brief discussions of other prospective, even exotic, transport systems for relativistic space travel. A glossary and simple end-of-chapter problems with answers enhance the learning process. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Control System Synthesis:A Factorization Approach

    Mathukumalli Vidyasagar
    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book introduces the so-called "stable factorization approach" to the synthesis of feedback controllers for linear control systems. The key to this approach is to view the multi-input, multi-output (MIMO) plant for which one wishes to design a controller as a matrix over the fraction field F associated with a commutative ring with identity, denoted by R, which also has no divisors of zero. In this setting, the set of single-input, single-output (SISO) stable control systems is precisely the ring R, while the set of stable MIMO control systems is the set of matrices whose elements all belong to R. The set of unstable, meaning not necessarily stable, control systems is then taken to be the field of fractions F associated with R in the SISO case, and the set of matrices with elements in F in the MIMO case. The central notion introduced in the book is that, in most situations of practical interest, every matrix P whose elements belong to F can be "factored" as a "ratio" of two matrice N,D whose elements belong to R, in such a way that N,D are coprime. In the familiar case where the ring R corresponds to the set of bounded-input, bounded-output (BIBO)-stable rational transfer functions, coprimeness is equivalent to two functions not having any common zeros in the closed right half-plane including infinity. However, the notion of coprimeness extends readily to discrete-time systems, distributed-parameter systems in both the continuous- as well as discrete-time domains, and to multi-dimensional systems. Thus the stable factorization approach enables one to capture all these situations within a common framework. The key result in the stable factorization approach is the parametrization of all controllers that stabilize a given plant. It is shown that the set of all stabilizing controllers can be parametrized by a single parameter R, whose elements all belong to R. Moreover, every transfer matrix in the closed-loop system is an affine function of the design parameter R Thus problems of reliable stabilization, disturbance rejection, robust stabilization etc. can all be formulated in terms of choosing an appropriate R. This is a reprint of the book Control System Synthesis: A Factorization Approach originally published by M.I.T. Press in 1985. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Clear Speech:Technologies that Enable the Expression and Reception of Language

    Frank Rudzicz
    Copyright Year: 2016

    Morgan and Claypool eBooks

    Approximately 10% of North Americans have some communication disorder. These can be physical as in cerebral palsy and Parkinson's disease, cognitive as in Alzheimer's disease and dementia generally, or both physical and cognitive as in stroke. In fact, deteriorations in language are often the early hallmarks of broader diseases associated with older age, which is especially relevant since aging populations across many nations will result in a drastic increase in the prevalence of these types of disorders. A significant change to how healthcare is administered, brought on by these aging populations, will increase the workload of speech-language pathologists, therapists, and caregivers who are often already overloaded. Fortunately, modern speech technology, such as automatic speech recognition, has matured to the point where it can now have a profound positive impact on the lives of millions of people living with various types of disorders. This book serves as a common ground for two ommunities: clinical linguists (e.g., speech-language pathologists) and technologists (e.g., computer scientists). This book examines the neurological and physical causes of several speech disorders and their clinical effects, and demonstrates how modern technology can be used in practice to manage those effects and improve one's quality of life. This book is intended for a broad audience, from undergraduates to more senior researchers, as well as to users of these technologies and their therapists. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Publishing and Using Cultural Heritage Linked Data on the Semantic Web

    Eero Hyvonen
    Copyright Year: 2012

    Morgan and Claypool eBooks

    Cultural Heritage (CH) data is syntactically and semantically heterogeneous, multilingual, semantically rich, and highly interlinked. It is produced in a distributed, open fashion by museums, libraries, archives, and media organizations, as well as individual persons. Managing publication of such richness and variety of content on the Web, and at the same time supporting distributed, interoperable content creation processes, poses challenges where traditional publication approaches need to be re-thought. Application of the principles and technologies of Linked Data and the Semantic Web is a new, promising approach to address these problems. This development is leading to the creation of large national and international CH portals, such as Europeana, to large open data repositories, such as the Linked Open Data Cloud, and massive publications of linked library data in the U.S., Europe, and Asia. Cultural Heritage has become one of the most successful application domains of Linked Data nd Semantic Web technologies. This book gives an overview on why, when, and how Linked (Open) Data and Semantic Web technologies can be employed in practice in publishing CH collections and other content on the Web. The text first motivates and presents a general semantic portal model and publishing framework as a solution approach to distributed semantic content creation, based on an ontology infrastructure. On the Semantic Web, such an infrastructure includes shared metadata models, ontologies, and logical reasoning, and is supported by shared ontology and other Web services alleviating the use of the new technology and linked data in legacy cataloging systems. The goal of all this is to provide layman users and researchers with new, more intelligent and usable Web applications that can be utilized by other Web applications, too, via well-defined Application Programming Interfaces (API). At the same time, it is possible to provide publishing organizations with more cost-efficient so utions for content creation and publication. This book is targeted to computer scientists, museum curators, librarians, archivists, and other CH professionals interested in Linked Data and CH applications on the Semantic Web. The text is focused on practice and applications, making it suitable to students, researchers, and practitioners developing Web services and applications of CH, as well as to CH managers willing to understand the technical issues and challenges involved in linked data publication. Table of Contents: Cultural Heritage on the Semantic Web / Portal Model for Collaborative CH Publishing / Requirements for Publishing Linked Data / Metadata Schemas / Domain Vocabularies and Ontologies / Logic Rules for Cultural Heritage / Cultural Content Creation / Semantic Services for Human and Machine Users / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Three-Dimensional Integration and Modeling:A Revolution in RF and Wireless Packaging

    Jong-Hoon Lee ; Manos M. Tentzeris
    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book presents a step-by-step discussion of the 3D integration approach for the development of compact system-on-package (SOP) front-ends.Various examples of fully-integrated passive building blocks (cavity/microstip filters, duplexers, antennas), as well as a multilayer ceramic (LTCC) V-band transceiver front-end midule demonstrate the revolutionary effects of this approach in RF/Wireless packaging and multifunctional miniaturization. Designs covered are based on novel ideas and are presented for the first time for millimeterwave (60GHz) ultrabroadband wireless modules. Table of Contents: Introduction / Background on Technologies for Millimeter-Wave Passive Front-Ends / Three-Dimensional Packaging in Multilayer Organic Substrates / Microstrip-Type Integrated Passives / Cavity-Type Integrated Passives / Three-Dimensional Antenna Architectures / Fully Integrated Three-Dimensional Passive Front-Ends / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Taxobook: History, Theories and Concepts of Knowledge Organization (Part 1 of a 3-Part Series):History, Theories, and Concepts of Knowledge Organization

    Marjorie Hlava
    Copyright Year: 2014

    Morgan and Claypool eBooks

    This is the first volume in a series about creating and maintaining taxonomies and their practical applications, especially in search functions. In Book 1 (The Taxobook: History, Theories, and Concepts of Knowledge Organization), the author introduces the very foundations of classification, starting with the ancient Greek philosophers Plato and Aristotle, as well as Theophrastus and the Roman Pliny the Elder. They were first in a line of distinguished thinkers and philosophers to ponder the organization of the world around them and attempt to apply a structure or framework to that world. The author continues by discussing the works and theories of several other philosophers from Medieval and Renaissance times, including Saints Aquinas and Augustine, William of Occam, Andrea Cesalpino, Carl Linnaeus, and René Descartes. In the 17th, 18th, and 19th centuries, John Locke, Immanuel Kant, James Frederick Ferrier, Charles Ammi Cutter, and Melvil Dewey contributed greatly to the theori s of classification systems and knowledge organization. Cutter and Dewey, especially, created systems that are still in use today. Chapter 8 covers the contributions of Shiyali Ramamrita Ranganathan, who is considered by many to be the “father of modern library science.” He created the concept of faceted vocabularies, which are widely used—even if they are not well understood—on many e-commerce websites. Following the discussions and historical review, the author has included a glossary that covers all three books of this series so that it can be referenced as you work your way through the second and third volumes. The author believes that it is important to understand the history of knowledge organization and the differing viewpoints of various philosophers—even if that understanding is only that the differing viewpoints simply exist. Knowing the differing viewpoints will help answer the fundamental questions: Why do we want to build taxonomies? How do we build them to serve multiple points of view? Table of Contents: List of Figures / Preface / Acknowledgments / Origins of Knowledge Organization Theory: Early Philosophy of Knowledge / Saints and Traits: Realism and Nominalism / Arranging the glowers… and the Birds, and the Insects, and Everything Else: Early Naturalists and Taxonomies / The Age of Enlightenment Impacts Knowledge Theory / 18th-Century Developments: Knowledge Theory Coming to the Foreground / High Resolution: Classification Sharpens in the 19th and 20th Centuries / Outlining the World and Its Parts / Facets: An Indian Mathematician and Children’s Toys at Selfridge’s / Points of Knowledge / Glossary / End Notes / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Markov Logic:An Interface Layer for Artificial Intelligence

    Pedro Domingos ; Daniel Lowd
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Most subfields of computer science have an interface layer via which applications communicate with the infrastructure, and this is key to their success (e.g., the Internet in networking, the relational model in databases, etc.). So far this interface layer has been missing in AI. First-order logic and probabilistic graphical models each have some of the necessary features, but a viable interface layer requires combining both. Markov logic is a powerful new language that accomplishes this by attaching weights to first-order formulas and treating them as templates for features of Markov random fields. Most statistical models in wide use are special cases of Markov logic, and first-order logic is its infinite-weight limit. Inference algorithms for Markov logic combine ideas from satisfiability, Markov chain Monte Carlo, belief propagation, and resolution. Learning algorithms make use of conditional likelihood, convex optimization, and inductive logic programming. Markov logic has been su cessfully applied to problems in information extraction and integration, natural language processing, robot mapping, social networks, computational biology, and others, and is the basis of the open-source Alchemy system. Table of Contents: Introduction / Markov Logic / Inference / Learning / Extensions / Applications / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Microcontroller Programming and Interfacing Texas Instruments MSP430:Part II

    Steven F. Barrett ; Daniel J. Pack
    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book provides a thorough introduction to the Texas Instruments MPS430 microcontroller. The MPS430 is a 16-bit reduced instruction set (RISC) processor that features ultra low power consumption and integrated digital and analog hardware. Variants of the MPS430 microcontroller have been in production since 1993. This provides for a host of MPS430 products including evaluation boards, compilers, and documentation. A thorough introduction to the MPS430 line of microcontrollers, programming techniques, and interface concepts are provided along with considerable tutorial information with many illustrated examples. Each chapter provides laboratory exercises to apply what has been presented in the chapter. The book is intended for an upper level undergraduate course in microcontrollers or mechatronics but may also be used as a reference for capstone design projects. Also, practicing engineers already familiar with another microcontroller, who require a quick tutorial on the microcontroll r, will find this book very useful. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Primer on Memory Consistency and Cache Coherence

    Daniel Sorin ; Mark Hill ; David Wood
    Copyright Year: 2011

    Morgan and Claypool eBooks

    Many modern computer systems and most multicore chips (chip multiprocessors) support shared memory in hardware. In a shared memory system, each of the processor cores may read and write to a single shared address space. For a shared memory machine, the memory consistency model defines the architecturally visible behavior of its memory system. Consistency definitions provide rules about loads and stores (or memory reads and writes) and how they act upon memory. As part of supporting a memory consistency model, many machines also provide cache coherence protocols that ensure that multiple cached copies of data are kept up-to-date. The goal of this primer is to provide readers with a basic understanding of consistency and coherence. This understanding includes both the issues that must be solved as well as a variety of solutions. We present both highlevel concepts as well as specific, concrete examples from real-world systems. Table of Contents: Preface / Introduction to Consistency and oherence / Coherence Basics / Memory Consistency Motivation and Sequential Consistency / Total Store Order and the x86 Memory Model / Relaxed Memory Consistency / Coherence Protocols / Snooping Coherence Protocols / Directory Coherence Protocols / Advanced Topics in Coherence / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Remote Sensing Image Processing

    Gustavo Camps-Valls ; Devis Tuia ; Luis Gomez-Chova ; Sandra Jimenez