By Topic

Morgan and ClayPool Synthesis Digital LIBRARY

768 Results Returned

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    DSP for MATLAB™ and LabVIEW™ II:Discrete Frequency Transforms

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book is Volume II of the series DSP for MATLAB™ and LabVIEW™. This volume provides detailed coverage of discrete frequency transforms, including a brief overview of common frequency transforms, both discrete and continuous, followed by detailed treatments of the Discrete Time Fourier Transform (DTFT), the z -Transform (including definition and properties, the inverse z -transform, frequency response via z-transform, and alternate filter realization topologies (including Direct Form, Direct Form Transposed, Cascade Form, Parallel Form, and Lattice Form), and the Discrete Fourier Transform (DFT) (including Discrete Fourier Series, the DFT-IDFT pair, DFT of common signals, bin width, sampling duration and sample rate, the FFT, the Goertzel Algorithm, Linear, Periodic, and Circular convolution, DFT Leakage, and computation of the Inverse DFT). The entire series consists of four volumes that collectively cover basic digital signal processing in a practical and accessible manner, but which nonetheless include all essential foundation mathematics. As the series title implies, the scripts (of which there are more than 200) described in the text and supplied in code form here will run on both MATLAB™ and LabVIEW™. The text for all volumes contains many examples, and many useful computational scripts, augmented by demonstration scripts and LabVIEW™ Virtual Instruments (VIs) that can be run to illustrate various signal processing concepts graphically on the user's computer. Volume I consists of four chapters that collectively set forth a brief overview of the field of digital signal processing, useful signals and concepts (including convolution, recursion, difference equations, LTI systems, etc), conversion from the continuous to discrete domain and back (i.e., analog-to-digital and digital-to-analog conversion), aliasing, the Nyquist rate, normalized frequency, sample rate conversion and Mu-law compression, and signal processing princ ples including correlation, the correlation sequence, the Real DFT, correlation by convolution, matched filtering, simple FIR filters, and simple IIR filters. Chapter 4 of Volume I, in particular, provides an intuitive or "first principle" understanding of how digital filtering and frequency transforms work, preparing the reader for the present volume (Volume II). Volume III of the series covers digital filter design (FIR design using Windowing, Frequency Sampling, and Optimum Equiripple techniques, and Classical IIR design) and Volume IV, the culmination of the series, is an introductory treatment of LMS Adaptive Filtering and applications. Table of Contents: The Discrete Time Fourier Transform / The z-Transform / The DFT View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Envisionment and Discovery Collaboratory (EDC):Explorations in Human-Centered Informatics with Tabletop Computing Environments

    Copyright Year: 2015

    Morgan and Claypool eBooks

    he Envisionment and Discovery Collaboratory (EDC) is a long-term research platform exploring immersive socio-technical environments in which stakeholders can collaboratively frame and solve problems and discuss and make decisions in a variety of application domains and different disciplines. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Integral:A Crux for Analysis

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book treats all of the most commonly used theories of the integral. After motivating the idea of integral, we devote a full chapter to the Riemann integral and the next to the Lebesgue integral. Another chapter compares and contrasts the two theories. The concluding chapter offers brief introductions to the Henstock integral, the Daniell integral, the Stieltjes integral, and other commonly used integrals. The purpose of this book is to provide a quick but accurate (and detailed) introduction to all aspects of modern integration theory. It should be accessible to any student who has had calculus and some exposure to upper division mathematics. Table of Contents: Introduction / The Riemann Integral / The Lebesgue Integral / Comparison of the Riemann and Lebesgue Integrals / Other Theories of the Integral View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Virtual Crowds:Methods, Simulation, and Control

    Copyright Year: 2008

    Morgan and Claypool eBooks

    There are many applications of computer animation and simulation where it is necessary to model virtual crowds of autonomous agents. Some of these applications include site planning, education, entertainment, training, and human factors analysis for building evacuation. Other applications include simulations of scenarios where masses of people gather, flow, and disperse, such as transportation centers, sporting events, and concerts. Most crowd simulations include only basic locomotive behaviors possibly coupled with a few stochastic actions. Our goal in this survey is to establish a baseline of techniques and requirements for simulating large-scale virtual human populations. Sometimes, these populations might be mutually engaged in a common activity such as evacuation from a building or area; other times they may be going about their individual and personal agenda of work, play, leisure, travel, or spectator. Computational methods to model one set of requirements may not mesh well wit good approaches to another. By including both crowd and individual goals and constraints into a comprehensive computational model, we expect to simulate the visual texture and contextual behaviors of groups of seemingly sentient beings. Table of Contents: Introduction / Crowd Simulation Methodology Survey / Individual Differences in Crowds / Framework (HiDAC + MACES + CAROSA) / HiDAC: Local Motion / MACES: Wayfinding with Communication and Roles / CAROSA: Functional Crowds / Initializing a Scenario / Evaluating Crowds View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Narrowband Direction of Arrival Estimation for Antenna Arrays

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book provides an introduction to narrowband array signal processing, classical and subspace-based direction of arrival (DOA) estimation with an extensive discussion on adaptive direction of arrival algorithms. The book begins with a presentation of the basic theory, equations, and data models of narrowband arrays. It then discusses basic beamforming methods and describes how they relate to DOA estimation. Several of the most common classical and subspace-based direction of arrival methods are discussed. The book concludes with an introduction to subspace tracking and shows how subspace tracking algorithms can be used to form an adaptive DOA estimator. Simulation software and additional bibliography are given at the end of the book. Table of Contents: Introduction / Background on Array Processing / Nonadaptive Direction of Arrival Estimation / Adaptive Direction of Arrival Estimation / Appendix View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Neural Network Methods in Natural Language Processing

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Neural networks are a family of powerful machine learning models. This book focuses on the application of neural network models to natural language data. The first half of the book (Parts I and II) covers the basics of supervised machine learning and feed-forward neural networks, the basics of working with machine learning over language data, and the use of vector-based rather than symbolic representations for words. It also covers the computation-graph abstraction, which allows to easily define and train arbitrary neural networks, and is the basis behind the design of contemporary neural network software libraries.</p> <p>The second part of the book (Parts III and IV) introduces more specialized neural network architectures, including 1D convolutional neural networks, recurrent neural networks, conditioned-generation models, and attention-based models. These architectures and techniques are the driving force behind state-of-the-art algorithms for machine tr nslation, syntactic parsing, and many other applications. Finally, we also discuss tree-shaped networks, structured prediction, and the prospects of multi-task learning.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Scalability Challenges in Web Search Engines

    Copyright Year: 2015

    Morgan and Claypool eBooks

    In this book, we aim to provide a fairly comprehensive overview of the scalability and efficiency challenges in large-scale web search engines. More specifically, we cover the issues involved in the design of three separate systems that are commonly available in every web-scale search engine: web crawling, indexing, and query processing systems. We present the performance challenges encountered in these systems and review a wide range of design alternatives employed as solution to these challenges, specifically focusing on algorithmic and architectural optimizations. We discuss the available optimizations at different computational granularities, ranging from a single computer node to a collection of data centers. We provide some hints to both the practitioners and theoreticians involved in the field about the way large-scale web search engines operate and the adopted design choices. Moreover, we survey the efficiency literature, providing pointers to a large number of relatively impo tant research papers. Finally, we discuss some open research problems in the context of search engine efficiency. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book serves as a practical guide to simulation of 3D deformable solids using the Finite Element Method (FEM). It reviews a number of topics related to the theory and implementation of FEM approaches: measures of deformation, constitutive laws of nonlinear materials, tetrahedral discretizations, and model reduction techniques for real-time simulation. Simulations of deformable solids are important in many applications in computer graphics, including film special effects, computer games, and virtual surgery. The Finite Element Method has become a popular tool in many such applications. Variants of FEM catering to both offline and real-time simulation have had a mature presence in computer graphics literature. This book is designed for readers familiar with numerical simulation in computer graphics, who would like to obtain a cohesive picture of the various FEM simulation methods available, their strengths and weaknesses, and their applicability in various simulation scenarios. The ook is also a practical implementation guide for the visual effects developer, offering a lean yet adequate synopsis of the underlying mathematical theory. Chapter 1 introduces the quantitative descriptions used to capture the deformation of elastic solids, the concept of strain energy, and discusses how force and stress result as a response to deformation. Chapter 2 reviews a number of constitutive models, i.e., analytical laws linking deformation to the resulting force that has successfully been used in various graphics-oriented simulation tasks. Chapter 3 summarizes how deformation and force can be computed discretely on a tetrahedral mesh, and how an implicit integrator can be structured around this discretization. Finally, chapter 4 presents the state of the art in model reduction techniques for real-time FEM solid simulation and discusses which techniques are suitable for which applications. Topics discussed in this chapter include linear modal analysis, modal warping, subspace imulation, and domain decomposition. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Speech Enhancement in the Karhunen-Loeve Expansion Domain

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book is devoted to the study of the problem of speech enhancement whose objective is the recovery of a signal of interest (i.e., speech) from noisy observations. Typically, the recovery process is accomplished by passing the noisy observations through a linear filter (or a linear transformation). Since both the desired speech and undesired noise are filtered at the same time, the most critical issue of speech enhancement resides in how to design a proper optimal filter that can fully take advantage of the difference between the speech and noise statistics to mitigate the noise effect as much as possible while maintaining the speech perception identical to its original form. The optimal filters can be designed either in the time domain or in a transform space. As the title indicates, this book will focus on developing and analyzing optimal filters in the Karhunen-Loève expansion (KLE) domain. We begin by describing the basic problem of speech enhancement and the fundamental rinciples to solve it in the time domain. We then explain how the problem can be equivalently formulated in the KLE domain. Next, we divide the general problem in the KLE domain into four groups, depending on whether interframe and interband information is accounted for, leading to four linear models for speech enhancement in the KLE domain. For each model, we introduce signal processing measures to quantify the performance of speech enhancement, discuss the formation of different cost functions, and address the optimization of these cost functions for the derivation of different optimal filters. Both theoretical analysis and experiments will be provided to study the performance of these filters and the links between the KLE-domain and time-domain optimal filters will be examined. Table of Contents: Introduction / Problem Formulation / Optimal Filters in the Time Domain / Linear Models for Signal Enhancement in the KLE Domain / Optimal Filters in the KLE Domain with Model 1 / Optimal ilters in the KLE Domain with Model 2 / Optimal Filters in the KLE Domain with Model 3 / Optimal Filters in the KLE Domain with Model 4 / Experimental Study View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Incomplete Data and Data Dependencies in Relational Databases

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The chase has long been used as a central tool to analyze dependencies and their effect on queries. It has been applied to different relevant problems in database theory such as query optimization, query containment and equivalence, dependency implication, and database schema design. Recent years have seen a renewed interest in the chase as an important tool in several database applications, such as data exchange and integration, query answering in incomplete data, and many others. It is well known that the chase algorithm might be non-terminating and thus, in order for it to find practical applicability, it is crucial to identify cases where its termination is guaranteed. Another important aspect to consider when dealing with the chase is that it can introduce null values into the database, thereby leading to incomplete data. Thus, in several scenarios where the chase is used the problem of dealing with data dependencies and incomplete data arises. This book discusses fundamental iss es concerning data dependencies and incomplete data with a particular focus on the chase and its applications in different database areas. We report recent results about the crucial issue of identifying conditions that guarantee the chase termination. Different database applications where the chase is a central tool are discussed with particular attention devoted to query answering in the presence of data dependencies and database schema design. Table of Contents: Introduction / Relational Databases / Incomplete Databases / The Chase Algorithm / Chase Termination / Data Dependencies and Normal Forms / Universal Repairs / Chase and Database Applications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Reconfigurable Supercomputing

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book covers technologies, applications, tools, languages, procedures, advantages, and disadvantages of reconfigurable supercomputing using Field Programmable Gate Arrays (FPGAs). The target audience is the community of users of High Performance Computers (HPC) who may benefit from porting their applications into a reconfigurable environment. As such, this book is intended to guide the HPC user through the many algorithmic considerations, hardware alternatives, usability issues, programming languages, and design tools that need to be understood before embarking on the creation of reconfigurable parallel codes. We hope to show that FPGA acceleration, based on the exploitation of the data parallelism, pipelining and concurrency remains promising in view of the diminishing improvements in traditional processor and system design. Table of Contents: FPGA Technology / Reconfigurable Supercomputing / Algorithmic Considerations / FPGA Programming Languages / Case Study: Sorting / Alternat ve Technologies and Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Power Electronics for Photovoltaic Power Systems

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The world energy demand has been increasing in a rapid manner with the increase of population and rising standard of living. The world population has nearly doubled in the last 40 years from 3.7 billion people to the present 7 billion people. It is anticipated that world population will grow towards 8 billion around 2030. Furthermore, the conventional fossil fuel supplies become unsustainable as the energy demand in emerging big economies such as China and India would rise tremendously where the China will increase its energy demand by 75% and India by 100% in the next 25 years. With dwindling natural resources, many countries throughout the world have increasingly invested in renewable resources such as photovoltaics (PV) and wind. The world has seen immense growth in global photovoltaic power generation over the last few decades. For example, in Australia, renewable resources represented nearly 15% of total power generation in 2013. Among renewable resources, solar and wind account or 38% of generation. In near future, energy in the domestic and industrial sector will become ""ubiquitous"" where consumers would have multiple sources to get their energy. Another such prediction is that co-location of solar and electrical storage will see a rapid growth in global domestic and industrial sectors; conventional power companies, which dominate the electricity market, will face increasing challenges in maintaining their incumbent business models. The efficiency, reliability and cost-effectiveness of the power converters used to interface PV panels to the mains grid and other types of off-grid loads are of major concern in the process of system design. This book describes state-of-the-art power electronic converter topologies used in various PV power conversion schemes. This book aims to provide a reader with a wide variety of topologies applied in different circumstances so that the reader would be able to make an educated choice for a given application. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Electrical Engineering: Fundamentals

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Pragmatic Electrical Engineering: Fundamentals introduces the fundamentals of the energy-delivery part of electrical systems. It begins with a study of basic electrical circuits and then focuses on electrical power. Three-phase power systems, transformers, induction motors, and magnetics are the major topics. All of the material in the text is illustrated with completely-worked examples to guide the student to a better understanding of the topics. This short lecture book will be of use at any level of engineering, not just electrical. Its goal is to provide the practicing engineer with a practical, applied look at the energy side of electrical systems. The author's "pragmatic" and applied style gives a unique and helpful "non-idealistic, practical, opinionated" introduction to the topic. Table of Contents: Basic Stuff / Power of the Sine / Three-Phase Power Systems / Transformers / Machines / Electromagnetics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multi-Objective Decision Making

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Many real-world decision problems have multiple objectives. For example, when choosing a medical treatment plan, we want to maximize the efficacy of the treatment, but also minimize the side effects. These objectives typically conflict, e.g., we can often increase the efficacy of the treatment, but at the cost of more severe side effects. In this book, we outline how to deal with multiple objectives in decision-theoretic planning and reinforcement learning algorithms. To illustrate this, we employ the popular problem classes of multi-objective Markov decision processes (MOMDPs) and multi-objective coordination graphs (MO-CoGs).</p> <p>First, we discuss different use cases for multi-objective decision making, and why they often necessitate explicitly multi-objective algorithms. We advocate a utility-based approach to multi-objective decision making, i.e., that what constitutes an optimal solution to a multi-objective decision problem should be derived from th available information about user utility. We show how different assumptions about user utility and what types of policies are allowed lead to different solution concepts, which we outline in a taxonomy of multi-objective decision problems.</p> <p>Second, we show how to create new methods for multi-objective decision making using existing single-objective methods as a basis. Focusing on planning, we describe two ways to creating multi-objective algorithms: in the inner loop approach, the inner workings of a single-objective method are adapted to work with multi-objective solution concepts; in the outer loop approach, a wrapper is created around a single-objective method that solves the multi-objective problem as a series of single-objective problems. After discussing the creation of such methods for the planning setting, we discuss how these approaches apply to the learning setting.</p> <p>Next, we discuss three promising application domains for multi-o jective decision making algorithms: energy, health, and infrastructure and transportation. Finally, we conclude by outlining important open problems and promising future directions.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multiculturalism and Information and Communication Technology

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Research on multiculturalism and information and communication technology (ICT) has been important to understanding recent history, planning for future large-scale initiatives, and understanding unrealized expectations for social and technological change. This interdisciplinary area of research has examined interactions between ICT and culture at the group and society levels. However, there is debate within the literature as to the nature of the relationship between culture and technology. In this synthesis, we suggest that the tensions result from the competing ideologies that drive researchers, allowing us to conceptualize the relationship between culture and ICT under three primary models, each with its own assumptions: 1) Social informatics, 2) Social determinism, and 3) Technological determinism. Social informatics views the relationship to be one of sociotechnical interaction, in which culture and ICTs affect each other mutually and iteratively, rather than linearly; the vast ma ority of the literature approach the relationships between ICT and culture under the assumptions of social informatics. From a socially deterministic perspective, ICTs are viewed as the dependent variable in the equation, whereas, from a technologically deterministic perspective, ICTs are an independent variable. The issues of multiculturalism and ICTs attracted much scholarly attention and have been explored under a myriad of contexts, with substantial literature on global development, social and political issues, business and public administration as well as education and scholarly collaboration. We synthesize here research in the areas of global development, social and political issues, and business collaboration. Finally we conclude by proposing under-explored areas for future research directions. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Natural Language Processing for Historical Texts

    Copyright Year: 2012

    Morgan and Claypool eBooks

    More and more historical texts are becoming available in digital form. Digitization of paper documents is motivated by the aim of preserving cultural heritage and making it more accessible, both to laypeople and scholars. As digital images cannot be searched for text, digitization projects increasingly strive to create digital text, which can be searched and otherwise automatically processed, in addition to facsimiles. Indeed, the emerging field of digital humanities heavily relies on the availability of digital text for its studies. Together with the increasing availability of historical texts in digital form, there is a growing interest in applying natural language processing (NLP) methods and tools to historical texts. However, the specific linguistic properties of historical texts -- the lack of standardized orthography, in particular -- pose special challenges for NLP. This book aims to give an introduction to NLP for historical texts and an overview of the state of the art in th s field. The book starts with an overview of methods for the acquisition of historical texts (scanning and OCR), discusses text encoding and annotation schemes, and presents examples of corpora of historical texts in a variety of languages. The book then discusses specific methods, such as creating part-of-speech taggers for historical languages or handling spelling variation. A final chapter analyzes the relationship between NLP and the digital humanities. Certain recently emerging textual genres, such as SMS, social media, and chat messages, or newsgroup and forum postings share a number of properties with historical texts, for example, nonstandard orthography and grammar, and profuse use of abbreviations. The methods and techniques required for the effective processing of historical texts are thus also of interest for research in other domains. Table of Contents: Introduction / NLP and Digital Humanities / Spelling in Historical Texts / Acquiring Historical Texts / Text Encoding an Annotation Schemes / Handling Spelling Variation / NLP Tools for Historical Languages / Historical Corpora / Conclusion / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Humanitarian Engineering

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Humanitarian Engineering reviews the development of engineering as a distinct profession and of the humanitarian movement as a special socio-political practice. Having noted that the two developments were situated in the same geographical and historical space -- that is, in Europe and North America beginning in the 1700s -- the book argues for a mutual influence and synthesis that has previously been lacking. In this spirit, the first of two central chapters describes humanitarian engineering as the artful drawing on science to direct the resources of nature with active compassion to meet the basic needs of all -- especially the powerless, poor, or otherwise marginalized. A second central chapter then considers strategies for education in humanitarian engineering so conceived. Two final chapters consider challenges and implications. Table of Contents: Engineering / Humanitarianism / Humanitarian Engineering / Humanitarian Engineering Education / Challenges / Conclusion: Humanizing Tec nology View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quality of Service in Wireless Networks Over Unlicensed Spectrum

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This Synthesis Lecture presents a discussion of Quality of Service (QoS) in wireless networks over unlicensed spectrum. The topic is presented from the point of view of protocols for wireless networks (e.g., 802.11) rather than the physical layer point of view usually discussed for cellular networks in the licensed wireless spectrum. A large number of mobile multimedia wireless applications are being deployed over WiFi (IEEE 802.11) and Bluetooth wireless networks and the number will increase in the future as more phones, tablets, and laptops are equipped with these unlicensed spectrum wireless interfaces. Achieving QoS objectives in wireless networks is challenging due to limited wireless resources, wireless nodes interference, wireless shared media, node mobility, and diverse topologies. The author presents the QoS problem as (1) an optimization problem with different constraints coming from the interference, mobility, and wireless resource constraints and (2) an algorithmic problem with fundamental algorithmic functions within wireless resource management and protocols. Table of Contents: Preface / Basics of Quality of Service in Wireless Networks / QoS-Aware Resource Allocation / Bandwidth Management / Delay Management / Routing / Acknowledgment / References / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Lying by Approximation:The Truth about Finite Element Analysis

    Copyright Year: 2013

    Morgan and Claypool eBooks

    In teaching an introduction to the finite element method at the undergraduate level, a prudent mix of theory and applications is often sought. In many cases, analysts use the finite element method to perform parametric studies on potential designs to size parts, weed out less desirable design scenarios, and predict system behavior under load. In this book, we discuss common pitfalls encountered by many finite element analysts, in particular, students encountering the method for the first time. We present a variety of simple problems in axial, bending, torsion, and shear loading that combine the students' knowledge of theoretical mechanics, numerical methods, and approximations particular to the finite element method itself. We also present case studies in which analyses are coupled with experiments to emphasize validation, illustrate where interpretations of numerical results can be misleading, and what can be done to allay such tendencies. Challenges in presenting the necessary mix f theory and applications in a typical undergraduate course are discussed. We also discuss a list of tips and rules of thumb for applying the method in practice. Table of Contents: Preface / Acknowledgments / Guilty Until Proven Innocent / Let's Get Started / Where We Begin to Go Wrong / It's Only a Model / Wisdom Is Doing It / Summary / Afterword / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Developing Embedded Software using DaVinci and OMAP Technology

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book discusses how to develop embedded products using DaVinci & OMAP Technology from Texas Instruments Incorporated. It presents a single software platform for diverse hardware platforms. DaVinci & OMAP Technology refers to the family of processors, development tools, software products, and support. While DaVinci Technology is driven by the needs of consumer video products such as IP network cameras, networked projectors, digital signage and portable media players, OMAP Technology is driven by the needs of wireless products such as smart phones. Texas Instruments offers a wide variety of processing devices to meet our users' price and performance needs. These vary from single digital signal processing devices to complex, system-on-chip (SoC) devices with multiple processors and peripherals. As a software developer you question: Do I need to become an expert in signal processing and learn the details of these complex devices before I can use them in my application? As a senior executive you wonder: How can I reduce my engineering development cost? How can I move from one processor to another from Texas Instruments without incurring a significant development cost? This book addresses these questions with sample code and gives an insight into the software architecture and associated component software products that make up this software platform. As an example, we show how we develop an IP network camera. Using this software platform, you can choose to focus on the application and quickly create a product without having to learn the details of the underlying hardware or signal processing algorithms. Alternatively, you can choose to differentiate at both the application as well as the signal processing layer by developing and adding your algorithms using the xDAIS for Digital Media, xDM, guidelines for component software. Finally, you may use one code base across different hardware platforms. Table of Contents: Software Platform / More about xDM, VISA, & CE / Building a Product Based on DaVinci Technology / Reducing Development Cost / eXpressDSP Digital Media (xDM) / Sample Application Using xDM / Embedded Peripheral Software Interface (EPSI) / Sample Application Using EPSI / Sample Application Using EPSI and xDM / IP Network Camera on DM355 Using TI Software / Adding your secret sauce to the Signal Processing Layer (SPL) / Further Reading View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Graph-Based Semi-Supervised Learning

    Copyright Year: 2014

    Morgan and Claypool eBooks

    While labeled data is expensive to prepare, ever increasing amounts of unlabeled data is becoming widely available. In order to adapt to this phenomenon, several semi-supervised learning (SSL) algorithms, which learn from labeled as well as unlabeled data, have been developed. In a separate line of work, researchers have started to realize that graphs provide a natural way to represent data in a variety of domains. Graph-based SSL algorithms, which bring together these two lines of work, have been shown to outperform the state-of-the-art in many applications in speech processing, computer vision, natural language processing, and other areas of Artificial Intelligence. Recognizing this promising and emerging area of research, this synthesis lecture focuses on graph-based SSL algorithms (e.g., label propagation methods). Our hope is that after reading this book, the reader will walk away with the following: (1) an in-depth knowledge of the current state-of-the-art in graph-based SSL alg rithms, and the ability to implement them; (2) the ability to decide on the suitability of graph-based SSL methods for a problem; and (3) familiarity with different applications where graph-based SSL methods have been successfully applied. Table of Contents: Introduction / Graph Construction / Learning and Inference / Scalability / Applications / Future Work / Bibliography / Authors' Biographies / Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Reaction Wheel Pendulum

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This monograph describes the Reaction Wheel Pendulum, the newest inverted-pendulum-like device for control education and research. We discuss the history and background of the reaction wheel pendulum and other similar experimental devices. We develop mathematical models of the reaction wheel pendulum in depth, including linear and nonlinear models, and models of the sensors and actuators that are used for feedback control. We treat various aspects of the control problem, from linear control of themotor, to stabilization of the pendulum about an equilibrium configuration using linear control, to the nonlinear control problem of swingup control. We also discuss hybrid and switching control, which is useful for switching between the swingup and balance controllers. We also discuss important practical issues such as friction modeling and friction compensation, quantization of sensor signals, and saturation. This monograph can be used as a supplement for courses in feedback control at the ndergraduate level, courses in mechatronics, or courses in linear and nonlinear state space control at the graduate level. It can also be used as a laboratory manual and as a reference for research in nonlinear control. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Customizable Computing

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory customization, and interconnect optimization. In addition to a discussion of the general techniques and classification of different approaches used in each area, we also highlight and illustrate some of the most successful design examples in each category and discuss their impact on performance and energy efficiency. We hope that this work captures the state-of-the-art research and development o customizable architectures and serves as a useful reference basis for further research, design, and implementation for large-scale deployment in future computing systems. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    PSpice for Analog Communications Engineering

    Copyright Year: 2007

    Morgan and Claypool eBooks

    In PSpice for Analog Communications Engineering we simulate the difficult principles of analog modulation using the superb free simulation software Cadence Orcad PSpice V10.5. While use is made of analog behavioral model parts (ABM), we use actual circuitry in most of the simulation circuits. For example, we use the 4-quadrant multiplier IC AD633 as a modulator and import real speech as the modulating source and look at the trapezoidal method for measuring the modulation index. Modulation is the process of relocating signals to different parts of the radio frequency spectrum by modifying certain parameters of the carrier in accordance with the modulating/information signals. In amplitude modulation, the modulating source changes the carrier amplitude, but in frequency modulation it causes the carrier frequency to change (and in phase modulation it’s the carrier phase). The digital equivalent of these modulation techniques are examined in PSpice for Digital communications Engine ring where we examine QAM, FSK, PSK and variants. We examine a range of oscillators and plot Nyquist diagrams showing themarginal stability of these systems. The superhetrodyne principle, the backbone of modern receivers is simulated using discrete components followed by simulating complete AM and FM receivers. In this exercise we examine the problems ofmatching individual stages and the use of double-tuned RF circuits to accommodate the large FM signal bandwidth. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Short Introduction to Preferences: Between AI and Social Choice

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Computational social choice is an expanding field that merges classical topics like economics and voting theory with more modern topics like artificial intelligence, multiagent systems, and computational complexity. This book provides a concise introduction to the main research lines in this field, covering aspects such as preference modelling, uncertainty reasoning, social choice, stable matching, and computational aspects of preference aggregation and manipulation. The book is centered around the notion of preference reasoning, both in the single-agent and the multi-agent setting. It presents the main approaches to modeling and reasoning with preferences, with particular attention to two popular and powerful formalisms, soft constraints and CP-nets. The authors consider preference elicitation and various forms of uncertainty in soft constraints. They review the most relevant results in voting, with special attention to computational social choice. Finally, the book considers prefere ces in matching problems. The book is intended for students and researchers who may be interested in an introduction to preference reasoning and multi-agent preference aggregation, and who want to know the basic notions and results in computational social choice. Table of Contents: Introduction / Preference Modeling and Reasoning / Uncertainty in Preference Reasoning / Aggregating Preferences / Stable Marriage Problems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Location Systems:An Introduction to the Technology Behind Location Awareness

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Advances in electronic location technology and the coming of age of mobile computing have opened the door for location-aware applications to permeate all aspects of everyday life. Location is at the core of a large number of high-value applications ranging from the life-and-death context of emergency response to serendipitous social meet-ups. For example, the market for GPS products and services alone is expected to grow to US$200 billion by 2015. Unfortunately, there is no single location technology that is good for every situation and exhibits high accuracy, low cost, and universal coverage. In fact, high accuracy and good coverage seldom coexist, and when they do, it comes at an extreme cost. Instead, the modern localization landscape is a kaleidoscope of location systems based on a multitude of different technologies including satellite, mobile telephony, 802.11, ultrasound, and infrared among others. This lecture introduces researchers and developers to the most popular technolog es and systems for location estimation and the challenges and opportunities that accompany their use. For each technology, we discuss the history of its development, the various systems that are based on it, and their trade-offs and their effects on cost and performance. We also describe technology-independent algorithms that are commonly used to smooth streams of location estimates and improve the accuracy of object tracking. Finally, we provide an overview of the wide variety of application domains where location plays a key role, and discuss opportunities and new technologies on the horizon. Table of Contents: Introduction / The Global Positioning System / Infrared and Ultrasonic Systems / Location Esimation with 802.11 / Cellular-Based Systems / Other Approaches / Improving Localization Accuracy / Location-Based Applications and Services / Challenges and Opportunities / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>This synthesis lecture presents an intuitive introduction to the mathematics of motion and deformation in computer graphics. Starting with familiar concepts in graphics, such as Euler angles, quaternions, and affine transformations, we illustrate that a mathematical theory behind these concepts enables us to develop the techniques for efficient/effective creation of computer animation.</p> <p>This book, therefore, serves as a good guidepost to mathematics (differential geometry and Lie theory) for students of geometric modeling and animation in computer graphics. Experienced developers and researchers will also benefit from this book, since it gives a comprehensive overview of mathematical approaches that are particularly useful in character modeling, deformation, and animation.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    User-Centered Agile Methods

    Copyright Year: 2010

    Morgan and Claypool eBooks

    With the introduction and popularization of Agile methods of software development, existing relationships and working agreements between user experience groups and developers are being disrupted. Agile methods introduce new concepts: the Product Owner, the Customer (but not the user), short iterations, User Stories. Where do UX professionals fit in this new world? Agile methods also bring a new mindset -- no big design, no specifications, minimal planning -- which conflict with the needs of UX design. This lecture discusses the key elements of Agile for the UX community and describes strategies UX people can use to contribute effectively in an Agile team, overcome key weaknesses in Agile methods as typically implemented, and produce a more robust process and more successful designs. We present a process combining the best practices of Contextual Design, a leading approach to user-centered design, with those of Agile development. Table of Contents: Introduction / Common Agile Methods / Agile Culture / Best Practices for Integrating UX with Agile / Structure of a User-Centered Agile Process / Structuring Projects / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Representations, Transformations, and Statistics for Visual Reasoning

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Analytical reasoning techniques are methods by which users explore their data to obtain insight and knowledge that can directly support situational awareness and decision making. Recently, the analytical reasoning process has been augmented through the use of interactive visual representations and tools which utilize cognitive, design and perceptual principles. These tools are commonly referred to as visual analytics tools, and the underlying methods and principles have roots in a variety of disciplines. This chapter provides an introduction to young researchers as an overview of common visual representations and statistical analysis methods utilized in a variety of visual analytics systems. The application and design of visualization and analytical algorithms are subject to design decisions, parameter choices, and many conflicting requirements. As such, this chapter attempts to provide an initial set of guidelines for the creation of the visual representation, including pitfalls and reas where the graphics can be enhanced through interactive exploration. Basic analytical methods are explored as a means of enhancing the visual analysis process, moving from visual analysis to visual analytics. Table of Contents: Data Types / Color Schemes / Data Preconditioning / Visual Representations and Analysis / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Oral Communication Excellence for Engineers and Scientists

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Many of us have implemented oral communication instruction in our design courses, lab courses, and other courses where students give presentations. Others have students give presentations without instruction on how to become a better presenter. Many of us, then, could use a concise book that guides us on what instruction on oral communication should include, based on input from executives from different settings. This instruction will help our students get jobs and make them more likely to move up the career ladder, especially in these hard economic times. Oral Communication Excellence for Engineers and Scientists: Based on Executive Input is the tool we need. It is based on input from over 75 executives with engineering or science degrees, leading organizations that employ engineers and scientists. For the presentation chapter, the executives described what makes a “stellar presentation.” And for every other chapter, they gave input—on, for example, how to effect vely communicate in meetings and in teams, how to excel at phone communication, how to communicate electronically to supplement oral communication, and how to meet the challenges of oral communication. They also provided tips on cross-cultural communication, listening, choosing the appropriate medium for a communication, elevator pitches, and posters; and using oral communication to network on the job. Oral Communication Excellence for Engineers and Scientists includes exercises and activities for students and professionals, based on instruction that has improved Georgia Tech’s students’ presentation skills at a statistically significant level. Slides demonstrating best practices are included from Capstone Design students around the country. Table of Contents: Introduction / Background Preparation / Presentation: Customizing to your Audience / Presentation: Telling your Story / Presentation: Displaying Key Information / Delivering the Presentation / Other Oral Communicat on Skills / Advanced Oral Communication Skills / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modern Image Quality Assessment

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This Lecture book is about objective image quality assessment—where the aim is to provide computational models that can automatically predict perceptual image quality. The early years of the 21st century have witnessed a tremendous growth in the use of digital images as a means for representing and communicating information. A considerable percentage of this literature is devoted to methods for improving the appearance of images, or for maintaining the appearance of images that are processed. Nevertheless, the quality of digital images, processed or otherwise, is rarely perfect. Images are subject to distortions during acquisition, compression, transmission, processing, and reproduction. To maintain, control, and enhance the quality of images, it is important for image acquisition, management, communication, and processing systems to be able to identify and quantify image quality degradations. The goals of this book are as follows; a) to introduce the fundamentals of image qual ty assessment, and to explain the relevant engineering problems, b) to give a broad treatment of the current state-of-the-art in image quality assessment, by describing leading algorithms that address these engineering problems, and c) to provide new directions for future research, by introducing recent models and paradigms that significantly differ from those used in the past. The book is written to be accessible to university students curious about the state-of-the-art of image quality assessment, expert industrial R&D engineers seeking to implement image/video quality assessment systems for specific applications, and academic theorists interested in developing new algorithms for image quality assessment or using existing algorithms to design or optimize other image processing applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Privacy for Location-based Services

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Sharing of location data enables numerous exciting applications, such as location-based queries, location-based social recommendations, monitoring of traffic and air pollution levels, etc. Disclosing exact user locations raises serious privacy concerns, as locations may give away sensitive information about individuals' health status, alternative lifestyles, political and religious affiliations, etc. Preserving location privacy is an essential requirement towards the successful deployment of location-based applications. These lecture notes provide an overview of the state-of-the-art in location privacy protection. A diverse body of solutions is reviewed, including methods that use location generalization, cryptographic techniques or differential privacy. The most prominent results are discussed, and promising directions for future work are identified. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Circuit Analysis with Multisim

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis. It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or both. Fourier analysis is discussed in the context of transient analysis. Next, we make a treatment of AC analysis to simulate the frequency response of a circuit. Then, we introduce diodes, transistors, and circuits composed by them and perform DC, transient, and AC analyses. The book ends with simulation of digital circuits. A practical approach is followed through the chapters, using step-by-st p examples to introduce new Multisim circuit elements, tools, analyses, and virtual instruments for measurement. The examples are clearly commented and illustrated. The different tools available on Multisim are used when appropriate so readers learn which analyses are available to them. This is part of the learning outcomes that should result after each set of end-of-chapter exercises is worked out. Table of Contents: Introduction to Circuit Simulation / Resistive Circuits / Time Domain Analysis -- Transient Analysis / Frequency Domain Analysis -- AC Analysis / Semiconductor Devices / Digital Circuits View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Relational and XML Data Exchange

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Data exchange is the problem of finding an instance of a target schema, given an instance of a source schema and a specification of the relationship between the source and the target. Such a target instance should correctly represent information from the source instance under the constraints imposed by the target schema, and it should allow one to evaluate queries on the target instance in a way that is semantically consistent with the source data. Data exchange is an old problem that re-emerged as an active research topic recently, due to the increased need for exchange of data in various formats, often in e-business applications. In this lecture, we give an overview of the basic concepts of data exchange in both relational and XML contexts. We give examples of data exchange problems, and we introduce the main tasks that need to addressed. We then discuss relational data exchange, concentrating on issues such as relational schema mappings, materializing target instances (including ca onical solutions and cores), query answering, and query rewriting. After that, we discuss metadata management, i.e., handling schema mappings themselves. We pay particular attention to operations on schema mappings, such as composition and inverse. Finally, we describe both data exchange and metadata management in the context of XML. We use mappings based on transforming tree patterns, and we show that they lead to a host of new problems that did not arise in the relational case, but they need to be addressed for XML. These include consistency issues for mappings and schemas, as well as imposing tighter restrictions on mappings and queries to achieve tractable query answering in data exchange. Table of Contents: Overview / Relational Mappings and Data Exchange / Metadata Management / XML Mappings and Data Exchange View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semantics in Mobile Sensing

    Copyright Year: 2014

    Morgan and Claypool eBooks

    The dramatic progress of smartphone technologies has ushered in a new era of mobile sensing, where traditional wearable on-body sensors are being rapidly superseded by various embedded sensors in our smartphones. For example, a typical smartphone today, has at the very least a GPS, WiFi, Bluetooth, triaxial accelerometer, and gyroscope. Alongside, new accessories are emerging such as proximity, magnetometer, barometer, temperature, and pressure sensors. Even the default microphone can act as an acoustic sensor to track noise exposure for example. These sensors act as a "lens" to understand the user's context along different dimensions. Data can be passively collected from these sensors without interrupting the user. As a result, this new era of mobile sensing has fueled significant interest in understanding what can be extracted from such sensor data both instantaneously as well as considering volumes of time series from these sensors. For example, GPS logs can be used to determine a tomatically the significant places associated to a user's life (e.g., home, office, shopping areas). The logs may also reveal travel patterns, and how a user moves from one place to another (e.g., driving or using public transport). These may be used to proactively inform the user about delays, relevant promotions from shops, in his "regular" route. Similarly, accelerometer logs can be used to measure a user's average walking speed, compute step counts, gait identification, and estimate calories burnt per day. The key objective is to provide better services to end users. The objective of this book is to inform the reader of the methodologies and techniques for extracting meaningful information (called "semantics") from sensors on our smartphones. These techniques form the cornerstone of several application areas utilizing smartphone sensor data. We discuss technical challenges and algorithmic solutions for modeling and mining knowledge from smartphone-resident sensor data streams. T is book devotes two chapters to dive deep into a set of highly available, commoditized sensors---the positioning sensor (GPS) and motion sensor (accelerometer). Furthermore, this book has a chapter devoted to energy-efficient computation of semantics, as battery life is a major concern on user experience. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Interactive Technologies for Autism:A Review

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Development, deployment, and evaluation of interactive technologies for individuals with autism have been rapidly increasing over the last decade. There is great promise for the use of these types of technologies to enrich interventions, facilitate communication, and support data collection. Emerging technologies in this area also have the potential to enhance assessment and diagnosis of individuals with autism, to understand the nature of autism, and to help researchers conduct basic and applied research. This book provides an in-depth review of the historical and state-of-the-art use of technology by and for individuals with autism. The intention is to give readers a comprehensive background in order to understand what has been done and what promises and challenges lie ahead. By providing a classification scheme and general review, this book can also help technology designers and researchers better understand what technologies have been successful, what problems remain open, and whe e innovations can further address challenges and opportunities for individuals with autism and the variety of stakeholders connected to them. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Articular Cartilage Tissue Engineering

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Cartilage injuries in children and adolescents are increasingly observed, with roughly 20% of knee injuries in adolescents requiring surgery. In the US alone, costs of osteoarthritis (OA) are in excess of $65 billion per year (both medical costs and lost wages). Comorbidities are common with OA and are also costly to manage. Articular cartilage's low friction and high capacity to bear load makes it critical in the movement of one bone against another, and its lack of a sustained natural healing response has necessitated a plethora of therapies. Tissue engineering is an emerging technology at the threshold of translation to clinical use. Replacement cartilage can be constructed in the laboratory to recapitulate the functional requirements of native tissues. This book outlines the biomechanical and biochemical characteristics of articular cartilage in both normal and pathological states, through development and aging. It also provides a historical perspective of past and current cartil ge treatments and previous tissue engineering efforts. Methods and standards for evaluating the function of engineered tissues are discussed, and current cartilage products are presented with an analysis on the United States Food and Drug Administration regulatory pathways that products must follow to market. This book was written to serve as a reference for researchers seeking to learn about articular cartilage, for undergraduate and graduate level courses, and as a compendium of articular cartilage tissue engineering design criteria. Table of Contents: Hyaline Articular Cartilage / Cartilage Aging and Pathology / In Vitro / Bioreactors / Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Practical Global Illumination with Irradiance Caching

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Irradiance caching is a ray tracing-based technique for computing global illumination on diffuse surfaces. Specifically, it addresses the computation of indirect illumination bouncing off one diffuse object onto another. The sole purpose of irradiance caching is to make this computation reasonably fast. The main idea is to perform the indirect illumination sampling only at a selected set of locations in the scene, store the results in a cache, and reuse the cached value at other points through fast interpolation. This book is for anyone interested in making a production-ready implementation of irradiance caching that reliably renders artifact-free images. Since its invention 20 years ago, the irradiance caching algorithm has been successfully used to accelerate global illumination computation in the Radiance lighting simulation system. Its widespread use had to wait until computers became fast enough to consider global illumination in film production rendering. Since then, its use is biquitous. Virtually all commercial and open-source rendering software base the global illumination computation upon irradiance caching. Although elegant and powerful, the algorithm in its basic form often fails to produce artifact-free mages. Unfortunately, practical information on implementing the algorithm is scarce. The main objective of this book is to show the irradiance caching algorithm along with all the details and tricks upon which the success of its practical implementation is dependent. In addition, we discuss some extensions of the basic algorithm, such as a GPU implementation for interactive global illumination computation and temporal caching that exploits temporal coherence to suppress flickering in animations. Our goal is to show the material without being overly theoretical. However, the reader should have some basic understanding of rendering concepts, ray tracing in particular. Familiarity with global illumination is useful but not necessary to read this book. Tab e of Contents: Introduction to Ray Tracing and Global Illumination / Irradiance Caching Core / Practical Rendering with Irradiance Caching / Irradiance Caching in a Complete Global Illumination / Irradiance Caching on Graphics Hardware / Temporal Irradiance Caching View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Taxobook: Principles and Practices of Taxonomy Construction (Part 2 of a 3-Part Series):Principles and Practices of Building Taxonomies, Part 2 of a 3-Part Series

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This book outlines the basic principles of creation and maintenance of taxonomies and thesauri. It also provides step by step instructions for building a taxonomy or thesaurus and discusses the various ways to get started on a taxonomy construction project. Often, the first step is to get management and budgetary approval, so I start this book with a discussion of reasons to embark on the taxonomy journey. From there I move on to a discussion of metadata and how taxonomies and metadata are related, and then consider how, where, and why taxonomies are used. Information architecture has its cornerstone in taxonomies and metadata. While a good discussion of information architecture is beyond the scope of this work, I do provide a brief discussion of the interrelationships among taxonomies, metadata, and information architecture. Moving on to the central focus of this book, I introduce the basics of taxonomies, including a definition of vocabulary control and why it is so important, how i dexing and tagging relate to taxonomies, a few of the types of tagging, and a definition and discussion of post- and pre-coordinate indexing. After that I present the concept of a hierarchical structure for vocabularies and discuss the differences among various kinds of controlled vocabularies, such as taxonomies, thesauri, authority files, and ontologies. Once you have a green light for your project, what is the next step? Here I present a few options for the first phase of taxonomy construction and then a more detailed discussion of metadata and markup languages. I believe that it is important to understand the markup languages (SGML and XML specifically, and HTML to a lesser extent) in relation to information structure, and how taxonomies and metadata feed into that structure. After that, I present the steps required to build a taxonomy, from defining the focus, collecting and organizing terms, analyzing your vocabulary for even coverage over subject areas, filling in gaps, creatin relationships between terms, and applying those terms to your content. Here I offer a cautionary note: don’t believe that your taxonomy is “done!” Regular, scheduled maintenance is an important—critical, really—component of taxonomy construction projects. After you’ve worked through the steps in this book, you will be ready to move on to integrating your taxonomy into the workflow of your organization. This is covered in Book 3 of this series. Table of Contents: List of Figures / Preface / Acknowledgments / Building a Case for Building a Taxonomy / Taxonomy Basics / Getting Started / Terms: The Building Blocks of a Taxonomy / Building the Structure of Your Taxonomy / Evaluation and Maintenance / Standards and Taxonomies / Glossary / End Notes / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Decidability of Parameterized Verification

    Copyright Year: 2015

    Morgan and Claypool eBooks

    While the classic model checking problem is to decide whether a finite system satisfies a specification, the goal of parameterized model checking is to decide, given finite systems 𝐌(n) parameterized by n ∈ ℕ, whether, for all n ∈ ℕ, the system 𝐌(n) satisfies a specification. In this book we consider the important case of 𝐌(n) being a concurrent system, where the number of replicated processes depends on the parameter n but each process is independent of n. Examples are cache coherence protocols, networks of finite-state agents, and systems that solve mutual exclusion or scheduling problems. Further examples are abstractions of systems, where the processes of the original systems actually depend on the parameter. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Retrieval Evaluation

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Evaluation has always played a major role in information retrieval, with the early pioneers such as Cyril Cleverdon and Gerard Salton laying the foundations for most of the evaluation methodologies in use today. The retrieval community has been extremely fortunate to have such a well-grounded evaluation paradigm during a period when most of the human language technologies were just developing. This lecture has the goal of explaining where these evaluation methodologies came from and how they have continued to adapt to the vastly changed environment in the search engine world today. The lecture starts with a discussion of the early evaluation of information retrieval systems, starting with the Cranfield testing in the early 1960s, continuing with the Lancaster "user" study for MEDLARS, and presenting the various test collection investigations by the SMART project and by groups in Britain. The emphasis in this chapter is on the how and the why of the various methodologies developed. The second chapter covers the more recent "batch" evaluations, examining the methodologies used in the various open evaluation campaigns such as TREC, NTCIR (emphasis on Asian languages), CLEF (emphasis on European languages), INEX (emphasis on semi-structured data), etc. Here again the focus is on the how and why, and in particular on the evolving of the older evaluation methodologies to handle new information access techniques. This includes how the test collection techniques were modified and how the metrics were changed to better reflect operational environments. The final chapters look at evaluation issues in user studies -- the interactive part of information retrieval, including a look at the search log studies mainly done by the commercial search engines. Here the goal is to show, via case studies, how the high-level issues of experimental design affect the final evaluations. Table of Contents: Introduction and Early History / "Batch" Evaluation Since 1992 / Interactive Evaluation / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multimodal Imaging in Neurology: Special Focus on MRI Applications and MEG

    Copyright Year: 2007

    Morgan and Claypool eBooks

    The field of brain imaging is developing at a rapid pace and has greatly advanced the areas of cognitive and clinical neuroscience. The availability of neuroimaging techniques, especially magnetic resonance imaging (MRI), functional MRI (fMRI), diffusion tensor imaging (DTI) and magnetoencephalography (MEG) and magnetic source imaging (MSI) has brought about breakthroughs in neuroscience. To obtain comprehensive information about the activity of the human brain, different analytical approaches should be complemented. Thus, in "intermodal multimodality" imaging, great efforts have been made to combine the highest spatial resolution (MRI, fMRI) with the best temporal resolution (MEG or EEG). "Intramodal multimodality" imaging combines various functional MRI techniques (e.g., fMRI, DTI, and/or morphometric/volumetric analysis). The multimodal approach is conceptually based on the combination of different noninvasive functional neuroimaging tools, their registration and cointegration. In articular, the combination of imaging applications that map different functional systems is useful, such as fMRI as a technique for the localization of cortical function and DTI as a technique for mapping of white matter fiber bundles or tracts. This booklet gives an insight into the wide field of multimodal imaging with respect to concepts, data acquisition, and postprocessing. Examples for intermodal and intramodal multimodality imaging are also demonstrated. Table of Contents: Introduction / Neurological Measurement Techniques and First Steps of Postprocessing / Coordinate Transformation / Examples for Multimodal Imaging / Clinical Aspects of Multimodal Imaging / References / Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    What Is Global Engineering Education For?:The Making of International Educators, Part I

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Global engineering offers the seductive image of engineers figuring out how to optimize work through collaboration and mobility. Its biggest challenge to engineers, however, is more fundamental and difficult: to better understand what they know and value qua engineers and why. This volume reports an experimental effort to help sixteen engineering educators produce "personal geographies" describing what led them to make risky career commitments to international and global engineering education. The contents of their diverse trajectories stand out in extending far beyond the narrower image of producing globally-competent engineers. Their personal geographies repeatedly highlight experiences of incongruence beyond home countries that provoked them to see themselves and understand their knowledge differently. The experiences were sufficiently profound to motivate them to design educational experiences that could provoke engineering students in similar ways. For nine engineers, gaining new international knowledge challenged assumptions that engineering work and life are limited to purely technical practices, compelling explicit attention to broader value commitments. For five non-engineers and two hybrids, gaining new international knowledge fueled ambitions to help engineering students better recognize and critically examine the broader value commitments in their work. A background chapter examines the historical emergence of international engineering education in the United States, and an epilogue explores what it might take to integrate practices of critical self-analysis more systematically in the education and training of engineers. Two appendices and two online supplements describe the unique research process that generated these personal geographies, especially the workshop at the U.S. National Academy of Engineering in which authors were prohibited from participating in discussions of their manuscripts. Table of Contents: Communicating Across Cultures: Humanitie in the International Education of Engineers (Bernd Widdig) / Linking Language Proficiency and the Professions (Michael Nugent) / Language, Life, and Pathways to Global Competency for Engineers (and Everyone Else) (Phil McKnight) / Bridging Two worlds (John M. Grandin) / Opened Eyes: From Moving Up to Helping Students See (Gayle G. Elliott) / What is Engineering for? A Search for Engineering beyond Militarism and Free-markets (Juan Lucena) / Location, Knowledge, and Desire: From Two Conservatisms to Engineering Cultures and Countries (Gary Lee Downey) / Epilogue - Beyond Global Competence: Implications for Engineering Pedagogy (Gary Lee Downey) View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    NS Simulator for Beginners

    Copyright Year: 2012

    Morgan and Claypool eBooks

    NS-2 is an open-source discrete event network simulator which is widely used by both the research community as well as by the people involved in the standardization protocols of IETF. The goal of this book is twofold: on one hand to learn how to use the NS-2 simulator, and on the other hand, to become acquainted with and to understand the operation of some of the simulated objects using NS-2 simulations. The book is intended to help students, engineers or researchers who need not have much background in programming or who want to learn through simple examples how to analyse some simulated objects using NS-2. Simulations may differ from each other in many aspects: the applications, topologies, parameters of network objects (links, nodes) and protocols used, etc. The first chapter is a general introduction to the book, where the importance of NS-2 as a tool for a good comprehension of networks and protocols is stated. In the next chapters we present special topics as TCP, RED, etc., usi g NS-2 as a tool for better understanding the protocols. We provide in the appendices a review of Random Variables and Confidence Intervals, as well as a first sketch for using the new NS-3 simulator. Table of Contents: Introduction / NS-2 Simulator Preliminaries / How to work with trace files / Description and simulation of TCP/IP / Routing and network dynamics / RED: Random Early Discard / Differentiated Services / Mobile Networks and Wireless Local Area Networks / Classical queueing models / Tcl and C++ linkage View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modern EMC Analysis Techniques Volume I:Time-Domain Computational Schemes

    Copyright Year: 2008

    Morgan and Claypool eBooks

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite integration technique. Moreover, alternative schemes, such as the finite-element, the finitevolume, the multiresolution time-domain methods and many others, are presented, while particular attention is drawn to hybrid approaches. To this aim, the general aspects for the correct implementation of the previous algorithms are also exemplified. At the end of every section, an elaborate reference on th prominent pros and possible cons, always in the light of EMC modeling, assists the reader to retrieve the gist of each formulation and decide on his/her best possible selection according to the problem under investigation. Table of Contents: Fundamental Time-Domain Methodologies for EMC Analysis / Alternative Time-Domain Techniques in EMC Modeling / Principal Implementation Issues of Time-Domain EMC Simulation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advanced Probability Theory for Biomedical Engineers

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the probability distribution of a function of one random variable using the CDF and then the PDF. Next, the probability distribution for a single random variable is determined from a function of two random variables using the CDF. Then, the joint probability distribution is found from a function of two random variables using the joint PDF and the CDF. The aim of all three books is as an introduction to p obability theory. The audience includes students, engineers and researchers presenting applications of this theory to a wide variety of problems—as well as pursuing these topics at a more advanced level. The theory material is presented in a logical manner—developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Pertinent biomedical engineering examples are throughout the text. Drill problems, straightforward exercises designed to reinforce concepts and develop problem solution skills, follow most sections. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Jordan Canonical Form:Theory and Practice

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Jordan Canonical Form (JCF) is one of the most important, and useful, concepts in linear algebra. The JCF of a linear transformation, or of a matrix, encodes all of the structural information about that linear transformation, or matrix. This book is a careful development of JCF. After beginning with background material, we introduce Jordan Canonical Form and related notions: eigenvalues, (generalized) eigenvectors, and the characteristic and minimum polynomials. We decide the question of diagonalizability, and prove the Cayley-Hamilton theorem. Then we present a careful and complete proof of the fundamental theorem: Let V be a finite-dimensional vector space over the field of complex numbers C, and let T : V → V be a linear transformation. Then T has a Jordan Canonical Form. This theorem has an equivalent statement in terms of matrices: Let A be a square matrix with complex entries. Then A is similar to a matrix J in Jordan Canonical Form, i.e., there is an invertible matrix P nd a matrix J in Jordan Canonical Form with A = PJP-1. We further present an algorithm to find P and J, assuming that one can factor the characteristic polynomial of A. In developing this algorithm we introduce the eigenstructure picture (ESP) of a matrix, a pictorial representation that makes JCF clear. The ESP of A determines J, and a refinement, the labeled eigenstructure picture (ℓESP) of A, determines P as well. We illustrate this algorithm with copious examples, and provide numerous exercises for the reader. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    New Models for Population Protocols

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Wireless sensor networks are about to be part of everyday life. Homes and workplaces capable of self-controlling and adapting air-conditioning for different temperature and humidity levels, sleepless forests ready to detect and react in case of a fire, vehicles able to avoid sudden obstacles or possibly able to self-organize routes to avoid congestion, and so on, will probably be commonplace in the very near future. Mobility plays a central role in such systems and so does passive mobility, that is, mobility of the network stemming from the environment itself. The population protocol model was an intellectual invention aiming to describe such systems in a minimalistic and analysis-friendly way. Having as a starting-point the inherent limitations but also the fundamental establishments of the population protocol model, we try in this monograph to present some realistic and practical enhancements that give birth to some new and surprisingly powerful (for these kind of systems) computati nal models. Table of Contents: Population Protocols / The Computational Power of Population Protocols / Enhancing the model / Mediated Population Protocols and Symmetry / Passively Mobile Machines that Use Restricted Space / Conclusions and Open Research Directions / Acronyms / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Latency and Distortion of Electromagnetic Trackers for Augmented Reality Systems

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Augmented reality (AR) systems are often used to superimpose virtual objects or information on a scene to improve situational awareness. Delays in the display system or inaccurate registration of objects destroy the sense of immersion a user experiences when using AR systems. AC electromagnetic trackers are ideal for these applications when combined with head orientation prediction to compensate for display system delays. Unfortunately, these trackers do not perform well in environments that contain conductive or ferrous materials due to magnetic field distortion without expensive calibration techniques. In our work we focus on both the prediction and distortion compensation aspects of this application, developing a "small footprint" predictive filter for display lag compensation and a simplified calibration system for AC magnetic trackers. In the first phase of our study we presented a novel method of tracking angular head velocity from quaternion orientation using an Extended Kalman Filter in both single model (DQEKF) and multiple model (MMDQ) implementations. In the second phase of our work we have developed a new method of mapping the magnetic field generated by the tracker without high precision measurement equipment. This method uses simple fixtures with multiple sensors in a rigid geometry to collect magnetic field data in the tracking volume. We have developed a new algorithm to process the collected data and generate a map of the magnetic field distortion that can be used to compensate distorted measurement data. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing for Gesture and Tangible Interaction

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Interactive technology is increasingly integrated with physical objects that do not have a traditional keyboard and mouse style of interaction, and many do not even have a display. These objects require new approaches to interaction design, referred to as post-WIMP (Windows, Icons, Menus, and Pointer) or as embodied interaction design.</p> <p>This book provides an overview of the design opportunities and issues associated with two embodied interaction modalities that allow us to leave the traditional keyboard behind: tangible and gesture interaction. We explore the issues in designing for this new age of interaction by highlighting the significance and contexts for these modalities. We explore the design of tangible interaction with a reconceptualization of the traditional keyboard as a Tangible Keyboard, and the design of interactive three-dimensional (3D) models as Tangible Models. We explore the design of gesture interaction through the design of gesture ase commands for a walk-up-and-use information display, and through the design of a gesture-based dialogue for the willful marionette. We conclude with design principles for tangible and gesture interaction and a call for research on the cognitive effects of these modalities.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Packets with Deadlines:A Framework for Real-Time Wireless Networks

    Copyright Year: 2013

    Morgan and Claypool eBooks

    With the explosive increase in the number of mobile devices and applications, it is anticipated that wireless traffic will increase exponentially in the coming years. Moreover, future wireless networks all carry a wide variety of flows, such as video streaming, online gaming, and VoIP, which have various quality of service (QoS) requirements. Therefore, a new mechanism that can provide satisfactory performance to the complete variety of all kinds of flows, in a coherent and unified framework, is needed. In this book, we introduce a framework for real-time wireless networks. This consists of a model that jointly addresses several practical concerns for real-time wireless networks, including per-packet delay bounds, throughput requirements, and heterogeneity of wireless channels. We detail how this framework can be employed to address a wide range of problems, including admission control, packet scheduling, and utility maximization. Table of Contents: Preface / Introduction / A Study of the Base Case / Admission Control / Scheduling Policies / Utility Maximization without Rate Adaptation / Utility Maximization with Rate Adaptation / Systems with Both Real-Time Flows and Non-Real-Time Flows / Broadcasting and Network Coding / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Intelligent Autonomous Robotics:A Robot Soccer Case Study

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Robotics technology has recently advanced to the point of being widely accessible for relatively low-budget research, as well as for graduate, undergraduate, and even secondary and primary school education. This lecture provides an example of how to productively use a cutting-edge advanced robotics platform for education and research by providing a detailed case study with the Sony AIBO robot, a vision-based legged robot. The case study used for this lecture is the UT Austin Villa RoboCup Four-Legged Team. This lecture describes both the development process and the technical details of its end result. The main contributions of this lecture are (i) a roadmap for new classes and research groups interested in intelligent autonomous robotics who are starting from scratch with a new robot, and (ii) documentation of the algorithms behind our own approach on the AIBOs with the goal of making them accessible for use on other vision-based and/or legged robot platforms. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing and Evaluating Usable Technology in Industrial Research:Three Case Studies

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book is about HCI research in an industrial research setting. It is based on the experiences of two researchers at the IBM T. J. Watson Research Center. Over the last two decades, Drs. John and Clare-Marie Karat have conducted HCI research to create innovative usable technology for users across a variety of domains. We begin the book by introducing the reader to the context of industrial research as well as a set of common themes or guidelines to consider in conducting HCI research in practice. Then case study examples of HCI approaches to the design and evaluation of usable solutions for people are presented and discussed in three domain areas: - item Conversational speech technologies, - item Personalization in eCommerce, and - item Security and privacy policy management technologies In each of the case studies, the authors illustrate and discuss examples of HCI approaches to design and evaluation that worked well and those that did not. They discuss what was learned over time bout different HCI methods in practice, and changes that were made to the HCI tools used over time. The Karats discuss trade-offs and issues related to time, resources, and money and the value derived from different HCI methods in practice. These decisions are ones that need to be made regularly in the industrial sector. Similarities and differences with the types of decisions made in this regard in academia will be discussed. The authors then use the context of the three case studies in the three research domains to draw insights and conclusions about the themes that were introduced in the beginning of the book. The Karats conclude with their perspective about the future of HCI industrial research. Table of Contents: Introduction: Themes and Structure of the Book / Case Study 1: Conversational Speech Technologies: Automatic Speech Recognition (ASR) / Case Study 2: Personalization in eCommerce / Case Study 3: Security and Privacy Policy Management Technologies / Insights and Conclusio s / The Future of Industrial HCI Research View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Power

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Pragmatic Power is focused on just three aspects of the AC electrical power system that supplies and moves the vast majority of electrical energy nearly everywhere in the world: three-phase power systems, transformers, and induction motors. The reader needs to have had an introduction to electrical circuits and AC power, although the text begins with a review of the basics of AC power. Balanced three-phase systems are studied by developing their single-phase equivalents. The study includes a look at how the cost of "power" is affected by reactive power and power factor. Transformers are considered as a circuit element in a power system, one that can be reasonably modeled to simplify system analysis. Induction motors are presented as the most common way to change electrical energy into rotational energy. Examples include the correct selection of an induction motor for a particular rotating load. All of these topics include completely worked examples to aid the reader in understanding h w to apply what has been learned. This short lecture book will be of use to students at any level of engineering, not just electrical, because it is intended for the practicing engineer or scientist looking for a practical, applied introduction to AC power systems. The author's "pragmatic" and applied style gives a unique and helpful "nonidealistic, practical, and opinionated" introduction to the topic. Table of Contents: Three-Phase Power: 3 > 3 x 1 / Transformers: Edison Lost / Induction Motors: Just One Moving Part View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multiantenna Systems for MIMO Communications

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Advanced communication scenarios demand the development of new systemswhere antenna theory, channel propagation and communication models are seen from a common perspective as a way to understand and optimize the system as a whole. In this context, a comprehensive multiantenna formulation for multiple-input multiple-output systems is presented with a special emphasis on the connection of the electromagnetic and communication principles. Starting from the capacity for amultiantenna system, the book reviews radiation, propagation, and communicationmechanisms, paying particular attention to the vectorial, directional, and timefrequency characteristics of the wireless communication equation for low- and high-scattering environments. Based on the previous concepts, different space—time methods for diversity and multiplexing applications are discussed, multiantenna modeling is studied, and specific tools are introduced to analyze the antenna coupling mechanisms and formulate appropria e decorrelation techniques.Miniaturization techniques for closely spaced antennas are studied, and its fundamental limits and optimization strategies are reviewed. Finally, different practical multiantenna topologies for new communication applications are presented, and its main parameters discussed. A relevant feature is a collection of synthesis exercises that review the main topics of the book and introduces state-of-the art system architectures and parameters, facilitating its use either as a text book or as a support tool for multiantenna systems design. Table of Contents: Principles of Multiantenna Communication Systems / The Radio Channel for MIMO Communication Systems / Coding Theory for MIMO Communication Systems / Antenna Modeling for MIMO Communication Systems / Design of MPAs for MIMO Communication Systems / Design Examples and Performance Analysis of Different MPAs / References / List of Acronyms / List of Symbols / Operators and Mathematical Symbols View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Scattering Analysis of Periodic Structures Using Finite-Difference Time-Domain

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Periodic structures are of great importance in electromagnetics due to their wide range of applications such as frequency selective surfaces (FSS), electromagnetic band gap (EBG) structures, periodic absorbers, meta-materials, and many others. The aim of this book is to develop efficient computational algorithms to analyze the scattering properties of various electromagnetic periodic structures using the finite-difference time-domain periodic boundary condition (FDTD/PBC) method. A new FDTD/PBC-based algorithm is introduced to analyze general skewed grid periodic structures while another algorithm is developed to analyze dispersive periodic structures. Moreover, the proposed algorithms are successfully integrated with the generalized scattering matrix (GSM) technique, identified as the hybrid FDTD-GSM algorithm, to efficiently analyze multilayer periodic structures. All the developed algorithms are easy to implement and are efficient in both computational time and memory usage. These lgorithms are validated through several numerical test cases. The computational methods presented in this book will help scientists and engineers to investigate and design novel periodic structures and to explore other research frontiers in electromagnetics. Table of Contents: Introduction / FDTD Method and Periodic Boundary Conditions / Skewed Grid Periodic Structures / Dispersive Periodic Structures / Multilayered Periodic Structures / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    GPU-Based Techniques for Global Illumination Effects

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book presents techniques to render photo-realistic images by programming the Graphics Processing Unit (GPU). We discuss effects such as mirror reflections, refractions, caustics, diffuse or glossy indirect illumination, radiosity, single or multiple scattering in participating media, tone reproduction, glow, and depth of field. The book targets game developers, graphics programmers, and also students with some basic understanding of computer graphics algorithms, rendering APIs like Direct3D or OpenGL, and shader programming. In order to make the book self-contained, the most important concepts of local illumination and global illumination rendering, graphics hardware, and Direct3D/HLSL programming are reviewed in the first chapters. After these introductory chapters we warm up with simple methods including shadow and environment mapping, then we move on toward advanced concepts aiming at global illumination rendering. Since it would have been impossible to give a rigorous review f all approaches proposed in this field, we go into the details of just a few methods solving each particular global illumination effect. However, a short discussion of the state of the art and links to the bibliography are also provided to refer the interested reader to techniques that are not detailed in this book. The implementation of the selected methods is also presented in HLSL, and we discuss their observed performance, merits, and disadvantages. In the last chapter, we also review how these techniques can be integrated in an advanced game engine and present case studies of their exploitation in games. Having gone through this book, the reader will have an overview of the state of the art, will be able to apply and improve these techniques, and most importantly, will be capable of developing brand new GPU algorithms. Table of Contents: Global Illumintation Rendering / Local Illumination Rendering Pipeline of GPUs / Programming and Controlling GPUs / Simple Improvements of the ocal Illumination Model / Ray Casting on the GPU / Specular Effects with Rasterization / Diffuse and Glossy Indirect Illumination / Pre-computation Aided Global Illumination / Participating Media Rendering / Fake Global Illumination / Postprocessing Effects / Integrating GI Effects in Games and Virtual Reality Systems / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computational Modeling of Human Language Acquisition

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Human language acquisition has been studied for centuries, but using computational modeling for such studies is a relatively recent trend. However, computational approaches to language learning have become increasingly popular, mainly due to advances in developing machine learning techniques, and the availability of vast collections of experimental data on child language learning and child-adult interaction. Many of the existing computational models attempt to study the complex task of learning a language under cognitive plausibility criteria (such as memory and processing limitations that humans face), and to explain the developmental stages observed in children. By simulating the process of child language learning, computational models can show us which linguistic representations are learnable from the input that children have access to, and which mechanisms yield the same patterns of behaviour that children exhibit during this process. In doing so, computational modeling provides i sight into the plausible mechanisms involved in human language acquisition, and inspires the development of better language models and techniques. This book provides an overview of the main research questions in the field of human language acquisition. It reviews the most commonly used computational frameworks, methodologies and resources for modeling child language learning, and the evaluation techniques used for assessing these computational models. The book is aimed at cognitive scientists who want to become familiar with the available computational methods for investigating problems related to human language acquisition, as well as computational linguists who are interested in applying their skills to the study of child language acquisition. Different aspects of language learning are discussed in separate chapters, including the acquisition of the individual words, the general regularities which govern word and sentence form, and the associations between form and meaning. For each of these aspects, the challenges of the task are discussed and the relevant empirical findings on children are summarized. Furthermore, the existing computational models that attempt to simulate the task under study are reviewed, and a number of case studies are presented. Table of Contents: Overview / Computational Models of Language Learning / Learning Words / Putting Words Together / Form--Meaning Associations / Final Thoughts View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Collaborative Web Search:Who, What, Where, When, and Why

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Today, Web search is treated as a solitary experience. Web browsers and search engines are typically designed to support a single user, working alone. However, collaboration on information-seeking tasks is actually commonplace. Students work together to complete homework assignments, friends seek information about joint entertainment opportunities, family members jointly plan vacation travel, and colleagues jointly conduct research for their projects. As improved networking technologies and the rise of social media simplify the process of remote collaboration, and large, novel display form-factors simplify the process of co-located group work, researchers have begun to explore ways to facilitate collaboration on search tasks. This lecture investigates the who, what, where, when and why of collaborative search, and gives insight in how emerging solutions can address collaborators' needs. Table of Contents: Introduction / Who? / What? / Where? / When? / Why? / Conclusion: How? View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    On the Efficient Determination of Most Near Neighbors:Horseshoes, Hand Grenades, Web Search and Other Situations When Close is Close Enough

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The time-worn aphorism "close only counts in horseshoes and hand-grenades" is clearly inadequate. Close also counts in golf, shuffleboard, archery, darts, curling, and other games of accuracy in which hitting the precise center of the target isn't to be expected every time, or in which we can expect to be driven from the target by skilled opponents. This lecture is not devoted to sports discussions, but to efficient algorithms for determining pairs of closely related web pages -- and a few other situations in which we have found that inexact matching is good enough; where proximity suffices. We will not, however, attempt to be comprehensive in the investigation of probabilistic algorithms, approximation algorithms, or even techniques for organizing the discovery of nearest neighbors. We are more concerned with finding nearby neighbors; if they are not particularly close by, we are not particularly interested. In thinking of when approximation is sufficient, remember the oft-told joke about two campers sitting around after dinner. They hear noises coming towards them. One of them reaches for a pair of running shoes, and starts to don them. The second then notes that even with running shoes, they cannot hope to outrun a bear, to which the first notes that most likely the bear will be satiated after catching the slower of them. We seek problems in which we don't need to be faster than the bear, just faster than the others fleeing the bear. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Joint Source-Channel Video Transmission

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book deals with the problem of joint source-channel video transmission, i.e., the joint optimal allocation of resources at the application layer and the other network layers, such as data rate adaptation, channel coding, power adaptation in wireless networks, quality of service (QoS) support from the network, and packet scheduling, for efficient video transmission. Real-time video communication applications, such as videoconferencing, video telephony, and on-demand video streaming, have gained increased popularity. However, a key problem in video transmission over the existing Internet and wireless networks is the incompatibility between the nature of the network conditions and the QoS requirements (in terms, for example, of bandwidth, delay, and packet loss) of real-time video applications. To deal with this incompatibility, a natural approach is to adapt the end-system to the network. The joint source-channel coding approach aims to efficiently perform content-aware cross-layer resource allocation, thus increasing the communication efficiency of multiple network layers. Our purpose in this book is to review the basic elements of the state-of-the-art approaches toward joint source-channel video transmission for wired and wireless systems. In this book, we present a general resource-distortion optimization framework, which is used throughout the book to guide our discussions on various techniques of joint source-channel video transmission. In this framework, network resources from multiple layers are assigned to each video packet according to its level of importance. It provides not only an optimization benchmark against which the performance of other sub-optimal systems can be evaluated, but also a useful tool for assessing the effectiveness of different error control components in practical system design. This book is therefore written to be accessible to researchers, expert industrial R&D engineers, and university students who are interested in the cut ing edge technologies in joint source-channel video transmission. Contents: Introduction / Elements of a Video Communication System / Joint Source-Channel Coding / Error-Resilient Video Coding / Channel Modeling and Channel Coding / Internet Video Transmission / Wireless Video Transmission / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Higher-Order FDTD Schemes for Waveguides and Antenna Structures

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This publication provides a comprehensive and systematically organized coverage of higher order finite-difference time-domain or FDTD schemes, demonstrating their potential role as a powerful modeling tool in computational electromagnetics. Special emphasis is drawn on the analysis of contemporary waveguide and antenna structures. Acknowledged as a significant breakthrough in the evolution of the original Yee's algorithm, the higher order FDTD operators remain the subject of an ongoing scientific research. Among their indisputable merits, one can distinguish the enhanced levels of accuracy even for coarse grid resolutions, the fast convergence rates, and the adjustable stability. In fact, as the fabrication standards of modern systems get stricter, it is apparent that such properties become very appealing for the accomplishment of elaborate and credible designs. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Communication

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book introduces fundamentals of information communication. At first, concepts and characteristics of information and information communication are summarized. And then five classic models of information communication are introduced. The mechanisms and fundamental laws of the information transmission process are also discussed. In order to realize information communication, impediments in information communication process are identified and analyzed. For the purpose of investigating implications of Internet information communication, patterns and characteristics of information communication in the Internet and Web 2.0 environment are also analyzed. In the end, case studies are provided for readers to understand the theory. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Logic

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Pragmatic Logic presents the analysis and design of digital logic systems. The author begins with a brief study of binary and hexadecimal number systems and then looks at the basics of Boolean algebra. The study of logic circuits is divided into two parts, combinational logic, which has no memory, and sequential logic, which does. Numerous examples highlight the principles being presented. The text ends with an introduction to digital logic design using Verilog, a hardware description language. The chapter on Verilog can be studied along with the other chapters in the text. After the reader has completed combinational logic in Chapters 4 and 5, sections 9.1 and 9.2 would be appropriate. Similarly, the rest of Chapter 9 could be studied after completing sequential logic in Chapters 6 and 7. This short lecture book will be of use to students at any level of electrical or computer engineering and for practicing engineers or scientists in any field looking for a practical and applied intr duction to digital logic. The author's "pragmatic" and applied style gives a unique and helpful "non-idealist, practical, opinionated" introduction to digital systems. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Search-User Interface Design

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Search User Interfaces (SUIs) represent the gateway between people who have a task to complete, and the repositories of information and data stored around the world. Not surprisingly, therefore, there are many communities who have a vested interest in the way SUIs are designed. There are people who study how humans search for information, and people who study how humans use computers. There are people who study good user interface design, and people who design aesthetically pleasing user interfaces. There are also people who curate and manage valuable information resources, and people who design effective algorithms to retrieve results from them. While it would be easy for one community to reject another for their limited ability to design a good SUI, the truth is that they all can, and they all have made valuable contributions. Fundamentally, therefore, we must accept that designing a great SUI means leveraging the knowledge and skills from all of these communities. The aim of this b ok is to at least acknowledge, if not integrate, all of these perspectives to bring the reader into a multidisciplinary mindset for how we should think about SUI design. Further, this book aims to provide the reader with a framework for thinking about how different innovations each contribute to the overall design of a SUI. With this framework and a multidisciplinary perspective in hand, the book then continues by reviewing: early, successful, established, and experimental concepts for SUI design. The book then concludes by discussing how we can analyse and evaluate the on-going developments in SUI design, as this multidisciplinary area of research moves forwards. Finally, in reviewing these many SUIs and SUI features, the book finishes by extracting a series of 20 SUI design recommendations that are listed in the conclusions. Table of Contents: Introduction / Searcher-Computer Interaction / Early Search User Interfaces / Modern Search User Interfaces / Experimental Search User Interf ces / Evaluating Search User Interfaces / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Path planning and navigation are indispensable components for controlling autonomous agents in interactive virtual worlds. Given the growing demands on the size and complexity of modern virtual worlds, a number of new techniques have been developed for achieving intelligent navigation for the next generation of interactive multi-agent simulations. This book reviews the evolution of several related techniques, starting from classical planning and computational geometry techniques and then gradually moving toward more advanced topics with focus on recent developments from the work of the authors. The covered topics range from discrete search and geometric representations to planning under different types of constraints and harnessing the power of graphics hardware in order to address Euclidean shortest paths and discrete search for multiple agents under limited time budgets. The use of planning algorithms beyond path planning is also discussed in the areas of crowd animation and whole-b dy motion planning for virtual characters. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Tensor Properties of Solids:Part Two: Transport Properties of Solids

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Tensor Properties of Solids presents the phenomenological development of solid state properties represented as matter tensors in two parts: Part I on equilibrium tensor properties and Part II on transport tensor properties. Part I begins with an introduction to tensor notation, transformations, algebra, and calculus together with the matrix representations. Crystallography, as it relates to tensor properties of crystals, completes the background treatment. A generalized treatment of solid-state equilibrium thermodynamics leads to the systematic correlation of equilibrium tensor properties. This is followed by developments covering first-, second-, third-, and higher-order tensor effects. Included are the generalized compliance and rigidity matrices for first-order tensor properties, Maxwell relations, effect of measurement conditions, and the dependent coupled effects and use of interaction diagrams. Part I concludes with the second- and higher-order effects, including numerous optica tensor properties. Part II presents the driving forces and fluxes for the well-known proper conductivities. An introduction to irreversible thermodynamics includes the concepts of microscopic reversibility, Onsager's reciprocity principle, entropy density production, and the proper choice of the transport parameters. This is followed by the force-flux equations for electronic charge and heat flow and the relationships between the proper conductivities and phenomenological coefficients. The thermoelectric effects in solids are discussed and extended to the piezothermoelectric and piezoresistance tensor effects. The subjects of thermomagnetic, galvanomagnetic, and thermogalvanomagnetic effects are developed together with other higher-order magnetotransport property tensors. A glossary of terms, expressions, and symbols are provided at the end of the text, and end-of-chapter problems are provided on request. Endnotes provide the necessary references for further reading. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    General Game Playing

    Copyright Year: 2014

    Morgan and Claypool eBooks

    General game players are computer systems able to play strategy games based solely on formal game descriptions supplied at "runtime" (n other words, they don't know the rules until the game starts). Unlike specialized game players, such as Deep Blue, general game players cannot rely on algorithms designed in advance for specific games; they must discover such algorithms themselves. General game playing expertise depends on intelligence on the part of the game player and not just intelligence of the programmer of the game player. GGP is an interesting application in its own right. It is intellectually engaging and more than a little fun. But it is much more than that. It provides a theoretical framework for modeling discrete dynamic systems and defining rationality in a way that takes into account problem representation and complexities like incompleteness of information and resource bounds. It has practical applications in areas where these features are important, e.g., in business a d law. More fundamentally, it raises questions about the nature of intelligence and serves as a laboratory in which to evaluate competing approaches to artificial intelligence. This book is an elementary introduction to General Game Playing (GGP). (1) It presents the theory of General Game Playing and leading GGP technologies. (2) It shows how to create GGP programs capable of competing against other programs and humans. (3) It offers a glimpse of some of the real-world applications of General Game Playing. Table of Contents: Preface / Introduction / Game Description / Game Management / Game Playing / Small Single-Player Games / Small Multiple-Player Games / Heuristic Search / Probabilistic Search / Propositional Nets / General Game Playing With Propnets / Factoring / Discovery of Heuristics / Logic / Analyzing Games with Logic / Solving Single-Player Games with Logic / Discovering Heuristics with Logic / Games with Incomplete Information / Games with Historical Constraints / Incomple e Game Descriptions / Advanced General Game Playing / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Anthropology of Services:Toward a Practice Approach to Designing Services

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book explores the possibility for an anthropology of services and outlines a practice approach to designing services. The reader is taken on a journey that Blomberg and Darrah have been on for the better part of a decade from their respective positions helping to establish a services research group within a large global enterprise and an applied anthropology master's program at a Silicon Valley university. They delve into the world of services to understand both how services are being conceptualized today and the possible benefits that might result from taking an anthropological view on services and their design. The authors argue that the anthropological gaze can be useful precisely because it combines attention to details of everyday life with consideration of the larger milieu in which those details make sense. Furthermore, it asks us to reflect upon and assess our own perspectives on that which we hope to understand and change. Central to their exploration is the question of how to conceptualize and engage with the world of services given their heterogeneity, the increasing global importance of the service economy, and the possibilities introduced for an engaged scholarship on service design. While discourse on services and service design can imply something distinctively new, the authors point to parallels with what is known about how humans have engaged with each other and the material world over millennia. Establishing the ubiquity of services as a starting point, the authors go on to consider the limits of design when the boundaries and connections between what can be designed and what can only be performed are complex and deeply mediated. In this regard the authors outline a practice approach to designing that acknowledges that designing involves participating in a social context, that design and use occur in concert, that people populate a world that has been largely built by and with others, and that formal models of services are impoverished repre entations of human performance. An Anthropology of Services draws attention to the conceptual and methodological messiness of service worlds while providing the reader with strategies for intervening in these worlds for human betterment as complex and challenging as that may be. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Three-Dimensional Integration and Modeling:A Revolution in RF and Wireless Packaging

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book presents a step-by-step discussion of the 3D integration approach for the development of compact system-on-package (SOP) front-ends.Various examples of fully-integrated passive building blocks (cavity/microstip filters, duplexers, antennas), as well as a multilayer ceramic (LTCC) V-band transceiver front-end midule demonstrate the revolutionary effects of this approach in RF/Wireless packaging and multifunctional miniaturization. Designs covered are based on novel ideas and are presented for the first time for millimeterwave (60GHz) ultrabroadband wireless modules. Table of Contents: Introduction / Background on Technologies for Millimeter-Wave Passive Front-Ends / Three-Dimensional Packaging in Multilayer Organic Substrates / Microstrip-Type Integrated Passives / Cavity-Type Integrated Passives / Three-Dimensional Antenna Architectures / Fully Integrated Three-Dimensional Passive Front-Ends / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Discriminative Learning for Speech Recognition

    Copyright Year: 2008

    Morgan and Claypool eBooks

    In this book, we introduce the background and mainstream methods of probabilistic modeling and discriminative parameter optimization for speech recognition. The specific models treated in depth include the widely used exponential-family distributions and the hidden Markov model. A detailed study is presented on unifying the common objective functions for discriminative learning in speech recognition, namely maximum mutual information (MMI), minimum classification error, and minimum phone/word error. The unification is presented, with rigorous mathematical analysis, in a common rational-function form. This common form enables the use of the growth transformation (or extended Baum–Welch) optimization framework in discriminative learning of model parameters. In addition to all the necessary introduction of the background and tutorial material on the subject, we also included technical details on the derivation of the parameter optimization formulas for exponential-family distribut ons, discrete hidden Markov models (HMMs), and continuous-density HMMs in discriminative learning. Selected experimental results obtained by the authors in firsthand are presented to show that discriminative learning can lead to superior speech recognition performance over conventional parameter learning. Details on major algorithmic implementation issues with practical significance are provided to enable the practitioners to directly reproduce the theory in the earlier part of the book into engineering practice. Table of Contents: Introduction and Background / Statistical Speech Recognition: A Tutorial / Discriminative Learning: A Unified Objective Function / Discriminative Learning Algorithm for Exponential-Family Distributions / Discriminative Learning Algorithm for Hidden Markov Model / Practical Implementation of Discriminative Learning / Selected Experimental Results / Epilogue / Major Symbols Used in the Book and Their Descriptions / Mathematical Notation / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modeling Digital Switching Circuits with Linear Algebra

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Modeling Digital Switching Circuits with Linear Algebra describes an approach for modeling digital information and circuitry that is an alternative to Boolean algebra. While the Boolean algebraic model has been wildly successful and is responsible for many advances in modern information technology, the approach described in this book offers new insight and different ways of solving problems. Modeling the bit as a vector instead of a scalar value in the set {0, 1} allows digital circuits to be characterized with transfer functions in the form of a linear transformation matrix. The use of transfer functions is ubiquitous in many areas of engineering and their rich background in linear systems theory and signal processing is easily applied to digital switching circuits with this model. The common tasks of circuit simulation and justification are specific examples of the application of the linear algebraic model and are described in detail. The advantages offered by the new model as compa ed to traditional methods are emphasized throughout the book. Furthermore, the new approach is easily generalized to other types of information processing circuits such as those based upon multiple-valued or quantum logic; thus providing a unifying mathematical framework common to each of these areas. Modeling Digital Switching Circuits with Linear Algebra provides a blend of theoretical concepts and practical issues involved in implementing the method for circuit design tasks. Data structures are described and are shown to not require any more resources for representing the underlying matrices and vectors than those currently used in modern electronic design automation (EDA) tools based on the Boolean model. Algorithms are described that perform simulation, justification, and other common EDA tasks in an efficient manner that are competitive with conventional design tools. The linear algebraic model can be used to implement common EDA tasks directly upon a structural netlist thus avo ding the intermediate step of transforming a circuit description into a representation of a set of switching functions as is commonly the case when conventional Boolean techniques are used. Implementation results are provided that empirically demonstrate the practicality of the linear algebraic model. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Phase Change Memory: From Devices to Systems

    Copyright Year: 2011

    Morgan and Claypool eBooks

    As conventional memory technologies such as DRAM and Flash run into scaling challenges, architects and system designers are forced to look at alternative technologies for building future computer systems. This synthesis lecture begins by listing the requirements for a next generation memory technology and briefly surveys the landscape of novel non-volatile memories. Among these, Phase Change Memory (PCM) is emerging as a leading contender, and the authors discuss the material, device, and circuit advances underlying this exciting technology. The lecture then describes architectural solutions to enable PCM for main memories. Finally, the authors explore the impact of such byte-addressable non-volatile memories on future storage and system designs. Table of Contents: Next Generation Memory Technologies / Architecting PCM for Main Memories / Tolerating Slow Writes in PCM / Wear Leveling for Durability / Wear Leveling Under Adversarial Settings / Error Resilience in Phase Change Memories Storage and System Design With Emerging Non-Volatile Memories View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Outlier Detection for Temporal Data

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Outlier (or anomaly) detection is a very broad field which has been studied in the context of a large number of research areas like statistics, data mining, sensor networks, environmental science, distributed systems, spatio-temporal mining, etc. Initial research in outlier detection focused on time series-based outliers (in statistics). Since then, outlier detection has been studied on a large variety of data types including high-dimensional data, uncertain data, stream data, network data, time series data, spatial data, and spatio-temporal data. While there have been many tutorials and surveys for general outlier detection, we focus on outlier detection for temporal data in this book. A large number of applications generate temporal datasets. For example, in our everyday life, various kinds of records like credit, personnel, financial, judicial, medical, etc., are all temporal. This stresses the need for an organized and detailed study of outliers with respect to such temporal data. In the past decade, there has been a lot of research on various forms of temporal data including consecutive data snapshots, series of data snapshots and data streams. Besides the initial work on time series, researchers have focused on rich forms of data including multiple data streams, spatio-temporal data, network data, community distribution data, etc. Compared to general outlier detection, techniques for temporal outlier detection are very different. In this book, we will present an organized picture of both recent and past research in temporal outlier detection. We start with the basics and then ramp up the reader to the main ideas in state-of-the-art outlier detection techniques. We motivate the importance of temporal outlier detection and brief the challenges beyond usual outlier detection. Then, we list down a taxonomy of proposed techniques for temporal outlier detection. Such techniques broadly include statistical techniques (like AR models, Markov models, histograms, neura networks), distance- and density-based approaches, grouping-based approaches (clustering, community detection), network-based approaches, and spatio-temporal outlier detection approaches. We summarize by presenting a wide collection of applications where temporal outlier detection techniques have been applied to discover interesting outliers. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2017

    Morgan and Claypool eBooks

    Negotiation of an energy purchase and sales agreement between a host industrial complex and the owner of a co-located combined heat and power (CHP) facility is a complex process between two inter-dependent parties forming a close long-term relationship. This case study examines the components of the agreement that require engineering input and the process of negotiation that is often led by an engineer. Outside reading is included with recommended course work and references for further study and professional development. A project management approach to the preparation phase of negotiating is presented. The study examines example calculations needed to establish components and priorities within the negotiating strategy for the industrial complex and the CHP owner from a real-world example. Students have a chance to develop hypothetical negotiating points for either side with proposed opening positions. The outcome of the case study is summarized for reference. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Design of Reconfigurable Antennas Using Graph Models

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This lecture discusses the use of graph models to represent reconfigurable antennas. The rise of antennas that adapt to their environment and change their operation based on the user's request hasn't been met with clear design guidelines. There is a need to propose some rules for the optimization of any reconfigurable antenna design and performance. Since reconfigurable antennas are seen as a collection of self-organizing parts, graph models can be introduced to relate each possible topology to a corresponding electromagnetic performance in terms of achieving a characteristic frequency of operation, impedance, and polarization. These models help designers understand reconfigurable antenna structures and enhance their functionality since they transform antennas from bulky devices into mathematical and software accessible models. The use of graphs facilitates the software control and cognition ability of reconfigurable antennas while optimizing their performance. This lecture also dis usses the reduction of redundancy, complexity and reliability of reconfigurable antennas and reconfigurable antenna arrays. The full analysis of these parameters allows a better reconfigurable antenna implementation in wireless and space communications platforms. The use of graph models to reduce the complexity while preserving the reliability of reconfigurable antennas allow a better incorporation in applications such as cognitive radio, MIMO, satellite communications, and personal communication systems. A swifter response time is achieved with less cost and losses. This lecture is written for individuals who wish to venture into the field of reconfigurable antennas, with a little prior experience in this area, and learn how graph rules and theory, mainly used in the field of computer science, networking, and control systems can be applied to electromagnetic structures. This lecture will walk the reader through a design and analysis process of reconfigurable antennas using graph mode s with a practical and theoretical outlook. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Primer on Compression in the Memory Hierarchy

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This synthesis lecture presents the current state-of-the-art in applying low-latency, lossless hardware compression algorithms to cache, memory, and the memory/cache link. There are many non-trivial challenges that must be addressed to make data compression work well in this context. First, since compressed data must be decompressed before it can be accessed, decompression latency ends up on the critical memory access path. This imposes a significant constraint on the choice of compression algorithms. Second, while conventional memory systems store fixed-size entities like data types, cache blocks, and memory pages, these entities will suddenly vary in size in a memory system that employs compression. Dealing with variable size entities in a memory system using compression has a significant impact on the way caches are organized and how to manage the resources in main memory. We systematically discuss solutions in the open literature to these problems. Chapter 2 provides the foundatio s of data compression by first introducing the fundamental concept of value locality. We then introduce a taxonomy of compression algorithms and show how previously proposed algorithms fit within that logical framework. Chapter 3 discusses the different ways that cache memory systems can employ compression, focusing on the trade-offs between latency, capacity, and complexity of alternative ways to compact compressed cache blocks. Chapter 4 discusses issues in applying data compression to main memory and Chapter 5 covers techniques for compressing data on the cache-to-memory links. This book should help a skilled memory system designer understand the fundamental challenges in applying compression to the memory hierarchy and introduce him/her to the state-of-the-art techniques in addressing them. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Basic Feedback Controls in Biomedicine

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This textbook is intended for undergraduate students (juniors or seniors) in Biomedical Engineering, with the main goal of helping these students learn about classical control theory and its application in physiological systems. In addition, students should be able to apply the Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) Controls and Simulation Modules to mammalian physiology. The first four chapters review previous work on differential equations for electrical and mechanical systems. Chapters 5 through 8 present the general types and characteristics of feedback control systems and foot locus, frequency response, and analysis of stability and margins. Chapters 9 through 12 cover basic LabVIEW programming, the control module with its pallets, and the simulation module with its pallets. Chapters 13 through 17 present various physiological models with several LabVIEW control analyses. These chapters cover control of the heart (heart rate, stroke volume, and cardiac output), the vestibular system and its role in governing equilibrium and perceived orientation, vestibulo-ocular reflex in stabilizing an image on the surface of the retina during head movement, mechanical control models of human gait (walking movement), and the respiratory control model. The latter chapters (Chapters 13-17) combine details from my class lecture notes in regard to the application of LabVIEW control programming by the class to produce the control virtual instruments and graphical displays (root locus, Bode plots, and Nyquist plot). This textbook was developed in cooperation with National Instruments personnel. Table of Contents: Electrical System Equations / Mechanical Translation Systems / Mechanical Rotational Systems / Thermal Systems and Systems Representation / Characteristics and Types of Feedback Control Systems / Root Locus / Frequency Response Analysis / Stability and Margins / Introduction to LabVIEW / Control Design in LabVIEW / Simulation in LabVIEW / LabVI W Control Design and Simulation Exercise / Cardiac Control / Vestibular Control System / Vestibulo-Ocular Control System / Gait and Stance Control System / Respiratory Control System View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Context-Aware Mobile Computing:Affordances of Space, Social Awareness, and Social Influence

    Copyright Year: 2009

    Morgan and Claypool eBooks

    The integration of ubiquitous mobile computing resources into physical spaces can potentially affect the development, maintenance, and transformation of communities and social interactions and relations within a particular context or location. Ubiquitous mobile computing allows users to engage in activities in diverse physical locations, to access resources specific to the location, and to communicate directly or indirectly with others. Mobile technologies can potentially enhance social interactions and users' experiences, extend both social and informational resources available in context, and greatly alter the nature and quality of our interactions. Activities using mobile devices in context generate complex systems of interactions, and the benefits of ubiquity and mobility can be easily lost if that complexity is not appreciated and understood. This monograph attempts to address issues of using and designing location-based computing systems and the use of these tools to enhance so ial awareness, navigate in spaces, extend interactions, and influence others. Table of Contents: Introduction / Space, Place, and Context / Creating a Sense of Presence and Awareness with Mobile Tools / Mobile Computing: A Tool for Social Influence to Change Behavior / Ethical Issues and Final Thoughts View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Metric Learning

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learning literature that covers algorithms, theory and applications for both numerical and structured data. We first introduce relevant definitions and classic metric functions, as well as examples of their use in machine learning and data mining. We then review a wide range of metric learning algorithms, starting with the simple setting of linear distance and similarity learning. We show how one may sc le-up these methods to very large amounts of training data. To go beyond the linear case, we discuss methods that learn nonlinear metrics or multiple linear metrics throughout the feature space, and review methods for more complex settings such as multi-task and semi-supervised learning. Although most of the existing work has focused on numerical data, we cover the literature on metric learning for structured data like strings, trees, graphs and time series. In the more technical part of the book, we present some recent statistical frameworks for analyzing the generalization performance in metric learning and derive results for some of the algorithms presented earlier. Finally, we illustrate the relevance of metric learning in real-world problems through a series of successful applications to computer vision, bioinformatics and information retrieval. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and War:Militarism, Ethics, Institutions, Alternatives

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book investigates the close connections between engineering and war, broadly understood, and the conceptual and structural barriers that face those who would seek to loosen those connections. It shows how military institutions and interests have long influenced engineering education, research, and practice and how they continue to shape the field in the present. The book also provides a generalized framework for responding to these influences useful to students and scholars of engineering, as well as reflective practitioners. The analysis draws on philosophy, history, critical theory, and technology studies to understand the connections between engineering and war and how they shape our very understandings of what engineering is and what it might be. After providing a review of diverse dimensions of engineering itself, the analysis shifts to different dimensions of the connections between engineering and war. First, it considers the ethics of war generally and then explores quest ons of integrity for engineering practitioners facing career decisions relating to war. Next, it considers the historical rise of the military-industrial-academic complex, especially from World War II to the present. Finally, it considers a range of responses to the militarization of engineering from those who seek to unsettle the status quo. Only by confronting the ethical, historical, and political consequences of engineering for warfare, this book argues, can engineering be sensibly reimagined. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mapped Vector Basis Functions for Electromagnetic Integral Equations

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The method-of-moments solution of the electric field and magnetic field integral equations (EFIE and MFIE) is extended to conducting objects modeled with curved cells. These techniques are important for electromagnetic scattering, antenna, radar signature, and wireless communication applications. Vector basis functions of the divergence-conforming and curl-conforming types are explained, and specific interpolatory and hierarchical basis functions are reviewed. Procedures for mapping these basis functions from a reference domain to a curved cell, while preserving the desired continuity properties on curved cells, are discussed in detail. For illustration, results are presented for examples that employ divergence-conforming basis functions with the EFIE and curl-conforming basis functions with the MFIE. The intended audience includes electromagnetic engineers with some previous familiarity with numerical techniques. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to Logic Circuit Testing

    Copyright Year: 2008

    Morgan and Claypool eBooks

    An Introduction to Logic Circuit Testing provides a detailed coverage of techniques for test generation and testable design of digital electronic circuits/systems. The material covered in the book should be sufficient for a course, or part of a course, in digital circuit testing for senior-level undergraduate and first-year graduate students in Electrical Engineering and Computer Science. The book will also be a valuable resource for engineers working in the industry. This book has four chapters. Chapter 1 deals with various types of faults that may occur in very large scale integration (VLSI)-based digital circuits. Chapter 2 introduces the major concepts of all test generation techniques such as redundancy, fault coverage, sensitization, and backtracking. Chapter 3 introduces the key concepts of testability, followed by some ad hoc design-for-testability rules that can be used to enhance testability of combinational circuits. Chapter 4 deals with test generation and response evaluat on techniques used in BIST (built-in self-test) schemes for VLSI chips. Table of Contents: Introduction / Fault Detection in Logic Circuits / Design for Testability / Built-in Self-Test / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Geometric Programming for Design and Cost Optimization:With Illustrative Case Study Problems and Solutions

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Geometric programming is used for design and cost optimization, the development of generalized design relationships, cost ratios for specific problems, and profit maximization. The early pioneers of the process - Zener, Duffin, Peterson, Beightler, Wilde, and Phillips -- played important roles in the development of geometric programming. There are three major areas: 1) Introduction, History, and Theoretical Fundamentals, 2) Applications with Zero Degrees of Difficulty, and 3) Applications with Positive Degrees of Difficulty. The primal-dual relationships are used to illustrate how to determine the primal variables from the dual solution and how to determine additional dual equations when the degrees of difficulty are positive. A new technique for determining additional equations for the dual, Dimensional Analysis, is demonstrated. The various solution techniques of the constrained derivative approach, the condensation of terms, and dimensional analysis are illustrated with example pro lems. The goal of this work is to have readers develop more case studies to further the application of this exciting tool. Table of Contents: Introduction / Brief History of Geometric Programming / Theoretical Considerations / The Optimal Box Design Case Study / Trash Can Case Study / The Open Cargo Shipping Box Case Study / Metal Casting Cylindrical Riser Case Study / Inventory Model Case Study / Process Furnace Design Case Study / Gas Transmission Pipeline Case Study / Profit Maximization Case Study / Material Removal/Metal Cutting Economics Case Study / Journal Bearing Design Case Study / Metal Casting Hemispherical Top Cylindrical Side Riser\Case Study / Liquefied Petroleum Gas (LPG) Cylinders Case Study / Material Removal/Metal Cutting Economics with Two Constraints / The Open Cargo Shipping Box with Skids / Profit Maximization Considering Decreasing Cost Functions of Inventory Policy / Summary and Future Directions / Thesis and Dissertations on Geometric Programming View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Full-Text (Substring) Indexes in External Memory

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Nowadays, textual databases are among the most rapidly growing collections of data. Some of these collections contain a new type of data that differs from classical numerical or textual data. These are long sequences of symbols, not divided into well-separated small tokens (words). The most prominent among such collections are databases of biological sequences, which are experiencing today an unprecedented growth rate. Starting in 2008, the "1000 Genomes Project" has been launched with the ultimate goal of collecting sequences of additional 1,500 Human genomes, 500 each of European, African, and East Asian origin. This will produce an extensive catalog of Human genetic variations. The size of just the raw sequences in this catalog would be about 5 terabytes. Querying strings without well-separated tokens poses a different set of challenges, typically addressed by building full-text indexes, which provide effective structures to index all the substrings of the given strings. Since full text indexes occupy more space than the raw data, it is often necessary to use disk space for their construction. However, until recently, the construction of full-text indexes in secondary storage was considered impractical due to excessive I/O costs. Despite this, algorithms developed in the last decade demonstrated that efficient external construction of full-text indexes is indeed possible. This book is about large-scale construction and usage of full-text indexes. We focus mainly on suffix trees, and show efficient algorithms that can convert suffix trees to other kinds of full-text indexes and vice versa. There are four parts in this book. They are a mix of string searching theory with the reality of external memory constraints. The first part introduces general concepts of full-text indexes and shows the relationships between them. The second part presents the first series of external-memory construction algorithms that can handle the construction of full-text indexes for moder tely large strings in the order of few gigabytes. The third part presents algorithms that scale for very large strings. The final part examines queries that can be facilitated by disk-resident full-text indexes. Table of Contents: Structures for Indexing Substrings / External Construction of Suffix Trees / Scaling Up: When the Input Exceeds the Main Memory / Queries for Disk-based Indexes / Conclusions and Open Problems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High Dynamic Range Video

    Copyright Year: 2008

    Morgan and Claypool eBooks

    As new displays and cameras offer enhanced color capabilities, there is a need to extend the precision of digital content. High Dynamic Range (HDR) imaging encodes images and video with higher than normal 8 bit-per-color-channel precision, enabling representation of the complete color gamut and the full visible range of luminance.However, to realize transition from the traditional toHDRimaging, it is necessary to develop imaging algorithms that work with the high-precision data. Tomake such algorithms effective and feasible in practice, it is necessary to take advantage of the limitations of the human visual system by aligning the data shortcomings to those of the human eye, thus limiting storage and processing precision. Therefore, human visual perception is the key component of the solutions we discuss in this book. This book presents a complete pipeline forHDR image and video processing fromacquisition, through compression and quality evaluation, to display. At the HDR image and vi eo acquisition stage specialized HDR sensors or multi-exposure techniques suitable for traditional cameras are discussed. Then, we present a practical solution for pixel values calibration in terms of photometric or radiometric quantities, which are required in some technically oriented applications. Also, we cover the problem of efficient image and video compression and encoding either for storage or transmission purposes, including the aspect of backward compatibility with existing formats. Finally, we review existing HDR display technologies and the associated problems of image contrast and brightness adjustment. For this purpose tone mapping is employed to accommodate HDR content to LDR devices. Conversely, the so-called inverse tone mapping is required to upgrade LDR content for displaying on HDR devices. We overview HDR-enabled image and video quality metrics, which are needed to verify algorithms at all stages of the pipeline. Additionally, we cover successful examples of the H R technology applications, in particular, in computer graphics and computer vision. The goal of this book is to present all discussed components of the HDR pipeline with the main focus on video. For some pipeline stages HDR video solutions are either not well established or do not exist at all, in which case we describe techniques for single HDR images. In such cases we attempt to select the techniques, which can be extended into temporal domain. Whenever needed, relevant background information on human perception is given, which enables better understanding of the design choices behind the discussed algorithms and HDR equipment. Table of Contents: Introduction / Representation of an HDR Image / HDR Image and Video Acquisition / HDR Image Quality / HDR Image, Video, and Texture Compression / Tone Reproduction / HDR Display Devices / LDR2HDR: Recovering Dynamic Range in Legacy Content / HDRI in Computer Graphics / Software View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Applications of Zero-Suppressed Decision Diagrams

    Copyright Year: 2014

    Morgan and Claypool eBooks

    A zero-suppressed decision diagram (ZDD) is a data structure to represent objects that typically contain many zeros. Applications include combinatorial problems, such as graphs, circuits, faults, and data mining. This book consists of four chapters on the applications of ZDDs. The first chapter by Alan Mishchenko introduces the ZDD. It compares ZDDs to BDDs, showing why a more compact representation is usually achieved in a ZDD. The focus is on sets of subsets and on sum-of-products (SOP) expressions. Methods to generate all the prime implicants (PIs), and to generate irredundant SOPs are shown. A list of papers on the applications of ZDDs is also presented. In the appendix, ZDD procedures in the CUDD package are described. The second chapter by Tsutomu Sasao shows methods to generate PIs and irredundant SOPs using a divide and conquer method. This chapter helps the reader to understand the methods presented in the first chapter. The third chapter by Shin-Ichi Minato introduces the "f ontier-based" method that efficiently enumerates certain subsets of a graph. The final chapter by Shinobu Nagayama shows a method to match strings of characters. This is important in routers, for example, where one must match the address information of an internet packet to the proprer output port. It shows that ZDDs are more compact than BDDs in solving this important problem. Each chapter contains exercises, and the appendix contains their solutions. Table of Contents: Preface / Acknowledgments / Introduction to Zero-Suppressed Decision Diagrams / Efficient Generation of Prime Implicants and Irredundant Sum-of-Products Expressions / The Power of Enumeration--BDD/ZDD-Based Algorithms for Tackling Combinatorial Explosion / Regular Expression Matching Using Zero-Suppressed Decision Diagrams / Authors' and Editors' Biographies / Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sequential Monte Carlo Methods for Nonlinear Discrete-Time Filtering

    Copyright Year: 2013

    Morgan and Claypool eBooks

    In these notes, we introduce particle filtering as a recursive importance sampling method that approximates the minimum-mean-square-error (MMSE) estimate of a sequence of hidden state vectors in scenarios where the joint probability distribution of the states and the observations is non-Gaussian and, therefore, closed-form analytical expressions for the MMSE estimate are generally unavailable. We begin the notes with a review of Bayesian approaches to static (i.e., time-invariant) parameter estimation. In the sequel, we describe the solution to the problem of sequential state estimation in linear, Gaussian dynamic models, which corresponds to the well-known Kalman (or Kalman-Bucy) filter. Finally, we move to the general nonlinear, non-Gaussian stochastic filtering problem and present particle filtering as a sequential Monte Carlo approach to solve that problem in a statistically optimal way. We review several techniques to improve the performance of particle filters, including importa ce function optimization, particle resampling, Markov Chain Monte Carlo move steps, auxiliary particle filtering, and regularized particle filtering. We also discuss Rao-Blackwellized particle filtering as a technique that is particularly well-suited for many relevant applications such as fault detection and inertial navigation. Finally, we conclude the notes with a discussion on the emerging topic of distributed particle filtering using multiple processors located at remote nodes in a sensor network. Throughout the notes, we often assume a more general framework than in most introductory textbooks by allowing either the observation model or the hidden state dynamic model to include unknown parameters. In a fully Bayesian fashion, we treat those unknown parameters also as random variables. Using suitable dynamic conjugate priors, that approach can be applied then to perform joint state and parameter estimation. Table of Contents: Introduction / Bayesian Estimation of Static Vectors / he Stochastic Filtering Problem / Sequential Monte Carlo Methods / Sampling/Importance Resampling (SIR) Filter / Importance Function Selection / Markov Chain Monte Carlo Move Step / Rao-Blackwellized Particle Filters / Auxiliary Particle Filter / Regularized Particle Filters / Cooperative Filtering with Multiple Observers / Application Examples / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Trustworthy Policies for Distributed Repositories

    Copyright Year: 2016

    Morgan and Claypool eBooks

    A trustworthy repository provides assurance in the form of management documents, event logs, and audit trails that digital objects are being managed correctly. The assurance includes plans for the sustainability of the repository, the accession of digital records, the management of technology evolution, and the mitigation of the risk of data loss. A detailed assessment is provided by the ISO-16363:2012 standard, "Space data and information transfer systems—Audit and certification of trustworthy digital repositories." This book examines whether the ISO specification for trustworthiness can be enforced by computer actionable policies. An implementation of the policies is provided and the policies are sorted into categories for procedures to manage externally generated documents, specify repository parameters, specify preservation metadata attributes, specify audit mechanisms for all preservation actions, specify control of preservation operations, and control preservation proper ies as technology evolves. An application of the resulting procedures is made to enforce trustworthiness within National Science Foundation data management plans. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>This book is written for students, CAD system users and software developers who are interested in geometric continuity—a notion needed in everyday practice of Computer-Aided Design and also a hot subject of research. It contains a description of the classical geometric spline curves and a solid theoretical basis for various constructions of smooth surfaces. Textbooks on computer graphics usually cover the most basic and necessary information about spline curves and surfaces in order to explain simple algorithms. In textbooks on geometric design, one can find more details, more algorithms and more theory. This book teaches how various parts of the theory can be gathered together and turned into constructions of smooth curves and smooth surfaces of arbitrary topology.</p><p>The mathematical background needed to understand this book is similar to what is necessary to read other textbooks on geometric design; most of it is basic linear algebra and analys s. More advanced mathematical material is introduced using elementary explanations. Reading <i>Geometric Continuity of Curves and Surfaces</i> provides an excellent opportunity to recall and exercise necessary mathematical notions and it may be your next step towards better practice and higher understanding of design principles.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Automated Metadata in Multimedia Information Systems

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Improvements in network bandwidth along with dramatic drops in digital storage and processing costs have resulted in the explosive growth of multimedia (combinations of text, image, audio, and video) resources on the Internet and in digital repositories. A suite of computer technologies delivering speech, image, and natural language understanding can automatically derive descriptive metadata for such resources. Difficulties for end users ensue, however, with the tremendous volume and varying quality of automated metadata for multimedia information systems. This lecture surveys automatic metadata creation methods for dealing with multimedia information resources, using broadcast news, documentaries, and oral histories as examples. Strategies for improving the utility of such metadata are discussed, including computationally intensive approaches, leveraging multimodal redundancy, folding in context, and leaving precision-recall tradeoffs under user control. Interfaces building from auto atically generated metadata are presented, illustrating the use of video surrogates in multimedia information systems. Traditional information retrieval evaluation is discussed through the annual National Institute of Standards and Technology TRECVID forum, with experiments on exploratory search extending the discussion beyond fact-finding to broader, longer term search activities of learning, analysis, synthesis, and discovery. Table of Contents: Evolution of Multimedia Information Systems: 1990-2008 / Survey of Automatic Metadata Creation Methods / Refinement of Automatic Metadata / Multimedia Surrogates / End-User Utility for Metadata and Surrogates: Effectiveness, Efficiency, and Satisfaction View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multidimensional Databases and Data Warehousing

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The present book's subject is multidimensional data models and data modeling concepts as they are applied in real data warehouses. The book aims to present the most important concepts within this subject in a precise and understandable manner. The book's coverage of fundamental concepts includes data cubes and their elements, such as dimensions, facts, and measures and their representation in a relational setting; it includes architecture-related concepts; and it includes the querying of multidimensional databases. The book also covers advanced multidimensional concepts that are considered to be particularly important. This coverage includes advanced dimension-related concepts such as slowly changing dimensions, degenerate and junk dimensions, outriggers, parent-child hierarchies, and unbalanced, non-covering, and non-strict hierarchies. The book offers a principled overview of key implementation techniques that are particularly important to multidimensional databases, including mat rialized views, bitmap indices, join indices, and star join processing. The book ends with a chapter that presents the literature on which the book is based and offers further readings for those readers who wish to engage in more in-depth study of specific aspects of the book's subject. Table of Contents: Introduction / Fundamental Concepts / Advanced Concepts / Implementation Issues / Further Readings View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High Performance Datacenter Networks:Architectures, Algorithms, and Opportunities

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Datacenter networks provide the communication substrate for large parallel computer systems that form the ecosystem for high performance computing (HPC) systems and modern Internet applications. The design of new datacenter networks is motivated by an array of applications ranging from communication intensive climatology, complex material simulations and molecular dynamics to such Internet applications as Web search, language translation, collaborative Internet applications, streaming video and voice-over-IP. For both Supercomputing and Cloud Computing the network enables distributed applications to communicate and interoperate in an orchestrated and efficient way. This book describes the design and engineering tradeoffs of datacenter networks. It describes interconnection networks from topology and network architecture to routing algorithms, and presents opportunities for taking advantage of the emerging technology trends that are influencing router microarchitecture. With the emerge ce of "many-core" processor chips, it is evident that we will also need "many-port" routing chips to provide a bandwidth-rich network to avoid the performance limiting effects of Amdahl's Law. We provide an overview of conventional topologies and their routing algorithms and show how technology, signaling rates and cost-effective optics are motivating new network topologies that scale up to millions of hosts. The book also provides detailed case studies of two high performance parallel computer systems and their networks. Table of Contents: Introduction / Background / Topology Basics / High-Radix Topologies / Routing / Scalable Switch Microarchitecture / System Packaging / Case Studies / Closing Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Biologic Foundations for Skeletal Tissue Engineering

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Tissue engineering research for bone and joint applications entails multidisciplinary teams bringing together the needed expertise in anatomy, biology, biochemistry, pathophysiology, materials science, biomechanics, fluidics, and clinical and veterinary orthopedics. It is the goal of this volume to provide students and investigators who are entering this exciting area with an understanding of the biologic foundations necessary to appreciate the problems in bone and cartilage that may benefit from innovative tissue engineering approaches. This volume includes state-of-the-art information about bone and cartilage physiology at the levels of cell and molecular biology, tissue structure, developmental processes, their metabolic and structural functions, responses to injury, mechanisms of post-natal healing and graft incorporation, the many congenital and acquired disorders, effects of aging, and current clinical standards of care. It reviews the strengths and limitations of various experi ental animal models, sources of cells, composition and design of scaffolds, activities of growth factors and genes to enhance histogenesis, and the need for new materials in the context of cell-based and cell-free tissue engineering. These building blocks constitute the dynamic environments in which innovative approaches are needed for addressing debilitating disorders of the skeleton. It is likely that a single tactic will not be sufficient for different applications because of variations in the systemic and local environments. The realizations that tissue regeneration is complex and dynamic underscore the continuing need for innovative multidisciplinary investigations, with an eye to simple and safe therapies for disabled patients. Table of Contents: Introduction / Structure and Function of Bone and Cartilage Tissue / Development / Responses to Injury and Grafting / Clinical Applications for Skeletal Tissue Engineering / Animal Models / Tissue Engineering Principles for Bone and Car ilage / Perspectives View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Graph Mining:Laws, Tools, and Case Studies

    Copyright Year: 2012

    Morgan and Claypool eBooks

    What does the Web look like? How can we find patterns, communities, outliers, in a social network? Which are the most central nodes in a network? These are the questions that motivate this work. Networks and graphs appear in many diverse settings, for example in social networks, computer-communication networks (intrusion detection, traffic management), protein-protein interaction networks in biology, document-text bipartite graphs in text retrieval, person-account graphs in financial fraud detection, and others. In this work, first we list several surprising patterns that real graphs tend to follow. Then we give a detailed list of generators that try to mirror these patterns. Generators are important, because they can help with "what if" scenarios, extrapolations, and anonymization. Then we provide a list of powerful tools for graph analysis, and specifically spectral methods (Singular Value Decomposition (SVD)), tensors, and case studies like the famous "pageRank" algorithm and the " ITS" algorithm for ranking web search results. Finally, we conclude with a survey of tools and observations from related fields like sociology, which provide complementary viewpoints. Table of Contents: Introduction / Patterns in Static Graphs / Patterns in Evolving Graphs / Patterns in Weighted Graphs / Discussion: The Structure of Specific Graphs / Discussion: Power Laws and Deviations / Summary of Patterns / Graph Generators / Preferential Attachment and Variants / Incorporating Geographical Information / The RMat / Graph Generation by Kronecker Multiplication / Summary and Practitioner's Guide / SVD, Random Walks, and Tensors / Tensors / Community Detection / Influence/Virus Propagation and Immunization / Case Studies / Social Networks / Other Related Work / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Double-Grid Finite-Difference Frequency-Domain (DG-FDFD) Method for Scattering from Chiral Objects

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book presents the application of the overlapping grids approach to solve chiral material problems using the FDFD method. Due to the two grids being used in the technique, we will name this method as Double-Grid Finite Difference Frequency-Domain (DG-FDFD) method. As a result of this new approach the electric and magnetic field components are defined at every node in the computation space. Thus, there is no need to perform averaging during the calculations as in the aforementioned FDFD technique [16]. We formulate general 3D frequency-domain numerical methods based on double-grid (DG-FDFD) approach for general bianisotropic materials. The validity of the derived formulations for different scattering problems has been shown by comparing the obtained results to exact and other solutions obtained using different numerical methods. Table of Contents: Introduction / Chiral Media / Basics of the Finite-Difference Frequency-Domain (FDFD) Method / The Double-Grid Finite-Difference Frequen y-Domain (DG-FDFD) Method for Bianisotropic Medium / Scattering FromThree Dimensional Chiral Structures / ImprovingTime and Memory Efficiencies of FDFD Methods / Conclusions / Appendix A: Notations / Appendix B: Near to Far FieldTransformation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital Forensic Science:Issues, Methods, and Challenges

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p><i>Digital forensic science</i>, or <i>digital forensics</i>, is the application of scientific tools and methods to identify, collect, and analyze digital (data) artifacts in support of legal proceedings. From a more technical perspective, it is the process of reconstructing the relevant sequence of events that have led to the currently observable state of a target IT system or (digital) artifacts.</p><p>Over the last three decades, the importance of digital evidence has grown in lockstep with the fast societal adoption of information technology, which has resulted in the continuous accumulation of data at an exponential rate. Simultaneously, there has been a rapid growth in network connectivity and the complexity of IT systems, leading to more complex behavior that needs to be investigated.</p><p>The goal of this book is to provide a systematic <i>technical</i> overview of digital forensic tech iques, primarily from the point of view of computer science. This allows us to put the field in the broader perspective of a host of related areas and gain better insight into the computational challenges facing forensics, as well as draw inspiration for addressing them. This is needed as some of the challenges faced by digital forensics, such as cloud computing, require <i>qualitatively</i> different approaches; the sheer volume of data to be examined also requires new means of processing it.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Arabic Natural Language Processing

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book provides system developers and researchers in natural language processing and computational linguistics with the necessary background information for working with the Arabic language. The goal is to introduce Arabic linguistic phenomena and review the state-of-the-art in Arabic processing. The book discusses Arabic script, phonology, orthography, morphology, syntax and semantics, with a final chapter on machine translation issues. The chapter sizes correspond more or less to what is linguistically distinctive about Arabic, with morphology getting the lion's share, followed by Arabic script. No previous knowledge of Arabic is needed. This book is designed for computer scientists and linguists alike. The focus of the book is on Modern Standard Arabic; however, notes on practical issues related to Arabic dialects and languages written in the Arabic script are presented in different chapters. Table of Contents: What is "Arabic"? / Arabic Script / Arabic Phonology and Orthograph / Arabic Morphology / Computational Morphology Tasks / Arabic Syntax / A Note on Arabic Semantics / A Note on Arabic and Machine Translation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Microcontroller Programming and Interfacing TI MSP430:Part I

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book provides a thorough introduction to the Texas Instruments MPS430 microcontroller. The MPS430 is a 16-bit reduced instruction set (RISC) processor that features ultra low power consumption and integrated digital and analog hardware. Variants of the MPS430 microcontroller have been in production since 1993. This provides for a host of MPS430 products including evaluation boards, compilers, and documentation. A thorough introduction to the MPS430 line of microcontrollers, programming techniques, and interface concepts are provided along with considerable tutorial information with many illustrated examples. Each chapter provides laboratory exercises to apply what has been presented in the chapter. The book is intended for an upper level undergraduate course in microcontrollers or mechatronics but may also be used as a reference for capstone design projects. Also, practicing engineers already familiar with another microcontroller, who require a quick tutorial on the microcontroll r, will find this book very useful. Table of Contents: Timer Systems / Resets and Interrupts / Analog Peripherals / Communication Systems / System Level Design View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and Society:Working Towards Social Justice, Part II: Decisions in the 21st Century

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Part II: Engineering Decisions in the 21st Century Engineers work in an increasingly complex entanglement of ideas, people, cultures, technology, systems and environments. Today, decisions made by engineers often have serious implications for not only their clients but for society as a whole and the natural world. Such decisions may potentially influence cultures, ways of living, as well as alter ecosystems which are in delicate balance. In order to make appropriate decisions and to co-create ideas and innovations within and among the complex networks of communities which currently exist and are shaped by our decisions, we need to regain our place as professionals, to realise the significance of our work and to take responsibility in a much deeper sense. Engineers must develop the 'ability to respond' to emerging needs of all people, across all cultures. To do this requires insights and knowledge which are at present largely within the domain of the social and political sciences but which need to be shared with our students in ways which are meaningful and relevant to engineering. This book attempts to do just that. In Part 1 Baillie introduces ideas associated with the ways in which engineers relate to the communities in which they work. Drawing on scholarship from science and technology studies, globalisation and development studies, as well as work in science communication and dialogue, this introductory text sets the scene for an engineering community which engages with the public. In Part 2 Catalano frames the thinking processes necessary to create ethical and just decisions in engineering, to understand the implications of our current decision making processes and think about ways in which we might adapt these to become more socially just in the future. In Part 3 Baillie and Catalano have provided case studies of everyday issues such as water, garbage and alarm clocks, to help us consider how we might see through the lenses of our new knowledge from Parts 1 and 2 and apply this to our everyday existence as engineers. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    XML Retrieval

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Documents usually have a content and a structure. The content refers to the text of the document, whereas the structure refers to how a document is logically organized. An increasingly common way to encode the structure is through the use of a mark-up language. Nowadays, the most widely used mark-up language for representing structure is the eXtensible Mark-up Language (XML). XML can be used to provide a focused access to documents, i.e. returning XML elements, such as sections and paragraphs, instead of whole documents in response to a query. Such focused strategies are of particular benefit for information repositories containing long documents, or documents covering a wide variety of topics, where users are directed to the most relevant content within a document. The increased adoption of XML to represent a document structure requires the development of tools to effectively access documents marked-up in XML. This book provides a detailed description of query languages, indexing str tegies, ranking algorithms, presentation scenarios developed to access XML documents. Major advances in XML retrieval were seen from 2002 as a result of INEX, the Initiative for Evaluation of XML Retrieval. INEX, also described in this book, provided test sets for evaluating XML retrieval effectiveness. Many of the developments and results described in this book were investigated within INEX. Table of Contents: Introduction / Basic XML Concepts / Historical Perspectives / Query Languages / Indexing Strategies / Ranking Strategies / Presentation Strategies / Evaluating XML Retrieval Effectiveness / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Cross-Language Information Retrieval

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Search for information is no longer exclusively limited within the native language of the user, but is more and more extended to other languages. This gives rise to the problem of cross-language information retrieval (CLIR), whose goal is to find relevant information written in a different language to a query. In addition to the problems of monolingual information retrieval (IR), translation is the key problem in CLIR: one should translate either the query or the documents from a language to another. However, this translation problem is not identical to full-text machine translation (MT): the goal is not to produce a human-readable translation, but a translation suitable for finding relevant documents. Specific translation methods are thus required. The goal of this book is to provide a comprehensive description of the specific problems arising in CLIR, the solutions proposed in this area, as well as the remaining problems. The book starts with a general description of the monolingual IR and CLIR problems. Different classes of approaches to translation are then presented: approaches using an MT system, dictionary-based translation and approaches based on parallel and comparable corpora. In addition, the typical retrieval effectiveness using different approaches is compared. It will be shown that translation approaches specifically designed for CLIR can rival and outperform high-quality MT systems. Finally, the book offers a look into the future that draws a strong parallel between query expansion in monolingual IR and query translation in CLIR, suggesting that many approaches developed in monolingual IR can be adapted to CLIR. The book can be used as an introduction to CLIR. Advanced readers can also find more technical details and discussions about the remaining research challenges in the future. It is suitable to new researchers who intend to carry out research on CLIR. Table of Contents: Preface / Introduction / Using Manually Constructed Translation Systems and Resources for CLIR / Translation Based on Parallel and Comparable Corpora / Other Methods to Improve CLIR / A Look into the Future: Toward a Unified View of Monolingual IR and CLIR? / References / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Resilient Architecture Design for Voltage Variation

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Shrinking feature size and diminishing supply voltage are making circuits sensitive to supply voltage fluctuations within the microprocessor, caused by normal workload activity changes. If left unattended, voltage fluctuations can lead to timing violations or even transistor lifetime issues that degrade processor robustness. Mechanisms that learn to tolerate, avoid, and eliminate voltage fluctuations based on program and microarchitectural events can help steer the processor clear of danger, thus enabling tighter voltage margins that improve performance or lower power consumption. We describe the problem of voltage variation and the factors that influence this variation during processor design and operation. We also describe a variety of runtime hardware and software mitigation techniques that either tolerate, avoid, and/or eliminate voltage violations. We hope processor architects will find the information useful since tolerance, avoidance, and elimination are generalizable construct that can serve as a basis for addressing other reliability challenges as well. Table of Contents: Introduction / Modeling Voltage Variation / Understanding the Characteristics of Voltage Variation / Traditional Solutions and Emerging Solution Forecast / Allowing and Tolerating Voltage Emergencies / Predicting and Avoiding Voltage Emergencies / Eliminiating Recurring Voltage Emergencies / Future Directions on Resiliency View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Linked Data:Evolving the Web into a Global Data Space

    Copyright Year: 2011

    Morgan and Claypool eBooks

    The World Wide Web has enabled the creation of a global information space comprising linked documents. As the Web becomes ever more enmeshed with our daily lives, there is a growing desire for direct access to raw data not currently available on the Web or bound up in hypertext documents. Linked Data provides a publishing paradigm in which not only documents, but also data, can be a first class citizen of the Web, thereby enabling the extension of the Web with a global data space based on open standards - the Web of Data. In this Synthesis lecture we provide readers with a detailed technical introduction to Linked Data. We begin by outlining the basic principles of Linked Data, including coverage of relevant aspects of Web architecture. The remainder of the text is based around two main themes - the publication and consumption of Linked Data. Drawing on a practical Linked Data scenario, we provide guidance and best practices on: architectural approaches to publishing Linked Data; choo ing URIs and vocabularies to identify and describe resources; deciding what data to return in a description of a resource on the Web; methods and frameworks for automated linking of data sets; and testing and debugging approaches for Linked Data deployments. We give an overview of existing Linked Data applications and then examine the architectures that are used to consume Linked Data from the Web, alongside existing tools and frameworks that enable these. Readers can expect to gain a rich technical understanding of Linked Data fundamentals, as the basis for application development, research or further study. Table of Contents: List of Figures / Introduction / Principles of Linked Data / The Web of Data / Linked Data Design Considerations / Recipes for Publishing Linked Data / Consuming Linked Data / Summary and Outlook View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mathematical Tools for Shape Analysis and Description

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This book is a guide for researchers and practitioners to the new frontiers of 3D shape analysis and the complex mathematical tools most methods rely on. The target reader includes students, researchers and professionals with an undergraduate mathematics background, who wish to understand the mathematics behind shape analysis. The authors begin with a quick review of basic concepts in geometry, topology, differential geometry, and proceed to advanced notions of algebraic topology, always keeping an eye on the application of the theory, through examples of shape analysis methods such as 3D segmentation, correspondence, and retrieval. A number of research solutions in the field come from advances in pure and applied mathematics, as well as from the re-reading of classical theories and their adaptation to the discrete setting. In a world where disciplines (fortunately) have blurred boundaries, the authors believe that this guide will help to bridge the distance between theory and practic . Table of Contents: Acknowledgments / Figure Credits / About this Book / 3D Shape Analysis in a Nutshell / Geometry, Topology, and Shape Representation / Differential Geometry and Shape Analysis / Spectral Methods for Shape Analysis / Maps and Distances between Spaces / Algebraic Topology and Topology Invariants / Differential Topology and Shape Analysis / Reeb Graphs / Morse and Morse-Smale Complexes / Topological Persistence / Beyond Geometry and Topology / Resources / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Biomedical Engineering:Biomechanics and Bioelectricity

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Intended as an introduction to the field of biomedical engineering, this book covers the topics of biomechanics (Part I) and bioelectricity (Part II). Each chapter emphasizes a fundamental principle or law, such as Darcy's Law, Poiseuille's Law, Hooke's Law, Starling's Law, levers, and work in the area of fluid, solid, and cardiovascular biomechanics. In addition, electrical laws and analysis tools are introduced, including Ohm's Law, Kirchhoff's Laws, Coulomb's Law, capacitors, and the fluid/electrical analogy. Culminating the electrical portion are chapters covering Nernst and membrane potentials and Fourier transforms. Examples are solved throughout the book and problems with answers are given at the end of each chapter. A semester-long Major Project that models the human systemic cardiovascular system, utilizing both a Matlab numerical simulation and an electrical analog circuit, ties many of the book's concepts together. Table of Contents: Ohm's Law: Current, Voltage and Resistance / Kirchhoff's Voltage and Current Laws: Circuit Analysis / Operational Amplifiers / Coulomb's Law, Capacitors and the Fluid/Electrical Analogy / Series and Parallel Combinations / Thevenin Equivalent Circuits / Nernst Potential: Cell Membrane Equivalent Circuit / Fourier Transforms: Alternating Currents (AC) View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Statistical Language Models for Information Retrieval

    Copyright Year: 2009

    Morgan and Claypool eBooks

    As online information grows dramatically, search engines such as Google are playing a more and more important role in our lives. Critical to all search engines is the problem of designing an effective retrieval model that can rank documents accurately for a given query. This has been a central research problem in information retrieval for several decades. In the past ten years, a new generation of retrieval models, often referred to as statistical language models, has been successfully applied to solve many different information retrieval problems. Compared with the traditional models such as the vector space model, these new models have a more sound statistical foundation and can leverage statistical estimation to optimize retrieval parameters. They can also be more easily adapted to model non-traditional and complex retrieval problems. Empirically, they tend to achieve comparable or better performance than a traditional model with less effort on parameter tuning. This book systemati ally reviews the large body of literature on applying statistical language models to information retrieval with an emphasis on the underlying principles, empirically effective language models, and language models developed for non-traditional retrieval tasks. All the relevant literature has been synthesized to make it easy for a reader to digest the research progress achieved so far and see the frontier of research in this area. The book also offers practitioners an informative introduction to a set of practically useful language models that can effectively solve a variety of retrieval problems. No prior knowledge about information retrieval is required, but some basic knowledge about probability and statistics would be useful for fully digesting all the details. Table of Contents: Introduction / Overview of Information Retrieval Models / Simple Query Likelihood Retrieval Model / Complex Query Likelihood Model / Probabilistic Distance Retrieval Model / Language Models for Special Retr eval Tasks / Language Models for Latent Topic Analysis / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computer Architecture Performance Evaluation Methods

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Performance evaluation is at the foundation of computer architecture research and development. Contemporary microprocessors are so complex that architects cannot design systems based on intuition and simple models only. Adequate performance evaluation methods are absolutely crucial to steer the research and development process in the right direction. However, rigorous performance evaluation is non-trivial as there are multiple aspects to performance evaluation, such as picking workloads, selecting an appropriate modeling or simulation approach, running the model and interpreting the results using meaningful metrics. Each of these aspects is equally important and a performance evaluation method that lacks rigor in any of these crucial aspects may lead to inaccurate performance data and may drive research and development in a wrong direction. The goal of this book is to present an overview of the current state-of-the-art in computer architecture performance evaluation, with a special em hasis on methods for exploring processor architectures. The book focuses on fundamental concepts and ideas for obtaining accurate performance data. The book covers various topics in performance evaluation, ranging from performance metrics, to workload selection, to various modeling approaches including mechanistic and empirical modeling. And because simulation is by far the most prevalent modeling technique, more than half the book's content is devoted to simulation. The book provides an overview of the simulation techniques in the computer designer's toolbox, followed by various simulation acceleration techniques including sampled simulation, statistical simulation, parallel simulation and hardware-accelerated simulation. Table of Contents: Introduction / Performance Metrics / Workload Design / Analytical Performance Modeling / Simulation / Sampled Simulation / Statistical Simulation / Parallel Simulation and Hardware Acceleration / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Processor Microarchitecture:An Implementation Perspective

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This lecture presents a study of the microarchitecture of contemporary microprocessors. The focus is on implementation aspects, with discussions on their implications in terms of performance, power, and cost of state-of-the-art designs. The lecture starts with an overview of the different types of microprocessors and a review of the microarchitecture of cache memories. Then, it describes the implementation of the fetch unit, where special emphasis is made on the required support for branch prediction. The next section is devoted to instruction decode with special focus on the particular support to decoding x86 instructions. The next chapter presents the allocation stage and pays special attention to the implementation of register renaming. Afterward, the issue stage is studied. Here, the logic to implement out-of-order issue for both memory and non-memory instructions is thoroughly described. The following chapter focuses on the instruction execution and describes the different functi nal units that can be found in contemporary microprocessors, as well as the implementation of the bypass network, which has an important impact on the performance. Finally, the lecture concludes with the commit stage, where it describes how the architectural state is updated and recovered in case of exceptions or misspeculations. This lecture is intended for an advanced course on computer architecture, suitable for graduate students or senior undergrads who want to specialize in the area of computer architecture. It is also intended for practitioners in the industry in the area of microprocessor design. The book assumes that the reader is familiar with the main concepts regarding pipelining, out-of-order execution, cache memories, and virtual memory. Table of Contents: Introduction / Caches / The Instruction Fetch Unit / Decode / Allocation / The Issue Stage / Execute / The Commit Stage / References / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital Image Processing for Ophthalmology:Detection of the Optic Nerve Head

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Fundus images of the retina are color images of the eye taken by specially designed digital cameras. Ophthalmologists rely on fundus images to diagnose various diseases that affect the eye, such as diabetic retinopathy and retinopathy of prematurity. A crucial preliminary step in the analysis of retinal images is the identification and localization of important anatomical structures, such as the optic nerve head (ONH), the macula, and the major vascular arcades. Identification of the ONH is an important initial step in the detection and analysis of the anatomical structures and pathological features in the retina. Different types of retinal pathology may be detected and analyzed via the application of appropriately designed techniques of digital image processing and pattern recognition. Computer-aided analysis of retinal images has the potential to facilitate quantitative and objective analysis of retinal lesions and abnormalities. Accurate identification and localization of retinal f atures and lesions could contribute to improved diagnosis, treatment, and management of retinopathy. This book presents an introduction to diagnostic imaging of the retina and an overview of image processing techniques for ophthalmology. In particular, digital image processing algorithms and pattern analysis techniques for the detection of the ONH are described. In fundus images, the ONH usually appears as a bright region, white or yellow in color, and is indicated as the convergent area of the network of blood vessels. Use of the geometrical and intensity characteristics of the ONH, as well as the property that the ONH represents the location of entrance of the blood vessels and the optic nerve into the retina, is demonstrated in developing the methods. The image processing techniques described in the book include morphological filters for preprocessing fundus images, filters for edge detection, the Hough transform for the detection of lines and circles, Gabor filters to detect the b ood vessels, and phase portrait analysis for the detection of convergent or node-like patterns. Illustrations of application of the methods to fundus images from two publicly available databases are presented, in terms of locating the center and the boundary of the ONH. Methods for quantitative evaluation of the results of detection of the ONH using measures of overlap and free-response receiver operating characteristics are also described. Table of Contents: Introduction / Computer-aided Analysis of Images of the Retina / Detection of Geometrical Patterns / Datasets and Experimental Setup / Detection of the\Optic Nerve Head\Using the Hough Transform / Detection of the\Optic Nerve Head\Using Phase Portraits / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Human Side of Engineering

    Copyright Year: 2017

    Morgan and Claypool eBooks

    While in many university courses attention is given to the human side, as opposed to the technical side of engineering, it is by and large an afterthought. Engineering is, however, a technical, social, and personal activity. Several studies show that engineering is a community activity of professionals in which communication is central to the engineering task. Increasingly, technology impacts everyone in society. Acting as a professional community, engineers have an awesome power to influence society but they can only act for the common good if they understand the nature of our society. To achieve such understanding they have to understand themselves. This book is about understanding ourselves in order to understand others, and understanding others in order to understand ourselves in the context of engineering and the society it serves. To achieve this understanding this book takes the reader on 12 intellectual journeys that frame the big questions confronting the engineering professi ns. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Hybrid Imagination:Science and Technology in Cultural Perspective

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book presents a cultural perspective on scientific and technological development. As opposed to the "story-lines" of economic innovation and social construction that tend to dominate both the popular and scholarly literature on science, technology and society (or STS), the authors offer an alternative approach, devoting special attention to the role played by social and cultural movements in the making of science and technology. They show how social and cultural movements, from the Renaissance of the late 15th century to the environmental and global justice movements of our time, have provided contexts, or sites, for mixing scientific knowledge and technical skills from different fields and social domains into new combinations, thus fostering what the authors term a "hybrid imagination." Such a hybrid imagination is especially important today, as a way to counter the competitive and commercial "hubris" that is so much taken for granted in contemporary science and engineering disc urses and practices with a sense of cooperation and social responsibility. The book portrays the history of science and technology as an underlying tension between hubris -- literally the ambition to "play god" on the part of many a scientist and engineer and neglect the consequences - and a hybrid imagination, connecting scientific "facts" and technological "artifacts" with cultural understanding. The book concludes with chapters on the recent transformations in the modes of scientific and technological production since the Second World War and the contending approaches to "greening" science and technology in relation to the global quest for sustainable development. The book is based on a series of lectures that were given by Andrew Jamison at the Technical University of Denmark in 2010 and draws on the authors' many years of experience in teaching non-technical, or contextual knowledge, to science and engineering students. The book has been written as part of the Program of Researc on Opportunities and Challenges in Engineering Education in Denmark (PROCEED) supported by the Danish Strategic Research Council from 2010 to 2013. Table of Contents: Introduction / Perceptions of Science and Technology / Where Did Science and Technology Come From? / Science, Technology and Industrialization / Science, Technology and Modernization / Science, Technology and Globalization / The Greening of Science and Technology View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Estimation of Cortical Connectivity in Humans: Advanced Signal Processing Techniques

    Copyright Year: 2007

    Morgan and Claypool eBooks

    In the last ten years many different brain imaging devices have conveyed a lot of information about the brain functioning in different experimental conditions. In every case, the biomedical engineers, together with mathematicians, physicists and physicians are called to elaborate the signals related to the brain activity in order to extract meaningful and robust information to correlate with the external behavior of the subjects. In such attempt, different signal processing tools used in telecommunications and other field of engineering or even social sciences have been adapted and re-used in the neuroscience field. The present book would like to offer a short presentation of several methods for the estimation of the cortical connectivity of the human brain. The methods here presented are relatively simply to implement, robust and can return valuable information about the causality of the activation of the different cortical areas in humans using non invasive electroencephalographic r cordings. The knowledge of such signal processing tools will enrich the arsenal of the computational methods that a engineer or a mathematician could apply in the processing of brain signals. Table of Contents: Introduction / Estimation of the Effective Connectivity from Stationary Data by Structural Equation Modeling / Estimation of the Functional Connectivity from Stationary Data by Multivariate Autoregressive Methods / Estimation of Cortical Activity by the use of Realistic Head Modeling / Application: Estimation of Connectivity from Movement-Related Potentials / Application to High-Resolution EEG Recordings in a Cognitive Task (Stroop Test) / Application to Data Related to the Intention of Limb Movements in Normal Subjects and in a Spinal Cord Injured Patient / The Instantaneous Estimation of the Time-Varying Cortical Connectivity by Adaptive Multivariate Estimators / Time-Varying Connectivity from Event-Related Potentials View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Simplified Models for Assessing Heat and Mass Transfer in Evaporative Towers

    Copyright Year: 2013

    Morgan and Claypool eBooks

    The aim of this book is to supply valid and reasonable parameters in order to guide the choice of the right model of industrial evaporative tower according to operating conditions which vary depending on the particular industrial context: power plants, chemical plants, food processing plants and other industrial facilities are characterized by specific assets and requirements that have to be satisfied. Evaporative cooling is increasingly employed each time a significant water flow at a temperature which does not greatly differ from ambient temperature is needed for removing a remarkable heat load; its aim is to refrigerate a water flow through the partial evaporation of the same. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Tensor Properties of Solids

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Tensor Properties of Solids presents the phenomenological development of solid state properties represented as matter tensors in two parts: Part I on equilibrium tensor properties and Part II on transport tensor properties. Part I begins with an introduction to tensor notation, transformations, algebra, and calculus together with the matrix representations. Crystallography, as it relates to tensor properties of crystals, completes the background treatment. A generalized treatment of solid-state equilibrium thermodynamics leads to the systematic correlation of equilibrium tensor properties. This is followed by developments covering first-, second-, third-, and higher-order tensor effects. Included are the generalized compliance and rigidity matrices for first-order tensor properties, Maxwell relations, effect of measurement conditions, and the dependent coupled effects and use of interaction diagrams. Part I concludes with the second- and higher-order effects, including numerous optica tensor properties. Part II presents the driving forces and fluxes for the well-known proper conductivities. An introduction to irreversible thermodynamics includes the concepts of microscopic reversibility, Onsager's reciprocity principle, entropy density production, and the proper choice of the transport parameters. This is followed by the force-flux equations for electronic charge and heat flow and the relationships between the proper conductivities and phenomenological coefficients. The thermoelectric effects in solids are discussed and extended to the piezothermoelectric and piezoresistance tensor effects. The subjects of thermomagnetic, galvanomagnetic, and thermogalvanomagnetic effects are developed together with other higher-order magnetotransport property tensors. A glossary of terms, expressions, and symbols are provided at the end of the text, and end-of-chapter problems are provided on request. Endnotes provide the necessary references for further reading. Table of Conten s: I. Equilibrium Tensor Properties of Solids / Introduction / Introduction to Tensor Notation, Tensor Transformations, Tensor Calculus, and Matrix Representation / Crystal Systems, Symmetry Elements, and Symmetry Transformations / Generalized Thermostatics and the Systematic Correlation of Physical Properties / The Dependent Coupled Effects and the Interrelationships Between First-Order Tensor Properties - Use of Interaction Diagrams / Third- and Fourth-Rank Tensor Properties - Symmetry Considerations / Second- and Higher-Order Effects - Symmetry Considerations / II. Transport Properties of Solids / Introduction to Transport Properties and the Thermodynamics of Irreversible Processes / Thermoelectric, Piezothermoelectric, and Diffusive Effects in Solids / Effect of Magnetic Field on the Transport Properties / Appendix A: Magnetic Tensor Properties, Magnetic Crystals, and the Combined Space-Time Transformations / Endnotes / Glossary / Biography / Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Logic

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This book is a gentle but rigorous introduction to formal logic. It is intended primarily for use at the college level. However, it can also be used for advanced secondary school students, and it can be used at the start of graduate school for those who have not yet seen the material. The approach to teaching logic used here emerged from more than 20 years of teaching logic to students at Stanford University and from teaching logic to tens of thousands of others via online courses on the World Wide Web. The approach differs from that taken by other books in logic in two essential ways, one having to do with content, the other with form. Like many other books on logic, this one covers logical syntax and semantics and proof theory plus induction. However, unlike other books, this book begins with Herbrand semantics rather than the more traditional Tarskian semantics. This approach makes the material considerably easier for students to understand and leaves them with a deeper understandi g of what logic is all about. The primary content difference concerns the semantics of the logic that is taught. In addition to this text, there are online exercises (with automated grading), online logic tools and applications, online videos of lectures, and an online forum for discussion. They are available at logic.stanford.edu/intrologic/. Table of Contents: Introduction / Propositional Logic / Propositional Proofs / Propositional Resolution / Satisfiability / Herbrand Logic / Herbrand Logic Proofs / Resolution / Induction / First Order Logic View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Web Indicators for Research Evaluation:A Practical Guide

    Copyright Year: 2016

    Morgan and Claypool eBooks

    In recent years there has been an increasing demand for research evaluation within universities and other research-based organisations. In parallel, there has been an increasing recognition that traditional citation-based indicators are not able to reflect the societal impacts of research and are slow to appear. This has led to the creation of new indicators for different types of research impact as well as timelier indicators, mainly derived from the Web. These indicators have been called altmetrics, webometrics or just web metrics. This book describes and evaluates a range of web indicators for aspects of societal or scholarly impact, discusses the theory and practice of using and evaluating web indicators for research assessment and outlines practical strategies for obtaining many web indicators. In addition to describing impact indicators for traditional scholarly outputs, such as journal articles and monographs, it also covers indicators for videos, datasets, software and other n n-standard scholarly outputs. The book describes strategies to analyse web indicators for individual publications as well as to compare the impacts of groups of publications. The practical part of the book includes descriptions of how to use the free software Webometric Analyst to gather and analyse web data. This book is written for information science undergraduate and Master’s students that are learning about alternative indicators or scientometrics as well as Ph.D. students and other researchers and practitioners using indicators to help assess research impact or to study scholarly communication. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Planning with Markov Decision Processes:An AI Perspective

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Markov Decision Processes (MDPs) are widely popular in Artificial Intelligence for modeling sequential decision-making scenarios with probabilistic dynamics. They are the framework of choice when designing an intelligent agent that needs to act for long periods of time in an environment where its actions could have uncertain outcomes. MDPs are actively researched in two related subareas of AI, probabilistic planning and reinforcement learning. Probabilistic planning assumes known models for the agent's goals and domain dynamics, and focuses on determining how the agent should behave to achieve its objectives. On the other hand, reinforcement learning additionally learns these models based on the feedback the agent gets from the environment. This book provides a concise introduction to the use of MDPs for solving probabilistic planning problems, with an emphasis on the algorithmic perspective. It covers the whole spectrum of the field, from the basics to state-of-the-art optimal and a proximation algorithms. We first describe the theoretical foundations of MDPs and the fundamental solution techniques for them. We then discuss modern optimal algorithms based on heuristic search and the use of structured representations. A major focus of the book is on the numerous approximation schemes for MDPs that have been developed in the AI literature. These include determinization-based approaches, sampling techniques, heuristic functions, dimensionality reduction, and hierarchical representations. Finally, we briefly introduce several extensions of the standard MDP classes that model and solve even more complex planning problems. Table of Contents: Introduction / MDPs / Fundamental Algorithms / Heuristic Search Algorithms / Symbolic Algorithms / Approximation Algorithms / Advanced Notes View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Perspectives on Business Intelligence

    Copyright Year: 2013

    Morgan and Claypool eBooks

    In the 1980s, traditional Business Intelligence (BI) systems focused on the delivery of reports that describe the state of business activities in the past, such as for questions like "How did our sales perform during the last quarter?" A decade later, there was a shift to more interactive content that presented how the business was performing at the present time, answering questions like "How are we doing right now?" Today the focus of BI users are looking into the future. "Given what I did before and how I am currently doing this quarter, how will I do next quarter?" Furthermore, fuelled by the demands of Big Data, BI systems are going through a time of incredible change. Predictive analytics, high volume data, unstructured data, social data, mobile, consumable analytics, and data visualization are all examples of demands and capabilities that have become critical within just the past few years, and are growing at an unprecedented pace. This book introduces research problems and solu ions on various aspects central to next-generation BI systems. It begins with a chapter on an industry perspective on how BI has evolved, and discusses how game-changing trends have drastically reshaped the landscape of BI. One of the game changers is the shift toward the consumerization of BI tools. As a result, for BI tools to be successfully used by business users (rather than IT departments), the tools need a business model, rather than a data model. One chapter of the book surveys four different types of business modeling. However, even with the existence of a business model for users to express queries, the data that can meet the needs are still captured within a data model. The next chapter on vivification addresses the problem of closing the gap, which is often significant, between the business and the data models. Moreover, Big Data forces BI systems to integrate and consolidate multiple, and often wildly different, data sources. One chapter gives an overview of several integ ation architectures for dealing with the challenges that need to be overcome. While the book so far focuses on the usual structured relational data, the remaining chapters turn to unstructured data, an ever-increasing and important component of Big Data. One chapter on information extraction describes methods for dealing with the extraction of relations from free text and the web. Finally, BI users need tools to visualize and interpret new and complex types of information in a way that is compelling, intuitive, but accurate. The last chapter gives an overview of information visualization for decision support and text. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Semi-Supervised Learning

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Semi-supervised learning is a learning paradigm concerned with the study of how computers and natural systems such as humans learn in the presence of both labeled and unlabeled data. Traditionally, learning has been studied either in the unsupervised paradigm (e.g., clustering, outlier detection) where all the data are unlabeled, or in the supervised paradigm (e.g., classification, regression) where all the data are labeled. The goal of semi-supervised learning is to understand how combining labeled and unlabeled data may change the learning behavior, and design algorithms that take advantage of such a combination. Semi-supervised learning is of great interest in machine learning and data mining because it can use readily available unlabeled data to improve supervised learning tasks when the labeled data are scarce or expensive. Semi-supervised learning also shows potential as a quantitative tool to understand human category learning, where most of the input is self-evidently unlabele . In this introductory book, we present some popular semi-supervised learning models, including self-training, mixture models, co-training and multiview learning, graph-based methods, and semi-supervised support vector machines. For each model, we discuss its basic mathematical formulation. The success of semi-supervised learning depends critically on some underlying assumptions. We emphasize the assumptions made by each model and give counterexamples when appropriate to demonstrate the limitations of the different models. In addition, we discuss semi-supervised learning for cognitive psychology. Finally, we give a computational learning theoretic perspective on semi-supervised learning, and we conclude the book with a brief discussion of open questions in the field. Table of Contents: Introduction to Statistical Machine Learning / Overview of Semi-Supervised Learning / Mixture Models and EM / Co-Training / Graph-Based Semi-Supervised Learning / Semi-Supervised Support Vector Machines / Human Semi-Supervised Learning / Theory and Outlook View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Database Replication

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Database replication is widely used for fault-tolerance, scalability and performance. The failure of one database replica does not stop the system from working as available replicas can take over the tasks of the failed replica. Scalability can be achieved by distributing the load across all replicas, and adding new replicas should the load increase. Finally, database replication can provide fast local access, even if clients are geographically distributed clients, if data copies are located close to clients. Despite its advantages, replication is not a straightforward technique to apply, and there are many hurdles to overcome. At the forefront is replica control: assuring that data copies remain consistent when updates occur. There exist many alternatives in regard to where updates can occur and when changes are propagated to data copies, how changes are applied, where the replication tool is located, etc. A particular challenge is to combine replica control with transaction manageme t as it requires several operations to be treated as a single logical unit, and it provides atomicity, consistency, isolation and durability across the replicated system. The book provides a categorization of replica control mechanisms, presents several replica and concurrency control mechanisms in detail, and discusses many of the issues that arise when such solutions need to be implemented within or on top of relational database systems. Furthermore, the book presents the tasks that are needed to build a fault-tolerant replication solution, provides an overview of load-balancing strategies that allow load to be equally distributed across all replicas, and introduces the concept of self-provisioning that allows the replicated system to dynamically decide on the number of replicas that are needed to handle the current load. As performance evaluation is a crucial aspect when developing a replication tool, the book presents an analytical model of the scalability potential of various rep ication solution. For readers that are only interested in getting a good overview of the challenges of database replication and the general mechanisms of how to implement replication solutions, we recommend to read Chapters 1 to 4. For readers that want to get a more complete picture and a discussion of advanced issues, we further recommend the Chapters 5, 8, 9 and 10. Finally, Chapters 6 and 7 are of interest for those who want get familiar with thorough algorithm design and correctness reasoning. Table of Contents: Overview / 1-Copy-Equivalence and Consistency / Basic Protocols / Replication Architecture / The Scalability of Replication / Eager Replication and 1-Copy-Serializability / 1-Copy-Snapshot Isolation / Lazy Replication / Self-Configuration and Elasticity / Other Aspects of Replication View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Hypermedia Genes:An Evolutionary Perspective on Concepts, Models, and Architectures

    Copyright Year: 2009

    Morgan and Claypool eBooks

    The design space of information services evolved from seminal works through a set of prototypical hypermedia systems and matured in open and widely accessible web-based systems. The original concepts of hypermedia systems are now expressed in different forms and shapes. The first works on hypertext invented the term itself, laid out the foundational concept of association or link, and highlighted navigation as the core paradigm for the future information systems. The first engineered systems demonstrated architectural requirements and models and fostered the emergence of the conceptual model related with the information systems and the information design. The artifacts for interaction, navigation, and search, grew from the pioneering systems. Multimedia added a new dimension to hypertext, and mutated the term into hypermedia. The adaptation of the primitive models and mechanisms to the space of continuous media led to a further conceptual level and to the reinvention of information de ign methods. Hypermedia systems also became an ideal space for collaboration and cooperative work. Information access and sharing, and group work were enabled and empowered by distributed hypermedia systems. As with many technologies, a winning technical paradigm, in our case the World Wide Web, concentrated the design options, the architectural choices and the interaction and navigation styles. Since the late nineties, the Web became the standard framework for hypermedia systems, and integrated a large number of the initial concepts and techniques. Yet, other paths are still open. This lecture maps a simple "genome" of hypermedia systems, based on an initial survey of primitive systems that established architectural and functional characteristics, or traits. These are analyzed and consolidated using phylogenetic analysis tools, to infer families of systems and evolution opportunities. This method may prove to be inspiring for more systematic perspectives of technological landscapes. able of Contents: Introduction / Original Visions and Concepts / Steps in the Evolution / Information and Structured Documents / Web-Based Environments / Some Research Trends / A Framework of Traits / A Phylogenetic Analysis / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Publishing and Using Cultural Heritage Linked Data on the Semantic Web

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Cultural Heritage (CH) data is syntactically and semantically heterogeneous, multilingual, semantically rich, and highly interlinked. It is produced in a distributed, open fashion by museums, libraries, archives, and media organizations, as well as individual persons. Managing publication of such richness and variety of content on the Web, and at the same time supporting distributed, interoperable content creation processes, poses challenges where traditional publication approaches need to be re-thought. Application of the principles and technologies of Linked Data and the Semantic Web is a new, promising approach to address these problems. This development is leading to the creation of large national and international CH portals, such as Europeana, to large open data repositories, such as the Linked Open Data Cloud, and massive publications of linked library data in the U.S., Europe, and Asia. Cultural Heritage has become one of the most successful application domains of Linked Data nd Semantic Web technologies. This book gives an overview on why, when, and how Linked (Open) Data and Semantic Web technologies can be employed in practice in publishing CH collections and other content on the Web. The text first motivates and presents a general semantic portal model and publishing framework as a solution approach to distributed semantic content creation, based on an ontology infrastructure. On the Semantic Web, such an infrastructure includes shared metadata models, ontologies, and logical reasoning, and is supported by shared ontology and other Web services alleviating the use of the new technology and linked data in legacy cataloging systems. The goal of all this is to provide layman users and researchers with new, more intelligent and usable Web applications that can be utilized by other Web applications, too, via well-defined Application Programming Interfaces (API). At the same time, it is possible to provide publishing organizations with more cost-efficient so utions for content creation and publication. This book is targeted to computer scientists, museum curators, librarians, archivists, and other CH professionals interested in Linked Data and CH applications on the Semantic Web. The text is focused on practice and applications, making it suitable to students, researchers, and practitioners developing Web services and applications of CH, as well as to CH managers willing to understand the technical issues and challenges involved in linked data publication. Table of Contents: Cultural Heritage on the Semantic Web / Portal Model for Collaborative CH Publishing / Requirements for Publishing Linked Data / Metadata Schemas / Domain Vocabularies and Ontologies / Logic Rules for Cultural Heritage / Cultural Content Creation / Semantic Services for Human and Machine Users / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Linguistic Structure Prediction

    Copyright Year: 2011

    Morgan and Claypool eBooks

    A major part of natural language processing now depends on the use of text data to build linguistic analyzers. We consider statistical, computational approaches to modeling linguistic structure. We seek to unify across many approaches and many kinds of linguistic structures. Assuming a basic understanding of natural language processing and/or machine learning, we seek to bridge the gap between the two fields. Approaches to decoding (i.e., carrying out linguistic structure prediction) and supervised and unsupervised learning of models that predict discrete structures as outputs are the focus. We also survey natural language processing problems to which these methods are being applied, and we address related topics in probabilistic inference, optimization, and experimental methodology. Table of Contents: Representations and Linguistic Data / Decoding: Making Predictions / Learning Structure from Annotated Data / Learning Structure from Incomplete Data / Beyond Decoding: Inference View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Theory Tools for Computer Graphics

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Information theory (IT) tools, widely used in scientific fields such as engineering, physics, genetics, neuroscience, and many others, are also emerging as useful transversal tools in computer graphics. In this book, we present the basic concepts of IT and how they have been applied to the graphics areas of radiosity, adaptive ray-tracing, shape descriptors, viewpoint selection and saliency, scientific visualization, and geometry simplification. Some of the approaches presented, such as the viewpoint techniques, are now the state of the art in visualization. Almost all of the techniques presented in this book have been previously published in peer-reviewed conference proceedings or international journals. Here, we have stressed their common aspects and presented them in an unified way, so the reader can clearly see which problems IT tools can help solve, which specific tools to use, and how to apply them. A basic level of knowledge in computer graphics is required but basic concepts i IT are presented. The intended audiences are both students and practitioners of the fields above and related areas in computer graphics. In addition, IT practitioners will learn about these applications. Table of Contents: Information Theory Basics / Scene Complexity and Refinement Criteria for Radiosity / Shape Descriptors / Refinement Criteria for Ray-Tracing / Viewpoint Selection and Mesh Saliency / View Selection in Scientific Visualization / Viewpoint-based Geometry Simplification View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Domain-Sensitive Temporal Tagging

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>This book covers the topic of temporal tagging, the detection of temporal expressions and the normalization of their semantics to some standard format. It places a special focus on the challenges and opportunities of domain-sensitive temporal tagging. After providing background knowledge on the concept of time, the book continues with a comprehensive survey of current research on temporal tagging. The authors provide an overview of existing techniques and tools, and highlight key issues that need to be addressed. This book is a valuable resource for researchers and application developers who need to become familiar with the topic and want to know the recent trends, current tools and techniques, as well as different application domains in which temporal information is of utmost importance. </p><p> Due to the prevalence of temporal expressions in diverse types of documents and the importance of temporal information in any information space, temporal tagging is an important task in natural language processing (NLP), and applications of several domains can benefit from the output of temporal taggers to provide more meaningful and useful results. </p><p> In recent years, temporal tagging has been an active field in NLP and computational linguistics. Several approaches to temporal tagging have been proposed, annotation standards have been developed, gold standard data sets have been created, and research competitions have been organized. Furthermore, some temporal taggers have also been made publicly available so that temporal tagging output is not just exploited in research, but is finding its way into real world applications. In addition, this book particularly focuses on domain-specific temporal tagging of documents. This is a crucial aspect as different types of documents (e.g., news articles, narratives, and colloquial texts) result in diverse challenges for temporal taggers and should be processed in a domain-sensitive man er.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Remote Sensing Image Processing

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Earth observation is the field of science concerned with the problem of monitoring and modeling the processes on the Earth surface and their interaction with the atmosphere. The Earth is continuously monitored with advanced optical and radar sensors. The images are analyzed and processed to deliver useful products to individual users, agencies and public administrations. To deal with these problems, remote sensing image processing is nowadays a mature research area, and the techniques developed in the field allow many real-life applications with great societal value. For instance, urban monitoring, fire detection or flood prediction can have a great impact on economical and environmental issues. To attain such objectives, the remote sensing community has turned into a multidisciplinary field of science that embraces physics, signal theory, computer science, electronics and communications. From a machine learning and signal/image processing point of view, all the applications are tackl d under specific formalisms, such as classification and clustering, regression and function approximation, data coding, restoration and enhancement, source unmixing, data fusion or feature selection and extraction. This book covers some of the fields in a comprehensive way. Table of Contents: Remote Sensing from Earth Observation Satellites / The Statistics of Remote Sensing Images / Remote Sensing Feature Selection and Extraction / Classification / Spectral Mixture Analysis / Estimation of Physical Parameters View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Community Detection and Mining in Social Media

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The past decade has witnessed the emergence of participatory Web and social media, bringing people together in many creative ways. Millions of users are playing, tagging, working, and socializing online, demonstrating new forms of collaboration, communication, and intelligence that were hardly imaginable just a short time ago. Social media also helps reshape business models, sway opinions and emotions, and opens up numerous possibilities to study human interaction and collective behavior in an unparalleled scale. This lecture, from a data mining perspective, introduces characteristics of social media, reviews representative tasks of computing with social media, and illustrates associated challenges. It introduces basic concepts, presents state-of-the-art algorithms with easy-to-understand examples, and recommends effective evaluation methods. In particular, we discuss graph-based community detection techniques and many important extensions that handle dynamic, heterogeneous networks i social media. We also demonstrate how discovered patterns of communities can be used for social media mining. The concepts, algorithms, and methods presented in this lecture can help harness the power of social media and support building socially-intelligent systems. This book is an accessible introduction to the study of emph{community detection and mining in social media}. It is an essential reading for students, researchers, and practitioners in disciplines and applications where social media is a key source of data that piques our curiosity to understand, manage, innovate, and excel. This book is supported by additional materials, including lecture slides, the complete set of figures, key references, some toy data sets used in the book, and the source code of representative algorithms. The readers are encouraged to visit the book website for the latest information. Table of Contents: Social Media and Social Computing / Nodes, Ties, and Influence / Community Detection and Evaluat on / Communities in Heterogeneous Networks / Social Media Mining View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Object Databases:Object-Oriented and Object-Relational Design

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Object-oriented databases were originally developed as an alternative to relational database technology for the representation, storage, and access of non-traditional data forms that were increasingly found in advanced applications of database technology. After much debate regarding object-oriented versus relational database technology, object-oriented extensions were eventually incorporated into relational technology to create object-relational databases. Both object-oriented databases and object-relational databases, collectively known as object databases, provide inherent support for object features, such as object identity, classes, inheritance hierarchies, and associations between classes using object references. This monograph presents the fundamentals of object databases, with a specific focus on conceptual modeling of object database designs. After an introduction to the fundamental concepts of object-oriented data, the monograph provides a review of object-oriented conceptual modeling techniques using side-by-side Enhanced Entity Relationship diagrams and Unified Modeling Language conceptual class diagrams that feature class hierarchies with specialization constraints and object associations. These object-oriented conceptual models provide the basis for introducing case studies that illustrate the use of object features within the design of object-oriented and object-relational databases. For the object-oriented database perspective, the Object Data Management Group data definition language provides a portable, language-independent specification of an object schema, together with an SQL-like object query language. LINQ (Language INtegrated Query) is presented as a case study of an object query language together with its use in the db4o open-source object-oriented database. For the object-relational perspective, the object-relational features of the SQL standard are presented together with an accompanying case study of the object-relational features of Orac e. For completeness of coverage, an appendix provides a mapping of object-oriented conceptual designs to the relational model and its associated constraints. Table of Contents: List of Figures / List of Tables / Introduction to Object Databases / Object-Oriented Databases / Object-Relational Databases View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Business Processes:A Database Perspective

    Copyright Year: 2012

    Morgan and Claypool eBooks

    While classic data management focuses on the data itself, research on Business Processes also considers the context in which this data is generated and manipulated, namely the processes, users, and goals that this data serves. This provides the analysts a better perspective of the organizational needs centered around the data. As such, this research is of fundamental importance. Much of the success of database systems in the last decade is due to the beauty and elegance of the relational model and its declarative query languages, combined with a rich spectrum of underlying evaluation and optimization techniques, and efficient implementations. Much like the case for traditional database research, elegant modeling and rich underlying technology are likely to be highly beneficiary for the Business Process owners and their users; both can benefit from easy formulation and analysis of the processes. While there have been many important advances in this research in recent years, there is st ll much to be desired: specifically, there have been many works that focus on the processes behavior (flow), and many that focus on its data, but only very few works have dealt with both the state-of-the-art in a database approach to Business Process modeling and analysis, the progress towards a holistic flow-and-data framework for these tasks, and highlight the current gaps and research directions. Table of Contents: Introduction / Modeling / Querying Business Processes / Other Issues / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Learning Programming Using Matlab

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book is intended for anyone trying to learn the fundamentals of computer programming. The chapters lead the reader through the various steps required for writing a program, introducing the MATLABr® constructs in the process. MATLABr® is used to teach programming because it has a simple programming environment. It has a low initial overhead which allows the novice programmer to begin programming immediately and allows the users to easily debug their programs. This is especially useful for people who have a “mental block” about computers. Although MATLABr® is a high-level language and interactive environment that enables the user to perform computationally intensive tasks faster than with traditional programming languages such as C, C++, and Fortran, the author shows that it can also be used as a programming learning tool for novices. There are a number of exercises at the end of each chapter which should help users become comfortable with the langua e. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Theory of Timed I/O Automata, Second Edition

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This monograph presents the Timed Input/Output Automaton (TIOA) modeling framework, a basic mathematical framework to support description and analysis of timed (computing) systems. Timed systems are systems in which desirable correctness or performance properties of the system depend on the timing of events, not just on the order of their occurrence. Timed systems are employed in a wide range of domains including communications, embedded systems, real-time operating systems, and automated control. Many applications involving timed systems have strong safety, reliability, and predictability requirements, which make it important to have methods for systematic design of systems and rigorous analysis of timing-dependent behavior. The TIOA framework also supports description and analysis of timed distributed algorithms -- distributed algorithms whose correctness and performance depend on the relative speeds of processors, accuracy of local clocks, or communication delay bounds. Such algori hms arise, for example, in traditional and wireless communications, networks of mobile devices, and shared-memory multiprocessors. The need to prove rigorous theoretical results about timed distributed algorithms makes it important to have a suitable mathematical foundation. An important feature of the TIOA framework is its support for decomposing timed system descriptions. In particular, the framework includes a notion of external behavior for a timed I/O automaton, which captures its discrete interactions with its environment. The framework also defines what it means for one TIOA to implement another, based on an inclusion relationship between their external behavior sets, and defines notions of simulations, which provide sufficient conditions for demonstrating implementation relationships. The framework includes a composition operation for TIOAs, which respects external behavior, and a notion of receptiveness, which implies that a TIOA does not block the passage of time. The TIOA f amework also defines the notion of a property and what it means for a property to be a safety or a liveness property. It includes results that capture common proof methods for showing that automata satisfy properties. Table of Contents: Introduction / Mathematical Preliminaries / Describing Timed System Behavior / Timed Automata / Operations on Timed Automata / Properties for Timed Automata / Timed I/O Automata / Operations on Timed I/O Automata / Conclusions and Future Work View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Understanding User-Web Interactions via Web Analytics

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This lecture presents an overview of the Web analytics process, with a focus on providing insight and actionable outcomes from collecting and analyzing Internet data. The lecture first provides an overview of Web analytics, providing in essence, a condensed version of the entire lecture. The lecture then outlines the theoretical and methodological foundations of Web analytics in order to make obvious the strengths and shortcomings of Web analytics as an approach. These foundational elements include the psychological basis in behaviorism and methodological underpinning of trace data as an empirical method. These foundational elements are illuminated further through a brief history of Web analytics from the original transaction log studies in the 1960s through the information science investigations of library systems to the focus on Websites, systems, and applications. Following a discussion of on-going interaction data within the clickstream created using log files and page tagging for analytics of Website and search logs, the lecture then presents a Web analytic process to convert these basic data to meaningful key performance indicators in order to measure likely converts that are tailored to the organizational goals or potential opportunities. Supplementary data collection techniques are addressed, including surveys and laboratory studies. The overall goal of this lecture is to provide implementable information and a methodology for understanding Web analytics in order to improve Web systems, increase customer satisfaction, and target revenue through effective analysis of user–Website interactions. Table of Contents: Understanding Web Analytics / The Foundations of Web Analytics: Theory and Methods / The History of Web Analytics / Data Collection for Web Analytics / Web Analytics Fundamentals / Web Analytics Strategy / Web Analytics as Competitive Intelligence / Supplementary Methods for Augmenting Web Analytics / Search Log Analytics / Conclusion / Key Te ms / Blogs for Further Reading / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Space-Time Computing with Temporal Neural Networks

    Copyright Year: 2017

    Morgan and Claypool eBooks

    Understanding and implementing the brain's computational paradigm is the one true grand challenge facing computer researchers. Not only are the brain's computational capabilities far beyond those of conventional computers, its energy efficiency is truly remarkable. This book, written from the perspective of a computer designer and targeted at computer researchers, is intended to give both background and lay out a course of action for studying the brain's computational paradigm. It contains a mix of concepts and ideas drawn from computational neuroscience, combined with those of the author.

    As background, relevant biological features are described in terms of their computational and communication properties. The brain's neocortex is constructed of massively interconnected neurons that compute and communicate via voltage spikes, and a strong argument can be made that precise spike timing is an essential element of the paradigm. Drawing from the biological features, a mathe atics-based computational paradigm is constructed. The key feature is spiking neurons that perform communication and processing in space-time, with emphasis on time. In these paradigms, time is used as a freely available resource for both communication and computation.

    Neuron models are first discussed in general, and one is chosen for detailed development. Using the model, single-neuron computation is first explored. Neuron inputs are encoded as spike patterns, and the neuron is trained to identify input pattern similarities. Individual neurons are building blocks for constructing larger ensembles, referred to as "columns". These columns are trained in an unsupervised manner and operate collectively to perform the basic cognitive function of pattern clustering. Similar input patterns are mapped to a much smaller set of similar output patterns, thereby dividing the input patterns into identifiable clusters. Larger cognitive systems are formed by combining columns into a hierarc ical architecture. These higher level architectures are the subject of ongoing study, and progress to date is described in detail in later chapters. Simulation plays a major role in model development, and the simulation infrastructure developed by the author is described.

    View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Primer on Physical-Layer Network Coding

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The concept of physical-layer network coding (PNC) was proposed in 2006 for application in wireless networks. Since then it has developed into a subfield of communications and networking with a wide following. This book is a primer on PNC. It is the outcome of a set of lecture notes for a course for beginning graduate students at The Chinese University of Hong Kong. The target audience is expected to have some prior background knowledge in communication theory and wireless communications, but not working knowledge at the research level. Indeed, a goal of this book/course is to allow the reader to gain a deeper appreciation of the various nuances of wireless communications and networking by focusing on problems arising from the study of PNC. Specifically, we introduce the tools and techniques needed to solve problems in PNC, and many of these tools and techniques are drawn from the more general disciplines of signal processing, communications, and networking: PNC is used as a pivot to earn about the fundamentals of signal processing techniques and wireless communications in general. We feel that such a problem-centric approach will give the reader a more in-depth understanding of these disciplines and allow him/her to see first-hand how the techniques of these disciplines can be applied to solve real research problems. As a primer, this book does not cover many advanced materials related to PNC. PNC is an active research field and many new results will no doubt be forthcoming in the near future. We believe that this book will provide a good contextual framework for the interpretation of these advanced results should the reader decide to probe further into the field of PNC. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multi-Pitch Estimation

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Periodic signals can be decomposed into sets of sinusoids having frequencies that are integer multiples of a fundamental frequency. The problem of finding such fundamental frequencies from noisy observations is important in many speech and audio applications, where it is commonly referred to as pitch estimation. These applications include analysis, compression, separation, enhancement, automatic transcription and many more. In this book, an introduction to pitch estimation is given and a number of statistical methods for pitch estimation are presented. The basic signal models and associated estimation theoretical bounds are introduced, and the properties of speech and audio signals are discussed and illustrated. The presented methods include both single- and multi-pitch estimators based on statistical approaches, like maximum likelihood and maximum a posteriori methods, filtering methods based on both static and optimal adaptive designs, and subspace methods based on the principles of subspace orthogonality and shift-invariance. The application of these methods to analysis of speech and audio signals is demonstrated using both real and synthetic signals, and their performance is assessed under various conditions and their properties discussed. Finally, the estimators are compared in terms of computational and statistical efficiency, generalizability and robustness. Table of Contents: Fundamentals / Statistical Methods / Filtering Methods / Subspace Methods / Amplitude Estimation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Statistics is Easy! 2nd Edition

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Statistics is the activity of inferring results about a population given a sample. Historically, statistics books assume an underlying distribution to the data (typically, the normal distribution) and derive results under that assumption. Unfortunately, in real life, one cannot normally be sure of the underlying distribution. For that reason, this book presents a distribution-independent approach to statistics based on a simple computational counting idea called resampling. This book explains the basic concepts of resampling, then system atically presents the standard statistical measures along with programs (in the language Python) to calculate them using resampling, and finally illustrates the use of the measures and programs in a case study. The text uses junior high school algebra and many examples to explain the concepts. Th e ideal reader has mastered at least elementary mathematics, likes to think procedurally, and is comfortable with computers. Table of Contents: The Basic Ide / Pragmatic Considerations when Using Resampling / Terminology / The Essential Stats / Case Study: New Mexico's 2004 Presidential Ballots / References / Bias Corrected Confidence Intervals / Appendix B View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Storing Clocked Programs Inside DNA:A Simplifying Framework for Nanocomputing

    Copyright Year: 2011

    Morgan and Claypool eBooks

    In the history of modern computation, large mechanical calculators preceded computers. A person would sit there punching keys according to a procedure and a number would eventually appear. Once calculators became fast enough, it became obvious that the critical path was the punching rather than the calculation itself. That is what made the stored program concept vital to further progress. Once the instructions were stored in the machine, the entire computation could run at the speed of the machine. This book shows how to do the same thing for DNA computing. Rather than asking a robot or a person to pour in specific strands at different times in order to cause a DNA computation to occur (by analogy to a person punching numbers and operations into a mechanical calculator), the DNA instructions are stored within the solution and guide the entire computation. We show how to store straight line programs, conditionals, loops, and a rudimentary form of subroutines. To achieve this goal, the ook proposes a complete language for describing the intrinsic topology of DNA complexes and nanomachines, along with the dynamics of such a system. We then describe dynamic behavior using a set of basic transitions, which operate on a small neighborhood within a complex in a well-defined way. These transitions can be formalized as purely syntactical functions of the string representations. Building on that foundation, the book proposes a novel machine motif which constitutes an instruction stack, allowing for the clocked release of an arbitrary sequence of DNA instruction or data strands. The clock mechanism is built of special strands of DNA called "tick" and "tock." Each time a "tick" and "tock" enter a DNA solution, a strand is released from an instruction stack (by analogy to the way in which as a clock cycle in an electronic computer causes a new instruction to enter a processing unit). As long as there remain strands on the stack, the next cycle will release a new instruction st and. Regardless of the actual strand or component to be released at any particular clock step, the "tick" and "tock" fuel strands remain the same, thus shifting the burden of work away from the end user of a machine and easing operation. Pre-loaded stacks enable the concept of a stored program to be realized as a physical DNA mechanism. A conceptual example is given of such a stack operating a walker device. The stack allows for a user to operate such a clocked walker by means of simple repetition of adding two fuel types, in contrast to the previous mechanism of adding a unique fuel -- at least 12 different types of strands -- for each step of the mechanism. We demonstrate by a series of experiments conducted in Ned Seeman's lab that it is possible to "initialize" a clocked stored program DNA machine. We end the book with a discussion of the design features of a programming language for clocked DNA programming. There is a lot left to do. Table of Contents: Introduction / Notation / Topological Description of DNA Computing / Machines and Motifs / Experiment: Storing Clocked Programs in DNA / A Clocked DNA Programming Language View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Arduino Microcontroller Processing for Everyone:Part I

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book is about the Arduino microcontroller and the Arduino concept. The visionary Arduino team of Massimo Banzi, David Cuartielles, Tom Igoe, Gianluca Martino, and David Mellis launched a new innovation in microcontroller hardware in 2005, the concept of open source hardware. Their approach was to openly share details of microcontroller-based hardware design platforms to stimulate the sharing of ideas and promote innovation. This concept has been popular in the software world for many years. This book is intended for a wide variety of audiences including students of the fine arts, middle and senior high school students, engineering design students, and practicing scientists and engineers. To meet this wide audience, the book has been divided into sections to satisfy the need of each reader. The book contains many software and hardware examples to assist the reader in developing a wide variety of systems. For the examples, the Arduino Duemilanove and the Atmel ATmega328 is employed as the target processor. Table of Contents: Getting Started / Programming / Embedded Systems Design / Serial Communication Subsystem / Analog to Digital Conversion (ADC) / Interrupt Subsystem / Timing Subsystem / Atmel AVR Operating Parameters and Interfacing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Spoken Dialogue Systems

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Considerable progress has been made in recent years in the development of dialogue systems that support robust and efficient human-machine interaction using spoken language. Spoken dialogue technology allows various interactive applications to be built and used for practical purposes, and research focuses on issues that aim to increase the system's communicative competence by including aspects of error correction, cooperation, multimodality, and adaptation in context. This book gives a comprehensive view of state-of-the-art techniques that are used to build spoken dialogue systems. It provides an overview of the basic issues such as system architectures, various dialogue management methods, system evaluation, and also surveys advanced topics concerning extensions of the basic model to more conversational setups. The goal of the book is to provide an introduction to the methods, problems, and solutions that are used in dialogue system development and evaluation. It presents dialogue m delling and system development issues relevant in both academic and industrial environments and also discusses requirements and challenges for advanced interaction management and future research. Table of Contents: Preface / Introduction to Spoken Dialogue Systems / Dialogue Management / Error Handling / Case Studies: Advanced Approaches to Dialogue Management / Advanced Issues / Methodologies and Practices of Evaluation / Future Directions / References / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Ontology-Based Interpretation of Natural Language

    Copyright Year: 2014

    Morgan and Claypool eBooks

    For humans, understanding a natural language sentence or discourse is so effortless that we hardly ever think about it. For machines, however, the task of interpreting natural language, especially grasping meaning beyond the literal content, has proven extremely difficult and requires a large amount of background knowledge. This book focuses on the interpretation of natural language with respect to specific domain knowledge captured in ontologies. The main contribution is an approach that puts ontologies at the center of the interpretation process. This means that ontologies not only provide a formalization of domain knowledge necessary for interpretation but also support and guide the construction of meaning representations. We start with an introduction to ontologies and demonstrate how linguistic information can be attached to them by means of the ontology lexicon model lemon. These lexica then serve as basis for the automatic generation of grammars, which we use to compositionally construct meaning representations that conform with the vocabulary of an underlying ontology. As a result, the level of representational granularity is not driven by language but by the semantic distinctions made in the underlying ontology and thus by distinctions that are relevant in the context of a particular domain. We highlight some of the challenges involved in the construction of ontology-based meaning representations, and show how ontologies can be exploited for ambiguity resolution and the interpretation of temporal expressions. Finally, we present a question answering system that combines all tools and techniques introduced throughout the book in a real-world application, and sketch how the presented approach can scale to larger, multi-domain scenarios in the context of the Semantic Web. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Representation Discovery using Harmonic Analysis

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Representations are at the heart of artificial intelligence (AI). This book is devoted to the problem of representation discovery: how can an intelligent system construct representations from its experience? Representation discovery re-parameterizes the state space - prior to the application of information retrieval, machine learning, or optimization techniques - facilitating later inference processes by constructing new task-specific bases adapted to the state space geometry. This book presents a general approach to representation discovery using the framework of harmonic analysis, in particular Fourier and wavelet analysis. Biometric compression methods, the compact disc, the computerized axial tomography (CAT) scanner in medicine, JPEG compression, and spectral analysis of time-series data are among the many applications of classical Fourier and wavelet analysis. A central goal of this book is to show that these analytical tools can be generalized from their usual setting in (infin te-dimensional) Euclidean spaces to discrete (finite-dimensional) spaces typically studied in many subfields of AI. Generalizing harmonic analysis to discrete spaces poses many challenges: a discrete representation of the space must be adaptively acquired; basis functions are not pre-defined, but rather must be constructed. Algorithms for efficiently computing and representing bases require dealing with the curse of dimensionality. However, the benefits can outweigh the costs, since the extracted basis functions outperform parametric bases as they often reflect the irregular shape of a particular state space. Case studies from computer graphics, information retrieval, machine learning, and state space planning are used to illustrate the benefits of the proposed framework, and the challenges that remain to be addressed. Representation discovery is an actively developing field, and the author hopes this book will encourage other researchers to explore this exciting area of research. Tab e of Contents: Overview / Vector Spaces / Fourier Bases on Graphs / Multiscale Bases on Graphs / Scaling to Large Spaces / Case Study: State-Space Planning / Case Study: Computer Graphics / Case Study: Natural Language / Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Natural Language Processing for the Semantic Web

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>This book introduces core natural language processing (NLP) technologies to non-experts in an easily accessible way, as a series of building blocks that lead the user to understand key technologies, why they are required, and how to integrate them into Semantic Web applications. Natural language processing and Semantic Web technologies have different, but complementary roles in data management. Combining these two technologies enables structured and unstructured data to merge seamlessly. Semantic Web technologies aim to convert unstructured data to meaningful representations, which benefit enormously from the use of NLP technologies, thereby enabling applications such as connecting text to Linked Open Data, connecting texts to each other, semantic searching, information visualization, and modeling of user behavior in online networks. </p> <p>The first half of this book describes the basic NLP processing tools: tokenization, part-of-speech tagging, and morpho ogical analysis, in addition to the main tools required for an information extraction system (named entity recognition and relation extraction) which build on these components. The second half of the book explains how Semantic Web and NLP technologies can enhance each other, for example via semantic annotation, ontology linking, and population. These chapters also discuss sentiment analysis, a key component in making sense of textual data, and the difficulties of performing NLP on social media, as well as some proposed solutions. The book finishes by investigating some applications of these tools, focusing on semantic search and visualization, modeling user behavior, and an outlook on the future. </p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Control Grid Motion Estimation for Efficient Application of Optical Flow

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Motion estimation is a long-standing cornerstone of image and video processing. Most notably, motion estimation serves as the foundation for many of today's ubiquitous video coding standards including H.264. Motion estimators also play key roles in countless other applications that serve the consumer, industrial, biomedical, and military sectors. Of the many available motion estimation techniques, optical flow is widely regarded as most flexible. The flexibility offered by optical flow is particularly useful for complex registration and interpolation problems, but comes at a considerable computational expense. As the volume and dimensionality of data that motion estimators are applied to continue to grow, that expense becomes more and more costly. Control grid motion estimators based on optical flow can accomplish motion estimation with flexibility similar to pure optical flow, but at a fraction of the computational expense. Control grid methods also offer the added benefit of repres nting motion far more compactly than pure optical flow. This booklet explores control grid motion estimation and provides implementations of the approach that apply to data of multiple dimensionalities. Important current applications of control grid methods including registration and interpolation are also developed. Table of Contents: Introduction / Control Grid Interpolation (CGI) / Application of CGI to Registration Problems / Application of CGI to Interpolation Problems / Discussion and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamental Biomechanics in Bone Tissue Engineering

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This eight-chapter monograph intends to present basic principles and applications of biomechanics in bone tissue engineering in order to assist tissue engineers in design and use of tissue-engineered products for repair and replacement of damaged/deformed bone tissues. Briefly, Chapter 1 gives an overall review of biomechanics in the field of bone tissue engineering. Chapter 2 provides detailed information regarding the composition and architecture of bone. Chapter 3 discusses the current methodologies for mechanical testing of bone properties (i.e., elastic, plastic, damage/fracture, viscoelastic/viscoplastic properties). Chapter 4 presents the current understanding of the mechanical behavior of bone and the associated underlying mechanisms. Chapter 5 discusses the structure and properties of scaffolds currently used for bone tissue engineering applications. Chapter 6 gives a brief discussion of current mechanical and structural tests of repair/tissue engineered bone tissues. Chapter 7 summarizes the properties of repair/tissue engineered bone tissues currently attained. Finally, Chapter 8 discusses the current issues regarding biomechanics in the area of bone tissue engineering. Table of Contents: Introduction / Bone Composition and Structure / Current Mechanical Test Methodologies / Mechanical Behavior of Bone / Structure and Properties of Scaffolds for Bone Tissue Regeneration / Mechanical and Structural Evaluation of Repair/Tissue Engineered Bone / Mechanical and Structural Properties of Tissues Engineered/Repair Bone / Current Issues of Biomechanics in Bone Tissue Engineering View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sustaining Moore’s Law:Uncertainty Leading to a Certainty of IoT Revolution

    Copyright Year: 2015

    Morgan and Claypool eBooks

    In 1965, Intel co-founder Gordon Moore, in "Cramming more components onto Integrated Circuits" in Electronics Magazine (April 19, 1965), made the observation that, in the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years. Since its inception in 1965 until recent times, this law has been used in the semiconductor industry to guide investments for long-term planning as well as to set targets for research and development. These investments have helped in a productive utilization of wealth, which created more employment opportunities for semiconductor industry professionals. In this way, the development of Moore's Law has helped sustain the progress of today's knowledge-based economy. While Moore's Law has, on one hand, helped drive investments toward technological and economic growth, thereby benefiting the consumers with more powerful electronic gadgets, Moore's Law has indirectly also helped to fuel other innovatio s in the global economy. However, the Law of diminishing returns is now questioning the sustainability of further evolution of Moore's Law and its ability to sustain the progress of today's knowledge based economy. The lack of liquidity in the global economy is truly bringing the entire industry to a standstill and the dark clouds of an economic depression are hovering over the global economy. What factors have been ignored by the global semiconductor industry leading to a demise of Moore's Law? Do the existing business models prevalent in the semiconductor industry pose any problems? Have supply chains made that progress unsustainable? In today's globalized world, have businesses been able to sustain national interests while driving the progress of Moore's Law? Could the semiconductor industry help the entire global economy move toward a radiance of the new crimson dawn, beyond the veil of the darkest night by sustaining the progress of Moore's Law? The entire semiconductor ind stry is now clamoring for a fresh approach to overcome existing barriers to the progress of Moore's Law, and this book delivers just that. Moore's Law can easily continue for the foreseeable future if the chip manufacturing industry becomes sustainable by having a balanced economy. The sustainable progress of Moore's Law advocates the "heresy" of transforming the current economic orthodoxy of monopoly capitalism into free-market capitalism. The next big thing that everybody is looking forward to after mobile revolution is the "Internet of Things" (IoT) revolution. While some analysts forecast that the IoT market would achieve 5.4 billion connections worldwide by 2020, the poor consumer purchasing power in global economy makes this forecast truly questionable. Sustaining Moore's Law presents a blueprint for sustaining the progress of Moore's Law to bring about IoT Revolution in the global economy. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mobile User Research:A Practical Guide

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>This book will give you a practical overview of several methods and approaches for designing mobile technologies and conducting mobile user research, including how to understand behavior and evaluate how such technologies are being (or may be) used out in the world. Each chapter includes case studies from our own work and highlights advantages, limitations, and very practical steps that should be taken to increase the validity of the studies you conduct and the data you collect.</p> <p>This book is intended as a practical guide for conducting mobile research focused on the user and their experience. We hope that the depth and breadth of case studies presented, as well as specific best practices, will help you to design the best technologies possible and choose appropriate methods to gather ethical, reliable, and generalizable data to explore the use of mobile technologies out in the world.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Surface Computing and Collaborative Analysis Work

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the cha lenges security personnel face in securing networks from attackers, and intelligence analysts encounter when analyzing intelligence data. Both of these activities are becoming increasingly collaborative endeavors, and there are huge opportunities for improving collaboration by leveraging surface computing. This work highlights for interaction designers and software developers the particular challenges and opportunities presented by interaction with surfaces. We have reviewed hundreds of recent research papers, and report on advancements in the fields of surface-enabled collaborative analytic work, interactive techniques for surface technologies, and useful theory that can provide direction to interaction design work. We also offer insight into issues that arise when developing applications for multi-touch surfaces derived from our own experiences creating collaborative applications. We present these insights at a level appropriate for all members of the software design and development team. Table of Contents: List of Figures / Acknowledgments / Figure Credits / Purpose and Direction / Surface Technologies and Collaborative Analysis Systems / Interacting with Surface Technologies / Collaborative Work Enabled by Surfaces / The Theory and the Design of Surface Applications / The Development of Surface Applications / Concluding Comments / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    How We Cope with Digital Technology

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Digital technology has become a defining characteristic of modern life. Almost everyone uses it, we all rely on it, and many of us own a multitude of devices. What is more, we all expect to be able to use these technologies "straight out the box." This lecture discusses how we are able to do this without apparent problems. We are able to use digital technology because we have learned to cope with it. "To cope" is used in philosophy to mean "absorbed engagement," that is, we use our smart phones and tablet computers with little or no conscious effort. In human-computer interaction this kind of use is more often described as intuitive. While this, of course, is testament to improved design, our interest in this lecture is in the human side of these interactions. We cope with technology because we are familiar with it. We define familiarity as the readiness to engage with technology which arises from being repeatedly exposed to it—often from birth. This exposure involves the frequ nt use of it and seeing people all around us using it every day. Digital technology has become as common a feature of our everyday lives as the motor car, TV, credit card, cutlery, or a dozen other things which we also use without conscious deliberation. We will argue that we cope with digital technology in the same way as we do these other technologies by means of this everyday familiarity. But this is only half of the story. We also regularly support or scaffold our use of technology. These scaffolding activities are described as "epistemic actions" which we adopt to make it easier for us to accomplish our goals. With digital technology these epistemic actions include appropriating it to more closer meet our needs. In summary, coping is a situated, embodied, and distributed description of how we use digital technology. Table of Contents: Introduction / Familiarity / Coping / Epistemic Scaffolding / Coping in Context / Bibliography / Author Biography View full abstract»