By Topic

Morgan and ClayPool Synthesis Digital LIBRARY

636 Results Returned

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Query Processing over Uncertain Databases

    Chen, L. ; Lian, X.
    DOI: 10.2200/S00465ED1V01Y201212DTM033
    Copyright Year: 2012

    Morgan and Claypool eBooks

    Due to measurement errors, transmission lost, or injected noise for privacy protection, uncertainty exists in the data of many real applications. However, query processing techniques for deterministic data cannot be directly applied to uncertain data because they do not have mechanisms to handle data uncertainty. Therefore, efficient and effective manipulation of uncertain data is a practical yet challenging research topic. In this book, we start from the data models for imprecise and uncertain data, move on to defining different semantics for queries on uncertain data, and finally discuss the advanced query processing techniques for various probabilistic queries in uncertain databases. The book serves as a comprehensive guideline for query processing over uncertain databases. Table of Contents: Introduction / Uncertain Data Models / Spatial Query Semantics over Uncertain Data Models / Spatial Query Processing over Uncertain Databases / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Near Field Communication:Recent Developments and Library Implications

    McHugh, S. ; Yarmey, K.
    DOI: 10.2200/S00570ED1V01Y201403ETL002
    Copyright Year: 2014

    Morgan and Claypool eBooks

    Near Field Communication is a radio frequency technology that allows objects, such as mobile phones, computers, tags, or posters, to exchange information wirelessly across a small distance. This report on the progress of Near Field Communication reviews the features and functionality of the technology and summarizes the broad spectrum of its current and anticipated applications. We explore the development of NFC technology in recent years, introduce the major stakeholders in the NFC ecosystem, and project its movement toward mainstream adoption. Several examples of early implementation of NFC in libraries are highlighted, primarily involving the use of NFC to enhance discovery by linking books or other physical objects with digital information about library resources, but also including applications of NFC to collection management and self-checkout. Future uses of NFC in libraries, such as smart posters or other enhanced outreach, are envisioned as well as the potential for the "touch paradigm" and "Internet of things" to transform the ways in which library users interact with the information environment. Conscious of the privacy and security of our patrons, we also address continuing concerns related to NFC technology and its expected applications, recommending caution, awareness, and education as immediate next steps for librarians. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineers for Korea

    Han, K. ; Downey, G.
    DOI: 10.2200/S00582ED1V01Y201406GES005
    Copyright Year: 2013

    Morgan and Claypool eBooks

    “The engineer is bearer of the nation’s industrialization,” says the tower pictured on the front cover. President Park Chung-hee (1917-1979) was seeking to scale up a unified national identity through industrialization, with engineers as iconic leaders. But Park encountered huge obstacles in what he called the “second economy” of mental nationalism. Technical workers had long been subordinate to classically-trained scholar officials. Even as the country became an industrial powerhouse, the makers of engineers never found approaches to techno-national formation—engineering education and training—that Koreans would wholly embrace. This book follows the fraught attempts of engineers to identify with Korea as a whole. It is for engineers, both Korean and non-Korean, who seek to become better critical analysts of their own expertise, identities, and commitments. It is for non-engineers who encounter or are affected by Korean engineers and en ineering, and want to understand and engage them. It is for researchers who serve as critical participants in the making of engineers and puzzle over the contents and effects of techno-national formation. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Engineering:A Starter's Guide with Hands-On Digital Multimedia and Robotics Explorations

    Karam, L. ; Mounsef, N.
    DOI: 10.2200/S00140ED1V01Y200806ENG007
    Copyright Year: 2008

    Morgan and Claypool eBooks

    This lecture provides a hands-on glimpse of the field of electrical and computer engineering. The broad range of hands-on applications utilize LabVIEW and the NI-SPEEDY-33 hardware to explore concepts such as basic computer input and output, basic robotic principals, and introductory signal processing and communication concepts such as signal generation, modulation, music, speech, and audio and image/video processing. These principals and technologies are introduced in a very practical way and are fundamental to many of the electronic and computerized devices we use today. Some examples include audio level meter and audio effects, music synthesizer, real-time autonomous robot, image and video analysis, and DTMF modulation found in touch-tone telephone systems. Table of Contents: Getting Familiar with LabVIEW and SPEEDY-33 / Applications using LEDs and Switches using the SPEEDY-33 / Noise Removal / Music Equalizer / Telephone / Digital Audio Effects: Echo and Reverb / Music Composer / ntroduction to Robotics / AM Radio / Modem / Digital Image Processing Fundamentals / Applications using USB Camera / Appendix: VIs at a Glance View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Control System Synthesis:A Factorization Approach

    Vidyasagar, M.
    DOI: 10.2200/S00351ED1V01Y201105CRM002
    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book introduces the so-called "stable factorization approach" to the synthesis of feedback controllers for linear control systems. The key to this approach is to view the multi-input, multi-output (MIMO) plant for which one wishes to design a controller as a matrix over the fraction field F associated with a commutative ring with identity, denoted by R, which also has no divisors of zero. In this setting, the set of single-input, single-output (SISO) stable control systems is precisely the ring R, while the set of stable MIMO control systems is the set of matrices whose elements all belong to R. The set of unstable, meaning not necessarily stable, control systems is then taken to be the field of fractions F associated with R in the SISO case, and the set of matrices with elements in F in the MIMO case. The central notion introduced in the book is that, in most situations of practical interest, every matrix P whose elements belong to F can be "factored" as a "ratio" of two matrice N,D whose elements belong to R, in such a way that N,D are coprime. In the familiar case where the ring R corresponds to the set of bounded-input, bounded-output (BIBO)-stable rational transfer functions, coprimeness is equivalent to two functions not having any common zeros in the closed right half-plane including infinity. However, the notion of coprimeness extends readily to discrete-time systems, distributed-parameter systems in both the continuous- as well as discrete-time domains, and to multi-dimensional systems. Thus the stable factorization approach enables one to capture all these situations within a common framework. The key result in the stable factorization approach is the parametrization of all controllers that stabilize a given plant. It is shown that the set of all stabilizing controllers can be parametrized by a single parameter R, whose elements all belong to R. Moreover, every transfer matrix in the closed-loop system is an affine function of the design parameter R Thus problems of reliable stabilization, disturbance rejection, robust stabilization etc. can all be formulated in terms of choosing an appropriate R. This is a reprint of the book Control System Synthesis: A Factorization Approach originally published by M.I.T. Press in 1985. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Spaces of Interaction, Places for Experience:Places for Experience

    Benyon, D.
    DOI: 10.2200/S00595ED1V01Y201409HCI022
    Copyright Year: 2014

    Morgan and Claypool eBooks

    Spaces of Interaction, Places for Experience is a book about Human-Computer Interaction (HCI), interaction design (ID) and user experience (UX) in the age of ubiquitous computing. The book explores interaction and experience through the different spaces that contribute to interaction until it arrives at an understanding of the rich and complex places for experience that will be the focus of the next period for interaction design. The book begins by looking at the multilayered nature of interaction and UX—not just with new technologies, but with technologies that are embedded in the world. People inhabit a medium, or rather many media, which allow them to extend themselves, physically, mentally, and emotionally in many directions. The medium that people inhabit includes physical and semiotic material that combine to create user experiences. People feel more or less present in these media and more or less engaged with the content of the media. From this understanding of people in media, the book explores some philosophical and practical issues about designing interactions. The book journeys through the design of physical space, digital space, information space, conceptual space and social space. It explores concepts of space and place, digital ecologies, information architecture, conceptual blending and technology spaces at work and in the home. It discusses navigation of spaces and how people explore and find their way through environments. Finally the book arrives at the concept of a blended space where the physical and digital are tightly interwoven and people experience the blended space as a whole. The design of blended spaces needs to be driven by an understanding of the correspondences between the physical and the digital, by an understanding of conceptual blending and by the desire to design at a human scale. There is no doubt that HCI and ID are changing. The design of “microinteractions” remains important, but there is a bigger picture o consider. UX is spread across devices, over time and across physical spaces. The commingling of the physical and the digital in blended spaces leads to new social spaces and new conceptual spaces. UX concerns the navigation of these spaces as much as it concerns the design of buttons and screens for apps. By taking a spatial perspective on interaction, the book provides new insights into the evolving nature of interaction design. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Adiabatic Quantum Computation and Quantum Annealing:Theory and Practice

    McGeoch, C.
    DOI: 10.2200/S00585ED1V01Y201407QMC008
    Copyright Year: 2014

    Morgan and Claypool eBooks

    Adiabatic quantum computation (AQC) is an alternative to the better-known gate model of quantum computation. The two models are polynomially equivalent, but otherwise quite dissimilar: one property that distinguishes AQC from the gate model is its analog nature. Quantum annealing (QA) describes a type of heuristic search algorithm that can be implemented to run in the ``native instruction set'' of an AQC platform. D-Wave Systems Inc. manufactures {quantum annealing processor chips} that exploit quantum properties to realize QA computations in hardware. The chips form the centerpiece of a novel computing platform designed to solve NP-hard optimization problems. Starting with a 16-qubit prototype announced in 2007, the company has launched and sold increasingly larger models: the 128-qubit D-Wave One system was announced in 2010 and the 512-qubit D-Wave Two system arrived on the scene in 2013. A 1,000-qubit model is expected to be available in 2014. This monograph presents an introduc ory overview of this unusual and rapidly developing approach to computation. We start with a survey of basic principles of quantum computation and what is known about the AQC model and the QA algorithm paradigm. Next we review the D-Wave technology stack and discuss some challenges to building and using quantum computing systems at a commercial scale. The last chapter reviews some experimental efforts to understand the properties and capabilities of these unusual platforms. The discussion throughout is aimed at an audience of computer scientists with little background in quantum computation or in physics. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Health Care Engineering, Part I:Clinical Engineering and Technology Management

    Frize, M.
    DOI: 10.2200/S00540ED1V01Y201310BME050
    Copyright Year: 2013

    Morgan and Claypool eBooks

    The first chapter describes the health care delivery systems in Canada and in the U.S. This is followed by examples of various approaches used to measure physiological variables in humans, either for the purpose of diagnosis or monitoring potential disease conditions; a brief description of sensor technologies is included. The function and role of the clinical engineer in managing medical technologies in industrialized and in developing countries are presented. This is followed by a chapter on patient safety (mainly electrical safety and electromagnetic interference); it includes a section on how to minimize liability and how to develop a quality assurance program for technology management. The next chapter discusses applications of telemedicine, including technical, social, and ethical issues. The last chapter presents a discussion on the impact of technology on health care and the technology assessment process. This two-part book consolidates material that supports courses on techno ogy development and management issues in health care institutions. It can be useful for anyone involved in design, development, or research, whether in industry, hospitals, or government. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Statistics is Easy!

    Shasha, D. ; Wilson, M.
    DOI: 10.2200/S00142ED1V01Y200807MAS001
    Copyright Year: 2009

    Morgan and Claypool eBooks

    Statistics is the activity of inferring results about a population given a sample. Historically, statistics books assume an underlying distribution to the data (typically, the normal distribution) and derive results under that assumption. Unfortunately, in real life, one cannot normally be sure of the underlying distribution. For that reason, this book presents a distribution-independent approach to statistics based on a simple computational counting idea called resampling. This book explains the basic concepts of resampling, then systematically presents the standard statistical measures along with programs (in the language Python) to calculate them using resampling, and finally illustrates the use of the measures and programs in a case study. The text uses junior high school algebra and many examples to explain the concepts. The ideal reader has mastered at least elementary mathematics, likes to think procedurally, and is comfortable with computers. Table of Contents: The Basic Idea Bias Corrected Confidence Intervals / Pragmatic Considerations When Using Resampling / Terminology / The Essential Stats / Case Study: New Mexico's 2004 Presidential Ballots / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Paradigm Shift to Multimodality in Contemporary Computer Interfaces

    Oviatt, S. ; Cohen, P.
    DOI: 10.2200/S00636ED1V01Y201503HCI030
    Copyright Year: 2015

    Morgan and Claypool eBooks

    During the last decade, cell phones with multimodal interfaces based on combined new media have become the dominant computer interface worldwide. Multimodal interfaces support mobility and expand the expressive power of human input to computers. They have shifted the fulcrum of human-computer interaction much closer to the human. This book explains the foundation of human-centered multimodal interaction and interface design, based on the cognitive and neurosciences, as well as the major benefits of multimodal interfaces for human cognition and performance. It describes the data-intensive methodologies used to envision, prototype, and evaluate new multimodal interfaces. From a system development viewpoint, this book outlines major approaches for multimodal signal processing, fusion, architectures, and techniques for robustly interpreting users' meaning. Multimodal interfaces have been commercialized extensively for field and mobile applications during the last decade. Research also is growing rapidly in areas like multimodal data analytics, affect recognition, accessible interfaces, embedded and robotic interfaces, machine learning and new hybrid processing approaches, and similar topics. The expansion of multimodal interfaces is part of the long-term evolution of more expressively powerful input to computers, a trend that will substantially improve support for human cognition and performance. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quantitative Neurophysiology

    Tranquillo, J.
    DOI: 10.2200/S00125ED1V01Y200808BME021
    Copyright Year: 2008

    Morgan and Claypool eBooks

    Quantitative Neurophysiology is supplementary text for a junior or senior level course in neuroengineering. It may also serve as an quick-start for graduate students in engineering, physics or neuroscience as well as for faculty interested in becoming familiar with the basics of quantitative neuroscience. The first chapter is a review of the structure of the neuron and anatomy of the brain. Chapters 2-6 derive the theory of active and passive membranes, electrical propagation in axons and dendrites and the dynamics of the synapse. Chapter 7 is an introduction to modeling networks of neurons and artificial neural networks. Chapter 8 and 9 address the recording and decoding of extracellular potentials. The final chapter has descriptions of a number of more advanced or new topics in neuroengineering. Throughout the text, vocabulary is introduced which will enable students to read more advanced literature and communicate with other scientists and engineers working in the neurosciences. Nu erical methods are outlined so students with programming knowledge can implement the models presented in the text. Analogies are used to clarify topics and reinforce key concepts. Finally, homework and simulation problems are available at the end of each chapter. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Electrical Engineering: Fundamentals

    Eccles, W.
    DOI: 10.2200/S00242ED1V01Y201105DCS031
    Copyright Year: 2011

    Morgan and Claypool eBooks

    Pragmatic Electrical Engineering: Fundamentals introduces the fundamentals of the energy-delivery part of electrical systems. It begins with a study of basic electrical circuits and then focuses on electrical power. Three-phase power systems, transformers, induction motors, and magnetics are the major topics. All of the material in the text is illustrated with completely-worked examples to guide the student to a better understanding of the topics. This short lecture book will be of use at any level of engineering, not just electrical. Its goal is to provide the practicing engineer with a practical, applied look at the energy side of electrical systems. The author's "pragmatic" and applied style gives a unique and helpful "non-idealistic, practical, opinionated" introduction to the topic. Table of Contents: Basic Stuff / Power of the Sine / Three-Phase Power Systems / Transformers / Machines / Electromagnetics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Lectures on Financial Mathematics:Discrete Asset Pricing

    Anderson, G. ; Kercheval, A.
    DOI: 10.2200/S00293ED1V01Y201008MAS007
    Copyright Year: 2010

    Morgan and Claypool eBooks

    This is a short book on the fundamental concepts of the no-arbitrage theory of pricing financial derivatives. Its scope is limited to the general discrete setting of models for which the set of possible states is finite and so is the set of possible trading times--this includes the popular binomial tree model. This setting has the advantage of being fairly general while not requiring a sophisticated understanding of analysis at the graduate level. Topics include understanding the several variants of "arbitrage", the fundamental theorems of asset pricing in terms of martingale measures, and applications to forwards and futures. The authors' motivation is to present the material in a way that clarifies as much as possible why the often confusing basic facts are true. Therefore the ideas are organized from a mathematical point of view with the emphasis on understanding exactly what is under the hood and how it works. Every effort is made to include complete explanations and proofs, and he reader is encouraged to work through the exercises throughout the book. The intended audience is students and other readers who have an undergraduate background in mathematics, including exposure to linear algebra, some advanced calculus, and basic probability. The book has been used in earlier forms with students in the MS program in Financial Mathematics at Florida State University, and is a suitable text for students at that level. Students who seek a second look at these topics may also find this book useful. Table of Contents: Overture: Single-Period Models / The General Discrete Model / The Fundamental Theorems of Asset Pricing / Forwards and Futures / Incomplete Markets View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Jordan Canonical Form:Application to Differential Equations

    Weintraub, S.
    DOI: 10.2200/S00146ED1V01Y200808MAS002
    Copyright Year: 2008

    Morgan and Claypool eBooks

    Jordan Canonical Form (JCF) is one of the most important, and useful, concepts in linear algebra. In this book we develop JCF and show how to apply it to solving systems of differential equations. We first develop JCF, including the concepts involved in it—eigenvalues, eigenvectors, and chains of generalized eigenvectors. We begin with the diagonalizable case and then proceed to the general case, but we do not present a complete proof. Indeed, our interest here is not in JCF per se, but in one of its important applications. We devote the bulk of our attention in this book to showing how to apply JCF to solve systems of constant-coefficient first order differential equations, where it is a very effective tool. We cover all situations—homogeneous and inhomogeneous systems; real and complex eigenvalues. We also treat the closely related topic of the matrix exponential. Our discussion is mostly confined to the 2-by-2 and 3-by-3 cases, and we present a wealth of examples that i lustrate all the possibilities in these cases (and of course, exercises for the reader). Table of Contents: Jordan Canonical Form / Solving Systems of Linear Differential Equations / Background Results: Bases, Coordinates, and Matrices / Properties of the Complex Exponential View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Estimating the Query Difficulty for Information Retrieval

    Carmel, D. ; Yom-Tov, E.
    DOI: 10.2200/S00235ED1V01Y201004ICR015
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Many information retrieval (IR) systems suffer from a radical variance in performance when responding to users' queries. Even for systems that succeed very well on average, the quality of results returned for some of the queries is poor. Thus, it is desirable that IR systems will be able to identify "difficult" queries so they can be handled properly. Understanding why some queries are inherently more difficult than others is essential for IR, and a good answer to this important question will help search engines to reduce the variance in performance, hence better servicing their customer needs. Estimating the query difficulty is an attempt to quantify the quality of search results retrieved for a query from a given collection of documents. This book discusses the reasons that cause search engines to fail for some of the queries, and then reviews recent approaches for estimating query difficulty in the IR field. It then describes a common methodology for evaluating the prediction qual ty of those estimators, and experiments with some of the predictors applied by various IR methods over several TREC benchmarks. Finally, it discusses potential applications that can utilize query difficulty estimators by handling each query individually and selectively, based upon its estimated difficulty. Table of Contents: Introduction - The Robustness Problem of Information Retrieval / Basic Concepts / Query Performance Prediction Methods / Pre-Retrieval Prediction Methods / Post-Retrieval Prediction Methods / Combining Predictors / A General Model for Query Difficulty / Applications of Query Difficulty Estimation / Summary and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Basic Probability Theory for Biomedical Engineers

    Enderle, J. ; Farden, D. ; Krause, D.
    DOI: 10.2200/S00037ED1V01Y200606BME005
    Copyright Year: 2006

    Morgan and Claypool eBooks

    This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems—as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first chapter. The second chapter introduces the topic of random variables. Later chapters simply expand upon these key ideas and extend the range of application. A considerable effort has been made to develop the theory in a logical manner—developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Ever effort has been made to be consistent with commonly used notation and terminology—both within the engineering community as well as the probability and statistics literature. Biomedical engineering examples are introduced throughout the text and a large number of self-study problems are available for the reader. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Little Book on Teaching:A Beginner's Guide for Educators of Engineering and Applied Science

    Barrett, S.
    DOI: 10.2200/S00406ED1V01Y201203ENG017
    Copyright Year: 2012

    Morgan and Claypool eBooks

    It is often a challenging and overwhelming transition to go from being a student to being a teacher. Many new faculty members of engineering and science have to make this dramatic transition in a very short time. In the same closing months of your Ph.D. program you are trying to complete your research, finish and defend your dissertation, find a job, move to a new location, and start a new job as a faculty member. If you are lucky, you've had the opportunity to serve as a teaching assistant and possibly have taught a university-level course. If you have served as a research assistant, your teaching opportunities may have been limited. Somehow, in this quick transition from student to teacher, one is supposed to become a good teacher and be ready for the first day of school. This book is intended as a basic primer on college-level teaching and learning for a new faculty member of engineering and applied science. New faculty members in other disciplines will find much of the informatio applicable to their area of expertise as well. First and foremost, this book is about learning and teaching. However, it also provides helpful information on related topics such as mentorship, student challenges, graduate students, tenure, and promotion and accreditation. This book is also intended as a reference for seasoned professionals. It is a good reference for those mentoring the next generation of college educators. Table of Contents: List of Figures / What makes a Great Teacher? / A little learning theory / Preparation for the first day of classes / Assessment / Beyond the first day View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Anthropology of Services:Toward a Practice Approach to Designing Services

    Blomberg, J. ; Darrah, C.
    DOI: 10.2200/S00628ED1V01Y201502HCI026
    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book explores the possibility for an anthropology of services and outlines a practice approach to designing services. The reader is taken on a journey that Blomberg and Darrah have been on for the better part of a decade from their respective positions helping to establish a services research group within a large global enterprise and an applied anthropology master's program at a Silicon Valley university. They delve into the world of services to understand both how services are being conceptualized today and the possible benefits that might result from taking an anthropological view on services and their design. The authors argue that the anthropological gaze can be useful precisely because it combines attention to details of everyday life with consideration of the larger milieu in which those details make sense. Furthermore, it asks us to reflect upon and assess our own perspectives on that which we hope to understand and change. Central to their exploration is the question of how to conceptualize and engage with the world of services given their heterogeneity, the increasing global importance of the service economy, and the possibilities introduced for an engaged scholarship on service design. While discourse on services and service design can imply something distinctively new, the authors point to parallels with what is known about how humans have engaged with each other and the material world over millennia. Establishing the ubiquity of services as a starting point, the authors go on to consider the limits of design when the boundaries and connections between what can be designed and what can only be performed are complex and deeply mediated. In this regard the authors outline a practice approach to designing that acknowledges that designing involves participating in a social context, that design and use occur in concert, that people populate a world that has been largely built by and with others, and that formal models of services are impoverished repre entations of human performance. An Anthropology of Services draws attention to the conceptual and methodological messiness of service worlds while providing the reader with strategies for intervening in these worlds for human betterment as complex and challenging as that may be. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Deformable Surface 3D Reconstruction from Monocular Images

    Salzmann, M. ; Fua, P.
    DOI: 10.2200/S00273ED1V01Y201006CAC010
    Copyright Year: 2010

    Morgan and Claypool eBooks

    Being able to recover the shape of 3D deformable surfaces from a single video stream would make it possible to field reconstruction systems that run on widely available hardware without requiring specialized devices. However, because many different 3D shapes can have virtually the same projection, such monocular shape recovery is inherently ambiguous. In this survey, we will review the two main classes of techniques that have proved most effective so far: The template-based methods that rely on establishing correspondences with a reference image in which the shape is already known, and non-rigid structure-from-motion techniques that exploit points tracked across the sequences to reconstruct a completely unknown shape. In both cases, we will formalize the approach, discuss its inherent ambiguities, and present the practical solutions that have been proposed to resolve them. To conclude, we will suggest directions for future research. Table of Contents: Introduction / Early Approaches t Non-Rigid Reconstruction / Formalizing Template-Based Reconstruction / Performing Template-Based Reconstruction / Formalizing Non-Rigid Structure from Motion / Performing Non-Rigid Structure from Motion / Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Answer Machine

    Feldman, S.
    DOI: 10.2200/S00442ED1V01Y201208ICR023
    Copyright Year: 2012

    Morgan and Claypool eBooks

    The Answer Machine is a practical, non-technical guide to the technologies behind information seeking and analysis. It introduces search and content analytics to software buyers, knowledge managers, and searchers who want to understand and design effective online environments. The book describes how search evolved from an expert-only to an end user tool. It provides an overview of search engines, categorization and clustering, natural language processing, content analytics, and visualization technologies. Detailed profiles for Web search, eCommerce search, eDiscovery, and enterprise search contrast the types of users, uses, tasks, technologies, and interaction designs for each. These variables shape each application, although the underlying technologies are the same. Types of information tasks and the trade-offs between precision and recall, time, volume and precision, and privacy vs. personalization are discussed within this context. The book examines trends toward convenient, contex -aware computing, big data and analytics technologies, conversational systems, and answer machines. The Answer Machine explores IBM Watson's DeepQA technology and describes how it is used to answer health care and Jeopardy questions. The book concludes by discussing the implications of these advances: how they will change the way we run our businesses, practice medicine, govern, or conduct our lives in the digital age. Table of Contents: Introduction / The Query Process and Barriers to Finding Information Online / Online Search: An Evolution / Search and Discovery Technologies: An Overview / Information Access: A Spectrum of Needs and Uses / Future Tense: The Next Era in Information Access and Discovery / Answer Machines View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Keyword Search in Databases

    Yu, J. ; Qin, L. ; Chang, L.
    DOI: 10.2200/S00231ED1V01Y200912DTM001
    Copyright Year: 2009

    Morgan and Claypool eBooks

    It has become highly desirable to provide users with flexible ways to query/search information over databases as simple as keyword search like Google search. This book surveys the recent developments on keyword search over databases, and focuses on finding structural information among objects in a database using a set of keywords. Such structural information to be returned can be either trees or subgraphs representing how the objects, that contain the required keywords, are interconnected in a relational database or in an XML database. The structural keyword search is completely different from finding documents that contain all the user-given keywords. The former focuses on the interconnected object structures, whereas the latter focuses on the object content. The book is organized as follows. In Chapter 1, we highlight the main research issues on the structural keyword search in different contexts. In Chapter 2, we focus on supporting structural keyword search in a relational databas management system using the SQL query language. We concentrate on how to generate a set of SQL queries that can find all the structural information among records in a relational database completely, and how to evaluate the generated set of SQL queries efficiently. In Chapter 3, we discuss graph algorithms for structural keyword search by treating an entire relational database as a large data graph. In Chapter 4, we discuss structural keyword search in a large tree-structured XML database. In Chapter 5, we highlight several interesting research issues regarding keyword search on databases. The book can be used as either an extended survey for people who are interested in the structural keyword search or a reference book for a postgraduate course on the related topics. Table of Contents: Introduction / Schema-Based Keyword Search on Relational Databases / Graph-Based Keyword Search / Keyword Search in XML Databases / Other Topics for Keyword Search on Databases View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Proxemic Interactions:From Theory to Practice

    Marquardt, N. ; Greenberg, S.
    DOI: 10.2200/S00619ED1V01Y201502HCI025
    Copyright Year: 2015

    Morgan and Claypool eBooks

    In the everyday world, much of what we do as social beings is dictated by how we perceive and manage our interpersonal space. This is called proxemics. At its simplest, people naturally correlate physical distance to social distance. We believe that people’s expectations of proxemics can be exploited in interaction design to mediate their interactions with devices (phones, tablets, computers, appliances, large displays) contained within a small ubiquitous computing ecology. Just as people expect increasing engagement and intimacy as they approach others, so should they naturally expect increasing connectivity and interaction possibilities as they bring themselves and their devices in close proximity to one another. This is called Proxemic Interactions. This book concerns the design of proxemic interactions within such future proxemic-aware ecologies. It imagines a world of devices that have fine-grained knowledge of nearby people and other devices—how they move into rang , their precise distance, their identity, and even their orientation—and how such knowledge can be exploited to design interaction techniques. The first part of this book concerns theory. After introducing proxemics, we operationalize proxemics for ubicomp interaction via the Proxemic Interactions framework that designers can use to mediate people’s interactions with digital devices. The framework, in part, identifies five key dimensions of proxemic measures (distance, orientation, movement, identity, and location) to consider when designing proxemic-aware ubicomp systems. The second part of this book applies this theory to practice via three case studies of proxemic-aware systems that react continuously to people’s and devices’ proxemic relationships. The case studies explore the application of proxemics in small-space ubicomp ecologies by considering first person-to-device, then device-to-device, and finally person-to-person and device-to-device proxemic elationships. We also offer a critical perspective on proxemic interactions in the form of “dark patterns,” where knowledge of proxemics may (and likely will) be easily exploited to the detriment of the user. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mellin Transform Method for Integral Evaluation

    Fikioris, G.
    DOI: 10.2200/S00076ED1V01Y200612CEM013
    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book introduces the Mellin-transform method for the exact calculation of one-dimensional definite integrals, and illustrates the application if this method to electromagnetics problems. Once the basics have been mastered, one quickly realizes that the method is extremely powerful, often yielding closed-form expressions very difficult to come up with other methods or to deduce from the usual tables of integrals. Yet, as opposed to other methods, the present method is very straightforward to apply; it usually requires laborious calculations, but little ingenuity. Two functions, the generalized hypergeometric function and the Meijer G-function, are very much related to the Mellin-transform method and arise frequently when the method is applied. Because these functions can be automatically handled by modern numerical routines, they are now much more useful than they were in the past. The Mellin-transform method and the two aforementioned functions are discussed first. Then the method is applied in three examples to obtain results, which, at least in the antenna/electromagnetics literature, are believed to be new. In the first example, a closed-form expression, as a generalized hypergeometric function, is obtained for the power radiated by a constant-current circular-loop antenna. The second example concerns the admittance of a 2-D slot antenna. In both these examples, the exact closed-form expressions are applied to improve upon existing formulas in standard antenna textbooks. In the third example, a very simple expression for an integral arising in recent, unpublished studies of unbounded, biaxially anisotropic media is derived. Additional examples are also briefly discussed. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Wavelet Image Compression

    Pearlman, W.
    DOI: 10.2200/S00464ED1V01Y201212IVM013
    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book explains the stages necessary to create a wavelet compression system for images and describes state-of-the-art systems used in image compression standards and current research. It starts with a high level discussion of the properties of the wavelet transform, especially the decomposition into multi-resolution subbands. It continues with an exposition of the null-zone, uniform quantization used in most subband coding systems and the optimal allocation of bitrate to the different subbands. Then the image compression systems of the FBI Fingerprint Compression Standard and the JPEG2000 Standard are described in detail. Following that, the set partitioning coders SPECK and SPIHT, and EZW are explained in detail and compared via a fictitious wavelet transform in actions and number of bits coded in a single pass in the top bit plane. The presentation teaches that, besides producing efficient compression, these coding systems, except for the FBI Standard, are capable of writing bit treams that have attributes of rate scalability, resolution scalability, and random access decoding. Many diagrams and tables accompany the text to aid understanding. The book is generous in pointing out references and resources to help the reader who wishes to expand his knowledge, know the origins of the methods, or find resources for running the various algorithms or building his own coding system. Table of Contents: Introduction / Characteristics of the Wavelet Transform / Generic Wavelet-based Coding Systems / The FBI Fingerprint Image Compression Standard / Set Partition Embedded Block (SPECK) Coding / Tree-based Wavelet Transform Coding Systems / Rate Control for Embedded Block Coders / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Concise Introduction to Multiagent Systems and Distributed Artificial Intelligence

    Vlassis, N.
    DOI: 10.2200/S00091ED1V01Y200705AIM002
    Copyright Year: 2007

    Morgan and Claypool eBooks

    Multiagent systems is an expanding field that blends classical fields like game theory and decentralized control with modern fields like computer science and machine learning. This monograph provides a concise introduction to the subject, covering the theoretical foundations as well as more recent developments in a coherent and readable manner. The text is centered on the concept of an agent as decision maker. Chapter 1 is a short introduction to the field of multiagent systems. Chapter 2 covers the basic theory of singleagent decision making under uncertainty. Chapter 3 is a brief introduction to game theory, explaining classical concepts like Nash equilibrium. Chapter 4 deals with the fundamental problem of coordinating a team of collaborative agents. Chapter 5 studies the problem of multiagent reasoning and decision making under partial observability. Chapter 6 focuses on the design of protocols that are stable against manipulations by self-interested agents. Chapter 7 provides a s ort introduction to the rapidly expanding field of multiagent reinforcement learning. The material can be used for teaching a half-semester course on multiagent systems covering, roughly, one chapter per lecture. View full abstract»