Morgan and ClayPool Synthesis Digital LIBRARY

820 Results Returned

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Cyber-Physical Security and Privacy in the Electric Smart Grid

    Copyright Year: 2017

    Morgan and Claypool eBooks

    This book focuses on the combined cyber and physical security issues in advanced electric smart grids. Existing standards are compared with classical results and the security and privacy principles of current practice are illustrated. The book paints a way for future development of advanced smart grids that operated in a peer-to-peer fashion, thus requiring a different security model. Future defenses are proposed that include information flow analysis and attestation systems that rely on fundamental physical properties of the smart grid system. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mobile Platforms and Development Environments

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Mobile platform development has lately become a technological war zone with extremely dynamic and fluid movement, especially in the smart phone and tablet market space. This Synthesis lecture is a guide to the latest developments of the key mobile platforms that are shaping the mobile platform industry. The book covers the three currently dominant native platforms -- iOS, Android and Windows Phone -- along with the device-agnostic HTML5 mobile web platform. The lecture also covers location-based services (LBS) which can be considered as a platform in its own right. The lecture utilizes a sample application (TwitterSearch) that the authors show programmed on each of the platforms. Audiences who may benefit from this lecture include: (1) undergraduate and graduate students taking mobile computing classes or self-learning the mobile platform programmability road map; (2) academic and industrial researchers working on mobile computing R&D projects; (3) mobile app developers for a specifi platform who may be curious about other platforms; (4) system integrator consultants and firms concerned with mobilizing businesses and enterprise apps; and (5) industries including health care, logistics, mobile workforce management, mobile commerce and payment systems and mobile search and advertisement. Table of Contents: From the Newton to the iPhone / iOS / Android / Windows Phone / Mobile Web / Platform-in-Platform: Location-Based Services (LBS) / The Future of Mobile Platforms / TwitterSearch Sample Application View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Spaces of Interaction, Places for Experience:Places for Experience

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Spaces of Interaction, Places for Experience is a book about Human-Computer Interaction (HCI), interaction design (ID) and user experience (UX) in the age of ubiquitous computing. The book explores interaction and experience through the different spaces that contribute to interaction until it arrives at an understanding of the rich and complex places for experience that will be the focus of the next period for interaction design. The book begins by looking at the multilayered nature of interaction and UX—not just with new technologies, but with technologies that are embedded in the world. People inhabit a medium, or rather many media, which allow them to extend themselves, physically, mentally, and emotionally in many directions. The medium that people inhabit includes physical and semiotic material that combine to create user experiences. People feel more or less present in these media and more or less engaged with the content of the media. From this understanding of people in media, the book explores some philosophical and practical issues about designing interactions. The book journeys through the design of physical space, digital space, information space, conceptual space and social space. It explores concepts of space and place, digital ecologies, information architecture, conceptual blending and technology spaces at work and in the home. It discusses navigation of spaces and how people explore and find their way through environments. Finally the book arrives at the concept of a blended space where the physical and digital are tightly interwoven and people experience the blended space as a whole. The design of blended spaces needs to be driven by an understanding of the correspondences between the physical and the digital, by an understanding of conceptual blending and by the desire to design at a human scale. There is no doubt that HCI and ID are changing. The design of “microinteractions” remains important, but there is a bigger picture o consider. UX is spread across devices, over time and across physical spaces. The commingling of the physical and the digital in blended spaces leads to new social spaces and new conceptual spaces. UX concerns the navigation of these spaces as much as it concerns the design of buttons and screens for apps. By taking a spatial perspective on interaction, the book provides new insights into the evolving nature of interaction design. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    RFID Explained:A Primer on Radio Frequency Identification Technologies

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This lecture provides an introduction to Radio Frequency Identification (RFID), a technology enabling automatic identification of objects at a distance without requiring line-of-sight. Electronic tagging can be divided into technologies that have a power source (active tags), and those that are powered by the tag interrogation signal (passive tags); the focus here is on passive tags. An overview of the principles of the technology divides passive tags into devices that use either near field or far field coupling to communicate with a tag reader. The strengths and weaknesses of the approaches are considered, along with the standards that have been put in place by ISO and EPCGlobal to promote interoperability and the ubiquitous adoption of the technology. A section of the lecture has been dedicated to the principles of reading co-located tags, as this represents a significant challenge for a technology that may one day be able to automatically identify all of the items in your shopping art in a just few seconds. In fact, RFID applications are already quite extensive and this lecture classifies the primary uses. Some variants of modern RFID can also be integrated with sensors enabling the technology to be extended to measure parameters in the local environment, such as temperature & pressure. The uses and applications of RFID sensors are further described and classified. Later we examine important lessons surrounding the deployment of RFID for the Wal-Mart and the Metro AG store experiences, along with deployments in some more exploratory settings. Extensions of RFID that make use of read/write memory integrated with the tag are also discussed, in particular looking at novel near term opportunities. Privacy and social implications surrounding the use of RFID inspire recurring debates whenever there is discussion of large scale deployment; we examine the pros and cons of the issues and approaches for mitigating the problems. Finally, the remaining challenges of R ID are considered and we look to the future possibilities for the technology. Table of Contents: Introduction / Principles of Radio Frequency Identification / RFID Industry Standards / Reading Collected RFID Tags / Applications of RFID Tagging / RFID Incorporating Sensing / Deployment and Experience with RFID Systems / Privacy, Kill Switches, and Blocker Tags / Opportunities for RFID Integrated with Memory / Challenges, Future Technology, and Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Representations of Multiple-Valued Logic Functions

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Compared to binary switching functions, the multiple-valued functions (MV) offer more compact representations of the information content of signals modeled by logic functions and, therefore, their use fits very well in the general settings of data compression attempts and approaches. The first task in dealing with such signals is to provide mathematical methods for their representation in a way that will make their application in practice feasible. Representation of Multiple-Valued Logic Functions is aimed at providing an accessible introduction to these mathematical techniques that are necessary for application of related implementation methods and tools. This book presents in a uniform way different representations of multiple-valued logic functions, including functional expressions, spectral representations on finite Abelian groups, and their graphical counterparts (various related decision diagrams). Three-valued, or ternary functions, are traditionally used as the first extension from the binary case. They have a good feature that the ratio between the number of bits and the number of different values that can be encoded with the specified number of bits is favourable for ternary functions. Four-valued functions, also called quaternary functions, are particularly attractive, since in practical realization within today prevalent binary circuits environment, they may be easy coded by binary values and realized with two-stable state circuits. At the same time, there is much more considerable advent in design of four-valued logic circuits than for other $p$-valued functions. Therefore, this book is written using a hands-on approach such that after introducing the general and necessarily abstract background theory, the presentation is based on a large number of examples for ternary and quaternary functions that should provide an intuitive understanding of various representation methods and the interconnections among them. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering: Women and Leadership

    Copyright Year: 2008

    Morgan and Claypool eBooks

    In this book we explore a sea change occurring in leadership for academic women in the sciences and engineering. Our approach is a two-pronged one: On the one hand, we outline the nature of the changes and their sources, both in various literatures and from program research results. On the other hand, we specify and provide detail about the persistent problems and obstacles that remain as barriers to women’s full participation in academic science and engineering, their career advancement and success, and, most important, their role as leaders in making change. At the heart of this book is our goal to give some shape to the research, practice, and programs developed by women academic leaders making institutional change in the sciences and engineering. Table of Contents: Women in a New Era of Academic Leadership / Background: Academic Leadership for Women in Science and Engineering / Gender and Leadership: Theories and Applications / Women in Engineering Leadership Institute: Cri ical Issues for Women Academic Engineers as Leaders / From Success Stories to Success Strategies: Leadership for Promoting Diversity in Academic Science and Engineering / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Joint Source-Channel Video Transmission

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book deals with the problem of joint source-channel video transmission, i.e., the joint optimal allocation of resources at the application layer and the other network layers, such as data rate adaptation, channel coding, power adaptation in wireless networks, quality of service (QoS) support from the network, and packet scheduling, for efficient video transmission. Real-time video communication applications, such as videoconferencing, video telephony, and on-demand video streaming, have gained increased popularity. However, a key problem in video transmission over the existing Internet and wireless networks is the incompatibility between the nature of the network conditions and the QoS requirements (in terms, for example, of bandwidth, delay, and packet loss) of real-time video applications. To deal with this incompatibility, a natural approach is to adapt the end-system to the network. The joint source-channel coding approach aims to efficiently perform content-aware cross-layer resource allocation, thus increasing the communication efficiency of multiple network layers. Our purpose in this book is to review the basic elements of the state-of-the-art approaches toward joint source-channel video transmission for wired and wireless systems. In this book, we present a general resource-distortion optimization framework, which is used throughout the book to guide our discussions on various techniques of joint source-channel video transmission. In this framework, network resources from multiple layers are assigned to each video packet according to its level of importance. It provides not only an optimization benchmark against which the performance of other sub-optimal systems can be evaluated, but also a useful tool for assessing the effectiveness of different error control components in practical system design. This book is therefore written to be accessible to researchers, expert industrial R&D engineers, and university students who are interested in the cut ing edge technologies in joint source-channel video transmission. Contents: Introduction / Elements of a Video Communication System / Joint Source-Channel Coding / Error-Resilient Video Coding / Channel Modeling and Channel Coding / Internet Video Transmission / Wireless Video Transmission / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Little Book on Teaching:A Beginner's Guide for Educators of Engineering and Applied Science

    Copyright Year: 2012

    Morgan and Claypool eBooks

    It is often a challenging and overwhelming transition to go from being a student to being a teacher. Many new faculty members of engineering and science have to make this dramatic transition in a very short time. In the same closing months of your Ph.D. program you are trying to complete your research, finish and defend your dissertation, find a job, move to a new location, and start a new job as a faculty member. If you are lucky, you've had the opportunity to serve as a teaching assistant and possibly have taught a university-level course. If you have served as a research assistant, your teaching opportunities may have been limited. Somehow, in this quick transition from student to teacher, one is supposed to become a good teacher and be ready for the first day of school. This book is intended as a basic primer on college-level teaching and learning for a new faculty member of engineering and applied science. New faculty members in other disciplines will find much of the informatio applicable to their area of expertise as well. First and foremost, this book is about learning and teaching. However, it also provides helpful information on related topics such as mentorship, student challenges, graduate students, tenure, and promotion and accreditation. This book is also intended as a reference for seasoned professionals. It is a good reference for those mentoring the next generation of college educators. Table of Contents: List of Figures / What makes a Great Teacher? / A little learning theory / Preparation for the first day of classes / Assessment / Beyond the first day View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Articulation and Intelligibility

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Immediately following the Second World War, between 1947 and 1955, several classic papers quantified the fundamentals of human speech information processing and recognition. In 1947 French and Steinberg published their classic study on the articulation index. In 1948 Claude Shannon published his famous work on the theory of information. In 1950 Fletcher and Galt published their theory of the articulation index, a theory that Fletcher had worked on for 30 years, which integrated his classic works on loudness and speech perception with models of speech intelligibility. In 1951 George Miller then wrote the first book Language and Communication, analyzing human speech communication with Claude Shannon's just published theory of information. Finally in 1955 George Miller published the first extensive analysis of phone decoding, in the form of confusion matrices, as a function of the speech-to-noise ratio. This work extended the Bell Labs' speech articulation studies with ideas from Shann n's Information theory. Both Miller and Fletcher showed that speech, as a code, is incredibly robust to mangling distortions of filtering and noise. Regrettably much of this early work was forgotten. While the key science of information theory blossomed, other than the work of George Miller, it was rarely applied to aural speech research. The robustness of speech, which is the most amazing thing about the speech code, has rarely been studied. It is my belief (i.e., assumption) that we can analyze speech intelligibility with the scientific method. The quantitative analysis of speech intelligibility requires both science and art. The scientific component requires an error analysis of spoken communication, which depends critically on the use of statistics, information theory, and psychophysical methods. The artistic component depends on knowing how to restrict the problem in such a way that progress may be made. It is critical to tease out the relevant from the irrelevant and dig for th key issues. This will focus us on the decoding of nonsense phonemes with no visual component, which have been mangled by filtering and noise. This monograph is a summary and theory of human speech recognition. It builds on and integrates the work of Fletcher, Miller, and Shannon. The long-term goal is to develop a quantitative theory for predicting the recognition of speech sounds. In Chapter 2 the theory is developed for maximum entropy (MaxEnt) speech sounds, also called nonsense speech. In Chapter 3, context is factored in. The book is largely reflective, and quantitative, with a secondary goal of providing an historical context, along with the many deep insights found in these early works. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Spread Spectrum Modulation

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This lecture covers the fundamentals of spread spectrum modulation, which can be defined as any modulation technique that requires a transmission bandwidth much greater than the modulating signal bandwidth, independently of the bandwidth of the modulating signal. After reviewing basic digital modulation techniques, the principal forms of spread spectrum modulation are described. One of the most important components of a spread spectrum system is the spreading code, and several types and their characteristics are described. The most essential operation required at the receiver in a spread spectrum system is the code synchronization, which is usually broken down into the operations of acquisition and tracking. Means for performing these operations are discussed next. Finally, the performance of spread spectrum systems is of fundamental interest and the effect of jamming is considered, both without and with the use of forward error correction coding. The presentation ends with considerat on of spread spectrum systems in the presence of other users. For more complete treatments of spread spectrum, the reader is referred to [1, 2, 3]. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Graph Mining:Laws, Tools, and Case Studies

    Copyright Year: 2012

    Morgan and Claypool eBooks

    What does the Web look like? How can we find patterns, communities, outliers, in a social network? Which are the most central nodes in a network? These are the questions that motivate this work. Networks and graphs appear in many diverse settings, for example in social networks, computer-communication networks (intrusion detection, traffic management), protein-protein interaction networks in biology, document-text bipartite graphs in text retrieval, person-account graphs in financial fraud detection, and others. In this work, first we list several surprising patterns that real graphs tend to follow. Then we give a detailed list of generators that try to mirror these patterns. Generators are important, because they can help with "what if" scenarios, extrapolations, and anonymization. Then we provide a list of powerful tools for graph analysis, and specifically spectral methods (Singular Value Decomposition (SVD)), tensors, and case studies like the famous "pageRank" algorithm and the " ITS" algorithm for ranking web search results. Finally, we conclude with a survey of tools and observations from related fields like sociology, which provide complementary viewpoints. Table of Contents: Introduction / Patterns in Static Graphs / Patterns in Evolving Graphs / Patterns in Weighted Graphs / Discussion: The Structure of Specific Graphs / Discussion: Power Laws and Deviations / Summary of Patterns / Graph Generators / Preferential Attachment and Variants / Incorporating Geographical Information / The RMat / Graph Generation by Kronecker Multiplication / Summary and Practitioner's Guide / SVD, Random Walks, and Tensors / Tensors / Community Detection / Influence/Virus Propagation and Immunization / Case Studies / Social Networks / Other Related Work / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Instant Recovery with Write-Ahead Logging:Page Repair, System Restart, Media Restore, and System Failover, Second Edition

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p> Traditional theory and practice of write-ahead logging and of database recovery focus on three failure classes: transaction failures (typically due to deadlocks) resolved by transaction rollback; system failures (typically power or software faults) resolved by restart with log analysis, "redo," and "undo" phases; and media failures (typically hardware faults) resolved by restore operations that combine multiple types of backups and log replay. </p> <p> The recent addition of single-page failures and single-page recovery has opened new opportunities far beyond the original aim of immediate, lossless repair of single-page wear-out in novel or traditional storage hardware. In the contexts of system and media failures, efficient single-page recovery enables on-demand incremental "redo" and "undo" as part of system restart or media restore operations. This can give the illusion of practically instantaneous restart and restore: instant restart permits processing ew queries and updates seconds after system reboot and instant restore permits resuming queries and updates on empty replacement media as if those were already fully recovered. In the context of node and network failures, instant restart and instant restore combine to enable practically instant failover from a failing database node to one holding merely an out-of-date backup and a log archive, yet without loss of data, updates, or transactional integrity. </p> <p> In addition to these instant recovery techniques, the discussion introduces self-repairing indexes and much faster offline restore operations, which impose no slowdown in backup operations and hardly any slowdown in log archiving operations. The new restore techniques also render differential and incremental backups obsolete, complete backup commands on a database server practically instantly, and even permit taking full up-to-date backups without imposing any load on the database server. </p> <p gt; Compared to the first version of this book, this second edition adds sections on applications of single-page repair, instant restart, single-pass restore, and instant restore. Moreover, it adds sections on instant failover among nodes in a cluster, applications of instant failover, recovery for file systems and data files, and the performance of instant restart and instant restore.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    VIVO:A Semantic Approach to Scholarly Networking and Discovery

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The world of scholarship is changing rapidly. Increasing demands on scholars, the growing size and complexity of questions and problems to be addressed, and advances in sophistication of data collection, analysis, and presentation require new approaches to scholarship. A ubiquitous, open information infrastructure for scholarship, consisting of linked open data, open-source software tools, and a community committed to sustainability are emerging to meet the needs of scholars today. This book provides an introduction to VIVO, http://vivoweb.org/, a tool for representing information about research and researchers -- their scholarly works, research interests, and organizational relationships. VIVO provides an expressive ontology, tools for managing the ontology, and a platform for using the ontology to create and manage linked open data for scholarship and discovery. Begun as a project at Cornell and further developed by an NIH funded consortium, VIVO is now being established as an open ource project with community participation from around the world. By the end of 2012, over 20 countries and 50 organizations will provide information in VIVO format on more than one million researchers and research staff, including publications, research resources, events, funding, courses taught, and other scholarly activity. The rapid growth of VIVO and of VIVO-compatible data sources speaks to the fundamental need to transform scholarship for the 21st century. Table of Contents: Scholarly Networking Needs and Desires / The VIVO Ontology / Implementing VIVO and Filling It with Life / Case Study: University of Colorado at Boulder / Case Study: Weill Cornell Medical College / Extending VIVO / Analyzing and Visualizing VIVO Data / The Future of VIVO: Growing the Community View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Linked Data:Evolving the Web into a Global Data Space

    Copyright Year: 2011

    Morgan and Claypool eBooks

    The World Wide Web has enabled the creation of a global information space comprising linked documents. As the Web becomes ever more enmeshed with our daily lives, there is a growing desire for direct access to raw data not currently available on the Web or bound up in hypertext documents. Linked Data provides a publishing paradigm in which not only documents, but also data, can be a first class citizen of the Web, thereby enabling the extension of the Web with a global data space based on open standards - the Web of Data. In this Synthesis lecture we provide readers with a detailed technical introduction to Linked Data. We begin by outlining the basic principles of Linked Data, including coverage of relevant aspects of Web architecture. The remainder of the text is based around two main themes - the publication and consumption of Linked Data. Drawing on a practical Linked Data scenario, we provide guidance and best practices on: architectural approaches to publishing Linked Data; choo ing URIs and vocabularies to identify and describe resources; deciding what data to return in a description of a resource on the Web; methods and frameworks for automated linking of data sets; and testing and debugging approaches for Linked Data deployments. We give an overview of existing Linked Data applications and then examine the architectures that are used to consume Linked Data from the Web, alongside existing tools and frameworks that enable these. Readers can expect to gain a rich technical understanding of Linked Data fundamentals, as the basis for application development, research or further study. Table of Contents: List of Figures / Introduction / Principles of Linked Data / The Web of Data / Linked Data Design Considerations / Recipes for Publishing Linked Data / Consuming Linked Data / Summary and Outlook View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Community Detection and Mining in Social Media

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The past decade has witnessed the emergence of participatory Web and social media, bringing people together in many creative ways. Millions of users are playing, tagging, working, and socializing online, demonstrating new forms of collaboration, communication, and intelligence that were hardly imaginable just a short time ago. Social media also helps reshape business models, sway opinions and emotions, and opens up numerous possibilities to study human interaction and collective behavior in an unparalleled scale. This lecture, from a data mining perspective, introduces characteristics of social media, reviews representative tasks of computing with social media, and illustrates associated challenges. It introduces basic concepts, presents state-of-the-art algorithms with easy-to-understand examples, and recommends effective evaluation methods. In particular, we discuss graph-based community detection techniques and many important extensions that handle dynamic, heterogeneous networks i social media. We also demonstrate how discovered patterns of communities can be used for social media mining. The concepts, algorithms, and methods presented in this lecture can help harness the power of social media and support building socially-intelligent systems. This book is an accessible introduction to the study of emph{community detection and mining in social media}. It is an essential reading for students, researchers, and practitioners in disciplines and applications where social media is a key source of data that piques our curiosity to understand, manage, innovate, and excel. This book is supported by additional materials, including lecture slides, the complete set of figures, key references, some toy data sets used in the book, and the source code of representative algorithms. The readers are encouraged to visit the book website for the latest information. Table of Contents: Social Media and Social Computing / Nodes, Ties, and Influence / Community Detection and Evaluat on / Communities in Heterogeneous Networks / Social Media Mining View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Brain-Machine Interface Engineering

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Neural interfaces are one of the most exciting emerging technologies to impact bioengineering and neuroscience because they enable an alternate communication channel linking directly the nervous system with man-made devices. This book reveals the essential engineering principles and signal processing tools for deriving control commands from bioelectric signals in large ensembles of neurons. The topics featured include analysis techniques for determining neural representation, modeling in motor systems, computing with neural spikes, and hardware implementation of neural interfaces. Beginning with an exploration of the historical developments that have led to the decoding of information from neural interfaces, this book compares the theory and performance of new neural engineering approaches for BMIs. Contents: Introduction to Neural Interfaces / Foundations of Neuronal Representations / Input-Outpur BMI Models / Regularization Techniques for BMI Models / Neural Decoding Using Generativ BMI Models / Adaptive Algorithms for Point Processes / BMI Systems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advances in Modern Blind Signal Separation Algorithms:Theory and Applications

    Copyright Year: 2010

    Morgan and Claypool eBooks

    With human-computer interactions and hands-free communications becoming overwhelmingly important in the new millennium, recent research efforts have been increasingly focusing on state-of-the-art multi-microphone signal processing solutions to improve speech intelligibility in adverse environments. One such prominent statistical signal processing technique is blind signal separation (BSS). BSS was first introduced in the early 1990s and quickly emerged as an area of intense research activity showing huge potential in numerous applications. BSS comprises the task of 'blindly' recovering a set of unknown signals, the so-called sources from their observed mixtures, based on very little to almost no prior knowledge about the source characteristics or the mixing structure. The goal of BSS is to process multi-sensory observations of an inaccessible set of signals in a manner that reveals their individual (and original) form, by exploiting the spatial and temporal diversity, readily access ble through a multi-microphone configuration. Proceeding blindly exhibits a number of advantages, since assumptions about the room configuration and the source-to-sensor geometry can be relaxed without affecting overall efficiency. This booklet investigates one of the most commercially attractive applications of BSS, which is the simultaneous recovery of signals inside a reverberant (naturally echoing) environment, using two (or more) microphones. In this paradigm, each microphone captures not only the direct contributions from each source, but also several reflected copies of the original signals at different propagation delays. These recordings are referred to as the convolutive mixtures of the original sources. The goal of this booklet in the lecture series is to provide insight on recent advances in algorithms, which are ideally suited for blind signal separation of convolutive speech mixtures. More importantly, specific emphasis is given in practical applications of the developed BSS algorithms associated with real-life scenarios. The developed algorithms are put in the context of modern DSP devices, such as hearing aids and cochlear implants, where design requirements dictate low power consumption and call for portability and compact size. Along these lines, this booklet focuses on modern BSS algorithms which address (1) the limited amount of processing power and (2) the small number of microphones available to the end-user. Table of Contents: Fundamentals of blind signal separation / Modern blind signal separation algorithms / Application of blind signal processing strategies to noise reduction for the hearing-impaired / Conclusions and future challenges / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Applications of Affine and Weyl Geometry

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Pseudo-Riemannian geometry is, to a large extent, the study of the Levi-Civita connection, which is the unique torsion-free connection compatible with the metric structure. There are, however, other affine connections which arise in different contexts, such as conformal geometry, contact structures, Weyl structures, and almost Hermitian geometry. In this book, we reverse this point of view and instead associate an auxiliary pseudo-Riemannian structure of neutral signature to certain affine connections and use this correspondence to study both geometries. We examine Walker structures, Riemannian extensions, and Kähler--Weyl geometry from this viewpoint. This book is intended to be accessible to mathematicians who are not expert in the subject and to students with a basic grounding in differential geometry. Consequently, the first chapter contains a comprehensive introduction to the basic results and definitions we shall need---proofs are included of many of these results to make t as self-contained as possible. Para-complex geometry plays an important role throughout the book and consequently is treated carefully in various chapters, as is the representation theory underlying various results. It is a feature of this book that, rather than as regarding para-complex geometry as an adjunct to complex geometry, instead, we shall often introduce the para-complex concepts first and only later pass to the complex setting. The second and third chapters are devoted to the study of various kinds of Riemannian extensions that associate to an affine structure on a manifold a corresponding metric of neutral signature on its cotangent bundle. These play a role in various questions involving the spectral geometry of the curvature operator and homogeneous connections on surfaces. The fourth chapter deals with Kähler--Weyl geometry, which lies, in a certain sense, midway between affine geometry and Kähler geometry. Another feature of the book is that we have tried wherever possible to find the original references in the subject for possible historical interest. Thus, we have cited the seminal papers of Levi-Civita, Ricci, Schouten, and Weyl, to name but a few exemplars. We have also given different proofs of various results than those that are given in the literature, to take advantage of the unified treatment of the area given herein. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Theory and Rate Distortion Theory for Communications and Compression

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book. While the coverage of some standard topics is shortened or eliminated, the standard, but important, topics of the chain rules for entropy and mutual information, relative entropy, the data processing inequality, and the Markov chain condition receive a full treatment. Similarly, lossless source coding techniques presented include the Lempel-Ziv-Welch coding method. The material on rate Distortion theory and exp oring fundamental limits on lossy source coding covers the often-neglected Shannon lower bound and the Shannon backward channel condition, rate distortion theory for sources with memory, and the extremely practical topic of rate distortion functions for composite sources. The target audience for the book consists of graduate students at the master's degree level and practicing engineers. It is hoped that practicing engineers can work through this book and comprehend the key results needed to understand the utility of information theory and rate distortion theory and then utilize the results presented to analyze and perhaps improve the communications and compression systems with which they are familiar. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Combating Bad Weather Part I:Rain Removal from Video

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Current vision systems are designed to perform in normal weather condition. However, no one can escape from severe weather conditions. Bad weather reduces scene contrast and visibility, which results in degradation in the performance of various computer vision algorithms such as object tracking, segmentation and recognition. Thus, current vision systems must include some mechanisms that enable them to perform up to the mark in bad weather conditions such as rain and fog. Rain causes the spatial and temporal intensity variations in images or video frames. These intensity changes are due to the random distribution and high velocities of the raindrops. Fog causes low contrast and whiteness in the image and leads to a shift in the color. This book has studied rain and fog from the perspective of vision. The book has two main goals: 1) removal of rain from videos captured by a moving and static camera, 2) removal of the fog from images and videos captured by a moving single uncalibrated ca era system. The book begins with a literature survey. Pros and cons of the selected prior art algorithms are described, and a general framework for the development of an efficient rain removal algorithm is explored. Temporal and spatiotemporal properties of rain pixels are analyzed and using these properties, two rain removal algorithms for the videos captured by a static camera are developed. For the removal of rain, temporal and spatiotemporal algorithms require fewer numbers of consecutive frames which reduces buffer size and delay. These algorithms do not assume the shape, size and velocity of raindrops which make it robust to different rain conditions (i.e., heavy rain, light rain and moderate rain). In a practical situation, there is no ground truth available for rain video. Thus, no reference quality metric is very useful in measuring the efficacy of the rain removal algorithms. Temporal variance and spatiotemporal variance are presented in this book as no reference quality met ics. An efficient rain removal algorithm using meteorological properties of rain is developed. The relation among the orientation of the raindrops, wind velocity and terminal velocity is established. This relation is used in the estimation of shape-based features of the raindrop. Meteorological property-based features helped to discriminate the rain and non-rain pixels. Most of the prior art algorithms are designed for the videos captured by a static camera. The use of global motion compensation with all rain removal algorithms designed for videos captured by static camera results in better accuracy for videos captured by moving camera. Qualitative and quantitative results confirm that probabilistic temporal, spatiotemporal and meteorological algorithms outperformed other prior art algorithms in terms of the perceptual quality, buffer size, execution delay and system cost. The work presented in this book can find wide application in entertainment industries, transportation, tracking a d consumer electronics. Table of Contents: Acknowledgments / Introduction / Analysis of Rain / Dataset and Performance Metrics / Important Rain Detection Algorithms / Probabilistic Approach for Detection and Removal of Rain / Impact of Camera Motion on Detection of Rain / Meteorological Approach for Detection and Removal of Rain from Videos / Conclusion and Scope of Future Work / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Particle Swarm Optimizaton:A Physics-Based Approach

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This work aims to provide new introduction to the particle swarm optimization methods using a formal analogy with physical systems. By postulating that the swarm motion behaves similar to both classical and quantum particles, we establish a direct connection between what are usually assumed to be separate fields of study, optimization and physics. Within this framework, it becomes quite natural to derive the recently introduced quantum PSO algorithm from the Hamiltonian or the Lagrangian of the dynamical system. The physical theory of the PSO is used to suggest some improvements in the algorithm itself, like temperature acceleration techniques and the periodic boundary condition. At the end, we provide a panorama of applications demonstrating the power of the PSO, classical and quantum, in handling difficult engineering problems. The goal of this work is to provide a general multi-disciplinary view on various topics in physics, mathematics, and engineering by illustrating their interd pendence within the unified framework of the swarm dynamics. Table of Contents: Introduction / The Classical Particle Swarm Optimization Method / Boundary Conditions for the PSO Method / The Quantum Particle Swarm Optimization / Bibliography /Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Circuits:Signals and Filters

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Pragmatic Circuits: Signals and Filters is built around the processing of signals. Topics include spectra, a short introduction to the Fourier series, design of filters, and the properties of the Fourier transform. The focus is on signals rather than power. But the treatment is still pragmatic. For example, the author accepts the work of Butterworth and uses his results to design filters in a fairly methodical fashion. This third of three volumes finishes with a look at spectra by showing how to get a spectrum even if a signal is not periodic. The Fourier transform provides a way of dealing with such non-periodic signals. The two other volumes in the Pragmatic Circuits series include titles on DC and Time Domain and Frequency Domain. These short lecture books will be of use to students at any level of electrical engineering and for practicing engineers, or scientists, in any field looking for a practical and applied introduction to circuits and signals. The author's “pragmati ” and applied style gives a unique and helpful “non-idealistic, practical, opinionated” introduction to circuits View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and Society:Working Towards Social Justice, Part III: Windows on Society

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Engineers work in an increasingly complex entanglement of ideas, people, cultures, technology, systems and environments. Today, decisions made by engineers often have serious implications for not only their clients but for society as a whole and the natural world. Such decisions may potentially influence cultures, ways of living, as well as alter ecosystems which are in delicate balance. In order to make appropriate decisions and to co-create ideas and innovations within and among the complex networks of communities which currently exist and are shaped by our decisions, we need to regain our place as professionals, to realise the significance of our work and to take responsibility in a much deeper sense. Engineers must develop the 'ability to respond' to emerging needs of all people, across all cultures. To do this requires insights and knowledge which are at present largely within the domain of the social and political sciences but which need to be shared with our students in ways hich are meaningful and relevant to engineering. This book attempts to do just that. In Part 1 Baillie introduces ideas associated with the ways in which engineers relate to the communities in which they work. Drawing on scholarship from science and technology studies, globalisation and development studies, as well as work in science communication and dialogue, this introductory text sets the scene for an engineering community which engages with the public. In Part 2 Catalano frames the thinking processes necessary to create ethical and just decisions in engineering, to understand the implications of our current decision making processes and think about ways in which we might adapt these to become more socially just in the future. In Part 3 Baillie and Catalano have provided case studies of everyday issues such as water, garbage and alarm clocks, to help us consider how we might see through the lenses of our new knowledge from Parts 1 and 2 and apply this to our everyday existence as ngineers. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    From Tool to Partner:The Evolution of Human-Computer Interaction

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>This is the first comprehensive history of human-computer interaction (HCI). Whether you are a user experience professional or an academic researcher, whether you identify with computer science, human factors, information systems, information science, design, or communication, you can discover how your experiences fit into the expanding field of HCI. You can determine where to look for relevant information in other fields—and where you won’t find it.</p><p>This book describes the different fields that have participated in improving our digital tools. It is organized chronologically, describing major developments across fields in each period. Computer use has changed radically, but many underlying forces are constant. Technology has changed rapidly, human nature very little. An irresistible force meets an immovable object. The exponential rate of technological change gives us little time to react before technology moves on. Patterns and trajec ories described in this book provide your best chance to anticipate what could come next.</p><p>We have reached a turning point. Tools that we built for ourselves to use are increasingly influencing how we use them, in ways that are planned and sometimes unplanned. The book ends with issues worthy of consideration as we explore the new world that we and our digital partners are shaping.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Light Field Sampling

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Light field is one of the most representative image-based rendering techniques that generate novel virtual views from images instead of 3D models. The light field capture and rendering process can be considered as a procedure of sampling the light rays in the space and interpolating those in novel views. As a result, light field can be studied as a high-dimensional signal sampling problem, which has attracted a lot of research interest and become a convergence point between computer graphics and signal processing, and even computer vision. This lecture focuses on answering two questions regarding light field sampling, namely how many images are needed for a light field, and if such number is limited, where we should capture them. The book can be divided into three parts. First, we give a complete analysis on uniform sampling of IBR data. By introducing the surface plenoptic function, we are able to analyze the Fourier spectrum of non-Lambertian and occluded scenes. Given the spectrum, we also apply the generalized sampling theorem on the IBR data, which results in better rendering quality than rectangular sampling for complex scenes. Such uniform sampling analysis provides general guidelines on how the images in IBR should be taken. For instance, it shows that non-Lambertian and occluded scenes often require a higher sampling rate. Next, we describe a very general sampling framework named freeform sampling. Freeform sampling handles three kinds of problems: sample reduction, minimum sampling rate to meet an error requirement, and minimization of reconstruction error given a fixed number of samples. When the to-be-reconstructed function values are unknown, freeform sampling becomes active sampling. Algorithms of active sampling are developed for light field and show better results than the traditional uniform sampling approach. Third, we present a self-reconfigurable camera array that we developed, which features a very efficient algorithm for real-time rendering an the ability of automatically reconfiguring the cameras to improve the rendering quality. Both are based on active sampling. Our camera array is able to render dynamic scenes interactively at high quality. To the best of our knowledge, it is the first camera array that can reconfigure the camera positions automatically. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advanced Metasearch Engine Technology

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Among the search tools currently on the Web, search engines are the most well known thanks to the popularity of major search engines such as Google and Yahoo!. While extremely successful, these major search engines do have serious limitations. This book introduces large-scale metasearch engine technology, which has the potential to overcome the limitations of the major search engines. Essentially, a metasearch engine is a search system that supports unified access to multiple existing search engines by passing the queries it receives to its component search engines and aggregating the returned results into a single ranked list. A large-scale metasearch engine has thousands or more component search engines. While metasearch engines were initially motivated by their ability to combine the search coverage of multiple search engines, there are also other benefits such as the potential to obtain better and fresher results and to reach the Deep Web. The following major components of large-s ale metasearch engines will be discussed in detail in this book: search engine selection, search engine incorporation, and result merging. Highly scalable and automated solutions for these components are emphasized. The authors make a strong case for the viability of the large-scale metasearch engine technology as a competitive technology for Web search. Table of Contents: Introduction / Metasearch Engine Architecture / Search Engine Selection / Search Engine Incorporation / Result Merging / Summary and Future Research View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Meta-Smith Charts and Their Applications

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book presents the developments and potential applications of Meta-Smith charts, which can be applied to practical and useful transmission line problems (e.g., metamaterial transmission lines and nonreciprocal transmission lines). These problems are beyond the capability of the standard Smith chart to be applied effectively. As any RF engineer is aware, a key property of the Smith chart is the insight it provides, even in very complex design processes. Like the Smith chart, Meta-Smith charts provide a useful way of visualizing transmission line phenomena. They provide useful physical insight, and they can also assist in solving related problems effectively. This book can be used as a companion guide in studying Microwave Engineering for senior undergraduate students as well as for graduate students. It is also recommended for researchers in the RF community, especially those working with periodic transmission line structures and metamaterial transmission lines. Problems are also p ovided at the end of each chapter for readers to gain a better understanding of material presented in this book. Table of Contents: Essential Transmission Line Theory / Theory of CCITLs / Theory of BCITLs / Meta-Smith Charts for CCITLs and BCITLs / Applications of Meta-Smith Charts View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Circuits:Frequency Domain

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Pragmatic Circuits: Frequency Domain goes through the Laplace transform to get from the time domain to topics that include the s-plane, Bode diagrams, and the sinusoidal steady state. This second of three volumes ends with a-c power, which, although it is just a special case of the sinusoidal steady state, is an important topic with unique techniques and terminology. Pragmatic Circuits: Frequency Domain is focused on the frequency domain. In other words, time will no longer be the independent variable in our analysis. The two other volumes in the Pragmatic Circuits series include titles on DC and Time Domain and Signals and Filters. These short lecture books will be of use to students at any level of electrical engineering and for practicing engineers, or scientists, in any field looking for a practical and applied introduction to circuits and signals. The author's “pragmatic” and applied style gives a unique and helpful “non-idealistic, practical, opinionated ” introduction to circuits. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Privacy Risk Analysis

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p><i>Privacy Risk Analysis</i> fills a gap in the existing literature by providing an introduction to the basic notions, requirements, and main steps of conducting a privacy risk analysis.</p><p>The deployment of new information technologies can lead to significant privacy risks and a privacy impact assessment should be conducted before designing a product or system that processes personal data. However, if existing privacy impact assessment frameworks and guidelines provide a good deal of details on organizational aspects (including budget allocation, resource allocation, stakeholder consultation, etc.), they are much vaguer on the technical part, in particular on the actual risk assessment task. For privacy impact assessments to keep up their promises and really play a decisive role in enhancing privacy protection, they should be more precise with regard to these technical aspects.</p><p>This book is an excellent resource for nyone developing and/or currently running a risk analysis as it defines the notions of personal data, stakeholders, risk sources, feared events, and privacy harms all while showing how these notions are used in the risk analysis process. It includes a running smart grids example to illustrate all the notions discussed in the book. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Boolean Differential Calculus

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>The Boolean Differential Calculus (BDC) is a very powerful theory that extends the basic concepts of Boolean Algebras significantly.</p> <p>Its applications are based on Boolean spaces 𝔹 and 𝔹ⁿ, Boolean operations, and basic structures such as Boolean Algebras and Boolean Rings, Boolean functions, Boolean equations, Boolean inequalities, incompletely specified Boolean functions, and Boolean lattices of Boolean functions. These basics, sometimes also called switching theory, are widely used in many modern information processing applications.</p> <p>The BDC extends the known concepts and allows the consideration of changes of function values. Such changes can be explored for pairs of function values as well as for whole subspaces. The BDC defines a small number of derivative and differential operations. Many existing theorems are very welcome and allow new insights due to possible transformations of problems. The avail ble operations of the BDC have been efficiently implemented in several software packages.</p> <p>The common use of the basic concepts and the BDC opens a very wide field of applications. The roots of the BDC go back to the practical problem of testing digital circuits. The BDC deals with changes of signals which are very important in applications of the analysis and the synthesis of digital circuits. The comprehensive evaluation and utilization of properties of Boolean functions allow, for instance, to decompose Boolean functions very efficiently; this can be applied not only in circuit design, but also in data mining. Other examples for the use of the BDC are the detection of hazards or cryptography. The knowledge of the BDC gives the scientists and engineers an extended insight into Boolean problems leading to new applications, e.g., the use of Boolean lattices of Boolean functions.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Electronics:Book 4 Oscillators and Advanced Electronics Topics

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>This book, <i>Oscillators and Advanced Electronics Topics</i>, is the final book of a larger, four-book set, Fundamentals of Electronics. It consists of five chapters that further develop practical electronic applications based on the fundamental principles developed in the first three books. </p><p> This book begins by extending the principles of electronic feedback circuits to linear oscillator circuits. The second chapter explores non-linear oscillation, waveform generation, and waveshaping. The third chapter focuses on providing clean, reliable power for electronic applications where voltage regulation and transient suppression are the focus. Fundamentals of communication circuitry form the basis for the fourth chapter with voltage-controlled oscillators, mixers, and phase-lock loops being the primary focus. The final chapter expands upon early discussions of logic gate operation (introduced in Book 1) to explore gate speed and advanced g te topologies. </p><p> Fundamentals of Electronics has been designed primarily for use in upper division courses in electronics for electrical engineering students and for working professionals. Typically such courses span a full academic year plus an additional semester or quarter. As such, Oscillators and Advanced Electronics Topics and the three companion book of Fundamentals of Electronics form an appropriate body of material for such courses.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semi-Supervised Learning and Domain Adaptation in Natural Language Processing

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book introduces basic supervised learning algorithms applicable to natural language processing (NLP) and shows how the performance of these algorithms can often be improved by exploiting the marginal distribution of large amounts of unlabeled data. One reason for that is data sparsity, i.e., the limited amounts of data we have available in NLP. However, in most real-world NLP applications our labeled data is also heavily biased. This book introduces extensions of supervised learning algorithms to cope with data sparsity and different kinds of sampling bias. This book is intended to be both readable by first-year students and interesting to the expert audience. My intention was to introduce what is necessary to appreciate the major challenges we face in contemporary NLP related to data sparsity and sampling bias, without wasting too much time on details about supervised learning algorithms or particular NLP applications. I use text classification, part-of-speech tagging, and depen ency parsing as running examples, and limit myself to a small set of cardinal learning algorithms. I have worried less about theoretical guarantees ("this algorithm never does too badly") than about useful rules of thumb ("in this case this algorithm may perform really well"). In NLP, data is so noisy, biased, and non-stationary that few theoretical guarantees can be established and we are typically left with our gut feelings and a catalogue of crazy ideas. I hope this book will provide its readers with both. Throughout the book we include snippets of Python code and empirical evaluations, when relevant. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Assistive Technology Design for Intelligence Augmentation

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Assistive Technology Design for Intelligence Augmentation presents a series of frameworks, perspectives, and design guidelines drawn from disciplines spanning urban design, artificial intelligence, sociology, and new forms of collaborative work, as well as the author's experience in designing systems for people with cognitive disabilities. Many of the topics explored came from the author's graduate studies at the Center for LifeLong Learning and Design, part of the Department of Computer Science and the Institute of Cognitive Science at the University of Colorado, Boulder. The members of the Center for LifeLong Learning and Design came from a wide range of design perspectives including computer science, molecular biology, journalism, architecture, assistive technology (AT), urban design, sociology, and psychology. The main emphasis of this book is to provide leverage for understanding the problems that the AT designer faces rather than facilitating the design process itself. Looking at the designer's task with these lenses often changes the nature of the problem to be solved. The main body of this book consists of a series of short chapters describing a particular approach, its applicability and relevance to design for intelligence augmentation in complex computationally supported systems, and examples in research and the marketplace. The final part of the book consists of listing source documents for each of the topics and a reading list for further exploration. This book provides an introduction to perspectives and frameworks that are not commonly taught in presentations of AT design which may also provide valuable design insights to general human-computer interaction and computer-supported cooperative work researchers and practitioners. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Future of Personal Information Management, Part 1:Our Information, Always and Forever

    Copyright Year: 2012

    Morgan and Claypool eBooks

    We are well into a second age of digital information. Our information is moving from the desktop to the laptop to the "palmtop" and up into an amorphous cloud on the Web. How can one manage both the challenges and opportunities of this new world of digital information? What does the future hold? This book provides an important update on the rapidly expanding field of personal information management (PIM). Part I (Always and Forever) introduces the essentials of PIM. Information is personal for many reasons. It's the information on our hard drives we couldn't bear to lose. It's the information about us that we don't want to share. It's the distracting information demanding our attention even as we try to do something else. It's the information we don't know about but need to. Through PIM, we control personal information. We integrate information into our lives in useful ways. We make it "ours." With basics established, Part I proceeds to explore a critical interplay between pers nal information "always" at hand through mobile devices and "forever" on the Web. How does information stay "ours" in such a world? Part II (Building Places of Our Own for Digital Information) will be available in the Summer of 2012, and will consist of the following chapters: Chapter 5. Technologies to eliminate PIM?: We have seen astonishing advances in the technologies of information management -- in particular, to aid in the storing, structuring and searching of information. These technologies will certainly change the way we do PIM; will they eliminate the need for PIM altogether? Chapter 6. GIM and the social fabric of PIM: We don't (and shouldn't) manage our information in isolation. Group information management (GIM) -- especially the kind practiced more informally in households and smaller project teams -- goes hand in glove with good PIM. Chapter 7. PIM by design: Methodologies, principles, questions and considerations as we seek to understand PIM better and to build PIM i to our tools, techniques and training. Chapter 8. To each of us, our own.: Just as we must each be a student of our own practice of PIM, we must also be a designer of this practice. This concluding chapter looks at tips, traps and tradeoffs as we work to build a practice of PIM and "places" of our own for personal information. Table of Contents: A New Age of Information / The Basics of PIM / Our Information, Always at Hand / Our Information, Forever on the Web View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis of Oriented Texture:With application to the Detection of Architectural Distortion in Mammograms

    Copyright Year: 2011

    Morgan and Claypool eBooks

    The presence of oriented features in images often conveys important information about the scene or the objects contained; the analysis of oriented patterns is an important task in the general framework of image understanding. As in many other applications of computer vision, the general framework for the understanding of oriented features in images can be divided into low- and high-level analysis. In the context of the study of oriented features, low-level analysis includes the detection of oriented features in images; a measure of the local magnitude and orientation of oriented features over the entire region of analysis in the image is called the orientation field. High-level analysis relates to the discovery of patterns in the orientation field, usually by associating the structure perceived in the orientation field with a geometrical model. This book presents an analysis of several important methods for the detection of oriented features in images, and a discussion of the phase po trait method for high-level analysis of orientation fields. In order to illustrate the concepts developed throughout the book, an application is presented of the phase portrait method to computer-aided detection of architectural distortion in mammograms. Table of Contents: Detection of Oriented Features in Images / Analysis of Oriented Patterns Using Phase Portraits / Optimization Techniques / Detection of Sites of Architectural Distortion in Mammograms View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to Kalman Filtering with MATLAB Examples

    Copyright Year: 2013

    Morgan and Claypool eBooks

    The Kalman filter is the Bayesian optimum solution to the problem of sequentially estimating the states of a dynamical system in which the state evolution and measurement processes are both linear and Gaussian. Given the ubiquity of such systems, the Kalman filter finds use in a variety of applications, e.g., target tracking, guidance and navigation, and communications systems. The purpose of this book is to present a brief introduction to Kalman filtering. The theoretical framework of the Kalman filter is first presented, followed by examples showing its use in practical applications. Extensions of the method to nonlinear problems and distributed applications are discussed. A software implementation of the algorithm in the MATLAB programming language is provided, as well as MATLAB code for several example applications discussed in the manuscript. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Atmel AVR Microcontroller Primer:Programming and Interfacing

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This textbook provides practicing scientists and engineers a primer on the Atmel AVR microcontroller. Our approach is to provide the fundamental skills to quickly get up and operating with this internationally popular microcontroller. The Atmel ATmega16 is used as a representative sample of the AVR line. The knowledge you gain on the ATmega16 can be easily translated to every other microcontroller in the AVR line. We cover the main subsystems aboard the ATmega16, providing a short theory section followed by a description of the related microcontroller subsystem with accompanying hardware and software to exercise the subsytem. In all examples, we use the C programming language. We conclude with a detailed chapter describing how to interface the microcontroller to a wide variety of input and output devices. Table of Contents: Atmel AVR Architecture Overview / Serial Communication Subsystem / Analog-to-Digital Conversion / Interrupt Subsystem / Timing Subsystem / Atmel AVR Operating Para eters and Interfacing / ATmega16 Register Set / ATmega16 Header File View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Merging Languages and Engineering:Partnering Across the Disciplines

    Copyright Year: 2013

    Morgan and Claypool eBooks

    At the University of Rhode Island over 25% of engineering undergraduates simultaneously complete a second degree in German, French, Spanish, or Chinese. They furthermore spend an entire year abroad, one semester as exchange students at a partner university and six months as professional engineering interns at a cooperating company. With a close-to 100% placement rate, over 400 graduates, and numerous national awards, the URI International Engineering Program (IEP) is a proven path of preparation for young engineers in today's global workplace. The author of this volume, John Grandin, is an emeritus professor of German who developed and led the IEP for twenty-three years. In these pages, he provides a two-pronged approach to explain the origin and history of this program rooted in such an unusual merger of two traditionally distinct higher education disciplines. He looks first at himself to explain how and why he became an international educator and what led him to his lasting passion for the IEP. He then provides an historical overview of the program's origin and growth, including looks at the bumps and bruises and ups and downs along the way. Grandin hopes that this story will be of use and value to other educators determined to reform higher education and align it with the needs of the 21st Century. Table of Contents: How I became a Professor of German / My Unexpected Path to Engineering / Building a Network of Support / Sidetracked by a Stint in the Dean's Office / Reshaping the Language Mission / Struggling to Institutionalize / Partnering with Universities Abroad / Going into the Hotel and Restaurant Business / Taking the Lead Nationally / Building the Chinese IEP / Staying Involved after Retirement / The Broader Message for Higher Education / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Search-Based Applications:At the Confluence of Search and Database Technologies

    Copyright Year: 2010

    Morgan and Claypool eBooks

    We are poised at a major turning point in the history of information management via computers. Recent evolutions in computing, communications, and commerce are fundamentally reshaping the ways in which we humans interact with information, and generating enormous volumes of electronic data along the way. As a result of these forces, what will data management technologies, and their supporting software and system architectures, look like in ten years? It is difficult to say, but we can see the future taking shape now in a new generation of information access platforms that combine strategies and structures of two familiar -- and previously quite distinct -- technologies, search engines and databases, and in a new model for software applications, the Search-Based Application (SBA), which offers a pragmatic way to solve both well-known and emerging information management challenges as of now. Search engines are the world's most familiar and widely deployed information access tool, used b hundreds of millions of people every day to locate information on the Web, but few are aware they can now also be used to provide precise, multidimensional information access and analysis that is hard to distinguish from current database applications, yet endowed with the usability and massive scalability of Web search. In this book, we hope to introduce Search Based Applications to a wider audience, using real case studies to show how this flexible technology can be used to intelligently aggregate large volumes of unstructured data (like Web pages) and structured data (like database content), and to make that data available in a highly contextual, quasi real-time manner to a wide base of users for a varied range of purposes. We also hope to shed light on the general convergences underway in search and database disciplines, convergences that make SBAs possible, and which serve as harbingers of information management paradigms and technologies to come. Table of Contents: Search Based pplications / Evolving Business Information Access Needs / Origins and Histories / Data Models and Storage / Data Collection/Population / Data Processing / Data Retrieval / Data Security, Usability, Performance, Cost / Summary Evolutions and Convergences / SBA Platforms / SBA Uses and Preconditions / Anatomy of a Search Based Application / Case Study: GEFCO / Case Study: Urbanizer / Case Study: National Postal Agency / Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Energy-Efficient Scheduling under Delay Constraints for Wireless Networks

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Packet delay and energy consumption are important considerations in wireless and sensor networks as these metrics directly affect the quality of service of the application and the resource consumption of the network; especially, for a rapidly growing class of real-time applications that impose strict restrictions on packet delays. Dynamic rate control is a novel technique for adapting the transmission rate of wireless devices, almost in real-time, to opportunistically exploit time-varying channel conditions as well as changing traffic patterns. Since power consumption is not a linear function of the rate and varies significantly with the channel conditions, adapting the rate has significant benefits in minimizing energy consumption. These benefits have prompted significant research in developing algorithms for achieving optimal rate adaptation while satisfying quality of service requirements. In this book, we provide a comprehensive study of dynamic rate control for energy minimizatio under packet delay constraints. We present several formulations and approaches adopted in the literature ranging from discrete-time formulations and dynamic programming based solutions to continuous-time approaches utilizing ideas from network calculus and stochastic optimal control theory. The goal of this book is to expose the reader to the important problem of wireless data transmission with delay constraints and to the rich set of tools developed in recent years to address it. Table of Contents: Introduction / Transmission Rate Adaptation under Deadline Constraints / Average Delay Constraints View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Biosystems Approach to Industrial Patient Monitoring and Diagnostic Devices

    Copyright Year: 2008

    Morgan and Claypool eBooks

    A medical device is an apparatus that uses engineering and scientific principles to interface to physiology and diagnose or treat a disease. In this Lecture, we specifically consider thosemedical devices that are computer based, and are therefore referred to as medical instruments. Further, the medical instruments we discuss are those that incorporate system theory into their designs. We divide these types of instruments into those that provide continuous observation and those that provide a single snapshot of health information. These instruments are termed patient monitoring devices and diagnostic devices, respectively.Within this Lecture, we highlight some of the common system theory techniques that are part of the toolkit of medical device engineers in industry. These techniques include the pseudorandom binary sequence, adaptive filtering, wavelet transforms, the autoregressive moving average model with exogenous input, artificial neural networks, fuzzy models, and fuzzy control. ecause the clinical usage requirements for patient monitoring and diagnostic devices are so high, system theory is the preferred substitute for heuristic, empirical processing during noise artifact minimization and classification. Table of Contents: Preface / Medical Devices / System Theory / Patient Monitoring Devices / Diagnostic Devices / Conclusion / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Electrical Engineering:Fundamentals

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Pragmatic Electrical Engineering: Fundamentals introduces the fundamentals of the energy-delivery part of electrical systems. It begins with a study of basic electrical circuits and then focuses on electrical power. Three-phase power systems, transformers, induction motors, and magnetics are the major topics. All of the material in the text is illustrated with completely-worked examples to guide the student to a better understanding of the topics. This short lecture book will be of use at any level of engineering, not just electrical. Its goal is to provide the practicing engineer with a practical, applied look at the energy side of electrical systems. The author's "pragmatic" and applied style gives a unique and helpful "non-idealistic, practical, opinionated" introduction to the topic. Table of Contents: Basic Stuff / Power of the Sine / Three-Phase Power Systems / Transformers / Machines / Electromagnetics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Neural Interfacing:Forging the Human-Machine Connection

    Copyright Year: 2008

    Morgan and Claypool eBooks

    In the past 50 years there has been an explosion of interest in the development of technologies whose end goal is to connect the human brain and/or nervous system directly to computers. Once the subject of science fiction, the technologies necessary to accomplish this goal are rapidly becoming reality. In laboratories around the globe, research is being undertaken to restore function to the physically disabled, to replace areas of the brain damaged by disease or trauma and to augment human abilities. Building neural interfaces and neuro-prosthetics relies on a diverse array of disciplines such as neuroscience, engineering, medicine and microfabrication just to name a few. This book presents a short history of neural interfacing (N.I.) research and introduces the reader to some of the current efforts to develop neural prostheses. The book is intended as an introduction for the college freshman or others wishing to learn more about the field. A resource guide is included for students al ng with a list of laboratories conducting N.I. research and universities with N.I. related tracks of study. Table of Contents: Neural Interfaces Past and Present / Current Neuroprosthesis Research / Conclusion / Resources for Students View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Compound Semiconductor Materials and Devices

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Ever since its invention in the 1980s, the compound semiconductor heterojunction-based high electron mobility transistor (HEMT) has been widely used in radio frequency (RF) applications. This book provides readers with broad coverage on techniques and new trends of HEMT, employing leading compound semiconductors, III-N and III-V materials. The content includes an overview of GaN HEMT device-scaling technologies and experimental research breakthroughs in fabricating various GaN MOSHEMT transistors. Readers are offered an inspiring example of monolithic integration of HEMT with LEDs, too. The authors compile the most relevant aspects of III-V HEMT, including the current status of state-of-art HEMTs, their possibility of replacing the Si CMOS transistor channel, and growth opportunities of III-V materials on an Si substrate. With detailed exploration and explanations, the book is a helpful source suitable for anyone learning about and working on compound semiconductor devices. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Wang Tiles in Computer Graphics

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Many complex signals in computer graphics, such as point distributions and textures, cannot be efficiently synthesized and stored. This book presents tile-based methods based on Wang tiles and corner tiles to solve both these problems. Instead of synthesizing a complex signal when needed, the signal is synthesized beforehand over a small set of Wang tiles or corner tiles. Arbitrary large amounts of that signal can then efficiently be generated when needed by generating a stochastic tiling, and storing only a small set of tiles reduces storage requirements. A tile-based method for generating a complex signal consists of a method for synthesizing the signal over a set of Wang tiles or corner tiles, and a method for generating a stochastic tiling using the set of tiles. The method for generating a stochastic tiling using the set of tiles is independent of the signal. This book covers scanline stochastic tiling algorithms and direct stochastic tiling algorithms for Wang tiles and corner t les. The method for synthesizing the signal over a set of tiles is dependent on the signal. This book covers tile-based methods for texture synthesis and for generating Poisson disk distributions. This book also explores several applications such as tile-based texture mapping and procedural modeling and texturing. Although the methods for constructing a complex signal over a set of Wang tiles or corner tiles are dependent on the signal, the general idea behind these methods generalizes to other kinds of signals. The methods presented in this book therefore have the potential to make the generation and storage of almost any complex signal efficient. Table of Contents: Introduction / Wang Tiles and Corner Tiles / Tiling Algorithms for Wang Tiles and Corner Tiles / Tile-Based Methods for Texture Synthesis / Tile-Based Methods Generating Poisson Disk Distributions / Applications of Poisson Disk Distributions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics provides a comprehensive tutorial of the most widely used method for solving Maxwell's equations -- the Finite Difference Time-Domain Method. This book is an essential guide for students, researchers, and professional engineers who want to gain a fundamental knowledge of the FDTD method. It can accompany an undergraduate or entry-level graduate course or be used for self-study. The book provides all the background required to either research or apply the FDTD method for the solution of Maxwell's equations to practical problems in engineering and science. Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics guides the reader through the foundational theory of the FDTD method starting with the one-dimensional transmission-line problem and then progressing to the solution of Maxwell's equations in three dimensions. It also provides step by step guides to modeling physic l sources, lumped-circuit components, absorbing boundary conditions, perfectly matched layer absorbers, and sub-cell structures. Post processing methods such as network parameter extraction and far-field transformations are also detailed. Efficient implementations of the FDTD method in a high level language are also provided. Table of Contents: Introduction / 1D FDTD Modeling of the Transmission Line Equations / Yee Algorithm for Maxwell's Equations / Source Excitations / Absorbing Boundary Conditions / The Perfectly Matched Layer (PML) Absorbing Medium / Subcell Modeling / Post Processing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quantum Computing for Computer Architects, Second Edition

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore the systems-level challenges in achieving scalable, fault-tolerant quantum computation. In this lecture, we provide an engineering-oriented introduction to quantum computation with an overview of the theory behind key quantum algorithms. Next, we look at architectural case studies based upon experimental data and future projections for quantum computation implemented using trapped ions. While we ocus here on architectures targeted for realization using trapped ions, the techniques for quantum computer architecture design, quantum fault-tolerance, and compilation described in this lecture are applicable to many other physical technologies that may be viable candidates for building a large-scale quantum computing system. We also discuss general issues involved with programming a quantum computer as well as a discussion of work on quantum architectures based on quantum teleportation. Finally, we consider some of the open issues remaining in the design of quantum computers. Table of Contents: Introduction / Basic Elements for Quantum Computation / Key Quantum Algorithms / Building Reliable and Scalable Quantum Architectures / Simulation of Quantum Computation / Architectural Elements / Case Study: The Quantum Logic Array Architecture / Programming the Quantum Architecture / Using the QLA for Quantum Simulation: The Transverse Ising Model / Teleportation-Based Quantum Architecture / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Paradigm Shift to Multimodality in Contemporary Computer Interfaces

    Copyright Year: 2015

    Morgan and Claypool eBooks

    During the last decade, cell phones with multimodal interfaces based on combined new media have become the dominant computer interface worldwide. Multimodal interfaces support mobility and expand the expressive power of human input to computers. They have shifted the fulcrum of human-computer interaction much closer to the human. This book explains the foundation of human-centered multimodal interaction and interface design, based on the cognitive and neurosciences, as well as the major benefits of multimodal interfaces for human cognition and performance. It describes the data-intensive methodologies used to envision, prototype, and evaluate new multimodal interfaces. From a system development viewpoint, this book outlines major approaches for multimodal signal processing, fusion, architectures, and techniques for robustly interpreting users' meaning. Multimodal interfaces have been commercialized extensively for field and mobile applications during the last decade. Research also is growing rapidly in areas like multimodal data analytics, affect recognition, accessible interfaces, embedded and robotic interfaces, machine learning and new hybrid processing approaches, and similar topics. The expansion of multimodal interfaces is part of the long-term evolution of more expressively powerful input to computers, a trend that will substantially improve support for human cognition and performance. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Query Processing over Uncertain Databases

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Due to measurement errors, transmission lost, or injected noise for privacy protection, uncertainty exists in the data of many real applications. However, query processing techniques for deterministic data cannot be directly applied to uncertain data because they do not have mechanisms to handle data uncertainty. Therefore, efficient and effective manipulation of uncertain data is a practical yet challenging research topic. In this book, we start from the data models for imprecise and uncertain data, move on to defining different semantics for queries on uncertain data, and finally discuss the advanced query processing techniques for various probabilistic queries in uncertain databases. The book serves as a comprehensive guideline for query processing over uncertain databases. Table of Contents: Introduction / Uncertain Data Models / Spatial Query Semantics over Uncertain Data Models / Spatial Query Processing over Uncertain Databases / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mechanical Testing for the Biomechanics Engineer:A Practical Guide

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Mechanical testing is a useful tool in the field of biomechanics. Classic biomechanics employs mechanical testing for a variety of purposes. For instance, testing may be used to determine the mechanical properties of bone under a variety of loading modes and various conditions including age and disease state. In addition, testing may be used to assess fracture fixation procedures to justify clinical approaches. Mechanical testing may also be used to test implants and biomaterials to determine mechanical strength and appropriateness for clinical purposes. While the information from a mechanical test will vary, there are basics that need to be understood to properly conduct mechanical testing. This book will attempt to provide the reader not only with the basic theory of conducting mechanical testing, but will also focus on providing practical insights and examples. Table of Contents: Preface / Fundamentals / Accuracy and Measurement Tools / Design / Testing Machine Design and Fabricati n / Fixture Design and Applications / Additional Considerations in a Biomechanics Test / Laboratory Examples and Additional Equations / Appendices: Practical Orthopedic Biomechanics Problems / Bibliography / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Integral Equation Methods for Electromagnetic and Elastic Waves

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Integral Equation Methods for Electromagnetic and Elastic Waves is an outgrowth of several years of work. There have been no recent books on integral equation methods. There are books written on integral equations, but either they have been around for a while, or they were written by mathematicians. Much of the knowledge in integral equation methods still resides in journal papers. With this book, important relevant knowledge for integral equations are consolidated in one place and researchers need only read the pertinent chapters in this book to gain important knowledge needed for integral equation research. Also, learning the fundamentals of linear elastic wave theory does not require a quantum leap for electromagnetic practitioners. Integral equation methods have been around for several decades, and their introduction to electromagnetics has been due to the seminal works of Richmond and Harrington in the 1960s. There was a surge in the interest in this topic in the 1980s (notably t e work of Wilton and his coworkers) due to increased computing power. The interest in this area was on the wane when it was demonstrated that differential equation methods, with their sparse matrices, can solve many problems more efficiently than integral equation methods. Recently, due to the advent of fast algorithms, there has been a revival in integral equation methods in electromagnetics. Much of our work in recent years has been in fast algorithms for integral equations, which prompted our interest in integral equation methods. While previously, only tens of thousands of unknowns could be solved by integral equation methods, now, tens of millions of unknowns can be solved with fast algorithms. This has prompted new enthusiasm in integral equation methods. Table of Contents: Introduction to Computational Electromagnetics / Linear Vector Space, Reciprocity, and Energy Conservation / Introduction to Integral Equations / Integral Equations for Penetrable Objects / Low-Frequency Prob ems in Integral Equations / Dyadic Green's Function for Layered Media and Integral Equations / Fast Inhomogeneous Plane Wave Algorithm for Layered Media / Electromagnetic Wave versus Elastic Wave / Glossary of Acronyms View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sentiment Analysis and Opinion Mining

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Sentiment analysis and opinion mining is the field of study that analyzes people's opinions, sentiments, evaluations, attitudes, and emotions from written language. It is one of the most active research areas in natural language processing and is also widely studied in data mining, Web mining, and text mining. In fact, this research has spread outside of computer science to the management sciences and social sciences due to its importance to business and society as a whole. The growing importance of sentiment analysis coincides with the growth of social media such as reviews, forum discussions, blogs, micro-blogs, Twitter, and social networks. For the first time in human history, we now have a huge volume of opinionated data recorded in digital form for analysis. Sentiment analysis systems are being applied in almost every business and social domain because opinions are central to almost all human activities and are key influencers of our behaviors. Our beliefs and perceptions of rea ity, and the choices we make, are largely conditioned on how others see and evaluate the world. For this reason, when we need to make a decision we often seek out the opinions of others. This is true not only for individuals but also for organizations. This book is a comprehensive introductory and survey text. It covers all important topics and the latest developments in the field with over 400 references. It is suitable for students, researchers and practitioners who are interested in social media analysis in general and sentiment analysis in particular. Lecturers can readily use it in class for courses on natural language processing, social media analysis, text mining, and data mining. Lecture slides are also available online. Table of Contents: Preface / Sentiment Analysis: A Fascinating Problem / The Problem of Sentiment Analysis / Document Sentiment Classification / Sentence Subjectivity and Sentiment Classification / Aspect-Based Sentiment Analysis / Sentiment Lexicon Generation / Opinion Summarization / Analysis of Comparative Opinions / Opinion Search and Retrieval / Opinion Spam Detection / Quality of Reviews / Concluding Remarks / Bibliography / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Landmarking and Segmentation of 3D CT Images

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Segmentation and landmarking of computed tomographic (CT) images of pediatric patients are important and useful in computer-aided diagnosis (CAD), treatment planning, and objective analysis of normal as well as pathological regions. Identification and segmentation of organs and tissues in the presence of tumors are difficult. Automatic segmentation of the primary tumor mass in neuroblastoma could facilitate reproducible and objective analysis of the tumor's tissue composition, shape, and size. However, due to the heterogeneous tissue composition of the neuroblastic tumor, ranging from low-attenuation necrosis to high-attenuation calcification, segmentation of the tumor mass is a challenging problem. In this context, methods are described in this book for identification and segmentation of several abdominal and thoracic landmarks to assist in the segmentation of neuroblastic tumors in pediatric CT images. Methods to identify and segment automatically the peripheral artifacts and tissu s, the rib structure, the vertebral column, the spinal canal, the diaphragm, and the pelvic surface are described. Techniques are also presented to evaluate quantitatively the results of segmentation of the vertebral column, the spinal canal, the diaphragm, and the pelvic girdle by comparing with the results of independent manual segmentation performed by a radiologist. The use of the landmarks and removal of several tissues and organs are shown to assist in limiting the scope of the tumor segmentation process to the abdomen, to lead to the reduction of the false-positive error, and to improve the result of segmentation of neuroblastic tumors. Table of Contents: Introduction to Medical Image Analysis / Image Segmentation / Experimental Design and Database / Ribs, Vertebral Column, and Spinal Canal / Delineation of the Diaphragm / Delineation of the Pelvic Girdle / Application of Landmarking / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Resource-Oriented Architecture Patterns for Webs of Data

    Copyright Year: 2013

    Morgan and Claypool eBooks

    The surge of interest in the REpresentational State Transfer (REST) architectural style, the Semantic Web, and Linked Data has resulted in the development of innovative, flexible, and powerful systems that embrace one or more of these compatible technologies. However, most developers, architects, Information Technology managers, and platform owners have only been exposed to the basics of resource-oriented architectures. This book is an attempt to catalog and elucidate several reusable solutions that have been seen in the wild in the now increasingly familiar "patterns book" style. These are not turn key implementations, but rather, useful strategies for solving certain problems in the development of modern, resource-oriented systems, both on the public Web and within an organization's firewalls. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Nonlinear Source Separation

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The purpose of this lecture book is to present the state of the art in nonlinear blind source separation, in a form appropriate for students, researchers and developers. Source separation deals with the problem of recovering sources that are observed in a mixed condition. When we have little knowledge about the sources and about the mixture process, we speak of blind source separation. Linear blind source separation is a relatively well studied subject, however nonlinear blind source separation is still in a less advanced stage, but has seen several significant developments in the last few years. This publication reviews the main nonlinear separation methods, including the separation of post-nonlinear mixtures, and the MISEP, ensemble learning and kTDSEP methods for generic mixtures. These methods are studied with a significant depth. A historical overview is also presented, mentioning most of the relevant results, on nonlinear blind source separation, that have been presented over th years. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Control System Synthesis:A Factorization Approach, Part II

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book introduces the so-called "stable factorization approach" to the synthesis of feedback controllers for linear control systems. The key to this approach is to view the multi-input, multi-output (MIMO) plant for which one wishes to design a controller as a matrix over the fraction field F associated with a commutative ring with identity, denoted by R, which also has no divisors of zero. In this setting, the set of single-input, single-output (SISO) stable control systems is precisely the ring R, while the set of stable MIMO control systems is the set of matrices whose elements all belong to R. The set of unstable, meaning not necessarily stable, control systems is then taken to be the field of fractions F associated with R in the SISO case, and the set of matrices with elements in F in the MIMO case. The central notion introduced in the book is that, in most situations of practical interest, every matrix P whose elements belong to F can be "factored" as a "ratio" of two matrice N,D whose elements belong to R, in such a way that N,D are coprime. In the familiar case where the ring R corresponds to the set of bounded-input, bounded-output (BIBO)-stable rational transfer functions, coprimeness is equivalent to two functions not having any common zeros in the closed right half-plane including infinity. However, the notion of coprimeness extends readily to discrete-time systems, distributed-parameter systems in both the continuous- as well as discrete-time domains, and to multi-dimensional systems. Thus the stable factorization approach enables one to capture all these situations within a common framework. The key result in the stable factorization approach is the parametrization of all controllers that stabilize a given plant. It is shown that the set of all stabilizing controllers can be parametrized by a single parameter R, whose elements all belong to R. Moreover, every transfer matrix in the closed-loop system is an affine function of the design parameter R Thus problems of reliable stabilization, disturbance rejection, robust stabilization etc. can all be formulated in terms of choosing an appropriate R. This is a reprint of the book Control System Synthesis: A Factorization Approach originally published by M.I.T. Press in 1985. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing and Evaluating Usable Technology in Industrial Research:Three Case Studies

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book is about HCI research in an industrial research setting. It is based on the experiences of two researchers at the IBM T. J. Watson Research Center. Over the last two decades, Drs. John and Clare-Marie Karat have conducted HCI research to create innovative usable technology for users across a variety of domains. We begin the book by introducing the reader to the context of industrial research as well as a set of common themes or guidelines to consider in conducting HCI research in practice. Then case study examples of HCI approaches to the design and evaluation of usable solutions for people are presented and discussed in three domain areas: - item Conversational speech technologies, - item Personalization in eCommerce, and - item Security and privacy policy management technologies In each of the case studies, the authors illustrate and discuss examples of HCI approaches to design and evaluation that worked well and those that did not. They discuss what was learned over time bout different HCI methods in practice, and changes that were made to the HCI tools used over time. The Karats discuss trade-offs and issues related to time, resources, and money and the value derived from different HCI methods in practice. These decisions are ones that need to be made regularly in the industrial sector. Similarities and differences with the types of decisions made in this regard in academia will be discussed. The authors then use the context of the three case studies in the three research domains to draw insights and conclusions about the themes that were introduced in the beginning of the book. The Karats conclude with their perspective about the future of HCI industrial research. Table of Contents: Introduction: Themes and Structure of the Book / Case Study 1: Conversational Speech Technologies: Automatic Speech Recognition (ASR) / Case Study 2: Personalization in eCommerce / Case Study 3: Security and Privacy Policy Management Technologies / Insights and Conclusio s / The Future of Industrial HCI Research View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Incentive-Centric Semantic Web Application Engineering

    Copyright Year: 2013

    Morgan and Claypool eBooks

    While many Web 2.0-inspired approaches to semantic content authoring do acknowledge motivation and incentives as the main drivers of user involvement, the amount of useful human contributions actually available will always remain a scarce resource. Complementarily, there are aspects of semantic content authoring in which automatic techniques have proven to perform reliably, and the added value of human (and collective) intelligence is often a question of cost and timing. The challenge that this book attempts to tackle is how these two approaches (machine- and human-driven computation) could be combined in order to improve the cost-performance ratio of creating, managing, and meaningfully using semantic content. To do so, we need to first understand how theories and practices from social sciences and economics about user behavior and incentives could be applied to semantic content authoring. We will introduce a methodology to help software designers to embed incentives-minded functiona ities into semantic applications, as well as best practices and guidelines. We will present several examples of such applications, addressing tasks such as ontology management, media annotation, and information extraction, which have been built with these considerations in mind. These examples illustrate key design issues of incentivized Semantic Web applications that might have a significant effect on the success and sustainable development of the applications: the suitability of the task and knowledge domain to the intended audience, and the mechanisms set up to ensure high-quality contributions, and extensive user involvement. Table of Contents: Semantic Data Management: A Human-driven Process / Fundamentals of Motivation and Incentives / Case Study: Motivating Employees to Annotate Content / Case Study: Building a Community of Practice Around Web Service Management and Annotation / Case Study: Games with a Purpose for Semantic Content Creation / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Aspects of Differential Geometry III

    Copyright Year: 2017

    Morgan and Claypool eBooks

    Differential Geometry is a wide field. We have chosen to concentrate upon certain aspects that are appropriate for an introduction to the subject; we have not attempted an encyclopedic treatment. Book III is aimed at the first-year graduate level but is certainly accessible to advanced undergraduates. It deals with invariance theory and discusses invariants both of Weyl and not of Weyl type; the Chern‒Gauss‒Bonnet formula is treated from this point of view. Homothety homogeneity, local homogeneity, stability theorems, and Walker geometry are discussed. Ricci solitons are presented in the contexts of Riemannian, Lorentzian, and affine geometry. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineers, Society, and Sustainability

    Copyright Year: 2011

    Morgan and Claypool eBooks

    sustainability, actor-network theory, consumption, ecological modernisation, infrastructure, water, socio-technical systems, environmental ethics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Geometric Programming for Design and Cost Optimization:With Illustrative Case Study Problems and Solutions

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Geometric programming is used for design and cost optimization, the development of generalized design relationships, cost ratios for specific problems, and profit maximization. The early pioneers of the process - Zener, Duffin, Peterson, Beightler, Wilde, and Phillips -- played important roles in the development of geometric programming. There are three major areas: 1) Introduction, History, and Theoretical Fundamentals, 2) Applications with Zero Degrees of Difficulty, and 3) Applications with Positive Degrees of Difficulty. The primal-dual relationships are used to illustrate how to determine the primal variables from the dual solution and how to determine additional dual equations when the degrees of difficulty are positive. A new technique for determining additional equations for the dual, Dimensional Analysis, is demonstrated. The various solution techniques of the constrained derivative approach, the condensation of terms, and dimensional analysis are illustrated with example pro lems. The goal of this work is to have readers develop more case studies to further the application of this exciting tool. Table of Contents: Introduction / Brief History of Geometric Programming / Theoretical Considerations / The Optimal Box Design Case Study / Trash Can Case Study / The Open Cargo Shipping Box Case Study / Metal Casting Cylindrical Riser Case Study / Inventory Model Case Study / Process Furnace Design Case Study / Gas Transmission Pipeline Case Study / Profit Maximization Case Study / Material Removal/Metal Cutting Economics Case Study / Journal Bearing Design Case Study / Metal Casting Hemispherical Top Cylindrical Side Riser\Case Study / Liquefied Petroleum Gas (LPG) Cylinders Case Study / Material Removal/Metal Cutting Economics with Two Constraints / The Open Cargo Shipping Box with Skids / Profit Maximization Considering Decreasing Cost Functions of Inventory Policy / Summary and Future Directions / Thesis and Dissertations on Geometric Programming View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis Techniques for Information Security

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Increasingly our critical infrastructures are reliant on computers. We see examples of such infrastructures in several domains, including medical, power, telecommunications, and finance. Although automation has advantages, increased reliance on computers exposes our critical infrastructures to a wider variety and higher likelihood of accidental failures and malicious attacks. Disruption of services caused by such undesired events can have catastrophic effects, such as disruption of essential services and huge financial losses. The increased reliance of critical services on our cyberinfrastructure and the dire consequences of security breaches have highlighted the importance of information security. Authorization, security protocols, and software security are three central areas in security in which there have been significant advances in developing systematic foundations and analysis methods that work for practical systems. This book provides an introduction to this work, covering rep esentative approaches, illustrated by examples, and providing pointers to additional work in the area. Table of Contents: Introduction / Foundations / Detecting Buffer Overruns Using Static Analysis / Analyzing Security Policies / Analyzing Security Protocols View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Arduino Microcontroller Processing for Everyone:Part II

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book is about the Arduino microcontroller and the Arduino concept. The visionary Arduino team of Massimo Banzi, David Cuartielles, Tom Igoe, Gianluca Martino, and David Mellis launched a new innovation in microcontroller hardware in 2005, the concept of open source hardware. Their approach was to openly share details of microcontroller-based hardware design platforms to stimulate the sharing of ideas and promote innovation. This concept has been popular in the software world for many years. This book is intended for a wide variety of audiences including students of the fine arts, middle and senior high school students, engineering design students, and practicing scientists and engineers. To meet this wide audience, the book has been divided into sections to satisfy the need of each reader. The book contains many software and hardware examples to assist the reader in developing a wide variety of systems. For the examples, the Arduino Duemilanove and the Atmel ATmega328 is employed as the target processor. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineers Engaging Community:Water and Energy

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Water and energy are fundamental elements of community well-being and economic development, and a key focus of engineering efforts the world over. As such, they offer outstanding opportunities for the development of socially just engineering practices. This work examines the engineering of water and energy systems with a focus on issues of social justice and sustainability. A key theme running through the work is engaging community on water and energy engineering projects: How is this achieved in diverse contexts? And, what can we learn from past failures and successes in water and energy engineering? The book includes a detailed case study of issues involved in the provision of water and energy, among other needs, in a developing and newly independent nation, East Timor. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Theory Tools for Computer Graphics

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Information theory (IT) tools, widely used in scientific fields such as engineering, physics, genetics, neuroscience, and many others, are also emerging as useful transversal tools in computer graphics. In this book, we present the basic concepts of IT and how they have been applied to the graphics areas of radiosity, adaptive ray-tracing, shape descriptors, viewpoint selection and saliency, scientific visualization, and geometry simplification. Some of the approaches presented, such as the viewpoint techniques, are now the state of the art in visualization. Almost all of the techniques presented in this book have been previously published in peer-reviewed conference proceedings or international journals. Here, we have stressed their common aspects and presented them in an unified way, so the reader can clearly see which problems IT tools can help solve, which specific tools to use, and how to apply them. A basic level of knowledge in computer graphics is required but basic concepts i IT are presented. The intended audiences are both students and practitioners of the fields above and related areas in computer graphics. In addition, IT practitioners will learn about these applications. Table of Contents: Information Theory Basics / Scene Complexity and Refinement Criteria for Radiosity / Shape Descriptors / Refinement Criteria for Ray-Tracing / Viewpoint Selection and Mesh Saliency / View Selection in Scientific Visualization / Viewpoint-based Geometry Simplification View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Wireless Network Pricing

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Today's wireless communications and networking practices are tightly coupled with economic considerations, to the extent that it is almost impossible to make a sound technology choice without understanding the corresponding economic implications. This book aims at providing a foundational introduction on how microeconomics, and pricing theory in particular, can help us to understand and build better wireless networks. The book can be used as lecture notes for a course in the field of network economics, or a reference book for wireless engineers and applied economists to understand how pricing mechanisms influence the fast growing modern wireless industry. This book first covers the basics of wireless communication technologies and microeconomics, before going in-depth about several pricing models and their wireless applications. The pricing models include social optimal pricing, monopoly pricing, price differentiation, oligopoly pricing, and network externalities, supported by introd ctory discussions of convex optimization and game theory. The wireless applications include wireless video streaming, service provider competitions, cellular usage-based pricing, network partial price differentiation, wireless spectrum leasing, distributed power control, and cellular technology upgrade. More information related to the book (including references, slides, and videos) can be found at ncel.ie.cuhk.edu.hk/content/wireless-network-pricing. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analytical Performance Modeling for Computer Systems

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking, and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems in which they are interested. Describing a complicated system abstractly with mathematical equations requires a careful choice of assumptions and approximations. These assumptions and approximations make the model tractable, but they must not remove essential characteristics of the system, nor introduce spurious properties. To help the reader understand the choices and their implications, this book discusses the analytical models in 20 research papers. These papers cover a broad range of topics: processors and disks, databases a d multimedia, worms and wireless, etc. An Appendix provides some questions for readers to exercise their understanding of the models in these papers. Table of Contents: Preliminaries / Concepts and Little's Law / Single Queues / Open Systems / Markov Chains / Closed Systems / Bottlenecks and Flow Equivalence / Deterministic Approximations / Transient Analysis / Experimental Validation and Analysis / Analysis with an Analytical Model View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Integral:A Crux for Analysis

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book treats all of the most commonly used theories of the integral. After motivating the idea of integral, we devote a full chapter to the Riemann integral and the next to the Lebesgue integral. Another chapter compares and contrasts the two theories. The concluding chapter offers brief introductions to the Henstock integral, the Daniell integral, the Stieltjes integral, and other commonly used integrals. The purpose of this book is to provide a quick but accurate (and detailed) introduction to all aspects of modern integration theory. It should be accessible to any student who has had calculus and some exposure to upper division mathematics. Table of Contents: Introduction / The Riemann Integral / The Lebesgue Integral / Comparison of the Riemann and Lebesgue Integrals / Other Theories of the Integral View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital Libraries for Cultural Heritage:Development, Outcomes, and Challenges from European Perspectives

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>European digital libraries have existed in diverse forms and with quite different functions, priorities, and aims. However, there are some common features of European-based initiatives that are relevant to non-European communities. There are now many more challenges and changes than ever before, and the development rate of new digital libraries is ever accelerating. Delivering educational, cultural, and research resources-especially from major scientific and cultural organizations-has become a core mission of these organizations. Using these resources they will be able to investigate, educate, and elucidate, in order to promote and disseminate and to preserve civilization. Extremely important in conceptualizing the digital environment priorities in Europe was its cultural heritage and the feeling that these rich resources should be open to Europe and the global community.</p> <p>In this book we focus on European digitized heritage and digital culture, and it potential in the digital age. We specifically look at the EU and its approaches to digitization and digital culture, problems detected, and achievements reached, all with an emphasis on digital cultural heritage. We seek to report on important documents that were prepared on digitization; copyright and related documents; research and education in the digital libraries field under the auspices of the EU; some other European and national initiatives; and funded projects.</p> <p>The aim of this book is to discuss the development of digital libraries in the European context by presenting, primarily to non-European communities interested in digital libraries, the phenomena, initiatives, and developments that dominated in Europe. We describe the main projects and their outcomes, and shine a light on the number of challenges that have been inspiring new approaches, cooperative efforts, and the use of research methodology at different stages of the digital libraries developme t. The specific goals are reflected in the structure of the book, which can be conceived as a guide to several main topics and sub-topics. However, the author’s scope is far from being comprehensive, since the field of digital libraries is very complex and digital libraries for cultural heritage is even moreso.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Wavelet Image Compression

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book explains the stages necessary to create a wavelet compression system for images and describes state-of-the-art systems used in image compression standards and current research. It starts with a high level discussion of the properties of the wavelet transform, especially the decomposition into multi-resolution subbands. It continues with an exposition of the null-zone, uniform quantization used in most subband coding systems and the optimal allocation of bitrate to the different subbands. Then the image compression systems of the FBI Fingerprint Compression Standard and the JPEG2000 Standard are described in detail. Following that, the set partitioning coders SPECK and SPIHT, and EZW are explained in detail and compared via a fictitious wavelet transform in actions and number of bits coded in a single pass in the top bit plane. The presentation teaches that, besides producing efficient compression, these coding systems, except for the FBI Standard, are capable of writing bit treams that have attributes of rate scalability, resolution scalability, and random access decoding. Many diagrams and tables accompany the text to aid understanding. The book is generous in pointing out references and resources to help the reader who wishes to expand his knowledge, know the origins of the methods, or find resources for running the various algorithms or building his own coding system. Table of Contents: Introduction / Characteristics of the Wavelet Transform / Generic Wavelet-based Coding Systems / The FBI Fingerprint Image Compression Standard / Set Partition Embedded Block (SPECK) Coding / Tree-based Wavelet Transform Coding Systems / Rate Control for Embedded Block Coders / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Deep Web Query Interface Understanding and Integration

    Copyright Year: 2012

    Morgan and Claypool eBooks

    There are millions of searchable data sources on the Web and to a large extent their contents can only be reached through their own query interfaces. There is an enormous interest in making the data in these sources easily accessible. There are primarily two general approaches to achieve this objective. The first is to surface the contents of these sources from the deep Web and add the contents to the index of regular search engines. The second is to integrate the searching capabilities of these sources and support integrated access to them. In this book, we introduce the state-of-the-art techniques for extracting, understanding, and integrating the query interfaces of deep Web data sources. These techniques are critical for producing an integrated query interface for each domain. The interface serves as the mediator for searching all data sources in the concerned domain. While query interface integration is only relevant for the deep Web integration approach, the extraction and under tanding of query interfaces are critical for both deep Web exploration approaches. This book aims to provide in-depth and comprehensive coverage of the key technologies needed to create high quality integrated query interfaces automatically. The following technical issues are discussed in detail in this book: query interface modeling, query interface extraction, query interface clustering, query interface matching, query interface attribute integration, and query interface integration. Table of Contents: Introduction / Query Interface Representation and Extraction / Query Interface Clustering and Categorization / Query Interface Matching / Query Interface Attribute Integration / Query Interface Integration / Summary and Future Research View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Digital Heritage Reconstruction Using Super-resolution and Inpainting

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>Heritage sites across the world have witnessed a number of natural calamities, sabotage and damage from visitors, resulting in their present ruined condition. Many sites are now restricted to reduce the risk of further damage. Yet these masterpieces are significant cultural icons and critical markers of past civilizations that future generations need to see. A digitally reconstructed heritage site could diminish further harm by using immersive navigation or walkthrough systems for virtual environments. An exciting key element for the viewer is observing fine details of the historic work and viewing monuments in their undamaged form. This book presents image super-resolution methods and techniques for automatically detecting and inpainting damaged regions in heritage monuments, in order to provide an enhanced visual experience.</p> <p>The book presents techniques to obtain higher resolution photographs of the digitally reconstructed monuments, and the resulti g images can serve as input to immersive walkthrough systems. It begins with the discussion of two novel techniques for image super-resolution and an approach for inpainting a user-supplied region in the given image, followed by a technique to simultaneously perform super-resolution and inpainting of given missing regions. It then introduces a method for automatically detecting and repairing the damage to dominant facial regions in statues, followed by a few approaches for automatic crack repair in images of heritage scenes. This book is a giant step toward ensuring that the iconic sites of our past are always available, and will never be truly lost.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Essentials of Applied Mathematics for Engineers and Scientists:Second Edition

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The Second Edition of this popular book on practical mathematics for engineers includes new and expanded chapters on perturbation methods and theory. This is a book about linear partial differential equations that are common in engineering and the physical sciences. It will be useful to graduate students and advanced undergraduates in all engineering fields as well as students of physics, chemistry, geophysics and other physical sciences and professional engineers who wish to learn about how advanced mathematics can be used in their professions. The reader will learn about applications to heat transfer, fluid flow and mechanical vibrations. The book is written in such a way that solution methods and application to physical problems are emphasized. There are many examples presented in detail and fully explained in their relation to the real world. References to suggested further reading are included. The topics that are covered include classical separation of variables and orthogonal f nctions, Laplace transforms, complex variables and Sturm-Liouville transforms. This second edition includes two new and revised chapters on perturbation methods, and singular perturbation theory of differential equations. Table of Contents: Partial Differential Equations in Engineering / The Fourier Method: Separation of Variables / Orthogonal Sets of Functions / Series Solutions of Ordinary Differential Equations / Solutions Using Fourier Series and Integrals / Integral Transforms: The Laplace Transform / Complex Variables and the Laplace Inversion Integral / Solutions with Laplace Transforms / Sturm-Liouville Transforms / Introduction to Perturbation Methods / Singular Perturbation Theory of Differential Equations / Appendix A: The Roots of Certain Transcendental Equations View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Practical Global Illumination with Irradiance Caching

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Irradiance caching is a ray tracing-based technique for computing global illumination on diffuse surfaces. Specifically, it addresses the computation of indirect illumination bouncing off one diffuse object onto another. The sole purpose of irradiance caching is to make this computation reasonably fast. The main idea is to perform the indirect illumination sampling only at a selected set of locations in the scene, store the results in a cache, and reuse the cached value at other points through fast interpolation. This book is for anyone interested in making a production-ready implementation of irradiance caching that reliably renders artifact-free images. Since its invention 20 years ago, the irradiance caching algorithm has been successfully used to accelerate global illumination computation in the Radiance lighting simulation system. Its widespread use had to wait until computers became fast enough to consider global illumination in film production rendering. Since then, its use is biquitous. Virtually all commercial and open-source rendering software base the global illumination computation upon irradiance caching. Although elegant and powerful, the algorithm in its basic form often fails to produce artifact-free mages. Unfortunately, practical information on implementing the algorithm is scarce. The main objective of this book is to show the irradiance caching algorithm along with all the details and tricks upon which the success of its practical implementation is dependent. In addition, we discuss some extensions of the basic algorithm, such as a GPU implementation for interactive global illumination computation and temporal caching that exploits temporal coherence to suppress flickering in animations. Our goal is to show the material without being overly theoretical. However, the reader should have some basic understanding of rendering concepts, ray tracing in particular. Familiarity with global illumination is useful but not necessary to read this book. Tab e of Contents: Introduction to Ray Tracing and Global Illumination / Irradiance Caching Core / Practical Rendering with Irradiance Caching / Irradiance Caching in a Complete Global Illumination / Irradiance Caching on Graphics Hardware / Temporal Irradiance Caching View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Architecture:The Design of Digital Information Spaces

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Information Architecture is about organizing and simplifying information, designing and integrating information spaces/systems, and creating ways for people to find and interact with information content. Its goal is to help people understand and manage information and make right decisions accordingly. In the ever-changing social, organizational and technological contexts, Information Architects not only design individual information spaces (e.g., individual websites, software applications, and mobile devices), but also tackle strategic aggregation and integration of multiple information spaces across websites, channels, modalities, and platforms. Not only they create predetermined navigation pathways, but also provide tools and rules for people to organize information on their own and get connected with others. Information Architects work with multi-disciplinary teams to determine the user experience strategy based on user needs and business goals, and make sure the strategy gets carr ed out by following the user-centered design (UCD) process via close collaboration with others. Drawing on the author(s) extensive experience as HCI researchers, User Experience Design practitioner, and Information Architecture instructors, this book provides a balanced view of the IA discipline by applying the IA theories, design principles and guidelines to the IA and UX practices. It also covers advanced topics such as Enterprise IA, Global IA, and Mobile IA. In addition to new and experienced IA practitioners, this book is written for undergraduate and graduate level students in Information Architecture, Information Sciences, Human Computer Interaction, Information Systems and related disciplines. Table of Contents: Information Architecture Concepts / Information Architecture and Web 2.0 / IA Research, Design and Evaluation / Organization and Navigation Systems / User Information Behavior and Design Implications / Interaction Design / Enterprise IA and IA in Practice / Global Info mation Architecture / Mobile Information Architecture / The Future of Information Architecture View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Stochastic Partial Differential Equations for Computer Vision with Uncertain Data

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>In image processing and computer vision applications such as medical or scientific image data analysis, as well as in industrial scenarios, images are used as input measurement data. It is good scientific practice that proper measurements must be equipped with error and uncertainty estimates. For many applications, not only the measured values but also their errors and uncertainties, should be—and more and more frequently are—taken into account for further processing. This error and uncertainty propagation must be done for every processing step such that the final result comes with a reliable precision estimate.</p> <p>The goal of this book is to introduce the reader to the recent advances from the field of uncertainty quantification and error propagation for computer vision, image processing, and image analysis that are based on partial differential equations (PDEs). It presents a concept with which error propagation and sensitivity analysis can be formulated with a set of basic operations. The approach discussed in this book has the potential for application in all areas of quantitative computer vision, image processing, and image analysis. In particular, it might help medical imaging finally become a scientific discipline that is characterized by the classical paradigms of observation, measurement, and error awareness.</p> <p>This book is comprised of eight chapters. After an introduction to the goals of the book (Chapter 1), we present a brief review of PDEs and their numerical treatment (Chapter 2), PDE-based image processing (Chapter 3), and the numerics of stochastic PDEs (Chapter 4). We then proceed to define the concept of stochastic images (Chapter 5), describe how to accomplish image processing and computer vision with stochastic images (Chapter 6), and demonstrate the use of these principles for accomplishing sensitivity analysis (Chapter 7). Chapter 8 concludes the book and highlights new resea ch topics for the future.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Surface Computing and Collaborative Analysis Work

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the cha lenges security personnel face in securing networks from attackers, and intelligence analysts encounter when analyzing intelligence data. Both of these activities are becoming increasingly collaborative endeavors, and there are huge opportunities for improving collaboration by leveraging surface computing. This work highlights for interaction designers and software developers the particular challenges and opportunities presented by interaction with surfaces. We have reviewed hundreds of recent research papers, and report on advancements in the fields of surface-enabled collaborative analytic work, interactive techniques for surface technologies, and useful theory that can provide direction to interaction design work. We also offer insight into issues that arise when developing applications for multi-touch surfaces derived from our own experiences creating collaborative applications. We present these insights at a level appropriate for all members of the software design and development team. Table of Contents: List of Figures / Acknowledgments / Figure Credits / Purpose and Direction / Surface Technologies and Collaborative Analysis Systems / Interacting with Surface Technologies / Collaborative Work Enabled by Surfaces / The Theory and the Design of Surface Applications / The Development of Surface Applications / Concluding Comments / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Datacenter Design and Management:A Computer Architect’s Perspective

    Copyright Year: 2016

    Morgan and Claypool eBooks

    An era of big data demands datacenters, which house the computing infrastructure that translates raw data into valuable information. This book defines datacenters broadly, as large distributed systems that perform parallel computation for diverse users. These systems exist in multiple forms—private and public—and are built at multiple scales. Datacenter design and management is multifaceted, requiring the simultaneous pursuit of multiple objectives. Performance, efficiency, and fairness are first-order design and management objectives, each which can be viewed from several perspectives. This book surveys datacenter research from a computer architect's perspective, addressing challenges in applications, design, management, server simulation, and system simulation. This perspective complements the rich bodies of work in datacenters as a warehouse-scale system, which study the implications for infrastructure that encloses computing equipment, and in datacenters as a dist ibuted systems, which employ abstract details in processor and memory subsystems. This book is written for first- or second-year graduate students in computer architecture and may be helpful for those in computer systems. The goal of this book is to prepare computer architects for datacenter-oriented research by describing prevalent perspectives and the state-of-the-art. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Embedded Systems Interfacing for Engineers using the Freescale HCS08 Microcontroller I:Machine Language Programming

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This textbook provides practicing scientists and engineers an advanced treatment of the Atmel AVR microcontroller. This book is intended as a follow-on to a previously published book, titled Atmel AVR Microcontroller Primer: Programming and Interfacing. Some of the content from this earlier text is retained for completeness. This book will emphasize advanced programming and interfacing skills. We focus on system level design consisting of several interacting microcontroller subsystems. The first chapter discusses the system design process. Our approach is to provide the skills to quickly get up to speed to operate the internationally popular Atmel AVR microcontroller line by developing systems level design skills. We use the Atmel ATmega164 as a representative sample of the AVR line. The knowledge you gain on this microcontroller can be easily translated to every other microcontroller in the AVR line. In succeeding chapters, we cover the main subsystems aboard the microcontroller, pro iding a short theory section followed by a description of the related microcontroller subsystem with accompanying software for the subsystem. We then provide advanced examples exercising some of the features discussed. In all examples, we use the C programming language. The code provided can be readily adapted to the wide variety of compilers available for the Atmel AVR microcontroller line. We also include a chapter describing how to interface the microcontroller to a wide variety of input and output devices. The book concludes with several detailed system level design examples employing the Atmel AVR microcontroller. Table of Contents: Embedded Systems Design / Atmel AVR Architecture Overview / Serial Communication Subsystem / Analog to Digital Conversion (ADC) / Interrupt Subsystem / Timing Subsystem / Atmel AVR Operating Parameters and Interfacing / System Level Design View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Background Subtraction:Theory and Practice

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Background subtraction is a widely used concept for detection of moving objects in videos. In the last two decades there has been a lot of development in designing algorithms for background subtraction, as well as wide use of these algorithms in various important applications, such as visual surveillance, sports video analysis, motion capture, etc. Various statistical approaches have been proposed to model scene backgrounds. The concept of background subtraction also has been extended to detect objects from videos captured from moving cameras. This book reviews the concept and practice of background subtraction. We discuss several traditional statistical background subtraction models, including the widely used parametric Gaussian mixture models and non-parametric models. We also discuss the issue of shadow suppression, which is essential for human motion analysis applications. This book discusses approaches and tradeoffs for background maintenance. This book also reviews many of the r cent developments in background subtraction paradigm. Recent advances in developing algorithms for background subtraction from moving cameras are described, including motion-compensation-based approaches and motion-segmentation-based approaches. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Biomedical Image Analysis: Tracking

    Copyright Year: 2006

    Morgan and Claypool eBooks

    In biological and medical imaging applications, tracking objects in motion is a critical task. This book describes the state-of-the-art in biomedical tracking techniques. We begin by detailing methods for tracking using active contours, which have been highly successful in biomedical applications. The book next covers the major probabilistic methods for tracking. Starting with the basic Bayesian model, we describe the Kalman filter and conventional tracking methods that use centroid and correlation measurements for target detection. Innovations such as the extended Kalman filter and the interacting multiple model open the door to capturing complex biological objects in motion. A salient highlight of the book is the introduction of the recently emerged particle filter, which promises to solve tracking problems that were previously intractable by conventional means. Another unique feature of Biomedical Image Analysis: Tracking is the explanation of shape-based methods for biomedical ima e analysis. Methods for both rigid and nonrigid objects are depicted. Each chapter in the book puts forth biomedical case studies that illustrate the methods in action. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Private Information Retrieval

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book deals with Private Information Retrieval (PIR), a technique allowing a user to retrieve an element from a server in possession of a database without revealing to the server which element is retrieved. PIR has been widely applied to protect the privacy of the user in querying a service provider on the Internet. For example, by PIR, one can query a location-based service provider about the nearest car park without revealing his location to the server. The first PIR approach was introduced by Chor, Goldreich, Kushilevitz and Sudan in 1995 in a multi-server setting, where the user retrieves information from multiple database servers, each of which has a copy of the same database. To ensure user privacy in the multi-server setting, the servers must be trusted not to collude. In 1997, Kushilevitz and Ostrovsky constructed the first single-database PIR. Since then, many efficient PIR solutions have been discovered. Beginning with a thorough survey of single-database PIR techniques, this text focuses on the latest technologies and applications in the field of PIR. The main categories are illustrated with recently proposed PIR-based solutions by the authors. Because of the latest treatment of the topic, this text will be highly beneficial to researchers and industry professionals in information security and privacy. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Foundations of Data Quality Management

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Data quality is one of the most important problems in data management. A database system typically aims to support the creation, maintenance, and use of large amount of data, focusing on the quantity of data. However, real-life data are often dirty: inconsistent, duplicated, inaccurate, incomplete, or stale. Dirty data in a database routinely generate misleading or biased analytical results and decisions, and lead to loss of revenues, credibility and customers. With this comes the need for data quality management. In contrast to traditional data management tasks, data quality management enables the detection and correction of errors in the data, syntactic or semantic, in order to improve the quality of the data and hence, add value to business processes. While data quality has been a longstanding problem for decades, the prevalent use of the Web has increased the risks, on an unprecedented scale, of creating and propagating dirty data. This monograph gives an overview of fundamental i sues underlying central aspects of data quality, namely, data consistency, data deduplication, data accuracy, data currency, and information completeness. We promote a uniform logical framework for dealing with these issues, based on data quality rules. The text is organized into seven chapters, focusing on relational data. Chapter One introduces data quality issues. A conditional dependency theory is developed in Chapter Two, for capturing data inconsistencies. It is followed by practical techniques in Chapter 2b for discovering conditional dependencies, and for detecting inconsistencies and repairing data based on conditional dependencies. Matching dependencies are introduced in Chapter Three, as matching rules for data deduplication. A theory of relative information completeness is studied in Chapter Four, revising the classical Closed World Assumption and the Open World Assumption, to characterize incomplete information in the real world. A data currency model is presented in Chap er Five, to identify the current values of entities in a database and to answer queries with the current values, in the absence of reliable timestamps. Finally, interactions between these data quality issues are explored in Chapter Six. Important theoretical results and practical algorithms are covered, but formal proofs are omitted. The bibliographical notes contain pointers to papers in which the results were presented and proven, as well as references to materials for further reading. This text is intended for a seminar course at the graduate level. It is also to serve as a useful resource for researchers and practitioners who are interested in the study of data quality. The fundamental research on data quality draws on several areas, including mathematical logic, computational complexity and database theory. It has raised as many questions as it has answered, and is a rich source of questions and vitality. Table of Contents: Data Quality: An Overview / Conditional Dependencies / C eaning Data with Conditional Dependencies / Data Deduplication / Information Completeness / Data Currency / Interactions between Data Quality Issues View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Theory and Applications of Gaussian Quadrature Methods

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Gaussian quadrature is a powerful technique for numerical integration that falls under the broad category of spectral methods. The purpose of this work is to provide an introduction to the theory and practice of Gaussian quadrature. We study the approximation theory of trigonometric and orthogonal polynomials and related functions and examine the analytical framework of Gaussian quadrature. We discuss Gaussian quadrature for bandlimited functions, a topic inspired by some recent developments in the analysis of prolate spheroidal wave functions. Algorithms for the computation of the quadrature nodes and weights are described. Several applications of Gaussian quadrature are given, ranging from the evaluation of special functions to pseudospectral methods for solving differential equations. Software realization of select algorithms is provided. Table of Contents: Introduction / Approximating with Polynomials and Related Functions / Gaussian Quadrature / Applications / Links to Mathematic l Software View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Globalization, Engineering, and Creativity

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The text addresses the impact of globalization within engineering, particularly on working practices and prospects for creativity. It suggests that accepted norms of economic activity create enclosures and thresholds within the profession, which—as engineers increase their awareness (reflexivity)—will shape the future of engineering, and the values which underpin it. It is aimed at practicing engineers and those in training and is an introduction to the social and political context currently setting new challenges for the profession. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to Logic Circuit Testing

    Copyright Year: 2008

    Morgan and Claypool eBooks

    An Introduction to Logic Circuit Testing provides a detailed coverage of techniques for test generation and testable design of digital electronic circuits/systems. The material covered in the book should be sufficient for a course, or part of a course, in digital circuit testing for senior-level undergraduate and first-year graduate students in Electrical Engineering and Computer Science. The book will also be a valuable resource for engineers working in the industry. This book has four chapters. Chapter 1 deals with various types of faults that may occur in very large scale integration (VLSI)-based digital circuits. Chapter 2 introduces the major concepts of all test generation techniques such as redundancy, fault coverage, sensitization, and backtracking. Chapter 3 introduces the key concepts of testability, followed by some ad hoc design-for-testability rules that can be used to enhance testability of combinational circuits. Chapter 4 deals with test generation and response evaluat on techniques used in BIST (built-in self-test) schemes for VLSI chips. Table of Contents: Introduction / Fault Detection in Logic Circuits / Design for Testability / Built-in Self-Test / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Capstone Design Courses, Part Two:Preparing Biomedical Engineers for the Real World

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The biomedical engineering senior capstone design course is probably the most important course taken by undergraduate biomedical engineering students. It provides them with the opportunity to apply what they have learned in previous years, develop their communication, teamwork, project management, and design skills, and learn about the product development process. It prepares students for professional practice and serves as a preview of what it will be like to work as a biomedical engineer. The capstone design experience can change the way engineering students think about technology, themselves, society, and the world around them. It can make them aware of their potential to make a positive contribution to healthcare throughout the world and generate excitement for, and pride in, the engineering profession. Ideas for how to organize, structure, and manage a senior capstone design course for biomedical and other engineering students are presented here. These ideas will be helpful to fa ulty who are creating a new design course, expanding a current design program, or just looking for some ideas for improving an existing course. The better we can make these courses, the more "industry ready" our students will be, and the better prepared they will be for meaningful, successful careers in biomedical engineering. This book is the second part of a series covering Capstone Design Courses for biomedical engineers. Part I is available online here and in print (ISBN 9781598292923) and covers the following topics: Purpose, Goals, and Benefits; Designing a Course to Meet Student Needs; Enhancing the Capstone Design Courses; Meeting the Changing Needs of Future Engineers. Table of Contents: The Myth of the "Industry-Ready" Engineer / Recent Trends and the Current State of Capstone Design / Preparing Students for Capstone Design / Helping Students Recognize the Value of Capstone Design Courses / Developing Teamwork Skills / Incorporating Design Controls / Learning to Identify Pro lems, Unmet Needs, and New Product Opportunities / Design Verification and Validation / Liability Issues with Assistive Technology Projects / Standards in Capstone Design Courses and the Engineering Curriculum / Design Transfer and Design for Manufacturability / Learning from other Engineering Disciplines: Capstone Design Conferences / Maintaining a Relevant, Up-to-Date Capstone Design Course / Active Learning in Capstone Design Courses / Showcasing Student Projects: National Student Design Competitions / Managing Student Expectations of the "Real World" / Career Management and Professional Development / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Resilient Architecture Design for Voltage Variation

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Shrinking feature size and diminishing supply voltage are making circuits sensitive to supply voltage fluctuations within the microprocessor, caused by normal workload activity changes. If left unattended, voltage fluctuations can lead to timing violations or even transistor lifetime issues that degrade processor robustness. Mechanisms that learn to tolerate, avoid, and eliminate voltage fluctuations based on program and microarchitectural events can help steer the processor clear of danger, thus enabling tighter voltage margins that improve performance or lower power consumption. We describe the problem of voltage variation and the factors that influence this variation during processor design and operation. We also describe a variety of runtime hardware and software mitigation techniques that either tolerate, avoid, and/or eliminate voltage violations. We hope processor architects will find the information useful since tolerance, avoidance, and elimination are generalizable construct that can serve as a basis for addressing other reliability challenges as well. Table of Contents: Introduction / Modeling Voltage Variation / Understanding the Characteristics of Voltage Variation / Traditional Solutions and Emerging Solution Forecast / Allowing and Tolerating Voltage Emergencies / Predicting and Avoiding Voltage Emergencies / Eliminiating Recurring Voltage Emergencies / Future Directions on Resiliency View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mellin Transform Method for Integral Evaluation

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book introduces the Mellin-transform method for the exact calculation of one-dimensional definite integrals, and illustrates the application if this method to electromagnetics problems. Once the basics have been mastered, one quickly realizes that the method is extremely powerful, often yielding closed-form expressions very difficult to come up with other methods or to deduce from the usual tables of integrals. Yet, as opposed to other methods, the present method is very straightforward to apply; it usually requires laborious calculations, but little ingenuity. Two functions, the generalized hypergeometric function and the Meijer G-function, are very much related to the Mellin-transform method and arise frequently when the method is applied. Because these functions can be automatically handled by modern numerical routines, they are now much more useful than they were in the past. The Mellin-transform method and the two aforementioned functions are discussed first. Then the method is applied in three examples to obtain results, which, at least in the antenna/electromagnetics literature, are believed to be new. In the first example, a closed-form expression, as a generalized hypergeometric function, is obtained for the power radiated by a constant-current circular-loop antenna. The second example concerns the admittance of a 2-D slot antenna. In both these examples, the exact closed-form expressions are applied to improve upon existing formulas in standard antenna textbooks. In the third example, a very simple expression for an integral arising in recent, unpublished studies of unbounded, biaxially anisotropic media is derived. Additional examples are also briefly discussed. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Short Introduction to Preferences: Between AI and Social Choice

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Computational social choice is an expanding field that merges classical topics like economics and voting theory with more modern topics like artificial intelligence, multiagent systems, and computational complexity. This book provides a concise introduction to the main research lines in this field, covering aspects such as preference modelling, uncertainty reasoning, social choice, stable matching, and computational aspects of preference aggregation and manipulation. The book is centered around the notion of preference reasoning, both in the single-agent and the multi-agent setting. It presents the main approaches to modeling and reasoning with preferences, with particular attention to two popular and powerful formalisms, soft constraints and CP-nets. The authors consider preference elicitation and various forms of uncertainty in soft constraints. They review the most relevant results in voting, with special attention to computational social choice. Finally, the book considers prefere ces in matching problems. The book is intended for students and researchers who may be interested in an introduction to preference reasoning and multi-agent preference aggregation, and who want to know the basic notions and results in computational social choice. Table of Contents: Introduction / Preference Modeling and Reasoning / Uncertainty in Preference Reasoning / Aggregating Preferences / Stable Marriage Problems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quorum Systems:With Applications to Storage and Consensus

    Copyright Year: 2012

    Morgan and Claypool eBooks

    A quorum system is a collection of subsets of nodes, called quorums, with the property that each pair of quorums have a non-empty intersection. Quorum systems are the key mathematical abstraction for ensuring consistency in fault-tolerant and highly available distributed computing. Critical for many applications since the early days of distributed computing, quorum systems have evolved from simple majorities of a set of processes to complex hierarchical collections of sets, tailored for general adversarial structures. The initial non-empty intersection property has been refined many times to account for, e.g., stronger (Byzantine) adversarial model, latency considerations or better availability. This monograph is an overview of the evolution and refinement of quorum systems, with emphasis on their role in two fundamental applications: distributed read/write storage and consensus. Table of Contents: Introduction / Preliminaries / Classical Quorum Systems / Classical Quorum-Based Emulat ons / Byzantine Quorum Systems / Latency-efficient Quorum Systems / Probabilistic Quorum Systems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Chaotic Maps:Dynamics, Fractals, and Rapid Fluctuations

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book consists of lecture notes for a semester-long introductory graduate course on dynamical systems and chaos taught by the authors at Texas A&M University and Zhongshan University, China. There are ten chapters in the main body of the book, covering an elementary theory of chaotic maps in finite-dimensional spaces. The topics include one-dimensional dynamical systems (interval maps), bifurcations, general topological, symbolic dynamical systems, fractals and a class of infinite-dimensional dynamical systems which are induced by interval maps, plus rapid fluctuations of chaotic maps as a new viewpoint developed by the authors in recent years. Two appendices are also provided in order to ease the transitions for the readership from discrete-time dynamical systems to continuous-time dynamical systems, governed by ordinary and partial differential equations. Table of Contents: Simple Interval Maps and Their Iterations / Total Variations of Iterates of Maps / Ordering among Per ods: The Sharkovski Theorem / Bifurcation Theorems for Maps / Homoclinicity. Lyapunoff Exponents / Symbolic Dynamics, Conjugacy and Shift Invariant Sets / The Smale Horseshoe / Fractals / Rapid Fluctuations of Chaotic Maps on RN / Infinite-dimensional Systems Induced by Continuous-Time Difference Equations View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Microcontrollers Fundamentals for Engineers and Scientists

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book provides practicing scientists and engineers a tutorial on the fundamental concepts and use of microcontrollers. Today, microcontrollers, or single integrated circuit (chip) computers, play critical roles in almost all instrumentation and control systems. Most existing books arewritten for undergraduate and graduate students taking an electrical and/or computer engineering course. Furthermore, these texts have beenwritten with a particular model of microcontroller as the target discussion. These textbooks also require a requisite knowledge of digital design fundamentals. This textbook presents the fundamental concepts common to all microcontrollers. Our goals are to present the over–arching theory of microcontroller operation and to provide a detailed discussion on constituent subsystems available in most microcontrollers. With such goals, we envision that the theory discussed in this book can be readily applied to a wide variety of microcontroller technologies, allo ing practicing scientists and engineers to become acquainted with basic concepts prior to beginning a design involving a specific microcontroller. We have found that the fundamental principles of a given microcontroller are easily transferred to other controllers. Although this is a relatively small book, it is packed with useful information for quickly coming up to speed on microcontroller concepts. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Metaphor:A Computational Perspective

    Copyright Year: 2016

    Morgan and Claypool eBooks

    The literary imagination may take flight on the wings of metaphor, but hard-headed scientists are just as likely as doe-eyed poets to reach for a metaphor when the descriptive need arises. Metaphor is a pervasive aspect of every genre of text and every register of speech, and is as useful for describing the inner workings of a "black hole" (itself a metaphor) as it is the affairs of the human heart. The ubiquity of metaphor in natural language thus poses a significant challenge for Natural Language Processing (NLP) systems and their builders, who cannot afford to wait until the problems of literal language have been solved before turning their attention to figurative phenomena. This book offers a comprehensive approach to the computational treatment of metaphor and its figurative brethren—including simile, analogy, and conceptual blending—that does not shy away from their important cognitive and philosophical dimensions. Veale, Shutova, and Beigman Klebanov approach me aphor from multiple computational perspectives, providing coverage of both symbolic and statistical approaches to interpretation and paraphrase generation, while also considering key contributions from philosophy on what constitutes the "meaning" of a metaphor. This book also surveys available metaphor corpora and discusses protocols for metaphor annotation. Any reader with an interest in metaphor, from beginning researchers to seasoned scholars, will find this book to be an invaluable guide to what is a fascinating linguistic phenomenon. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Web Indicators for Research Evaluation:A Practical Guide

    Copyright Year: 2016

    Morgan and Claypool eBooks

    In recent years there has been an increasing demand for research evaluation within universities and other research-based organisations. In parallel, there has been an increasing recognition that traditional citation-based indicators are not able to reflect the societal impacts of research and are slow to appear. This has led to the creation of new indicators for different types of research impact as well as timelier indicators, mainly derived from the Web. These indicators have been called altmetrics, webometrics or just web metrics. This book describes and evaluates a range of web indicators for aspects of societal or scholarly impact, discusses the theory and practice of using and evaluating web indicators for research assessment and outlines practical strategies for obtaining many web indicators. In addition to describing impact indicators for traditional scholarly outputs, such as journal articles and monographs, it also covers indicators for videos, datasets, software and other n n-standard scholarly outputs. The book describes strategies to analyse web indicators for individual publications as well as to compare the impacts of groups of publications. The practical part of the book includes descriptions of how to use the free software Webometric Analyst to gather and analyse web data. This book is written for information science undergraduate and Master’s students that are learning about alternative indicators or scientometrics as well as Ph.D. students and other researchers and practitioners using indicators to help assess research impact or to study scholarly communication. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advances in Waveform-Agile Sensing for Tracking

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Recent advances in sensor technology and information processing afford a new flexibility in the design of waveforms for agile sensing. Sensors are now developed with the ability to dynamically choose their transmit or receive waveforms in order to optimize an objective cost function. This has exposed a new paradigm of significant performance improvements in active sensing: dynamic waveform adaptation to environment conditions, target structures, or information features. The manuscript provides a review of recent advances in waveform-agile sensing for target tracking applications. A dynamic waveform selection and configuration scheme is developed for two active sensors that track one or multiple mobile targets. A detailed description of two sequential Monte Carlo algorithms for agile tracking are presented, together with relevant Matlab code and simulation studies, to demonstrate the benefits of dynamic waveform adaptation. The work will be of interest not only to practitioners of rada and sonar, but also other applications where waveforms can be dynamically designed, such as communications and biosensing. Table of Contents: Waveform-Agile Target Tracking Application Formulation / Dynamic Waveform Selection with Application to Narrowband and Wideband Environments / Dynamic Waveform Selection for Tracking in Clutter / Conclusions / CRLB Evaluation for Gaussian Envelope GFM Chirp from the Ambiguity Function / CRLB Evaluation from the Complex Envelope View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering Thermodynamics and 21st Century Energy Problems:A Textbook Companion for Student Engagement

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Energy is a basic human need; technologies for energy conversion and use are fundamental to human survival. As energy technology evolves to meet demands for development and ecological sustainability in the 21st century, engineers need to have up-to-date skills and knowledge to meet the creative challenges posed by current and future energy problems. Further, engineers need to cultivate a commitment to and passion for lifelong learning which will enable us to actively engage new developments in the field. This undergraduate textbook companion seeks to develop these capacities in tomorrow's engineers in order to provide for future energy needs around the world. This book is designed to complement traditional texts in engineering thermodynamics, and thus is organized to accompany explorations of the First and Second Laws, fundamental property relations, and various applications across engineering disciplines. It contains twenty modules targeted toward meeting five often-neglected ABET o tcomes: ethics, communication, lifelong learning, social context, and contemporary issues. The modules are based on pedagogies of liberation, used for decades in the humanities and social sciences for instilling critical thinking and reflective action in students by bringing attention to power relations in the classroom and in the world. This book is intended to produce a conversation and creative exploration around how to teach and learn thermodynamics differently. Because liberative pedagogies are at their heart relational, it is important to maintain spaces for discussing classroom practices with these modules, and for sharing ideas for implementing critical pedagogies in engineering contexts. The reader is therefore encouraged to visit the book's blog. Table of Contents: What and Why? / The First Law: Making Theory Relevant / The Second Law and Property Relations / Thinking Big Picture about Energy and Sustainability View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering Principles in Everyday Life for Non-Engineers

    Copyright Year: 2016

    Morgan and Claypool eBooks

    This book is about the role of some engineering principles in our everyday lives. Engineers study these principles and use them in the design and analysis of the products and systems with which they work. The same principles play basic and influential roles in our everyday lives as well. Whether the concept of entropy, the moments of inertia, the natural frequency, the Coriolis acceleration, or the electromotive force, the roles and effects of these phenomena are the same in a system designed by an engineer or created by nature. This shows that learning about these engineering concepts helps us to understand why certain things happen or behave the way they do, and that these concepts are not strange phenomena invented by individuals only for their own use, rather, they are part of our everyday physical and natural world, but are used to our benefit by the engineers and scientists. Learning about these principles might also help attract more and more qualified and interested high schoo and college students to the engineering fields. Each chapter of this book explains one of these principles through examples, discussions, and at times, simple equations. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Primer on Memory Consistency and Cache Coherence

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Many modern computer systems and most multicore chips (chip multiprocessors) support shared memory in hardware. In a shared memory system, each of the processor cores may read and write to a single shared address space. For a shared memory machine, the memory consistency model defines the architecturally visible behavior of its memory system. Consistency definitions provide rules about loads and stores (or memory reads and writes) and how they act upon memory. As part of supporting a memory consistency model, many machines also provide cache coherence protocols that ensure that multiple cached copies of data are kept up-to-date. The goal of this primer is to provide readers with a basic understanding of consistency and coherence. This understanding includes both the issues that must be solved as well as a variety of solutions. We present both highlevel concepts as well as specific, concrete examples from real-world systems. Table of Contents: Preface / Introduction to Consistency and oherence / Coherence Basics / Memory Consistency Motivation and Sequential Consistency / Total Store Order and the x86 Memory Model / Relaxed Memory Consistency / Coherence Protocols / Snooping Coherence Protocols / Directory Coherence Protocols / Advanced Topics in Coherence / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Proxemic Interactions:From Theory to Practice

    Copyright Year: 2015

    Morgan and Claypool eBooks

    In the everyday world, much of what we do as social beings is dictated by how we perceive and manage our interpersonal space. This is called proxemics. At its simplest, people naturally correlate physical distance to social distance. We believe that people’s expectations of proxemics can be exploited in interaction design to mediate their interactions with devices (phones, tablets, computers, appliances, large displays) contained within a small ubiquitous computing ecology. Just as people expect increasing engagement and intimacy as they approach others, so should they naturally expect increasing connectivity and interaction possibilities as they bring themselves and their devices in close proximity to one another. This is called Proxemic Interactions. This book concerns the design of proxemic interactions within such future proxemic-aware ecologies. It imagines a world of devices that have fine-grained knowledge of nearby people and other devices—how they move into rang , their precise distance, their identity, and even their orientation—and how such knowledge can be exploited to design interaction techniques. The first part of this book concerns theory. After introducing proxemics, we operationalize proxemics for ubicomp interaction via the Proxemic Interactions framework that designers can use to mediate people’s interactions with digital devices. The framework, in part, identifies five key dimensions of proxemic measures (distance, orientation, movement, identity, and location) to consider when designing proxemic-aware ubicomp systems. The second part of this book applies this theory to practice via three case studies of proxemic-aware systems that react continuously to people’s and devices’ proxemic relationships. The case studies explore the application of proxemics in small-space ubicomp ecologies by considering first person-to-device, then device-to-device, and finally person-to-person and device-to-device proxemic elationships. We also offer a critical perspective on proxemic interactions in the form of “dark patterns,” where knowledge of proxemics may (and likely will) be easily exploited to the detriment of the user. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    PSpice for Filters and Transmission Lines

    Copyright Year: 2007

    Morgan and Claypool eBooks

    In this book, PSpice for Filters and Transmission Lines, we examine a range of active and passive filters where each design is simulated using the latest Cadence Orcad V10.5 PSpice capture software. These filters cannot match the very high order digital signal processing (DSP) filters considered in PSpice for Digital Signal Processing, but nevertheless these filters have many uses. The active filters considered were designed using Butterworth and Chebychev approximation loss functions rather than using the ‘cookbook approach’ so that the final design will meet a given specification in an exacting manner. Switched-capacitor filter circuits are examined and here we see how useful PSpice/Probe is in demonstrating how these filters, filter, as it were. Two-port networks are discussed as an introduction to transmission lines and, using a series of problems, we demonstrate quarter-wave and single-stub matching. The concept of time domain reflectrometry as a fault location to l on transmission lines is then examined. In the last chapter we discuss the technique of importing and exporting speech signals into a PSpice schematic using a tailored-made program Wav2ascii. This is a novel technique that greatly extends the simulation boundaries of PSpice. Various digital circuits are also examined at the end of this chapter to demonstrate the use of the bus structure and other techniques. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designed Technologies for Healthy Aging 

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Designed Technologies for Healthy Aging identifies and presents a variety of contemporary technologies to support older adults’ abilities to perform everyday activities. Efforts of industry, laboratories, and learning institutions are documented under four major categories: social connections, independent self care, healthy home and active lifestyle. The book contains well-documented and illustrative recent examples of designed technologies—ranging from wearable devices, to mobile applications, to assistive robots— on the broad areas of design and computation, including industrial design, interaction design, graphic design, human-computer interaction, software engineering, and artificial intelligence. Table of Contents: Acknowledgments / Introduction / Social Connections / Independent Self Care / Healthy Home / Active Lifestyle / Conclusion / Contributors / Companies, Laboratories and Institutions / About the Author View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Higher-Order FDTD Schemes for Waveguides and Antenna Structures

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This publication provides a comprehensive and systematically organized coverage of higher order finite-difference time-domain or FDTD schemes, demonstrating their potential role as a powerful modeling tool in computational electromagnetics. Special emphasis is drawn on the analysis of contemporary waveguide and antenna structures. Acknowledged as a significant breakthrough in the evolution of the original Yee's algorithm, the higher order FDTD operators remain the subject of an ongoing scientific research. Among their indisputable merits, one can distinguish the enhanced levels of accuracy even for coarse grid resolutions, the fast convergence rates, and the adjustable stability. In fact, as the fabrication standards of modern systems get stricter, it is apparent that such properties become very appealing for the accomplishment of elaborate and credible designs. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and Social Justice

    Copyright Year: 2008

    Morgan and Claypool eBooks

    The profession of engineering in the United States has historically served the status quo, feeding an ever-expanding materialistic and militaristic culture, remaining relatively unresponsive to public concerns, and without significant pressure for change from within. This book calls upon engineers to cultivate a passion for social justice and peace and to develop the skill and knowledge set needed to take practical action for change within the profession. Because many engineers do not receive education and training that support the kinds of critical thinking, reflective decision-making, and effective action necessary to achieve social change, engineers concerned with social justice can feel powerless and isolated as they remain complicit. Utilizing techniques from radical pedagogies of liberation and other movements for social justice, this book presents a roadmap for engineers to become empowered and engage one another in a process of learning and action for social justice and peace. Table of contents: What Do we Mean by Social Justice? / Mindsets in Engineering / Engineering and Social Injustice / Toward a More Socially Just Engineering / Turning Knowledge into Action: Strategies for Change / Parting Lessons for the Continuing Struggle View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Communication Networks:A Concise Introduction

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book results from many years of teaching an upper division course on communication networks in the EECS department at University of California, Berkeley. It is motivated by the perceived need for an easily accessible textbook that puts emphasis on the core concepts behind current and next generation networks. After an overview of how today's Internet works and a discussion of the main principles behind its architecture, we discuss the key ideas behind Ethernet, WiFi networks, routing, internetworking and TCP. To make the book as self contained as possible, brief discussions of probability and Markov chain concepts are included in the appendices. This is followed by a brief discussion of mathematical models that provide insight into the operations of network protocols. Next, the main ideas behind the new generation of wireless networks based on WiMAX and LTE, and the notion of QoS are presented. A concise discussion of the physical layer technologies underlying various networks i also included. Finally, a sampling of topics is presented that may have significant influence on the future evolution of networks including overlay networks like content delivery and peer-to-peer networks, sensor networks, distributed algorithms, Byzantine agreement and source compression. Table of Contents: The Internet / Principles / Ethernet / WiFi / Routing / Internetworking / Transport / Models / WiMAX & LTE / QOS / Physical Layer / Additional Topics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Informing Chemical Engineering Decisions with Data, Research, and Government Resources

    Copyright Year: 2013

    Morgan and Claypool eBooks

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Exploring Context in Information Behavior:Seeker, Situation, Surroundings, and Shared Identities

    Copyright Year: 2017

    Morgan and Claypool eBooks

    The field of human information behavior runs the gamut of processes from the realization of a need or gap in understanding, to the search for information from one or more sources to fill that gap, to the use of that information to complete a task at hand or to satisfy a curiosity, as well as other behaviors such as avoiding information or finding information serendipitously. Designers of mechanisms, tools, and computer-based systems to facilitate this seeking and search process often lack a full knowledge of the context surrounding the search. This context may vary depending on the job or role of the person; individual characteristics such as personality, domain knowledge, age, gender, perception of self, etc.; the task at hand; the source and the channel and their degree of accessibility and usability; and the relationship that the seeker shares with the source. Yet researchers have yet to agree on what context really means. While there have been various research studies incorporatin context, and biennial conferences on context in information behavior, there lacks a clear definition of what context is, what its boundaries are, and what elements and variables comprise context. In this book, we look at the many definitions of and the theoretical and empirical studies on context, and I attempt to map the conceptual space of context in information behavior. I propose theoretical frameworks to map the boundaries, elements, and variables of context. I then discuss how to incorporate these frameworks and variables in the design of research studies on context. We then arrive at a unified definition of context. This book should provide designers of search systems a better understanding of context as they seek to meet the needs and demands of information seekers. It will be an important resource for researchers in Library and Information Science, especially doctoral students looking for one resource that covers an exhaustive range of the most current literature related to ontext, the best selection of classics, and a synthesis of these into theoretical frameworks and a unified definition. The book should help to move forward research in the field by clarifying the elements, variables, and views that are pertinent. In particular, the list of elements to be considered, and the variables associated with each element will be extremely useful to researchers wanting to include the influences of context in their studies. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Phase Change Memory: From Devices to Systems

    Copyright Year: 2011

    Morgan and Claypool eBooks

    As conventional memory technologies such as DRAM and Flash run into scaling challenges, architects and system designers are forced to look at alternative technologies for building future computer systems. This synthesis lecture begins by listing the requirements for a next generation memory technology and briefly surveys the landscape of novel non-volatile memories. Among these, Phase Change Memory (PCM) is emerging as a leading contender, and the authors discuss the material, device, and circuit advances underlying this exciting technology. The lecture then describes architectural solutions to enable PCM for main memories. Finally, the authors explore the impact of such byte-addressable non-volatile memories on future storage and system designs. Table of Contents: Next Generation Memory Technologies / Architecting PCM for Main Memories / Tolerating Slow Writes in PCM / Wear Leveling for Durability / Wear Leveling Under Adversarial Settings / Error Resilience in Phase Change Memories Storage and System Design With Emerging Non-Volatile Memories View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modeling and Data Mining in Blogosphere

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book offers a comprehensive overview of the various concepts and research issues about blogs or weblogs. It introduces techniques and approaches, tools and applications, and evaluation methodologies with examples and case studies. Blogs allow people to express their thoughts, voice their opinions, and share their experiences and ideas. Blogs also facilitate interactions among individuals creating a network with unique characteristics. Through the interactions individuals experience a sense of community. We elaborate on approaches that extract communities and cluster blogs based on information of the bloggers. Open standards and low barrier to publication in Blogosphere have transformed information consumers to producers, generating an overwhelming amount of ever-increasing knowledge about the members, their environment and symbiosis. We elaborate on approaches that sift through humongous blog data sources to identify influential and trustworthy bloggers leveraging content and net ork information. Spam blogs or "splogs" are an increasing concern in Blogosphere and are discussed in detail with the approaches leveraging supervised machine learning algorithms and interaction patterns. We elaborate on data collection procedures, provide resources for blog data repositories, mention various visualization and analysis tools in Blogosphere, and explain conventional and novel evaluation methodologies, to help perform research in the Blogosphere. The book is supported by additional material, including lecture slides as well as the complete set of figures used in the book, and the reader is encouraged to visit the book website for the latest information. Table of Contents: Modeling Blogosphere / Blog Clustering and Community Discovery / Influence and Trust / Spam Filtering in Blogosphere / Data Collection and Evaluation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multidimensional Databases and Data Warehousing

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The present book's subject is multidimensional data models and data modeling concepts as they are applied in real data warehouses. The book aims to present the most important concepts within this subject in a precise and understandable manner. The book's coverage of fundamental concepts includes data cubes and their elements, such as dimensions, facts, and measures and their representation in a relational setting; it includes architecture-related concepts; and it includes the querying of multidimensional databases. The book also covers advanced multidimensional concepts that are considered to be particularly important. This coverage includes advanced dimension-related concepts such as slowly changing dimensions, degenerate and junk dimensions, outriggers, parent-child hierarchies, and unbalanced, non-covering, and non-strict hierarchies. The book offers a principled overview of key implementation techniques that are particularly important to multidimensional databases, including mat rialized views, bitmap indices, join indices, and star join processing. The book ends with a chapter that presents the literature on which the book is based and offers further readings for those readers who wish to engage in more in-depth study of specific aspects of the book's subject. Table of Contents: Introduction / Fundamental Concepts / Advanced Concepts / Implementation Issues / Further Readings View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Qualitative HCI Research:Going Behind the Scenes

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Human-Computer Interaction (HCI) addresses problems of interaction design: understanding user needs to inform design, delivering novel designs that meet user needs, and evaluating new and existing designs to determine their success in meeting user needs. Qualitative methods have an essential role to play in this enterprise, particularly in understanding user needs and behaviours and evaluating situated use of technology. Qualitative methods allow HCI researchers to ask questions where the answers are more complex and interesting than "true" or "false," and may also be unexpected. In this lecture, we draw on the analogy of making a documentary film to discuss important issues in qualitative HCI research: historically, films were presented as finished products, giving the viewer little insight into the production process; more recently, there has been a trend to go behind the scenes to expose some of the painstaking work that went into creating the final cut. Similarly, in qualitative r search, the essential work behind the scenes is rarely discussed. There are many "how to" guides for particular methods, but few texts that start with the purpose of a study and then discuss the important details of how to select a suitable method, how to adapt it to fit the study context, or how to deal with unexpected challenges that arise. We address this gap by presenting a repertoire of qualitative techniques for understanding user needs, practices and experiences with technology for the purpose of informing design. We also discuss practical considerations such as tactics for recruiting participants and ways of getting started when faced with a pile of interview transcripts. Our particular focus is on semi-structured qualitative studies, which occupy a space between ethnography and surveys—typically involving observations, interviews and similar methods for data gathering, and methods of analysis based on systematic coding of data. Just as a documentary team faces challen es that often go unreported when arranging expeditions or interviews and gathering and editing footage within time and budget constraints, so the qualitative research team faces challenges in obtaining ethical clearance, recruiting participants, analysing data, choosing how and what to report, etc. We present illustrative examples drawn from prior experience to bring to life the purpose, planning and practical considerations of doing qualitative studies for interaction design. We include takeaway checklists for planning, conducting, reporting and evaluating semi-structured qualitative studies. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Face Detection and Adaptation

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Face detection, because of its vast array of applications, is one of the most active research areas in computer vision. In this book, we review various approaches to face detection developed in the past decade, with more emphasis on boosting-based learning algorithms. We then present a series of algorithms that are empowered by the statistical view of boosting and the concept of multiple instance learning. We start by describing a boosting learning framework that is capable to handle billions of training examples. It differs from traditional bootstrapping schemes in that no intermediate thresholds need to be set during training, yet the total number of negative examples used for feature selection remains constant and focused (on the poor performing ones). A multiple instance pruning scheme is then adopted to set the intermediate thresholds after boosting learning. This algorithm generates detectors that are both fast and accurate. We then present two multiple instance learning schemes for face detection, multiple instance learning boosting (MILBoost) and winner-take-all multiple category boosting (WTA-McBoost). MILBoost addresses the uncertainty in accurately pinpointing the location of the object being detected, while WTA-McBoost addresses the uncertainty in determining the most appropriate subcategory label for multiview object detection. Both schemes can resolve the ambiguity of the labeling process and reduce outliers during training, which leads to improved detector performances. In many applications, a detector trained with generic data sets may not perform optimally in a new environment. We propose detection adaption, which is a promising solution for this problem. We present an adaptation scheme based on the Taylor expansion of the boosting learning objective function, and we propose to store the second order statistics of the generic training data for future adaptation. We show that with a small amount of labeled data in the new environment, the detector' performance can be greatly improved. We also present two interesting applications where boosting learning was applied successfully. The first application is face verification for filtering and ranking image/video search results on celebrities. We present boosted multi-task learning (MTL), yet another boosting learning algorithm that extends MILBoost with a graphical model. Since the available number of training images for each celebrity may be limited, learning individual classifiers for each person may cause overfitting. MTL jointly learns classifiers for multiple people by sharing a few boosting classifiers in order to avoid overfitting. The second application addresses the need of speaker detection in conference rooms. The goal is to find who is speaking, given a microphone array and a panoramic video of the room. We show that by combining audio and visual features in a boosting framework, we can determine the speaker's position very accurately. Finally, we offer our thoughts on uture directions for face detection. Table of Contents: A Brief Survey of the Face Detection Literature / Cascade-based Real-Time Face Detection / Multiple Instance Learning for Face Detection / Detector Adaptation / Other Applications / Conclusions and Future Work View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Reduction of a Ship's Magnetic Field Signatures

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Decreasing the magnetic field signature of a naval vessel will reduce its susceptibility to detonating naval influence mines and the probability of a submarine being detected by underwater barriers and maritime patrol aircraft. Both passive and active techniques for reducing the magnetic signatures produced by a vessel's ferromagnetism, roll-induced eddy currents, corrosion-related sources, and stray fields are presented. Mathematical models of simple hull shapes are used to predict the levels of signature reduction that might be achieved through the use of alternate construction materials. Also, the process of demagnetizing a steel-hulled ship is presented, along with the operation of shaft-grounding systems, paints, and alternate configurations for power distribution cables. In addition, active signature reduction technologies are described, such as degaussing and deamping, which attempt to cancel the fields surrounding a surface ship or submarine rather than eliminate its source. able of Contents: Introduction / Passive Magnetic Silencing Techniques / Active Signature Compensation / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dependency Parsing

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Dependency-based methods for syntactic parsing have become increasingly popular in natural language processing in recent years. This book gives a thorough introduction to the methods that are most widely used today. After an introduction to dependency grammar and dependency parsing, followed by a formal characterization of the dependency parsing problem, the book surveys the three major classes of parsing models that are in current use: transition-based, graph-based, and grammar-based models. It continues with a chapter on evaluation and one on the comparison of different methods, and it closes with a few words on current trends and future prospects of dependency parsing. The book presupposes a knowledge of basic concepts in linguistics and computer science, as well as some knowledge of parsing methods for constituency-based representations. Table of Contents: Introduction / Dependency Parsing / Transition-Based Parsing / Graph-Based Parsing / Grammar-Based Parsing / Evaluation / Comp rison / Final Thoughts View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    RFID Security and Privacy

    Copyright Year: 2013

    Morgan and Claypool eBooks

    As a fast-evolving new area, RFID security and privacy has quickly grown from a hungry infant to an energetic teenager during recent years. Much of the exciting development in this area is summarized in this book with rigorous analyses and insightful comments. In particular, a systematic overview on RFID security and privacy is provided at both the physical and network level. At the physical level, RFID security means that RFID devices should be identified with assurance in the presence of attacks, while RFID privacy requires that RFID devices should be identified without disclosure of any valuable information about the devices. At the network level, RFID security means that RFID information should be shared with authorized parties only, while RFID privacy further requires that RFID information should be shared without disclosure of valuable RFID information to any honest-but-curious server which coordinates information sharing. Not only does this book summarize the past, but it also rovides new research results, especially at the network level. Several future directions are envisioned to be promising for advancing the research in this area. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multiantenna Systems for MIMO Communications

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Advanced communication scenarios demand the development of new systemswhere antenna theory, channel propagation and communication models are seen from a common perspective as a way to understand and optimize the system as a whole. In this context, a comprehensive multiantenna formulation for multiple-input multiple-output systems is presented with a special emphasis on the connection of the electromagnetic and communication principles. Starting from the capacity for amultiantenna system, the book reviews radiation, propagation, and communicationmechanisms, paying particular attention to the vectorial, directional, and timefrequency characteristics of the wireless communication equation for low- and high-scattering environments. Based on the previous concepts, different space—time methods for diversity and multiplexing applications are discussed, multiantenna modeling is studied, and specific tools are introduced to analyze the antenna coupling mechanisms and formulate appropri te decorrelation techniques.Miniaturization techniques for closely spaced antennas are studied, and its fundamental limits and optimization strategies are reviewed. Finally, different practical multiantenna topologies for new communication applications are presented, and its main parameters discussed. A relevant feature is a collection of synthesis exercises that review the main topics of the book and introduces state-of-the art system architectures and parameters, facilitating its use either as a text book or as a support tool for multiantenna systems design. Table of Contents: Principles of Multiantenna Communication Systems / The Radio Channel for MIMO Communication Systems / Coding Theory for MIMO Communication Systems / Antenna Modeling for MIMO Communication Systems / Design of MPAs for MIMO Communication Systems / Design Examples and Performance Analysis of Different MPAs / References / List of Acronyms / List of Symbols / Operators and Mathematical Symbols View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sharing Network Resources

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Resource Allocation lies at the heart of network control. In the early days of the Internet the scarcest resource was bandwidth, but as the network has evolved to become an essential utility in the lives of billions, the nature of the resource allocation problem has changed. This book attempts to describe the facets of resource allocation that are most relevant to modern networks. It is targeted at graduate students and researchers who have an introductory background in networking and who desire to internalize core concepts before designing new protocols and applications. We start from the fundamental question: what problem does network resource allocation solve? This leads us, in Chapter 1, to examine what it means to satisfy a set of user applications that have different requirements of the network, and to problems in Social Choice Theory. We find that while capturing these preferences in terms of utility is clean and rigorous, there are significant limitations to this choice. Chapt r 2 focuses on sharing divisible resources such as links and spectrum. Both of these resources are somewhat atypical -- a link is most accurately modeled as a queue in our context, but this leads to the analytical intractability of queueing theory, and spectrum allocation methods involve dealing with interference, a poorly understood phenomenon. Chapters 3 and 4 are introductions to two allocation workhorses: auctions and matching. In these chapters we allow the users to game the system (i.e., to be strategic), but don't allow them to collude. In Chapter 5, we relax this restriction and focus on collaboration. Finally, in Chapter 6, we discuss the theoretical yet fundamental issue of stability. Here, our contribution is mostly on making a mathematically abstruse subdiscipline more accessible without losing too much generality. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Logic

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This book is a gentle but rigorous introduction to formal logic. It is intended primarily for use at the college level. However, it can also be used for advanced secondary school students, and it can be used at the start of graduate school for those who have not yet seen the material. The approach to teaching logic used here emerged from more than 20 years of teaching logic to students at Stanford University and from teaching logic to tens of thousands of others via online courses on the World Wide Web. The approach differs from that taken by other books in logic in two essential ways, one having to do with content, the other with form. Like many other books on logic, this one covers logical syntax and semantics and proof theory plus induction. However, unlike other books, this book begins with Herbrand semantics rather than the more traditional Tarskian semantics. This approach makes the material considerably easier for students to understand and leaves them with a deeper understandi g of what logic is all about. The primary content difference concerns the semantics of the logic that is taught. In addition to this text, there are online exercises (with automated grading), online logic tools and applications, online videos of lectures, and an online forum for discussion. They are available at logic.stanford.edu/intrologic/. Table of Contents: Introduction / Propositional Logic / Propositional Proofs / Propositional Resolution / Satisfiability / Herbrand Logic / Herbrand Logic Proofs / Resolution / Induction / First Order Logic View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Interaction for Visualization

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Visualization has become a valuable means for data exploration and analysis. Interactive visualization combines expressive graphical representations and effective user interaction. Although interaction is an important component of visualization approaches, much of the visualization literature tends to pay more attention to the graphical representation than to interaction. The goal of this work is to strengthen the interaction side of visualization. Based on a brief review of general aspects of interaction, we develop an interaction-oriented view on visualization. This view comprises five key aspects: the data, the tasks, the technology, the human, as well as the implementation. Picking up these aspects individually, we elaborate several interaction methods for visualization. We introduce a multi-threading architecture for efficient interactive exploration. We present interaction techniques for different types of data e.g., multivariate data, spatio-temporal data, graphs) and different visualization tasks (e.g., exploratory navigation, visual comparison, visual editing). With respect to technology, we illustrate approaches that utilize modern interaction modalities (e.g., touch, tangibles, proxemics) as well as classic ones. While the human is important throughout this work, we also consider automatic methods to assist the interactive part. In addition to solutions for individual problems, a major contribution of this work is the overarching view of interaction in visualization as a whole. This includes a critical discussion of interaction, the identification of links between the key aspects of interaction, and the formulation of research topics for future work with a focus on interaction. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Automatic Parallelization:An Overview of Fundamental Compiler Techniques

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and discuss transformations that expose parallelism to target shared memory multicore and vector processors. We then discuss some problems that arise when parallelizing programs for execution on distributed memory machines. Finally, we conclude with an overview of solving Diophantine equations and suggestions for further readings in the topics of this book to enable the interested reader to delve deeper into t e field. Table of Contents: Introduction and overview / Dependence analysis, dependence graphs and alias analysis / Program parallelization / Transformations to modify and eliminate dependences / Transformation of iterative and recursive constructs / Compiling for distributed memory machines / Solving Diophantine equations / A guide to further reading View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    What Is Global Engineering Education For?:The Making of International Educators, Part I

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Global engineering offers the seductive image of engineers figuring out how to optimize work through collaboration and mobility. Its biggest challenge to engineers, however, is more fundamental and difficult: to better understand what they know and value qua engineers and why. This volume reports an experimental effort to help sixteen engineering educators produce "personal geographies" describing what led them to make risky career commitments to international and global engineering education. The contents of their diverse trajectories stand out in extending far beyond the narrower image of producing globally-competent engineers. Their personal geographies repeatedly highlight experiences of incongruence beyond home countries that provoked them to see themselves and understand their knowledge differently. The experiences were sufficiently profound to motivate them to design educational experiences that could provoke engineering students in similar ways. For nine engineers, gaining new international knowledge challenged assumptions that engineering work and life are limited to purely technical practices, compelling explicit attention to broader value commitments. For five non-engineers and two hybrids, gaining new international knowledge fueled ambitions to help engineering students better recognize and critically examine the broader value commitments in their work. A background chapter examines the historical emergence of international engineering education in the United States, and an epilogue explores what it might take to integrate practices of critical self-analysis more systematically in the education and training of engineers. Two appendices and two online supplements describe the unique research process that generated these personal geographies, especially the workshop at the U.S. National Academy of Engineering in which authors were prohibited from participating in discussions of their manuscripts. Table of Contents: Communicating Across Cultures: Humanitie in the International Education of Engineers (Bernd Widdig) / Linking Language Proficiency and the Professions (Michael Nugent) / Language, Life, and Pathways to Global Competency for Engineers (and Everyone Else) (Phil McKnight) / Bridging Two worlds (John M. Grandin) / Opened Eyes: From Moving Up to Helping Students See (Gayle G. Elliott) / What is Engineering for? A Search for Engineering beyond Militarism and Free-markets (Juan Lucena) / Location, Knowledge, and Desire: From Two Conservatisms to Engineering Cultures and Countries (Gary Lee Downey) / Epilogue - Beyond Global Competence: Implications for Engineering Pedagogy (Gary Lee Downey) View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computational Electronics

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Computational Electronics is devoted to state of the art numerical techniques and physical models used in the simulation of semiconductor devices from a semi-classical perspective. Computational electronics, as a part of the general Technology Computer Aided Design (TCAD) field, has become increasingly important as the cost of semiconductor manufacturing has grown exponentially, with a concurrent need to reduce the time from design to manufacture. The motivation for this volume is the need within the modeling and simulation community for a comprehensive text which spans basic drift-diffusion modeling, through energy balance and hydrodynamic models, and finally particle based simulation. One unique feature of this book is a specific focus on numerical examples, particularly the use of commercially available software in the TCAD community. The concept for this book originated from a first year graduate course on computational electronics, taught now for several years, in the Electrical ngineering Department at Arizona State University. Numerous exercises and projects were derived from this course and have been included. The prerequisite knowledge is a fundamental understanding of basic semiconductor physics, the physical models for various device technologies such as pndiodes, bipolar junction transistors, and field effect transistors. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing Asynchronous Circuits using NULL Convention Logic (NCL)

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Designing Asynchronous Circuits using NULL Convention Logic (NCL) begins with an introduction to asynchronous (clockless) logic in general, and then focuses on delay-insensitive asynchronous logic design using the NCL paradigm. The book details design of input-complete and observable dual-rail and quad-rail combinational circuits, and then discusses implementation of sequential circuits, which require datapath feedback. Next, throughput optimization techniques are presented, including pipelining, embedding registration, early completion, and NULL cycle reduction. Subsequently, low-power design techniques, such as wavefront steering and Multi-Threshold CMOS (MTCMOS) for NCL, are discussed. The book culminates with a comprehensive design example of an optimized Greatest Common Divisor circuit. Readers should have prior knowledge of basic logic design concepts, such as Boolean algebra and Karnaugh maps. After studying this book, readers should have a good understanding of the differences between asynchronous and synchronous circuits, and should be able to design arbitrary NCL circuits, optimized for area, throughput, and power. Table of Contents: Introduction to Asynchronous Logic / Overview of NULL Convention Logic (NCL) / Combinational NCL Circuit Design / Sequential NCL Circuit Design / NCL Throughput Optimization / Low-Power NCL Design / Comprehensive NCL Design Example View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Architecture:The Design and Integration of Information Spaces

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Information Architecture is about organizing and simplifying information, designing and integrating information spaces/systems, and creating ways for people to find and interact with information content. Its goal is to help people understand and manage information and make the right decisions accordingly. This updated and revised edition of the book looks at integrated information spaces in the web context and beyond, with a focus on putting theories and principles into practice.</p><p>In the ever-changing social, organizational, and technological contexts, information architects not only design individual information spaces (e.g., websites, software applications, and mobile devices), but also tackle strategic aggregation and integration of multiple information spaces across websites, channels, modalities, and platforms. Not only do they create predetermined navigation pathways, but they also provide tools and rules for people to organize information on thei own and get connected with others.</p><p>Information architects work with multi-disciplinary teams to determine the user experience strategy based on user needs and business goals, and make sure the strategy gets carried out by following the user-centered design (UCD) process via close collaboration with others. Drawing on the authors’ extensive experience as HCI researchers, User Experience Design practitioners, and Information Architecture instructors, this book provides a balanced view of the IA discipline by applying theories, design principles, and guidelines to IA and UX practices. It also covers advanced topics such as iterative design, UX decision support, and global and mobile IA considerations. Major revisions include moving away from a web-centric view toward multi-channel, multi-device experiences. Concepts such as responsive design, emerging design principles, and user-centered methods such as Agile, Lean UX, and Design Thinking are discussed and elated to IA processes and practices.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sparse Representations for Radar with MATLAB® Examples

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Although the field of sparse representations is relatively new, research activities in academic and industrial research labs are already producing encouraging results. The sparse signal or parameter model motivated several researchers and practitioners to explore high complexity/wide bandwidth applications such as Digital TV, MRI processing, and certain defense applications. The potential signal processing advancements in this area may influence radar technologies. This book presents the basic mathematical concepts along with a number of useful MATLAB® examples to emphasize the practical implementations both inside and outside the radar field. Table of Contents: Radar Systems: A Signal Processing Perspective / Introduction to Sparse Representations / Dimensionality Reduction / Radar Signal Processing Fundamentals / Sparse Representations in Radar View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Continuum Biomechanics

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book is concerned with the study of continuum mechanics applied to biological systems, i.e., continuum biomechanics. This vast and exciting subject allows description of when a bone may fracture due to excessive loading, how blood behaves as both a solid and fluid, down to how cells respond to mechanical forces that lead to changes in their behavior, a process known as mechanotransduction. We have written for senior undergraduate students and first year graduate students in mechanical or biomedical engineering, but individuals working at biotechnology companies that deal in biomaterials or biomechanics should also find the information presented relevant and easily accessible. Table of Contents: Tensor Calculus / Kinematics of a Continuum / Stress / Elasticity / Fluids / Blood and Circulation / Viscoelasticity / Poroelasticity and Thermoelasticity / Biphasic Theory View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Mobile Agent Rendezvous Problem in the Ring

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Mobile agent computing is being used in fields as diverse as artificial intelligence, computational economics and robotics. Agents' ability to adapt dynamically and execute asynchronously and autonomously brings potential advantages in terms of fault-tolerance, flexibility and simplicity. This monograph focuses on studying mobile agents as modelled in distributed systems research and in particular within the framework of research performed in the distributed algorithms community. It studies the fundamental question of how to achieve rendezvous, the gathering of two or more agents at the same node of a network. Like leader election, such an operation is a useful subroutine in more general computations that may require the agents to synchronize, share information, divide up chores, etc. The work provides an introduction to the algorithmic issues raised by the rendezvous problem in the distributed computing setting. For the most part our investigation concentrates on the simplest case o two agents attempting to rendezvous on a ring network. Other situations including multiple agents, faulty nodes and other topologies are also examined. An extensive bibliography provides many pointers to related work not covered in the text. The presentation has a distinctly algorithmic, rigorous, distributed computing flavor and most results should be easily accessible to advanced undergraduate and graduate students in computer science and mathematics departments. Table of Contents: Models for Mobile Agent Computing / Deterministic Rendezvous in a Ring / Multiple Agent Rendezvous in a Ring / Randomized Rendezvous in a Ring / Other Models / Other Topologies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Fundamentals of Analysis for Talented Freshmen

    Copyright Year: 2016

    Morgan and Claypool eBooks

    This book assumes the students know some of the basic facts about Calculus. We are very rigorous and expose them to the proofs and the ideas which produce them. In three chapters, this book covers these number systems and the material usually found in a junior-senior advanced Calculus course. It is designed to be a one-semester course for "talented" freshmen. Moreover, it presents a way of thinking about mathematics that will make it much easier to learn more of this subject and be a good preparation for more of the undergraduate curriculum. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semantics Empowered Web 3.0:Managing Enterprise, Social, Sensor, and Cloud-based Data and Services for Advanced Applications

    Copyright Year: 2012

    Morgan and Claypool eBooks

    After the traditional document-centric Web 1.0 and user-generated content focused Web 2.0, Web 3.0 has become a repository of an ever growing variety of Web resources that include data and services associated with enterprises, social networks, sensors, cloud, as well as mobile and other devices that constitute the Internet of Things. These pose unprecedented challenges in terms of heterogeneity (variety), scale (volume), and continuous changes (velocity), as well as present corresponding opportunities if they can be exploited. Just as semantics has played a critical role in dealing with data heterogeneity in the past to provide interoperability and integration, it is playing an even more critical role in dealing with the challenges and helping users and applications exploit all forms of Web 3.0 data. This book presents a unified approach to harness and exploit all forms of contemporary Web resources using the core principles of ability to associate meaning with data through conceptual or domain models and semantic descriptions including annotations, and through advanced semantic techniques for search, integration, and analysis. It discusses the use of Semantic Web standards and techniques when appropriate, but also advocates the use of lighter weight, easier to use, and more scalable options when they are more suitable. The authors' extensive experience spanning research and prototypes to development of operational applications and commercial technologies and products guide the treatment of the material. Table of Contents: Role of Semantics and Metadata / Types and Models of Semantics / Annotation -- Adding Semantics to Data / Semantics for Enterprise Data / Semantics for Services / Semantics for Sensor Data / Semantics for Social Data / Semantics for Cloud Computing / Semantics for Advanced Applications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Atmel AVR Microcontroller Primer:Programming and Interfacing, Second Edition

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This textbook provides practicing scientists and engineers a primer on the Atmel AVR microcontroller. In this second edition we highlight the popular ATmega164 microcontroller and other pin-for-pin controllers in the family with a complement of flash memory up to 128 kbytes. The second edition also adds a chapter on embedded system design fundamentals and provides extended examples on two different autonomous robots. Our approach is to provide the fundamental skills to quickly get up and operating with this internationally popular microcontroller. We cover the main subsystems aboard the ATmega164, providing a short theory section followed by a description of the related microcontroller subsystem with accompanying hardware and software to exercise the subsystem. In all examples, we use the C programming language. We include a detailed chapter describing how to interface the microcontroller to a wide variety of input and output devices and conclude with several system level examples. Ta le of Contents: Atmel AVR Architecture Overview / Serial Communication Subsystem / Analog-to-Digital Conversion / Interrupt Subsystem / Timing Subsystem / Atmel AVR Operating Parameters and Interfacing / Embedded Systems Design View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Markov Logic:An Interface Layer for Artificial Intelligence

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Most subfields of computer science have an interface layer via which applications communicate with the infrastructure, and this is key to their success (e.g., the Internet in networking, the relational model in databases, etc.). So far this interface layer has been missing in AI. First-order logic and probabilistic graphical models each have some of the necessary features, but a viable interface layer requires combining both. Markov logic is a powerful new language that accomplishes this by attaching weights to first-order formulas and treating them as templates for features of Markov random fields. Most statistical models in wide use are special cases of Markov logic, and first-order logic is its infinite-weight limit. Inference algorithms for Markov logic combine ideas from satisfiability, Markov chain Monte Carlo, belief propagation, and resolution. Learning algorithms make use of conditional likelihood, convex optimization, and inductive logic programming. Markov logic has been su cessfully applied to problems in information extraction and integration, natural language processing, robot mapping, social networks, computational biology, and others, and is the basis of the open-source Alchemy system. Table of Contents: Introduction / Markov Logic / Inference / Learning / Extensions / Applications / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    PSpice for Digital Signal Processing

    Copyright Year: 2007

    Morgan and Claypool eBooks

    PSpice for Digital Signal Processing is the last in a series of five books using Cadence Orcad PSpice version 10.5 and introduces a very novel approach to learning digital signal processing (DSP). DSP is traditionally taught using Matlab/Simulink software but has some inherent weaknesses for students particularly at the introductory level. The ‘plug in variables and play’ nature of these software packages can lure the student into thinking they possess an understanding they don’t actually have because these systems produce results quicklywithout revealing what is going on. However, it must be said that, for advanced level work Matlab/Simulink really excel. In this book we start by examining basic signals starting with sampled signals and dealing with the concept of digital frequency. The delay part, which is the heart of DSP, is explained and applied initially to simple FIR and IIR filters. We examine linear time invariant systems starting with the difference equa ion and applying the z-transform to produce a range of filter type i.e. low-pass, high-pass and bandpass. The important concept of convolution is examined and here we demonstrate the usefulness of the 'log' command in Probe for giving the correct display to demonstrate the 'flip n slip' method. Digital oscillators, including quadrature carrier generation, are then examined. Several filter design methods are considered and include the bilinear transform, impulse invariant, and window techniques. Included also is a treatment of the raised-cosine family of filters. A range of DSP applications are then considered and include the Hilbert transform, single sideband modulator using the Hilbert transform and quad oscillators, integrators and differentiators. Decimation and interpolation are simulated to demonstrate the usefulness of the multi-sampling environment. Decimation is also applied in a treatment on digital receivers. Lastly, we look at some musical applications for DSP such as r verberation/echo using real-world signals imported into PSpice using the program Wav2Ascii. The zero-forcing equalizer is dealt with in a simplistic manner and illustrates the effectiveness of equalizing signals in a receiver after transmission. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Matrices in Engineering Problems

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book is intended as an undergraduate text introducing matrix methods as they relate to engineering problems. It begins with the fundamentals of mathematics of matrices and determinants. Matrix inversion is discussed, with an introduction of the well known reduction methods. Equation sets are viewed as vector transformations, and the conditions of their solvability are explored. Orthogonal matrices are introduced with examples showing application to many problems requiring three dimensional thinking. The angular velocity matrix is shown to emerge from the differentiation of the 3-D orthogonal matrix, leading to the discussion of particle and rigid body dynamics. The book continues with the eigenvalue problem and its application to multi-variable vibrations. Because the eigenvalue problem requires some operations with polynomials, a separate discussion of these is given in an appendix. The example of the vibrating string is given with a comparison of the matrix analysis to the cont nuous solution. Table of Contents: Matrix Fundamentals / Determinants / Matrix Inversion / Linear Simultaneous Equation Sets / Orthogonal Transforms / Matrix Eigenvalue Analysis / Matrix Analysis of Vibrating Systems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Robot Learning from Human Teachers

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Learning from Demonstration (LfD) explores techniques for learning a task policy from examples provided by a human teacher. The field of LfD has grown into an extensive body of literature over the past 30 years, with a wide variety of approaches for encoding human demonstrations and modeling skills and tasks. Additionally, we have recently seen a focus on gathering data from non-expert human teachers (i.e., domain experts but not robotics experts). In this book, we provide an introduction to the field with a focus on the unique technical challenges associated with designing robots that learn from naive human teachers. We begin, in the introduction, with a unification of the various terminology seen in the literature as well as an outline of the design choices one has in designing an LfD system. Chapter 2 gives a brief survey of the psychology literature that provides insights from human social learning that are relevant to designing robotic social learners. Chapter 3 walks through an fD interaction, surveying the design choices one makes and state of the art approaches in prior work. First, is the choice of input, how the human teacher interacts with the robot to provide demonstrations. Next, is the choice of modeling technique. Currently, there is a dichotomy in the field between approaches that model low-level motor skills and those that model high-level tasks composed of primitive actions. We devote a chapter to each of these. Chapter 7 is devoted to interactive and active learning approaches that allow the robot to refine an existing task model. And finally, Chapter 8 provides best practices for evaluation of LfD systems, with a focus on how to approach experiments with human subjects in this domain. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Circuits:D-C and Time Domain

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Pragmatic Circuits: DC and Time Domain deals primarily with circuits and how they function, beginning with a review of Kirchhoff's and Ohm's Laws analysis of d-c circuits and op-amps, and the sinusoidal steady state. The author then looks at formal circuit analysis through nodal and mesh equations. Useful theorems like Thevenin are added to the circuits toolbox. This first of three volumes ends with a chapter on design. The two follow-up volumes in the Pragmatic Circuits series include titles on Frequency Domain and Signals and Filters. These short lecture books will be of use to students at any level of electrical engineering and for practicing engineers, or scientists, in any field looking for a practical and applied introduction to circuits and signals. The author's “pragmatic” and applied style gives a unique and helpful “non-idealistic, practical, opinionated” introduction to circuits. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Cardiac Tissue Engineering:Principles, Materials, and Applications

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Cardiac tissue engineering aims at repairing damaged heart muscle and producing human cardiac tissues for application in drug toxicity studies. This book offers a comprehensive overview of the cardiac tissue engineering strategies, including presenting and discussing the various concepts in use, research directions and applications. Essential basic information on the major components in cardiac tissue engineering, namely cell sources and biomaterials, is firstly presented to the readers, followed by a detailed description of their implementation in different strategies, broadly divided to cellular and acellular ones. In cellular approaches, the biomaterials are used to increase cell retention after implantation or as scaffolds when bioengineering the cardiac patch, in vitro. In acellular approaches, the biomaterials are used as ECM replacement for damaged cardiac ECM after MI, or, in combination with growth factors, the biomaterials assume an additional function as a depot for prolong d factor activity for the effective recruitment of repairing cells. The book also presents technological innovations aimed to improve the quality of the cardiac patches, such as bioreactor applications, stimulation patterns and prevascularization. This book could be of interest not only from an educational perspective (i.e. for graduate students), but also for researchers and medical professionals, to offer them fresh views on novel and powerful treatment strategies. We hope that the reader will find a broad spectrum of ideas and possibilities described in this book both interesting and convincing. Table of Contents: Introduction / The Heart: Structure, Cardiovascular Diseases, and Regeneration / Cell Sources for Cardiac Tissue Engineering / Biomaterials: Polymers, Scaffolds, and Basic Design Criteria / Biomaterials as Vehicles for Stem Cell Delivery and Retention in the Infarct / Bioengineering of Cardiac Patches, In Vitro / Perfusion Bioreactors and Stimulation Patterns in Cardiac T ssue Engineering / Vascularization of Cardiac Patches / Acellular Biomaterials for Cardiac Repair / Biomaterial-based Controlled Delivery of Bioactive Molecules for Myocardial Regeneration View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mobile Platform Security

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Recently, mobile security has garnered considerable interest in both the research community and industry due to the popularity of smartphones. The current smartphone platforms are open systems that allow application development, also for malicious parties. To protect the mobile device, its user, and other mobile ecosystem stakeholders such as network operators, application execution is controlled by a platform security architecture. This book explores how such mobile platform security architectures work. We present a generic model for mobile platform security architectures: the model illustrates commonly used security mechanisms and techniques in mobile devices and allows a systematic comparison of different platforms. We analyze several mobile platforms using the model. In addition, this book explains hardware-security mechanisms typically present in a mobile device. We also discuss enterprise security extensions for mobile platforms and survey recent research in the area of mobile p atform security. The objective of this book is to provide a comprehensive overview of the current status of mobile platform security for students, researchers, and practitioners. Table of Contents: Preface / Introduction / Platform Security Model / Mobile Platforms / Platform Comparison / Mobile Hardware Security / Enterprise Security Extensions / Platform Security Research / Conclusions / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Incomplete Data and Data Dependencies in Relational Databases

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The chase has long been used as a central tool to analyze dependencies and their effect on queries. It has been applied to different relevant problems in database theory such as query optimization, query containment and equivalence, dependency implication, and database schema design. Recent years have seen a renewed interest in the chase as an important tool in several database applications, such as data exchange and integration, query answering in incomplete data, and many others. It is well known that the chase algorithm might be non-terminating and thus, in order for it to find practical applicability, it is crucial to identify cases where its termination is guaranteed. Another important aspect to consider when dealing with the chase is that it can introduce null values into the database, thereby leading to incomplete data. Thus, in several scenarios where the chase is used the problem of dealing with data dependencies and incomplete data arises. This book discusses fundamental iss es concerning data dependencies and incomplete data with a particular focus on the chase and its applications in different database areas. We report recent results about the crucial issue of identifying conditions that guarantee the chase termination. Different database applications where the chase is a central tool are discussed with particular attention devoted to query answering in the presence of data dependencies and database schema design. Table of Contents: Introduction / Relational Databases / Incomplete Databases / The Chase Algorithm / Chase Termination / Data Dependencies and Normal Forms / Universal Repairs / Chase and Database Applications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fieldwork for Healthcare:Guidance for Investigating Human Factors in Computing Systems

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Conducting fieldwork for investigating technology use in healthcare is a challenging undertaking, and yet there is little in the way of community support and guidance for conducting these studies. There is a need for better knowledge sharing and resources to facilitate learning. This is the second of two volumes designed as a collective graduate guidebook for conducting fieldwork in healthcare. This volume brings together thematic chapters that draw out issues and lessons learned from practical experience. Researchers who have first-hand experience of conducting healthcare fieldwork collaborated to write these chapters. This volume contains insights, tips, and tricks from studies in clinical and non-clinical environments, from hospital to home. This volume starts with an introduction to the ethics and governance procedures a researcher might encounter when conducting fieldwork in this sensitive study area. Subsequent chapters address specific aspects of conducting situated healthcare esearch. Chapters on readying the researcher and relationships in the medical domain break down some of the complex social aspects of this type of research. They are followed by chapters on the practicalities of collecting data and implementing interventions, which focus on domain-specific issues that may arise. Finally, we close the volume by discussing the management of impact in healthcare fieldwork. The guidance contained in these chapters enables new researchers to form their project plans and also their contingency plans in this complex and challenging domain. For more experienced researchers, it offers advice and support through familiar stories and experiences. For supervisors and teachers, it offers a source of reference and debate. Together with the first volume, Fieldwork for Healthcare: Case Studies Investigating Human Factors in Computing systems, these books provide a substantive resource on how to conduct fieldwork in healthcare. Table of Contents: Preface / Acknowledgm nts / Ethics, Governance, and Patient and Public Involvement in Healthcare / Readying the Researcher for Fieldwork in Healthcare / Establishing and Maintaining Relationships in Healthcare Fields / Practicalities of Data Collection in Healthcare Fieldwork / Healthcare Intervention Studies “In the Wild” / Impact of Fieldwork in Healthcare: Understanding Impact on Researchers, Research, Practice, and Beyond / References / Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Blossoming Development of Splines, A

    Copyright Year: 2006

    Morgan and Claypool eBooks

    In this lecture, we study Bézier and B-spline curves and surfaces, mathematical representations for free-form curves and surfaces that are common in CAD systems and are used to design aircraft and automobiles, as well as in modeling packages used by the computer animation industry. Bézier/B-splines represent polynomials and piecewise polynomials in a geometric manner using sets of control points that define the shape of the surface. The primary analysis tool used in this lecture is blossoming, which gives an elegant labeling of the control points that allows us to analyze their properties geometrically. Blossoming is used to explore both Bézier and B-spline curves, and in particular to investigate continuity properties, change of basis algorithms, forward differencing, B-spline knot multiplicity, and knot insertion algorithms. We also look at triangle diagrams (which are closely related to blossoming), direct manipulation of B-spline curves, NURBS curves, and triangu ar and tensor product surfaces. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Maximum Consensus Problem:Recent Algorithmic Advances

    Copyright Year: 2017

    Morgan and Claypool eBooks

    Outlier-contaminated data is a fact of life in computer vision. For computer vision applications to perform reliably and accurately in practical settings, the processing of the input data must be conducted in a robust manner. In this context, the maximum consensus robust criterion plays a critical role by allowing the quantity of interest to be estimated from noisy and outlier-prone visual measurements. The <i>maximum consensus problem</i> refers to the problem of optimizing the quantity of interest according to the maximum consensus criterion. This book provides an overview of the algorithms for performing this optimization. The emphasis is on the basic operation or "inner workings" of the algorithms, and on their mathematical characteristics in terms of optimality and efficiency. The applicability of the techniques to common computer vision tasks is also highlighted. By collecting existing techniques in a single article, this book aims to trigger further developments in this theoretically interesting and practically important area. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Basic Probability Theory for Biomedical Engineers

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems—as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first chapter. The second chapter introduces the topic of random variables. Later chapters simply expand upon these key ideas and extend the range of application. A considerable effort has been made to develop the theory in a logical manner—developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Ever effort has been made to be consistent with commonly used notation and terminology—both within the engineering community as well as the probability and statistics literature. Biomedical engineering examples are introduced throughout the text and a large number of self-study problems are available for the reader. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sensory Organ Replacement and Repair

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The senses of human hearing and sight are often taken for granted by many individuals until they are lost or adversely affected. Millions of individuals suffer from partial or total hearing loss and millions of others have impaired vision. The technologies associated with augmenting these two human senses range from simple hearing aids to complex cochlear implants, and from (now commonplace) intraocular lenses to complex artificial corneas. The areas of human hearing and human sight will be described in detail with the associated array of technologies also described. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High Dynamic Range Video

    Copyright Year: 2008

    Morgan and Claypool eBooks

    As new displays and cameras offer enhanced color capabilities, there is a need to extend the precision of digital content. High Dynamic Range (HDR) imaging encodes images and video with higher than normal 8 bit-per-color-channel precision, enabling representation of the complete color gamut and the full visible range of luminance.However, to realize transition from the traditional toHDRimaging, it is necessary to develop imaging algorithms that work with the high-precision data. Tomake such algorithms effective and feasible in practice, it is necessary to take advantage of the limitations of the human visual system by aligning the data shortcomings to those of the human eye, thus limiting storage and processing precision. Therefore, human visual perception is the key component of the solutions we discuss in this book. This book presents a complete pipeline forHDR image and video processing fromacquisition, through compression and quality evaluation, to display. At the HDR image and vi eo acquisition stage specialized HDR sensors or multi-exposure techniques suitable for traditional cameras are discussed. Then, we present a practical solution for pixel values calibration in terms of photometric or radiometric quantities, which are required in some technically oriented applications. Also, we cover the problem of efficient image and video compression and encoding either for storage or transmission purposes, including the aspect of backward compatibility with existing formats. Finally, we review existing HDR display technologies and the associated problems of image contrast and brightness adjustment. For this purpose tone mapping is employed to accommodate HDR content to LDR devices. Conversely, the so-called inverse tone mapping is required to upgrade LDR content for displaying on HDR devices. We overview HDR-enabled image and video quality metrics, which are needed to verify algorithms at all stages of the pipeline. Additionally, we cover successful examples of the H R technology applications, in particular, in computer graphics and computer vision. The goal of this book is to present all discussed components of the HDR pipeline with the main focus on video. For some pipeline stages HDR video solutions are either not well established or do not exist at all, in which case we describe techniques for single HDR images. In such cases we attempt to select the techniques, which can be extended into temporal domain. Whenever needed, relevant background information on human perception is given, which enables better understanding of the design choices behind the discussed algorithms and HDR equipment. Table of Contents: Introduction / Representation of an HDR Image / HDR Image and Video Acquisition / HDR Image Quality / HDR Image, Video, and Texture Compression / Tone Reproduction / HDR Display Devices / LDR2HDR: Recovering Dynamic Range in Legacy Content / HDRI in Computer Graphics / Software View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    MRTD (Multi Resolution Time Domain) Method in Electromagnetics

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book presents a method that allows the use of multiresolution principles in a time domain electromagnetic modeling technique that is applicable to general structures. The multiresolution time-domain (MRTD) technique, as it is often called, is presented for general basis functions. Additional techniques that are presented here allow the modeling of complex structures using a subcell representation that permits the modeling discrete electromagnetic effects at individual equivalent grid points. This is accomplished by transforming the application of the effects at individual points in the grid into the wavelet domain. In this work, the MRTD technique is derived for a general wavelet basis using a relatively compact vector notation that both makes the technique easier to understand and illustrates the differences between MRTD basis functions. In addition, techniques such as the uniaxial perfectly matched layer (UPML) for arbitrary wavelet resolution and non-uniform gridding are prese ted. Using these techniques, any structure that can be simulated in Yee-FDTD can be modeled with in MRTD. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Concise Introduction to Models and Methods for Automated Planning

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Planning is the model-based approach to autonomous behavior where the agent behavior is derived automatically from a model of the actions, sensors, and goals. The main challenges in planning are computational as all models, whether featuring uncertainty and feedback or not, are intractable in the worst case when represented in compact form. In this book, we look at a variety of models used in AI planning, and at the methods that have been developed for solving them. The goal is to provide a modern and coherent view of planning that is precise, concise, and mostly self-contained, without being shallow. For this, we make no attempt at covering the whole variety of planning approaches, ideas, and applications, and focus on the essentials. The target audience of the book are students and researchers interested in autonomous behavior and planning from an AI, engineering, or cognitive science perspective. Table of Contents: Preface / Planning and Autonomous Behavior / Classical Planning: Fu l Information and Deterministic Actions / Classical Planning: Variations and Extensions / Beyond Classical Planning: Transformations / Planning with Sensing: Logical Models / MDP Planning: Stochastic Actions and Full Feedback / POMDP Planning: Stochastic Actions and Partial Feedback / Discussion / Bibliography / Author's Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Extreme Value Theory-Based Methods for Visual Recognition

    Copyright Year: 2017

    Morgan and Claypool eBooks

    A common feature of many approaches to modeling sensory statistics is an emphasis on capturing the "average." From early representations in the brain, to highly abstracted class categories in machine learning for classification tasks, central-tendency models based on the Gaussian distribution are a seemingly natural and obvious choice for modeling sensory data. However, insights from neuroscience, psychology, and computer vision suggest an alternate strategy: preferentially focusing representational resources on the extremes of the distribution of sensory inputs. The notion of treating extrema near a decision boundary as features is not necessarily new, but a comprehensive statistical theory of recognition based on extrema is only now just emerging in the computer vision literature. This book begins by introducing the statistical Extreme Value Theory (EVT) for visual recognition. In contrast to central-tendency modeling, it is hypothesized that distributions near decision boundaries f rm a more powerful model for recognition tasks by focusing coding resources on data that are arguably the most diagnostic features. EVT has several important properties: strong statistical grounding, better modeling accuracy near decision boundaries than Gaussian modeling, the ability to model asymmetric decision boundaries, and accurate prediction of the probability of an event beyond our experience. The second part of the book uses the theory to describe a new class of machine learning algorithms for decision making that are a measurable advance beyond the state-of-the-art. This includes methods for post-recognition score analysis, information fusion, multi-attribute spaces, and calibration of supervised machine learning algorithms. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Elastic Shape Analysis of Three-Dimensional Objects

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Statistical analysis of shapes of 3D objects is an important problem with a wide range of applications. This analysis is difficult for many reasons, including the fact that objects differ in both geometry and topology. In this manuscript, we narrow the problem by focusing on objects with fixed topology, say objects that are diffeomorphic to unit spheres, and develop tools for analyzing their geometries. The main challenges in this problem are to register points across objects and to perform analysis while being invariant to certain shape-preserving transformations.</p> <p>We develop a comprehensive framework for analyzing shapes of spherical objects, i.e., objects that are embeddings of a unit sphere in ℝ, including tools for: quantifying shape differences, optimally deforming shapes into each other, summarizing shape samples, extracting principal modes of shape variability, and modeling shape variability associated with populations. An important str ngth of this framework is that it is elastic: it performs alignment, registration, and comparison in a single unified framework, while being invariant to shape-preserving transformations.</p> <p>The approach is essentially Riemannian in the following sense. We specify natural mathematical representations of surfaces of interest, and impose Riemannian metrics that are invariant to the actions of the shape-preserving transformations. In particular, they are invariant to reparameterizations of surfaces. While these metrics are too complicated to allow broad usage in practical applications, we introduce a novel representation, termed square-root normal fields (SRNFs), that transform a particular invariant elastic metric into the standard L² metric. As a result, one can use standard techniques from functional data analysis for registering, comparing, and summarizing shapes. Specifically, this results in: pairwise registration of surfaces; computation of geodesic paths encoding optimal deformations; computation of Karcher means and covariances under the shape metric; tangent Principal Component Analysis (PCA) and extraction of dominant modes of variability; and finally, modeling of shape variability using wrapped normal densities.</p> <p>These ideas are demonstrated using two case studies: the analysis of surfaces denoting human bodies in terms of shape and pose variability; and the clustering and classification of the shapes of subcortical brain structures for use in medical diagnosis.</p> <p>This book develops these ideas without assuming advanced knowledge in differential geometry and statistics. We summarize some basic tools from differential geometry in the appendices, and introduce additional concepts and terminology as needed in the individual chapters.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Numerical Methods for Linear Complementarity Problems in Physics-Based Animation

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Linear complementarity problems (LCPs) have for many years been used in physics-based animation to model contact forces between rigid bodies in contact. More recently, LCPs have found their way into the realm of fluid dynamics. Here, LCPs are used to model boundary conditions with fluid-wall contacts. LCPs have also started to appear in deformable models and granular simulations. There is an increasing need for numerical methods to solve the resulting LCPs with all these new applications. This book provides a numerical foundation for such methods, especially suited for use in computer graphics. This book is mainly intended for a researcher/Ph.D. student/post-doc/professor who wants to study the algorithms and do more work/research in this area. Programmers might have to invest some time brushing up on math skills, for this we refer to Appendices A and B. The reader should be familiar with linear algebra and differential calculus. We provide pseudo code for all the numerical methods, w ich should be comprehensible by any computer scientist with rudimentary programming skills. The reader can find an online supplementary code repository, containing Matlab implementations of many of the core methods covered in these notes, as well as a few Python implementations [Erleben, 2011]. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mining Latent Entity Structures

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The "big data" era is characterized by an explosion of information in the form of digital data collections, ranging from scientific knowledge, to social media, news, and everyone's daily life. Examples of such collections include scientific publications, enterprise logs, news articles, social media, and general web pages. Valuable knowledge about multi-typed entities is often hidden in the unstructured or loosely structured, interconnected data. Mining latent structures around entities uncovers hidden knowledge such as implicit topics, phrases, entity roles and relationships. In this monograph, we investigate the principles and methodologies of mining latent entity structures from massive unstructured and interconnected data. We propose a text-rich information network model for modeling data in many different domains. This leads to a series of new principles and powerful methodologies for mining latent structures, including (1) latent topical hierarchy, (2) quality topical phrases, ( ) entity roles in hierarchical topical communities, and (4) entity relations. This book also introduces applications enabled by the mined structures and points out some promising research directions. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advanced Circuit Simulation using Multisim Workbench

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Multisim is now the de facto standard for circuit simulation. It is a SPICE-based circuit simulator which combines analog, discrete-time, and mixed-mode circuits. In addition, it is the only simulator which incorporates microcontroller simulation in the same environment. It also includes a tool for printed circuit board design. Advanced Circuit Simulation Using Multisim Workbench is a companion book to Circuit Analysis Using Multisim, published by Morgan & Claypool in 2011. This new book covers advanced analyses and the creation of models and subcircuits. It also includes coverage of transmission lines, the special elements which are used to connect components in PCBs and integrated circuits. Finally, it includes a description of Ultiboard, the tool for PCB creation from a circuit description in Multisim. Both books completely cover most of the important features available for a successful circuit simulation with Multisim. Table of Contents: Models and Subcircuits / Transmission Line / Other Types of Analyses / Simulating Microcontrollers / PCB Design With Ultiboard View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Microcontroller Education:Do it Yourself, Reinvent the Wheel, Code to Learn

    Copyright Year: 2017

    Morgan and Claypool eBooks

    Microcontroller education has experienced tremendous change in recent years. This book attempts to keep pace with the most recent technology while holding an opposing attitude to the <i>No Need to Reinvent the Wheel</i> philosophy. The choice strategies are in agreement with the employment of today's flexible and low-cost <i>Do-It-Yourself (DYI)</i> microcontroller hardware, along with an embedded C programming approach able to be adapted by different hardware and software development platforms. Modern embedded C compilers employ built-in features for keeping programs short and manageable and, hence, speeding up the development process. However, those features eliminate the reusability of the source code among diverse systems. The recommended programming approach relies on the motto <i>Code More to Learn Even More</i>, and directs the reader toward a low-level accessibility of the microcontroller device. The examples addressed herein ar designed to meet the demands of <i>Electrical & Electronic Engineering</i> discipline, where the microcontroller learning processes definitely bear the major responsibility. The programming strategies are in line with the two virtues of C programming language, that is, the adaptability of the source code and the low-level accessibility of the hardware system. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to Laplacian Spectral Distances and Kernels:Theory, Computation, and Applications

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>In geometry processing and shape analysis, several applications have been addressed through the properties of the Laplacian spectral kernels and distances, such as commute time, biharmonic, diffusion, and wave distances.</p> <p>Within this context, this book is intended to provide a common background on the definition and computation of the Laplacian spectral kernels and distances for geometry processing and shape analysis. To this end, we define a unified representation of the isotropic and anisotropic discrete Laplacian operator on surfaces and volumes; then, we introduce the associated differential equations, i.e., the harmonic equation, the Laplacian eigenproblem, and the heat equation. Filtering the Laplacian spectrum, we introduce the Laplacian spectral distances, which generalize the commute-time, biharmonic, diffusion, and wave distances, and their discretization in terms of the Laplacian spectrum. As main applications, we discuss the design of smoot functions and the Laplacian smoothing of noisy scalar functions.</p> <p>All the reviewed numerical schemes are discussed and compared in terms of robustness, approximation accuracy, and computational cost, thus supporting the reader in the selection of the most appropriate with respect to shape representation, computational resources, and target application.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Resistance Spot Welding:Fundamentals and Applications for the Automotive Industry

    Copyright Year: 2017

    Morgan and Claypool eBooks

    The early chapters of this book provide thorough coverage of resistance spot welding fundamentals and principles. Topics covered include lobe and current range curves, contact resistance vs. electrode force, dynamic resistance, heat balance, nugget growth, etc. Equipment issues such as machine types, power supplies, and electrodes are addressed. Subsequent chapters focus on specific spot welding challenges to modern automotive manufacturing. Approaches to welding modern materials including advanced high strength steels, coated steels, and aluminum alloys are covered in much detail. The final chapters focus on many common production and quality control issues, such as electrode wear, monitoring and testing, computational modeling, and welding codes. The overall goal of the book is to provide a comprehensive resource for automotive engineers and technicians who work with modern spot welding equipment and automotive materials. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Electronically Scanned Arrays

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Scanning arrays present the radar or communications engineer with the ultimate in antenna flexibility. They also present a multitude of new opportunities and new challenges that need to be addressed. In order to describe the needs for scanned array development, this book begins with a brief discussion of the history that led to present array antennas. This text is a compact but comprehensive treatment of the scanned array, from the underlying basis for array pattern behavior to the engineering choices leading to successful design. The book describes the scanned array in terms of radiation from apertures and wire antennas and introduces the effects resulting directly from scanning, including beam broadening, impedance mismatch and gain reduction and pattern squint and those effects of array periodicity including grating and quantization lobes and array blindness. The text also presents the engineering tools for improving pattern control and array efficiency including lattice selection, subarrray technology and pattern synthesis. Equations and figurers quantify the phenomena being described and provide the reader with the tools to tradeoff various performance features. The discussions proceed beyond the introductory material and to the state of the art in modern array design. Contents: Basic Principles and Applications of Array Antennas / Element Coupling Effects in Array Antennas / Array Pattern Synthesis / Subarray Techniques for Limited Field of View and Wide Band Applications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    New Prospects of Integrating Low Substrate Temperatures with Scaling-Sustained Device Architectural Innovation

    Copyright Year: 2016

    Morgan and Claypool eBooks