Morgan and ClayPool Synthesis Digital LIBRARY

829 Results Returned

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Chronobioengineering:Introduction to Biological Rhythms with Applications, Volume 1

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This book represents the first in a two-volume set on biological rhythms. This volume focuses on supporting the claim that biological rhythms are universal and essential characteristics of living organisms, critical for proper functioning of any living system. The author begins by examining the potential reasons for the evolution of biological rhythms: (1) the need for complex, goal-oriented devices to control the timing of their activities; (2) the inherent tendency of feedback control systems to oscillate; and (3) the existence of stable and powerful geophysical cycles to which all organisms must adapt. To investigate the second reason, the author enlists the help of biomedical engineering students to develop mathematical models of various biological systems. One such model involves a typical endocrine feedback system. By adjusting various model parameters, it was found that creating a oscillation in any component of the model generated a rhythmic cascade that made the entire system oscillate. This same approach was used to show how daily light/dark cycles could cascade rhythmic patterns throughout ecosystems and within organisms. Following up on these results, the author discusses how the twin requirements of internal synchronization (precise temporal order necessary for the proper functioning of organisms as complex, goal-oriented devices) and external synchronization (aligning organisms' behavior and physiology with geophysical cycles) supported the evolution of biological clocks. The author then investigates the clock systems that evolved using both conceptual and mathematical models, with the assistance of Dr. Bahrad Sokhansanj, who contributes a chapter on mathematical formulations and models of rhythmic phenomena. With the ubiquity of biological rhythms established, the author suggests a new classification system: the F4LM approach (Function; Frequency; waveForm; Flexibility; Level of biological system expressing rhythms; and Mode of rhythm generation) to investigate biological rhythms. This approach is first used on the more familiar cardiac cycle and then on neural rhythms as exemplified and measured by the electroencephalogram. During the process of investigating neural cycles, the author finds yet another reason for the evolution of biological rhythms: physical constraints, such as those imposed upon long distance neural signaling. In addition, a common theme emerges of a select number of autorhythmic biological oscillators imposing coherent rhythmicity on a larger network or system. During the course of the volume, the author uses a variety of observations, models, experimental results, and arguments to support the original claim of the importance and universality of biological rhythms. In Volume 2, the author will move from the establishment of the critical nature of biological rhythms to how these phenomena may be used to improve human health, well-being, and productivity. In a sense, Volume 1 focuses on the chronobio aspect of hronobioengineering while Volume 2 investigates methods of translating this knowledge into applications, the engineering aspect of chronobioengineering. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High Dynamic Range Image Reconstruction

    Copyright Year: 2008

    Morgan and Claypool eBooks

    High dynamic range imaging (HDRI) is an emerging field that has the potential to cause a great scientific and technological impact in the near future. Although new, this field is large and complex, with non-trivial relations to many different areas, such as image synthesis, computer vision, video and image processing, digital photography, special effects among others. For the above reasons,HDRI has been extensively researched over the past years and, consequently, the related scientific literature is vast. As an indication that the field is reaching maturity, tutorials and books on HDRI appeared. Moreover, this new resource has already reached interested practitioners in various application areas. In this book, we do not aim at covering the whole field of high dynamic range imaging and its applications, since it is a broad subject that is still evolving. Instead, our intent is to cover the basic principles behind HDRI and focus on one of the currently most important problems, both the retically and practically. That is, the reconstruction of high dynamic range images from regular low dynamic range pictures. Table of Contents: Introduction / Digital Image / Imaging Devices and Calibration / HDR Reconstruction / HDRI Acquisition and Visualization / Tone Enhancement / References / Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Practical Guide to Testing Wireless Smartphone Applications

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Testing applications for mobile phones is difficult, time-consuming, and hard to do effectively. Many people have limited their testing efforts to hands-on testing of an application on a few physical handsets, and they have to repeat the process every time a new version of the software is ready to test. They may miss many of the permutations of real-world use, and as a consequence their users are left with the unpleasant mess of a failing application on their phone. Test automation can help to increase the range and scope of testing, while reducing the overhead of manual testing of each version of the software. However automation is not a panacea, particularly for mobile applications, so we need to pick our test automation challenges wisely. This book is intended to help software and test engineers pick appropriately to achieve more; and as a consequence deliver better quality, working software to users. This Synthesis lecture provides practical advice based on direct experience of us ng software test automation to help improve the testing of a wide range of mobile phone applications, including the latest AJAX applications. The focus is on applications that rely on a wireless network connection to a remote server, however the principles may apply to other related fields and applications. We start by explaining terms and some of the key challenges involved in testing smartphone applications. Subsequent chapters describe a type of application e.g. markup, AJAX, Client, followed by a related chapter on how to test each of these applications. Common test automation techniques are covered in a separate chapter, and finally there is a brief chapter on when to test manually. The book also contains numerous pointers and links to further material to help you to improve your testing using automation appropriately. Table of Contents: Introduction / Markup Languages / Testing Techniques for Markup Applications / AJAX Mobile Applications / Testing Mobile AJAX Applications / Cli nt Applications / Testing Techniques for Client Applications / Common Techniques / When to Test Manually / Future Work / Appendix A: Links and References / Appendix B: Data Connectivity / Appendix C: Configuring Your Machine View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Progress in Applications of Boolean Functions

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book brings together five topics on the application of Boolean functions. They are 1. Equivalence classes of Boolean functions: The number of n-variable functions is large, even for values as small as n = 6, and there has been much research on classifying functions. There are many classifications, each with their own distinct merit. 2. Boolean functions for cryptography: The process of encrypting/decrypting plaintext messages often depends on Boolean functions with specific properties. For example, highly nonlinear functions are valued because they are less susceptible to linear attacks. 3. Boolean differential calculus: An operation analogous to taking the derivative of a real-valued function offers important insight into the properties of Boolean functions. One can determine tests or susceptibility to hazards. 4. Reversible logic: Most logic functions are irreversible; it is impossible to reconstruct the input, given the output. However, Boolean functions that are reversible ar necessary for quantum computing, and hold significant promise for low-power computing. 5. Data mining: The process of extracting subtle patterns from enormous amounts of data has benefited from the use of a graph-based representation of Boolean functions. This has use in surveillance, fraud detection, scientific discovery including bio-informatics, genetics, medicine, and education. Written by experts, these chapters present a tutorial view of new and emerging technologies in Boolean functions. Table of Contents: Equivalence Classes of Boolean Functions / Boolean Functions for Cryptography / Boolean Differential Calculus / Synthesis of Boolean Functions in Reversible Logic / Data Mining Using Binary Decision Diagrams View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Business Processes:A Database Perspective

    Copyright Year: 2012

    Morgan and Claypool eBooks

    While classic data management focuses on the data itself, research on Business Processes also considers the context in which this data is generated and manipulated, namely the processes, users, and goals that this data serves. This provides the analysts a better perspective of the organizational needs centered around the data. As such, this research is of fundamental importance. Much of the success of database systems in the last decade is due to the beauty and elegance of the relational model and its declarative query languages, combined with a rich spectrum of underlying evaluation and optimization techniques, and efficient implementations. Much like the case for traditional database research, elegant modeling and rich underlying technology are likely to be highly beneficiary for the Business Process owners and their users; both can benefit from easy formulation and analysis of the processes. While there have been many important advances in this research in recent years, there is st ll much to be desired: specifically, there have been many works that focus on the processes behavior (flow), and many that focus on its data, but only very few works have dealt with both the state-of-the-art in a database approach to Business Process modeling and analysis, the progress towards a holistic flow-and-data framework for these tasks, and highlight the current gaps and research directions. Table of Contents: Introduction / Modeling / Querying Business Processes / Other Issues / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Datacenter as a Computer:An Introduction to the Design of Warehouse-Scale Machines

    Copyright Year: 2009

    Morgan and Claypool eBooks

    As computation continues to move into the cloud, the computing platform of interest no longer resembles a pizza box or a refrigerator, but a warehouse full of computers. These new large datacenters are quite different from traditional hosting facilities of earlier times and cannot be viewed simply as a collection of co-located servers. Large portions of the hardware and software resources in these facilities must work in concert to efficiently deliver good levels of Internet service performance, something that can only be achieved by a holistic approach to their design and deployment. In other words, we must treat the datacenter itself as one massive warehouse-scale computer (WSC). We describe the architecture of WSCs, the main factors influencing their design, operation, and cost structure, and the characteristics of their software base. We hope it will be useful to architects and programmers of today's WSCs, as well as those of future many-core platforms which may one day implement the equivalent of today's WSCs on a single board. Table of Contents: Introduction / Workloads and Software Infrastructure / Hardware Building Blocks / Datacenter Basics / Energy and Power Efficiency / Modeling Costs / Dealing with Failures and Repairs / Closing Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Complexity of Noise:A Philosophical Outlook on Quantum Error Correction

    Copyright Year: 2010

    Morgan and Claypool eBooks

    In quantum computing, where algorithms exist that can solve computational problems more efficiently than any known classical algorithms, the elimination of errors that result from external disturbances or from imperfect gates has become the "holy grail", and a worldwide quest for a large scale fault-tolerant, and computationally superior, quantum computer is currently taking place. Optimists rely on the premise that, under a certain threshold of errors, an arbitrary long fault-tolerant quantum computation can be achieved with only moderate (i.e., at most polynomial) overhead in computational cost. Pessimists, on the other hand, object that there are in principle (as opposed to merely technological) reasons why such machines are still inexistent, and that no matter what gadgets are used, large scale quantum computers will never be computationally superior to classical ones. Lacking a complete empirical characterization of quantum noise, the debate on the physical possibility of such ma hines invites philosophical scrutiny. Making this debate more precise by suggesting a novel statistical mechanical perspective thereof is the goal of this project. Table of Contents: Introduction / The Curse of the Open System / To Balance a Pencil on Its Tip / Universality at All Cost / Coda View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Privacy Risk Analysis

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p><i>Privacy Risk Analysis</i> fills a gap in the existing literature by providing an introduction to the basic notions, requirements, and main steps of conducting a privacy risk analysis.</p><p>The deployment of new information technologies can lead to significant privacy risks and a privacy impact assessment should be conducted before designing a product or system that processes personal data. However, if existing privacy impact assessment frameworks and guidelines provide a good deal of details on organizational aspects (including budget allocation, resource allocation, stakeholder consultation, etc.), they are much vaguer on the technical part, in particular on the actual risk assessment task. For privacy impact assessments to keep up their promises and really play a decisive role in enhancing privacy protection, they should be more precise with regard to these technical aspects.</p><p>This book is an excellent resource for nyone developing and/or currently running a risk analysis as it defines the notions of personal data, stakeholders, risk sources, feared events, and privacy harms all while showing how these notions are used in the risk analysis process. It includes a running smart grids example to illustrate all the notions discussed in the book. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High-Speed Digital System Design

    Copyright Year: 2006

    Morgan and Claypool eBooks

    High-Speed Digital System Design bridges the gap from theory to implementation in the real world. Systems with clock speeds in low megahertz range qualify for high-speed. Proper design results in quality digital transmissions and lowers the chance for errors. This book is for computer and electrical engineers who may or may not have learned electromagnetic theory. The presentation style allows readers to quickly begin designing their own high-speed systems and diagnosing existing designs for errors. After studying this book, readers will be able to: Design the power distribution system for a printed circuit board to minimize noise Plan the layers of a PCB for signals, power, and ground to maximize signal quality and minimize noise Include test structures in the printed circuit board to easily diagnose manufacturing mistakes Choose the best PCB design parameters such a trace width, height,and routed path to ensure the most stable characteristic impedance Determine the correct terminati n to minimize reflections Predict the delay caused by a given PCB trace Minimize driver power consumption using AC terminations Compensate for discontinuities along a PCB trace Use pre-emphasis and equalization techniques to counteract lossy transmission lines Determine the amount of crosstalk between two traces Diagnose existing PCBs to determine the sources of errors View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Merging Languages and Engineering:Partnering Across the Disciplines

    Copyright Year: 2013

    Morgan and Claypool eBooks

    At the University of Rhode Island over 25% of engineering undergraduates simultaneously complete a second degree in German, French, Spanish, or Chinese. They furthermore spend an entire year abroad, one semester as exchange students at a partner university and six months as professional engineering interns at a cooperating company. With a close-to 100% placement rate, over 400 graduates, and numerous national awards, the URI International Engineering Program (IEP) is a proven path of preparation for young engineers in today's global workplace. The author of this volume, John Grandin, is an emeritus professor of German who developed and led the IEP for twenty-three years. In these pages, he provides a two-pronged approach to explain the origin and history of this program rooted in such an unusual merger of two traditionally distinct higher education disciplines. He looks first at himself to explain how and why he became an international educator and what led him to his lasting passion for the IEP. He then provides an historical overview of the program's origin and growth, including looks at the bumps and bruises and ups and downs along the way. Grandin hopes that this story will be of use and value to other educators determined to reform higher education and align it with the needs of the 21st Century. Table of Contents: How I became a Professor of German / My Unexpected Path to Engineering / Building a Network of Support / Sidetracked by a Stint in the Dean's Office / Reshaping the Language Mission / Struggling to Institutionalize / Partnering with Universities Abroad / Going into the Hotel and Restaurant Business / Taking the Lead Nationally / Building the Chinese IEP / Staying Involved after Retirement / The Broader Message for Higher Education / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Despeckle Filtering for Ultrasound Imaging and Video, Volume I:Algorithms and Software, Second Edition

    Copyright Year: 2015

    Morgan and Claypool eBooks

    It is well known that speckle is a multiplicative noise that degrades image and video quality and the visual expert's evaluation in ultrasound imaging and video. This necessitates the need for robust despeckling image and video techniques for both routine clinical practice and tele-consultation. The goal for this book (book 1 of 2 books) is to introduce the problem of speckle occurring in ultrasound image and video as well as the theoretical background (equations), the algorithmic steps, and the MATLABTM code for the following group of despeckle filters: linear filtering, nonlinear filtering, anisotropic diffusion filtering, and wavelet filtering. This book proposes a comparative evaluation framework of these despeckle filters based on texture analysis, image quality evaluation metrics, and visual evaluation by medical experts. Despeckle noise reduction through the application of these filters will improve the visual observation quality or it may be used as a pre-processing step for urther automated analysis, such as image and video segmentation, and texture characterization in ultrasound cardiovascular imaging, as well as in bandwidth reduction in ultrasound video transmission for telemedicine applications. The aforementioned topics will be covered in detail in the companion book to this one. Furthermore, in order to facilitate further applications we have developed in MATLABTM two different toolboxes that integrate image (IDF) and video (VDF) despeckle filtering, texture analysis, and image and video quality evaluation metrics. The code for these toolsets is open source and these are available to download complementary to the two books. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Broadband Quantum Cryptography

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Quantum cryptography is a rapidly developing field that draws from a number of disciplines, from quantum optics to information theory to electrical engineering. By combining some fundamental quantum mechanical principles of single photons with various aspects of information theory, quantum cryptography represents a fundamental shift in the basis for security from numerical complexity to the fundamental physical nature of the communications channel. As such, it promises the holy grail of data security: theoretically unbreakable encryption. Of course, implementing quantum cryptography in real broadband communications systems poses some unique challenges, including generating single photons, distilling random keys from the quantum key distribution process, and maintaining security at both the theoretical and practical level. Overall, quantum cryptography has a place in the history of secret keeping as a novel and potentially useful paradigm shift in the approach to broadband data encrypt on. Table of Contents: Introduction / Elements of Classical Cryptography / The Quantum Mechanics of Photons / Fundamentals of Quantum Key Distribution / Information Theory and Key Reconciliation / Components for Broadband QKD / A Survey of QKD Implementations / Conclusion - QKD in the Marketplace View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Integration:The Relational Logic Approach

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Data integration is a critical problem in our increasingly interconnected but inevitably heterogeneous world. There are numerous data sources available in organizational databases and on public information systems like the World Wide Web. Not surprisingly, the sources often use different vocabularies and different data structures, being created, as they are, by different people, at different times, for different purposes. The goal of data integration is to provide programmatic and human users with integrated access to multiple, heterogeneous data sources, giving each user the illusion of a single, homogeneous database designed for his or her specific need. The good news is that, in many cases, the data integration process can be automated. This book is an introduction to the problem of data integration and a rigorous account of one of the leading approaches to solving this problem, viz., the relational logic approach. Relational logic provides a theoretical framework for discussing da a integration. Moreover, in many important cases, it provides algorithms for solving the problem in a computationally practical way. In many respects, relational logic does for data integration what relational algebra did for database theory several decades ago. A companion web site provides interactive demonstrations of the algorithms. Table of Contents: Preface / Interactive Edition / Introduction / Basic Concepts / Query Folding / Query Planning / Master Schema Management / Appendix / References / Index / Author Biography Don't have access? Recommend our Synthesis Digital Library to your library or purchase a personal subscription. Email info@morganclaypool.com for details. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Globalization, Engineering, and Creativity

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The text addresses the impact of globalization within engineering, particularly on working practices and prospects for creativity. It suggests that accepted norms of economic activity create enclosures and thresholds within the profession, which—as engineers increase their awareness (reflexivity)—will shape the future of engineering, and the values which underpin it. It is aimed at practicing engineers and those in training and is an introduction to the social and political context currently setting new challenges for the profession. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Finite State Machine Datapath Design, Optimization, and Implementation

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Finite State Machine Datapath Design, Optimization, and Implementation explores the design space of combined FSM/Datapath implementations. The lecture starts by examining performance issues in digital systems such as clock skew and its effect on setup and hold time constraints, and the use of pipelining for increasing system clock frequency. This is followed by definitions for latency and throughput, with associated resource tradeoffs explored in detail through the use of dataflow graphs and scheduling tables applied to examples taken from digital signal processing applications. Also, design issues relating to functionality, interfacing, and performance for different types of memories commonly found in ASICs and FPGAs such as FIFOs, single-ports, and dual-ports are examined. Selected design examples are presented in implementation-neutral Verilog code and block diagrams, with associated design files available as downloads for both Altera Quartus and Xilinx Virtex FPGA platforms. A wor ing knowledge of Verilog, logic synthesis, and basic digital design techniques is required. This lecture is suitable as a companion to the synthesis lecture titled Introduction to Logic Synthesis using Verilog HDL. Table of Contents: Calculating Maximum Clock Frequency / Improving Design Performance / Finite State Machine with Datapath (FSMD) Design / Embedded Memory Usage in Finite State Machine with Datapath (FSMD) Designs View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bad to the Bone:Crafting Electronic Systems with BeagleBone Black, Second Edition

    Copyright Year: 2015

    Morgan and Claypool eBooks

    BeagleBone Black is a low-cost, open hardware computer uniquely suited to interact with sensors and actuators directly and over the Web. Introduced in April 2013 by BeagleBoard.org, a community of developers first established in early 2008, BeagleBone Black is used frequently to build vision-enabled robots, home automation systems, artistic lighting systems, and countless other do-it-yourself and professional projects. BeagleBone variants include the original BeagleBone and the newer BeagleBone Black, both hosting a powerful 32-bit, super-scalar ARM Cortex A8 processor capable of running numerous mobile and desktop-capable operating systems, typically variants of Linux including Debian, Android, and Ubuntu. Yet, BeagleBone is small enough to fit in a small mint tin box. The "Bone" may be used in a wide variety of projects from middle school science fair projects to senior design projects to first prototypes of very complex systems. Novice users may access the power of the Bone through the user-friendly BoneScript software, experienced through a Web browser in most major operating systems, including Microsoft Windows, Apple Mac OS X, or the Linux operating systems. Seasoned users may take full advantage of the Bone's power using the underlying Linux-based operating system, a host of feature extension boards (Capes) and a wide variety of Linux community open source libraries. This book provides an introduction to this powerful computer and has been designed for a wide variety of users including the first time novice through the seasoned embedded system design professional. The book contains background theory on system operation coupled with many well-documented, illustrative examples. Examples for novice users are centered on motivational, fun robot projects while advanced projects follow the theme of assistive technology and image-processing applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mobile Platform Security

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Recently, mobile security has garnered considerable interest in both the research community and industry due to the popularity of smartphones. The current smartphone platforms are open systems that allow application development, also for malicious parties. To protect the mobile device, its user, and other mobile ecosystem stakeholders such as network operators, application execution is controlled by a platform security architecture. This book explores how such mobile platform security architectures work. We present a generic model for mobile platform security architectures: the model illustrates commonly used security mechanisms and techniques in mobile devices and allows a systematic comparison of different platforms. We analyze several mobile platforms using the model. In addition, this book explains hardware-security mechanisms typically present in a mobile device. We also discuss enterprise security extensions for mobile platforms and survey recent research in the area of mobile p atform security. The objective of this book is to provide a comprehensive overview of the current status of mobile platform security for students, researchers, and practitioners. Table of Contents: Preface / Introduction / Platform Security Model / Mobile Platforms / Platform Comparison / Mobile Hardware Security / Enterprise Security Extensions / Platform Security Research / Conclusions / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multiple-Valued Logic:Concepts and Representations

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Multiple Valued Logic: Concepts and Representations begins with a survey of the use ofmultiple-valued logic in several modern application areas including electronic design automation algorithms and circuit design. The mathematical basis and concepts of various algebras and systems of multiple valued logic are provided including comparisons among various systems and examples of their application. The book also provides an examination of alternative representations of multiple-valued logic suitable for implementation as data structures in automated computer applications. Decision diagram structures for multiple valued applications are described in detail with particular emphasis on the recently developed quantum multiple valued decision diagram. Table of Contents: Multiple Valued Logic Applications / MVL Concepts and Algebra / Functional Representations / Reversible andQuantum Circuits / Quantum Multiple-Valued Decision Diagrams / Summary / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Genome Refactoring

    Copyright Year: 2009

    Morgan and Claypool eBooks

    The science of biology celebrates the discovery and understanding of biological systems that already exist in nature. In parallel, the engineering of biology must learn how to make use of our understanding of the natural world to design and build new useful biological systems. "Synthetic biology" represents one example of recent work to engineer biological systems. This emerging field aims to replace the ad hoc process of assembling biological systems by primarily developing tools to assemble reliable-but-complex living organisms from standard components that can later be reused in new combination. The focus of this book is "genome refactoring," one of several approaches to manage the complexity of a biological system in which the goal is to redesign the genetic elements that encode a living form--preserving the function of that form but encoding it with a genome far easier to study and extend. This book presents genome refactoring in two ways: as an important aspect of the emerging f eld of synthetic biology and as a powerful teaching tool to train would be professionals in the subject. Chapters focus on the overarching goals of synthetic biology and their alignment with the motivations and achievements in genome engineering; the engineering frameworks of refactoring, including genome synthesis, standardization of biological parts, and abstraction; a detailed description of the bacteriophages that have been refactored up to this point; and the methods of refactoring and contexts for that work drawn from the bacteriophage M13. Overall, these examples offer readers the potential for synthetic biology and the areas in need of further research. If successful, synthetic biology and genome refactoring could address any number of persistent societal needs, including sustainable energy, affordable and effective medicine, and green manufacturing practices. Table of Contents: Tools for Genome Engineering and Synthetic Biology / Bacteriophage as Templates for Refactoring / M thods/Teaching Protocols for M13 Reengineering / Writing and Speaking as Biological Engineers / Summary and Future Directions / Appendix A / Appendix B / Appendix C View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    iRODS Primer:Integrated Rule-Oriented Data System

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Policy-based data management enables the creation of community-specific collections. Every collection is created for a purpose. The purpose defines the set of properties that will be associated with the collection. The properties are enforced by management policies that control the execution of procedures that are applied whenever data are ingested or accessed. The procedures generate state information that defines the outcome of enforcing the management policy. The state information can be queried to validate assessment criteria and verify that the required collection properties have been conserved. The integrated Rule-Oriented Data System implements the data management framework required to support policy-based data management. Policies are turned into computer actionable Rules. Procedures are composed from a Micro-service-oriented architecture. The result is a highly extensible and tunable system that can enforce management policies, automate administrative tasks, and periodically alidate assessment criteria. Table of Contents: Introduction / Integrated Rule-Oriented Data System / iRODS Architecture / Rule-Oriented Programming / The iRODS Rule System / iRODS Micro-services / Example Rules / Extending iRODS / Appendix A: iRODS Shell Commands / Appendix B: Rulegen Grammar / Appendix C: Exercises / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Block Transceivers:OFDM and Beyond

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The demand for data traffic over mobile communication networks has substantially increased during the last decade. As a result, these mobile broadband devices spend the available spectrum fiercely, requiring the search for new technologies. In transmissions where the channel presents a frequency-selective behavior, multicarrier modulation (MCM) schemes have proven to be more efficient, in terms of spectral usage, than conventional modulations and spread spectrum techniques. The orthogonal frequency-division multiplexing (OFDM) is the most popular MCM method, since it not only increases spectral efficiency but also yields simple transceivers. All OFDM-based systems, including the single-carrier with frequency-division equalization (SC-FD), transmit redundancy in order to cope with the problem of interference among symbols. This book presents OFDM-inspired systems that are able to, at most, halve the amount of redundancy used by OFDM systems while keeping the computational complexity co parable. Such systems, herein called memoryless linear time-invariant (LTI) transceivers with reduced redundancy, require low-complexity arithmetical operations and fast algorithms. In addition, whenever the block transmitter and receiver have memory and/or are linear time-varying (LTV), it is possible to reduce the redundancy in the transmission even further, as also discussed in this book. For the transceivers with memory it is possible to eliminate the redundancy at the cost of making the channel equalization more difficult. Moreover, when time-varying block transceivers are also employed, then the amount of redundancy can be as low as a single symbol per block, regardless of the size of the channel memory. With the techniques presented in the book it is possible to address what lies beyond the use of OFDM-related solutions in broadband transmissions. Table of Contents: The Big Picture / Transmultiplexers / OFDM / Memoryless LTI Transceivers with Reduced Redundancy / FIR LTV Transc ivers with Reduced Redundancy View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Rethinking Quaternions

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Quaternion multiplication can be used to rotate vectors in three-dimensions. Therefore, in computer graphics, quaternions have three principal applications: to increase speed and reduce storage for calculations involving rotations, to avoid distortions arising from numerical inaccuracies caused by floating point computations with rotations, and to interpolate between two rotations for key frame animation. Yet while the formal algebra of quaternions is well-known in the graphics community, the derivations of the formulas for this algebra and the geometric principles underlying this algebra are not well understood. The goals of this monograph are to provide a fresh, geometric interpretation for quaternions, appropriate for contemporary computer graphics, based on mass-points; to present better ways to visualize quaternions, and the effect of quaternion multiplication on points and vectors in three dimensions using insights from the algebra and geometry of multiplication in the complex p ane; to derive the formula for quaternion multiplication from first principles; to develop simple, intuitive proofs of the sandwiching formulas for rotation and reflection; to show how to apply sandwiching to compute perspective projections. In addition to these theoretical issues, we also address some computational questions. We develop straightforward formulas for converting back and forth between quaternion and matrix representations for rotations, reflections, and perspective projections, and we discuss the relative advantages and disadvantages of the quaternion and matrix representations for these transformations. Moreover, we show how to avoid distortions due to floating point computations with rotations by using unit quaternions to represent rotations. We also derive the formula for spherical linear interpolation, and we explain how to apply this formula to interpolate between two rotations for key frame animation. Finally, we explain the role of quaternions in low-dimensional lifford algebras, and we show how to apply the Clifford algebra for R3 to model rotations, reflections, and perspective projections. To help the reader understand the concepts and formulas presented here, we have incorporated many exercises in order to clarify and elaborate some of the key points in the text. Table of Contents: Preface / Theory / Computation / Rethinking Quaternions and Clif ford Algebras / References / Further Reading / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Automated Grammatical Error Detection for Language Learners, Second Edition

    Copyright Year: 2014

    Morgan and Claypool eBooks

    It has been estimated that over a billion people are using or learning English as a second or foreign language, and the numbers are growing not only for English but for other languages as well. These language learners provide a burgeoning market for tools that help identify and correct learners' writing errors. Unfortunately, the errors targeted by typical commercial proofreading tools do not include those aspects of a second language that are hardest to learn. This volume describes the types of constructions English language learners find most difficult: constructions containing prepositions, articles, and collocations. It provides an overview of the automated approaches that have been developed to identify and correct these and other classes of learner errors in a number of languages. Error annotation and system evaluation are particularly important topics in grammatical error detection because there are no commonly accepted standards. Chapters in the book describe the options avai able to researchers, recommend best practices for reporting results, and present annotation and evaluation schemes. The final chapters explore recent innovative work that opens new directions for research. It is the authors' hope that this volume will continue to contribute to the growing interest in grammatical error detection by encouraging researchers to take a closer look at the field and its many challenging problems. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Exploratory Search:Beyond the Query-Response Paradigm

    Copyright Year: 2013

    Morgan and Claypool eBooks

    As information becomes more ubiquitous and the demands that searchers have on search systems grow, there is a need to support search behaviors beyond simple lookup. Information seeking is the process or activity of attempting to obtain information in both human and technological contexts. Exploratory search describes an information-seeking problem context that is open-ended, persistent, and multifaceted, and information-seeking processes that are opportunistic, iterative, and multitactical. Exploratory searchers aim to solve complex problems and develop enhanced mental capacities. Exploratory search systems support this through symbiotic human-machine relationships that provide guidance in exploring unfamiliar information landscapes. Exploratory search has gained prominence in recent years. There is an increased interest from the information retrieval, information science, and human-computer interaction communities in moving beyond the traditional turn-taking interaction model support d by major Web search engines, and toward support for human intelligence amplification and information use. In this lecture, we introduce exploratory search, relate it to relevant extant research, outline the features of exploratory search systems, discuss the evaluation of these systems, and suggest some future directions for supporting exploratory search. Exploratory search is a new frontier in the search domain and is becoming increasingly important in shaping our future world. Table of Contents: Introduction / Defining Exploratory Search / Related Work / Features of Exploratory Search Systems / Evaluation of Exploratory Search Systems / Future Directions and concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Visual Information Retrieval using Java and LIRE

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Visual information retrieval (VIR) is an active and vibrant research area, which attempts at providing means for organizing, indexing, annotating, and retrieving visual information (images and videos) from large, unstructured repositories. The goal of VIR is to retrieve matches ranked by their relevance to a given query, which is often expressed as an example image and/or a series of keywords. During its early years (1995-2000), the research efforts were dominated by content-based approaches contributed primarily by the image and video processing community. During the past decade, it was widely recognized that the challenges imposed by the lack of coincidence between an image's visual contents and its semantic interpretation, also known as semantic gap, required a clever use of textual metadata (in addition to information extracted from the image's pixel contents) to make image and video retrieval solutions efficient and effective. The need to bridge (or at least narrow) the semanti gap has been one of the driving forces behind current VIR research. Additionally, other related research problems and market opportunities have started to emerge, offering a broad range of exciting problems for computer scientists and engineers to work on. In this introductory book, we focus on a subset of VIR problems where the media consists of images, and the indexing and retrieval methods are based on the pixel contents of those images -- an approach known as content-based image retrieval (CBIR). We present an implementation-oriented overview of CBIR concepts, techniques, algorithms, and figures of merit. Most chapters are supported by examples written in Java, using Lucene (an open-source Java-based indexing and search implementation) and LIRE (Lucene Image REtrieval), an open-source Java-based library for CBIR. Table of Contents: Introduction / Information Retrieval: Selected Concepts and Techniques / Visual Features / Indexing Visual Features / LIRE: An Extensible Java CBIR Li rary / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    PSpice for Digital Signal Processing

    Copyright Year: 2007

    Morgan and Claypool eBooks

    PSpice for Digital Signal Processing is the last in a series of five books using Cadence Orcad PSpice version 10.5 and introduces a very novel approach to learning digital signal processing (DSP). DSP is traditionally taught using Matlab/Simulink software but has some inherent weaknesses for students particularly at the introductory level. The ‘plug in variables and play’ nature of these software packages can lure the student into thinking they possess an understanding they don’t actually have because these systems produce results quicklywithout revealing what is going on. However, it must be said that, for advanced level work Matlab/Simulink really excel. In this book we start by examining basic signals starting with sampled signals and dealing with the concept of digital frequency. The delay part, which is the heart of DSP, is explained and applied initially to simple FIR and IIR filters. We examine linear time invariant systems starting with the difference equa ion and applying the z-transform to produce a range of filter type i.e. low-pass, high-pass and bandpass. The important concept of convolution is examined and here we demonstrate the usefulness of the 'log' command in Probe for giving the correct display to demonstrate the 'flip n slip' method. Digital oscillators, including quadrature carrier generation, are then examined. Several filter design methods are considered and include the bilinear transform, impulse invariant, and window techniques. Included also is a treatment of the raised-cosine family of filters. A range of DSP applications are then considered and include the Hilbert transform, single sideband modulator using the Hilbert transform and quad oscillators, integrators and differentiators. Decimation and interpolation are simulated to demonstrate the usefulness of the multi-sampling environment. Decimation is also applied in a treatment on digital receivers. Lastly, we look at some musical applications for DSP such as r verberation/echo using real-world signals imported into PSpice using the program Wav2Ascii. The zero-forcing equalizer is dealt with in a simplistic manner and illustrates the effectiveness of equalizing signals in a receiver after transmission. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Computational Electronics

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Computational Electronics is devoted to state of the art numerical techniques and physical models used in the simulation of semiconductor devices from a semi-classical perspective. Computational electronics, as a part of the general Technology Computer Aided Design (TCAD) field, has become increasingly important as the cost of semiconductor manufacturing has grown exponentially, with a concurrent need to reduce the time from design to manufacture. The motivation for this volume is the need within the modeling and simulation community for a comprehensive text which spans basic drift-diffusion modeling, through energy balance and hydrodynamic models, and finally particle based simulation. One unique feature of this book is a specific focus on numerical examples, particularly the use of commercially available software in the TCAD community. The concept for this book originated from a first year graduate course on computational electronics, taught now for several years, in the Electrical ngineering Department at Arizona State University. Numerous exercises and projects were derived from this course and have been included. The prerequisite knowledge is a fundamental understanding of basic semiconductor physics, the physical models for various device technologies such as pndiodes, bipolar junction transistors, and field effect transistors. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Geographical Design:Spatial Cognition and Geographical Information Science

    Copyright Year: 2011

    Morgan and Claypool eBooks

    With GIS technologies ranging from Google Maps and Google Earth to the use of smart phones and in-car navigation systems, spatial knowledge is often acquired and communicated through geographic information technologies. This monograph describes the interplay between spatial cognition research and use of spatial interfaces. It begins by reviewing what is known about how humans process spatial concepts and then moves on to discuss how interfaces can be improved to take advantage of those capabilities. Special attention is given to a variety of innovative geographical platforms that provide users with an intuitive understanding and support the further acquisition of spatial knowledge. The monograph concludes with a discussion of the number of outstanding issues, including the changing nature of maps as the primary spatial interface, concerns about privacy for spatial information, and a look at the future of user-centered spatial information systems. Table of Contents: Introduction / Spat al Cognition / Technologies / Cognitive Interfaces for Wayfinding / Open Issues / For More Information View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Plastics Thermoforming

    Copyright Year: 2009

    Morgan and Claypool eBooks

    The process of heating and reshaping plastics sheet and film materials has been in use since the beginning of the plastics industry. This process is known as thermoforming. Today this process is used for industrial products including signage, housings, and hot tubs. It also produces much of the packaging in use today including blister packs, egg cartons, and food storage containers. This process has many advantages over other methods of producing these products, but it has some limitations. This book has a twofold purpose. It is designed to be used as a text book for a course on thermoforming. It is also intended to be an application guide for professionals in the field of thermoforming including manufacturing, process and quality engineers, and managers. This book is focused on process application rather than theory. It refers to real products and processes with the intent of understanding the real issues faced in this industry. In addition to materials and processes, part and tool d sign are covered. Quality control is critical to any operation and this is also covered in this text. Two areas of focus in today's industry include Lean operations and environmental issues. Both of these topics are also included. Table of Contents: Introduction / Plastics Materials / Thermoforming Process Overview / The Forming Process / Part Design Mold / Tool Design / Quality Control Issues / Lean Operations / Environmental Issues View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Effects of Hypergravity and Microgravity on Biomedical Experiments, The

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Take one elephant and one man to the top of a tower and simultaneously drop. Which will hit the ground first? You are a pilot of a jet fighter performing a high-speed loop. Will you pass out during the maneuver? How can you simulate being an astronaut with your feet still firmly placed on planet Earth? In the aerospace environment, human, animal, and plant physiology differs significantly from that on Earth, and this book provides reasons for some of these changes. The challenges encountered by pilots in their missions can have implications on the health and safety of not only themselves but others. Knowing the effects of hypergravity on the human body during high-speed flight led to the development of human centrifuges. We also need to better understand the physiological responses of living organisms in space. It is therefore necessary to simulate weightlessness through the use of specially adapted equipment, such as clinostats, tilt tables, and body suspension devices. Each of these ideas, and more, is addressed in this review of the physical concepts related to space flights, microgravity, and hypergravity simulations. Basic theories, such as Newton’s law and Einstein’s principle are explained, followed by a look at the biomedical effects of experiments performed in space life sciences institutes, universities, and space agencies. Table of Contents: General Concepts in Physics - Definition of Physical Terms / The Effects of Hypergravity on Biomedical Experiments / The Effects of Microgravity on Biomedical Experiments / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Intermediate Probability Theory for Biomedical Engineers

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This is the second in a series of three short books on probability theory and random processes for biomedical engineers. This volume focuses on expectation, standard deviation, moments, and the characteristic function. In addition, conditional expectation, conditional moments and the conditional characteristic function are also discussed. Jointly distributed random variables are described, along with joint expectation, joint moments, and the joint characteristic function. Convolution is also developed. A considerable effort has been made to develop the theory in a logical manner—developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Every effort has been made to be consistent with commonly used notation and terminology—both within the engineering community as well as the probability and statistics literature. The aim is to prepare students for the application of this theory to a wi e variety of problems, as well give practicing engineers and researchers a tool to pursue these topics at a more advanced level. Pertinent biomedical engineering examples are used throughout the text. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Primer on Hardware Prefetching

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Since the 1970’s, microprocessor-based digital platforms have been riding Moore’s law, allowing for doubling of density for the same area roughly every two years. However, whereas microprocessor fabrication has focused on increasing instruction execution rate, memory fabrication technologies have focused primarily on an increase in capacity with negligible increase in speed. This divergent trend in performance between the processors and memory has led to a phenomenon referred to as the “Memory Wall.” To overcome the memory wall, designers have resorted to a hierarchy of cache memory levels, which rely on the principal of memory access locality to reduce the observed memory access time and the performance gap between processors and memory. Unfortunately, important workload classes exhibit adverse memory access patterns that baffle the simple policies built into modern cache hierarchies to move instructions and data across cache levels. As such, processors of en spend much time idling upon a demand fetch of memory blocks that miss in higher cache levels. Prefetching—predicting future memory accesses and issuing requests for the corresponding memory blocks in advance of explicit accesses—is an effective approach to hide memory access latency. There have been a myriad of proposed prefetching techniques, and nearly every modern processor includes some hardware prefetching mechanisms targeting simple and regular memory access patterns. This primer offers an overview of the various classes of hardware prefetchers for instructions and data proposed in the research literature, and presents examples of techniques incorporated into modern microprocessors. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mining and Communities:Understanding the Context of Engineering Practice

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Mining has been entangled with the development of communities in all continents since the beginning of large-scale resource extraction. It has brought great wealth and prosperity, as well as great misery and environmental destruction. Today, there is a greater awareness of the urgent need for engineers to meet the challenge of extracting declining mineral resources more efficiently, with positive and equitable social impact and minimal environmental impact. Many engineering disciplines—from software to civil engineering—play a role in the life of a mine, from its inception and planning to its operation and final closure. The companies that employ these engineers are expected to uphold human rights, address community needs, and be socially responsible. While many believe it is possible for mines to make a profit and achieve these goals simultaneously, others believe that these are contradictory aims. This book narrates the social experience of mining in two very different settings—Papua New Guinea and Western Australia—to illustrate how political, economic, and cultural contexts can complicate the simple idea of "community engagement." View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Electronics: Book 2:Amplifiers: Analysis and Design

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book, Amplifiers: Analysis and Design, is the second of four books of a larger work, Fundamentals of Electronics. It is comprised of four chapters that describe the fundamentals of amplifier performance. Beginning with a review of two-port analysis, the first chapter introduces the modeling of the response of transistors to AC signals. Basic one-transistor amplifiers are extensively discussed. The next chapter expands the discussion to multiple transistor amplifiers. The coverage of simple amplifiers is concluded with a chapter that examines power amplifiers. This discussion defines the limits of small-signal analysis and explores the realm where these simplifying assumptions are no longer valid and distortion becomes present. The final chapter concludes the book with the first of two chapters in Fundamental of Electronics on the significant topic of feedback amplifiers. Fundamentals of Electronics has been designed primarily for use in an upper division course in electronics for electrical engineering students. Typically such a course spans a full academic years consisting of two semesters or three quarters. As such, Amplifiers: Analysis and Design, and two other books, Electronic Devices and Circuit Applications, and Active Filters and Amplifier Frequency Response, form an appropriate body of material for such a course. Secondary applications include the use with Electronic Devices and Circuit Applications in a one-semester electronics course for engineers or as a reference for practicing engineers. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Stream Management

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Many applications process high volumes of streaming data, among them Internet traffic analysis, financial tickers, and transaction log mining. In general, a data stream is an unbounded data set that is produced incrementally over time, rather than being available in full before its processing begins. In this lecture, we give an overview of recent research in stream processing, ranging from answering simple queries on high-speed streams to loading real-time data feeds into a streaming warehouse for off-line analysis. We will discuss two types of systems for end-to-end stream processing: Data Stream Management Systems (DSMSs) and Streaming Data Warehouses (SDWs). A traditional database management system typically processes a stream of ad-hoc queries over relatively static data. In contrast, a DSMS evaluates static (long-running) queries on streaming data, making a single pass over the data and using limited working memory. In the first part of this lecture, we will discuss research prob ems in DSMSs, such as continuous query languages, non-blocking query operators that continually react to new data, and continuous query optimization. The second part covers SDWs, which combine the real-time response of a DSMS by loading new data as soon as they arrive with a data warehouse's ability to manage Terabytes of historical data on secondary storage. Table of Contents: Introduction / Data Stream Management Systems / Streaming Data Warehouses / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Engineering Economics and Decision Analysis

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The authors cover two general topics: basic engineering economics and risk analysis in this text. Within the topic of engineering economics are discussions on the time value of money and interest relationships. These interest relationships are used to define certain project criteria that are used by engineers and project managers to select the best economic choice among several alternatives. Projects examined will include both income- and service-producing investments. The effects of escalation, inflation, and taxes on the economic analysis of alternatives are discussed. Risk analysis incorporates the concepts of probability and statistics in the evaluation of alternatives. This allows management to determine the probability of success or failure of the project. Two types of sensitivity analyses are presented. The first is referred to as the range approach while the second uses probabilistic concepts to determine a measure of the risk involved. The authors have designed the text to as ist individuals to prepare to successfully complete the economics portions of the Fundamentals of Engineering Exam. Table of Contents: Introduction / Interest and the Time Value of Money / Project Evaluation Methods / Service Producing Investments / Income Producing Investments / Determination of Project Cash Flow / Financial Leverage / Basic Statistics and Probability / Sensitivity Analysis View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Lectures on Financial Mathematics:Discrete Asset Pricing

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This is a short book on the fundamental concepts of the no-arbitrage theory of pricing financial derivatives. Its scope is limited to the general discrete setting of models for which the set of possible states is finite and so is the set of possible trading times--this includes the popular binomial tree model. This setting has the advantage of being fairly general while not requiring a sophisticated understanding of analysis at the graduate level. Topics include understanding the several variants of "arbitrage", the fundamental theorems of asset pricing in terms of martingale measures, and applications to forwards and futures. The authors' motivation is to present the material in a way that clarifies as much as possible why the often confusing basic facts are true. Therefore the ideas are organized from a mathematical point of view with the emphasis on understanding exactly what is under the hood and how it works. Every effort is made to include complete explanations and proofs, and he reader is encouraged to work through the exercises throughout the book. The intended audience is students and other readers who have an undergraduate background in mathematics, including exposure to linear algebra, some advanced calculus, and basic probability. The book has been used in earlier forms with students in the MS program in Financial Mathematics at Florida State University, and is a suitable text for students at that level. Students who seek a second look at these topics may also find this book useful. Table of Contents: Overture: Single-Period Models / The General Discrete Model / The Fundamental Theorems of Asset Pricing / Forwards and Futures / Incomplete Markets View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Ensemble Methods in Data Mining:Improving Accuracy Through Combining Predictions

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Ensemble methods have been called the most influential development in Data Mining and Machine Learning in the past decade. They combine multiple models into one usually more accurate than the best of its components. Ensembles can provide a critical boost to industrial challenges -- from investment timing to drug discovery, and fraud detection to recommendation systems -- where predictive accuracy is more vital than model interpretability. Ensembles are useful with all modeling algorithms, but this book focuses on decision trees to explain them most clearly. After describing trees and their strengths and weaknesses, the authors provide an overview of regularization -- today understood to be a key reason for the superior performance of modern ensembling algorithms. The book continues with a clear description of two recent developments: Importance Sampling (IS) and Rule Ensembles (RE). IS reveals classic ensemble methods -- bagging, random forests, and boosting -- to be special cases of single algorithm, thereby showing how to improve their accuracy and speed. REs are linear rule models derived from decision tree ensembles. They are the most interpretable version of ensembles, which is essential to applications such as credit scoring and fault diagnosis. Lastly, the authors explain the paradox of how ensembles achieve greater accuracy on new data despite their (apparently much greater) complexity. This book is aimed at novice and advanced analytic researchers and practitioners -- especially in Engineering, Statistics, and Computer Science. Those with little exposure to ensembles will learn why and how to employ this breakthrough method, and advanced practitioners will gain insight into building even more powerful models. Throughout, snippets of code in R are provided to illustrate the algorithms described and to encourage the reader to try the techniques. The authors are industry experts in data mining and machine learning who are also adjunct professors and popular speakers. Although early pioneers in discovering and using ensembles, they here distill and clarify the recent groundbreaking work of leading academics (such as Jerome Friedman) to bring the benefits of ensembles to practitioners. Table of Contents: Ensembles Discovered / Predictive Learning and Decision Trees / Model Complexity, Model Selection and Regularization / Importance Sampling and the Classic Ensemble Methods / Rule Ensembles and Interpretation Statistics / Ensemble Complexity View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Biomedical Image Analysis:Segmentation

    Copyright Year: 2009

    Morgan and Claypool eBooks

    The sequel to the popular lecture book entitled Biomedical Image Analysis: Tracking, this book on Biomedical Image Analysis: Segmentation tackles the challenging task of segmenting biological and medical images. The problem of partitioning multidimensional biomedical data into meaningful regions is perhaps the main roadblock in the automation of biomedical image analysis. Whether the modality of choice is MRI, PET, ultrasound, SPECT, CT, or one of a myriad of microscopy platforms, image segmentation is a vital step in analyzing the constituent biological or medical targets. This book provides a state-of-the-art, comprehensive look at biomedical image segmentation that is accessible to well-equipped undergraduates, graduate students, and research professionals in the biology, biomedical, medical, and engineering fields. Active model methods that have emerged in the last few years are a focus of the book, including parametric active contour and active surface models, active shape models and geometric active contours that adapt to the image topology. Additionally, Biomedical Image Analysis: Segmentation details attractive new methods that use graph theory in segmentation of biomedical imagery. Finally, the use of exciting new scale space tools in biomedical image analysis is reported. Table of Contents: Introduction / Parametric Active Contours / Active Contours in a Bayesian Framework / Geometric Active Contours / Segmentation with Graph Algorithms / Scale-Space Image Filtering for Segmentation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Key Issues Regarding Digital Libraries:Evaluation and Integration

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This is the second book based on the 5S (Societies, Scenarios, Spaces, Structures, Streams) approach to digital libraries (DLs). Leveraging the first volume, on Theoretical Foundations, we focus on the key issues of evaluation and integration. These cross-cutting issues serve as a bridge for those interested in DLs, connecting the introduction and formal discussion in the first book, with the coverage of key technologies in the third book, and of illustrative applications in the fourth book. These two topics have central importance in the DL field, allowing it to be treated scientifically as well as practically. In the scholarly world, we only really understand something if we know how to measure and evaluate it. In the Internet era of distributed information systems, we only can be practical at scale if we integrate across both systems and their associated content. Evaluation of DLs must take place atmultiple levels,so we can address the different entities and their associated measur s. Thus, for digital objects, we assess accessibility, pertinence, preservability, relevance, significance, similarity, and timeliness. Other measures are specific to higher-level constructs like metadata, collections, catalogs, repositories, and services.We tie these together through a case study of the 5SQual tool, which we designed and implemented to perform an automatic quantitative evaluation of DLs. Thus, across the Information Life Cycle, we describe metrics and software useful to assess the quality of DLs, and demonstrate utility with regard to representative application areas: archaeology and education. Though integration has been a challenge since the earliest work on DLs, we provide the first comprehensive 5S-based formal description of the DL integration problem, cast in the context of related work. Since archaeology is a fundamentally distributed enterprise, we describe ETANADL, for integrating Near Eastern Archeology sites and information. Thus, we show how 5S-based mode ing can lead to integrated services and content. While the first book adopts a minimalist and formal approach to DLs, and provides a systematic and functional method to design and implement DL exploring services, here we broaden to practical DLs with richer metamodels, demonstrating the power of 5S for integration and evaluation. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Basic Probability Theory for Biomedical Engineers

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems—as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first chapter. The second chapter introduces the topic of random variables. Later chapters simply expand upon these key ideas and extend the range of application. A considerable effort has been made to develop the theory in a logical manner—developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Ever effort has been made to be consistent with commonly used notation and terminology—both within the engineering community as well as the probability and statistics literature. Biomedical engineering examples are introduced throughout the text and a large number of self-study problems are available for the reader. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Humanitarian Engineering

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Humanitarian Engineering reviews the development of engineering as a distinct profession and of the humanitarian movement as a special socio-political practice. Having noted that the two developments were situated in the same geographical and historical space -- that is, in Europe and North America beginning in the 1700s -- the book argues for a mutual influence and synthesis that has previously been lacking. In this spirit, the first of two central chapters describes humanitarian engineering as the artful drawing on science to direct the resources of nature with active compassion to meet the basic needs of all -- especially the powerless, poor, or otherwise marginalized. A second central chapter then considers strategies for education in humanitarian engineering so conceived. Two final chapters consider challenges and implications. Table of Contents: Engineering / Humanitarianism / Humanitarian Engineering / Humanitarian Engineering Education / Challenges / Conclusion: Humanizing Tec nology View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Space-Time Computing with Temporal Neural Networks

    Copyright Year: 2017

    Morgan and Claypool eBooks

    Understanding and implementing the brain's computational paradigm is the one true grand challenge facing computer researchers. Not only are the brain's computational capabilities far beyond those of conventional computers, its energy efficiency is truly remarkable. This book, written from the perspective of a computer designer and targeted at computer researchers, is intended to give both background and lay out a course of action for studying the brain's computational paradigm. It contains a mix of concepts and ideas drawn from computational neuroscience, combined with those of the author.

    As background, relevant biological features are described in terms of their computational and communication properties. The brain's neocortex is constructed of massively interconnected neurons that compute and communicate via voltage spikes, and a strong argument can be made that precise spike timing is an essential element of the paradigm. Drawing from the biological features, a mathe atics-based computational paradigm is constructed. The key feature is spiking neurons that perform communication and processing in space-time, with emphasis on time. In these paradigms, time is used as a freely available resource for both communication and computation.

    Neuron models are first discussed in general, and one is chosen for detailed development. Using the model, single-neuron computation is first explored. Neuron inputs are encoded as spike patterns, and the neuron is trained to identify input pattern similarities. Individual neurons are building blocks for constructing larger ensembles, referred to as "columns". These columns are trained in an unsupervised manner and operate collectively to perform the basic cognitive function of pattern clustering. Similar input patterns are mapped to a much smaller set of similar output patterns, thereby dividing the input patterns into identifiable clusters. Larger cognitive systems are formed by combining columns into a hierarc ical architecture. These higher level architectures are the subject of ongoing study, and progress to date is described in detail in later chapters. Simulation plays a major role in model development, and the simulation infrastructure developed by the author is described.

    View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamentals of Electromagnetics 2:Quasistatics and Waves

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book is the second of two volumes which have been created to provide an understanding of the basic principles and applications of electromagnetic fields for electrical engineering students. Fundamentals of Electromagnetics Vol 2: Quasistatics and Waves examines how the low-frequency models of lumped elements are modified to include parasitic elements. For even higher frequencies, wave behavior in space and on transmission lines is explained. Finally, the textbook concludes with details of transmission line properties and applications. Upon completion of this book and its companion Fundamentals of Electromagnetics Vol 1: Internal Behavior of Lumped Elements, with a focus on the DC and low-frequency behavior of electromagnetic fields within lumped elements, students will have gained the necessary knowledge to progress to advanced studies of electromagnetics. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advances in Modern Blind Signal Separation Algorithms:Theory and Applications

    Copyright Year: 2010

    Morgan and Claypool eBooks

    With human-computer interactions and hands-free communications becoming overwhelmingly important in the new millennium, recent research efforts have been increasingly focusing on state-of-the-art multi-microphone signal processing solutions to improve speech intelligibility in adverse environments. One such prominent statistical signal processing technique is blind signal separation (BSS). BSS was first introduced in the early 1990s and quickly emerged as an area of intense research activity showing huge potential in numerous applications. BSS comprises the task of 'blindly' recovering a set of unknown signals, the so-called sources from their observed mixtures, based on very little to almost no prior knowledge about the source characteristics or the mixing structure. The goal of BSS is to process multi-sensory observations of an inaccessible set of signals in a manner that reveals their individual (and original) form, by exploiting the spatial and temporal diversity, readily access ble through a multi-microphone configuration. Proceeding blindly exhibits a number of advantages, since assumptions about the room configuration and the source-to-sensor geometry can be relaxed without affecting overall efficiency. This booklet investigates one of the most commercially attractive applications of BSS, which is the simultaneous recovery of signals inside a reverberant (naturally echoing) environment, using two (or more) microphones. In this paradigm, each microphone captures not only the direct contributions from each source, but also several reflected copies of the original signals at different propagation delays. These recordings are referred to as the convolutive mixtures of the original sources. The goal of this booklet in the lecture series is to provide insight on recent advances in algorithms, which are ideally suited for blind signal separation of convolutive speech mixtures. More importantly, specific emphasis is given in practical applications of the developed BSS algorithms associated with real-life scenarios. The developed algorithms are put in the context of modern DSP devices, such as hearing aids and cochlear implants, where design requirements dictate low power consumption and call for portability and compact size. Along these lines, this booklet focuses on modern BSS algorithms which address (1) the limited amount of processing power and (2) the small number of microphones available to the end-user. Table of Contents: Fundamentals of blind signal separation / Modern blind signal separation algorithms / Application of blind signal processing strategies to noise reduction for the hearing-impaired / Conclusions and future challenges / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Joint Source-Channel Video Transmission

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book deals with the problem of joint source-channel video transmission, i.e., the joint optimal allocation of resources at the application layer and the other network layers, such as data rate adaptation, channel coding, power adaptation in wireless networks, quality of service (QoS) support from the network, and packet scheduling, for efficient video transmission. Real-time video communication applications, such as videoconferencing, video telephony, and on-demand video streaming, have gained increased popularity. However, a key problem in video transmission over the existing Internet and wireless networks is the incompatibility between the nature of the network conditions and the QoS requirements (in terms, for example, of bandwidth, delay, and packet loss) of real-time video applications. To deal with this incompatibility, a natural approach is to adapt the end-system to the network. The joint source-channel coding approach aims to efficiently perform content-aware cross-layer resource allocation, thus increasing the communication efficiency of multiple network layers. Our purpose in this book is to review the basic elements of the state-of-the-art approaches toward joint source-channel video transmission for wired and wireless systems. In this book, we present a general resource-distortion optimization framework, which is used throughout the book to guide our discussions on various techniques of joint source-channel video transmission. In this framework, network resources from multiple layers are assigned to each video packet according to its level of importance. It provides not only an optimization benchmark against which the performance of other sub-optimal systems can be evaluated, but also a useful tool for assessing the effectiveness of different error control components in practical system design. This book is therefore written to be accessible to researchers, expert industrial R&D engineers, and university students who are interested in the cut ing edge technologies in joint source-channel video transmission. Contents: Introduction / Elements of a Video Communication System / Joint Source-Channel Coding / Error-Resilient Video Coding / Channel Modeling and Channel Coding / Internet Video Transmission / Wireless Video Transmission / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Linguistic Fundamentals for Natural Language Processing:100 Essentials from Morphology and Syntax

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Many NLP tasks have at their core a subtask of extracting the dependencies—who did what to whom—from natural language sentences. This task can be understood as the inverse of the problem solved in different ways by diverse human languages, namely, how to indicate the relationship between different parts of a sentence. Understanding how languages solve the problem can be extremely useful in both feature design and error analysis in the application of machine learning to NLP. Likewise, understanding cross-linguistic variation can be important for the design of MT systems and other multilingual applications. The purpose of this book is to present in a succinct and accessible fashion information about the morphological and syntactic structure of human languages that can be useful in creating more linguistically sophisticated, more language-independent, and thus more successful NLP systems. Table of Contents: Acknowledgments / Introduction/motivation / Morphology: Introduct on / Morphophonology / Morphosyntax / Syntax: Introduction / Parts of speech / Heads, arguments, and adjuncts / Argument types and grammatical functions / Mismatches between syntactic position and semantic roles / Resources / Bibliography / Author's Biography / General Index / Index of Languages View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Image-Based Visualization:Interactive Multidimensional Data Exploration

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Our society has entered a data-driven era, one in which not only are enormous amounts of data being generated daily but there are also growing expectations placed on the analysis of this data. Some data have become simply too large to be displayed and some have too short a lifespan to be handled properly with classical visualization or analysis methods. In order to address these issues, this book explores the potential solutions where we not only visualize data, but also allow users to be able to interact with it. Therefore, this book will focus on two main topics: large dataset visualization and interaction. Graphic cards and their image processing power can leverage large data visualization but they can also be of great interest to support interaction. Therefore, this book will show how to take advantage of graphic card computation power with techniques called GPGPUs (general-purpose computing on graphics processing units). As specific examples, this book details GPGPU usages to pro uce fast enough visualization to be interactive with improved brushing techniques, fast animations between different data representations, and view simplifications (i.e. static and dynamic bundling techniques). Since data storage and memory limitation is less and less of an issue, we will also present techniques to reduce computation time by using memory as a new tool to solve computationally challenging problems. We will investigate innovative data processing techniques: while classical algorithms are expressed in data space (e.g. computation on geographic locations), we will express them in graphic space (e.g., raster map like a screen composed of pixels). This consists of two steps: (1) a data representation is built using straightforward visualization techniques; and (2) the resulting image undergoes purely graphical transformations using image processing techniques. This type of technique is called image-based visualization. The goal of this book is to explore new computing techn ques using image-based techniques to provide efficient visualizations and user interfaces for the exploration of large datasets. This book concentrates on the areas of information visualization, visual analytics, computer graphics, and human-computer interaction. This book opens up a whole field of study, including the scientific validation of these techniques, their limitations, and their generalizations to different types of datasets. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bad to the Bone:Crafting Electronic Systems with BeagleBone and BeagleBone Black

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This comprehensive book provides detailed materials for both novice and experienced programmers using all BeagleBone variants which host a powerful 32-bit, super-scalar TI Sitara ARM Cortex A8 processor. Authored by Steven F. Barrett and Jason Kridner, a seasoned ECE educator along with the founder of Beagleboard.org, respectively, the work may be used in a wide variety of projects from science fair projects to university courses and senior design projects to first prototypes of very complex systems. Beginners may access the power of the "Bone" through the user-friendly Bonescript examples. Seasoned users may take full advantage of the Bone's power using the underlying Linux-based operating system, a host of feature extension boards (Capes) and a wide variety of Linux community open source libraries. The book contains background theory on system operation coupled with many well-documented, illustrative examples. Examples for novice users are centered on motivational, fun robot projec s while advanced projects follow the theme of assistive technology and image processing applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    PSpice for Circuit Theory and Electronic Devices

    Copyright Year: 2007

    Morgan and Claypool eBooks

    PSpice for Circuit Theory and Electronic Devices is one of a series of five PSpice books and introduces the latest Cadence Orcad PSpice version 10.5 by simulating a range of DC and AC exercises. It is aimed primarily at those wishing to get up to speed with this version but will be of use to high school students, undergraduate students, and of course, lecturers. Circuit theorems are applied to a range of circuits and the calculations by hand after analysis are then compared to the simulated results. The Laplace transform and the s-plane are used to analyze CR and LR circuits where transient signals are involved. Here, the Probe output graphs demonstrate what a great learning tool PSpice is by providing the reader with a visual verification of any theoretical calculations. Series and parallel-tuned resonant circuits are investigated where the difficult concepts of dynamic impedance and selectivity are best understood by sweeping different circuit parameters through a range of values. O taining semiconductor device characteristics as a laboratory exercise has fallen out of favour of late, but nevertheless, is still a useful exercise for understanding or modelling semiconductor devices. Inverting and non-inverting operational amplifiers characteristics such as gain-bandwidth are investigated and we will see the dependency of bandwidth on the gain using the performance analysis facility. Power amplifiers are examined where PSpice/Probe demonstrates very nicely the problems of cross-over distortion and other problems associated with power transistors. We examine power supplies and the problems of regulation, ground bounce, and power factor correction. Lastly, we look at MOSFET device characteristics and show how these devices are used to form basic CMOS logic gates such as NAND and NOR gates. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Hard Problems in Software Testing:Solutions Using Testing as a Service (TaaS)

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This book summarizes the current hard problems in software testing as voiced by leading practitioners in the field. The problems were identified through a series of workshops, interviews, and surveys. Some of the problems are timeless, such as education and training, while others such as system security have recently emerged as increasingly important. The book also provides an overview of the current state of Testing as a Service (TaaS) based on an exploration of existing commercial offerings and a survey of academic research. TaaS is a relatively new development that offers software testers the elastic computing capabilities and generous storage capacity of the cloud on an as-needed basis. Some of the potential benefits of TaaS include automated provisioning of test execution environments and support for rapid feedback in agile development via continuous regression testing. The book includes a case study of a representative web application and three commercial TaaS tools to determine which hard problems in software testing are amenable to a TaaS solution. The findings suggest there remains a significant gap that must be addressed before TaaS can be fully embraced by the industry, particularly in the areas of tester education and training and a need for tools supporting more types of testing. The book includes a roadmap for enhancing TaaS to help bridge the gap between potential benefits and actual results. Table of Contents: Introduction / Hard Problems in Software Testing / Testing as a Service (TaaS) / Case Study and Gap Analysis / Summary / Appendix A: Hard Problems in Software Testing Survey / Appendix B: Google App Engine Code Examples / Appendix C: Sauce Labs Code Examples / References / Author Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sentiment Analysis and Opinion Mining

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Sentiment analysis and opinion mining is the field of study that analyzes people's opinions, sentiments, evaluations, attitudes, and emotions from written language. It is one of the most active research areas in natural language processing and is also widely studied in data mining, Web mining, and text mining. In fact, this research has spread outside of computer science to the management sciences and social sciences due to its importance to business and society as a whole. The growing importance of sentiment analysis coincides with the growth of social media such as reviews, forum discussions, blogs, micro-blogs, Twitter, and social networks. For the first time in human history, we now have a huge volume of opinionated data recorded in digital form for analysis. Sentiment analysis systems are being applied in almost every business and social domain because opinions are central to almost all human activities and are key influencers of our behaviors. Our beliefs and perceptions of rea ity, and the choices we make, are largely conditioned on how others see and evaluate the world. For this reason, when we need to make a decision we often seek out the opinions of others. This is true not only for individuals but also for organizations. This book is a comprehensive introductory and survey text. It covers all important topics and the latest developments in the field with over 400 references. It is suitable for students, researchers and practitioners who are interested in social media analysis in general and sentiment analysis in particular. Lecturers can readily use it in class for courses on natural language processing, social media analysis, text mining, and data mining. Lecture slides are also available online. Table of Contents: Preface / Sentiment Analysis: A Fascinating Problem / The Problem of Sentiment Analysis / Document Sentiment Classification / Sentence Subjectivity and Sentiment Classification / Aspect-Based Sentiment Analysis / Sentiment Lexicon Generation / Opinion Summarization / Analysis of Comparative Opinions / Opinion Search and Retrieval / Opinion Spam Detection / Quality of Reviews / Concluding Remarks / Bibliography / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Network Simulation

    Copyright Year: 2006

    Morgan and Claypool eBooks

    A detailed introduction to the design, implementation, and use of network simulation tools is presented. The requirements and issues faced in the design of simulators for wired and wireless networks are discussed. Abstractions such as packet- and fluid-level network models are covered. Several existing simulations are given as examples, with details and rationales regarding design decisions presented. Issues regarding performance and scalability are discussed in detail, describing how one can utilize distributed simulation methods to increase the scale and performance of a simulation environment. Finally, a case study of two simulation tools is presented that have been developed using distributed simulation techniques. This text is essential to any student, researcher, or network architect desiring a detailed understanding of how network simulation tools are designed, implemented, and used. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Libraries and the Semantic Web:An Introduction to Its Applications and Opportunities for Libraries

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This book covers the concept of the Semantic Web—what it is, the components that comprise it, including Linked Data, and the various ways that libraries are engaged in contributing to its development in making library resources and services ever more accessible to end-users. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Adaptive High-Resolution Sensor Waveform Design for Tracking

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Recent innovations in modern radar for designing transmitted waveforms, coupled with new algorithms for adaptively selecting the waveform parameters at each time step, have resulted in improvements in tracking performance. Of particular interest are waveforms that can be mathematically designed to have reduced ambiguity function sidelobes, as their use can lead to an increase in the target state estimation accuracy. Moreover, adaptively positioning the sidelobes can reveal weak target returns by reducing interference from stronger targets. The manuscript provides an overview of recent advances in the design of multicarrier phase-coded waveforms based on Bjorck constant-amplitude zero-autocorrelation (CAZAC) sequences for use in an adaptive waveform selection scheme for mutliple target tracking. The adaptive waveform design is formulated using sequential Monte Carlo techniques that need to be matched to the high resolution measurements. The work will be of interest to both practitioner and researchers in radar as well as to researchers in other applications where high resolution measurements can have significant benefits. Table of Contents: Introduction / Radar Waveform Design / Target Tracking with a Particle Filter / Single Target tracking with LFM and CAZAC Sequences / Multiple Target Tracking / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Oral Communication Excellence for Engineers and Scientists

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Many of us have implemented oral communication instruction in our design courses, lab courses, and other courses where students give presentations. Others have students give presentations without instruction on how to become a better presenter. Many of us, then, could use a concise book that guides us on what instruction on oral communication should include, based on input from executives from different settings. This instruction will help our students get jobs and make them more likely to move up the career ladder, especially in these hard economic times. Oral Communication Excellence for Engineers and Scientists: Based on Executive Input is the tool we need. It is based on input from over 75 executives with engineering or science degrees, leading organizations that employ engineers and scientists. For the presentation chapter, the executives described what makes a “stellar presentation.” And for every other chapter, they gave input—on, for example, how to eff ctively communicate in meetings and in teams, how to excel at phone communication, how to communicate electronically to supplement oral communication, and how to meet the challenges of oral communication. They also provided tips on cross-cultural communication, listening, choosing the appropriate medium for a communication, elevator pitches, and posters; and using oral communication to network on the job. Oral Communication Excellence for Engineers and Scientists includes exercises and activities for students and professionals, based on instruction that has improved Georgia Tech’s students’ presentation skills at a statistically significant level. Slides demonstrating best practices are included from Capstone Design students around the country. Table of Contents: Introduction / Background Preparation / Presentation: Customizing to your Audience / Presentation: Telling your Story / Presentation: Displaying Key Information / Delivering the Presentation / Other Oral Commu ication Skills / Advanced Oral Communication Skills / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    P2P Techniques for Decentralized Applications

    Copyright Year: 2012

    Morgan and Claypool eBooks

    As an alternative to traditional client-server systems, Peer-to-Peer (P2P) systems provide major advantages in terms of scalability, autonomy and dynamic behavior of peers, and decentralization of control. Thus, they are well suited for large-scale data sharing in distributed environments. Most of the existing P2P approaches for data sharing rely on either structured networks (e.g., DHTs) for efficient indexing, or unstructured networks for ease of deployment, or some combination. However, these approaches have some limitations, such as lack of freedom for data placement in DHTs, and high latency and high network traffic in unstructured networks. To address these limitations, gossip protocols which are easy to deploy and scale well, can be exploited. In this book, we will give an overview of these different P2P techniques and architectures, discuss their trade-offs, and illustrate their use for decentralizing several large-scale data sharing applications. Table of Contents: P2P Overla s, Query Routing, and Gossiping / Content Distribution in P2P Systems / Recommendation Systems / Top-k Query Processing in P2P Systems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to the Locally-Corrected Nystrom Method

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This lecture provides a tutorial introduction to the Nyström and locally-corrected Nyström methods when used for the numerical solutions of the common integral equations of two-dimensional electromagnetic fields. These equations exhibit kernel singularities that complicate their numerical solution. Classical and generalized Gaussian quadrature rules are reviewed. The traditional Nyström method is summarized, and applied to the magnetic field equation for illustration. To obtain high order accuracy in the numerical results, the locally-corrected Nyström method is developed and applied to both the electric field and magnetic field equations. In the presence of target edges, where current or charge density singularities occur, the method must be extended through the use of appropriate singular basis functions and special quadrature rules. This extension is also described. Table of Contents: Introduction / Classical Quadrature Rules / The Classical Nyström Me hod / The Locally-Corrected Nyström Method / Generalized Gaussian Quadrature / LCN Treatment of Edge Singularities View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Hardware and Software Support for Virtualization

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>This book focuses on the core question of the necessary <i>architectural support provided by hardware</i> to efficiently run virtual machines, and of the corresponding design of the hypervisors that run them. Virtualization is still possible when the instruction set architecture lacks such support, but the hypervisor remains more complex and must rely on additional techniques.</p><p>Despite the focus on architectural support in current architectures, some historical perspective is necessary to appropriately frame the problem. The first half of the book provides the historical perspective of the theoretical framework developed four decades ago by Popek and Goldberg. It also describes earlier systems that enabled virtualization despite the lack of architectural support in hardware.</p><p>As is often the case, theory defines a necessary—but not sufficient—set of features, and modern architectures are the result of the combination of the theoretical framework with insights derived from practical systems. The second half of the book describes state-of-the-art support for virtualization in both x86-64 and ARM processors. This book includes an in-depth description of the CPU, memory, and I/O virtualization of these two processor architectures, as well as case studies on the Linux/KVM, VMware, and Xen hypervisors. It concludes with a performance comparison of virtualization on current-generation x86- and ARM-based systems across multiple hypervisors.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Cross-Language Information Retrieval

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Search for information is no longer exclusively limited within the native language of the user, but is more and more extended to other languages. This gives rise to the problem of cross-language information retrieval (CLIR), whose goal is to find relevant information written in a different language to a query. In addition to the problems of monolingual information retrieval (IR), translation is the key problem in CLIR: one should translate either the query or the documents from a language to another. However, this translation problem is not identical to full-text machine translation (MT): the goal is not to produce a human-readable translation, but a translation suitable for finding relevant documents. Specific translation methods are thus required. The goal of this book is to provide a comprehensive description of the specific problems arising in CLIR, the solutions proposed in this area, as well as the remaining problems. The book starts with a general description of the monolingual IR and CLIR problems. Different classes of approaches to translation are then presented: approaches using an MT system, dictionary-based translation and approaches based on parallel and comparable corpora. In addition, the typical retrieval effectiveness using different approaches is compared. It will be shown that translation approaches specifically designed for CLIR can rival and outperform high-quality MT systems. Finally, the book offers a look into the future that draws a strong parallel between query expansion in monolingual IR and query translation in CLIR, suggesting that many approaches developed in monolingual IR can be adapted to CLIR. The book can be used as an introduction to CLIR. Advanced readers can also find more technical details and discussions about the remaining research challenges in the future. It is suitable to new researchers who intend to carry out research on CLIR. Table of Contents: Preface / Introduction / Using Manually Constructed Translation Systems and Resources for CLIR / Translation Based on Parallel and Comparable Corpora / Other Methods to Improve CLIR / A Look into the Future: Toward a Unified View of Monolingual IR and CLIR? / References / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Interactive Technologies for Autism:A Review

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Development, deployment, and evaluation of interactive technologies for individuals with autism have been rapidly increasing over the last decade. There is great promise for the use of these types of technologies to enrich interventions, facilitate communication, and support data collection. Emerging technologies in this area also have the potential to enhance assessment and diagnosis of individuals with autism, to understand the nature of autism, and to help researchers conduct basic and applied research. This book provides an in-depth review of the historical and state-of-the-art use of technology by and for individuals with autism. The intention is to give readers a comprehensive background in order to understand what has been done and what promises and challenges lie ahead. By providing a classification scheme and general review, this book can also help technology designers and researchers better understand what technologies have been successful, what problems remain open, and whe e innovations can further address challenges and opportunities for individuals with autism and the variety of stakeholders connected to them. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dynamics and Control of DC-DC Converters

    Copyright Year: 2018

    Morgan and Claypool eBooks

    <p>DC-DC converters have many applications in the modern world. They provide the required power to the communication backbones, they are used in digital devices like laptops and cell phones, and they have widespread applications in electric cars, to just name a few.</p> <p>DC-DC converters require negative feedback to provide a suitable output voltage or current for the load. Obtaining a stable output voltage or current in presence of disturbances such as: input voltage changes and/or output load changes seems impossible without some form of control.</p> <p>This book tries to train the art of controller design for DC-DC converters. Chapter 1 introduces the DC-DC converters briefly. It is assumed that the reader has the basic knowledge of DC-DC converter (i.e., a basic course in power electronics).</p> <p>The reader learns the disadvantages of open loop control in Chapter 2. Simulation of DC-DC converters with the aid of Simulink #xae; is discussed in this chapter as well. Extracting the dynamic models of DC-DC converters is studied in Chapter 3. We show how MATLAB® and a software named KUCA can be used to do the cumbersome and error-prone process of modeling automatically. Obtaining the transfer functions using PSIM® is studied as well.</p> <p>These days, softwares are an integral part of engineering sciences. Control engineering is not an exception by any means. Keeping this in mind, we design the controllers using MATLAB® in Chapter 4.</p> <p>Finally, references are provided at the end of each chapter to suggest more information for an interested reader. The intended audiences for this book are practice engineers and academians.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Joint Source Channel Coding Using Arithmetic Codes

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Based on the encoding process, arithmetic codes can be viewed as tree codes and current proposals for decoding arithmetic codes with forbidden symbols belong to sequential decoding algorithms and their variants. In this monograph, we propose a new way of looking at arithmetic codes with forbidden symbols. If a limit is imposed on the maximum value of a key parameter in the encoder, this modified arithmetic encoder can also be modeled as a finite state machine and the code generated can be treated as a variable-length trellis code. The number of states used can be reduced and techniques used for decoding convolutional codes, such as the list Viterbi decoding algorithm, can be applied directly on the trellis. The finite state machine interpretation can be easily migrated to Markov source case. We can encode Markov sources without considering the conditional probabilities, while using the list Viterbi decoding algorithm which utilizes the conditional probabilities. We can also use contex -based arithmetic coding to exploit the conditional probabilities of the Markov source and apply a finite state machine interpretation to this problem. The finite state machine interpretation also allows us to more systematically understand arithmetic codes with forbidden symbols. It allows us to find the partial distance spectrum of arithmetic codes with forbidden symbols. We also propose arithmetic codes with memories which use high memory but low implementation precision arithmetic codes. The low implementation precision results in a state machine with less complexity. The introduced input memories allow us to switch the probability functions used for arithmetic coding. Combining these two methods give us a huge parameter space of the arithmetic codes with forbidden symbols. Hence we can choose codes with better distance properties while maintaining the encoding efficiency and decoding complexity. A construction and search method is proposed and simulation results show that we can chieve a similar performance as turbo codes when we apply this approach to rate 2/3 arithmetic codes. Table of Contents: Introduction / Arithmetic Codes / Arithmetic Codes with Forbidden Symbols / Distance Property and Code Construction / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Outlier Detection for Temporal Data

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Outlier (or anomaly) detection is a very broad field which has been studied in the context of a large number of research areas like statistics, data mining, sensor networks, environmental science, distributed systems, spatio-temporal mining, etc. Initial research in outlier detection focused on time series-based outliers (in statistics). Since then, outlier detection has been studied on a large variety of data types including high-dimensional data, uncertain data, stream data, network data, time series data, spatial data, and spatio-temporal data. While there have been many tutorials and surveys for general outlier detection, we focus on outlier detection for temporal data in this book. A large number of applications generate temporal datasets. For example, in our everyday life, various kinds of records like credit, personnel, financial, judicial, medical, etc., are all temporal. This stresses the need for an organized and detailed study of outliers with respect to such temporal data. In the past decade, there has been a lot of research on various forms of temporal data including consecutive data snapshots, series of data snapshots and data streams. Besides the initial work on time series, researchers have focused on rich forms of data including multiple data streams, spatio-temporal data, network data, community distribution data, etc. Compared to general outlier detection, techniques for temporal outlier detection are very different. In this book, we will present an organized picture of both recent and past research in temporal outlier detection. We start with the basics and then ramp up the reader to the main ideas in state-of-the-art outlier detection techniques. We motivate the importance of temporal outlier detection and brief the challenges beyond usual outlier detection. Then, we list down a taxonomy of proposed techniques for temporal outlier detection. Such techniques broadly include statistical techniques (like AR models, Markov models, histograms, neura networks), distance- and density-based approaches, grouping-based approaches (clustering, community detection), network-based approaches, and spatio-temporal outlier detection approaches. We summarize by presenting a wide collection of applications where temporal outlier detection techniques have been applied to discover interesting outliers. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Tutorial on Queuing and Trunking with Applications to Communications

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The motivation for developing this synthesis lecture was to provide a tutorial on queuing and trunking, with extensions to networks of queues, suitable for supplementing courses in communications, stochastic processes, and networking. An essential component of this lecture is MATLAB-based demonstrations and exercises, which can be easily modified to enable the student to observe and evaluate the impact of changing parameters, arrival and departure statistics, queuing disciplines, the number of servers, and other important aspects of the underlying system model. Much of the work in this lecture is based on Poisson statistics, since Poisson models are useful due to the fact that Poisson models are analytically tractable and provide a useful approximation for many applications. We recognize that the validity of Poisson statistics is questionable for a number of networking applications and therefore we briefly discuss self-similar models and the Hurst parameter, long-term dependent models the Pareto distribution, and other related topics. Appropriate references are given for continued study on these topics. The initial chapters of this book consider individual queues in isolation. The systems studied consist of an arrival process, a single queue with a particular queuing discipline, and one or more servers. While this allows us to study the basic concepts of queuing and trunking, modern data networks consist of many queues that interact in complex ways. While many of these interactions defy analysis, the final chapter introduces a model of a network of queues in which, after being served in one queue, customers may join another queue. The key result for this model is known as Jackson's Theorem. Finally, we state the BCMP Theorem, which can be viewed as a further extension of Jackson's Theorem and present Kleinrock's formula, which can be viewed as the network version of Little's Theorem. Table of Contents: Introduction / Poisson, Erlang, and Pareto Distributions / A Brief Introduction to Queueing Theory / Blocking and Delay / Networks of Queues View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Embedded Systems: Using ANSI C and the Arduino Development Environment

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Many electrical and computer engineering projects involve some kind of embedded system in which a microcontroller sits at the center as the primary source of control. The recently-developed Arduino development platform includes an inexpensive hardware development board hosting an eight-bit ATMEL ATmega-family processor and a Java-based software-development environment. These features allow an embedded systems beginner the ability to focus their attention on learning how to write embedded software instead of wasting time overcoming the engineering CAD tools learning curve. The goal of this text is to introduce fundamental methods for creating embedded software in general, with a focus on ANSI C. The Arduino development platform provides a great means for accomplishing this task. As such, this work presents embedded software development using 100% ANSI C for the Arduino's ATmega328P processor. We deviate from using the Arduino-specific Wiring libraries in an attempt to provide the most general embedded methods. In this way, the reader will acquire essential knowledge necessary for work on future projects involving other processors. Particular attention is paid to the notorious issue of using C pointers in order to gain direct access to microprocessor registers, which ultimately allow control over all peripheral interfacing. Table of Contents: Introduction / ANSI C / Introduction to Arduino / Embedded Debugging / ATmega328P Architecture / General-Purpose Input/Output / Timer Ports / Analog Input Ports / Interrupt Processing / Serial Communications / Assembly Language / Non-volatile Memory View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Engineering:A Starter's Guide with Hands-On Analog Multimedia Explorations

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This lecture provides a hands-on glimpse of the field of electrical engineering. The introduced applications utilize the NI ELVIS hardware and software platform to explore concepts such as circuits, power, analog sensing, and introductory analog signal processing such as signal generation, analog filtering, and audio and music processing. These principals and technologies are introduced in a very practical way and are fundamental to many of the electronic devices we use today. Some examples include photodetection, analog signal (audio, light, temperature) level meter, and analog music equalizer. Table of Contents: Getting Familiar with NI ELVIS / Analog Signal Level Meter Using LEDs / Noise Removal Using Analog Filters / Music Equalizer Using Op-Amps: Volume and Treble Control / Music Composer Using 555 Timers View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Acoustical Impulse Response Functions of Music Performance Halls

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Digital measurement of the analog acoustical parameters of a music performance hall is difficult. The aim of such work is to create a digital acoustical derivation that is an accurate numerical representation of the complex analog characteristics of the hall. The present study describes the exponential sine sweep (ESS) measurement process in the derivation of an acoustical impulse response function (AIRF) of three music performance halls in Canada. It examines specific difficulties of the process, such as preventing the external effects of the measurement transducers from corrupting the derivation, and provides solutions, such as the use of filtering techniques in order to remove such unwanted effects. In addition, the book presents a novel method of numerical verification through mean-squared error (MSE) analysis in order to determine how accurately the derived AIRF represents the acoustical behavior of the actual hall. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mathematical Magnetohydrodynamics

    Copyright Year: 2018

    Morgan and Claypool eBooks

    Fundamentals of mathematical magnetohydrodynamics (MHD) start with definitions of major variables and parameters in MHD fluids (also known as MHD media) and specifically plasmas encountered in nature as well as in engineering sytems, e.g., metallurgy or thermonuclear fusion power. Then collisions of fluids in such fluids are examined as well as motion of individual particles. Then the basic principles of MHD fluids are introduced along with transport phenomena, medium boundaries, and surface interactions. Then, waves and resonances of all sorts in MHD media are presented. The account concludes with the description of main MHD fluid types including plasma in fusion power generation. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advances in Multi-Channel Resource Allocation:Throughput, Delay, and Complexity

    Copyright Year: 2016

    Morgan and Claypool eBooks

    The last decade has seen an unprecedented growth in the demand for wireless services. These services are fueled by applications that often require not only high data rates, but also very low latency to function as desired. However, as wireless networks grow and support increasingly large numbers of users, these control algorithms must also incur only low complexity in order to be implemented in practice. Therefore, there is a pressing need to develop wireless control algorithms that can achieve both high throughput and low delay, but with low-complexity operations. While these three performance metrics, i.e., throughput, delay, and complexity, are widely acknowledged as being among the most important for modern wireless networks, existing approaches often have had to sacrifice a subset of them in order to optimize the others, leading to wireless resource allocation algorithms that either suffer poor performance or are difficult to implement. In contrast, the recent results presented i this book demonstrate that, by cleverly taking advantage of multiple physical or virtual channels, one can develop new low-complexity algorithms that attain both provably high throughput and provably low delay. The book covers both the intra-cell and network-wide settings. In each case, after the pitfalls of existing approaches are examined, new systematic methodologies are provided to develop algorithms that perform provably well in all three dimensions. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    C Programming and Numerical Analysis:An Introduction

    Copyright Year: 2018

    Morgan and Claypool eBooks

    This book is aimed at those in engineering/scientific fields who have never learned programming before but are eager to master the C language quickly so as to immediately apply it to problem solving in numerical analysis. The book skips unnecessary formality but explains all the important aspects of C essential for numerical analysis. Topics covered in numerical analysis include single and simultaneous equations, differential equations, numerical integration, and simulations by random numbers. In the Appendices, quick tutorials for gnuplot, Octave/MATLAB, and FORTRAN for C users are provided. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to Constraint-Based Temporal Reasoning

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Solving challenging computational problems involving time has been a critical component in the development of artificial intelligence systems almost since the inception of the field. This book provides a concise introduction to the core computational elements of temporal reasoning for use in AI systems for planning and scheduling, as well as systems that extract temporal information from data. It presents a survey of temporal frameworks based on constraints, both qualitative and quantitative, as well as of major temporal consistency techniques. The book also introduces the reader to more recent extensions to the core model that allow AI systems to explicitly represent temporal preferences and temporal uncertainty. This book is intended for students and researchers interested in constraint-based temporal reasoning. It provides a self-contained guide to the different representations of time, as well as examples of recent applications of time in AI systems. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Cyber Foraging:Bridging Mobile and Cloud Computing

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This lecture provides an introduction to cyber foraging, a topic that lies at the intersection of mobile and cloud computing. Cyber foraging dynamically augments the computing resources of mobile computers by opportunistically exploiting fixed computing infrastructure in the surrounding environment. In a cyber foraging system, applications functionality is dynamically partitioned between the mobile computer and infrastructure servers that store data and execute computation on behalf of mobile users. The location of application functionality changes in response to user mobility, platform characteristics, and variation in resources such as network bandwidth and CPU load. Cyber foraging also introduces a new, surrogate computing tier that lies between mobile users and cloud data centers. Surrogates are wired, infrastructure servers that offer much greater computing resources than those offered by small, battery-powered mobile devices. Surrogates are geographically distributed to be as cl se as possible to mobile computers so that they can provide substantially better response time to network requests than that provided by servers in cloud data centers. For instance, surrogates may be co-located with wireless hotspots in coffee shops, airport lounges, and other public locations. This lecture first describes how cyber foraging systems dynamically partition data and computation. It shows how dynamic partitioning can often yield better performance, energy efficiency, and application quality than static thin-client or thick-client approaches for dividing functionality between cloud and mobile computers. The lecture then describes the design of the surrogate computing tier. It shows how strong isolation can enable third-party computers to host computation and store data on behalf of nearby mobile devices. It then describes how surrogates can provide reasonable security and privacy guarantees to the mobile computers that use them. The lecture concludes with a discussion of d ta staging, in which surrogates temporarily store data in transit between cloud servers and mobile computers in order to improve transfer bandwidth and energy efficiency. Table of Contents: Introduction / Partitioning / Management / Security and Privacy / Data Staging / Challenges and Opportunities View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing for User Engagement:Aesthetic and Attractive User Interfaces

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book explores the design process for user experience and engagement, which expands the traditional concept of usability and utility in design to include aesthetics, fun and excitement. User experience has evolved as a new area of Human Computer Interaction research, motivated by non-work oriented applications such as games, education and emerging interactive Web 2.0. The chapter starts by examining the phenomena of user engagement and experience and setting them in the perspective of cognitive psychology, in particular motivation, emotion and mood. The perspective of aesthetics is expanded towards interaction and engagement to propose design treatments, metaphors, and interactive techniques which can promote user interest, excitement and satisfying experiences. This is followed by reviewing the design process and design treatments which can promote aesthetic perception and engaging interaction. The final part of the chapter provides design guidelines and principles drawn from the interaction and graphical design literature which are cross-referenced to issues in the design process. Examples of designs and design treatments are given to illustrate principles and advice, accompanied by critical reflection. Table of Contents: Introduction / Psychology of User Engagement / UE Design Process / Design Principles and Guidelines / Perspectives and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Medical Equipment Maintenance:Management and Oversight

    Copyright Year: 2012

    Morgan and Claypool eBooks

    In addition to being essential for safe and effective patient care, medical equipment also has significant impact on the income and, thus, vitality of healthcare organizations. For this reason, its maintenance and management requires careful supervision by healthcare administrators, many of whom may not have the technical background to understand all of the relevant factors. This book presents the basic elements of medical equipment maintenance and management required of healthcare leaders responsible for managing or overseeing this function. It will enable these individuals to understand their professional responsibilities, as well as what they should expect from their supervised staff and how to measure and benchmark staff performance against equivalent performance levels at similar organizations. The book opens with a foundational summary of the laws, regulations, codes, and standards that are applicable to the maintenance and management of medical equipment in healthcare organizat ons. Next, the core functions of the team responsible for maintenance and management are described in sufficient detail for managers and overseers. Then the methods and measures for determining the effectiveness and efficiency of equipment maintenance and management are presented to allow performance management and benchmarking comparisons. The challenges and opportunities of managing healthcare organizations of different sizes, acuity levels, and geographical locations are discussed. Extensive bibliographic sources and material for further study are provided to assist students and healthcare leaders interested in acquiring more detailed knowledge. Table of Contents: Introduction / Regulatory Framework / Core Functions of Medical Equipment Maintenance and Management / CE Department Management / Performance Management / Discussion and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering the Knee Meniscus

    Copyright Year: 2009

    Morgan and Claypool eBooks

    The knee meniscus was once thought to be a vestigial tissue, but is now known to be instrumental in imparting stability, shock absorption, load transmission, and stress distribution within the knee joint. Unfortunately, most damage to the meniscus cannot be effectively healed by the body. Meniscus tissue engineering offers a possible solution to this problem by striving to create replacement tissue that may be implanted into a defect site. With a strong focus on structure-function relationships, this book details the essential anatomical, biochemical, and mechanical aspects of this versatile tissue and reviews current meniscus tissue engineering strategies and repair techniques. We have written this text such that undergraduate students, graduate students, and researchers will find it useful as a first foray into tissue engineering, a cohesive study of the meniscus, or a reference for meniscus engineering specifications. Table of Contents: Structure-Function Relationships of the Knee eniscus / Pathophysiology and the Need for Tissue Engineering / Tissue Engineering of the Knee Meniscus / Current Therapies and Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Circuits:Signals and Filters

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Pragmatic Circuits: Signals and Filters is built around the processing of signals. Topics include spectra, a short introduction to the Fourier series, design of filters, and the properties of the Fourier transform. The focus is on signals rather than power. But the treatment is still pragmatic. For example, the author accepts the work of Butterworth and uses his results to design filters in a fairly methodical fashion. This third of three volumes finishes with a look at spectra by showing how to get a spectrum even if a signal is not periodic. The Fourier transform provides a way of dealing with such non-periodic signals. The two other volumes in the Pragmatic Circuits series include titles on DC and Time Domain and Frequency Domain. These short lecture books will be of use to students at any level of electrical engineering and for practicing engineers, or scientists, in any field looking for a practical and applied introduction to circuits and signals. The author's “pragmati ” and applied style gives a unique and helpful “non-idealistic, practical, opinionated” introduction to circuits View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Circuit Analysis with Multisim

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis. It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or both. Fourier analysis is discussed in the context of transient analysis. Next, we make a treatment of AC analysis to simulate the frequency response of a circuit. Then, we introduce diodes, transistors, and circuits composed by them and perform DC, transient, and AC analyses. The book ends with simulation of digital circuits. A practical approach is followed through the chapters, using step-by-st p examples to introduce new Multisim circuit elements, tools, analyses, and virtual instruments for measurement. The examples are clearly commented and illustrated. The different tools available on Multisim are used when appropriate so readers learn which analyses are available to them. This is part of the learning outcomes that should result after each set of end-of-chapter exercises is worked out. Table of Contents: Introduction to Circuit Simulation / Resistive Circuits / Time Domain Analysis -- Transient Analysis / Frequency Domain Analysis -- AC Analysis / Semiconductor Devices / Digital Circuits View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Understanding Circuits:Learning Problem Solving Using Circuit Analysis

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book/lecture is intended for a college freshman level class in problem solving, where the particular problems deal with electrical and electronic circuits. It can also be used in a junior/senior level class in high school to teach circuit analysis. The basic problem-solving paradigm used in this book is that of resolution of a problem into its component parts. The reader learns how to take circuits of varying levels of complexity using this paradigm. The problem-solving exercises also familiarize the reader with a number of different circuit components including resistors, capacitors, diodes, transistors, and operational amplifiers and their use in practical circuits. The reader should come away with both an understanding of how to approach complex problems and a “feel” for electrical and electronic circuits. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Datacenter Design and Management:A Computer Architect’s Perspective

    Copyright Year: 2016

    Morgan and Claypool eBooks

    An era of big data demands datacenters, which house the computing infrastructure that translates raw data into valuable information. This book defines datacenters broadly, as large distributed systems that perform parallel computation for diverse users. These systems exist in multiple forms—private and public—and are built at multiple scales. Datacenter design and management is multifaceted, requiring the simultaneous pursuit of multiple objectives. Performance, efficiency, and fairness are first-order design and management objectives, each which can be viewed from several perspectives. This book surveys datacenter research from a computer architect's perspective, addressing challenges in applications, design, management, server simulation, and system simulation. This perspective complements the rich bodies of work in datacenters as a warehouse-scale system, which study the implications for infrastructure that encloses computing equipment, and in datacenters as a dist ibuted systems, which employ abstract details in processor and memory subsystems. This book is written for first- or second-year graduate students in computer architecture and may be helpful for those in computer systems. The goal of this book is to prepare computer architects for datacenter-oriented research by describing prevalent perspectives and the state-of-the-art. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Easy Path to Convex Analysis and Applications

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Convex optimization has an increasing impact on many areas of mathematics, applied sciences, and practical applications. It is now being taught at many universities and being used by researchers of different fields. As convex analysis is the mathematical foundation for convex optimization, having deep knowledge of convex analysis helps students and researchers apply its tools more effectively. The main goal of this book is to provide an easy access to the most fundamental parts of convex analysis and its applications to optimization. Modern techniques of variational analysis are employed to clarify and simplify some basic proofs in convex analysis and build the theory of generalized differentiation for convex functions and sets in finite dimensions. We also present new applications of convex analysis to location problems in connection with many interesting geometric problems such as the Fermat-Torricelli problem, the Heron problem, the Sylvester problem, and their generalizations. Of ourse, we do not expect to touch every aspect of convex analysis, but the book consists of sufficient material for a first course on this subject. It can also serve as supplemental reading material for a course on convex optimization and applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Relational and XML Data Exchange

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Data exchange is the problem of finding an instance of a target schema, given an instance of a source schema and a specification of the relationship between the source and the target. Such a target instance should correctly represent information from the source instance under the constraints imposed by the target schema, and it should allow one to evaluate queries on the target instance in a way that is semantically consistent with the source data. Data exchange is an old problem that re-emerged as an active research topic recently, due to the increased need for exchange of data in various formats, often in e-business applications. In this lecture, we give an overview of the basic concepts of data exchange in both relational and XML contexts. We give examples of data exchange problems, and we introduce the main tasks that need to addressed. We then discuss relational data exchange, concentrating on issues such as relational schema mappings, materializing target instances (including ca onical solutions and cores), query answering, and query rewriting. After that, we discuss metadata management, i.e., handling schema mappings themselves. We pay particular attention to operations on schema mappings, such as composition and inverse. Finally, we describe both data exchange and metadata management in the context of XML. We use mappings based on transforming tree patterns, and we show that they lead to a host of new problems that did not arise in the relational case, but they need to be addressed for XML. These include consistency issues for mappings and schemas, as well as imposing tighter restrictions on mappings and queries to achieve tractable query answering in data exchange. Table of Contents: Overview / Relational Mappings and Data Exchange / Metadata Management / XML Mappings and Data Exchange View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Project Management for Engineering Design

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This lecture book is an introduction to project management. It will be of use for engineering students working on project design in all engineering disciplines and will also be of high value to practicing engineers in the work force. Few engineering programs prepare students in methods of project design and configuration management used within industry and government. This book emphasizes teams throughout and includes coverage of an introduction to projectmanagement, project definition, researching intellectual property (patent search), project scope, idealizing and conceptualizing a design, converting product requirements to engineering specifications, project integration, project communicationsmanagement, and conducting design reviews. The overall objectives of the book are for the readers to understand and manage their project by employing the good engineering practice used by medical and other industries in design and development of medical devices, engineered products and systems The goal is for the engineer and student to work well on large projects requiring a team environment, and to effectively communicate technical matters in both written documents and oral presentations. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    General Game Playing

    Copyright Year: 2014

    Morgan and Claypool eBooks

    General game players are computer systems able to play strategy games based solely on formal game descriptions supplied at "runtime" (n other words, they don't know the rules until the game starts). Unlike specialized game players, such as Deep Blue, general game players cannot rely on algorithms designed in advance for specific games; they must discover such algorithms themselves. General game playing expertise depends on intelligence on the part of the game player and not just intelligence of the programmer of the game player. GGP is an interesting application in its own right. It is intellectually engaging and more than a little fun. But it is much more than that. It provides a theoretical framework for modeling discrete dynamic systems and defining rationality in a way that takes into account problem representation and complexities like incompleteness of information and resource bounds. It has practical applications in areas where these features are important, e.g., in business a d law. More fundamentally, it raises questions about the nature of intelligence and serves as a laboratory in which to evaluate competing approaches to artificial intelligence. This book is an elementary introduction to General Game Playing (GGP). (1) It presents the theory of General Game Playing and leading GGP technologies. (2) It shows how to create GGP programs capable of competing against other programs and humans. (3) It offers a glimpse of some of the real-world applications of General Game Playing. Table of Contents: Preface / Introduction / Game Description / Game Management / Game Playing / Small Single-Player Games / Small Multiple-Player Games / Heuristic Search / Probabilistic Search / Propositional Nets / General Game Playing With Propnets / Factoring / Discovery of Heuristics / Logic / Analyzing Games with Logic / Solving Single-Player Games with Logic / Discovering Heuristics with Logic / Games with Incomplete Information / Games with Historical Constraints / Incomple e Game Descriptions / Advanced General Game Playing / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Aspects of Differential Geometry II

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Differential Geometry is a wide field. We have chosen to concentrate upon certain aspects that are appropriate for an introduction to the subject; we have not attempted an encyclopedic treatment. Book II deals with more advanced material than Book I and is aimed at the graduate level. Chapter 4 deals with additional topics in Riemannian geometry. Properties of real analytic curves given by a single ODE and of surfaces given by a pair of ODEs are studied, and the volume of geodesic balls is treated. An introduction to both holomorphic and Kähler geometry is given. In Chapter 5, the basic properties of de Rham cohomology are discussed, the Hodge Decomposition Theorem, Poincaré duality, and the Künneth formula are proved, and a brief introduction to the theory of characteristic classes is given. In Chapter 6, Lie groups and Lie algebras are dealt with. The exponential map, the classical groups, and geodesics in the context of a bi-invariant metric are discussed. The de ham cohomology of compact Lie groups and the Peter--Weyl Theorem are treated. In Chapter 7, material concerning homogeneous spaces and symmetric spaces is presented. Book II concludes in Chapter 8 where the relationship between simplicial cohomology, singular cohomology, sheaf cohomology, and de Rham cohomology is established. We have given some different proofs than those that are classically given and there is some new material in these volumes. For example, the treatment of the total curvature and length of curves given by a single ODE is new as is the discussion of the total Gaussian curvature of a surface defined by a pair of ODEs. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Natural Language Processing for Social Media:Second Edition

    Copyright Year: 2017

    Morgan and Claypool eBooks

    In recent years, online social networking has revolutionized interpersonal communication. The newer research on language analysis in social media has been increasingly focusing on the latter's impact on our daily lives, both on a personal and a professional level. Natural language processing (NLP) is one of the most promising avenues for social media data processing. It is a scientific challenge to develop powerful methods and algorithms which extract relevant information from a large volume of data coming from multiple sources and languages in various formats or in free form. We discuss the challenges in analyzing social media texts in contrast with traditional documents. Research methods in information extraction, automatic categorization and clustering, automatic summarization and indexing, and statistical machine translation need to be adapted to a new kind of data. This book reviews the current research on NLP tools and methods for processing the non-traditional information from social media data that is available in large amounts (big data), and shows how innovative NLP approaches can integrate appropriate linguistic information in various fields such as social media monitoring, healthcare, business intelligence, industry, marketing, and security and defence. We review the existing evaluation metrics for NLP and social media applications, and the new efforts in evaluation campaigns or shared tasks on new datasets collected from social media. Such tasks are organized by the Association for Computational Linguistics (such as SemEval tasks) or by the National Institute of Standards and Technology via the Text REtrieval Conference (TREC) and the Text Analysis Conference (TAC). In the concluding chapter, we discuss the importance of this dynamic discipline and its great potential for NLP in the coming decade, in the context of changes in mobile technology, cloud computing, virtual reality, and social networking. In this second edition, we have added information a out recent progress in the tasks and applications presented in the first edition. We discuss new methods and their results. The number of research projects and publications that use social media data is constantly increasing due to continuously growing amounts of social media data and the need to automatically process them. We have added 85 new references to the more than 300 references from the first edition. Besides updating each section, we have added a new application (digital marketing) to the section on media monitoring and we have augmented the section on healthcare applications with an extended discussion of recent research on detecting signs of mental illness from social media. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Working Together Apart:Collaboration over the Internet

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Increasingly, teams are working together when they are not in the same location, even though there are many challenges to doing so successfully. Here we review the latest insights into these matters, guided by a framework that we have developed during two decades of research on this topic. This framework organizes a series of factors that we have found to differentiate between successful and unsuccessful distributed collaborations. We then review the kinds of technology options that are available today, focusing more on types of technologies rather than specific instances. We describe a database of geographically distributed projects we have studied and introduce the Collaboration Success Wizard, an online tool for assessing past, present, or planned distributed collaborations. We close with a set of recommendations for individuals, managers, and those higher in the organizations who wish to support distance work. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to Logic Circuit Testing

    Copyright Year: 2008

    Morgan and Claypool eBooks

    An Introduction to Logic Circuit Testing provides a detailed coverage of techniques for test generation and testable design of digital electronic circuits/systems. The material covered in the book should be sufficient for a course, or part of a course, in digital circuit testing for senior-level undergraduate and first-year graduate students in Electrical Engineering and Computer Science. The book will also be a valuable resource for engineers working in the industry. This book has four chapters. Chapter 1 deals with various types of faults that may occur in very large scale integration (VLSI)-based digital circuits. Chapter 2 introduces the major concepts of all test generation techniques such as redundancy, fault coverage, sensitization, and backtracking. Chapter 3 introduces the key concepts of testability, followed by some ad hoc design-for-testability rules that can be used to enhance testability of combinational circuits. Chapter 4 deals with test generation and response evaluat on techniques used in BIST (built-in self-test) schemes for VLSI chips. Table of Contents: Introduction / Fault Detection in Logic Circuits / Design for Testability / Built-in Self-Test / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics provides a comprehensive tutorial of the most widely used method for solving Maxwell's equations -- the Finite Difference Time-Domain Method. This book is an essential guide for students, researchers, and professional engineers who want to gain a fundamental knowledge of the FDTD method. It can accompany an undergraduate or entry-level graduate course or be used for self-study. The book provides all the background required to either research or apply the FDTD method for the solution of Maxwell's equations to practical problems in engineering and science. Introduction to the Finite-Difference Time-Domain (FDTD) Method for Electromagnetics guides the reader through the foundational theory of the FDTD method starting with the one-dimensional transmission-line problem and then progressing to the solution of Maxwell's equations in three dimensions. It also provides step by step guides to modeling physic l sources, lumped-circuit components, absorbing boundary conditions, perfectly matched layer absorbers, and sub-cell structures. Post processing methods such as network parameter extraction and far-field transformations are also detailed. Efficient implementations of the FDTD method in a high level language are also provided. Table of Contents: Introduction / 1D FDTD Modeling of the Transmission Line Equations / Yee Algorithm for Maxwell's Equations / Source Excitations / Absorbing Boundary Conditions / The Perfectly Matched Layer (PML) Absorbing Medium / Subcell Modeling / Post Processing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Latent Semantic Mapping:Principles and Applications

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Latent semantic mapping (LSM) is a generalization of latent semantic analysis (LSA), a paradigm originally developed to capture hidden word patterns in a text document corpus. In information retrieval, LSA enables retrieval on the basis of conceptual content, instead of merely matching words between queries and documents. It operates under the assumption that there is some latent semantic structure in the data, which is partially obscured by the randomness of word choice with respect to retrieval. Algebraic and/or statistical techniques are brought to bear to estimate this structure and get rid of the obscuring "noise." This results in a parsimonious continuous parameter description of words and documents, which then replaces the original parameterization in indexing and retrieval. This approach exhibits three main characteristics: -Discrete entities (words and documents) are mapped onto a continuous vector space; -This mapping is determined by global correlation patterns; and -Dimens onality reduction is an integral part of the process. Such fairly generic properties are advantageous in a variety of different contexts, which motivates a broader interpretation of the underlying paradigm. The outcome (LSM) is a data-driven framework for modeling meaningful global relationships implicit in large volumes of (not necessarily textual) data. This monograph gives a general overview of the framework, and underscores the multifaceted benefits it can bring to a number of problems in natural language understanding and spoken language processing. It concludes with a discussion of the inherent tradeoffs associated with the approach, and some perspectives on its general applicability to data-driven information extraction. Contents: I. Principles / Introduction / Latent Semantic Mapping / LSM Feature Space / Computational Effort / Probabilistic Extensions / II. Applications / Junk E-mail Filtering / Semantic Classification / Language Modeling / Pronunciation Modeling / Speaker Ve ification / TTS Unit Selection / III. Perspectives / Discussion / Conclusion / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Community Detection and Mining in Social Media

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The past decade has witnessed the emergence of participatory Web and social media, bringing people together in many creative ways. Millions of users are playing, tagging, working, and socializing online, demonstrating new forms of collaboration, communication, and intelligence that were hardly imaginable just a short time ago. Social media also helps reshape business models, sway opinions and emotions, and opens up numerous possibilities to study human interaction and collective behavior in an unparalleled scale. This lecture, from a data mining perspective, introduces characteristics of social media, reviews representative tasks of computing with social media, and illustrates associated challenges. It introduces basic concepts, presents state-of-the-art algorithms with easy-to-understand examples, and recommends effective evaluation methods. In particular, we discuss graph-based community detection techniques and many important extensions that handle dynamic, heterogeneous networks i social media. We also demonstrate how discovered patterns of communities can be used for social media mining. The concepts, algorithms, and methods presented in this lecture can help harness the power of social media and support building socially-intelligent systems. This book is an accessible introduction to the study of emph{community detection and mining in social media}. It is an essential reading for students, researchers, and practitioners in disciplines and applications where social media is a key source of data that piques our curiosity to understand, manage, innovate, and excel. This book is supported by additional materials, including lecture slides, the complete set of figures, key references, some toy data sets used in the book, and the source code of representative algorithms. The readers are encouraged to visit the book website for the latest information. Table of Contents: Social Media and Social Computing / Nodes, Ties, and Influence / Community Detection and Evaluat on / Communities in Heterogeneous Networks / Social Media Mining View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Guide to Convolutional Neural Networks for Computer Vision

    Copyright Year: 2018

    Morgan and Claypool eBooks

    <p>Computer vision has become increasingly important and effective in recent years due to its wide-ranging applications in areas as diverse as smart surveillance and monitoring, health and medicine, sports and recreation, robotics, drones, and self-driving cars. Visual recognition tasks, such as image classification, localization, and detection, are the core building blocks of many of these applications, and recent developments in Convolutional Neural Networks (CNNs) have led to outstanding performance in these state-of-the-art visual recognition tasks and systems. As a result, CNNs now form the crux of deep learning algorithms in computer vision.</p> <p>This self-contained guide will benefit those who seek to both understand the theory behind CNNs and to gain hands-on experience on the application of CNNs in computer vision. It provides a comprehensive introduction to CNNs starting with the essential concepts behind neural networks: training, regularization, a d optimization of CNNs. The book also discusses a wide range of loss functions, network layers, and popular CNN architectures, reviews the different techniques for the evaluation of CNNs, and presents some popular CNN tools and libraries that are commonly used in computer vision. Further, this text describes and discusses case studies that are related to the application of CNN in computer vision, including image classification, object detection, semantic segmentation, scene understanding, and image generation.</p> <p>This book is ideal for undergraduate and graduate students, as no prior background knowledge in the field is required to follow the material, as well as new researchers, developers, engineers, and practitioners who are interested in gaining a quick understanding of CNN models.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis Techniques for Information Security

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Increasingly our critical infrastructures are reliant on computers. We see examples of such infrastructures in several domains, including medical, power, telecommunications, and finance. Although automation has advantages, increased reliance on computers exposes our critical infrastructures to a wider variety and higher likelihood of accidental failures and malicious attacks. Disruption of services caused by such undesired events can have catastrophic effects, such as disruption of essential services and huge financial losses. The increased reliance of critical services on our cyberinfrastructure and the dire consequences of security breaches have highlighted the importance of information security. Authorization, security protocols, and software security are three central areas in security in which there have been significant advances in developing systematic foundations and analysis methods that work for practical systems. This book provides an introduction to this work, covering rep esentative approaches, illustrated by examples, and providing pointers to additional work in the area. Table of Contents: Introduction / Foundations / Detecting Buffer Overruns Using Static Analysis / Analyzing Security Policies / Analyzing Security Protocols View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Content-based Retrieval of Medical Images:Landmarking, Indexing, and Relevance Feedback

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Content-based image retrieval (CBIR) is the process of retrieval of images from a database that are similar to a query image, using measures derived from the images themselves, rather than relying on accompanying text or annotation. To achieve CBIR, the contents of the images need to be characterized by quantitative features; the features of the query image are compared with the features of each image in the database and images having high similarity with respect to the query image are retrieved and displayed. CBIR of medical images is a useful tool and could provide radiologists with assistance in the form of a display of relevant past cases. One of the challenging aspects of CBIR is to extract features from the images to represent their visual, diagnostic, or application-specific information content. In this book, methods are presented for preprocessing, segmentation, landmarking, feature extraction, and indexing of mammograms for CBIR. The preprocessing steps include anisotropic di fusion and the Wiener filter to remove noise and perform image enhancement. Techniques are described for segmentation of the breast and fibroglandular disk, including maximum entropy, a moment-preserving method, and Otsu's method. Image processing techniques are described for automatic detection of the nipple and the edge of the pectoral muscle via analysis in the Radon domain. By using the nipple and the pectoral muscle as landmarks, mammograms are divided into their internal, external, upper, and lower parts for further analysis. Methods are presented for feature extraction using texture analysis, shape analysis, granulometric analysis, moments, and statistical measures. The CBIR system presented provides options for retrieval using the Kohonen self-organizing map and the k-nearest-neighbor method. Methods are described for inclusion of expert knowledge to reduce the semantic gap in CBIR, including the query point movement method for relevance feedback (RFb). Analysis of performanc is described in terms of precision, recall, and relevance-weighted precision of retrieval. Results of application to a clinical database of mammograms are presented, including the input of expert radiologists into the CBIR and RFb processes. Models are presented for integration of CBIR and computer-aided diagnosis (CAD) with a picture archival and communication system (PACS) for efficient workflow in a hospital. Table of Contents: Introduction to Content-based Image Retrieval / Mammography and CAD of Breast Cancer / Segmentation and Landmarking of Mammograms / Feature Extraction and Indexing of Mammograms / Content-based Retrieval of Mammograms / Integration of CBIR and CAD into Radiological Workflow View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering, Poverty, and the Earth

    Copyright Year: 2007

    Morgan and Claypool eBooks

    In the present work, the growing awareness in engineering of the profession’s responsibility towards the environment and the poor is considered. The following approach is taken: a brief overview of the issues of poverty particularly in the U.S. and the deterioration of the natural world with a focus on the Arctic is provided. Case studies involving New Orleans in the aftermath of Hurricane Katrina and the status of polar bears in a time of shrinking Arctic ice cover are detailed. Recent developments in engineering related to the issues of poverty and the environment are discussed. A new paradigm for engineering based on the works of Leonardo Boff and Thomas Berry, one that places an important emphasis upon a community, is explored. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    High Fidelity Haptic Rendering

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The human haptic system, among all senses, provides unique and bidirectional communication between humans and their physical environment. Yet, to date, most human-computer interactive systems have focused primarily on the graphical rendering of visual information and, to a lesser extent, on the display of auditory information. Extending the frontier of visual computing, haptic interfaces, or force feedback devices, have the potential to increase the quality of human-computer interaction by accommodating the sense of touch. They provide an attractive augmentation to visual display and enhance the level of understanding of complex data sets. They have been effectively used for a number of applications including molecular docking, manipulation of nano-materials, surgical training, virtual prototyping, and digital sculpting. Compared with visual and auditory display, haptic rendering has extremely demanding computational requirements. In order to maintain a stable system while displaying mooth and realistic forces and torques, high haptic update rates in the range of 500-1000 Hz or more are typically used. Haptics present many new challenges to researchers and developers in computer graphics and interactive techniques. Some of the critical issues include the development of novel data structures to encode shape and material properties, as well as new techniques for geometry processing, data analysis, physical modeling, and haptic visualization. This synthesis examines some of the latest developments on haptic rendering, while looking forward to exciting future research in this area. It presents novel haptic rendering algorithms that take advantage of the human haptic sensory modality. Specifically it discusses different rendering techniques for various geometric representations (e.g. point-based, polygonal, multiresolution, distance fields, etc), as well as textured surfaces. It also shows how psychophysics of touch can provide the foundational design guidelines for de eloping perceptually driven force models and concludes with possible applications and issues to consider in future algorithmic design, validating rendering techniques, and evaluating haptic interfaces. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and Social Justice

    Copyright Year: 2008

    Morgan and Claypool eBooks

    The profession of engineering in the United States has historically served the status quo, feeding an ever-expanding materialistic and militaristic culture, remaining relatively unresponsive to public concerns, and without significant pressure for change from within. This book calls upon engineers to cultivate a passion for social justice and peace and to develop the skill and knowledge set needed to take practical action for change within the profession. Because many engineers do not receive education and training that support the kinds of critical thinking, reflective decision-making, and effective action necessary to achieve social change, engineers concerned with social justice can feel powerless and isolated as they remain complicit. Utilizing techniques from radical pedagogies of liberation and other movements for social justice, this book presents a roadmap for engineers to become empowered and engage one another in a process of learning and action for social justice and peace. Table of contents: What Do we Mean by Social Justice? / Mindsets in Engineering / Engineering and Social Injustice / Toward a More Socially Just Engineering / Turning Knowledge into Action: Strategies for Change / Parting Lessons for the Continuing Struggle View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Query Answer Authentication

    Copyright Year: 2012

    Morgan and Claypool eBooks

    In data publishing, the owner delegates the role of satisfying user queries to a third-party publisher. As the servers of the publisher may be untrusted or susceptible to attacks, we cannot assume that they would always process queries correctly, hence there is a need for users to authenticate their query answers. This book introduces various notions that the research community has studied for defining the correctness of a query answer. In particular, it is important to guarantee the completeness, authenticity and minimality of the answer, as well as its freshness. We present authentication mechanisms for a wide variety of queries in the context of relational and spatial databases, text retrieval, and data streams. We also explain the cryptographic protocols from which the authentication mechanisms derive their security properties. Table of Contents: Introduction / Cryptography Foundation / Relational Queries / Spatial Queries / Text Search Queries / Data Streams / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quantum Radar

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book offers a concise review of quantum radar theory. Our approach is pedagogical, making emphasis on the physics behind the operation of a hypothetical quantum radar. We concentrate our discussion on the two major models proposed to date: interferometric quantum radar and quantum illumination. In addition, this book offers some new results, including an analytical study of quantum interferometry in the X-band radar region with a variety of atmospheric conditions, a derivation of a quantum radar equation, and a discussion of quantum radar jamming. This book assumes the reader is familiar with the basic principles of non-relativistic quantum mechanics, special relativity, and classical electrodynamics. Our discussion of quantum electrodynamics and its application to quantum radar is brief, but all the relevant equations are presented in the text. In addition, the reader is not required to have any specialized knowledge on classical radar theory. Table of Contents: Introduction / T e Photon / Photon Scattering / Classical Radar Theory / Quantum Radar Theory / Quantum Radar Cross Section / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Anomaly Detection as a Service:Challenges, Advances, and Opportunities

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Anomaly detection has been a long-standing security approach with versatile applications, ranging from securing server programs in critical environments, to detecting insider threats in enterprises, to anti-abuse detection for online social networks. Despite the seemingly diverse application domains, anomaly detection solutions share similar technical challenges, such as how to accurately recognize various normal patterns, how to reduce false alarms, how to adapt to concept drifts, and how to minimize performance impact. They also share similar detection approaches and evaluation methods, such as feature extraction, dimension reduction, and experimental evaluation.</p> <p>The main purpose of this book is to help advance the real-world adoption and deployment anomaly detection technologies, by systematizing the body of existing knowledge on anomaly detection. This book is focused on data-driven anomaly detection for software, systems, and networks against adv nced exploits and attacks, but also touches on a number of applications, including fraud detection and insider threats. We explain the key technical components in anomaly detection workflows, give in-depth description of the state-of-the-art data-driven anomaly-based security solutions, and more importantly, point out promising new research directions. This book emphasizes on the need and challenges for deploying service-oriented anomaly detection in practice, where clients can outsource the detection to dedicated security providers and enjoy the protection without tending to the intricate details.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Combating Bad Weather Part I:Rain Removal from Video

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Current vision systems are designed to perform in normal weather condition. However, no one can escape from severe weather conditions. Bad weather reduces scene contrast and visibility, which results in degradation in the performance of various computer vision algorithms such as object tracking, segmentation and recognition. Thus, current vision systems must include some mechanisms that enable them to perform up to the mark in bad weather conditions such as rain and fog. Rain causes the spatial and temporal intensity variations in images or video frames. These intensity changes are due to the random distribution and high velocities of the raindrops. Fog causes low contrast and whiteness in the image and leads to a shift in the color. This book has studied rain and fog from the perspective of vision. The book has two main goals: 1) removal of rain from videos captured by a moving and static camera, 2) removal of the fog from images and videos captured by a moving single uncalibrated ca era system. The book begins with a literature survey. Pros and cons of the selected prior art algorithms are described, and a general framework for the development of an efficient rain removal algorithm is explored. Temporal and spatiotemporal properties of rain pixels are analyzed and using these properties, two rain removal algorithms for the videos captured by a static camera are developed. For the removal of rain, temporal and spatiotemporal algorithms require fewer numbers of consecutive frames which reduces buffer size and delay. These algorithms do not assume the shape, size and velocity of raindrops which make it robust to different rain conditions (i.e., heavy rain, light rain and moderate rain). In a practical situation, there is no ground truth available for rain video. Thus, no reference quality metric is very useful in measuring the efficacy of the rain removal algorithms. Temporal variance and spatiotemporal variance are presented in this book as no reference quality met ics. An efficient rain removal algorithm using meteorological properties of rain is developed. The relation among the orientation of the raindrops, wind velocity and terminal velocity is established. This relation is used in the estimation of shape-based features of the raindrop. Meteorological property-based features helped to discriminate the rain and non-rain pixels. Most of the prior art algorithms are designed for the videos captured by a static camera. The use of global motion compensation with all rain removal algorithms designed for videos captured by static camera results in better accuracy for videos captured by moving camera. Qualitative and quantitative results confirm that probabilistic temporal, spatiotemporal and meteorological algorithms outperformed other prior art algorithms in terms of the perceptual quality, buffer size, execution delay and system cost. The work presented in this book can find wide application in entertainment industries, transportation, tracking a d consumer electronics. Table of Contents: Acknowledgments / Introduction / Analysis of Rain / Dataset and Performance Metrics / Important Rain Detection Algorithms / Probabilistic Approach for Detection and Removal of Rain / Impact of Camera Motion on Detection of Rain / Meteorological Approach for Detection and Removal of Rain from Videos / Conclusion and Scope of Future Work / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Double-Grid Finite-Difference Frequency-Domain (DG-FDFD) Method for Scattering from Chiral Objects

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book presents the application of the overlapping grids approach to solve chiral material problems using the FDFD method. Due to the two grids being used in the technique, we will name this method as Double-Grid Finite Difference Frequency-Domain (DG-FDFD) method. As a result of this new approach the electric and magnetic field components are defined at every node in the computation space. Thus, there is no need to perform averaging during the calculations as in the aforementioned FDFD technique [16]. We formulate general 3D frequency-domain numerical methods based on double-grid (DG-FDFD) approach for general bianisotropic materials. The validity of the derived formulations for different scattering problems has been shown by comparing the obtained results to exact and other solutions obtained using different numerical methods. Table of Contents: Introduction / Chiral Media / Basics of the Finite-Difference Frequency-Domain (FDFD) Method / The Double-Grid Finite-Difference Frequen y-Domain (DG-FDFD) Method for Bianisotropic Medium / Scattering FromThree Dimensional Chiral Structures / ImprovingTime and Memory Efficiencies of FDFD Methods / Conclusions / Appendix A: Notations / Appendix B: Near to Far FieldTransformation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    HCI Theory:Classical, Modern, and Contemporary

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Theory is the bedrock of many sciences, providing a rigorous method to advance knowledge, through testing and falsifying hypotheses about observable phenomena. To begin with, the nascent field of HCI followed the scientific method borrowing theories from cognitive science to test theories about user performance at the interface. But HCI has emerged as an eclectic interdiscipline rather than a well-defined science. It now covers all aspects of human life, from birth to bereavement, through all manner of computing, from device ecologies to nano-technology. It comes as no surprise that the role of theory in HCI has also greatly expanded from the early days of scientific testing to include other functions such as describing, explaining, critiquing, and as the basis for generating new designs. The book charts the theoretical developments in HCI, both past and present, reflecting on how they have shaped the field. It explores both the rhetoric and the reality: how theories have been concept alized, what was promised, how they have been used and which has made the most impact in the field -- and the reasons for this. Finally, it looks to the future and asks whether theory will continue to have a role, and, if so, what this might be. Table of Contents: Introduction / The Backdrop to HCI Theory / The Role and Contribution of Theory in HCI / Classical Theories / Modern Theories / Contemporary Theory / Discussion / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Model-Driven Software Engineering in Practice

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This book discusses how model-based approaches can improve the daily practice of software professionals. This is known as Model-Driven Software Engineering (MDSE) or, simply, Model-Driven Engineering (MDE). MDSE practices have proved to increase efficiency and effectiveness in software development, as demonstrated by various quantitative and qualitative studies. MDSE adoption in the software industry is foreseen to grow exponentially in the near future, e.g., due to the convergence of software development and business analysis. The aim of this book is to provide you with an agile and flexible tool to introduce you to the MDSE world, thus allowing you to quickly understand its basic principles and techniques and to choose the right set of MDSE instruments for your needs so that you can start to benefit from MDSE right away. The book is organized into two main parts. The first part discusses the foundations of MDSE in terms of basic concepts (i.e., models and transformations), driving p inciples, application scenarios and current standards, like the well-known MDA initiative proposed by OMG (Object Management Group) as well as the practices on how to integrate MDSE in existing development processes. The second part deals with the technical aspects of MDSE, spanning from the basics on when and how to build a domain-specific modeling language, to the description of Model-to-Text and Model-to-Model transformations, and the tools that support the management of MDSE projects. The book is targeted to a diverse set of readers, spanning: professionals, CTOs, CIOs, and team managers that need to have a bird's eye vision on the matter, so as to take the appropriate decisions when it comes to choosing the best development techniques for their company or team; software analysts, developers, or designers that expect to use MDSE for improving everyday work productivity, either by applying the basic modeling techniques and notations or by defining new domain-specific modeling lang ages and applying end-to-end MDSE practices in the software factory; and academic teachers and students to address undergrad and postgrad courses on MDSE. In addition to the contents of the book, more resources are provided on the book's website, including the examples presented in the book. Table of Contents: Introduction / MDSE Principles / MDSE Use Cases / Model-Driven Architecture (MDA) / Integration of MDSE in your Development Process / Modeling Languages at a Glance / Developing your Own Modeling Language / Model-to-Model Transformations / Model-to-Text Transformations / Managing Models / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Geometric Programming for Design Equation Development and Cost/Profit Optimization:(with illustrative case study problems and solutions), Third Edition

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Geometric Programming is used for cost minimization, profit maximization, obtaining cost ratios, and the development of generalized design equations for the primal variables. The early pioneers of geometric programming—Zener, Duffin, Peterson, Beightler, Wilde, and Phillips—played important roles in its development. Five new case studies have been added to the third edition. There are five major sections: (1) Introduction, History and Theoretical Fundamentals; (2) Cost Minimization Applications with Zero Degrees of Difficulty; (3) Profit Maximization Applications with Zero Degrees of Difficulty; (4) Applications with Positive Degrees of Difficulty; and (5) Summary, Future Directions, and Geometric Programming Theses & Dissertations Titles. The various solution techniques presented are the constrained derivative approach, condensation of terms approach, dimensional analysis approach, and transformed dual approach. A primary goal of this work is to have readers deve op more case studies and new solution techniques to further the application of geometric programming. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mathematical Basics of Motion and Deformation in Computer Graphics

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This synthesis lecture presents an intuitive introduction to the mathematics of motion and deformation in computer graphics. Starting with familiar concepts in graphics, such as Euler angles, quaternions, and affine transformations, we illustrate that a mathematical theory behind these concepts enables us to develop the techniques for efficient/effective creation of computer animation. This book, therefore, serves as a good guidepost to mathematics (differential geometry and Lie theory) for students of geometric modeling and animation in computer graphics. Experienced developers and researchers will also benefit from this book, since it gives a comprehensive overview of mathematical approaches that are particularly useful in character modeling, deformation, and animation. Table of Contents: Preface / Symbols and Notations / Introduction / Rigid Transformation / Affine Transformation / Exponential and Logarithm of Matrices / 2D Affine Transformation between Two Triangles / Global 2D Sh pe Interpolation / Parametrizing 3D Positive Affine Transformations / Further Readings / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Automated Software Diversity

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Whereas user-facing applications are often written in modern languages, the firmware, operating system, support libraries, and virtual machines that underpin just about any modern computer system are still written in low-level languages that value flexibility and performance over convenience and safety. Programming errors in low-level code are often exploitable and can, in the worst case, give adversaries unfettered access to the compromised host system. This book provides an introduction to and overview of automatic software diversity techniques that, in one way or another, use randomization to greatly increase the difficulty of exploiting the vast amounts of low-level code in existence. Diversity-based defenses are motivated by the observation that a single attack will fail against multiple targets with unique attack surfaces. We introduce the many, often complementary, ways that one can diversify attack surfaces and provide an accessible guide to more than two decades worth of rese rch on the topic. We also discuss techniques used in conjunction with diversity to prevent accidental disclosure of randomized program aspects and present an in-depth case study of one of our own diversification solutions. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Numerical Methods for Linear Complementarity Problems in Physics-Based Animation

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Linear complementarity problems (LCPs) have for many years been used in physics-based animation to model contact forces between rigid bodies in contact. More recently, LCPs have found their way into the realm of fluid dynamics. Here, LCPs are used to model boundary conditions with fluid-wall contacts. LCPs have also started to appear in deformable models and granular simulations. There is an increasing need for numerical methods to solve the resulting LCPs with all these new applications. This book provides a numerical foundation for such methods, especially suited for use in computer graphics. This book is mainly intended for a researcher/Ph.D. student/post-doc/professor who wants to study the algorithms and do more work/research in this area. Programmers might have to invest some time brushing up on math skills, for this we refer to Appendices A and B. The reader should be familiar with linear algebra and differential calculus. We provide pseudo code for all the numerical methods, w ich should be comprehensible by any computer scientist with rudimentary programming skills. The reader can find an online supplementary code repository, containing Matlab implementations of many of the core methods covered in these notes, as well as a few Python implementations [Erleben, 2011]. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Multiantenna Systems for MIMO Communications

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Advanced communication scenarios demand the development of new systemswhere antenna theory, channel propagation and communication models are seen from a common perspective as a way to understand and optimize the system as a whole. In this context, a comprehensive multiantenna formulation for multiple-input multiple-output systems is presented with a special emphasis on the connection of the electromagnetic and communication principles. Starting from the capacity for amultiantenna system, the book reviews radiation, propagation, and communicationmechanisms, paying particular attention to the vectorial, directional, and timefrequency characteristics of the wireless communication equation for low- and high-scattering environments. Based on the previous concepts, different space—time methods for diversity and multiplexing applications are discussed, multiantenna modeling is studied, and specific tools are introduced to analyze the antenna coupling mechanisms and formulate appropri te decorrelation techniques.Miniaturization techniques for closely spaced antennas are studied, and its fundamental limits and optimization strategies are reviewed. Finally, different practical multiantenna topologies for new communication applications are presented, and its main parameters discussed. A relevant feature is a collection of synthesis exercises that review the main topics of the book and introduces state-of-the art system architectures and parameters, facilitating its use either as a text book or as a support tool for multiantenna systems design. Table of Contents: Principles of Multiantenna Communication Systems / The Radio Channel for MIMO Communication Systems / Coding Theory for MIMO Communication Systems / Antenna Modeling for MIMO Communication Systems / Design of MPAs for MIMO Communication Systems / Design Examples and Performance Analysis of Different MPAs / References / List of Acronyms / List of Symbols / Operators and Mathematical Symbols View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Translating Euclid:Designing a Human-Centered Mathematics

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Translating Euclid reports on an effort to transform geometry for students from a stylus-and-clay-tablet corpus of historical theorems to a stimulating computer-supported collaborative-learning inquiry experience. The origin of geometry was a turning point in the pre-history of informatics, literacy, and rational thought. Yet, this triumph of human intellect became ossified through historic layers of systematization, beginning with Euclid’s organization of the Elements of geometry. Often taught by memorization of procedures, theorems, and proofs, geometry in schooling rarely conveys its underlying intellectual excitement. The recent development of dynamic-geometry software offers an opportunity to translate the study of geometry into a contemporary vernacular. However, this involves transformations along multiple dimensions of the conceptual and practical context of learning. Translating Euclid steps through the multiple challenges involved in redesigning geometry education to take advantage of computer support. Networked computers portend an interactive approach to exploring dynamic geometry as well as broadened prospects for collaboration. The proposed conception of geometry emphasizes the central role of the construction of dependencies as a design activity, integrating human creation and mathematical discovery to form a human-centered approach to mathematics. This book chronicles an iterative effort to adapt technology, theory, pedagogy and practice to support this vision of collaborative dynamic geometry and to evolve the approach through on-going cycles of trial with students and refinement of resources. It thereby provides a case study of a design-based research effort in computer-supported collaborative learning from a human-centered informatics perspective. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Mobile Agent Rendezvous Problem in the Ring

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Mobile agent computing is being used in fields as diverse as artificial intelligence, computational economics and robotics. Agents' ability to adapt dynamically and execute asynchronously and autonomously brings potential advantages in terms of fault-tolerance, flexibility and simplicity. This monograph focuses on studying mobile agents as modelled in distributed systems research and in particular within the framework of research performed in the distributed algorithms community. It studies the fundamental question of how to achieve rendezvous, the gathering of two or more agents at the same node of a network. Like leader election, such an operation is a useful subroutine in more general computations that may require the agents to synchronize, share information, divide up chores, etc. The work provides an introduction to the algorithmic issues raised by the rendezvous problem in the distributed computing setting. For the most part our investigation concentrates on the simplest case o two agents attempting to rendezvous on a ring network. Other situations including multiple agents, faulty nodes and other topologies are also examined. An extensive bibliography provides many pointers to related work not covered in the text. The presentation has a distinctly algorithmic, rigorous, distributed computing flavor and most results should be easily accessible to advanced undergraduate and graduate students in computer science and mathematics departments. Table of Contents: Models for Mobile Agent Computing / Deterministic Rendezvous in a Ring / Multiple Agent Rendezvous in a Ring / Randomized Rendezvous in a Ring / Other Models / Other Topologies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Design and Development of RFID and RFID-Enabled Sensors on Flexible Low Cost Substrates

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book presents a step-by-step discussion of the design and development of radio frequency identification (RFID) and RFID-enabled sensors on flexible low cost substrates for UHF frequency bands. Various examples of fully function building blocks (design and fabrication of antennas, integration with ICs and microcontrollers, power sources, as well as inkjet-printing techniques) demonstrate the revolutionary effect of this approach in low cost RFID and RFID-enabled sensors fields. This approach could be easily extended to other microwave and wireless applications as well. The first chapter describes the basic functionality and the physical and IT-related principles underlying RFID and sensors technology. Chapter two explains in detail inkjet-printing technology providing the characterization of the conductive ink, which consists of nano-silver-particles, while highlighting the importance of this technology as a fast and simple fabrication technique especially on flexible organic subs rates such as Liquid Crystal Polymer (LCP) or paper-based substrates. Chapter three demonstrates several compact inkjet-printed UHF RFID antennas using antenna matching techniques to match IC's complex impedance as prototypes to provide the proof of concept of this technology. Chapter four discusses the benefits of using conformal magnetic material as a substrate for miniaturized high-frequency circuit applications. In addition, in Chapter five, the authors also touch up the state-of-the-art area of fully-integrated wireless sensor modules on organic substrates and show the first ever 2D sensor integration with an RFID tag module on paper, as well as the possibility of 3D multilayer paper-based RF/microwave structures. Table of Contents: Radio Frequency Identification Introduction / Flexible Organic Low Cost Substrates / Benchmarking RFID Prototypes on Organic Substrates / Conformal Magnetic Composite RFID Tags / Inkjet-Printed RFID-Enabled Sensors View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Information Architecture:The Design and Integration of Information Spaces

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Information Architecture is about organizing and simplifying information, designing and integrating information spaces/systems, and creating ways for people to find and interact with information content. Its goal is to help people understand and manage information and make the right decisions accordingly. This updated and revised edition of the book looks at integrated information spaces in the web context and beyond, with a focus on putting theories and principles into practice.</p><p>In the ever-changing social, organizational, and technological contexts, information architects not only design individual information spaces (e.g., websites, software applications, and mobile devices), but also tackle strategic aggregation and integration of multiple information spaces across websites, channels, modalities, and platforms. Not only do they create predetermined navigation pathways, but they also provide tools and rules for people to organize information on thei own and get connected with others.</p><p>Information architects work with multi-disciplinary teams to determine the user experience strategy based on user needs and business goals, and make sure the strategy gets carried out by following the user-centered design (UCD) process via close collaboration with others. Drawing on the authors’ extensive experience as HCI researchers, User Experience Design practitioners, and Information Architecture instructors, this book provides a balanced view of the IA discipline by applying theories, design principles, and guidelines to IA and UX practices. It also covers advanced topics such as iterative design, UX decision support, and global and mobile IA considerations. Major revisions include moving away from a web-centric view toward multi-channel, multi-device experiences. Concepts such as responsive design, emerging design principles, and user-centered methods such as Agile, Lean UX, and Design Thinking are discussed and elated to IA processes and practices.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Understanding Atrial Fibrillation:The Signal Processing Contribution, Part II

    Copyright Year: 2008

    Morgan and Claypool eBooks

    The book presents recent advances in signal processing techniques for modeling, analysis, and understanding of the heart's electrical activity during atrial fibrillation. This arrhythmia is the most commonly encountered in clinical practice and its complex and metamorphic nature represents a challenging problem for clinicians, engineers, and scientists. Research on atrial fibrillation has stimulated the development of a wide range of signal processing tools to better understand the mechanisms ruling its initiation, maintenance, and termination. This book provides undergraduate and graduate students, as well as researchers and practicing engineers, with an overview of techniques, including time domain techniques for atrial wave extraction, time-frequency analysis for exploring wave dynamics, and nonlinear techniques to characterize the ventricular response and the organization of atrial activity. The book includes an introductory chapter about atrial fibrillation and its mechanisms, t eatment, and management. The successive chapters are dedicated to the analysis of atrial signals recorded on the body surface and to the quantification of ventricular response. The rest of the book explores techniques to characterize endo- and epicardial recordings and to model atrial conduction. Under the appearance of being a monothematic book on atrial fibrillation, the reader will not only recognize common problems of biomedical signal processing but also discover that analysis of atrial fibrillation is a unique challenge for developing and testing novel signal processing tools. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modeling and Data Mining in Blogosphere

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book offers a comprehensive overview of the various concepts and research issues about blogs or weblogs. It introduces techniques and approaches, tools and applications, and evaluation methodologies with examples and case studies. Blogs allow people to express their thoughts, voice their opinions, and share their experiences and ideas. Blogs also facilitate interactions among individuals creating a network with unique characteristics. Through the interactions individuals experience a sense of community. We elaborate on approaches that extract communities and cluster blogs based on information of the bloggers. Open standards and low barrier to publication in Blogosphere have transformed information consumers to producers, generating an overwhelming amount of ever-increasing knowledge about the members, their environment and symbiosis. We elaborate on approaches that sift through humongous blog data sources to identify influential and trustworthy bloggers leveraging content and net ork information. Spam blogs or "splogs" are an increasing concern in Blogosphere and are discussed in detail with the approaches leveraging supervised machine learning algorithms and interaction patterns. We elaborate on data collection procedures, provide resources for blog data repositories, mention various visualization and analysis tools in Blogosphere, and explain conventional and novel evaluation methodologies, to help perform research in the Blogosphere. The book is supported by additional material, including lecture slides as well as the complete set of figures used in the book, and the reader is encouraged to visit the book website for the latest information. Table of Contents: Modeling Blogosphere / Blog Clustering and Community Discovery / Influence and Trust / Spam Filtering in Blogosphere / Data Collection and Evaluation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Smart Antennas

    Copyright Year: 2007

    Morgan and Claypool eBooks

    As the growing demand for mobile communications is constantly increasing, the need for better coverage, improved capacity, and higher transmission quality rises. Thus, a more efficient use of the radio spectrum is required. Smart antenna systems are capable of efficiently utilizing the radio spectrum and is a promise for an effective solution to the present wireless systems’ problems while achieving reliable and robust high-speed high-data-rate transmission. The purpose of this book is to provide the reader a broad view of the system aspects of smart antennas. In fact, smart antenna systems comprise several critical areas such as individual antenna array design, signal processing algorithms, space-time processing, wireless channel modeling and coding, and network performance. In this book we include an overview of smart antenna concepts, introduce some of the areas that impact smart antennas, and examine the influence of interaction and integration of these areas to Mobile Ad oc Networks. In addition, the general principles and major benefits of using space-time processing are introduced, especially employing multiple-input multiple-output (MIMO) techniques. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Cleaning:A Practical Perspective

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Data warehouses consolidate various activities of a business and often form the backbone for generating reports that support important business decisions. Errors in data tend to creep in for a variety of reasons. Some of these reasons include errors during input data collection and errors while merging data collected independently across different databases. These errors in data warehouses often result in erroneous upstream reports, and could impact business decisions negatively. Therefore, one of the critical challenges while maintaining large data warehouses is that of ensuring the quality of data in the data warehouse remains high. The process of maintaining high data quality is commonly referred to as data cleaning. In this book, we first discuss the goals of data cleaning. Often, the goals of data cleaning are not well defined and could mean different solutions in different scenarios. Toward clarifying these goals, we abstract out a common set of data cleaning tasks that often ne d to be addressed. This abstraction allows us to develop solutions for these common data cleaning tasks. We then discuss a few popular approaches for developing such solutions. In particular, we focus on an operator-centric approach for developing a data cleaning platform. The operator-centric approach involves the development of customizable operators that could be used as building blocks for developing common solutions. This is similar to the approach of relational algebra for query processing. The basic set of operators can be put together to build complex queries. Finally, we discuss the development of custom scripts which leverage the basic data cleaning operators along with relational operators to implement effective solutions for data cleaning tasks. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Visual Object Recognition

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The visual recognition problem is central to computer vision research. From robotics to information retrieval, many desired applications demand the ability to identify and localize categories, places, and objects. This tutorial overviews computer vision algorithms for visual object recognition and image classification. We introduce primary representations and learning approaches, with an emphasis on recent advances in the field. The target audience consists of researchers or students working in AI, robotics, or vision who would like to understand what methods and representations are available for these problems. This lecture summarizes what is and isn't possible to do reliably today, and overviews key concepts that could be employed in systems requiring visual categorization. Table of Contents: Introduction / Overview: Recognition of Specific Objects / Local Features: Detection and Description / Matching Local Features / Geometric Verification of Matched Features / Example Systems: S ecific-Object Recognition / Overview: Recognition of Generic Object Categories / Representations for Object Categories / Generic Object Detection: Finding and Scoring Candidates / Learning Generic Object Category Models / Example Systems: Generic Object Recognition / Other Considerations and Current Challenges / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Design of Reconfigurable Antennas Using Graph Models

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This lecture discusses the use of graph models to represent reconfigurable antennas. The rise of antennas that adapt to their environment and change their operation based on the user's request hasn't been met with clear design guidelines. There is a need to propose some rules for the optimization of any reconfigurable antenna design and performance. Since reconfigurable antennas are seen as a collection of self-organizing parts, graph models can be introduced to relate each possible topology to a corresponding electromagnetic performance in terms of achieving a characteristic frequency of operation, impedance, and polarization. These models help designers understand reconfigurable antenna structures and enhance their functionality since they transform antennas from bulky devices into mathematical and software accessible models. The use of graphs facilitates the software control and cognition ability of reconfigurable antennas while optimizing their performance. This lecture also dis usses the reduction of redundancy, complexity and reliability of reconfigurable antennas and reconfigurable antenna arrays. The full analysis of these parameters allows a better reconfigurable antenna implementation in wireless and space communications platforms. The use of graph models to reduce the complexity while preserving the reliability of reconfigurable antennas allow a better incorporation in applications such as cognitive radio, MIMO, satellite communications, and personal communication systems. A swifter response time is achieved with less cost and losses. This lecture is written for individuals who wish to venture into the field of reconfigurable antennas, with a little prior experience in this area, and learn how graph rules and theory, mainly used in the field of computer science, networking, and control systems can be applied to electromagnetic structures. This lecture will walk the reader through a design and analysis process of reconfigurable antennas using graph mode s with a practical and theoretical outlook. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Big Data Integration

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The big data era is upon us: data are being generated, analyzed, and used at an unprecedented scale, and data-driven decision making is sweeping through all aspects of society. Since the value of data explodes when it can be linked and fused with other data, addressing the big data integration (BDI) challenge is critical to realizing the promise of big data. BDI differs from traditional data integration along the dimensions of volume, velocity, variety, and veracity. First, not only can data sources contain a huge volume of data, but also the number of data sources is now in the millions. Second, because of the rate at which newly collected data are made available, many of the data sources are very dynamic, and the number of data sources is also rapidly exploding. Third, data sources are extremely heterogeneous in their structure and content, exhibiting considerable variety even for substantially similar entities. Fourth, the data sources are of widely differing qualities, with signif cant differences in the coverage, accuracy and timeliness of data provided. This book explores the progress that has been made by the data integration community on the topics of schema alignment, record linkage and data fusion in addressing these novel challenges faced by big data integration. Each of these topics is covered in a systematic way: first starting with a quick tour of the topic in the context of traditional data integration, followed by a detailed, example-driven exposition of recent innovative techniques that have been proposed to address the BDI challenges of volume, velocity, variety, and veracity. Finally, it presents merging topics and opportunities that are specific to BDI, identifying promising directions for the data integration community. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    MPEG-4 Beyond Conventional Video Coding:Object Coding, Resilience, and Scalability

    Copyright Year: 2006

    Morgan and Claypool eBooks

    An important merit of the MPEG-4 video standard is that it not only provided tools and algorithms for enhancing the compression efficiency of existing MPEG-2 and H.263 standards but also contributed key innovative solutions for new multimedia applications such as real-time video streaming to PCs and cell phones over Internet and wireless networks, interactive services, and multimedia access. Many of these solutions are currently used in practice or have been important stepping-stones for new standards and technologies. In this book, we do not aim at providing a complete reference for MPEG-4 video as many excellent references on the topic already exist. Instead, we focus on three topics that we believe formed key innovations of MPEG-4 video and that will continue to serve as an inspiration and basis for new, emerging standards, products, and technologies. The three topics highlighted in this book are object-based coding and scalability, Fine Granularity Scalability, and error resilienc tools. This book is aimed at engineering students as well as professionals interested in learning about these MPEG-4 technologies for multimedia streaming and interaction. Finally, it is not aimed as a substitute or manual for the MPEG-4 standard, but rather as a tutorial focused on the principles and algorithms underlying it. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mapped Vector Basis Functions for Electromagnetic Integral Equations

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The method-of-moments solution of the electric field and magnetic field integral equations (EFIE and MFIE) is extended to conducting objects modeled with curved cells. These techniques are important for electromagnetic scattering, antenna, radar signature, and wireless communication applications. Vector basis functions of the divergence-conforming and curl-conforming types are explained, and specific interpolatory and hierarchical basis functions are reviewed. Procedures for mapping these basis functions from a reference domain to a curved cell, while preserving the desired continuity properties on curved cells, are discussed in detail. For illustration, results are presented for examples that employ divergence-conforming basis functions with the EFIE and curl-conforming basis functions with the MFIE. The intended audience includes electromagnetic engineers with some previous familiarity with numerical techniques. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bandwidth Extension of Speech Using Perceptual Criteria

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Bandwidth extension of speech is used in the International Telecommunication Union G.729.1 standard in which the narrowband bitstream is combined with quantized high-band parameters. Although this system produces high-quality wideband speech, the additional bits used to represent the high band can be further reduced. In addition to the algorithm used in the G.729.1 standard, bandwidth extension methods based on spectrum prediction have also been proposed. Although these algorithms do not require additional bits, they perform poorly when the correlation between the low and the high band is weak. In this book, two wideband speech coding algorithms that rely on bandwidth extension are developed. The algorithms operate as wrappers around existing narrowband compression schemes. More specifically, in these algorithms, the low band is encoded using an existing toll-quality narrowband system, whereas the high band is generated using the proposed extension techniques. The first method relies nly on transmitted high-band information to generate the wideband speech. The second algorithm uses a constrained minimum mean square error estimator that combines transmitted high-band envelope information with a predictive scheme driven by narrowband features. Both algorithms make use of novel perceptual models based on loudness that determine optimum quantization strategies for wideband recovery and synthesis. Objective and subjective evaluations reveal that the proposed system performs at a lower average bit rate while improving speech quality when compared to other similar algorithms. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Background Subtraction:Theory and Practice

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Background subtraction is a widely used concept for detection of moving objects in videos. In the last two decades there has been a lot of development in designing algorithms for background subtraction, as well as wide use of these algorithms in various important applications, such as visual surveillance, sports video analysis, motion capture, etc. Various statistical approaches have been proposed to model scene backgrounds. The concept of background subtraction also has been extended to detect objects from videos captured from moving cameras. This book reviews the concept and practice of background subtraction. We discuss several traditional statistical background subtraction models, including the widely used parametric Gaussian mixture models and non-parametric models. We also discuss the issue of shadow suppression, which is essential for human motion analysis applications. This book discusses approaches and tradeoffs for background maintenance. This book also reviews many of the r cent developments in background subtraction paradigm. Recent advances in developing algorithms for background subtraction from moving cameras are described, including motion-compensation-based approaches and motion-segmentation-based approaches. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Systems Engineering:Building Successful Systems

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book provides an overview of systems engineering, its important elements, and aspects of management that will lead in the direction of building systems with a greater likelihood of success. Emphasis is placed upon the following elements: - How the systems approach is defined, and how it guides the systems engineering processes - How systems thinking helps in combination with the systems approach and systems engineering - Time lines that define the life cycle dimensions of a system - System properties, attributes, features, measures and parameters - Approaches to architecting systems - Dealing with requirements, synthesis, analysis and cost effectiveness considerations - Life cycle costing of systems - Modeling, simulation and other analysis methods - Technology and its interplay with risk and its management - Systems acquisition and integration - Systems of systems - Thinking outside the box - Success and failure factors - Software engineering - Standards - Systems engineering ma agement Together, these top-level aspects of systems engineering need to be understood and mastered in order to improve the way we build systems, as they typically become larger and more complex. Table of Contents: Definitions and Background / The Systems Approach / Systems Thinking / Key Elements of Systems Engineering / The Life Cycle Dimension / System Properties, Attributes and Features (PAFs) / Measures and Parameters / Architecting / Functional Decomposition / Requirements Engineering / Synthesis / Analysis / Cost-Effectiveness / Life Cycle Costing / Modeling and Simulation / Other Analysis Relationships / The Role of Technology / Risk Management / Testing, Verification, and Validation / Integration / Systems Engineering Management / Project Management / Software Engineering / Systems Acquisition / Systems of Systems / Thinking Outside the Box / Ten Failure Factors / A Success Audit / Standards View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fundamental Biomechanics in Bone Tissue Engineering

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This eight-chapter monograph intends to present basic principles and applications of biomechanics in bone tissue engineering in order to assist tissue engineers in design and use of tissue-engineered products for repair and replacement of damaged/deformed bone tissues. Briefly, Chapter 1 gives an overall review of biomechanics in the field of bone tissue engineering. Chapter 2 provides detailed information regarding the composition and architecture of bone. Chapter 3 discusses the current methodologies for mechanical testing of bone properties (i.e., elastic, plastic, damage/fracture, viscoelastic/viscoplastic properties). Chapter 4 presents the current understanding of the mechanical behavior of bone and the associated underlying mechanisms. Chapter 5 discusses the structure and properties of scaffolds currently used for bone tissue engineering applications. Chapter 6 gives a brief discussion of current mechanical and structural tests of repair/tissue engineered bone tissues. Chapter 7 summarizes the properties of repair/tissue engineered bone tissues currently attained. Finally, Chapter 8 discusses the current issues regarding biomechanics in the area of bone tissue engineering. Table of Contents: Introduction / Bone Composition and Structure / Current Mechanical Test Methodologies / Mechanical Behavior of Bone / Structure and Properties of Scaffolds for Bone Tissue Regeneration / Mechanical and Structural Evaluation of Repair/Tissue Engineered Bone / Mechanical and Structural Properties of Tissues Engineered/Repair Bone / Current Issues of Biomechanics in Bone Tissue Engineering View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Nonlinear Source Separation

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The purpose of this lecture book is to present the state of the art in nonlinear blind source separation, in a form appropriate for students, researchers and developers. Source separation deals with the problem of recovering sources that are observed in a mixed condition. When we have little knowledge about the sources and about the mixture process, we speak of blind source separation. Linear blind source separation is a relatively well studied subject, however nonlinear blind source separation is still in a less advanced stage, but has seen several significant developments in the last few years. This publication reviews the main nonlinear separation methods, including the separation of post-nonlinear mixtures, and the MISEP, ensemble learning and kTDSEP methods for generic mixtures. These methods are studied with a significant depth. A historical overview is also presented, mentioning most of the relevant results, on nonlinear blind source separation, that have been presented over th years. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Lung Sounds:An Advanced Signal Processing Perspective

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Lung sounds auscultation is often the first noninvasive resource for detection and discrimination of respiratory pathologies available to the physician through the use of the stethoscope. Hearing interpretation, though, was the only means of appreciation of the lung sounds diagnostic information for many decades. Nevertheless, in recent years, computerized auscultation combined with signal processing techniques has boosted the diagnostic capabilities of lung sounds. The latter were traditionally analyzed and characterized by morphological changes in the time domain using statistical measures, by spectral properties in the frequency domain using simple spectral analysis, or by nonstationary properties in a joint time–frequency domain using short-time Fourier transform. Advanced signal processing techniques, however, have emerged in the last decade, broadening the perspective in lung sounds analysis. The scope of this book is to present up-to-date signal processing techniques th t have been applied to the area of lung sound analysis. It starts with a description of the nature of lung sounds and continues with the introduction of new domains in their representation, new denoising techniques, and concludes with some reflective implications, both from engineers’ and physicians’ perspective. Issues of nonstationarity, nonlinearity, non-Gaussianity, modeling, and classification of lung sounds are addressed with new methodologies, revealing a more realistic approach to their pragmatic nature. Advanced denoising techniques that effectively circumvent the noise presence (e.g., heart sound interference, background noise) in lung sound recordings are described, providing the physician with high-quality auscultative data. The book offers useful information both to engineers and physicians interested in bioacoustics, clearly demonstrating the current trends in lung sound analysis. Table of Contents: The Nature of Lung Sound Signals / New Domains in LS Rep esentation / Denoising Techniques / Reflective Implications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Integral Equation Methods for Electromagnetic and Elastic Waves

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Integral Equation Methods for Electromagnetic and Elastic Waves is an outgrowth of several years of work. There have been no recent books on integral equation methods. There are books written on integral equations, but either they have been around for a while, or they were written by mathematicians. Much of the knowledge in integral equation methods still resides in journal papers. With this book, important relevant knowledge for integral equations are consolidated in one place and researchers need only read the pertinent chapters in this book to gain important knowledge needed for integral equation research. Also, learning the fundamentals of linear elastic wave theory does not require a quantum leap for electromagnetic practitioners. Integral equation methods have been around for several decades, and their introduction to electromagnetics has been due to the seminal works of Richmond and Harrington in the 1960s. There was a surge in the interest in this topic in the 1980s (notably t e work of Wilton and his coworkers) due to increased computing power. The interest in this area was on the wane when it was demonstrated that differential equation methods, with their sparse matrices, can solve many problems more efficiently than integral equation methods. Recently, due to the advent of fast algorithms, there has been a revival in integral equation methods in electromagnetics. Much of our work in recent years has been in fast algorithms for integral equations, which prompted our interest in integral equation methods. While previously, only tens of thousands of unknowns could be solved by integral equation methods, now, tens of millions of unknowns can be solved with fast algorithms. This has prompted new enthusiasm in integral equation methods. Table of Contents: Introduction to Computational Electromagnetics / Linear Vector Space, Reciprocity, and Energy Conservation / Introduction to Integral Equations / Integral Equations for Penetrable Objects / Low-Frequency Prob ems in Integral Equations / Dyadic Green's Function for Layered Media and Integral Equations / Fast Inhomogeneous Plane Wave Algorithm for Layered Media / Electromagnetic Wave versus Elastic Wave / Glossary of Acronyms View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Electrical Engineering: Systems & Instruments

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Pragmatic Electrical Engineering: Systems and Instruments is about some of the non-energy parts of electrical systems, the parts that control things and measure physical parameters. The primary topics are control systems and their characterization, instrumentation, signals, and electromagnetic compatibility. This text features a large number of completely worked examples to aid the reader in understanding how the various principles fit together. While electric engineers may find this material useful as a review, engineers in other fields can use this short lecture text as a modest introduction to these non-energy parts of electrical systems. Some knowledge of basic d-c circuits and of phasors in the sinusoidal steady state is presumed. Table of Contents: Closed-Loop Control Systems / Characterizing a System / Instrumentation / Processing Signals / Electromagnetic Compatibility View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Communication Networks:A Concise Introduction

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book results from many years of teaching an upper division course on communication networks in the EECS department at University of California, Berkeley. It is motivated by the perceived need for an easily accessible textbook that puts emphasis on the core concepts behind current and next generation networks. After an overview of how today's Internet works and a discussion of the main principles behind its architecture, we discuss the key ideas behind Ethernet, WiFi networks, routing, internetworking and TCP. To make the book as self contained as possible, brief discussions of probability and Markov chain concepts are included in the appendices. This is followed by a brief discussion of mathematical models that provide insight into the operations of network protocols. Next, the main ideas behind the new generation of wireless networks based on WiMAX and LTE, and the notion of QoS are presented. A concise discussion of the physical layer technologies underlying various networks i also included. Finally, a sampling of topics is presented that may have significant influence on the future evolution of networks including overlay networks like content delivery and peer-to-peer networks, sensor networks, distributed algorithms, Byzantine agreement and source compression. Table of Contents: The Internet / Principles / Ethernet / WiFi / Routing / Internetworking / Transport / Models / WiMAX & LTE / QOS / Physical Layer / Additional Topics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Faceted Search

    Copyright Year: 2009

    Morgan and Claypool eBooks

    We live in an information age that requires us, more than ever, to represent, access, and use information. Over the last several decades, we have developed a modern science and technology for information retrieval, relentlessly pursuing the vision of a "memex" that Vannevar Bush proposed in his seminal article, "As We May Think." Faceted search plays a key role in this program. Faceted search addresses weaknesses of conventional search approaches and has emerged as a foundation for interactive information retrieval. User studies demonstrate that faceted search provides more effective information-seeking support to users than best-first search. Indeed, faceted search has become increasingly prevalent in online information access systems, particularly for e-commerce and site search. In this lecture, we explore the history, theory, and practice of faceted search. Although we cannot hope to be exhaustive, our aim is to provide sufficient depth and breadth to offer a useful resource to bot researchers and practitioners. Because faceted search is an area of interest to computer scientists, information scientists, interface designers, and usability researchers, we do not assume that the reader is a specialist in any of these fields. Rather, we offer a self-contained treatment of the topic, with an extensive bibliography for those who would like to pursue particular aspects in more depth. Table of Contents: I. Key Concepts / Introduction: What Are Facets? / Information Retrieval / Faceted Information Retrieval / II. Research and Practice / Academic Research / Commercial Applications / III. Practical Concerns / Back-End Concerns / Front-End Concerns / Conclusion / Glossary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A, B, See... in 3D:A Workbook to Improve 3-D Visualization Skills

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The workbook provides over 100 3D visualization exercises challenging the student to create three dimensions from two. It is a powerful and effective way to help engineering and architecture educators teach spatial visualization. Most of the 3-D visualization exercises currently being used by students in Design and Graphics classes present the objects in isometric views already in 3-D, asking the viewer to create multiple views, fold patterns, manipulate, reflect, or rotate them. The exercises presenting the objects in incomplete multiview projections asking the students to add missing lines use mostly real 3D objects that are more easily recognizable to help the student correlate 2D with 3D. This workbook uses a different approach. Each view of the solid represents a letter of the alphabet. The letters are by definition 2D representations and when they are combined to create a 3D object, visualizing it becomes quite a challenge. This workbook is intended for Engineering, Architecture and Art students and faculty that want to increase their 3-D visualization skills. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Blocks and Chains:Introduction to Bitcoin, Cryptocurrencies, and Their Consensus Mechanisms

    Copyright Year: 2017

    Morgan and Claypool eBooks

    The new field of cryptographic currencies and consensus ledgers, commonly referred to as <i>blockchains</i>, is receiving increasing interest from various different communities. These communities are very diverse and amongst others include: technical enthusiasts, activist groups, researchers from various disciplines, start ups, large enterprises, public authorities, banks, financial regulators, business men, investors, and also criminals. The scientific community adapted relatively slowly to this emerging and fast-moving field of cryptographic currencies and consensus ledgers. This was one reason that, for quite a while,the only resources available have been the Bitcoin source code, blog and forum posts, mailing lists, and other online publications. Also the original Bitcoin paper which initiated the hype was published online without any prior peer review. Following the original publication spirit of the Bitcoin paper, a lot of innovation in this field has repeatedly c me from the community itself in the form of online publications and online conversations instead of established peer-reviewed scientific publishing. On the one side, this spirit of fast free software development, combined with the business aspects of cryptographic currencies, as well as the interests of today's time-to-market focused industry, produced a flood of publications, whitepapers, and prototypes. On the other side, this has led to deficits in systematization and a gap between practice and the theoretical understanding of this new field. This book aims to further close this gap and presentsa well-structured overview of this broad field from a technical viewpoint. The archetype for modern cryptographic currencies and consensus ledgers is Bitcoin and its underlying Nakamoto consensus. Therefore we describe the inner workings of this protocol in great detail and discuss its relations to other derived systems. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Protection from Insider Threats

    Copyright Year: 2012

    Morgan and Claypool eBooks

    As data represent a key asset for today's organizations, the problem of how to protect this data from theft and misuse is at the forefront of these organizations' minds. Even though today several data security techniques are available to protect data and computing infrastructures, many such techniques -- such as firewalls and network security tools -- are unable to protect data from attacks posed by those working on an organization's "inside." These "insiders" usually have authorized access to relevant information systems, making it extremely challenging to block the misuse of information while still allowing them to do their jobs. This book discusses several techniques that can provide effective protection against attacks posed by people working on the inside of an organization. Chapter One introduces the notion of insider threat and reports some data about data breaches due to insider threats. Chapter Two covers authentication and access control techniques, and Chapter Three show how these general security techniques can be extended and used in the context of protection from insider threats. Chapter Four addresses anomaly detection techniques that are used to determine anomalies in data accesses by insiders. These anomalies are often indicative of potential insider data attacks and therefore play an important role in protection from these attacks. Security information and event management (SIEM) tools and fine-grained auditing are discussed in Chapter Five. These tools aim at collecting, analyzing, and correlating -- in real-time -- any information and event that may be relevant for the security of an organization. As such, they can be a key element in finding a solution to such undesirable insider threats. Chapter Six goes on to provide a survey of techniques for separation-of-duty (SoD). SoD is an important principle that, when implemented in systems and tools, can strengthen data protection from malicious insiders. However, to date, very few approaches hav been proposed for implementing SoD in systems. In Chapter Seven, a short survey of a commercial product is presented, which provides different techniques for protection from malicious users with system privileges -- such as a DBA in database management systems. Finally, in Chapter Eight, the book concludes with a few remarks and additional research directions. Table of Contents: Introduction / Authentication / Access Control / Anomaly Detection / Security Information and Event Management and Auditing / Separation of Duty / Case Study: Oracle Database Vault / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Datalog and Logic Databases

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The use of logic in databases started in the late 1960s. In the early 1970s Codd formalized databases in terms of the relational calculus and the relational algebra. A major influence on the use of logic in databases was the development of the field of logic programming. Logic provides a convenient formalism for studying classical database problems and has the important property of being declarative, that is, it allows one to express what she wants rather than how to get it. For a long time, relational calculus and algebra were considered the relational database languages. However, there are simple operations, such as computing the transitive closure of a graph, which cannot be expressed with these languages. Datalog is a declarative query language for relational databases based on the logic programming paradigm. One of the peculiarities that distinguishes Datalog from query languages like relational algebra and calculus is recursion, which gives Datalog the capability to express quer es like computing a graph transitive closure. Recent years have witnessed a revival of interest in Datalog in a variety of emerging application domains such as data integration, information extraction, networking, program analysis, security, cloud computing, ontology reasoning, and many others. The aim of this book is to present the basics of Datalog, some of its extensions, and recent applications to different domains. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Representations, Transformations, and Statistics for Visual Reasoning

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Analytical reasoning techniques are methods by which users explore their data to obtain insight and knowledge that can directly support situational awareness and decision making. Recently, the analytical reasoning process has been augmented through the use of interactive visual representations and tools which utilize cognitive, design and perceptual principles. These tools are commonly referred to as visual analytics tools, and the underlying methods and principles have roots in a variety of disciplines. This chapter provides an introduction to young researchers as an overview of common visual representations and statistical analysis methods utilized in a variety of visual analytics systems. The application and design of visualization and analytical algorithms are subject to design decisions, parameter choices, and many conflicting requirements. As such, this chapter attempts to provide an initial set of guidelines for the creation of the visual representation, including pitfalls and reas where the graphics can be enhanced through interactive exploration. Basic analytical methods are explored as a means of enhancing the visual analysis process, moving from visual analysis to visual analytics. Table of Contents: Data Types / Color Schemes / Data Preconditioning / Visual Representations and Analysis / Summary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Estimating the Query Difficulty for Information Retrieval

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Many information retrieval (IR) systems suffer from a radical variance in performance when responding to users' queries. Even for systems that succeed very well on average, the quality of results returned for some of the queries is poor. Thus, it is desirable that IR systems will be able to identify "difficult" queries so they can be handled properly. Understanding why some queries are inherently more difficult than others is essential for IR, and a good answer to this important question will help search engines to reduce the variance in performance, hence better servicing their customer needs. Estimating the query difficulty is an attempt to quantify the quality of search results retrieved for a query from a given collection of documents. This book discusses the reasons that cause search engines to fail for some of the queries, and then reviews recent approaches for estimating query difficulty in the IR field. It then describes a common methodology for evaluating the prediction qual ty of those estimators, and experiments with some of the predictors applied by various IR methods over several TREC benchmarks. Finally, it discusses potential applications that can utilize query difficulty estimators by handling each query individually and selectively, based upon its estimated difficulty. Table of Contents: Introduction - The Robustness Problem of Information Retrieval / Basic Concepts / Query Performance Prediction Methods / Pre-Retrieval Prediction Methods / Post-Retrieval Prediction Methods / Combining Predictors / A General Model for Query Difficulty / Applications of Query Difficulty Estimation / Summary and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Exploratory Causal Analysis with Time Series Data

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments. Data analysis techniques are required for identifying causal information and relationships directly from such observational data. This need has led to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in time series data sets. Exploratory causal analysis (ECA) provides a framework for exploring potential causal structures in time series data sets and is characterized by a myopic goal to determine which data series from a given set of series might be seen as the primary driver. In this work, ECA is used on several synthetic and empirical data sets, and it is found that all of the tested time series causality ools agree with each other (and intuitive notions of causality) for many simple systems but can provide conflicting causal inferences for more complicated systems. It is proposed that such disagreements between different time series causality tools during ECA might provide deeper insight into the data than could be found otherwise. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Frequency Domain Hybrid Finite Element Methods in Electromagnetics

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book provides a brief overview of the popular Finite Element Method (FEM) and its hybrid versions for electromagnetics with applications to radar scattering, antennas and arrays, guided structures, microwave components, frequency selective surfaces, periodic media, and RF materials characterizations and related topics. It starts by presenting concepts based on Hilbert and Sobolev spaces as well as Curl and Divergence spaces for generating matrices, useful in all engineering simulation methods. It then proceeds to present applications of the finite element and finite element-boundary integral methods for scattering and radiation. Applications to periodic media, metamaterials and bandgap structures are also included. The hybrid volume integral equation method for high contrast dielectrics and is presented for the first time. Another unique feature of the book is the inclusion of design optimization techniques and their integration within commercial numerical analysis packages for s ape and material design. To aid the reader with the method's utility, an entire chapter is devoted to two-dimensional problems. The book can be considered as an update on the latest developments since the publication of our earlier book (Finite Element Method for Electromagnetics, IEEE Press, 1998). The latter is certainly complementary companion to this one. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Adaptive Mesh Refinement in Time-Domain Numerical Electromagnetics

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This monograph is a comprehensive presentation of state-of-the-art methodologies that can dramatically enhance the efficiency of the finite-difference time-domain (FDTD) technique, the most popular electromagnetic field solver of the time-domain form of Maxwell's equations. These methodologies are aimed at optimally tailoring the computational resources needed for the wideband simulation of microwave and optical structures to their geometry, as well as the nature of the field solutions they support. That is achieved by the development of robust “adaptive meshing” approaches, which amount to varying the total number of unknown field quantities in the course of the simulation to adapt to temporally or spatially localized field features. While mesh adaptation is an extremely desirable FDTD feature, known to reduce simulation times by orders of magnitude, it is not always robust. The specific techniques presented in this book are characterized by stability and robustness. T erefore, they are excellent computer analysis and design (CAD) tools. The book starts by introducing the FDTD technique, along with challenges related to its application to the analysis of real-life microwave and optical structures. It then proceeds to developing an adaptive mesh refinement method based on the use of multiresolution analysis and, more specifically, the Haar wavelet basis. Furthermore, a new method to embed a moving adaptive mesh in FDTD, the dynamically adaptive mesh refinement (AMR) FDTD technique, is introduced and explained in detail. To highlight the properties of the theoretical tools developed in the text, a number of applications are presented, including: Microwave integrated circuits (microstrip filters, couplers, spiral inductors, cavities). Optical power splitters, Y-junctions, and couplers Optical ring resonators Nonlinear optical waveguides. Building on first principles of time-domain electromagnetic simulations, this book presents advanced concepts and cu ting-edge modeling techniques in an intuitive way for programmers, engineers, and graduate students. It is designed to provide a solid reference for highly efficient time-domain solvers, employed in a wide range of exciting applications in microwave/millimeter-wave and optical engineering. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Natural Language Processing for the Semantic Web

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>This book introduces core natural language processing (NLP) technologies to non-experts in an easily accessible way, as a series of building blocks that lead the user to understand key technologies, why they are required, and how to integrate them into Semantic Web applications. Natural language processing and Semantic Web technologies have different, but complementary roles in data management. Combining these two technologies enables structured and unstructured data to merge seamlessly. Semantic Web technologies aim to convert unstructured data to meaningful representations, which benefit enormously from the use of NLP technologies, thereby enabling applications such as connecting text to Linked Open Data, connecting texts to each other, semantic searching, information visualization, and modeling of user behavior in online networks. </p> <p>The first half of this book describes the basic NLP processing tools: tokenization, part-of-speech tagging, and morpho ogical analysis, in addition to the main tools required for an information extraction system (named entity recognition and relation extraction) which build on these components. The second half of the book explains how Semantic Web and NLP technologies can enhance each other, for example via semantic annotation, ontology linking, and population. These chapters also discuss sentiment analysis, a key component in making sense of textual data, and the difficulties of performing NLP on social media, as well as some proposed solutions. The book finishes by investigating some applications of these tools, focusing on semantic search and visualization, modeling user behavior, and an outlook on the future. </p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Lifelong Machine Learning

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p><i>Lifelong Machine Learning</i> (or <i>Lifelong Learning</i>) is an advanced machine learning paradigm that learns continuously, accumulates the knowledge learned in previous tasks, and uses it to help future learning. In the process, the learner becomes more and more knowledgeable and effective at learning. This learning ability is one of the hallmarks of human intelligence. However, the current dominant machine learning paradigm learns <i>in isolation</i>: given a training dataset, it runs a machine learning algorithm on the dataset to produce a model. It makes no attempt to retain the learned knowledge and use it in future learning. Although this <i>isolated learning paradigm</i> has been very successful, it requires a large number of training examples, and is only suitable for well-defined and narrow tasks. In comparison, we humans can learn effectively with a few examples because we have accumulated so mu h knowledge in the past which enables us to learn with little data or effort. Lifelong learning aims to achieve this capability. As statistical machine learning matures, it is time to make a major effort to break the isolated learning tradition and to study lifelong learning to bring machine learning to new heights. Applications such as intelligent assistants, chatbots, and physical robots that interact with humans and systems in real-life environments are also calling for such lifelong learning capabilities. Without the ability to accumulate the learned knowledge and use it to learn more knowledge incrementally, a system will probably never be truly intelligent. This book serves as an introductory text and survey to lifelong learning.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Model-Driven Software Engineering in Practice:Second Edition

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>This book discusses how model-based approaches can improve the daily practice of software professionals. This is known as Model-Driven Software Engineering (MDSE) or, simply, Model-Driven Engineering (MDE).</p> <p>MDSE practices have proved to increase efficiency and effectiveness in software development, as demonstrated by various quantitative and qualitative studies. MDSE adoption in the software industry is foreseen to grow exponentially in the near future, e.g., due to the convergence of software development and business analysis.</p> <p>The aim of this book is to provide you with an agile and flexible tool to introduce you to the MDSE world, thus allowing you to quickly understand its basic principles and techniques and to choose the right set of MDSE instruments for your needs so that you can start to benefit from MDSE right away.</p> <p>The book is organized into two main parts.</p> <ul> <li>The irst part discusses the foundations of MDSE in terms of basic concepts (i.e., models and transformations), driving principles, application scenarios, and current standards, like the well-known MDA initiative proposed by OMG (Object Management Group) as well as the practices on how to integrate MDSE in existing development processes.</li> <li>The second part deals with the technical aspects of MDSE, spanning from the basics on when and how to build a domain-specific modeling language, to the description of Model-to-Text and Model-to-Model transformations, and the tools that support the management of MDSE projects.</li> </ul> <p>The second edition of the book features:</p> <ul> <li>a set of completely new topics, including: full example of the creation of a new modeling language (IFML), discussion of modeling issues and approaches in specific domains, like business process modeling, user interaction modeling, and enterprise architecture</li> <li>complete revision of examples, figures, and text, for improving readability, understandability, and coherence</li> <li>better formulation of definitions, dependencies between concepts and ideas</li> <li>addition of a complete index of book content</li> </ul> <p>In addition to the contents of the book, more resources are provided on the book's website http://www.mdse-book.com, including the examples presented in the book.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Remote Sensing Image Processing

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Earth observation is the field of science concerned with the problem of monitoring and modeling the processes on the Earth surface and their interaction with the atmosphere. The Earth is continuously monitored with advanced optical and radar sensors. The images are analyzed and processed to deliver useful products to individual users, agencies and public administrations. To deal with these problems, remote sensing image processing is nowadays a mature research area, and the techniques developed in the field allow many real-life applications with great societal value. For instance, urban monitoring, fire detection or flood prediction can have a great impact on economical and environmental issues. To attain such objectives, the remote sensing community has turned into a multidisciplinary field of science that embraces physics, signal theory, computer science, electronics and communications. From a machine learning and signal/image processing point of view, all the applications are tackl d under specific formalisms, such as classification and clustering, regression and function approximation, data coding, restoration and enhancement, source unmixing, data fusion or feature selection and extraction. This book covers some of the fields in a comprehensive way. Table of Contents: Remote Sensing from Earth Observation Satellites / The Statistics of Remote Sensing Images / Remote Sensing Feature Selection and Extraction / Classification / Spectral Mixture Analysis / Estimation of Physical Parameters View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Introduction to Models of Online Peer-to-Peer Social Networking

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book concerns peer-to-peer applications and mechanisms operating on the Internet, particularly those that are not fully automated and involve significant human interaction. So, the realm of interest is the intersection of distributed systems and online social networking. Generally, simple models are described to clarify the ideas. Beginning with short overviews of caching, graph theory and game theory, we cover the basic ideas of structured and unstructured search. We then describe a simple framework for reputations and for iterated referrals and consensus. This framework is applied to a problem of sybil identity management. The fundamental result for iterated Byzantine consensus for a relatively important issue is also given. Finally, a straight-forward epidemic model is used to describe the propagation of malware on-line and for BitTorrent-style file-sharing. This short book can be used as a preliminary orientation to this subject matter. References are given for the interested student to papers with good survey and tutorial content and to those with more advanced treatments of specific topics. For an instructor, this book is suitable for a one-semester seminar course. Alternatively, it could be the framework for a semester's worth of lectures where the instructor would supplement each chapter with additional lectures on related or more advanced subject matter. A basic background is required in the areas of computer networking, probability theory, stochastic processes, and queueing. Table of Contents: Networking overview / Graphs / Games / Search in structured networks / Search in unstructured networks / Transactions, reputations, and referrals / False Referrals / Peer-to-peer file sharing / Consensus in dynamical belief systems / Byzantine consensus / Epidemics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Captains of Energy:Systems Dynamics from an Energy Perspective

    Copyright Year: 2015

    Morgan and Claypool eBooks

    In teaching an introduction to transport or systems dynamics modeling at the undergraduate level, it is possible to lose pedagogical traction in a sea of abstract mathematics. What the mathematical modeling of time-dependent system behavior offers is a venue in which students can be taught that physical analogies exist between what they likely perceive as distinct areas of study in the physical sciences. We introduce a storyline whose characters are superheroes that store and dissipate energy in dynamic systems. Introducing students to the overarching conservation laws helps develop the analogy that ties the different disciplines together under a common umbrella of system energy. In this book, we use the superhero cast to present the effort-flow analogy and its relationship to the conservation principles of mass, momentum, energy, and electrical charge. We use a superhero movie script common to mechanical, electrical, fluid, and thermal engineering systems to illustrate how to apply t e analogy to arrive at governing differential equations describing the systems' behavior in time. Ultimately, we show how only two types of differential equation, and therefore, two types of system response are possible. This novel approach of storytelling and a movie script is used to help make the mathematics of lumped system modeling more approachable for students. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dynamic Speech Models

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Speech dynamics refer to the temporal characteristics in all stages of the human speech communication process. This speech “chain” starts with the formation of a linguistic message in a speaker's brain and ends with the arrival of the message in a listener's brain. Given the intricacy of the dynamic speech process and its fundamental importance in human communication, this monograph is intended to provide a comprehensive material on mathematical models of speech dynamics and to address the following issues: How do we make sense of the complex speech process in terms of its functional role of speech communication? How do we quantify the special role of speech timing? How do the dynamics relate to the variability of speech that has often been said to seriously hamper automatic speech recognition? How do we put the dynamic process of speech into a quantitative form to enable detailed analyses? And finally, how can we incorporate the knowledge of speech dynamics into com uterized speech analysis and recognition algorithms? The answers to all these questions require building and applying computational models for the dynamic speech process. What are the compelling reasons for carrying out dynamic speech modeling? We provide the answer in two related aspects. First, scientific inquiry into the human speech code has been relentlessly pursued for several decades. As an essential carrier of human intelligence and knowledge, speech is the most natural form of human communication. Embedded in the speech code are linguistic (as well as para-linguistic) messages, which are conveyed through four levels of the speech chain. Underlying the robust encoding and transmission of the linguistic messages are the speech dynamics at all the four levels. Mathematical modeling of speech dynamics provides an effective tool in the scientific methods of studying the speech chain. Such scientific studies help understand why humans speak as they do and how humans exploit redunda cy and variability by way of multitiered dynamic processes to enhance the efficiency and effectiveness of human speech communication. Second, advancement of human language technology, especially that in automatic recognition of natural-style human speech is also expected to benefit from comprehensive computational modeling of speech dynamics. The limitations of current speech recognition technology are serious and are well known. A commonly acknowledged and frequently discussed weakness of the statistical model underlying current speech recognition technology is the lack of adequate dynamic modeling schemes to provide correlation structure across the temporal speech observation sequence. Unfortunately, due to a variety of reasons, the majority of current research activities in this area favor only incremental modifications and improvements to the existing HMM-based state-of-the-art. For example, while the dynamic and correlation modeling is known to be an important topic, most of the ystems nevertheless employ only an ultra-weak form of speech dynamics; e.g., differential or delta parameters. Strong-form dynamic speech modeling, which is the focus of this monograph, may serve as an ultimate solution to this problem. After the introduction chapter, the main body of this monograph consists of four chapters. They cover various aspects of theory, algorithms, and applications of dynamic speech models, and provide a comprehensive survey of the research work in this area spanning over past 20~years. This monograph is intended as advanced materials of speech and signal processing for graudate-level teaching, for professionals and engineering practioners, as well as for seasoned researchers and engineers specialized in speech processing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Relativistic Flight Mechanics and Space Travel

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Relativistic Flight Mechanics and Space Travel is about the fascinating prospect of future human space travel. Its purpose is to demonstrate that such ventures may not be as difficult as one might believe and are certainly not impossible. The foundations for relativistic flight mechanics are provided in a clear and instructive manner by using well established principles which are used to explore space flight possibilities within and beyond our galaxy. The main substance of the book begins with a background review of Einstein's Special Theory of Relativity as it pertains to relativistic flight mechanics and space travel. The book explores the dynamics and kinematics of relativistic space flight from the point of view of the astronauts in the spacecraft and compares these with those observed by earth's scientists and engineers-differences that are quite surprising. A quasi historical treatment leads quite naturally into the central subject areas of the book where attention is focused n various issues not ordinarily covered by such treatment. To accomplish this, numerous simple thought experiments are used to bring rather complicated subject matter down to a level easily understood by most readers with an engineering or science background. The primary subjects regarding photon rocketry and space travel are covered in some depth and include a flight plan together with numerous calculations represented in graphical form. A geometric treatment of relativistic effects by using Minkowski diagrams is included for completeness. The book concludes with brief discussions of other prospective, even exotic, transport systems for relativistic space travel. A glossary and simple end-of-chapter problems with answers enhance the learning process. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dynamic Binary Modification:Tools, Techniques and Applications

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Dynamic binary modification tools form a software layer between a running application and the underlying operating system, providing the powerful opportunity to inspect and potentially modify every user-level guest application instruction that executes. Toolkits built upon this technology have enabled computer architects to build powerful simulators and emulators for design-space exploration, compiler writers to analyze and debug the code generated by their compilers, software developers to fully explore the features, bottlenecks, and performance of their software, and even end-users to extend the functionality of proprietary software running on their computers. Several dynamic binary modification systems are freely available today that place this power into the hands of the end user. While these systems are quite complex internally, they mask that complexity with an easy-to-learn API that allows a typical user to ramp up fairly quickly and build any of a number of powerful tools. Mea while, these tools are robust enough to form the foundation for software products in use today. This book serves as a primer for researchers interested in dynamic binary modification systems, their internal design structure, and the wide range of tools that can be built leveraging these systems. The hands-on examples presented throughout form a solid foundation for designing and constructing more complex tools, with an appreciation for the techniques necessary to make those tools robust and efficient. Meanwhile, the reader will get an appreciation for the internal design of the engines themselves. Table of Contents: Dynamic Binary Modification: Overview / Using a Dynamic Binary Modifier / Program Analysis and Debugging / Active Program Modification / Architectural Exploration / Advanced System Internals / Historical Perspectives / Summary and Observations View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Survive and Thrive:A Guide for Untenured Faculty

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The experience of an untenured faculty member is highly dependent on the quality of the mentoring they receive. This mentoring may come from a number of different sources, and the concept of developing a constellation of mentors is highly recommended, but a mentoring relationship that is guided by the mentee's needs will be the most productive. Often, however, the mentee does not know their own needs, what questions to ask, and what topics they should discuss with a mentor. This book provides a guide to the mentoring process for untenured faculty. Perspectives are provided and questions posed on topics ranging from establishing scholarly expertise and developing professional networks to personal health and balancing responsibilities. The questions posed are not intended for the mentee to answer in isolation, rather a junior faculty member should approach these questions throughout their untenured years with the help of their mentors. Survive and Thrive: A Guide for Untenured Faculty ill help to facilitate the mentoring process and lead junior faculty to a path where they can move beyond just surviving and truly thrive in their position. Table of Contents: Tough Questions About Why You Are Here / Joining Your Department and Discipline / Establishing Expertise / Developing Networks, Relationships, and Mentoring Activities / Getting Support and Evaluating Your Personal Health / Planning for the Future / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Intelligent Autonomous Robotics:A Robot Soccer Case Study

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Robotics technology has recently advanced to the point of being widely accessible for relatively low-budget research, as well as for graduate, undergraduate, and even secondary and primary school education. This lecture provides an example of how to productively use a cutting-edge advanced robotics platform for education and research by providing a detailed case study with the Sony AIBO robot, a vision-based legged robot. The case study used for this lecture is the UT Austin Villa RoboCup Four-Legged Team. This lecture describes both the development process and the technical details of its end result. The main contributions of this lecture are (i) a roadmap for new classes and research groups interested in intelligent autonomous robotics who are starting from scratch with a new robot, and (ii) documentation of the algorithms behind our own approach on the AIBOs with the goal of making them accessible for use on other vision-based and/or legged robot platforms. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Managing Event Information: Modeling, Retrieval, and Applications:Modeling, Retrieval, and Applications

    Copyright Year: 2011

    Morgan and Claypool eBooks

    With the proliferation of citizen reporting, smart mobile devices, and social media, an increasing number of people are beginning to generate information about events they observe and participate in. A significant fraction of this information contains multimedia data to share the experience with their audience. A systematic information modeling and management framework is necessary to capture this widely heterogeneous, schemaless, potentially humongous information produced by many different people. This book is an attempt to examine the modeling, storage, querying, and applications of such an event management system in a holistic manner. It uses a semantic-web style graph-based view of events, and shows how this event model, together with its query facility, can be used toward emerging applications like semi-automated storytelling. Table of Contents: Introduction / Event Data Models / Implementing an Event Data Model / Querying Events / Storytelling with Events / An Emerging Applicati n / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Approximability of Optimization Problems through Adiabatic Quantum Computation

    Copyright Year: 2014

    Morgan and Claypool eBooks

    The adiabatic quantum computation (AQC) is based on the adiabatic theorem to approximate solutions of the Schrödinger equation. The design of an AQC algorithm involves the construction of a Hamiltonian that describes the behavior of the quantum system. This Hamiltonian is expressed as a linear interpolation of an initial Hamiltonian whose ground state is easy to compute, and a final Hamiltonian whose ground state corresponds to the solution of a given combinatorial optimization problem. The adiabatic theorem asserts that if the time evolution of a quantum system described by a Hamiltonian is large enough, then the system remains close to its ground state. An AQC algorithm uses the adiabatic theorem to approximate the ground state of the final Hamiltonian that corresponds to the solution of the given optimization problem. In this book, we investigate the computational simulation of AQC algorithms applied to the MAX-SAT problem. A symbolic analysis of the AQC solution is given in rder to understand the involved computational complexity of AQC algorithms. This approach can be extended to other combinatorial optimization problems and can be used for the classical simulation of an AQC algorithm where a Hamiltonian problem is constructed. This construction requires the computation of a sparse matrix of dimension 2ⁿ × 2ⁿ, by means of tensor products, where n is the dimension of the quantum system. Also, a general scheme to design AQC algorithms is proposed, based on a natural correspondence between optimization Boolean variables and quantum bits. Combinatorial graph problems are in correspondence with pseudo-Boolean maps that are reduced in polynomial time to quadratic maps. Finally, the relation among NP-hard problems is investigated, as well as its logical representability, and is applied to the design of AQC algorithms. It is shown that every monadic second-order logic (MSOL) expression has associated pseudo-Boolean maps that can be obtained y expanding the given expression, and also can be reduced to quadratic forms. Table of Contents: Preface / Acknowledgments / Introduction / Approximability of NP-hard Problems / Adiabatic Quantum Computing / Efficient Hamiltonian Construction / AQC for Pseudo-Boolean Optimization / A General Strategy to Solve NP-Hard Problems / Conclusions / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Building a Better World with Our Information:The Future of Personal Information Management, Part 3

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Part 1 in "The Future of" series covers the fundamentals of personal information management (PIM) and then explores the seismic shift, already well underway, toward a world where our information is always at hand (thanks to our devices) and "forever" on the web. Part 2, "Transforming Technologies to Manage Our Information," provides a more focused look at technologies for managing information. The opening chapter discusses "natural interface" technologies of input/output to free us from keyboard, screen, and mouse. Successive chapters then explore technologies to save, search, and structure our information. A concluding chapter introduces the possibility that we may see dramatic reductions in the "clerical tax" we pay as we work with our information. Focus in this concluding Part 3 to the series shifts to the practical and to the near future. What can we do, now or soon, to manage our information better? And, as we do so, how might we build a better world? Part 3 is in three chapters: Chapter 10. Group Information Management and the Social Fabric in PIM. How do we preserve and promote our PIM practices as we interact with others at home, at school, at work, at play and in wider, even global, communities? Chapter 11. PIM by Design. What principles guide us? How can developers build better tools for PIM? How can the rest of us make better use of the tools we already have? Chapter 12. To Each of Us, Our Own concludes with an exploration of the ways each of us, individually, can develop better practices for the management of our information in service of the lives we wish to live and toward a better world we all must share. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Controlling Energy Demand in Mobile Computing Systems

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This lecture provides an introduction to the problem of managing the energy demand of mobile devices. Reducing energy consumption, primarily with the goal of extending the lifetime of battery-powered devices, has emerged as a fundamental challenge in mobile computing and wireless communication. The focus of this lecture is on a systems approach where software techniques exploit state-of-the-art architectural features rather than relying only upon advances in lower-power circuitry or the slow improvements in battery technology to solve the problem. Fortunately, there are many opportunities to innovate on managing energy demand at the higher levels of a mobile system. Increasingly, device components offer low power modes that enable software to directly affect the energy consumption of the system. The challenge is to design resource management policies to effectively use these capabilities. The lecture begins by providing the necessary foundations, including basic energy terminology and widely accepted metrics, system models of how power is consumed by a device, and measurement methods and tools available for experimental evaluation. For components that offer low power modes, management policies are considered that address the questions of when to power down to a lower power state and when to power back up to a higher power state. These policies rely on detecting periods when the device is idle as well as techniques for modifying the access patterns of a workload to increase opportunities for power state transitions. For processors with frequency and voltage scaling capabilities, dynamic scheduling policies are developed that determine points during execution when those settings can be changed without harming quality of service constraints. The interactions and tradeoffs among the power management policies of multiple devices are discussed. We explore how the effective power management on one component of a system may have either a positive or negative impact on over ll energy consumption or on the design of policies for another component. The important role that application-level involvement may play in energy management is described, with several examples of cross-layer cooperation. Application program interfaces (APIs) that provide information flow across the application-OS boundary are valuable tools in encouraging development of energy-aware applications. Finally, we summarize the key lessons of this lecture and discuss future directions in managing energy demand. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Arabic Natural Language Processing

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book provides system developers and researchers in natural language processing and computational linguistics with the necessary background information for working with the Arabic language. The goal is to introduce Arabic linguistic phenomena and review the state-of-the-art in Arabic processing. The book discusses Arabic script, phonology, orthography, morphology, syntax and semantics, with a final chapter on machine translation issues. The chapter sizes correspond more or less to what is linguistically distinctive about Arabic, with morphology getting the lion's share, followed by Arabic script. No previous knowledge of Arabic is needed. This book is designed for computer scientists and linguists alike. The focus of the book is on Modern Standard Arabic; however, notes on practical issues related to Arabic dialects and languages written in the Arabic script are presented in different chapters. Table of Contents: What is "Arabic"? / Arabic Script / Arabic Phonology and Orthograph / Arabic Morphology / Computational Morphology Tasks / Arabic Syntax / A Note on Arabic Semantics / A Note on Arabic and Machine Translation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bayesian Analysis in Natural Language Processing

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>Natural language processing (NLP) went through a profound transformation in the mid-1980s when it shifted to make heavy use of corpora and data-driven techniques to analyze language. Since then, the use of statistical techniques in NLP has evolved in several ways. One such example of evolution took place in the late 1990s or early 2000s, when full-fledged Bayesian machinery was introduced to NLP. This Bayesian approach to NLP has come to accommodate for various shortcomings in the frequentist approach and to enrich it, especially in the unsupervised setting, where statistical learning is done without target prediction examples. </p><p> We cover the methods and algorithms that are needed to fluently read Bayesian learning papers in NLP and to do research in the area. These methods and algorithms are partially borrowed from both machine learning and statistics and are partially developed "in-house" in NLP. We cover inference techniques such as Markov chain Mon e Carlo sampling and variational inference, Bayesian estimation, and nonparametric modeling. We also cover fundamental concepts in Bayesian statistics such as prior distributions, conjugacy, and generative modeling. Finally, we cover some of the fundamental modeling techniques in NLP, such as grammar modeling and their use with Bayesian analysis.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Extreme Value Theory-Based Methods for Visual Recognition

    Copyright Year: 2017

    Morgan and Claypool eBooks

    A common feature of many approaches to modeling sensory statistics is an emphasis on capturing the "average." From early representations in the brain, to highly abstracted class categories in machine learning for classification tasks, central-tendency models based on the Gaussian distribution are a seemingly natural and obvious choice for modeling sensory data. However, insights from neuroscience, psychology, and computer vision suggest an alternate strategy: preferentially focusing representational resources on the extremes of the distribution of sensory inputs. The notion of treating extrema near a decision boundary as features is not necessarily new, but a comprehensive statistical theory of recognition based on extrema is only now just emerging in the computer vision literature. This book begins by introducing the statistical Extreme Value Theory (EVT) for visual recognition. In contrast to central-tendency modeling, it is hypothesized that distributions near decision boundaries f rm a more powerful model for recognition tasks by focusing coding resources on data that are arguably the most diagnostic features. EVT has several important properties: strong statistical grounding, better modeling accuracy near decision boundaries than Gaussian modeling, the ability to model asymmetric decision boundaries, and accurate prediction of the probability of an event beyond our experience. The second part of the book uses the theory to describe a new class of machine learning algorithms for decision making that are a measurable advance beyond