By Topic

Morgan and ClayPool Synthesis Digital LIBRARY

758 Results Returned

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modeling and Data Mining in Blogosphere

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book offers a comprehensive overview of the various concepts and research issues about blogs or weblogs. It introduces techniques and approaches, tools and applications, and evaluation methodologies with examples and case studies. Blogs allow people to express their thoughts, voice their opinions, and share their experiences and ideas. Blogs also facilitate interactions among individuals creating a network with unique characteristics. Through the interactions individuals experience a sense of community. We elaborate on approaches that extract communities and cluster blogs based on information of the bloggers. Open standards and low barrier to publication in Blogosphere have transformed information consumers to producers, generating an overwhelming amount of ever-increasing knowledge about the members, their environment and symbiosis. We elaborate on approaches that sift through humongous blog data sources to identify influential and trustworthy bloggers leveraging content and net ork information. Spam blogs or "splogs" are an increasing concern in Blogosphere and are discussed in detail with the approaches leveraging supervised machine learning algorithms and interaction patterns. We elaborate on data collection procedures, provide resources for blog data repositories, mention various visualization and analysis tools in Blogosphere, and explain conventional and novel evaluation methodologies, to help perform research in the Blogosphere. The book is supported by additional material, including lecture slides as well as the complete set of figures used in the book, and the reader is encouraged to visit the book website for the latest information. Table of Contents: Modeling Blogosphere / Blog Clustering and Community Discovery / Influence and Trust / Spam Filtering in Blogosphere / Data Collection and Evaluation View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Model-Driven Software Engineering in Practice:Second Edition

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>This book discusses how model-based approaches can improve the daily practice of software professionals. This is known as Model-Driven Software Engineering (MDSE) or, simply, Model-Driven Engineering (MDE).</p> <p>MDSE practices have proved to increase efficiency and effectiveness in software development, as demonstrated by various quantitative and qualitative studies. MDSE adoption in the software industry is foreseen to grow exponentially in the near future, e.g., due to the convergence of software development and business analysis.</p> <p>The aim of this book is to provide you with an agile and flexible tool to introduce you to the MDSE world, thus allowing you to quickly understand its basic principles and techniques and to choose the right set of MDSE instruments for your needs so that you can start to benefit from MDSE right away.</p> <p>The book is organized into two main parts.</p> <ul> <li>The irst part discusses the foundations of MDSE in terms of basic concepts (i.e., models and transformations), driving principles, application scenarios, and current standards, like the well-known MDA initiative proposed by OMG (Object Management Group) as well as the practices on how to integrate MDSE in existing development processes.</li> <li>The second part deals with the technical aspects of MDSE, spanning from the basics on when and how to build a domain-specific modeling language, to the description of Model-to-Text and Model-to-Model transformations, and the tools that support the management of MDSE projects.</li> </ul> <p>The second edition of the book features:</p> <ul> <li>a set of completely new topics, including: full example of the creation of a new modeling language (IFML), discussion of modeling issues and approaches in specific domains, like business process modeling, user interaction modeling, and enterprise architecture</li> <li>complete revision of examples, figures, and text, for improving readability, understandability, and coherence</li> <li>better formulation of definitions, dependencies between concepts and ideas</li> <li>addition of a complete index of book content</li> </ul> <p>In addition to the contents of the book, more resources are provided on the book's website http://www.mdse-book.com, including the examples presented in the book.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Interacting with Information

    Copyright Year: 2011

    Morgan and Claypool eBooks

    We live in an "information age," but information is only useful when it is interpreted by people and applied in the context of their goals and activities. The volume of information to which people have access is growing at an incredible rate, vastly outstripping people's ability to assimilate and manage it. In order to design technologies that better support information work, it is necessary to better understand the details of that work. In this lecture, we review the situations (physical, social and temporal) in which people interact with information. We also discuss how people interact with information in terms of an "information journey," in which people, iteratively, do the following: recognise a need for information, find information, interpret and evaluate that information in the context of their goals, and use the interpretation to support their broader activities. People's information needs may be explicit and clearly articulated but, conversely, may be tacit, exploratory an evolving. Widely used tools supporting information access, such as searching on the Web and in digital libraries, support clearly defined information requirements well, but they provide limited support for other information needs. Most other stages of the information journey are poorly supported at present. Novel design solutions are unlikely to be purely digital, but to exploit the rich variety of information resources, digital, physical and social, that are available. Theories of information interaction and sensemaking can highlight new design possibilities that augment human capabilities. We review relevant theories and findings for understanding information behaviours, and we review methods for evaluating information working tools, to both assess existing tools and identify requirements for the future. Table of Contents: Introduction: Pervasive Information Interactions / Background: Information Interaction at the Crossroads of Research Traditions / The Situations: Physical, Socia and Temporal / The Behaviors: Understanding the "Information Journey" / The Technologies: Supporting the Information Journey / Studying User Behaviors and Needs for Information Interaction / Looking to the Future / Further Reading View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and Society:Working Towards Social Justice, Part III: Windows on Society

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Engineers work in an increasingly complex entanglement of ideas, people, cultures, technology, systems and environments. Today, decisions made by engineers often have serious implications for not only their clients but for society as a whole and the natural world. Such decisions may potentially influence cultures, ways of living, as well as alter ecosystems which are in delicate balance. In order to make appropriate decisions and to co-create ideas and innovations within and among the complex networks of communities which currently exist and are shaped by our decisions, we need to regain our place as professionals, to realise the significance of our work and to take responsibility in a much deeper sense. Engineers must develop the 'ability to respond' to emerging needs of all people, across all cultures. To do this requires insights and knowledge which are at present largely within the domain of the social and political sciences but which need to be shared with our students in ways hich are meaningful and relevant to engineering. This book attempts to do just that. In Part 1 Baillie introduces ideas associated with the ways in which engineers relate to the communities in which they work. Drawing on scholarship from science and technology studies, globalisation and development studies, as well as work in science communication and dialogue, this introductory text sets the scene for an engineering community which engages with the public. In Part 2 Catalano frames the thinking processes necessary to create ethical and just decisions in engineering, to understand the implications of our current decision making processes and think about ways in which we might adapt these to become more socially just in the future. In Part 3 Baillie and Catalano have provided case studies of everyday issues such as water, garbage and alarm clocks, to help us consider how we might see through the lenses of our new knowledge from Parts 1 and 2 and apply this to our everyday existence as ngineers. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Stochastic Network Optimization with Application to Communication and Queueing Systems

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This text presents a modern theory of analysis, control, and optimization for dynamic networks. Mathematical techniques of Lyapunov drift and Lyapunov optimization are developed and shown to enable constrained optimization of time averages in general stochastic systems. The focus is on communication and queueing systems, including wireless networks with time-varying channels, mobility, and randomly arriving traffic. A simple drift-plus-penalty framework is used to optimize time averages such as throughput, throughput-utility, power, and distortion. Explicit performance-delay tradeoffs are provided to illustrate the cost of approaching optimality. This theory is also applicable to problems in operations research and economics, where energy-efficient and profit-maximizing decisions must be made without knowing the future. Topics in the text include the following: - Queue stability theory - Backpressure, max-weight, and virtual queue methods - Primal-dual methods for non-convex stochasti utility maximization - Universal scheduling theory for arbitrary sample paths - Approximate and randomized scheduling theory - Optimization of renewal systems and Markov decision systems Detailed examples and numerous problem set questions are provided to reinforce the main concepts. Table of Contents: Introduction / Introduction to Queues / Dynamic Scheduling Example / Optimizing Time Averages / Optimizing Functions of Time Averages / Approximate Scheduling / Optimization of Renewal Systems / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Arduino Microcontroller Processing for Everyone:Part I

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book is about the Arduino microcontroller and the Arduino concept. The visionary Arduino team of Massimo Banzi, David Cuartielles, Tom Igoe, Gianluca Martino, and David Mellis launched a new innovation in microcontroller hardware in 2005, the concept of open source hardware. Their approach was to openly share details of microcontroller-based hardware design platforms to stimulate the sharing of ideas and promote innovation. This concept has been popular in the software world for many years. This book is intended for a wide variety of audiences including students of the fine arts, middle and senior high school students, engineering design students, and practicing scientists and engineers. To meet this wide audience, the book has been divided into sections to satisfy the need of each reader. The book contains many software and hardware examples to assist the reader in developing a wide variety of systems. For the examples, the Arduino Duemilanove and the Atmel ATmega328 is employed as the target processor. Table of Contents: Getting Started / Programming / Embedded Systems Design / Serial Communication Subsystem / Analog to Digital Conversion (ADC) / Interrupt Subsystem / Timing Subsystem / Atmel AVR Operating Parameters and Interfacing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Natural Language Processing for Historical Texts

    Copyright Year: 2012

    Morgan and Claypool eBooks

    More and more historical texts are becoming available in digital form. Digitization of paper documents is motivated by the aim of preserving cultural heritage and making it more accessible, both to laypeople and scholars. As digital images cannot be searched for text, digitization projects increasingly strive to create digital text, which can be searched and otherwise automatically processed, in addition to facsimiles. Indeed, the emerging field of digital humanities heavily relies on the availability of digital text for its studies. Together with the increasing availability of historical texts in digital form, there is a growing interest in applying natural language processing (NLP) methods and tools to historical texts. However, the specific linguistic properties of historical texts -- the lack of standardized orthography, in particular -- pose special challenges for NLP. This book aims to give an introduction to NLP for historical texts and an overview of the state of the art in th s field. The book starts with an overview of methods for the acquisition of historical texts (scanning and OCR), discusses text encoding and annotation schemes, and presents examples of corpora of historical texts in a variety of languages. The book then discusses specific methods, such as creating part-of-speech taggers for historical languages or handling spelling variation. A final chapter analyzes the relationship between NLP and the digital humanities. Certain recently emerging textual genres, such as SMS, social media, and chat messages, or newsgroup and forum postings share a number of properties with historical texts, for example, nonstandard orthography and grammar, and profuse use of abbreviations. The methods and techniques required for the effective processing of historical texts are thus also of interest for research in other domains. Table of Contents: Introduction / NLP and Digital Humanities / Spelling in Historical Texts / Acquiring Historical Texts / Text Encoding an Annotation Schemes / Handling Spelling Variation / NLP Tools for Historical Languages / Historical Corpora / Conclusion / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Biomedical Engineering:Biomechanics and Bioelectricity

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Intended as an introduction to the field of biomedical engineering, this book covers the topics of biomechanics (Part I) and bioelectricity (Part II). Each chapter emphasizes a fundamental principle or law, such as Darcy's Law, Poiseuille's Law, Hooke's Law, Starling's Law, levers, and work in the area of fluid, solid, and cardiovascular biomechanics. In addition, electrical laws and analysis tools are introduced, including Ohm's Law, Kirchhoff's Laws, Coulomb's Law, capacitors, and the fluid/electrical analogy. Culminating the electrical portion are chapters covering Nernst and membrane potentials and Fourier transforms. Examples are solved throughout the book and problems with answers are given at the end of each chapter. A semester-long Major Project that models the human systemic cardiovascular system, utilizing both a Matlab numerical simulation and an electrical analog circuit, ties many of the book's concepts together. Table of Contents: Ohm's Law: Current, Voltage and Resistance / Kirchhoff's Voltage and Current Laws: Circuit Analysis / Operational Amplifiers / Coulomb's Law, Capacitors and the Fluid/Electrical Analogy / Series and Parallel Combinations / Thevenin Equivalent Circuits / Nernst Potential: Cell Membrane Equivalent Circuit / Fourier Transforms: Alternating Currents (AC) View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Reading and Writing the Electronic Book

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Developments over the last twenty years have fueled considerable speculation about the future of the book and of reading itself. This book begins with a gloss over the history of electronic books, including the social and technical forces that have shaped their development. The focus then shifts to reading and how we interact with what we read: basic issues such as legibility, annotation, and navigation are examined as aspects of reading that ebooks inherit from their print legacy. Because reading is fundamentally communicative, I also take a closer look at the sociality of reading: how we read in a group and how we share what we read. Studies of reading and ebook use are integrated throughout the book, but Chapter 5 "goes meta" to explore how a researcher might go about designing his or her own reading-related studies. No book about ebooks is complete without an explicit discussion of content preparation, i.e., how the electronic book is written. Hence, Chapter 6 delves into the unde lying representation of ebooks and efforts to create and apply markup standards to them. This chapter also examines how print genres have made the journey to digital and how some emerging digital genres might be realized as ebooks. Finally, Chapter 7 discusses some beyond-the-book functionality: how can ebook platforms be transformed into portable personal libraries? In the end, my hope is that by the time the reader reaches the end of this book, he or she will feel equipped to perform the next set of studies, write the next set of articles, invent new ebook functionality, or simply engage in a heated argument with the stranger in seat 17C about the future of reading. Table of Contents: Preface / Figure Credits / Introduction / Reading / Interaction / Reading as a Social Activity / Studying Reading / Beyond the Book / References / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Learning from Multiple Social Networks

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>With the proliferation of social network services, more and more social users, such as individuals and organizations, are simultaneously involved in multiple social networks for various purposes. In fact, multiple social networks characterize the same social users from different perspectives, and their contexts are usually consistent or complementary rather than independent. Hence, as compared to using information from a single social network, appropriate aggregation of multiple social networks offers us a better way to comprehensively understand the given social users. </p><p> Learning across multiple social networks brings opportunities to new services and applications as well as new insights on user online behaviors, yet it raises tough challenges: (1) How can we map different social network accounts to the same social users? (2) How can we complete the item-wise and block-wise missing data? (3) How can we leverage the relatedness among sources to strengt en the learning performance? And (4) How can we jointly model the dual-heterogeneities: multiple tasks exist for the given application and each task has various features from multiple sources? These questions have been largely unexplored to date. </p><p> We noticed this timely opportunity, and in this book we present some state-of-the-art theories and novel practical applications on aggregation of multiple social networks. In particular, we first introduce multi-source dataset construction. We then introduce how to effectively and efficiently complete the item-wise and block-wise missing data, which are caused by the inactive social users in some social networks. We next detail the proposed multi-source mono-task learning model and its application in volunteerism tendency prediction. As a counterpart, we also present a mono-source multi-task learning model and apply it to user interest inference. We seamlessly unify these models with the so-called multi-source multi-ta k learning, and demonstrate several application scenarios, such as occupation prediction. Finally, we conclude the book and figure out the future research directions in multiple social network learning, including the privacy issues and source complementarity modeling. </p><p> This is preliminary research on learning from multiple social networks, and we hope it can inspire more active researchers to work on this exciting area. If we have seen further it is by standing on the shoulders of giants.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Human Side of Engineering

    Copyright Year: 2017

    Morgan and Claypool eBooks

    While in many university courses attention is given to the human side, as opposed to the technical side of engineering, it is by and large an afterthought. Engineering is, however, a technical, social, and personal activity. Several studies show that engineering is a community activity of professionals in which communication is central to the engineering task. Increasingly, technology impacts everyone in society. Acting as a professional community, engineers have an awesome power to influence society but they can only act for the common good if they understand the nature of our society. To achieve such understanding they have to understand themselves. This book is about understanding ourselves in order to understand others, and understanding others in order to understand ourselves in the context of engineering and the society it serves. To achieve this understanding this book takes the reader on 12 intellectual journeys that frame the big questions confronting the engineering professi ns. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Ever since its invention in the 1980s, the compound semiconductor heterojunction-based high electron mobility transistor (HEMT) has been widely used in radio frequency (RF) applications. This book provides readers with broad coverage on techniques and new trends of HEMT, employing leading compound semiconductors, III-N and III-V materials. The content includes an overview of GaN HEMT device-scaling technologies and experimental research breakthroughs in fabricating various GaN MOSHEMT transistors. Readers are offered an inspiring example of monolithic integration of HEMT with LEDs, too. The authors compile the most relevant aspects of III-V HEMT, including the current status of state-of-art HEMTs, their possibility of replacing the Si CMOS transistor channel, and growth opportunities of III-V materials on an Si substrate. With detailed exploration and explanations, the book is a helpful source suitable for anyone learning about and working on compound semiconductor devices. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Control System Synthesis:A Factorization Approach, Part II

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book introduces the so-called "stable factorization approach" to the synthesis of feedback controllers for linear control systems. The key to this approach is to view the multi-input, multi-output (MIMO) plant for which one wishes to design a controller as a matrix over the fraction field F associated with a commutative ring with identity, denoted by R, which also has no divisors of zero. In this setting, the set of single-input, single-output (SISO) stable control systems is precisely the ring R, while the set of stable MIMO control systems is the set of matrices whose elements all belong to R. The set of unstable, meaning not necessarily stable, control systems is then taken to be the field of fractions F associated with R in the SISO case, and the set of matrices with elements in F in the MIMO case. The central notion introduced in the book is that, in most situations of practical interest, every matrix P whose elements belong to F can be "factored" as a "ratio" of two matrice N,D whose elements belong to R, in such a way that N,D are coprime. In the familiar case where the ring R corresponds to the set of bounded-input, bounded-output (BIBO)-stable rational transfer functions, coprimeness is equivalent to two functions not having any common zeros in the closed right half-plane including infinity. However, the notion of coprimeness extends readily to discrete-time systems, distributed-parameter systems in both the continuous- as well as discrete-time domains, and to multi-dimensional systems. Thus the stable factorization approach enables one to capture all these situations within a common framework. The key result in the stable factorization approach is the parametrization of all controllers that stabilize a given plant. It is shown that the set of all stabilizing controllers can be parametrized by a single parameter R, whose elements all belong to R. Moreover, every transfer matrix in the closed-loop system is an affine function of the design parameter R Thus problems of reliable stabilization, disturbance rejection, robust stabilization etc. can all be formulated in terms of choosing an appropriate R. This is a reprint of the book Control System Synthesis: A Factorization Approach originally published by M.I.T. Press in 1985. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    DSP for MATLAB™ and LabVIEW™ IV:LMS Adaptive Filters

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book is Volume IV of the series DSP for MATLAB™ and LabVIEW™. Volume IV is an introductory treatment of LMS Adaptive Filtering and applications, and covers cost functions, performance surfaces, coefficient perturbation to estimate the gradient, the LMS algorithm, response of the LMS algorithm to narrow-band signals, and various topologies such as ANC (Active Noise Cancelling) or system modeling, Noise Cancellation, Interference Cancellation, Echo Cancellation (with single- and dual-H topologies), and Inverse Filtering/Deconvolution. The entire series consists of four volumes that collectively cover basic digital signal processing in a practical and accessible manner, but which nonetheless include all essential foundation mathematics. As the series title implies, the scripts here will run on both MATLAB™ and LabVIEW™. The text for all volumes contains many examples, and many useful computational scripts, augmented by demonstration scripts and LabVIEW&# 482; Virtual Instruments (VIs) that can be run to illustrate various signal processing concepts graphically on the user's computer screen. Volume I consists of four chapters that collectively set forth a brief overview of the field of digital signal processing, useful signals and concepts (including convolution, recursion, difference equations, LTI systems, etc), conversion from the continuous to discrete domain and back (i.e., analog-to-digital and digital-to-analog conversion), aliasing, the Nyquist rate, normalized frequency, sample rate conversion and Mu-law compression, and signal processing principles including correlation, the correlation sequence, the Real DFT, correlation by convolution, matched filtering, simple FIR filters, and simple IIR filters. Chapter 4 of Volume I, in particular, provides an intuitive or "first principle" understanding of how digital filtering and frequency transforms work. Volume II provides detailed coverage of discrete frequency transforms, includi g a brief overview of common frequency transforms, both discrete and continuous, followed by detailed treatments of the Discrete Time Fourier Transform (DTFT), the z-Transform (including definition and properties, the inverse z-transform, frequency response via z-transform, and alternate filter realization topologies including Direct Form, Direct Form Transposed, Cascade Form, Parallel Form, and Lattice Form), and the Discrete Fourier Transform (DFT) (including Discrete Fourier Series, the DFT-IDFT pair, DFT of common signals, bin width, sampling duration, and sample rate, the FFT, the Goertzel Algorithm, Linear, Periodic, and Circular convolution, DFT Leakage, and computation of the Inverse DFT). Volume III covers digital filter design, including the specific topics of FIR design via windowed-ideal-lowpass filter, FIR highpass, bandpass, and bandstop filter design from windowed-ideal lowpass filters, FIR design using the transition-band-optimized Frequency Sampling technique (impleme ted by Inverse-DFT or Cosine/Sine Summation Formulas), design of equiripple FIRs of all standard types including Hilbert Transformers and Differentiators via the Remez Exchange Algorithm, design of Butterworth, Chebyshev (Types I and II), and Elliptic analog prototype lowpass filters, conversion of analog lowpass prototype filters to highpass, bandpass, and bandstop filters, and conversion of analog filters to digital filters using the Impulse Invariance and Bilinear Transform techniques. Certain filter topologies specific to FIRs are also discussed, as are two simple FIR types, the Comb and Moving Average filters. Table of Contents: Introduction To LMS Adaptive Filtering / Applied Adaptive Filtering View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Visual Object Recognition

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The visual recognition problem is central to computer vision research. From robotics to information retrieval, many desired applications demand the ability to identify and localize categories, places, and objects. This tutorial overviews computer vision algorithms for visual object recognition and image classification. We introduce primary representations and learning approaches, with an emphasis on recent advances in the field. The target audience consists of researchers or students working in AI, robotics, or vision who would like to understand what methods and representations are available for these problems. This lecture summarizes what is and isn't possible to do reliably today, and overviews key concepts that could be employed in systems requiring visual categorization. Table of Contents: Introduction / Overview: Recognition of Specific Objects / Local Features: Detection and Description / Matching Local Features / Geometric Verification of Matched Features / Example Systems: S ecific-Object Recognition / Overview: Recognition of Generic Object Categories / Representations for Object Categories / Generic Object Detection: Finding and Scoring Candidates / Learning Generic Object Category Models / Example Systems: Generic Object Recognition / Other Considerations and Current Challenges / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Recognizing Textual Entailment:Models and Applications

    Copyright Year: 2013

    Morgan and Claypool eBooks

    In the last few years, a number of NLP researchers have developed and participated in the task of Recognizing Textual Entailment (RTE). This task encapsulates Natural Language Understanding capabilities within a very simple interface: recognizing when the meaning of a text snippet is contained in the meaning of a second piece of text. This simple abstraction of an exceedingly complex problem has broad appeal partly because it can be conceived also as a component in other NLP applications, from Machine Translation to Semantic Search to Information Extraction. It also avoids commitment to any specific meaning representation and reasoning framework, broadening its appeal within the research community. This level of abstraction also facilitates evaluation, a crucial component of any technological advancement program. This book explains the RTE task formulation adopted by the NLP research community, and gives a clear overview of research in this area. It draws out commonalities in this res arch, detailing the intuitions behind dominant approaches and their theoretical underpinnings. This book has been written with a wide audience in mind, but is intended to inform all readers about the state of the art in this fascinating field, to give a clear understanding of the principles underlying RTE research to date, and to highlight the short- and long-term research goals that will advance this technology. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Embedded System Design with the Atmel AVR Microcontroller:Part I

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This textbook provides practicing scientists and engineers an advanced treatment of the Atmel AVR microcontroller. This book is intended as a follow-on to a previously published book, titled Atmel AVR Microcontroller Primer: Programming and Interfacing. Some of the content from this earlier text is retained for completeness. This book will emphasize advanced programming and interfacing skills. We focus on system level design consisting of several interacting microcontroller subsystems. The first chapter discusses the system design process. Our approach is to provide the skills to quickly get up to speed to operate the internationally popular Atmel AVR microcontroller line by developing systems level design skills. We use the Atmel ATmega164 as a representative sample of the AVR line. The knowledge you gain on this microcontroller can be easily translated to every other microcontroller in the AVR line. In succeeding chapters, we cover the main subsystems aboard the microcontroller, pro iding a short theory section followed by a description of the related microcontroller subsystem with accompanying software for the subsystem. We then provide advanced examples exercising some of the features discussed. In all examples, we use the C programming language. The code provided can be readily adapted to the wide variety of compilers available for the Atmel AVR microcontroller line. We also include a chapter describing how to interface the microcontroller to a wide variety of input and output devices. The book concludes with several detailed system level design examples employing the Atmel AVR microcontroller. Table of Contents: Embedded Systems Design / Atmel AVR Architecture Overview / Serial Communication Subsystem / Analog to Digital Conversion (ADC) / Interrupt Subsystem / Timing Subsystem / Atmel AVR Operating Parameters and Interfacing / System Level Design View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Smart Grid:Adapting the Power System to New Challenges

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book links the challenges to which the electricity network is exposed with the range of new technology, methodologies and market mechanisms known under the name "smart grid." The main challenges will be described by the way in which they impact the electricity network: the introduction of renewable electricity production, energy efficiency, the introduction and further opening of the electricity market, increasing demands for reliability and voltage quality, and the growing need for more transport capacity in the grid. Three fundamentally different types of solutions are distinguished in this book: solutions only involving the electricity network (like HVDC and active distribution networks), solutions including the network users but under the control of the network operator (like requirements on production units and curtailment), and fully market-driven solutions (like demand response). An overview is given of the various solutions to the challenges that are possible with new tec nology; this includes some that are actively discussed elsewhere and others that are somewhat forgotten. Linking the different solutions with the needs of the electricity network, in the light of the various challenges, is a recurring theme in this book. Table of Contents: Introduction / The Challenges / Solutions in the Grid / Participation of Network Users / Market Incentives / Discussion / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Approximability of Optimization Problems through Adiabatic Quantum Computation

    Copyright Year: 2014

    Morgan and Claypool eBooks

    The adiabatic quantum computation (AQC) is based on the adiabatic theorem to approximate solutions of the Schrödinger equation. The design of an AQC algorithm involves the construction of a Hamiltonian that describes the behavior of the quantum system. This Hamiltonian is expressed as a linear interpolation of an initial Hamiltonian whose ground state is easy to compute, and a final Hamiltonian whose ground state corresponds to the solution of a given combinatorial optimization problem. The adiabatic theorem asserts that if the time evolution of a quantum system described by a Hamiltonian is large enough, then the system remains close to its ground state. An AQC algorithm uses the adiabatic theorem to approximate the ground state of the final Hamiltonian that corresponds to the solution of the given optimization problem. In this book, we investigate the computational simulation of AQC algorithms applied to the MAX-SAT problem. A symbolic analysis of the AQC solution is given in rder to understand the involved computational complexity of AQC algorithms. This approach can be extended to other combinatorial optimization problems and can be used for the classical simulation of an AQC algorithm where a Hamiltonian problem is constructed. This construction requires the computation of a sparse matrix of dimension 2ⁿ × 2ⁿ, by means of tensor products, where n is the dimension of the quantum system. Also, a general scheme to design AQC algorithms is proposed, based on a natural correspondence between optimization Boolean variables and quantum bits. Combinatorial graph problems are in correspondence with pseudo-Boolean maps that are reduced in polynomial time to quadratic maps. Finally, the relation among NP-hard problems is investigated, as well as its logical representability, and is applied to the design of AQC algorithms. It is shown that every monadic second-order logic (MSOL) expression has associated pseudo-Boolean maps that can be obtained y expanding the given expression, and also can be reduced to quadratic forms. Table of Contents: Preface / Acknowledgments / Introduction / Approximability of NP-hard Problems / Adiabatic Quantum Computing / Efficient Hamiltonian Construction / AQC for Pseudo-Boolean Optimization / A General Strategy to Solve NP-Hard Problems / Conclusions / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Privacy for Location-based Services

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Sharing of location data enables numerous exciting applications, such as location-based queries, location-based social recommendations, monitoring of traffic and air pollution levels, etc. Disclosing exact user locations raises serious privacy concerns, as locations may give away sensitive information about individuals' health status, alternative lifestyles, political and religious affiliations, etc. Preserving location privacy is an essential requirement towards the successful deployment of location-based applications. These lecture notes provide an overview of the state-of-the-art in location privacy protection. A diverse body of solutions is reviewed, including methods that use location generalization, cryptographic techniques or differential privacy. The most prominent results are discussed, and promising directions for future work are identified. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Biomedical Engineering:Biomechanics and Bioelectricity, Part II

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Intended as an introduction to the field of biomedical engineering, this book covers the topics of biomechanics (Part I) and bioelectricity (Part II). Each chapter emphasizes a fundamental principle or law, such as Darcy's Law, Poiseuille's Law, Hooke's Law, Starling's Law, levers, and work in the area of fluid, solid, and cardiovascular biomechanics. In addition, electrical laws and analysis tools are introduced, including Ohm's Law, Kirchhoff's Laws, Coulomb's Law, capacitors, and the fluid/electrical analogy. Culminating the electrical portion are chapters covering Nernst and membrane potentials and Fourier transforms. Examples are solved throughout the book and problems with answers are given at the end of each chapter. A semester-long Major Project that models the human systemic cardiovascular system, utilizing both a Matlab numerical simulation and an electrical analog circuit, ties many of the book's concepts together. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Project Management for Engineering Design

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This lecture book is an introduction to project management. It will be of use for engineering students working on project design in all engineering disciplines and will also be of high value to practicing engineers in the work force. Few engineering programs prepare students in methods of project design and configuration management used within industry and government. This book emphasizes teams throughout and includes coverage of an introduction to projectmanagement, project definition, researching intellectual property (patent search), project scope, idealizing and conceptualizing a design, converting product requirements to engineering specifications, project integration, project communicationsmanagement, and conducting design reviews. The overall objectives of the book are for the readers to understand and manage their project by employing the good engineering practice used by medical and other industries in design and development of medical devices, engineered products and systems The goal is for the engineer and student to work well on large projects requiring a team environment, and to effectively communicate technical matters in both written documents and oral presentations. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Linguistic Fundamentals for Natural Language Processing:100 Essentials from Morphology and Syntax

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Many NLP tasks have at their core a subtask of extracting the dependencies—who did what to whom—from natural language sentences. This task can be understood as the inverse of the problem solved in different ways by diverse human languages, namely, how to indicate the relationship between different parts of a sentence. Understanding how languages solve the problem can be extremely useful in both feature design and error analysis in the application of machine learning to NLP. Likewise, understanding cross-linguistic variation can be important for the design of MT systems and other multilingual applications. The purpose of this book is to present in a succinct and accessible fashion information about the morphological and syntactic structure of human languages that can be useful in creating more linguistically sophisticated, more language-independent, and thus more successful NLP systems. Table of Contents: Acknowledgments / Introduction/motivation / Morphology: Introductio / Morphophonology / Morphosyntax / Syntax: Introduction / Parts of speech / Heads, arguments, and adjuncts / Argument types and grammatical functions / Mismatches between syntactic position and semantic roles / Resources / Bibliography / Author's Biography / General Index / Index of Languages View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis and Design of Transmitarray Antennas

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>In recent years, transmitarray antennas have attracted growing interest with many antenna researchers. Transmitarrays combines both optical and antenna array theory, leading to a low profile design with high gain, high radiation efficiency, and versatile radiation performance for many wireless communication systems. In this book, comprehensive analysis, new methodologies, and novel designs of transmitarray antennas are presented.</p> <p><ul> <li>Detailed analysis for the design of planar space-fed array antennas is presented. The basics of aperture field distribution and the analysis of the array elements are described. The radiation performances (directivity and gain) are discussed using array theory approach, and the impacts of element phase errors are demonstrated.</li> <li>The performance of transmitarray design using multilayer frequency selective surfaces (M-FSS) approach is carefully studied, and the transmission phase imit which are generally independent from the selection of a specific element shape is revealed. The maximum transmission phase range is determined based on the number of layers, substrate permittivity, and the separations between layers.</li> <li>In order to reduce the transmitarray design complexity and cost, three different methods have been investigated. As a result, one design is performed using quad-layer cross-slot elements with no dielectric material and another using triple-layer spiral dipole elements. Both designs were fabricated and tested at X-Band for deep space communications. Furthermore, the radiation pattern characteristics were studied under different feed polarization conditions and oblique angles of incident field from the feed.</li> <li>New design methodologies are proposed to improve the bandwidth of transmitarray antennas through the control of the transmission phase range of the elements. These design techniques are validated th ough the fabrication and testing of two quad-layer transmitarray antennas at Ku-band.</li> <li>A single-feed quad-beam transmitarray antenna with 50 degrees elevation separation between the beams is investigated, designed, fabricated, and tested at Ku-band.</li></ul> </p> <p>In summary, various challenges in the analysis and design of transmitarray antennas are addressed in this book. New methodologies to improve the bandwidth of transmitarray antennas have been demonstrated. Several prototypes have been fabricated and tested, demonstrating the desirable features and potential new applications of transmitarray antennas.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Metaphor:A Computational Perspective

    Copyright Year: 2016

    Morgan and Claypool eBooks

    The literary imagination may take flight on the wings of metaphor, but hard-headed scientists are just as likely as doe-eyed poets to reach for a metaphor when the descriptive need arises. Metaphor is a pervasive aspect of every genre of text and every register of speech, and is as useful for describing the inner workings of a "black hole" (itself a metaphor) as it is the affairs of the human heart. The ubiquity of metaphor in natural language thus poses a significant challenge for Natural Language Processing (NLP) systems and their builders, who cannot afford to wait until the problems of literal language have been solved before turning their attention to figurative phenomena. This book offers a comprehensive approach to the computational treatment of metaphor and its figurative brethren—including simile, analogy, and conceptual blending—that does not shy away from their important cognitive and philosophical dimensions. Veale, Shutova, and Beigman Klebanov approach me aphor from multiple computational perspectives, providing coverage of both symbolic and statistical approaches to interpretation and paraphrase generation, while also considering key contributions from philosophy on what constitutes the "meaning" of a metaphor. This book also surveys available metaphor corpora and discusses protocols for metaphor annotation. Any reader with an interest in metaphor, from beginning researchers to seasoned scholars, will find this book to be an invaluable guide to what is a fascinating linguistic phenomenon. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semantic Interaction for Visual Analytics:Inferring Analytical Reasoning for Model Steering

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>This book discusses <b><i>semantic interaction</i></b>, a user interaction methodology for visual analytic applications that more closely couples the visual reasoning processes of people with the computation. This methodology affords user interaction on visual data representations that are native to the domain of the data.</p><p> User interaction in visual analytics systems is critical to enabling visual data exploration. Interaction transforms people from mere viewers to active participants in the process of analyzing and understanding data. This discourse between people and data enables people to understand aspects of their data, such as structure, patterns, trends, outliers, and other properties that ultimately result in insight. Through interacting with visualizations, users engage in sensemaking, a process of developing and understanding relationships within datasets through foraging and synthesis.</p><p>T e book provides a description of the principles of semantic interaction, providing design guidelines for the integration of semantic interaction into visual analytics, examples of existing technologies that leverage semantic interaction, and a discussion of how to evaluate these technologies. Semantic interaction has the potential to increase the effectiveness of visual analytic technologies and opens possibilities for a fundamentally new design space for user interaction in visual analytics systems.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Power-Efficient Computer Architectures:Recent Advances

    Copyright Year: 2014

    Morgan and Claypool eBooks

    As Moore's Law and Dennard scaling trends have slowed, the challenges of building high-performance computer architectures while maintaining acceptable power efficiency levels have heightened. Over the past ten years, architecture techniques for power efficiency have shifted from primarily focusing on module-level efficiencies, toward more holistic design styles based on parallelism and heterogeneity. This work highlights and synthesizes recent techniques and trends in power-efficient computer architecture. Table of Contents: Introduction / Voltage and Frequency Management / Heterogeneity and Specialization / Communication and Memory Systems / Conclusions / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dependency Parsing

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Dependency-based methods for syntactic parsing have become increasingly popular in natural language processing in recent years. This book gives a thorough introduction to the methods that are most widely used today. After an introduction to dependency grammar and dependency parsing, followed by a formal characterization of the dependency parsing problem, the book surveys the three major classes of parsing models that are in current use: transition-based, graph-based, and grammar-based models. It continues with a chapter on evaluation and one on the comparison of different methods, and it closes with a few words on current trends and future prospects of dependency parsing. The book presupposes a knowledge of basic concepts in linguistics and computer science, as well as some knowledge of parsing methods for constituency-based representations. Table of Contents: Introduction / Dependency Parsing / Transition-Based Parsing / Graph-Based Parsing / Grammar-Based Parsing / Evaluation / Comp rison / Final Thoughts View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Packets with Deadlines:A Framework for Real-Time Wireless Networks

    Copyright Year: 2013

    Morgan and Claypool eBooks

    With the explosive increase in the number of mobile devices and applications, it is anticipated that wireless traffic will increase exponentially in the coming years. Moreover, future wireless networks all carry a wide variety of flows, such as video streaming, online gaming, and VoIP, which have various quality of service (QoS) requirements. Therefore, a new mechanism that can provide satisfactory performance to the complete variety of all kinds of flows, in a coherent and unified framework, is needed. In this book, we introduce a framework for real-time wireless networks. This consists of a model that jointly addresses several practical concerns for real-time wireless networks, including per-packet delay bounds, throughput requirements, and heterogeneity of wireless channels. We detail how this framework can be employed to address a wide range of problems, including admission control, packet scheduling, and utility maximization. Table of Contents: Preface / Introduction / A Study of the Base Case / Admission Control / Scheduling Policies / Utility Maximization without Rate Adaptation / Utility Maximization with Rate Adaptation / Systems with Both Real-Time Flows and Non-Real-Time Flows / Broadcasting and Network Coding / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis of Oriented Texture:With application to the Detection of Architectural Distortion in Mammograms

    Copyright Year: 2011

    Morgan and Claypool eBooks

    The presence of oriented features in images often conveys important information about the scene or the objects contained; the analysis of oriented patterns is an important task in the general framework of image understanding. As in many other applications of computer vision, the general framework for the understanding of oriented features in images can be divided into low- and high-level analysis. In the context of the study of oriented features, low-level analysis includes the detection of oriented features in images; a measure of the local magnitude and orientation of oriented features over the entire region of analysis in the image is called the orientation field. High-level analysis relates to the discovery of patterns in the orientation field, usually by associating the structure perceived in the orientation field with a geometrical model. This book presents an analysis of several important methods for the detection of oriented features in images, and a discussion of the phase po trait method for high-level analysis of orientation fields. In order to illustrate the concepts developed throughout the book, an application is presented of the phase portrait method to computer-aided detection of architectural distortion in mammograms. Table of Contents: Detection of Oriented Features in Images / Analysis of Oriented Patterns Using Phase Portraits / Optimization Techniques / Detection of Sites of Architectural Distortion in Mammograms View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Tragedy in the Gulf:A Call for a New Engineering Ethic

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The recent tragedy in the Gulf of Mexico and resultant ethical consequences for the engineering profession are introduced and discussed. The need for a new engineering ethic is identified and introduced based upon advancements in science, complex systems and eco-philosophy. Motivations for introducing a new ethic rather than modifying existing ethics are also discussed. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Design and the Digital Divide:Insights from 40 Years in Computer Support for Older and Disabled People

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Demographic trends and increasing support costs means that good design for older and disabled people is an economic necessity, as well as a moral imperative. Alan Newell has been described as "a visionary who stretches the imagination of all of us" and "truly ahead of his time." This monograph describes research ranging from developing communication systems for non-speaking and hearing-impaired people to technology to support older people, and addresses the particular challenges older people have with much modern technology. Alan recounts the insights gained from this research journey, and recommends a philosophy, and design practices, to reduce the "Digital Divide" between users of information technology and those who are excluded by the poor design of many current systems. How to create and lead interdisciplinary teams, and the practical and ethical challenges of working in clinically related fields are discussed. The concepts of "Ordinary and Extra-ordinary HCI", "User Sensitive In lusive Design" , and "Design for Dynamic Diversity", and the use of "Creative Design" techniques are suggested as extensions of "User Centered" and "Universal Design." Also described are the use of professional theatre and other methods for raising designers' awareness of the challenges faced by older and disabled people, ways of engaging with these groups, and of ascertaining what they "want" rather than just what they "need." This monograph will give all Human Computer Interaction (HCI) practitioners and designers of both mainstream and specialized IT equipment much food for thought. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering and Society:Working Towards Social Justice, Part I: Engineering and Society

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Engineers work in an increasingly complex entanglement of ideas, people, cultures, technology, systems and environments. Today, decisions made by engineers often have serious implications for not only their clients but for society as a whole and the natural world. Such decisions may potentially influence cultures, ways of living, as well as alter ecosystems which are in delicate balance. In order to make appropriate decisions and to co-create ideas and innovations within and among the complex networks of communities which currently exist and are shaped by our decisions, we need to regain our place as professionals, to realise the significance of our work and to take responsibility in a much deeper sense. Engineers must develop the 'ability to respond' to emerging needs of all people, across all cultures. To do this requires insights and knowledge which are at present largely within the domain of the social and political sciences but which need to be shared with our students in ways hich are meaningful and relevant to engineering. This book attempts to do just that. In Part 1 Baillie introduces ideas associated with the ways in which engineers relate to the communities in which they work. Drawing on scholarship from science and technology studies, globalisation and development studies, as well as work in science communication and dialogue, this introductory text sets the scene for an engineering community which engages with the public. In Part 2 Catalano frames the thinking processes necessary to create ethical and just decisions in engineering, to understand the implications of our current decision making processes and think about ways in which we might adapt these to become more socially just in the future. In Part 3 Baillie and Catalano have provided case studies of everyday issues such as water, garbage and alarm clocks, to help us consider how we might see through the lenses of our new knowledge from Parts 1 and 2 and apply this to our everyday existence as ngineers. Table of Contents: Introduction / Throwing Away Rubbish / Turning on the Tap / Awakened by an Alarm Clock / Driving the SUV / Travelling to Waikiki Beach View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Management in the Cloud:Challenges and Opportunities

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Cloud computing has emerged as a successful paradigm of service-oriented computing and has revolutionized the way computing infrastructure is used. This success has seen a proliferation in the number of applications that are being deployed in various cloud platforms. There has also been an increase in the scale of the data generated as well as consumed by such applications. Scalable database management systems form a critical part of the cloud infrastructure. The attempt to address the challenges posed by the management of big data has led to a plethora of systems. This book aims to clarify some of the important concepts in the design space of scalable data management in cloud computing infrastructures. Some of the questions that this book aims to answer are: the appropriate systems for a specific set of application requirements, the research challenges in data management for the cloud, and what is novel in the cloud for database researchers? We also aim to address one basic question: whether cloud computing poses new challenges in scalable data management or it is just a reincarnation of old problems? We provide a comprehensive background study of state-of-the-art systems for scalable data management and analysis. We also identify important aspects in the design of different systems and the applicability and scope of these systems. A thorough understanding of current solutions and a precise characterization of the design space are essential for clearing the "cloudy skies of data management" and ensuring the success of DBMSs in the cloud, thus emulating the success enjoyed by relational databases in traditional enterprise settings. Table of Contents: Introduction / Distributed Data Management / Cloud Data Management: Early Trends / Transactions on Co-located Data / Transactions on Distributed Data / Multi-tenant Database Systems / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Children’s Internet Search:Using Roles to Understand Children’s Search Behavior

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Searching the Internet and the ability to competently use search engines are increasingly becoming an important part of children’s daily lives. Whether mobile or at home, children use search interfaces to explore personal interests, complete academic assignments, and have social interaction. However, engaging with search also means engaging with an ever-changing and evolving search landscape. There are continual software updates, multiple devices used to search (e.g., phones, tablets), an increasing use of social media, and constantly updated Internet content. For young searchers, this can require infinite adaptability or mean being hopelessly confused. This book offers a perspective centered on children’s search experiences as a whole instead of thinking of search as a process with separate and potentially problematic steps. Reading the prior literature with a child-centered view of search reveals that children have been remarkably consistent over time as searchers, dis laying the same search strategies regardless of the landscape of search. However, no research has synthesized these consistent patterns in children’s search across the literature, and only recently have these patterns been uncovered as distinct search roles, or searcher types. Based on a four-year longitudinal study on children’s search experiences, this book weaves together the disparate evidence in the literature through the use of 9 search roles for children ages 7-15. The search role framework has a distinct advantage because it encourages adult stakeholders to design children’s search tools to support and educate children at their existing levels of search strength and deficit, rather than expecting children to adapt to a transient search landscape. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Joint Source-Channel Video Transmission

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book deals with the problem of joint source-channel video transmission, i.e., the joint optimal allocation of resources at the application layer and the other network layers, such as data rate adaptation, channel coding, power adaptation in wireless networks, quality of service (QoS) support from the network, and packet scheduling, for efficient video transmission. Real-time video communication applications, such as videoconferencing, video telephony, and on-demand video streaming, have gained increased popularity. However, a key problem in video transmission over the existing Internet and wireless networks is the incompatibility between the nature of the network conditions and the QoS requirements (in terms, for example, of bandwidth, delay, and packet loss) of real-time video applications. To deal with this incompatibility, a natural approach is to adapt the end-system to the network. The joint source-channel coding approach aims to efficiently perform content-aware cross-layer resource allocation, thus increasing the communication efficiency of multiple network layers. Our purpose in this book is to review the basic elements of the state-of-the-art approaches toward joint source-channel video transmission for wired and wireless systems. In this book, we present a general resource-distortion optimization framework, which is used throughout the book to guide our discussions on various techniques of joint source-channel video transmission. In this framework, network resources from multiple layers are assigned to each video packet according to its level of importance. It provides not only an optimization benchmark against which the performance of other sub-optimal systems can be evaluated, but also a useful tool for assessing the effectiveness of different error control components in practical system design. This book is therefore written to be accessible to researchers, expert industrial R&D engineers, and university students who are interested in the cut ing edge technologies in joint source-channel video transmission. Contents: Introduction / Elements of a Video Communication System / Joint Source-Channel Coding / Error-Resilient Video Coding / Channel Modeling and Channel Coding / Internet Video Transmission / Wireless Video Transmission / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Query Processing over Uncertain Databases

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Due to measurement errors, transmission lost, or injected noise for privacy protection, uncertainty exists in the data of many real applications. However, query processing techniques for deterministic data cannot be directly applied to uncertain data because they do not have mechanisms to handle data uncertainty. Therefore, efficient and effective manipulation of uncertain data is a practical yet challenging research topic. In this book, we start from the data models for imprecise and uncertain data, move on to defining different semantics for queries on uncertain data, and finally discuss the advanced query processing techniques for various probabilistic queries in uncertain databases. The book serves as a comprehensive guideline for query processing over uncertain databases. Table of Contents: Introduction / Uncertain Data Models / Spatial Query Semantics over Uncertain Data Models / Spatial Query Processing over Uncertain Databases / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Embedded System Design with the Atmel AVR Microcontroller:Part II

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This textbook provides practicing scientists and engineers an advanced treatment of the Atmel AVR microcontroller. This book is intended as a follow-on to a previously published book, titled Atmel AVR Microcontroller Primer: Programming and Interfacing. Some of the content from this earlier text is retained for completeness. This book will emphasize advanced programming and interfacing skills. We focus on system level design consisting of several interacting microcontroller subsystems. The first chapter discusses the system design process. Our approach is to provide the skills to quickly get up to speed to operate the internationally popular Atmel AVR microcontroller line by developing systems level design skills. We use the Atmel ATmega164 as a representative sample of the AVR line. The knowledge you gain on this microcontroller can be easily translated to every other microcontroller in the AVR line. In succeeding chapters, we cover the main subsystems aboard the microcontroller, pro iding a short theory section followed by a description of the related microcontroller subsystem with accompanying software for the subsystem. We then provide advanced examples exercising some of the features discussed. In all examples, we use the C programming language. The code provided can be readily adapted to the wide variety of compilers available for the Atmel AVR microcontroller line. We also include a chapter describing how to interface the microcontroller to a wide variety of input and output devices. The book concludes with several detailed system level design examples employing the Atmel AVR microcontroller. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mellin Transform Method for Integral Evaluation

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book introduces the Mellin-transform method for the exact calculation of one-dimensional definite integrals, and illustrates the application if this method to electromagnetics problems. Once the basics have been mastered, one quickly realizes that the method is extremely powerful, often yielding closed-form expressions very difficult to come up with other methods or to deduce from the usual tables of integrals. Yet, as opposed to other methods, the present method is very straightforward to apply; it usually requires laborious calculations, but little ingenuity. Two functions, the generalized hypergeometric function and the Meijer G-function, are very much related to the Mellin-transform method and arise frequently when the method is applied. Because these functions can be automatically handled by modern numerical routines, they are now much more useful than they were in the past. The Mellin-transform method and the two aforementioned functions are discussed first. Then the method is applied in three examples to obtain results, which, at least in the antenna/electromagnetics literature, are believed to be new. In the first example, a closed-form expression, as a generalized hypergeometric function, is obtained for the power radiated by a constant-current circular-loop antenna. The second example concerns the admittance of a 2-D slot antenna. In both these examples, the exact closed-form expressions are applied to improve upon existing formulas in standard antenna textbooks. In the third example, a very simple expression for an integral arising in recent, unpublished studies of unbounded, biaxially anisotropic media is derived. Additional examples are also briefly discussed. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Interaction for Visualization

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Visualization has become a valuable means for data exploration and analysis. Interactive visualization combines expressive graphical representations and effective user interaction. Although interaction is an important component of visualization approaches, much of the visualization literature tends to pay more attention to the graphical representation than to interaction. The goal of this work is to strengthen the interaction side of visualization. Based on a brief review of general aspects of interaction, we develop an interaction-oriented view on visualization. This view comprises five key aspects: the data, the tasks, the technology, the human, as well as the implementation. Picking up these aspects individually, we elaborate several interaction methods for visualization. We introduce a multi-threading architecture for efficient interactive exploration. We present interaction techniques for different types of data e.g., multivariate data, spatio-temporal data, graphs) and different visualization tasks (e.g., exploratory navigation, visual comparison, visual editing). With respect to technology, we illustrate approaches that utilize modern interaction modalities (e.g., touch, tangibles, proxemics) as well as classic ones. While the human is important throughout this work, we also consider automatic methods to assist the interactive part. In addition to solutions for individual problems, a major contribution of this work is the overarching view of interaction in visualization as a whole. This includes a critical discussion of interaction, the identification of links between the key aspects of interaction, and the formulation of research topics for future work with a focus on interaction. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dynamic Speech Models

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Speech dynamics refer to the temporal characteristics in all stages of the human speech communication process. This speech “chain” starts with the formation of a linguistic message in a speaker's brain and ends with the arrival of the message in a listener's brain. Given the intricacy of the dynamic speech process and its fundamental importance in human communication, this monograph is intended to provide a comprehensive material on mathematical models of speech dynamics and to address the following issues: How do we make sense of the complex speech process in terms of its functional role of speech communication? How do we quantify the special role of speech timing? How do the dynamics relate to the variability of speech that has often been said to seriously hamper automatic speech recognition? How do we put the dynamic process of speech into a quantitative form to enable detailed analyses? And finally, how can we incorporate the knowledge of speech dynamics into compu erized speech analysis and recognition algorithms? The answers to all these questions require building and applying computational models for the dynamic speech process. What are the compelling reasons for carrying out dynamic speech modeling? We provide the answer in two related aspects. First, scientific inquiry into the human speech code has been relentlessly pursued for several decades. As an essential carrier of human intelligence and knowledge, speech is the most natural form of human communication. Embedded in the speech code are linguistic (as well as para-linguistic) messages, which are conveyed through four levels of the speech chain. Underlying the robust encoding and transmission of the linguistic messages are the speech dynamics at all the four levels. Mathematical modeling of speech dynamics provides an effective tool in the scientific methods of studying the speech chain. Such scientific studies help understand why humans speak as they do and how humans exploit redundanc and variability by way of multitiered dynamic processes to enhance the efficiency and effectiveness of human speech communication. Second, advancement of human language technology, especially that in automatic recognition of natural-style human speech is also expected to benefit from comprehensive computational modeling of speech dynamics. The limitations of current speech recognition technology are serious and are well known. A commonly acknowledged and frequently discussed weakness of the statistical model underlying current speech recognition technology is the lack of adequate dynamic modeling schemes to provide correlation structure across the temporal speech observation sequence. Unfortunately, due to a variety of reasons, the majority of current research activities in this area favor only incremental modifications and improvements to the existing HMM-based state-of-the-art. For example, while the dynamic and correlation modeling is known to be an important topic, most of the sy tems nevertheless employ only an ultra-weak form of speech dynamics; e.g., differential or delta parameters. Strong-form dynamic speech modeling, which is the focus of this monograph, may serve as an ultimate solution to this problem. After the introduction chapter, the main body of this monograph consists of four chapters. They cover various aspects of theory, algorithms, and applications of dynamic speech models, and provide a comprehensive survey of the research work in this area spanning over past 20~years. This monograph is intended as advanced materials of speech and signal processing for graudate-level teaching, for professionals and engineering practioners, as well as for seasoned researchers and engineers specialized in speech processing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Understanding the Financial Score

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Financial statements and information drawn from them confront us daily: in the media, in corporate annual reports, in the treasurer’s reports for clubs or religious groups, in documents provided to employees and managers, as one considers alternative investments, in documents provided by homeowners’ association and government agencies Various readers of a company’s “financial score” make decisions based on financial information: the company’s managers devise actions to improve operations; investors buy or sell the corporation’s securities; creditors decide how much to lend; customers judge the reliability of this supplier; potential employees decide whether to invest their careers in the company. If you are training to be an accountant, find another book. This book’s objective is to increase your ability to draw useful information from financial statements, and thus to make better decisions—in both your personal life and y ur professional life. Studying this book should help you be a better manager. That is both its objective and its perspective. The book starts at square one; it assumes no prior knowledge on your part. To increase your financial literacy, you will learn the common nomenclature (but not esoteric jargon) used by accountants and financial experts. You will be equipped to ask insightful questions of experts, to engage them and your colleagues in thoughtful debates about financial and accounting issues, and to make better decisions. Table of Contents: The Balance Sheet / The Income Statement / Valuation / Timing / Capital Structure / Cash Flow / Evaluating with Ratios / Cost Accounting / Budgeting and Forecasting / Rules and Integrity / Appendix: Scorekeeping at Not-for-Profits View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Instant Recovery with Write-Ahead Logging:Page Repair, System Restart, and Media Restore

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Traditional theory and practice of write-ahead logging and of database recovery techniques revolve around three failure classes: transaction failures resolved by rollback; system failures (typically software faults) resolved by restart with log analysis, “redo,” and “undo” phases; and media failures (typically hardware faults) resolved by restore operations that combine multiple types of backups and log replay. The recent addition of single-page failures and single-page recovery has opened new opportunities far beyond its original aim of immediate, lossless repair of single-page wear-out in novel or traditional storage hardware. In the contexts of system and media failures, efficient single-page recovery enables on-demand incremental “redo” and “undo” as part of system restart or media restore operations. This can give the illusion of practically instantaneous restart and restore: instant restart permits processing new queries an updates seconds after system reboot and instant restore permits resuming queries and updates on empty replacement media as if those were already fully recovered. In addition to these instant recovery techniques, the discussion introduces much faster offline restore operations without slowdown in backup operations and with hardly any slowdown in log archiving operations. The new restore techniques also render differential and incremental backups obsolete, complete backup commands on the database server practically instantly, and even permit taking full backups without imposing any load on the database server. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    User-Centered Agile Methods

    Copyright Year: 2010

    Morgan and Claypool eBooks

    With the introduction and popularization of Agile methods of software development, existing relationships and working agreements between user experience groups and developers are being disrupted. Agile methods introduce new concepts: the Product Owner, the Customer (but not the user), short iterations, User Stories. Where do UX professionals fit in this new world? Agile methods also bring a new mindset -- no big design, no specifications, minimal planning -- which conflict with the needs of UX design. This lecture discusses the key elements of Agile for the UX community and describes strategies UX people can use to contribute effectively in an Agile team, overcome key weaknesses in Agile methods as typically implemented, and produce a more robust process and more successful designs. We present a process combining the best practices of Contextual Design, a leading approach to user-centered design, with those of Agile development. Table of Contents: Introduction / Common Agile Methods / Agile Culture / Best Practices for Integrating UX with Agile / Structure of a User-Centered Agile Process / Structuring Projects / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Engineering, Poverty, and the Earth

    Copyright Year: 2007

    Morgan and Claypool eBooks

    In the present work, the growing awareness in engineering of the profession’s responsibility towards the environment and the poor is considered. The following approach is taken: a brief overview of the issues of poverty particularly in the U.S. and the deterioration of the natural world with a focus on the Arctic is provided. Case studies involving New Orleans in the aftermath of Hurricane Katrina and the status of polar bears in a time of shrinking Arctic ice cover are detailed. Recent developments in engineering related to the issues of poverty and the environment are discussed. A new paradigm for engineering based on the works of Leonardo Boff and Thomas Berry, one that places an important emphasis upon a community, is explored. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advances in Waveform-Agile Sensing for Tracking

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Recent advances in sensor technology and information processing afford a new flexibility in the design of waveforms for agile sensing. Sensors are now developed with the ability to dynamically choose their transmit or receive waveforms in order to optimize an objective cost function. This has exposed a new paradigm of significant performance improvements in active sensing: dynamic waveform adaptation to environment conditions, target structures, or information features. The manuscript provides a review of recent advances in waveform-agile sensing for target tracking applications. A dynamic waveform selection and configuration scheme is developed for two active sensors that track one or multiple mobile targets. A detailed description of two sequential Monte Carlo algorithms for agile tracking are presented, together with relevant Matlab code and simulation studies, to demonstrate the benefits of dynamic waveform adaptation. The work will be of interest not only to practitioners of rada and sonar, but also other applications where waveforms can be dynamically designed, such as communications and biosensing. Table of Contents: Waveform-Agile Target Tracking Application Formulation / Dynamic Waveform Selection with Application to Narrowband and Wideband Environments / Dynamic Waveform Selection for Tracking in Clutter / Conclusions / CRLB Evaluation for Gaussian Envelope GFM Chirp from the Ambiguity Function / CRLB Evaluation from the Complex Envelope View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Brain-Machine Interface Engineering

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Neural interfaces are one of the most exciting emerging technologies to impact bioengineering and neuroscience because they enable an alternate communication channel linking directly the nervous system with man-made devices. This book reveals the essential engineering principles and signal processing tools for deriving control commands from bioelectric signals in large ensembles of neurons. The topics featured include analysis techniques for determining neural representation, modeling in motor systems, computing with neural spikes, and hardware implementation of neural interfaces. Beginning with an exploration of the historical developments that have led to the decoding of information from neural interfaces, this book compares the theory and performance of new neural engineering approaches for BMIs. Contents: Introduction to Neural Interfaces / Foundations of Neuronal Representations / Input-Outpur BMI Models / Regularization Techniques for BMI Models / Neural Decoding Using Generativ BMI Models / Adaptive Algorithms for Point Processes / BMI Systems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Modern Image Quality Assessment

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This Lecture book is about objective image quality assessment—where the aim is to provide computational models that can automatically predict perceptual image quality. The early years of the 21st century have witnessed a tremendous growth in the use of digital images as a means for representing and communicating information. A considerable percentage of this literature is devoted to methods for improving the appearance of images, or for maintaining the appearance of images that are processed. Nevertheless, the quality of digital images, processed or otherwise, is rarely perfect. Images are subject to distortions during acquisition, compression, transmission, processing, and reproduction. To maintain, control, and enhance the quality of images, it is important for image acquisition, management, communication, and processing systems to be able to identify and quantify image quality degradations. The goals of this book are as follows; a) to introduce the fundamentals of image qual ty assessment, and to explain the relevant engineering problems, b) to give a broad treatment of the current state-of-the-art in image quality assessment, by describing leading algorithms that address these engineering problems, and c) to provide new directions for future research, by introducing recent models and paradigms that significantly differ from those used in the past. The book is written to be accessible to university students curious about the state-of-the-art of image quality assessment, expert industrial R&D engineers seeking to implement image/video quality assessment systems for specific applications, and academic theorists interested in developing new algorithms for image quality assessment or using existing algorithms to design or optimize other image processing applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis Techniques for Information Security

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Increasingly our critical infrastructures are reliant on computers. We see examples of such infrastructures in several domains, including medical, power, telecommunications, and finance. Although automation has advantages, increased reliance on computers exposes our critical infrastructures to a wider variety and higher likelihood of accidental failures and malicious attacks. Disruption of services caused by such undesired events can have catastrophic effects, such as disruption of essential services and huge financial losses. The increased reliance of critical services on our cyberinfrastructure and the dire consequences of security breaches have highlighted the importance of information security. Authorization, security protocols, and software security are three central areas in security in which there have been significant advances in developing systematic foundations and analysis methods that work for practical systems. This book provides an introduction to this work, covering rep esentative approaches, illustrated by examples, and providing pointers to additional work in the area. Table of Contents: Introduction / Foundations / Detecting Buffer Overruns Using Static Analysis / Analyzing Security Policies / Analyzing Security Protocols View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Privacy-Preserving Data Publishing:An Overview

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Privacy preservation has become a major issue in many data analysis applications. When a data set is released to other parties for data analysis, privacy-preserving techniques are often required to reduce the possibility of identifying sensitive information about individuals. For example, in medical data, sensitive information can be the fact that a particular patient suffers from HIV. In spatial data, sensitive information can be a specific location of an individual. In web surfing data, the information that a user browses certain websites may be considered sensitive. Consider a dataset containing some sensitive information is to be released to the public. In order to protect sensitive information, the simplest solution is not to disclose the information. However, this would be an overkill since it will hinder the process of data analysis over the data from which we can find interesting patterns. Moreover, in some applications, the data must be disclosed under the government regulati ns. Alternatively, the data owner can first modify the data such that the modified data can guarantee privacy and, at the same time, the modified data retains sufficient utility and can be released to other parties safely. This process is usually called as privacy-preserving data publishing. In this monograph, we study how the data owner can modify the data and how the modified data can preserve privacy and protect sensitive information. Table of Contents: Introduction / Fundamental Concepts / One-Time Data Publishing / Multiple-Time Data Publishing / Graph Data / Other Data Types / Future Research Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Adiabatic Quantum Computation and Quantum Annealing:Theory and Practice

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Adiabatic quantum computation (AQC) is an alternative to the better-known gate model of quantum computation. The two models are polynomially equivalent, but otherwise quite dissimilar: one property that distinguishes AQC from the gate model is its analog nature. Quantum annealing (QA) describes a type of heuristic search algorithm that can be implemented to run in the ``native instruction set'' of an AQC platform. D-Wave Systems Inc. manufactures {quantum annealing processor chips} that exploit quantum properties to realize QA computations in hardware. The chips form the centerpiece of a novel computing platform designed to solve NP-hard optimization problems. Starting with a 16-qubit prototype announced in 2007, the company has launched and sold increasingly larger models: the 128-qubit D-Wave One system was announced in 2010 and the 512-qubit D-Wave Two system arrived on the scene in 2013. A 1,000-qubit model is expected to be available in 2014. This monograph presents an introduc ory overview of this unusual and rapidly developing approach to computation. We start with a survey of basic principles of quantum computation and what is known about the AQC model and the QA algorithm paradigm. Next we review the D-Wave technology stack and discuss some challenges to building and using quantum computing systems at a commercial scale. The last chapter reviews some experimental efforts to understand the properties and capabilities of these unusual platforms. The discussion throughout is aimed at an audience of computer scientists with little background in quantum computation or in physics. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bacterial Sensors:Synthetic Design and Application Principles

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Bacterial reporters are live, genetically engineered cells with promising application in bioanalytics. They contain genetic circuitry to produce a cellular sensing element, which detects the target compound and relays the detection to specific synthesis of so-called reporter proteins (the presence or activity of which is easy to quantify). Bioassays with bacterial reporters are a useful complement to chemical analytics because they measure biological responses rather than total chemical concentrations. Simple bacterial reporter assays may also replace more costly chemical methods as a first line sample analysis technique. Recent promising developments integrate bacterial reporter cells with microsystems to produce bacterial biosensors. This lecture presents an in-depth treatment of the synthetic biological design principles of bacterial reporters, the engineering of which started as simple recombinant DNA puzzles, but has now become a more rational approach of choosing and combining s nsing, controlling and reporting DNA 'parts'. Several examples of existing bacterial reporter designs and their genetic circuitry will be illustrated. Besides the design principles, the lecture also focuses on the application principles of bacterial reporter assays. A variety of assay formats will be illustrated, and principles of quantification will be dealt with. In addition to this discussion, substantial reference material is supplied in various Annexes. Table of Contents: Short History of the use of Bacteria for Biosensing and Bioreporting / Genetic Engineering Concepts / Measuring with Bioreporters / Epilogue View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Secure Outsourcing Computation

    Copyright Year: 2016

    Morgan and Claypool eBooks

    With the rapid development of cloud computing, the enterprises and individuals can outsource their sensitive data into the cloud server where they can enjoy high quality data storage and computing services in a ubiquitous manner. This is known as the outsourcing computation paradigm. Recently, the problem for securely outsourcing various expensive computations or storage has attracted considerable attention in the academic community. In this book, we focus on the latest technologies and applications of secure outsourcing computations. Specially, we introduce the state-of-the-art research for secure outsourcing some specific functions such as scientific computations, cryptographic basic operations, and verifiable large database with update. The constructions for specific functions use various design tricks and thus result in very efficient protocols for real-world applications. The topic of outsourcing computation is a hot research issue nowadays. Thus, this book will be beneficial to cademic researchers in the field of cloud computing and big data security. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Antennas with Non-Foster Matching Networks

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Most antenna engineers are likely to believe that antennas are one technology that is more or less impervious to the rapidly advancing semiconductor industry. However, as demonstrated in this lecture, there is a way to incorporate active components into an antenna and transform it into a new kind of radiating structure that can take advantage of the latest advances in analog circuit design. The approach for making this transformation is to make use of non-Foster circuit elements in the matching network of the antenna. By doing so, we are no longer constrained by the laws of physics that apply to passive antennas. However, we must now design and construct very touchy active circuits. This new antenna technology is now in its infancy. The contributions of this lecture are (1) to summarize the current state-of-the-art in this subject, and (2) to introduce some new theoretical and practical tools for helping us to continue the advancement of this technology. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Embedded Systems: Using ANSI C and the Arduino Development Environment

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Many electrical and computer engineering projects involve some kind of embedded system in which a microcontroller sits at the center as the primary source of control. The recently-developed Arduino development platform includes an inexpensive hardware development board hosting an eight-bit ATMEL ATmega-family processor and a Java-based software-development environment. These features allow an embedded systems beginner the ability to focus their attention on learning how to write embedded software instead of wasting time overcoming the engineering CAD tools learning curve. The goal of this text is to introduce fundamental methods for creating embedded software in general, with a focus on ANSI C. The Arduino development platform provides a great means for accomplishing this task. As such, this work presents embedded software development using 100% ANSI C for the Arduino's ATmega328P processor. We deviate from using the Arduino-specific Wiring libraries in an attempt to provide the most general embedded methods. In this way, the reader will acquire essential knowledge necessary for work on future projects involving other processors. Particular attention is paid to the notorious issue of using C pointers in order to gain direct access to microprocessor registers, which ultimately allow control over all peripheral interfacing. Table of Contents: Introduction / ANSI C / Introduction to Arduino / Embedded Debugging / ATmega328P Architecture / General-Purpose Input/Output / Timer Ports / Analog Input Ports / Interrupt Processing / Serial Communications / Assembly Language / Non-volatile Memory View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Grammatical Inference for Computational Linguistics

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book provides a thorough introduction to the subfield of theoretical computer science known as grammatical inference from a computational linguistic perspective. Grammatical inference provides principled methods for developing computationally sound algorithms that learn structure from strings of symbols. The relationship to computational linguistics is natural because many research problems in computational linguistics are learning problems on words, phrases, and sentences: What algorithm can take as input some finite amount of data (for instance a corpus, annotated or otherwise) and output a system that behaves "correctly" on specific tasks? Throughout the text, the key concepts of grammatical inference are interleaved with illustrative examples drawn from problems in computational linguistics. Special attention is paid to the notion of "learning bias." In the context of computational linguistics, such bias can be thought to reflect common (ideally universal) properties of natur l languages. This bias can be incorporated either by identifying a learnable class of languages which contains the language to be learned or by using particular strategies for optimizing parameter values. Examples are drawn largely from two linguistic domains (phonology and syntax) which span major regions of the Chomsky Hierarchy (from regular to context-sensitive classes). The conclusion summarizes the major lessons and open questions that grammatical inference brings to computational linguistics. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Representing and Reasoning with Qualitative Preferences:Tools and Applications

    Copyright Year: 2016

    Morgan and Claypool eBooks

    This book provides a tutorial introduction to modern techniques for representing and reasoning about qualitative preferences with respect to a set of alternatives. The syntax and semantics of several languages for representing preference languages, including CP-nets, TCP-nets, CI-nets, and CP-theories, are reviewed. Some key problems in reasoning about preferences are introduced, including determining whether one alternative is preferred to another, or whether they are equivalent, with respect to a given set of preferences. These tasks can be reduced to model checking in temporal logic. Specifically, an induced preference graph that represents a given set of preferences can be efficiently encoded using a Kripke Structure for Computational Tree Logic (CTL). One can translate preference queries with respect to a set of preferences into an equivalent set of formulae in CTL, such that the CTL formula is satisfied whenever the preference query holds. This allows us to use a model checker t reason about preferences, i.e., answer preference queries, and to obtain a justification as to why a preference query is satisfied (or not) with respect to a set of preferences. This book defines the notions of the equivalence of two sets of preferences, including what it means for one set of preferences to subsume another, and shows how to answer preferential equivalence and subsumption queries using model checking. Furthermore, this book demontrates how to generate alternatives ordered by preference, along with providing ways to deal with inconsistent preference specifications. A description of CRISNER—an open source software implementation of the model checking approach to qualitative preference reasoning in CP-nets, TCP-nets, and CP-theories is included, as well as examples illustrating its use. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Combating Bad Weather Part II:Fog Removal from Image and Video

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Every year lives and properties are lost in road accidents. About one-fourth of these accidents are due to low vision in foggy weather. At present, there is no algorithm that is specifically designed for the removal of fog from videos. Application of a single-image fog removal algorithm over each video frame is a time-consuming and costly affair. It is demonstrated that with the intelligent use of temporal redundancy, fog removal algorithms designed for a single image can be extended to the real-time video application. Results confirm that the presented framework used for the extension of the fog removal algorithms for images to videos can reduce the complexity to a great extent with no loss of perceptual quality. This paves the way for the real-life application of the video fog removal algorithm. In order to remove fog, an efficient fog removal algorithm using anisotropic diffusion is developed. The presented fog removal algorithm uses new dark channel assumption and anisotropic diff sion for the initialization and refinement of the airlight map, respectively. Use of anisotropic diffusion helps to estimate the better airlight map estimation. The said fog removal algorithm requires a single image captured by uncalibrated camera system. The anisotropic diffusion-based fog removal algorithm can be applied in both RGB and HSI color space. This book shows that the use of HSI color space reduces the complexity further. The said fog removal algorithm requires pre- and post-processing steps for the better restoration of the foggy image. These pre- and post-processing steps have either data-driven or constant parameters that avoid the user intervention. Presented fog removal algorithm is independent of the intensity of the fog, thus even in the case of the heavy fog presented algorithm performs well. Qualitative and quantitative results confirm that the presented fog removal algorithm outperformed previous algorithms in terms of perceptual quality, color fidelity and execu ion time. The work presented in this book can find wide application in entertainment industries, transportation, tracking and consumer electronics. Table of Contents: Acknowledgments / Introduction / Analysis of Fog / Dataset and Performance Metrics / Important Fog Removal Algorithms / Single-Image Fog Removal Using an Anisotropic Diffusion / Video Fog Removal Framework Using an Uncalibrated Single Camera System / Conclusions and Future Directions / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Die-stacking Architecture

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The emerging three-dimensional (3D) chip architectures, with their intrinsic capability of reducing the wire length, promise attractive solutions to reduce the delay of interconnects in future microprocessors. 3D memory stacking enables much higher memory bandwidth for future chip-multiprocessor design, mitigating the "memory wall" problem. In addition, heterogenous integration enabled by 3D technology can also result in innovative designs for future microprocessors. This book first provides a brief introduction to this emerging technology, and then presents a variety of approaches to designing future 3D microprocessor systems, by leveraging the benefits of low latency, high bandwidth, and heterogeneous integration capability which are offered by 3D technology. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mobile Interactions in Context:A Designerly Way Toward Digital Ecology

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This book presents a contextual approach to designing contemporary interactive mobile computer systems as integral parts of ubiquitous computing environments. Interactive mobile systems, services, and devices have become functional design objects that we care deeply about. Although their look, feel, and features impact our everyday lives as we orchestrate them in concert with a plethora of other computing technologies, these artifacts are not well understood or created through traditional methods of user-centered design and usability engineering. Contrary to more traditional IT artifacts, they constitute holistic user experiences of value and pleasure that require careful attention to the variety, complexity, and dynamics of their usage. Hence, the design of mobile interactions proposed in this book transcends existing approaches by using the ensemble of form and context as its central unit of analysis. As such, it promotes a designerly way of achieving convergence between form and co text through a contextually grounded, wholeness sensitive, and continually unfolding process of design. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Simplified Models for Assessing Heat and Mass Transfer in Evaporative Towers

    Copyright Year: 2013

    Morgan and Claypool eBooks

    The aim of this book is to supply valid and reasonable parameters in order to guide the choice of the right model of industrial evaporative tower according to operating conditions which vary depending on the particular industrial context: power plants, chemical plants, food processing plants and other industrial facilities are characterized by specific assets and requirements that have to be satisfied. Evaporative cooling is increasingly employed each time a significant water flow at a temperature which does not greatly differ from ambient temperature is needed for removing a remarkable heat load; its aim is to refrigerate a water flow through the partial evaporation of the same. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Developing Embedded Software using DaVinci and OMAP Technology

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book discusses how to develop embedded products using DaVinci & OMAP Technology from Texas Instruments Incorporated. It presents a single software platform for diverse hardware platforms. DaVinci & OMAP Technology refers to the family of processors, development tools, software products, and support. While DaVinci Technology is driven by the needs of consumer video products such as IP network cameras, networked projectors, digital signage and portable media players, OMAP Technology is driven by the needs of wireless products such as smart phones. Texas Instruments offers a wide variety of processing devices to meet our users' price and performance needs. These vary from single digital signal processing devices to complex, system-on-chip (SoC) devices with multiple processors and peripherals. As a software developer you question: Do I need to become an expert in signal processing and learn the details of these complex devices before I can use them in my application? As a senior executive you wonder: How can I reduce my engineering development cost? How can I move from one processor to another from Texas Instruments without incurring a significant development cost? This book addresses these questions with sample code and gives an insight into the software architecture and associated component software products that make up this software platform. As an example, we show how we develop an IP network camera. Using this software platform, you can choose to focus on the application and quickly create a product without having to learn the details of the underlying hardware or signal processing algorithms. Alternatively, you can choose to differentiate at both the application as well as the signal processing layer by developing and adding your algorithms using the xDAIS for Digital Media, xDM, guidelines for component software. Finally, you may use one code base across different hardware platforms. Table of Contents: Software Platform / More about xDM, VISA, & CE / Building a Product Based on DaVinci Technology / Reducing Development Cost / eXpressDSP Digital Media (xDM) / Sample Application Using xDM / Embedded Peripheral Software Interface (EPSI) / Sample Application Using EPSI / Sample Application Using EPSI and xDM / IP Network Camera on DM355 Using TI Software / Adding your secret sauce to the Signal Processing Layer (SPL) / Further Reading View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Geometry of Walker Manifolds

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book, which focuses on the study of curvature, is an introduction to various aspects of pseudo-Riemannian geometry. We shall use Walker manifolds (pseudo-Riemannian manifolds which admit a non-trivial parallel null plane field) to exemplify some of the main differences between the geometry of Riemannian manifolds and the geometry of pseudo-Riemannian manifolds and thereby illustrate phenomena in pseudo-Riemannian geometry that are quite different from those which occur in Riemannian geometry, i.e. for indefinite as opposed to positive definite metrics. Indefinite metrics are important in many diverse physical contexts: classical cosmological models (general relativity) and string theory to name but two. Walker manifolds appear naturally in numerous physical settings and provide examples of extremal mathematical situations as will be discussed presently. To describe the geometry of a pseudo-Riemannian manifold, one must first understand the curvature of the manifold. We shall anal ze a wide variety of curvature properties and we shall derive both geometrical and topological results. Special attention will be paid to manifolds of dimension 3 as these are quite tractable. We then pass to the 4 dimensional setting as a gateway to higher dimensions. Since the book is aimed at a very general audience (and in particular to an advanced undergraduate or to a beginning graduate student), no more than a basic course in differential geometry is required in the way of background. To keep our treatment as self-contained as possible, we shall begin with two elementary chapters that provide an introduction to basic aspects of pseudo-Riemannian geometry before beginning on our study of Walker geometry. An extensive bibliography is provided for further reading. Math subject classifications : Primary: 53B20 -- (PACS: 02.40.Hw) Secondary: 32Q15, 51F25, 51P05, 53B30, 53C50, 53C80, 58A30, 83F05, 85A04 Table of Contents: Basic Algebraic Notions / Basic Geometrical Notions / Walker S ructures / Three-Dimensional Lorentzian Walker Manifolds / Four-Dimensional Walker Manifolds / The Spectral Geometry of the Curvature Tensor / Hermitian Geometry / Special Walker Manifolds View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Adaptive Mesh Refinement in Time-Domain Numerical Electromagnetics

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This monograph is a comprehensive presentation of state-of-the-art methodologies that can dramatically enhance the efficiency of the finite-difference time-domain (FDTD) technique, the most popular electromagnetic field solver of the time-domain form of Maxwell's equations. These methodologies are aimed at optimally tailoring the computational resources needed for the wideband simulation of microwave and optical structures to their geometry, as well as the nature of the field solutions they support. That is achieved by the development of robust “adaptive meshing” approaches, which amount to varying the total number of unknown field quantities in the course of the simulation to adapt to temporally or spatially localized field features. While mesh adaptation is an extremely desirable FDTD feature, known to reduce simulation times by orders of magnitude, it is not always robust. The specific techniques presented in this book are characterized by stability and robustness. T erefore, they are excellent computer analysis and design (CAD) tools. The book starts by introducing the FDTD technique, along with challenges related to its application to the analysis of real-life microwave and optical structures. It then proceeds to developing an adaptive mesh refinement method based on the use of multiresolution analysis and, more specifically, the Haar wavelet basis. Furthermore, a new method to embed a moving adaptive mesh in FDTD, the dynamically adaptive mesh refinement (AMR) FDTD technique, is introduced and explained in detail. To highlight the properties of the theoretical tools developed in the text, a number of applications are presented, including: Microwave integrated circuits (microstrip filters, couplers, spiral inductors, cavities). Optical power splitters, Y-junctions, and couplers Optical ring resonators Nonlinear optical waveguides. Building on first principles of time-domain electromagnetic simulations, this book presents advanced concepts and cu ting-edge modeling techniques in an intuitive way for programmers, engineers, and graduate students. It is designed to provide a solid reference for highly efficient time-domain solvers, employed in a wide range of exciting applications in microwave/millimeter-wave and optical engineering. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Virtual Design of an Audio Lifelogging System:Tools for IoT Systems

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p> The availability of inexpensive, custom, highly integrated circuits is enabling some very powerful systems that bring together sensors, smart phones, wearables, cloud computing, and other technologies. To design these types of complex systems we are advocating a top-down simulation methodology to identify problems early. This approach enables software development to start prior to expensive chip and hardware development. We call the overall approach virtual design. This book explains why simulation has become important for chip design and provides an introduction to some of the simulation methods used. The audio lifelogging research project demonstrates the virtual design process in practice. </p> <p>The goals of this book are to: </p> <p><ul> <li>explain how silicon design has become more closely involved with system design&#59;</li> <li>show how virtual design enables top down design&#59;</li& t; <li>explain the utility of simulation at different abstraction levels&#59;</li> <li>show how open source simulation software was used in audio lifelogging.</li> </ul> </p> <p>The target audience for this book are faculty, engineers, and students who are interested in developing digital devices for Internet of Things (IoT) types of products. </p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Three-Dimensional Integration and Modeling:A Revolution in RF and Wireless Packaging

    Copyright Year: 2007

    Morgan and Claypool eBooks

    This book presents a step-by-step discussion of the 3D integration approach for the development of compact system-on-package (SOP) front-ends.Various examples of fully-integrated passive building blocks (cavity/microstip filters, duplexers, antennas), as well as a multilayer ceramic (LTCC) V-band transceiver front-end midule demonstrate the revolutionary effects of this approach in RF/Wireless packaging and multifunctional miniaturization. Designs covered are based on novel ideas and are presented for the first time for millimeterwave (60GHz) ultrabroadband wireless modules. Table of Contents: Introduction / Background on Technologies for Millimeter-Wave Passive Front-Ends / Three-Dimensional Packaging in Multilayer Organic Substrates / Microstrip-Type Integrated Passives / Cavity-Type Integrated Passives / Three-Dimensional Antenna Architectures / Fully Integrated Three-Dimensional Passive Front-Ends / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing for User Engagement:Aesthetic and Attractive User Interfaces

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This book explores the design process for user experience and engagement, which expands the traditional concept of usability and utility in design to include aesthetics, fun and excitement. User experience has evolved as a new area of Human Computer Interaction research, motivated by non-work oriented applications such as games, education and emerging interactive Web 2.0. The chapter starts by examining the phenomena of user engagement and experience and setting them in the perspective of cognitive psychology, in particular motivation, emotion and mood. The perspective of aesthetics is expanded towards interaction and engagement to propose design treatments, metaphors, and interactive techniques which can promote user interest, excitement and satisfying experiences. This is followed by reviewing the design process and design treatments which can promote aesthetic perception and engaging interaction. The final part of the chapter provides design guidelines and principles drawn from the interaction and graphical design literature which are cross-referenced to issues in the design process. Examples of designs and design treatments are given to illustrate principles and advice, accompanied by critical reflection. Table of Contents: Introduction / Psychology of User Engagement / UE Design Process / Design Principles and Guidelines / Perspectives and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Private Information Retrieval

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book deals with Private Information Retrieval (PIR), a technique allowing a user to retrieve an element from a server in possession of a database without revealing to the server which element is retrieved. PIR has been widely applied to protect the privacy of the user in querying a service provider on the Internet. For example, by PIR, one can query a location-based service provider about the nearest car park without revealing his location to the server. The first PIR approach was introduced by Chor, Goldreich, Kushilevitz and Sudan in 1995 in a multi-server setting, where the user retrieves information from multiple database servers, each of which has a copy of the same database. To ensure user privacy in the multi-server setting, the servers must be trusted not to collude. In 1997, Kushilevitz and Ostrovsky constructed the first single-database PIR. Since then, many efficient PIR solutions have been discovered. Beginning with a thorough survey of single-database PIR techniques, this text focuses on the latest technologies and applications in the field of PIR. The main categories are illustrated with recently proposed PIR-based solutions by the authors. Because of the latest treatment of the topic, this text will be highly beneficial to researchers and industry professionals in information security and privacy. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Primer on Hardware Prefetching

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Since the 1970’s, microprocessor-based digital platforms have been riding Moore’s law, allowing for doubling of density for the same area roughly every two years. However, whereas microprocessor fabrication has focused on increasing instruction execution rate, memory fabrication technologies have focused primarily on an increase in capacity with negligible increase in speed. This divergent trend in performance between the processors and memory has led to a phenomenon referred to as the “Memory Wall.” To overcome the memory wall, designers have resorted to a hierarchy of cache memory levels, which rely on the principal of memory access locality to reduce the observed memory access time and the performance gap between processors and memory. Unfortunately, important workload classes exhibit adverse memory access patterns that baffle the simple policies built into modern cache hierarchies to move instructions and data across cache levels. As such, processors of en spend much time idling upon a demand fetch of memory blocks that miss in higher cache levels. Prefetching—predicting future memory accesses and issuing requests for the corresponding memory blocks in advance of explicit accesses—is an effective approach to hide memory access latency. There have been a myriad of proposed prefetching techniques, and nearly every modern processor includes some hardware prefetching mechanisms targeting simple and regular memory access patterns. This primer offers an overview of the various classes of hardware prefetchers for instructions and data proposed in the research literature, and presents examples of techniques incorporated into modern microprocessors. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Arduino Microcontroller:Processing for Everyone! Second Edition

    Copyright Year: 2012

    Morgan and Claypool eBooks

    This book is about the Arduino microcontroller and the Arduino concept. The visionary Arduino team of Massimo Banzi, David Cuartielles, Tom Igoe, Gianluca Martino, and David Mellis launched a new innovation in microcontroller hardware in 2005, the concept of open source hardware. Their approach was to openly share details of microcontroller-based hardware design platforms to stimulate the sharing of ideas and promote innovation. This concept has been popular in the software world for many years. This book is intended for a wide variety of audiences including students of the fine arts, middle and senior high school students, engineering design students, and practicing scientists and engineers. To meet this wide audience, the book has been divided into sections to satisfy the need of each reader. The book contains many software and hardware examples to assist the reader in developing a wide variety of systems. For the examples, the Arduino UNO R3 and the Atmel ATmega328 is employed as t e target processor. The second edition has been updated with the latest on the Arduino UNO R3 processor, changes to the Arduino Development Environment and several extended examples. Table of Contents: Getting Started / Programming / Embedded Systems Design / Serial Communication Subsystem / Analog to Digital Conversion (ADC) / Interrupt Subsystem / Timing Subsystem / Atmel AVR Operating Parameters and Interfacing View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Code Division Multiple Access (CDMA)

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This book covers the basic aspects of Code Division Multiple Access or CDMA. It begins with an introduction to the basic ideas behind fixed and random access systems in order to demonstrate the difference between CDMA and the more widely understood TDMA, FDMA or CSMA. Secondly, a review of basic spread spectrum techniques are presented which are used in CDMA systems including direct sequence, frequency-hopping and time-hopping approaches. The basic concept of CDMA is presented, followed by the four basic principles of CDMA systems that impact their performance: interference averaging, universal frequency reuse, soft handoff, and statistical multiplexing. The focus of the discussion will then shift to applications. The most common application of CDMA currently is cellular systems. A detailed discussion on cellular voice systems based on CDMA, specifically IS-95, is presented. The capacity of such systems will be examined as well as performance enhancement techniques such as coding and patial filtering. Also discussed are Third Generation CDMA cellular systems and how they differ from Second Generation systems. A second application of CDMA that is covered is spread spectrum packet radio networks. Finally, there is an examination of multi-user detection and interference cancellation and how such techniques impact CDMA networks. This book should be of interest and value to engineers, advanced students, and researchers in communications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Circuits: D-C and Time Domain

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Pragmatic Circuits: DC and Time Domain deals primarily with circuits and how they function, beginning with a review of Kirchhoff's and Ohm's Laws analysis of d-c circuits and op-amps, and the sinusoidal steady state. The author then looks at formal circuit analysis through nodal and mesh equations. Useful theorems like Thevenin are added to the circuits toolbox. This first of three volumes ends with a chapter on design. The two follow-up volumes in the Pragmatic Circuits series include titles on Frequency Domain and Signals and Filters. These short lecture books will be of use to students at any level of electrical engineering and for practicing engineers, or scientists, in any field looking for a practical and applied introduction to circuits and signals. The author's “pragmatic” and applied style gives a unique and helpful “non-idealistic, practical, opinionated” introduction to circuits. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2014

    Morgan and Claypool eBooks

    A response of the engineering profession to the challenges of security, poverty and underdevelopment, environmental sustainability, and native cultures is described. Ethical codes, which govern the behavior of engineers, are examined from a historical perspective linking the prevailing codes to models of the natural world. A new ethical code based on a recently introduced model of Nature as an integral community is provided and discussed. Applications of the new code are described using a case study approach. With the ethical code based on an integral community in place, new design algorithms are developed and also explored using case studies. Implications of the proposed changes in ethics and design on engineering education are considered. Table of Contents: Preface / Acknowledgments / Introduction / Engineering Ethics / Models of the Earth / Engineering in a Morally Deep World / Engineering Design in a Morally Deep World / Implications for Engineering Education / Final Thoughts / Re erences / Author's Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Linguistic Annotation and Text Analytics

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Linguistic annotation and text analytics are active areas of research and development, with academic conferences and industry events such as the Linguistic Annotation Workshops and the annual Text Analytics Summits. This book provides a basic introduction to both fields, and aims to show that good linguistic annotations are the essential foundation for good text analytics. After briefly reviewing the basics of XML, with practical exercises illustrating in-line and stand-off annotations, a chapter is devoted to explaining the different levels of linguistic annotations. The reader is encouraged to create example annotations using the WordFreak linguistic annotation tool. The next chapter shows how annotations can be created automatically using statistical NLP tools, and compares two sets of tools, the OpenNLP and Stanford NLP tools. The second half of the book describes different annotation formats and gives practical examples of how to interchange annotations between different formats sing XSLT transformations. The two main text analytics architectures, GATE and UIMA, are then described and compared, with practical exercises showing how to configure and customize them. The final chapter is an introduction to text analytics, describing the main applications and functions including named entity recognition, coreference resolution and information extraction, with practical examples using both open source and commercial tools. Copies of the example files, scripts, and stylesheets used in the book are available from the companion website, located at the book website. Table of Contents: Working with XML / Linguistic Annotation / Using Statistical NLP Tools / Annotation Interchange / Annotation Architectures / Text Analytics View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fractal Analysis of Breast Masses in Mammograms

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Fractal analysis is useful in digital image processing for the characterization of shape roughness and gray-scale texture or complexity. Breast masses present shape and gray-scale characteristics in mammograms that vary between benign masses and malignant tumors. This book demonstrates the use of fractal analysis to classify breast masses as benign masses or malignant tumors based on the irregularity exhibited in their contours and the gray-scale variability exhibited in their mammographic images. A few different approaches are described to estimate the fractal dimension (FD) of the contour of a mass, including the ruler method, box-counting method, and the power spectral analysis (PSA) method. Procedures are also described for the estimation of the FD of the gray-scale image of a mass using the blanket method and the PSA method. To facilitate comparative analysis of FD as a feature for pattern classification of breast masses, several other shape features and texture measures are desc ibed in the book. The shape features described include compactness, spiculation index, fractional concavity, and Fourier factor. The texture measures described are statistical measures derived from the gray-level cooccurrence matrix of the given image. Texture measures reveal properties about the spatial distribution of the gray levels in the given image; therefore, the performance of texture measures may be dependent on the resolution of the image. For this reason, an analysis of the effect of spatial resolution or pixel size on texture measures in the classification of breast masses is presented in the book. The results demonstrated in the book indicate that fractal analysis is more suitable for characterization of the shape than the gray-level variations of breast masses, with area under the receiver operating characteristics of up to 0.93 with a dataset of 111 mammographic images of masses. The methods and results presented in the book are useful for computer-aided diagnosis of br ast cancer. Table of Contents: Computer-Aided Diagnosis of Breast Cancer / Detection and Analysis ofnewline Breast Masses / Datasets of Images of Breast Masses / Methods for Fractal Analysis / Pattern Classification / Results of Classification of Breast Masses / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Selected Asymptotic Methods with Applications to Electromagnetics and Antennas

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book describes and illustrates the application of several asymptotic methods that have proved useful in the authors' research in electromagnetics and antennas. We first define asymptotic approximations and expansions and explain these concepts in detail. We then develop certain prerequisites from complex analysis such as power series, multivalued functions (including the concepts of branch points and branch cuts), and the all-important gamma function. Of particular importance is the idea of analytic continuation (of functions of a single complex variable); our discussions here include some recent, direct applications to antennas and computational electromagnetics. Then, specific methods are discussed. These include integration by parts and the Riemann-Lebesgue lemma, the use of contour integration in conjunction with other methods, techniques related to Laplace's method and Watson's lemma, the asymptotic behavior of certain Fourier sine and cosine transforms, and the Poisson s mmation formula (including its version for finite sums). Often underutilized in the literature are asymptotic techniques based on the Mellin transform; our treatment of this subject complements the techniques presented in our recent Synthesis Lecture on the exact (not asymptotic) evaluation of integrals. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Social Semantic Web Mining

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The past ten years have seen a rapid growth in the numbers of people signing up to use Web-based social networks (hundreds of millions of new members are now joining the main services each year) with a large amount of content being shared on these networks (tens of billions of content items are shared each month). With this growth in usage and data being generated, there are many opportunities to discover the knowledge that is often inherent but somewhat hidden in these networks. Web mining techniques are being used to derive this hidden knowledge. In addition, the Semantic Web, including the Linked Data initiative to connect previously disconnected datasets, is making it possible to connect data from across various social spaces through common representations and agreed upon terms for people, content items, etc. In this book, we detail some current research being carried out to semantically represent the implicit and explicit structures on the Social Web, along with the techniques be ng used to elicit relevant knowledge from these structures, and we present the mechanisms that can be used to intelligently mesh these semantic representations with intelligent knowledge discovery processes. We begin this book with an overview of the origins of the Web, and then show how web intelligence can be derived from a combination of web and Social Web mining. We give an overview of the Social and Semantic Webs, followed by a description of the combined Social Semantic Web (along with some of the possibilities it affords), and the various semantic representation formats for the data created in social networks and on social media sites. Provenance and provenance mining is an important aspect here, especially when data is combined from multiple services. We will expand on the subject of provenance and especially its importance in relation to social data. We will describe extensions to social semantic vocabularies specifically designed for community mining purposes (SIOCM). In the last three chapters, we describe how the combination of web intelligence and social semantic data can be used to derive knowledge from the Social Web, starting at the community level (macro), and then moving through group mining (meso) to user profile mining (micro). Table of Contents: Acknowledgments / Grant Aid / Introduction and the Web / Web Mining / The Social Web / The Semantic Web / The Social Semantic Web / Social Semantic Web Mining / Social Semantic Web Mining of Communities / Social Semantic Web Mining of Groups / Social Semantic Web Mining of Users / Conclusions / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Business Processes:A Database Perspective

    Copyright Year: 2012

    Morgan and Claypool eBooks

    While classic data management focuses on the data itself, research on Business Processes also considers the context in which this data is generated and manipulated, namely the processes, users, and goals that this data serves. This provides the analysts a better perspective of the organizational needs centered around the data. As such, this research is of fundamental importance. Much of the success of database systems in the last decade is due to the beauty and elegance of the relational model and its declarative query languages, combined with a rich spectrum of underlying evaluation and optimization techniques, and efficient implementations. Much like the case for traditional database research, elegant modeling and rich underlying technology are likely to be highly beneficiary for the Business Process owners and their users; both can benefit from easy formulation and analysis of the processes. While there have been many important advances in this research in recent years, there is st ll much to be desired: specifically, there have been many works that focus on the processes behavior (flow), and many that focus on its data, but only very few works have dealt with both the state-of-the-art in a database approach to Business Process modeling and analysis, the progress towards a holistic flow-and-data framework for these tasks, and highlight the current gaps and research directions. Table of Contents: Introduction / Modeling / Querying Business Processes / Other Issues / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Easy Path to Convex Analysis and Applications

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Convex optimization has an increasing impact on many areas of mathematics, applied sciences, and practical applications. It is now being taught at many universities and being used by researchers of different fields. As convex analysis is the mathematical foundation for convex optimization, having deep knowledge of convex analysis helps students and researchers apply its tools more effectively. The main goal of this book is to provide an easy access to the most fundamental parts of convex analysis and its applications to optimization. Modern techniques of variational analysis are employed to clarify and simplify some basic proofs in convex analysis and build the theory of generalized differentiation for convex functions and sets in finite dimensions. We also present new applications of convex analysis to location problems in connection with many interesting geometric problems such as the Fermat-Torricelli problem, the Heron problem, the Sylvester problem, and their generalizations. Of ourse, we do not expect to touch every aspect of convex analysis, but the book consists of sufficient material for a first course on this subject. It can also serve as supplemental reading material for a course on convex optimization and applications. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Impossibility Results for Distributed Computing

    Copyright Year: 2014

    Morgan and Claypool eBooks

    To understand the power of distributed systems, it is necessary to understand their inherent limitations: what problems cannot be solved in particular systems, or without sufficient resources (such as time or space). This book presents key techniques for proving such impossibility results and applies them to a variety of different problems in a variety of different system models. Insights gained from these results are highlighted, aspects of a problem that make it difficult are isolated, features of an architecture that make it inadequate for solving certain problems efficiently are identified, and different system models are compared. Table of Contents: Acknowledgments / Introduction / Indistinguishability / Shifting and Scaling / Scenario Arguments / Information Theory Arguments / Covering Arguments / Valency Arguments / Combinatorial Arguments / Reductions and Simulations / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Qualitative HCI Research:Going Behind the Scenes

    Copyright Year: 2016

    Morgan and Claypool eBooks

    Human-Computer Interaction (HCI) addresses problems of interaction design: understanding user needs to inform design, delivering novel designs that meet user needs, and evaluating new and existing designs to determine their success in meeting user needs. Qualitative methods have an essential role to play in this enterprise, particularly in understanding user needs and behaviours and evaluating situated use of technology. Qualitative methods allow HCI researchers to ask questions where the answers are more complex and interesting than "true" or "false," and may also be unexpected. In this lecture, we draw on the analogy of making a documentary film to discuss important issues in qualitative HCI research: historically, films were presented as finished products, giving the viewer little insight into the production process; more recently, there has been a trend to go behind the scenes to expose some of the painstaking work that went into creating the final cut. Similarly, in qualitative r search, the essential work behind the scenes is rarely discussed. There are many "how to" guides for particular methods, but few texts that start with the purpose of a study and then discuss the important details of how to select a suitable method, how to adapt it to fit the study context, or how to deal with unexpected challenges that arise. We address this gap by presenting a repertoire of qualitative techniques for understanding user needs, practices and experiences with technology for the purpose of informing design. We also discuss practical considerations such as tactics for recruiting participants and ways of getting started when faced with a pile of interview transcripts. Our particular focus is on semi-structured qualitative studies, which occupy a space between ethnography and surveys—typically involving observations, interviews and similar methods for data gathering, and methods of analysis based on systematic coding of data. Just as a documentary team faces challen es that often go unreported when arranging expeditions or interviews and gathering and editing footage within time and budget constraints, so the qualitative research team faces challenges in obtaining ethical clearance, recruiting participants, analysing data, choosing how and what to report, etc. We present illustrative examples drawn from prior experience to bring to life the purpose, planning and practical considerations of doing qualitative studies for interaction design. We include takeaway checklists for planning, conducting, reporting and evaluating semi-structured qualitative studies. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Exploratory Search:Beyond the Query-Response Paradigm

    Copyright Year: 2009

    Morgan and Claypool eBooks

    As information becomes more ubiquitous and the demands that searchers have on search systems grow, there is a need to support search behaviors beyond simple lookup. Information seeking is the process or activity of attempting to obtain information in both human and technological contexts. Exploratory search describes an information-seeking problem context that is open-ended, persistent, and multifaceted, and information-seeking processes that are opportunistic, iterative, and multitactical. Exploratory searchers aim to solve complex problems and develop enhanced mental capacities. Exploratory search systems support this through symbiotic human-machine relationships that provide guidance in exploring unfamiliar information landscapes. Exploratory search has gained prominence in recent years. There is an increased interest from the information retrieval, information science, and human-computer interaction communities in moving beyond the traditional turn-taking interaction model support d by major Web search engines, and toward support for human intelligence amplification and information use. In this lecture, we introduce exploratory search, relate it to relevant extant research, outline the features of exploratory search systems, discuss the evaluation of these systems, and suggest some future directions for supporting exploratory search. Exploratory search is a new frontier in the search domain and is becoming increasingly important in shaping our future world. Table of Contents: Introduction / Defining Exploratory Search / Related Work / Features of Exploratory Search Systems / Evaluation of Exploratory Search Systems / Future Directions and concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Theoretical Foundations for Digital Libraries: the 5S (Societies, Scenarios, Spaces, Structures, Streams) Approach

    Copyright Year: 2012

    Morgan and Claypool eBooks

    In 1991, a group of researchers chose the term digital libraries to describe an emerging field of research, development, and practice. Since then, Virginia Tech has had funded research in this area, largely through its Digital Library Research Laboratory. This book is the first in a four book series that reports our key findings and current research investigations. Underlying this book series are six completed dissertations (Gon¿¿alves, Kozievitch, Leidig, Murthy, Shen, Torres), eight dissertations underway, and many masters theses. These reflect our experience with a long string of prototype or production systems developed in the lab, such as CITIDEL, CODER, CTRnet, Ensemble, ETANA, ETD-db, MARIAN, and Open Digital Libraries. There are hundreds of related publications, presentations, tutorials, and reports. We have built upon that work so this book, and the others in the series, will address digital library related needs in many computer science, information science, and library scien e (e.g., LIS) courses, as well as the requirements of researchers, developers, and practitioners. Much of the early work in the digital library field struck a balance between addressing real-world needs, integrating methods from related areas, and advancing an ever-expanding research agenda. Our work has fit in with these trends, but simultaneously has been driven by a desire to provide a firm conceptual and formal basis for the field.Our aim has been to move from engineering to science. We claim that our 5S (Societies, Scenarios, Spaces, Structures, Streams) framework, discussed in publications dating back to at least 1998, provides a suitable basis. This book introduces 5S, and the key theoretical and formal aspects of the 5S framework. While the 5S framework may be used to describe many types of information systems, and is likely to have even broader utility and appeal, we focus here on digital libraries. Our view of digital libraries is broad, so further generalization should be s raightforward. We have connected with related fields, including hypertext/hypermedia, information storage and retrieval, knowledge management, machine learning, multimedia, personal information management, and Web 2.0. Applications have included managing not only publications, but also archaeological information, educational resources, fish images, scientific datasets, and scientific experiments/ simulations. Table of Contents: Introduction / Exploration / Mathematical Preliminaries / Minimal Digital Library / Archaeological Digital Libraries / 5S Results: Lemmas, Proofs, and 5SSuite / Glossary / Bibliography / Authors' Biographies / Index View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Community Detection and Mining in Social Media

    Copyright Year: 2010

    Morgan and Claypool eBooks

    The past decade has witnessed the emergence of participatory Web and social media, bringing people together in many creative ways. Millions of users are playing, tagging, working, and socializing online, demonstrating new forms of collaboration, communication, and intelligence that were hardly imaginable just a short time ago. Social media also helps reshape business models, sway opinions and emotions, and opens up numerous possibilities to study human interaction and collective behavior in an unparalleled scale. This lecture, from a data mining perspective, introduces characteristics of social media, reviews representative tasks of computing with social media, and illustrates associated challenges. It introduces basic concepts, presents state-of-the-art algorithms with easy-to-understand examples, and recommends effective evaluation methods. In particular, we discuss graph-based community detection techniques and many important extensions that handle dynamic, heterogeneous networks i social media. We also demonstrate how discovered patterns of communities can be used for social media mining. The concepts, algorithms, and methods presented in this lecture can help harness the power of social media and support building socially-intelligent systems. This book is an accessible introduction to the study of emph{community detection and mining in social media}. It is an essential reading for students, researchers, and practitioners in disciplines and applications where social media is a key source of data that piques our curiosity to understand, manage, innovate, and excel. This book is supported by additional materials, including lecture slides, the complete set of figures, key references, some toy data sets used in the book, and the source code of representative algorithms. The readers are encouraged to visit the book website for the latest information. Table of Contents: Social Media and Social Computing / Nodes, Ties, and Influence / Community Detection and Evaluat on / Communities in Heterogeneous Networks / Social Media Mining View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bioinstrumentation

    Copyright Year: 2006

    Morgan and Claypool eBooks

    This short book provides basic information about bioinstrumentation and electric circuit theory. Many biomedical instruments use a transducer or sensor to convert a signal created by the body into an electric signal. Our goal here is to develop expertise in electric circuit theory applied to bioinstrumentation. We begin with a description of variables used in circuit theory, charge, current, voltage, power and energy. Next, Kirchhoff's current and voltage laws are introduced, followed by resistance, simplifications of resistive circuits and voltage and current calculations. Circuit analysis techniques are then presented, followed by inductance and capacitance, and solutions of circuits using the differential equation method. Finally, the operational amplifier and time varying signals are introduced. This lecture is written for a student or researcher or engineer who has completed the first two years of an engineering program (i.e., 3 semesters of calculus and differential equations). A considerable effort has been made to develop the theory in a logical manner—developing special mathematical skills as needed. At the end of the short book is a wide selection of problems, ranging from simple to complex. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    PSpice for Filters and Transmission Lines

    Copyright Year: 2007

    Morgan and Claypool eBooks

    In this book, PSpice for Filters and Transmission Lines, we examine a range of active and passive filters where each design is simulated using the latest Cadence Orcad V10.5 PSpice capture software. These filters cannot match the very high order digital signal processing (DSP) filters considered in PSpice for Digital Signal Processing, but nevertheless these filters have many uses. The active filters considered were designed using Butterworth and Chebychev approximation loss functions rather than using the ‘cookbook approach’ so that the final design will meet a given specification in an exacting manner. Switched-capacitor filter circuits are examined and here we see how useful PSpice/Probe is in demonstrating how these filters, filter, as it were. Two-port networks are discussed as an introduction to transmission lines and, using a series of problems, we demonstrate quarter-wave and single-stub matching. The concept of time domain reflectrometry as a fault location tool on transmission lines is then examined. In the last chapter we discuss the technique of importing and exporting speech signals into a PSpice schematic using a tailored-made program Wav2ascii. This is a novel technique that greatly extends the simulation boundaries of PSpice. Various digital circuits are also examined at the end of this chapter to demonstrate the use of the bus structure and other techniques. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    On the Efficient Determination of Most Near Neighbors:Horseshoes, Hand Grenades, Web Search and Other Situations When Close is Close Enough

    Copyright Year: 2012

    Morgan and Claypool eBooks

    The time-worn aphorism "close only counts in horseshoes and hand-grenades" is clearly inadequate. Close also counts in golf, shuffleboard, archery, darts, curling, and other games of accuracy in which hitting the precise center of the target isn't to be expected every time, or in which we can expect to be driven from the target by skilled opponents. This lecture is not devoted to sports discussions, but to efficient algorithms for determining pairs of closely related web pages -- and a few other situations in which we have found that inexact matching is good enough; where proximity suffices. We will not, however, attempt to be comprehensive in the investigation of probabilistic algorithms, approximation algorithms, or even techniques for organizing the discovery of nearest neighbors. We are more concerned with finding nearby neighbors; if they are not particularly close by, we are not particularly interested. In thinking of when approximation is sufficient, remember the oft-told joke about two campers sitting around after dinner. They hear noises coming towards them. One of them reaches for a pair of running shoes, and starts to don them. The second then notes that even with running shoes, they cannot hope to outrun a bear, to which the first notes that most likely the bear will be satiated after catching the slower of them. We seek problems in which we don't need to be faster than the bear, just faster than the others fleeing the bear. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bandwidth Extension of Speech Using Perceptual Criteria

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Bandwidth extension of speech is used in the International Telecommunication Union G.729.1 standard in which the narrowband bitstream is combined with quantized high-band parameters. Although this system produces high-quality wideband speech, the additional bits used to represent the high band can be further reduced. In addition to the algorithm used in the G.729.1 standard, bandwidth extension methods based on spectrum prediction have also been proposed. Although these algorithms do not require additional bits, they perform poorly when the correlation between the low and the high band is weak. In this book, two wideband speech coding algorithms that rely on bandwidth extension are developed. The algorithms operate as wrappers around existing narrowband compression schemes. More specifically, in these algorithms, the low band is encoded using an existing toll-quality narrowband system, whereas the high band is generated using the proposed extension techniques. The first method relies nly on transmitted high-band information to generate the wideband speech. The second algorithm uses a constrained minimum mean square error estimator that combines transmitted high-band envelope information with a predictive scheme driven by narrowband features. Both algorithms make use of novel perceptual models based on loudness that determine optimum quantization strategies for wideband recovery and synthesis. Objective and subjective evaluations reveal that the proposed system performs at a lower average bit rate while improving speech quality when compared to other similar algorithms. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Primer on Physical-Layer Network Coding

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The concept of physical-layer network coding (PNC) was proposed in 2006 for application in wireless networks. Since then it has developed into a subfield of communications and networking with a wide following. This book is a primer on PNC. It is the outcome of a set of lecture notes for a course for beginning graduate students at The Chinese University of Hong Kong. The target audience is expected to have some prior background knowledge in communication theory and wireless communications, but not working knowledge at the research level. Indeed, a goal of this book/course is to allow the reader to gain a deeper appreciation of the various nuances of wireless communications and networking by focusing on problems arising from the study of PNC. Specifically, we introduce the tools and techniques needed to solve problems in PNC, and many of these tools and techniques are drawn from the more general disciplines of signal processing, communications, and networking: PNC is used as a pivot to earn about the fundamentals of signal processing techniques and wireless communications in general. We feel that such a problem-centric approach will give the reader a more in-depth understanding of these disciplines and allow him/her to see first-hand how the techniques of these disciplines can be applied to solve real research problems. As a primer, this book does not cover many advanced materials related to PNC. PNC is an active research field and many new results will no doubt be forthcoming in the near future. We believe that this book will provide a good contextual framework for the interpretation of these advanced results should the reader decide to probe further into the field of PNC. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Action Programming Languages

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Artificial systems that think and behave intelligently are one of the most exciting and challenging goals of Artificial Intelligence. Action Programming is the art and science of devising high-level control strategies for autonomous systems which employ a mental model of their environment and which reason about their actions as a means to achieve their goals. Applications of this programming paradigm include autonomous software agents, mobile robots with high-level reasoning capabilities, and General Game Playing. These lecture notes give an in-depth introduction to the current state-of-the-art in action programming. The main topics are knowledge representation for actions, procedural action programming, planning, agent logic programs, and reactive, behavior-based agents. The only prerequisite for understanding the material in these lecture notes is some general programming experience and basic knowledge of classical first-order logic. Table of Contents: Introduction / Mathematical Pr liminaries / Procedural Action Programs / Action Programs and Planning / Declarative Action Programs / Reactive Action Programs / Suggested Further Reading View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Automated Metadata in Multimedia Information Systems

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Improvements in network bandwidth along with dramatic drops in digital storage and processing costs have resulted in the explosive growth of multimedia (combinations of text, image, audio, and video) resources on the Internet and in digital repositories. A suite of computer technologies delivering speech, image, and natural language understanding can automatically derive descriptive metadata for such resources. Difficulties for end users ensue, however, with the tremendous volume and varying quality of automated metadata for multimedia information systems. This lecture surveys automatic metadata creation methods for dealing with multimedia information resources, using broadcast news, documentaries, and oral histories as examples. Strategies for improving the utility of such metadata are discussed, including computationally intensive approaches, leveraging multimodal redundancy, folding in context, and leaving precision-recall tradeoffs under user control. Interfaces building from auto atically generated metadata are presented, illustrating the use of video surrogates in multimedia information systems. Traditional information retrieval evaluation is discussed through the annual National Institute of Standards and Technology TRECVID forum, with experiments on exploratory search extending the discussion beyond fact-finding to broader, longer term search activities of learning, analysis, synthesis, and discovery. Table of Contents: Evolution of Multimedia Information Systems: 1990-2008 / Survey of Automatic Metadata Creation Methods / Refinement of Automatic Metadata / Multimedia Surrogates / End-User Utility for Metadata and Surrogates: Effectiveness, Efficiency, and Satisfaction View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Surface Computing and Collaborative Analysis Work

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Large surface computing devices (wall-mounted or tabletop) with touch interfaces and their application to collaborative data analysis, an increasingly important and prevalent activity, is the primary topic of this book. Our goals are to outline the fundamentals of surface computing (a still maturing technology), review relevant work on collaborative data analysis, describe frameworks for understanding collaborative processes, and provide a better understanding of the opportunities for research and development. We describe surfaces as display technologies with which people can interact directly, and emphasize how interaction design changes when designing for large surfaces. We review efforts to use large displays, surfaces or mixed display environments to enable collaborative analytic activity. Collaborative analysis is important in many domains, but to provide concrete examples and a specific focus, we frequently consider analysis work in the security domain, and in particular the cha lenges security personnel face in securing networks from attackers, and intelligence analysts encounter when analyzing intelligence data. Both of these activities are becoming increasingly collaborative endeavors, and there are huge opportunities for improving collaboration by leveraging surface computing. This work highlights for interaction designers and software developers the particular challenges and opportunities presented by interaction with surfaces. We have reviewed hundreds of recent research papers, and report on advancements in the fields of surface-enabled collaborative analytic work, interactive techniques for surface technologies, and useful theory that can provide direction to interaction design work. We also offer insight into issues that arise when developing applications for multi-touch surfaces derived from our own experiences creating collaborative applications. We present these insights at a level appropriate for all members of the software design and development team. Table of Contents: List of Figures / Acknowledgments / Figure Credits / Purpose and Direction / Surface Technologies and Collaborative Analysis Systems / Interacting with Surface Technologies / Collaborative Work Enabled by Surfaces / The Theory and the Design of Surface Applications / The Development of Surface Applications / Concluding Comments / Bibliography / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Scalability Challenges in Web Search Engines

    Copyright Year: 2015

    Morgan and Claypool eBooks

    In this book, we aim to provide a fairly comprehensive overview of the scalability and efficiency challenges in large-scale web search engines. More specifically, we cover the issues involved in the design of three separate systems that are commonly available in every web-scale search engine: web crawling, indexing, and query processing systems. We present the performance challenges encountered in these systems and review a wide range of design alternatives employed as solution to these challenges, specifically focusing on algorithmic and architectural optimizations. We discuss the available optimizations at different computational granularities, ranging from a single computer node to a collection of data centers. We provide some hints to both the practitioners and theoreticians involved in the field about the way large-scale web search engines operate and the adopted design choices. Moreover, we survey the efficiency literature, providing pointers to a large number of relatively impo tant research papers. Finally, we discuss some open research problems in the context of search engine efficiency. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Tissue Engineering of Temporomandibular Joint Cartilage

    Copyright Year: 2009

    Morgan and Claypool eBooks

    The temporomandibular joint (TMJ) is a site of intense morbidity for millions of people, especially young, pre-menopausal women. Central to TMJ afflictions are the cartilaginous tissues of the TMJ, especially those of the disc and condylar cartilage, which play crucial roles in normal function of this unusual joint. Damage or disease to these tissues significantly impacts a patient's quality of life by making common activities such as talking and eating difficult and painful. Unfortunately, these tissues have limited ability to heal, necessitating the development of treatments for repair or replacement. The burgeoning field of tissue engineering holds promise that replacement tissues can be constructed in the laboratory to recapitulate the functional requirements of native tissues. This book outlines the biomechanical, biochemical, and anatomical characteristics of the disc and condylar cartilage, and also provides a historical perspective of past and current TMJ treatments and previ us tissue engineering efforts. This book was written to serve as a reference for researchers seeking to learn about the TMJ, for undergraduate and graduate level courses, and as a compendium of TMJ tissue engineering design criteria. Table of Contents: The Temporomandibular Joint / Fibrocartilage of the TMJ Disc / Cartilage of the Mandibular Condyle / Tissue Engineering of the Disc / Tissue Engineering of the Mandibular Condyle / Current Perspectives View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Testing iOS Apps with HadoopUnit:Rapid Distributed GUI Testing

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Smartphone users have come to expect high-quality apps. This has increased the importance of software testing in mobile software development. Unfortunately, testing apps—particularly the GUI—can be very time-consuming. Exercising every user interface element and verifying transitions between different views of the app under test quickly becomes problematic. For example, execution of iOS GUI test suites using Apple’s UI Automation framework can take an hour or more if the app’s interface is complicated. The longer it takes to run a test, the less frequently the test can be run, which in turn reduces software quality. This book describes how to accelerate the testing process for iOS apps using HadoopUnit, a distributed test execution environment that leverages the parallelism inherent in the Hadoop platform. HadoopUnit was previously used to run unit and system tests in the cloud. It has been modified to perform GUI testing of iOS apps on a small-scale cluste —a modest computing infrastructure available to almost every developer. Experimental results have shown that distributed test execution with HadoopUnit can significantly outperform the test execution on a single machine, even if the size of the cluster used for the execution is as small as two nodes. This means that the approach described in this book could be adopted without a huge investment in IT resources. HadoopUnit is a cost-effective solution for reducing lengthy test execution times of system-level GUI testing of iOS apps. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Pragmatic Power

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Pragmatic Power is focused on just three aspects of the AC electrical power system that supplies and moves the vast majority of electrical energy nearly everywhere in the world: three-phase power systems, transformers, and induction motors. The reader needs to have had an introduction to electrical circuits and AC power, although the text begins with a review of the basics of AC power. Balanced three-phase systems are studied by developing their single-phase equivalents. The study includes a look at how the cost of "power" is affected by reactive power and power factor. Transformers are considered as a circuit element in a power system, one that can be reasonably modeled to simplify system analysis. Induction motors are presented as the most common way to change electrical energy into rotational energy. Examples include the correct selection of an induction motor for a particular rotating load. All of these topics include completely worked examples to aid the reader in understanding h w to apply what has been learned. This short lecture book will be of use to students at any level of engineering, not just electrical, because it is intended for the practicing engineer or scientist looking for a practical, applied introduction to AC power systems. The author's "pragmatic" and applied style gives a unique and helpful "nonidealistic, practical, and opinionated" introduction to the topic. Table of Contents: Three-Phase Power: 3 > 3 x 1 / Transformers: Edison Lost / Induction Motors: Just One Moving Part View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing and Evaluating Usable Technology in Industrial Research:Three Case Studies

    Copyright Year: 2010

    Morgan and Claypool eBooks

    This book is about HCI research in an industrial research setting. It is based on the experiences of two researchers at the IBM T. J. Watson Research Center. Over the last two decades, Drs. John and Clare-Marie Karat have conducted HCI research to create innovative usable technology for users across a variety of domains. We begin the book by introducing the reader to the context of industrial research as well as a set of common themes or guidelines to consider in conducting HCI research in practice. Then case study examples of HCI approaches to the design and evaluation of usable solutions for people are presented and discussed in three domain areas: - item Conversational speech technologies, - item Personalization in eCommerce, and - item Security and privacy policy management technologies In each of the case studies, the authors illustrate and discuss examples of HCI approaches to design and evaluation that worked well and those that did not. They discuss what was learned over time bout different HCI methods in practice, and changes that were made to the HCI tools used over time. The Karats discuss trade-offs and issues related to time, resources, and money and the value derived from different HCI methods in practice. These decisions are ones that need to be made regularly in the industrial sector. Similarities and differences with the types of decisions made in this regard in academia will be discussed. The authors then use the context of the three case studies in the three research domains to draw insights and conclusions about the themes that were introduced in the beginning of the book. The Karats conclude with their perspective about the future of HCI industrial research. Table of Contents: Introduction: Themes and Structure of the Book / Case Study 1: Conversational Speech Technologies: Automatic Speech Recognition (ASR) / Case Study 2: Personalization in eCommerce / Case Study 3: Security and Privacy Policy Management Technologies / Insights and Conclusio s / The Future of Industrial HCI Research View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Intelligent Autonomous Robotics:A Robot Soccer Case Study

    Copyright Year: 2007

    Morgan and Claypool eBooks

    Robotics technology has recently advanced to the point of being widely accessible for relatively low-budget research, as well as for graduate, undergraduate, and even secondary and primary school education. This lecture provides an example of how to productively use a cutting-edge advanced robotics platform for education and research by providing a detailed case study with the Sony AIBO robot, a vision-based legged robot. The case study used for this lecture is the UT Austin Villa RoboCup Four-Legged Team. This lecture describes both the development process and the technical details of its end result. The main contributions of this lecture are (i) a roadmap for new classes and research groups interested in intelligent autonomous robotics who are starting from scratch with a new robot, and (ii) documentation of the algorithms behind our own approach on the AIBOs with the goal of making them accessible for use on other vision-based and/or legged robot platforms. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Control Grid Motion Estimation for Efficient Application of Optical Flow

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Motion estimation is a long-standing cornerstone of image and video processing. Most notably, motion estimation serves as the foundation for many of today's ubiquitous video coding standards including H.264. Motion estimators also play key roles in countless other applications that serve the consumer, industrial, biomedical, and military sectors. Of the many available motion estimation techniques, optical flow is widely regarded as most flexible. The flexibility offered by optical flow is particularly useful for complex registration and interpolation problems, but comes at a considerable computational expense. As the volume and dimensionality of data that motion estimators are applied to continue to grow, that expense becomes more and more costly. Control grid motion estimators based on optical flow can accomplish motion estimation with flexibility similar to pure optical flow, but at a fraction of the computational expense. Control grid methods also offer the added benefit of repres nting motion far more compactly than pure optical flow. This booklet explores control grid motion estimation and provides implementations of the approach that apply to data of multiple dimensionalities. Important current applications of control grid methods including registration and interpolation are also developed. Table of Contents: Introduction / Control Grid Interpolation (CGI) / Application of CGI to Registration Problems / Application of CGI to Interpolation Problems / Discussion and Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Vision-Based Interaction

    Copyright Year: 2013

    Morgan and Claypool eBooks

    In its early years, the field of computer vision was largely motivated by researchers seeking computational models of biological vision and solutions to practical problems in manufacturing, defense, and medicine. For the past two decades or so, there has been an increasing interest in computer vision as an input modality in the context of human-computer interaction. Such vision-based interaction can endow interactive systems with visual capabilities similar to those important to human-human interaction, in order to perceive non-verbal cues and incorporate this information in applications such as interactive gaming, visualization, art installations, intelligent agent interaction, and various kinds of command and control tasks. Enabling this kind of rich, visual and multimodal interaction requires interactive-time solutions to problems such as detecting and recognizing faces and facial expressions, determining a person's direction of gaze and focus of attention, tracking movement of th body, and recognizing various kinds of gestures. In building technologies for vision-based interaction, there are choices to be made as to the range of possible sensors employed (e.g., single camera, stereo rig, depth camera), the precision and granularity of the desired outputs, the mobility of the solution, usability issues, etc. Practical considerations dictate that there is not a one-size-fits-all solution to the variety of interaction scenarios; however, there are principles and methodological approaches common to a wide range of problems in the domain. While new sensors such as the Microsoft Kinect are having a major influence on the research and practice of vision-based interaction in various settings, they are just a starting point for continued progress in the area. In this book, we discuss the landscape of history, opportunities, and challenges in this area of vision-based interaction; we review the state-of-the-art and seminal works in detecting and recognizing the human b dy and its components; we explore both static and dynamic approaches to "looking at people" vision problems; and we place the computer vision work in the context of other modalities and multimodal applications. Readers should gain a thorough understanding of current and future possibilities of computer vision technologies in the context of human-computer interaction. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Designing for Gesture and Tangible Interaction

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Interactive technology is increasingly integrated with physical objects that do not have a traditional keyboard and mouse style of interaction, and many do not even have a display. These objects require new approaches to interaction design, referred to as post-WIMP (Windows, Icons, Menus, and Pointer) or as embodied interaction design.</p> <p>This book provides an overview of the design opportunities and issues associated with two embodied interaction modalities that allow us to leave the traditional keyboard behind: tangible and gesture interaction. We explore the issues in designing for this new age of interaction by highlighting the significance and contexts for these modalities. We explore the design of tangible interaction with a reconceptualization of the traditional keyboard as a Tangible Keyboard, and the design of interactive three-dimensional (3D) models as Tangible Models. We explore the design of gesture interaction through the design of gesture ase commands for a walk-up-and-use information display, and through the design of a gesture-based dialogue for the willful marionette. We conclude with design principles for tangible and gesture interaction and a call for research on the cognitive effects of these modalities.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Neural Interfacing:Forging the Human-Machine Connection

    Copyright Year: 2008

    Morgan and Claypool eBooks

    In the past 50 years there has been an explosion of interest in the development of technologies whose end goal is to connect the human brain and/or nervous system directly to computers. Once the subject of science fiction, the technologies necessary to accomplish this goal are rapidly becoming reality. In laboratories around the globe, research is being undertaken to restore function to the physically disabled, to replace areas of the brain damaged by disease or trauma and to augment human abilities. Building neural interfaces and neuro-prosthetics relies on a diverse array of disciplines such as neuroscience, engineering, medicine and microfabrication just to name a few. This book presents a short history of neural interfacing (N.I.) research and introduces the reader to some of the current efforts to develop neural prostheses. The book is intended as an introduction for the college freshman or others wishing to learn more about the field. A resource guide is included for students al ng with a list of laboratories conducting N.I. research and universities with N.I. related tracks of study. Table of Contents: Neural Interfaces Past and Present / Current Neuroprosthesis Research / Conclusion / Resources for Students View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Image-Based Modeling of Plants and Trees

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Plants and trees are among the most complex natural objects. Much work has been done attempting to model them, with varying degrees of success. In this book, we review the various approaches in computer graphics, which we categorize as rule-based, image-based, and sketch-based methods. We describe our approaches for modeling plants and trees using images. Image-based approaches have the distinct advantage that the resulting model inherits the realistic shape and complexity of a real plant or tree. We use different techniques for modeling plants (with relatively large leaves) and trees (with relatively small leaves).With plants, we model each leaf from images, while for trees, the leaves are only approximated due to their small size and large number. Both techniques start with the same initial step of structure from motion on multiple images of the plant or tree that is to be modeled. For our plant modeling system, because we need to model the individual leaves, these leaves need to be segmented out from the images. We designed our plant modeling system to be interactive, automating the process of shape recovery while relying on the user to provide simple hints on segmentation. Segmentation is performed in both image and 3D spaces, allowing the user to easily visualize its effect immediately. Using the segmented image and 3D data, the geometry of each leaf is then automatically recovered from the multiple views by fitting a deformable leaf model. Our system also allows the user to easily reconstruct branches in a similar manner. To model trees, because of the large leaf count, small image footprint, and widespread occlusions, we do not model the leaves exactly as we do for plants. Instead, we populate the tree with leaf replicas from segmented source images to reconstruct the overall tree shape. In addition, we use the shape patterns of visible branches to predict those of obscured branches. As a result, we are able to design our tree modeling system so as to minimi e user intervention. We also handle the special case of modeling a tree from only a single image. Here, the user is required to draw strokes on the image to indicate the tree crown (so that the leaf region is approximately known) and to refine the recovery of branches. As before, we concatenate the shape patterns from a library to generate the 3D shape. To substantiate the effectiveness of our systems, we show realistic reconstructions of a variety of plants and trees from images. Finally, we offer our thoughts on improving our systems and on the remaining challenges associated with plant and tree modeling. Table of Contents: Introduction / Review of Plant and Tree Modeling Techniques / Image-Based Technique for Modeling Plants / Image-Based Technique for Modeling Trees / Single Image Tree Modeling / Summary and Concluding Remarks / Acknowledgments View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    GPU-Based Techniques for Global Illumination Effects

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book presents techniques to render photo-realistic images by programming the Graphics Processing Unit (GPU). We discuss effects such as mirror reflections, refractions, caustics, diffuse or glossy indirect illumination, radiosity, single or multiple scattering in participating media, tone reproduction, glow, and depth of field. The book targets game developers, graphics programmers, and also students with some basic understanding of computer graphics algorithms, rendering APIs like Direct3D or OpenGL, and shader programming. In order to make the book self-contained, the most important concepts of local illumination and global illumination rendering, graphics hardware, and Direct3D/HLSL programming are reviewed in the first chapters. After these introductory chapters we warm up with simple methods including shadow and environment mapping, then we move on toward advanced concepts aiming at global illumination rendering. Since it would have been impossible to give a rigorous review f all approaches proposed in this field, we go into the details of just a few methods solving each particular global illumination effect. However, a short discussion of the state of the art and links to the bibliography are also provided to refer the interested reader to techniques that are not detailed in this book. The implementation of the selected methods is also presented in HLSL, and we discuss their observed performance, merits, and disadvantages. In the last chapter, we also review how these techniques can be integrated in an advanced game engine and present case studies of their exploitation in games. Having gone through this book, the reader will have an overview of the state of the art, will be able to apply and improve these techniques, and most importantly, will be capable of developing brand new GPU algorithms. Table of Contents: Global Illumintation Rendering / Local Illumination Rendering Pipeline of GPUs / Programming and Controlling GPUs / Simple Improvements of the ocal Illumination Model / Ray Casting on the GPU / Specular Effects with Rasterization / Diffuse and Glossy Indirect Illumination / Pre-computation Aided Global Illumination / Participating Media Rendering / Fake Global Illumination / Postprocessing Effects / Integrating GI Effects in Games and Virtual Reality Systems / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semantics Empowered Web 3.0:Managing Enterprise, Social, Sensor, and Cloud-based Data and Services for Advanced Applications

    Copyright Year: 2012

    Morgan and Claypool eBooks

    After the traditional document-centric Web 1.0 and user-generated content focused Web 2.0, Web 3.0 has become a repository of an ever growing variety of Web resources that include data and services associated with enterprises, social networks, sensors, cloud, as well as mobile and other devices that constitute the Internet of Things. These pose unprecedented challenges in terms of heterogeneity (variety), scale (volume), and continuous changes (velocity), as well as present corresponding opportunities if they can be exploited. Just as semantics has played a critical role in dealing with data heterogeneity in the past to provide interoperability and integration, it is playing an even more critical role in dealing with the challenges and helping users and applications exploit all forms of Web 3.0 data. This book presents a unified approach to harness and exploit all forms of contemporary Web resources using the core principles of ability to associate meaning with data through conceptual or domain models and semantic descriptions including annotations, and through advanced semantic techniques for search, integration, and analysis. It discusses the use of Semantic Web standards and techniques when appropriate, but also advocates the use of lighter weight, easier to use, and more scalable options when they are more suitable. The authors' extensive experience spanning research and prototypes to development of operational applications and commercial technologies and products guide the treatment of the material. Table of Contents: Role of Semantics and Metadata / Types and Models of Semantics / Annotation -- Adding Semantics to Data / Semantics for Enterprise Data / Semantics for Services / Semantics for Sensor Data / Semantics for Social Data / Semantics for Cloud Computing / Semantics for Advanced Applications View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Joint Source Channel Coding Using Arithmetic Codes

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Based on the encoding process, arithmetic codes can be viewed as tree codes and current proposals for decoding arithmetic codes with forbidden symbols belong to sequential decoding algorithms and their variants. In this monograph, we propose a new way of looking at arithmetic codes with forbidden symbols. If a limit is imposed on the maximum value of a key parameter in the encoder, this modified arithmetic encoder can also be modeled as a finite state machine and the code generated can be treated as a variable-length trellis code. The number of states used can be reduced and techniques used for decoding convolutional codes, such as the list Viterbi decoding algorithm, can be applied directly on the trellis. The finite state machine interpretation can be easily migrated to Markov source case. We can encode Markov sources without considering the conditional probabilities, while using the list Viterbi decoding algorithm which utilizes the conditional probabilities. We can also use contex -based arithmetic coding to exploit the conditional probabilities of the Markov source and apply a finite state machine interpretation to this problem. The finite state machine interpretation also allows us to more systematically understand arithmetic codes with forbidden symbols. It allows us to find the partial distance spectrum of arithmetic codes with forbidden symbols. We also propose arithmetic codes with memories which use high memory but low implementation precision arithmetic codes. The low implementation precision results in a state machine with less complexity. The introduced input memories allow us to switch the probability functions used for arithmetic coding. Combining these two methods give us a huge parameter space of the arithmetic codes with forbidden symbols. Hence we can choose codes with better distance properties while maintaining the encoding efficiency and decoding complexity. A construction and search method is proposed and simulation results show that we can chieve a similar performance as turbo codes when we apply this approach to rate 2/3 arithmetic codes. Table of Contents: Introduction / Arithmetic Codes / Arithmetic Codes with Forbidden Symbols / Distance Property and Code Construction / Conclusion View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Noise-Resilient Computing

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Noise abatement is the key problem of small-scaled circuit design. New computational paradigms are needed -- as these circuits shrink, they become very vulnerable to noise and soft errors. In this lecture, we present a probabilistic computation framework for improving the resiliency of logic gates and circuits under random conditions induced by voltage or current fluctuation. Among many probabilistic techniques for modeling such devices, only a few models satisfy the requirements of efficient hardware implementation -- specifically, Boltzman machines and Markov Random Field (MRF) models. These models have similar built-in noise-immunity characteristics based on feedback mechanisms. In probabilistic models, the values 0 and 1 of logic functions are replaced by degrees of beliefs that these values occur. An appropriate metric for degree of belief is probability. We discuss various approaches for noise-resilient logic gate design, and propose a novel design taxonomy based on implementati n of the MRF model by a new type of binary decision diagram (BDD), called a cyclic BDD. In this approach, logic gates and circuits are designed using 2-to-1 bi-directional switches. Such circuits are often modeled using Shannon expansions with the corresponding graph-based implementation, BDDs. Simulation experiments are reported to show the noise immunity of the proposed structures. Audiences who may benefit from this lecture include graduate students taking classes on advanced computing device design, and academic and industrial researchers. Table of Contents: Introduction to probabilistic computation models / Nanoscale circuits and fluctuation problems / Estimators and Metrics / MRF Models of Logic Gates / Neuromorphic models / Noise-tolerance via error correcting / Conclusion and future work View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Sensory Organ Replacement and Repair

    Copyright Year: 2006

    Morgan and Claypool eBooks

    The senses of human hearing and sight are often taken for granted by many individuals until they are lost or adversely affected. Millions of individuals suffer from partial or total hearing loss and millions of others have impaired vision. The technologies associated with augmenting these two human senses range from simple hearing aids to complex cochlear implants, and from (now commonplace) intraocular lenses to complex artificial corneas. The areas of human hearing and human sight will be described in detail with the associated array of technologies also described. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Interactive Shape Design

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Providing an intuitive modeling system, which would enable us to communicate about any free-form shape we have in mind at least as quickly as with real-world tools, is one of the main challenges of digital shape design. The user should ideally be able to create, deform, and progressively add details to a shape, without being aware of the underlying mathematical representation nor being tied by any constraint on the geometrical or topological nature of the model. This book presents the field of interactive shape design from this perspective. Since interactively creating a shape builds on the humans ability of modeling by gesture, we note that the recent advances in interactive shape design can be classified as those that rely on sculpting as opposed to sketching metaphors. Our synthetic presentation of these strategies enables us to compare the different families of solutions, discuss open issues, and identify directions for future research. Table of Contents: Introduction / Sculpting etaphors / Sketching Systems / Future Directions: Modeling by Gesture View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Transforming Technologies to Manage Our Information:The Future of Personal Information Management, Part II

    Copyright Year: 2013

    Morgan and Claypool eBooks

    With its theme, "Our Information, Always and Forever," Part I of this book covers the basics of personal information management (PIM) including six essential activities of PIM and six (different) ways in which information can be personal to us. Part I then goes on to explore key issues that arise in the "great migration" of our information onto the Web and into a myriad of mobile devices. Part 2 provides a more focused look at technologies for managing information that promise to profoundly alter our practices of PIM and, through these practices, the way we lead our lives. Part 2 is in five chapters: - Chapter 5. Technologies of Input and Output. Technologies in support of gesture, touch, voice, and even eye movements combine to support a more natural user interface (NUI). Technologies of output include glasses and "watch" watches. Output will also increasingly be animated with options to "zoom". - Chapter 6. Technologies to Save Our Information. We can opt for "life logs" to record o r experiences with increasing fidelity. What will we use these logs for? And what isn’t recorded that should be? - Chapter 7. Technologies to Search Our Information. The potential for personalized search is enormous and mostly yet to be realized. Persistent searches, situated in our information landscape, will allow us to maintain a diversity of projects and areas of interest without a need to continually switch from one to another to handle incoming information. - Chapter 8. Technologies to Structure Our Information. Structure is key if we are to keep, find, and make effective use of our information. But how best to structure? And how best to share structured information between the applications we use, with other people, and also with ourselves over time? What lessons can we draw from the failures and successes in web-based efforts to share structure? - Chapter 9. PIM Transformed and Transforming: Stories from the Past, Present and Future. Part 2 concludes with a comparison b tween Licklider’s world of information in 1957 and our own world of information today. And then we consider what the world of information is likely to look like in 2057. Licklider estimated that he spent 85% of his "thinking time" in activities that were clerical and mechanical and might (someday) be delegated to the computer. What percentage of our own time is spent with the clerical and mechanical? What about in 2057? View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Webometrics:Quantitative Web Research for the Social Sciences

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Webometrics is concerned with measuring aspects of the web: web sites, web pages, parts of web pages, words in web pages, hyperlinks, web search engine results. The importance of the web itself as a communication medium and for hosting an increasingly wide array of documents, from journal articles to holiday brochures, needs no introduction. Given this huge and easily accessible source of information, there are limitless possibilities for measuring or counting on a huge scale (e.g., the number of web sites, the number of web pages, the number of blogs) or on a smaller scale (e.g., the number of web sites in Ireland, the number of web pages in the CNN web site, the number of blogs mentioning Barack Obama before the 2008 presidential campaign). This book argues that it can be useful for social scientists to measure aspects of the web and explains how this can be achieved on both a small and large scale. The book is intended for social scientists with research topics that are wholly or p rtly online (e.g., social networks, news, political communication) and social scientists with offline research topics with an online reflection, even if this is not a core component (e.g., diaspora communities, consumer culture, linguistic change). The book is also intended for library and information science students in the belief that the knowledge and techniques described will be useful for them to guide and aid other social scientists in their research. In addition, the techniques and issues are all directly relevant to library and information science research problems. Table of Contents: Introduction / Web Impact Assessment / Link Analysis / Blog Searching / Automatic Search Engine Searches: LexiURL Searcher / Web Crawling: SocSciBot / Search Engines and Data Reliability / Tracking User Actions Online / Advaned Techniques / Summary and Future Directions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    An Anthropology of Services:Toward a Practice Approach to Designing Services

    Copyright Year: 2015

    Morgan and Claypool eBooks

    This book explores the possibility for an anthropology of services and outlines a practice approach to designing services. The reader is taken on a journey that Blomberg and Darrah have been on for the better part of a decade from their respective positions helping to establish a services research group within a large global enterprise and an applied anthropology master's program at a Silicon Valley university. They delve into the world of services to understand both how services are being conceptualized today and the possible benefits that might result from taking an anthropological view on services and their design. The authors argue that the anthropological gaze can be useful precisely because it combines attention to details of everyday life with consideration of the larger milieu in which those details make sense. Furthermore, it asks us to reflect upon and assess our own perspectives on that which we hope to understand and change. Central to their exploration is the question of how to conceptualize and engage with the world of services given their heterogeneity, the increasing global importance of the service economy, and the possibilities introduced for an engaged scholarship on service design. While discourse on services and service design can imply something distinctively new, the authors point to parallels with what is known about how humans have engaged with each other and the material world over millennia. Establishing the ubiquity of services as a starting point, the authors go on to consider the limits of design when the boundaries and connections between what can be designed and what can only be performed are complex and deeply mediated. In this regard the authors outline a practice approach to designing that acknowledges that designing involves participating in a social context, that design and use occur in concert, that people populate a world that has been largely built by and with others, and that formal models of services are impoverished repre entations of human performance. An Anthropology of Services draws attention to the conceptual and methodological messiness of service worlds while providing the reader with strategies for intervening in these worlds for human betterment as complex and challenging as that may be. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Concise Introduction to Models and Methods for Automated Planning

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Planning is the model-based approach to autonomous behavior where the agent behavior is derived automatically from a model of the actions, sensors, and goals. The main challenges in planning are computational as all models, whether featuring uncertainty and feedback or not, are intractable in the worst case when represented in compact form. In this book, we look at a variety of models used in AI planning, and at the methods that have been developed for solving them. The goal is to provide a modern and coherent view of planning that is precise, concise, and mostly self-contained, without being shallow. For this, we make no attempt at covering the whole variety of planning approaches, ideas, and applications, and focus on the essentials. The target audience of the book are students and researchers interested in autonomous behavior and planning from an AI, engineering, or cognitive science perspective. Table of Contents: Preface / Planning and Autonomous Behavior / Classical Planning: Fu l Information and Deterministic Actions / Classical Planning: Variations and Extensions / Beyond Classical Planning: Transformations / Planning with Sensing: Logical Models / MDP Planning: Stochastic Actions and Full Feedback / POMDP Planning: Stochastic Actions and Partial Feedback / Discussion / Bibliography / Author's Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Quantifying Research Integrity

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Institutions typically treat research integrity violations as black and white, right or wrong. The result is that the wide range of grayscale nuances that separate accident, carelessness and bad practice from deliberate fraud and malpractice often get lost. This lecture looks at how to quantify the grayscale range in three kinds of research integrity violations: plagiarism, data falsification, and image manipulation.</p><p>Quantification works best with plagiarism, because the essential one-to one matching algorithms are well known and established tools for detecting when matches exist. Questions remain, however, how many matching words of what kind in what location in which discipline constitute reasonable suspicion of fraudulent intent. Different disciplines take different perspectives on quantity and location. Quantification is harder with data falsification, because the original data are often not available, and because experimental replication remains s rprisingly difficult. The same is true with image manipulation, where tools exist for detecting certain kinds of manipulations, but where the tools are also easily defeated.</p><p>This lecture looks at how to prevent violations of research integrity from a pragmatic viewpoint, and at what steps can institutions and publishers take to discourage problems beyond the usual ethical admonitions. There are no simple answers, but two measures can help: the systematic use of detection tools and requiring original data and images. These alone do not suffice, but they represent a start.</p><p>The scholarly community needs a better awareness of the complexity of research integrity decisions. Only an open and wide-spread international discussion can bring about a consensus on where the boundary lines are and when grayscale problems shade into black. One goal of this work is to move that discussion forward.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semantic Breakthrough in Drug Discovery

    Copyright Year: 2014

    Morgan and Claypool eBooks

    The current drug development paradigm---sometimes expressed as, ``One disease, one target, one drug''---is under question, as relatively few drugs have reached the market in the last two decades. Meanwhile, the research focus of drug discovery is being placed on the study of drug action on biological systems as a whole, rather than on individual components of such systems. The vast amount of biological information about genes and proteins and their modulation by small molecules is pushing drug discovery to its next critical steps, involving the integration of chemical knowledge with these biological databases. Systematic integration of these heterogeneous datasets and the provision of algorithms to mine the integrated datasets would enable investigation of the complex mechanisms of drug action; however, traditional approaches face challenges in the representation and integration of multi-scale datasets, and in the discovery of underlying knowledge in the integrated datasets. The Sem ntic Web, envisioned to enable machines to understand and respond to complex human requests and to retrieve relevant, yet distributed, data, has the potential to trigger system-level chemical-biological innovations. Chem2Bio2RDF is presented as an example of utilizing Semantic Web technologies to enable intelligent analyses for drug discovery. Table of Contents: Introduction / Data Representation and Integration Using RDF / Data Representation and Integration Using OWL / Finding Complex Biological Relationships in PubMed Articles using Bio-LDA / Integrated Semantic Approach for Systems Chemical Biology Knowledge Discovery / Semantic Link Association Prediction / Conclusions / References / Authors' Biographies View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Science Fiction Prototyping:Designing the Future with Science Fiction

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Science fiction is the playground of the imagination. If you are interested in science or fascinated with the future then science fiction is where you explore new ideas and let your dreams and nightmares duke it out on the safety of the page or screen. But what if we could use science fiction to do more than that? What if we could use science fiction based on science fact to not only imagine our future but develop new technologies and products? What if we could use stories, movies and comics as a kind of tool to explore the real world implications and uses of future technologies today? Science Fiction Prototyping is a practical guide to using fiction as a way to imagine our future in a whole new way. Filled with history, real world examples and conversations with experts like best selling science fiction author Cory Doctorow, senior editor at Dark Horse Comics Chris Warner and Hollywood science expert Sidney Perkowitz, Science Fiction Prototyping will give you the tools you need to be in designing the future with science fiction. The future is Brian David Johnson’s business. As a futurist at Intel Corporation, his charter is to develop an actionable vision for computing in 2021. His work is called “future casting”—using ethnographic field studies, technology research, trend data, and even science fiction to create a pragmatic vision of consumers and computing. Johnson has been pioneering development in artificial intelligence, robotics, and reinventing TV. He speaks and writes extensively about future technologies in articles and scientific papers as well as science fiction short stories and novels (Fake Plastic Love and Screen Future: The Future of Entertainment, Computing and the Devices We Love). He has directed two feature films and is an illustrator and commissioned painter. Table of Contents: Preface / Foreword / Epilogue / Dedication / Acknowledgments / 1. The Future Is in Your Hands / 2. Religious Robots and Runaway Were-Tigers: Brief Overview of the Science and the Fiction that Went Into Two SF Prototypes / 3. How to Build Your Own SF Prototype in Five Steps or Less / 4. I, Robot: From Asimov to Doctorow: Exploring Short Fiction as an SF Prototype and a Conversation With Cory Doctorow / 5. The Men in the Moon: Exploring Movies as an SF Prototype and a Conversation with Sidney Perkowitz / 6. Science in the Gutters: Exploring Comics as an SF Prototype and a Conversation With Chris Warner / 7. Making the Future: Now that You Have Developed Your SF Prototype, What’s Next? / 8. Einstein’s Thought Experiments and Asimov’s Second Dream / Appendix A: The SF Prototypes / Notes / Author Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2016

    Morgan and Claypool eBooks

    As technologists, we are constantly exploring and pushing the limits of our own disciplines, and we accept the notion that the efficiencies of new technologies are advancing at a very rapid rate. However, we rarely have time to contemplate the broader impact of these technologies as they impact and amplify adjacent technology disciplines. This book therefore focuses on the potential impact of those technologies, but it is not intended as a technical manuscript. In this book, we consider our progress and current position %toward on arbitrary popular concepts of future scenarios rather than the typical measurements of cycles per second or milliwatts. We compare our current human cultural situation to other past historic events as we anticipate the future social impact of rapidly accelerating technologies. We also rely on measurements based on specific events highlighting the breadth of the impact of accelerating semiconductor technologies rather than the specific rate of advance of any articular semiconductor technology. These measurements certainly lack the mathematic precision and repeatability to which technologists are accustomed, but the material that we are dealing with—the social objectives and future political structures of humanity—does not permit a high degree of mathematic accuracy. Our conclusion draws from the concept of Singularity. It seems certain that at the rate at which our technologies are advancing, we will exceed the ability of our post‒Industrial Revolution structures to absorb these new challenges, and we cannot accurately anticipate what those future social structures will resemble. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Narrowband Direction of Arrival Estimation for Antenna Arrays

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book provides an introduction to narrowband array signal processing, classical and subspace-based direction of arrival (DOA) estimation with an extensive discussion on adaptive direction of arrival algorithms. The book begins with a presentation of the basic theory, equations, and data models of narrowband arrays. It then discusses basic beamforming methods and describes how they relate to DOA estimation. Several of the most common classical and subspace-based direction of arrival methods are discussed. The book concludes with an introduction to subspace tracking and shows how subspace tracking algorithms can be used to form an adaptive DOA estimator. Simulation software and additional bibliography are given at the end of the book. Table of Contents: Introduction / Background on Array Processing / Nonadaptive Direction of Arrival Estimation / Adaptive Direction of Arrival Estimation / Appendix View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Integral Equation Methods for Electromagnetic and Elastic Waves

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Integral Equation Methods for Electromagnetic and Elastic Waves is an outgrowth of several years of work. There have been no recent books on integral equation methods. There are books written on integral equations, but either they have been around for a while, or they were written by mathematicians. Much of the knowledge in integral equation methods still resides in journal papers. With this book, important relevant knowledge for integral equations are consolidated in one place and researchers need only read the pertinent chapters in this book to gain important knowledge needed for integral equation research. Also, learning the fundamentals of linear elastic wave theory does not require a quantum leap for electromagnetic practitioners. Integral equation methods have been around for several decades, and their introduction to electromagnetics has been due to the seminal works of Richmond and Harrington in the 1960s. There was a surge in the interest in this topic in the 1980s (notably t e work of Wilton and his coworkers) due to increased computing power. The interest in this area was on the wane when it was demonstrated that differential equation methods, with their sparse matrices, can solve many problems more efficiently than integral equation methods. Recently, due to the advent of fast algorithms, there has been a revival in integral equation methods in electromagnetics. Much of our work in recent years has been in fast algorithms for integral equations, which prompted our interest in integral equation methods. While previously, only tens of thousands of unknowns could be solved by integral equation methods, now, tens of millions of unknowns can be solved with fast algorithms. This has prompted new enthusiasm in integral equation methods. Table of Contents: Introduction to Computational Electromagnetics / Linear Vector Space, Reciprocity, and Energy Conservation / Introduction to Integral Equations / Integral Equations for Penetrable Objects / Low-Frequency Prob ems in Integral Equations / Dyadic Green's Function for Layered Media and Integral Equations / Fast Inhomogeneous Plane Wave Algorithm for Layered Media / Electromagnetic Wave versus Elastic Wave / Glossary of Acronyms View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Fuzzy Information Retrieval

    Copyright Year: 2017

    Morgan and Claypool eBooks

    <p>Information retrieval used to mean looking through thousands of strings of texts to find words or symbols that matched a user's query. Today, there are many models that help index and search more effectively so retrieval takes a lot less time. Information retrieval (IR) is often seen as a subfield of computer science and shares some modeling, applications, storage applications and techniques, as do other disciplines like artificial intelligence, database management, and parallel computing. This book introduces the topic of IR and how it differs from other computer science disciplines. A discussion of the history of modern IR is briefly presented, and the notation of IR as used in this book is defined. The complex notation of relevance is discussed. Some applications of IR is noted as well since IR has many practical uses today. Using information retrieval with fuzzy logic to search for software terms can help find software components and ultimately help increase the reuse f software. This is just one practical application of IR that is covered in this book.</p> <p>Some of the classical models of IR is presented as a contrast to extending the Boolean model. This includes a brief mention of the source of weights for the various models. In a typical retrieval environment, answers are either yes or no, i.e., on or off. On the other hand, fuzzy logic can bring in a "degree of" match, vs. a crisp, i.e., strict match. This, too, is looked at and explored in much detail, showing how it can be applied to information retrieval. Fuzzy logic is often times considered a soft computing application and this book explores how IR with fuzzy logic and its membership functions as weights can help indexing, querying, and matching. Since fuzzy set theory and logic is explored in IR systems, the explanation of where the fuzz is ensues.</p> <p>The concept of relevance feedback, including pseudorelevance feedback is explored for the various mod ls of IR. For the extended Boolean model, the use of genetic algorithms for relevance feedback is delved into.</p> <p>The concept of query expansion is explored using rough set theory. Various term relationships is modeled and presented, and the model extended for fuzzy retrieval. An example using the UMLS terms is also presented. The model is also extended for term relationships beyond synonyms.</p> <p>Finally, this book looks at clustering, both crisp and fuzzy, to see how that can improve retrieval performance. An example is presented to illustrate the concepts.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Probabilistic Databases

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Probabilistic databases are databases where the value of some attributes or the presence of some records are uncertain and known only with some probability. Applications in many areas such as information extraction, RFID and scientific data management, data cleaning, data integration, and financial risk assessment produce large volumes of uncertain data, which are best modeled and processed by a probabilistic database. This book presents the state of the art in representation formalisms and query processing techniques for probabilistic data. It starts by discussing the basic principles for representing large probabilistic databases, by decomposing them into tuple-independent tables, block-independent-disjoint tables, or U-databases. Then it discusses two classes of techniques for query evaluation on probabilistic databases. In extensional query evaluation, the entire probabilistic inference can be pushed into the database engine and, therefore, processed as effectively as the evaluati n of standard SQL queries. The relational queries that can be evaluated this way are called safe queries. In intensional query evaluation, the probabilistic inference is performed over a propositional formula called lineage expression: every relational query can be evaluated this way, but the data complexity dramatically depends on the query being evaluated, and can be #P-hard. The book also discusses some advanced topics in probabilistic data management such as top-k query processing, sequential probabilistic databases, indexing and materialized views, and Monte Carlo databases. Table of Contents: Overview / Data and Query Model / The Query Evaluation Problem / Extensional Query Evaluation / Intensional Query Evaluation / Advanced Techniques View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    On-Chip Photonic Interconnects:A Computer Architect's Perspective

    Copyright Year: 2013

    Morgan and Claypool eBooks

    As the number of cores on a chip continues to climb, architects will need to address both bandwidth and power consumption issues related to the interconnection network. Electrical interconnects are not likely to scale well to a large number of processors for energy efficiency reasons, and the problem is compounded by the fact that there is a fixed total power budget for a die, dictated by the amount of heat that can be dissipated without special (and expensive) cooling and packaging techniques. Thus, there is a need to seek alternatives to electrical signaling for on-chip interconnection applications. Photonics, which has a fundamentally different mechanism of signal propagation, offers the potential to not only overcome the drawbacks of electrical signaling, but also enable the architect to build energy efficient, scalable systems. The purpose of this book is to introduce computer architects to the possibilities and challenges of working with photons and designing on-chip photonic in erconnection networks. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Visual Information Retrieval using Java and LIRE

    Copyright Year: 2013

    Morgan and Claypool eBooks

    Visual information retrieval (VIR) is an active and vibrant research area, which attempts at providing means for organizing, indexing, annotating, and retrieving visual information (images and videos) from large, unstructured repositories. The goal of VIR is to retrieve matches ranked by their relevance to a given query, which is often expressed as an example image and/or a series of keywords. During its early years (1995-2000), the research efforts were dominated by content-based approaches contributed primarily by the image and video processing community. During the past decade, it was widely recognized that the challenges imposed by the lack of coincidence between an image's visual contents and its semantic interpretation, also known as semantic gap, required a clever use of textual metadata (in addition to information extracted from the image's pixel contents) to make image and video retrieval solutions efficient and effective. The need to bridge (or at least narrow) the semanti gap has been one of the driving forces behind current VIR research. Additionally, other related research problems and market opportunities have started to emerge, offering a broad range of exciting problems for computer scientists and engineers to work on. In this introductory book, we focus on a subset of VIR problems where the media consists of images, and the indexing and retrieval methods are based on the pixel contents of those images -- an approach known as content-based image retrieval (CBIR). We present an implementation-oriented overview of CBIR concepts, techniques, algorithms, and figures of merit. Most chapters are supported by examples written in Java, using Lucene (an open-source Java-based indexing and search implementation) and LIRE (Lucene Image REtrieval), an open-source Java-based library for CBIR. Table of Contents: Introduction / Information Retrieval: Selected Concepts and Techniques / Visual Features / Indexing Visual Features / LIRE: An Extensible Java CBIR Li rary / Concluding Remarks View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Bridging the Gap Between Engineering and the Global World:A Case Study of the Coconut (Coir) Fiber Industry in Kerala, India

    Copyright Year: 2008

    Morgan and Claypool eBooks

    Over the last two decades, globalization has had a profound impact on how we view the world and its sustainability. One group of professionals that lies at the heart of sustainability is the engineers. Engineers are trained problem solvers, required to implement technical solutions and are at the forefront of the development of new technologies. Although engineers play a critical role in sustainability, traditional engineering programs typically only focus on the technocentric and ecocentric dimensions of sustainability, providing little training on the sociocentric dimension. With more and more interest in sustainability, it is becoming increasingly important to also provide engineers with an awareness of sociocentric issues and the necessary skills to address them. The aim of this book is to provide engineering educators with a real-life case study that can be brought into existing courses to help bridge the gap between engineering and the global world. The case study focuses on how our engineering study of different natural plant fibers for soil erosion control led us to small villages in Kerala, India, where marginalized women workers often stand waste deep in water several hours a day, clean and beat coconuts by hand, and separate and spin coconut (coir) fibers into yarn by hand, for very low wages. The case study provides insight into the three dimensions of sustainability (technocentric, ecocentric, and sociocentric) and how they come together in a typical engineering problem. Table of Contents: Reinforcing the Classroom / Natural Plant Fibers for Engineering Applications: Technocentric and Ecocentric Dimensions of Sustainability / The Coir Fiber Industry in Kerala, India: Sociocentric Dimension of Sustainability / Case Study / Conclusion / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Activity Theory in HCI:Fundamentals and Reflections

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Activity theory -- a conceptual framework originally developed by Aleksei Leontiev -- has its roots in the socio-cultural tradition in Russian psychology. The foundational concept of the theory is human activity, which is understood as purposeful, mediated, and transformative interaction between human beings and the world. Since the early 1990s, activity theory has been a visible landmark in the theoretical landscape of Human-Computer Interaction (HCI). Along with some other frameworks, such as distributed cognition and phenomenology, it established itself as a leading post-cognitivist approach in HCI and interaction design. In this book we discuss the conceptual foundations of activity theory and its contribution to HCI research. After making the case for theory in HCI and briefly discussing the contribution of activity theory to the field (Chapter One) we introduce the historical roots, main ideas, and principles of activity theory (Chapter Two). After that we present in-depth analy es of three issues which we consider of special importance to current developments in HCI and interaction design, namely: agency (Chapter Three), experience (Chapter Four), and activity-centric computing (Chapter Five). We conclude the book with reflections on challenges and prospects for further development of activity theory in HCI (Chapter Six). Table of Contents: Introduction: Activity theory and the changing face of HCI / Basic concepts and principles of activity theory / Agency / Activity and experience / Activity-centric computing / Activity theory and the development of HCI View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Matrices in Engineering Problems

    Copyright Year: 2011

    Morgan and Claypool eBooks

    This book is intended as an undergraduate text introducing matrix methods as they relate to engineering problems. It begins with the fundamentals of mathematics of matrices and determinants. Matrix inversion is discussed, with an introduction of the well known reduction methods. Equation sets are viewed as vector transformations, and the conditions of their solvability are explored. Orthogonal matrices are introduced with examples showing application to many problems requiring three dimensional thinking. The angular velocity matrix is shown to emerge from the differentiation of the 3-D orthogonal matrix, leading to the discussion of particle and rigid body dynamics. The book continues with the eigenvalue problem and its application to multi-variable vibrations. Because the eigenvalue problem requires some operations with polynomials, a separate discussion of these is given in an appendix. The example of the vibrating string is given with a comparison of the matrix analysis to the cont nuous solution. Table of Contents: Matrix Fundamentals / Determinants / Matrix Inversion / Linear Simultaneous Equation Sets / Orthogonal Transforms / Matrix Eigenvalue Analysis / Matrix Analysis of Vibrating Systems View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Database Anonymization:Privacy Models, Data Utility, and Microaggregation-based Inter-model Connections

    Copyright Year: 2016

    Morgan and Claypool eBooks

    The current social and economic context increasingly demands open data to improve scientific research and decision making. However, when published data refer to individual respondents, disclosure risk limitation techniques must be implemented to anonymize the data and guarantee by design the fundamental right to privacy of the subjects the data refer to. Disclosure risk limitation has a long record in the statistical and computer science research communities, who have developed a variety of privacy-preserving solutions for data releases. This Synthesis Lecture provides a comprehensive overview of the fundamentals of privacy in data releases focusing on the computer science perspective. Specifically, we detail the privacy models, anonymization methods, and utility and risk metrics that have been proposed so far in the literature. Besides, as a more advanced topic, we identify and discuss in detail connections between several privacy models (i.e., how to accumulate the privacy guarantee they offer to achieve more robust protection and when such guarantees are equivalent or complementary); we also explore the links between anonymization methods and privacy models (how anonymization methods can be used to enforce privacy models and thereby offer ex ante privacy guarantees). These latter topics are relevant to researchers and advanced practitioners, who will gain a deeper understanding on the available data anonymization solutions and the privacy guarantees they can offer. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Social Media and Library Services

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The rise of social media technologies has created new ways to seek and share information for millions of users worldwide, but also has presented new challenges for libraries in meeting users where they are within social spaces. From social networking sites such as Facebook and Google+, and microblogging platforms such as Twitter and Tumblr to the image and video sites of YouTube, Flickr, Instagram, and to geotagging sites such as Foursquare, libraries have responded by establishing footholds within a variety of social media platforms and seeking new ways of engaging with online users in social spaces. Libraries are also responding to new social review sites such as Yelp and Tripadvisor, awareness sites including StumbleUpon, Pinterest, Goodreads, and Reddit, and social question-and-answer (Q&A) sites such as Yahoo! Answers—sites which engage social media users in functions similar to traditional library content curation, readers' advisory, information and referral, and re erence services. Establishing a social media presence extends the library's physical manifestation into virtual space and increases the library's visibility, reach, and impact. However, beyond simply establishing a social presence for the library, a greater challenge is building effective and engaging social media sites that successfully adapt a library's visibility, voice, and presence to the unique contexts, audiences, and cultures within diverse social media sites. This lecture examines the research and theory on social media and libraries, providing an overview of what is known and what is not yet known about libraries and social media. Chapter 1 focuses on the social media environments within which libraries are establishing a presence, including how social media sites differ from each other, yet work together within a social ecosphere. Chapter 2 examines how libraries are engaging with users across a variety of social media platforms and the extent to which libraries are invo ved in using these different social media platforms, as well as the activities of libraries in presenting a social "self," sharing information, and interacting with users via social media. Chapter 3 explores metrics and measures for assessing the impact of the library's activity in social media sites. The book concludes with Chapter 4 on evolving directions for libraries and social media, including potential implications of new and emerging technologies for libraries in social spaces. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Engineering Design Challenge:A Unique Opportunity

    Copyright Year: 2013

    Morgan and Claypool eBooks

    The Engineering Design Challenge addresses teaching engineering design and presents design projects for first-year students and interdisciplinary design ventures. A short philosophy and background of engineering design is discussed. The organization of the University of Wyoming first-year Introduction to Engineering program is presented with an emphasis on the first-year design challenges. These challenges are presented in a format readily incorporated in other first-year programs. The interdisciplinary design courses address the institutional constraints and present organizational approaches that resolve these issues. Student results are summarized and briefly assessed. A series of short intellectual problems are included to initiate discussion and understanding of design issues. Sample syllabi, research paper requirements, and oral presentation evaluation sheets are included. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Single-Instruction Multiple-Data Execution

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Having hit power limitations to even more aggressive out-of-order execution in processor cores, many architects in the past decade have turned to single-instruction-multiple-data (SIMD) execution to increase single-threaded performance. SIMD execution, or having a single instruction drive execution of an identical operation on multiple data items, was already well established as a technique to efficiently exploit data parallelism. Furthermore, support for it was already included in many commodity processors. However, in the past decade, SIMD execution has seen a dramatic increase in the set of applications using it, which has motivated big improvements in hardware support in mainstream microprocessors. The easiest way to provide a big performance boost to SIMD hardware is to make it wider— i.e., increase the number of data items hardware operates on simultaneously. Indeed, microprocessor vendors have done this. However, as we exploit more data parallelism in applications, cert in challenges can negatively impact performance. In particular, conditional execution, noncontiguous memory accesses, and the presence of some dependences across data items are key roadblocks to achieving peak performance with SIMD execution. This book first describes data parallelism, and why it is so common in popular applications. We then describe SIMD execution, and explain where its performance and energy benefits come from compared to other techniques to exploit parallelism. Finally, we describe SIMD hardware support in current commodity microprocessors. This includes both expected design tradeoffs, as well as unexpected ones, as we work to overcome challenges encountered when trying to map real software to SIMD execution. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Negative Quantum Channels:An Introduction to Quantum Maps that are Not Completely Positive

    Copyright Year: 2014

    Morgan and Claypool eBooks

    This book is a brief introduction to negative quantum channels, i.e., linear, trace-preserving (and consistent) quantum maps that are not completely positive. The flat and sharp operators are introduced and explained. Complete positivity is presented as a mathematical property, but it is argued that complete positivity is not a physical requirement of all quantum operations. Negativity, a measure of the lack of complete positivity, is proposed as a tool for empirically testing complete positivity assumptions. Table of Contents: Preface / Acknowledgments / Introduction and Definition of Terms / Tomography / Non-Positive Reduced Dynamics / Complete Positivity / Physical Motivation of Complete Positivity / Measures of Complete Positivity / Negative Channels / Negative Climates with Diagonal Composite Dynamics / Rabi Channels / Physical Motivations for Sharp Operations / Negative Qubit Channel Examples with Multi-Qubit Baths / Proposed Experimental Demonstration of Negativity / Implicatio s of Negative Channels / Uses for Negative Channels / Conclusions / Bibliography / Author's Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Smart Antennas

    Copyright Year: 2007

    Morgan and Claypool eBooks

    As the growing demand for mobile communications is constantly increasing, the need for better coverage, improved capacity, and higher transmission quality rises. Thus, a more efficient use of the radio spectrum is required. Smart antenna systems are capable of efficiently utilizing the radio spectrum and is a promise for an effective solution to the present wireless systems’ problems while achieving reliable and robust high-speed high-data-rate transmission. The purpose of this book is to provide the reader a broad view of the system aspects of smart antennas. In fact, smart antenna systems comprise several critical areas such as individual antenna array design, signal processing algorithms, space-time processing, wireless channel modeling and coding, and network performance. In this book we include an overview of smart antenna concepts, introduce some of the areas that impact smart antennas, and examine the influence of interaction and integration of these areas to Mobile Ad-H c Networks. In addition, the general principles and major benefits of using space-time processing are introduced, especially employing multiple-input multiple-output (MIMO) techniques. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Virtual Reality and Virtual Environments in 10 Lectures

    Copyright Year: 2015

    Morgan and Claypool eBooks

    The book is based on the material originally developed for the course on Virtual Reality, which the author was teaching at Tampere University of Technology, as well as course on Virtual Environments that the author had prepared for the University for Advancing Studies at Tempe, Arizona. This original purpose has influenced the structure of this book as well as the depth to which we explore the presented concepts. Therefore, our intention in this book is to give an introduction into the important issues regarding a series of related concepts of Virtual Reality, Augmented Reality, and Virtual Environments. We do not attempt to go into any of these issues in depth but rather outline general principles and discuss them in a sense broad enough to provide sufficient foundations for a further study. In other words, we aim to provide a set of keywords to the reader in order give him a good starting point from which he could go on and explore any of these issues in detail. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Practical Global Illumination with Irradiance Caching

    Copyright Year: 2009

    Morgan and Claypool eBooks

    Irradiance caching is a ray tracing-based technique for computing global illumination on diffuse surfaces. Specifically, it addresses the computation of indirect illumination bouncing off one diffuse object onto another. The sole purpose of irradiance caching is to make this computation reasonably fast. The main idea is to perform the indirect illumination sampling only at a selected set of locations in the scene, store the results in a cache, and reuse the cached value at other points through fast interpolation. This book is for anyone interested in making a production-ready implementation of irradiance caching that reliably renders artifact-free images. Since its invention 20 years ago, the irradiance caching algorithm has been successfully used to accelerate global illumination computation in the Radiance lighting simulation system. Its widespread use had to wait until computers became fast enough to consider global illumination in film production rendering. Since then, its use is biquitous. Virtually all commercial and open-source rendering software base the global illumination computation upon irradiance caching. Although elegant and powerful, the algorithm in its basic form often fails to produce artifact-free mages. Unfortunately, practical information on implementing the algorithm is scarce. The main objective of this book is to show the irradiance caching algorithm along with all the details and tricks upon which the success of its practical implementation is dependent. In addition, we discuss some extensions of the basic algorithm, such as a GPU implementation for interactive global illumination computation and temporal caching that exploits temporal coherence to suppress flickering in animations. Our goal is to show the material without being overly theoretical. However, the reader should have some basic understanding of rendering concepts, ray tracing in particular. Familiarity with global illumination is useful but not necessary to read this book. Tab e of Contents: Introduction to Ray Tracing and Global Illumination / Irradiance Caching Core / Practical Rendering with Irradiance Caching / Irradiance Caching in a Complete Global Illumination / Irradiance Caching on Graphics Hardware / Temporal Irradiance Caching View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Mobile Platforms and Development Environments

    Copyright Year: 2012

    Morgan and Claypool eBooks

    Mobile platform development has lately become a technological war zone with extremely dynamic and fluid movement, especially in the smart phone and tablet market space. This Synthesis lecture is a guide to the latest developments of the key mobile platforms that are shaping the mobile platform industry. The book covers the three currently dominant native platforms -- iOS, Android and Windows Phone -- along with the device-agnostic HTML5 mobile web platform. The lecture also covers location-based services (LBS) which can be considered as a platform in its own right. The lecture utilizes a sample application (TwitterSearch) that the authors show programmed on each of the platforms. Audiences who may benefit from this lecture include: (1) undergraduate and graduate students taking mobile computing classes or self-learning the mobile platform programmability road map; (2) academic and industrial researchers working on mobile computing R&D projects; (3) mobile app developers for a specifi platform who may be curious about other platforms; (4) system integrator consultants and firms concerned with mobilizing businesses and enterprise apps; and (5) industries including health care, logistics, mobile workforce management, mobile commerce and payment systems and mobile search and advertisement. Table of Contents: From the Newton to the iPhone / iOS / Android / Windows Phone / Mobile Web / Platform-in-Platform: Location-Based Services (LBS) / The Future of Mobile Platforms / TwitterSearch Sample Application View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Spectral Analysis of Signals:The Missing Data Case

    Copyright Year: 2006

    Morgan and Claypool eBooks

    Spectral estimation is important in many fields including astronomy, meteorology, seismology, communications, economics, speech analysis, medical imaging, radar, sonar, and underwater acoustics. Most existing spectral estimation algorithms are devised for uniformly sampled complete-data sequences. However, the spectral estimation for data sequences with missing samples is also important in many applications ranging from astronomical time series analysis to synthetic aperture radar imaging with angular diversity. For spectral estimation in the missing-data case, the challenge is how to extend the existing spectral estimation techniques to deal with these missing-data samples. Recently, nonparametric adaptive filtering based techniques have been developed successfully for various missing-data problems. Collectively, these algorithms provide a comprehensive toolset for the missing-data problem based exclusively on the nonparametric adaptive filter-bank approaches, which are robust and ac urate, and can provide high resolution and low sidelobes. In this book, we present these algorithms for both one-dimensional and two-dimensional spectral estimation problems. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Implanted Antennas in Medical Wireless Communications

    Copyright Year: 2006

    Morgan and Claypool eBooks

    One of the main objectives of this lecture is to summarize the results of recent research activities of the authors on the subject of implanted antennas for medical wireless communication systems. It is anticipated that ever sophisticated medical devices will be implanted inside the human body for medical telemetry and telemedicine. To establish effective and efficient wireless links with these devices, it is pivotal to give special attention to the antenna designs that are required to be low profile, small, safe and cost effective. In this book, it is demonstrated how advanced electromagnetic numerical techniques can be utilized to design these antennas inside as realistic human body environment as possible. Also it is shown how simplified models can assist the initial designs of these antennas in an efficient manner. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Linked Lexical Knowledge Bases:Foundations and Applications

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>This book conveys the fundamentals of Linked Lexical Knowledge Bases (LLKB) and sheds light on their different aspects from various perspectives, focusing on their construction and use in natural language processing (NLP). It characterizes a wide range of both expert-based and collaboratively constructed lexical knowledge bases. Only basic familiarity with NLP is required and this book has been written for both students and researchers in NLP and related fields who are interested in knowledge-based approaches to language analysis and their applications. </p><p> Lexical Knowledge Bases (LKBs) are indispensable in many areas of natural language processing, as they encode human knowledge of language in machine readable form, and as such, they are required as a reference when machines attempt to interpret natural language in accordance with human perception. In recent years, numerous research efforts have led to the insight that to make the best use of available knowledge, the orchestrated exploitation of different LKBs is necessary. This allows us to not only extend the range of covered words and senses, but also gives us the opportunity to obtain a richer knowledge representation when a particular meaning of a word is covered in more than one resource. Examples where such an orchestrated usage of LKBs proved beneficial include word sense disambiguation, semantic role labeling, semantic parsing, and text classification. </p><p> This book presents different kinds of automatic, manual, and collaborative linkings between LKBs. A special chapter is devoted to the linking algorithms employing text-based, graph-based, and joint modeling methods. Following this, it presents a set of higher-level NLP tasks and algorithms, effectively utilizing the knowledge in LLKBs. Among them, you will find advanced methods, e.g., distant supervision, or continuous vector space models of knowledge bases (KB), that have become widely used at the tim of this book's writing. Finally, multilingual applications of LLKB's, such as cross-lingual semantic relatedness and computer-aided translation are discussed, as well as tools and interfaces for exploring LLKBs, followed by conclusions and future research directions. </p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Advances in Modern Blind Signal Separation Algorithms:Theory and Applications

    Copyright Year: 2010

    Morgan and Claypool eBooks

    With human-computer interactions and hands-free communications becoming overwhelmingly important in the new millennium, recent research efforts have been increasingly focusing on state-of-the-art multi-microphone signal processing solutions to improve speech intelligibility in adverse environments. One such prominent statistical signal processing technique is blind signal separation (BSS). BSS was first introduced in the early 1990s and quickly emerged as an area of intense research activity showing huge potential in numerous applications. BSS comprises the task of 'blindly' recovering a set of unknown signals, the so-called sources from their observed mixtures, based on very little to almost no prior knowledge about the source characteristics or the mixing structure. The goal of BSS is to process multi-sensory observations of an inaccessible set of signals in a manner that reveals their individual (and original) form, by exploiting the spatial and temporal diversity, readily access ble through a multi-microphone configuration. Proceeding blindly exhibits a number of advantages, since assumptions about the room configuration and the source-to-sensor geometry can be relaxed without affecting overall efficiency. This booklet investigates one of the most commercially attractive applications of BSS, which is the simultaneous recovery of signals inside a reverberant (naturally echoing) environment, using two (or more) microphones. In this paradigm, each microphone captures not only the direct contributions from each source, but also several reflected copies of the original signals at different propagation delays. These recordings are referred to as the convolutive mixtures of the original sources. The goal of this booklet in the lecture series is to provide insight on recent advances in algorithms, which are ideally suited for blind signal separation of convolutive speech mixtures. More importantly, specific emphasis is given in practical applications of the developed BSS algorithms associated with real-life scenarios. The developed algorithms are put in the context of modern DSP devices, such as hearing aids and cochlear implants, where design requirements dictate low power consumption and call for portability and compact size. Along these lines, this booklet focuses on modern BSS algorithms which address (1) the limited amount of processing power and (2) the small number of microphones available to the end-user. Table of Contents: Fundamentals of blind signal separation / Modern blind signal separation algorithms / Application of blind signal processing strategies to noise reduction for the hearing-impaired / Conclusions and future challenges / Bibliography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Semantic Mining of Social Networks

    Copyright Year: 2015

    Morgan and Claypool eBooks

    Online social networks have already become a bridge connecting our physical daily life with the (web-based) information space. This connection produces a huge volume of data, not only about the information itself, but also about user behavior. The ubiquity of the social Web and the wealth of social data offer us unprecedented opportunities for studying the interaction patterns among users so as to understand the dynamic mechanisms underlying different networks, something that was previously difficult to explore due to the lack of available data. In this book, we present the architecture of the research for social network mining, from a microscopic point of view. We focus on investigating several key issues in social networks. Specifically, we begin with analytics of social interactions between users. The first kinds of questions we try to answer are: What are the fundamental factors that form the different categories of social ties? How have reciprocal relationships been developed fro parasocial relationships? How do connected users further form groups? Another theme addressed in this book is the study of social influence. Social influence occurs when one's opinions, emotions, or behaviors are affected by others, intentionally or unintentionally. Considerable research has been conducted to verify the existence of social influence in various networks. However, few literature studies address how to quantify the strength of influence between users from different aspects. In Chapter 4 and in [138], we have studied how to model and predict user behaviors. One fundamental problem is distinguishing the effects of different social factors such as social influence, homophily, and individual's characteristics. We introduce a probabilistic model to address this problem. Finally, we use an academic social network, ArnetMiner, as an example to demonstrate how we apply the introduced technologies for mining real social networks. In this system, we try to mine knowledge from b th the informative (publication) network and the social (collaboration) network, and to understand the interaction mechanisms between the two networks. The system has been in operation since 2006 and has already attracted millions of users from more than 220 countries/regions. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Dynamic Binary Modification:Tools, Techniques and Applications

    Copyright Year: 2011

    Morgan and Claypool eBooks

    Dynamic binary modification tools form a software layer between a running application and the underlying operating system, providing the powerful opportunity to inspect and potentially modify every user-level guest application instruction that executes. Toolkits built upon this technology have enabled computer architects to build powerful simulators and emulators for design-space exploration, compiler writers to analyze and debug the code generated by their compilers, software developers to fully explore the features, bottlenecks, and performance of their software, and even end-users to extend the functionality of proprietary software running on their computers. Several dynamic binary modification systems are freely available today that place this power into the hands of the end user. While these systems are quite complex internally, they mask that complexity with an easy-to-learn API that allows a typical user to ramp up fairly quickly and build any of a number of powerful tools. Mea while, these tools are robust enough to form the foundation for software products in use today. This book serves as a primer for researchers interested in dynamic binary modification systems, their internal design structure, and the wide range of tools that can be built leveraging these systems. The hands-on examples presented throughout form a solid foundation for designing and constructing more complex tools, with an appreciation for the techniques necessary to make those tools robust and efficient. Meanwhile, the reader will get an appreciation for the internal design of the engines themselves. Table of Contents: Dynamic Binary Modification: Overview / Using a Dynamic Binary Modifier / Program Analysis and Debugging / Active Program Modification / Architectural Exploration / Advanced System Internals / Historical Perspectives / Summary and Observations View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Analysis of the MPEG-1 Layer III (MP3) Algorithm using MATLAB

    Copyright Year: 2011

    Morgan and Claypool eBooks

    The MPEG-1 Layer III (MP3) algorithm is one of the most successful audio formats for consumer audio storage and for transfer and playback of music on digital audio players. The MP3 compression standard along with the AAC (Advanced Audio Coding) algorithm are associated with the most successful music players of the last decade. This book describes the fundamentals and the MATLAB implementation details of the MP3 algorithm. Several of the tedious processes in MP3 are supported by demonstrations using MATLAB software. The book presents the theoretical concepts and algorithms used in the MP3 standard. The implementation details and simulations with MATLAB complement the theoretical principles. The extensive list of references enables the reader to perform a more detailed study on specific aspects of the algorithm and gain exposure to advancements in perceptual coding. Table of Contents: Introduction / Analysis Subband Filter Bank / Psychoacoustic Model II / MDCT / Bit Allocation, Quantiza ion and Coding / Decoder View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Understanding User-Web Interactions via Web Analytics

    Copyright Year: 2009

    Morgan and Claypool eBooks

    This lecture presents an overview of the Web analytics process, with a focus on providing insight and actionable outcomes from collecting and analyzing Internet data. The lecture first provides an overview of Web analytics, providing in essence, a condensed version of the entire lecture. The lecture then outlines the theoretical and methodological foundations of Web analytics in order to make obvious the strengths and shortcomings of Web analytics as an approach. These foundational elements include the psychological basis in behaviorism and methodological underpinning of trace data as an empirical method. These foundational elements are illuminated further through a brief history of Web analytics from the original transaction log studies in the 1960s through the information science investigations of library systems to the focus on Websites, systems, and applications. Following a discussion of on-going interaction data within the clickstream created using log files and page tagging for analytics of Website and search logs, the lecture then presents a Web analytic process to convert these basic data to meaningful key performance indicators in order to measure likely converts that are tailored to the organizational goals or potential opportunities. Supplementary data collection techniques are addressed, including surveys and laboratory studies. The overall goal of this lecture is to provide implementable information and a methodology for understanding Web analytics in order to improve Web systems, increase customer satisfaction, and target revenue through effective analysis of user–Website interactions. Table of Contents: Understanding Web Analytics / The Foundations of Web Analytics: Theory and Methods / The History of Web Analytics / Data Collection for Web Analytics / Web Analytics Fundamentals / Web Analytics Strategy / Web Analytics as Competitive Intelligence / Supplementary Methods for Augmenting Web Analytics / Search Log Analytics / Conclusion / Key Te ms / Blogs for Further Reading / References View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Epistemology of Intelligent Semantic Web Systems

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>The Semantic Web is a young discipline, even if only in comparison to other areas of computer science. Nonetheless, it already exhibits an interesting history and evolution. This book is a reflection on this evolution, aiming to take a snapshot of where we are at this specific point in time, and also showing what might be the focus of future research. </p><p> This book provides both a conceptual and practical view of this evolution, especially targeted at readers who are starting research in this area and as support material for their supervisors. From a conceptual point of view, it highlights and discusses key questions that have animated the research community: what does it mean to be a Semantic Web system and how is it different from other types of systems, such as knowledge systems or web-based information systems? From a more practical point of view, the core of the book introduces a simple conceptual framework which characterizes Intelligent Semantic W b Systems. We describe this framework, the components it includes, and give pointers to some of the approaches and technologies that might be used to implement them. We also look in detail at concrete systems falling under the category of Intelligent Semantic Web Systems, according to the proposed framework, allowing us to compare them, analyze their strengths and weaknesses, and identify the key fundamental challenges still open for researchers to tackle.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Outlier Detection for Temporal Data

    Copyright Year: 2014

    Morgan and Claypool eBooks

    Outlier (or anomaly) detection is a very broad field which has been studied in the context of a large number of research areas like statistics, data mining, sensor networks, environmental science, distributed systems, spatio-temporal mining, etc. Initial research in outlier detection focused on time series-based outliers (in statistics). Since then, outlier detection has been studied on a large variety of data types including high-dimensional data, uncertain data, stream data, network data, time series data, spatial data, and spatio-temporal data. While there have been many tutorials and surveys for general outlier detection, we focus on outlier detection for temporal data in this book. A large number of applications generate temporal datasets. For example, in our everyday life, various kinds of records like credit, personnel, financial, judicial, medical, etc., are all temporal. This stresses the need for an organized and detailed study of outliers with respect to such temporal data. In the past decade, there has been a lot of research on various forms of temporal data including consecutive data snapshots, series of data snapshots and data streams. Besides the initial work on time series, researchers have focused on rich forms of data including multiple data streams, spatio-temporal data, network data, community distribution data, etc. Compared to general outlier detection, techniques for temporal outlier detection are very different. In this book, we will present an organized picture of both recent and past research in temporal outlier detection. We start with the basics and then ramp up the reader to the main ideas in state-of-the-art outlier detection techniques. We motivate the importance of temporal outlier detection and brief the challenges beyond usual outlier detection. Then, we list down a taxonomy of proposed techniques for temporal outlier detection. Such techniques broadly include statistical techniques (like AR models, Markov models, histograms, neura networks), distance- and density-based approaches, grouping-based approaches (clustering, community detection), network-based approaches, and spatio-temporal outlier detection approaches. We summarize by presenting a wide collection of applications where temporal outlier detection techniques have been applied to discover interesting outliers. View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    The Garbage Crisis:A Global Challenge for Egineers

    Copyright Year: 2013

    Morgan and Claypool eBooks

    This book will focus on "Waste Management," a serious global issue and engineers' responsibility towards finding better solutions for its sustainable management. Solid waste management is one of the major environmental burdens in both developed and developing countries alike. An alarming rate of solid waste generation trends can be seen as a result of globalization, industrialization, and rapid economic development. However, low-income and marginalized sectors in society suffer most from the unfavorable conditions deriving from poor waste management. Solid waste management is not a mere technical challenge. The environmental impact, socio-economic, cultural, institutional, legal, and political aspects are fundamental in planning, designing, and maintaining a sustainable waste management system in any country. Engineers have a major role to play in designing proper systems that integrate stakeholders, waste system elements, and sustainability aspects of waste management. This book is art of a focused collection from a project on Engineering and Education for Social and Environmental Justice. It takes an explicitly social and environmental justice stance on waste and attempts to assess the social impact of waste management on those who are also the most economically vulnerable and least powerful in the society. We hope that this book will assist our readers to think critically and understand the framework of socially and environmentally just waste management. Table of Contents: Introduction / Towards a Just Politics of Waste Management / Expertise, Indigenous People, and the Site 41 Landfill / Waste Management in the Global North / Waste Management in the Global South: A Sri Lankan Case Study / Assessing the Feasibility of Waste for Life in the Western Province of Sri Lanka View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Introduction to Engineering:A Starter's Guide with Hands-On Analog Multimedia Explorations

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This lecture provides a hands-on glimpse of the field of electrical engineering. The introduced applications utilize the NI ELVIS hardware and software platform to explore concepts such as circuits, power, analog sensing, and introductory analog signal processing such as signal generation, analog filtering, and audio and music processing. These principals and technologies are introduced in a very practical way and are fundamental to many of the electronic devices we use today. Some examples include photodetection, analog signal (audio, light, temperature) level meter, and analog music equalizer. Table of Contents: Getting Familiar with NI ELVIS / Analog Signal Level Meter Using LEDs / Noise Removal Using Analog Filters / Music Equalizer Using Op-Amps: Volume and Treble Control / Music Composer Using 555 Timers View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Copyright Year: 2016

    Morgan and Claypool eBooks

    <p>This book is written for students, CAD system users and software developers who are interested in geometric continuity—a notion needed in everyday practice of Computer-Aided Design and also a hot subject of research. It contains a description of the classical geometric spline curves and a solid theoretical basis for various constructions of smooth surfaces. Textbooks on computer graphics usually cover the most basic and necessary information about spline curves and surfaces in order to explain simple algorithms. In textbooks on geometric design, one can find more details, more algorithms and more theory. This book teaches how various parts of the theory can be gathered together and turned into constructions of smooth curves and smooth surfaces of arbitrary topology.</p><p>The mathematical background needed to understand this book is similar to what is necessary to read other textbooks on geometric design; most of it is basic linear algebra and analys s. More advanced mathematical material is introduced using elementary explanations. Reading <i>Geometric Continuity of Curves and Surfaces</i> provides an excellent opportunity to recall and exercise necessary mathematical notions and it may be your next step towards better practice and higher understanding of design principles.</p> View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Generalized Transmission Line Method to Study the Far-zone Radiation of Antennas Under a Multilayer Structure

    Copyright Year: 2008

    Morgan and Claypool eBooks

    This book gives a step-by-step presentation of a generalized transmission line method to study the far-zone radiation of antennas under a multilayer structure. Normally, a radiation problem requires a full wave analysis which may be time consuming. The beauty of the generalized transmission line method is that it transforms the radiation problem for a specific type of structure, say the multilayer structure excited by an antenna, into a circuit problem that can be efficiently analyzed. Using the Reciprocity Theorem and far-field approximation, the method computes the far-zone radiation due to a Hertzian dipole within a multilayer structure by solving an equivalent transmission line circuit. Since an antenna can be modeled as a set of Hertzian dipoles, the method could be used to predict the far-zone radiation of an antenna under a multilayer structure. The analytical expression for the far-zone field is derived for a structure with or without a polarizer. The procedure of obtaining th Hertzian dipole model that is required by the generalized transmission line method is also described. Several examples are given to demonstrate the capabilities, accuracy, and efficiency of this method. Table of Contents: Antennas Under a Multilayer Dielectric Slab / Antennas Under a Polarized Multilayer Structure / Hertzian Dipole Model for an Antenna / Bibliography / Biography View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Data Stream Management

    Copyright Year: 2010

    Morgan and Claypool eBooks

    Many applications process high volumes of streaming data, among them Internet traffic analysis, financial tickers, and transaction log mining. In general, a data stream is an unbounded data set that is produced incrementally over time, rather than being available in full before its processing begins. In this lecture, we give an overview of recent research in stream processing, ranging from answering simple queries on high-speed streams to loading real-time data feeds into a streaming warehouse for off-line analysis. We will discuss two types of systems for end-to-end stream processing: Data Stream Management Systems (DSMSs) and Streaming Data Warehouses (SDWs). A traditional database management system typically processes a stream of ad-hoc queries over relatively static data. In contrast, a DSMS evaluates static (long-running) queries on streaming data, making a single pass over the data and using limited working memory. In the first part of this lecture, we will discuss research prob ems in DSMSs, such as continuous query languages, non-blocking query operators that continually react to new data, and continuous query optimization. The second part covers SDWs, which combine the real-time response of a DSMS by loading new data as soon as they arrive with a data warehouse's ability to manage Terabytes of historical data on secondary storage. Table of Contents: Introduction / Data Stream Management Systems / Streaming Data Warehouses / Conclusions View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    Faceted Search

    Copyright Year: 2009

    Morgan and Claypool eBooks

    We live in an information age that requires us, more than ever, to represent, access, and use information. Over the last several decades, we have developed a modern science and technology for information retrieval, relentlessly pursuing the vision of a "memex" that Vannevar Bush proposed in his seminal article, "As We May Think." Faceted search plays a key role in this program. Faceted search addresses weaknesses of conventional search approaches and has emerged as a foundation for interactive information retrieval. User studies demonstrate that faceted search provides more effective information-seeking support to users than best-first search. Indeed, faceted search has become increasingly prevalent in online information access systems, particularly for e-commerce and site search. In this lecture, we explore the history, theory, and practice of faceted search. Although we cannot hope to be exhaustive, our aim is to provide sufficient depth and breadth to offer a useful resource to bot researchers and practitioners. Because faceted search is an area of interest to computer scientists, information scientists, interface designers, and usability researchers, we do not assume that the reader is a specialist in any of these fields. Rather, we offer a self-contained treatment of the topic, with an extensive bibliography for those who would like to pursue particular aspects in more depth. Table of Contents: I. Key Concepts / Introduction: What Are Facets? / Information Retrieval / Faceted Information Retrieval / II. Research and Practice / Academic Research / Commercial Applications / III. Practical Concerns / Back-End Concerns / Front-End Concerns / Conclusion / Glossary View full abstract»

  • Full text access may be available. Click article title to sign in or learn about subscription options.

    A Gyrovector Space Approach to Hyperbolic Geometry

    Copyright Year: 2009

    Morgan and Claypool eBooks