By Topic

Potentials, IEEE

Issue 5 • Date Dec. 1995-Jan. 1996

Filter Results

Displaying Results 1 - 8 of 8
  • Getting on board

    Publication Year: 1995
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1177 KB)  

    First Page of the Article
    View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fast simulation of HDL models

    Publication Year: 1995 , Page(s): 14 - 17
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (776 KB)  

    Using large application specific integrated circuits (ASICs), typically with at least 50,000 (50K) gates, is on the rise. Hardware design methodology has turned to CAE (computer-aided engineering) tools to simulate and verify performance. These tools are usually called hardware description languages (HDLs). Digital systems are becoming increasingly complex. Simulating designs using conventional or optimized (sequential) HDL simulators, or even specialized hardware accelerators, cannot keep up with this growth. It is not unusual for a single simulation run with one test vector for a 50K ASIC design to take hours. This definitely increases the hardware design cost and prolongs the time-to-market. One solution to this time problem is to reduce the total amount of simulation that is done, or simply to simulate a less complex model. However, this increases the risk of producing a faulty circuit. Unfortunately, these methods are quite common in today's competitive hardware design industry. The company that first releases a new and better design is the one most likely to survive. Hence, there are some serious doubts in the hardware design community that existing HDL tools are suitable for large, complex designs. However, ways exist to improve the situation. One approach we believe is promising is to use parallel simulation methodologies View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Best of both worlds [formal and semi-formal software engineering]

    Publication Year: 1995 , Page(s): 29 - 32
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (892 KB)  

    The objective of software engineering is modeling software systems to support effective system synthesis and analysis. The traditional approach is using semi-formal models and accompanying design methodologies. Formal models represent an alternative to semi-formal models supporting precise representation and proof capabilities. These two approaches are complimentary, not competitive. Formal methods have had difficulty finding their way into the software engineering lifecycle. Although excellent CASE tools exist to support semi-formal methods, tools for formal methods support have yet to emerge. Although model checkers, proof tools and the like exist for formal models, they provide little support for the software lifecycle. Traditional CASE environments provide little or no support for formal models. In addition, formal methods and the mathematics necessary to support them have not found their way into standard software engineering curriculum. Thus, formal methods practitioners have found it difficult to present their work in some industrial settings. Using formal and semi-formal methods together should satisfy both the need to predict behavior and have understandable system representations. Although this article presents only structured analysis, formal methods can contribute to virtually any semi-formal software engineering methodology in the same manner View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Local area networks

    Publication Year: 1995 , Page(s): 6 - 10
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1624 KB)  

    The processing power of today's computers is relatively ineffective unless they can be efficiently interconnected, both locally and globally. On the horizon is the asynchronous transfer mode (ATM) which addresses the switching paradigm for broadband technologies. ATM allocates the total network bandwidth and time to bring multimedia services to the user. Thus, the future of the LAN is ensured. It is the technology for connecting global data bases, world-wide information sources, and data/video/voice communication just to mention a few. The ability to communicate anywhere, anytime has permeated the workplace and personal lives of people. Local Area Networks provide this ability, both as the medium for linking the office/work environment and as a gateway to the world View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using fuzzy set theory

    Publication Year: 1995 , Page(s): 33 - 35
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (600 KB)  

    Two main trends are mentioned: the fuzzy set theory and the fuzzy logic. Both build upon set theory and logic, respectively. Three features distinguish the approaches: (1) the use of so called linguistic variables, instead of or together with numeric variables; (2) the use of fuzzy conditional statements to represent simple relations between variables; and (3) the characterization of complex relations by fuzzy algorithms. Fuzzy linguistic variables and fuzzy algorithms offer an effective, more flexible way to describe a system's behavior too complex for a classical mathematical model. They are very successful in economics, management science, artificial intelligence, information retrieval systems, pattern recognition, image processing, psychology, biology, and other fields rendered inherently fuzzy do to the unpredictable behavior of their components. Expert systems, fuzzy neural computing and pattern recognition are discussed in some detail View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Waiting in line [queuing theory]

    Publication Year: 1995 , Page(s): 11 - 13
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (352 KB)  

    Queuing phenomenon encompasses all activities of our lives. We have to wait in line whenever the number of servers or the service rate of the server does not match the rate at which the customers arrive in the queue. Queuing theory is the formal analysis of this phenomenon in search of finding the optimum solution to this problem so that everybody gets service without waiting for a long time in line. After giving an introduction to queuing theory, the author discusses real world applications of the theory View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Power SAWs

    Publication Year: 1995 , Page(s): 18 - 20
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1376 KB)  

    Applications employing digital signal processing (DSP) has mushroomed over the past fifteen years. DSP techniques are now employed in television and radio receivers, in telephones, in automobile engines, in medical diagnostics, in robotics, in military systems, to just name a few. Its popularity stems from its flexibility. DSP also parallels the advances made in low cost computers and specialized integrated circuits (ICs). In a few applications, however, these digital techniques are either too slow or consume too much power to be practical. Examples include ultra-high frequency (UHF) filters and convolvers. In such applications, analog techniques can often perform the same tasks much more efficiently. Of the analog techniques, those employing surface acoustic waves (SAWs) offer unique capabilities for processing signals in the 30 MHz to 2 GHz frequency range. We examine this technology and review some of its applications View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Mirroring our thought processes [recurrent neural network and time series in forecasting]

    Publication Year: 1995 , Page(s): 36 - 41
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1304 KB)  

    To employ simple exponential smoothing in statistical forecasting, we essentially have to assume that the time series fluctuates at a gradually changing mean level. Forecasts are created on an iterative basis by weighing averages of observed values in the time series. The weights are assigned unequally with heavier weights applied to the most recent observations and exponentially declining weights to observations made far in the past. Yet, simple exponential smoothing still cannot help in making accurate predictions. One still has to monitor this forecasting system to determine whether or not the weights need to be adjusted to reduce forecasting errors. Since artificial neural network (ANN) technology provides us with weight adjusting algorithms, we propose using a special ANN architecture, a simple recurrent neural network. This network will provide a simple exponential smoothing forecasting system with an adaptive weighting scheme View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

IEEE Potentials is the magazine dedicated to undergraduate and graduate students and young professionals.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
David Tian
Carnegie Mellon University
david.tian@ieee.org