Scheduled System Maintenance:
Some services will be unavailable Sunday, March 29th through Monday, March 30th. We apologize for the inconvenience.
By Topic

Systems Science and Cybernetics, IEEE Transactions on

Issue 3 • Date Sept. 1968

Filter Results

Displaying Results 1 - 21 of 21
  • [Table of contents]

    Publication Year: 1968 , Page(s): c1
    Save to Project icon | Request Permissions | PDF file iconPDF (81 KB)  
    Freely Available from IEEE
  • IEEE Systems Science and Cybernetics Group

    Publication Year: 1968 , Page(s): c2
    Save to Project icon | Request Permissions | PDF file iconPDF (121 KB)  
    Freely Available from IEEE
  • Foreword

    Publication Year: 1968 , Page(s): 199
    Save to Project icon | Request Permissions | PDF file iconPDF (1140 KB)  
    Freely Available from IEEE
  • A Tutorial Introduction to Decision Theory

    Publication Year: 1968 , Page(s): 200 - 210
    Cited by:  Papers (8)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2303 KB)  

    Decision theory provides a rational framework for choosing between alternative courses of action when the consequences resulting from this choice are imperfectly known. Two streams of thought serve as the foundations: utility theory and the inductive use of probability theory. The intent of this paper is to provide a tutorial introduction to this increasingly important area of systems science. The foundations are developed on an axiomatic basis, and a simple example, the "anniversary problem," is used to illustrate decision theory. The concept of the value of information is developed and demonstrated. At times mathematical rigor has been subordinated to provide a clear and readily accessible exposition of the fundamental assumptions and concepts of decision theory. A sampling of the many elegant and rigorous treatments of decision theory is provided among the references. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Foundations of Decision Analysis

    Publication Year: 1968 , Page(s): 211 - 219
    Cited by:  Papers (27)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1955 KB)  

    Decision analysis has emerged from theory to practice to form a discipline for balancing the many factors that bear upon a decision. Unusual features of the discipline are the treatment of uncertainty through subjective probability and of attitude toward risk through utility theory. Capturing the structure of problem relationships occupies a central position; the process can be visualized in a graphical problem space. These features are combined with other preference measures to produce a useful conceptual model for analyzing decisions, the decision analysis cycle. In its three phases¿deterministic, probabilistic, and informational¿the cycle progressively determines the importance of variables in deterministic, probabilistic, and economic environments. The ability to assign an economic value to the complete or partial elimination of uncertainty through experimentation is a particularly important characteristic. Recent applications in business and government indicate that the increased logical scope afforded by decision analysis offers new opportunities for rationality to those who wish it. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Decision Analysis in a Corporation

    Publication Year: 1968 , Page(s): 220 - 226
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1589 KB)  

    The practical advantage of decision analysis is the decomposition of a complex problem into simpler parts that it makes possible. The consequences of a decision can be described in terms of contingent payoffs and then evaluated via independently assessed risk preferences (codified in a utility measure) and likelihood judgments (codified in a probability measure). In principle an (axiomatically consistent) individual can or should specify the requisite measures directly via various devices. Decision analysis in a multiperson enterprise, however, requires additional methods to construct such measures when they exist, or altemative measures if necessary. This paper reviews a "cooperative sharing" approach to multiperson decision analysis and compares it to the economic theory of risk markets. The qualitative properties of surrogate measures for an enterprise are described and related to the role of financial instruments, such as stocks and bonds. Cooperative behavior is assumed for the most part, but the role of game theory under uncertainty is also described briefly. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Prior Probabilities

    Publication Year: 1968 , Page(s): 227 - 241
    Cited by:  Papers (126)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3452 KB)  

    In decision theory, mathematical analysis shows that once the sampling distribution, loss function, and sample are specified, the only remaining basis for a choice among different admissible decisions lies in the prior probabilities. Therefore, the logical foundations of decision theory cannot be put in fully satisfactory form until the old problem of arbitrariness (sometimes called "subjectiveness") in assigning prior probabilities is resolved. The principle of maximum entropy represents one step in this direction. Its use is illustrated, and a correspondence property between maximum-entropy probabilities and frequencies is demonstrated. The consistency of this principle with the principles of conventional "direct probability" analysis is illustrated by showing that many known results may be derived by either method. However, an ambiguity remains in setting up a prior on a continuous parameter space because the results lack invariance under a change of parameters; thus a further principle is needed. It is shown that in many problems, including some of the most important in practice, this ambiguity can be removed by applying methods of group theoretical reasoning which have long been used in theoretical physics. By finding the group of transformations on the parameter space which convert the problem into an equivalent one, a basic desideratum of consistency can be stated in the form of functional equations which impose conditions on, and in some cases fully determine, an "invariant measure" on the parameter space. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Widget Problem Revisited

    Publication Year: 1968 , Page(s): 241 - 248
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1264 KB)  

    The Jaynes "widget problem" is reviewed as an example of an application of the principle of maximum entropy in the making of decisions. The exact solution yields an unusual probability distribution. The problem illustrates why some kinds of decisions can be made intuitively and accurately, but would be difficult to rationalize without the principle of maximum entropy. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Probabilistic Information Processing Systems: Design and Evaluation

    Publication Year: 1968 , Page(s): 248 - 265
    Cited by:  Papers (8)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4699 KB)  

    A Probabilistic Information Processing System (PIP) uses men and machines in a novel way to perform diagnostic information processing. Men estimate likelihood ratios for each datum and each pair of hypotheses under consideration or a sufficient subset of these pairs. A computer aggregates these estimates by means of Bayes' theorem of probability theory into a posterior distribution that reflects the impact of all available data on all hypotheses being considered. Such a system circumvents human conservatism in information processing, the inability of men to aggregate information in such a way as to modify their opinions as much as the available data justify. It also fragments the job of evaluating diagnostic information into small separable tasks. The posterior distributions that are a PIP's output may be used as a guide to human decision making or may be combined with a payoff matrix to make decisions by means of the principle of maximizing expected value. A large simulation-type experiment compared a PIP with three other information processing systems in a simulated strategic war setting of the 1970's. The difference between PIP and its competitors was that in PIP the information was aggregated by computer, while in the other three systems, the operators aggregated the information in their heads. PIP processed the information dramatically more efficiently than did any competitor. Data that would lead PIP to give 99:1 odds in favor of a hypothesis led the next best system to give 4¿: 1 odds. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Economic Objectives and Decision Problems

    Publication Year: 1968 , Page(s): 266 - 270
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1057 KB)  

    This paper surveys some classical decision problems with and without uncertainty. From the survey, it is concluded that the natural generalization of these problems leads to the problem of describing preference orderings over sets of stochastic processes. It is shown that the decision maker can describe a preference ordering of this kind by stating that he is exposed to a risk, represented by a stochastic process, and that his objective is to find the decision which will minimize the probability of his ruin. If this probability is equal to one, the natural objective is to maximize the expected time before ruin occurs. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Consistent Assessment and Fairing of Preference Functions

    Publication Year: 1968 , Page(s): 270 - 278
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1896 KB)  

    When a decision maker is assessing a preference (utility) function for assets (wealth), it is natural for him to start by making some quantitative assessments of the certainty equivalents of a few simple gambles and some qualitative statements specifying any regions in which he feels risk-averse or risk-seeking and any regions in which he feels decreasingly or increasingly risk-averse or risk-seeking. Several questions then arise. Does any preference function exist which satisfies all the quantitative and qualitative restrictions simultaneously, that is, are the restrictions consistent? If so, how far do they determine the preference function? How might one fair a "smooth" function satisfying the restrictions? This paper is addressed to these questions. First the problem is introduced in some detail, and the concepts involved reviewed. Then the case is considered where the qualitative restrictions only specify regions of risk-aversion or risk-seeking. It turns out in this case that all the restrictions are linear in certain quantities, so that the existence problem is essentially one of satisfying linear constraints. Furthermore, finding the maximum or minimum solution at a specified point is exactly a linear programming problem. Also discussed briefly are the possibility that some smoothing problems might simply introduce a nonlinear objective function (though the general smoothing problem is more complicated) and the problem of making the derivative of the preference function continuous (which is not always possible). If regions of increasing or decreasing risk-aversion are also given, the problem becomes much more difficult. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Development of a Corporate Risk Policy for Capital Investment Decisions

    Publication Year: 1968 , Page(s): 279 - 300
    Cited by:  Papers (3)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (4030 KB)  

    A corporate utility function plays a key role in the application of decision theory. This paper describes how such a utility function was evolved as a risk policy for capital investment decisions. First, 36 corporate executives were interviewed and their risk attitudes were quantified. From the responses of the interviewees, a mathematical function was developed that could reflect each interviewee's attitude. The fit of the function was tested by checking the reaction of the interviewees to adjusted responses. The functional form that led the interviewees to prefer the adjusted responses to their initial responses was finally accepted. The mathematical form of the function was considered a flexible pattern for a risk policy. The assumption was made that the corporate risk policy would be of this pattern. With the pattern for a risk policy set, it was possible to simplify the method of deriving a particular individual's risk attitude. Using the simplified method, the corporate policy makers were interviewed once more. The results from these interviews were then used as a starting point in two negotiation sessions. As a result of these negotiation sessions, the policy makers agreed on a risk policy for trial purposes. They also agreed to develop a number of major projects using the concepts of risk analysis and the certainty equivalent. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An Abbreviated States of the World Decision Model

    Publication Year: 1968 , Page(s): 300 - 306
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1220 KB)  

    Terminal acts and states of the world are specified, but detailed consequences may not be explicitly listed in an abbreviated states model. Axiomatic analyses of this model, with and without experimentation prior to the selection of a terminal act, are presented. In the abbreviated model, state probabilities are not defined, but the analysis is similar in many ways to more specialized models that permit derivations of state probabilities. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Memory Limitation and Multistage Decision Processes

    Publication Year: 1968 , Page(s): 307 - 316
    Cited by:  Papers (10)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1464 KB)  

    Sequential decision models have heretofore assumed a full memory decision maker. That is, the model is permitted to retain, to any degree of precision, all information needed to optimize decision performance. This information may include functions or variables that change with observations and thus often implies a decision maker which possesses a large amount of soft (erasable) memory. In simple multistage decision problems soft memory can be reduced to two variables¿the log-odds ratio L and the available number of observations n. The log-odds ratio is a quantitative measure of the decision maker's opinion of the cause of the observed variate. This paper examines the effect of limiting the decision maker's soft memory by specifying an m-bit register for the random variable L. The theory for limited memory multistage decision processes is presented in which there are two simple hypotheses. Numerical results indicate that the 3-bit memory is, for practical purposes, equivalent to a full memory decision maker. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Bayesian Autoregressive Time Series Analysis

    Publication Year: 1968 , Page(s): 317 - 324
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1384 KB)  

    Two Bayesian autoregressive time series models for partially observable dynamic processes are presented. In the first model, a general inference procedure is developed for the situation in which k previous values of the time series plus a change error determine the next value. This general model is specialized to an example in which the observational and change errors follow a normal probability law; the results for k = 1 are given and discussed. The second general model adds the facility for simultaneously inferring an unknown and unchanging parameter of the time series. This model is specialized to the same normal example presented earlier, with the precision of the change error as the unknown process parameter. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Economic Value of Analysis and Computation

    Publication Year: 1968 , Page(s): 325 - 332
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (989 KB)  

    This paper shows how the decision analysis approach can be used to determine the most economic method of carrying out computations or analyses. A primary decision problem is first formulated to obtain a structure for the analysis. Then several computational or analytical procedures, which can be used to analyze the primary decision problem in greater detail, are evaluated to select the most economic procedure. The purpose of each of these procedures is to increase the available information about uncertain parameters before making the primary decision, thereby yielding a "better" decision. Each procedure is evaluated by combining the value structure of the primary decision problem with a model of that procedure. The procedures considered in this paper are clairvoyance, complete analysis, Monte Carlo analysis, and numerical analysis. An example of a bidding problem is used to illustrate the results. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Decision Analysis of Model Selection

    Publication Year: 1968 , Page(s): 333 - 342
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1792 KB)  

    This paper describes a conceptual framework within which several alternate model forms for a particular process can be considered simultaneously. The development is in decision theoretic terms with the primary emphasis on the setting of a vector of control variables rather than the selection of a model per se. The argument centers about the role of observed data in altering the state of information about the appropriate model form and its parameters. The basic ideas are illustrated by means of a simple example of modeling a binary source. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Decision Analysis for Product Development

    Publication Year: 1968 , Page(s): 342 - 354
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2013 KB)  

    The decision analysis described considers four major installation development alternatives upon which management decisions were required. The analysis has a number of phases. The technical-economic system is first modeled deterministically to describe the characteristics of the business. The deterministic model, or expected value business model, is then exercised in a sensitivity phase to determine what parameters are most influential to the outcomes. The analysis uses present worth methods for the time preference for money. Pricing strategies and market feedback capability are included. Adjunct businesses that develop as a result of the original business have been modeled in the deterministic model. The deterministic or nominal value outcomes show a clear progression of improvement over a 20 year period for the more technically advanced installation developments. The cost of development is an influential contributor to deterministic outcomes. The influence on near term outcomes of development costs was sufficient to make the issue of time value of outcomes very significant. The value of identifying the short term and long term outcomes on the basis of the time value of outcomes becomes identifiable as a significant contribution to decision making. The thinking which contributed to choosing the parameters used in the uncertainty analysis proved to be an important means of clarifying the issues surrounding what uncertainty analysis of a given development alternative venture really consists of. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An Application of Decision Theory to a Medical Diagnosis-Treatment Problem

    Publication Year: 1968 , Page(s): 355 - 362
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1330 KB)  

    The decision faced by a physician when confronted by a patient with an undetermined disease may be simply stated as: "What course of action, in the form of diagnostic tests and/or treatments, should be taken?" In most cases, this problem can be characterized as a sequential decision under uncertainty. Since this is a class of problems for which decision theory has proved a useful tool, it appears fruitful to attempt to apply it to the physician's problem. In this paper, this possibility is explored by describing the application of decision theoretic techniques to a specific case. We first comment on why we believe the proposed model is more appropriate than other methods of treating the problem. Then the proposed model is briefly described in the abstract. The main body of the paper describes a specific problem and its solution by decision theoretic techniques. In the final section, some of the shortcomings of the particular analysis and some of the problems that might be encountered in a more general setting are pointed out. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Contributors

    Publication Year: 1968 , Page(s): 363 - 366
    Save to Project icon | Request Permissions | PDF file iconPDF (4128 KB)  
    Freely Available from IEEE
  • Information for authors

    Publication Year: 1968 , Page(s): 366b
    Save to Project icon | Request Permissions | PDF file iconPDF (152 KB)  
    Freely Available from IEEE