By Topic

Cognitive Informatics, 2005. (ICCI 2005). Fourth IEEE Conference on

Date 8-10 Aug. 2005

Filter Results

Displaying Results 1 - 25 of 45
  • Cognitive computation: the Ersatz Brain project

    Publication Year: 2005 , Page(s): 2 - 3
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (222 KB) |  | HTML iconHTML  

    The following discusses the Ersatz Brain project, a brain-like computing system, of Brown University. It tackles, in general, the proposed hardware and hardware parameters, the software application strategy and potential software applications suitable for its architecture, its technological vision, and the potential result of the project. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Psychological experiments on the cognitive complexities of fundamental control structures of software systems

    Publication Year: 2005 , Page(s): 4 - 5
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (323 KB) |  | HTML iconHTML  

    The measurement of cognitive complexity and functional size of software systems are an age-long problem in software engineering. Although, the symbolic complexity of software may be measured in lines of code, the functional complexity of software is too abstract to be measured or even estimated. Because numerous attributes of software systems are highly dependent on the understanding and measurability of software functional complicity, it has to be formally treated and empirically studied based on cognitive informatics and theoretical software engineering methodologies. This talk reveals that the cognitive functional size (CFS) of software is a product of its architectural and operational complexities based on the studies in cognitive informatics and abstract system theories. The fundamental basic control structures (BCSs) are elicited from software architectural and behavioral specifications and descriptions. The cognitive weights of those BCSs are derived and calibrated via a series of psychological experiments. Based on this work, CFS of a software system may be rigorously measured and analyzed by the unit of function-object (FO). View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Some advances in cognitive informatics

    Publication Year: 2005 , Page(s): 6 - 7
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (96 KB) |  | HTML iconHTML  

    This paper summarizes a few recent developments in cognitive informatics, with special emphasis on signal processing for autonomic computing and its metrics. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fixpoint semantics for rule-base anomalies

    Publication Year: 2005 , Page(s): 10 - 17
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (443 KB) |  | HTML iconHTML  

    A crucial component of an intelligent system is its knowledge base that contains knowledge about a problem domain. Knowledge base development involves domain analysis, context space definition, ontological specification, and knowledge acquisition, codification and verification. Knowledge base anomalies can affect the correctness and performance of an intelligent system. In this paper, we adopt a fixpoint semantics that is based on a multi-valued logic for a knowledge base. We then use the fixpoint semantics to provide formal definitions for four types of knowledge base anomalies: inconsistency, redundancy, incompleteness, circularity. We believe such formal definitions of knowledge base anomalies helps pave the way for a more effective knowledge base verification process. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The cognitive processes of abstraction and formal inferences

    Publication Year: 2005 , Page(s): 18 - 26
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (414 KB) |  | HTML iconHTML  

    Theoretical research is predominately an inductive process; while applied research is mainly a deductive process. Both inference processes are based on the cognitive process and means of abstraction. This paper describes the cognitive processes of abstraction and formal inferences such as deduction, induction, abduction, and analogy. The hierarchy of abstraction and the descriptivity of abstract means at different levels are analyzed. A set of mathematical models of formal inference methodologies are developed. Formal descriptions of the five cognitive processes of abstraction and inferences are presented using real-time process algebra (RTPA). Applications of abstraction and formal inferences in dealing with complicated problems in large-scale software system development in software engineering are discussed. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A brain-like computer for cognitive software applications: the Ersatz Brain project

    Publication Year: 2005 , Page(s): 27 - 36
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (685 KB) |  | HTML iconHTML  

    We want to design a suitable computer for the efficient execution of the software now being developed that displays human-like cognitive abilities. Examples of these potential software applications include natural language understanding, text processing, conceptually based Internet search, natural human-computer interfaces, cognitively based data mining, sensor fusion, and image understanding. Requirements of the proposed software are primary in shaping our hardware design. The hardware architecture design is based on a few ideas taken from the anatomy of mammalian neo-cortex. In common with other such attempts it is a massively parallel, two-dimensional array of CPUs and their associated memory. However, the design used in this project: (1) uses an approximation to cortical computation called the network of networks which holds that the basic computing unit in the cortex is not a single neuron but small groups of them working together in attractor networks; and (2) assumes connections in cortex are very sparse. The resulting architecture depends largely on local data movement. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Cognitive modelling using a logic-based algebra

    Publication Year: 2005 , Page(s): 37 - 42
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (177 KB) |  | HTML iconHTML  

    We give an approach to cognitive modelling which allows for richer expression than the one based simply on the firing of sets of neurons. The object language of the approach is first-order logic augmented by operations of an algebra, PSEN. Some operations useful for this kind of modelling are postulated combination, comparison and inhibition of sets of sentences. It is shown how these operations can be realised using PSEN. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Case-based introspective learning

    Publication Year: 2005 , Page(s): 43 - 48
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (251 KB) |  | HTML iconHTML  

    Introspective learning, as a method to improve the learning efficiency, has become an active area of research. In this paper, introspective learning and a general introspective learning mode are discussed. Some related problems such as meta-level reasoning, taxonomy of failure, and the relation between case-based reasoning and introspective learning are represented. Based on the importance of case-based reasoning in introspective learning, a case representation and case retrieval mechanism appropriate to introspective learning are described in detail. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Topological-based classification using artificial gene networks

    Publication Year: 2005 , Page(s): 49 - 56
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (276 KB) |  | HTML iconHTML  

    The topological properties of artificial gene networks that rely on the assessment of similarities between the expressions profile of gene pairs would be consistent with specific cellular states. Furthermore, some topological properties of the network would have significant variance regarding different cellular states. In this study, we proposed a novel and highly efficient computational framework for a topological-based classification using microarray gene expression data. Owing to the high prediction accuracy with this approach, we believe that there is a noticeable advantage of using topological-based classification. We used the microarray gene expression data sets containing 14 classes of cancer to construct 14 basic artificial gene networks accordingly. For each test sample, we add the sample into the data set of each class and reconstruct all of the networks. Cancer type was classified according to the correlation of topological quantity between the basic artificial gene networks and the reconstructed networks. Total classification accuracy can achieve 78.26% in the test data set (95.83% in the standard data set). After screening the quality of the standard data set, we can achieve a prediction accuracy of 86.48% in the test data set. Thus, we can achieve higher prediction accuracy in comparison to previous studies. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A unified approach to fractal dimensions

    Publication Year: 2005 , Page(s): 58 - 72
    Cited by:  Papers (12)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (302 KB) |  | HTML iconHTML  

    Many scientific papers treat the diversity of fractal dimensions as mere variations on either the same theme or a single definition. There is a need for a unified approach to fractal dimensions for there are fundamental differences between their definitions. This paper presents a new description of three essential classes of fractal dimensions based on: (i) morphology, (ii) entropy, and (iii) transforms, all unified through the generalized entropy-based Renyi fractal dimension spectrum. It discusses practical algorithms for computing 15 different fractal dimensions representing the classes. Although the individual dimensions have already been described in the literature, the unified approach presented in this paper is unique in terms of (i) its progressive development of the fractal dimension concept, (ii) similarity in the definitions and expressions, (iii) analysis of the relation between the dimensions, and (iv) their taxonomy. As a result, a number of new observations have been made, and new applications discovered. Of particular interest are behavioural processes such as dishabituation, irreversible and birth-death growth phenomena e.g., diffusion-limited aggregates, DLAs, dielectric discharges, and cellular automata, as well as dynamical nonstationary transient processes such as speech and transients in radio transmitters, multifractal optimization of image compression using learned vector quantization with Kohonen 's self-organizing feature maps (SOFMs), and multifractal-based signal denoising. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Theoretical study on a new information entropy and its use in attribute reduction

    Publication Year: 2005 , Page(s): 73 - 79
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (186 KB) |  | HTML iconHTML  

    The positive region in rough set framework and Shannon conditional entropy are two traditional uncertainty measurements, used usually as heuristic metrics in attribute reduction. In this paper first a new information entropy is systematically compared with Shannon entropy, which shows its competence of another new uncertainty measurement. Then given a decision system we theoretically analyze the variance of these three metrics under two reverse circumstances, Those are when condition (decision) granularities merge while decision (condition) granularities remain unchanged. The conditions that keep these measurements unchanged in the above different situations are also figured out. These results help us to give a new information view of attribute reduction and propose more clear understanding of the quantitative relations between these different views, defined by the above three uncertainty measurements. It shows that the requirement of reducing a condition attribute in new information view is more rigorous than the ones in the latter two views and these three views are equivalent in a consistent decision system. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A new model of generating chaos

    Publication Year: 2005 , Page(s): 80 - 85
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (264 KB) |  | HTML iconHTML  

    According to the problem of being difficult to control randomicity of chaos series generated by neutral networks (NN), this paper presents a new model of generating chaos based on adaptive neurofuzzy, inference system (ANFIS) whose inside parameters are adjusted by delaminating-adaptation genetic algorithm. Simulation shows this new model can generate chaos series with excellent randomicity. And randomicity of chaos series can be intelligently controlled by delaminating-adaptation genetic algorithm. The new model is suitable for spread spectrum communications. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Robust independent component analysis for cognitive informatics

    Publication Year: 2005 , Page(s): 86 - 92
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (423 KB) |  | HTML iconHTML  

    This paper evaluates the outlier sensitivity of five independent component analysis (ICA) algorithms (FastICA, extended-Infomax, JADE, radical, and β-divergence) using: (i) the Amari separation performance index, (ii) the optimum angle of rotation error, and (iii) the contrast function difference, in an outlier-contaminated mixture simulation. The Amari separation performance index has revealed a strong sensitivity of JADE and FastICA, using 3rd- and 4th- order nonlinearities, to outliers. However, the two contrast measures demonstrated conclusively that β-divergence is the least outlier-sensitive algorithm, followed by Radical, FastICA (exponential and hyperbolic-tangent nonlinearities), extended-Infomax, JADE, and FastICA (3rd-and 4th-order nonlinearities) in an outlier-contaminated mixture of two uniformly distributed signals. The novelty of this paper is the development of an unbiased optimization-landscape environment for assessing outlier sensitivity, as well as the optimum angle of rotation error and the contrast function difference as promising new measures for assessing the outlier sensitivity of ICA algorithms. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A novel fuzzy neural network: the vague neural network

    Publication Year: 2005 , Page(s): 94 - 99
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (503 KB) |  | HTML iconHTML  

    Fuzzy neural network that combines the artificial neural network and fuzzy logic is regarded as one of promising intelligent system. Based on fuzzy theory, fuzzy neural network has its problems: fuzzy membership function is a single value which combines the evidence for and against the pattern without indicating how much there is of which, hence it cannot get the more reasonable result of classification and recognition. Vague sets is characterized by a truth-membership function and a false-membership function which can solve the problem. Based on the idea of vague set theory, in this paper, a new kind of FNN: vague neuron network (VNN) is put forward, the properties are discussed. Then the VNN is applied to the problem of fault diagnosis and shows good performance. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Analysing the role of perceived self-efficacy in information processing for Web-based information systems

    Publication Year: 2005 , Page(s): 100 - 109
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (308 KB) |  | HTML iconHTML  

    From the perspective of individuals' beliefs about their cognitive abilities to process information, understanding the use of Web-based information systems is a big challenge. Ensuring then the competent processing of information, on the basis of users' mental processes and states, guarantees successful use of Web-based information systems. However, this issue becomes critical when users interact with familiar and unfamiliar information systems with varying perceptions of their abilities to use them. These perceptions are translated as users' perceptions of self-efficacy. This paper addresses this issue by proposing an information processing model based on user perceived self-efficacy. The proposed theoretical information processing model is tested and validated in a three-phase empirical study. Useful insights are obtained into individuals' varying perceptions of self-efficacy that account for information processing from the perspective of using familiar and unfamiliar systems separately. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Study of the protein-protein interaction networks via random graph approach

    Publication Year: 2005 , Page(s): 110 - 119
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (355 KB) |  | HTML iconHTML  

    We employ the random graph theory approach to analyze the protein-protein interaction database DIP, for seven species. Several global topological parameters are used to characterize the protein-protein interaction networks (PINs) for each species. We find that the seven PINs are well approximated by the scale-free networks and the hierarchical models possibly except fruit fly. In particular, we determine that the E. coli and the yeast PINs are well represented by the stochastic and deterministic hierarchical network models respectively. These results suggesting that the hierarchical network model is a better description for certain species' PINs, and it may not be an universal feature across different species. Furthermore, we demonstrate that PINs are quite robust when subject to random perturbation where up to 50% of the nodes are rewired or removed or 50% of edges are removed. Average node degree correlation study supports the fact that nodes of low connectivity are correlated, whereas nodes of high connectivity are not directly linked. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A model of attention-guided visual sparse coding

    Publication Year: 2005 , Page(s): 120 - 125
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (283 KB) |  | HTML iconHTML  

    Sparse coding theory demonstrates that the neurons in primary visual cortex form a sparse representation of natural scenes in the viewpoint of statistics, but a typical scene contains many different patterns (corresponding to neurons in cortex) compete for neural representation because of the limited processing capacity of the visual system. We propose an attention-guided sparse coding model. This model includes two modules: nonuniform sampling module simulating the process of retina and; data-driven attention module based on the response saliency. Our experiment results show that the model notably decreases the number of coefficients which may be activated and retains the main vision information at the same time. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • How does the memory work? By timed-arc Petri nets

    Publication Year: 2005 , Page(s): 128 - 135
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (371 KB) |  | HTML iconHTML  

    In this paper we use the graphical formalism timed-arc Petri nets to specify cognitive systems. In addition to the usual characteristics of this formalism, this timed extension of PNs features the possibility of capturing the facts of enabling/disabling a transition as a consequence of the time elapsed. This feature is useful to represent systems where the delays can strongly modify the state, such as the memorization process. In order to illustrate the suitability of this model we formally represent a cognitive model of the memory. This process includes the appropriate characteristics to show the advantages of having a very natural conception of concurrency as well as, how time can elapse and sometimes go beyond the threshold which determines if a perception goes into the long-term memory or not. The memory structure/working assumed can be better studied in Squire et al. (1993), Solso (1999), and Wang and Wang (2002). View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Automatic high-dimensional association rule generation for large relational data sets

    Publication Year: 2005 , Page(s): 136 - 143
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1269 KB) |  | HTML iconHTML  

    Data mining extracts knowledge from a large amount of data. It has been used in a variety of applications ranging from business and marketing to bioinformatics and genomics. Many data mining algorithms currently available, however, generate relatively simple rules that include a small number of attributes. Moreover, these algorithms need to build decision trees, which take a significant amount of time due to a large number of attributes and lack of field knowledge. Thus, in this paper, we propose a method that automatically generates high-dimensional association rules in large data sets with high accuracy and broad coverage. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Map element extraction model for pedestrian route guidance map

    Publication Year: 2005 , Page(s): 144 - 153
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (447 KB) |  | HTML iconHTML  

    A model that can generate compact and intelligible route guidance maps is essential for distributing efficient information on pedestrian route guidance through mobile terminals. In this research, we focus on the fact that the existing route guidance maps have been prepared by considering the characteristics of people's spatial cognition. By analyzing which roads and buildings are represented in existing maps, i.e., which map elements are important and necessary for pedestrian route guidance, we construct a model that can extract key map elements from the geographical database. The route guidance maps generated by the proposed model are shown and evaluated in comparison with the existing maps. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Dimension reduction of microarray data based on local tangent space alignment

    Publication Year: 2005 , Page(s): 154 - 159
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (230 KB) |  | HTML iconHTML  

    We introduce the new nonlinear dimension reduction method: LTSA, in dealing with the difficulty of analyzing high-dimensional, nonlinear microarray data. Firstly, we analyze the applicability of the method and we propose the reconstruction error of LTSA. The method is tested on Iris data set and acute leukemias microarray data. The results show good visualization performance. And LTSA outperforms PCA on determining the reduced dimension. There is only subtle change in the clustering correctness after dimension reduction by LTSA. It is evident that application of nonlinear dimension reduction techniques could have a promising perspective in microarray data analysis. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Cognitive information fusion of georeferenced data for tactical applications

    Publication Year: 2005 , Page(s): 160 - 166
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (333 KB) |  | HTML iconHTML  

    Georeferenced data from multiple autonomous sources for defined AOI (areas of interest) can be fused and analyzed in support of various decision-making processes such as risk assessment, emergency response, situation awareness and tactical planning. However, data from multiple heterogeneous sources may be in different formats, scales, qualities and coverage. All these characteristics of multi-source spatial data limit the use of traditional statistics methods for information fusion. In this paper, the cognitive belief is proposed to represent georeferenced data from different sources. Uncertainty caused by inaccurate or partial information can be modeled. By applying the belief combination rule, cognitive beliefs can be fused to provide better support for decision makers. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Towards a unified theory of spoken language processing

    Publication Year: 2005 , Page(s): 167 - 172
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (270 KB) |  | HTML iconHTML  

    Spoken language processing is arguably the most sophisticated behavior of the most complex organism in the known universe and, unsurprisingly, scientists still have much to learn about how it works. Meanwhile, automated spoken language processing systems have begun to emerge in commercial applications, not as a result of any deep insights into the way in which humans process language, but largely as a consequence of the introduction of a 'data-driven' approach to building practical systems. At the same time, computational models of human spoken language processing have begun to emerge but, although this has given rise to a greater interest in the relationship between human and machine behavior, the performance of the best models appears to be asymptoting some way short of the capabilities of the human listener/speaker. This paper discusses these issues, and presents an argument in favor of the derivation of a 'unifying theory' that would be capable of explaining and predicting both human and machine spoken language processing behavior, and hence serve both communities as well as representing a long-term 'grand challenge' for the scientific community in the emerging field of 'cognitive informatics'. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On cognitive properties of human factors in engineering

    Publication Year: 2005 , Page(s): 174 - 182
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (495 KB) |  | HTML iconHTML  

    Human traits and needs are the fundamental force underlying almost all phenomena in human task performances, engineering organizations, and societies. This paper explores the cognitive foundations of human traits and cognitive properties of human factors in engineering. The fundamental traits of human beings are identified, and the hierarchical model of basic human needs is formally describes. The characteristics of human factors and their influences in engineering organizations and socialization are explored. Based on the models of basic human traits, needs, and their influences, driving forces behind the human factors in engineering and society are revealed. A formal model of human errors in task performing is derived, and case studies of the error model in software engineering are presented. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • How to disregard irrelevant stimulus dimensions: evidence from comparative visual search

    Publication Year: 2005 , Page(s): 183 - 192
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (258 KB) |  | HTML iconHTML  

    To what extent is it possible to disregard stimulus dimensions that are irrelevant to a certain task? This question was tackled in three experiments using the paradigm of comparative visual search. Reaction times and eye-movement data were recorded in order to study the cognitive processes in this series of tasks. For the data analysis, task-specific variables were defined and their values computed across subjects and tasks. The results show that on the basis of top-down processes only, it is easier to ignore shape information, and that disregarding color information requires additional bottom-up processes. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.