By Topic

Machine Learning and Cybernetics, 2008 International Conference on

Date 12-15 July 2008

Go

Filter Results

Displaying Results 1 - 25 of 91
  • A labeling scheme based on Markov Random Fields and Gaussian mixture models for hyperspectral images

    Publication Year: 2008 , Page(s): 3619 - 3624
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (544 KB) |  | HTML iconHTML  

    A new method about surface feature labeling for hyperspectral images is presented in this paper in the framework of Bayesian labeling based on Markov random field (MRF). After the dimension of the hyperspectral image is reduced by PCA, a kernel density estimator and a Gaussian mixture model (GMM) are respectively used to capture the non-Gaussian statistics of the dimension-reduced images and their difference images. Further more, one of components of GMM is chosen to describe the energy of difference images to improve classification accuracy. A Markov random field-maximum a posteriori estimation problem is formulated and the final labels are obtained by the simulated annealing algorithm. Additionally, the labeling result based on GMM is compared with generalized Laplacian (GL) model. Experimental results show that it is an efficient and robust algorithm for surface feature labeling. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Construction of bivariate nonseparable compactly supported biorthogonal wavelets

    Publication Year: 2008 , Page(s): 3625 - 3629
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (325 KB) |  | HTML iconHTML  

    Bivariate wavelets have become powerful tool of image processing. We get a method for constructing bivariate nonseparable compactly supported biorthogonal filters. These filters can generate bivariate nonseparable compactly supported biorthogonal scaling functions and the related wavelets based on the theory of bivariate multiresolution analysis. Finally, we give two simple numerical examples of bivariate nonseparable compactly supported biorthogonal filters satisfying the perfect reconstruction condition. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Neural data mining for credit card fraud detection

    Publication Year: 2008 , Page(s): 3630 - 3634
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (233 KB) |  | HTML iconHTML  

    Due to a rapid advancement in the electronic commerce technology, use of credit cards has dramatically increased. As credit card becomes the most popular mode of payment, credit card frauds are becoming increasingly rampant in recent years. In this paper, we model the sequence of operations in credit card transaction processing using a confidence-based neural network. Receiver operating characteristic (ROC) analysis technology is also introduced to ensure the accuracy and effectiveness of fraud detection. A neural network is initially trained with synthetic data. If an incoming credit card transaction is not accepted by the trained neural network model (NNM) with sufficiently low confidence, it is considered to be fraudulent. This paper shows how confidence value, neural network algorithm and ROC can be combined successfully to perform credit card fraud detection. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Propagation modeling and analysis of viruses in P2P networks

    Publication Year: 2008 , Page(s): 3635 - 3640
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (262 KB) |  | HTML iconHTML  

    To counter the attacks of virus in P2P file-sharing networks, the model of virus propagation in P2P networks is proposed based on deep analysis on the features of file sharing and virus propagation. In order to examine the effects of different parameters in this model, large scale simulating experiments are carried out. The numerical analysis on the impact of P2P-related factors such as number of initially infected peers, downloading rate, recovery rate, etc, shows that attack performance of viruses is very sensitive to P2P system parameters. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Beat-wave signal regression based on least squares reproducing kernel support vector machine

    Publication Year: 2008 , Page(s): 3641 - 3645
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (252 KB) |  | HTML iconHTML  

    The kernel function of support vector machine(SVM) is an important factor for studying the result of the SVM. Based on the conditions of the support vector kernel function and reproducing kernel(RK) theory, a novel notion of least squares RK support vector machine(LS-RKSVM) with a RK on the Sobolev Hilbert space H1(R;a,b) is proposed for regressing Beat-wave signal. The choice of the RK is important in SVM technic. The RK function enhances the generalization ability of least squares support vector machine(LS-SVM) method. The simulation results are presented to illustrate the feasibility of the proposed method, this model gives a better experiment results. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A security model to protect sensitive information flows based on trusted computing technologies

    Publication Year: 2008 , Page(s): 3646 - 3650
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (290 KB) |  | HTML iconHTML  

    The sensitive information leakage accidents due to insiders were always troublesome. Based on trusted computing technologies the paper designs a model called secure information flows control model ( SIFCM for short) that will be applied in sensitive organizations. SIFCM protects sensitive information by encrypting any sensitive information flowing out of the system and guaranteeing the confidentiality of sensitive information against decryption outside the system boundary. The model is based on the hardware protection capability of trusted computing to prevent the keys from being obtained by unauthorized persons. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An adaptive epidemic broadcast mechanism for mobile ad hoc networks

    Publication Year: 2008 , Page(s): 3651 - 3656
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (484 KB) |  | HTML iconHTML  

    A major problem with the on-demand routing protocols for MANETs is the high cost flooding associated with route discovery process. Amongst many different optimizations for the pure flooding, the simple epidemic algorithm that emulates the spread of an infection in a crowded population, is one of the promising approaches. Based on the observation that phase transition phenomenon occurs under the relatively realistic ad hoc conditions with modest node mobility, our study show that adapting the packet retransmission probability to the local topology information can greatly enhance the overall performance of on-demand routing protocols for MANETs. The simulation results also exhibit that the adaptive gossip-based flooding offers a reasonable scalability in a relatively large-scale network. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An enhanced scheme of enforcing DTE security policy based on trusted computing technology

    Publication Year: 2008 , Page(s): 3657 - 3662
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (308 KB) |  | HTML iconHTML  

    As a classical security policy, DTE (domain and type enforcement) is usually used to protect the integrity of information and implemented in many famous security operating systems. But there are three main questions for most systems that have implemented DTE security policy as follows: 1) security policy enforcing module is easy to be tampered and bypass before loaded; 2) The content of security policy file is easily to be disclosed and modified; 3) The system is prone to suffer from "changed-name" attack. Trusted computing provides novel ideas and methods to solve the question of information security. The paper presents an enhanced scheme of enforcing DTE security policy based on trusted computing technology, it is scalable and can deal with the questions mentioned above well. It analyses the whole design of scheme in details and implements a prototype system to demonstrate the feasibility. Experiment results show that it has accepted performance overhead. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Crawling technology on eDonkey network and characteristics analysis

    Publication Year: 2008 , Page(s): 3663 - 3667
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (386 KB) |  | HTML iconHTML  

    EDonkey network which had a large number of shared resources had become a popular P2P files sharing network. In order to analyzing the characteristics of eDonkey network, this paper provided a method which could prevent the crawler from being added to the blacklist of the server and could break the count restriction of the results when crawler searching something from the server. Then the paper summarized the server distribution characteristics and the distribution characteristics of the shared files. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Cryptanalysis and improvement of a password-based key exchange protocol

    Publication Year: 2008 , Page(s): 3668 - 3672
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (307 KB) |  | HTML iconHTML  

    In 2002, Yeh and Sun proposed a simple authenticated key agreement protocol resistant to password guessing attacks. And they provided a formal proof of security to show its strength against both passive and active adversaries. However, the scheme presented by Yeh and Sun has secure flaws. In this paper, we provide the secure analysis of the scheme and show that it cannot resist the stolen-verifier attack and man-in-the-middle attack. Then we presents an improved scheme of the Yeh-Sunpsilas scheme which is resistant to the stolen-verifier attack combining with man-in-the-middle attack. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improvement upon architecture of TCG credentials

    Publication Year: 2008 , Page(s): 3673 - 3678
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (404 KB) |  | HTML iconHTML  

    The trusted computing group (TCG) has developed specifications for computing platforms that create a foundation of trust for software processes, based on credentials. But according to research on those credentials, we found management of those credentials is too complicated to implement, and the complication leads to underlying insecurity. The paper proposes a new architecture for credentials, making use of the EK credential and platform identity credential to accomplish the trust mechanism. This is a significant improvement on the previously five credentials required which are now reduced to only three. The proposal provides evidence for superiority in security and availability. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Research of BLP and Biba dynamic union model based on check domain

    Publication Year: 2008 , Page(s): 3679 - 3683
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (468 KB) |  | HTML iconHTML  

    This paper puts forward confidentiality and integrity dynamic union model to solve the contradiction of BLP and Biba based on check domain. First, all union situations of BLP and Biba model are analyzed. Then, the concept of check domain is advanced. In this domain the confidentiality and integrity can be checked and changed. When the BLP and Biba canpsilat be fit simultaneously, the subject canpsilat operate the object directly. Only by making use of the check domain and privilege subject, can the subject operate the object indirectly. At last, a serial of security are give, making BLP and Biba banded together in deed. The objects read and appended are signed by subjects who launched the operation for accident tracking. The model can solve the contradiction of BLP model and Biba model availability. So our can assure the confidentiality and integrality of the system. At the same time, the system will get high usability. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Research on intrusion detection systems based on Levenberg-Marquardt algorithm

    Publication Year: 2008 , Page(s): 3684 - 3688
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (388 KB) |  | HTML iconHTML  

    Intrusion detection system is a hot spot in the researching domain of information security. The paper puts forward an intrusion detection model called LMBP. Firstly the paper does a research on different optimal algorithms of BP (back propagation) networks and then combines improved BP algorithm with Levenberg-Marquardt algorithm to device LMBP. Based on this model the LM-HIDS intrusion detection system is established. The result of the experiment shows that it is an efficient way to detect host intrusion by applying Levenberg-Marquardt algorithm to optimize BP network. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A parametric method for edge detection based on recursive mean-separate image decomposition

    Publication Year: 2008 , Page(s): 3689 - 3694
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (485 KB) |  | HTML iconHTML  

    Edge detection has played an important role in the field of computer vision. A parametric edge detection method based on recursive mean-separate image decomposition is introduced. A method for automatic parameter selection and two methods for thresholding are also suggested. Experimental results show that the proposed method outperforms many popular edge detection methods, including Sobel, Prewitt, Frei-Chen, and Canny both visually and by quantitative edge map evaluation. Proper parameter selection can also provide segmentation of materials such as potential threat objects in X-ray luggage scan images. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An image scrambling algorithm using parameter bases M-sequences

    Publication Year: 2008 , Page(s): 3695 - 3698
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (482 KB) |  | HTML iconHTML  

    Image scrambling is a useful approach to secure the image data by transforming the image into an unintelligible format. This paper introduces a new parameter based M-sequence which can be produced by a series of shift registers. In addition, a new image scrambling algorithm based on the parametric M-sequence is presented. The user can change the security keys, r, which indicates the number of shift operations to be implemented, or the distance parameter p, to generate many different M-sequences. This makes the scrambled images difficult to decode thus providing a high level of security protection for the images. The presented algorithm can encrypt the 2-D or 3-D images in one step. It also shows good performance in the image attacks such as filters (data loss) and noise attacks. The algorithm can be applied in the real-time applications since it is a straightforward process and easily implemented. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Dynamic role assignment based on X.509 PMI mechanism for mobile agent systems

    Publication Year: 2008 , Page(s): 3699 - 3703
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (190 KB) |  | HTML iconHTML  

    Unauthorized access of malicious entities to agent systems is becoming a serious threat to agent-based communications. An X.509 attribute certificate-based access control mechanism for mobile agent systems is given. Authorization policies of this access control can be dynamically adapted to the constantly evolving threats from computer systems and networks. The privilege management infrastructure (PMI) based on X.509 standard provides an interoperability solution among varied agent platform by utilizing external authorization data structure, namely, attribute certificate fields. A role-based access control (RBAC) mechanism for agent systems based on attribute certificates is designed. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Enhancement of alaryngeal speech utilizing spectral subtraction and minimum statistics

    Publication Year: 2008 , Page(s): 3704 - 3709
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (702 KB) |  | HTML iconHTML  

    This paper proposes improvements to the electrolarynx device, which allows a patient to speak after the larynx is removed. Speech through existing electrolarynx devices is corrupted by high levels of noise and sounds unnatural. The proposed algorithm is based upon spectral subtraction techniques and modifies the magnitude of the speech signal in the frequency domain. Here, with the introduction of Discrete Cosine Transform (DCT) domain analysis using minimum statistics, the proposed algorithm effectively reduces high levels of noise generated by the electrolarynx. Unlike existing methods, the proposed algorithm does not require the use of a voice activity detector and the Discrete Cosine Transform domain is more proficient at isolating speech signal energy. The new algorithm presented in this paper is readily adaptable to hardware implementation and has the potential to be included in a handheld electrolarynx device in the future. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • JPEG steganalysis using color correlation and training on clean images only

    Publication Year: 2008 , Page(s): 3710 - 3713
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (140 KB) |  | HTML iconHTML  

    Steganalysis has becoming an emerging important technique for detecting secret messages that are embedded in a clean-image. Universal steganalysis is especially useful due to its independence of prior knowledge of the embedding procedure. However, the detection results from the majority of universal methods are largely determined by the training procedure on a mixture of clean-images and stego-images, and therefore not practically feasible. Moreover, many color steganalysis methods do not take color coefficients into special consideration and thus they can be viewed as a simple extension of the analysis for grayscale images. To capture the distinct features of the clean images, we propose a novel predictor based on the intra- and inter- color correlations of wavelet coefficients. This method achieves higher detection rates, under a blind condition that only involves clean images at the training stage. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Progressive Edge Detection on multi-bit images using polynomial-based binarization

    Publication Year: 2008 , Page(s): 3714 - 3719
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2941 KB) |  | HTML iconHTML  

    Edge detection is an important image processing operation with applications such as 3D reconstruction, recognition, image enhancement, image restoration and compression. Several edge detectors have been developed in the past decades although no single edge detector is best suited for all applications [S]. This paper presents a new concept in edge detection that is better suited for application-specific image processing. The grayscale or multi-bit image is mapped to a set of several binary images. This is followed by the application of edge detection algorithms on these binary images and a fusion of the individual edge maps. A novel polynomial based binarization method is also presented. An evolutionary approach to the fusion of edgemaps renders the algorithm adaptive. Experimental results have shown that this new concept has several advantages. It produces edges of better quality; it can be used in a dasiaprogressivepsila manner to save on computations and increase usability. Further, it can be used to take advantage of multiple popular edge detection algorithms (Danahy et al., 2007; Danahy et al., 2006; Danahy, 2006). View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Steganography detection using RBFNN

    Publication Year: 2008 , Page(s): 3720 - 3725
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (110 KB) |  | HTML iconHTML  

    A machine learning approach based on alpha-trimmed mean feature preprocessing is introduced to determine whether secret messages are hidden within JPEG images. This paper also integrates a multi-preprocessing sequence to develop the classification system which contains features generated from an image dataset including steganographic and clean images, feature ranking and selection, feature extraction, and data standardization. Neural networks using radial basis functions train the classifier to accomplish the decision making progress. The analyzed image is labeled as either a steganographic or a clean image. The computer simulations have shown that classification accuracy increases by 40% when using feature preprocessing within the complete detection system over a system without feature preprocessing. In addition, alpha-trimmed mean (including mean and median) statistics approach results in higher classification accuracy. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A network problem diagnosis expert system based on web services

    Publication Year: 2008 , Page(s): 3726 - 3731
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (616 KB) |  | HTML iconHTML  

    Web services are a link of service-oriented architecture (SOA), a hot research topic in recent years. Their application is present in collaborative commerce and other service-oriented systems. This study incorporates two major theoretic frameworks, expert systems and Web services, and applies them to computer network problem diagnosis in order to help users find appropriate recommendations and solutions for different network problems facing them. The system in the study employs the ASP.net environment for development in order to demonstrate the control and knowledge segregation characteristics. SQL server is selected as the knowledge base storage tool to give managers more system flexibility and allow them to easily modify contents of the knowledge and rules. Further, due to the cross-interface characteristics of Web services and the fact that the simple object access protocol (SOAP) employs the hyper text transfer protocol (HTTP) as the basic transmission protocol, there will be more and more platforms supporting Internet browsing in the future as mobile computation technology and equipment mature. The system therefore will be in line with the trend. It will be applied to different software/hardware platforms to maximize application benefits. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A rulespaces for Semantic Web services based on tuplespace and SWRL

    Publication Year: 2008 , Page(s): 3732 - 3737
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (289 KB) |  | HTML iconHTML  

    Coordination is a vital aspect of any type of interaction between computer agents in open distributed systems such as the Web. As distributed systems become more ubiquitous, autonomous and complex, the need to ground them on common data model grows stronger. In particular, semantic Web services are based on synchronous message exchange, thus being incompatible with the REST architectural model of the (semantic) Web. Recent advances in middleware technologies propose semantic aware tuplespaces as an infrastructure for coping with these issues. However, all of the existing related works do not support rule in their extensions of tuplespace. This paper presents a RuleSpaces based on a combination of tuplespace computing and semantic Web rule language. We describe overview of the conceptual model and discuss the necessary extensions to the original tuplespace. An initial version of the RuleSpaces prototype system has been implemented. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Accident prevention system based on semantic network

    Publication Year: 2008 , Page(s): 3738 - 3743
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (340 KB) |  | HTML iconHTML  

    As humans handle huge and dynamic information in their daily life, such information is structurally complex and ever growing. Nowadays, the most typical information should be the World Wide Web. It is a fertile knowledge sea which humans encounter in their life individually as well as in business. Therefore, Web mining is one of most important techniques. A new generation of Web mining techniques is developed to analyze Web information by means of searching, recommending, surfing and visualizing the Web. A semantic network is one of new ways for Web content mining which takes advantage for both of a fuzzy logical search and semantic analysis. The objective of this paper is to build an accident prevention system by means of the semantic network. The system is built from a number of ranking algorithms based on generality and novelty measures extracted from an accident database. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fuzzy system reliability analysis based on level(1−β,1−α) interval-valued fuzzy numbers and using statistical data

    Publication Year: 2008 , Page(s): 3744 - 3749
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (447 KB) |  | HTML iconHTML  

    In this paper, we consider the fuzzy reliability of the serial system and the fuzzy reliability of the parallel system problems. We use the statistical data to derive a level 1-alpha fuzzy numbers and a level (1-beta, 1-alpha) interval-valued fuzzy numbers. Then, we compute both fuzzy reliability of the serial system and fuzzy reliability of the parallel system based on level (1-beta, 1-alpha) interval-valued fuzzy numbers. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • On laws of large numbers for L-R fuzzy variables

    Publication Year: 2008 , Page(s): 3750 - 3755
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1308 KB) |  | HTML iconHTML  

    In this study, we discuss the laws of large numbers for T-independent L-R fuzzy variables based on continuous Archimedean t-norm and expected value of fuzzy variable. First, by using continuous Archimedean t-norm, we derive several convergent properties of sum of L-R fuzzy variables in credibility measure and in expected value, respectively. Then, on the basis of the obtained convergent properties, we establish some laws of large numbers for T-independent L-R fuzzy variables. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.