By Topic

Signal Processing, IEEE Transactions on

Issue 2  Part 2 • Date Feb. 2005

 This issue contains several parts.Go to:  Part 1 

Filter Results

Displaying Results 1 - 18 of 18
  • Table of contents

    Publication Year: 2005 , Page(s): c1
    Save to Project icon | Request Permissions | PDF file iconPDF (37 KB)  
    Freely Available from IEEE
  • IEEE Transactions on Signal Processing publication information

    Publication Year: 2005 , Page(s): c2
    Save to Project icon | Request Permissions | PDF file iconPDF (35 KB)  
    Freely Available from IEEE
  • A Message From the Editor-in-Chief and Guest Editorial: Supplement on Secure Media—II

    Publication Year: 2005 , Page(s): 745
    Save to Project icon | Request Permissions | PDF file iconPDF (27 KB) |  | HTML iconHTML  
    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Secret key estimation in sequential steganography

    Publication Year: 2005 , Page(s): 746 - 757
    Cited by:  Papers (11)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1060 KB) |  | HTML iconHTML  

    We define sequential steganography as those class of embedding algorithms that hide messages in consecutive (time, spatial, or frequency domain) features of a host signal. This work presents a steganalysis method that estimates the secret key used in sequential embedding. Steganalysis is posed as the detection of abrupt jumps in the statistics of a stego signal. Stationary and nonstationary host signals with low, medium, and high signal-to-noise ratio (SNR) embedding are considered. A locally most powerful steganalysis detector for the low SNR case is also derived. Several techniques to make the steganalysis algorithm work for nonstationary digital image steganalysis are also presented. Extensive experimental results are shown to illustrate the strengths and weaknesses of the proposed steganalysis algorithm. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Exposing digital forgeries by detecting traces of resampling

    Publication Year: 2005 , Page(s): 758 - 767
    Cited by:  Papers (33)  |  Patents (7)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2252 KB) |  | HTML iconHTML  

    The unique stature of photographs as a definitive recording of events is being diminished due, in part, to the ease with which digital images can be manipulated and altered. Although good forgeries may leave no visual clues of having been tampered with, they may, nevertheless, alter the underlying statistics of an image. For example, we describe how resampling (e.g., scaling or rotating) introduces specific statistical correlations, and describe how these correlations can be automatically detected in any portion of an image. This technique works in the absence of any digital watermark or signature. We show the efficacy of this approach on uncompressed TIFF images, and JPEG and GIF images with minimal compression. We expect this technique to be among the first of many tools that will be needed to expose digital forgeries. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Cryptanalysis of optimal differential energy watermarking (DEW) and a modified robust scheme

    Publication Year: 2005 , Page(s): 768 - 775
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (346 KB) |  | HTML iconHTML  

    This work presents a cryptanalytic methodology of the well-known "Optimal Differential Energy Watermarking (DEW)" scheme. The DEW scheme divides the image into some disjoint regions (each region containing two subregions). In the DEW scheme, the watermark is a secret binary string where each individual bit information gets inserted in one of the regions by modifying the high-frequency Discrete Cosine Transform (DCT) coefficients. This modification creates a required energy difference between two subregions. We modify these high-frequency components so that this energy difference vanishes, and in turn, extraction of the watermark signal becomes impossible, making the cryptanalysis successful. Further, we improve the DEW scheme by inserting the watermark information in low-frequency components instead of high-frequency components and propose an oblivious robust watermarking strategy that enables forensic tracking. We present statistical justification toward the robustness of our modified scheme. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Online signature verification using temporal shift estimated by the phase of Gabor filter

    Publication Year: 2005 , Page(s): 776 - 783
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1081 KB) |  | HTML iconHTML  

    A new online signature-verification method using temporal shift estimation is presented. Local temporal shifts existing in signatures are estimated by the differences of the phase outputs of Gabor filter applied to signature signals. In the proposed signature-verification algorithm, an input signature signal undergoes preprocessing procedures including smoothing, size normalization and skew correction, and then its feature profile is extracted from the signature signal. A Gabor filter with the predetermined center frequency /spl omega/ is applied on a feature profile, and a phase profile is computed from the phase output. The feature profile and the phase profile are length normalized and quantized so that a signature code of fixed size is generated. The temporal shifts existing between two signatures are computed by using the differences between the phase profiles. The information about the temporal shifts is used as offsets for comparing the two feature profiles. Therefore, two kinds of dissimilarities are proposed. Temporal dissimilarity is a measure reflecting the amount of total temporal disturbance between the two signatures. The difference between the two signature profiles is computed at each corresponding point pair and is accumulated into temporally arranged feature profile dissimilarity. The decision boundary is represented as a straight line in the dissimilarity space whose two axes are the two dissimilarity measures. The slope and the position of the decision boundary are computed using the distribution of the dissimilarities among the sample signatures involved in the enrollment procedure. The experimental results show that through the compact and fixed size of signature data representation and relatively simple comparison methods, the proposed method can compare signatures 400 times faster than the conventional DP matching based signature-verification method. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An asymmetric subspace watermarking method for copyright protection

    Publication Year: 2005 , Page(s): 784 - 792
    Cited by:  Papers (6)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1057 KB) |  | HTML iconHTML  

    We present an asymmetric watermarking method for copyright protection that uses different matrix operations to embed and extract a watermark. It allows for the public release of all information, except the secret key. We investigate the conditions for a high detection probability, a low false positive probability, and the possibility of unauthorized users successfully hacking into our system. The robustness of our method is demonstrated by the simulation of various attacks. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Embeddable ADC-based true random number generator for cryptographic applications exploiting nonlinear signal processing and chaos

    Publication Year: 2005 , Page(s): 793 - 805
    Cited by:  Papers (7)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (484 KB) |  | HTML iconHTML  

    We present a true random number generator which, contrary to other implementations, is not based on the explicit observation of complex micro-cosmic processes but on standard signal processing primitives, freeing the designer from the need for dedicated hardware. The system can be implemented from now ubiquitous analog-to-digital converters building blocks, and is therefore well-suited to embedding. On current technologies, the design permits data rates in the order of a few tens of megabits per second. Furthermore, the absence of predictable, repeatable behaviors increases the system security for cryptographic applications. The design relies on a simple inner model based on chaotic dynamics which, in ideal conditions, can be formally proven to generate perfectly uncorrelated binary sequences. Here, we detail the design and we validate the quality of its output against a couple of test suites standardized by the U.S. National Institute of Standards and Technology, both in the ideal case and assuming implementation errors. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multiple watermarks for stereo audio signals using phase-modulation techniques

    Publication Year: 2005 , Page(s): 806 - 815
    Cited by:  Papers (3)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (738 KB) |  | HTML iconHTML  

    Audio watermarking techniques are used to embed extra information, typically relating to copyrights, into audio signals. In some applications, embedding of multiple watermarks in a single multimedia content is required. Considering an actual situation, it might be necessary to embed up to three kinds of information as watermarks, i.e., "copyright management information," "copy control information," and "fingerprints." For this purpose, a new watermarking technique capable of embedding multiple watermarks based on phase modulation is proposed in this paper. The idea utilizes the insensitivity of the human auditory system to phase changes with relatively long transition period. In the proposed technique, the phase modulation of the original signal is realized by means of a time-varying all-pass filter. To accomplish the blind detection which is required in detecting the copy control information, this watermark is assigned to the inter-channel phase difference between a stereo audio signal by using frequency shift keying. Meanwhile, the copyright management information and the fingerprint are embedded into both channels by using phase shift keying of different frequency components. Consequently, these three kinds of information are simultaneously embedded into a single time frame. Imperceptibility of the watermark was confirmed through a subjective listening test. Robustness against several kinds of signal processing were evaluated by computer simulations. The proposed method provided good performance in both those subjective and objective tests. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Audio watermarking: a way to stationnarize audio signals

    Publication Year: 2005 , Page(s): 816 - 823
    Cited by:  Papers (3)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (422 KB) |  | HTML iconHTML  

    Audio watermarking is usually used as a multimedia copyright protection tool or as a system that embed metadata in audio signals. In this paper, watermarking is viewed as a preprocessing step for further audio processing systems: the watermark signal conveys no information, rather it is used to modify the statistical characteristics of an audio signal, in particular its nonstationarity. The embedded watermark is then added in order to stationnarize the host signal. Indeed, the embedded watermark is piecewise stationary, thus it modifies the stationarity of the original audio signal. In some audio processing fields, this fact can be used to improve performances that are very sensitive to time-variant signal statistics. This work presents an analysis of the perceptual watermarking impact on the stationarity of audio signals. The study is based on stationarity indices, which represent a measure of variations in spectral characteristics of signals, using time-frequency representations. Simulation results with two kinds of signals, artificial signals and audio signals (speech and music), are presented. Stationarity indices comparison between watermarked and original audio signals shows a significant stationarity enhancement of the watermarked signal, especially for transient attacks. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Informed watermarking by means of orthogonal and quasi-orthogonal dirty paper coding

    Publication Year: 2005 , Page(s): 824 - 833
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (511 KB) |  | HTML iconHTML  

    A new dirty paper coding technique that is robust against the gain attack is presented. Such a robustness is obtained by adopting a set of (orthogonal) equi-energetic codewords and a correlation-based decoder. Due to the simple structure of orthogonal codes, we developed a simple yet powerful technique to embed the hidden message within the host signal. The proposed technique is an optimal one, in that the embedding distortion is minimized for a given robustness level, where robustness is measured through the maximum pairwise error probability in the presence of an additive Gaussian attack of given strength. The performance of the dirty coding algorithm is further improved by replacing orthogonal with quasi- orthogonal codes, namely, Gold sequences, and by concatenating them with an outer turbo code. To this aim, the inner decoder is modified to produce a soft estimate of the embedded message. Performance analysis is carried out by means of extensive simulations proving the validity of the novel watermarking scheme. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Joint watermarking and compression using scalar quantization for maximizing robustness in the presence of additive Gaussian attacks

    Publication Year: 2005 , Page(s): 834 - 844
    Cited by:  Papers (4)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (538 KB) |  | HTML iconHTML  

    In joint watermarking and compression (JWC), a key process is quantization which embeds watermarks into a host signal while digitizing the host signal subject to requirements on the embedding rate, compression rate, quantization distortion, and robustness. Using fixed-rate scalar quantization for watermarking and compression, in this paper, we mainly consider how to design binary JWC systems to maximize the robustness of the systems in the presence of additive Gaussian attacks under constraints on the compression rate and quantization distortion. We first investigate optimum decoding of a binary JWC system, and demonstrate by experiments that in the distortion-to-noise ratio (DNR) region of practical interest, the minimum distance (MD) decoder achieves performance comparable to that of the maximum likelihood decoder in addition to having advantages of low computation complexity and being independent of the statistics of the host signal. We then present optimum binary JWC encoding schemes using fixed-rate scalar quantization and the MD decoder. Simulation results show that optimum binary JWC systems using nonuniform quantization are better than optimum binary JWC systems using uniform quantization. Furthermore, in comparison with separate watermarking and compression systems, optimum binary JWC systems using nonuniform quantization achieve significant DNR gains in the DNR region of practical interest. Finally, spread transform dither modulation is applied to improving the robustness of the JWC systems at low DNRs. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • How realistic is photorealistic?

    Publication Year: 2005 , Page(s): 845 - 850
    Cited by:  Papers (12)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2540 KB) |  | HTML iconHTML  

    Computer graphics rendering software is capable of generating highly photorealistic images that can be impossible to differentiate from photographic images. As a result, the unique stature of photographs as a definitive recording of events is being diminished (the ease with which digital images can be manipulated is, of course, also contributing to this demise). To this end, we describe a method for differentiating between photorealistic and photographic images. Specifically, we show that a statistical model based on first-order and higher order wavelet statistics reveals subtle but significant differences between photorealistic and photographic images. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • User authentication through typing biometrics features

    Publication Year: 2005 , Page(s): 851 - 855
    Cited by:  Papers (6)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (171 KB) |  | HTML iconHTML  

    This paper uses a static keystroke dynamics in user authentication. The inputs are the key down and up times and the key ASCII codes captured while the user is typing a string. Four features (key code, two keystroke latencies, and key duration) were analyzed and seven experiments were performed combining these features. The results of the experiments were evaluated with three types of user: the legitimate, the impostor and the observer impostor users. The best results were achieved utilizing all features, obtaining a false rejection rate of 1.45% and a false acceptance rate of 1.89%. This approach can be used to improve the usual login-password authentication when the password is no more a secret. This paper innovates using four features to authenticate users. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • 2005 IEEE Signal Processing Society membership application

    Publication Year: 2005 , Page(s): 856
    Save to Project icon | Request Permissions | PDF file iconPDF (175 KB)  
    Freely Available from IEEE
  • IEEE Signal Processing Society Information

    Publication Year: 2005 , Page(s): c3
    Save to Project icon | Request Permissions | PDF file iconPDF (30 KB)  
    Freely Available from IEEE
  • Blank page [back cover]

    Publication Year: 2005 , Page(s): c4
    Save to Project icon | Request Permissions | PDF file iconPDF (2 KB)  
    Freely Available from IEEE

Aims & Scope

IEEE Transactions on Signal Processing covers novel theory, algorithms, performance analyses and applications of techniques for the processing, understanding, learning, retrieval, mining, and extraction of information from signals

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Sergios Theodoridis
University of Athens