Scheduled System Maintenance:
Some services will be unavailable Sunday, March 29th through Monday, March 30th. We apologize for the inconvenience.
By Topic

Information Technology: Coding and Computing [Computers and Communications], 2003. Proceedings. ITCC 2003. International Conference on

Date 28-30 April 2003

Filter Results

Displaying Results 1 - 25 of 133
  • Web services for a biomedical imaging portal

    Publication Year: 2003 , Page(s): 432 - 436
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (342 KB) |  | HTML iconHTML  

    TACWeb (TAC images on the Web), is a Web-based Grid portal, developed at the CACT/ISUFI laboratory of the University of Lecce for the management of biomedical images in a distributed environment. TACWeb, building on top of the Globus Toolkit, is an interactive environment that deals with complex user's requests, regarding the acquisition of biomedical data, the "processing" and "delivering" of biomedical images, using the power and security of computational grids. Recently, Grid technologies are being integrated with Web services technologies to provide a framework for interoperable application-to-application interaction. In this paper we present an evolution of the TACWeb architecture that is compliant with the Web services approach and its main functionalities. In such a system, the basic capabilities are encapsulated and exposed as Web services allowing the development of new health applications as a composition of such services. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Correlation-based asymmetric watermark detector

    Publication Year: 2003 , Page(s): 564 - 568
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (298 KB) |  | HTML iconHTML  

    Asymmetric watermarking is second generation watermarking scheme which uses different keys for watermark embedders and detectors for public applications. Recently, many algorithms to render asymmetry were suggested: for example, eigenvector watermark, periodic watermark, neural network system, etc. A simple algorithm was proposed by Hartung and Girod (1997). They generated a detection key by substituting noise for a portion of a given embedding key and detected the watermarks by the usual correlation test. In this paper, we generalize their idea by the concept of the angle of the two keys. We study correlation detector structure for a transform based asymmetric watermarking scheme. We propose a secure transform to render asymmetry: random directional angle rotation after linear projection. We also address the merits and the drawbacks of the proposed system. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Evaluation of zero tree wavelet coders

    Publication Year: 2003 , Page(s): 507 - 511
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (281 KB) |  | HTML iconHTML  

    Image compression techniques play a vital role in satellite and multimedia applications. Wavelet transforms have received significant attention recently due to their suitability for a number of signal and image compression applications and they have established their viability for the same. This is mainly due to the lapped nature of this transform and the computational simplicity, which comes in the form of filter bank implementations. In this paper, the performance of zero tree wavelet coders such as the Embedded Zero Tree Wavelet (EZW) coder, Set Partitioning In Hierarchical Trees (SPIHT) coder and Packetizable Zero tree Wavelet (PZW) coder for different levels of decomposition, compression ratios and for different percentage of packet losses are evaluated. PSNR is used as the criteria for the measurement of reconstructed image quality. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Development and validation of a computer science self-efficacy scale for CS0 courses and the group analysis of CS0 student self-efficacy

    Publication Year: 2003 , Page(s): 60 - 64
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (267 KB) |  | HTML iconHTML  

    A 21-item self-efficacy scale was developed to measure perceptions of capability associated with success in CS0 students. Data from 377 subjects enrolled in CS0 breadth-first courses over three years were used to assess the reliability and construct validity of the instrument. A principal factor analysis, with Varimax rotation, produced a six factor solution that closely aligned with the constructs of Bandura's (1986) self-efficacy model. The self-efficacy group analysis used participant factor scores that were calculated by adding the sum of Likert-type responses to items associated with each of the six derived factors. Subjects received a success, no-success designation based on final course grade. Significant differences were found between the factor scores of: success, no-success groups; males and females; males and females in success group; and males in success, no-success groups. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A multi-agent system for personalized and private service in PDR

    Publication Year: 2003 , Page(s): 635 - 639
    Cited by:  Papers (1)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2875 KB)  

    The overabundance of DTV (digital television) programs precipitates a need for new tools to help people find programs and personalize TV viewing experience. In this paper, we present a multi-agent system that provides personalized and private service in PDR (personal digital recorders). This paper details the architecture and components of the system. The multi-agent system observes the users' viewing behaviors in the background, updates the users' profiles continuously, and provides different programs for different users according to their respective profile information. It can protect the users' privacy by means of utilizing XML encryption of the reported information and verifying the user's name and password before reporting his/her profile to service provider. This paper also describes the communication mechanism in the multi-agent system. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Development of group's signature for evaluation of skin cancer in mice caused by ultraviolet radiation

    Publication Year: 2003 , Page(s): 617 - 620
    Cited by:  Patents (9)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (241 KB) |  | HTML iconHTML  

    In this research effort, the effect of UVC (260 nm) on the skin of one month old Balb/c mice exposed for a total of 100 hours is studied. The goal is to identify those independent variables in the experimental group that have a significant change in their measurements in compare to the measurements of their counterparts in the control group. To meet the goal, we create signatures for both experimental and control groups using the Kohonen self-organizing map. The comparison of signatures to each other delivers the significant changes in the independent variables between the two groups. The findings are compared with another set of findings obtained from using analysis of variance. The results reveal that using signature approach that is created based on the SOM methodology, is a viable tool for this type of analysis. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Concurrent edge detection with Spiral Architecture on Linux

    Publication Year: 2003 , Page(s): 524 - 528
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (281 KB) |  | HTML iconHTML  

    Edge detection is an essential tool in image analysis. It is a process of identifying significant discontinuities in light intensities in order to form an outline of the object of interest. Spiral Architecture provides powerful computational power as a method of image data representation. Gaussian multi-scale theory was employed as the mathematical model for edge detection. The parallel processing algorithm was implemented in the form of a master-slave model. Spiral Architecture enables an image to be uniformly partitioned and distributed to slave processors after spiral multiplication. Concurrent processing is facilitated by inter-process communication and socket interface programming tools of Linux operating system. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Adaptive wavelet filter design for optimized image source encoding

    Publication Year: 2003 , Page(s): 478 - 482
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (415 KB) |  | HTML iconHTML  

    Despite intensive research being conducted on the topic of adaptive filter design in general, adaptive filter design in the discrete wavelet transform (DWT) domain with specific constraints is still an active research area. We have developed a method for the design of a 2-channel perfect-reconstruction adaptive wavelet filter which is optimized under minimum energy constraints in specific bands. The optimal 2-channel conjugate quadrature filter (CQF) bank has been designed using sequential quadratic programming techniques for nonlinear, nonconvex functions in general, although finding a global minimum is not guaranteed. However, such a filter can be effectively used with a wavelet based image encoder for high fidelity transmission of large image data sets at low bit rates. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Dictionary-based fast transform for text compression

    Publication Year: 2003 , Page(s): 176 - 182
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (294 KB) |  | HTML iconHTML  

    In this paper we present StarNT, a dictionary-based fast lossless text transform algorithm. With a static generic dictionary, StarNT achieves a superior compression ratio than almost all the other recent efforts based on BWT and PPM. This algorithm utilizes ternary search tree to expedite transform encoding. Experimental results show that the average compression time has improved by orders of magnitude compared with our previous algorithm LIPT and the additional time overhead it introduced to the backend compressor is unnoticeable. Based on StarNT, we propose StarZip, a domain-specific lossless text compression utility. Using domain-specific static dictionaries embedded in the system, StarZip achieves an average improvement in compression performance (in terms of BPC) of 13% over bzip2-9, 19% over gzip-9, and 10% over PPMD. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Pragmatic teaching of advanced optical networks: connecting physics, optical technology and networks

    Publication Year: 2003 , Page(s): 65 - 68
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (267 KB) |  | HTML iconHTML  

    Advancements in communications technology and communications services have triggered a global appetite for bandwidth. This appetite contributes to an increasing bandwidth demand that can only be met with a new optical technology known as dense wavelength division multiplexing (DWDM). This technology has appealed to graduate students manifested by an enrollment increase in telecommunications. What makes DWDM possible is photonic technology that is based on principles of solid state physics, optics, and photonics. Although such courses are taught in a physics curriculum, currently they are not considered core courses in electrical and computer engineering. For the first time, physics, photonic technology and communications networks need to be aligned and sequenced to a comprehensive optical communications curriculum. In this paper, we describe the teaching philosophy of optical networking for the TCOM graduate program of the University of Oklahoma at Tulsa. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Data acquisition system interfacing with virtual cards

    Publication Year: 2003 , Page(s): 724 - 728
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (273 KB) |  | HTML iconHTML  

    Power transformers' failures carry great costs to electric companies since they need resources to recover from them and to perform periodical maintenance. To avoid this problem in four working 40 MVA transformers, the authors have implemented the measurement system of a failure prediction tool, that is the basis of a predictive maintenance infrastructure. The prediction models obtain their inputs from sensors, whose values must be previously conditioned, sampled and filtered, since the forecasting algorithms need clean data to work properly. Applying data warehouse (DW) techniques, the models have been provided with an abstraction of sensors the authors have called virtual card (VC). By means of these virtual devices, models have access to clean data, both fresh and historic, from the set of sensors they need. Besides, several characteristics of the data flow coming from the VC, such as the sample rate or the set of sensors itself, can be dynamically reconfigured. A replication scheme was implemented to allow the distribution of demanding processing tasks and the remote management of the prediction applications. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An agent-based curve-digitizing system for well-logging data management

    Publication Year: 2003 , Page(s): 656 - 660
    Cited by:  Papers (1)  |  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (278 KB) |  | HTML iconHTML  

    To permanently keep and efficiently use data implied in curves on well-logging parameter graphs is becoming very important in the petroleum industry. In this paper, we contribute a system based on multi-agent technology to digitize well-logging curves and store the data into the Oracle database by means of an agent-based database broker. We employed some ideas of Gaia and Open Agent Architecture (OAA) to analyze and design the system. We applied the line adjacency graph (LAG) data structure to develop the image-compressing agent, which compresses image files with great efficiency. According to the characteristics of the well-logging parameter graph, the digitizing algorithms based on a centimeter grid of coordinates, which perfectly rectify the distortion of the parameter graph, are proposed for implementing the digitizing agent. We applied middleware agents to operate databases in the uniform Database-based Agent Communication Language (DBACL). The experimental results show that all agents in the system can work cooperatively. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A performance comparison of communication APIs on Solaris and Windows operating systems

    Publication Year: 2003 , Page(s): 336 - 340
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (551 KB)  

    Communication application programming interfaces (API) constitute an important component of many network-based applications. They play a central role in the end-to-end performance ultimately delivered by networked applications. Most network architectures exploit the underlying networking API in their designs. In this paper, we conduct an empirical performance evaluation on the PC platform of some of the most popular networking API which include: Winsock/BSD, Java, and RMI. To explore the impact of the underlying operating system and Java virtual machine (JVM) architecture, we conducted performance tests on two operating systems namely, Windows NT 4.0 and Solaris 8. We found that on both operating system platforms, Winsock and BSD sockets yield about 1.8 times better throughput than Java sockets, and Java sockets in turn yield twice the throughput of that obtained using remote method invocation (RMI). We also obtained about 1.3 times higher latency overheads with Java compared to either Winsock or BSD as well as with RMI when compared to Java sockets on both Windows NT and Solaris operating systems. We hope that our results will be useful to application designers and developers in better optimizing end-to-end application performance. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Integrating adaptive Web techniques with Web services

    Publication Year: 2003 , Page(s): 415 - 419
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (747 KB)  

    Enterprise software infrastructure traditionally comprises a number of independent systems in order to support customer needs and business processes complicating productivity, performance and maintenance. Instead there are hypermedia facilities based on widespread standards that provide over a user-friendly, well-known framework richer functionality and features. In the paper, we present the design, implementation and evaluation of the combined use of Web services and adaptive Web-based techniques. We describe a large-scale hypermedia framework that improves the efficiency and performance of conventional enterprise activities like customer services (registration, activation, billing, support) and document management. The presented approach aims to cover these fundamental needs for a network of customers, collaborators and intranet employees. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • RDF/XML-based automatic generation of adaptable hypermedia presentations

    Publication Year: 2003 , Page(s): 410 - 414
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (404 KB) |  | HTML iconHTML  

    Hera is a design methodology that supports the development of Web information systems. It is a model-driven method that distinguishes four steps: data retrieval, application model generation, application model instance generation, and presentation data generation. Data retrieval populates the conceptual model with data. In the application model generation the navigational aspects of the application are specified in the application model. Also, the application model needs to be adapted for different user/channel profiles. In the third step of the Hera method, the application model is populated with the retrieved data. The last step considers the physical aspects of the presentation: the retrieved data wrapped in application logic is translated to different implementation platforms (e.g. HTML, WML, SMIL). Having in mind the advantage of Web application interoperability we chose to implement an experimental prototype for the Hera method using RDF(S), the foundation of the semantic Web. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The mechanism of ASN.1 encoding and decoding implementation in network protocols

    Publication Year: 2003 , Page(s): 622 - 626
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (274 KB) |  | HTML iconHTML  

    Abstract Syntax Notation One (ASN.1) plays an important role in many network protocol stacks. Some institutes use compiler-based approach to develop their ASN.1 module. However, this approach has some disadvantages, such as low efficiency, redundant codes, etc, and the module may become one of the performance bottlenecks in high-speed network communication. This paper discusses the requirements of ASN.1 module in network protocols, and analyses the compiler-based approach. Later a novel approach is proposed. It does not contain numerous encoding and decoding routines, but use a tree-like structure called "database definition" to store the necessary ASN.1 syntax information, and uniform functions to perform the encoding and decoding procedure. Using this method, a substantial reduction in code size was demonstrated (up to 90% reduction in the code size taken up by marshalling routines), and the improved efficiency also tested its superiority. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Simplified Authoring for Instructors using Distance Education (SAIDE)

    Publication Year: 2003 , Page(s): 43 - 47
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (230 KB) |  | HTML iconHTML  

    This paper presents an alternative to current authoring methods for use in distance education. Several commercially available distance-education authoring tools are briefly discussed and an alternative, SAIDE (Simplified Authoring for Instructors using Distance Education), is presented. The discussion focuses on how SAIDE's easy-to-use feature set is useful to authors/trainers in a non-traditional learning/training environment. The technical aspects discussed include details about the user interface and the ways that key features are linked from the interface to the server. The design includes feature interaction, clickable textures, buttons, and boxes, and the user interface paradigm shift from the current authoring methods to a simpler and more intuitive structure. The thrust is to focus on a fully developed and interactive distance education experience via a tool that is simple enough for a novice at using software applications and still powerful enough for a skillful practitioner in distance education. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Direct Huffman coding and decoding using the table of code-lengths

    Publication Year: 2003 , Page(s): 237 - 241
    Cited by:  Papers (3)  |  Patents (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (268 KB) |  | HTML iconHTML  

    A new Huffman coding and decoding technique is presented. There is no need to construct a full size Huffman table in this technique; instead, the symbols are encoded directly from the table of code-lengths. For decoding purposes a new Condensed Huffman Table (CHT) is also introduced. It is shown that by employing this technique both encoding and decoding operations become significantly faster, and the memory consumption also becomes much smaller compared to the normal Huffman coding/decoding. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Probabilistic analysis of connectivity on mesh networks

    Publication Year: 2003 , Page(s): 362 - 366
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (555 KB)  

    Mesh networks are among the most important network topologies in massive multiprocessor parallel systems. This paper proposes a new method for calculating the connectivity probability for mesh networks, which uses thorough mathematical methods to derive the relationship between the network node failure probability and the network connectivity probability in mesh networks. The results show that practical mesh networks can remain connected with very high probability. For example, it is formally proved that when the network node failure probability is bounded by 0.12%, the mesh networks with up to forty thousand nodes remain connected with probability larger than 99%. Our methods are also useful for deriving lower bounds on the connectivity probability for other computer networks. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Fast color image processing using quantized color instruction set

    Publication Year: 2003 , Page(s): 529 - 535
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3892 KB)  

    The paper describes the Quantized Color Pack eXtension (QCPX) instruction set that can accelerate color-image processing applications. QCPX employs heterogeneous-subword-parallel instructions, which utilize the microprocessor's 16-bit wide datapaths to process a packed, quantized 16-bit color data type in YCbCr (Y: luminance, Cr and Cb: chrominance) format in parallel. Unlike typical multimedia instruction set extensions (e.g. MMX, SSE, ALTIVEC), QCPX obtains performance and code density improvements through implicit support for color pixel processing rather than depending solely upon generic subword parallelism. Five time-critical color image processing algorithms are coded with and without QCPX to fully measure its impact upon programming techniques. Simulation results indicate that the applications using QCPX achieve significant speedups in execution time over non-QCPX. In addition, QCPX results in higher system utilization (in excess of 94% in all cases) due to significant reduction of conditional instructions. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Evaluation and comparison performance of various MPI implementations on an OSCAR Linux cluster

    Publication Year: 2003 , Page(s): 310 - 314
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (268 KB) |  | HTML iconHTML  

    PC clusters are becoming increasingly popular and important in the parallel computing community as a cost-effective alternative to expensive special designed parallel computers. The most popular communication interface used in PC clusters as well as other cluster-type parallel computers is MPI (Message Passing Interface). There are several different open-source MPI implementations freely available and most are based on the popular MPICH. In general, these implementations have similar basic functions and performance, but each package targets different environment with different communication libraries. In this paper we present our implementation and performance measurement of several available open-source MPI packages on a Linux cluster built with the OSCAR software package. We then compare the performance and discuss the factors that affect performance of these MPI implementations. Our target is building a PC-cluster for our research and development in the areas of SAN communication and parallel digital simulations. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Image analysis and interpretation for semantics categorization in baseball video

    Publication Year: 2003 , Page(s): 379 - 383
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2789 KB)  

    The semantic information in videos is useful for content-based video retrieval and summarization. Traditional image/video understanding is formulated in terms of low-level features describing the structure and intensity of the input image/video. The way to generate the high-level knowledge such as common sense and human perceptual knowledge is one of the most difficult problems This paper attempts to bridge this gap through the integration of image analysis algorithms and multi-level semantic network (SN) to interpret the semantic meaning of the baseball video. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Stochastic image compression using fractals

    Publication Year: 2003 , Page(s): 574 - 579
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (635 KB) |  | HTML iconHTML  

    Fractal objects like the Sierpinski triangle and Fern have very high visual complexity and low storage-information content. For generating computer graphic images and compression of such objects, iterated function systems (IFS) (Barnsley; Jacquin (1992)) are used. The main problem in fractal encoding using IFS is the large amount of time taken for the compression of the fractal object. Our endeavor in the present paper is to use a stochastic algorithm to improve upon the compression time as well as compression ratio obtained in Jacquin, while maintaining the image quality. Our results show that we are able to reduce time taken for compression of images by 55%-80% and the size by 60%-80% as compared to the nonstochastic algorithm. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • SDG - a system for synthetic data generation

    Publication Year: 2003 , Page(s): 69 - 75
    Cited by:  Papers (2)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (304 KB) |  | HTML iconHTML  

    The synthetic generation of data is appropriate in many cases. The data generated can be of different nature: program texts, tables, XML documents, etc. The application area of such data comprises not only the scientific research and analysis of the real program system features, but the teaching activity, too. The generation of texts of different problems, belonging to the same type, represents such a case. The problems could be used for self-preparation as after-class training, for solving exercises in class, or for assessing the students' knowledge in a given discipline. This paper discusses the philosophy and development of a system generated sets of synthetic program texts of the same type. The examples considered are for teaching purpose. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A novel feature-based robust audio watermarking for copyright protection

    Publication Year: 2003 , Page(s): 554 - 558
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (588 KB)  

    A novel public audio watermarking scheme based on the statistical feature manipulation in the wavelet domain combined with an error correction coding technique is proposed in this paper. The algorithm is shown to be rather robust against a variety of common signal processing manipulations and synchronization attacks including the very challenging random cropping and time scale modifications. The watermarked audio has very high perceptual quality and is indistinguishable from the original signal. A blind watermark detection technique without resorting to the original signal is developed to identify the embedded watermark under various types of attacks. Security is guaranteed by applying a chaotic sequence during embedding and detection. Several meaningful suggestions on how to use error correction coding technique to improve the robustness are given at the end. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.