Notification:
We are currently experiencing intermittent issues impacting performance. We apologize for the inconvenience.
By Topic

Information Assurance and Security Workshop, 2007. IAW '07. IEEE SMC

Date 20-22 June 2007

Filter Results

Displaying Results 1 - 25 of 56
  • Contributor listings

    Publication Year: 2007 , Page(s): 1 - 2
    Save to Project icon | Request Permissions | PDF file iconPDF (58 KB)  
    Freely Available from IEEE
  • Contributor listings

    Publication Year: 2007 , Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (23 KB)  
    Freely Available from IEEE
  • Sessions at a glance

    Publication Year: 2007 , Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (23 KB)  
    Freely Available from IEEE
  • Table of contents

    Publication Year: 2007 , Page(s): 1 - 4
    Save to Project icon | Request Permissions | PDF file iconPDF (3068 KB)  
    Freely Available from IEEE
  • Author index

    Publication Year: 2007 , Page(s): 1 - 5
    Save to Project icon | Request Permissions | PDF file iconPDF (2761 KB)  
    Freely Available from IEEE
  • [Breaker page]

    Publication Year: 2007 , Page(s): 1
    Save to Project icon | Request Permissions | PDF file iconPDF (557 KB)  
    Freely Available from IEEE
  • A Global Look at Authentication

    Publication Year: 2007 , Page(s): 1 - 8
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (11335 KB) |  | HTML iconHTML  

    In today's world of increased connectivity, authentication issues are becoming increasingly important. Many user accounts, from banking to water bill accounts, are available online. How do we ensure that identity theft occurrences are reduced, and people can enjoy the benefits of managing multiple accounts online? There are numerous schemes and systems that exist today, but many are difficult to implement, especially for the end users. When analyzing a system from a technical standpoint, one might be able to set the security policies and password encryption strength to the appropriate level in order to protect the information it contains, however, when multiple users utilize this system, and they in turn use other systems with identical passwords, the initial system can become less secure. In addition, phishing attacks are becoming more popular, where users may enter the password they reuse on a hostile site, which could result in compromising multiple systems. This paper examines various authentication methods and mechanisms available today, and determines which is appropriate for various uses. In general, users and administrators are responsible for security, and they should treat their systems with regard to the sensitivity of what they want to protect. Users should always consider the importance of each system they use and choose an authentication method or password scheme to match the system they intend to operate. In addition, administrators should always consider the consequences of implementing different security measures, to include the usability of their system. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Recovering from Database Recovery: Case Studies and the Lessons They Teach

    Publication Year: 2007 , Page(s): 9 - 13
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (7076 KB) |  | HTML iconHTML  

    The attacks of September 11, 2001 and Hurricane Katrina forced database professionals to truly reconsider what it means to recover a database. Recovering the data stored on the disks is just one part of recovery. Reassessing disaster recovery plans and preparing to recover again is another. Perhaps the two most important changes include 1) addressing the needs of people in the recovery plan and 2) viewing database recovery from an enterprise-wide perspective, rather than from a technology slant. This paper discusses two major events that force us to think about recovering from database recovery. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Do Word Clues Suffice in Detecting Spai and Phishing?

    Publication Year: 2007 , Page(s): 14 - 21
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (8756 KB) |  | HTML iconHTML  

    Some commercial antispam and anti-phishing products prohibit email from "blacklisted" sites that they claim send spam and phishing email, while allowing email claiming to be from "whitelisted" sites they claim are known not to send it. This approach tends to unfairly discriminate against smaller and less-known sites, and would seem to be anti-competitive. An open question is whether other clues to spam and phishing would suffice to identify it. We report on experiments we have conducted to compare different clues for automated detection tools. Results show that word clues were by far the best clues for spam and phishing, although a little bit better performance could be obtained by supplementing word clues with a few others like the time of day the email was sent and inconsistency in headers. We also compared different approaches to combining clues to spam such as Bayesian reasoning, case-based reasoning, and neural networks; Bayesian reasoning performed the best. Our conclusion is that Bayesian reasoning on word clues is sufficient for antispam software and that blacklists and whitelists are unnecessary. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Experiences and lessons learned in the design and implementation of an Information Assurance curriculum

    Publication Year: 2007 , Page(s): 22 - 29
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3945 KB) |  | HTML iconHTML  

    In 2004, Dakota State University proposed a model for information assurance and computer security program development. That model provided a framework for developing undergraduate and graduate programs at DSU. This paper provides insight into experiences and lessons learned to further implement that model. The paper details modifications to both the undergraduate and graduate information assurance programs as a result of specific issues and challenges. Further, the paper highlights the introduction of a new terminal degree that includes an information assurance specialization. As a national center of excellence in information assurance education, we are confident that this paper will be helpful to universities around the world in either developing new or improving existing IA programs. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Mapping information security curricula to professional accreditation standards

    Publication Year: 2007 , Page(s): 30 - 35
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (595 KB) |  | HTML iconHTML  

    The alignment of information security education curricula with national and international accreditation standards is a fast growing area of activity. This paper discusses several prominent professional accreditation standards in the area of information assurance in the USA and information security worldwide. Body of knowledge and student learning outcome recommendations by professional computing organizations for the discipline of information security form the basis of much of the education curricula design at present. Key topic areas in standards used for accreditation and certification in information security are also included in curricula design. This paper presents the findings from an exercise to identify areas of specific interest within information security reflected in a group of US national and international standards and discusses their relevance to education curricula design and evaluation. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Evaluating an IA virtual network education program

    Publication Year: 2007 , Page(s): 36 - 42
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (3076 KB) |  | HTML iconHTML  

    This paper presents a case study on the application of an evaluation framework for education programs (NIMSAD 1) to an information assurance education program utilizing a virtual network environment (VIAN). The discussion considers elements relating to the education process, the education practice, and the educators as project teams. Consideration of technical, educational and human elements are also included. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Protocol of Secure Mutual Authentication

    Publication Year: 2007 , Page(s): 43 - 48
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (7510 KB) |  | HTML iconHTML  

    The paper discusses the peculiarities of modern information systems, problems with traditional authentication protocols and introduces a new authentication protocol that allows transparent secure authentication in dynamic distributed information systems. Approaches to reasoning about authentication protocols are described and the proposed protocol is analyzed according to them. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Building Security into an IEEE FIPA Compliant Multiagent System

    Publication Year: 2007 , Page(s): 49 - 55
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (8166 KB) |  | HTML iconHTML  

    Software security engineering is generally an after thought and software engineering approaches often do not provide convenient methods for securing the software without considerable reengineering. In this paper we analyze an IEEE FIPA compliant multi agent system. We examine the security vulnerabilities of such a system and describe how the features of agent-oriented computing make it more amenable to incorporate security into a multi-agent system, even when security engineering is an after thought. We provide examples of securing a multi-agent system in the context of managing a global telecommunications network. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Framework for Redacting Digital Information from Electronic Devices

    Publication Year: 2007 , Page(s): 56 - 60
    Cited by:  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (6572 KB) |  | HTML iconHTML  

    A reliable method for the removal of selected information from digital devices remains an open problem. A solution is particularly necessary for the legal profession, where it is required to produce information to opposing counsel during the discovery portion of court proceedings. The method outlined in this paper provides an efficient and effective system for redacting digital information beyond recovery by conventional forensic techniques. This paper also describes the major obstacles to achieving practical and comprehensive redaction of digital information from electronic devices. Of particular issue is the lack of a rational process for systematically handling encoded, encrypted, or otherwise complex data objects. Applications for this method extend well beyond the courtroom-it can be used in government and business to remove classified and proprietary information from documents and records. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Observability Calibration Test Development Framework

    Publication Year: 2007 , Page(s): 61 - 66
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (7592 KB) |  | HTML iconHTML  

    Formal standards, precedents, and best practices for verifying and validating the behavior of low layer network devices used for digital evidence-collection on networks are badly needed - initially so that these can be employed directly by device owners and data users to document the behaviors of these devices for courtroom presentation, and ultimately so that calibration testing and calibration regimes are established and standardized as common practice for both vendors and their customers [Endicott-Popovsky, B.E., Chee, B. and Frincke, D. "Role of calibration as part of establishing foundation for expert testimony," in Proceedings 3rd Annual IFIP WG 11.9 Conference, January 29-31, Orlando, FL.]. The ultimate intent is to achieve a state of confidence in device calibration that allows the network data gathered by them to be relied upon by all parties in a court of law. This paper describes a methodology for calibrating forensic-ready low layer network devices based on the Flaw Hypothesis Methodology [Weissman, C. (1973). "System Security Analysis: Certification, methodology and results." Tech Report No. SP-3728, System Development Corporation., Weissman, C. (1995). "Penetration testing." In M. Abrams, S. Jajodia, and H. Podell, (Eds.), Information Security: An Integrated Collection of Essays, pp. 269-296. Los Alamitos, CA: IEEE Computer Society Press.]. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Volleystore: A Parasitic Storage Framework

    Publication Year: 2007 , Page(s): 67 - 75
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (9344 KB) |  | HTML iconHTML  

    We present Volleystore, a filesystem that stores data on network equipment and servers without any authorization, yet without compromising the systems that are used. This is achieved by exploiting the echo functionality present in most standard Internet protocols. Various issues concerning the design of a parasitic storage system are addressed and a practical system based on Internet Control Message Protocol (ICMP) echo messages is demonstrated. We present an analysis of storage capacity and latency limits for various configurations of the system. We also describe a proof-of-concept implementation of the system and show that one can indeed store data using Volleystore for a reasonable lengths of time. Finally, we suggest defenses against parasitic storage, both abstractly in economic terms and concretely in technical terms. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Family of Efficient Key Predistribution Schemes for Pairwise Authentication

    Publication Year: 2007 , Page(s): 76 - 83
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (9404 KB) |  | HTML iconHTML  

    The growing need for ad hoc establishment of security associations (SA) in networks that may include resource constrained nodes has sparkled renewed interest in key predistribution schemes (KPS). KPSs which employ only inexpensive symmetric cryptographic primitives also support ID-based distribution of secrets to eliminate the need for dissemination of certificates or other public values. KPSs are however susceptible to collusions. The complexity associated with deployment and use of any KPS that can restrict collusions of very large number of nodes is influenced by several factors like bandwidth overheads, computational complexity, storage requirements, overheads for distributing secrets etc. We introduce a class of simple scalable KPSs with several compelling advantages over other KPSs in the literature. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Efficient Distribution of Trust Authority Functions in Tactical Networks

    Publication Year: 2007 , Page(s): 84 - 91
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (10965 KB) |  | HTML iconHTML  

    In this paper we describe an algorithm for the distribution of trust authority functions such as key generation and distribution in tactical mobile ad hoc networks. Such networks cannot rely on existing infrastructures and must operate under severe resource constraints. Moreover, network partitioning and node failure, including Byzantine failures must be compensated in tactical networks. We propose the combination of metrics on both network state and beliefs or trust in other nodes to form a composite metric for use in a clustering algorithm. The effectiveness and other characteristics of this improved clustering algorithm are then evaluated and analyzed in a simulation environment, demonstrating a significant improvement over the baseline clustering algorithm. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Detection of Virtual Environments and Low Interaction Honeypots

    Publication Year: 2007 , Page(s): 92 - 98
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (7127 KB) |  | HTML iconHTML  

    This paper focuses on the detection of virtual environments and low interaction honeypots by using a feature set that is built using traditional system and network level finger printing mechanisms. Earlier work in the area has been mostly based on the system level detection. The results aim at bringing out the limitations in the current honeypot technology. This paper also describes the results concerning the robustness and generalization capabilities of kernel methods in detecting honeypots using system and network finger printing data. We use traditional support vector machines (SVM), biased support vector machine (BSVM) and leave-one-out model selection for support vector machines (looms) for model selection. We also evaluate the impact of kernel type and parameter values on the accuracy of a support vector machine (SVM) performing honeypot classification. Through a variety of comparative experiments, it is found that SVM performs the best for data sent on the same network; BSVM performs the best for data sent from a remote network. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improving Honeynet Data Analysis

    Publication Year: 2007 , Page(s): 99 - 106
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (8942 KB) |  | HTML iconHTML  

    The honeywall's hflow and walleye interface first introduced in[1] vastly improved honeynet data analysis by integrating different data sources and thus reducing the time required for analyzing honeynet data. However, there are some open architectural questions. This paper answers some of these questions by introducing a packet processing language that allows a modular architecture. This architecture not only solves the immediate problems but is also applicable to a wide range of problems. We present data regarding the problems of the old architecture and present our solution. We also present some of performance envelopes of both architectures. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Deception in Honeynets: A Game-Theoretic Analysis

    Publication Year: 2007 , Page(s): 107 - 113
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (8767 KB) |  | HTML iconHTML  

    Recently, honeynets became one of the main tools for understanding the characteristics of malicious attacks and the behavior of the attackers. However the attackers may identify the honeypots and avoid attacking them. Thus the honeynet administrators must be able to deceive the attackers and induce them to attack the honeypots. In this paper we propose a game theoretic framework for modeling deception in honeynets. The framework is based on extensive games of imperfect information. We study the equilibrium solutions of these games and show how they are used to determine the strategies of the attacker and the honeynet system. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • TimeKeeper: A Metadata Archiving Method for Honeypot Forensics

    Publication Year: 2007 , Page(s): 114 - 118
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (6721 KB) |  | HTML iconHTML  

    Internet attacks are becoming more advanced as the economy for cybercrime grows and the tools for evading detection become ubiquitous. To counter this threat, new detection and forensics tools are needed to capture these new techniques. In this paper, we propose a method to extract and analyze a richer set of forensic information from the file system journal of honeypots in spite of anti-forensic tool use. We show initial results of our journal monitoring prototype, TimeKeeper, of file system activities and argue that by detecting these events, we are able to capture previously unavailable forensic information. This forensic information can then be used for system recovery, research on attack techniques, insight into attacker motives, and for criminal investigations. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Stego Scrubbing A New Direction for Image Steganography

    Publication Year: 2007 , Page(s): 119 - 126
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (10256 KB) |  | HTML iconHTML  

    We propose a new method for dealing with steganography. Instead of simply attempting to find the steganography, we propose an architecture that will remove the steganography, we call this stego scrubbing. This paper describes our philosophy and lays the ground work for the actual development of a stego scrubber, which can be inserted in a manner similar to a guard or firewall. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Scalable, Cluster-based Anti-replay Protection for Wireless Sensor Networks

    Publication Year: 2007 , Page(s): 127 - 134
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (11465 KB) |  | HTML iconHTML  

    Large-scale wireless sensor network (WSN) deployments show great promise for military, homeland security, and many other applications. This promise, however, is offset by important security concerns. The resource constraints that typify wireless sensor devices make traditional security solutions impractical. One threat to secure sensor networks is the replay attack, in which packets are captured and replayed into the network. This type of attack can be perpetrated to confuse observers or to mount a denial-of-service or denial-of-sleep attack. Traditional techniques for anti-replay protection are too resource intensive for large-scale WSN deployments. While techniques for reducing data transmission overhead of WSN-speciflc anti-replay mechanisms have been explored, the important problem of minimizing per-node reply table storage requirements has not been addressed. This paper introduces Clustered Anti-Replay Protection or CARP, which leverages sensor network clustering to place a limit on the amount of memory required to store anti-replay information. We show that clustering keeps the memory required for anti-replay tables manageable, reducing the size from 30% of a Mica2's memory to 4.4% for a 200-node network. While the advantages of this technique are clear, the difficulty lies in securely updating network-wide anti-replay tables when the network reclusters, an event that must happen routinely to distribute energy consumption across the nodes in the network. Our mechanism distributes necessary anti-replay information in a secure, low-overhead, and completely distributed manner. We further show the energy-consumption overhead of adding anti-replay counters to network traffic across several WSN medium access control (MAC) protocols and two representative WSN platforms. On the Mica2 platform, overheads range from a 0% to 1.32% decrease in network lifetime, depending on the MAC protocol. On the Tmote Sky, overheads range from 0% to 4.64%. Providing anti-replay suppor- t in a secure, scalable, and distributed way is necessary to the overall security of future WSN deployments if they are to meet current expectations. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.