By Topic

Future Computer and Communication (ICFCC), 2010 2nd International Conference on

Date 21-24 May 2010

Go

Filter Results

Displaying Results 1 - 25 of 190
  • Discovering collaborative users based on query context for Web information seeking

    Publication Year: 2010 , Page(s): V3-1 - V3-5
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (237 KB) |  | HTML iconHTML  

    Facing large amount of diverse Web information, collaborative filtering is an effective way to improve the efficiency of user's information seeking. It provides useful information specifically for a user by taking the information that has been interested by collaborative users which have similar interests to the specific user. How to discover the collaborative user is a key issue in collaborative filtering. This paper presents a novel method which discovers collaborative users with computation based on query context and user context. First, query context is proposed as the semantic background of user's information seeking behavior. It is built based on not only concepts but also the relation between concepts, which provides more accurate description for user's search background. Secondly, collaboration degree not only comprises the similarity between user contexts, but also takes the relatedness of user's search background into account, which ensures the accuracy of finding collaborative user. Thirdly, information entropy is introduced to computing the relatedness between relations from query contexts, which provides accurate value of relatedness between users' search background. Experimental results demonstrate the validity of the method proposed in this paper. It can be seen that the proposed method has a brilliant perspective in the applications of Web personalized information seeking. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Majority decision based weighted symbol-flipping decoding for nonbinary LDPC codes

    Publication Year: 2010 , Page(s): V3-6 - V3-10
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (219 KB) |  | HTML iconHTML  

    A new low-complexity symbol-flipping algorithm based on majority decision to decode nonbinary low-density parity-check (LDPC) codes is proposed. The decoding procedure updates iteratively the hard-decision received symbol vector in search of a valid codeword in the symbol vector space. Only one symbol is changed in each iteration, and symbol flipping function combines the number of failed checks and reliability of the received bits and calculated symbols. Flipped symbol position is determined by majority decision and flipped value is calculated by channel output. And an optional mechanism to avoid infinite loops in high Galois field search is also proposed. Our studies show that the algorithm achieves an appealing tradeoff between performance and complexity over relatively low Galois field for short to medium code length. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Improved winning probability model in go based on strong group quantization and multi-level species compete-die out algorithms

    Publication Year: 2010 , Page(s): V3-11 - V3-14
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (386 KB) |  | HTML iconHTML  

    Winning probability is important for professional players and Go programs to calculate in the Go game. However, it is difficult to determine the value of strong groups when calculating winning probability. This paper presents an approach to quantize the influences of strong groups, based on which the winning probability model Winnable is defined and the model parameter is further optimized by multi-level species compete-die out algorithm. The results of the test show that compared with the previous model, Winnable's accuracy and speed of operation are promoted by 27% and 18% respectively. This model has a practical utilization in researches on the middle game of computer Go. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Blind Fractionally Spaced Equalization and timing synchronization in wireless fading channels

    Publication Year: 2010 , Page(s): V3-15 - V3-19
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (966 KB) |  | HTML iconHTML  

    The development of low-complexity blind techniques for equalization and timing synchronization is of enormous importance in the design of wireless communication systems. In this paper, we propose a practical solution for blind equalization and timing recovery in fast-fading time and frequency selective wireless communication channels. We develop a general framework for Constant Modulus Algorithm (CMA) based joint Fractionally Spaced Equalization (FSE) and timing recovery. We use differential modulation to deal with any arbitrary carrier offset. We propose a data reuse strategy to achieve improved short burst wireless communication in CMA based equalization systems. Our results show that FSE outperforms T-Spaced Equalization (TSE) with approximately 2 times faster Mean Square Error (MSE) convergence and approximately 2 dB gain in Bit Error Rate (BER) performance in wireless fading channels. In addition, we demonstrate that the BER performance of the proposed FSE receiver meets the theoretical bounds with only a few dB loss in Stanford University Interim (SUI) channels, which are relevant to IEEE 802.16.3c standard for Wireless Metropolitan Area Networks. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Notice of Retraction
    Pragmatic analysis of progress reports: Through a case study of communications between local and head offices of a UK university

    Publication Year: 2010 , Page(s): V3-20 - V3-25
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (345 KB) |  | HTML iconHTML  

    Notice of Retraction

    After careful and considered review of the content of this paper by a duly constituted expert committee, this paper has been found to be in violation of IEEE's Publication Principles.

    We hereby retract the content of this paper. Reasonable effort should be made to remove all past references to this paper.

    The presenting author of this paper has the option to appeal this decision by contacting TPII@ieee.org.

    Progress reports are usual the important means of communication between China office and its head office at a UK university, but sometimes do not efficiently serve their purposes. This paper applied ethnographic research and pragmatic analysis to examine the insufficient issues of progress reports collected as primary data in the case study. The causes to the gaps between the expected and actual perlocutionary effects were discussed from the angle of the cultural differences in the pragmatic information with the assumption that all four validity claims raised in the speech act were true and truthful. In addition, the elements in pragmatic information that need to be considered for successfully fulfilling a certain type of illocutionary act were abstracted from the findings using grounded theory research. The analysis of each issue was organized by using a self-defined template. Although it provided only a simplified view of analyzing communication issues, the approach was verified effective in the practice. The communication efficiency is also affected by other factors such as communication context, the notion differences of illocutionary acts, and the acceptance of validity claims. Therefore, in future studies, the role of pragmatic information in forming notions of illocutionary acts and the influences of the context of communication and validity claims on both illocutionary acts and perlocutionary effects will be analyzed in order to develop relatively comprehensive solutions to the inefficiency of progress re- orts. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Identifying heavy-hitter flows fast and accurately

    Publication Year: 2010 , Page(s): V3-26 - V3-30
    Cited by:  Patents (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (341 KB) |  | HTML iconHTML  

    In many applications, such as network congestion monitoring, accounting and network anomaly detection, identifying heavy-hitter flows is very important and imperative. Recent research work on extracting heavy-hitter flows in high-speed network attracts quite a few researchers because of the importance of finding out heavy-hitter flows. However, due to the high speed of network and the ever-growing of flow amount, it's of great significance to identify heavy-hitter flows in high-speed network with much less memory and computational overhead apart from high measurement accuracy. Moreover, it's infeasible to track per-flow statistics to achieve high accuracy because of the limits of both the computational and memory requirements. Consequently, finding a rational way to solve this problem is of great significance. In this paper, a novel scheme named CHHFR (Caching Heavy-hitter Flows with Replacement) is proposed. CHHFR algorithm is based on LRU (Least Recently Used) replacement mechanism but different from it because CHHFR does not only care about the update time of traffic flows when it find a flow to replace. Another parameter named Ctr is added to LRU replacement mechanism to avoid its shortcomings. Through both theoretical analysis and experimental validation of different Internet traces, CHHFR algorithm can achieve a good measurement performance with a higher accuracy at a faster speed compared with the existing methods. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Automated verification of loops with assignments only by recurrence solving and optimization problems

    Publication Year: 2010 , Page(s): V3-31 - V3-36
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (299 KB) |  | HTML iconHTML  

    Based on techniques of solving recurrence equations and optimization problems, we present a practical approach for verifying program including loops with assignments only. We implement this approach on the platform of Mathmatica. The experimental results demonstrate the power of our approach. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The design and realization of DSRC logic analyzer based on USB

    Publication Year: 2010 , Page(s): V3-37 - V3-40
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (419 KB) |  | HTML iconHTML  

    The USB interface has the advantages of plug and play, convenient installation, high-speed transmission and so on. An information monitoring device between the On Board Unit (OBU) and the Rode Side Unit (RSU) used in the Dedicated Short Range Communication (DSRC) in the Electronic Toll Collection (ETC) will be realized in this design. This article focuses on the design of the hardware circuit and the firmware for the USB interface chip, and the realization of the computer applications based on the USB interface will also be introduced briefly. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Application of the virtual reality technologies in power systems

    Publication Year: 2010 , Page(s): V3-41 - V3-44
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (180 KB) |  | HTML iconHTML  

    This article first established the new mode. The new mode is based on the components and the virtual reality simulation engine is the kernel of it. As the same time, it gives the frame construction of the virtual reality simulation engine and illustrates the functions of components in the frame construction. The aim is to improve the development efficiency of the virtual reality substation simulation. The results show that the virtual reality simulation engine can greatly shorten the development-cycle of the virtual reality substation simulation and can reduce its cost. Therefore, it is necessary to establish the virtual reality substation simulation. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The analysis and realization of the P2P network security

    Publication Year: 2010 , Page(s): V3-45 - V3-48
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (164 KB) |  | HTML iconHTML  

    P2P is a network model that is developing rapidly in the recent years. Compared with the traditional Client/Server model, P2P has a lot of advantages, such as the higher utilization of network resources, the elimination of bottleneck caused by central servers, and so on. So P2P has a great potential value on commerce and technology. This paper first analyzes the network encryption, and combines the characteristics of P2P network model and Web Services transmission structure. Also a P2P file sharing system is designed and realized based on .NET framework. The feasibility of the proposed P2P security transmission model is certified and the experimental results and analyses are given. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Online image fusion of remotely sensed satellite imagery using web services chain

    Publication Year: 2010 , Page(s): V3-49 - V3-52
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1800 KB) |  | HTML iconHTML  

    In this paper, we propose a framework of geoprocessing web services, which implemented an independent, multi-step geospatial processing services to handle complex tasks. An online image fusion web service is developed as an example. This web service allows the different images to be fused on line. The input images can be obtained from distributed data supplier web services. The proposed image fusion web service then performs the fusion process based on the input parameters and returns to the client application the fused image. This prototype system integrated workflow technology, Web services, and OGC WPS. Due to the use of industry standard technologies such like XML, SOAP, WSDL and UDDI to build geoprocessing web services, it is possible to distribute the geoprocessing capabilities in a platform and language independent manner over the Internet, with faster client application development, easier service deployment and efficient online discovery. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modulation classification of MQAM signals based on density spectrum of the constellations

    Publication Year: 2010 , Page(s): V3-57 - V3-61
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (212 KB) |  | HTML iconHTML  

    Blind modulation classification (MC) is an intermediate step between signal detection and demodulation, and plays a key role in various civilian and military applications. In this paper, a novel algorithm based on the density spectrum of digital signal constellations is presented. First we propose a new carrier estimation method (iterative approximation algorithm of carrier estimation), which can acquire the accurate approximation value of the carrier frequency. Then we perform the wavelet denoising based on wavelet transform modulus maximum. Finally, the discriminating feature is derived from the density spectra to identify the different modulated signals. The density spectra based classifier is much simpler and has lower computational complexity than the traditional classifier based on constellation reconstruction. The simulation results show the relatively high efficiency and recognition accuracy of the method. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimization of viterbi decoder parameters for WRAN system

    Publication Year: 2010 , Page(s): V3-53 - V3-56
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (348 KB) |  | HTML iconHTML  

    Extracting the fixed point model for digital hardware implementation is an important step in the realization of digital signal processing systems. In such a model, limited word lengths for processing blocks should be selected with no harmful effects on the system performance. Viterbi decoder is one of the commonly used blocks in digital communication systems. Optimizing its word length causes a considerable reduction in the chip area and decoding delay. In this paper the Viterbi decoder parameters, i.e. the number of soft decision bits and trace-back depth, are optimized for a new wireless standard, IEEE 802.22. The optimization is performed with less than 0.5 dB degradation of BER of the fixed point model compared to the floating point model. The 64-QAM modulation with the code rate 5/6 is selected as a case study since it is the most sensitive mode in the standard to the channel noise. Simulation results show that the Viterbi decoder with 7 bit soft decision and trackback depth of 70 meets the system requirements. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modeling and performance evaluation of IEEE 802.22 physical layer

    Publication Year: 2010 , Page(s): V3-62 - V3-66
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (439 KB) |  | HTML iconHTML  

    IEEE 802.22, also called Wireless Regional Area Network (WRAN), is the newest wireless standard being developed for remote and rural areas. In this paper an overview of the standard, and more specifically its PHY layer is introduced. In order to evaluate the performance of the system, we model the PHY layer in MATLAB/SIMULINK and extract the Bit Error Rate (BER) of the system for different code rates and modulation schemes with noisy channel. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Weight-based fairness allocation alogrithms for Mesh network

    Publication Year: 2010 , Page(s): V3-67 - V3-70
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (397 KB) |  | HTML iconHTML  

    Based on the fixed fairness algorithms, this paper considering the maximization of throughout in networks. Transforming the problem of allocation of bandwidth in centralized scheduling into a question of optimization of fairly allocate the rate, and then simplify it into linear programming problem. So an uplink fair alogrithms based on centralized scheduling is proposed. We use NS2 as the simulation platform simulate and modeling the alogrithms. And the result shows that the new alogrithms reaches the new fairness goal and maximize the utilization ratio of system. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A novel subspace clustering algorithm with dimensional density

    Publication Year: 2010 , Page(s): V3-71 - V3-75
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (187 KB) |  | HTML iconHTML  

    When clustering data of high dimension, most of the existing algorithms cannot reach people's expectation due to the curse of dimensionality. In high-dimensional space, clusters are often hidden in subspaces of the attributes. The distribution of clusters is dense in the subspace and each attribute of the subspace. So objects belonged to the same subspace have similar density on each attribute. Based on this idea a novel subspace clustering algorithm SC2D is proposed. By introducing the definition of dimensional density, SC2D puts objects into the same cluster if they have similar dimensional density. And then clusters are separated from each other if there are more than one cluster in the same subspace. Experiments on both artificial and real-world data have demonstrated that SC2D algorithm can achieve desired results. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Notice of Retraction
    Cloning and homologous analysis of a partial lipase gene in Staphylococcus caprae TCCC 11546: Partial lipase gene sequencing of S. caprae

    Publication Year: 2010 , Page(s): V3-76 - V3-80
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (335 KB) |  | HTML iconHTML  

    Notice of Retraction

    After careful and considered review of the content of this paper by a duly constituted expert committee, this paper has been found to be in violation of IEEE's Publication Principles.

    We hereby retract the content of this paper. Reasonable effort should be made to remove all past references to this paper.

    The presenting author of this paper has the option to appeal this decision by contacting TPII@ieee.org.

    Lipase activity is popular in staphylococci. We described here that a 781-bp consensus lipase gene sequence of Staphylococcus caprae TCCC 11546, which encoded a deduced polypeptide of 260 amino acid residues, was cloned using the degenerate primers. The amino acid sequence alignment and the phylogenetic tree revealed that this 260-amino acid partial lipase of S. caprae was highly homologous from 55.0% to 98.8% among other 11 conserved lipase stretches from other 11 Staphylococcus species. Moreover, two amino acid residues, Ser39 and Asp230, were identified in the putative lipase catalytic triad, the consensus `P-loop' motif (-[AG]-x4-G-K-[ST]-) was also found in this partial lipase stretch. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Handover algorithm based on movement state for cellular relaying networks

    Publication Year: 2010 , Page(s): V3-81 - V3-85
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (352 KB) |  | HTML iconHTML  

    This paper firstly introduces the architecture of cellular relaying networks, and describes the additional handover scenarios in the networks. Then the handover model of cellular relaying networks is made, the traditional handover algorithm based on absolute threshold value and hysteresis level is introduced. On this basis, we propose a novel handover algorithm based on movement state for cellular relaying networks, which combines with movement direction and velocity to choose a more accurate access station as the handover target. Finally, the proposed handover algorithm is compared with the traditional algorithm. Simulation results show that the novel algorithm improves handover efficiency and reduces signaling cost, and the network performance has been optimized. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Formal analysis of improved EAP-AKA based on Protocol Composition Logic

    Publication Year: 2010 , Page(s): V3-86 - V3-90
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (235 KB) |  | HTML iconHTML  

    A formal analysis based on PCL (Protocol Composition Logic) points out the vulnerability during the composition of EAP-AKA, and proposes an improved protocol EAP-AKA'. Based on DH protocol, the new protocol has session key secrecy, meanwhile, avoids the vulnerability to redirection attack and replay attack. Then a security analysis of the EAP-AKA' is made based on PCL, the analysis indicates that sub-protocols have SSA and key secrecy. According to the sequential rule, the precondition of a sub-protocol is preserved by the other one later in the chain, and each sub-protocol respects the invariant of the other, So EAP-AKA' is secure in the PCL mode. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Key frame extraction based on information entropy and edge matching rate

    Publication Year: 2010 , Page(s): V3-91 - V3-94
    Cited by:  Papers (2)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (302 KB) |  | HTML iconHTML  

    This paper presents a new approach for key frame extraction based on the image information entropy and edge matching rate. Firstly, the information entropy of every frame is calculated, and then the edges of the candidate key frames are extracted by Prewitt operator. At last, the paper makes the edges of adjacent frames match. If the edge matching rate is up to 50%, the current frame is deemed to the redundant key frame and should be discarded. The experimental results show that the method proposed in this paper is accurate and effective for key frame extraction, and the extracted key frames can be a good representative of the main content of the given video. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An improved method of differential fault analysis on SMS4 key schedule

    Publication Year: 2010 , Page(s): V3-95 - V3-99
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (187 KB) |  | HTML iconHTML  

    SMS4 is a 128-bit block cipher published as the symmetric-key encryption standard of Wireless Local Area Network(WLAN) by China in 2006. By inducing faults into the key schedule, we propose an improved method of differential fault attack on the key schedule of the SMS4 cipher. The result shows that our attack can recover its secret key by introducing 4 faulty ciphertexts. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A routing scheme for dynamic event region tracking in wireless sensor networks

    Publication Year: 2010 , Page(s): V3-100 - V3-103
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (392 KB) |  | HTML iconHTML  

    Dynamic event region detection is an important application of wireless sensor networks. The position of event region should be transmitted to the base station in good time. A routing scheme based on gradient field and cluster is advised to track dynamic event region. The gradient field is set up after sensors are deployed. Only the event sensors are organized into clusters and then the cluster heads send the event information to the base station through the gradient paths respectively. When no event occurs, no data will be reported. When event occurs, data will be reported periodically. This scheme is of low traffic, low computational complexity and low delay. Computer simulations verified the effectiveness of this scheme. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Effective XML data storage and distributed query retrieval system

    Publication Year: 2010 , Page(s): V3-104 - V3-108
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (217 KB) |  | HTML iconHTML  

    As XML is gaining its popularity in data exchange over the Web, storing and querying XML data has become an important issue to be addressed especially in terms of support for interoperability and extensibility among various application domains. In order to support this, a dynamic context-driven data exchange architecture is needed. In this paper, we propose an efficient distributed query processing method by partitioning the large native XML repositories into different fragmentations. The query will be evaluated in the Query Evaluation Center to determine which Query Processor nodes to process the query. The uniqueness of our Query Processor lies in the use of a suitable labeling scheme and an efficient indexing scheme which is able to identify parent-child (P-C), ancestor-descendant (A-D) and sibling relationships and consequently retrieve the query result efficiently. Experimental results indicate that our proposed join algorithm, TwigINLAB2 performs about 23% better compared to TwigStack and 10% better than TwigINLAB1 for all types of queries. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Ad Hoc grid task scheduling algorithm considering trust-demand

    Publication Year: 2010 , Page(s): V3-109 - V3-113
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (215 KB) |  | HTML iconHTML  

    At present, most Ad Hoc grid task scheduling algorithms ignore the impact of trust mechanism, resulting in the potential dangerous in the real scheduling. In this paper, the trust model is defined, which satisfies the tasks' trust demands, in order to choose the suitable resource node to be executed in the scheduling. Moreover, a task scheduling algorithm extended from Min-Min algorithm (T-Min-Min) is proposed. The proposed algorithm not only satisfies the trust demand of the user but also considers the resource nodes' energy consumption and load balancing. The simulation suggests that it can effectively shorten the task completion time and the number of failed tasks in the Ad Hoc grid. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Key-peers based topology control for unstructured P2P networks

    Publication Year: 2010 , Page(s): V3-114 - V3-118
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (189 KB) |  | HTML iconHTML  

    One of the essential problems in unstructured peer-to-peer (P2P) network is enhancing the efficiency of resource retrieval. Previous researches either have poor response speed, or generate too much network overhead. In order to reduce the traffic load and improve the availability of the sharing resources in unstructured P2P networks, a key-peers based topology control mechanism is presented in this paper. Due to inherent heterogeneity of peers in P2P network, a few peers (called key-peers) directly affect the connectivity of P2P overlay topology. Therefore, it is particularly important to explore these peers. Here, we regard P2P overlay topology as an undirected graph, and analyze the similarities and differences between cut-nodes in graph theory and key-peers in P2P networks. Then we use the related principles about cut-nodes and the reachability relationship of nodes to detect the key-peers of P2P network. Furthermore, we adjust P2P overlay topology based on these key-peers. Finally, experimental results show that compared to the original P2P overlay topology, the modified topology using our approach can perform much better in success rate and respond speed, especially for rare resources. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.