By Topic

Algorithms for estimating information distance with application to bioinformatics and linguistics

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Kaitchenko, /A/. ; Dept. of Phys. & Comput., Wilfrid Laurier Univ., Waterloo, Ont., Canada

We review unnormalized and normalized information distances based on incomputable notions of Kolmogorov complexity and discuss how Kolmogorov complexity can be approximated by data compression algorithms. We argue that optimal algorithms for data compression with side information can be successfully used to approximate the normalized distance. Next, we discuss an alternative information distance, which is based on relative entropy rate (also known as Kullback-Leibler divergence), and compression-based algorithms for its estimation. We conjecture that in bioinformatics and computational linguistics this alternative distance is more relevant and important than the ones based on Kolmogorov complexity.

Published in:

Electrical and Computer Engineering, 2004. Canadian Conference on  (Volume:4 )

Date of Conference:

2-5 May 2004