To search, Click
below search items.
|
|
All
Published Papers Search Service
|
Title
|
Modeling Biological Signals using Information-Entropy with Kullback-Leibler-Divergence
|
Author
|
Anjali Mohapatra, P.M.Mishra, S.Padhy
|
Citation |
Vol. 9 No. 1 pp. 147-154
|
Abstract
|
Biological signals are short conserved regions in DNA, RNA, or Protein sequences which correspond to some structural and/or functional feature of the bio-molecules. Finding such signals has important applications in locating regulatory sites and drug target identification. Identification of biological signals such as motifs is a challenging problem because they can exist in different sequences in mutated forms. Despite extensive studies over last few years this problem is far from being satisfactorily solved. Most existing methods formulate signal finding as an intractable optimization problem and rely either on expectation maximization (EM) or on local heuristics. Another challenge is the choice of model: simpler models such as positional weight matrices (PWM) impose biologically unrealistic assumptions where as other harder models are difficult to parameterize. In this paper a conceptually simpler and biologically relevant model based on Kullback-Leibler divergence along with information entropy framework is proposed to measure the divergence of the biological signals. Both synthetic and real data are used to test applicability of the proposed model for finding motifs in DNA sequence. Our proposed model performs better than models based on PWM or Shannon entropy.
|
Keywords
|
Information entropy, Biological signal, Motif, Kullback-Leibler divergence
|
URL
|
http://paper.ijcsns.org/07_book/200901/20090121.pdf
|
|