Average Mutual Information

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 7470 Experts worldwide ranked by ideXlab platform

Tolga M Duman - One of the best experts on this subject based on the ideXlab platform.

  • implementing the han kobayashi scheme using low density parity check codes over gaussian interference channels
    IEEE Transactions on Communications, 2015
    Co-Authors: Shahrouz Sharifi, Korhan A Tanc, Tolga M Duman
    Abstract:

    We focus on Gaussian interference channels (GICs) and study the Han-Kobayashi coding strategy for the two-user case with the objective of designing implementable (explicit) channel codes. Specifically, low-density parity-check codes are adopted for use over the channel, their benefits are studied, and suitable codes are designed. Iterative joint decoding is used at the receivers, where independent and identically distributed channel adapters are used to prove that log-likelihood-ratios exchanged among the nodes of the Tanner graph enjoy symmetry when BPSK or QPSK with Gray coding is employed. This property is exploited in the proposed code optimization algorithm adopting a random perturbation technique. Code optimization and convergence threshold computations are carried out for different GICs employing finite constellations by tracking the Average Mutual Information. Furthermore, stability conditions for the admissible degree distributions under strong and weak interference levels are determined. Via examples, it is observed that the optimized codes using BPSK or QPSK with Gray coding operate close to the capacity boundary for strong interference. For the case of weak interference, it is shown that nontrivial rate pairs are achievable via the newly designed codes, which are not possible by single user codes with time sharing. Performance of the designed codes is also studied for finite block lengths through simulations of specific codes picked with the optimized degree distributions with random constructions, where, for one instance, the results are compared with those of some structured designs.

  • low density parity check codes over wireless relay channels
    IEEE Transactions on Wireless Communications, 2007
    Co-Authors: Tolga M Duman
    Abstract:

    We exploit the capacity approaching capability of low density parity check (LDPC) codes to design coding schemes for relay channels. We consider the classical relay channel model, and the use of both full-duplex relays and half-duplex ones. In addition to the design of practical coding schemes and the development of the appropriate receiver structures, we also exploit the use of Average Mutual Information to characterize the convergence behavior of the proposed systems. Using the convergence predictions and the simulation results, we demonstrate that the proposed LDPC coded relay systems, in particular, with irregular LDPC codes, have the capability to approach the ergodic/outage Information rates very closely. This is true for both ergodic fading channels where the Shannon type (constrained, i.e., modulation specific) capacity is considered, and non-ergodic fading channels where the outage capacity provides the appropriate limits of reliable communication. For the (time-division) half-duplex relay schemes, we also discuss the optimization of the time-division parameters, and the bit allocation strategies to improve the system performance further.

Khalid Sayood - One of the best experts on this subject based on the ideXlab platform.

  • Use of Average Mutual Information signatures to construct phylogenetic trees for fungi
    2017 IEEE International Conference on Electro Information Technology (EIT), 2017
    Co-Authors: Garin Newcomb, Audrey L. Atkin, Khalid Sayood
    Abstract:

    Average Mutual Information (AMI) has been applied to many fields, including various aspects of bioinformatics. In this paper, we evaluate its performance as a measure of evolutionary distance between sequences. We use the internal transcribed spacer (ITS) regions for 16 fungal sequences as representative sequences used for species comparison. We generate profiles based on the AMI for each species' ITS sequence. We then populate a distance matrix for the set of species using either a Euclidean or correlation distance between AMI profiles. We generate phylogenetic trees using the distance matrices as input. While these trees do not exactly match the accepted fungal phylogeny, there are sufficient commonalities to merit further investigation of AMI as a distance metric and tool for inferring relationships. We also simulate the evolution of an ITS sequence in order to observe how point mutations affect the distance between AMI profiles, concluding that a correlation distance performs slightly better than a Euclidean distance.

  • Use of Average Mutual Information for studying changes in HIV populations
    2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2009
    Co-Authors: Khalid Sayood, Federico Hoffman, Charles Wood
    Abstract:

    Average Mutual Information (AMI) has been used in a number of applications in bioinformatics. In this paper we present its use to study genetic changes in populations; in particular populations of HIV viruses. Disease progression of HIV-1 infection in infants can be rapid resulting in death within the the first year, or slow, allowing the infant to survive beyond the first year. We study the development of rapid and slow progressing HIV population using AMI charts based on Average Mutual Information among amino acids in the env gene from a population of 1142 clones derived from seven infants with slow progressing HIV-1 infection and four infants with rapidly progressing HIV-1 infection. The AMI charts indicate the relative homogeneity of the rapid progressor populations and the much greater heterogeneity of the slow progressor population, especially in later samples. The charts also show the distinct regions of covariation between residues without the need for aligning the sequences. By examining the changes in AMI between populations we can distinguish between clones obtained from rapid progressor and slow progressor. A measure of this change can be used to enhance prediction of disease progression.

  • the Average Mutual Information profile as a genomic signature
    BMC Bioinformatics, 2008
    Co-Authors: Mark Bauer, Sheldon M Schuster, Khalid Sayood
    Abstract:

    Occult organizational structures in DNA sequences may hold the key to understanding functional and evolutionary aspects of the DNA molecule. Such structures can also provide the means for identifying and discriminating organisms using genomic data. Species specific genomic signatures are useful in a variety of contexts such as evolutionary analysis, assembly and classification of genomic sequences from large uncultivated microbial communities and a rapid identification system in health hazard situations. We have analyzed genomic sequences of eukaryotic and prokaryotic chromosomes as well as various subtypes of viruses using an Information theoretic framework. We confirm the existence of a species specific Average Mutual Information (AMI) profile. We use these profiles to define a very simple, computationally efficient, alignment free, distance measure that reflects the evolutionary relationships between genomic sequences. We use this distance measure to classify chromosomes according to species of origin, to separate and cluster subtypes of the HIV-1 virus, and classify DNA fragments to species of origin. AMI profiles of DNA sequences prove to be species specific and easy to compute. The structure of AMI profiles are conserved, even in short subsequences of a species' genome, rendering a pervasive signature. This signature can be used to classify relatively short DNA fragments to species of origin.

  • the use of Average Mutual Information profile as a species signature
    Data Compression Conference, 2005
    Co-Authors: Mark Bauer, Sheldon M Schuster, Khalid Sayood
    Abstract:

    Two sets of figures are presnted without discussion. The first show (1a): Average Mutual Information Profile for the Human Chromosomes plotted for values of k between 5 and 50; and b) Average Mutual Information Profile for the Mouse Chromosomes plotted for values of k between 5 and 50. Thesecond set of figures show: (2a) Average Mutual Information Profile for the C. Elegans Chromosomes plotted for values of k between 5 and 50; and b) Average Mutual Information Profile for the S. Cerevisiae Chromosomes plotted for values of k between 5 and 50.

  • introduction to data compression
    1996
    Co-Authors: Khalid Sayood
    Abstract:

    Preface 1 Introduction 1.1 Compression Techniques 1.1.1 Lossless Compression 1.1.2 Lossy Compression 1.1.3 Measures of Performance 1.2 Modeling and Coding 1.3 Organization of This Book 1.4 Summary 1.5 Projects and Problems 2 Mathematical Preliminaries 2.1 Overview 2.2 A Brief Introduction to Information Theory 2.3 Models 2.3.1 Physical Models 2.3.2 Probability Models 2.3.3. Markov Models 2.3.4 Summary 2.5 Projects and Problems 3 Huffman Coding 3.1 Overview 3.2 "Good" Codes 3.3. The Huffman Coding Algorithm 3.3.1 Minimum Variance Huffman Codes 3.3.2 Length of Huffman Codes 3.3.3 Extended Huffman Codes 3.4 Nonbinary Huffman Codes 3.5 Adaptive Huffman Coding 3.5.1 Update Procedure 3.5.2 Encoding Procedure 3.5.3 Decoding Procedure 3.6 Applications of Huffman Coding 3.6.1 Lossless Image Compression 3.6.2 Text Compression 3.6.3 Audio Compression 3.7 Summary 3.8 Projects and Problems 4 Arithmetic Coding 4.1 Overview 4.2 Introduction 4.3 Coding a Sequence 4.3.1 Generating a Tag 4.3.2 Deciphering the Tag 4.4 Generating a Binary Code 4.4.1 Uniqueness and Efficiency of the Arithmetic Code 4.4.2 Algorithm Implementation 4.4.3 Integer Implementation 4.5 Comparison of Huffman and Arithmetic Coding 4.6 Applications 4.6.1 Bi-Level Image Compression-The JBIG Standard 4.6.2 Image Compression 4.7 Summary 4.8 Projects and Problems 5 Dictionary Techniques 5.1 Overview 5.2 Introduction 5.3 Static Dictionary 5.3.1 Diagram Coding 5.4 Adaptive Dictionary 5.4.1 The LZ77 Approach 5.4.2 The LZ78 Approach 5.5 Applications 5.5.1 File Compression-UNIX COMPRESS 5.5.2 Image Compression-the Graphics Interchange Format (GIF) 5.5.3 Compression over Modems-V.42 bis 5.6 Summary 5.7 Projects and Problems 6 Lossless Image Compression 6.1 Overview 6.2 Introduction 6.3 Facsimile Encoding 6.3.1 Run-Length Coding 6.3.2 CCITT Group 3 and 4-Recommendations T.4 and T.6 6.3.3 Comparison of MH, MR, MMR, and JBIG 6.4 Progressive Image Transmission 6.5 Other Image Compression Approaches 6.5.1 Linear Prediction Models 6.5.2 Context Models 6.5.3 Multiresolution Models 6.5.4 Modeling Prediction Errors 6.6 Summary 6.7 Projects and Problems 7 Mathematical Preliminaries 7.1 Overview 7.2 Introduction 7.3 Distortion Criteria 7.3.1 The Human Visual System 7.3.2 Auditory Perception 7.4 Information Theory Revisted 7.4.1 Conditional Entropy 7.4.2 Average Mutual Information 7.4.3 Differential Entropy 7.5 Rate Distortion Theory 7.6 Models 7.6.1 Probability Models 7.6.2 Linear System Models 7.6.3 Physical Models 7.7 Summary 7.8 Projects and Problems 8 Scalar Quantization 8.1 Overview 8.2 Introduction 8.3 The Quantization Problem 8.4 Uniform Quantizer 8.5 Adaptive Quantization 8.5.1 Forward Adaptive Quantization 8.5.2 Backward Adaptive Quantization 8.6 Nonuniform Quantization 8.6.1 pdf-Optimized Quantization 8.6.2 Companded Quantization 8.7 Entropy-Coded Quantization 8.7.1 Entropy Coding of Lloyd-Max Quantizer Outputs 8.7.2 Entropy-Constrained Quantization 8.7.3 High-Rate Optimum Quantization 8.8 Summary 8.9 Projects and Problems 9 Vector Quantization 9.1 Overview 9.2 Introduction 9.3 Advantages of Vector Quantization over Scalar Quantization 9.4 The Linde-Buzo-Gray Algorithm 9.4.1 Initializing the LBG Algorithm 9.4.2 The Empty Cell Problem 9.4.3 Use of LBG for Image Compression 9.5 Tree-Structured Vector Quantizers 9.5.1 Design of Tree-Structured Vector Quantizers 9.6 Structured Vector Quantizers 9.6.1 Pyramid Vector Quantization 9.6.2 Polar and Spherical Vector Quantizers 9.6.3 Lattice Vector Quantizers 9.7 Variations on the Theme 9.7.1 Gain-Shape Vector Quantization 9.7.2 Mean-Removed Vector Quantization 9.7.3 Classified Vector Quantization 9.7.4 Multistage Vector Quantization 9.7.5 Adaptive Vector Quantization 9.8 Summary 9.9 Projects and Problems 10 Differential Encoding 10.1 Overview 10.2 Introduction 10.3 The Basic Algorithm 10.4 Prediction in DPCM 10.5 Adaptive DPCM (ADPCM) 10.5.1 Adaptive Quantization in DPCM 10.5.2 Adaptive Prediction in DPCM 10.6 Delta Modulation 10.6.1 Constant Factor Adaptive Delta Modulation (CFDM) 10.6.2 Continuously Variable Slope Delta Modulation 10.7 Speech Coding 10.7.1 G.726 10.8 Summary 10.9 Projects and Problems 11 Subband Coding 11.1 Overview 11.2 Introduction 11.3 The Frequency Domain and Filtering 11.3.1 Filters 11.4 The Basic Subband Coding Algorithm 11.4.1 Bit Allocation 11.5 Application to Speech Coding-G.722 11.6 Application to Audio Coding-MPEG Audio 11.7 Application to Image Compression 11.7.1 Decomposing an Image 11.7.2 Coding the Subbands 11.8 Wavelets 11.8.1 Families of Wavelets 11.8.2 Wavelets and Image Compression 11.9 Summary 11.10 Projects and Problems 12 Transform Coding 12.1 Overview 12.2 Introduction 12.3 The Transform 12.4 Transforms of Interest 12.4.1 Karhunen-Loeve Transform 12.4.2 Discrete Cosine Transform 12.4.3 Discrete Sine Transform 12.4.4 Discrete Walsh-Hadamard Transform 12.5 Quantization and Coding of Transform Coefficients 12.6 Application to Image Compression-JPEG 12.6.1 The Transform 12.6.2 Quantization 12.6.3 Coding 12.7 Application to Audio Compression 12.8 Summary 12.9 Projects and Problems 13 Analysis/Synthesis Schemes 13.1 Overview 13.2 Introduction 13.3 Speech Compression 13.3.1 The Channel Vocoder 13.3.2 The Linear Predictive Coder (Gov.Std.LPC-10) 13.3.3 Code Excited Linear Prediction (CELP) 13.3.4 Sinusoidal Coders 13.4 Image Compression 13.4.1 Fractal Compression 13.5 Summary 13.6 Projects and Problems 14 Video Compression 14.1 Overview 14.2 Introduction 14.3 Motion Compensation 14.4 Video Signal Representation 14.5 Algorithms for Videoconferencing and Videophones 14.5.1 ITU_T Recommendation H.261 14.5.2 Model-Based Coding 14.6 Asymmetric Applications 14.6.1 The MPEG Video Standard 14.7 Packet Video 14.7.1 ATM Networks 14.7.2 Compression Issues in ATM Networks 14.7.3 Compression Algorithms for Packet Video 14.8 Summary 14.9 Projects and Problems A Probability and Random Processes A.1 Probability A.2 Random Variables A.3 Distribution Functions A.4 Expectation A.5 Types of Distribution A.6 Stochastic Process A.7 Projects and Problems B A Brief Review of Matrix Concepts B.1 A Matrix B.2 Matrix Operations C Codes for Facsimile Encoding D The Root Lattices Bibliography Index

Jamal Najim - One of the best experts on this subject based on the ideXlab platform.

  • On the Capacity Achieving Covariance Matrix for Rician MIMO Channels: An Asymptotic Approach
    IEEE Transactions on Information Theory, 2010
    Co-Authors: Julien Dumont, Philippe Loubaton, Samson Lasaulce, Walid Hachem, Jamal Najim
    Abstract:

    In this paper, the capacity-achieving input covariance matrices for coherent block-fading correlated multiple input multiple output (MIMO) Rician channels are determined. In contrast with the Rayleigh and uncorrelated Rician cases, no closed-form expressions for the eigenvectors of the optimum input covariance matrix are available. Classically, both the eigenvectors and eigenvalues are computed numerically and the corresponding optimization algorithms remain computationally very demanding. In the asymptotic regime where the number of transmit and receive antennas converge to infinity at the same rate, new results related to the accuracy of the approximation of the Average Mutual Information are provided. Based on the accuracy of this approximation, an attractive optimization algorithm is proposed and analyzed. This algorithm is shown to yield an effective way to compute the capacity achieving matrix for the Average Mutual Information and numerical simulation results show that, even for a moderate number of transmit and receive antennas, the new approach provides the same results as direct maximization approaches of the Average Mutual Information.

  • on the capacity achieving covariance matrix for rician mimo channels an asymptotic approach
    arXiv: Probability, 2007
    Co-Authors: Julien Dumont, Philippe Loubaton, Samson Lasaulce, Walid Hachem, Jamal Najim
    Abstract:

    The capacity-achieving input covariance matrices for coherent block-fading correlated MIMO Rician channels are determined. In this case, no closed-form expressions for the eigenvectors of the optimum input covariance matrix are available. An approximation of the Average Mutual Information is evaluated in this paper in the asymptotic regime where the number of transmit and receive antennas converge to $+\infty$. New results related to the accuracy of the corresponding large system approximation are provided. An attractive optimization algorithm of this approximation is proposed and we establish that it yields an effective way to compute the capacity achieving covariance matrix for the Average Mutual Information. Finally, numerical simulation results show that, even for a moderate number of transmit and receive antennas, the new approach provides the same results as direct maximization approaches of the Average Mutual Information, while being much more computationally attractive.

Miguel R. D. Rodrigues - One of the best experts on this subject based on the ideXlab platform.

  • 1Coherent Fading Channels Driven by Arbitrary Inputs: Asymptotic Characterization of the Constrained Capacity and Related Information- and Estimation-Theoretic Quantities
    2016
    Co-Authors: Alberto Gil C. P. Ramos, Student Member, Miguel R. D. Rodrigues
    Abstract:

    We consider the characterization of the asymptotic behavior of the Average minimum mean-squared error (MMSE) and the Average Mutual Information in scalar and vector fading coherent channels, where the receiver knows the exact fading channel state but the transmitter knows only the fading channel distribution, driven by a range of inputs. We construct low-snr and – at the heart of the novelty of the contribution – high-snr asymptotic expansions for the Average MMSE and the Average Mutual Information for coherent channels subject to Rayleigh fading, Ricean fading or Nakagami fading and driven by discrete inputs (with finite support) or various continuous inputs. We reveal the role that the so-called canonical MMSE in a standard additive white Gaussian noise (AWGN) channel plays in the characterization of the asymptotic behavior of the Average MMSE and the Average Mutual Information in a fading coherent channel: in the regime of low-snr, the derivatives of the canonical MMSE define the expansions of the estimation- and Information-theoretic quantities; in contrast, in the regime of high-snr, the Mellin transform of the canonical MMSE define the expansions of the quantities. We thus also provide numerically and – whenever possible – analytically the Mellin transform of the canonical MMSE for the most common input distributions. We also reveal connections to and generalizations of the MMSE dimension. The most relevant element that enables the construction of these non-trivial expansions is the realization that the integral representation of the estimation- and Information

  • characterization and optimization of the constrained capacity of coherent fading channels driven by arbitrary inputs a mellin transform based asymptotic approach
    International Symposium on Information Theory, 2013
    Co-Authors: Alberto Gil C. P. Ramos, Miguel R. D. Rodrigues
    Abstract:

    We unveil asymptotic characterizations of the Average minimum mean-squared error (MMSE) and the Average Mutual Information in scalar fading coherent channels, where the receiver knows the exact fading channel state but the transmitter knows only the fading channel distribution, driven by a range of inputs both in the regimes of low-SNR - and at the heart of the novelty of the contribution - high-SNR. We also unveil connections to and generalizations of the MMSE dimension. By capitalizing on the characterizations, we conclude with applications of the results to the optimization of the constrained capacity of a bank of parallel independent coherent fading channels.

Mark Bauer - One of the best experts on this subject based on the ideXlab platform.

  • the Average Mutual Information profile as a genomic signature
    BMC Bioinformatics, 2008
    Co-Authors: Mark Bauer, Sheldon M Schuster, Khalid Sayood
    Abstract:

    Occult organizational structures in DNA sequences may hold the key to understanding functional and evolutionary aspects of the DNA molecule. Such structures can also provide the means for identifying and discriminating organisms using genomic data. Species specific genomic signatures are useful in a variety of contexts such as evolutionary analysis, assembly and classification of genomic sequences from large uncultivated microbial communities and a rapid identification system in health hazard situations. We have analyzed genomic sequences of eukaryotic and prokaryotic chromosomes as well as various subtypes of viruses using an Information theoretic framework. We confirm the existence of a species specific Average Mutual Information (AMI) profile. We use these profiles to define a very simple, computationally efficient, alignment free, distance measure that reflects the evolutionary relationships between genomic sequences. We use this distance measure to classify chromosomes according to species of origin, to separate and cluster subtypes of the HIV-1 virus, and classify DNA fragments to species of origin. AMI profiles of DNA sequences prove to be species specific and easy to compute. The structure of AMI profiles are conserved, even in short subsequences of a species' genome, rendering a pervasive signature. This signature can be used to classify relatively short DNA fragments to species of origin.

  • the use of Average Mutual Information profile as a species signature
    Data Compression Conference, 2005
    Co-Authors: Mark Bauer, Sheldon M Schuster, Khalid Sayood
    Abstract:

    Two sets of figures are presnted without discussion. The first show (1a): Average Mutual Information Profile for the Human Chromosomes plotted for values of k between 5 and 50; and b) Average Mutual Information Profile for the Mouse Chromosomes plotted for values of k between 5 and 50. Thesecond set of figures show: (2a) Average Mutual Information Profile for the C. Elegans Chromosomes plotted for values of k between 5 and 50; and b) Average Mutual Information Profile for the S. Cerevisiae Chromosomes plotted for values of k between 5 and 50.