Outage Event

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 1083 Experts worldwide ranked by ideXlab platform

Neri Merhav - One of the best experts on this subject based on the ideXlab platform.

  • Trading off Weak–Noise Estimation Performance and Outage Exponents in Nonlinear Modulation
    2019 IEEE International Symposium on Information Theory (ISIT), 2019
    Co-Authors: Neri Merhav
    Abstract:

    We consider the problem of modulating a parameter onto a power-limited signal, transmitted over a discrete-time Gaussian channel and estimating this parameter at the receiver. Considering the well-known threshold effect in non-linear modulation systems, our approach is the following: instead of deriving upper and lower bounds on the total estimation error, which weigh both weak-noise errors and anomalous errors beyond the threshold, we separate the two kinds of errors. In particular, we derive upper and lower bounds on the best achievable trade-off between the exponential decay rate of the weak-noise expected error cost and the exponential decay rate of the probability of the anomalous error Event, also referred to as the Outage Event. This Outage Event is left to be defined as part of the communication system design problem. Our achievability scheme, which is based on lattice codes, meets the lower bound at the high signal-to- noise (SNR) limit and for a certain range of trade-offs between the weak-noise error cost and the Outage exponent.

  • trading off weak noise estimation performance and Outage exponents in nonlinear modulation
    International Symposium on Information Theory, 2019
    Co-Authors: Neri Merhav
    Abstract:

    We consider the problem of modulating a parameter onto a power–limited signal, transmitted over a discrete–time Gaussian channel and estimating this parameter at the receiver. Considering the well–known threshold effect in non–linear modulation systems, our approach is the following: instead of deriving upper and lower bounds on the total estimation error, which weigh both weak–noise errors and anomalous errors beyond the threshold, we separate the two kinds of errors. In particular, we derive upper and lower bounds on the best achievable trade-off between the exponential decay rate of the weak–noise expected error cost and the exponential decay rate of the probability of the anomalous error Event, also referred to as the Outage Event. This Outage Event is left to be defined as part of the communication system design problem. Our achievability scheme, which is based on lattice codes, meets the lower bound at the high signal–to– noise (SNR) limit and for a certain range of trade–offs between the weak–noise error cost and the Outage exponent.

  • ISIT - Trading off Weak–Noise Estimation Performance and Outage Exponents in Nonlinear Modulation
    2019 IEEE International Symposium on Information Theory (ISIT), 2019
    Co-Authors: Neri Merhav
    Abstract:

    We consider the problem of modulating a parameter onto a power–limited signal, transmitted over a discrete–time Gaussian channel and estimating this parameter at the receiver. Considering the well–known threshold effect in non–linear modulation systems, our approach is the following: instead of deriving upper and lower bounds on the total estimation error, which weigh both weak–noise errors and anomalous errors beyond the threshold, we separate the two kinds of errors. In particular, we derive upper and lower bounds on the best achievable trade-off between the exponential decay rate of the weak–noise expected error cost and the exponential decay rate of the probability of the anomalous error Event, also referred to as the Outage Event. This Outage Event is left to be defined as part of the communication system design problem. Our achievability scheme, which is based on lattice codes, meets the lower bound at the high signal–to– noise (SNR) limit and for a certain range of trade–offs between the weak–noise error cost and the Outage exponent.

  • Tradeoffs Between Weak-Noise Estimation Performance and Outage Exponents in Nonlinear Modulation
    IEEE Transactions on Information Theory, 2019
    Co-Authors: Neri Merhav
    Abstract:

    We focus on the problem of modulating a parameter onto a power-limited signal transmitted over a discrete-time Gaussian channel and estimating this parameter at the receiver. Considering the well-known threshold effect in the non-linear modulation systems, our approach is the following: instead of deriving upper and lower bounds on the total estimation error, which weighs both weak-noise errors and anomalous errors beyond the threshold, we separate the two kinds of errors. In particular, we derive upper and lower bounds on the best achievable tradeoff between the exponential decay rate of the weak-noise expected error cost and the exponential decay rate of the probability of the anomalous error Event, also referred to as the Outage Event. This Outage Event is left to be defined as a part of the communication system design problem. Our achievability scheme, which is based on lattice codes, meets the lower bound at the high signal-to-noise limit and for a certain range of tradeoffs between the weak-noise error cost and the Outage exponent.

  • Trade-offs Between Weak-Noise Estimation Performance and Outage Exponents in Nonlinear Modulation
    arXiv: Information Theory, 2018
    Co-Authors: Neri Merhav
    Abstract:

    We focus on the problem of modulating a parameter onto a power-limited signal transmitted over a discrete-time Gaussian channel and estimating this parameter at the receiver. Considering the well-known threshold effect in non-linear modulation systems, our approach is the following: instead of deriving upper and lower bounds on the total estimation error, which weigh both weak-noise errors and anomalous errors beyond the threshold, we separate the two kinds of errors. In particular, we derive upper and lower bounds on the best achievable trade-off between the exponential decay rate of the weak-noise expected error cost and the exponential decay rate of the probability of the anomalous error Event, also referred to as the Outage Event. This Outage Event is left to be defined as part of the communication system design problem. Our achievability scheme, which is based on lattice codes, meets the lower bound at the high signal-to-noise (SNR) limit and for a certain range of trade-offs between the weak--noise error cost and the Outage exponent.

Andreas F. Molisch - One of the best experts on this subject based on the ideXlab platform.

  • The Throughput-Outage Tradeoff of Wireless One-Hop Caching Networks
    IEEE Transactions on Information Theory, 2015
    Co-Authors: Mingyue Ji, Giuseppe Caire, Andreas F. Molisch
    Abstract:

    We consider a wireless device-to-device (D2D) network where the nodes have precached information from a library of available files. Nodes request files at random. If the requested file is not in the on-board cache, then it is downloaded from some neighboring node via one-hop local communication. An Outage Event occurs when a requested file is not found in the neighborhood of the requesting node, or if the network admission control policy decides not to serve the request. We characterize the optimal throughput-Outage tradeoff in terms of tight scaling laws for various regimes of the system parameters, when both the number of nodes and the number of files in the library grow to infinity. Our analysis is based on Gupta and Kumar protocol model for the underlying D2D wireless network, widely used in the literature on capacity scaling laws of wireless networks without caching. Our results show that the combination of D2D spectrum reuse and caching at the user nodes yields a per-user throughput independent of the number of users, for any fixed Outage probability in (0, 1). This implies that the D2D caching network is scalable: even though the number of users increases, each user achieves constant throughput. This behavior is very different from the classical Gupta and Kumar result on ad hoc wireless networks, for which the per-user throughput vanishes as the number of users increases. Furthermore, we show that the user throughput is directly proportional to the fraction of cached information over the whole file library size. Therefore, we can conclude that D2D caching networks can turn memory into bandwidth (i.e., doubling the on-board cache memory on the user devices yields a 100% increase of the user throughout).

  • the throughput Outage tradeoff of wireless one hop caching networks
    arXiv: Information Theory, 2013
    Co-Authors: Mingyue Ji, Giuseppe Caire, Andreas F. Molisch
    Abstract:

    We consider a wireless device-to-device (D2D) network where the nodes have pre-cached information from a library of available files. Nodes request files at random. If the requested file is not in the on-board cache, then it is downloaded from some neighboring node via one-hop "local" communication. An Outage Event occurs when a requested file is not found in the neighborhood of the requesting node, or if the network admission control policy decides not to serve the request. We characterize the optimal throughput-Outage tradeoff in terms of tight scaling laws for various regimes of the system parameters, when both the number of nodes and the number of files in the library grow to infinity. Our analysis is based on Gupta and Kumar {\em protocol model} for the underlying D2D wireless network, widely used in the literature on capacity scaling laws of wireless networks without caching. Our results show that the combination of D2D spectrum reuse and caching at the user nodes yields a per-user throughput independent of the number of users, for any fixed Outage probability in $(0,1)$. This implies that the D2D caching network is "scalable": even though the number of users increases, each user achieves constant throughput. This behavior is very different from the classical Gupta and Kumar result on ad-hoc wireless networks, for which the per-user throughput vanishes as the number of users increases. Furthermore, we show that the user throughput is directly proportional to the fraction of cached information over the whole file library size. Therefore, we can conclude that D2D caching networks can turn "memory" into "bandwidth" (i.e., doubling the on-board cache memory on the user devices yields a 100\% increase of the user throughout).

Jung Ryul Yang - One of the best experts on this subject based on the ideXlab platform.

  • Optimal relaying strategy for UE relays
    The 17th Asia Pacific Conference on Communications, 2011
    Co-Authors: Jung Ryul Yang
    Abstract:

    In cooperative cellular wireless networks, a user equipment (UE) relay can be a good alternative to rendering an end user by forwarding the signal overheard from the source to the end user. However, due to its practical limitation, it has no cell-specific reference signal. Therefore, different from conventional fixed relay station, the channel state information (CSI) is not available to the UE relay. Since the UE relay cannot estimate relay-to-destination (R-D) channel capacity, there arises the possibility of an Outage Event in cooperating phase. In this paper, we introduce an Outage based rate control under multiple relay networks in which there are one source, multiple UE relays and one end user. To achieve maximal overall transmission rate, resource allocation during the cooperating phase is done at the source in advance by considering both the expected transmission rate and the resulting Outage probability. Especially we consider three cases of relaying strategies in such open-loop R-D links and observe that they complement each other depending on the R-D link geometry and channel conditions, leading to optimal relaying strategy for UE relays.

Luigi Paura - One of the best experts on this subject based on the ideXlab platform.

  • Mobile Smart Grids: Exploiting the TV White Space in Urban Scenarios
    IEEE Access, 2016
    Co-Authors: Angela Sara Cacciapuoti, Marcello Caleffi, Francesco Marino, Luigi Paura
    Abstract:

    Due to its attractive characteristics, the TV white space (TVWS) spectrum is considered the ideal candidate to enable the deployment of smart grid networks (SGNs) via cognitive radio paradigm. However, the intermittent availability of the TVWS spectrum as well as its scarcity in urban scenarios could compromise the tight smart grid requirements in terms of reliability, latency, and data rate. This degradation could be even more severe when mobile grid nodes, e.g., electric vehicles, are considered. Stemming from this, we first develop an analytical framework to account for the mobility in SG scenarios. Then, we design a switching procedure based on the use of two different bands: TVWS spectrum and Industrial, Scientific and Medical (ISM) spectrum. The switching procedure selects, among the available spectrum bands, the one maximizing the achievable throughput at an arbitrary SGN. Such a procedure accounts for the presence of interfering SGNs on the TVWS spectrum through both their traffic and mobility patterns. By wisely using both the ISM and the TVWS spectrum, the proposed switching procedure is able to: 1) increase the achievable data rate, and to 2) reduce the Outage Event rate, improving the reliability and the latency of the smart grid communications. Moreover, we show the performance of the proposed switching procedure depends largely on the time devoted to sense. Hence, the proper setting of such a parameter is critical for the performance of any SGN. For this, we derive an optimization criterion maximizing the throughput under the constraint of bounding the Outage rate. The theoretical analysis is validated through extensive numerical simulations.

Mingyue Ji - One of the best experts on this subject based on the ideXlab platform.

  • The Throughput-Outage Tradeoff of Wireless One-Hop Caching Networks
    IEEE Transactions on Information Theory, 2015
    Co-Authors: Mingyue Ji, Giuseppe Caire, Andreas F. Molisch
    Abstract:

    We consider a wireless device-to-device (D2D) network where the nodes have precached information from a library of available files. Nodes request files at random. If the requested file is not in the on-board cache, then it is downloaded from some neighboring node via one-hop local communication. An Outage Event occurs when a requested file is not found in the neighborhood of the requesting node, or if the network admission control policy decides not to serve the request. We characterize the optimal throughput-Outage tradeoff in terms of tight scaling laws for various regimes of the system parameters, when both the number of nodes and the number of files in the library grow to infinity. Our analysis is based on Gupta and Kumar protocol model for the underlying D2D wireless network, widely used in the literature on capacity scaling laws of wireless networks without caching. Our results show that the combination of D2D spectrum reuse and caching at the user nodes yields a per-user throughput independent of the number of users, for any fixed Outage probability in (0, 1). This implies that the D2D caching network is scalable: even though the number of users increases, each user achieves constant throughput. This behavior is very different from the classical Gupta and Kumar result on ad hoc wireless networks, for which the per-user throughput vanishes as the number of users increases. Furthermore, we show that the user throughput is directly proportional to the fraction of cached information over the whole file library size. Therefore, we can conclude that D2D caching networks can turn memory into bandwidth (i.e., doubling the on-board cache memory on the user devices yields a 100% increase of the user throughout).

  • the throughput Outage tradeoff of wireless one hop caching networks
    arXiv: Information Theory, 2013
    Co-Authors: Mingyue Ji, Giuseppe Caire, Andreas F. Molisch
    Abstract:

    We consider a wireless device-to-device (D2D) network where the nodes have pre-cached information from a library of available files. Nodes request files at random. If the requested file is not in the on-board cache, then it is downloaded from some neighboring node via one-hop "local" communication. An Outage Event occurs when a requested file is not found in the neighborhood of the requesting node, or if the network admission control policy decides not to serve the request. We characterize the optimal throughput-Outage tradeoff in terms of tight scaling laws for various regimes of the system parameters, when both the number of nodes and the number of files in the library grow to infinity. Our analysis is based on Gupta and Kumar {\em protocol model} for the underlying D2D wireless network, widely used in the literature on capacity scaling laws of wireless networks without caching. Our results show that the combination of D2D spectrum reuse and caching at the user nodes yields a per-user throughput independent of the number of users, for any fixed Outage probability in $(0,1)$. This implies that the D2D caching network is "scalable": even though the number of users increases, each user achieves constant throughput. This behavior is very different from the classical Gupta and Kumar result on ad-hoc wireless networks, for which the per-user throughput vanishes as the number of users increases. Furthermore, we show that the user throughput is directly proportional to the fraction of cached information over the whole file library size. Therefore, we can conclude that D2D caching networks can turn "memory" into "bandwidth" (i.e., doubling the on-board cache memory on the user devices yields a 100\% increase of the user throughout).