Broadcast Television

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Jon M. Peha - One of the best experts on this subject based on the ideXlab platform.

  • on the trade off between spectrum efficiency and transmission cost in traditional and sfn based Broadcast Television
    IEEE International Symposium on Dynamic Spectrum Access Networks, 2015
    Co-Authors: Rolando Bettancourt, Jon M. Peha
    Abstract:

    This paper evaluates two alternatives to today's noise-limited single-transmitter Broadcast Television systems, either of which would improve spectrum efficiency. One alternative is to increase the transmit power of each Broadcaster's only transmitter. The other alternative is to replace that single transmitter with a multi-transmitter single frequency network (SFN). For both approaches, this paper calculates the relationship between the maximum achievable spectrum efficiency and the cost required to achieve that efficiency as a function of the most important design parameters. Results suggest that increasing power of traditional single-transmitter Broadcasters could reduce the amount of spectrum needed for TV by roughly 30%, and would be cost-effective today through much of the U.S. A switch to SFNs could reduce the amount of spectrum needed for TV by roughly 60%, but at a higher cost. Results suggest that the SFN approach could be cost-effective in the most densely populated regions where spectrum is most valuable.

  • Spectrum management policy options
    IEEE Communications Surveys, 1998
    Co-Authors: Jon M. Peha
    Abstract:

    As market-based reform sweeps telecommunications industries around the world, it is a good time to reevaluate the spectrum management policies which govern wireless industries ranging from Broadcast Television to satellite communications. Most countries have been using a central planning approach to spectrum management, but there are many alternatives with varying degrees of flexibility and market-based incentives. This paper provides a survey of spectrum management approaches, addressing methods of determining how spectrum can be used, which commercial entities can use it, and how governments can manage their own spectrum. It identifies some of the crucial choices to be made, and summarizes advantages and disadvantages of each.

Michael J Marcus - One of the best experts on this subject based on the ideXlab platform.

Rob Frieden - One of the best experts on this subject based on the ideXlab platform.

  • the evolving 5g case study in spectrum management and industrial policy
    Telecommunications Policy, 2019
    Co-Authors: Rob Frieden
    Abstract:

    Abstract Even before the International Telecommunication Union (“ITU”) reaches consensus on spectrum allocations for fifth generation wireless technologies (“5G”), commercial ventures scramble to offer new equipment and services. Expediting 5G wireless service can benefit consumers and businesses at the risk of stressing the traditional process of spectrum planning that combines study, dialogue and consensus building at ITU conferences. This paper explains why most nations refused to endorse key United States 5G spectrum allocation proposals at the ITU’s, 2015 World Radio Conference. U.S. representatives underestimated the time needed for consensus building, despite increasing demand for wireless video and the evolving Internet of Things. Other contributing factors include U.S. support for treating spectrum like property, use of “incentive auctions” to clear Broadcast Television spectrum with unprecedented speed and enough existing wireless spectrum allocations in most nations. The paper concludes that the U.S. cannot expect faster frequency reallocations, particularly when it and other nations pursue matters having little to do with spectrum optimization. The paper offers recommendations on best practices for improving the consensus building process.

  • ancillary to what the fcc s mixed record in expanding its regulatory reach without explicit statutory authority
    2012
    Co-Authors: Rob Frieden
    Abstract:

    For Internet-based services, such as retail broadband, the Federal Communications Commission (“FCC”) has evidenced great ambivalence over the scope of its jurisdiction and the need for its regulatory intervention. On one hand, the Commission decided to apply the information service statutory classification that triggers little if any regulatory authority. Whether by credible empirical evidence, or flawed assumptions and projections about marketplace competition, the Commission initially expressed confidence that self-regulation could remedy most anticompetitive practices. The FCC subsequently regretted its broad sweeping deregulatory initiative in light of complaints about discriminatory practices of some Internet Service Providers (“ISPs”) and new found concerns about the viability of broadband access competition. Having exempted Internet access technologies from any of the requirements established in Title II of the Communications Act, the FCC subsequently attempted to invoke necessary statutory authority based on Title I of the Communications Act. This strategy of establishing “ancillary jurisdiction” uses an indirect method whereby the FCC extrapolates statutory authority when explicit statutory authority does not exist. The Commission has generated a mixed record in convincing courts that such indirect authority exists.This paper will examine cases where the FCC has successfully convinced appellate courts that ancillary jurisdiction exists and cases where the Commission has failed. In the former the FCC lawfully extended its jurisdiction to include cable Television, even in the absence of enabling legislation, based on an analogy. The Commission argued successfully in United States v. Southwestern Cable Co., 392 U.S. 157 (1968), that because it had direct statutory authority to regulate Broadcast Television under Title III of the Act, and because cable Television had the potential to impact the viability of “free” advertiser supported Broadcast Television, the Commission had ancillary jurisdiction to establish rules and regulations to curb the market fragmenting impact of cable Television. Recently the FCC convinced an appellate court that it could and should apply duties to deal between information service providers of mobile wireless data services, so long as the requirements do not constitute common carriage. The FCC also has achieved success in applying ancillary jurisdiction to Voice over the Internet Protocol (“VoIP”) companies and now imposes many regulatory requirements previously applied solely to telecommunications service providers. Having refrained from specifying whether VoIP providers offer telecommunications services or information services, the Commission nevertheless invoked ancillary jurisdiction authority. Reviewing courts have affirmed the FCC’s jurisdictional claims largely based on the sense that VoIP competes with, and constitutes a technological alternative to, dial up telephone service.On the other hand, a reviewing court in Comcast Corp. v. FCC, 600 F.3d 642 (D.C. Cir. 2010) refused to accept the FCC’s assertion that it could lawfully stretch ancillary jurisdiction to include ventures that it previously had classified as ISPs. In this instance the court did not accept that because the FCC has general statutory authority over “wire and radio” in Title I of the Communications Act, the Commission can extend its regulatory reach to include information services simply because ISPs use wire and radio to provide service. The FCC similarly failed in American Library Association v. FCC, 406 F.3d 689 (D.C. Cir. 2005), to convince a reviewing court that Broadcast Television jurisdiction included authorization to require Television set manufacturers to construct sets capable of processing copyright protection instructions. This paper will identify what circumstances favor and disfavor the FCC’s attempt to invoke ancillary jurisdiction. The paper concludes that the Commission will have greater difficulty in securing judicial approval of Title I ancillary jurisdiction for instances where it previously made a determination that it lacked direct statutory authority to act. Even though the perceived need to intervene shows that the FCC miscalculated the sufficiency of marketplace self-regulation, the Commission cannot easily convince courts that statutory definitions are so pliable that the FCC can toggle between two categories. However consumers increasingly rely on Internet access as the primary or exclusive medium for access to both telecommunications and information services. Disputes over what ISPs must do to serve the public interest likely will increase and the FCC will not abandon efforts to provide solutions. The paper will consider what direct or ancillary authority the Commission can muster in light of Verizon v. FCC, __ F.3d __, No. 11-1355 (D.C. Cir. 2014) which validated the lawfulness of some oversight derived from Section 706 of the Communications Act that authorizes the FCC to promote broadband access.

  • internet packet sniffing and its impact on the network neutrality debate and the balance of power between intellectual property creators and consumers
    Fordham Intellectual Property Media & Entertainment Law Journal, 2007
    Co-Authors: Rob Frieden
    Abstract:

    When Internet Service Providers ("ISPs") serve as neutral conduits they qualify for a safe harbor exemption from liability for carrying copyright infringing traffic provided by Section 512 of the Digital Millennium Copyright Act. However ISPs now want to operate non-neutral networks capable of offering "better than best efforts" routing and premium services for both content providers and consumers seeking higher quality of service and more reliable traffic delivery. The ability to inspect specific packet streams also enables ISPs to identify traffic type and routing priority as well as a greater ability to determine copyright compliance. The debate about Internet neutrality has largely ignored whether ISPs risk losing safe harbors from copyright infringement when they actively manage their networks to offer tiered services. This paper will assess non-neutral network operation in terms of its impact on intellectual property rights, including consumers' fair use opportunities. The paper will assess whether and how ISPs might lose their safe harbor for copyright infringement liability based on new technological means to know about the content they carry. Additionally the paper will consider whether ISPs have an affirmative duty to conduct packet inspection absent a legislative mandate. The paper also will examine litigation over mandatory processing of Broadcast Television "flags," which specify consumer use options, but which require equipment processing on user premises. The paper concludes that ISPs regulatory status as information service providers does not provide an absolute exemption from responsibilities to examine the content they carry and to provide reasonable safeguards for protecting copyrights. However such affirmative efforts to operate a non-neutral network may impose greater burdens on ISPs to protect creators' intellectual property rights with the likely reduction of consumers' fair use opportunities.

Alec C. Tefertiller - One of the best experts on this subject based on the ideXlab platform.

  • With or without you: Connected viewing and co-viewing Twitter activity for traditional appointment and asynchronous Broadcast Television models
    First Monday, 2015
    Co-Authors: Matthew Pittman, Alec C. Tefertiller
    Abstract:

    Social networking services like Twitter have changed the way people engage with traditional Broadcast media. But how social is “second screen” activity? The purpose of this study is to determine if patterns of connected viewing (augmenting Television consumption with a second screen) and co-viewing (watching Television together) are different for traditionally Broadcast, “appointment” Television shows versus streaming, asynchronous Television releases. This study explores this phenomena of “co-connected viewing” — a combination of connected and co-viewing — on Twitter for four programs that were all released within seven days of each other: Parks and Recreation, Downton Abbey, House of Cards , and Unbreakable Kimmy Schmidt . Complete datasets (over 200,000 tweets) from 72 hours’ worth of Twitter activity for four Television programs, two traditional and two streaming, were collected and analyzed. In terms of co-connected viewing, the study found that despite radically different Broadcast models and corresponding shapes in Twitter activity, the ratios of social to non-social tweets were nearly identical. Additionally, the study found that the asynchronous, streaming Netflix shows saw more engagement from active Twitter users. Finally, implications are discussed for viewers, fans, advertisers, and the Television industry, as well as directions for future research.

Winston M Caldwell - One of the best experts on this subject based on the ideXlab platform.

  • an overview of the atsc 3 0 physical layer specification
    IEEE Transactions on Broadcasting, 2016
    Co-Authors: L B Michael, David Gomezbarquero, Nejib Ammar, Winston M Caldwell
    Abstract:

    This paper provides an overview of the physical layer specification of Advanced Television Systems Committee (ATSC) 3.0, the next-generation digital terrestrial Broadcasting standard. ATSC 3.0 does not have any backwards-compatibility constraint with existing ATSC standards, and it uses orthogonal frequency division multiplexing-based waveforms along with powerful low-density parity check (LDPC) forward error correction codes similar to existing state-of-the-art. However, it introduces many new technological features such as 2-D non-uniform constellations, improved and ultra-robust LDPC codes, power-based layered division multiplexing to efficiently provide mobile and fixed services in the same radio frequency (RF) channel, as well as a novel frequency pre-distortion multiple-input single-output antenna scheme. ATSC 3.0 also allows bonding of two RF channels to increase the service peak data rate and to exploit inter-RF channel frequency diversity, and to employ dual-polarized multiple-input multiple-output antenna system. Furthermore, ATSC 3.0 provides great flexibility in terms of configuration parameters (e.g., 12 coding rates, 6 modulation orders, 16 pilot patterns, 12 guard intervals, and 2 time interleavers), and also a very flexible data multiplexing scheme using time, frequency, and power dimensions. As a consequence, ATSC 3.0 not only improves the spectral efficiency and robustness well beyond the first generation ATSC Broadcast Television standard, but also it is positioned to become the reference terrestrial Broadcasting technology worldwide due to its unprecedented performance and flexibility. Another key aspect of ATSC 3.0 is its extensible signaling, which will allow including new technologies in the future without disrupting ATSC 3.0 services. This paper provides an overview of the physical layer technologies of ATSC 3.0, covering the ATSC A/321 standard that describes the so-called bootstrap, which is the universal entry point to an ATSC 3.0 signal, and the ATSC A/322 standard that describes the physical layer downlink signals after the bootstrap. A summary comparison between ATSC 3.0 and DVB-T2 is also provided.

  • Broadcast Television spectrum incentive auctions in the u s trends challenges and opportunities
    IEEE Communications Magazine, 2015
    Co-Authors: David Gomezbarquero, Winston M Caldwell
    Abstract:

    This article presents an overview of the upcoming Television Broadcast spectrum incentive auction in the U.S., which will be the first ever attempted worldwide, and discusses the main business, regulatory, and technical challenges of a successful incentive auction. The process combines two separate but linked auctions: a reverse auction, which will identify the prices at which Broadcasters are willing to relinquish their spectrum; and a forward auction, which will determine the price mobile network operators are willing to pay to acquire the new frequencies. The two auctions will determine the buyers and sellers and also the amount of spectrum to be cleared in the 600 MHz band after reorganizing the Television stations that remain on air. This process is known as repacking and will create contiguous blocks of cleared spectrum at the high frequency side of the UHF band for mobile use. The article also reviews the potential plans for the 600 MHz band and discusses the opportunities that could bring about the new digital terrestrial Television standard known as “ATSC 3.0.”