In 2010, the Obama administration mandated that the federal government consolidate wireless communications for government usage. Over the next 10 years, the US National Telecommunications and Information Administration (NTIA) and the Federal Communications Commission (FCC) must work to free up 500MHz of spectrum that is to be allocated for public wireless data consumption. This is great news since we all know that wireless spectrum is finite in nature and we should set forth to utilize it as efficiently as possible.
Two years after the mandate, the efforts of the NTIA and FCC are now coming to fruition as a recent proposal was put forth to free up 95MHz in the highly valuable 1755-1850MHz Advanced Wireless Services (AWS) band. These frequencies could be put to use for next generation wireless technologies. Many wireless providers are interested in acquiring the freed frequencies when they eventually go up for auction. But the question is, will they be used efficiently or simply as a chess piece in the battle for wireless telecommunications dominance?
The big carriers such as AT&T and Verizon are regularly talking about the lack of available spectrum, which contributes to nationwide rollout delays for current next-generation wireless. What is rarely discussed is the fact that the carriers with the biggest pockets will gobble up all the available frequencies at auctions, but have no way of utilizing it. Instead, the spectrum will be purchased simply so competitors can't use it.
This past February, T-Mobile filed a petition to the FCC to block Verizon
from purchasing another block of spectrum from a smaller company. Owning wireless spectrum is power and it seems like T-Mobile has a good point here as the other major carrier, AT&T, has been accused of hoarding spectrum in the past. While there are "buildout rules" defined by the FCC that are in place to prevent hoarding, many believe that they are not restrictive enough and are nearly impossible to enforce.
So perhaps wireless spectrum auctions that essentially go to the highest bidder aren't the best idea if we want to get the most out of our wireless spectrum. The current system is skewed toward larger companies. While frequencies will likely get used at some point by the big providers, it's the smaller carriers that are likely to use them first. After all, smaller companies often use the first-mover advantage in order to carve out a niche market. A perfect example of this is Clearwire, who was the first to build-out a 4G network far in advance of larger carriers.
If Americans want a truly free-market system in terms of wireless carriers, it's important that smaller companies are protected from being squeezed out. Wireless spectrum is an incredibly rare commodity and we have a responsibility to allocate frequencies to companies that truly have a use for it. Let's not forget that wireless spectrum is all around us and never owned by any one corporation. It's up to our elected officials to develop the best method for renting spectrum to private carriers. If we don't like how things are being handled, it's our obligation to let the government know.
@Andrew- That plan might be good for consumers in the short run as carriers would likely encourage more usage to keep their spectrum. But isn't there an actual shortage that we'l bump into in the long run? Aren't we just setting up expectations that we won't be able to meet as demand grows?
@Dave, What about the idea that a company can't flip the space to another company. If they don't use it, the govt keeps the money and the frequencies go back up for sale. It won't solve the whole problem, but it might help.
A slightly different take on this subject is the idea of carriers creating artificial demand for capacity and that's another reason they are dragging their feet on using the spectrum they've already been allocated.
Seems like what we need is a federal agency to monitor how these are being used and dole them out accoridng to need. Oh wait, we have one. It is just mired in politics and ruined by corporate lobbying.
Can anyone point to a system working well at this?
The blogs and comments posted on EnterpriseEfficiency.com do not reflect the views of TechWeb, EnterpriseEfficiency.com, or its sponsors. EnterpriseEfficiency.com, TechWeb, and its sponsors do not assume responsibility for any comments, claims, or opinions made by authors and bloggers. They are no substitute for your own research and should not be relied upon for trading or any other purpose.
4/29/2014 - Join Dell and Intel for an interactive discussion about implementing, refining and improving your virtual environment. Specifically we’ll discuss pain points virtualization can solve and those that it can create and how to prevent them.
Enterprise Efficiency is looking for engaged readers to moderate the message boards on this site. Engage in high-IQ conversations with IT industry leaders; earn kudos and perks. Interested? E-mail: firstname.lastname@example.org
Dell's Efficiency Modeling Tool The major problem facing the CIO is how to measure the effectiveness of the IT department. Learn how Dell’s Efficiency Modeling Tool gives the CIO two clear, powerful numbers: Efficiency Quotient and Impact Quotient. These numbers can be transforma¬tive not only to the department, but to the entire enterprise. Read the full report
Now that TGen has broken new ground in genomic research by using Dell's storage, cloud, and high-performance computing solutions, the company discusses what will come next for it and for personalized medicine.
The Translational Genomics Research Institute wanted to save lives, but its efforts were hobbled by immense computing challenges related to collecting, processing, sharing, and storing enormous amounts of data.
Office and personal productivity tools come in a first-class and coach flavor set, but what makes the difference is primarily little things that most users won't encounter. What's the big issue in using something other than Office, and can you get around it?
We really don't want an "Internet of Everything" but even building an Internet of Everythinguseful means setting some ground rules to insure there's value in the process and that costs and risks are minimized.
Google's Chrome OS has a lot of potential value and a lot of recent press, but it still needs something to make it more than a thin client. It needs cloud integration, it needs extended APIs via web services, and it needs to suck it up and support a hard drive.
On a recent African trip I saw examples of the value of the cloud in developing nations, for educational and community development programs. We could build on this, but not only in developing economies, because these same programs are often under-supported even in first-world countries.
VMware's debate with Cisco on SDN might finally create a fusion between an SDN view that's all about software and another that's all about network equipment. That would be good for every enterprise considering the cloud and SDN.
Wearing a bulky, oversized watch is good training for the next phase in wristwatches: the Internet-enabled, connected watch. Why the smartphone-tethered connected watch makes sense, plus Ivan demos an entirely new concept for the "smart watch."
Cloud storage costs are determined primarily by the rate at which files are changed and the possibility of concurrent access/update. If you can structure your storage use to optimize these factors you can cut costs, perhaps to zero.
The Internet has evolved into a machine for drumming up a chorus of "Happy Birthday" messages, from family, friends, friends of friends who you added on Facebook, random people that you circled on G+, and increasingly, automated bots. Enough already.