Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The bandwidth bottleneck that is throttling the Internet (nature.com)
60 points by okket on Aug 10, 2016 | hide | past | favorite | 53 comments


This seems like as good a place as any to complain that auctioning off radio frequencies to the highest bidder is unlikely to result in optimal (or nearly optimal) utilization of scarce resources, and is therefore working against the public interest.


In Sweden the 3G frequencies were dispersed for free using a "beauty contest" method - potential operators submitted buildout and investment plans and the regulator picked the ones with the most aggressive yet realistic plans. They also had to commit to minimum population coverage requirements. The idea being the money was better spent building the networks than in government coffers.


Is there a land-value tax [1] equivalent for spectrum?

[1] https://en.wikipedia.org/wiki/Land_value_tax


The spectrum is paid for by the auction winners. A new auction happens yearly. Spectrum isn't traded in the market like land. It's leased by the government on a yearly contract. So no tax is needed.


Is every band of spectrum re-auctioned every year? How do companies make multi-year investments?


If you had a new auction every year nobody would put money into infrastructure development. Mobile providers spend billions every year maintaining and upgrading networks - there's no way they would do that if it was possible to lose the spectrum they were using.


That sounds like an implementation of a 100% LV tax on radio frequencies.


It's public ownership + periodically auctioned leases, so it's hard to calculate a LV rate equivalent. The closest arrangement I can think of with respect to land would be Chinese real estate, where "title" is legally a 99-year lease from the government.


Optimal for whom? What is an optimal use of spectrum for me will probably not be optimal for most people.


"and is therefore working against the public interest."


Beyond allocations for government and emergency services, there's no such thing as a public interest. Different people have different interests.


I agree. I wonder if it would be sufficient to just allow companies to sue each other in case of intentional interference, and otherwise just allow free play. I suspect there is so much value in everyone cooperating that we would more or less sort it out, without needing restrictions on who can operate at which frequencies, by handing out frequency band monopolies to the three highest paying bidders.


That's basically how unlicensed wireless (e.g. 802.11 wireless ethernet) works, and it seems to be working okay. I think increased unlicensed spectrum would be great.

Another option would be to have something like a hybrid of the unlicensed and HAM bands with more liberal output power restrictions than unlicensed spectrum where you have to pass a test and get a proper (inexpensive) license to use the band, but you can do more-or-less whatever you want within the band as long as long as you identify your station, stay within output power limits, and don't intentionally interfere with anyone.


It does not work OK at all. Try being in any dense area such as an urban apartment building. There are 30 networks I can see on all 11 wifi channels. Its totally unusable.


Is that an issue with unlicensed spectrum, or an issue with trying to provide 30 separate encrypted comms channels over a narrow bandwidth in a dense space?

Surely, increasing unlicensed spectrum would alleviate this by providing for more channels...


That was roughly how things were developing until sometime in the 1920s, according to an old paper in Law & Economics. By my quarter-century-old memory it claimed that kind of approach worked all right and got replaced after a scare campaign and power grab by Herbert Hoover and friends.


I wonder if it would still work in the current situation though. People did not have 10+ radio transmitters at home/on them in 1920s. Would we just replicate the music loudness war in radio waves?


We have it working now. What would change to make it not work if licensing was abandoned? A loudness war would leave people without connectivity. That's exactly the opposite of what everyone pays for, and what everyone wants. I don't see why we'd want to sabotage ourselves, just because it becomes easier to do.


Because without limits / licenses, company A uses range 1-5Hz, while company B uses 3-8Hz. Now A gets reports that B's equipment are interfering, but fixes that quickly by bumping up power. A gets connectivity, B loses. Until they get the reports...

It's not like everything will suddenly stop working. We'd just have a constant war of A vs B vs C vs ... For a similar situation see the internet's buffer bloat. Everyone goes "how do we make the connection faster? bigger buffers!" and now we all have crazy buffer sizes on each node in path and all get crappy connections because of it. We do sabotage ourselves all the time in technology.


That's not how markets work. The war-mongering companies would quickly disappear, as they wouldn't be able to deliver service to their paying customers. The providers willing to work together, and actually make it work, would win over the clients of these other aggressive companies.

Companies that spend time sabotaging each other, rather than focusing on delivering service to their clients, deserve to die. And, in any sane economy, they will.


> That's not how markets work. [...] The providers willing to work together, and actually make it work, would win over the clients of these other aggressive companies.

Can you bring up some good examples of that? We've got the mentioned loudness war, patent trolls, insurance companies with race to the bottom, people destroying the earth right now and who will keep ignoring global warming where it's profitable, antibiotic resistance via over-prescription, internet buffer bloat, software size bloat, etc.

I love the idealistic view of the sane economy, but no, that's so not how markets work at the moment. Especially in this case - how exactly would any non-war-mongering company stay in business if the name of the game is "how wide/loud can you transmit"? It's not that companies are sabotaging each other. It's that if companies can make a better product on an unregulated RF market, they will. This will most likely get into someone else's way and those in turn will have to upgrade as well. There's only constant improvement on both sides.

If you're able to find any, you can compare the output spectrum from any RF toy that's FCC registered and one that's not. Guess which one bleeds noise on a wider spectrum around it's intended frequency.


Yes but the current economic system is not evolving in an idealistic world. You will have deception, bridery and other coercision method to push other players out of the market.


People collectively sabotage themselves all the time. It's called the tragedy of the commons.


Pardon my ignorance, is there a reason we couldn't theoretically just build out a state owned network, then lease back access to the operators? I would it would cut down significantly on infrastructure required, and create better competition.


Then the state would have to own the network. That's inconvenient and a bit prone to corruption.


The Free Market Knows Best.


What mechanism do you propose to more optimally utilize scarce resources? How do you measure degrees of need/want and fulfillment of those?


Do you have a better alternative?

Generally bureaucracies don't have a very good efficiency track record either.


I suspect that much of the demand for more bandwidth is really a demand for lower latency. It's not that there's a real need for 1Gb/s to the user. It's that the latency is too long under load. Some delay is being inserted by middle boxes near the end user with dumb FIFO queuing. This is sometimes called "bufferbloat", but it's not about how much RAM you have in the router, it's about big FIFO queues.

Just prioritizing empty ACKs in the upstream direction can help a lot. If you can't watch video while uploading a file, that's the problem.


Good point. The utility of incremental bandwidth diminishes quickly. https://www.igvita.com/2012/07/19/latency-the-new-web-perfor...


Most bandwidth usage is ads and video. And bloated apps and web pages. What else needs much bandwidth?

Even HDTV only needs about 20mb/s, and can be compressed down to 8mb/s or so without much visible loss.


Sending streaming lightfields for viewing in VR (for live sports etc.) Is going to take every bit of bandwidth we can throw at it.


Yep, video/audio aside, most of the textual content and data we deal with is manageable on 56k or less.


Holographic video is about 1000000 times as bandwidth intensive as flat images. If we had a million times the bandwidth we would use it!


3d


For a fun look at current fundamental limits on bandwidth I found this wikipedia page [0]. Really puts things in perspective.

0. https://en.wikipedia.org/wiki/List_of_device_bit_rates#Bandw...


Google, FB, and Microsoft own cables. So I can see that it's data-intensive companies. What does owning the cable do for them? They get to rent it out to other companies?


Its a lot cheaper, and you can do fancy things at little to no extra cost.

For example, In the UK if you order a leased line for say 100megs, you'll get some geenric fibre blown into the building. That fibre, and the equipment to run it are all capable of at least 1gig, if not more. However you ISP will charge you a lot more for the full line speed. Why? because they can, that and upstream bandwidth costs more.

If you are interested in point to point, then getting semi decent dark fibre allows you to do DWDM to get 40 gig per strand (most fibre cables are more than on strand.)

a 32 fibre cable (Most of the cost is the physical laying, not the fibre it's self) would give you 1.2tbps theoretical. All for the cost of installing one cable.

Also owning your own cables means you can peer directly with other networks without having to go through a third party. (In the EU, and rest of the world at least, The US have some horrid monopoly system.) This means that the cost of buying bandwidth drops dramatically.


Note, I haven't been in this world for a few years, so this info may be out of date

Actually, a heck of a lot more than that - WDM (wavelength division multiplexing) can give you 40Gbps (or more, though cost per bit per second starts to increase substantially) _per wavelength_, with 40 waves per strand (again, more possible, but cost-effectiveness not as good).

So, 1.6Tbps per strand fairly cost-effectively, with more physically possible. That 32-strand cable could carry 48Tbps with a config like this.

Upstream bandwidth isn't really expensive if you're selling to eyeballs (i.e. those who primarily consume services rather than provide them). However, as you said the fixed cost of laying fiber is non-trivial, as is the endpoint equipment to terminate all these waves and do something (route, etc.) with the transported data.


DWDM on commercial equipment is up to >24Tbps per fiber, long haul is about 8-12Tbps (trans-oceanic). I haven't looked into the details yet, but earlier this year Nokia announced a system that can do 70Tbps per fiber.

New Sub-sea builds are usually 6 fibers, some large-scale terrestrial builds are up to almost 8000 fibers (euNetworks did this across europe).


Why not more fibers sub-sea? I would have guessed when you are going to the expense to lay fiber across an ocean, including a fair amount of dark fiber looks like a good investment.


Repeaters and equalizers. They go up to 8 fiber pairs, and are probably the most expensive single component for long cables. You need separate amplifiers/equalizers for each single fiber, so the cost strongly depends on the number of strands.

Since these repeaters are typically placed every 50-80km, there's a significant number of them in long systems and powering them becomes a challenge. These cables usually operate around 10kV DC at 1-2 amps, and I'm not sure how realistic it would be to go higher. The wire resistance losses are probably already more than 70% of the voltage budget, so you can't really use more current (voltage drop = I^2 R). Higher voltages cause other issues such as dielectric breakdown. Deep ocean cables are actually pretty thin, about as thick as a garden hose, and I doubt you could easily pass more than 15kV. You also have to account for ground voltage shift between the continents which may be as high as 1kV and depends on the weather.


There is a limit to the amount of power that can be pushed through the cable on each end, and each strand of fiber needs a number of Erbium-doped fiber amplifiers, which consume a lot of power. This isn't a problem on land, as you can add power at each regen site.


Google, FB, and MSFT all essentially migrate huge parts of the internet between continents (Google Search Index, Gmail Inboxes, YouTube Videos, Bing Search Index, Outlook Inbox, Facebook Profiles, Photos, Videos, etc). This is a ton of data, and they want to keep cost low and bandwidth high. Getting in on the construction keeps the long-term costs down.


Given the headline, I thought it was going to say "Comcast."


the size of your pipe is not the speed of your pipe. something I think few people understand.

better compression and less chatty protocols would certainly help with this issue but the overall solution is using superconductors instead of plain wires.

I raise my glass in salute to the day we all have hydrogen superconductors as our pop.


> the Internet is still a global patchwork built on top of a century-old telephone system.

No, it's not.


Australias newest, most high-end infrastructure for internet access, the NBN, is based on a mix of 100 year old Telstra copper for last mile (in reality last 4 or 5 mile) connections, 40 year old HFC cable if you're lucky, and if you happened to be really lucky and in an opposition party area when it was originally planned, a little bit of fibre in neighbourhoods. Its an abysmal abuse of the phrase 'cost-savings' as its crippling any attempt to introduce new, more advanced services that need more than about 20mbit continuously...


it is in germany.


And Denmark.

Of course, the core internet nodes are not connected by old phone lines, but most of the edge nodes (consumers) are.


Most edge nodes in Denmark (6.6 million) are mobile phones, 2.4M are fixed lines.

Worldwide, DSL is less than half of all fixed lines. Denmark has decent fibre networks, and cable TV, but i haven't found any numbers for usage.

Statistics from http://www.oecd.org/sti/broadband/oecdbroadbandportal.htm


What about the cable networks?

Unity Media advertises fiber connections.


First thing I did was open the page and try to find the word "Comcast".




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: