- Does this guarantee that you get the speed that you pay for? If not, then should the price during heavy congestion cost less per GB? I’ve had the unfortunate luck of having lived in areas where the internet is unusable for anything other than email and light browsing during peak hours. Bonus: Do you know why this happens? If not, I encourage you to research this.
- How will this affect the advertising model of the entire internet economy? If you, like me, have lived your whole life on a mobile data plan that is priced based on data usage, you’ll realize just how much bandwidth is consumed by advertisements (worst are the video/audio types that auto play).
- How will this affect “future” technology such at smart homes? Most devices send data back (Amazon Echo devices) and the growing security camera ecosystem that sends recordings to the cloud would suddenly cost users a ton more.
- Speaking of cloud, what will happen to the gaming industry that is heavily cloud based? Many games don’t even ship in a completed state anymore. Instead, part of the install requires downloading a ton of additional updates. Not to mention online play, DLC, etc.
There a many many more everyday examples like this that would be heavily affected by pricing per GB. Your question sounds like a simple solution but like many things in life, that is rarely the case.
I think the answer to most of these questions is, let the market decide. E.g. consider this: congestion during peak hours is bad now, but there is currently no penalty on an ISP for being slow. However if billing was based on bytes moved, for an ISP to underserve during times of demand means lost revenue. Thus ISPs would be incentivized to provide the best service.
With respect to the US, this answer is incomplete. The theory of letting the market decide only works when there is a free market. Most places I’ve lived has only had a single choice for ISP effectively making them monopolies.
> With respect to the US, this answer is incomplete. The theory of letting the market decide only works when there is a free market. Most places I’ve lived has only had a single choice for ISP effectively making them monopolies.
Furthermore, even in places where you do have some sort of choice for internet service, it is almost certainly limited to choosing between a cable monopoly and a phone monopoly as your ISP.
Because customers lose their minds with that model, largely as a result of conditioning that "internet" is an unlimited resource (which of course it is when instantaneous demand <= supply, but during high traffic periods that isn't the case).
Personally I understand the economics around it, but still don't like the idea of paying per GB. I would end up skipping some Netflix and would get mad at kids for playing Netflix to an empty room (much like I currently get mad when they leave the lights on in an empty room). It's nice on a personal level to avoid that.
I really don't like the idea of limited bandwidth or charging per gigabyte because I know ISPs will rip people off and still not upgrade their networks to handle more traffic.
But I think it would considerably change the Internet landscape. No more listening to the same exact song multiple times on Youtube or Spotify or whatever. No more downloading then deleting the same stuff over and over again. I think about it often, what a massive waste of bandwidth, I don't even know why. It's a lot of electricity used, I guess?
Even though I've got unlimited fiber, I'm looking for some sort of local "Internet cache" solution that would store everything so it would be re-downloaded from my home instead of across the ocean. Would be great for outages, too.
Yeah, I don't know. I just think about the bandwidth we all use sometimes and it seems extremely wasteful and it bothers me for some reason. I use around ~200-300 GB/month, which I thought was a lot until I saw how much other people use :D
that's because you benefit from it. You're not currently paying the cost of that 1080p stream - netflix (and the ISP/peering networks) are paying. Netflix recoups the cost from your subscription, and the ISP/peering arrangement is mostly cost neutral to them (save for a small amount i presume).
I think what they're getting is how bloated software has become. Some websites download several megabytes of data to display a kilobyte or less of actual content.
That would probably increase the amount of storage space people would end up buying which would greatly increase the electricity demands on the end user.
I would bet this is already the most efficient way of it working.
You can setup a proxy service to cache internet requests too, but they're getting less useful due to greater use of HTTPS.
To use the power analogy, when I was young the price of power was relatively high but in recent times it has now dropped to 0.6 NOK / kWh so even charging the car from 10-80% (~61 kWh) cost 40 NOK or $4. I do not get mad at the kids leaving the light on like my mother did to me.
I (or rather my job) currently pays about 1000 NOK/ month for unlimited 500 MBit synchronous fiber internet. I transfer maybe 500 GB/month so for it to be cheaper for me it would be under 2 NOK/GB or $.2 which sounds really high but that is what I currently pay with my usage.
You pay for peace of mind that little Johnny doesn't download some crazy amount by accident. Now you don't have to monitor family usage so that makes things easier as well.
i had per gb pricing in new zealand. the key feature to address your issue was that i could set an upper limit, so, say start throttling the connection once i reach $50. that's the same that other ISPs would do anyways once i reached a certain amount of usage, but with this ISP i decided the limit and i could change it at any time.
the price was competitive too. other ISPs charged $70 per month, and had a limit of 50gb (that was 15 years ago)
my ISP charged $20 as base fee and $1 per gb of usage. if i used less than 50gb, i saved money, if i used more then i could...
It's unfortunate that cellular providers don't offer this. I get around it by setting my data connection to prefer 3g. It helps with a 500mb limit (data is expensive on cheap plans), but it'd be nice to be able to throttle apps individually (no, I do not want video ads to be downloaded, but internet connectivity is required for the app to work online. A HN client with webview, for example)
years ago i came across per gb pricing but the cost was insane. it was essentially: prepay your usage or we charge you 10 times as much as the prepay would have cost you.
I thought internet in Norway was cheaper than in Denmark. Interesting. For comparison I pay 449 DKR/month for 1000/1000 unlimited fiber with a guaranteed bandwidth of minimum 950 (they are upgrading the network atm. hence not a guarantee on the full 1000 yet as some customers equipment is too old).
Is it a normal connection or a business connection perhaps?
Bandwidth to where? Do they really have 1gbit per customer peering with say level3? And with cogent, and telia? And a full non blocking internal network? And enough packet buffers to ensure that microbursts don’t saturate any link?
Depending on the SLA, or lack thereof, they could have enough peering for a sustained +(x = 3?)σ demand spike without having a full dedicated peering for each customer, and still reasonably guarantee throughput availability.
Even ignoring peering, I’m still trying to picture the non blocking network. An Arista 7368X4 isn’t cheap, but will cope with 12,000 customers with appropriate switches downstream (128x100G ports with each port breaking out to 96 customers. Not sure how you’d scale beyond that without blocking.
Your peering will only scale to your (combined) interface speed regardless of your SLA. Our 2x 40G peer with level 3 serve far more than 40 1G devices (I personally have 600 in one building alone, and that sets aside the rest of the users), but it’s rare it’s more than 30% utilised.
Clearly that’s not going to be a domestic isp architecture, so a reasonable question is what does uncontended actually mean.
I very much doubt an isp with 10,000 customers has 10TB of peering physically available. Linx public peering is less than half that for the entire UK, and while private peering will likely increase that, the suggest is it’s welll under tenfold, so the 50 million plus internet users in the UK only use at peak times 1M each, and that ignores all the non-domestic use.
Even a 100:1 contention ratio seems enough at a core level, so any isp spending money on improving on 10:1 ratios seems that they are just burning cash.
Clearly as you go the the edge contention ratio needs to drop - but 40G, or maybe even 20G uplink for 48x1G users would be reasonable to me.
It's a normal connection (Altibox).
I'm allowed to runs services and whatnot on it though I mostly use it for data analytics and downloading large datasets.
Because their costs aren’t based on usage, but installing and maintaining the infrastructure then collecting rent on it. Actual usage is an exponential curve and if you bill on it you have a few angry users paying for everybody else’s infrastructure.
Why won't ISPs stop advertising 'unlimited' service that is actually limited to N gigabytes a month, where N is substantially lower than what's possible given the speeds they provide?
Because it is insanely expensive to actually provide that. The nature of internet traffic is short bursts, not 100% utilization. In a commercial setting you can purchase fixed pipes that are entirely yours and they’re tens or hundreds of times more expensive.
Why wouldn’t you want to pool bandwidth with your neighbors so you could all get faster speeds when you were using it instead of rate limiting everyone?
The question was not "why won't they provide it." The question was why won't they stop advertising it?
I can't sell you a pony made out of diamonds because that's impossible. Consequently, there is no legitimate reason for me to be advertising the sale of a diamond pony!
>Because it is insanely expensive to actually provide that.
So, the defense of using a misnomer to name your service is because the service warranted is actually impossible to supply given the margins?
Call me a fool, but that still seems like a company that is getting away with lying to the vast majority of people that don't bother reading the asterisk ( like T-mobile style "Unlimited" plan that gives X amount unthrottled, and some arbitrarily low rate after).
Criminal issue? Of course not, that's why the companies present such things this way.
Plenty of other countries have actually unlimited high speed internet with little issue and reasonable pricing. It's obviously not so "insanely expensive" that it can't be done.
The issue is that there's no way to know what this limit is. I understand the cost, but then tell us exactly how many GB I can use at full speed and when does it start to throttle. Instead I have to rely on internet anecdotes.
They do. I'm pretty aware of what my data transfer limits are on both my home and wireless connections. Granted my new ISP has no data caps and symmetric gigabit speeds so I'm a bit spoiled.
Well in New Zealand we used to charge-per-byte but as things got cheaper this has largely gone away. Some of the cheaper how fibre accounts have something like a 100G/month limit but I guess it isn't worth the trouble to charge even here where bandwidth is much more expensive than the US.
Flashback: I was on Actrix which charged $5 per megabyte. Downloading Netscape Navigator 2 nearly bankrupted me.
I used to browse the internet with images disabled, which ironically gave me an appreciation for accessibility issues which served me well later in life.
When Xtra came out at $2.50 per hour, it was a complete game changer.
I remember when Xtra bought out their $27.95/month unlimited dialup plan.
It changed our internet browsing habits so much. With hourly charges, you would try to plan out what you would do before connecting and instead of reading webpages, you would save them to disk for reading later.
With unlimited, you could just sit there and browse. Or leave it on overnight to download files.
At some point we got a second phoneline and I was downloading torrents on dialup all day and night for years before we finally moved somewhere with ADSL in 2006.
It's also how Google Fi works. Every time I loaded something, in the back of my mind I'd think "this page is costing me two cents". It makes everything you load feel like an individual transaction, making it super uncomfortable to use.
I imagine the data whales would not cover the loss revenue from people who just use online banking yet pay the same amount.
I've used 15 TB in the last 5 weeks with my torrent client alone (and considering my Backblaze backup size, that's not nearly all of my traffic), but how many people like me are there in the general public?