On one hand, as other sibling posts are pointing out, you're comparing true network capacity with subscribed ("sold") capacity; those are massively different by orders of magnitude (depending on how shitty your ISP is.)
On the other hand, it's far worse than "20G per satellite." Bandwidth is limited by available RF spectrum, and that's not magically gonna increase with satellite count. Best thing they can try is limit the coverage area of each radio as much as possible, but there's limits to that and now you have an onslaught of handovers since the satellites are moving quite fast.
I honestly am too lazy to do the math, but my gut says this may work great for larger areas with lower population density (let's say the Amazon, U.S. Midwest, Australia, Sibiria, etc.) but it's not gonna replace high-density broadband access.
Also, just for comparison, current optical network technology (in the market, not research) puts 50G on a single wavelength on an optical fiber. There's 80 wavelengths in a medium-density DWDM setup, so that's 4T ... on a single strand. A low-density cable has 12 cores (that's what you deploy on the outskirts of your network), so, uh, 48 terabit/s with current technology. Infrastructure cables are commonly 144 cores, that's 576 terabit/s...
(These are obviously "maximum" numbers, you rarely see all 80 wavelengths or all 144 strands in use, and there's gonna be slower links like 10G occupying some wavelengths too.)
> "On the other hand, it's far worse than "20G per satellite." Bandwidth is limited by available RF spectrum, and that's not magically gonna increase with satellite count."
That's not true. More satellites allows you to put them in lower orbits, increasing spatial multiplexing. Yes, even for the same spectrum.
This is not like mobile where the phones have fairly omnidirectional antenna (improving to a handful of elements under 5G). The user terminals will have dozens if not hundreds or thousands of phased array elements, allowing connections to multiple satellites simultaneously and very narrow beamwidths. The satellites also have phased arrays with lots of elements.
I agree that dense cities and optical fiber will still be able to beat out Starlink (and similar), but there is PLENTY of room for using the same amount of RF spectrum for vastly greater bandwidth using spatial multiplexing and phased arrays on both ends and multiple, cooperating satellites in view simultaneously.
On the other hand, it's far worse than "20G per satellite." Bandwidth is limited by available RF spectrum, and that's not magically gonna increase with satellite count. Best thing they can try is limit the coverage area of each radio as much as possible, but there's limits to that and now you have an onslaught of handovers since the satellites are moving quite fast.
I honestly am too lazy to do the math, but my gut says this may work great for larger areas with lower population density (let's say the Amazon, U.S. Midwest, Australia, Sibiria, etc.) but it's not gonna replace high-density broadband access.
Also, just for comparison, current optical network technology (in the market, not research) puts 50G on a single wavelength on an optical fiber. There's 80 wavelengths in a medium-density DWDM setup, so that's 4T ... on a single strand. A low-density cable has 12 cores (that's what you deploy on the outskirts of your network), so, uh, 48 terabit/s with current technology. Infrastructure cables are commonly 144 cores, that's 576 terabit/s...
(These are obviously "maximum" numbers, you rarely see all 80 wavelengths or all 144 strands in use, and there's gonna be slower links like 10G occupying some wavelengths too.)