I'm not sure what his point is. Despite BitTorrent having 135 million installs and being 55% of all internet traffic, P2P in general 'is a product that tests great. In application however, it has a ton of challenges'. Maybe he's talking trash because he invested $1.7 million in a 'BitTorrent-like' company. He's been transparent about such motivations before. That said, he does has some claim to punditry in the bandwidth space because his $5 billion sale of broadcast.com for yahoo stock set the precedent for valuing bandwidth supply companies based on how quickly they flush money down the toilet. (Amusingly, if you go to broadcast.com today it simply redirects to yahoo.com.)
Given that I don't know what the point of Cuban's post is, instead of rebutting it I'll play Devil's advocate and argue his side. But first I'll argue against some other things.
- Digital computers are extremely complicated devices, requiring a large number of gates to do even the most elementary operations, while analog computers can do amazing things with only a few simple pieces of circuitry
- Millions or billions of calculations are required by digital computers to do even very modest simulations. For example, that stupid space alien searching screen saver could have long since finished using a analog circuitry at far less cost, and digital radios require a high-end processor to replicate what a basic radio can do.
- Even a single error in billions of operations will frequently render the entire output of a digital computer garbage.
- To program even rudimentary tasks on a digital computer requires tricky programming on the part of highly skilled specialists.
Clearly, digital computers test great. In application however, they have a ton of challenges.
- Packet switching is based on some handwavy observations rather than a rigorous model and practical implementations contain a ton of voodoo magic numbers.
- Packet switched networks make no guarantees of service for particular connections whatsoever, and applications have to deal with the possibility of not getting necessary bandwidth.
- Even simplistic models of packet switched networks are extremely difficult to analyze, and the results of those analyses have dubious application to the real world.
- The end nodes of packet networks have to have extremely complicated code to handle packet loss, a phenomenon which they outright rely on to function properly, and all end nodes need to have reasonably harmonious behavior for the whole network to work.
- Packet based networks hardly provide any useful information to end nodes at all. The potential capacity of each connection and whether it's at the limit must be guessed at using very cumbersome techniques and unreliable information.
Clearly, the internet tests great. In application however, it has a ton of challenges.
- BitTorrent trackers have ludicrously little responsibility, having no control over peers whatsoever. This architecture trivially reduces central overhead to almost nothing, but creates a ton of problems in exchange for going to the extreme on that one criterion.
- There are no guarantees of service from downloading from BitTorrent peers whatsoever, and it relies on an extremely baroque and poorly studies variant on tit for tat to have any enforcement of behavior at all.
- Even an elementary implementation of a BitTorrent peer is very complicated and tricky to write, with no clear benchmarks and lots of hazy requirements.
Clearly, BitTorrent tests great. In application however, it has a ton of challenges.
Humor aside, Cuban was confused by some basic arithmetic at the end of his post, and having a mathematical bent I'd like to help. If BitTorrent traffic is currently 55% of all internet traffic, and it doubles, would it then be 100%? No it wouldn't. Let's say that BitTorrent traffic is currently 55 zillobits, and everything else is 45 zillobits, for a total of 100 zillobits. BitTorrent would then be 55/100 = 55% of total traffic. If BitTorrent traffic were then to double, it would then be 110 zillobits, out of a total of 110 + 45 = 155 zillobits of traffic, and as a percentage would be 110/155 = 71% of all internet traffic.