Well the 2.8gb is to download all confirmed blocks, say a copy of the blocks added 10 days ago.
But the actual bandwidth exceeds that significantly. Both transactions that never make it into actual blocks (like bad transactions or double spends) and blocks that are orphaned are part of the bandwidth too.
More importantly, even if you were just sending and receiving the transactions that end up in the final longest blockchain, you're communicating hopefully not with 1 other node, as that'd mostly defeat the purpose, but rather tens of nodes, meaning you're perhaps seeding yourself the new blockchain data to tens of other nodes.
In fact, if you look at nodes today, the average daily bandwidth is pretty much the 2.8gb you mentioned for blocks 20x the current size. e.g. http://213.165.91.169/
So if we'd go 20x, you're talking about 50gb a day, or 1500gb a month. And that just doesn't fly with most consumer ISPs. It's something we could certainly move towards in 2020 or 2025 though.
Now you can limit your node of course, but that limits the utility, too few connections means you're just receiving data and not passing through actually hurting the health of the network. This node transmits only about 20% more than it receives, so it's not some kind of super node that delivers data to lots of others. There are nodes like that which today rack up 800gb of monthly data already, this one is closer to 100gb per month, which already is a lot just 'on the side', in countries with shitty bandwidth caps. (US) Plus, a lot of the traffic is burst traffic, so it's not like we're talking about 3gb daily = 35kb/s (which sounds fine) 24/7. For example, he's had days reaching 30gb of traffic, and again this isn't some industrial node. You have to be able to deal with that as well. If you cut your node when the going get's though on days like these, then it defeats the point and means the blocks are too big to handle.
So I think we can go a lot bigger with nodes, but we shouldn't downplay the traffic increases too much, 20mb blocks for example aren't trivial and can't just be compared to some 1080p video (although I make similar comparisons from time to time). At some point, you have to give away autonomy if you increase blocks too much. We already see with 1mb blocks today that lots of peeps run nodes on a VPS, so they can offload the hassle of a home setup. But this means lots of nodes already are being run by commercial providers, which are susceptible to government influence. Now I'm in no way anti-government, but the whole point of bitcoin is decentralisation, gateless, disintermediated, resistant to power etc... removing this leaves you a database that can be manipulated by those with power, just like an ordinary financial system we already have.
Tricky debate this!
edit: just to add for others who are interested... If the blockchain has value, people will use it, and keeping blocks small just forces transactions off-chain. And these can themselves be controlled by various gatekeepers and intermediaries. So on both sides you're losing out. A balance is necessary, everyone agrees there. Where the balance is, they don't. And it's made difficult by the need to 'get it right' right away, as changing the protocol in 5 years becomes way harder. Every day that goes by, more actors move in, making change a political process. So simply making blocks 2mb and saying 'let's get back to it in two years when blocks are getting full again' could actually be worse than delaying it shortly now and forcing a good permanent solution. While at the same time, every single day, bitcoin is treading uncharted waters, none of this has been done before in this particular context, so the foresight for a long-term solution is incredibly hard and already incredibly political.
Thanks for the link, seems like the post I was going by was underestimating the traffic quite a bit (though to be fair, they allude to it and it was probably my reading at blame).
That said, I stand by my point that the traffic ("computational") requirements hardly limit running a node to large organisations: depending on where you live, a domestic connection might not do it -- though it very well might! -- but a cheap server hosted somewhere on the "right" side of the last mile will.
> a cheap server hosted somewhere on the "right" side of the last mile will.
I think the idea is that if a large majority of the network is on linode, then shutting down Linode would cause the majority of whoever is left being able to dictate what's canon?
Why is 2MB the perfect size? Will we run into the same problems in three years? Or thirty?
But the actual bandwidth exceeds that significantly. Both transactions that never make it into actual blocks (like bad transactions or double spends) and blocks that are orphaned are part of the bandwidth too.
More importantly, even if you were just sending and receiving the transactions that end up in the final longest blockchain, you're communicating hopefully not with 1 other node, as that'd mostly defeat the purpose, but rather tens of nodes, meaning you're perhaps seeding yourself the new blockchain data to tens of other nodes.
In fact, if you look at nodes today, the average daily bandwidth is pretty much the 2.8gb you mentioned for blocks 20x the current size. e.g. http://213.165.91.169/
So if we'd go 20x, you're talking about 50gb a day, or 1500gb a month. And that just doesn't fly with most consumer ISPs. It's something we could certainly move towards in 2020 or 2025 though.
Now you can limit your node of course, but that limits the utility, too few connections means you're just receiving data and not passing through actually hurting the health of the network. This node transmits only about 20% more than it receives, so it's not some kind of super node that delivers data to lots of others. There are nodes like that which today rack up 800gb of monthly data already, this one is closer to 100gb per month, which already is a lot just 'on the side', in countries with shitty bandwidth caps. (US) Plus, a lot of the traffic is burst traffic, so it's not like we're talking about 3gb daily = 35kb/s (which sounds fine) 24/7. For example, he's had days reaching 30gb of traffic, and again this isn't some industrial node. You have to be able to deal with that as well. If you cut your node when the going get's though on days like these, then it defeats the point and means the blocks are too big to handle.
So I think we can go a lot bigger with nodes, but we shouldn't downplay the traffic increases too much, 20mb blocks for example aren't trivial and can't just be compared to some 1080p video (although I make similar comparisons from time to time). At some point, you have to give away autonomy if you increase blocks too much. We already see with 1mb blocks today that lots of peeps run nodes on a VPS, so they can offload the hassle of a home setup. But this means lots of nodes already are being run by commercial providers, which are susceptible to government influence. Now I'm in no way anti-government, but the whole point of bitcoin is decentralisation, gateless, disintermediated, resistant to power etc... removing this leaves you a database that can be manipulated by those with power, just like an ordinary financial system we already have.
Tricky debate this!
edit: just to add for others who are interested... If the blockchain has value, people will use it, and keeping blocks small just forces transactions off-chain. And these can themselves be controlled by various gatekeepers and intermediaries. So on both sides you're losing out. A balance is necessary, everyone agrees there. Where the balance is, they don't. And it's made difficult by the need to 'get it right' right away, as changing the protocol in 5 years becomes way harder. Every day that goes by, more actors move in, making change a political process. So simply making blocks 2mb and saying 'let's get back to it in two years when blocks are getting full again' could actually be worse than delaying it shortly now and forcing a good permanent solution. While at the same time, every single day, bitcoin is treading uncharted waters, none of this has been done before in this particular context, so the foresight for a long-term solution is incredibly hard and already incredibly political.