You are viewing a single comment's thread from:

RE: Introducing: Steem Pressure #1

in #steem-pressure6 years ago (edited)

I really appreciate your taking the time to create this sort of series. Periodically there will be heated arguments about steemit's scalability and the technical requirements for running a node, and I find myself pretty uninformed as a reader.

This fills in some of the gaps and I imagine future posts in the series will fill out more.

But I am interested in your opinion about scalability. There was a ton of growth in 2017 - and presumably that growth wil continue, as you anticipate. But does their come a point where the growth is no longer scalable for decentralized volunteer witnesses to handle? I imagine the amount of information being transacted through the blockchain in 2020 would still be nothing to, say, a Google server farm. But will it be manageable by witnesses at the current rate of growth?

(I'm really just curious - I'm not coming at this question with any pre-suppositions or hypothesis - end of the day, ima just keep plugging away at my fungal hobbies regardless - but this scalability question is a point of interest nonetheless, and as it comes up periodically in high level comment threads I'd be interested to hear your opinion.)

Thanks for all you do!

Sort:  

Well, yes, it was amazing growth in 2017 and it would be even greater in 2018.
And yes, I can see a point in time where many of random witnesses with home grown infrastructure would no longer be able to keep up with the growth. And that's OK, because we are aiming at something bigger than just a proof of concept that bunch of geeks can run in their garage.
Scalability is and would be a challenge and a constant battle. The key is to keep an eye on your enemy, never ever underestimate it and plan ahead of time to avoid being ambushed.
If we can see some problem on a horizon it's great, because then we have a time to prepare ourselves and react accordingly.
I took a part in many discussions about scalability last week and I'm sure we can handle what is comming for the next few months.
And then?
By that time we will be ready things that we are not ready now.
And so on, and so on...

This might be a silly newbie question, but why would one need to store the entire blockchain, apart from those hosting a big DApp like steemit that needs fast searches?
Can't there be some kind of work-sharing (with some redundancy of course), where you store a chunk of the chain in a deterministically computable way so that users know whom to ask for a specific information?

Splitting blockchain (by block ranges) wouldn't have much much sense because it would be very hard to ask for useful information. However, we are moving towards fabrics and microservices.

Splitting blockchain (by block ranges) wouldn't have much much sense because it would be very hard to ask for useful information

Unless the client who asks for the data is aware of the distribution scheme.

Doesn't matter. If only blocks are distributed then it's really ineffective to grab such data as "who follows user x". Knowing who can give you block ranges is irrelevant info.
Reindexing whole blockchain with tags plugin turned on will get that information for fast access at the run time.
It's in network best interest to have seeders, not leechers.

True. A more clever breakup may be by transaction type.
Ultimately it would even make sense to store monetary transactions in the main chain, text data in one side chain and big data (such as videos) in another.

(I lack culture on this, this is pure speculation)

It's in network best interest to have seeders, not leechers

Having smaller dataset may help having specialized seeders, meaning more seeders. Think of a seeder/cache node specialized in content written in one language.

Coin Marketplace

STEEM 0.27
TRX 0.11
JST 0.031
BTC 67476.14
ETH 3776.09
USDT 1.00
SBD 3.52