▲ | Andrew_nenakhov 4 days ago | |
Server-centric computing is just more efficient, and it usually becomes less popular only at because means of communication aren't up to the task at the moment. 1. In the beginning, there were mainframes and terminals. You saved resources by running apps on a server and connecting to them with cheap terminals 2. Then, PCs happened. You could run reasonably complex programs on them, but communication capabilities were very limited: dialup modem connections or worse 3. Then, internet happened, and remote web apps overtook local apps in many areas (most of those that survived required massive usage of graphics, like games, which is difficult even with modern internet) 4. Then, smartphones happened. At the time of their appearance they didn't have ubiquitous network coverage, so many first apps for these platforms where local. This is eroding too, as communication coverage improves. So if you look at this, it is clear that main share of computing oscillated back and forth between server and local, moving to local only when communication capabilities do not permit remote running, and once comms catch up, the task of running apps moves back to servers. |