I doubt that very much. Video streaming is such a huge fraction of bandwidth that any effect from typical "cloud" usage is pretty much invisible, when it works (I don't like this part of the cloud either, but not for bogus reasons like this).Brendan wrote:That 1 gigabit uplink that 20+ computers have been sharing for years without any problem at all, is now a major bottleneck because you've got 20+ computers using "the cloud" for no sane reason whatsoever. Then one guy decides to download a copy of LibreOffice and 19+ people start getting 4 seconds of lag for their word processor. Yay.
Do you agree with linus regarding parallel computing?
Re: Do you agree with linus regarding parallel computing?
Re: Do you agree with linus regarding parallel computing?
Hi,
In a much more general way; for everything involving 2 or more processors the cost of communication between the processors limits scalability (and for TCP/IP over the Internet that "cost of communication" is orders of magnitude higher than it is for processors in the same computer, or processors in different computers on the same LAN).
Mostly, you can break it down into 3 categories:
Cheers,
Brendan
It really does depend on the type/s of application/s, and how they're implemented. For something like a word-processor, if almost everything is done locally (e.g. editing, layout, rendering, spell checking, etc) and the cloud is only used for storage it wouldn't be too bad, and if everything is done remotely (e.g. it's using a protocol like VNC and all the processing is done remotely) then it would be extremely bad (e.g. > 200 ms of lag between pressing a key and seeing it on screen).Rusky wrote:I doubt that very much. Video streaming is such a huge fraction of bandwidth that any effect from typical "cloud" usage is pretty much invisible, when it works (I don't like this part of the cloud either, but not for bogus reasons like this).Brendan wrote:That 1 gigabit uplink that 20+ computers have been sharing for years without any problem at all, is now a major bottleneck because you've got 20+ computers using "the cloud" for no sane reason whatsoever. Then one guy decides to download a copy of LibreOffice and 19+ people start getting 4 seconds of lag for their word processor. Yay.
In a much more general way; for everything involving 2 or more processors the cost of communication between the processors limits scalability (and for TCP/IP over the Internet that "cost of communication" is orders of magnitude higher than it is for processors in the same computer, or processors in different computers on the same LAN).
Mostly, you can break it down into 3 categories:
- Applications where communication overhead outweighs the benefits of doing processing remotely; where cloud is a silly joke
- Applications where communication overhead is small and the benefits of doing processing remotely are large; where the cloud could be useful in theory, but in practice no desktop applications fall into this category
- Applications where the communication overhead can't be avoided (e.g. social media, file sharing, etc).
Cheers,
Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
Re: Do you agree with linus regarding parallel computing?
You are expecting that everything will be shifted to a cloud and it is your mistake. The economy of scale is applicable to situations when more than one user uses the same infrastructure without redundancy, but if users use their PCs as terminals and shift all work to a cloud, then we see a great redundancy right where the user is, user's PC is underused while cloud's servers are overused. Such a skewed picture is, of course, not very efficient.Brendan wrote:It really does depend on the type/s of application/s, and how they're implemented. For something like a word-processor, if almost everything is done locally (e.g. editing, layout, rendering, spell checking, etc) and the cloud is only used for storage it wouldn't be too bad, and if everything is done remotely (e.g. it's using a protocol like VNC and all the processing is done remotely) then it would be extremely bad (e.g. > 200 ms of lag between pressing a key and seeing it on screen).
But if we remember the origins of the "cloud talk", then it is about a tablets that are replacing PCs. May be the word "tablet" is the cause of your mistake and you think that if an enterprise will be using tablets then it just must shift all possible processing to a cloud. But it is wrong expectation. Because, of course, the processing power of a tablet should be used just as processing power of PC is used today. Tablet's power is more than enough for such applications as text editing, so there is no need for cloud for such task. But there is a need for cloud for an enterprise that wants to catch the economy of scale. And the economy is achieved without offloading word processor tasks to the cloud, but it is a result of sharing the same infrastructure among many enterprises with redundancy liquidation. For example, if there are 10 enterprises that use 10 servers each and that hire 10 administrators each, we can imagine a centralized provider of services with 60 servers and 20 administrators with the same throughput that is required by 10 enterprises. So, here we have the economy of scale in form of 40 extra servers and 80 extra administrators. And service provider here never was expected to do any job of a text processor or whatever desktop application you have thoughts of. But the provider just does the same job as combined 100 servers and 100 administrators have done before.
And next, if an enterprise recognizes the need for speech recognition, then it sees it's benefits and can calculate the $ value of the technology application to the enterprise's particular conditions. Now an enterprise have a choice - to spend money on some dedicated recognition server(s) or pay service fee for a recognition service of a cloud provider. Because the economy of scale works as expected, the cost of managing additional dedicated server will be greater than the cost of managing servers for 10 enterprises, divided by 10. So, from the cost perspective it is obviously more efficient to use the cloud provided service instead of buying a dedicated serve(s).
Re: Do you agree with linus regarding parallel computing?
Honestly the one real application of the "cloud" that actually makes sense and took off is servers. Services like Amazon's AWS or Rackspace or Digital Ocean, where you can provision servers at will and easily scale up or down how much you're using/paying for are quite convenient, and there's not really anything going on that could be easily done locally.
Re: Do you agree with linus regarding parallel computing?
The big problem with your argument, seductive though it may seem, is that current sales figures show that tablets aren't replacing PCs. (Hardly surprising to anyone who has worked in an enterprise environment.) Rather, tablet sales are slowing whilst PC sales are showing a resurgance.embryo wrote:But if we remember the origins of the "cloud talk", then it is about a tablets that are replacing PCs.
Argue the theory all you like, the facts are different.
Re: Do you agree with linus regarding parallel computing?
It's just what I have described above - collect a server farm and service user needs for server side processing while exploiting economy of scale. And from the user's point of view the server farm looks like a cloud without any visible internal details.Rusky wrote:Honestly the one real application of the "cloud" that actually makes sense and took off is servers. Services like Amazon's AWS or Rackspace or Digital Ocean, where you can provision servers at will and easily scale up or down how much you're using/paying for are quite convenient
What do you mean when use word "locally"? What should be local?Rusky wrote:and there's not really anything going on that could be easily done locally.
- AndrewAPrice
- Member
- Posts: 2300
- Joined: Mon Jun 05, 2006 11:00 pm
- Location: USA (and Australia)
Re: Do you agree with linus regarding parallel computing?
The last few years we've seen mass adoption of tablets, but now they're cheap and affordable, and most people who want a tablet have a tablet.iansjack wrote:The big problem with your argument, seductive though it may seem, is that current sales figures show that tablets aren't replacing PCs. (Hardly surprising to anyone who has worked in an enterprise environment.) Rather, tablet sales are slowing whilst PC sales are showing a resurgance.embryo wrote:But if we remember the origins of the "cloud talk", then it is about a tablets that are replacing PCs.
Argue the theory all you like, the facts are different.
My OS is Perception.
-
- Member
- Posts: 193
- Joined: Wed Jan 11, 2012 6:10 pm
Re: Do you agree with linus regarding parallel computing?
I'd have to say no, I don't agree with Linus. Consider that most modern x86 computers are actually RISC architectures emulating CISC ISA's. This is something you can't simply throw at the GPU, and parallelism could be used increase the speed of recompilation. I'm not aware if pc's actually do this though.
I think that a bigger problem than supporting parallelism in programming languages is supporting concurrency. Without concurrency, Linus is right. You gain nothing.
I think that a bigger problem than supporting parallelism in programming languages is supporting concurrency. Without concurrency, Linus is right. You gain nothing.