Concurrency October 29, 2009Posted by ddouthitt in Erlang, Haskell, Scala.
Tags: concurrency, parallelism
One of the most interesting things about some of these languages (such as Scala) is their innate support of concurrency.
Concurrency is a huge benefit for making systems work together or for making programming systems that can scale (that is, grow and adapt to heavier usage).
Ted Leung at his blog wrote an article about concurrency called The Cambrian Period of Concurrency (and a follow-up). He also gave a talk at OSCON 2009 titled A Survey of Concurrency Constructs which was well received.
Tim Bray, also from Sun Microsystems, created a series of articles on the same topic entitled Concur.next. He talks about Haskell, Erlang, and Clojure. Erlang and Haskell are fascinating, but to me Clojure seems like it took Lisp and obfuscated it and made it harder to learn and use – and became slower besides.
After reading Tim’s article, one of the Glaskow Haskell Compiler (GHC) folks wrote an article comparing parallelism and concurrency. (By the way, not everyone thinks of Haskell when one talks of GHC…)
What intrigues me most about concurrent systems is the ability of multiple systems to work together – not as a cluster, nor in a failover configuration – but as a coherent set of multiple systems doing what each does best. A compute-farm with dynamically responding discrete and unique systems would be one example. Another example (which intrigues me to no end) is a network of servers that responds as a whole to a threat on any one of the servers individually.
Concurrent programming over the network, when used to its fullest, means that a server no longer needs to be the largest unit of response: it means that a network may respond, not just the server. It means that information can be shared among a group of servers rather than just one. Concurrency opens up a vast amount of fascinating ideas.