Discussion:
Parallelism and Concurrency
(too old to reply)
Casey Hawthorne
2009-11-16 19:08:27 UTC
Permalink
Parallelism and Concurrency: I would separate the two concepts:
Parallelism - one task shared across multiple cores
Concurrency - different tasks shared among multiple cores

For both, the overhead of communication between the tasks compared to
the useful work done on the task itself, is the biggest challenge.

For parallelism, (and concurrency), any immutable data can be locally
stored, reducing memory latency, and since memory prices are falling
all the time, more and more local data can be accommodated.

So, for problems with many readers and only one writer, things are not
so bad.

For many writers and some readers, one wants as much immutable data as
possible, and also to localize the effects of mutable data, which
would seem to suit functional languages, like Haskell, which are
designed to "cleanly" separate pure code from side-effect code
(globally affecting code).

--
Regards,
Casey
Jon Harrop
2009-11-17 02:26:52 UTC
Permalink
Post by Casey Hawthorne
Parallelism - one task shared across multiple cores
Concurrency - different tasks shared among multiple cores
No, you have defined data parallelism and task parallelism, respectively.

Definitions of concurrency differ but the general idea is that the control
flow of a concurrent program is non-deterministic.
Post by Casey Hawthorne
For both, the overhead of communication between the tasks compared to
the useful work done on the task itself, is the biggest challenge.
I'd say the biggest challenge with concurrent programming is correctness
because non-deterministic control flow renders testing much less useful,
and the biggest challenge with parallel program is granularity.
Post by Casey Hawthorne
For parallelism, (and concurrency), any immutable data can be locally
stored, reducing memory latency, and since memory prices are falling
all the time, more and more local data can be accommodated.
Yes.
Post by Casey Hawthorne
So, for problems with many readers and only one writer, things are not
so bad.
The readers must still get updates from the writer somehow.
Post by Casey Hawthorne
For many writers and some readers, one wants as much immutable data as
possible, and also to localize the effects of mutable data, which
would seem to suit functional languages, like Haskell, which are
designed to "cleanly" separate pure code from side-effect code
(globally affecting code).
There is a flip side: functional languages generate heaps with far more
pointers which bogs down the GC.

Moreover, the only reason to parallelize code is performance and Haskell is
a disaster in the context of performance.
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Casey Hawthorne
2009-11-17 05:37:16 UTC
Permalink
Post by Jon Harrop
Moreover, the only reason to parallelize code is performance and Haskell is
a disaster in the context of performance.
I take it you mean compared to OCaml.

Compared to other functional languages?
--
Regards,
Casey
Jon Harrop
2009-11-18 00:58:23 UTC
Permalink
Post by Casey Hawthorne
Post by Jon Harrop
Moreover, the only reason to parallelize code is performance and Haskell
is a disaster in the context of performance.
I take it you mean compared to OCaml.
Compared to C, C++, Java, C#, F#, OCaml...
Post by Casey Hawthorne
Compared to other functional languages?
Do you mean purely functional?
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Casey Hawthorne
2009-11-20 22:57:36 UTC
Permalink
I just realized that the separation into: data parallelism and task
parallelism, reminds me of the debates decades ago in AI whether
"intelligence" is in the program or the data.

We now know that the "intellegence" exhibited by the program is in
both.

I wonder if such a unification is possible in the data parallelism and
task parallelism worlds, which would lead to a better theory of
parallel architectures.
Post by Jon Harrop
Post by Casey Hawthorne
Parallelism - one task shared across multiple cores
Concurrency - different tasks shared among multiple cores
No, you have defined data parallelism and task parallelism, respectively.
Definitions of concurrency differ but the general idea is that the control
flow of a concurrent program is non-deterministic.
--
Regards,
Casey
Jon Harrop
2009-11-21 05:07:10 UTC
Permalink
Post by Casey Hawthorne
I wonder if such a unification is possible in the data parallelism and
task parallelism worlds, which would lead to a better theory of
parallel architectures.
Well, functional programming unifies them because tasks are data. :-)
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Jon Harrop
2009-11-21 05:08:08 UTC
Permalink
Post by Jon Harrop
Post by Casey Hawthorne
I wonder if such a unification is possible in the data parallelism and
task parallelism worlds, which would lead to a better theory of
parallel architectures.
Well, functional programming unifies them because tasks are data. :-)
Ok, not quite: tasks spawn more tasks but data doesn't (it can just be
nested). In fact, that is a good justification for my preferring task
parallelism to nested data parallelism. :-)
--
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
Continue reading on narkive:
Loading...