Monday, August 10, 2015

Concurrency Again

Scalable Planning
:: by Dr. David DiNucci ::

Hot languages such as Clojure and Python are building concurrency structures into their basic grammar.

Sure, you might want to talk to the operating system about threads, but maybe your language uses different concepts, such as "start this now and get back to me" or "do this later".

Let the translator talk to the OS, while you the coder stay blissfully in your native language sphere.

As I was planning with Glenn, we need to converge computer science with theater a lot more, if folks are to understand at a deeper level what operations research and general systems theory are all about.  Business executives need these concepts as much as coders, if wishing to avail of economies of scale.

It's not just that a website is like a backstage, with JavaScript puppets keeping a user amused, it's like when you direct films, or plays, with casts of thousands, the extras cannot all wait on each other for cues.

They have their instructions, some of which may involve waiting for other processes to finish.  Once you get the ball, run with it.  Many relay races, many Olympic events, are all going at the same time, perhaps with some kind of scheduler (the OS itself?) with a sense of priorities (changing?).

The mirror of a multi-process or multi-threaded back end is an event driven front end.

If a process dispatches a whole lot of worker bee processes to tackle some task, with a "report back when done" instruction, how will the program know when to check back?  Waiting for the teapot to boil is just another form of blocking.

How is work accomplished in the meantime?

That question is often more intuitively answerable when we use a control panel indicator, on some dashboard or in a cockpit, to show "percent complete" and leave it to the human controller, the driver, the pilot, to initiate some next action.

The human controller is likewise a multi-tasker, as is that human's own anatomical infrastructure.  The human body is about as parallel (concurrent) as it gets.

"This work is now done, so you have the option to do this other thing".  Just an option.

Just because a gun is loaded, doesn't mean you have to fire it.

Yes, military planners confront these same concepts.  A lot of these concepts were initially hammered out in some war-fighting context.

Dr. David DiNucci has done extensive research into concurrency as a topic and has come up with what amounts to a graphical language even choreographers or theater directors might use, to organized dances or plays with lots of non-blocking calls, lots of not waiting amidst waiting.

Many patterns pertain in concurrency diagrams.  A cast of threaded workers, awaiting tasks, is a well-known pattern.  Workers pick tasks off a queue and go off on their own, reporting back when complete.  These workers are also known as "listeners" or "subscribers" or "agents" in the design pattern literature.

From the description of Dr. DiNucci's Scalable Planning @ Amazon:
A new graphical representation called ScalPL (Scalable Planning Language) is then introduced for building even complex concurrent activities of all kinds from those elemental activities, one mind-sized bite at a time. For programmers, structured and object-oriented programming are extended into the concurrent realm, and performance techniques are explored. For the more serious student, axiomatic semantics and proof techniques are covered.
In today's world of Containers and Micro-services in the Cloud, the emphasis is on freeing up components to get work done regardless of the various critical paths through the network.

A given job may get hung up somewhere, waiting for Y to finish, but X and Z have already moved on, free to take on other work.  A well designed ecosystem does not freeze up or get paralyzed when a particular process seizes.  Just make a frozen process back burner and move on.