Synchronous programming language


A synchronous programming language is a computer programming language optimized for programming reactive systems. Computer systems can be sorted in three main classes: transformational systems that take some inputs, process them, deliver their outputs, and terminate their execution; a typical example is a compiler; interactive systems that interact continuously with their environment, at their own speed; a typical example is the web; and reactive systems that interact continuously with their environment, at a speed imposed by the environment; a typical example is the automatic flight control system of modern airplanes. Reactive systems must therefore react to stimuli from the environment within strict time bounds. For this reason they are often also called real-time systems, and are found often in embedded systems.
Synchronous programming is a computer programming paradigm supported by synchronous programming languages. The principle of SRP is to make the same abstraction for programming languages as the synchronous abstraction in digital circuits. Synchronous circuits are indeed designed at a high-level of abstraction where the timing characteristics of the electronic transistors are neglected. Each gate of the circuit is therefore assumed to compute its result instantaneously, each wire is assumed to transmit its signal instantaneously. A synchronous circuit is clocked and at each tick of its clock, it computes instantaneously its output values and the new values of its memory cells from its input values and the current values of its memory cells. In other words, the circuit behaves as if the electrons were flowing infinitely fast. The first synchronous programming languages were invented in France in the 1980s: Esterel, Lustre and Signal. Since then, many other synchronous languages have emerged.
The synchronous abstraction makes reasoning about time in a synchronous program a lot easier, thanks to the notion of logical ticks: a synchronous program reacts to its environment in a sequence of ticks, and computations within a tick are assumed to be instantaneous, i.e., as if the processor executing them were infinitely fast. The statement "a||b" is therefore abstracted as the package "ab" where "a" and "b" are simultaneous. To take a concrete example, the Esterel statement "every 60 second emit minute" specifies that the signal "minute" is exactly synchronous with the 60-th occurrence of the signal "second". At a more fundamental level, the synchronous abstraction eliminates the non-determinism resulting from the interleaving of concurrent behaviors. This allows deterministic semantics, therefore making synchronous programs amenable to formal analysis, verification and certified code generation, and usable as formal specification formalisms.
In contrast, in the asynchronous model of computation, on a sequential processor, the statement "a||b" can be either implemented as "a;b" or as "b;a". This is known as the interleaving-based non determinism. The drawback with an asynchronous model is that it intrinsically forbids deterministic semantics, which makes formal reasoning such as analysis and verification more complex. Nonetheless, asynchronous formalisms are very useful to model, design and verify distributed systems, because they are intrinsically asynchronous.
Also in contrast are systems with processes that basically interact synchronously. An example would be systems built based on the Communicating sequential processes model, which also allows nondeterministic choice.

Synchronous languages