I saw a pair of talks recently that I thought were interesting, and am posting pointers to both. The first is by Rob Pike, was presented to at a Google Free and Open Source Software meeting in Sydney in , and concerns building lexical analyzers in Go. The second is from Rich Hickey, was presented to the Boston Lisp meeting in , and presents his Clojure dialect of Lisp to an audience of Lisp programmers.
In Rob Pike's talk, Lexical Scanning In Go, he presents a clever application of channels and goroutines to build a lexical scanner. The result feels reminiscent of the Continuation-Passing Style, in the sense of capturing state in functions that take a lexer environment as an argument.
But something that struck me was an almost-parenthetical comment where Pike contrasts parallelism and concurrency. In the Communicating Sequential Processes-inspired model implemented in Go, the two can be separated, though they are often conflated. In particular, unlike in the textbook definition of concurrency, in Go the notion applies to components of a program that are executing independently, though not necessarily simultaneously (e.g., parallelism). This is a subtle but important distinction; Pike's lexer is essentially single threaded, even though it has two concurrent components: communication via channels acts as a serialization mechanism, and the two components are run sequentially instead of in parallel.
Pike's lexer is a simple, idiomatic piece of Go code, illustrates the principles of the language and the type of programming that it encourages, and provides an extremely elegant solution to a common problem.
Rich Hickey's
talk,
Clojure for Lisp Programmers, part 1
and
part 2,
is an extensive (the whole talk is roughly three hours) introduction to
Clojure
for experienced Lispers. What is interesting here is the way that
Hickey
describes the Clojure design philosophy in contrast to
traditional Lisp approaches: Clojure builds on
abstractions instead of concrete
implementation. For example the two-item
cons
cells used to implement the lists of
traditional Lisp are considered too concrete; their details
become engrained in functions that manipulate lists, and make
those functions difficult to extend elegantly to different
structures. So Clojure creates an abstraction called a
Seq
for things that are sequential in nature;
Seq
's have a first
element and a
rest
(another Seq), and anything that implements the
abstract Seq
interface can take advantage of
functions that operates on Seq
s. The details of how
a Seq
are implemented are opaque. A list may be
implemented in terms of something like cons
cells,
while a vector may be implemented as an array. Things may be
fully constructed, or lazily constructed as elements are
required; the details don't matter. Functions like
map
, reduce
and filter
work over any Seq
independent of the details.
While this feels conceptually like an
Object-Oriented
approach, pretty much everything in Clojure is immutable,
and there is no encapsulated state. The approach seems
closer to the
type classes
of
Haskell
et al.
But what is really important about Hickey's talk, and indeed about Clojure, is that it shows that someone has really taken the time to think about the underlying princples of Lisp and had the courage to cleanly break with the past in a way that neither Common Lisp nor Scheme could have when they were defined. Clojure preserves the essence of Lisp while recasting it for the 21st century, and Hickey's talk is a nice presentation of that.
Clojure and Go are both neat, elegant, refined implementations of proven techniques and concepts. Here their creators show interesting elements and provide fascinating insights into their creations. Readers are highly encouraged to watch both talks if they have time: they are interesting, relevant and well presented.