The Model Driven Software Network

Raise your level of abstraction

A mind map of Martin Fowler's DSL book.

Today, I finished reading the thick black book.. "Domain-Specific Languages" by Martin Fowler.

 

Here's the (personal)mind map of the book.(click to magnify img. or here to see in "share")

Recently, my colleagues are rapidly moving from Java to Ruby, first because they have found that Java + XML config of Web application was much more cumbersome that Ruby on Rails. And they naturally became big fans of Cucumber testing, Rake files, ... and other Ruby DSLs(internal/external).

 

I'm finding that Ruby's capability of meta-programming, flexible syntax, blocks(closures), and literal expressions have a great power to design internal DSLs. 20 years ago, I was a Unix-hand and a heavy user of yacc/lex tools to incorporate "my languages" into my application. So I have been feeling nostalgic about this movement, and wanted to update my knowledge to date.

 

That's why I started reading the book, and to me, "Part I" is the part I should read and the other parts are for DSL designers. (Useful patterns, which reminded me of the "Dragon Book".)

 

If I would choose one quote from the book, it would be about a Semantic Model and DSLs.;

All the DSL does is provide a readable way of populating that model - that is the difference from the command-query API I started with.

From the DSL's point of view, I refer to this model as the "Semantic Model"(159)...

I advocate a Semantic Model because it provides a clear separation of concerns between parsing a language and the resulting semantics...

I put "DSL populates the Semantic Model" message and the diagram into the center of the mind map.

 

Among the four "why"s of DSL, I really liked "Communication with Domain Experts" aspect as an Agilista. One of my colleagues, @moro, has been developing "moro-miso" which works with Cucumber and helps use Japanese terms in the testing language. 

(EDIT 5/6: "moro-miso" is now incorporated with "cucumer-rails", thanks @moro)

 

Coincidentally I recently attended the DSM(Domain Specific Modeling) workshop at SPLASH 2010 which Steven Kelly at MetaEdit hosted and knew the concept of "language workbench", so I learned a bit about other workbenches and added info to at the right bottom.

 

Also, InfoQ has some good articles about MD*, which I added at the left bottom.

 

I'm a developer of a modeling tool "Astah" (http://astah.change-vision.com) and exploring ways to incorporate the concept of DSL or to collaborate with DSL tools and workbenches.

 

Also a PDF version of the mindmap is available for printing in A3 or bigger paper -> DomainSpecificLanguagesByMartinFowler.pdf

Views: 4046

Add a Comment

You need to be a member of The Model Driven Software Network to add comments!

Join The Model Driven Software Network

Comment by Angelo Hulshout on May 23, 2011 at 20:06
I understand Japan is not around the corner, but you can still check out the solutions. Even better: a first one has been put online by the people from the Whole Platform so that you can actually play with it online. I'll post the link after the workshop
Comment by Kenji Hiranabe on May 23, 2011 at 1:30

Thanks Angelo for the LWC info.. I wish I could come from Japan, but cannot make it this time. Say hello to Steven Kelly.

Comment by Angelo Hulshout on May 22, 2011 at 22:28

Nice mind map. Concerning the tools list, why not consider all 12 tools that participate in the LWC (http://www.languageworkbenches.net), and are presented in the workshop this Tuesday?

 

Comment by Kenji Hiranabe on May 17, 2011 at 5:01

Is it still alive ? especially, I was once interested in "M language". Why dead by the way?

Here's my thoughts on Oslo (but in the year 2008)

http://blogs.itmedia.co.jp/hiranabe/2008/10/oslo-m-language.html

Comment by Meinte Boersma on May 17, 2011 at 4:39
I guess Oslo should be in there for completeness' sake, but otherwise it's pretty dead...
Comment by Kenji Hiranabe on May 17, 2011 at 1:38
I should add "www.eclipse.org/Xtext" and Microsoft "Oslo". sorry.
Comment by Meinte Boersma on May 16, 2011 at 22:59
I'm missing a mention of Xtext under the Tools sub tree :(
Comment by Kenji Hiranabe on May 7, 2011 at 15:19

Clifford,

Thanks for your insight. "A specialized semantic modeling language, the Constellation Query Language", what an awesome approach!

  • noun - Ubiquitous Language
  • verb - fact types
  • rules - constraints

Yes, I've never seen those three in one hosting language, not in a domain specific way. I'm thinking UML is struggling with its complexity, as Ivar Jacobson himself says, because it aimed from the first palce for universalness and extensibility, although we learned a lot from it. But, now we need a simple way of capturing a semantic model from the viewpoint of the people within the domain, not from the one in the solution or technology.

 

Latter part of your discussion ... may be great but I'm not able to understand without further knowledge about parser and grammer, but anyway, thanks a lot for your comment. I'm quite new to this society(this blog space), so I'm glad to have such a detailed feedback. 

Comment by Clifford Heath on May 6, 2011 at 6:17

I'm glad you recognize the importance of the Semantic Model. I have been a user of the semantic modeling approach for more than twenty years now, and have even created a specialized semantic modeling language, the Constellation Query Language (tm). It enables you to capture the semantic model from the domain experts in much greater detail and clarity than any previous approach, including particularly DDD, because it models not just the nouns (which make up DDD's Ubiquitous Language), but also the verb expressions (fact types) and rules (constraints) in the natural English language used by those experts. It does this without loss of formality; in fact the meta-model of CQL maps (via ORM2) directly to Common Logic, so it has a better formal basis than UML does. Yet the semantic model can be easily validated and populated with examples by the domain experts directly. For more information, please check http://dataconstellation.com/ActiveFacts/.

 

As a reviewer of Martin's book (https://twitter.com/#!/martinfowler/status/12948552199), I helped him to get a handle on the benefits of constructing external DSLs using Parsing Expression Grammars (PEGs). These are not only very powerful, but much easier to use and debug than previous generations of parser generators, because they're basically just recursive descent (LL) grammars. LR Parser generators can be terribly difficult to debug, and the authors of the tools seem to be incapable of producing helpful error messages. Normally LL grammars are not very powerful because they have limited lookahead, or very inefficient because backtracking produces exponential behavior, but PEGs are implemented using packrat techniques that allow unlimited backtracing with O(N) (linear!) cost in both time and memory. Unlike ANTLR-generated parsers, which are also LL and very efficient, PEGs are composable, which means that you can extend and combine grammars at runtime, which is not possible with any other parsing technology. It also make possible the idea of a "Language Workbench" for plain-text languages, in the form of a language which can dynamically extend its own grammar. As far as I know this idea has not yet been implemented however.

 

I'm the maintainer of Treetop, which is the predominant PEG parser generator used in the Ruby community; it powers Cucumber and the new Rails Mailer gem.

Comment by Kenji Hiranabe on May 6, 2011 at 1:22

Thanks Marco !

I read the link, and found good practices for designing visual DSLs, whereas, most Martin's practices are for textual DSLs.

Badge

Loading…

© 2017   Created by Mark Dalgarno.   Powered by

Badges  |  Report an Issue  |  Terms of Service