Model Driven Development
Model Driven seems to be enjoying a bit of an upsurge lately. Gartner has recently been hyping (is it fair to accuse the inventors of the hype curve of hyping something? or is it redundant?) what they call “low code/ no code” environments.
Perhaps they are picking up on and reporting a trend, or perhaps they are creating one.
Model Driven Development has been around for a long time. To back fill what this is and where it came from, I’m going to recount my experience with Model Driven, as I think it provides a first person narrative for most of what was happening in the field at the time.
I first encountered what would later be called Model Driven in the early 80’s when CAD (Computer Aided Design—of buildings and manufactured parts) was making software developers jealous. Why didn’t we have workbenches where we could generate systems from designs? Early experiments coalesced into CASE (Computer Aided Software Engineering). I was running a custom ERP development project in the early 80’s (on an ICL Mainframe!) and we ended up building our own CASE platform. The interesting thing about that platform was that we built the designs on recently acquired 8-bit microcomputers, which we then pushed to a compatible framework on the mainframe. We were able to iterate our designs on the PCs, work out the logistical issues, and get a working prototype UI to review with the users before we committed to the build.
The framework built a scaffold of code based on the prototype and indicated where the custom code needed to go. This forever changed my perspective on how systems could and should be built.
What we built was also being built at the same time by commercial vendors (we did this project in Papua, New Guinea and were pretty out of the loop as to what was happening in mainstream circles). When we came up for air, we discovered what we had built was being called “I-CASE” (Integrated Computer Aided Software Engineering), which referred to the integration of design with development (seemed like that was the idea all along). I assume Gartner would call this approach “low code” as there still was application code to be written for the non-boiler-plate functionality.
Next stop on my journey through model driven was another ERP custom build. By the late 80’s a few new trends had emerged. One was CAD was being invaded by parametric modeling. Parametric modeling recognizes that many designs of physical products did not need to be redesigned by a human every time a small change was made to the input factors. A motor mount could be designed in such a way that a change to the weight, position, and torque would drive a new design optimized for those new factors. The design of the trusses for a basketball court could be automatically redesigned if the span, weight, or snow load changed and the design of big box retail outlets could be derived from, among other things: wind shear, maximum rainfall, and seismic potential.
The other trend was AI (remember AI? Oh yeah, of course you remember AI, which you forgot about from the early 90’s until Watson and Google’s renaissance of AI).
Being privy to these two trends, we decided to build a parametric model of applications and have the code generation be driven by AI. Our goal was to be able to design a use case on a post-it note. We didn’t quite achieve our goal. Most of our designs were up to a page long. But this was a big improvement over what was on offer at the time. We managed to generate 97% of the code in this very sophisticated ERP system. While it was not a very big company, I have yet to see more complex requirements in any system I have seen (lot based inventory, multi-modal outbound logistics, a full ISO 9000 compliant laboratory information management system, in-line QA, complex real time product disposition based on physical and chemical characteristics of each lot).
In the mid 90’s we were working on systems for ambulatory health care. We were building semantic models for our domain. Instead of parametric modeling we defined all application behavior in a scripting language called tcl. One day we drew on a white board where all the tcl scripts fit in the architecture (they defined the UI, the constraint logic, the schema, etc.) It occurred to us that with the right architecture, the tcl code, and therefore the behavior of the application, could be reduced to data. The architecture would interpret the data, and create the equivalent of application behavior.
We received what I believe to be the original patents on fully model driven application development (patent number 6,324,682). We eventually built an architecture that would interpret the data models and build user interfaces, constraints, transactions, and even schemas. We built several healthcare applications in this architecture and were rolling out many more when our need for capital and the collapse of the .com bubble ended this company.
I offer this up as a personal history of the “low code / no code” movement. It is not only real, as far as we are concerned, but its value is underrepresented in the hype.
More recently we have become attracted to the opportunity that lies in helping companies become data-centric. This data-centric focus has mostly come from our work with semantics and enterprise ontology development.
What we discovered is that when an enterprise embraces the elegant core model that drives their business, all their problems become tractable. Integration becomes a matter or conforming to the core. New system development becomes building to a much, much simpler core model.
Most of these benefits come without embracing model driven. There is amazing economy in reducing the size of your enterprise data model by two orders of magnitude.