EXPLORING BUSINESS RULES

Preface to the Business Rule Book, Second Edition

Ronald G.  Ross
Ronald G. Ross Co-Founder & Principal, Business Rule Solutions, LLC , Executive Editor, Business Rules Journal and Co-Chair, Building Business Capability (BBC) Read Author Bio       || Read All Articles by Ronald G. Ross

As readers of the Newsletter and attendees of my seminars know, I have been on the track of business rules for many years. Here is a relevant quote from my book, Entity Modeling: Techniques and Applications, page 102, published in 1987. Incidentally, I would not change a word of it today.

"... specific integrity rules [of an enterprise], even though 'shared' and universal ... (just like its data should be), traditionally have not been captured in the context of its [data] model(s). Instead, they usually have been stated vaguely (if at all) in largely uncoordinated analytical and design documents, and then buried deep in the logic of application programs. Since application programs are notoriously unreliable in the consistent and correct application of such rules, this has been the source of considerable frustration and error. It sometimes also has led, unjustly, to distrust of the data model itself."

Besides the crucial problem of database integrity (correctness), current interest in business rules arises for other compelling reasons, including the following.

  • Accelerating rate of change. Businesses currently are challenged by a variety of factors to be more adaptable than ever before. Unfortunately, they are finding that the current architecture of their bread-and-butter information systems does not permit this. The underlying problem is lack of unification in their various practices, policies and guidelines (a.k.a. business rules). Solving this is key to re-engineering.

  • Opportunities afforded by new hardware/software platforms. Client/server architectures feature specialization of the data management function and, through triggers and stored procedures, a means to focus directly on integrity. The emergence of rule engines and rule servers provides equivalent capabilities. Active database systems and new forms of middleware promise to do even more.

  • Impetus of object orientation. OO's sharp focus on classes or types has led to a new wave of thinking about the nature of "functional" requirements.

  • Higher-order automation schemes. Experience with CASE has demonstrated that traditional methods are woefully inadequate for the code generation needs of the future-and that radically new approaches for the capture and automation of user requirements are needed.

  • Re-engineering work. Front-end technologies (e.g., GUIs) have rapidly transformed basic forms of user-machine interaction-but have also raised concerns about consistency and coordination. On a broader scale, businesses are seeking fresh ideas for streamlining and organizing workflow. Many IT professionals consequently have begun to seek some bold new approach that seamlessly addresses both user interactions and workflow on the front-end, as well as database on the back-end.

"Business rules represent one such radically new approach. Indeed, I believe they represent a revolution in the making."

Business Rules

Business rules represent one such radically new approach. Indeed, I believe they represent a revolution in the making. They take non-procedurality to a level never seen before. Compared with traditional techniques for expressing user "requirements," they demand a completely new mindset.

To appreciate this, consider the following additional quote from my 1987 book (also page 102).

"The enterprise naturally has dozens or hundreds [or thousands] of specific rules (or business policies) that govern its own behavior and distinguish it from others. Since these rules govern changes in the state of the enterprise, they translate directly into updating rules for its database(s)."

This quote expresses two fundamental ideas, both central to business rules, as follows.

  • At the operational level, an enterprise is a collection of rules. I believe that businesses must embrace this view to achieve the adaptability they seek. A business is not merely a collection of processes or procedures to execute. This revised view will require new designs for information systems-ones that seek to make turning rules on or off as easy as flipping light switches. I call such designs rule-based.

  • Rules should be data-based. Large-scale enterprises are characterized at the operational level by multitudes of "users" with diverse responsibilities, each having specific objectives that not infrequently conflict. Concurrency and substantial query requirements are the givens. In such an environment, attempting to enforce rules within processes or procedures ultimately is futile. What all these "users" have in common is their databases, and the need to record and share persistent results (of processes and procedures) in standard form. This strongly suggests that the expression of business rules should be based on "data." (Implementation, of course, is a separate question.) This, in turn, suggests that their specification should be declarative, rather than procedural.

Put simply, what I am saying is that the business information systems have unique characteristics. They are not the same as real-time systems, or systems software, or process control, or other such computing problems. Because business information systems are different, they require their own solution. I believe strongly that the business rule approach is the right one.

What are Rules?

A rule may be defined as a constraint or a test exercised for the purpose of maintaining the integrity (i.e., correctness) of data. The purpose of a rule generally is to control the updating of persistent (i.e., stored) data-in other words, the results that the execution of actions (processes) are permitted to leave behind.

Such control reflects desired patterns for business behavior. A rule thus embodies a formal, implementable expression of some "user requirement," usually stated in textual form using a natural language (e.g., English).

This textual form is called a Business Rule Statement. Each Business Rule Statement indicates a discrete, operational practice or policy in running the business, without reference to any particular implementation strategy or specific technology. (Users, of course, generally should not be concerned with how rules actually are enforced.)

The textual expression of each business rule-the Business Rule Statement-is extremely important. Unfortunately, Business Rule Statements often are ambiguous, and cannot be translated directly into an actual implementation (i.e., into running code). The task of the Rule Analyst thus is to translate Business Rule Statements into formal, precise (and implementable) rules.

Traditionally, rules have been viewed as "editing checks" or "validation criteria." This technical view fails to appreciate the potential of rule-based design either as a specification approach, or potentially as an implementation strategy.

Rules traditionally have been implemented in procedural logic buried deep in application programs- in a form that is virtually unrecognizable, and far re-moved from original business intent. Such strategies not only produce highly inconsistent enforcement of the rules, but make rapid change in them virtually impossible.

Ross Method

The question, of course, is exactly how data-based, nonprocedural (i.e., declarative) expression of rules can be accomplished.

  • Traditional techniques (e.g., data flow diagrams, action diagrams, pseudo-code, and other forms of procedural models) offer little assistance.

  • Expert systems provide rules for inference, but these generally are not in a form directly suitable for enforcement of business policies.

  • Data models (and certain object models) provide a fertile starting point for expressing rules (they define data types, a basic ingredient), but offer little beyond that.

This book introduces a new approach called Ross Method. I believe the book offers several contributions to the new field of business rules, including the following.

Classification of Rule Types

This first contribution proves that good things often come in small packages. The basic classification scheme is summarized opposite. (It also appears on the cover of the book.) A second classification scheme is offered for derivative rule types. These classification schemes are the heart of Ross Method. More on that in a moment.

"The central idea is this: rule types compute."

Modeling Techniques for Rules

Ross Method offers a specific syntax for exploiting the rule types of the classification schemes to express the actual rules of an enterprise. This syntax happens to be graphic. It does not have to be-the classification schemes are the heart, and they could be used for non-graphic forms of syntax as well (e.g., for textual forms of conceptual query/constraint language). I'm sure they will be used for that. But the graphic syntax provides proof-of-concept that a comprehensive declarative language is possible - and useful - for expressing rules.

Representative Set of Examples

The book contains over 500 real-life rule models, resulting in its rather substantial size. Collecting and modeling the 500+ examples was no small chore! The set of 500+ rules was selected carefully in order to represent the scope of the rule "problem." Consider this to be a minimum set-there are many more areas of requirements that rules can address. I strongly believe that any specification technique for rules or rule-based software product should address at least this set. Consider the set to represent baseline test cases.

About the Classification Schemes for Rule Types

Another question I have been asked frequently is what makes Ross Method unique. The obvious (but least important) answer is that it is graphic-not graphic just by accident or as an afterthought, but designed that way from the start. After all, we live in a point-and-click world!

A second answer is that it was designed specifically as a rule language-not as an extension to a database query language. This makes it unlike the extended forms of SQL used in some business rule approaches and tools. (This is not to say those approaches and tools are not good ones as far as they go.)

The third answer, however, is by far the most important. I claim that Ross Method is a higher-level rule language. To understand why, I must explain some things about the classification schemes for the rule types. As I have said, these are literally the heart of the matter. The central idea is this: rule types compute.

Take a moment to inspect the Chart of Atomic Rule Types. The Chart is organized into seven columns, representing seven families. (The rows in the Chart have no meaning.)

Each of the seven families is based on some distinct type of computation that the rule types in the family "know" how to do. In the case of the mathematical evaluators, the computation is obvious; for the other rule families, it is less so. The families produce atomic rule types because no other rule types "know" how to do their particular type of computation.

Usually, the result of the computation for a particular rule is hidden. However, it can be materialized for use by other rules in what I call the rule's Yield Value. Yield Value types and rule types are inseparable-twin reflections of the same idea. (Abbreviations for Yield Value types are given in the lower right-hand corner for each rule type box in the Chart of Atomic Rule Types.)

There are several important consequences of this computational base for rules, as follows.

  • First, I believe the classification scheme for atomic rule types is correct in a fundamental sense. In fact, I believe this is the classification scheme for atomic rule types.

  • Second, I believe Ross Method is therefore an example of what I call a rules calculus-a higher-order scheme for rule specification. In mathematics, the invention of calculus provided a new form of expression for problems that would be difficult and cumbersome in lower forms. Calculus addresses certain important problems extremely well. I believe that Ross Method crosses a similar threshold for rule specification.

  • Third, the computational base provides a means to distinguish rule types that are atomic from those that are not. If a rule type's Yield Value is not atomic, then the rule type is a derivative-that is, it can be computed or derived from other rule types. Derivatives, incidentally, represent an important means by which the syntax of Ross Method may be extended.

The Periodic Table of Chemical Elements is a good analogy for the classification of atomic rule types. Just as there are millions of different chemical compounds, all made from the same small number (100+) of atomic elements, so too can businesses be viewed as having many thousands of "compound" rules all made from the same set of atomic rule types.

My new vision for information systems thus is this-clever compounds of rules (known only to a few), all based on a relatively small set of atomic rule types (known to many). Just as chemical engineers produce complex chemical compounds from the same basic elements, rule engineers will produce complex information systems from the same atomic rule types.

Evaluating Techniques for Modeling Business Rules

The syntax of Ross Method approaches the problem of specifying more complex rules in building-block fashion. Since I believe all rules ultimately should be based on atomic rule types, such support for building compound rules is a must.

Beyond that, crucial criteria for evaluating any syntax for expressing rules include the following.

  • Extensible. Some compound rules re-appear frequently. For convenience, the syntax should permit any such rule type to be pre-defined, based on a template of the atomic rule types from which it is derived. The resulting derivative rule type, once named, subsequently should be usable in the same fashion as any atomic rule type. By this means, the syntax becomes extensible. Such extensibility is vital since no pre-established syntax can hope to cover directly all the many types of rules that practitioners will seek to express.

  • Expansive. For a specification approach to rules to be viable, it must be comprehensive. The syntax therefore must embody a pervasive way of thinking about information problems that applies to most situations rule analysts are likely to encounter. Otherwise, it will be viewed merely as a special-case technique, which eventually may prove easy to dismiss. A corollary is this-it is more important that the syntax provide a consistent solution for the widest possible variety of cases, than always to be the most convenient or obvious for each individual case.

  • Executable. An ultimate goal for the syntax is that it produces specifications that are computable. This means that the syntax must be rigorous and unambiguous so that a rote procedure (e.g., a compiler, interpreter, etc.) can "process" it with predictable, consistent results. This, of course, precludes native use of natural language (e.g., English) for such syntax. The benefit of such precision, incidentally, is not limited just to code generation. As practitioners have discovered in building data (and other types of) models, statements produced by "users" as requirements inevitably are riddled with hidden ambiguities and inconsistencies. A language or modeling technique that addresses this problem successfully has significant value to designers in and of itself.

  • Expressive. The syntax must be usable for capture and communication of rules at a high level. In this context, "high level" means independently of any specific implementation strategy (e.g., hardware/software platform). This includes database technology (e.g., rows, columns, referential integrity, and other concepts of the relational model; triggers and stored procedures of server DBMS; etc.). The reasons are simple. First, "users" do not care about technology per se. Second, a rule that is dependent on any physical database component is literally not a "business" rule.

Undoubtedly, other criteria for rules are important as well. The four above (I call them the four E's), however, are foremost. Incidentally, satisfying only one or several of the E's is not sufficient.

Rethinking Methodologies

How have methodologies addressed such rules in the past? The answer is, "poorly or not at all." Traditional methodologies generally address rules (if at all) only relatively late in the system development life cycle. This, of course, is when it becomes most expensive to do so.

This latter point deserves additional comment. I have found that every rule can be reduced or transformed into two or more individual validation checks (I call these system or update events) when actual enforcement of the rule must take place. Incidentally, I believe that a rote procedure (i.e., compiler, interpreter, etc.) always could discover these update events automatically.

The point regarding methodologies, however, is the following. Although crucial to the implementation of a rule, specifying these individual update events is not essential in understanding the business intent of the rule.

"How have methodologies addressed such rules in the past? The answer is, 'poorly or not at all'."

Yet that is exactly the "end" of the problem on which traditional (and current) system development methodologies have concentrated. The result usually is very procedural, implementation-specific deliverables that dance separate dances around the central fire of the rule. Lost in that dance is what ties everything together-the original rule itself. Sooner or later, this loss will become painfully apparent-because sooner or later the business almost surely will want to change the rule. Then what?

This dis-integrated approach to specifying rules is perhaps the single most significant problem in analysis today. It is one that Ross Method corrects.

Is It Complete?

Rules clearly address semantics (meaning). A philosopher might ask whether any approach to semantics ever can be complete. Frankly, I do not know.

Early in my research into rules, I decided to explore how much could be expressed with no reference whatsoever to process or procedure. Why?

  • For one reason, I wanted to see how much I could express without them. What I found continues to amaze me. The 500+ examples in the book provide ample demonstration that the potential scope of declarative expression of rules is far greater than most (including myself) ever expected. Rules simply "do" much of what processes traditionally have done. (I must confess that while working on hard problems, I sometimes was tempted in moments of frustration to seek procedural solutions. But I resisted, and always eventually found a declarative alternative. And it was always better.)

  • A second reason is that I believe a full-fledged approach for rule-based design first required development of a consistent and comprehensive framework for rules in general. Then the framework could be targeted to actions (processes) in particular. The resulting unified "logic" ultimately is bound to prove far more powerful and consistent (and simple) than the results of piecemeal alternatives.

  • The third reason is that the data-based approach to specifying rules actually is a simplifying one. The approach taken by Ross Method is that rules never can constrain processes directly, but rather only the results of those processes. Therefore, rules need to be expressed only in terms of "data." There is no need for any special attention to process in this regard-and indeed, many reasons to avoid it.

The limited coverage of actions from this book does not imply in any way that I am suggesting that "process" is unimportant. Clearly, a workable approach to information systems design cannot be complete without it. I also am not saying that process (actions) cannot be the target of rules (e.g., to control their enabling or execution). Obviously, they can and should.

"My new vision for information systems thus is this-clever compounds of rules (known only to a few), all based on a relatively small set of atomic rule types (known to many). Just as chemical engineers produce complex chemical compounds from the same basic elements, rule engineers will produce complex information systems from the same atomic rule types."

A complete approach for information systems design includes at least the following four components: data types, states, rules, and actions (processes). This book covers rules and data types (as more or less a given). Surprisingly, it also involves an approach to expressing states. (The examples in discussion of the sequence-controller rule family illustrate.)

In covering these three components, the book established a direct and promising framework for the fourth (actions), and thus for a complete technique for information systems design.

I am pleased to report that this expectation has come to fruition in the new technique, Use Cases + Rules (UC+R), recently introduced by Gladys S.W. Lam, my partner in Business Rule Solutions, LLC, and myself. UC+R is one of many practical techniques in our Business Rule Methodology, BRSolutions. And there is much more to come!

Rules and Normalization

Rules address integrity (i.e., correctness). This is fundamentally what they are about.

The greatest body of theoretical work to date on integrity in data modeling is Normalization from the Relational Theory. Normalization provides sound prescriptions for how to evaluate whether a data model is a good one with respect to optimizing the integrity of persistent (stored) data. These prescriptions are represented by the normal forms (first normal form, second normal form, etc.)

Is there a connection between rules and Normalization? In a word, yes-a very fundamental one! This is among the most significant insights from my work on rules. It is a very exciting idea.

The connection is based on the following observations about rules. 

  • First, rules are, or at least have, data. This "data" includes its truth value and its Yield Value.

  • Second, this "data" is persistent. It must last longer than individual frames of processing or transactions so that the rule can be tested or enforced properly across applications.

So rules involve data, and that data persists. This is precisely where Normalization takes up for regular business data-the very same prescriptions can be applied to rules. In other words, rules normalize!

Actually, that is not exactly the right way to say it. Relational experts say that tables (relations) normalize. Unfortunately, saying "rules have values that can be considered along with the other values of a table when normalized" does not have quite the same ring to it.

How does this important idea relate to the graphic rule modeling approach of Ross method? Rules always have connections to other types, often data types. One of these connections, the most important, is to an anchor type. Choosing the anchor for a rule is a key step in getting the rule right. It is to its anchor that a rule "normalizes."

The connection between rules and Normalization represents a fundamental guarantee for your information system designs. It means that there is a formal theoretical basis (i.e., one that is not arbitrary) for ensuring designs are good ones (i.e., provable and repeatable). When a design is good, we know why.

Acknowledgments

I wish to take a brief moment for thanks and credits.

The idea of comparing the Atomic Chart of Rule Types to the Periodic Table of Chemical Elements originally was suggested by John Zachman. In the fall of 1990, I showed him working examples, and this analogy was among his observations. It has provided an important focus for the emerging classification schemes of Ross Method.

I should thank Arnold Barnett and his widow, Leah Barnett, who sponsored my seminar on data modeling for many years. I should also thank Digital Consulting, LLC, which has sponsored my seminar on business rule concepts. The many attendees at these seminars who have offered suggestions and criticisms, and these have been invaluable. It has been within these continuing forums that many of the ideas presented in the book have evolved.

Many examples in this text actually originated from those attendees. Most of the rest originated from my own consulting work or imagination. A few, however, originally were suggested by leading industry experts (including Robert G. Brown, Chris Date, John Hall, James Odell, Michael Stonebraker, Barbara von Halle, and others). The give-and-take on these examples continues to prove extremely valuable.

I would like to thank Barbara von Halle for her long-standing interest, active give-and-take, and professional and personal support. Special thanks are due to all members of the GUIDE Business Rules Project for stimulating ideas and constructive criticisms. I also wish to thank Gladys S.W. Lam, my partner in Business Rule Solutions, LLC for her innovative thoughts on business rule methodology, and for her continuing encouragement.

I also wish to thank the readers of the DataToKnowledge Newsletter (formerly Data Base Newsletter) for their interest in the debate on business rules. Their continuing support of the Newsletter is appreciated greatly.

The second edition of this book was reviewed by a number of industry experts who provided very helpful feedback and suggestions. Among these experts were Barbara von Halle, David C. Hay, Keri Anderson Healy, Ted Farmer, Colleen McClintock, and Paul Winsberg. Keri Anderson Healy's intense scrutiny of Part 1 has been invaluable. All of these professionals share my love of the problem; I am fortunate as well to count each as my friend.

I have been fortunate that so many people have taken an active interest in this work. The interaction has been extremely valuable. There is much left to learn. Let me openly invite all comments and responses from readers.

Finally, my graphics and layout person, Mark Wilson, has demonstrated excellence (and patience) on both editions of the book well beyond the normal call of duty. I owe him a special debt of gratitude.

In all honesty, I have worked on this subject matter longer and harder by far, over more years, than any other single effort in my professional career. It seems fitting therefore that I dedicate this book to my children, who are the most important thing, by far, ever to have come into my personal life.

Ronald G. Ross
First Edition
West University Place, Texas
December 30, 1993

Second Edition
Galveston, Texas
December 30, 1996

Website revisions
April 30, 1999

Standard citation for this article:


citations icon
Ronald G. Ross, "EXPLORING BUSINESS RULES" (Mar./Apr. 1999)
URL: http://www.brcommunity.com/a1999/a439.html

About our Contributor:


Ronald  G. Ross
Ronald G. Ross Co-Founder & Principal, Business Rule Solutions, LLC , Executive Editor, Business Rules Journal and Co-Chair, Building Business Capability (BBC)

Ronald G. Ross is Principal and Co-Founder of Business Rule Solutions, LLC, where he actively develops and applies the BRS Methodology including RuleSpeak®, DecisionSpeak and TableSpeak.

Ron is recognized internationally as the "father of business rules." He is the author of ten professional books including the groundbreaking first book on business rules The Business Rule Book in 1994. His newest are:


Ron serves as Executive Editor of BRCommunity.com and its flagship publication, Business Rules Journal. He is a sought-after speaker at conferences world-wide. More than 50,000 people have heard him speak; many more have attended his seminars and read his books.

Ron has served as Chair of the annual International Business Rules & Decisions Forum conference since 1997, now part of the Building Business Capability (BBC) conference where he serves as Co-Chair. He was a charter member of the Business Rules Group (BRG) in the 1980s, and an editor of its Business Motivation Model (BMM) standard and the Business Rules Manifesto. He is active in OMG standards development, with core involvement in SBVR.

Ron holds a BA from Rice University and an MS in information science from Illinois Institute of Technology. Find Ron's blog on http://www.brsolutions.com/category/blog/. For more information about Ron visit www.RonRoss.info. Tweets: @Ronald_G_Ross

Read All Articles by Ronald G. Ross
Subscribe to the eBRJ Newsletter
In The Spotlight
 Ronald G. Ross
 John A. Zachman
';
The Issue Is THE ENTERPRISE By John A. Zachman Jan. 2017 | Vol. 18, Iss. 1

Online Interactive Training Series

In response to a great many requests, Business Rule Solutions now offers at-a-distance learning options. No travel, no backlogs, no hassles. Same great instructors, but with schedules, content and pricing designed to meet the special needs of busy professionals.