Wednesday, December 14, 2011

Pega Rocks!

Recently I gave a short presentation about Pega Rule resolution to enable delivery of enterprise software, this as part of my CSSA training. Trying to grasp the background behind the often technical descriptions of the what and how of the rule resolution process this is what I’ve come up with...


Pega is really focused on delivering enterprise wide solutions. An enterprise typically has a complex organization structure, sells more than one product or product type, has offices or branches in more than one location, sells products through various channels to several groups. One could say that the only stable factor in an enterprise is change itself:-)


Pega handles complexity really well, internally following the structure of the organisation, the products it delivers and the markets it serves. The software system which is build with Pega to support business processes, typically stays close to the organization structure of the enterprise; knowing that management at organizational, divisional and unit level all “need” their own influence on the software support of the business processes they own: Certain regulations like human resource management may be maintained at an organization level. Much functionality is likely to differ on divisional and unit level.

To allow this to work Pega breaks down functionality to the level of what they call “a rule”. Several rule types exist like flow rules to create process flows, user interface rules to create user interfaces or decision rules to store business rules. Rules can be defined at one layer and specialized on another as such allowing differences between division and unit level and in the meantime to allow compliance to certain enterprise rules defined at organizational level.

Developers will use the so called “Enterprise Class Structure” (ECS) together with “Custom Build Frameworks” (CBF) and standard Pega frameworks to obtain optimal reuse and speed of development, see the example below:


A typical application uses rules that are defined on the unit level and that inherit rules from the division and enterprise layer(s) below. Rules from framework layers can be injected using multiple inheritance paths. As such developers can create an application handling car insurance claims using the Pega CRM framework, augmented with car insurance specifics defined in the custom build framework. But in the meantime complying to the reporting requirements by using the business rules defined at divisional and enterprise level.

Development and deployment is supported by rule-sets

To allow for easy deployment, rules can be grouped in rule sets. An application typically comprises several rule-sets. Rules that make up a Custom Build Framework for instance could be stored in several rule-sets allowing for easy deployment. Rule-sets follow the structure ECS and although rules for a unit could reside in one rule-set, often different aspects of the rules in the set need to be released independently of each other. Rule sets are version-ed and versions can reside next to each other. Specific rules in a rule set may be withdrawn or blocked to allow to patch functionality and create quick fixes. Rules can be set final to make sure that the rule can not be specialized any further.

Rule Resolution makes it work

In order for this system to work and to allow for optimal reuse, Pega comes with a advanced rule resolution scheme. All rules are stored in a database under a rule name, referring to a class and a rule type. At run-time the system determines what is the most appropriate rule to run considering the operator, the operators authorizations and access to frameworks, place in the organization and so on. Besides that, the system determines the most appropriate rule on more technical aspects like the class name, circumstance values (parameters provided) and validity of the rule. And finally rules can be blocked and withdrawn to cater deployment, release schedules and quick fixes.

Pega offers an unparalleled system for reuse. It truly keeps it’s promise to deliver a system for enterprise wide software delivery... in my opinion Pega Rocks!

Saturday, December 10, 2011

Keep structure when going Agile with BPMS, refactor continuously!

BPM suite implementations highly benefit from an agile implementation approach. They are build to allow you to embrace change, add business value quickly and learn as you go. Realized by offering an integrated requirements approach, by providing decision tables to be changed by the business itself, by generating code rather than coding it and by making it possible to deploy new software with a key press.


Agile approach need constant attention to architecture

When using an agile approach you don’t design everything up-front. Remember that one of the promises of using an agile approach is to allow us to embrace change; we only want to build that piece of software that has highest business value at this moment. As Thomas Carlyle stated in the 19h century: “Our main purpose is not to see what lies dimly at a distance, but to do what lies clearly at hand.”, we in the 21th century discover that this holds true in software development so we better not take decisions on things we can only guess about; but rather focus on problems we really have to solve at this moment. The same statement holds true for the software architecture. Rather than guessing on what a good architecture could be for the envisioned solution, we now develop the architecture as we go. This means that when we implement a new piece of functionality that touches an existing part, we have to pay special attention to structure (the architecture).  If not enough attention is paid to this aspect, the structure of the solution will surely deteriorate causing severe problems: Newly added models may conflict with existing ones, models may become duplicated leading to unexpected results and to a structure which is hard to see and to maintain.


How to obtain a solid architecture in agile approaches? Refactor!

So how to continuously develop an architecture in the ongoing process of implementing pieces of software and not having an overview of the final solution. An interesting technique introduced by Martin Fowler is called refactoring. This technique is often used in traditional software development. Currently most IDE’s support basic refactorings by choosing a code fragment and choosing the refactoring to apply to it. Not only for traditional software development this technique is useful, also BPM suite implementations can benefit from it. Let me elaborate a bit on the technique itself and give you some examples.


How does refactoring work?

Refactoring is based on making small controlled changes to the software and using automated tests to validate whether the change doesn’t break down functionality. The refactorings described by Martin Fowler focus on improving one aspect of the code and for each refactoring possible side effects are described.

I will give an example of refactoring in traditional source code. Suppose we have the following source code:

if (customer.rating > 25) {
total = (price * 0.95) * (1.0 + VAT_percentage / 100.0) ;
Send();

}
else {
total = price * 0.98 * (1.0 + VAT_percentage / 100.0) ;
Send();
}

Since it is quite a small fragment we can quite easily understand what is going on. But, suppose the calculation for the total price requires adding up of order lines amounts with different VAT_percentages; code will quickly grow giving real need for more insight and structure. For the example however I will use this small fragment to explain the approach.

The first thing one could notice from the fragment, is that the calculation is duplicated. Having such duplications in the code makes it harder to maintain, and more likely that errors are introduced when changing the calculation in one part of the code, but forgetting to update it in the other. This “bad smell” can be fixed using the “Extract Method” refactoring. In our case we extract method “CalculateTotalPrice”, leading to the code fragment below:

CalculateTotalPrice(price, discountPercentage) {
return (price * discountPercentage) * (1.0 + VAT_PERCENTAGE / 100.0) ;
}

if (customer.rating > 25) {
total = CalculateTotalPrice(price, 0.95)

Send();
}
else {
total = CalculateTotalPrice(price, 0.98)
Send();
}


After running the unit tests we should find that we didn’t break anything. However should we have broken the test, it is only one little thing we have been changing and the failing result can be pinpointed to this change easily.

With this one refactoring we’ve improved the structure slightly, still some other “bad smells” remain. In the code fragment above you can see that the “Send” method  is called from both branches of the condition. This also is a well known bad smell for which refactoring “Consolidate Duplicate Conditional Fragments ” exists. After applying the refactoring we get the following code fragment:


if (customer.rating > 25)
totalPrice = CalculateTotalPrice( price, 0.95)

else
totalPrice = CalculateTotalPrice( price, 0.98)

Send();


I hope you see that the code resulting from the refactorings lead to better maintainability; less side effects if changes are made. Note that there are more refactorings applicable to the code fragement above, but for the explanation I expect this suffices.


How to apply refactoring to BPM suite implementations?

Now the question is how can we apply refactoring to models in our BPM suite. I’ll try to demonstrate how the same refactorings can be applied to a mirco-flow in a BPM suite.To understand the model I’ll describe the visual elements used (originating from Aquima’s open source BPM suite)

A flow is a graph-like structure that can consist of forms, services, sub-flows and conditional nodes. You can use flows to create event-driven applications. This means the occurrence of an event can determine the path the application is taking within the flow. The path that is taken, in its turn, determines which events may occur. (source Aquima)

A flow can be used as a starter flow that kicks of when the user has to perform a task in the process flow, or as a subflow in another flow for reuse and to improve ma intainability.

A service is used for two situations: To connecting your application with external applications or data sources and to transforming your application data, for instance for a complex calculation. Several service types come out of the box, but they can also be custom made using java or dotNet.(source Aquima)

Forms are used to interact with end-users and contain logical blocks of questions, information etc. these pages are placed in flows. They publish events that can be catched in a process flow.

Now we know what a micro-flow is made of, we can look at a micro-flow found in an implementation below. The subsidy request form shows all kind of information needed for a subsidy request to be made. One of the options clients can choose from is to look for an subsidy advisor. Advisors typically work for an organisation and this organization may or may not have been chosen in a previous process step. 

Mirco-Flow showing a form, conditions, services and subflows

One good practice is to model a flow, showing sequential steps top-to-bottom or left-to-right but not mixed. Doing so will already reveal issues more easily...

Mirco-Flow showing modeled flow top-down not changing any connection

A first “bad smell” from the model is the double check for the subsidy type on both sides of the condition is organization known. A first refactoring “Consolidate Duplicate Conditional Fragments” is used to remove the double, see the resulting flow below. Applying this refactoring results in the following flow.

Micro-flow after “Consolidate Duplicate Conditional Fragments” refactoring

The service set_Control_Search however is to set the functioning of the subflow, by setting a Control Parameter. To make the sub flow more reusable this control remains outside the sub flow.

Micro-flow after changing the order of activities

Three activities in the flow set an element in the search condition. This search condition is further detailed and used in the sub-flow “addAdvisorToSubsidyRequest” (not shown here) to make a selection of all available advisors. Apparently this micro-flow grew and some elements that actually belong closer to the addAdvisorToSubsidyRequest sub-flow where placed outside of it. The intent of the flow becomes less clear because one can only grasps it, analyzing the services. We use refactoring “Extract method” to add the services to the already existing addAdvisorToSubsidyRequest sub-flow. The remaining flow is shown below:

Micro-flow after “Extract Method” refactoring
I hope you agree with me that the readability of the flow has improved enormously. Using a complementary test driven approach makes it possible to test every single step in the process and validate that the model still produces the right results. Structure has improved leading to better readability and better reusability of assets. Other analists can more easily grasp the intent of the flow which leads to better understanding, quicker bug-fixing and better maintainability.


Some difficulties with refactoring micro-flows

Although several refactorings presented by Martin Fowler, can easily be translated to their BPM suite model counterpart, there are also some difficulties with refactoring micro-flows depending on the possibilities the specific BPM suite offers:
  • As it isn’t always possible to parametrize a micro-flow or flow element it becomes harder to reuse the flow or element and makes it impossibly to apply certain refactorings proposed by Martin Fowler.
  • It may or may not be easy to copy flow elements between different micro-flows making it easier or more difficult to move functionality from one place to the other which is necessary to apply certain refactorings swiftly. Refactoring flows can be a quite tedious job if their is no good support to move items around.
  • Creating unit tests may lead to some challenges depending on the BPM suite used. If the BPM suite doesn’t support automatic testing, It may require you to use a test robot to set certain values in a form and press buttons to  kick off a micro flow.

and finally...

I hope you will come to appreciate Martin Fowler’s work on refactoring as I do. In the description above I’ve shown you some refactorings in action for the micro-flow model in a BPM suite implementation. Different types of models in BPM suites benifit from different refactorings. The question that keeps me busy these days is what specific BPM suite related refactorings can help us structuring our work further... I am open to suggestions:-)
http://sourcemaking.com/refactoring

Wednesday, December 7, 2011

Live by the principles of the agile manifesto and raise the bar by using your craftsmanship!

The agile manifesto states: “We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

That is, while there is value in the items on the right, we value the items on the left more.”

When applying agility in a project and starting with the items on the left on this scale, the question is, how little processes and tools, documentation, contract negotiation and planning is needed to successfully deliver. Project approach influencers like the technical complexity, the amount of stakeholders involved or the functional complexity of the project may force teams to use some level of process, contract negotiation, documentation or planning. Different agile methods focus on influencers or try to avoid risks differently.
Several Agile methods exists… Hey! Is this not a “contradiction in terminus”, the agile manifesto states “Individuals and interactions over processes and tools”. Apparently some amount of process is required…And if one looks at the several agile methods that have seen the light, as with any process, they try to solve a specific problem or are good at handling a specific risk: the scrum method for instance has its focus on project management and requirements management … XP focuses on continuous integration and coding practices. As with most things in the world once successful they have the tendency to grow and unfortunately sometimes in wrong directions… some methods however stay on their home ground, concentrate on specific risk and can therefore be combined easily to complement each other. An often used combination for instance is Scrum for project management and requirements management, XP for the more technical aspects using pair programming, test driven development, refactoring and continuous integration. I personally favour the combination of Scrum, XP and augmented with Smart Use-Cases for requirements definition and estimation. The strange thing is: while there is quite some process involved, still we call these agile approaches, how come? The answer on this question you can find in the principles underlying the agile approaches:
  • Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.
  • Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage.
  • Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.
  • Business people and developers must work together daily throughout the project.
  • Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.
  • The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.
  • Working software is the primary measure of progress.
  • Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.
  • Continuous attention to technical excellence and good design enhances agility.
  • Simplicity--the art of maximizing the amount of work not done--is essential.
  • The best architectures, requirements, and designs emerge from self-organizing teams.
  • At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.
Following these principles is what makes your team become agile, it is not in applying a specific method or technique. Writing Scrum Storyboards or creating XP-Unit Tests don’t make your approach agile, and while some techniques don’t work together well with an agile approach, many can be adjusted to do so: Writing a full functional design won’t contribute to the agility of your approach, but using functional decomposition to break down functionality in more manageable blocks may. Also detailing a 12 page Use-Case to describe functionality will not contribute, but describing functionality in the form of a dialog between the user and the system makes a good format to investigate the users interaction with the system.
So my advice: Live by the principles of the agile manifesto, but raise the bar applying your craftsmanship!
http://manifesto.softwarecraftsmanship.org/

Thursday, December 1, 2011

Change your requirements habits: get Lean using your BPMS

Many BPM suite implementation projects still use a traditional requirements approach, either based on functional designs or on Use Cases. I’ve found that although these approaches lead to results, they don’t benefit the full potential of current BPM suites making design, development and maintenance harder than necessary -  ultimately not fulfilling the promise of obtaining agility in business’ process management...

Traditionally trained requirements personnel are likely to put much effort in creating a paper based specification of the software. This specification comprise documents describing use cases, process models, non-functional specifications and so on. Input documentation like stakeholder requirements, descriptions of business rules and regulations are referred to using text based links. Specifications are often verified and signed off by business personnel to become “the truth” for all  developers, testers and the implementation team. The value of these specifications is likely to deteriorate over the course of the project. Reason for this is that maintaining the elaborate set of documentation takes a big effort and adds only little value since team members get more and more knowledge about the domain themselves. Bug fixes and change requests augment the documentation set and more information is described in more places. Over the course of a project, the actual database model, the application flow, end-user documentation, web-forms, generated reports and source code become the truth, that is what everyone can depend on to be the actual status. The paper documentation, although often containing information not present elsewhere deteriorates.

Modern BPM suites offer very interesting options to make the requirements process more effective and make it lead to more durable results, taking away some of the problems described above. This is based upon the fact that the suites allow business people and IT personnel to share their work spaces and work together on developing software from their own perspectives. The actual software is generated from the models rather than that it is coded by IT personnel. Documentation is generated from the suite and will always give the up-to-date view of the software.  


Decision table modeled in the suite.

Let me give some examples as to how the requirements process can be improved using some great options of modern BPM suites:

  • Store Business requirements and regulation documents in the suite. Create links from the relevant text fragments to the artifacts where they are implemented. Having traceability from the actual documents used by the business will give good support for impact analysis and help to audit compliance.
  • Maintain specifications like use cases in the suite as close to the implementation as possible. Don’t create specifications if you can create a self explaining implementation artifact directly: rather than specifying a decision table in a paper document, create the table in the suite directly. The specification when linked to implementation artifacts, glues together functionality and forms a means for controlling the project.
  • Create mock-ups of forms using the functionality of the suite to form a great means for requirement elicitation. Especially end-users can contribute effectively when they see what they are actually going to get. For developers these mock-ups form a good starting point as they are already in the right place and don’t have to be translated from a paper representation. In many cases the forms are linked to a Use Case, to the process flow and refer to the domain classes.
  • Develop the process models and task flows directly in the suite to be able to walk through the created process, debugging and demonstrating the process flow to end users. This can be done by manually deciding where the flow is continuing and by responding to mock-up forms as created and attached to the right steps in the process.
  • Maintain Business rules directly in the system using decision tables, decision trees and expressions and allow business analysts to change these rules themselves and give them opportunities to take IT support matters in own hands as much as possible.
  • Maintain domain entities in the suite itself. Define and maintain relationships between the domain entities, define attributes and specify default value or value lists and let the generators take care of database specific implementations.

These are some ideas as to how you can model in a BPM suite directly and prevent double documentation. As I said in the beginning of this item and I hope you come to appreciate a bit, is that a traditional approach will work, but using the possibilities of the BPM suite will make your work more fun, will produce better results in shorter time and will ultimately lead to a better adaptable solution, happier customers and happier end-users:-)

Thursday, November 10, 2011

You want to be successful in BPMS? Teach yourself some good modeling practices!


Currently I am developing software using a BPM Suite. I've found that several practices used in traditional software development remain valuable when using modern BPM suites. Although these suites promise to give business people a tool to support their own business processes themselves, reality shows business people find it hard to cope with the growing complexity of the solution they create themselves. The reason for complexity to grow is obvious: the real world situations the BPM software supports are complex and since the models created in the BPM software are executable, they are likely to become complex too. Although most suites offer support to assess the impact of a change, structure is needed for this to really work. To cope with growing complexity, one has to create overviews and keep a clear structure. Neglecting structure will result in erroneous process support and in badly maintainable assets, ultimately leaving business people with badly supported processes.

Traceability options of a BPM suite showing a navigable diagram with upward and downward traces

When we focus on the created software elements we see that in the end the complexity in delivering a project is not in understanding the tool or programming language itself, but in understanding the structure of what makes up the application: the class structure, the used libraries, the services structure and so on. For delivering a BPM software solution this is not different. The artifacts are a bit different though, we now have process flows, user interfaces, task flows, domain classes etc. It is quite easy to make a mess of these artifacts. To name some good ways to do just that:

  • Bad naming of any object makes it hard to be found or reused.
  • Unreadable process flows due to large flows or flows on different detail levels make it hard to get an overview of what's going on.
  • Domain classes that don't resemble objects in the real world are likely to change often whenever a user makes a new request for functionality.
  • Micro flows that don't show the intent are hard to maintain.


In the past I have used quite some tools and programming languages to develop software. What always kept me "alive" is to follow good software engineering practices like keeping focus on ubiquitous language, on maintaining high coherence of assets and low coupling between assets, on continuously refactoring, on domain driven design and so on.

I've found that similar practices are needed to successfully deliver software with a BPM suite. In line with good old programming, `good modeling practices´ are required to maintain a clear structure of the software assets and obtain a maintainable solution that can keep business value for the years to come.

IT personnel are trained in maintaining structure; it is what they like, it is in their genes. For business people structure has not their first focus, and it shouldn’t have, as they should focus on keeping the business run smoothly. So I suppose business people and IT personnel have to walk hand in hand for a little while longer...