Neuma Technology: CM+ Enterprise Software Configuration Management for Application Lifecycle Management

Neuma Technology Inc. provides the world's most advanced solution to manage the automation of the software development lifecycle.
Neuma Technology Inc.

More white papers:

Find out more ...


Neuma White Paper:

CM: THE NEXT GENERATION of ALM Transparency

When it comes to IT governance, a key issue is transparency of process and data, all the way up the chain. 
 
Senior executives must be able to say: We understand the process, and the data confirms that we're following the process.  The problem in the CM world has been that the tools have been oriented to the design team, and not to the executives.  Why is this and how can it change?

The why is simple.  When the first movies came out, there was a theme: the world of entertainment - movies about broadway, about movies, about stars.  The entertainment industry focused on what it knew best - the entertainment industry. And this persisted for many years, although it was complemented by many movies that were also "outside the box".  Well it's the same with SCM tools.  Our first focus was both on managing software and on helping developers.  It was not focused on the business side of things.

In the emerging 3rd generation of CM tools, we see a definite shift.  Perhaps this is brought on by the understanding that to sell these tools, upper management must buy in.  Perhaps it's just maturity.  Perhaps it's because the scope of CM just continues to widen - CM is, after all, a backbone technology.  In reality, it's all of these and in recent years, the IT governance issues of the business world has shifted the focus in each of these areas.  Executives want to give more direction in tool selection; vendors realize that the business world is not disjoint from development; and CM has cleary grown in scope to ALM and in third and fourth generation products will cover a much wider portion of the business model.

The Basics
So what does it take to provide transparency of process and data up the chain of command?  It takes a good combination of vision and technology.  From a CM perspective, the processes must be integrated with the CM function without imposing undue overhead on the development team.  The best way to accomplish this is to provide processes and tools which actually improve productivity while improving process.  And this has to be accomplished while providing the flexibility to model the required processes precisely.

The problems arise when we attempt to integrate tools together.  Second generation CM tools performed tool integration with plenty of glue. The problem with glue is that it is inflexible, not to mention the time it takes to glue things together in the first place. A secondary and related problem was the lack of a single repository to hold all of the CM data.  Even when nice data-sourcing front ends can be applied to all of the underlying repositories, there is still a multiplication of effort and technology to address things such as multiple site operation, consistency of backups, security and access permissions, and integrated process work flow. Most integrated toolsets do not attempt to deal with these issues across the board.  The CM niche deals with file revisions and often advertises a multiple site solution based on this single slice of the pie.  The result is a tool administration nightmare.

Third generation tools address these problems by ensuring:

  • <Data> A common repository across the ALM spectrum and beyond
  • <Process> A common process workflow engine
  • <User Interface>  A common user interface across the ALM spectrum

These are augmented with some level of flexibility that permit processes to be adjusted to meet requirements.  A couple of Canadian vendors (MKS and Neuma) have extensive capabilities in this area, allowing not only the extension of existing process models, but also an expansion of the application set to address areas of the process that are more specific to a vertical market.  So additional apps may be added to the toolset to address lab hardware management, customer and site tracking, and business case management.  In order to minimize the amount of tool integration and glue, toolset expansion is a very desireable feature.  A common Data, Process and User Interface architecture allows growth without incurring additional learning curve or administration, in most cases, apart from understanding the business processes.

Third generation systems will be well-positioned to address executive use of the ALM tools.  But addressing the above problems is not a sufficient condition.

Process Model
Once the basic components are in place, we need to address the process thoroughly.  There are a number of process tools out there, but they really must be well integrated into the CM/ALM toolset to provide adequate control.  Some components that need to be managed include:

  • Object State Flow
  • Inter-object Work Flow
  • Access Permissions

Essentially, it must be possible to address a set of requirements that identify how data moves through the system, how it interacts with other parts of the system, and who has permissions to trigger/execute various parts of the process and change the related data.

Object State Flow is a key component of an process model that can be easily expanded.  The central data object is identified and then the states of the object, and the transitions between states. This determines how a customer request is handled, or how a build is tracked through to production and delivery.

Inter-object work flow deals with rules for and triggers of actions on related objects based on the state of one object.  This is usually managed using specifically designed rules and triggers against the transitions of an object state flow.  It is also dealt with through user interface actions which trigger object actions across multiple objects.  Implicit in Inter-object work flow is the data that provides full traceability between objects.  It is not sufficient to have actions interwork. Nor is it sufficient to provide an audit trail of what actions were performed.  The resulting data links which provide traceability are crucial to the reporting function, and this will be the key to executive buy-in to using the tool.

Access Permissions are handled in many ways.  The simplest and most common way is through providing role-based user interfaces.  The users can only do what the tool allows them to do.  Even if the user interfaces aren't foolproof (e.g. a Command Line Interface can be used to circumvent the restrictions), they provide a key component of process work flow by guiding users of each role through their tasks. To tighten access further, a lower layer of data access permissions must provide assurance that only those specifically authorized have the appropriate access to the data.  This includes capabilities to meet ITAR regulations which require data to be physically and/or logically separated into access partitions so that only those authorized to access the data can see and/or change it.  CM vendors are stepping up to meet ITAR requirement if for no other reason than they want to sell to the U.S. government.

When these capabilities are present, it's possible to tune the solution to the corporate requirements.  This gives executives control because they can mandate, in a more cost-effective and timely manner, what their own requirements are.

Executive Buy-in: Reporting

All of these are capabilities are great for control and for usue by the development team but will only go so far toward getting senior management useful tools for themselves..  They need the appropriate level of reporting to go with them.  The reporting is not only for senior management, but for the whole team to be able to function more productively and coherently.  Reporting must be able to cover these areas:

  • Process - Process reporting must answer the questions:  What is the process? What "to-do lists" do I have?
  • Data - From Dashboards, to drill-down summaries, to overview and to detailed levels, daa must be presented in a useful manner.
  • Tools - Do I know what tools are being used in my business and is it possible to track the revisions of these?
  • Metrics - Metrics identify key measures of how well the process is working and how well the team is performing.

From a process perspective, reporting must clearly indicate how things are working.  It's not sufficient any more to have printed documentation (typically out-of-date) showing the process.  Instead, at a minimum, live state flow diagrams should be available and should reflect the actual data driving the process.  These need to be augmented with a list of "to-do lists" provided for each role.  The executive wants to know that the process is well oiled.  He wants to be able to observe the process from an overview perspective, and make sure that everyone has their data-driven marching orders.  Form reviews, for data entry, will likely be delegated down to the project or process office.  The executive should be able to assume proper forms exist based on the data and capabilities he has through the reporting function.

"To-Do Lists", sometimes referred to as "Inboxes", need to be visible and easily specified on a per-role, and sometimes on a per-user, basis.  The CM/ALM function should drive the team based on the data within it.  When a piece of data is in a given state, it requires a certain role to advance it through to the next state.  That means it needs to be on somebody's to-do list.  The user interface must make it easy to identify what is on a user's to-do list so that the user is aware of his or her priorities without the need for manual intervention.

From a data perspective, the executive is largely driven by a combination of charts and risk reports.  It must be possible to provide these across the CM/ALM function.  Charts may cover things such as problem/defect arrival rate/fix rate, customer-specific response rates, project schedules, product and development status, requirements coverage and verification results.  At one level, a basic set of functionality must be provided out of the box. But at another level, the executive must be able to ask for specific reports and have them readily available.  Better yet, the tools must support easily-specified, interactive queries for facilitating meetings.  If reports don't have to be cut and pasted into a presentation all the better.  If what-if scenarios can be easily dealt with, even better.

Data reporting includes the ability to design custom dashboards for each management role.  In some cases this will take a few minutes, in others a few days.  When it goes beyond that, it tends to be less useful.  Dashboards need to be in drill-down format so that the top-level status information can be expanded into summaries and drilled-down to the details in specific cases - such as for emergency problems or for high-risk items.  

Summary charts should be interactive so that any line of a graph or bar of a gantt chart can be zoomed into for a greater level of detail.  In particular, the need to traverse traceability links must be present.  One record at a time traceability is better than nothing, but the ability to instantly traverse a whole set of records is much better oriented to the manager.  Some managers want to know which requirements have failed in a test run.  Others want to know what features and problems are being delivered in a new release.  Others want to know what resources are required to complete a proposed set of functionality.  These all involve a level of traceability.  Advanced database capabilities are a pre-requisite to such capabilities.

From a tools perspective, it's important to be able to identify which tools are in use at which time.  Did a new accounting system cause the sudden change in corporate performance or was it a real change?  Are the CM tools adequate to ensure that what we say we're delivering is what we're delivering?  Are changes to the toolset properly verified and authorized?  This is a level of functionality that more directly impacts on the Sarbanes-Oxley requirements.  And the tools tracked should not be restricted to the traditional set of CM tools.  It's not necessary to store the tools in the repository, though this would be nice.  But it is necessary to clearly identify exactly which revisions of which tools were used at which times.  And this includes in-house tools as well.

Finally, metrics are crucial to understanding how well your process is working and to discovering bottlenecks or other areas of potential improvement.  Watching the march of a metric over time is an interesting activity if nothing else.  Well defined metrics will respond to process tuning so that the effect of the tuning is readily visible.  Patterns over time allow us to better predict the future:  for example, what does verification failure rates tell us about release readiness? Even more telling may be the rate of customer request arrivals.

The CM Way

The transparency required by the business world will be attained.  The question is, at what cost and with what accuracy.  The CM/ALM world is familiar with these sorts of requirements, and as a backbone technology is well-positioned to address them.  It will mean expanding scope once again (as is already happening).  But it will also give way to significant growth in the industry if we successfully take on the challenge.


Joe Farah is the President and CEO of Neuma Technology . Prior to co-founding Neuma in 1990 and directing the development of CM+, Joe was Director of Software Architecture and Technology at Mitel, and in the 1970s a Development Manager at Nortel (Bell-Northern Research) where he developed the Program Library System (PLS) still heavily in use by Nortel's largest projects. A software developer since the late 1960s, Joe holds a B.A.Sc. degree in Engineering Science from the University of Toronto. You can contact Joe by email at farah@neuma.com