More white papers:
Find out more ...
|
|
Neuma White Paper:
CM: THE NEXT GENERATION - Where Are We ... Going?
CM is
picking up the pace. After the longest
time as a Version Control solution, CM is now commonly viewed as a Change
Management solution. The difference is
not primarily one of functionality. It's a difference between designer utility
and product backbone. As such, where
Change Management has been embraced, CM is now center stage. And that means it will be a catalyst for full
ALM solutions and eventually enterprise business management. What are the key achievements that have
happened recently and what advances are needed to embrace a wider solution
space?
A
transition is happening from one extreme end of the CM solution space, to
another. It may take a while to
complete. With some products you may
need to cut the definitions a bit of slack (e.g. what is
"end-to-end") but with others, it's the market demand that needs to
catch up to the offering. Here are the two extremes:
A:
Complex, Rigid, Big-IT "solutions"
- Lots
of computing resources, with a wealth of available trained admin staff
- A
fixed CM/ALM process that does what it says it will do
- High
cost product, incrementally increasing as components/functions are added
- Well
documented procedures and recommendations for creating consistent backups
- Partitioning,
Synchronization and constand monitoring of Multiple Site solutions
- A
complete set of 3rd party tools glued together in-house to provide end-to-end
functionality
-
Comprehensive
labelling and merging tools to support the worst-case scenarios
- A
strong RDBMS capability underlying each tool to ensure the capability extensive
reporting
- Extensive
and comprehensive courses to ensure expert use of all component technologies
B: Lightweight, Flexible Solutions
- Small
Footprint, Scalable, Zero Administration
- Configurable,
continually improving CM/ALM process
- Competitively
priced full functionality, end-to-end solution, customer expandable
- Automated
backups with on-line disaster recovery modes
- Partition-free
Synchronous Multiple Site requiring no administration
- A
single seamlessly-integrated end-to-end tool out-of-the-box
- No
labelling and minimal merging optimized to the typical scenario, with less
optimal support for special cases
- Well
engineered pre-defined and configurable reporting and interactive browsing
based on next generation repository technology
-
Easy-to-use,
single role-based user interfaces to minimize training requirements
Why do I
think this transition is gathering steam? Well the primary reason is that the
industry, that is, the user community, is no longer tolerating the status
quo. Even though we have quad CPU
processors, GBytes of memory and high bandwidth network capabilities, it turns
out that the small footprint solutions are easier to deploy, generally more
scalable, and tend to incorporate newer, more advanced technology. The fear of not having enough, if it's not a
Big-IT solution, has turned into a fear of not having enough if it is one! And rightly so in many cases.
Big-IT
means lots of administration of resources, complex admin processes, loads of
training. When I look at a CM tool, I
want to install it, load in a representative chunk of my project and get going
with it right away, in hours from first consideration. I don't want to have to buy additional
hardware, bring in consultants and go through a bunch of training, just to do
an evaluation. It's fine to take time to
put it through all of the paces and check out its scalability, functionality
and reliability - but not so fine if I have to put in an equivalent amount of
effort up front, and even less so if I find that after doing so, it doesn't
measure up.
And as
more capabilities come on line, Big-IT, glue-integrated tool solutions don't
make the grade. Consider Multiple Site
operation. Not only do you have to have
a CM plan for this, but one for all of the other tools in the solution. Then the synchronization operations must be
coordinated. Another question that comes
up is "How do you do consistent backups?", especially when shops are
operating 24/7 around the globe for your enterprise! What about process
improvement? If I have to learn 4
process tools to implement process across my lifecycle, forget it.
So as
"small-IT" solutions gain credibility, more and more organizations
are looking at them. If they fail to
cover the end-to-end spectrum, they too will fail to make the grade. But there are "small-IT" solutions
that are starting to measure up to the full ALM problem, some slowly and some
more advanced. And that's the feedback
we see. Sure we still get the occasional
"Technically your solution wins hands down, but you're not big and
blue...". And that's fine, because
it's the trend that's important. The RFI input we see includes, more and more,
things like a need for:
- Low
administration
- Easy-to-use
change packages
- Out-of-the-box
configuration
- Flexibility
to customize the processes, schema and user interface
- Easy
[ITAR] Data Segregation
- Better
Workspace Management
The
industry is starting to realize that it can ask for it's cake and eat it
too. And so the transition continues.
Where
are we
So where
are we now? Well, most CM tools are
still 2G with a few 3G features. But the
tools are advancing in technology more quickly.
Especially the newer tools and the ones that have a single ALM
architecture. Ease-of-use is important
today, and this rules out a few of the older solutions. We're in a race. New tools are trying to expand their
functionality while older tools are trying to become easier to use while
addressing a wider ALM solution. As a start, almost everyone is demanding:
- Easy-to-use
Change packages
- Good
workspace management
- High reliability
and easy data recovery
- Seamlessly
integrated suites
And with
easier to use CM, it's only natural that we try to do more advanced CM. In our own shop, we like to stay ahead of the
curve where possible. We've started using more and more features over the past
few years:
- Warm-standby
Disaster recovery
- Low
admin Multiple Site Operation
- Requirements
Tracking and Customer Management
- Fully
automated deliverable packaging
- Data
Segregation
- Product
Hierarchy/Dependency Management
Where
are we... going
But the
story is just starting to get good.
There are a number of innovations coming foward in the industry over the
next year or two.
The
Virtual File System monopoly which has been dominated by a single vendor
product (i.e. ClearCase) over the past 15 or so years, will give way to one or
two other vendors, with a focus on less resource intensive operation. What this means is that we'll see more tools
offering access to both static and dynamic context views of their
configurations, directly from the operating system. If the traditional server bottlenecks are
eliminated for this type of technology, and the associated administration
eliminated, we could see a wider market beginning to adopt CM functionality
(e.g. documentation groups, accountants, etc.)
Multiple
Site operation will approach zero administration operation as automated
synchronization and monitoring capabilities enter the picture. Data segregation will still be important, but
this will be a logical specification that will be supported with a simpler
operating framework that does not require data partitioning and synchronization
management skills.
At least
one CM vendor will be introducing a solution which caters to the CM II process
(see ICM). Expect to see others that
cater to the CMMI process. Ideally,
these will be optional and configurable starting points for a solution rather
than rigid configurations.
Customizable
dashboards for product, project and verification status will begin to take
center stage as the tools become more and more management oriented. The value of CM tools as a backbone product
management tool, and as a backbone for business management, will be exposed and
the rush will be on to include management more easily in the default, and even
driving, set of users. Dashboards and
central data repositories will make CM tools central communication hubs - so
much so that users will prefer pulling information from the CM tool rather than
having a myriad of emails sent to them through customization of triggers.
Broader
and easier customization of the user interface, and the CM process, will take
over from the more rigid fixed process configurations. There will always be room for tool personnel
who can work wonders with the product.
But more and more of the basic customization capabilities will be simple
menu-driven operations: adding a new data
field or a new traceability link, specifying/evolving state-transition
diagrams, establishing better security, etc.
Better
out-of-the-box, end-to-end solutions will appear with rapid traceabiltiy
navigation. The days of glue departments
will begin to disappear. As well,
end-to-end solutions will be seamlessly integrated, through a single user
interface running on top of a single repository, as opposed to one per
application area. Operations such as
backups, multiple site operation, etc. will work across all applications, not
just the file-management areas.
Ever
wider application suites, embracing things like customer management, time sheet
tracking/project roll-up, project management and full traceability between
Requirements and Test Cases. Not only
from a coverage perspective, but also from active test run data against
specific builds and configurations.
And CM
Vendors will begin to offer free on-line or webcast courses to push their
solution through to the short-lists of prospective customers. When you see this happening, the market will
have turned a significant corner.
Ease-of-use for CM will have matured.
CM training and consulting will focus on process, not on tool
usage. And the tool vendor will, in most
cases, no longer be the main provider of CM training.
When will
these things take place? Well, they've
already begun, and will continue this year and into the next. And as each vendor contributes to the
advances, the others will be forced to play rapid catch-up or to drop out of
the race. The market expectations will
move from one of tollerating CM tool behaviour to one of a backbone team
support role. The pace will continue to
quicken and a few vendors will emerge from the myriad now out there, as they
mature in their 3rd and 4th generation solutions.
Open
Source
Will the
open source community eventually take over?
This is a hard call. Compilers,
GUIs, Operating Systems, etc. have fairly well-defined requirements and if you
can use one OS, you can likely use another.
CM is complex, very complex. Not
just it's nature, but also in its diversity.
Go from one site to another and you have a new set of requirements: hundreds of little products, legacy
languages, thousands of team members, different IDEs, specific contracts vs.
general market, end-user product vs. system component, etc.
The
customer for a CM system varies a lot more than for the other Open Source
products. So while open source will
continue to produce better point tools, process and customization requirements
are sufficiently diverse and complex that Open Source tools will either not be
able to embrace the full set sufficiently, or will do so in too complex a
manner. Look at how difficult it's been
to get a good desktop configuration of Linux for example - one that a
significant market share could embrace.
I'm afraid the task is more difficult for the CM world where the user
community is much more widely dispersed.
But then
there's always the wild card - will one of the vendors turn over it's assets to
the open source community - especially if they find it difficult going alone in
the upcoming race? Well, not this year
anyway. Open source will focus more on
knitting together point solutions to provide a basic ALM solution. But the long-term viability of this approach
will be uncertain, at best, because of the added complexity that comes from
combining such solutions.
Joe Farah is the President and CEO of Neuma Technology . Prior to co-founding Neuma in 1990 and directing the development of CM+, Joe was Director of Software Architecture and Technology at Mitel, and in the 1970s a Development Manager at Nortel (Bell-Northern Research) where he developed the Program Library System (PLS) still heavily in use by Nortel's largest projects. A software developer since the late 1960s, Joe holds a B.A.Sc. degree in Engineering Science from the University of Toronto. You can contact Joe by email at farah@neuma.com
|