Value Engineering and Value Management by SAMI
Value analysis and value improvement- Home of SAMI Value analysis and value management products Value Engineering and value improvement table of contents Search for information on value engineering Contact SAMI for Value engineering and value improvement services       Value improvement and value management at SAMI
Info on SAMI's value engineering services
Value engineering defined
Value management and value analysis courses
Value management and value engineering program
Value improvement- COPRISA
Value Management  and value engineering papers
Value analysis web links
Certified value specialist- team leader
Value engineering classroom
Previous

CONSTRUCTIVELY MEASURING VALUE PROGRAMS' EFFECTIVENESS

J. Samuel Martin, PE, CVS

Published and Presented in the Copyrighted 1998 SAVE International Conference Proceedings
(the full proceedings available for purchase from them)

 

ABSTRACT

Value programs must be periodically reviewed and analyzed to determine if they provide the intended benefits. Practical, quantifiable measurements are needed to maintain and improve any Value Program. Many standard methodology practice guides do not demonstrate if their envisioned objectives are achieved--they only show performance of activities. For the private industry executive, this does not demonstrate to their management that a program has made a good return. For the Federal manager, obtaining useful measuring of program effectiveness is the basis for specifying to their oversight groups that they have met the requirements in Public Law (PL) 104-106. This law requires that agencies "establish and maintain an effective" program, and demonstrate that they are meeting the regulations relating to that goal. Some real-life examples are given.

INTRODUCTION

Value program managers should use, and if necessary, develop systematic approaches to measuring program's effectiveness. Verifying results that meet the organization's objectives and demonstrating plans to improve program results are the crux of creating support within the organization and its executives.

Appropriately selected and monitored indicators of program effectiveness form:

  • Demonstrable evidence to value program staff and overall organizations that the Value Method (Value Engineering) saves money, improves customer satisfaction, and provides better solutions,
  • Methods to quickly pinpoint problem areas,
  • Evidence of possible causes to more effectively address problems,
  • A means to allow less expert value program staff to examine and interpret their local program effectiveness and give recommendations of ways to resolve possible problems.

DATA NEEDS

General Data Needs. To be useful, analysis data needs to be easily collectable and verifiable. The basic data for operating and selling a program are:

  • Initial baseline used entering into the value study, including cost and the assumptions and criteria that made-up the initial presumed value-base,
  • At least one written report, preferably two (presentation and final) that defines the value-base for the initial baseline and value study results,
  • The number of value study proposals and their projected value, including a comparison to the baseline in a benefit, disadvantage, risk and monetary form,
  • The number of ideas generated during the value study activities and a short description and discussion of their disposition through the value study process,
  • Feedback of the number of proposals that were accepted (along with the actual value attained by its acceptance), rejected (along with a short discussion of why it was rejected), and not utilized due to the acceptance of another proposal (along with a notation of which proposal was accepted),
  • Feedback of the eventual enacted (implemented) cost of the value studied project.

Program Specific. Additional data that helps sell a program and must be obtained if it has mandated features, such as a Federal program meeting PL 104-106, are:

  • The number of projects meeting the threshold target (mandated) requirement and the number of those that were value studied or obtained a waiver (a management decision to not perform a value study),
  • A method plan for performing value studies on the projects involved (in-house cadre, contractor, etc.),
  • Estimated number of managers and executives involved in projects that are affected by a mandated requirement and the number of those that have obtained a minimal amount of Value Method training on the benefits of the program; for them, the organization, and their staff; and in their responsibilities within the organization's value program.

MANAGERIAL USE OF THE DATA

Mangers and executives must have at least two points of data. First they need data that shows the value study results are suitable for use in making decisions. Second, they need support to sell implementing the decision results. The first three described points in the general data needs serve both these needs.

For the manager to consider any value study to be valid, the three points; baseline, written report, and comparative proposals must be present. These give the manager a simple, understandable basis to examine the value study and accept it for use.

This is usually achieved if a cost model, function analysis, FAST diagram, and discussion of the criteria involved are present in a written report. Using these, the manager can examine them, see if the value-base given corresponds to their understanding and clarify meanings as needed. Once assured that an appropriate value-base for the baseline was used and observing that the proposals are comparative to that baseline, they can be comfortable that the comparative proposals represent a usable product. To have comfort in the proposals, managers need only examine the report to see if these representative issues are present. For this, they do not need much Value Method expertise.

The presence and use of these three simple measurement points also gives the decision-maker the documented basis for their decisions. It shows the starting values used in the analysis and the added-value results that would be generated by implementing the value study proposal results. When the comparative display of each proposal shows its benefits, disadvantages, risk, and monetary features, long after everyone else has forgotten the decision-basis, it will help them support the final selected implementation decision.

The manager uses the remaining data only indirectly. It is too detailed for them to expend much effort on it. They usually prefer only the bottom line of whether the program is returning its investment properly and the resources needed to make sure that it does. For the governmental manager, their concerns often center around if they are meeting the mandated requirements and what is required to meet the requirements. It is the technical Value Method professional's job (value program staff or other support) to create this synopsis data and give them the bottom-line.

PROGRAM DATA

General. The program person's data requirements supports the management data needs. They give the bottom-line to the manager and have the support data needed to back it up. To do this they need more technical data than the manager. They must process it so they can present it and support program activities. This data falls into three general categories of value study acceptance, value study result use, and meeting of overall program goals.

Value Study Acceptance. The program person needs to review (the Federal sector often refers to this as an internal program audit) each value study to determine if it represents a result that meets the basic program requirements; for a minimally acceptable value study. Examining the first three points of the general data needs and ensuring they are present is the basis for this acceptance. These three features make up the successful structure of the Value Method. Data has shown that studies lacking these features, are much more prone to being flawed. Accordingly, if these features are not present, it usually represents a lack of care during value study performance and the proposals are more susceptible to errors in judgement.

Value Study Result Use. Even the best value study, when unused, is a study that does not produce an appropriate return. Technical Value Method perfection is not the goal for a value study, producing good decision-making products is the goal. If a quality value study result goes unused, it is like a fine wine that sits in the heat until it turns to vinegar.

A value study goes unused or has proposals that are totally rejected, indicates a problem. The measurement data selected can show where that problem resides. General data items four and five are the added features needed for this basic data. It is easily collected, and can be used in a variety of forms to demonstrate the status for a multitude of issues and indicate potential problems. The table shown was generated using a moderately sized value program as a guide is a sample of the potential. (Only a portion of the useful information that can be learned from application of this data is shown, as the original table is many pages long.)

Measurements from statistical reviews
Flag indicating a potential program problem
Potential causes
Suggested possible actions
Number of staff trained as a percentage of overall staff * Less than 5- percent formally trained, or experienced by way of participation in at least five value studies, in each major discipline that may be required for value studies (higher percentages are best in technical areas often requested in value studies) * Management support may be low
* Brief management on training benefits
* Staff may not be aware of the value program or its training courses * Examine techniques to keep staff informed
*Staff may not be interested in training

* Short training sessions focusing on what it can do for individuals

* Training may not be held at convenient location or times

* Request people perform survey of staff and develop plan of action

* Funds may not be available for training

* Investigate techniques to share costs with other activities

* Increase selection of team members that have had some value study experience to participate in CVS or equivalent facilitated studies to generate more expertise

Percentage of activities that management identifies, and if government, oversight authority, selects for value study: as related to the number of activities available and their estimated cost * Less than 5-percent of overall budget * Management and/or oversight authority support may be low * Brief management about potential (how it can help them)
* Less than 25-percent of new budget items * Communication of benefits may be insufficient * Communications of benefits

* Include performance parameters and job descriptions tying salary bonus to overall activity success

* If Federal, include performance parameters and job descriptions pursuant to Office of management and Budget Circular A-131

* If program is mandated, brief managers on responsibilities
Percent of activities that violate value program standards: as specified by oversight groups, or if appropriate, regulatory/legal standards * Allowed deviation should be set by local oversight authority *Inappropriate standard (e. g., too restrictive) * Examine performance standards and revise accordingly
(Recommended standard is no more than 10 percent) * Train management about program scope and expectation

* Train management about program scope and expectations

* Insufficient local program support * Increase allotted time and funds for value program staff activities
Estimated savings divided by total original projected expenditures (percent) * Lower than industry average percent savings (e. g., 4- to 8-percent government, 8- to 12-percent private industry) * Activities selected for value study too early or too late in the project cycle

* Insufficient number of value studies over the project life-cycle
* Train selection personnel (management, activity team, value program staff) in selection procedures and potential applications for the process
Actual cost of value studies * Value study proposal results are less than cost to operate value study * Improper team expertise selection * Training of selection personnel (management, activity team, value program staff in selection procedures
* Incorrect project or activity selection * Expand training in the application of the Value Method for value study team leader selected

* Incorrect delineation of scope in selection

* Scope of value study inappropriately narrow

 
  * Actual value study costs exceeds projected costs * Lack of understanding of estimating

* Incorrect estimate of scope of study (time, complexity, expertise, expectations)
* Training in the art of estimating for the value program staff or the activity team
  * Value study proposal results are less than cost to operate value study * Inappropriate value study team leader selection * If technical people selected members, supply more training in selection process
* Actual value study costs exceed projected costs

 

* Inappropriate value study team member selection (not enough experience or training) * If management selected team members, inform management of expertise expectations

* Incorrect estimate of expertise needed by team leader and value study team

* Monitor value studies in progress and incorporate consultants or other expertise as necessary in future studies during their performance

* Lack of value study team cohesiveness * Emphasize facilitation and dynamic team composition
Nonproductive value studies * Statement in value study report: "This project is not improvable"
* Prior involvement in activity by a value study team member * Examine and correct selection procedures
* No proposals or additional ideas for further development * Inappropriate value study team leader selection (indicated by lack of training or experience) * Brief supervisor chain on value study team selection criteria
* Idea generation results missing or low volume * Inappropriate value study team member selection * Examine and correct procedures for team leader selection and performance monitoring
  * Value study team leader untrained or severely lacking experience

* Report unresolved issues affecting "effective" program implementation and/or violation of regulations/laws to designated "senior responsible official"

  * Study was abandoned before development phase occurs or can occur * Prior involvement in activity by a value study team member * Examine and correct selection procedures
* No report, or report lacks evidence of decision process use * Value study team leader untrained or severely lacking experience

* Brief supervisor chain on team member selection criteria purpose

*Train value study team leader

* Enroll team leader in mentor-based program

* Report unresolved issues affecting "effective" program implementation and/or violation of regulations/laws to designated "senior responsible official"

Number of proposals * Less than three proposals per value study (implies poor value study performance) * Scope of study is constantly changing or is inappropriate (verify by comparing value study results to the results generated by a selection phase analysis from a CVS) * Train management, activity team in appropriate scopes of study

* Train value study team leader in estimating scope of study
* Inappropriate timing during the project activity cycle * Train activity selection staff in selection procedures and ideal application timing
* Allocated duration for value study too short

* Encourage program staff to join mentor based Value Method activities (SAVE International, value program improvement classes, value study based meetings, or other)

* Delegate duration selection to an experienced person e.g., CVS

* Inappropriate value study team member selection (too little expertise, lack of independence for previously selected solution, etc.) * Train value study team selection staff in selection procedures and timing of application

* Obtain selection assistance expertise from others
    * Inappropriate value study team leader selection (poor leader performance is indicated by FAST validity, number of ideas evaluated vs. complexity of study, team member evaluation form comments, etc.) * Train value study team leader

* Examine selection procedures

* Obtain more training in selection procedures

* Obtain selection assistance expertise from others

Number of proposals implemented divided by number proposed (percent) * Below average range by more than 20-percent (government, private industry) * Lack of value study team leader expertise (this is often indicated by lack of written report or presentation clarity) * Enroll value study team leader in mentor-based program augmented by formal facilitation and team leader training

 

* Inappropriate value study team leader selection (indicated by matching areas of expertise with the activities requirements) * Examine and correct typical selection procedures
* Improper team member selection (expertise/ training) * Obtain selection assistance expertise from others
* Inappropriate timing of value studies (phase) * Train staff in selection procedures and timing of application
    * Support for the studied activity may be low * Communicate specific value study benefits for each participant

* Prior involved team participants may be resistant to change

* Lack of value study team emphasis on implementation phase

* Train value study team leader on procedures, responsibilities, and need to highlight importance of implementation and verification phases to value study team
Number of proposals implemented divided by number proposed (percent)  

* Poor individual value study team member interaction and relations (check with team)

* Offer constructive assistance to team

* Avoid selections which have caused the difficulties identified

* Discuss with appropriate personnel, and if necessary, avoid future selection of staff involved

  * Above average range for government, private industry (usually with low return on investment, or small cost savings and avoidances) * Prior involvement by the value study team in the activity * Train staff in selection procedures and timing of application

* Encourage program staff to join mentor based Value Method activities (SAVE, etc.)

* Federal: report unresolved issues affecting "effective" program implementation and/or violation of regulations/laws to "senior responsible official"

Savings implemented divided by savings proposed * Below average range (government, private industry) * Lack of team leader expertise (usually indicated by lack of written report or presentation clarity) * Value study team leader enrollment in mentor-based program augmented by formal facilitation and team training
* Inappropriate value study team leader selection * Examine selection procedures

* Inappropriate value study team selection (indicated by matching areas of expertise with the activities needs)

* Obtain selection assistance expertise from others

* Train staff in selection procedures and application timing

Percent difference between est. values of accepted proposals in Final Report and Feedback Report estimates *Difference is greater than 30% (+ or -) * Inappropriate estimating expertise
* Train estimators
* Value study team members lack field experience * Examine selection procedures
* Difference is greater than 100% (+ or -) * Inappropriate team member expertise * Examine selection procedures
* Lack of value study team support by prior involved activity team during study

* Encourage program staff to join mentor based Value Method activities (SAVE International, value program improvement classes, value study based meetings, or other)

* Discuss with appropriate personnel and plan corrective actions as necessary

Meeting Overall Program Goals. Whether in the private or public sector, managers implement use of the Value Method to meet goals. When a program for use of the Value Method has been developed, it is to ferment obtaining these goals more completely. The sixth item in the general data need makes sure reality is in the analysis. It helps the Value Method professional improve program products, the methods used to attain them, and maintain (or obtain) managerial support.

To be of use, the results must produce beneficial products that people accept and understand. Ideally, the parameters used will relate to the industry, the organization, overall goals, and specific value study products desired. For example, if cost reduction was one of the stated benefits for a proposal, the amount estimated by the value study and the actual amount that was produced can be used to show an understandable and acceptable benefit was predicted and produced by the use of the Value Method. This can, in turn, be used to generate support for obtaining value study benefits in future activities.

In turn, these results can be used to improve program implementation and identify features needing improvement before they become a problem. For example, suppose the products from value studies indicate good proposal acceptance with a estimated high dollar return, and the ratio of actual amount attained versus estimated amount expected is below 60-percent. The measurement parameter would indicate that the estimating competence within value studies may not be sufficient, and more attention to obtaining this expertise within value studies should occur. Should this parameter be ignored and conditions remain unchanged, managerial support will usually decline. Often, the overall program results will become regarded as "overstated" and "unreliable." In another case, the number of ideas versus presented proposals can be used to indicate whether the expertise selected for the multi-discipline team were appropriate and how they may be changed to improve the final results.

For the Federal or other governmental manager, the meeting of goals and performance parameters become more key. The stated purpose of government is not to make money; but to provide specified services at the least cost. Business provide products and services so as to make a profit. Government provides products and services to meet societal or other goals. This difference in purpose product significant changes in the way work is performed and products are analyzed to prove success. While private enterprise goals are at the foundation of most present governmental laws and directives, they become disconnected from their original intent during implementation. A recognition of this disconnect is present in most laws; that is why many of them specify goals and requirements. The legislative or executive body often use industry standards to create the goal oriented laws and directives.

Thus, for the governmental manager with a law or directive to meet, additional program specific parameters come into play. These show the management support level (projects required for study versus number not studied), the presence of some kind of structure to perform value studies and advise the organization on its pro-active practices; and an organizational understanding of why a program is required, what they can get from it, and what their responsibilities are within the organization's value program.

SPECIALTY DATA

Quality Parameters. Star diagrams, quality models, and space models are examples of high quality value study assistance components. Using these types of models generates more opportunities for a quality value study result However, they are study type specific and are not crucial to ensuring that every value study meets the ultimate purpose for performing the value study. Accordingly, in determining the appropriateness of a program activity, limited weight should be accorded to the presence of these types of models.

Inappropriate parameters. An often quoted measurement of value study performance is technical discipline versus number of ideas, or similar measurable parameter. These have little value to determining the success of a program's effort and if a team knows they may be used, can generate counter-productive and competitive stress in the team; reducing overall performance. It ignores that non-technical ideas feed on technical one and visa-versa. These types of parameters should be avoided unless an appropriate added-value for them is proved.

PRESENTING IT

Many highly successful programs have failed to retain management support. This is most often due to lack of visibility for the program and its results. The other most common reasons are lack of understanding what the program offers and belief that cited results, if any are cited, are not reliable or provable.

Managers and executives have many problems and cannot spend enormous amounts of time learning to understand everything in their organization. They often figure, rightfully so, that this is why the technical expertise has been hired. It falls to the value program manager or the consultant for Value Method activities to produce provable, simple, easily reviewed overall program results that are compiled and presented to the organization, line managers, and executives. When performing the multiple presentations each year, each presentation needs to be tuned to the audience. Further, obtaining upper executive time for a short 15- to 30-minute briefing and feedback session of the measurement results and next year's plans, should be a key part of each value program manager's annual agenda.

Requesting and receiving feedback about the presented program results and understanding of the analysis must be a part of each meeting. This improves the data presentation and produces improved overall support. Schedule one to five minutes, in each meeting, for this objective.

ACTUAL APPLICATION EXAMPLES

In a moderately sized program, the program staff cited that their acceptance within their program had reached an all-time high. An examination, as described in the previous partial table of measurement parameters, ensued. Indeed, the number of proposals presented versus number accepted showed more than 90-percent of all proposal were being accepted. A review of the percent improvement (in this case, cost) showed that the average reduction in overall costs were 60-percent less than that normally expected in the value program. This combination often indicates that people were being selected for inclusion as a full-time team members that were significantly involved in the project prior to the value study. The program staff person was contacted and this was found to be the case. While this practice produced momentary apparent success, in reality, the overall program was suffering. It produced a belief by the management that they didn't need "outside help." Further, the lack of fresh viewpoints created less innovation and allowed more mistakes to occur. This practice inevitably produces less than optimum decision products. If that program had attained the average result for independently formed teams, several million dollars in additional cost reductions would have been the result.

After a major reorganization, the percentage of Value Method trained cadre available for applying Value Method processes versus those that may be called upon began to fall off dramatically. The loss in available qualified cadre reduced by about 10-percent or more per year. While additional training was offered, little effort to attend was apparent. This combination usually indicates a lack of management support for the overall program. Discussions with staff that had previously expressed an interest in taking classes appeared to confirm this. Managers indicated that they were supportive of the program, but could not spare staff time due to fewer staff being available. While managers were made aware of other options, they made no move to fill the gap through other than in-house personnel. Over a three year period, the number of value studies performed dropped by 80-percent and upon implementation, the average project, not value studied, commonly exceeded its allocated budget. The value studied projects tended to be implemented at or below their budgeted amount. This result was apparently due to the loss of the "champion" for the value program.

MEASUREMENT IMPLEMENTATION

Appropriate value program measurements help people run productive programs. Creation and use of a table of measurement parameters with the meanings for the results and common remedies should be a part of the tools every professional uses. Since many professionals practice as consultants, they have difficulty obtaining programmatic data. However, clients should be requested for feedback with the understanding that its presence is a strong added-value for the clients. (Specific data can be generalized so trade issues are not an issue.)

Many value programs rely on a person who manages the program that is aided by an expert. The knowledge the expert uses is often not fully understood, and as a result may not be productively used. The few organizations and professionals that have developed systematic measurements and meanings of those measurements to the scale described have made books and other resources available to others. These, when combined with the knowledge within the involved organization, can produce results that improve both the productiveness of the value study results obtained, and the process that produces it.

Use of these approaches can help people generate and meet the requirements in the 1996 Government Performance Results Act and other new government initiatives that stress measuring performance. They also demonstrate auditable results for the Federal manager and staff to use to demonstrate they have effective value programs as defined in PL-104-106.

REFERENCES

Federal Construction Council Technical Report No. 92,Elements of an Effective Value Engineering Program, Federal Construction Council Consulting Committee on VE, National Academy Press, Washington, D. C., 1990.

Module II Program Coursebook, Fred Clarke, J. Samuel Martin, and Michael LaBorne, Bureau of Reclamation, Denver CO, 1993.

Operating and Maintaining a Value Program, J. Samuel Martin, SAMI VE LLC, 1998.

Value Engineering Officer's Operational Guide, Appendix A, US Army Corps of Engineers, Washington, D. C., 1987.

Various Office of Management and Budget Reports, Office of Management and Budget, 1993-1997



Previous

 

 


Content 1993-2006 by SAMI VE LLC http://www.value-engineering.com



Certified value specialist and value analysis at SAMI

SAMI VE LLC - east
RR 1 Box 225C
Polk, PA 16342

301-327-7500
or
SAMI VE LLC - west
PO Box 150242
Lakewood, CO 80215-0242

303-674-6900

Web cam views out the PA office windows


Many locations to serve you.
FAX Call if needed
inquiries@value-engineering.com

Web design 1993-2006 by OfficeOnWeb Technologies