Summaries of Articles on Management Accounting

All the article summaries included here were prepared by MBA students in Management 413, Cost Accounting. Each article summary includes the name of the student who prepared the summary.

Management Accounting in General

Adler, Paul S. "Time-and-Motion Regained," Harvard Business Review (January-February, 1993), pp. 97-108.

Alonso, Ramon L and Cline W. Frasier "JIT Hits Home: A Case Study in Reducing Management Delays," Sloan Management Review (Summer, 1991), pp. 59-67.

Assmus, Gert and Carsten Wiese, "How to Address the Gray Market Threat using Price Coordination," Sloan Management Review (Spring, 1995), pp. 31-41.

Awasthi, Vidya N ABC's of Activity-Based Accounting , Industrial Management (Jul-Aug 1994) pp.: 8-11, January 29, 1996.

Beaty, Carol A. "Implementing Advanced Manufacturing Technologies: Rules of the Road," Sloan Management Review (Summer, 1992), pp. 49-60.

Böer, Germain, "Making Accounting a Value Added Activity," Management Accounting (August, 1991), pp. 36-41.

______ , "Five Modern Management Accounting Myths," Management Accounting (January, 1994), pp. 22-27.

Brimson, James A., Feature Costing: Beyond ABC, Journal of Cost Management, January/February 1998, pp. 6-12.

Briner, Russell F., Michael D. Akers, James W. Truitt, and James D. Wilson, "Coping with Change at Martin Industries," Management Accounting (July, 1989), pp.45-48.

Burt, David N., "Managing Suppliers Up to Speed," Harvard Business Review (July-August, 1989)m pp. 127-135.

Chalos, Peter, "Costing, Control, and Strategic Analysis in Outsourcing," Journal of Cost Management (Winter, 1995), pp. 31-37.

Clinton, B. Douglas, and Ko-Cheng Hsu, "JIT and the Balanced Scorecard: Linking Manufacturing Control to Management Control, "Management Accounting, September 1997, p 18-24.

Colin  Drury, and Mike Tayles, Cost System Design for Enhancing Profitability, Management Accounting, January 1998, pp. 40-42

"Cost Accounting Standards: Myths & Misconceptions", Management Accounting, January 1994, pp. 42-43.

Cooper, Robin and Regine Slagmolder, "Strategic Cost Management." Management Accounting. Vol. 89, Issue 8. (New York: February 1998), pp. 16-18.

Cooper, Robin and Regine Slagmulder, "Cost Management Beyond the Boundaries of the Firm," Management Accounting, March 1998, pp.18-19.

Curtis, Corey C. and Lynn W. Ellis, "Balanced Scorecards for New Product Development" Journal of Cost Management; Man/June 1997, Volume II, Number 3, pp. 12-18

Elram, Lisa M. and Feitzinger Ed, "Using total profit analysis to model supply chain decisions", Journal of Cost Management (July/August 1997), pp. 12-21.

Enzweiler, Albert, "Improving the Financial Reporting Process," Management Accounting (February, 1995), pp. 40-43.

Ferrara, William L., "Cost/Management Accounting--the 21st Century Paradigm," Management Accounting (December, 1995), pp. 30-36.

Fry, Timothy D, and James F. Cox "Manufacturing Performance: Local Versus Global Measures," Production and Inventory Management Journal (Second Quarter, 1989), pp. 52-56.

Fuller, Joseph B., James O'Connor, and Richard Rawlinson "Tailored Logistics: The Next Advantage," Harvard Business Review (May-June, 1993), pp. 87-98.

Gallimore, Devin F. and Richard J. Penlesky, "A Framework for Developing Maintenance Strategies," Production and Inventory Management Journal (First Quarter, 1988), pp. 16-21.

Grant, Robert M. "The Resource-Based Theory of Competitive Advantage: Implications for Strategy Formulation," California Management Review (Spring, 1991), pp. 114-135.

Harris, Whitman and Harry Franklin Porter, "Deciding Whether to Buy or Make," Journal of Manufacturing and Operations Management (Summer, 1990), pp. 78-84.

Hopson, James F. "New Rules for Package Design Costs," Management Accounting (February, 1993), pp. 46-48.

Howell, Robert and Steve Soucy have a series of articles in Management Accounting in July, August, October, November, 1987 and January, 1988 that deal with the impact of the new manufacturing environment on management accounting.

Hubbell, William Jr., Combining Economic Value Added and Activity-Based Management, Journal of Cost Management (Spring 1997), pp. 18-29.

Johnson, H. Thomas, "Activity-Based Information: A Blueprint for World-Class Management," Management Accounting (June, 1988), pp. 23-30.

Jung, Jong-Yun and Rasphpal S. Ahluwalia "FOECOD: A Coding and Classification System for Formed Parts," Journal of Manufacturing Systems (March, 1991), pp. 223-232.

Keegan, Daniel P., Robert G. Eiler, and Charles R. Jones, "Are Your Performance Measures Obsolete?" Management Accounting (June, 1989).

 Krupnicki, Michael and Thomas Tyson, "Using ABC to determine the cost of servicing customers", Management Accounting, December 1997, pp. 40-46

Lacity, Mary C, Leslie P. Willcocks, and David F. Feeny "IT Outsourcing: Maximize Flexibility and Control," Harvard Business Review (May-June, 1995), pp. 85-93.

Lieber, Ronald B. Here Comes SAP. Fortune. October 2,1995. pp. 122-124.

Lucas, Mike, Standard costing and its role in today’s manufacturing environment, Management Accounting, April 1997, p. 32 – 34.

McKinnon, Sharon M. and William J. Bruns, Jr. "What Production Managers Really Want to Know," Management Accounting (January, 1993) pp. 29-37.

Maisel, Lawrence S. "Performance Measurement: The Balanced Scorecard Approach," Journal of Cost Management (Summer, 1992), pp. 47-52.

Magretta, Joan. The Power of Virtual Integration: An Interview with Dell Computer’s Michael Dell. Harvard Business Review. March-April, 1998: pp. 73-84.

Marple, Raymond, "Management Accounting Is Coming of Age," Management Accounting (July, 1967), pp. 3-16.

McFarland, Walter Concepts for Management Accounting New York: National Association of Accountants, 1966.

McIlhattan, Robert D. "How Cost Management Systems Can Support the JIT Philosophy," Management Accounting (September, 1987), pp. 20-26.

Metters, Richard, "Quantifying the bullwhip effect in supply chains," Journal of Operations Management, (May, 1997), pp.89-100.

Monden, Yasuhiro and John Lee, "How A Japanese Auto Maker Reduces Costs," Management Accounting (August, 1993), pp. 22-26.

Nanni, Alfred J., Jeffrey Miller and Thomas E. Vollman, "What Should We Account For?" Management Accounting (January, 1988), pp. 42-48.

National Association of Accountants--This organization publishes numerous studies related to management accounting. Their studies on Cost-Volume-Profit Analysis and Direct Costing are classics and well worth reading.

Rehnberg, Stephen M., "Budgeting: Keep Your Head Our of the Cockpit," Management Accounting (July, 1995), pp. 34-37.

Robinson, Michael A. and John E. Timmerman, "How Vendor Analysis Supports JIT Manufacturing," Management Accounting (December, 1987), pp. 20-24.

Sakurai, Michiharu "Target Costing and How to Use It," Journal of Cost Management (Summer, 1989), pp. 39-50.

Schmittenner III, John W. "Metrics" Management Accounting (May, 1993), pp. 27-30.

Schoemaker, Paul J. H., "Scenario Planning: A Tool for Strategic thinking," Sloan Management Review (Winter, 1995), pp. 5-40.

Sellers, Patricia "The Dumbest Marketing Ploy," Fortune (October 5, 1992), pp. 88-94.

Shank, John K. and Vijay Govindarajan, "Making Strategy Explicit In Cost Analysis: A Case Study," Sloan Management Review (Spring, 1988), pp. 19-29.

Simons, Robert, "Control in an Age of Empowerment," Harvard Business Review (March-April, 1995), pp. 80-88.

Singhvi, Virendra, "Reengineer The Payables Process," Management Accounting (March, 1995), pp. 46-49.

Susman, Gerald I., "Product Life Cycle Management," Journal of Cost Management (Summer, 1989), pp. 8-22.

Tatikonda, Lakshmi U. and Rao J. Tatikonda "Overhead Cost Control--Through Allocation or Elimination?" Production and Inventory Management Journal (First Quarter, 1991), pp. 37-41.

Venkatesan, Ravi "Strategic Sourcing: To Make Or Not To Make," Harvard Business Review (November-December, 1992), pp. 98-107.

Vineyard, M.L. and J.R. Meredith "Effect of Maintenance Policies on FMS Failures," International Journal of Production Research (November, 1992), pp. 2647-2657.

Walker, Denton B. and Terry Zinsli "The Coors Shenandoah Experience," Management Accounting (March, 1993) pp. 37-41.

West, Timothy D & West, David A. Applying ABC to healthcare, Management Accounting February, 1997

Wright, Michael A. and John W. Jonez "Material Burdening: Management Accounting Can Support Competitive Strategy," Management Accounting (August, 1987), pp. 27-31.

West, Robert N. and Snyder, Amy M. "How to Set Up a Budgeting and Planning System." Management Accounting Jan. 1997: 20-26.

Cost Behavior Analysis

Benston, George, "Multiple Regression Analysis of Cost Behavior," The Accounting Review (October, 1966), pp. 657-72.

"Separating and Using Costs as Fixed and Variable," NAA Bulletin (June, 1960).

Dean, Joel Statistical Cost Estimation. Bloomington: Indiana University Press, 1976.

H2>Environmental Costs

Hammer, Burt and Christopher H. Stinson, "Managerial Accounting and Environmental Compliance Costs," Journal of Cost Management (Summer. 1995), pp. 4-10.

Kite, Devaun, "Capital Budgeting: Integrating Environmental Impact," Journal of Cost Management (Summer, 1995), pp. 11-14.


Carr, Lawrence, "How Xerox Sustains the Cost of Quality," Management Accounting (August, 1995), pp. 26-32.

Deming, W. Edwards, Out of the Crisis. Cambridge, MA: Massachusetts Institute of Technology Center for Advanced Engineering Study, 1982.

Ernst & Young The International Quality Study: Best Practices Report Ernst & Young, 1992.

Diallo, alahassane, Zafar Khan, and Curtis F. Vail, "Cost of Quality in the New Manufacturing environment, Managemnt Accounting (August, 1995), pp. 20-25.

Flanagan, Theresa A., and Joan O. Fredericks "Improving Company Performance through Customer-Satisfaction Measurement and Management," National Productivity Review (Spring, 1993), pp. 239-258.

Garvin, David A. "What Does 'Product Quality' Really Mean?" Sloan Management Review (Fall, 1984), pp. 25-43.

Gilbert, James D. "TQM Flops--A Chance To Learn from the Mistakes of Others," National Productivity Review (Autumn, 1992), pp. 491-499.

Grahn, Dennis P., "The Five Drivers of Total Quality," Quality Progress (January, 1995), pp. 65- 20.

Gray, Janet, "Quality Costs: A Report Card on Business," Quality Progress (April, 1995), pp. 51- 54.

Hare, Lynne B., Roger W. Hoerl, John D. Hromi, and Ronald D. Snee "The Role of Statistical Thinking in Management," Quality Progress (February, 1995), pp. 53-60.

Hart, Marilyn K. "Quality Tools for Decreasing Variation And Defining Process Capability," Production and Inventory Management Journal (Second Quarter, 1992), pp. 6-11.

Hauser, John R. and Don Clausing, "The House of Quality," Harvard Business Review (May- June, 1988), pp. 63-73.

Jaehn, Alfred H. "The Zone Control Chart," Quality Progress (July, 1991), pp. 65-68.

Juran, J. M. Juran on Planning for Quality. New York: The Free Press, 1988.

Lederer, Phillip J., and Seung-Kyu Rhee, "Economics of Total Quality Management," Journal of Operations Management (June, 1995), pp. 353-368.

Rust, Kathleen, "Measuring the Costs of Quality," Management Accounting (August, 1995), pp. 33-37.

Scherkenbach, William W., "Performance Appraisal and Quality: Ford's New Philosophy," Quality Progress (April, 1985), pp. 40-46.

Stowell, Daniel M. "Quality in the Marketing Process," Quality Progress (October, 1989), pp. 57-62.

Tierno, Anthony "Lessons From Alcoa: Finding the Right Path to Quality Has Its Twists and Turns," Management Accounting (August, 1991), pp. 27-30.

Tyson, Thomas N., "Quality & Profitability: Have Controllers Made the Connection?" Management Accounting (November, 1987), pp. 38-42.

Welch, James F. "Service Quality Measurement at American Express Traveler's Cheque Group," National Productivity Review (Autumn, 1992), pp. 463-471.

Wettach, Robert, "Function or Focus?--The Old and the New Views of Quality," Quality Progress (November, 1985), pp. 65-68.

Zangwill, Willard I., "Ten Mistakes CEO's Make About Quality," Quality Progress (June, 1994), pp. 43-48.

Transfer Pricing

Benke, Ralph L. and James Don Edwards, Transfer Pricing: Techniques and Uses. New York: National Association of Accountants, 1980.

Burt, David N., "Managing Product Quality through Strategic Purchasing," Sloan Management Review (Spring, 1989), pp. 39-48.

Cassel, Herbert C. and Vincent F. McCormack "The Transfer Pricing Dilemma--And a Dual Pricing Solution," Journal of Accountancy (September, 1987), pp. 166-174.

Hirshleifer, "On the Economics of Transfer Pricing," Journal of Business (April, 1957), pp. 96- 108.

Lesser, Frederic E., "Does Your Transfer Price Make Cents?" Management Accounting (December, 1987), pp. 43-46.


Baxter, W. T., and A. R. Oxenfeldt, "Costing and Pricing: The Cost Accountant Versus The Economist," Business Horizons (Winter, 1961), pp. 77-90.

Boiteux, M., "Peak-Load Pricing," The Journal of Business (April, 1960), pp. 157-179.

Campbell, Robert J., "Pricing Strategy in the Automotive Glass Industry," Management Accounting (July, 1989), pp. 26-34.

Frank, Gary B., Steven A. Fisher, and Allen R. Wilkie, "Linking Cost to Price and Profit," Management Accounting (June, 1989), pp. 22-26.

Reid, David A. and Richard E. Plank, "A Guide to Pricing in Business Markets," Journal of Pricing Management (Fall, 1990), pp. 19-25.

Simon, Hermann, "Pricing Opportunities--And How to Exploit Them" Sloan Management Review (Winter, 1992), pp. 55-65.

Wright, Micheal A.and John W. Jonez, "Material Burdening: Management Accounting can support Competitive Strategy", Management Accounting, August 1987, pp.27-31., 11/6/95 K. JEFF LIVINGSTON, Class of 1997

Wright, Micheal A. and John W. Jonez, "Material Burdening: Management Accounting can support Competitive Strategy", Management Accounting, August 1987, pp. 27-31.

The Portables Group is one of the most competitive segments of the test and measurement instrument market serviced by Tektronix. As an industry leader, Portables Group has been under intense pressure from Japanese and European competitors to reduce the costs of its products. In response, many cost reduction programs have been implemented. Programs such as JIT ( Just In Time) and MRPII ( Manufacturing Resource Planning) have been so successful that labor cost has been reduced to as little as 3% of total cost in some areas. These advances are substantial, yet they are not having the overall impacts expected, according to reports based upon existing internal accounting standards. The problem has turned out to be that the removal of labor was having less and less impact upon overall costs. Because the cost accounting system correlated overhead to labor, managers came to an erroneous but often drawn conclusion that the way to reduce overhead was to reduce labor. In fact, there was little relationship between direct labor and overhead. A new method was needed.

The objective of the new accounting system was to design a method useful in identifying cost reduction opportunities. After segmenting the various operations, it was concluded that the current allocation of overhead costs was misleading managers. A decision was made to classify total overhead into one of two pools. These pools created the appropriate cost correlation. As a result the new accounting system would encourage a certain type of behavior on the part of the design and cost engineers, while penalizing those actions which increased costs. The new accounting system was not to be used for external reporting. The ³two sets of costs² approach made this new information available solely as a tool for management decision making.

In conclusion, accurate cost information can be a powerful management tool if presented properly and if there is a common understanding of its meaning and origin. The new system offered management a method of quantifying and communicating to design engineers the value of specific design decisions.

Go back to list of articles.

Baxter, W.T., and A. R. Oxenfeldt, "Costing and Pricing: The Cost Accountant Verses The Economist," Business Horizons (Winter, 1961), pp. 77-90., 11/8/95

Shannon Wilcoxon, Class of 1997

This article attempts to explain and reconcile the differences between a cost accountant's and economist's view towards setting prices. The price setting system described is the cost plus (or costing margin) model which is a formula based approach used by cost accountants. This formula consists of three components: direct costs, indirect costs, and profit.
Direct costs are the costs associated with the individual job under consideration. The indirect costs component is not as objective as the direct cost category. Indirect costs are based on overhead allocated by some quantitative measure (machine hours, direct labor hours, etc.). The profit component is usually expressed as a percentage of either the direct costs, indirect costs or a combination of the two.
The authors acknowledge problems associated with this pricing strategy. They acknowledge that a cost figure can only be accurate at one point in time along the complete project life. In addition, some resources are not job-specific but are used in unequal amounts by various jobs. It can also be argued that this pricing formula disregards the opportunity cost of alternatives. Opportunity costs are a significant component in an economist's view of price setting. Demand research is another component that is frequently not considered by a cost accountant.
The main differences between an economist's and cost accountant's perspectives on price setting is the stage at which they are making their decisions. A cost accountant is usually concerned with costs after plans have been made whereas an economist concentrates on profit maximization before the company has decided on a course of action. The authors suggest that the seemingly vast differences in price setting between these two professions are not as divergent as they initially appear.
When deciding whether a cost plus system is appropriate for your business, you must consider the advantages and disadvantages of this system. This system is based on a formula. No one formula can accurately represent appropriate costs for each project. However, a good formula can make a solid estimate which can be revised based on the parameters of the job under consideration. Parameters such as large volume orders, continuing customer relations, and profit percentage for large jobs verses small jobs must be considered. The cost plus system is most effecting in setting short-run prices in a expedient and uncomplicated manner.

Go back to list of articles.

Campbell, Robert J., "Pricing Strategy in the Automotive Glass Industry," Management Accounting (July, 1989), pp. 26-34., 11/8/95

Mary Williamson, Class of 1997

Customers in the automotive window glass industry are changing their purchasing practices. The glass manufacturers now must evaluate their cost accounting systems and answer two questions 1) What is the proper product mix of windows given productive/distribution constraints? 2) What prices should be set on individual windows to enhance profitability?
Product costs are determined using traditional cost allocation practices. These cost systems focus on solving the needs of inventory valuation and financial reporting. However, true product cost should reflect “the value of production and distribution resources consumed in creating, packaging, and transporting the product to the customer.” Direct relationships between even small changes in product volume and consumption should be reflected in the cost assignment. Overhead is all other costs not directly related to the product. Traditionally assigned based on labor hours or square footage, these measures are no longer appropriate due to the changes in manufacturing technology. The challenge is where do accountants assign overhead now.
Overhead in a glass plant
There are three major categories of overhead in a glass plant 1)Production department overhead: Traditionally allocation was in two-stage process. First, group by department those costs directly traceable to individual production cost centers. This is reasonable if the costs assigned are truly identifiable. Second, these groups are allocated to products based on the number of labor hours or machine hours in each type product. This is reasonable if the cost center resources consumed by each product are reflected in the allocation bases. 2) Material movement overhead and 3) general plant overhead are traditionally allocated directly to products based on square footage.
A cost accounting system should measure cost center resource consumption taking into consideration certain productive or distributive resources that are finite. The traditional cost accounting system doer not allow for this.
Throughput in a new cost system
Product costs should be built around the consumption of the most critical resources. Bottlenecking can limit total plant throughput. “Throughput is a measure of net revenue of the product less costs directly traceable to and caused by the existence of that product in the mix.” In the glass industry, direct traceable costs consist of raw materials and material related costs, product order-related costs, and product bundle related costs. These cost relationships are potential cost pools that make up the activity based cost system.
Throughput only relates to product volume that meets current customer demand. Inventory has no throughput value. Inventory just means an increase in storage costs and cash outflow and risk of parts stockouts.
Throughput is used to develop a product mix strategy and determine the benefit of adjusting product prices. Spotting bottlenecks can result in production improvements that increase plant profitability.
Computation of troughput
“Direct material costs and those related overhead costs directly affected by material quantities consumed are deducted from estimated market-based selling prices as are product bundle movement, packaging scheduling, and setup requirements. These costs represent product consumption of resources that would be increased readily within he budget period if product priority or volume were changed.”
Product costs for pricing decisions
The pricing mechanism must recover all fixed and variable costs, so throughput value alone will not support product pricing. Production related fixed-costs fall into three categories 1) traceable to plant processes 2) machine depreciation 3) labor rates. “Full costing of products should display the same relative profit spread as depicted by throughput value so that product mix decisions using throughput value are consistent with product pricing strategy based on full costing.” Square footage measures have no relation to value of time and fixed resources consumed in each machine center. Analysis based on labor rates doesn’t reflect the consumed resource’s value.
A general overhead plus depreciation cost per constraining hour figure is helpful, because you can see the freed up time in a resource from dropping one product. These saved costs can be used toward another product.
Bottlenecking may occur in different departments as investment and operational decisions effect production activity. These changes will change allocation. Remember: Product profitability should not be change from that determined by throughput value by the allocation of indirect fixed costs.
Through cost accounting and identification of bottlenecks, the firm can coordinate what it is able to sell with what produced. Cost allocation determines adequate selling prices. Throughput value is an important strategy to determine product profitability and will allow best use of scarce resources.

Go back to list of articles.

Wettach, Robert "Function or Focus?--The Old and the New Views of Quality," Quality Progress, November 1985, 11/8/95

Mindy J. Sauers, Class of 1997

This article deals with the new views that organizations have about achieving quality. What I found interesting is that these views are not radically different rather they encompass they old views of quality in a different context. Quality assurance has evolved from being a production afterthought to a process which is integrated along the production line. No longer is one group given authority to “police” over people, rather it is now a process shared by teams whose end goal is to produce quality products.

The former way of assuring quality was to give the “quality” division total responsibility and total control. This quality division was responsible for stopping production if the product did not meet quality standards. This in turn, created deceptive employees who did whatever they had to do to ensure that the product they had made would pass inspection. If a product was found to be defective once it passed the quality division, then it was the quality manager who would be held accountable, notwithstanding the fact that manufacturing employees would often “sneak” products by them.

Today, however, it is the actual production unit that is responsible for shutting down a line. The production unit is responsible for building the quality products and dealing with the costs associated with assuring quality.

There have been ten “new changes” in quality emphasis versus the “old way”(p. 65)

1) Old- inspectors catch defects/New-operators prevent defects
2) Old-quality toll gates/New-manufacturing process control
3) Old-product quality vs. cost/New-high quality equals low cost
4) Old-productivity & quality are separated/New-productivity
and quality are synonymous
5) Old-errors are reduced to “acceptable minimums”/New-errors
reduced to zero
6) Old-automation reduces labor costs/New-automation improves quality
7) Old-quality is driven internally by manufacturing loss/New-quality
improvements are driven externally by customer requirements
8) Old-quality is manufacturing focused/New-quality is focused on all
aspects of business
9) Old-quality manager has authority to stop production/New-shop
production manager has authority and responsibility to stop
10) Old-quality is a function/New-quality is a focus

Originally, product quality was controlled by inspectors who would be responsible for verifying that the product met specifications only at the end of its production. However, the new thought process is to have operators inspect products through each stage of production thereby being responsible for the quality of production in their division. The key to making this type of quality assurance successful is for managers to motivate and instruct operators as to what is required and how to achieve it.

Formally, quality toll gates were used however, this was only supposed to be a temporary measure and it turned into a “security blanket” for many manufacturing units. The new thrust is on quality planning during the manufacturing process.

The new view that high product quality can mean low cost was proven by Jaguar in 1983. The company had major losses in the 1980’s and then Jaguar realized that quality improvements were at the heart of their success. They decided that product quality does not have to be a trade off for cost.

Productivity and quality were once viewed as separate, however, General Electric proved that by establishing planning and station controls their amount of defects decreased. This lead to the belief that taking time to make it right the first time pays off in the end.

Another change in quality emphasis is that defects not only can but should be reduced to zero. The mission of a manufacturing division is to strive for perfection. This change really came into effect after David Garvin conducted a quality study of Japanese air conditioners versus American air conditioners. The results indicated that the defect rate of Japanese air conditioners was 70% lower than American air conditioners. (p 66)

Automation was once used to reduce labor costs and there were many examples that proved this assertion to be true. However, the new belief is that automation, will not only reduce labor costs, but also improve quality. Such was the case with General Electric’s flexible machining system project which showed a 240% improvement in employee productivity and a reduction in manufacturing loss.(p. 67)

The customer is the final judge of the quality of a product. So the primary focus of any product quality improvement program is to have an external focus. This type of program must also reduce manufacturing loss while enhancing the product. If a quality improvement program does not achieve these two goals then it is missing its mark.

Another new view of quality is that quality should no longer just be focused on manufacturing, rather it should be focused on all business functions. The internal view of quality must be supplemented by an external view that addresses:

• the accuracy of the perception of customer expectations regarding
product perfomance
• have product designers connected with customer expectations

Structure product/process analysis is the method used to address the above issues. (p 67)

Cross-funtional work teams are being used more often to make sure that the proper customer connection is made.

The result of all of these new views on quality is summed up with the companies who emphasize that quality is a focus not just a function. This new quality emphasis is being integrated throughout the entire organization. The quality focus must become part of company philosophy and be a routine part of the institution in all their business decisions and actions.

In order for the quality organization to be successful it needs to be externally driven and externally focused--quality becomes everyone’s job internally. The new qulaity-minded manager must focus his resources on several things:

• training and involvement
• operator station control
• data collection and analysis
• corrective action teams
• integration
• organization implemantations

Implementing these resources has proven to help companies improve product quality and also reduce costs. The key to quality success is to integrate “quality” decisions and actions into the business organization. Employees throughout the new “quality-minded” organization need to be involved in quality assurance of both product and customer service.

Go back to list of articles.

Gallimore, Devin F. and Richard J. Penlesky, "A Framework for Developing Maintenance Strategies," Production and Inventory Management Journal (First Quarter, 1988), 16-21, 11/9/95

David S. Ellis, Class of 1997

There are three defenses against breakdowns: 1) provide buffer zones in the process so other areas can continue work when one area breaks down; 2) maintain the facilities so breakdowns do not occur as frequently; and 3) design facilities so they are easy to maintain. With the emphasis on low inventories and just-in-time production, the last two defenses are becoming more important.
Maintenance strategies have five possible elements: reactive maintenance, preventative maintenance, inspection, backup equipment, and equipment upgrades. These five form a maintenance mix that varies from plant to plant, depending on the needs, goals and resources of each plant.
Reactive maintenance requires workers to move fast to minimize the costs of a breakdown. The emphasis here is on loss control. Manpower is the primary cost incurred in reactive maintenance. The actual costs depend on the desired level and time of response. Dispersal of manpower throughout the various facilities often cuts down on the response time, and thus the cost, of reaction.
Regularly scheduled preventative maintenance is designed to reduce the probability of a breakdown occurring. Many managers erroneously assume that preventative maintenance is a high-cost option. Actually, the labor and lost sales costs of down time may far outweigh the cost of preventing the breakdown in the first place. Two inherent characteristics of preventive maintenance help reduce the total cost of maintenance: 1) breakdowns due to worn-out parts occur less frequently and 2) companies often schedule the maintenance during nonusage periods to minimize interruption to work schedules.
Inspection can help a manager find out how well a preventative maintenance program is working. It may also determine whether maintenance has occurred on a piece of equipment.
Companies often use backup equipment when the cost of a breakdown is very high or time constraints do not allow them to carry out proper preventative maintenance. Backup requires the purchase of “extra” equipment. This cost may seem high until one compares it to the cost of a breakdown. Companies may also use backup equipment to supply extra capacity in certain situations. Standardization of backup equipment makes this use much easier. It also lowers backup equipment cost. If a company has only one or a few types of machines, it only needs one or a few different types of backup equipment.
Companies often use equipment upgrades as maintenance rather than purchase new equipment. Upgrades increase reliability, facilitate repair, and increase the throughput of an organization. Accounting for the cost of upgrades depends on the type of upgrade and the expected lifetime of the equipment.
A Sample Case
In 1984 a domestic automobile manufacturer was operating two shifts per day producing cars. The firm could seldom produce over 70% of its target output due to breakdowns. The firm currently used only a reactive maintenance program. Down time averaged 70 minutes a day and cost the firm $16,500 a day in lost labor productivity and $00,000 a day in lost production.
The company decided that it should increase its maintenance costs to reduce its breakdown costs. The company dispersed its maintenance crews to cut down on the time it takes to travel to the site of the breakdown. (Travel to the breakdown site accounted for about 15% of the downtime.) This required a greater investment in tools, spare parts, and communication equipment. It also meant the company would devote more manpower and floor space to the maintenance department. The company reduced travel time to 3% of downtime.
The company also upgraded its equipment. They bought overhead and slack detectors that brought the diagnosis time on a breakdown down from 25 minutes per day to 4 minutes per day. The company also added a backup conveyor line to guard against down time should one of the conveyors fail. The extra conveyor also served to increase capacity during peak times. The company also began inspecting the conveyors and keeping track of the maintenance. The addition of backup equipment and the upgrades reduces down time from 70 minutes per day to 60 minutes per week.
The company dramatically reduced downtime due to the increased response time and improved equipment They were even able to staff a preventive maintenance team from one of the reactive maintenance teams that was no longer needed. This preventive maintenance team eliminated motor breakdowns almost completely and reduced average downtime to 5 minutes per week. The benefits of maintenance clearly outweighed the costs. The company saved several million dollars a year because of improved maintenance.

Go back to list of articles.

Lesser, Frederic E., “Does Your Transfer Price Make Cents?” Management Accounting (December 1987), pp. 43-46., 11/12/95

Len White, Class of 1997

The main idea behind this article is to state that transfer prices should be established to accomplish the goals of the organization. In this article Lesser identifies a fictitious corporation that has recently implemented a new line to produce widgets for sale to a sister division. The price of the widgets were set below standard cost at variable cost plus 10% which was beneficial to the sister division and the corporation as a whole, but didn’t provide enough profits to cover the fixed costs of their own division.

One employee came to the divisional president with a proposal to establish an assembly line to produce the widget. The proposed solution would eliminate four operators which would save the company $100,000 annually, but would require a one-time charge of $125,000 for additional equipment. The $100,000 savings amounted to a $1.00 per unit savings on the widgets. On the surface the proposal sounded ingenious, but at the end of the year, a new standard cost would be developed which would result in a lower contribution margin per unit, and thus lower profits. Once again, the corporation as a whole would greatly benefit, but the division would be worse off. And, considering that the division manager gets a bonus based on the profits of the division, the proposal was rejected. The goals of the corporation (to maximize corporate profits) were opposed to the division’s goals (to maximize division profits). The current policy provided the employees with a disincentive to reduce costs.

Products that have an external market are properly priced at variable plus 10% for internal company sales. This provides profits that can cover development and other fixed costs. However, products that have no existing external market should not be priced at variable plus 10% because there will be a disincentive to reduce costs.

In order to solve the problems the corporation was having, they decided to reject the proposal to establish an assembly line. They would continue selling the widgets to the sister division at variable plus 10%. And, they would increase the cost to their sister division only if there was an increase in the standard cost, and only if the market would allow it.

Go back to list of articles.

Harris, Ford Whitman and Harry Franklin Porter, “Deciding Whether to Buy or to Make,” Journal of Manufacturing and Operations Management Review, (Spring, 1990), pp. 78-84. , 11/12/95

Jeff Fackler, Class of 97

Should a company make or buy components used in its products? This question should not be answered hastily. There are many factors to consider before a decision can be made. Price is obviously one of the major factors in determining whether to make or buy. Even when the quoted price is lower than internal manufacturing cost, a decision to make these components can still prevail. Once a decision is reached whether to make or buy, the market conditions should still be monitored. However, there are situations which the only alternative is to buy. Mainly, when the company lacks both the facilities and knowledge required to produce the required part at an acceptable quality level.

The authors presented several examples of whether to make or buy. Investment opportunities can play a major role in this decision. A company purchasing a component at high levels may think investing in equipment would be an excellent alternative. By doing this, they would reduce both the component cost and lead time. However, further analysis could indicate that investing the money in another alternative would generate a higher return. In this case buying the part would be the best alternative. As in the previous case, the component may be purchased at such a level that manufacturing the part initially may look good. However, an analysis reveals the market is very competitive, creating low profit margins and that the part is very difficult to manufacture. Because the market is so competitive, prices should stay low and a decision to buy the component should be made.

Generally it is unwise to buy from a competitor. A company must remember that they are merely adding profits to their competitor. Buying a part from a non-competitor may actually end up building business for your competitor. For example, a company may decide to start buying considerable amounts of a component outside. This increased business for the new supplier could allow it to reduce its prices. This supplier could then turn around and sell this component to a direct competitor at a reduced price, keeping the initial companies price the same.

There are situations were it pays to make and use an unbranded component rather than be controlled by the patent monopoly. Also, a company should investigate why a supplier can produce a component at a reduced price. The authors gave an example were a company decided to start purchasing a component at a reduced price and dismantled their own equipment after a year of purchases. Shortly after a year, the supplier had an abrupt price increase. Further investigation revealed that the initial price was calculated with the supplier using by-product from another part it made. When the accumulation of the by-product was depleted the price went up.

A decision on whether to make or buy should not only focus on the given component in question. There may be other components that are associated with it and what may look advantageous may turn out disastrous. The net effect must be examined. These examples provide a good illustration of what must be considered before making a decision to make or buy.

Go back to list of articles.

Zangwill, Willard I., “Ten Mistakes CEO’s Make About Quality,” Quality Progress (June, 1994), pp. 43-48., 11/12/95

Brad Smith, Owen Class of 1997

A professor of management at the University of Chicago interviewed executives from several firms noted for having excellent quality programs, many of whom are Baldridge Award winners. He discovered 10 common mistakes that many CEO’s make that may keep their companies from developing excellent quality programs.

Failing to Lead: Many managers have a misguided notion of what leading actually is. They have a “Hollywood style” of leadership which is management by exhortation and inspiration. This may cause initial results but then most people go right back to their old ways. Good leadership is about planning, organizing, and training your people. Good leadership produces results.

Thinking that planning devolves from financial or marketing goals. In many organizations, planning starts with management setting goals for financial and market growth. Customer satisfaction is often ignored in this process. Planning should not be a horizontal process working from the top down, but rather a vertical process working from the customer in.

Believing that being close to the customer and planning for customer satisfaction is sufficient. Many CEO’s mistakenly believe they know what their customers think. A systematic approach to customer satisfaction is needed to overcome this misconception. All aspects of a business should have goals and incentives tied to enhancing customer satisfaction. The most effective way to do that is to record data regarding all interactions between the firm and the customer. That data must be used in turn to improve the company’s systems.

Believing that quality means inspection. Getting rid of the root causes of defects is much more effective than reducing the number of defects that go undetected. Inspection only reveals a percentage of the defects.

Believing that quality improvement is too expensive. Doing the job right the first time actually can cut costs. Tasks being redone and materials being scrapped increase costs. In most cases, with quality all costs tend to go down much sooner than expected. This is because quality improvements in one area can cut costs in other areas. Out puts of one process become the inputs of another.

Managing by intuition and not by fact. Intuition and judgment are not as sound as we believe them to be. The brain subconsciously distorts recalled predictions to be closer to the actual outcome, thus giving managers a distorted sense of their own judgment. Managing by fact helps to overcome this. Most Baldridge Award winners tend to collect and use a great deal of information.

Using misguided incentives and developing a distorted culture. Incentives will have little impact if the are wrong for the culture. Managers will sometimes take actions that cut back on costs but will adversely affect the company long-term. Managers who deal the best with their crises are often promoted but no one ever questions why this manager had a crisis in the first place. It is important to reward only those actions that are truly beneficial to the company.

Changing targets each year. Managers should not change goals every year but should rather consistently focus on fundamental factors that are vital to the firm’s success. Goals that are changed often only confuse management and employees.

Failing to follow the best practices. Companies should determine and follow the best practices. Benchmarking is one way to learn from the best. The most difficult aspect of benchmark is getting people to do it. This is because it means that people must acknowledge that they are not the best and that they must improve.

Believing Baldridge Award examiners are stupid. Many firms make the mistake of submitting what is actually public relations pieces to Baldridge examiners. What firms need is closely monitored and documented quality systems and processes, not PR packages. Examiners are smart enough to recognize the difference.

Go back to list of articles.

"What Should We Account For?", Management Accounting, January 1988, pp. 42-48, by Alfred J. Nanni, Jeffrey G. Miller, and Thomas E. Vollman, 11/12/95

R. McPhail Hunt, Class of 1997

In the article "What Shall We Account For?" Management Accounting, January 1988, pp. 42-48, the authors Alfred J. Nanni, Jeffrey G. Miller, and Thomas E. Vollman present a means to integrate cost accounting with management goals and strategies. This main theme of integration between the two is based on the simple fact that traditional cost accounting methods are outdated. They begin their argument with three case studies of management making flawed decisions because they based such decisions on standard cost accounting and control systems. They then proceed to illustrate such shortcomings along with a remedy as to how cost accounting should perform. Their remedy is called CAGS or Cost Accounting by Goals and Strategies, a method by which a matrix is used to integrate management goals and strategies with operating units.

After reading the article, I was very refreshed by its alternative and appealing approach to cost accounting. The authors do a tremendous job in "setting the stage" by presenting the three case studies and using them as a basis for the remainder of article. Afterwards, they describe in strong detail the flaws in current cost accounting systems by inherently pointing out that management must have a "macro" view of their operations versus a "micro" one. This is particularly important since costs have evolved to the point where they span across several parts or units of an operation rather than to one individual unit. The authors add to this point by the fact that managers simply examine costs and not how they are collected and that managers also treat overhead incorrectly in their organizations. Basically, a narrow mindset is quite prevalent with managers when interpreting costs.

Naturally, the authors then proceed to offer a solution by introducing CAGS. They emphasize that such an approach is a custom one that evolves and grows with the organization. This is the strong point of the article since the authors understand that cost accounting should change with the ever-changing goals and strategies of the organization. With a strong degree of conciseness, the authors then present the CAGS system at work, highlighting its details, intricacies, and benefits through easy to understand charts and text. Finally, the authors point out the flaws with their approach such as it being time-consuming and difficult to implement. Such objectivity and self-criticism adds much value to the article.

In conclusion, I greatly enjoyed this article. It was very well-written and organized with more than sufficient examples. Any student who has had introductory accounting could grasp the authors' message and apply their methods in the real world. The article was essentially simple and easy to understand with the CAGS being a realistic, practical method that should be used in any organization. I especially liked the fact that cost accounting should no longer be a static discipline but rather a dynamic one that changes with the corporation and the real world.

Go back to list of articles.

Alonso, Ramon L., Frasier, Cline W., JIT Hits Home: A Case Study in Reducing Management Delays, Sloan Management Review, Summer 1991, pg. 59-67., 11/13/95

Carlos E. Arguello, Class of 1997
The article suggests that it is important for management to be able to plan into the future while considering a products life cycle. Usually, the companyıs profitability is negatively effected by high inventories during the decline phase of the products life cycle. During this phase the product sales are low. A company which has high inventory of a product in decline will witness a reduction in their cash position. This is primarily because of the high inventory costs. The need for a company to plan its production is evident.
In order for management to be able to plan effectively they must be able to forecast accurately. An accurate forecast will be far enough into the future that it will account for the time spent on planning, the time it takes to receive raw materials, and the time it takes to manufacture the product. The forecast will consider products currently in the pipeline. Products being built as a result of the previous plan will limit the amount to be built under the new plan. An effective plan will also establish inventory goals (i.e., zero inventory) and finished goods inventory. Plans must also be made on the specific product and not the companyıs product line as a whole. This is because different products have different demands. Lastly, the company must plan frequently (on a monthly basis).
Just-In-Time (JIT) manufacturing resolves the inventory problem and allows management to react faster to the changing conditions of a products life cycle, thus, enabling the company to forecast more accurately. Since JIT manufacturing deals with improvements in quality, reductions in waste, and lowered Work-in-Process inventory carrying costs it reduces the companyıs planning horizon.
The authors of the article applied the concept of Just-In-Time manufacturing on a consulting project. They performed simulations assuming several demand schedules. They concluded that the more uncertainty with the demand for the product, the more significant the benefits of JIT became. Their results also showed that JIT added to their clientıs bottom line and enhanced their forecast quality.

Here is another review of this article.

Alonso, Ramon L and Cline W. Frasier "JIT Hits Home: A Case Study in Reducing Management Delays" Sloan Management Review (Summer 1991), pp 59-67.

Laurent Chardonnet
Carrying inventory will lower the company's profit. JIT manufacturing is advocated to improve quality, reduce waste, lower WIP inventory and shorten planning period.

Shorter planning quickens reaction to change due to product life cycle and unpredictable demand. Therefore, JIT is not only restricted to the manufacturing process but can also apply to all upstream processes such as planning. Shortening the planning horizon and lowering inventory goals are part of the JIT management.

In the case studied by the authors the net cash follows the product life cycle with a delay. When sales start to decrease, the net cash flow flattens. If at this precise moment ending inventory remains from unsold goods, it will have a negative effect on the company's cash position. A shorter planning will recognize the sales decay sooner so that its effect on the net cash will be less severe.

Similarly, planning is an on going sequence of lead times. Each sequence depends on the preceding one (raw materials supplies come before manufacturing which comes before inventory build up). By planning a zero inventory level JIT will remove the uncertainty created by the lead time of each sequence. Management is another source of delay. It takes time to gather sales data, to build forecast, to develop a SOP and to issue buy and build plans.

The authors have developed a simulation model that takes into account all the elements cited above. At the end of the product life cycle the simulation carries the raw materials, WIP and finished goods inventories and evaluate their cost.

Based on 6 different demand curves, they observed that JIT manufacturing will benefit the net cash. But they especially observed that JIT management will dramatically improve the net cash to a bigger extent than JIT manufacturing. JIT management increases dramatically the profit at the end of the product's life.

Because of these results the company can shorten its planning from a quarterly to a monthly period with a 6 months horizon. Shorter forecasting also helps to faster adjust and correct past errors allowing a better inventory management at the end of the product life.

Frequency of planning, age of information used in forecasting and inventory goals act as amplifiers. The more abrupt the product’s end of life and the more unpredictable the demand, the greater the benefit derived from JIT management.

Go back to list of articles.

Thomas N. Tyson; "Quality and Profitability: Have Controllers Made the Connection?"; Management Accounting; November 1987; pp. 38 - 42., 11/13/95

Karissa Cliff Thomas, Class of 1997 (

“Quality and Profitability: Have Controllers Made the Connection?” appears in the November 1987 issue of Management Accounting. Thomas N. Tyson writes just prior to the widespread popularity of quantifying quality cost measures.

First the author identifies three categories of “costs underlying quality-related activities” (p. 38). He segregates them as:
1) costs of preventing product defects
2) costs of ensuring that products conform to specifications
3) costs of failure, whether they are discovered internally prior to shipment or externally by the customer.

He recognizes that corporations often measure their progress toward quality management as the reduction in overall quality costs. After this brief overview he launches into explanations of actual research he conducted.

Tyson randomly chose 125 of the Fortune 500 and contacted their controllers. Of these, 94 responded. He found that 31% of the respondents specifically measured quality costs on a regular basis. He found evidence that quality cost measurement consistency was highest among manufacturing industries who were forced to confront strong foreign competition (p.39).

The author polled the respondents on five correlation factors that linked controllership to measurement. He ran a regression analysis to determine that a combination of available resource, participation in team projects, and communication with quality function personnel were strongly related to quality measurement being in place.

Tyson also noted that frequency of reporting was directly related to the recipient’s closeness to the source of cost incurrence (p. 41). He found that the quality cost measures were used frequently for performance evaluations and simply to identify quality costs. He reported that among companies who strongly recommended the use of quality cost measures, a clear correlation between quality and profitability had been made.

Go back to list of articles.

Sakurai, Michiharu “Target Costing and How to Use It,” Journal of Cost Management (Summer, 1989), pp. 39-50., 11/13/95

Michael W. Johnson, Class of 1997

Target costing, also called cost planning or cost projection, is a cost management tool which is used to reduce the overall cost of a product over the life cycle of that product. In Japan, target costing is used on a large scale in automobile manufacturing electronics, machine tooling, and precision machine manufacturing. Target costing can be used to reduce costs in the production, planning, and design stages of product development.
Target costing has four major characteristics. It is used in the planning and design stages of product development. It is cost planning, not cost control. It is used mainly in, but not limited to assembly-oriented industries, and is used for the control of design specifications and techniques of producing these products.
Companies use target costing in different ways, and therefore implement it at dif-ferent stages of product development. Most companies, however, implement target cost-ing during the product design stage, and carry it through engineering and finally into pro-duction. Estimates for target costs are developed by looking at similar products and the costs they incurred during each phase of production.
There are three methods generally used in setting target costs. A “top down” ap-proach which does not involve lower-level management, a “bottom up” approach in which engineers have the deciding voice in setting the cost, and finally, a combination of the two methods which is largely believed to be the best practice.
The target cost is made up of production costs, R&D costs, distribution costs, user costs, and miscellaneous other costs. A target costing system must be installed in con-junction with cost engineering tools such as JIT, Value Engineering(VE), or TQC. The agreed upon target cost should only be attainable through considerable effort on the part of all parties concerned.
In addition to the industries mentioned earlier, target costing is being used increas-ingly in formerly labor intensive industries in Japan which have recently become highly automated, and tends to be least effective process industries.

Go back to list of articles.

Cassel, Herbert C. And Vincent F. McCormack "The Transfer Pricing Dilemma--And a Dual Pricing Solution" Journal of Accountancy, September 1987, pp. 166-174, 11/13/95

Michael S. Tudor, Class of 1996

Managers are faced with the transfer pricing problem when their departments are established as decentralized profit centers. The issue becomes complex because managers are evaluated based on their profitability. The authors attempt to introduce a way of benefiting from decentralization and yet eliminating the pitfalls of transfer pricing. They suggest using a dual pricing method.

Transfer pricing policies used by decentralized organizations are market price, negotiated price, and pricing at out-of-pocket costs. Because each department in a company’s goal is usually to make a profit, conflicts arise when these exchanges happen within an organization.

The authors use an auto dealership as their example to explain the issue of transfer pricing. Dealerships usually have areas for new car sales, used car sales, parts and accessories, mechanical repairs, and body & paint. The interactions that occur internally are known as transfer prices. The costs differ as they move from area to area. Gross outside sales all depend on the internal transfer prices.

Problems with the 3 traditional transfer pricing policies:
If a department can get a better price from an outside supplier rather than an internal source, the overall profitability is harmed. Market pricing also eliminates opportunity costs which hinders decision making. And because the higher associated higher costs are now carried in inventory, financial statements are affected.
As with market prices, negotiated prices have the same effect because the negotiated price is usually higher. Conflicts can also arise, and the process is time consuming.
By using this policy, erroneous decision making is eliminated and capacity is flexible because is it a function of manpower. This policy’s main disadvantage is that departments can’t be used as cost centers. There isn’t any way to motivate managers as with the other two.

Use Dual Pricing:
Dual pricing refers to the price charged to a buying department is the out-of-pocket cost incurred by the selling department. But the selling department is credited at the market price. The difference in the two figures is charged to internal sales in excess of assigned cost. This policy will ensure that the revenue to the dealership is only shown by the prices that are charged to its outside customers. All managers are motivated to cut costs and maximize their profits.

The cost of the benefits using dual pricing provides compensation benefits paid to managers. Although it is an additional expense, it is very controllable, and most importantly, it is measurable. Costs can be tracked easier and the profit of the company as a whole will increase.

Go back to list of articles.

Venkatesan, Ravi "Strategic Sourcing: To Make or Not To Make," Harvard Business Review (November-December, 1992), pp. 98-107., 11/13/95

Elliott F. Leschen, Class of 1997

Ravi Venkatesan introduces a new structure for the outsourcing decision. He contends that most manufacturers approach this decision from the wrong position. Instead of addressing the strategic significance of each part, most managers look at the "volumes and hassles" of individual parts. Additionally, traditional cost accounting systems have actually hindered efforts by hiding the real opportunities for cost savings. Without a comprehensive approach to the commodities as a whole, managers will continue to make poorly informed choices regarding insourcing versus outsourcing.

Venkatesan suggests that the key to proper sourcing decisions is, first, to determine whether the parts are strategic or non-strategic. The firm must establish which parts are key to "product differentiation" and the company’s "competitive position" over time. In making this decision, the product must be broken into its subsystems. These subsystems are then evaluated in terms of strategic importance. This determination involves several considerations. Namely, is this a significant system to our customers and what kind of specialized assets and technologies are needed? Once these questions are answered, the company must then decide if its cheaper to "catch up" with the best supplier or if a capable supplier exists.

Once the strategic/non-strategic decision has been made, the systems’ should be grouped in "families" of similar manufacturing facilities, technologies, and equipment. A component will then be judged strategic or non-strategic based on the economies involved in its market. "Strategic" components will manufactured in-house if the technology and equipment required to become a leader is cheap and available. Otherwise it will be outsourced with a close link to the supplier. "Commodity" components will be outsourced because of the economies achieved by suppliers in the industry.

In making these decisions, several key factors are always considered. Is in-house manufacturing competitive in quality and cost? Is in-house manufacturing cost-effective based on the investment necessary to bring the equipment and technology to par or better?

In the case of an outsourced system, Venkatesan recognizes that employees with a thorough knowledge of the outsourced systems must be retained and involved in the design and implementation. Typically, if a significant system is outsourced, a company should work closely with the supplier in its engineering. By doing so, the manufacturer can guarantee quality and keep knowledgeable personnel working with the system. Otherwise, the company will become too dependent on the supplier and lack the "architectural knowledge" required to maintain quality.

This framework will improve the "make or buy" decision process. Simple, "commodity" parts will be outsourced to suppliers whose economies are greater. If a part is deemed "strategic" than the in-house efficiency will be evaluated. If it is possible and cost-effective to be a leader in the production of that system, it will remain in-house. Otherwise, it will be outsourced. This particular decision gives lower level employees an opportunity to improve the efficiency in their division to avoid being outsourced. Venkatesan’s comprehensive approach requires a company to identify the systems that are most significant and achieve internal or external efficiencies based on that determination.

Go back to list of articles.

Boer,Germain, "Making Accounting a Value Added Activity," Management Accounting (August 1991), pp.36-41, 11/13/95

Edwin H. Caldwell, Class of 1997

To survive in business, a company must eliminate activities that do not add value. Accounting activities must also be examined to see if they add value to the company. Without a value added accounting system, a company may perform well but will not be able to see where it is going.

One area where accounting can be non-value adding is when budget compliance becomes more important than profit. Managers will cut prices too deeply or overload distributors (trade loading) just to make budget. This adds no value to the company and especially disrupts the production process and product quality. This type of sales upswing at the end of the period is referred to as the banana sales curve. This non-value adding activity must be removed.

Another area that companies get very hung up on is labor cost reporting. A company should first determine if labor is a significant cost to the company. If labor is only 10% of product cost while material is 70%, a value added accounting system will focus management on significant costs and not allow them to waste time trying to reduce the insignificant costs.

There are several other things that companies do that are detrimental to the overall operation. One of these is the implementation of complex accounting based incentive plans. If these plans are implemented without organizational changes to support it, the plan will actually increase costs. Another thing companies do that devalue the company is to concentrate on purchase price variances. This focuses the purchasing department on finding the cheapest parts and supplies without consideration for the lifecycle costs of the new purchase. Although a part may cost less initially, it may actually add more costs to the overall process if new tools and procedures must be developed.

If these problems exists in a company, the accountants need to implement a value added accounting system. To quote the author, Professor Boer, “A value added accounting system is simple, encourages teamwork, and continuous improvement, focuses on important strategic issues, measures people development, and compares company costs to outside companies.” First of all, the accounting system must be simple. These simple systems reveal information rather than covering it up and they also cost little to run.

In addition, we want the accounting system to encourage cooperation between functional areas. Accountants could develop performance measures that evaluates the performance of all the managers in a related area. For example, marketing and manufacturing could be linked so that the performance of each area depends on how the other one is doing. Also, the system should encourage continuous improvement. The idea that you are good enough is not acceptable. You are NEVER good enough! Large cost reductions come from many small reductions. Don’t go for the homerun, just try to get runners on base. Sooner or later you will score, i.e. achieve significant reductions.

A manager must also consider relevant issues. Rather than looking at the purchase price variance, a manager should look at the lifecycle costs of the part. This takes into account the product quality and the supplier’s delivery performance among other things. This method does not give 100% accurate costs, but it does give a nice frame of reference when selecting vendors.

One of the most important things to remember as a manager is that skilled workers can do more to reduce your costs than almost anything. Properly trained employees will take matters into their own hands by solving problems and elevating their work standards. The accounting system must report the value that the company receives from this training. This information can be prominently displayed in the work area with the employees name and corresponding skills. This shows what the company gets (the value added) in return for its training.

Finally, “ideas, not technology are the keys to continuous success”. By studying your best competitor you can learn if your operations are as good as they should be. What good does it do to reduce costs by 10% a year when the competition is reducing their costs by 20%. You must know your competitors costs or you will be a victim of your own ignorance. “If you want to compete with the best, study the best.”

In conclusion, “Focus on the simple elements that make the accounting information useful to managers, prune any accounting operations that do not make a positive profit contribution, and expand the accounting horizons to include unconventional data. Do these things, and you will have a value-added accounting system.

Go back to list of articles.

Johnson, Thomas H. "Activity-Based Information: A Blueprint for World-Class Management Accounting" Management Accounting, (June 1988), pp.23-29, 11/13/95

Michael J. Ferriera, Class of 1997

In “Activity-Based Information: A Blueprint for World-Class Management Accounting,” Johnson expresses the need for a new accounting system, one not based on cost information, for making decisions about company profitability. Profitability does not result simply from controlling costs, it is also determined by quality and flexibility. Therefore many companies cannot rely upon cost information to make management decisions regarding profitability and competitive value. As Johnson says, companies “must look beyond transaction-based cost information to know if decisions will deliver profit. It must develop new information to achieve this objective.

Enter “activity-based information.” The new management accounting system must be based on activities and their value rather than on costs and how to reduce them. Employees perform activities, which in turn cause costs. Management needs to manage the activities which cause cost, and therefore it needs information based on these activities -- not information based on the costs incurred. Activity-based information is concerned with the factors that drive the costs and profits.

There are two types of activity-based information which should form the foundation of this new management accounting system. One is non-financial information. This information expresses how each activity delivers value to the customer. The second type is strategy cost information which indicates cost effectiveness of the activities compared to competitors, and whether “the mix of products management has chosen to sell uses activities in the most profitable way.”

To be profitable and competitive, companies must reduce the activities which do not add customer value. To accomplish this they need to “eliminate causes of delay, excess, and unevenness in all activities.” Information based on cost may indicate that problems exist or that improvements could be achieved, but it is unable to identify exactly where the problems lie and what improvements should be made. The new management accounting system, with the activity-based information it provides, allows management to react to the activities and see which activities add, or do not add, value. It does this by requiring management to analyze each activity to find its contribution to customer value and to find any causes of wasted efforts, or non-value adding activities.

By approaching the management decisions with information based on activities rather than information based on costs, companies will be able to perform more pro-actively. By consistently analyzing the activities for value adding and non-value adding characteristics, companies will be able to see future problems before they become big problems that are difficult to fix. It would be possible to identify problems before they make a big impact on cost -- problems which would have been caught much later if the company was using a management accounting system based on cost information rather than activity information.

Go back to list of articles.

Burt, David N., "Managing Suppliers Up to Speed," Harvard Business Review, July-August 1989, pp.127-135., 11/13/95

Kimberly Collora, Class of 1997

The article, “Managing Suppliers Up to Speed” by David N. Burt examines the elements inherent in effective partnerships between suppliers and manufacturers. Such a partnership should foster interdependence and respect. Such mutually beneficial partnerships have become increasingly important because today’s ever changing marketplace requires manufacturers to be flexible. There are several elements involved in motivating the creation of a strategic, non-competitive partnership between managers and suppliers.
The first point to keep in mind is that the cheapest component may turn out to be the most costly. By the time one considers “the cost of poor quality factors such as downtime on the line, rework, scrap, warranty work, and legal fees, the cheapest may well be the most costly.”1 The cost that is most important is the cost which takes into account all of these factors, not only the initial unit cost.
Managers in the past have tended to award two or more contracts for critical materials, when in fact “manufacturers are better off with single source suppliers.”1 This is because these suppliers may feel as if they are family and allow themselves to be subject to more examination processes and feel more committed to the quality of the product.
There are five critical areas involved in creating a successful relationship between a manufacturer and a supplier. The first area is the way in which the supplier is selected. It has been demonstrated that the best way to select a critical supplier is through a team effort. This team should be composed of members from different departments, and should be able to keep the views of the company, and the possible future supplier, in mind when making decisions.
The manufacturer should also allow the supplier to be active in the product design area from the beginning. This interaction allows a greater sharing of ideas and allows for “lower unit prices and less likelihood of future quality problems.”1
One way to motivate suppliers is to have them develop quality plans while preparing proposals on component designs and to designate what they find necessary for testing equipment and procedures. Another way to motivate suppliers to meet desired quality levels is for manufacturers to implement certification programs, and provide timely feedback.
These new manufacturer-supplier partnerships “intensify the dependence of suppliers on major manufacturing corporations, and the latter would be well-advised not to press this advantage too hard.”1 This advantage is the one which suppliers may gain by entering into a long-term relationship with a manufacturer as their only supplier.
Manufacturers can also survey their suppliers to find out about their needs. The information from these surveys can be used to improve the relationship between manufacturers and suppliers. It can also be circulated by manufacturers within the industry to let others know of technology that they would like to see developed.
A partnership is based on trust, interdependence and respect. While the supplier needs a stable customer for its product and/or services, a manufacturer needs suppliers to share their “schedules, brainpower and financial information.”1 Such a partnership may be difficult to achieve, but a partnership that is amicable will be more beneficial in the long run.

Here is another review of this same article

Burt, David N., "Managing Product Quality through Strategic Purchasing," Sloan Management Review (Spring 1989), pp. 39-48.

Debbie Glenn, Class of 1997

Managers must now design and purchase quality into the product if they want to make quality products at reasonable prices, contends the author. In order to do this effectively, many departments must give the Purchasing Staff input before the Purchasing Department can “assume responsibility for ensuring that quality, time, service and cost all receive proper attention.

In order for the Purchasing Staff to be effective, the requirements for products must be explicit. Establishing the correct input requirements requires a cross-functional effort. Marketing must be involved to identify the actual products to be manufactured and develop demand forecasts to facilitate the ordering of the correct quantities of raw materials and components. By cooperating with Purchasing during the development and design of new products, engineering excellence can be improved. The Purchasing Department can help Engineering locate low cost, high quality suppliers of components and work with the suppliers to get technical assistance/cooperation from them. Communication with the Quality Assurance Department ensures that Purchasing knows the specifications and can procure components and raw materials at reasonable costs from suppliers that can comply with the purchasing firm’s quality standards. Purchasing and Product Planning/Inventory Control must work closely to ensure that long-term production plans can be fulfilled without increasing holding and ordering costs. And, finally, Purchasing must work with Operations to assist with make/buy decisions. By working with representatives from all involved departments, Purchasing can determine exact quality, price and quantity needs and function more effectively. The author stresses this point by quoting Philip Crosby: "...half the quality problems in purchased materials result from unclearly stated requirements."

Once these requirements have been determined, Purchasing is responsible for source selection and price negotiations. To begin, Purchasing may prequalify suppliers by evaluating their technical and physical capabilities and their managerial and financial soundness. Prequalifying ensures that the purchasing company will receive products that meet their specifications and demands, as well as builds a relationship that fosters cooperation between supplier and purchaser. Burt notes that “good suppliers are valuable resources” and contends that the collaborative relationships lead to higher quality and lower costs for both parties. Once suppliers have been identified and relationships begin to form, Purchasing is able to better negotiate with suppliers for the best possible price.

Purchasing is also responsible for maintaining supplier relationships and managing supply contracts. Through the efforts of the Purchasing Department, key players from both the buying and selling firms are able to meet and possibly exchange plant visits. This helps both firms understand each others’ needs and the foundations of supply contracts. It allows both firms to offer technical assistance to each other and to quickly and efficiently solve quality issues. Additionally, Purchasing serves to motivate suppliers--extraordinary effort can be positively reinforced, while negative reinforcement can deter undesirable behavior--monitor quality by examining materials and process control data, request value analysis to improve design characteristics, and assist suppliers should they run into problems that temporarily compromise the integrity of their products.

The Purchasing function becomes more important as firms move towards JIT methods. Cultivating and maintaining the close supplier relationships necessary to make JIT work requires a strong, well-managed, disciplined Purchasing Department. And, with the added emphasis on quality, the Purchasing function becomes an even more important link in the production chain.

Go back to list of articles.

Adler, Paul S. "Time and Motion Regained," Harvard Business Review (January-February 1993) pp.97-108, 11/13/95

Rochelle S. Andrews, Class of 1997

In 1984, General Motors and Toyota set up a joint-venture called New United Motor Manufacturing Inc. (NUMMI). The concept of NUMMI is to take the time-and-motion regimentation of Frederick Taylor’s work, and add a new twist to it. This will create not only excellent quality in the products, but it will also increase worker motivation and satisfaction. The concept of NUMMI uses just-in-time production methods that makes quality assurance the responsibility of each work station. Every job performed within the factory is constantly screened to insure top quality. The application of Kaizen, or continuous improvement, allows for input from all aspects of the factory as products are produced to keep the quality high. The approach that NUMMI has in producing cars has two distinct features: a commitment to the social context of work, and a focus on standardization.
In terms of trust, NUMMI builds an atmosphere of trust and common purpose. There is a chance for everyone to offer input into how the factory is run. The structure within NUMMI is the production team, there are about 350 teams in the factory. It is believed that teams encourage participative decision making. Each team has a leader, and four teams make a group. The group leader then serves with other group leaders as the first layer of management in the factory. Toyota leadership stresses that the company is not the property of management but of all workers together. They illustrate this point the best through their no-layoff policy. NUMMI has made an agreement with the unions that they will do whatever it takes to not layoff workers. All workers get the same pay except for team leaders, who get sixty cents more, and there is no seniority, performance or merit-based bonuses. This eliminates any sense of competition among the workers.
The factory itself is designed so that the workers are constantly learning and upgrading their skills. They are not treated like trained animals. The system also encourages worker suggestions, and in 1991 more than eighty percent of the suggestions were put into action. NUMMI has broken many of the stereotypes that suggest that Taylorism only works because people are forced into and not because they are intelligent enough to except or like it. The NUMMI system taps into a workers 1) desire for excellence, 2) mature sense of realism, 3) positive response to respect and trust. This is illustrated in the fact that the GM plant where NUMMI is located had the highest productivity of all GM plants, (they were the worst). Also absenteeism dropped from approximately twenty-five percent to between three and four percent. As one UAW official put it “ The key to NUMMI’s success is that management gave up some of its power, some of its prerogatives.... If managers want workers to trust them, we need to be 50-50 in making the decision. Don’t just make the decision and say, ‘Trust me’.” According to the numbers at NUMMI, this is exactly what they did. And it worked.

Go back to list of articles.

Flanagan, Theresa A., and Joan O. Fredericks "Improving Company Performance through Customer Satisfaction Measurement and Management," National Productivity Review (Spring, 1993), pp.239 - 258., 11/13/95

Tommy Marshall, Class of 1997

Corporations in the service industries covet one thing more than anything else. They want intimate knowledge about what makes their customers happy and why. Flanagan and Fredericks give the service corporations a method to experience their desires in the article Improving Company Performance through Customer Satisfaction Measurement and Management.

Unless you just returned from a long vacation in Siberia you know that the Total Quality Management revolution is happening in America. An essential element of TQM involves knowing what your customers want and how to give it to them. A commitment to TQM coupled with successful implementation can give a company greater market penetration, sales performance, customer retention, and future market prospects. Flanagan and Fredericks remind the TQM crazed manager that these outcomes are connected to four fundamental issues.

1. Who are your customers and what do they really need and want?
2. Which product/service attributes are expected and which are perceived as value-added?
3. In comparison with the competition, how well is your organization catering to customer requirements?
4. Where should you focus your improvement efforts?

Getting an answer to these questions comes with a carefully crafted customer satisfaction measurement. This measurement is something very different from complaint measurements that simply let angry customers vent to management in a reactionary style. The customer satisfaction measurement is a proactive outreach to the total market of customers. This measurement provides genuine interaction with the customer to discover what she thinks of the company today and what she needs from the company tomorrow.

Flanagan and Fredericks help define a method of customer discovery by outlining six phases that form an effective customer satisfaction and management process.

€ Objective setting
€ Discovery
€ Critical needs Assessment
€ Action Planning
€ Product, Service, and Organizational Improvement
€ Ongoing Measurement and Monitoring

If any service corporation successfully implements these six phases they will have taken large strides in making Total Quality Management techniques work to improve their company. The first phase begins with objective setting. This means that the corporation asks itself the question how will we use the information gathered from our customers strategically and tactically? The objective of this process must be something that the company can attain and act on.

Discovery is the process of acquiring information from customers. This process must be well planned and executed. Many companies forget that they speak a different language from their customers. Company jargon and culture can keep the customer from understanding the companyıs request. Useful information about customers also resides in the company itself. Employees who deal with customers understand customer habits and desires better than anyone else in the company. These tactics coupled with surveys and interviews should help the company make useful discoveries.

The critical needs assessment will help the company to decide what to do with all the information gathered. The company must keep in mind what it really wants to know. Look at all the information and try to unveil those factors that effect customer satisfaction. Once the key factors are recognized the company can get to the action phase. Create an action plan that will give the customer exactly what she has told you she wants today and tomorrow.

The exceptional companies will pay close attention to the final two phases of product, service, and organizational improvement coupled with ongoing measurement. The company must commit itself to working as a team across all levels if the action plan is expected to take effect. Flanagan and Fredericks call this the climate of shared commitment. The climate committed to action and continuous monitoring of quality will move the company forward to meet the changing needs and expectations of the customer. This effort will satisfy the customer and the resulting positive impact on the bottom line will satisfy the company.

Go back to list of articles.

Fuller, Joseph B., James O'Connor, and Richard Rawlinson "Tailored Logistics: The Next Advantage," Harvard Business Review (May-June, 1993), pp. 87-98., 11/13/95

GEORGE DOUPSAS, Class of 1997

In their article titled Tailored Logistics: The Next Advantage, Joseph Fuller, James O'Connor, and Richard Rawlinson of Monitor Company, a Boston-based strategic consulting firm, attempt to address the question that efficient logistics management raises. They contend that logistics, if tailored correctly to fit the needs of the company and its customers, could become the next "governing element of strategy as an inventive way of creating value for customers, an immediate source of savings, an important discipline on marketing, and a critical extension of production flexibility."

Companies need to realize that different customers have different needs. Once companies understand this basic principle, they will be able to tailor their logistics systems to serve their customers' needs better. As result, the companies will become more profitable. Central to the argument for better management of logistics is the creation of value, or simply profit. The authors estimate, for instance, that Coca-Cola bottlers could save an average of $80 million to $90 million a year for the next ten years in costs by tailoring their service delivery channels to the needs of distinct groups of customers. Logistics has clearly become a critical issue in product strategy today.
The most difficult challenge, however, in strategically managing logistics is the development of "target segments of customers that can be served profitably by distinct, rationalized pipelines." A company does not simply create value for its customers and sustainable advantage for itself by just offering a variety of products, but also by attaching valuable services to those products. Thus, real opportunities exist for added value and greater profit as logistics are generally undermanaged.

The authors further contend that the best logistics strategy is to build distinct approaches to different group of customers. They note that "the goal of logistics strategy, then, is to organize companies to compete across the span of their markets without having to overcharge some customers or underserve others." This goal can be achieved by applying some of the same management principles which have made operations more efficient: interfunctional planning, just-in-time approach to inventory management, separation of work flow, electronic tracking, data interchange, and so forth. In addition, general management leadership must be exerted if the efforts for logistics improvement are to be successful.

Companies could use "logistically distinct business methods" (or "LDB methods") to achieve the desired goals. The most original LDB method is the posing of several basic questions (usually eight) about any product that, put together, make up a logistics decision menu. The authors also note that LDB methods are useful not only for their analytical components, but also for their cross-functional approach.

Tailored logistics can become the next powerful competitive advantage to those companies that realize the value they can add to their organizations just by simply using different logistics processes and pipelines to satisfy different customers' needs. But most importantly, at the same time companies are adding value to their customers, they are also benefiting their own bottom lines as they streamline operations and adopt a long-term, integrative, strategic vision to manage their operations.

Go back to list of articles.

Marple, Raymond. "Management Accounting is Coming of Age." Management Accounting (July 1967), pp. 3-16, 11/14/95

Jennifer Hall, Class of 1996

Management accounting has evolved over time and is influenced by the continually changing environment in which it has developed. This type of accounting is focused on internal use by management rather than external reporting. At this time, management accounting has appeared to have matured and the current principles provide the basis for management accounting of the future. All types of accounting share a common objective: to measure entity capital and its changes through use over time. The principal end products of management accounting are the forecast balance sheet and the forecast profit plan. However, management must also consider historical reporting and must use the same concepts for planning and reporting to provide comparable results.

First, management needs segment information. The contribution of each segment to the overall company must be determined. These segments only contribute negatively or positively; only the firm earns a profit or loss.

Next, consider contribution versus net profit. By attributing profit and loss to segments, allocations are made on top of allocations which reduces the value of the resulting information for managerial planning and appraisal. The contribution approach provides "a simple objective statement of what has happened, requiring no assumptions and no cost allocations."

A segment is part of the company that is recognized separately for planning or control purposes. Accounting for segments is done in two steps: assign revenues and costs to both the planning and control segments responsible for the segment, and prepare reports that measure contributions by relating planning segment costs to revenues and that measure variance by relating
control numbers to planned numbers. Planning reports represent an alternative to allocation.

Assignment is based on responsibility. This sort of assignment shows the cause and effect relationship between decisions and actions and the cost and revenue results of these decisions. This method allows for direct comparison between actual and projected results that are conclusive. Direct costing is used as opposed to absorption costing. Direct costing insures that only those costs for which the product is responsible are the variable costs. Contribution reporting allows for responsibility to be assigned for fixed costs, too.

Marketing management benefits greatly from contribution reporting. Contribution reporting allows marketing to focus on the marginal income rather than sales volume or gross profit. Marketing managers can look at marketing costs and revenues without worrying about allocation problems.

In the future, standards will be even more important. The standards will be for both revenues and costs. Such standards will allow for the integration of the various accounting purposes. These standards are important for budgetary planning, variance calculations, and determination of periodic costs. Essentially, management accounting provides for the assignment of revenues and costs on a responsibility basis.

Go back to list of articles.

Briner,Russell F., Michael D. Akers, James W. Truitt, and James D. Wilson, "Coping with Change at Martin Industries", Management Accounting (July,1989), pp. 45 -48, 11/14/95

Zollie Collins, Class of 1997

The article “Coping with change at Martin Industries,”in Management Accounting’s July 1989 issue, illustrates the way Martin’s Industries changed its accounting system to remain competitive. Martin’s accounting system in the 1970’s was a limited trial balance and bill of materials system. The company had to change their system to a variable costing and managerial system.
The company recognized the need for change and the method that would assure the system’s success. The company distributed an in-house training book “Marginal Income Planning for Profit” and supported the educational process of changing of changing to the new system.
The key contribution the new variable costing system provides is the internal reporting statements that are valuable for internal management and cost control at Martin Industries. The company’s need to lower costs to stay competitive in the changing industry could not have been achieved without the comparative income statements and performance analysis reports produced by the new system. Also, the new system introduced responsibility accounting into Martin’s accounting reports. The responsibility accounting plays a significant part in Martin’s successful change to remain competitive.
The company has an effective accounting system now and uses variable costing to lower costs and change with their industries changing environment. Without a doubt, The new system has been a major reason for Martin Industry’s success over the past 20 years.

Go back to list of articles.

Beatty, Carol A. “Implemented Advanced Manufacturing Technologies: Rules of the Road,” Sloan Management Review (Summer, 1992), pp. 49-60., 11/14/95

Mark W. Babcock, Class of 1996

In this article Carol Beatty presents the results of a multi-year study of ten companies attempting to use advanced manufacturing technologies (AMT). AMT consists of integrated manufacturing systems that combine computer-aided design (CAD), computer-aided manufacturing (CAM) and numerically controlled machinery. Beatty’s study followed the implementation of these systems from inception through completion and analyzed the reasons for success and failure at each company. She identified three “rules of the road,” requirements which must be met in order for AMT to have a chance. These are: (1) develop an effective champion, (2) plan for a high level of systems integration, and (3) use organizational integration techniques. A company that meets these requirements is not guaranteed success. However, Beatty’s research showed that no firm was able to achieve all of its stated goals without taking these steps.

Beatty also analyzed why firms were not able to follow these “rules.” She calls these barriers “potholes.” According to her research, the main impediment to creating an effective champion is finding an individual with both the technical abilities and the interpersonal and motivational skills to guide the project to completion. Systems integration is even more difficult to achieve. Because of the multiple hardware and software systems in place at most companies, the effort required to integrate them is a substantial drain on resources. The rapid rate at which technology becomes obsolete amplifies these problems. Finally, organizational integration is also necessary. AMT is a cross-functional solution which requires the cooperation of many departments. The different goals and cultures of a company’s R&D, MIS and manufacturing groups can create enough conflict to prevent a firm from achieving its AMT goals.

As Beatty’s study demonstrates, even companies that know the potential problems and necessary steps to effective AMT implementation are sometimes unsuccessful. Knowing what to do does not ensure that a firm will be able to do it, but it can provide some guidance to managers considering utilizing AMT in their businesses.

Go back to list of articles.

Burt, David N., "Managing Product Quality through Strategic Purchasing," Sloan Management Review (Spring, 1989), pp. 39-48., 11/14/95

Terry E. Trudgian, Class of 1997

Quality has become a critical factor of success in manufacturing. However, quality can not be “inspected into a product” at the end of the production line. Quality begins with product design and the purchasing of materials. In this article Burt outlines three roles for the Purchasing department in a company’s quest for quality: assist in product development, selection of appropriate suppliers, management of contracts and supplier relationships.

The Purchasing department's involvement in quality begins in the early stages of product design. Purchasing should be familiar with the potential demand for a product and factors that may shift the level of demand. Purchasing should use the information to develop contingence plans to meet shifts in demand. Purchasing is able to add value to “make or buy” decisions by contributing cost and quality information. Purchasing can also play a role in product development by involving potential suppliers in a product’s design. Qualified suppliers can assist a product design team in making decisions. Finally, the shift to JIT production requires Purchasing to understand the needs of production and planning.

Purchasing’s role in quality continues with the selection of suppliers. Purchasing has three critical roles in the selection of suppliers. 1) Purchasing must ensure clear and accurate specifications for components. Burt highlights the experiences of companies where Purchasing procured material based on inaccurate specifications. 2) Purchasing should be involved in pre-qualifying suppliers. Verifying a supplier is financial stable, has established quality programs, and has capacity to meet demand can prevent future production problems. 3) In negotiating contracts with suppliers, Purchasing should ensure that suppliers understand component specifications and the company's commitment to quality. Purchasing’s desire for a low price should be adjust to “what is your price for the quality we must have?”

Purchasing’s role in quality concludes with the management of supplier contracts. Purchasing must continually guarantee suppliers understand the buyer's commitment to quality. Burt lists a number of companies who supply quality training to key suppliers. Finally, purchasing must monitor the quality of suppliers and have mechanisms to motivate conformance.

Traditionally, Purchasing was seen as a “clerical function." However, Burt has outlined how a commitment to quality beginning with Purchasing results in a higher level of quality in the final product.

Go back to list of articles.

Schmitthenner III, John W. "Metrics" Management Accounting (May, 1993), pp. 27-30., 11/14/95

Samuel Jones, Class of 1997

Over the years, there has been little change in the way accountants prepare financial statements for both internal managers and external users. While these documents provide a consistent look at the business for outsiders, they are of little use in helping manufacturing management to improve manufacturing performance. It is an increasingly important part of the controllerıs job at the plant and division level to take a proactive role in making sure that the managers has the tools needed to make decisions that lead to continuous improvement.

The Soladyne Division of Rogers Corporation have developed Metrics, graphic illustrations to show what direction manufacturing is heading. These metrics fall into three categories: Customer Satisfaction, Manufacturing Volume, and Manufacturing Performance.

The biggest impact manufacturing employees can have in improving customer satisfaction is to ensure that the customer has his order when he wants it. Soladyne measures customer satisfaction by using three metrics: On-Time Sales to Customers, calculated by dividing the value of the shipments that went out as promised by the total sales for the week. Orders Past Due, the dollar value of orders that were scheduled for shipment, but not shipped. Buyers Misery Index, a chart that shows how many different customers have parts that are late as well as the number of part numbers that are past due.

Manufacturing Volume is important because volume through the factory generally translates into sales. Soladyne measures manufacturing volume by using these three metrics: Weekly Shipments shows sales volume, Year-To-Date Shipments vs. Plan which compares sales to the most recent forecast and the budget, and Flats/Day Completed which measures the number of units that manufacturing finishes per working day.

Manufacturing Effectiveness shows trends and progress toward continual and rapid improvement in the areas of quality, cost, lead time, and customer service. Soladyne measures manufacturing effectiveness by using these three metrics: Lost Sales Due to Damage which is the amount of potential sales dollars lost due to scrap costs, Cycle Time which is the time elapsed between when the lot is put into the process and completion of the manufacturing operations, and Percentage of Time on Parts which tracks the amount of time that is being spent working on parts compared to the time on other activities.
John W. Schmitthenner III concluded in his article that developing useful metrics is an ongoing process and in order to make these measurement tools, or metrics, useful the controller should:
1. Use the right language.
2. Make sure that manufacturing can control the metric.
3. The goal is to make money.
4. Make the metric visible and graphic.
5. Make the information timely.
6. Use available data.
7. Donıt forget to ask manufacturing what they need to know.
8. Take a macro look at the Business.

Go back to list of articles.

Gilbert, James D. "TQM Flops--A Chance To Learn from the Mistakes of Others," National Productivity Review (Autumn, 1992), pp. 491-499., 11/14/95

Firms often seek external advice from various consultants when financially struggling and facing possible shut down. The strategists observe the company for awhile, talking with as many people as they can, then organize a Total Quality Management (TQM) workshop. For TQM to be achieved, all levels of management and personnel must participate, and most importantly, communicate.
Too many times, TQM has failed due to lack of top management involvement. The cases illustrated how production people were able to rid their bad attitudes by meeting with one another to discuss their concerns. Initially this meeting can be difficult, but once everyone agrees to talk one at a time and to pay attention, progress can be made. There were problems that seemed virtually impossible to overcome because people were against each other and had little respect for the company. By communicating in an amicable fashion, these opposing forces began to form an alliance, set goals, and work together to achieve them. Unfortunately, these efforts may not be enough. The ultimate cause of demise too often occurs because the top management failed to support their employees.
Other firms, however, can benefit from these failures by scrutinizing the underlying issues that led to the demise, and then take appropriate steps to avoid similar actions. When properly supported by all levels of management, TQM can turn seemingly impossible situations into winning outcomes. The following list offers valuable advice for managers who sincerely wish to see their TQM efforts succeed:

€ Total involvement. Join the employees and act as a team to achieve common corporate goals.
€ Keep communication channels open. Workers probably know the most concerning operational hazards and problems. Furthermore, these workers usually can offer the best advice on how to increase efficiencies and reduce waste.
€ Get training. Be the first to attend a training seminar in order to fully understand what the employees will be learning.
€ Screen consultants. There are many external units who offer advice for a substantial fee. Make sure the consultants have good experience dealing with issues similar to the ones currently faced.
€ Be patient. Face each problem and take one step at a time. Make small goals that are achievable.
€ Remove fear. Productivity will increase.
€ Institute an education program for continual training.
€ Continually monitor training effectiveness and make changes as needed.
€ Continually motivate the employees to strive to achieve their goals, recognize their accomplishments, and reward them accordingly.

Go back to list of articles.

Frank, Gary B., Steven A. Fisher, and Allen R. Wilke, “Linking Cost to Price and Profit,” Management Accounting (June 1989), pp. 22-26. , 11/14/95

Elizabeth Lee Wilson, Class of 1996

The role of cost accounting is two fold: one, to facilitate internal managerial decision making and secondly, to provide a framework for external reporting. While the parameters for external reporting are stringent, the opportunities to develop and implement tailored, effective internal cost accounting systems are limitless.

Creativity and organizational fit are two key factors that influence a firm’s cost accounting structure. In the manufacturing industry, the advent of technological changes and increasing economic pressures demand that today’s cost accounting system focus on total cost management, not merely cost control. Reducing costs remains a central goal, however the boundaries of an accounting system must also encompass profit maximization information. The system must respond directly to a firm’s manufacturing processes wherein it can identify relative fixed and variable costs on both a product as well as a plant basis.

Devising a cost accounting system that adheres to the belief that “the only way to change costs is to change activity” requires that the framework apply to all of a firms activities. In essence, the accounting data and reports must represent critical actives from the receiving dock to the marketing department. Functional integration is central to devising as system that identifies all optimal costs, pricing policies, and product trade-offs. This information allows managerial decision makers to focus on performance improvements throughout company - - helping them distinguish where to focus their time, effort and energy.

All one has to do is observe the innovative cost accounting system at GenCorp Polymer Products (GPP) to discover that, if you are willing to devote the resources to shaping up your accounting system the rewards may prove enormous in terms of productivity and profits. GPP is the largest producer of styrene butadiene latex in the world. This industry is highly competitive as is evidenced by the reliance on price as an order-winner. GPP uses the same feedstock and production technology to produce 60 distinct products.

In order to differentiate itself, GPP focuses on achieving superior product quality (“precise tolerance”) and fostering continuous product innovations. Product quality is achieved by employing just-in-time relationships with suppliers and using computer-aided-manufacturing (CAM) processes. Innovation is inspired via a commitment to research and development as well as using a computerized control reaction process which allows the company to develop products that meet exact customer specifications.

Devotion to these two strategies provided GPP with a healthy bottom line until plant began to approach capacity limits in 1985. The firm had to decide what products to ration and or drop from its product line. The existing cost accounting system, it was discovered, had no link to pricing or true production costs.

The firm used external reporting information and found itself handicapped as far as making an effective managerial decision. For example, the firm was treating direct labor as variable cost when, in fact, due to technological changes (i.e. CAM) only one person’s (out of 130 laborers) efforts could be associated with a specific product. Also, the firm considered overhead a fixed cost when in actuality, it was variable due to the sizes of its 18 reactor vessels - - in essence a 3,500 gallon reactor received the same “cost” as a 7,500 gallon reactor!

To cure its cost accounting paralysis, GPP developed a system that gives it a true measure of product costs as they relate to pricing and product-mix decisions. The flagship measure in the new system is the “product profit velocity” (PVV) which measures a product’s contribution margin per standardized reactor hour. This figure is generated as follows: revenue per reactor run (price) less materials and freight out costs (inputs) equals a contribution margin which is then divided by standard product processing time (technical and manufacturing). This measure clearly and cleanly links the purchasing, manufacturing and marketing functions.

The PVV measure, along with other decisions such as determining that conversion costs were indeed fixed, was achieved by a cross-functional ad-hoc committee. Committee members spent an extraordinary amount of time researching the key cost drivers and related allocation concerns. Total cost analysis, cross-functional ties, managerial buy-in, and constant communication with all divisions ensures that GPP’s cost accounting system provides managers with crucial decision-making information.

Go back to list of articles.

Fry, Timothy D. and James F. Cox "Manufacturing Performance: Local Versus Global Measures", Production and Inventory Journal (Second Quarter, 1989), pp. 52-56., 11/14/95

Moritz Dechow, Exchange student Karlsrue Germany

The article “Manufacturing Performance: Local Versus Global Measures“ by Timothy D. Fry and James F. Cox deals mainly with comparing local performance criteria with global performance criteria. The conclusion of this article is that an isolated use of local measures will lead to wrong decisions. The authors illustrate their findings by several examples.
In a numerical example, a four product company with limited resources in time is looking for the optimal product mix. Applying three different approaches leads to three different product mixes, but all of them are suboptimal: The accounting approach aims at maximizing the ROI on each product, thus only the product with the highest margin is produced. This approach is suboptimal because it does not take into account that another product mix would use resources more efficiently. The second approach, the marketing approach, maximizes sales, regardless of the profit margin created by each product. The production approach tries to maintain each worker’s efficiency, again ignoring different profit margins. The authors point out that only a “joint effort“ between several areas of the company will result in a global optimum.
Another example that demonstrates the negative impact of local performance measures is given below: Interviews in a manufacturing plant show that employees on different levels in the company’s hierarchy are measured by different performance measures. The plant manager is mainly concerned with the ROI on the plant, product managers with due-date performance. Floor supervisors and machine operators were evaluated on the basis of standard efficiency measures.
Workers are caught in the “performance trap“. If a particular operation was running smoothly, then more units than necessary were produced, thus delaying production of “priority“ units. The extra units are “hidden away“ and are reported at a later day when this unit was ordered again. Jobs that more easily beat the standard were chosen over less favorable jobs. Producing out of sequence leads to a delay in downstream production and assembly. In addition, workers often steal parts assigned to other orders in order to finish a particular order, thus delaying the production of the „pirated“ products. Through such activities, shop floor supervisors and machine operators are able to report high levels of efficiency to their product managers.
But these actions have a negative impact on due-date performance. Thus product managers are forced to schedule overtime work, especially towards the end of the month when product manager’s performance is evaluated and production is running behind the standards. Therefore, the majority of the orders is shipped during the end of the month.
The excessive use of overtime increases labor costs. In order to get orders finished and to reduce the large work-in-process inventories at the end of the month, the plant managers have to purchase additional capacity. All these actions affect the ROI negatively.
The authors conclude that local measures, whether in production or functional departments, should not be established in isolation. Measures may be distorted by “time fences“ such as end-of-the-month syndrome, measured work day and individual incentive systems. Striving for perfection on one measure often has a detrimental effect on other measures, as shown above. Local measures should only be used with extreme caution.
The authors point out that developing a sound and useful performance measurement system is challenging and difficult task, especially the determination of measure dimensions and the relationship between among local measures. Both these require further research.

Go back to list of articles.

Grant, Robert M. "The Resource-Based Theory of Competitive Advantage: Implications for Strategy Formulation," California Management Review (Spring 1991), pp. 114-135., 11/14/95

Taylor Erickson, Class of 1997

Strategy has been defined as “the match an organization makes between its internal resources and skills...and the opportunities and risks created by its external environment.” Throughout the 1980’s the developments of strategy analysis focused on the relationship between strategy and the external environment. Michael Porter’s analysis of industry structure and an overall increased awareness of competitive positioning, are examples of the predominant trend in strategy analysis to focus on a firm’s link to the external environment. However, a resurgence of interest in the firm’s internal resources as the foundation for firm strategy provide a new framework for strategic analysis.

Advances in what has been termed “the resource-based view of the firm” have occurred on several fronts and involve two business strategy levels. At the corporate strategy level, focus is placed on
the role of corporate resources in determining the industrial and geographic boundaries of the firm’s activities. At the business strategy level, the relationship between resources, competition, and profitability include a comprehensive analysis to determine the optimal method for a firm to sustain competitive advantage. The purpose of this article is to integrate a number of key themes arising from the “resource-based theory” and provide a 5 stage procedural framework for research-based strategy formulation.

Stage 1: Resources and Capabilities as the Foundation for Strategy
The key distinction between resources and capabilities is that resources are inputs into the production process and a capability is the capacity for a team of resources to perform some task or activity. The primary task of a resource-based approach to strategy formulation is to maximize rents over time. To this end, the article examines the strategic implications of the following questions: What opportunities exist for economizing on the use of resources? and What are the possibilities for using existing assets more intensely and more profitable employment?

Stage 2: Identifying and Appraising Capabilities
A firm’s capabilities can be identified using a “standard functional classification” of the firm’s activities. The “collective learning” in the organization or “core competencies” help examine how firms coordinate diverse production skills and integrate multiple streams of technology. When appraising capabilities, it is important to maintain objectivity and assess capabilities relative to those of your competitors. Creating capabilities involves complex patterns of coordination between people and resources and requires learning through repetition.

Stage 3: Evaluating the Rent-Earning Potential: Sustainability
A firm’s return on resources and capabilities depends upon the sustainability of the competitive advantage and the ability of the firm to appropriate the rents earned from resources and capabilities. Over the long term, the returns associated with competitive advantage are eroded by the depreciation of resources and capabilities and by the imitation of rivals. Durability--In the absence of competition, the durability of a firm’s resources vary considerably. There is a greater potential for a firm’s capabilities to be more durable than the resources upon which they are based. Transparency--A firm’s ability to sustain competitive advantage depends upon the speed with which other firms can imitate its strategy. Successful imitation requires overcoming two obstacles. The first is the information problem: What is the competitive advantage of the successful rival and how is it being achieved? The second is the strategy duplication problem: How can the imitator amass the resources and capabilities required to successfully duplicate the strategy? Transferability--The ability of the firm to acquire the resources and capabilities required for imitating the competitive advantage of a successful rival. Replicability--Imperfect transferability limits the ability of a firm to imitate success. Some resources and capabilities can be easily imitated through replication. If the firm’s capabilities are based on highly complex organizational routines the capabilities are much less easily replicable.

Stage 4: Evaluating Rent-Earning Potential: Appropriability
The returns to a firm from its resources and capabilities depend also on the firm’s ability to appropriate these returns. This stage is concerned with discerning between ownership rights and evaluating the role of employee skills. In the case of employee skills, two major problems arise. One, The lack of a clear distinction between the technology of the firm and the human capital of the individual. Two, the limited control of the firm over the individual. Employee mobility defines the risk associated when a firm’s strategy is dependent upon the specific skills of a few key employees.

Stage 5: Identifying Resource Gaps and Developing the Resource Base.
A resource based strategy is concerned with the development of the firm’s resource base. This includes not only the replenishment of a firm’s stock of resources, but also the augmentation of resources to extend positions of competitive advantage. In addition, firms must also develop their resource base.

The resources and capabilities of a firm are central considerations in formulating its strategy and the firm’s profitability. The relationships between resources, capabilities, competitive advantage, profitability, and sustainability are key to a resource-based approach to strategy formulation. The resource-based approach “requires the design of strategies which exploit to the maximum effect each of the firm’s unique characteristics.”

Go back to list of articles.

Jaehn, Alfred H. "The Zone Control Chart," Quality Progress (July, 1991), pp. 65-68., 11/14/95

Steve Pillsbury, Class of 1997

In his article, Mr. Jaehn discusses the advantages of using a Zone Control Chart (ZCC) as opposed to a traditional Statistical Process Control chart (SPC), such as the Shewart Control Chart, to measure variable quality of a process. He points out that the ZCC is simpler to use and has better results in most instances. To summarize how a ZCC works:

Results between the target and 1 Sigma (+ or -) receive a
value of 0. Results between the target and 2 Sigma (+ or -)
receive a value of 2. Results between the target and 3 Sigma
(+ or -) receive a value of 4. Results greater than 3 Sigma
receive a value of 8. Track the cumulative scores. When the
cumulative total reaches 8 or above, the process is out of
control. The cumulative zone score, however, returns to 0
each time a new observation falls on the opposite side of the
centerline (i.e. goes from positive to negative Sigma).

Mr. Jaehn highlights three specific benefits of using the ZCC.
(1) Workers do not need to spend time plotting the exact measurement of each observation. Rather, they can simply identify the range, or zone, in which the observation falls. (2) The ZCC has been proven to be more efficient than other process variability measures. With few exceptions, the ZCC accurately identifies out of control processes in less time (or number of observations) than other comparable methods.
(3) The ZCC is generally more reliable with regards to "false alarms."
That is, the number of observations before a false alarm is larger for the ZCC than similar out-of-control tools.

The first part of the article (summarized thus far) discussed ZCC’s results compared to other control tests in terms of average process control (known as x-bar). The ZCC has also been determined to be a viable test for "detecting changes in variation" among the results (known as r-bar measurements). The use of the ZCC for ranges is essentially the same as for control averages, except only the positive zones are represented.

Finally, Mr. Jaehn describes options in using the zone system. This discussion is also useful in explaining why the zone points are allocated in the 0-2-4-8 manner. For example, if any shift in the mean away from the target "results in a costly rejection," the zone values can be changed to address the issue. In the article he describes a process where the point schemes are in the 1-2-4-8 pattern.

The key point of the article for managers concerned with process quality measurements, however, is that the Zone Control Chart is a simple and efficient method, especially compared to existing SPC tests.

Go back to list of articles.

Frank, Gary B., Steven A. Fisher, and Allen R. Wilke, "Linking Cost to Price and Profit," Management Accounting (June 1989), pp. 22-26. , 12/7/95

Wilson, Elizabeth Lee, Class of 1996

The role of cost accounting is two fold: one, to facilitate internal managerial decision making and secondly, to provide a framework for external reporting. While the parameters for external reporting are stringent, the opportunities to develop and implement tailored, effective internal cost accounting systems are limitless.

Creativity and organizational fit are two key factors that influence a firm’s cost accounting structure. In the manufacturing industry, the advent of technological changes and increasing economic pressures demand that today’s cost accounting system focus on total cost management, not merely cost control. Reducing costs remains a central goal, however the boundaries of an accounting system must also encompass profit maximization information. The system must respond directly to a firm’s manufacturing processes wherein it can identify relative fixed and variable costs on both a product as well as a plant basis.

Devising a cost accounting system that adheres to the belief that “the only way to change costs is to change activity” requires that the framework apply to all of a firms activities. In essence, the accounting data and reports must represent critical actives from the receiving dock to the marketing department. Functional integration is central to devising as system that identifies all optimal costs, pricing policies, and product trade-offs. This information allows managerial decision makers to focus on performance improvements throughout company - - helping them distinguish where to focus their time, effort and energy.

All one has to do is observe the innovative cost accounting system at GenCorp Polymer Products (GPP) to discover that, if you are willing to devote the resources to shaping up your accounting system the rewards may prove enormous in terms of productivity and profits. GPP is the largest producer of styrene butadiene latex in the world. This industry is highly competitive as is evidenced by the reliance on price as an order-winner. GPP uses the same feedstock and production technology to produce 60 distinct products.

In order to differentiate itself, GPP focuses on achieving superior product quality (“precise tolerance”) and fostering continuous product innovations. Product quality is achieved by employing just-in-time relationships with suppliers and using computer-aided-manufacturing (CAM) processes. Innovation is inspired via a commitment to research and development as well as using a computerized control reaction process which allows the company to develop products that meet exact customer specifications.
Devotion to these two strategies provided GPP with a healthy bottom line until plant began to approach capacity limits in 1985. The firm had to decide what products to ration and or drop from its product line. The existing cost accounting system, it was discovered, had no link to pricing or true production costs.

The firm used external reporting information and found itself handicapped as far as making an effective managerial decision. For example, the firm was treating direct labor as variable cost when, in fact, due to technological changes (i.e. CAM) only one person’s (out of 130 laborers) efforts could be associated with a specific product. Also, the firm considered overhead a fixed cost when in actuality, it was variable due to the sizes of its 18 reactor vessels - - in essence a 3,500 gallon reactor received the same “cost” as a 7,500 gallon reactor!

To cure its cost accounting paralysis, GPP developed a system that gives it a true measure of product costs as they relate to pricing and product-mix decisions. The flagship measure in the new system is the “product profit velocity” (PVV) which measures a product’s contribution margin per standardized reactor hour. This figure is generated as follows: revenue per reactor run (price) less materials and freight out costs (inputs) equals a contribution margin which is then divided by standard product processing time (technical and manufacturing). This measure clearly and cleanly links the purchasing, manufacturing and marketing functions.

The PVV measure, along with other decisions such as determining that conversion costs were indeed fixed, was achieved by a cross-functional ad-hoc committee. Committee members spent an extraordinary amount of time researching the key cost drivers and related allocation concerns. Total cost analysis, cross-functional ties, managerial buy-in, and constant communication with all divisions ensures that GPP’s cost accounting system provides managers with crucial decision-making information.

Go back to list of articles.

Shank, John and Vijay Govindarajan, "Making Strategy Explicit in Cost Accounting: A Case Study", Sloan Management Review - Spring 1988, 12/7/95

Robert E. Lawless, Class of 1997

The article titled "Making Strategy Explicit in Cost Analysis: A Case Study" in the Spring 1988 issue of the Sloan Management Review stresses the need for managers to begin actively considering cost analysis in an extensive strategic context. Strategic cost analysis uses cost data to develop tactics used to gain sustainable competitive advantages.

The authors use the Baldwin Bicycle Company Case to expand the traditional cost analysis approach to include questions pertaining to Baldwin's strategic position in a saturated market. Baldwin has an opportunity to expand sales by developing a private-label bicycle for a discount department store chain. However, the agreement specifies that Baldwin sell the bike at a 15% discount.

By using the traditional cost analysis approach the proposed arrangement seems very attractive. The authors use a relatively straightforward method for computing the incremental cost of producing the new bike and the cost of carrying the incremental investment needed to support the incremental sales. Also considered is a rough approximation of the costs to Baldwin due to reduced sales of their existing product lines caused by the lower prices offered by the discount department store. The final results suggested that the deal was very attractive from a short-run, incremental, financial analysis standpoint.

The authors state that most managers will not pursue any further dimensions of this case before making a decision. The agreement seems very favorable from the traditional accounting perspective. However, by using a strategic analysis the authors illustrate how hazardous this limited viewpoint can be.

A strategic analysis starts by examining the possible market penetration achieved by the new product and which market segment will most likely be effected. Since the bikes themselves will only differ in cosmetic ways, the new "discount bike" will be competing directly with Baldwin's existing lines while generating less profit for the company.

By selling the bicycles to the discount department store chain, Baldwin is not only creating a direct competitor to its regular customers, but also giving that competitor a better price than its regular customers. This behavior can be considered unethical, as well as harmful to their existing client base.

Strategic analysis can also be very useful in looking at the possible long-tern results of Baldwin entering the discount segment of the market. In response to supplying the discount department store, Baldwin's dealers can drop Baldwin bikes and adopt a competitor's line. This would drive Baldwin further into the discount segment of the market. Further analysis of the company suggests that Baldwin is ill-equipped to handle such a change in sales mix. In fact, the case states that under these circumstances Baldwin would need to cut fixed costs by over 40% just to earn an average return on equity.

In summary, what looks like a profitable investment from the traditional short-run relevant cost perspective can be detrimental to the firm from a strategic viewpoint. It is imperative that the strategic aspects of a business problem be considered when performing cost analysis

Go back to list of articles.

McKinnon, Sharon M. and William J. Bruns, Jr. "What Production Managers Really Want to Know," Management Accounting (January, 1993)pp.29-37, 12/8/95

Guardiola, Jose, Class of 1996

This study attempted to determine the usefulness of accounting information for production managers. Its findings demonstrate that the actual information used for daily operating control does not come from the accounting area. Instead, the information comes from the manufacturing area itself.
The manager’s need for information is affected by three factors: 1) production factors in terms of cost, quality and availability; 2) the time frame and 3) the channels of communication through which the information flows.
One of the most valuable pieces of information that production managers use is the level of finished goods inventory. They try to minimize the amount of inventory they hold for many reasons: 1) to avoid tying up much of their working capital; 2) to lower expenses for storage facilities and 3) to reduce product obsolesce and deterioration.
In companies with high inventories, managers look for critical success factors (CSF) to help them monitor on daily basis the factors that drive the annual production plan. The CSF for these companies revolves around assuring continuous output rather than financial data. Thus, they focus their attention to: 1) units of inputs; 2) number of employees; 3)levels of production; 4) machine downtime and 5) scrap.
On the other hand, companies with low inventories due to the nature of the process ( i.e. more specialization, tailored product and quick distribution) place less emphasis in CSF related to volume. Instead, they focus more on CSFs like: customer satisfaction, quality and product innovation.
An additional factor that affects production control is the daily purchasing activities. Among the purchasing units, the CSFs are inventories levels and prices. Other factors are considered stable over time such as: production plans, bills of materials and supplier lead time.
This study showed the importance of relative cost of raw materials as a key factor in whether production managers maintained high inventories. As a consequence, the purchases are more focused on the price of the commodity and its fluctuations. When the materials are inexpensive, the emphasis is to assure a continuous supply.
Also, production managers must consider distribution costs and customer service. In companies that produce discrete products, neither of these factors was considered to be as important as quality. Despite the scenario, distribution costs were not followed by managers on daily basis. Instead, decisions were made based upon the manager’s experience.
So in turn, what information is most useful for managers? According to the study, production managers utilize more physical unit data than dollar data. However, they do look for financial data in order to ensure that they are moving toward the desired goal.
Accounting systems have failed to provide production managers with timely, accurate and relevant information required. This failure occurs because: 1) accounting information reaches production managers almost too late; 2) the method that accounting is used to report information is useless for them; and 3) the information continues to be reported in dollars.
As a result, production managers have developed their own information system to gather required information by: 1) short memos among their units; 2) personally designing spreadsheets; 3) personal observation and 4) external collection of information.
The study proposes three solutions for this information problem. First, management accountants should aid in providing necessary physical unit data to managers. Second, accountants should play a major role in the improvement of communication and gathering of information between all the parties involved in gathering information( i.e. sales and production) . Third, they should attempt to build friendly and real time databases that help production managers possess the information in a timely manner.

Go back to list of articles.

Performance Appraisal and Quality: Ford’s New Philosophy” by William W. Scherkenbach, “Quality Progress” April 1985, 12/9/95

Maria Popova, Class of 96

Ford management considers people their most important resource. They want to put in place performance appraisal system that supports people rather than blocking them.

Problems with traditional performance appraisal systems

· American businesses are functionally oriented and each discipline such as finance or marketing is evaluated on objectives that are also functionally oriented. As the system works it produces mutually exclusive goals, which destroys teamwork.

· Performance appraisal systems reduce initiative or risk taking, because they evaluate people on the basis of their making or not making their objectives. Fear of failure to make an objective keeps people from stretching themselves to meet ambitious goals.

· They increase variability of people’s performance. Employees who were rated below average try to change to become above average. And even if they do not, they may be devastated by the stigma of being below average and until they recover their productivity will suffer.

· The systems confound people with other resources. The main assumption in rewarding or punishing people is that they are solely responsible for the results of the process. Yet the outcomes are the results of blending all the inputs.

· They focus on the short-term, because senior management is often evaluated on short-term

Important principles of the new performance appraisal system

· The main purposes of the new system are to nurture and sustain individual employee contribution to the continuos improvement of the organization as a team and to provide an assessment or evaluation of performance for the employee and management.

· The system must be based on a deep regard for people and must recognize that employees are the organization’s most important resource. It requires continuos counseling and coaching, and honest, open communications between the employee and the supervisor, supported by opportunities for enhancement of professional, managerial, and interpersonal skills.

· The new rating system would reflect the fact that there are only three possible positions: outside the system on the low side, in the system, and outside the system on the high side. As a result the number of appraisal categories will be reduced. The only people eligible for merit raises are those outstanding on the high side.

· The new system will put greater emphasis on teamwork. In order to achieve that the appraisal will need to focus on the fact that every employee has to meet customers' needs.

Go back to list of articles.

"Are your performance measures obsolete?", By D.P. Keegan, R.G. Eiler, C.R. Jones - Management Accounting, June 1989, pages 45-50, 12/9/95

Isabel Figueiredo, MBA 1996

In this article the authors address the problems companies have of using obsolete measures of performance that have not kept up with the pace of organizational changes. In order to reflect today’s business issues, a company should change its performance measures by considering several points: the company strategy, the relationship among the company’s functions, the company’s multidimensional environment and a deep understanding of cost relationship and behavior. Finally, the authors suggest an approach to analyze these points and implement new performance measures.
Company strategy: Performance measures should derive from strategy, report conformance to specific strategic policies , trigger information concerning deviations from company policies and provide linkages between business actions and strategic plans.
Relationship among functions: Performance measures should report congruence throughout the organization. They should be increasingly specific and comprise a shorter term planning as they extend downward the lowest levels. In addition to hierarchical issues, performance measures should consider cross functional relationships, or how these interrelationships can affect each function's performance.
Company's environment: Performance measures should reflect external perspectives (e.g. competitive cost position, relative labor costs ) and internal ( historical costs of material) . In addition, they should address non-cost issues (e.g. number of customer complaints) and cost items ( e.g. material cost.)
Cost relationships and behavior: Performance measurement must be modified to focus on cost relationships and cost behavior as well as to stress the importance of the new cost drivers of operations. The increasing modifications in process and operations has changed cost drivers and old performance measures has hidden them from management analysis. Therefore, a company should determine new performance measurement considering the overall flow of costs to the customer.
The authors suggests the following approach to change the performance measurement system:
Start with the strategy, answering the following questions:
What are the strategic objectives of the company? How do these objectives translate into divisional goals and individual actions? How do you select the main performance measures that will cause strategy to be successfully implemented? How do you translate the overall performance measures into those at the lowest level of the company?
Look to budgeting as a process of injecting performance measures into management thinking:
Budgets should incorporate concepts of continual improvement and consider performance measures. Moreover what was budgeted should be reported and create a closed-loop performance measure reporting structure.
Determine and decompose costs drivers (KMA):
Determine your cost drivers and select the best measure to control costs and improve performance . Decompose the departmental budget into business functions. When the process is completed, the company will have the overall cost of each business function and address what seems out of order.
In conclusion, companies should “exorcise the host-of-management-past and reflect today's business issues in their performance measures.

Go back to list of articles.

Lieber, Ronald B. Here Comes SAP. Fortune. October 2 1995. pp. 122-124.

Mike Shapaker, Class of 1997

In Here Comes SAP Ronald Lieber claims that organizational change (reengineering) can sometimes be sabotaged by technology. Many of today's computer systems don't allow departments to talk to each other in the same language, resulting in work that is time-consuming and inefficient. As a result, these computer systems don't provide employees with the necessary information to perform in a newly reengineered environment.

SAP, a software package made by SAP (a German-based company), remedies these problems by allowing a company or division to standardize its information systems and give employees the data they need when they need it. This software, which runs on mainframes and client-server systems, is expensive and takes time to install. Complete SAP systems can run into the tens of millions of dollars and take years to implement. However, as evidenced by its fast growth ($532 million in revenues in 1992; $1.5 billion in 1995) SAP has gained wide acceptance in the business marketplace. According to the Aberdeen Group, the SAP-related market was expected to hit $9.3 billion in 1995.

R/3, SAP's primary software product, is billed as an enterprise-wide software solution. It can handle a range of tasks from keeping track of manufacturing levels to balancing the books in accounting, and tie it all together too, streamlining the data flow between different parts of a business. The goal of SAP is to allow a company to enter information into a computer only once. After that, the software ensures that everyone stays informed. For example, a sales rep can book an order into SAP. When the factory begins assembling the order, shipping can calculate the expected transport date by checking its progress on-line. Meanwhile, the warehouse can use SAP to check inventory and replenish parts used by the factory. Once the order gets shipped, sales information goes directly to the sales report for management.

One advantage of SAP is that it can act as a template for reengineering. To implement the software, a company must define what information it needs, who needs it, and when. This often forces people to think about their business in radically different ways. SAP can also make people's work easier by eliminating unnecessary paperwork.

Another strength of SAP is that it can force everyone who uses it to focus on the most important part of the business -- the customer. Borden found that it helped provide its customers with more timely information. Before SAP, a customer like Sam’s Club had to order grocery items through Borden’s separate computer systems. This led to multiple purchase orders, invoices, and shipments to the customer, and delays as long as one day to provide information to the customer. With SAP, Borden employees can now track an order’s status on-line while the customers are on the phone.

SAP can also help integrate international operations. For example, Analog Devices used to have each foreign subsidiary develop its own computerized order management system. In this setup, computers in each of Analog's warehouses that kept track of inventory were not connected. This resulted in expensive inventory lying around. With SAP, Analog was able to consolidate its warehouses and create a worldwide order-processing system. Analog employees in the United States can now share on-line information with overseas subsidiaries and thus can ship product more efficiently and cheaply.

Despite SAP's many success stories, there are drawbacks. SAP can take a long time to implement (up to several years) and can be very expensive. As a result, it can take years before the payback is evident. In addition, a large consulting presence is generally required for implementation and training. This can potentially cause disruption in the organization. One example is a $25 million computer network software company in Texas that recently implemented a piece of SAP. The piece of SAP only took two months to implement (an all-time speed record according to the author) but cost $450,000--twice the cost of the next best option. Despite the high cost, CEO Dennis McGinn said “If we're going to be in the big-leagues, we're going to need SAP to compete, and world-class solutions don't come cheap.

Go back to list of articles.

Simon Hermann,"Pricing Opportunities--And How to Exploit Them" Sloan Management Review (Winter, 192), pp 55-65.

Laurent Chardonnet

Price is a key element in a product strategy. It is part of the marketing mix and reflects product positioning, company strategy and customer perception. Price is a quantitative reflection of a product’s perceived value. However, prices are still decided on empirical methods such as intuition or rules of thumb.

Price can make or destroy a company profit. It is the result of a complex balance between the different market components such as competitors, market segments, customer needs, price sensitivity, volume, etc,....

Changing a price might disrupt the entire market structure. When changing a price, one should take into account its effect on market share, competition, cost and revenue. In order to reach the optimal price, one can get information by:

Asking managers realistic scenarios in terms of profit and revenue for different pricing strategies.

Asking direct price questions to customers in conjunction with evaluating their reaction to alternative products. This method tries to quantify the perceived value for a product and its brand.

Observing customer behavior to a change in price using experimental setting or historic data.

A combination of those methods will give the most accurate result.

The shortening of product lifecycle has changed the traditional approach of skimming and penetration strategies. Products with a real customer benefit can be priced at a high premium, but must rapidly decrease in price after a short period of time. Decreases can be made before newcomers enter the market (proactive price cut) or slightly after (reactive price cut). On the other hand, a firm can choose to keep its high pricing policy at the expense market share (harvesting strategy). This last strategy is only possible if the firm wishes to see its product die and be replaced by a more innovative one. Considering historic data, proactive price cut strategy seems to be the most efficient.

In contrary, me too products can benefit from penetration pricing. However, prices might have to rise in the long run.

Bad pricing can result in missed opportunities, but today we see the emergence of “price engineering” that uses conjoint measurement methods to fine tune price strategies. Many companies are trying to exploit those price opportunities. Nonlinear pricing, bundling pricing and price differentiation by market segment offer new ways to exploit missed opportunities.

Go back to list of articles.

"Cost Accounting Standards: Myths & Misconceptions", Management Accounting, January 1994, pp. 42-43.

Todd L Christiansen, Class of 1996

The purpose of this article is to debunk eight common myths and misconceptions about cost accounting standards. The author of the article feels this is important because the misunderstandings represent "a threat to making progress in government contract cost accounting."

The eight myths listed are:

1) The sole purpose of the cost accounting standards is to save the government money
2) CAS were written to achieve the least cost
3) CAS are black and white
4) CAS are governed by GAAP and the tax code
5) CAS take precedence over the cost principles
6) CAS must be followed by all contractors
7) The defense Contract Audit Agency (DCAA) establishes procurement policy
8) DCAA's interpretations and applications are not fair and balanced

The myths are then debunked with the following arguments:

1) primary purpose was to achieve uniformity among government contractors and consistency over time to promote a fairness and equity within the system. While savings were a positive by-product of the effort, disputes and cases have been a negative.
2) CAS were designed to be a shield against inequitable treatment of costs
3) CAS are open to judgement and interpretation, much like GAAP
4) While GAAP and the tax code have an impact on CAS, they in no way control the standards, if so, there would be no need for CAS
5) CAS deal with allocability of costs, and are limited in their scope to the assignment, measurement and allocation of costs. CAS only take precedence when there is a conflict regarding allocability.
6) while CAS cover many government contractors, sufficient exemptions exist to say they are not universally applicable. CAS is applied on a contract by contract basis, not contractor basis.
7) DCAA only develops guidance to assist auditors in determining compliance with procurement laws, rules, regulations, etc.
8) While this may be true, there is the belief that the public is better served by this.

This article does a good job of pointing out some basic misconceptions with CAS and their applicability to government contract work. This information is important for anyone who is interested in doing business with the government to understand. Money may be saved by knowing when CAS must be applied or not. Therefore, all who are interested in better understanding this aspect of government contract work should read this article.

Go back to list of articles.

Gray, Janet, "Quality Costs: A Report Card on Business," Quality Progress (April, 1995), pp. 51-54.

Stephen A. Maynard, Class of 1997

In Quality Costs: A Report Card on Business, Janet Gray suggests quality improvements in a company should be measured by results, not activities. The best way to measure the return on quality is to track how quality costs decrease over time. By eliminating failure costs, the managers of an organization can directly affect the bottom line.

Gray also describes how quality costs systems are not just for manufacturing concerns. It is sometimes argued that quality cost can’t be measured for a service firm because the offering is intangible and consumed at the point of delivery. While the service costs may not be as tangible as a scrap pile on the manufacturing floor, the costs are there and are significant. It is estimated that 30% to 50% of a service firm’s annual expenses is due to quality costs with the majority of the expenses caused by failure costs.

By identifying quality costs, management’s attention gets directed to improvement efforts aimed at lowering these costs. Employees can then scrutinize where value is being added and where it is not. Routine costs should be questioned. For example, is the current disability expenses necessary? Do the costs have to be as high as they are?

In addition to identifying the obvious costs in an organization, the hidden costs are just as critical. For example, lost customers and employee turnover should be factored into the quality cost equation. Lost sales and excessive training costs have a direct influence on profitability. It is estimated that hidden costs account for four times the more visible costs of prevention, detection and correction.

Gray suggests one way to track quality costs in a process. Labor/resource claiming assigns costs by calculating the amount of labor and resources that are consumed at each step. The costs are then categorized according to whether they prevent errors, look for errors or correct failures. By following this procedure, the value-adding costs can be highlighted while minimizing the detection and failure costs.

When starting a quality cost program, Gray stresses that precise numbers are not necessary. The figures that managers derive should be used as a benchmark for improvement. Internal cost consistencies should be the goal. It is also important to grant managers a general amnesty for initial cost estimates. This policy will encourage honest calculations. Finally, the accounting department must be involved since it contains the data to assign costs to jobs, accounts and activities.

In conclusion, Janet Gray explains quality costs are a management tool. It provides direct insight into a company’s performance and helps explain how company procedures affect the bottom line. A quality improvement program should be an integral part of an organization’s overall business strategy.

Go back to list of articles.

Keep Your Head Out of the Cockpit, by Stephen M. Rehnberg, Management Accounting, July 1995, pp 34-37.

Karen Doolittle, Class of 1996

In this article, Mr. Rehnberg compares the budgeting process to learning to fly an airplane. He constantly reiterates that when drawing up a budget, one is to keep his head out of the cockpit. He explains:

Although the instrument panel provided a variety of information, I needed to focus my attention on what was important flying the plane. Concentrating on the instruments meant I was not watching for other aircraft or where I was heading

In other words, when beginning to prepare a budget, forego previous budgets, current budget results in dollar amounts, and reams of cost data (control panel information) until the "budgeter" has a clear understanding of the organization's goals and mission, or where it is headed.

Similar to learning to fly a plane, there are five steps to be followed in learning to produce an efficient and effective budget.

Flying Lesson #1: Understand the basics. Know the budget's purpose. Develop a clear and concise mission statement that contains measurable goals within a set time frame. Secondly, identify the resources needed to accomplish the organization's mission and also identify the minimum quantities of those resources.

Flying Lesson #2: Head Out of the Cockpit Approach. Begin the budget process with a preliminary budget development worksheet that answers questions focused around the organization's mission and what actions, resources, and alternatives are needed to accomplish the mission. Cost data should not be provided until after this is completed.

Flying Lesson #3: Flying the Budget Worksheet. Develop department mission statements to support the organization's mission. Develop plans and alternatives to accomplish the mission.

Flying Lesson #4: Looking at the Flight Instruments. Scan financial information on a regular basis, making adjustments and corrections as needed while maintaining primary focus on flying the "plane."

Flying Lesson #5: Earning Your Wings. A department manager has succeeded in the budget process if s/he has: 1) Developed a preliminary budget worksheet identifying and quantifying necessary resources, 2) Assigned monetary units to identified resources, and 3) Adjusted the plan based upon an understanding of the values assigned to those resources.

Go back to list of articles.

Awasthi, Vidya N ABC's of Activity-Based Accounting , Industrial Management (Jul-Aug 1994) pp.: 8-11, January 29, 1996

Vikrant Aggarwal, Class of 1996
In this article Vidya Awasthi contrasts the traditional costing systems and activity-based costing (ABC)system. He further identifies the potential benefits of an activity-based costing system. Traditional costing systems provide precise, objective, verifiable and good information for external reporting purposes but do not provide relevant information for management decision making. This shortcoming of the traditional systems is overcome by implementing an activity-based costing system.

Product cost allocation is a two stage process. The first stage entails allocating general overhead costs to their respective production departments. In the second stage production costs of each department are allocated to products produced in that department. Costs in the second stage are determined by using metrics such as direct labor hours (DLH), direct labor cost or machine hours, usually the metric used is DLH.

Awasthi questions this metric given the change in the manufacturing technologies and the decreasing reliance on labor in today’s manufacturing environment. Furthermore, the basic assumptions of traditional costing systems i.e. overhead costs vary with volume of output is questioned. This underlying assumption is no longer valid under all situations, more often than not product cost is a function of other factors such as product complexity, product specific equipment or transaction costs. Consequently, costs of certain products (high volume and simple) are over-stated and those of others are under-stated.

In contrast to traditional costing systems, ABC allocates costs to a product by determining the activities (and the amount of these activities) required to produce the product and the costs incurred as a result of those related activities. In other words ABC does not pool all the overhead costs together, instead it allocates costs to products according to cost driver activities consumed by the products and thus makes decision easier and more accurate.

The author has identified the following potential benefits of activity-based costing:

Product Profitability: Since activity-based costing provides more accurate costs of products, it decreases product cost distortions and hence gives better information on product profitability.

Customer Profitability: Since activity-based costing provides information on costs of providing the same product to different customers (based on order quantities, service levels), it helps managers evaluate profitability of their customers.

Cost Management: Activity-based costing allocates costs accurately to the various cost drivers, this helps mangers focus their attention on those activities that can be either eliminated or be carried out more efficiently.

Project Accounting: Since projects are a set of activities, activity-based costing can be used to develop cost information of the various project activities and thus allows managers to focus on and manage project activities effectively.

Finally, Awasthi goes on to state that the decision to implement a new cost system is a trade-off between the cost of measurement (of cost related data) and the cost of errors (in decision making). An optimal costing system is one for which the total cost is minimum. However, the decision to use a new costing system should be made on a case by case basis.

Go back to list of articles.

West, Robert N. and Snyder, Amy M. "How to Set Up a Budgeting and Planning System." Management Accounting Jan. 1997: 20-26.

Stacy Cobbins, Class of 1998

Budgeting and planning are important functions of any company. Although all companies do not perform these tasks, this article is a means of informing such companies (e.g., small businesses) how to set up such systems and conveys the system’s importance. To hone in on this point, the authors focused their analysis on Penn Fuel Gas, Inc. (PFG). Ms. Amy M. Snyder, the co-author of this article, was an outsider that PFG hired to manage its budgeting function which the company had consultants to establish.

Background Information: PFG is a public utility holding company. The company sells natural gas, offers natural gas storage and transportation services, renders mechanical services and owns a propane business. Because various stakeholders like: bankers, board of directors, and managers desire to have various statements that revolve around budgeting, the company realized that a budgeting function needed to be institutionalized.

Since Ms. Snyder is an external hire, she had to learn the ins and outs of the business. She immediately learned that the company’s operations are clear; however, she realized that the rules and regulations covering the public utility industry are not so clear. To counter these issues, Ms. Snyder was sent to a technical program for a week to learn about industry regulations and she was invited to attend the company’s operations meetings. Although those items would help Ms. Snyder with parts of the budgeting function, there is one issue she could not control, Mother Nature. Therefore, PFG established flexible budgets that allowed reprojections at least quarterly.

The co-authors thus pointed out steps necessary for people initiating a budgeting system by detailing the steps Ms. Snyder took to implement her system. First, Ms. Snyder had to understand what information the stakeholders wanted and how to provide that information to them. For example, reports to outside users must comply with GAAP; whereas, internal reports can be broken down into various formats that detail a product, region, customer, etc.

Next, decisions needed to be made. In this article, PFG’s Northern division used a different accounting system from the Southern division. Some segments of the company were regulated and others were not. Account structures varied across segments. In any company, these issues would pose significant challenges when integrating a budgeting and planning system. In this case, corporate’s preference prevailed instead of individual business segments.

To make the above decisions, information needed to be gathered. Some information was easy to obtain while others had not been developed and/or maintained. In addition, expenses needed to be reviewed and classified. As a result of gathering data and reviewing expenses, Ms. Snyder made two recommendations: get rid of miscellaneous expense accounts with large balances and change the expense classification system. As expected and as the authors shared, there were challenges of these suggestions. These challenges ranged from insufficient data to massive accounts for reclassification. Although these challenges can be overwhelming, the authors say that they are best addressed earlier because it is hard to change a system after its development.

After those issues were addressed, PFG’s top officials wanted better and faster information. A new billing system was installed to automate and speed up some of the necessary tasks. However, Ms. Snyder was responsible for satisfying other needs like composing reports. Management requested one-year business plans and directors wanted three-year plans. In order to meet those stakeholder needs, it was and is necessary to establish monthly financial packets to be used for month-end or year-to-date evaluations and for projections (although very hard in this industry).

Adding a special touch to the article, the authors concluded the article with tips and warnings. The tips were to use graphs in report for users that are not familiar with the budgeting process and to consider the direct method for cash flow reports. The warnings are that no two years of budgets are ever the same and to be aware of the potential to lowball revenues or pad expenses.

Go back to list of articles.

Metters, Richard, "Quantifying the bullwhip effect in supply chains," Journal of Operations Management, (May, 1997), pp.89-100.

Kristin Pillsbury

Class of 1998

In looking at today’s supply chain it is imperative to note how distortions in demand information can occur the farther away in the chain away from the end consumer. This is compounded by a perceived demand seasonality and forecast error, which is viewed farther up the supply chain. This occurrence has been termed a "bull whip". This action can have serious effects on companies who are farther down the supply chain. This occurs because the suppliers upstream perceive a change in demand and begin producing at capacity. Unfortunately, this results in a lot of extra inventory during times of decreased demand. This article focuses on looking at the causes of the bullwhip and possible solutions to this problem.

There are two important factors that measure the effect of the bullwhip that have proved significant. These are the increases in parameter values, like demand variance and seasonality and the effects on overall business profitability. Looking at business profitability, it is unclear if there are any practical implications. Furthermore, any real dollar costs due to this type of demand behavior may or may not be significant. This article gives the scenario using two situations to explain, "one has a demand of 0 one week followed by demand of 1000 the next, compared to known demand of 500 both weeks. This means as long as capacity is greater than 1000 per week and demand is deterministic, cost are identical in either case." Even looking at the cost of holding and shortage they appear to be the same.

Looking further at the impact of the bullwhip on business their needs to be concern when "(10 the manufacturers are typically capacitated; and (2) missing a customer deadline for a seasonal product often has ramifications in addition to the loss of revenue." This second means that the business fore goes the lost sales, by not having enough of the product to sell at that particular time. This is a time when the customer does not what the product place on back order, but now or never. This author finds that the bullwhip can have real costs occur when there is a combination of seasonality with capacitated systems.

This article discusses in depth how to determine some of the cost experience by the bull whip effect and how to look at them through problem modeling. The purpose of problem modeling is to determine "the cost of optimal production policies." One real policy action that can reduce the bull whip effect is to reduce time delays between supply chain links. By shortening this you can decrease costs by decreasing the amount of inventory that is held.

Finally, much research has been done on the bullwhip effect that you would think it would be easy to identify and correct, but this is not the case. The key in eliminating the bullwhip is to change overall business practices throughout the supply chain. These can very difficult to change. But according to the research that once the bullwhip is eliminated, it can definitely improve profitability very dramatically. The key to its success is knowing your customer inside and out.

Go back to list of articles.

West, Timothy D & West, David A,Applying ABC to healthcare, Management Accounting February, 1997

By: Nick Checota, class of 1999

In the early 1980's the healthcare industry began experiencing fundamental change. As a result of government regulation and the rise of managed care, healthcare providers experienced increasing pressures to bring costs under control. The changing environment has caused healthcare providers to identify and implement accurate and effective accounting methods. In a February, 97' Management Accounting article, "Applying ABC to healthcare" David and Timothy West argue the benefits of adapting Activity-based accounting to the healthcare industry.

The authors begin making the case for adopting an ABC system by presenting the changes that have occurred in the healthcare industry. Prior to 1983 most healthcare providers operated on a "retrospective payment basis", meaning that facilities could dictate price in order to increase profits. This system was curtailed when Medicare introduced Diagnostic Related Groups (DRG). DRGs classified illness and reimbursed accordingly, forcing providers to contain costs in order to obtain profits. Cost pressure further increased as managed care became more widely accepted. The concept of capitation implemented by managed care has made cost containment and accountability essential for success.

Tim and David West propose that the best way to counter the increased need for accurate cost accounting is to implement an Activity-based accounting system. They believe that an ABC system would better account for the relationship between cost information and treatment provided. Their study of two divisions within a non-profit clinic show that by using cost drivers, the ABC method will help providers more accurately identify the costs of services provided.

The article goes on to outline the elements that are required for the successful implementation of an ABC system. First, accountants must participate in the issues surrounding cost questions and determine what casts are associated with services provided. Second, management and accounts must work closely together to best determine what cost drivers should be used. Lastly, cost system decisions must be a factor when determining a provider's organization design.

As a system that has been primarily used in manufacturing, ABC accounting seems to be gaining popularity in the healthcare sector. While many experts are cautious on the benefits of ABC accounting, the study done by Tim and David West yielded positive results for its use in a healthcare setting. The author's feel that this method of accounting could be a substantial asset in provider efforts to cope with managed care and government regulation.

Go back to list of articles.

Brimson, James A., Feature Costing: Beyond ABC, Journal of Cost Management, January/February 1998, pp. 6-12.

Mona Reinhardt  Class of 1998

Process Management is one of the most popular management improvement techniques today. Instead of organizing along traditional functions, many companies are trying to define their business in term of the processes they undertake. The process participants then map the elements of those processes. Subsequently, management and employees examine how to improve performance by reengineering the basic processes.

In Feature Costing: Beyond ABC, James Brimson extends process management into the realm of cost accounting. He begins his analysis by explaining that normal activity based costing (ABC) systems have become so complex, that they are difficult to maintain. Moreover, they are not meaningful to operations personnel. ABC systems are complex because they require a unique bill of activity for each product. As an alternative, Brimson proposes grouping costs by processes and by the product features which use those processes. Feature costing is less complicated because it groups activities into product features. Products within a line tend to have a limited number of features, which remain fairly table.

Under a feature-based system, one first identifies the features of each product, then sub-divides those features to further refine their functionality. As an example, Brimson explains that a pair of jeans could have a fly, which could be subdivided into either zip or button style. The next step is to identify the activity associated with the specific feature. The author gives a watch pocket on the jeans as an example. The associated activities are cutting, hemming and attaching the watch pocket. After identifying these activities, one must then determine the cost of the various factors of production, including labor and materials. An activity analysis is performed to determine the average activity cost associated with the product feature. Using an average ensures that "chance variations" are built into the process cost. The cost of the product is the sum of the feature costs.

Once a product has been disaggregated into features and corresponding processes, one can better examine the source of any cost variations. If a cost variation is caused by a product feature, then it can be managed in the product design and development phase. If the cost variation is from process execution, then it is a process management problem. The process execution problem can be either from basic process design, or non-compliance with processes.

The advantage of the feature costing approach is its consistency with the popular Process Management approach to firm organization. Feature costing provides both a method and a metric for examining products and determining whether production can be improved through process or product redesign.

Go back to list of articles.

Krupnicki, M. And T. Tyson, "Using ABC to Determine the Cost of Servicing," Management Accounting (December, 1997), pp. 40-46.

Kevin Peak MBA 1998

In this article Krupnicki and Tyson discuss some of the issues involved in the servicing industry when using activity-based costing to determine the cost of servicing customers. They use a small welding supply distribution company with seven employees and a diverse group of customers to show how ABC can be useful. The end result of this process, they say, is better profitability analysis and the identification of cost reduction opportunities. The steps involved in their process are: analyzing the data; determining the data; and using the data to make responsible managerial decisions.

Analyzing the Data

This step involves performing time studies of employee duties, which the authors say is difficult because rarely do employees have only one responsibility. This step is further complicated by the fact that it is difficult to accurately reflect the time spent on any one function in a services company. At the welding company there are fifteen types of activities, including: sales, purchasing, auditing, order pulling, advertising, and billing.

After determining the types of activities, a services firm must decide what base it will use to allocate costs. For sales the authors use time, but for auditing they use number of invoices and for advertising they use sales dollars.

Determining the Rates

Once the activities and cost drivers are determined, the services firm can look at how much it costs to service a particular customer. In the welding supply customer, for example, each employee’s total expenses per activity were divided by the cost driver to determine a cost for a walk-in customer, local delivery, country delivery, and courier shipment. Each of these delivery methods can then be looked at for profitability analyses and cost reduction opportunities.

The authors look at the rates to determine that it costs $22.38 to handle one walk-in customer invoice, but only $18.98 for a city truck invoice. Once this information is known they can look for efficiencies and other savings measures. They say, however, that rates alone cannot be used to make decisions about servicing.

Managerial Decisions

Using ABC is quite useful to look at servicing costs, but there are limitations. For example, the country delivery customer base operates at a net loss, but the authors argue that management must consider the length of time this route has existed (over fifty years) and the impact of canceling it. They argue that the first step is to look at ways to make this customer base more economical, either through more direct selling, imposing a delivery charge or demanding a minimum sales charge. In any case, management can now see the contribution margin for each activity in the servicing process, and can then make better decisions in the future.

 Michael Krupnicki and Thomas Tyson, "Using ABC to determine the cost of servicing customers", Management Accounting, December 1997, pp. 40-46

Yudesh Sohan, Class of 1999

Although most case studies of activity based costing (ABC) applications have been in manufacturing, ABC can be used in a small service business to isolate costs for decision making. An ABC cost study was directed on the small family-run welding supply business of Mahany Welding Supply. The purposes of the study were to determine the cost of servicing customers and to identify appropriate cost reduction opportunities.

Under current ownership, Mahany Welding Supply has enjoyed three decades of steady growth. The owner, who operates the business in a very traditional management style, like many small, family-run business owners has financial goals limited to just staying in the black from year to year. Nevertheless, there is a growing concern that certain parts of the company have been subsidizing other parts and that profits from some customers are subsidizing losses from others. Applying ABC principles to historical financial data was used to address these concerns so that the business owner actually could see the problems and assess ideas on how to correct them.

The ABC analysis indicated that 15 different activities ranging from billing to advertising caused costs to occur in the company. After classifying the activities, performing time studies, finding a causal link between activities and costs, computing allocation rates, and putting data into a contribution margin format, the desired cost numbers were realized. Even if this analysis were not refined and tried again, it still identified the costs of servicing different customers, which is the main reason the project was implemented.

These ABC study provided information that will help Mahany’s managers make better decisions. The mechanics of an ABC system are forthright, but a company intending to conduct an ABC study must be prepared to spend sufficient resources to it. People involved in the project must spend a great deal of time looking at what really drives costs in their business by observing activities, interviewing employees, and performing quantitative methods such as regression analysis. A company that doesn’t allocate the necessary resources is destined to be displeased with the results.

Go back to list of articles.

Corey C. Curtis and Lynn W. Ellis "Balanced Scorecards for New Product Development"  Journal of Cost Management; May/June 1997, Volume II, Number 3, pp. 12-18 

Cindy R. Smith  Class of 1998

The balanced scorecard concept was introduced in the early 1990s and has almost exclusively been applied to operations. Historically, finance and new product managers have struggled to establish appropriate new product development measures. This is due, in large part, to the lack of empirical data available. Knowing that "What you measure is what you get" has prompted managers to seek better, more useful tools for the innovation process. This article presents the findings from four years of survey research on innovation practices and new product development process inputs and their relationships to speed-to-market, financial performance and customer satisfaction outcomes. This research represents the input 600-900 companies in a broad range of industries.

This research is important in that it statistically models survey results to demonstrate the relationship between internal strategies and processes and their alignment with ultimate outcomes. The studies found that Stage gate tracking, Focusing on customer satisfaction with responsiveness, QFD, Shifting R&D emphasis downstream, Adopting a longer technology strategy planning horizon, and Extending patent lives are all practices that positively influence more than one outcome. Using internal rate of return (IRR) rather than net present value (NPV) financial analysis is also beneficial for new product development purposes. Unlike capital projects, IRR uses relative value, which allows smaller projects with higher expected profits to be funded – projects which would have been rejected using NPV. This study also addresses erroneous innovation notions such as increased R&D spending will result in better financial results. This research shows that increased R&D spending only related to higher new product sales percentage outcomes. This benefit, however, will not be experienced until later years. More importantly, researchers found that overspending in R&D can also lead to diseconomies of scale and disastrous financial results.

Research also revealed practices to avoid because of their negative impact upon financial performance. These practices include: Target costing, Taguchi method, and using stock options to reward R&D professionals. Management should also consider the trade-offs associated with their decisions relating to new product development and avoid certain "acceleration traps." There are also practices which do not add value and in this event, resources should be redirected to more rewarding projects. Because decisions relating to new product development are often unique to this area, managers must seek new and relevant management accounting, budgeting, and control tools unlike those used in capital budgeting.

Go back to list of articles.

Robin Cooper and Regine Slagmulder, "Cost Management Beyond the Boundaries of the Firm," Management Accounting, March 1998, pp.18-19,

Written by: Tamara L. Mathis, Owen Class of 1999

New findings within the management accounting community suggest that coordinated efforts by firms in developing cost-reduction activities are much more effective than independent efforts. Traditional methods of limiting these developments to internal efforts did not achieve their full potential because they did not include the "cost reduction synergies that exist across the supply chain". Therefore, collective efforts have been found to result in lower-cost solutions for all organizations involved.

These collective efforts reduce costs by 1) helping to identify ways to make the interface between firms more efficient and 2) helping firms and its buyers and suppliers find additional ways to reduce the manufacturing costs of products. Although costs are decreased, external effects occur in the form of increased customer satisfaction along each link of the supply chain. This is important because customers are a driving force behind firms that exist along the supply chain. Individual aspects of each participant, the firm, buyers, and suppliers, must be made more efficient for this collective process to work. Success of this venture hinges on the end result of all parties involved. Changes, such as efficiency changes, will only be approved if all firms are net better off at the end of the improvement process.

To reduce manufacturing costs along a supply chain, the following are two cost management approaches to consider: 1) Target costing and 2) Kaizen costing.

Target costing is applied to reduce costs during the product design stage. The objective of this costing is to initiate closer relations between design teams of the firms and its suppliers, while also striving to find lower cost solutions that would not be possible if the design teams functioned independently of suppliers. Through target costing, the firm can negotiate prices with suppliers. The prices should reflect the cost pressures in the marketplace that are faced by the firm. This process is not limited to only buyers and sellers. All players along a given supply chain can be incorporated in this effort.

Kaizen costing involves reducing costs during the manufacturing stage. Cost reduction objectives are set for a company’s suppliers and should reflect the competitive pressure that the firm faces in the marketplace. There are two approaches to the Kaizen method: 1) across-the-board cost reduction objectives for all outsourced items and 2) specific cost-reduction goals for individual items.

Both the Target and Kaizen costing fulfill a common objective-they achieve cost saving advantages that the organizations along a given supply chain would not be able to achieve independently. All parties involved must be willing to cooperate and concentrate on the collective good (net-gains) as opposed to the individual savings levels. Savings of one organization can greatly benefit another organization as cost saving are passed down the supply chain. This will allow for the overall supply chain to be more cost effective and competitive.

Go back to list of articles.

Robin Cooper and Regine Slagmulder, ‘Strategic Cost Management,’ Management Accounting, Series 79, February 1998, pp. 16-18.

Inga L. James, Class of 1999

Strategic cost management reduces costs and strengthens the strategic position of the firm. Traditional management accounting and strategic cost management are quite different in that traditional management accounting concentrates primarily on determining the cost of products. Strategic cost management does not. In the case of traditional cost systems, nonmanufacturing costs are not managed effectively because their occurrence is masked by the way they are treated by the firm’s cost system.

Cost management extends beyond the physical boundaries of the manufacturing facility of the firm to costs related to suppliers and customers as well as products. To enable thse costs to be managed strategically, they must be allocated causally. One technique for assigning nonmanufacturing costs is activity-based cost management. Activity-based costing assigns costs in a causal manner. In other words, the cost can be traced to the exact cause.

Procurement costs must be managed to ensure purchasing manager incentives are aligned with the strategic position of the firm. In other words, strategic cost management would argue that purchasing managers should not be rewarded solely on obtaining the best purchase prices, but on the total cost. Total cost would include quality, reliability and delivery performance. The resulting buying behavior leads to a strengthening of the firm’s strategic position because suppliers are chosen on the basis of their ability to help the firm produce high-quality products timed to customer demand. Assigning supplier costs to products generates a more accurate view of product profitability and provides better insight into the design of new products.

In traditional cost systems, SG&A expenses are treated as period costs and are expenses to the income statement. Essentially, this spreads the SG&A costs evenly over products based on their sales dollars. Often, this method distorts the view of the cost of serving customers. If SG&A costs are significant, treating them as period costs can lead to a strategic weakening of the firm because there is no way that individual profitability can be accurately measured. Strategic cost management assigns customer-related costs to the customers that cause them using activity-based principles. Consequently, a more accurate view of customer cost and profitability is generated. Based on strategic cost management, the strategic position of the firm can be strengthened by attracting and returning high profitability customers, even at the risk of losing low-profitability ones. There are several ways to identify highly-profitable customers once strategic cost management has been used. However, the overall goal is to increase the ratio of profitable to unprofitable customers.

In short, strategic cost management assigns costs causally such that changes in these costs can be better managed.

Go back to list of articles.

Magretta, Joan. The Power of Virtual Integration: An Interview with Dell Computer’s Michael Dell. Harvard Business Review. March-April, 1998: pp. 73-84.

Reviewed by Elizabeth J. Mueller

Owen Class of 1998

This article provides an opportunity to hear how Michael Dell has changed the personal computing industry in his relationship with customers, suppliers and manufacturers using virtual integration. Virtual integration, as described by Dell, as a process using technology and information to blur the traditional boundaries in the value chain among suppliers, manufacturers and end users. This new way of doing business has meant that technology is enabling coordination across company boundaries for new levels of efficiency, productivity and returns to investors. This new model uses two traditional, yet very different business models. The first model is that of a tightly controlled supply chain that has traditionally come through vertical integration. Simultaneously, it uses a focuses and specialized strategy to drive the virtual corporation.

Dell computer has emerged distinctly separated from its closest competitors, IBM, HP and Compaq. Unlike these three, it hasn’t focused success through engineering, but instead by harnessing all its resources through partnering style relationships. Dell’s value comes through its ability to leverage its relationships with both suppliers and customers. With its suppliers, Dell has established deals where the supplier agrees to meet 25% of the volume requirements based on a long-term commitment that will be consistent regardless of yearly fluctuations in supply and demand. The supplier assigns engineers specifically to Dell and Dell begins to treat them as if they were part of the company. These types of collaborations change the focus from how much inventory there is, to how fast it is moving. This creates real value. When Intel introduces a new 450-megahertz chip, Dell, with 11 days of inventory, can go to market with the new product 69 days faster than its competitor holding 80 days worth of inventory. This leads to another important issue directly related to inventory—accounts receivable. For Dell, accounts receivable is comprised of about 70% corporate customers who tend to pay their bills on time and keep Dell growing. However, in the computer industry, inventory accounts for massive risk because the cost of materials goes down 50% a year and if you have 2-3 months of inventory versus 11 days, you’ve got a big cost disadvantage. In this industry this can result in high levels of obsolete inventory.

Inventory velocity is an important performance measure that Dell watches closely. This key performance focuses Dell on working with suppliers to keep reducing inventory and increasing speed. For example, with a supplier like Sony, which makes high quality, reliable monitors, there is no need to have any inventory at all. They can comfortably put the Dell name on the monitor without even testing for defects because they are under 1,000 defects per million. So Dell approached Sony and asked if they could just pick up the monitors they need every day directly from Sony. Then Dell relies on a UPS or Airborne Express to pick up the monitors from Sony that correspond to the number of computers Dell has just built and then deliver them directly to the customer. This relies on a sophisticated data system to have real-time information on what demand is. This system allows Dell to keep only a few days worth of inventory on hand because it is communicating inventory and replenishment needs almost hourly with its vendors.

This article continues to discuss Dell’s segmentation strategy in servicing customers directly either through corporations or individual users. Dell has found inventive ways to incorporate its customers into the value chain to determine the most effective strategy for staying on top of customer needs and desires in a very close way. This has also allowed Dell to dramatically extend the value it delivers to customers. This article is illustrative of the way costs are managed within a firm that is partnering closely with suppliers and customers to create new levels of integration of time and resources more effectively than any business model has ever achieved.


"The Power of Virtual Integration: An Interview with Dell Computer’s Michael Dell", Harvard Business Review, March-April 1998. pp. 73-84.

Lance M. McInerney, Owen ‘98

Michael Dell tells how vertical integration has made his company into one of the premier PC vendors in the world. His basic strategy has been the direct business model: eliminate resellers and their markup costs by selling directly to the customer. To make this model successful involves an unusual level of integration and information sharing with the customer. The results have been fabulous: new levels of efficiency, highest industry inventory turns, and spectacular shareholder returns.

Dell envisioned that the old model in the PC industry -- vertical integration -- was doomed for failure. With technology changing rapidly, a technology could be obsolete with in a few months. Dell didn’t think that investing capital in a business that could easily become obsolete in such a short period was a good idea and that there was a way to capitalize on shifting technological trends. This led to the development of Dell Computer as an "assembler" of PCs.

One of the key factors to the direct model is closeness with customers. There is typically a 5-6 day lead time with customers’ orders in the Dell model compared to a 90-day lag time for traditional PC manufacturers. By not having to stock reseller channels with inventory and trying to guess what the final demand for its product will be, Dell simply works closely with its customers and is able to make delivery with in 5-6 days. This short lead time permits Dell to carry significantly less inventory than its competition, improving its ROE and ROA.

On the flip side, closeness with suppliers is crucial as well. Dell cited his relationship with Sony (supplying monitors) as a good example. Dell never sees the monitors that will be shipped with its computers. Dell will tell UPS to pick up an order of monitors from one of its three factories and to pick up a corresponding number of monitors from Sony’s factory in Mexico. UPS will then match up the monitors to the correct order and ship them to the customer. This dynamic works well for both Dell and Sony. Dell carries no inventory and Sony is privy to real-time demand information which prevents inventory build up and reduces production variability.

Dell views inventory management as the single most important success factor in the PC business. With rapidly changing technologies, a company with large stockpiles of old microchips will be slower to introduce the latest, more powerful chip into its computers. Also, inventory risk is manifest in rapidly declining technology costs. For example, if DRAM prices fall dramatically during the year and one firm has three months of inventory while Dell has only 11 days, that translates into huge cost savings for Dell.

But Dell admits that the direct model doesn’t guarantees success; the model can’t be put on autopilot and be expected to work. He claims that with such a tightly coordinated value chain, execution is the critical element to success. Dell Computer’s future success will depend on people recognizing value shifts in the industry and implementing and executing them through the direct model.

Go back to list of articles.

Mike Lucas, Standard costing and its role in today’s manufacturing environment, Management Accounting, April 1997, p. 32 – 34.

David Metzner  Expected graduation: May 1999

Mike Lucas, in his article "Standard costing and its role in today’s manufacturing environment", recognizes and refutes arguments that standard costing variance analysis should not be used as a method for controlling costs and evaluating performance in today’s manufacturing environments. The standard costing opponents argue that using variance analysis is inclined to induce behaviors that are incongruous with the manufacturing strategies and objectives today’s companies must follow in order to sustain competitive advantages and succeed in the global market.

The Case against Standard Costing

Today’s manufacturing companies are likely to have strategic objectives such as reducing manufacturing lead times, inventories, and unit costs while increasing quality and customer service. In order to achieve these objectives, many companies have implemented strategies such as JIT, advanced manufacturing technology, and continuous improvement programs. Opponents believe that standard costing encourages dysfunctional behavior because it focuses on cost reductions that are inconsistent with these new strategies.

For example, producing smaller batch sizes increases efficiency variances because more labor time is spent on set-ups and therefore standard hours of output will be lower relative to standard hours of input. Reducing batch sizes will result in an unfavorable fixed overhead volume variance because the fixed overhead is spread over a smaller number of units. Inventory reduction suggests ordering smaller quantities of raw materials and may increase the materials price variance if purchasing managers traditionally buy in bulk to take advantage of volume discounts. Trying to minimize unfavorable variances will lead managers to make poor strategic decisions.

Standard costing emphasizes cost control and consequently, in a TQM environment, managers may be required to focus on cost reduction at the expense of quality. In a continuous improvement environment, standard costing ensures that standards are adhered to instead of continuously improving the cost structure that enables a firm to compete effectively in today’s intensely competitive climate.

The Case for Standard Costing

Lucas suggests several modifications that enable standard costing to play a major role in cost management and performance evaluation. He proposes that companies modify their standard costing system to their competitive situations in order to manage costs effectively.

Developing counter-balances that discourage long production runs or excluding set-up times in labor efficiency variances would enable companies to manage costs as well as focus on their JIT systems. A solution to the problems associated with fixed overhead volume variances includes developing a system that focuses on marginal costing rather than standard absorption costing. In order to discourage dysfunctional behavior in purchasing, Lucas recommends tying materials price variances to other indicators including inventory levels in order to create a balanced performance measurement.

Opponents to standard costing argue that focusing on cost control will likely cause reductions in quality. However, a quality focus does not justify the abandonment of cost management. In a quality-focused environment, managers should be measured on a range of indicators including cost, quality, inventory levels, and lead times.


Instead of abandoning standard costing, companies are keeping the cost control methodology inherent in standard costing and adapting their systems and performance measurements to incorporate the changing production environment in today’s global economy.

Go back to list of articles.

Hubbell, William Jr., Combining Economic Value Added and Activity-Based Management, Journal of Cost Management (Spring 1997), pp. 18-29

Sopeerat Boonchayaanant, Class of 1999

Creating economic value for shareholders is the basic purpose of any for-profit company. One of the best measures of shareholder economic value is economic value added (EVA), which is defined as revenues less all costs associated with producing the revenues, including the cost of capital employed. This article also discusses the relation between EVA to activity-based costing (ABC), which is one of the way to measure, track, and improve the management of costs.

Measuring shareholder value

A company’s EVA is simply its operating profits after tax, less a charge for the capital used in creating the profits.

Capital charge.

The capital charge is something devised by financial economists, not accountants. It is calculated by multiplying the cost of capital times the net assets employed (defined as net working capital plus net fixed capital). Note that the cost of capital includes the cost of both debt and equity. The cost of equity is best described as the rate of return required by investors to compensate them for the risk inherent in the investment. Thus, the cost of equity is an opportunity cost rather than true cash cost.

Exhibit Equivalency of Net Present Value and Economic Value Added

Investment = 1,000 Years

Cost of Capital (c) = 10% 0 1 2 3 4

Discounted Cash Flow

Operating profit after tax(OPAT) 250 250 250 250

Present value of OPAT 2,500

Less: Investment 1,000

Net present value (NPV) 1,500

Discounted Economic Value Added

Operating profit after tax 250 250 250 250

Capital 1,000 1,000 1,000 1,000

Cost of capital 10% 10% 10% 10%

Capital charge = Capital * Cost of Capital 100 100 100 100

Economic value added = OPAT – Capital Charge 150 150 150 150

Present value of economic value added 1,500

The value of any asset is the present value of all future cash flows inherent to that asset. NPV is calculated by discounting the project’s future cash flows at the cost of capital, then subtracting the initial investment. It is easy to demonstrate that discounting EVA yields the same answer as discounting cash flows, or the NPV approach (see exhibit).

Benefit of EVA

Since the changes in market value (i.e., changes in stock price) are most closely related to changes in EVA, EVA is a strong predictor of stock prices than such popular measures as return on equity. EVA also provides managers with the most rigorous measure of earnings performance. Therefore, EVA can be used to align performance planning, measurement, reward programs with the interests of shareowners. Managers are encouraged to maximize the long-term return on investment, improve the productivity of assets, and minimize the weighted-average cost of capital. EVA makes managers act like shareholders.

Linking cost accounting with shareholder value

Activity-based costing (ABC) and activity-based management (ABM) is the tool to track activity costs, improve the accuracy of product costing, and report on critical financial and non-financial performance measures. Traditional ABC and ABM systems fail to account for the full cost of capital.

Objectives. The major objectives of combining EVA with ABM are to identify where EVA is being created (e.g., in which product categories, geographic areas, customer segments, etc) and which processes and activities improve or increase EVA to link operational plans and budgets to strategies for increasing EVA.

Revising ABC

ABC can be amended to include a capital charges by adding them to the operating expenses from the general ledger. The capital charges are calculated using the balance sheet and the cost of capital. In addition to the cost drivers, capital drivers must be used. Finally, all the costs, both operating expenses and capital charges, must be traced to cost objects. As a result, it becomes apparent which products and customers contribute to EVA, not just operating profit.

Go back to list of articles.

Cost System Design for Enhancing Profitability, Management Accounting, January 1998

Reviewed by Sheila Sullivan, Class of 1998

The accuracy of cost system information for reported product costs is essential for managers who use it to make decisions regarding pricing and whether or not a particular product line should be continued or dropped. The emergence of activity-based costing (ABC) has emerged to help in decisions where the number of products is large and thus decisions are not independent. Proponents of ABC argue that traditional product-costing systems are obsolete and should be replaced. The author of this article contends that there is no right or wrong system to use, and that the system should be chosen according to circumstance.

In the case of stock valuation for manufacturing companies, a simplistic product-costing system may be appropriate. However, for product pricing decisions, managers may choose long-run product costs. Overall the argument leans toward sophisticated systems since the possibility of undercosting would lead to the acceptance of unprofitable business. In addition, companies with high overhead should opt for sophisticated systems so as to avoid distorted product costs.

Where product introduction/abandonment decisions are infrequent (small number of products in line), a cost-accumulation system that assigns only direct costs to products is most useful. In this case it is more likely to treat product decisions as being independent.

The cost system design is an important component of the management accounting system within an organization. Through a broader examination of the content and use of cost systems to support management decisions via questionnaire, this publication will hope to determine whether the need to establish stock values influences cost system design in manufacturing organizations. It will then also be possible to gain insight as to what cost information companies use to assess profitability of their product lines.

Go back to list of articles.


Cooper, Robin and Regine Slagmolder, "Strategic Cost Management." Management Accounting. Vol. 89, Issue 8. (New York: February 1998), pp. 16-18.

by Derrick Mickle, '98


The objective of strategic cost management is to reduce costs while simultaneously

strengthening the strategic position of the firm. Traditional cost systems (Figure 1) determine the cost of products only. Other potential cost objects (like suppliers and customers) are treated either as general overhead and arbitrarily allocated to products or as period costs and assigned directly to the income statement. It becomes difficult to manage these nonmanufacturing costs effectively because the underlying reasons for their occurrence are masked by the way they are treated by the firm's cost system.
Strategic cost management assigns costs to suppliers and customers as well as
products so that a firm can begin to manage these costs strategically. Costs are allocated causally using activity-based costing methods, principally because of their ability to assign costs in a causal manner to a broad range of cost objects. Costs are allocated in three areas: supplier costs, product costs, and customer costs.


Without the proper assignment of procurement costs, purchasing managers typically select suppliers based on product price, which can lead to suboptimal buying behaviors that can weaken a firm's strategic position. Strategic cost management resolves the conflict in two ways: first, by taking a broader view of component costs and, second, by assigning procurement costs to products causally.

In the next step of the process, supplier costs are assigned causally to products using activity-based principles to get product costs. Now products are assigned their specific procurement costs, not the average for all products. Consequently, a more accurate view of product profitability is generated and better insights into the design of new products are provided.

As for customer costs, traditional cost systems treat SG&A expenses as period costs and are expensed to the income statement, resulting in a totally distorted view of the cost of serving customers. Customers appear to cost nothing to serve, or they all appear to cost the same percentage of their sales revenue. This can result in strategic weakening if SG&A expenses are significant. Strategic cost management provides a more balanced view of customer profitability by assigning customer-related costs to the customers that cause them (using activity-based principles). For example, customers who order in small, unpredictable quantities and require considerable post-sales support will be seen to be more costly than customers who order in high, predictable quantities and require little or no support. A more accurate view of customer cost (and profitability) is generated, giving a means to increase the ratio of profitable to unprofitable customers.

Go back to list of articles.

Elram, Lisa M. and Feitzinger Ed, "Using total profit analysis to model supply chain decisions", Journal of Cost Management (July/August 1997), pp. 12-21.


The concepts of supply chain management have been utilized to deliver the highest revenues at the lowest costs to corporations. However, in practice, companies focus mainly on the cost side of the profit equation. Transaction Cost Analysis (TCA) has been used by most of the companies to determine whether or not to outsource. In most of the analyses under this method, customer service usually has been treated as a constrain, instead of a variable, because of the lack of an understanding of the service impact on revenue.

The Total Profit Analysis (TPA) can be a viable practical alternative to Transaction Cost Analysis (TCA). Total Profit Analysis is a procedure that simultaneously considers cost and revenue implications of organizational decision making; it inherently recognizes that decisions made to reduce total cost may also have an effect on revenues. Total Profit Analysis (TPA) also considers the revenue implications of various alternatives. A systems approach can helps companies understand the underlying concepts of Total Profit Analysis (TPA). In the popular world of the system approach, the world, and subsystems within the world, are seen as mutually dependent units that interact in such a way that the activities and outcomes of one affect the activities and outcomes of all others within the system, on a direct or indirect level. Applying this to supply chain management, any change in the activities or performance of one supply chain member, or a change in supply chain membership, will have an effect throughout the entire supply chain.

Porter applies the supply chain concepts to an organization at three levels. The first level is the internal relationship among activities (i.e. using the existing component in the new product development project results in reducing new product development cycle time). The second level is also an internal system, which views the way that separate divisions within the same company share a value-adding/strategic activities (i.e. one distribution department for three product lines). The third level is the way that organization interfaces with the market (i.e. the supply chain of its customers and suppliers).

Even though Total Profit Analysis (TPA) seems to be a perfect way to help managers answer supply chain management questions, there are some factors that limit the usage of Total Profit Analysis. They are the internal and external data access, number of factors to measure, static nature of analysis, ability to compare alternatives and customer service and revenue issues. To reduce these limitations, managers should therefore use Total Profit Analysis techniques only for questions that are critical to companies. They should also apply the Pareto's 80/20 rule to decrease the amount of time spent collecting information (i.e. the supply chain analysis team should not collect or analyze data it believes would either not change or would have minimal impact on the outcomes of the different alternatives).

Go back to list of articles.

B. Douglas Clinton, CPA, and Ko-Cheng Hsu, CPA, "JIT and the Balanced Scorecard: Linking Manufacturing Control to Management Control, "Management Accounting, September 1997, p 18-24.

Allison Miazga Owen class of 1998

The ideas below are the concept and creation of the authors mentioned above.

This article proposes linking Just-in-Time (JIT) philosophy, used to control manufacturing, with the Balanced Scorecard concept, used to help management control overall operations. JIT is a set of manufacturing techniques and concepts or a philosophy of doing business that minimizes inventory levels. The Balanced Scorecard concept measures corporate performance in the four categories of financial, customer, internal business processes, and innovation and learning. The article states that a company that changes the manufacturing process to JIT without changing the management control system can create an incongruent state that results in inconsistent performance evaluation and dysfunctional behavior. The Balanced Scorecard is recommended as a useful tool in systematizing the management control system to accommodate radical changes in activities that are brought on by implementation of a JIT manufacturing system.

When comparing JIT manufacturing methods with traditional manufacturing methods, the authors identify typical differences in three areas. First, JIT focuses more on the manufacturing processes; flexibility is considered to be more important than automation and efficiency. Second, JIT promotes a stable workforce trained to perform various production duties as opposed to a specialized task. Third, the supplier base used with JIT is much smaller than with traditional systems and often consists of certified vendors. These three areas highlight differences in manufacturing that require different treatment in the management control system to be effective. The manufacturing control factors are presented in three areas of 1) process, 2) workforce, and 4) suppliers.

The article stresses that each company must develop its own metrics for an Integrated Balance Scorecard based on its own individual strategy. The metrics used should be linked to strategic objectives and linked together in such a way that they reinforce each other. The time horizon for each metric reflects the importance of considering the time dimension on assessing how each metric meets strategic objectives and how the metrics are linked together. Time horizons are either short-term specific (provides a clear cause-and-effect type of relationship to provide guidance for management action), intermediate indicators (can be linked back to specific short-term drivers), or long-term validators (should reflect long-term expectations and are broader in scope and more fundamental to overall company goals and are often are financial measures. Studying the linkages between the metrics reveal how each metric relates to the big picture strategy and provides a clear map for evaluating particular management actions. Additionally, a company can conduct a value-chain analysis to find the area where management actions are likely to take place to achieve results.

According to the article, the point to using the Integrated Balanced Scorecard as a management tool is not to adopt a specific set of metrics by cloning them from a particular list. Rather, it is to analyze each of these components and consider how they link to strategy and support a meaningful continuous improvement and assessment effort. Properly matching attributes of manufacturing control with management control is necessary to avoid dysfunctional results brought by sweeping changes. The Balanced Scorecard provides a context for conducting activity and measurement analysis, linking activities to the value chain, time phasing each metric for proper interpretation, and linking the elements together in an integrated and useful manner.

Go back to list of articles.