Skip navigation
2011

All of business is a trade-off of one kind or another. For example, carrying enough inventory to always be able to quickly respond to unexpected demand surges brings with it an unnecessary—and in today’s economy, unacceptable—cost. The trick, therefore, is to find the optimal balance between multiple objectives, which requires identifying trade-offs. According to news from IBM, understanding trade-offs is about to get much easier.

 

An article that ran in Supply Chain Digest reports that IBM recently announced what it calls a substantial breakthrough in supply chain optimization technology: so-called “multi-objective” optimization capabilities that will enable companies to better understand trade-offs between different objectives, such as cost and service. According to Dr. Michael Watson of IBM, the multi-objective capability enables users to optimize across two or more objectives at the same time.

 

Executives know that when they make decisions in the supply chain, the issue isn’t cost, service, or capital investment as isolated elements because it’s a combination of those factors that drives their decisions, Watson says. What’s now possible using new mathematical optimization techniques, is the ability to analyze these different objectives all at the same time.

 

These capabilities were announced as part of a number of new features in the latest release of IBM’s LogicNet Plus XE, a network optimization tool. While it’s first being rolled out in IBM’s network optimization tool, the Supply Chain Digest article reports that the multi-objective capability is expected to also appear in IBM’s inventory optimization and factory scheduling software, as well as its optimization tool kit, CPLEX, which is embedded in supply chain planning software solutions offered by other suppliers as well as used by individual companies to solve specialized supply chain optimization problems.

 

The problem in the past, Watson says, was that companies had to run a single objective optimization, and then manually run scenarios with a few data points on different service levels. Besides the manual nature of the process, which takes time, the result is really a very incomplete curve--and risks the result not showing important step changes or other insights between the few data points selected, Watson says.

 

However, using the new technology, optimization software can now build a trade-off curve that shows how different objectives play against each other, Watson says. For example, these trade-off curves can allow companies to see areas where there might be a big step jump between one part of the curve and an adjacent part, or easily identify where there might be an exponential type rise, such as in an objective like cost as it approaches a 100 percent service level, he said.

 

While there are many potential uses, one example is to compare the total cost of a supply chain network versus the capital investment required to get to that point, Watson says. A slightly less than optimal network in terms of operating costs may--in some cases--require substantially less capital investment, which would make it the best total decision. Or, Watson said, a user could look at what the optimal network might be given different levels of capital investment ($10 million versus $20 million, versus $30 million, and so on).

 

In the end, I’m intrigued by the possibilities as well as the potential impact for users. And while improved optimization engine capabilities are one thing, adding the use of cloud computing to the equation could really make things interesting. I’m looking forward to hearing more on the story as the technology develops. Are you?

Did you know October is National Cybersecurity Awareness month? I’ll also ask as a follow-up question: Are you confident about cybersecurity in your supply chain?

 

I ask because I’ve been thinking about some recent articles and news regarding the growing threat against supply chain cybersecurity. In fact, the U.S. Department of Homeland Security reports that the growing number of attacks on U.S. cyber networks has become, in President Obama’s words, “one of the most serious economic and national security threats our nation faces.” Now, some of you may wonder whether or not the situation really is so severe. The answer, says Dennis Omanoff, senior vice president and chief supply chain officer at computer security solution provider McAfee, is, yes.

 

In a recent article that ran in Supply Chain Management Review, Omanoff said that more than natural disasters, financial instability, or political upheavals, what keeps him awake at night is the fear that “bad guys are injecting bad stuff into products” that can disrupt, bring down, or steal confidential information from networks.

 

That’s partly because concerns continue to rise about the “injection of viruses” into high-tech hardware products during their journey from manufacturing sources to customer delivery, especially to government agencies, he added. But it’s also the growing nature of the threat. Omanoff explains that McAfee reviews about 100,000 potential malware samples per day; identifies more than 55,000 new, unique pieces of malware per day; and identifies about 2,000,000 new malicious web sites per month.

 

Omanoff’s comments remind me of a Marketwire story that ran last summer, which reported that according to a survey of U.S. IT and IT security professionals, the threat from cyber attacks today is nearing statistical certainty and businesses of every type and size are vulnerable to attacks. While the survey, conducted independently by Ponemon Institute and sponsored by Juniper Networks, found that 90 percent of businesses suffered cyber security breaches at least once during the past 12 months, what’s more alarming is that more than half of the respondents report that their companies have experienced multiple breaches during the past 12 months.

 

The financial consequences have, of course, been significant. Overall, respondents indicated that security breaches have cost their companies at least half a million dollars to address in terms of cash outlays, business disruption, revenue losses, internal labor, overhead, and other expenses. Furthermore, most respondents--59 percent--report that the most severe consequence of any breach was the theft of information assets, followed closely by business disruption.

 

One of the more interesting aspects to all this is just how the attacks take place. I was interested to see in Juniper’s study that according to survey participants, security breaches most often occur at off-site locations but the origin is not often known. Mobile devices and outsourcing to third parties or business partners seem to be putting organizations at the most risk for a security breach. In fact, 28 percent of the respondents say the breaches occurred remotely and 27 percent say it was at a third party or business partner location.

 

So, what can be done? Omanoff from McAfee says that to counter the threat, supply chain professionals charged with manufacturing and delivery processes must look beyond traditional supply chain threats such as tsunamis, demand volatility, or financial degradation, and take extra precautions to ensure that technology products in particular are safeguarded from viral attacks. He goes on to say that supply chain managers must be vigilant when it comes to resisting cyber crime and cyber terrorism, and remember that it takes a “preemptive” strategy to ensure against future violations.

 

Considering the proliferation of smartphones and laptops in the workplace, and companies allowing employees to use their personal devices for work, it certainly seems cybersecurity threats will likewise continue to grow. What do you think? Has your company—and its business partners and suppliers—addressed the threat?

Earlier this week, I saw an article in Pharmaceutical Commerce noting that the trend of companies’ primary marketing target shifting from physicians to centralized healthcare management--whether conducted by healthcare providers themselves or by governments—is well established. However, the consequences of this transformation are growing, according to the results of a survey conducted by Cegedim Relationship Management. Indeed, according to its research, “changing commercial models,” is the top concern for U.S. pharmaceutical executives, which means it is now a larger concern than pipeline development for new products.

 

According to news from Cegedim, a provider of solutions for life sciences companies, the results from its study show that 87 percent of U.S. bio-pharmaceutical stakeholders list the changing business model as the biggest issue “keeping them awake at night.” According to the report, 2011 Pharma Insights, after changing commercial business model, the next two pain points cited by respondents are impact of impending regulatory reform (cited by 82 percent) and pipeline growth (cited by 76 percent).

 

In response, survey participants say they are making changes to their commercial business model. For example, 73 percent of the respondents plan to increase focus on market access strategies, 63 percent of them plan to increase focus on managed markets, and 59 percent of the respondents will realign their primary sales force.

 

What’s interesting, is that also earlier this week, medical products giant Abbott Laboratories announced it will spin off its branded drug business from its more diversified medical products company. So on the one hand, there will be the diversified medical products company, which has approximately $22 billion in annual revenue. The company, which will retain the Abbott name, will consist of Abbott’s existing diversified medical products portfolio--including its branded generic pharmaceutical, devices, diagnostic, and nutritional businesses. This company will seek geographic expansion, particularly in high-growth emerging markets.

 

The second research-based pharmaceutical company, to be named later, has nearly $18 billion in annual revenue. This company, which will include Abbott’s current portfolio of proprietary pharmaceuticals and biologics, includes the company’s blockbuster anti-inflammatory drug Humira, as well as other well-known brands Lupron, Synagis, Kaletra, Creon, and Synthroid. In its pipeline are specialty therapeutic areas such as Hepatitis C, immunology, chronic kidney disease, women's health, oncology, and neuroscience. This company is expected to generate the majority of its revenue from developed markets.

 

In speaking about the move, Venkat Rajan, medical devices industry manager at advisory firm Frost & Sullivan, said in an IndustryWeek article that ran mid-week, that the medical device industry seems to be undergoing a significant shift. A common challenge for large healthcare conglomerates that develop both pharmaceutical and medical products is to determine how they are able to draw attention to successful high-growth practices that might be marginalized when looking at the company as a whole, Rajan says. So even though they sell into similar care settings, target similar end-users, and address similar disease states, the business models for medical products and pharmaceuticals are so different that this should be a boon for both companies, he says.

 

In many respects, Abbott is doing what other large companies in other industries are also doing. Kraft Foods, Motorola, and Sara Lee, for example, have all recently spun off part of their businesses to further develop distinct business models and concentrate on specific markets. Is your company doing something similar?

To cope with globalization and rising customer demands, competition, and development costs, manufacturers have forged tighter alliances with distributors, resellers—and perhaps most importantly—suppliers. The result is that suppliers’ performance now plays a critical role in an organization’s performance and, correspondingly, its profitability. The problem, however, is that while it’s increasingly important to manage suppliers, information about supplier performance often resides in disparate systems or worse, in employee’s heads.

 

With all that in mind, I was interested to see a recent Supply & Demand Chain Executive  article about supplier lifecycle management. In that article, the author, Martin Berr-Sorokin, senior vice president and general manager of Supplier Lifecycle Management at Emptoris, explained that typically, at least half of every revenue dollar is spent on goods and services purchased from external suppliers. As companies focus on their core competencies and outsource non-core operations, this percentage has steadily increased. In fact, in industries such as technology and automotive, purchases from external suppliers may account for as much as 80 percent of the total cost of new products, Berr-Sorokin explains.

 

This growing reliance on third parties significantly raises an organization’s supply and pricing risk, and it also increases its exposure to adverse scenarios such as safety issues and lack of regulatory compliance. To continually improve operational performance, manage costs, and reduce regulatory risks, it’s now necessary for a company to first select appropriate suppliers, and then monitor and manage their performance over time, Berr-Sorokin says. The problem, I believe, is that managing suppliers and monitoring their performance is one of those “easier said than done” tasks. In other words: It’s difficult. That’s why so many companies struggle with it—if they even seriously attempt supplier lifecycle management in the first place.

 

But, having said that, I was interested to see some best practices explained in the article that enable companies to consistently measure supplier performance. The first, of course, as it’s the cornerstone of the practice, is to identify metrics, thresholds, and targets. Capturing key performance metrics within the supplier’s contracts ensures that all key terms/measures are contract compliant and visible. Second, companies must gather input from key relationship managers to understand their supplier performance objectives and use the information to establish metrics that are aligned with overall strategy. As Berr-Sorokin notes, these metrics and targets should be shared with suppliers and mutually agreed to, so both the company and suppliers can create a meaningful performance management program.

 

Another key capability is to use scorecards, trend reports, and alerts to identify disparities between suppliers’ targets and their actual performance. The purchasing organization should use this information to review the impact of supplier performance gaps on business, prioritize them, and then communicate with the supplier regarding the issues, Berr-Sorokin says. Use of collaborative supplier portals that deliver this information to suppliers will help ensure both parties are up to date on performance.

 

As with any significant program, perhaps the most important element is to implement continuous tracking and optimization. Rather than taking a “one-and-done” approach, companies instead should realize that supplier performance must be tracked on an on-going basis—both to ensure that previously identified gaps have been remediated as well as to continue making improvement. That way, an organization can continue to make sound supplier decisions ranging from phasing-out a supplier or giving another more business.

 

So while a supplier lifecycle management initiative enables improving operational performance, reducing supplier risk, reducing component costs, and improving supply chain efficiency, it does require considerable effort. I believe that work is more than worthwhile. Do you?

Anybody who has ever been to Disney World or Disneyland is familiar with the song “It’s a small world.” But recent news about the worst flooding in 50 years in Thailand reminds me that it truly is a small world—especially when it comes to supply chains—and that disruptions may be felt quickly around the world.

 

The New York Times reports that deforestation, overbuilding, damming and diverting natural waterways, urban sprawl, and the filling-in of canals combined with poor urban planning and an unusually heavy monsoon season have created a terrible situation in Thailand. Indeed, the monsoon season this year has brought disaster to Cambodia, the Philippines, and Vietnam as well as Thailand, where 283 people are reported to have died. And as the article further reports, Thai officials are warning that, in the next few days, Bangkok—in the region hardest hit--could be inundated by a combination of heavy floodwaters from the north, mudslides, unusually high tides, and monsoon rains.

 

The supply chain connection is that Thailand has become a major production and export hub for global auto makers, including Toyota Motor Co., Honda Motor Co., and others that have been forced to close their plants after weeks of worsening flooding overwhelmed a cluster of component plants north of Bangkok. Honda, it appears, has been the hardest hit of the automakers. In fact, as reported by Reuters, Honda’s Thailand plant closure will reduce the company’s global output by 4.7 percent.

 

How bad is it? Cars at the facility in Ayutthaya appear to be floating, Honda spokesman Tomohiro Okada said in the Reuters article. Because no one is allowed into the area, Honda is still unable to assess the damage to production machinery or give any estimate of when output, halted since Oct. 4, can restart, he added. Finally, Honda’s Thai plant supplies parts to other factories in the region, so the damage to that particular plant may effect its supply chain and slow—if not halt-- output in other locations unless Honda can redirect parts from elsewhere.

 

Other automakers are struggling as well. While Toyota reports that its three Thai assembly plants have not been damaged by the flooding, it has still closed those plants because they face component shortages. Additionally, Nissan has announced that it too may experience some disruption because the company is uncertain as to whether or not its plant can continue to receive components.

 

What’s problematic is that Japanese automakers are still struggling in the aftermath of the earthquake and tsunami that caused catastrophic damage in Japan last spring. Normally, automakers would simply ramp up production in Japan to compensate for the shutdowns in Thailand. But while supply chains have generally been repaired following the Japanese earthquake, Japanese firms have relied on plants in other countries to make up for lost production volume because their Japanese plants are at full capacity to meet post-earthquake recovery demand. With manufacturing now stopped in Thailand, they will have little chance—at least for the time--to make up for the shortfall.

 

It isn’t just automotive companies that face supply chain disruptions either. As reported in the Reuters story, camera manufacturers are closing plants too. Sony announced it has closed its sole camera factory that produces bodies for its interchangeable lens cameras, and Nikon announced it closed its sole factory that makes digital single-lens reflex cameras. Electronics manufacturers now face a setback as well, the story also notes. For example, Canon was forced to close a printer plant, and Pioneer announced that two of its production sites had been partially flooded, forcing it to halt production of vehicle navigation systems. Furthermore, Pioneer spokespeople don’t know when the company will be able to reopen the plants.

 

Considering the global nature of today’s market, it seems only a matter of time until the ramifications of these plant closings begin to be felt elsewhere in the world.

 

More importantly, our hearts go out to those in Thailand and other countries suffering loss.

A recent Chicago Tribune article offered a mixed bag of news regarding how mid-size manufacturers view the economy—and correspondingly, the future. The article reports on the findings of a new survey by accounting and consulting firm Deloitte, which found that mid-size companies in the U.S. plan to maintain or increase their long-term investments despite executive’s uncertainty about the economy and reservations about hiring.

 

Deloitte hired a market researcher that polled 696 executives of mid-size U.S. companies. The participating companies had annual revenues between $50 million and $1 billion, and two-thirds of them were privately held. According to survey findings, three out of four companies plan to maintain or increase their levels of long-term investment. Furthermore, executives at the companies are pursuing these plans even though 64 percent of the respondents noted that factors such as taxes, regulations, and credit availability are more uncertain than normal.

 

What I found interesting was the survey respondents’ perspective on hiring. On a positive note, 44 percent of the respondents said they expected to increase their U.S. workforce during the next 12 months, and nearly 60 percent said they would conduct “strategic hiring in critical areas” if they can increase productivity. On the other hand, while unemployment continues to hover just over nine percent, it apparently isn’t as easy to find skilled workers as it may seem. Indeed, nearly half of the survey’s respondents noted that they have difficulty finding “employees with the skills and education to become productive immediately.”

 

I’m reminded by this of talk regarding the Skills for America’s Future program. If you recall, last summer President Obama endorsed the creation of a national manufacturing skills certification system, which, ultimately, should help manufacturers across all industries. The system is a plan by the National Association of Manufacturers’ Manufacturing Institute to train and certify some 500,000 community college students with skills considered critical to manufacturing operations. It was explained then that the Manufacturing Institute will work with the president’s Skills for America’s Future program to implement the system, which will provide certification through competency-based education and training.

 

More recently, I saw an announcement that, at a meeting of the President’s Economic Recovery Advisory Board (PERAB), President Obama announced the launch of the Skills for America’s Future program. President Obama said that the plan calls for community colleges and employers to jointly create programs that match curricula in the classroom with “the needs of the boardroom.” Skills for America’s Future will, the President also said, help connect more employers, schools, and other job training providers, and help them share knowledge about what practices work best. So essentially, the goal is to ensure there are strong partnerships between growing industries and community college or training programs in every state.

 

There were also the announcements of, I believe, some important new partners. PG&E, for example, will commit over the next three years to expanding its energy job training program, with an emphasis in four areas: clean tech vehicles, energy efficiency and renewables, smart grid, and skilled crafts. United Technologies will collaborate with other employers to replicate its employee scholar and apprenticeship programs in advanced manufacturing, which have resulted in over 30,000 degrees earned over the past 15 years. Finally, Accenture announced it will work with other employers and community colleges to expand reach of its pathways programs, which prepare students with skills for their first job across industries.

 

All of this won’t do much for those companies in Deloitte’s survey that are searching for skilled employees now—or even in the next six months. But I do believe that ultimately, the program will pay dividends for manufacturers across all industries.

 

What do you think?

Former undisputed boxing champion Mike Tyson once famously observed, “Everyone has a plan until they get punched in the mouth.” I’m reminded of that quote from time to time when talking to manufacturers because while most have a plan for supply chain disruptions, the reality for many, is that their plan goes out the window quickly when things go awry.

 

It doesn’t need to be that way, however. In fact, in a recent SupplyChainBrain article, Mark David, supply chain solution principal with SAP AG, says that companies today strive to do more with less—especially when it comes to inventory—but they also are aware of critical issues such as globalization, outsourcing, and intensified customer demands, and are determined to be flexible enough to react to unexpected changes in buying patterns.

 

In the past, says David, processes applied to response management have generally been supply-centric. However, response management today has become an integral part of risk management, and a complete program must look to inventory throughout the network. For example, suppliers need to be able to take an order and re-prioritize it using existing pools of inventory, wherever it may be. And by reallocating inventory on the fly, David says, supply-chain managers can actually help to boost revenues.

 

Indeed, David has heard of companies achieving revenue growth of between one percent and 2 percent through the application of better response management. He cites, for example, the case of a consumer electronics company that suddenly experienced a surge in demand from a big retailer. The company was able to work with its supply-chain partners to respond quickly to the development.

 

Essentially then, the key, as was recently pointed out by speakers at the recent Supply Chain Council’s Executive Summit, is to be well prepared. In an article that ran on Supply Chain Management Review, Gary Kilponen, SCC’s event chair, said that in today’s globally connected economy, organizations must be able to respond quickly to market changes while mitigating exposure and risk. That means supply chains have had to become more responsive to environmental and political disruptions, and, increasingly, economic fluctuations as well, he says.

 

That’s just what Cisco Systems has done, and those efforts as a result, left the company well prepared when the earthquake and tsunami occurred in Japan last March. The scale, scope, and rapid evolution of the Japan crisis was unprecedented and was a key test of the preparedness and resiliency of operations across many industries, James B. Steele, program director Value Chain Risk Management, Global Business Operations for Cisco Systems, said in the article.

 

Steele says that company leadership had realized that its supply chain could be easily fragmented if the company didn’t take a holistic view of operations. Consequently, Cisco had developed a simulation program that prepared it for the catastrophe.

 

What I found most interesting was Steele’s observation that even the best analytic engine can’t anticipate everything, and that a predictive model can—at best—deliver what he calls, “agnostic resiliency.” That means it’s critical to build resiliency in both design and engineering, he says.

 

I’d like to hear what you think about risk mitigation. Has your company, like Cisco, prepared for a catastrophic event? Furthermore, are you prepared to accommodate a sudden demand surge?

Everyone understands the importance of innovation in today’s business environment. Likewise, they understand just how critical collaboration has become. But what about collaborative innovation? While there certainly are challenging obstacles, do the potential returns and future revenue make it all worthwhile?

 

I ask because I’m intrigued by a recent Inside Supply Management article that was referenced on the SupplyChainBrain website. In the article, Chris Thoen, senior vice president/global head, science and technology, at Givaudan Flavors Corp., says the future of competition isn’t going to rely on how well organizations create innovation on their own. In other words, successful companies will be those that are good at discovering innovation, whether it’s happening internally or externally. Companies must be good at creating a comfort zone so external partners will want to work with them on an on-going basis, says Thoen. The key, he says, is to establish sustainable relationships to repeatedly create new innovations with those partners.

 

That also is what the findings of a 2010 Capgemini study, Collaborating for Innovation, point out. The firm surveyed a variety of manufacturing industries to gauge levels of collaborative innovation with customers and suppliers. The research found that the primary focus for most innovation initiatives remains cost reduction rather than value chain creation through collaborative partnering. However, to cope with the technological and competitive challenges, manufacturing companies must increase active involvement of their suppliers in the value creation part of the innovation process instead of just relying on delivery of defined components, according to the report.

 

In the ISM article, I was interested to see a method for determining which suppliers an organization should pursue for innovation. That is, to divide suppliers into groups based on strategic relationships and capabilities, such as Gold, Silver, and Bronze, for example, with each level having specific measures and rewards. Henry Chesbrough, Ph.D., adjunct professor and executive director for the Center for Open Innovation at the University of California, Berkeley, says each group of suppliers receives a scorecard, with Gold suppliers having the broadest and most strategic measures. The scorecard for Silver suppliers would include some—but not all—of the measures and rewards of the Gold suppliers. Finally, the Bronze suppliers’ scorecards would maintain their current relationship with supply management and continue focusing on cost, availability, and incoming quality.

 

In addition to price, and the ability to deliver on-time and the quality of what is shipped, a company could also add measures about innovation for suppliers, says Chesbrough. Adding criteria based on the number of new ideas suppliers come up with, the level of usefulness of those ideas, and how much value the ideas create/how much money they save will broaden the scorecard for Gold and Silver suppliers.

 

Organizations can then take collaborative segmentation to the next level by conducting periodic reviews with Gold suppliers to determine their future path and synchronicities that may exist, Chesbrough says. So, for instance, a company could ask, What technology investments, new product development, and capacity outlook do suppliers expect next year? This same openness must occur as a customer, as well. But by sharing the same investment forecast and innovation needs, Chesbrough says all partners can work together to develop solutions that fill those gaps and needs.

 

What do you think? Is this already taking place to a certain degree? If not, what prevents this type of collaboration?