Skip navigation
2016

In the midst of what essentially is a public relations nightmare, and facing increasing scrutiny from U.S. lawmakers, drugmaker Mylan announced it will now sell a generic version of its own life-saving allergy treatment EpiPen for half the list price of its brand-name treatment. The announcement does not seem—at least for now—to be received well.

 

Mylan has aggressively defended multiple price increases over the years that have resulted in the company listing a two-pack of EpiPens at about $600—up from $100 in 2007. This is important to note because the pens expire, which means people with allergies must purchase a two-pack of EpiPens annually. If they do use a pen, they must then buy a replacement pack immediately.

 

EpiPen has a 94 percent market share for auto-injector devices. The company’s chance to raise prices for the EpiPen came when its sole market competitor, Sanofi, voluntarily recalled its own autoinjector following miscalibration issues. Without that competition, Mylan made several cost increase moves. In the meantime, an anticipated generic competitor from Teva Pharmaceutical failed to pass US Food and Drug Administration testing, leaving the market mostly to Mylan.

 

The dramatic price increase has drawn considerable notice as well as growing criticism. In an attempt to stem the tide, Mylan announced it would reduce the out-of-pocket costs of its allergy injection for some patients. The list price of the drug will remain the same, but the company said it would increase the maximum copay assistance program to $300 from $100 for patients who pay for the EpiPen two-pack in cash or who are covered by a commercial health insurer.

 

That move was met with criticism from 20 U.S. lawmakers, who in a letter to CEO Bresch, daughter of Senator Joe Manchin of West Virginia, asked the company to spell out the company’s programs to provide some people with lower cost EpiPens. Such discount programs are often an “industry tactic to keep costs high through a complex shell game,” the letter said, Reuters reports. “Insurance companies, the government and employers still bear the burden of these excessive prices. In turn, those costs are eventually passed on to consumers in the form of higher premiums,” the senators wrote.

 

Mylan has announced it will also market the generic treatment.

 

In explaining the situation in an interview on CNBC, Bresch didn’t seem to help the company’s case. First she blamed the “broken” health-care industry, then added that “no one’s more frustrated than me.”

 

It actually seems many consumers are extremely frustrated, as are both Republican and Democrat lawmakers. Leaders of the U.S. House Committee on Oversight and Government Reform have launched an investigation into EpiPen price increases, an article in USA Today reports. Committee chairman Jason Chaffetz, R-Utah, and ranking member Elijah Cummings, D-Md., requested numerous documents from Mylan, including details of EpiPen profits and sales, lobbying data, internal cost figures and federal health reimbursement numbers.

 

Mylan has “a virtual monopoly over the epinephrine auto-injector market,” the lawmakers wrote in a letter to Bresch. “While families and schools are struggling to keep up with your company’s unreasonable price increases, Mylan has profited richly from its pricing strategy.”

 

The committee's letter, which requested a briefing by Sept. 6 and documents by Sept. 12, paves the way for a potential congressional hearing on the matter.

 

In the meantime, the question remains whether or not consumers will actually pay less for the generic version. If patients don’t see cost savings and are still paying high amounts for EpiPens, Mylan can say “We lowered the price and this is on insurers and pharmacy benefits managers,” Walid Gellad, who heads the Center for Pharmaceutical Policy and Prescribing at the University of Pittsburgh, says in a Bloomberg article.

 

What are your thoughts on this situation? Are the EpiPen price increases justified? Does the fault indeed lie with what Bresch calls the “broken” health-care industry that, she claims, “incentivizes” higher prices?

The labor dispute that caused massive bottlenecks at the ports of Los Angeles and Long Beach last year not only interrupted the economy, it also hurt efforts to clean the air.

 

The good news is that Long Beach’s annual report card on pollution released earlier this month shows the air is significantly cleaner than it was in 2005, a Long Beach Press Telegram article reports. However, the port backslid slightly between 2014 and 2015. Port officials say one big reason for the dirtier air is the 2.6 million hours big rigs and massive ships spent idle running their diesel engines as they waited in long lines for others to unload during the work slowdown.

 

“This highlights for us the continuing challenge to make sure we focus on these strategies on how we can make progress and move forward,” said Heather Tomley, director of environmental planning at the Port of Long Beach, in the Press Telegram article.

 

A report prepared by Seacrest Consulting Group highlighted an overall improvement in cutting emissions since 2005, when the port created goals to reduce the amount of harmful pollution. Emissions of particulate matter have fallen 84 percent since 2005. Another potentially harmful pollutant, the particle-forming sulfur oxide, decreased by 97 percent over 11 years, according to the report.

 

Port authorities attribute the overall reduction in emissions to a series of federal, state and port-centered initiatives, including the ports’ creation of incentives for vessel owners to slow down when pulling into the port. The initiatives allow ships to plug in for power instead of idling and sending pollutants into the air.

 

Despite the improvements, the ports of Los Angeles and Long Beach are the single largest fixed source of air pollution in Southern California. And the region continually fails to meet federal ozone standards.

 

“The ports are not moving fast enough,” Nidia Erceg, deputy policy director at Coalition for Clean Air, says in the Press Telegram article. She notes that a recent study found more than 2,000 Southern Californians die each year from polluted air.

 

“Those are the numbers that shock me,” Erceg says. “There are opportunities to make change here.”

 

In nearby Los Angeles, Mayor Eric Garcetti has appointed an advisory panel tasked with reducing air pollution from the Port of Los Angeles by expanding the use of zero-emissions technology. The 10-member Sustainable Freight Advisory Board, made up of representatives from industry, environmental groups, labor and air quality agencies, will advise the city-owned port on how to work with manufacturers to develop and deploy cleaner trucks, trains, ships and cargo-handling equipment, the Los Angeles Times reports.

 

The air pollution is “an unacceptable price to pay for a bustling port, but we don’t have to choose between one or the other,” Garcetti said at a news conference at a port terminal. “We can have healthy communities and we can have a healthy port. We can have economic growth and clean air.”

 

Tomley at the Port of Long Beach says the ports have done a lot of work, including getting cleaner running trucks at the ports. Next year, a state rule will require 70 percent of all container and cruise ships pulling up to the dock must plug in, which should give a further boost to clean air. That said, she notes that the easy work has been done, and technological advances are needed to make further strides.

 

What’s interesting is that as ports along the U.S. Gulf Coast and eastern seaboard continue expansion to accommodate expected traffic moving through the expanded Panama Canal, one would expect concerns about air quality to grow as well. Granted, the ports of Los Angeles and Long Beach see more traffic, but continued construction and increasing traffic are bound to have some impact.

 

Whether you are near the ports of Los Angeles or Long Beach, or another port, what are your thoughts on air quality and increasing port traffic?

U.S. intelligence officials plan to provide information including classified supply chain threat reports to companies about the risks of hacking and other crimes tied to the supplies and services they buy. The Office of the Director of National Intelligence’s National Counterintelligence and Security Center made an announcement and released a video last week that highlights the threats foreign entities pose to the private sector’s supply chain and to the public sector organizations that use goods and services from the private sector.

 

The video raises awareness of increased risk to supply chains that stems from what NCSC calls a “growing dependence” on globally sourced commercial information and technologies for mission-critical systems and services. The risks are passed to end users through products and services that may contain defective, counterfeit or otherwise tainted components—such as compromised telecommunications equipment. Those threats may come from China, Russia and other governments, as well as criminals, hackers and disgruntled employees who want to steal sensitive information or disrupt operations, NCSC says.

 

“You’d be shocked to find out how many people really don’t know where their stuff comes from,” Bill Evanina, director of NCSC, says in a Bloomberg article. “The supply chain threat is one that’s the least talked about but is the easiest to manipulate for all aspects of our daily lives.”

 

The new threat reports, which may start going out in about two months, will provide intelligence and context behind hacking attacks and other activity, such as whether another country is responsible and the likely motivation, according to NCSC. It’s particularly worth noting, the agency explains, that the Chinese government has previously stolen secrets from U.S. agencies and companies to gain a competitive advantage, while the Russian government wants to deliver defective parts into U.S. supply chains to cause disruptions to military capabilities.

 

“Often, we get lost in putting the fire out,” Evanina says. “At the end of the day, to stop the fire we have to find out who’s lighting it.”

 

Companies can take many steps to help secure their supply chains, such as doing simple online research into businesses they plan to buy from, working with the FBI and Homeland Security Department and adding security requirements to contracts, NCSC explains. The agency’s video also recommends that acquisition and procurement personnel need to be a full part of a company’s security efforts.

 

“Know where your stuff is coming from,” Evanina says in the Bloomberg article. “You might have the best software and cybersecurity programs, but if you don’t have the same due diligence and understanding of the threat for the people who buy the systems that run your buildings and facilities, you’re running the risk of potential compromise.”

 

The U.S. government has previously accused China and Russia of cyber attacks, however interest in critical infrastructure security has certainly surged since Ukraine authorities blamed a power outage on a cyber attack from Russia. What’s more, the Department of Homeland Security (DHS) and the FBI have previously announced that they have seen an increasing exploitation of business networks and servers by disgruntled and/or former employees. Some of these cases have resulted in significant FBI investigations in which individuals used their access to destroy data, steal proprietary software, obtain customer information, purchase unauthorized goods and services using customer accounts, and gain a competitive edge at a new company. The cost to businesses for these attacks by disgruntled or former employees ranges from $5,000 to $3 million per attack, according to the FBI.

 

What are your thoughts about supply chain vulnerabilities? Is your organization concerned about counterfeit or defective parts purposefully being introduced into the supply chain? Secondly, does your organization have a plan to prevent former employees from exploiting business networks and servers?

This has been an interesting week for makers of autonomous cars and their supply chains, as well as for the ride-sharing market. Not only is the development of self-driving cars accelerating, the technology also seems poised to alter the ride-sharing business model.

 

For starters, Ford announced this week that it intends to have a fully autonomous vehicle for ride sharing in 2021. The company explains that—building on more than a decade of autonomous vehicle research and development—its first fully autonomous vehicle will be a Society of Automotive Engineers-rated level 4-capable vehicle without a steering wheel or gas and brake pedals. The vehicle is being specifically designed for commercial mobility services, such as ride sharing and ride hailing, and will be available in high volumes.

 

Key steps in the initiative are Ford’s investments and collaborations with several high-tech companies that produce technology such as light detection and ranging (LiDAR) sensors, and technology to create high-resolution 3D maps of autonomous vehicle environments. However, it also is aggressively expanding its Silicon Valley operations to create a dedicated campus in Palo Alto by building near its current Research and Innovation Center.

 

Sweden-based Volvo Cars also announced this week it’s teaming up with ride-sharing service Uber in a $300 million (265 million euro) joint venture to develop driverless automobiles.

 

“Both Uber and Volvo will use the same base vehicle for the next stage of their own autonomous car strategies,” Volvo, owned by China’s Geely, said in a statement. “This will involve Uber adding its own self-developed autonomous driving systems to the Volvo base vehicle.”

 

This, frankly, is where I think the nature of testing autonomous cars gets interesting because Uber announced this week that, starting later this month, it will allow customers in downtown Pittsburgh to opt into a test program and summon autonomous Ford Fusions and Volvo XC90 SUVs. But since the technology is still being tested, the cars will come with human backup drivers to handle any unexpected situations. As an enticement, the autonomous rides will be free, Uber announced.

 

The use of human backup drivers essentially means Uber is testing the technology and taking people along for the ride, Bryant Walker Smith, a University of South Carolina professor who studies self-driving technology, says in an Associated Press story.

 

“Part of this is marketing in the sense that they’re going to be doing continued research and development of these systems,” Walker Smith says.

 

Timothy Carone, a Notre Dame professor and author of “Future Automation: Changes to Lives and to Businesses,” says in the article that Uber is trying to gain an advantage by putting its cars on the road before competitors. But unlike Tesla, Uber is mitigating the risk with its own drivers.

 

“This is a way to get autonomous cars out there and accepted and increase the adoption rate,” Carone says.

 

Uber rival Lyft has not been ignored either. Earlier this year, the company announced that General Motors had invested $500 million in Lyft. There are now reports that GM recently made an offer to buy Lyft but was rebuffed.

 

The shift to autonomous vehicles marks a sea change for Uber and Lyft. Until now, they have had no substantial capital investment beyond the servers that run their ride-hailing software. Uber in particular has been very adamant in multiple lawsuits that drivers aren’t employees, but instead, are independent contractors and all Uber does is provide the platform that connects drivers and passengers. The vehicles used until now (except for autonomous test vehicles) have been owned by drivers who are responsible for their purchase, maintenance, insurance and fuel.

 

I am interested to see how the Uber program develops. What are your thoughts? Do you think the initiative will be received well? Would you ride in an autonomous Uber vehicle with a human back-up driver?

Although Lean and Six Sigma practices are widely followed, they ultimately are insufficient to address the complexities of modern industrial manufacturing, and companies are instead turning to “smart operations,” according to a new white paper from UPS. These operations combine pervasive data collection, advanced analytics, technology investments and deeper collaboration with partners to prepare their value streams for the next industrial revolution.

 

Indeed, over the next three years, a growing number of successful manufacturers will enhance their manufacturing processes with smart operations, a broader supply chain strategy that extends beyond the factory walls, according to the paper, “The Rise of Smart Operations: Reaching New Levels of Operational Excellence.”

 

While Lean and Six Sigma methods remain the standard for manufacturers, continuous improvement has a downside. Overly optimized processes can become inflexible, leaving the business unable to adjust rapidly to disruptions in the supply chain and changing customer demand. Manufacturers which use smart operations are better positioned than others to compete and thrive in today’s fluctuating markets, the paper notes, because increased visibility of inventory location and transportation allow companies to better analyze and quickly manage changes to their supply chain both upstream and downstream of the factory. Consequently, the use of smart operations “separates manufacturers who thrive from those that merely survive,” according to the paper.

 

“Smart operations are crucial to the long-term success of manufacturing companies,” says Derrick Johnson, vice president of marketing at UPS. “The strategy enables manufacturers with limited resources to serve their increasingly demanding customers more flexibly.”

 

UPS and market research firm IDC conducted the survey of more than 100 manufacturing operations executives and hosted focus group discussions to assess how far along companies are in implementing smart operations. Interestingly, 53 percent of the responding executives said their companies were at a relatively low level of overall maturity. Still, 47 percent of the respondents said their company’s progress toward smart operations exceeded that of their peers.

 

There are five areas essential to smart operations, the paper explains:

 

  • Connected products: Increasingly, industrial manufacturers sell products that are connected. This connectivity allows companies to offer better maintenance service, which sometimes even generates new revenue streams.
  • Connected assets: Manufacturers with connected assets are better able to monitor their operations to anticipate and even correct problems before they occur.
  • Supply chain decision making: The data and analytic tools used in smart operations help manufacturers resolve issues in the supply chain faster.
  • Buy-side value chain: Smart operations allow manufacturers to automate purchasing with their vendors and manage the inbound transportation of those supplies.
  • Sell-side value chain: Smart operations allow manufacturers to change transportation modes and speeds as well as destinations based on shifting customer demand.

 

At the heart of this business strategy is digital transformation enabled by investments in technology for data collection, advanced analytics and connectivity for products, assets and partners throughout the value chain, UPS says. One top-tier automotive supplier’s executive responding to the survey said, “We are no longer an automotive company, but a technology company in the automotive business.”

 

The UPS paper also reports that manufacturers increasingly rely on external service providers, freeing themselves to focus on their own key competencies. Companies that have made less progress toward smart operations can take advantage of the technology and process investments their partners have already made.

 

What are your thoughts on either the whitepaper or smart operations?

Research at the University of Michigan (U-M) focused on artificial intelligence, robotics and autonomous driving will get a major boost thanks to a $22 million commitment from the Toyota Research Institute (TRI). The announcement was made earlier this week by TRI’s CEO Gill Pratt in an address to the U-M faculty.

 

Under the agreement, TRI will provide an initial $22 million over four years for research collaborations with the U-M faculty, and will be directed by robotics professors Ryan Eustice and Ed Olson, who will retain their part-time faculty positions. The research will focus on enhanced driving safety, partner robotics and indoor mobility, autonomous driving and student learning and diversity, however, the university will seek proposals from faculty across departments.

 

“Toyota has long enjoyed an excellent working relationship with the University of Michigan, and we are excited to expand our collective efforts to address complex mobility challenges through artificial intelligence,” Pratt said. “We look forward to collaborating with U-M’s research faculty and students to develop new intelligent technologies that will help drivers travel more safely, securely and efficiently. We will also focus on expanding the benefit of mobility technology to in-home support of older persons and those with special needs.”

 

This is the latest step in the emerging private-public effort to establish southeast Michigan and Ann Arbor as a major hub for development of new modes of mobility and in-home robotics. Last April, TRI announced the establishment of its Ann Arbor research facility (TRI-ANN) and the hiring of U-M robotics professors Eustice and Olson to support autonomous vehicle research. TRI-ANN is the third TRI facility, joining TRI offices in Palo Alto near Stanford and in Cambridge, near MIT.

 

Toyota, along with General Motors, Ford, Nissan and Honda, is also a founding partner of U-M’s Mobility Transformation Center (MTC), an interdisciplinary public-private research and development initiative that is developing, as Toyota says, “the foundation for a commercially viable ecosystem of connected and automated vehicles.” MTC also oversees Mcity. The “mini-city” sits on a 32-acre site on U-M’s North Campus which allows researchers to test emerging vehicle technologies—such as autonomous vehicles—rapidly and rigorously in a safe, controlled environment.

 

“Our labs at U-M push the envelope of what robots can sense and understand about the world, and TRI provides an opportunity to apply these discoveries into real-world products,” says Professor Eustice. “The challenges that TRI faces with autonomous cars will leverage our labs’ research into complex behaviors, like merging and understanding the intention of other vehicles from their actions.”

 

News of the commitment to U-M isn’t really surprising given that Toyota executives have explained that the company believes artificial intelligence has significant potential to support future industrial technologies and even create entirely new industries. TRI’s CEO Pratt has previously said that the organization’s goals are to improve safety by continuously decreasing the likelihood that a car will be involved in an accident; make driving accessible to everyone, regardless of ability; and apply Toyota technology used for outdoor mobility to indoor environments, particularly so robots can help care for the elderly.

 

It will be interesting to watch developments at the University of Michigan and at TRI. In the meantime, what are your thoughts on autonomous vehicles and the potential use of partner robots to care for the elderly or people with special needs?

China Cosco Shipping was the first company to have a vessel transit the expanded Panama Canal on June 26, but it also has had a vessel collide with the canal wall and suffer damage. The 8,500 teu container ship Xin Fei Zhou hit a wall while transiting the locks at Agua Clara on the Atlantic side of the canal last month. The ship’s hull was damaged, but traffic was not affected, and the wall suffered only minor damage.

 

That hasn’t been the only accident since the expanded canal opened. The Lycaste Peace, the first LPG tanker to pass through the new section of the canal, ripped off a canal wall fender during a collision in late June, causing some minor damage to the railing of the ship.

 

The Panama Canal Authority has confirmed that the Cosco Shipping Panama, the container ship that made the inaugural journey through the canal, also made contact with the canal’s fenders. A spokesman for the Canal Authority said that is normal, and also claims the way incidents involving the Lycaste Peace and Cosco Shipping Panama were reported was incorrect.

 

“In both of these cases, neither is considered an incident or accident,” according to the Authority, Maritime Executive reports. “In fact, contacting fenders during approaches to the locks or inside the chambers of the locks is normal, expected and is the reason for installing fenders on areas where contact is expected.”

 

It is interesting to note, however, that other vessels have not only had contact, but have sheared or badly damaged up to 100 buffering fenders that are supposed to protect the lock walls and ship hulls should they come into contact, according to interviews with canal workers, a New York Times article reports. Canal workers interviewed for the story expressed concern about whether the plastic fenders on the lock walls would be adequate and whether tugboat captains had received the proper training in how to guide giant ships through the chambers—a procedure that differs from the one used in the original canal.

 

The original canal, which remains in operation, uses locomotives that run alongside the lock walls to pull the ships. The expanded canal operates differently: Tugboats push and guide the ships. According to the Panama Canal Authority, tugs not only are cheaper, but are the only practical way to move the newer giant neo-Panamax ships.

 

That may be the case, however, in a 2014 report, written well before the expanded canal opened, global insurance company Allianz wrote that the change to using tugboats brings with it “a greater potential for damage,” the New York Times reports. Allianz further wrote that the tugs will be sufficient, but training is the key to mitigating the risks.

 

It is a challenge, tugboat captains say, made more difficult because so many of them have not been fully trained in the new system, said Iván de la Guardia, who heads the tugboat captains’ union, in the NYT article. After three weeks of operation, Mr. de la Guardia said, 60 percent of his members had yet to be trained.

 

None of this seems to have had an impact on shippers’ plans to use the expanded Panama Canal, which can shorten the one-way journey by sea from Asia to the U.S. East Coast by roughly five days while also eliminating the need for a trip around Cape Horn to get to the Atlantic. Indeed, the expanded Panama Canal has transited 69 neo-Panamax vessels since its inauguration on June 26. Specifically, 40 container ships, 24 LPG carriers, three vehicle carriers and two LNG carriers have transited the expanded canal. In addition, the Panama Canal Authority notes that it has received 250 reservations—and counting—for ships to transit the expanded canal, including seven cruise ship reservations.

 

It will be interesting to see if the number of times ships collide with canal lock fenders decreases as tugboat captains are trained and become more familiar with the new expanded canal and the larger neo-Panamax vessels.

The distinction between pharmaceuticals and technology continues to get blurry as companies partner to create high-tech devices combining biology, software and hardware to treat chronic diseases. Bioelectronics—miniaturized, implantable devices—can modify electrical signals that pass along nerves in the body, and potentially may be used to treat conditions such as arthritis, diabetes and asthma.

 

Earlier this week, GlaxoSmithKline (GSK) announced an agreement with Verily Life Sciences LLC (formerly Google Life Sciences), an Alphabet company, to form Galvani Bioelectronics to research, develop and commercialize bioelectronic medicines. GSK will hold a 55 percent equity interest in the new jointly owned company and Verily will hold 45 percent. The company will be headquartered in the UK.

 

The new company will leverage GSK’s drug discovery and development expertise and its deep understanding of disease biology along with Verily’s technical expertise in the miniaturization of low power electronics, device development, data analytics and software development for clinical applications. Initial work will focus on establishing clinical proofs of principle in inflammatory, metabolic and endocrine disorders, including type 2 diabetes, where substantial evidence already exists in animal models; and developing associated miniaturized, precision devices, according to GSK.

 

“Many of the processes of the human body are controlled by electrical signals firing between the nervous system and the body’s organs, which may become distorted in many chronic diseases,” says Moncef Slaoui, GSK’s Chairman of Global Vaccines, and chair of Galvani Bioelectronics’ board. “Bioelectronic medicine’s vision is to employ the latest advances in biology and technology to interpret this electrical conversation and to correct the irregular patterns found in disease states, using miniaturized devices attached to individual nerves. If successful, this approach offers the potential for a new therapeutic modality alongside traditional medicines and vaccines.”

 

Kris Famm, GSK’s vice president of Bioelectronics R&D, and now president of Galvani Bioelectronics, has pioneered work in both large and small molecule drug discovery and worked for a decade developing and delivering R&D strategy with a recurring focus on emerging technologies. He believes one of the benefits of bioelectronics is the ability to gain real-time feedback on how patients are doing.

 

“It will really help us hone the intervention,” Famm says in a Reuters article. “This is almost the epicenter of convergence because the technology is not only helping you to monitor a disease but it is also actually the therapy.”

 

Other companies are also working in the field. For example, biotech firms Setpoint Medical and EnteroMedics are developing bioelectronics as a means to tackle inflammatory diseases like rheumatoid arthritis and suppress appetite in the obese. Swiss-based Novartis also is working with Alphabet on a smart contact lens with an embedded glucose sensor to help monitor diabetes.

 

Novartis Chief Executive Joe Jimenez has said that the combination of pharmaceuticals and technology will eventually be “front and center” in disease management.

 

Nonetheless, there are numerous challenges for these companies, including, as the Reuters article notes, the need for multi-year clinical trials to prove that the new technologies are safe, effective and can deliver the type of benefits to overall clinical outcomes that proponents expect. Regulators such as the U.S. Food and Drug Administration (FDA) also need to be convinced of the case for radically new ways of treating and monitoring patients.

 

“The challenge will be to make sure that regulators are on board, although the FDA is much more innovative now than it was 10 years ago about accepting different endpoints for treating disease,” Hilary Thomas, chief medical adviser at KPMG, says in the article.

 

It will be interesting to see how supply chains are created for bioelectronics.