Opportunities and Challenges in Biosimilar Development

Opportunities and Challenges in Biosimilar Development

Opportunities and Challenges in Biosimilar Development

A biosimilar biotherapeutic product is similar (but not identical) in terms of quality, safety, and efficacy to an already licensed reference product. Unlike generic small molecules, it is difficult to standardize such inherently complex products based on complicated manufacturing processes. Table 1 describes the main differences between biosimilar and generic drug molecules.
Table 1: Major difference between biosimilars and generic drugs

The global biosimilar market is growing rapidly as patents on blockbuster biologic drugs expire (Table 2) and other healthcare sectors focus on reduction of costs. Biologics are among the highest-cost treatments on the global market today, which implies the need for low-cost alternatives. In emerging markets, biosimilars already offer more affordable prices, which are not only attractive, but indispensable to economies where expensive treatments are not financially feasible (1). Interchangeability of biosimilars could have a big impact on drug budgets around the world. However, concerns remain about the effect that could have on patients in terms of safety and efficacy.
Table 2: Patent status of some innovator biologics (3)

Developing and manufacturing biosimilars is challenging, so well-established biopharmaceutical companies are investing in these important medicines. As Table 3 shows, Continue Reading Article

The FDA’s Groundbreaking Strides Toward a Quality-Centric Regulatory System

Quality Centric Image

Two famous pharmaceutical industry quotes encapsulate industry thinking and the push for progress in recent years. Upon launching Pharmaceutical cGMPs for the 21st Century – a Risk-Based Approach, Janet Woodcock, director of the FDA’s Center for Drug Evaluation and Research (CDER), shared the following vision: “A maximally efficient, agile, flexible pharmaceutical manufacturing sector that reliably produces high-quality drugs without extensive regulatory oversight.”

Gerry Migliaccio, former head of global quality at Pfizer, once notoriously stated, “We produce six sigma products with three sigma processes.” Gerry was referring to the fact that the pharmaceutical industry’s over-reliance on compliance-based inspection to achieve quality is a very ineffective way to do business and that most other industries, even those in heavily regulated environments such as nuclear and aviation, moved away from this way of working many years ago.

Before exploring FDA action to achieve the outlined vision and address the great challenge shared by Mr. Migliaccio, it is important to recognize some of the primary pharmaceutical quality issues the industry and regulatory community have been wrestling with in recent years:

  • Treatment of all products equally, resulting in a disproportionate amount of regulatory attention devoted to low-risk products, diverting needed resources for high-risk products.
  • Unacceptably high levels of product recall and defect reporting that is too frequently rooted in defects in product and process design.
  • Shortages of critical drugs in recent years are also often the result of ineffective product and process design and the lack of effective quality management systems.
  • Inspections are not well connected to knowledge gained from product review and applications.
  • Because current FDA practice “locks in” a process before it is fully optimized, post-approval supplement submission levels are substantially higher and are a burden to both the FDA and industry.
  • Product review is often conducted based on pre-marketing data from exhibit or clinical batches. There may be significant differences between these data and the conditions of commercial production.

So, within an environment of increased drug product complexity, the need for greater numbers of different specialty drugs to treat smaller patient populations, and more complex drug product supply chains, the industry and the regulatory community still:

  • Do not understand their products and processes to the degree necessary
  • Are hampered by a compliance-based regulatory framework that does not allow easy adjustment of processes based on the ongoing accumulation of product and process understanding
  • Operate within the framework of all products, regardless of risk, essentially being treated equally

To move toward a system that ultimately regulates quality rather than compliance, the FDA and other regulatory bodies around the world are working to move to a system that considers drug product risk in a strategic manner. As other heavily regulated industries have already recognized, all products and situations should not be treated equally. Rather, regulatory resources need to be more heavily focused on the greatest areas of risk.

To achieve a risk-based construct, CDER’s new Office of Pharmaceutical Quality (OPQ) formulated the New Inspection Protocol Project (NIPP) incorporating five priorities: regulatory science, globalization, safety and quality, smart regulation, and stewardship.

NIPP will strive to differ from regulatory and inspection practices that have been in place for decades by using informatics, internal analysis, and other digital tools to prepare for and steer inspections. By utilizing data and predictive analytics, NIPP will allow the development of algorithms to reveal product quality challenges and facilitate inspections that concentrate on high-risk, inconsistent processes and products. A key aim of the NIPP is to focus attention and resources on high-risk processes and products so that, ultimately, quality improvements can be identified and made more quickly.

Organizationally, the NIPP consists of three subgroups: Pre-approval Inspection Protocol(s) subgroup, the Surveillance Inspection Protocol(s) subgroup, and the For Cause Inspection Protocol(s) subgroup. These subgroups will mine regulatory data to improve the FDA’s understanding of processes and approaches that produce quality products.

The FDA is using informatics as the cornerstone of NIPP. A database will evaluate the history of a given pharmaceutical manufacturing site for the following eight facility risk factors:

  1. Process
  2. Research
  3. Analytical Methods
  4. Sterility/Microbiology
  5. Inspections
  6. APIs and Excipients
  7. Chemistry Manufacturing and Controls (CMC), and
  8. Policy/Enforcement Actions

An algorithm will calculate the findings from the historical data and integrate them with product and facility risk factors to calculate a site ranking. Ultimately, higher-risk pharmaceutical manufacturing inspection sites will be identified and given priority resources.

To increase regulatory efficiency, based on the knowledge garnered from the NIPP inspections, the FDA can make more informed decisions on frequency of inspections, prioritization of inspections, and more effective pre-inspection planning so that time on site is optimized as fully as possible and hours are reduced as much as possible.

In the end, NIPP’s and OPQ’s goals are to strive for inspections that determine quality and more closely scrutinize risk-intensive products and operations, rather than inspecting for adherence to SOPs and other such regulatory criteria that may or may not lead to higher-quality products that reduce patient risk.

Interested in Process Validation in the Era of Expedited Approval Drugs? Download the White Paper from BioTechLogic


Process Validation in the Age of Expedited Drugs CTA



Why Pharmaceutical Data Integrity Is More Important Than Ever

Home / Articles / 2017 / Why Pharmaceutical Data Integrity Is More Important Than Ever Why Pharmaceutical Data Integrity Is More Important Than Ever

With radical pharmaceutical industry changes in the air, the importance of data integrity and the steps the pharmaceutical industry must take are clear.

By Ashley Ruth, Senior Consultant, Analytical Services, BioTechLogic, Inc.

The pharmaceutical and biopharmaceutical industries must give immediate and strategic consideration to data integrity practices for two critical reasons:

  1. increased attention to data integrity shortcomings by global regulatory agencies and
  2. the possibility of a less stringent regulatory environment.

To understand the pressing ramifications of the data integrity issue, we must remind ourselves that data is both the backbone of CGMP compliance and the fuel of the digital economy. In the pursuit of increased efficiency, digital approaches are being used for risk reduction and greater innovation in pharmaceutical lifecycle processes—from discovery to commercial manufacturing—and must continue to be further leveraged. Additionally, data must become less siloed and flow much more seamlessly throughout pharmaceutical organizations at various stages of the product lifecycle. However, just as if gasoline in a car engine is contaminated, it will damage and/or cease the engine, the same is true for “digital fuel.” Correct and uncorrupted data must flow through a pharmaceutical organization so that correct and reliable decisions can be made.

Even without sophisticated digital data management considerations, data integrity of even the most basic data systems must be assured to ensure compliance.


For several years, the FDA and other global regulatory bodies have emphasized the importance of accurate and reliable data in assuring drug safety and quality. However, in tandem with increased digital sophistication and the role of global manufacturing partners, data integrity violations have been on the rise.

As a reflection of the importance of this issue, in April, 2016, the FDA released draft guidance, “Data Integrity and Compliance With CGMP Guidance for Industry.” Within the guidance itself, the FDA notes the trend of increasing data integrity violations.

The guidance states, “In recent years, FDA has increasingly observed CGMP violations involving data integrity during CGMP inspections. This is troubling because ensuring data integrity is an important component of industry’s responsibility to ensure the safety, efficacy, and quality of drugs and of FDA’s ability to protect the public health. These data integrity-related CGMP violations have led to numerous regulatory actions, including warning letters, import alerts, and consent decrees.”

The phrase “data integrity” often conjures the image of intended and dishonest manipulation of data to achieve some benefit or avoid negative consequences. While purposeful data adulterations do occur, many data integrity violations are not purposeful and are a result of improper training, ineffective SOPs, corrupt systems, or lack of clarity within the regulations themselves. The CGMP framework recognizes that technologies and approaches evolve over time which reflects advances in innovation. Therefore, regulations and guidance are created with built-in flexibility to accommodate these changes. However, often these accommodations result in a lack of clarity.


President Trump has vowed to overhaul the FDA and sharply reduce the regulatory burdens of the current drug approval system. In fact, arguments have been made, by people associated with the administration, to completely eliminate pre-market approval clinical trials that are currently required to demonstrate drug efficacy and safety. Continue reading article

Date Integrity CTA

Live from the PDA Annual Meeting: Risk Management in Combination Products and Co-Packaged Kits

Tracy Speaking PDA 2017

From post-aging performance testing to container closure integrity, robust design controls extend far beyond “constituent part” requirements.

With the increase in complexity of some combination products and co-packaged kits, the need remains for ensuring that the patient gets the right drug at the right dose at the right time. While it is acceptable to make improvements to products as more information becomes available, it’s important to recognize that a change to a co-packaged kit can result in a warning letter if design controls are not properly implemented.

In her presentation at the 2017 PDA Annual Meeting in Anaheim, CA, Tracy TreDenick, Head of Regulatory and Quality Assurance and Founding Partner at BioTechLogic, explained that design controls for co-packaged kits include requirements for individual constituent parts, but also include inter-component dimensional and functional specifications, system integration verification testing and shipping of aged components. “FDA is asking for dimensional and functional specs for the final finished form,” she said.

This is not just about the closure of a prefilled syringe, but about the biocompatibility of all parts combined. She noted, “The challenge is understanding the constituent parts and the system integration requirements.” Continue Reading Article

4 Things You Need to Know About Combination Drug Compliance

Pharmaceutical Processing Combination Drug StoryCombination products are a fascinating area of the pharmaceutical industry and present great future promise. The segment is projected to reach $115 billion in global sales by the end of 2019. It has grown solidly at a rate of 7.9% CAGR since 2013, and is projected to continue at that rate through 2019.1

Some of the key factors driving this growth include: higher levels of patient compliance, demand for minimally invasive surgeries, opportunities for precise pain relief, quicker healing and governments and non-governmental organizations (NGO) embracing combination drugs for their ease of administration.

Combination products defined in 21 CFR 3.2(e)2 are therapeutic and diagnostic products that are composed of any combination of a drug, device, or biological products, with the intention of creating safer, more effective, precisely targeted and easier to administrate therapies.

While the technologies and innovations driving the combination product market deliver a great deal of value to patients and to the medical community, the novelty of these products is often challenging for drug developers and regulatory agencies. The marriage of two different disciplines – drug and medical device – creates a complex regulatory process that must be well managed. In addition, evolving regulations as the combination product segment matures can present challenges for older, legacy combination products. Continue Reading

4 Things Your Need to Know About Combination Drug Compliance



SEE ALSO: The Formation of the Combination Products Policy Council to Improve Regulatory Efficiency

Pharmaceutical Data Integrity Library of Resources

Pharmaceutical Data Integrity Library of Resources

The issue of pharmaceutical data integrity is more important than it has ever been. Data is both the backbone of CGMP compliance and the fuel of the digital economy. Secondly, global regulatory bodies have become increasingly focused on data integrity and have issued many actions of varying severity.

The following collection of resources are the leading guidances and resources for pharmaceutical data integrity.

FDA Draft Guidance: Data Integrity and Compliance With CGMP Guidance for Industry

The purpose of this guidance is to clarify the role of data integrity in current good manufacturing practice (CGMP) for drugs, as required in 21 CFR parts 210, 211, and 212. Part 210 covers Current Good Manufacturing Practice in Manufacturing, Processing, Packing, or Holding of Drugs; General; part 211 covers Current Good Manufacturing Practice for Finished Pharmaceuticals; and part 212 covers Current Good Manufacturing Practice for Positron Emission Tomography Drugs. This guidance provides the Agency’s current thinking on the creation and handling of data in accordance with CGMP requirements. Access Guidance

MHRA GxP Data Integrity Definitions and Guidance for Industry

The United Kingdom’s Medicines and Healthcare products Regulatory Agency (MHRA) guidance on data integrity is one of the globe’s leading resources on the topic. Access Guidance

Elements of a Code of Conduct for Data Integrity in the Pharmaceutical Industry

The Parenteral Drug Association’s (PDA) Elements of a Code of Conduct for Data Integrity in the Pharmaceutical Industry outlines key elements necessary to help ensure the reliability and integrity of information and data throughout all aspects of a product’s lifecycle. It is intended to be used in whole or in part to guide a company’s internal practices, create or modify an existing data integrity code of conduct, or in developing agreements with outsourcing partners or other suppliers. The elements identified throughout this document are intended to reinforce a culture of quality and trust within the pharmaceutical industry. Access Data Integrity Code of Conduct

Guidance: Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments

Pharmaceutical Inspection Convention/Pharmaceutical Inspection Co-Operation Scheme (PIC/S) – Good data management practices influence the integrity of all data generated and recorded by a manufacturer and these practices should ensure that data is accurate, complete and reliable. While the main focus of this document is in relation to data integrity expectations, the principles herein should also beconsidered in the wider context of good data management. Access PIC/S Guidance

Library of Data Quality, Records Management & FDA Recordkeeping Laws

Be compliant with US regulations and statutes on data integrity and recordkeeping in order to avoid stiff penalties and preserve corporate value. This robust collection of resources will prove to be quite helpful. Access Library


Evolution of Biopharmaceutical Control Strategy Through Continued Process Verification

Continued Process Verification Image

As defined in the ICH Q10 guideline, a control strategy is “a planned set of controls, derived from current product and process understanding, that assures process performance and product quality” (1). Every biopharmaceutical manufacturing process has an associated control strategy.

FDA’s 2011 guidance for process validation (2) describes process validation activities in three stages (Figure 1). A primary goal of stage 1 is to establish a strategy for process control that ensures a commercial process consistently produces acceptable quality products. Biopharmaceutical development culminates in the commercial control strategy, a comprehensive package including analytical and process controls and procedures. Stage 2 process performance qualification (PPQ) is needed to establish scientific evidence that a process is reproducible and will consistently deliver high-quality products. Stage 3 of validation and continued process verification (CPV) provides an opportunity to improve process control through the lifecycle of a product. Continue reading

SEE ALSO: 4 Steps for Managing Biopharmaceutical Manufacturing Projects

Response to the Publication of USP ‹1207›: Package Integrity Evaluation-Sterile Products

Container Closures

The BioPhorum Operation Group’s (BPOG’s) Container Closure Integrity Testing (CCIT) workstream would like to congratulate the United States Pharmacopeia’s committee for its latest revision to USP chapter <1207> Package Integrity Evaluation: Sterile Products. Generally, we believe it provides a comprehensive overview of the available methods for container–closure testing and outlines many important elements for consideration in establishing a successful CCIT strategy. We first responded to the USP <1207> draft when it was released for comment in 2014. And from our perspective, some of the changes that were proposed and concerns that were raised in 2014 clearly have been addressed. We are grateful to see that.

The purpose of this letter, however, is to highlight specific areas that cause us concern as a crossindustry group. We are aware that USP’s “informational” chapters are not compulsory. As end-users of such guidances, though — and as representatives of companies that receive the scrutiny of regulators — we recognize that informational chapters often evolve in practice to establish expectations. So any lack of clarity or any bias introduced toward specific methodologies is of concern. It is in this context that we would like USP to consider our further comments here.

These concerns center on the description, perception, and treatment of probabilistic and deterministic analytical methods — specifically dye and microbial ingress methods. We would like to see our concerns considered at the earliest possible opportunity, ideally precipitating an update to USP<1207>. Read article
SEE ALSO: 4 Steps for Managing Biopharmaceutical Projects

Therapeutic Gene Editing: An American Society of Gene & Cell Therapy White Paper

Gene Therapy Post image

Interested in learning more about gene editing and its therapeutic applications? Download a new white paper from the American Society of Gene & Cell Therapy (ASGCT). The document, titled “Therapeutic Gene Editing: An ASGCT
White Paper” is intended as background information for policymakers, patients, and the general public to help them Therapeutic Gene Editing - An American Society of Gene & Cell Therapy White Paperbetter understand gene therapy concepts and its related therapies.

The paper is designed to help people to better understand the ramifications of a soon to be released report on human gene editing by the National Academy of Sciences and National Academy of Medicine. This report, planned for release in early 2017, is expected to address the ethical, legal, and social implications of gene editing processes and suggestions for potentially needed policy.

Download White Paper

4 Steps for Managing the Criticality and Challenges of Biopharmaceutical Projects

Biopharmaceutial project management is a complex process. This SlideShare walks you through the process:

  • Challenges of biopharma project management
  • Scope of biopharmaceutical project management
  • Characteristics of great biopharmaceutical project managers – are we asking for unicorns?
  • 4 Steps for managing biopharmaceutical projects
    • Step # 1 – Clearly Establish Project Definition and Impacting Constraints
    • Step # 2 – Project Execution Planning
    • Step #3 – Project Execution
    • Step # 4 – Project Completion