Data Integrity Principle
Data Integrity Principle
Data integrity is fundamental in a pharmaceutical quality system which ensures that medicines are of the required quality. Data integrity requirements apply equally to manual (paper) and electronic data.
Manufacturers and analytical laboratories should be aware that reverting from automated/computerized to manual / paper-based systems will not in itself remove the need for data integrity controls.
Pharmaceutical companies collect and analyze data throughout the production process to ensure end products meet standards for both quality and safety.
For these companies to make informed business decisions and remain in compliance with regulations, the data they collect must be complete, accurate, and continuously protected from accidental and malicious alteration.
Defined as the process of maintaining the accuracy and availability of data throughout its life cycle, data integrity plays a critical role in any pharmaceutical manufacturing operation. If data becomes lost or inaccurate, your company could face fines, delays in product approval, or even criminal charges.
It is important to protect your manufacturing data at every stage of production and access — from initial recording through validation and archiving.
Common causes of data integrity challenges
The United States Food and Drug Administration (FDA) finds many data integrity lapses during inspections every year. Although most pharmaceutical manufacturers want to keep accurate records, they still struggle to manage risk and achieve compliance. Some of the common data integrity challenges that companies face include:
- Record policies that deviate from requirements: The FDA and other governing bodies have strict expectations for pharmaceutical data integrity compliance, requiring companies to keep records that are records and principles that are attributable, legible, contemporaneous, original, and accurate as well as complete, consistent, enduring, and available. To avoid gaps in data integrity, manufacturers usually need to design their record policies and systems specifically to meet these requirements.
- Insufficient data security controls: An important part of pharmaceutical data integrity involves keeping backups of data for security purposes. To meet FDA requirements, a backup must be a true copy of the original that is maintained securely for the required retention period. Without a proper backup, data can be accidentally deleted or rendered unusable due to improper alteration.
- Lack of employee technical knowledge: Human error accounts for many data integrity concerns, and errors become more likely when employees lack the training and technical knowledge to maintain records properly.
- Outdated procedures: Paper records no longer offer the best method for data maintenance. They can become illegible over time and are easier to tamper with than digital records. As a result, updating procedures can go a long way toward achieving data integrity compliance.
Data integrity challenges in drug development and manufacturing
Drug development and manufacturing generate large amounts of data, which has further increased in volume and complexity with the rise of biologics. This data must be transparent, accurate, and traceable as it will be the basis for making GMP decisions regarding the pharmaceutical product.
The intricacy of this data makes maintaining audit trails all the more challenging. Audit trails are hard to create given the large and complex array of data, but they are an important regulatory requirement and a proven, effective means of detecting data integrity issues.
Additionally, regulatory bodies require periodic reviews of audit trails. If systems do not capture enough information regarding the action performed or have insufficient sorting, filtering and grouping capabilities, reviewers may need to resort to manual steps to conduct a comprehensive and meaningful review, which becomes time-consuming and inefficient.
The World Health Organization, for instance, is drafting new data integrity guidelines expected to take effect in 2020, outlining recommendations for complying with data integrity and ALCOA+ principles. These principles require data to be:
- Attributable
- Legible
- Contemporaneous
- Original
- Accurate
- Complete
- Consistent
- Enduring
- Available
Designing systems to assure data quality and integrity
Systems should be designed in a way that encourages compliance with the principles of data integrity.
Examples include:
• Access to clocks for recording timed events
• Accessibility of batch records at locations where activities take place so that ad hoc data
recording and later transcription to official records is not necessary
• Control over blank paper templates for data recording
• User access rights which prevent (or audit trail) data amendments
• Automated data capture or printers attached to equipment such as balances
• Proximity of printers to relevant activities
• Access to sampling points (e.g. for water systems)
• Access to raw data for staff performing data-checking activities.
The use of supervisory (scribes) to record activity on behalf of another operator should be considered ‘exceptional’, and only take place where:
• The act of recording places the product or activity at risk e.g. documenting line interventions by sterile operators.
• To accommodate cultural or staff literacy/language limitations, for instance where activity is performed by an operator, but witnessed and recorded by a Supervisor or Officer.
In both situations, the supervisory recording must be contemporaneous with the task being performed and must identify both the person performing the observed task and the person completing the record.
The person performing the observed task should countersign the record wherever possible, although it is accepted that this countersigning step will be retrospective. The process for supervisory (scribe) documentation completion should be described in an approved procedure, which should also specify the activities to which the process applies.
Data
Information derived or obtained from raw data (e.g. a reported analytical result) and Data must be:
A – attributable to the person generating the data
L – legible and permanent
C – contemporaneous
O – original record (or ‘true copy’)
A – accurate
Raw data
Original records and documentation, are retained in the format in which they were originally generated (i.e. paper or
electronic), or as a ‘true copy’. Raw data must be contemporaneously and accurately recorded by permanent
means. In the case of basic electronic equipment which does not store electronic data, or provides only a printed
data output (e.g. balance or pH meter), the printout constitutes the raw data.
Raw data must:
• Be legible and accessible throughout the data lifecycle.
• Permit the full reconstruction of the activities resulting in the generation of the data
Metadata:
Metadata is data that describes the attributes of other data and provides context and meaning. Typically, these are data that describe the structure, data elements, interrelationships, and other characteristics of data. It also permits data to be attributable to an individual.
Metadata forms an integral part of the original record. Without metadata, the data has no meaning.
Data Integrity
The extent to which all data are complete, consistent, and accurate throughout the data lifecycle. Data integrity arrangements must ensure that the accuracy, completeness, content, and meaning of data are retained throughout the data lifecycle.
Data governance
The sum total of arrangements to ensure that data, irrespective of the format in which it is generated is recorded, processed, retained, and used to ensure a complete, consistent, and accurate record throughout the data lifecycle
Data governance should address data ownership throughout the lifecycle, and consider the design, operation, and monitoring of processes/systems in order to comply with the principles of data integrity including control over intentional and unintentional changes to information.
Data Governance systems should include staff training in the importance of data integrity principles and the creation of a working environment that encourages an open reporting culture for errors, omissions, and aberrant results.
Senior management is responsible for the implementation of systems and procedures to minimize the potential risk to data integrity, and for identifying the residual risk, using the principles of ICH Q9. Contract Givers should perform a similar review as part of their vendor assurance programme.
Data Lifecycle
All phases in the life of the data (including raw data) from initial generation and recording through processing
(including transformation or migration), use, data retention, archive/retrieval, and destruction.
The procedures for the destruction of data should consider data criticality and legislative retention requirements. Archival arrangements should be in place for long-term retention (in some cases, periods up to 30 years) for records such as batch documents, marketing authorization application data, and traceability data for human-derived starting materials (not an exhaustive list). Additionally, at least 2 years of data must be retrievable in a timely manner for the purposes of regulatory inspection.
Primary Record
The record which takes primacy in cases where data that are collected and retained concurrently by more than one
the method fails to concur.
In situations where the same information is recorded concurrently by more than one system, the data owner should define which system generates and retains the primary record, in case of discrepancy. The ‘primary record’ attribute should be defined in the quality system, and should not be changed on a case-by-case basis.
Risk management principles should be used to ensure that the assigned ‘primary record’ provides the greatest accuracy, completeness, content, and meaning.
For instance, it is not appropriate for low-resolution or static (printed/manual) data to be designated as a primary record in preference to high-resolution or dynamic (electronic) data. All data should be considered when performing a risk-based investigation into data anomalies (e.g. out of specification results).
Original record / True Copy:
Original record: Data as the file or format in which it was originally generated, preserving the integrity (accuracy,
completeness, content, and meaning) of the record, e.g. original paper record of manual observation, or electronic
raw data file from a computerized system.
Original records and true copies must preserve the integrity (accuracy, completeness, content, and meaning) of the record. Exact (true) copies of original records may be retained in place of the original record (e.g. scan of a paper record), provided that a documented system is in place to verify and record the integrity of the copy.
True Copy: An exact verified copy of an original record.
Data may be static (e.g. a ‘fixed’ record such as paper or pdf) or dynamic (e.g. an electronic record which the user /
the reviewer can interact with).
Example 1: a group of still images (photographs – the static ‘paper copy’ example) may not provide the full content and meaning of the same event as a recorded moving image (video – the dynamic ‘electronic record’ example).
Example 2: once printed or converted to static .pdfs, chromatography records lose the capability of being reprocessed and do not enable more detailed viewing of baselines or any hidden fields. By comparison, the same dynamic electronic records in database format provide the ability to track, trend, and query data, allowing the reviewer (with proper access permissions) to reprocess, view hidden fields, and expand the baseline to view the integration more clearly.
It is conceivable for raw data generated by electronic means to be retained in an acceptable paper or pdf format, where it can be justified that a static record maintains the integrity of the original data.
However, the data retention process must be shown to include verified copies of all raw data, metadata, relevant audit trail and result files, software/system configuration settings specific to each analytical run*, and all data processing runs (including methods and audit trails) necessary for the reconstruction of a given raw data set.
It would also require a documented means to verify that the printed records were an accurate representation. This approach is likely to be onerous in its administration to enable a GMP-compliant record.
Many electronic records are important to retain in their dynamic (electronic) format, to enable interaction with the data. Data must be retained in a dynamic form where this is critical to its integrity or later verification. This should be justified based on risk.
Computerized system configuration settings should be defined, tested, ‘locked’, and protected from unauthorized access as part of computer system validation. Only those variable settings which relate to an analytical run would be considered electronic raw data
Computer system transactions:
A computer system transaction is a single operation or the sequence of operations performed as a single logical ‘unit
of work’. The operation(s) that make up a transaction may not be saved as a permanent record on durable storage
until the user commits the transaction through a deliberate act (e.g. pressing a save button), or until the system forces the saving of data.
The metadata (i.e., user name, date, and time) is not captured in the system audit trail until the user commits the
transaction.
In Manufacturing Execution Systems (MES), an electronic signature is often required by the system in order for the
record to be saved and become permanent.
Computer systems should be designed to ensure that the execution of critical operations are recorded contemporaneously by the user and are not combined into a single computer system transaction with other operations.
A critical processing step is a parameter that must be within an appropriate limit, range, or distribution to ensure the desired product quality. These should be reflected in the process control strategy.
Examples of ‘units of work:
• Weighing of individual materials
• Entry of process critical manufacturing / analytical parameters
• Verification of the identity of each component or material that will be used in a batch
• Verification of the addition of each individual raw material to a batch (e.g. when the sequence of addition is considered critical to process control)
• Addition of multiple pre-weighed raw materials to the bulk vessel when required as a single manufacturing step (e.g. when the the sequence of addition is not considered critical to process control)