Developing A Risk-Based Model For Computer System Validation
By Richie Siconolfi, Richard M. Siconolfi, LLC

With a solid understanding of regulatory guidance and the foundation it laid for risk-based approach to computer system validation (CSV) — see “3 FDA Guidance Documents That Shaped Today’s Computer System Validation” — responsible parties can learn to develop their own path to CSV. This article focuses on risk factors that can be used to develop a risk-based approach to CSV.
The Evolution Of Risk-Based Approach To Validation
The three FDA guidance documents discussed in the first article provided key principles to develop a risk-based approach to validate regulated computerized systems, and the FDA Part 11 Scope and Application1 guidance document directed industry toward this risk-based approach. In turn, many regulated industries realized that developing a risk-based approach to validate a computerized system would include the establishment of and rationale for taking measured risks. These risks included, but were not limited to, electronic record criticality, computerized system types, computer system functionality, the distribution of computer systems in commerce, and vulnerability of the computerized system if it became unavailable to its users.
Each of these attributes can be used to develop a reliable and documented risk-based approach to validate a regulated computerized system, and below are recommendations for how to do it.
Risk 1: Electronic Records
The Society of Quality Assurance recently published three articles on computer software assurance (CSA) based on the FDA Draft Guidance Document on Computer Software Assurance for Production and Quality System Software. This first article discusses record criticality and the quality assurance professional’s role in supporting CSA. Identifying record criticality is the first step in establishing a risk-based approach to validate a regulated computerized system. The key regulations impacting the regulated industry are Good Clinical Practice, Good Laboratory Practice, Good Manufacturing Practice, and Good Pharmacovigilance Practice guidance3. Business owners, information technology (IT) owners, subject matter experts (SMEs), and quality assurance professionals (QAPs) are usually the key individuals to assess record criticality and do so by first identifying all the regulated electronic records a computerized system will create, modify, maintain, archive, retrieve, or transmit. Next, they will assign a risk value to each record, such as high, medium, or low. Once the record’s criticality has been determined and documented, they can proceed to the software categories.
Risk 2: Software Categorization
GAMP5 2.04 can help identify the categories of software to be assessed. GAMP5 2.0 outlines four categories currently used by the regulated industry. These categories are:
- Category 1 – Infrastructure Software, Tools, and IT Services. These are established or publicly available layered software or infrastructure software tools. (Category 2 has since been removed.)
- Category 3 – Standard System Components. These include off-the-shelf software used for business purposes and are usually non-configurable or allow limited customization.
- Category 4 – Configured Components. These are software systems configured to business and regulated objectives to meet their intended use.
- Category 5 – Custom Applications and Components. These are software systems developed to meet specific requirements.
To use these categories effectively, develop a risk for each category and include this value in a risk-based approach for computerized system validation. This could be as simple as denoting categories as not applicable, low, medium, or high, or by using a numerical value that is part of an algorithm to calculate the regulated computerized system risk.
Risk 3: Functionality
Today, many in industry do not consider functionality a risk. An article titled “RAMP (Risk Assessment and Management Process): An Approach to Risk-Based Computer System Validation and Part 11 Compliance”5 provides another perspective for defining and rating a computerized system functionality according to eight functionalities of a computerized system.
Modification of RAMP Table 4:
Computerized System Functionality |
||
What Function Does This Computerized System Perform? |
||
Main System Function |
Description |
Score (1-3, low to high risk) |
Electronic data capture (EDC) |
Systems that facilitate the electronic capture of data |
3 |
Data entry/data modification |
Systems used to enter, modify, or delete electronic records |
3 |
Data calculations, transformation, or derivations |
Systems are used to create new stored data by changing the data format (i.e., changing an alpha to a numeric) or by deriving it from other stored data. Note: If there is a significant impact on patient health or product safety based on the type of data being transformed or derived, increase the value to 3. |
2 |
Data analysis—reporting |
Systems used to analyze data. The output may be in the form of data sets or listings, graphs, or reports. |
2 |
Submission creating |
Systems used to collate and publish a regulatory submission or report. |
2 |
Data transport |
Systems are used to move electronic records from one platform to another. |
1 |
Data browsing |
Systems used to interrogate data for accuracy, quality, and reliability, or other pre-analysis purposes. Output would not be used for any regulated activity. |
1 |
Data/document storage and distribution |
Systems used to facilitate the storage of data/documents required to be retained. |
1 |
The above scores were part of an overview risk assessment process developed shortly after the FDA Part 11 Scope and Assessment guidance document was finalized. The RAMP article details how to develop this approach when revising an existing risk-based approach to validate a regulated computerized system.
Risk 4: Distribution
Distribution of software solutions played a key role in the scope and application guidance document because the software industry was moving away from mainframe computers and client servers and toward web-based applications and software as a service (SaaS) computerized systems. The table below should be part of any risk-based approach to evaluate computerized systems.
Modification of RAMP Table 5:
Computerized System Distribution |
||
How Is This Computerized System Distributed? |
||
System Distribution |
Description |
Score |
Custom-designed, highly configured, or contains open-source software |
System was designed, developed, or highly configured for or by a user organization or application that is or contains open-source software. |
5 |
Multi-industry, |
The system was designed and developed for general purposes across many industries, academia, and government, but is not widely used. |
4 |
Regulated industry, limited use |
The system was designed and developed for regulatory purposes across many industries, academia, and government, |
3 |
Regulated industry, |
The system was designed and developed for regulatory purposes across many industries, academia, and government, and is widely used. |
2 |
Multi-industry, |
The system was designed and developed for general purposes across many industries, academia, and government, and is |
1 |
This unique approach can help estimate the distribution of a software application in commerce. Many software applications today may be SaaS, off-the-shelf, or custom code. For example, a SaaS that crosses many industries but has limited overall use would be classified as higher risk compared to a regulated SaaS that is used broadly. The reason for this classification is simple: A regulated SaaS application is used by more users, and the multi-industry SaaS application has limited use or fewer users. The more an application is used by industry, academia, or government, the more chances there are to find bugs and program errors that were not caught during the application's coding and testing. Thus, it improves the software application.
Risk 5: Vulnerability
This risk level is usually documented and discussed within a facility’s backup and restore SOP and business continuity plan. However, the vulnerability of a computerized system that supports patient health, product safety, or business objectives is still a risk and should be evaluated, discussed, and have a disaster recovery plan. RAMP Table 8 on Vulnerabilities is an example of how this risk can be incorporated into a documented risk-based approach.
Modification of RAMP Table 8:
Vulnerability |
||
Choose One Condition That Best Describes The Business Tolerance Of |
||
Condition |
Definition |
Score |
Low tolerance of downtime |
If the system goes down for a short period of time, then there will be a negative impact on patient health, product safety, or business objectives. |
5 |
Low tolerance and a contingency plan |
Although there is a low tolerance for downtime, a tested contingency plan is in place. |
4 |
Moderate tolerance of downtime |
If the system goes down for a moderate period of time, then there will be a negative impact on patient health, product safety, or business objectives. |
3 |
Moderate tolerance and a contingency plan. |
Although there is a moderate tolerance of downtime, a tested contingency plan is in place. |
2 |
High tolerance of downtime (i.e., 5+ days) |
If the system goes down for a long period of time, there will be little impact on patient health, product safety, or business objectives. |
1 |
High tolerance and a contingency plan |
There is a high tolerance of downtime, and a tested contingency plan is in place. |
0 |
This table can be modified to fit a particular regulatory and business objective needed to protect patient health and product safety. The key to any disaster recovery plan is to periodically test the plan, critique the results, and revise the plan as needed. Each time the test plan is executed, something will be learned.
AI And CSV
Today, AI appears to be everywhere. The regulated industry needs new concepts and procedures to validate and use AI. The existing guidance documents will provide the flexibility to develop new risk models and revise existing ones, and the industry will have to decide how best to assess AI against the various regulations, guidance documents, and directives. Additionally, some may consider using AI to create and validate a risk-based approach to CSV.
Summary
Validating regulated computerized systems will continue to evolve, sometimes as fast as technology is changing. The business and IT owners, as well as the SMEs and QAPs, must stay current with these changes to ensure regulated computerized systems are validated and compliant with regulations, guidance documents, and directives.
References:
- FDA’s Part 11 Electronic Records; Electronic Signatures – Scope and Application, 2003
- FDA Draft Guidance Document on Computer Software Assurance for Production and Quality System Software, 2022
- Good Pharmacovigilance Practices and Pharmacoepidemiologic Assessment, 2025
- ISPE GAMP 5 A Risk-Based Approach to Compliant GxP Computerized Systems, Second Edition, 2022.
- Richard M. Siconolfi and Suzanne Bishop, RAMP (Risk Assessment and Management Process): An Approach to Risk-Based Computer System Validation and Part 11 Compliance, Drug Information Journal, Vol. 41, pp. 69–79, 2007 • 0092-8615/2007.
About The Author:
Richie Siconolfi earned a BS in biology (Bethany College, Bethany, WV) and MS degree in toxicology (University of Cincinnati College of Medicine, Cincinnati). He has worked for The Standard Oil Co., Gulf Oil Co., Sherex Chemical Co., and the Procter & Gamble Co. Currently, Richie is a consultant in computer system validation, Part 11 compliance, data integrity, and software vendor audits (“The Validation Specialist”, Richard M. Siconolfi, LLC). Richie is a co-founder of the Society of Quality Assurance and was elected president in 1990. He is a member of the Beyond Compliance Specialty Section, Computer Validation IT Compliance Specialty Section, and Program Committee. Richie also is a member of Research Quality Assurance’s IT Committee and Drug Information Association’s GCP/QA community. The Research Quality Assurance professional society appointed Richie to fellow in 2014.