ASQ FD&C GMP Quality Conference
End User Testing for Performance
Qualification of GXP/ Part 11
Systems – Off The Shelf (OTS)
Dr. Teri Stokes, GXP International,
Concord, MA – USA
www. GXPInternational.com
Email: GXPINTL@RCN.com
Context for Computer Validation Testing
Operating environment (Quality Control Lab)
Computerized system (HPLC LIMS System)
People
Software
Hardware
Computer system
SOPs
People
Equipment
Work Process
Infrastructure Platform (IT/IQ) Software Application (User/PQ)
Slide 2
User Acceptance at System Go-Live
2.SYSTEM PLAN
What? URS
3.DESIGN
How? FRS &
SDD
4.BUILD
Program or
Configure
5.TEST
Verify to SDD
& Release
6.COMMISSION
Accept &
Validate to URS
9.RETIRE
Decommission
& Replace
Test Fit
to Design
Test Fit to
Work Process
8.MAINTAIN
Fix & Modify
7.OPERATE
Use & Monitor
1.SYSTEM IDEA
Needs Analysis,
RFP & Contract
HPLC LIMS
Application
Life Cycle
SDLC
OQ – Operational
Qualification
IQ – Installation
Qualification
PQ – Performance
Qualification
1.
Configure
2.
Install
3.
Test
Test Fit to Install Specs.
PQ
OQ
IQSDLC - SW Development
Life Cycle
Slide 3
HPLC
LIMS
Supplier User’s IT Dept.
HPLC LIMS Platform
HPLC LIMS
User Group
Audit OQ
Change Control and Ongoing Testing
2.SYSTEM PLAN
What? URS
3.DESIGN
How? SDD
4.BUILD
Program or
Configure
5.TEST
Verify to SDD
& Release
6.COMMISSION
Accept &
Validate to URS
9.RETIRE
Decommission
& Replace
Test Fit
to Design
Test Fit to
Work Process
8.MAINTAIN
Fix & Modify
7.OPERATE
Use & Monitor
1.SYSTEM IDEA
Needs Analysis,
RFP & Contract
Application
Life Cycle
SDLC
Platform Systems
1.
Change
2.
Install
3.
Test
Test Fit to Design & Work
SDLC - Software
Development
Life Cycle
Software
Supplier
System User
Change
Symbol
Slide 4
Application User’s PQ* Package
Applic. Admin. SOPs &
Applic. Config. Mgt. Logs
Validation Plan
Test Cases, Scripts,
Data & Result Logs
Test Plan(s)
Startup & Ongoing
Change Control Log,
QA Audit Log, Supplier
Reports & BDG Minutes
Users’ PQ Package Summary Report
Standard user’s
CSV package
User Manuals, CVs,
Trng Records, Dept.
SOPs, WIs, Help Logs
Prepared and maintained by User Department(s) Team
Needs Analysis, RFP,
Contract, URS, SLAs
Slide 5
System Control Human Control
Test Summary
Report(s)
Testing Control
QMS
Trace
*PQ = Performance Qualification: Application meets URS workflows
CSV Package Team Model
Package Sponsor
Test Coordinator
(Test Plan, Scripts…)
Package Manager
(QC of Pkge. Docs.)
CSV Package Team
Quality
Assurance
Team Leader
(Project Mgr.)
Ad hoc Members
(Tester, Witness…)
Trains Team and
Audits CSV Pkg.
Funds and Approves
Validation Work
Team to Develop and
Maintain a CSV Pkg.
Slide 6
Requirements are required for testing.
Slide 7
All formal testing requires a metric or
benchmark (e.g. requirement) to test
against. If you don’t know your “expected
results” then testing is meaningless and
validation is not possible.
What do you need in a system?
Slide 8
Identify infrastructure needs
Identify work process needs
Do You Have a Requirement or a Wish?
Slide 9
Step #1 Develop User Requirements
Slide 10
Track data
edits
Secure data
entry
User’s Work
Step
Change items.
Check screen.
Run Report.
Audit data
changes –
Part 11
UR02
Use pos. and
neg. IDs
Log off after 3
failed tries
Control user
access and
log off
intruders
UR01
User Accept.
Test
System Role
– User Req.
Req. ID
What Does the User’s Process Need? (URS)
Slide 11
Production Planner:
Check special dosage
Check authorized status
Check rest or shelf life
Check principals for picking a
material
Check principals for picking a
batch
What happens if
material is
physically
non-existent
Check signals for creating a
picking list
Picking list to production
orders
UR016
ChallengesUser PQ Tests Work Process StepsUR ID
What Does the User’s Process Need? (URS)
Slide 12
Production Operator -1:
(Finance Checker: Check product
costing)
Creation of extra
batches
(Finance Checker: Check signals to
cost price)
Check rules for
yield
Check function in systemFeedback per workstation -
time and materials
UR018
(Finance Checker: Check product
costing)
Check special dosage
Check reg. of loss (waste)
Handling of
wreckage
Check feedback rest material to
warehouse
Check correct
storage bin
Check printing of labels (if weighing
station)
Material is picked and taken to
working station for weighing of
components
UR017
ChallengesUser PQ Tests Work Process StepsUR ID
Data Capture – Lab URS View for PQ
Lab Methods
Instruments Devices
Capture & Analysis
Systems (PCs)
Analysts & Supervisors
Sample Handling
Reagents
Defined Raw Data
Calculations Results
Verification Analytical Report
Slide 13
SOPs & Work
Instructions
Laboratory
Workflow(s)
OECD GLP – Acceptance Testing
• “…there should be evidence that the
system was adequately tested for
conformance with acceptance criteria …
prior to being put into routine use.”
• “Formal acceptance testing requires the
conduct of tests following a pre-defined
plan and retention of all testing
procedures, test data, test results, a
formal summary of testing, and a record
of formal acceptance.”
Slide 14
Step #2 – Map System to Work Process
1. Team agrees a common workflow for
use of system in the work process
(URS).
2. Workflow steps interacting with the
system are identified.
3. Critical workflow steps are described
in an SOP or Work Instruction.
Workflow
Acts (URS)
Critical steps
criteria
Slide 15
4. Each workflow step has positive
and negative challenges identified
in testing step procedures.
5. Test scripts are organized by
logical groups of workflow
activities into Test Cases.
6. Test Scripts and related documents
are managed with large envelopes.
Workflow
Needs (URS)
Critical
work steps
Test
scripts
Test step
procedures
Test
envelopesSlide 16
Print-
outs
Test
script
Step #2 – Map System to Work Process
Step #3 – Make Logical Groups of Test Scripts
Don’t get tangled up in lots of Test Scripts! Organize them into
Test Cases based on work process flow and requirements.
Slide 17
Step #3 – Make Logical Groups – Test Cases
Outline adapted from - IEEE Std. 829-1983
1. Test Case Identifier – unique & related to Test Plan ID
2. Test items - scope of features being tested & URS items
addressed with table of Test Scripts to be used
3. Input requirements - user roles, system privileges, input data
types ...
4. Output requirements - reports, listings, screen printouts
5. Environmental needs - user materials, training, physical setup
6. Special procedure requirements - anomaly handling
7. Inter-case dependencies - cases to be run prior to this
Slide 18
User Test Case Descriptions (PQ)
To include normal, problem, and stress conditions in
user’s work process environment.
Slide 19
TC01. Work Area Preparedness – SOP, WI, Manual
TC02. System Setup/Admin – User Profiles, DB schema
TC03. Work Process – Activity A (vanilla run)
TC04. Work Process – Activity B (chocolate issues)
TC05. Work Process – Activity C (strawberry stresses)
TC06. Special Challenges – Multi-user, Problem work
Server Test Case Descriptions (IQ)
To include normal, problem, and stress conditions in
IT/IS environment.
Slide 20
TC01. Hardware Operations
TC02. Operating System (O/S) & Software Tools
TC03. Database Engine (DB) & Query Tools
TC04. Network Operations – LAN/WAN
TC05. Platform Routine Backup & Recovery
TC06. Platform Disaster Recovery – System & Data
Test Step # 4 Develop a Test Plan
Test Cases, Scripts,
Data & Result Logs
Test Plan(s)
Test Summary
Report
Standard user’s
CSV package
Prepared and maintained by User Department(s) Team
Slide 21
System Control Human Control
Startup & Ongoing
Trace
URS
Test Step # 4a Develop a Trace Matrix
No testing project is too big or too small for a trace matrix.
Always trace Test Scripts to Requirements to ensure that all
requirements have been sufficiently tested.
Slide 22
IEEE Format for a Software Test Plan
Software Test Plan Outline - IEEE Std. 829-1983
1. Test plan identifier
2. Introduction
3. Test items
4. Features to be tested
5. Features not to be tested
6. Approach
7. Item pass/fail criteria
8. Suspension criteria & resumption requirements
Slide 23
IEEE Format for a Software Test Plan
Software Test Plan Outline - IEEE Std. 829-1983
9. Test deliverables
10. Testing tasks
11. Environmental needs
12. Responsibilities
13. Staffing and training needs
14. Schedule
15. Risks and contingencies
16. Approvals (IEEE Tel: 800-678-4333)
Slide 24
Who Is Who in Testing?
Test Coordinator – writes test plan, test cases, and
prepares testing materials. Trains testers & witnesses,
manages testing process, and tracks results and
anomalies. Writes conclusion per case. Writes test
summary report. Organizes test records in a test records
binder.
Slide 25
Roles &
Responsibilities
Who Is Who in Testing?
Script author - can never be tester or approver of same script, but
can witness testing of same script
Script approver – independent reviewer of script for content and
strategy
Tester- performs test, records system response, and signs script for
accurate recording of system response
Witness - verifies that tester is prepared and ready to perform test,
checks presence of final test records, and signs script for
compliance to testing practices SOP
Slide 26
Roles &
Responsibilities
Test Script Author Guidelines – High 5
1. Keep Tester Alert – author scripts that can be executed
within 30-50 minutes.
2. Rule of Three – test key functions 3 times using a
variety of challenges – normal, problem a, and issue b
3. Trace to Requirements – identify requirements being
addressed by each script.
4. System Output – have at least one screen print or
report per test script. Identify label info needed for
output.
5. More than Pass/Fail – have tester write out an
observed system response.
Slide 27
Part 11 Guidance*: GXP Testing Principles
5.1 System requirements Specifications: Established,
documented end user requirements are essential. (GXP,
Part 11, work process – scanning, scalability, operating
environment requirements)
5.2 Documentation of Validation Activity:
Management approved plan(s), procedures and
report(s).
5.3 Equipment Installation: Qualify installation prior to
testing.
Slide 28 * Retracted Draft Guidance
Part 11 Guidance*: GXP Testing Principles
5.4 Dynamic Testing: 5.4.1 - Test conditions, simulation
tests, and live, user-site tests. 5.4.2 - Software testing to
include structural, functional, and program build
testing. 5.4.3 – Quantifiable tests recorded in
quantified, not pass/fail terms.
5.5 Static Verification Techniques: Document and code
inspections, walk-throughs, and technical reviews.
5.6 Extent of Validation: System risk to product safety,
efficacy, and quality. System risk to data integrity,
authenticity, and confidentiality. System complexity.
Slide 29 * Retracted Draft Guidance
Part 11 Guidance*: GXP Testing Principles
5.7 Independence of Review: CSV to be performed by
persons other than system developers.
5.8 Change Control (Configuration Management):
Systems to control changes and evaluate extent of
revalidation needed. Regression testing is a key tool.
5.6 Extent of Validation: Based on
– System risk to product safety, efficacy, and quality
– System risk to data integrity, authenticity, and
confidentiality
– System complexity.
Slide 30 * Retracted Draft Guidance
Step #5 - Prepare Test Script Materials
Test Script
Step Procedures
Incident Form
(Red Note) & WI
URS Trace &
Script Pass/Fail
T/W Signatures
Large Mailing Envelope
Standard Test
Script Envelope
Support Materials,
Result Printouts...
Prepared by Test Coordinator & Package Team
Witness
Checklist
Slide 31
Result Logs &
System Issues
Test Data Form,
SOP, Work Instr.
Step #6 - To Do’s in Formal Testing
1. Check script, logs, data, & testing area.
2. Record all system responses as they occur.
3. Use only indelible ink to record results.
4. Correct with single line through record & new
result next to it. Initial, date & explain.
5. Draw single line through unused log spaces.
6. Label, sign & date all printouts or CD/diskettes.
System
Setup
T.S. Envelope
System
Testing
Print-
outs
Slide 32
Step #6 - The Don’ts in Formal Testing
1. Don’t use pencil or other erasable media.
2. Don’t correct by using white-out or scribble over to obliterate
prior result.
3. Don’t use check marks, dittos, Y/N or other abbreviations.
Write results & comments in full.
4. Don’t leave large blank spaces in result logs. Line through.
5. Don’t forget to sign & date all output documents & logs.
T.S. Envelope
System Testing
Print-
outs
Slide 33
Witness Participation in Testing
1. Checks contents of T.S. Envelope with
Tester for completeness and understanding.
2. Checks System Setup with Tester for logon
access. Watches start of first testing step.
3. After testing, T.S. Envelope is re-examined.
Checks Test Script and printouts for
completeness, signatures, dates and labels.
System
Setup
Test
script
T.S. Envelope
System
Testing
Print-
outs
T.S. Envelope
Slide 34
Step #7 – Prepare a Test Summary Report
Testing is finished at last!!
Slide 35
Reporting on Test Results
Test Plan(s)
Test Summary
Report
Standard user’s
CSV package
Prepared and maintained by Site User Team
Slide 36
System Control Human Control
Trace
URS
Test Plan Summary Report
1. Test Summary Report Identifier - Unique ID traceable to
associated Test Plan.
2. Summary - Describes items tested (application version), test
environment (platform system), and test documentation used (Test Cases,
Test Scripts, T.S. Envelope contents).
3. Variances - States any deviations from Test Plan or Test Scripts and
reason why.
4. Comprehensive Assessment - Discusses assumptions and limits
to scope of testing. Were scope of testing and results obtained sufficient to
assess system reliability? Discuss reasons for limits chosen.
IEEE Std. 829-1983 Adapted
Slide 37
5. Summary of Results - Gives table of testing results per Test Case.
Table of anomalies and their resolutions. List of outstanding issues and
risks (unresolved anomalies).
6. Evaluation - Pass/Fail conclusion based on test results and criteria
in the Test Plan.
7. Summary of Activities - Tester/Witness staffing, task list from
Test Plan with updated status.
8. Approvals - Names, titles, signatures, dates and meaning of
signatures.
9. Appendix - Table of Contents list for test documentation
Test Plan Summary Report
IEEE Std. 829-1983 Adapted
Slide 38
Step #7 - Store Test Results for Audit
Site Test BinderSite Test Binder
Prepared by Test Coordinator at Site
T. Case 2 -Sum. Rpt.
Test scripts, test
data & Result Logs
T. Case 2 - System
Admin .Workflow
Step Step Step
Test scripts, test
data & Result Logs
T. Case 1 -Sum. Rpt.
T. Case 3 - Work
Process Workflow(s)
Step Step Step
Test scripts, test
data & Result Logs
T. Case 3 -Sum. Rpt.
Site A
T. Case 1 - SOP &
Document ReviewSOP for
Testing
Site Test Summary Report
T/W
Training
Records
Slide 39
Over time systems change. An Ongoing Test Plan sets up the
process for maintaining a validated status as change occurs.
It identifies major, moderate, and minor types of system
change and corresponding degree of formal testing to be
performed any validation tasks to be updated.
Slide 40
OTP
Step #8 – Write an Ongoing Test Plan
Audit Points for Test Documentation
• Test Plan was approved with or after Validation Plan approval
date and before Test Script approval dates.
• Test Plan describes system to be tested, gives specific strategy
for testing, and identifies tasks and roles responsible.
• All issues arising are recorded, tracked, and resolved.
• Repeat testing is performed using a new copy of a script and
the run number is identified.
• Test Script author is not tester or approver for same script.
Slide 41
Audit Points for Test Documentation
• All printouts are labelled with tester ID, date, and script ID.
• All log entries are made in indelible blue or black ink.
• No abbreviations (P/F, Y/N), ditto or check marks were used.
• Signature page identifies names, initials, signatures, and
Tester/Witness roles.
• Test Script identifies Requirements being tested, Testing Site,
Test Run, Author, Approver, Tester, Witness and overall
conclusion.
Slide 42
Audit Points for Test Documentation
• Test Summary Report clearly describes what happened, how
problems were handled, who was responsible, and how the Test
Plan was followed or what deviations were made and why.
• Test Summary Report should show that test execution was
consistent with Test Plan strategy.
• PQ testing uses GXP work process SOPs and work instructions
to test their suitability for working with the system.
• IQ testing uses IT Dept. SOPs and work instructions to test their
suitability.
Slide 43
MANAGEMENT CONTROL
• Formal Testing Practices SOP
• Approved Test Plans, Test Scripts,
& Test Summary Reports
• Anomaly Tracking Process
• User & Support Documents
Formal Testing Major Themes
SYSTEM RELIABILITY
• Requirements & Specifications
• Trace Matrix – Tests to
Requirements
• Defined Acceptance Criteria in Test
Procedure
• Limits, Logic & Problem Testing
DATA INTEGRITY & PRIVACY
• Tester & Witness Signatures
• Result Logs in Indelible Ink
• IQ for Automated Testing Tools
• Security & Disaster Recovery Tests
• Audit Trail Tests
AUDITABLE QUALITY
• Approved Plans & Reports
• Traceable test coverage for all
Requirements & Specifications
• Documented Anomaly Resolution
• Independence of Testing Process
Slide 44
Common Sense Computer Validation
ASQ FD&C GMP Quality Conference
Thank You!
Merci
Tak, Tack, Takk
Gracias
Obrigado
Spasibo
Nandri
Cobjai It’s the little things that can bite you
in an audit of test records!
How would you answer these questions?
The software supplier has tested this software a lot and
other people have bought it and are using it, so why do
we have to validate it with our own testing?
Why test off the shelf systems that everyone else is using
in their businesses?
Why must end users be involved in the PQ testing?
Can’t IT or outside testing folks do the job for us?
Our job is our work process, how should we know what
to do for testing a computer system?
Any Questions or Comments?
??
本文档为【end-user-testing-for-performance-qualification-of-gxp-part-11-systems-off-the-shelf-ots】,请使用软件OFFICE或WPS软件打开。作品中的文字与图均可以修改和编辑,
图片更改请在作品中右键图片并更换,文字修改请直接点击文字进行修改,也可以新增和删除文档中的内容。
该文档来自用户分享,如有侵权行为请发邮件ishare@vip.sina.com联系网站客服,我们会及时删除。
[版权声明] 本站所有资料为用户分享产生,若发现您的权利被侵害,请联系客服邮件isharekefu@iask.cn,我们尽快处理。
本作品所展示的图片、画像、字体、音乐的版权可能需版权方额外授权,请谨慎使用。
网站提供的党政主题相关内容(国旗、国徽、党徽..)目的在于配合国家政策宣传,仅限个人学习分享使用,禁止用于任何广告和商用目的。