Regulatory & Compliance ISO 13485QMSMedical

QMS ISO 13485

ISO 13485 Quality Management System implementation and maintenance.

$skillfish add ~skills/quality-manager-qms-iso13485

Real Conversations

See how the QMS ISO 13485 skill responds to actual requests. These are representative examples of what your AI agent will produce.

You

We're developing a new Class IIa medical device — a connected glucose monitor with a companion app. I need to set up design controls per ISO 13485 §7.3. What does the design control process look like and what records do I need?

Agent

Design Control System — ISO 13485 §7.3 + MDR Compliance

Design controls are the most audited clause in ISO 13485. Every stage must be documented and the records form your Design History File (DHF).

Design control phases and required records:

Phase 1: Design and Development Planning (§7.3.2)

  • Design plan: stages, responsibilities, review/verification/validation activities, group interfaces (HW, SW, clinical)
  • Must be updated as development progresses

Phase 2: Design Inputs (§7.3.3)

  • User needs document
  • Intended use statement
  • Functional, performance, and safety requirements
  • Applicable standards: IEC 62304 (SW), IEC 60601-1 (safety), IEC 62366 (usability), ISO 14971 (risk)
  • Regulatory requirements: MDR Annex I (General Safety and Performance Requirements)

Phase 3: Design Outputs (§7.3.4)

  • Hardware specifications
  • Software architecture and detailed design
  • Labeling (IFU, packaging)
  • Each output must be traceable to a design input

Phase 4: Design Review (§7.3.5)

  • Formal review meetings at key milestones
  • Required records: attendees list (including independent reviewer), issues and resolutions, action items, formal sign-off

Phase 5: Design Verification (§7.3.6) "Did we build it right?" — does the output meet the input?

  • Verification test protocols and reports
  • Software unit/integration testing (per IEC 62304)
  • Each requirement traced to a verification test result

Phase 6: Design Validation (§7.3.7) "Did we build the right thing?" — does it meet user needs in real conditions?

  • Clinical evaluation plan and report (MDR Art. 61)
  • Usability testing — summative evaluation per IEC 62366
  • For connected device: cybersecurity validation

Phase 7: Design Transfer (§7.3.8)

  • Manufacturing procedures verified and approved
  • Training records for production staff
  • First article inspection / pilot build records

Phase 8: Design Changes (§7.3.9)

  • Any change must be: documented, reviewed, verified/validated as appropriate, approved before implementation

Traceability matrix (DHF backbone):

User Need → Design Input → Design Output → Verification → Validation
UN-001: Patient needs glucose readings accurate to ±10 mg/dL
  → DI-003: Accuracy spec ±10 mg/dL at 40–400 mg/dL range
    → DO-005: Sensor calibration algorithm v2.1
      → VER-012: Bench test report — 500 samples, all within spec
        → VAL-007: Clinical study — 98.2% within ±10 mg/dL vs. YSI reference

Want me to generate the design plan template, traceability matrix, and design review agenda template?

Get Started

How to install and use this skill in your preferred environment.

Skills are designed for AI coding agents (Claude Code, Cursor, Windsurf) and IDE-based workflows where the agent can read files, run scripts, and act on your codebase. Web-based AI can use the knowledge and frameworks, but won't have tool access.

Models & Context

Which AI models and context windows work best with this skill.

Recommended Models

Best
Claude Opus 4 Claude Sonnet 4 GPT-4.1 Gemini 2.5 Pro Grok 3 Kimi K2
Good
Claude Haiku 4.5 GPT-4.1 mini Gemini 2.5 Flash Grok 3 mini

Larger models produce more detailed, production-ready outputs.

Context Window

This skill's SKILL.md is typically 3–10 KB — fits in any modern context window.

8K Skill only
32K+ Skill + conversation
100K+ Skill + references + codebase

All current frontier models (Claude, GPT, Gemini) support 100K+ context. Use the full window for complex multi-service work.

Pro tips for best results

1

Be specific

Include numbers — users, budget, RPS — so the skill can size the architecture.

2

Share constraints

Compliance needs, team size, and existing stack all improve the output.

3

Iterate

Start with a high-level design, then ask follow-ups for IaC, cost analysis, or security review.

4

Combine skills

Pair with companion skills below for end-to-end coverage.

Good to Know

Advanced guide and reference material for QMS ISO 13485. Background, edge cases, and patterns worth understanding.

Contents

ISO 13485 Clause Map

Clause Title What auditors focus on
4 Quality Management System QMS scope and exclusion justification, documented information control, record retention periods, regulatory file structure
5 Management Responsibility Quality policy signed and dated by top management, management review meeting minutes with required inputs/outputs, QMR appointment
6 Resource Management Competence records for all staff in quality-affecting roles, infrastructure maintenance logs, work environment controls for manufacturing
7 Product Realization Design controls (§7.3), purchasing controls and supplier qualification, production and service provision, traceability, customer property
7.3 Design and Development The most-cited nonconformance clause — see below
7.4 Purchasing Approved supplier list, supplier evaluation records, incoming inspection criteria
8 Measurement, Analysis, Improvement Internal audit programme and reports, CAPA system records, complaint handling, nonconforming product disposition

Clause 8 is evidence-heavy: Auditors expect to pull individual CAPA records and trace from complaint or nonconformance through root cause analysis, corrective action, effectiveness check, and closure. Gaps in any step are findings.

Design Controls (Clause 7.3)

Design controls are the most cited nonconformance area in ISO 13485 audits. The clause defines a sequential, linked loop that must produce documented evidence at every stage:

The design control loop:

Design Inputs → Design Outputs → Verification → Validation → Transfer
       ↑                                                           ↓
       └──────────────── Design Changes (§7.3.9) ←───────────────┘

Design reviews (§7.3.5) must occur at planned stages throughout the loop and involve at least one participant independent of the design function.

Verification vs. validation — the precise distinction:

Design Verification (§7.3.6) Design Validation (§7.3.7)
Question answered Did we build it right? Did we build the right thing?
Compares Output against input specifications Device against user needs and intended use
Performed under Defined test conditions Conditions similar to actual use (simulated or real)
Typical evidence Test protocols and reports, software unit testing, dimensional inspection Clinical data, usability summative evaluation, field study, risk-benefit analysis

The common failure mode: Treating performance testing (bench test against specifications) as validation. It is not — that is verification. Validation must demonstrate suitability for actual use by intended users in the intended environment.

Traceability Requirements

Traceability in ISO 13485 operates at two levels: design traceability (DHF) and device traceability (DHR/DMR).

Design file traceability chain:

User Need → Design Input → Design Output → Verification result → Validation result

Each link must be documented in a traceability matrix. A user need with no design input, or a design output with no verification test, is an automatic finding.

Device history record (DHR) traceability — what must trace to what:

From To Regulatory basis
Finished device unit Manufacturing records, inspection results, release status §7.5.9
Component lot Supplier, incoming inspection record, devices built with that lot §7.5.9
Nonconforming product Disposition record, CAPA if systematic §8.3
Risk management file Design inputs, verification/validation activities, post-market data ISO 14971 integration

Why traceability gaps are high-severity findings: A missing link in the DHF-to-DHR chain means you cannot demonstrate that the device reaching a patient was built to the approved design. Notified bodies and FDA treat this as a potential patient safety issue, not a paperwork deficiency. Severity classification during audit is typically major or even critical depending on the product class.

Nonconformance Severity Classification

Classification drives CAPA requirements, audit outcomes, and regulatory notification obligations.

Severity Definition CAPA requirement Audit outcome
Critical Actual or potential patient harm; regulatory violation (field safety, MDR Art. 87 reportable event) Mandatory CAPA with executive oversight and defined timeline May result in certification suspension; regulatory authority notification likely required
Major Systemic failure of a QMS requirement; absence of a required element; multiple related minor findings CAPA required; root cause analysis mandatory; effectiveness verification before closure May delay or condition certification; noted in audit report as requirement for resolution before next audit
Minor Isolated, low-risk deviation; procedural gap with no systemic pattern Correction required; CAPA at organization's discretion Noted in audit report; evidence of correction verified at next surveillance audit

How classification affects ongoing operations: A critical finding during a surveillance audit typically triggers an unscheduled follow-up audit before the next certification cycle. A pattern of recurring minor findings in the same process area is often reclassified as a major at the next audit — auditors document repeat findings explicitly.

Self-classification during internal audits: Train internal auditors to classify conservatively. An internal finding classified as minor that an external auditor reclassifies as major signals that your internal audit function lacks rigor — itself an audit finding.

Audit Timing Strategy

Stage 1 — Readiness audit (document review): Focus: Does your documented QMS cover all required processes? Do you have the right documents?

Stage 1 preparation priorities:

  • Quality manual complete, scope defined and justified
  • All mandatory procedures present (document control, record control, internal audit, CAPA, nonconforming product, complaint handling)
  • At least one complete internal audit cycle with records
  • At least one management review with minutes showing required inputs were addressed
  • Design control procedures in place even if no products are in active development

Stage 2 — Implementation audit (on-site verification): Focus: Are you actually doing what the documents say? Do your records prove it?

Stage 2 preparation priorities:

  • Pull and review your own CAPA records for completeness before the audit
  • Verify training records are current and linked to the current document versions
  • Confirm all approved suppliers on your Approved Supplier List have evaluation records on file
  • Check that DHRs exist for any devices built during the QMS period
  • Brief all staff who may be interviewed — auditors routinely speak to operators, not just the QMR

The gap between stages: Typically 4–12 weeks. Use it to close Stage 1 findings, not to implement processes that should have been in place. Implementing a CAPA system for the first time between Stage 1 and Stage 2 signals that Stage 1 readiness was overstated.

Ready to try QMS ISO 13485?

Install the skill and start getting expert-level guidance in your workflow — any agent, any IDE.

$skillfish add ~skills/quality-manager-qms-iso13485
← Browse all 169 skills