Context
To ensure your first XM Cloud website performs reliably, securely, and meets the business and user requirement, a testing strategy needs to be in place before, during, and after go-live.
Execution
Start scope the test plan early, during the Discovery and Project Setup phases - preferably as requirements are being collected, the delivery plan is being outlined, and roles and environments are being mapped.
By this point, clear testing goals must be defined: what specifically must be validated - functionality, user experience, performance, security? Consider the potential business risks of insufficient testing, and if there are any regulatory or accessibility standards that must be met. Addressing these questions up-front offers a sound foundation for a successful, risk-minimized deployment.
Roles & Ownership
Just as important is defining ownership - clearly assign who is responsible for planning, executing, and validating each stage of testing to avoid gaps and ensure accountability.
Role | Key Testing Responsibilities |
---|---|
Product Owner |
|
Project Manager |
|
Business Analyst |
|
Technical Lead |
|
Architect |
|
CMS Developer |
|
Web Application Developer |
|
QA / Tester |
|
End User / Business Stakeholder |
|
Testing Plan
A testing plan should be setup based on the implementation requirements that outlines the key environments, testing layers, tools, and approaches required to confidently deliver and support the implementation. This not only should align all stakeholders involves but also ensure the project runs on track.
The plan should support both manual and automated testing practices but needs to include:
Area | Detail |
---|---|
Environment | Testing flows through clearly defined environments:
|
Testing Layers |
|
Tools | Tools used across the project including:
|
Automation Focus |
|
Manual Testing |
|
Testing Types & Timelines
The table below outlines the key types of testing involved in an XM Cloud implementation, along with typical timelines, purposes, and responsible roles. Each type serves a distinct purpose, from validating individual components to verifying business acceptance and accessibility compliance.
Testing Type | Timeline | Purpose | Responsible Roles |
---|---|---|---|
Unit Testing | Development (ongoing) | Validate individual logic, components, and schemas | CMS Developer, Web Application Developer – write unit tests Technical Lead, QA/Tester – review coverage |
Integration Testing | Mid → Pre‑UAT | Confirm interaction between components, APIs, and middleware | Web Application Developer, CMS Developer – implement & test integrations QA/Tester – validate end‑to‑end |
Performance Testing | Pre‑UAT → Pre‑Production | Measure responsiveness, scalability, and headless delivery performance | QA/Tester, overseen by Technical Lead & Architect |
Security Testing | Late Dev → Pre‑Go‑Live | Identify vulnerabilities, verify authentication, secure APIs | QA/Tester, supported by Architect & Technical Lead |
System Testing | Late Dev → Start of UAT | Validate full-site workflows and data flows | QA/Tester Input from Business Analyst, Technical Lead |
User Acceptance Testing | UAT Phase | Business-side validation against requirements and UX | Product Owner, Business Analyst, QA/Tester – facilitation End Users / Business Stakeholders |
Regression Testing | Sprint end / Pre‑UAT | Ensure existing features work after changes | QA/Tester – own automation Developers – maintain tests Project Manager – coordinate cycles |
Accessibility Testing | Design sign-off → Pre‑Go-Live | Verify WCAG compliance, assistive tech usability | QA/Tester – run tools and manual checks Web App Developer – fix UI issues Business Analyst – review acceptable standards |
Insights
It's very important to keep testing hygiene in place to have content stability and quality during testing: utilize dedicated environments to stage and verify content, and don't push placeholder or test content to production.
Authoring scenarios must mimic real-life use cases, such as multilingual, personalized, and component-based layouts, to validate appropriately.
Insert automated build verification tests into each deployment to identify significant problems early and retain confidence in the experience shipped.
Lastly, introduce a QA gate before deploying to production. This verifies that content and presentation have been checked, minimizing the possibility of regressions, broken layouts, or incorrectly configured experiences making it to end users.
Testing Checklist
As any other process, documentation is key to any project - a testing checklist should be maintained, with the essential artefacts, to ensure confidence throughout.
Phase | Key Tests |
---|---|
Build |
|
QA / UAT |
|
Pre Go-Live |
|
Post Go-Live |
|
Testing artefacts provide the documentation, structure, and traceability needed to manage quality, and allow you to get sign off in a consistent and accountable way.
Test Strategy Document- Outlines your overall test objectives, approach, scope, environments, and responsibilities across teams.
- This is the foundational reference for all testing activities.
- Defines which scenarios will be validated by business stakeholders.
- Includes acceptance criteria, test scripts, expected outcomes, and sign-off conditions.
- A reusable list of critical paths, UI elements, and component behaviours.
- Must be verified across every release or update.
- Maintain a documented list of tests triggered automatically within your deployment pipeline.
- Includes unit, smoke, accessibility, and regression checks with pass/fail thresholds.
- Capture formal approval from QA and business owners for each key testing phase (QA complete, UAT complete, Go-live approved).
- These logs ensure traceability and accountability.