What a first-pass report can tell you
A first-pass automated scan checks the rendered HTML of one page for common machine-detectable patterns. It can find:
- Missing or generic page titles
- Missing page language declarations
- Missing or multiple H1 headings
- Skipped heading levels
- Empty headings
- Missing, empty, filename-like, or generic image alt text
- Unlabeled form fields
- Placeholder-only form labels
- Empty or generic link text
- Buttons without accessible names
- Duplicate IDs
- Iframes without titles
- Empty aria-label attributes
- Invalid ARIA roles
- Focusable content inside aria-hidden containers
- Some inline colour contrast problems
- Viewport zoom restrictions
- Missing skip links and accessibility statement links
These are real issues that affect real users. Fixing them is a practical step toward a more accessible site.
What it cannot prove
Automated scans have hard limits. A clean report does not mean a page is accessible. A report with findings does not mean a page is unusable. Specifically, automation cannot:
- Prove WCAG conformance or legal compliance
- Judge whether alt text is actually useful in context
- Test keyboard navigation, focus order, or focus visibility
- Evaluate screen reader flow or announcement quality
- Check PDFs, documents, video captions, or audio transcripts
- Test dynamic widgets, modals, carousels, or third-party embeds
- Assess colour contrast in gradients, images behind text, or dynamic states
- Determine whether content is understandable or well-written
Treat the report as triage, not a verdict. Use it to fix obvious problems before investing in qualified human review.
How to prioritize findings
The report groups findings by severity. Here is a practical way to think about each level:
Critical
These are likely to block or confuse users. Fix them first. Examples: unlabeled form fields, buttons without accessible names, focusable content hidden from assistive technology.
Major
These create real barriers for many users. Fix them soon after critical issues. Examples: missing page title, missing language, empty headings, linked images with empty alt, missing iframe titles.
Moderate
These are real issues that may depend on context. Review and fix where practical. Examples: skipped heading levels, duplicate IDs, generic alt text, possible low contrast.
Notices
These are signals that need human judgement. They may be fine or may need attention. Examples: empty alt text on images (may be decorative), repeated alt text, missing skip link or statement link.
Within each severity level, fix issues that affect the most users or block key tasks first. A checkout form with unlabeled fields is more urgent than a decorative image with a generic alt text.
How to turn findings into developer tickets
The report includes a Developer tickets view that turns each critical and major finding into a ticket-style card with:
- Ticket title and severity
- Category and suggested owner
- Effort estimate (quick fix, template/code fix, needs review, manual test required)
- CSS selector and HTML snippet when available
- What to fix and why it matters
- Acceptance criteria to verify the fix
- Link to the issue help page for more detail
You can copy individual tickets, copy all tickets at once, or download the full set as Markdown. Paste tickets into your issue tracker, project board, or team chat.
The acceptance criteria are a starting point. They do not prove WCAG conformance. Adjust them to match your project standards and QA process.
How to explain results to a client, boss, or board
The report includes a Client/boss summary view that presents findings in plain English without technical jargon. It covers:
- What was checked
- What was found (with counts by severity)
- What to fix first
- What still needs manual review
- A clear disclaimer that this is not a formal audit
Use this summary to communicate the scope of work, justify budget for fixes, or explain why a clean automated report is not the same as an accessible site. The summary avoids selectors, code snippets, and WCAG references unless they add clarity.
How to plan manual review
The report includes a Manual review checklist view with interactive checkboxes and a downloadable checklist. Work through these areas:
- Keyboard and focus: Tab through every page. Is focus visible? Does it follow a logical order? Can you reach every control?
- Screen reader: Listen to the page. Are headings, landmarks, forms, and dynamic content announced correctly?
- Content and design: Is alt text actually useful? Is information conveyed only by colour? Do links make sense out of context?
- Documents and media: Are PDFs tagged? Do videos have captions? Is there a transcript for audio?
- Dynamic and third-party content: Test widgets, carousels, modals, maps, booking tools, and payment forms.
- Formal obligations: Identify which standards apply (WCAG, AODA, ACA). Get qualified review for critical user journeys.
Manual review is where most real accessibility improvements happen. Automation finds the obvious patterns; humans find the meaningful ones.
When to bring in a qualified accessibility professional
Automated scans and team review are useful, but they are not a substitute for qualified expertise. Bring in a professional when you need:
- A formal WCAG conformance audit
- Legal or procurement compliance documentation
- Testing of complex applications, authenticated flows, or custom widgets
- User testing with people who have disabilities
- An accessibility statement that accurately reflects current status
- Guidance on organizational accessibility policy and planning
SiteCheck Canada is a triage tool. It helps you find obvious issues and prepare for qualified review. It is not a compliance platform, formal audit tool, or WCAG certification.