Smart Treatment Proposals

Post-MVP ROADMAP — Conversions Suite 💰 GTM ⚙ Settings
Journey progress
33% complete · 6d since last change
📝 Specs drafted
Specs published
🎨 Design in progress
👀 Design reviewed
🔨 Built
🚀 Released
📝 Meetings affecting this module 1 past meeting · all handled click to expand
💬 Discussion no comments on ux yet comments don't trigger digest emails (mentions do)

Mention: @email@domain for a person, @role:designer for everyone with that role, or @all for everyone watching this module. Markdown supported in the body.

Sign in as a designer or higher to post comments.

No comments on the ux spec yet. Be the first.

Versions (UX Specification)
Currently viewing
v0.1 · ux
Status: published
Updated: 2026-05-14

🖼 Designs in Figma

Figma integration not configured. Set FIGMA_PAT in .env and restart the web container to enable file linking.
🎨 Generate AI design prompt
Compose a prompt from this UX spec, paste it into your AI design tool of choice (UX Pilot, Galileo, v0, etc), then send the result into Figma.

Smart Treatment Proposals – UX Specification

Related Technical Authority: Smart Treatment Proposals – Technical Specification

1. Purpose

This UX specification governs the end-to-end experience of creating, delivering, and accepting treatment proposals within Primoro's secure ecosystem. It covers the interfaces used by clinical staff, treatment co-ordinators, finance and admin teams, and patients across all delivery surfaces — tablet, patient mobile app, and secure web link. The primary roles served are: clinicians and treatment co-ordinators (proposal creation and monitoring), patients (proposal review, Q&A, and acceptance), and finance/admin staff (engagement tracking and conversion oversight).

2. Core UX Principles (Non-Negotiable)

These principles take precedence over visual preferences. If a design choice conflicts with a principle below, the principle wins.

  • Action-first — users see the action they need next, not abstract status displays
  • Governance always visible — when AI is involved, users always know what AI did and what they're confirming
  • No dead toggles — every UI control either does something or doesn't appear
  • Calm by default — the interface gets out of the way; alerts are reserved for things that genuinely need attention
  • Progressive disclosure — advanced detail is one click away, not always-on
  • Separation of selection and commitment — choosing a payment option or applying for finance never implies acceptance of treatment; the UI must make this separation unmistakably clear at every step, inferred from the technical spec's §6.7 and §7 rules governing acceptance/charging separation.
  • State integrity made tangible — the current proposal state (Draft, Presented, Explained & Acknowledged, Accepted, Declined, Expired) is always visible and unambiguous, because every downstream action (booking, pipeline update, PMS sync) depends on it, inferred from technical spec §3.2 and §13.
  • Patient language, not clinical language — all patient-facing surfaces use plain-English descriptions; clinical codes and internal nomenclature are hidden from patients, inferred from technical spec §4.1.

3. Design Philosophy

Smart Treatment Proposals is built around the mental model of a governed conversation: the proposal is a narrative artefact that a clinician prepares, hands to a patient to read and question at their own pace, and that the patient formally acknowledges and accepts in two distinct steps. The interface reflects this by treating the proposal document itself as read-only content — a governed story — while all actions (accept, acknowledge, ask a question, request a call-back, choose a payment route) live outside the document in persistent UI controls. This is inferred from technical spec §4.4 CTA rule and §7.

Empty states: A staff user who has no proposals yet sees a purposeful prompt to create a first proposal or to review any AI Quality Monitor draft that may be waiting — not a blank screen. Inferred from technical spec §9.4 (AI Quality Monitor draft notification) and §5.6 (dashboard).

Error states: Errors in proposal creation (missing mandatory sections, invalid payment profile configuration) surface inline at the offending section, not as a full-page interruption, so that the rest of a partially-built proposal is preserved. Inferred from technical spec §4.4 mandatory sections and §6.5 payment profile validation.

AI suggestions: AI-contributed content (Aiden-suggested FAQ blocks, AI Quality Monitor draft proposals, AI Concierge call summaries, Aiden-drafted staff replies) is visually distinguished from human-authored content with a consistent AI-provenance indicator. Suggested content is never surfaced as final; it always requires explicit human approval before it enters the governed artefact or is sent to a patient. Inferred from technical spec §4.5, §8.2, §9.4.

Multi-step flows: The two-step acknowledgement/acceptance model is presented as a linear progression, not as a single form. Each step is its own confirmable moment with its own summary of what the patient is agreeing to. Step 1 (Explained & Acknowledged) is clearly labelled as not a commitment to proceed. Step 2 (Acceptance) is clearly labelled as a binding instruction. Inferred from technical spec §7.1 and §7.2.

Version control: When a proposal has been accepted and a modification is needed, the UI surfaces a clear "create new version" workflow rather than allowing silent edits. The currently active version is always marked. Previous versions are accessible to authorised staff but visually demoted. Inferred from technical spec §7.3.

Read-only vs editable: Patient-facing surfaces (mobile app and secure web link) are always read-only with respect to proposal content. Staff-facing creation and editing surfaces are clearly distinguished from the preview of what the patient will see. Inferred from technical spec §5.4 and §4.4.

4. Primary Surfaces

4.1 Web Portal

Who uses it: Treatment co-ordinators, practice managers, finance/admin staff, and clinicians reviewing proposals at a desk. Inferred from technical spec §5.6 (staff dashboard) and §2.1.

Key tasks performed here:

  • Monitor the proposal dashboard/feed: filter by state, review engagement signals (last opened, questions submitted, call-back requests), and assign or re-route proposals to the correct owner. Inferred from technical spec §5.6.
  • Create and edit proposals using the sectioned base template, adding treatment options, multimedia, payment profiles, and content library blocks. Inferred from technical spec §4 and §5.
  • Review AI Quality Monitor draft proposals and promote them to Presented after editing. Inferred from technical spec §9.4.
  • Review and respond to patient Q&A, including reviewing Aiden-suggested replies before sending. Inferred from technical spec §8.1 and §8.2.
  • View full proposal audit trail and version history for compliance and dispute purposes. Inferred from technical spec §10 and §7.3.
  • Configure Payment Profiles and proposal templates per treatment type. Inferred from technical spec §6.1 and §4.3.

Layout pattern: split-pane on the proposal dashboard (proposal list left, detail/engagement right); form-wizard for proposal creation; list-detail for audit trail and version history. Inferred from the multi-status, multi-field nature of the proposal object in technical spec §3 and §5.6.

4.2 Tablet App

Who uses it: Clinicians and treatment co-ordinators presenting proposals in-chair to patients during or immediately after a consultation. Inferred from technical spec §5.1.

Key tasks performed here:

  • Present the proposal to the patient on the tablet screen for face-to-face walkthrough. Inferred from technical spec §5.1.
  • Capture Step 1 acknowledgement (Explained & Acknowledged) via patient signature or tap-to-confirm on the tablet. Inferred from technical spec §7.1.
  • Optionally initiate Step 2 acceptance in-chair where the patient wishes to proceed immediately. Inferred from technical spec §7.2.
  • Allow the patient to submit a text question or request a call-back while the clinician is present. Inferred from technical spec §8.1 and §8.3.

Touch ergonomics: All primary interactive controls (acknowledge, accept, ask a question, request call-back) must have touch targets ≥48 px to accommodate use on a shared tablet in a clinical setting. The in-chair presentation mode should minimise accidental navigation away from the proposal. Inferred from the in-chair presentation context described in technical spec §5.1 and §7.1.

4.3 Mobile App (Patient)

Who uses it: Patients reviewing their treatment proposal remotely at a time of their choosing, via the Primoro Patient Mobile App. Inferred from technical spec §5.2.

Key tasks performed here:

  • Read and navigate the full proposal, including treatment options, multimedia, costs, and payment information. Inferred from technical spec §4 and §5.2.
  • Complete Step 1 acknowledgement and Step 2 acceptance at their own pace, with clear labelling of what each step means. Inferred from technical spec §7.1 and §7.2.
  • Submit text or voice questions and receive staff responses within the proposal context. Inferred from technical spec §8.1.
  • Request a call-back and see confirmation that the request has been logged. Inferred from technical spec §8.3.
  • Browse available payment options and initiate a finance application where eligible, with clear signposting that this action does not constitute acceptance. Inferred from technical spec §6.6 and §6.7.
  • Where Step 2 has been completed and first-step booking is permitted, view available appointment slots and book the first appointment. Inferred from technical spec §7.2.

Who uses it: Patients who are not registered on the Primoro Patient Mobile App, accessing their proposal via an emailed secure link. Inferred from technical spec §5.3 and §5.4.

Key tasks performed here:

  • All tasks available on the Patient Mobile App (read, acknowledge, accept, Q&A, call-back, payment option selection, first-step booking where enabled). The experience must be functionally equivalent to the app. Inferred from technical spec §5.4.
  • Identity verification step at entry (tokenised link plus any configured secondary verification). Inferred from technical spec §5.3.

5. Interaction Model

5.1 Primary Flows

Flow 1 — Staff creates and delivers a proposal

Inferred from technical spec §4, §5, and §7.

  1. Staff member opens proposal creation from the patient record or linked Treatment Pipeline opportunity.
  2. System loads the sectioned base template. Mandatory sections are pre-populated and locked where applicable; optional sections are listed as available to add.
  3. Staff completes treatment outline, adds/selects options, attaches multimedia, selects a payment profile, and adds content blocks from the library.
  4. Aiden MAY suggest FAQ blocks; staff reviews and approves or dismisses each suggestion before inclusion.
  5. Staff previews the proposal as the patient will see it.
  6. Staff selects delivery surface: Tablet App (in-chair) or remote delivery (Patient Mobile App / secure email link fallback).
  7. Proposal moves from Draft → Presented. Communication Hub delivers the notification or opens the tablet presentation.
  8. Engagement signals begin recording.

Document Hub governance alignment: Proposal documents are stored and shared in accordance with Document Hub's secure-by-reference principles. Staff annotations, internal review notes, and draft comments are never visible to patients at any point in the delivery flow — only the content that has been explicitly approved for patient viewing is included in the delivered artefact. The proposal preview (step 5 above) renders exclusively what the patient will see, with all staff-only context hidden. Inferred from Document Hub's patient-data-never-exposed-carelessly principle and its annotation-visibility governance boundary.

Pre-proposal form dependency: Where a Digital Form (for example, a medical history update or a consent-to-examine form) is configured as a prerequisite for a given proposal type, the proposal creation surface surfaces a dependency indicator showing whether the required form has been completed by the patient. If the prerequisite form has not been completed, the staff member is informed before delivery; the proposal MAY still be created in Draft state but the delivery action is blocked until the form dependency is resolved. The authoritative consent artefact for treatment consent purposes remains the proposal acceptance record; Digital Forms capture separate pre-treatment information and are not a substitute for Step 2 acceptance. Inferred from the consent-sequencing boundary between Digital Forms and Smart Treatment Proposals.

Flow 2 — Patient reviews and completes the two-step model (remote)

Inferred from technical spec §5.2, §5.3, §7.1, §7.2.

  1. Patient receives notification (via Patient App push or email from Communication Hub).
  2. Patient opens proposal. All sections are visible; payment options are present but no acceptance CTA is shown yet.
  3. Patient reads, watches any media, and may submit text or voice questions.
  4. Patient completes Step 1: taps/clicks an acknowledgement control. UI confirms "You've confirmed this plan was explained to you" (needs UX writer input — exact microcopy). Proposal state → Explained & Acknowledged.
  5. Staff sees state change in the dashboard.
  6. Patient returns (same session or later) and completes Step 2: reviews acceptance summary, optionally selects a payment route, and confirms acceptance. UI clearly states this authorises treatment to proceed. (needs UX writer input — exact acceptance modal copy and confirmation pattern). Proposal state → Accepted.
  7. If first-step booking is enabled and slots are available, patient is offered appointment selection immediately post-acceptance (via Appointment Manager).
  8. If booking is not available, system creates a staff follow-up task automatically.
  9. Treatment Pipeline opportunity is updated to Accepted where linked.

Hygiene subscription boundary: Where a patient's payment options panel includes a Direct Debit mechanism, the UI MUST NOT present or imply enrolment in a Hygiene Subscription or any other recurring plan as part of this flow. Treatment proposals are one-time governed artefacts; recurring subscription enrolment requires a separate, explicit consent journey with its own Direct Debit mandate. No proposal acceptance action — including payment option selection — automatically triggers or pre-populates a subscription enrolment. If a subscription product is clinically relevant, it is surfaced as a distinct next-step recommendation after acceptance is complete, never as a component of the proposal itself. Inferred from the consent-boundary distinction between one-time treatment proposals and recurring Hygiene Subscription enrolment.

Flow 3 — AI Quality Monitor draft review

Inferred from technical spec §9.4.

  1. AI Quality Monitor detects a proposal opportunity during an in-surgery discussion.
  2. A non-blocking notification appears in the staff interface: (needs UX writer input — notification label and CTA).
  3. Staff member opens the draft proposal. All AI-authored content is clearly marked with an AI-provenance indicator.
  4. Explainability and review state: Before the staff member can approve or promote any section, the UI surfaces an explainability note for the AI Quality Monitor's draft — a brief, plain-language statement of what the AI detected and why it generated the draft (for example, the clinical signal or discussion context that triggered it). This note is always visible during the review session and cannot be collapsed before first review. The proposal review state is shown as "AI Draft — Awaiting Clinical Review" until the staff member has explicitly confirmed each AI-authored section; partially reviewed drafts retain this state so that the staff member can resume without losing track of which sections have been assessed. The explainability note and the per-section review state are both rendered distinctly from the proposal content itself and do not appear in the patient-facing view. Inferred from AI Quality Monitor's requirement that Draft Output Artefacts must never be auto-finalised and must surface governance-visibility context before clinician acceptance.
  5. Staff reviews, edits any sections, removes or approves suggested blocks, and confirms the proposal is ready to present.
  6. Staff explicitly promotes the draft — it does not auto-advance.
  7. Flow continues from Flow 1, step 6.

Flow 4 — Staff responds to a patient question

Inferred from technical spec §8.1 and §8.2.

  1. Patient submits a text or voice question from within the proposal.
  2. Staff dashboard surfaces a pending action against the proposal.
  3. For voice questions, a transcription is displayed alongside the audio.
  4. Aiden MAY surface a suggested reply; it is visually marked as a suggestion and is not sent without staff approval.
  5. Staff edits or dismisses the suggestion and sends a response.
  6. Response is attached to the proposal record and delivered via Communication Hub.

5.2 State Machines

The following maps the authoritative proposal state machine from technical spec §3.2 to UX treatments.

State Entry condition visible before transition Confirmation pattern Visual treatment
Draft Proposal creation started; no patient has seen it None required — auto-state on creation Muted badge; clearly labelled "Draft — not yet sent"
Presented Staff has delivered or opened the proposal for in-chair presentation Staff confirms delivery action (needs UX writer input — confirmation label) Active badge; timestamp of presentation shown
Explained & Acknowledged Patient has completed Step 1 acknowledgement Patient-side: explicit acknowledgement tap with summary shown; no further staff action required for state change Positive/progress badge; Step 1 checkmark visible
Accepted Patient has completed Step 2 acceptance Patient-side: confirmation modal summarising what is being agreed to, requiring a deliberate confirm action (needs UX writer input — modal copy); on staff side: read-only, state change is automatic Affirming/complete badge; acceptance timestamp and method shown; version locked
Declined Patient has explicitly declined Patient-side: decline confirmation with optional reason capture (needs UX writer input); irreversible without staff action Neutral/closed badge; no further patient CTAs shown
Expired Proposal has passed its configured expiry without acceptance System-triggered; staff notified via dashboard pending action Subdued/expired badge; expiry date shown; re-send or new proposal options offered to staff

State badges appear consistently on the proposal list, proposal detail header, and the Treatment Pipeline card where linked. Inferred from technical spec §3.2 and §7.3.

Finance application initiated and payment option selected are NOT reflected as state changes. These events are visible in the audit trail and engagement panel only, so that staff and patients are not misled into thinking a financial action equals acceptance. Inferred from technical spec §6.7.

5.3 Empty / Loading / Error / Offline States

Proposal dashboard (staff)

  • Empty state: No proposals exist yet. Surface a prompt to create the first proposal or check whether any AI Quality Monitor drafts are waiting. Inferred from technical spec §5.6 and §9.4.
  • Loading state: Skeleton rows representing proposal list items, with state badge placeholders. Inferred from the list-based dashboard structure in technical spec §5.6.
  • Error state: If dashboard data fails to load, surface a non-blocking error with a retry action. Partially loaded data is shown where available. Inferred from dashboard requirements in technical spec §5.6.
  • Offline state: Last-loaded proposal list is shown read-only from local cache where available. Engagement signals and state changes cannot be submitted; a persistent offline indicator is shown. Inferred from the engagement signal and state-transition architecture in technical spec §5.5.

Proposal creation (staff)

  • Empty state (new proposal): Sectioned base template loads with mandatory sections visible and clearly marked; optional sections shown as available to add. Inferred from technical spec §4.4.
  • Loading state: Skeleton template structure while content library blocks load. Inferred from technical spec §4.5.
  • Error state: Inline validation at section level for missing mandatory content; payment profile validation errors surface adjacent to the relevant field before the proposal can be presented. Inferred from technical spec §4.4 and §6.5.
  • Offline state: Draft auto-saves locally; staff is notified that the proposal cannot be presented until connectivity is restored. Inferred from the Draft state and delivery requirement in technical spec §3.2 and §5.
  • Empty/not-yet-loaded state: Branded loading screen with proposal context (practice name, patient name) while content fetches. Inferred from the identity-verification-at-entry requirement in technical spec §5.3.
  • Loading state: Progressive section reveal — proposal header and cover section load first; media embeds load progressively. Inferred from the sectioned template structure in technical spec §4.4.
  • Error state: If the proposal cannot be loaded, surface a message with a call-back request option and a direct contact prompt. Inferred from technical spec §8.3.
  • Offline state: If the patient loses connectivity mid-review, any unsaved question text is preserved locally. Acknowledgement and acceptance actions require connectivity and surface a clear message if attempted offline. Inferred from the immutable audit-trail requirement for acknowledgement events in technical spec §10.

6. Component Inventory

New components introduced or extended by this module:

  • Proposal card — summary card showing ProposalID, patient name, state badge, last-engagement timestamp, and pending-action indicator; appears in the staff dashboard list. Inferred from technical spec §5.6.
  • Proposal state badge — colour-coded, labelled badge reflecting the current ProposalState; appears on proposal cards, proposal detail headers, and Treatment Pipeline cards. Inferred from technical spec §3.2.
  • Sectioned proposal template — editable multi-section document frame with mandatory/optional section distinction, block-insertion affordance, and patient-preview toggle. Inferred from technical spec §4.4 and §4.5.
  • Content block picker — searchable, category-filtered panel for inserting library blocks into permitted sections; shows mandatory-block lock states. Inferred from technical spec §4.5.
  • AI-provenance indicator — a persistent visual mark (badge or border) on any content or reply suggestion that was AI-generated, with a human-approval affordance to promote or dismiss. Inferred from technical spec §4.5, §8.2, and §9.4.
  • Two-step acceptance rail — a persistent UI element in the patient-facing view showing Step 1 and Step 2 status, with the appropriate CTA for the current step; lives outside the proposal document. Inferred from technical spec §4.4 CTA rule and §7.
  • Payment options panel — side-by-side presentation of available payment mechanisms with representative examples, regulatory wording where required, and a clear "selecting this does not accept treatment" label. Lives outside the proposal document. Inferred from technical spec §6.6 and §6.7.
  • Engagement timeline panel — staff-facing panel on a proposal record showing all recorded engagement signals (opened, re-opened, sections viewed, questions submitted, call-back requested) in chronological order. Inferred from technical spec §5.5 and §10.
  • Voice question player — in-staff-UI audio playback control with transcription display alongside, attached to a specific Q&A thread on the proposal. Inferred from technical spec §8.1.
  • Version history drawer — accessible to authorised staff from the proposal detail; shows all versions with change provenance, active version marker, and revert option. Inferred from technical spec §7.3.
  • Booking selector (post-acceptance) — first-step appointment slot picker surfaced immediately after Step 2 acceptance where booking is permitted, drawing on Appointment Manager availability. Inferred from technical spec §7.2.

Reused from the design system:

  • Modal / confirmation dialog
  • Toast notification
  • In-app banner
  • Form field with inline validation
  • Skeleton loader
  • Filter/sort toolbar
  • Signature capture control

7. Visual Design Notes

  • Typography: (needs UX writer input — heading scale, body scale, monospace usage to be defined by design system application)
  • Colour — semantic usage: proposal state badges use the semantic colour palette (e.g. success/affirming for Accepted, warning/amber for Explained & Acknowledged awaiting Step 2, neutral/muted for Draft and Expired, negative/closed for Declined). AI-provenance indicators use a distinct, consistent colour token that is never reused for other semantic purposes. Payment selection controls must not use the same colour treatment as acceptance/commitment controls. Inferred from the state machine in technical spec §3.2 and the acceptance/payment separation rules in §6.7.
  • Iconography: icons are never used alone without a text label on interactive controls.
  • Motion: transitions between proposal states (e.g. on acknowledgement or acceptance) MAY use a brief confirming animation to reinforce the significance of the action. Media embeds load progressively. All motion must respect prefers-reduced-motion. Inferred from the significance of state transitions in technical spec §7 and the media embedding in §4.2.

8. Accessibility & Inclusivity

The module MUST meet WCAG 2.2 AA. Specifically:

  • Text contrast ≥4.5:1 (normal) / ≥3:1 (large)
  • All interactive controls reachable via keyboard
  • Focus states visible
  • Form fields have programmatic labels
  • ARIA used only where native semantics are insufficient
  • Touch targets ≥44×44 px on mobile/tablet
  • Motion can be reduced via prefers-reduced-motion
  • Screen reader tested on NVDA / VoiceOver / TalkBack
  • Voice question submission must have an equivalent text-input path so that patients who cannot use voice are not excluded from the Q&A function. Inferred from technical spec §8.1 which lists voice as a patient option alongside text.
  • The two-step acceptance model must be fully operable by keyboard and screen reader, with each step's purpose and consequences announced to assistive technology before the user commits. Inferred from the binding nature of Step 2 acceptance described in technical spec §7.2.
  • Patient-facing content (plain-English treatment descriptions) supports patients with lower health literacy; complex financial terms must be accompanied by plain-language explanations where they appear. Inferred from technical spec §4.1 and §6.6.

9. Internationalisation

  • Locale-aware date/time/number formatting
  • All user-facing strings externalised
  • Layouts tolerant of 30% string-length growth (German, French)
  • Currency formatting must respect locale conventions; representative finance examples (monthly amounts, totals) use locale-aware number formatting with explicit currency symbols. Inferred from technical spec §6.6 which requires representative financial examples to be shown.
  • Regulatory wording blocks (required where credit is offered) must support locale-specific variants in the content library, as regulatory text differs by jurisdiction. Inferred from technical spec §6.6.
  • RTL support: (needs UX writer input — not explicitly addressed in technical spec; to be determined by target market scope)

10. Cross-Module UX Touchpoints

All cross-module touchpoints below are inferred from the integration contracts named in technical spec §9 and the explicit non-goals in §12.

  • Treatment Pipeline — When a proposal reaches the Accepted state, the linked opportunity card in Treatment Pipeline updates its status automatically. Staff navigating from a Treatment Pipeline opportunity to its associated proposal transition via a persistent link on the opportunity card; returning takes them back to the opportunity context. Inferred from technical spec §9.1.
  • Appointment Manager — Post-acceptance, where first-step booking is permitted, the patient is handed off to an Appointment Manager booking selector embedded in the acceptance flow. Appointment availability is read-only from Smart Treatment Proposals' perspective; all booking logic remains within Appointment Manager. Where no suitable slot exists, the system creates a follow-up task rather than surfacing an empty booking screen. Inferred from technical spec §7.2 and §9.2.
  • Communication Hub — All outbound notifications (proposal delivery, SMS engagement prompts, staff reply to patient Q&A, call-back confirmations) are dispatched via Communication Hub. Smart Treatment Proposals surfaces a delivery-status indicator (sent / delivered / opened) sourced from Communication Hub signals, but does not own the delivery mechanism. AI Concierge call outcomes are logged back to the proposal via Communication Hub. Inferred from technical spec §5.3, §8, §8.4, and §9's explicit statement that delivery flows through Communication Hub.
  • AI Quality Monitor — A non-blocking notification in the staff workspace (web portal or tablet) surfaces when AI Quality Monitor has prepared a draft proposal. The notification links directly to the draft in the proposal creation/editing surface. The AI Quality Monitor surface itself does not show proposal editing UI. Inferred from technical spec §9.4.
  • AI Concierge — Where AI Concierge is enabled and a patient calls via a dedicated proposal-query number, the call outcome, summary, and any tasks created are surfaced in the proposal's engagement timeline panel. Staff see these alongside other engagement signals. AI Concierge does not surface proposal acceptance controls. Inferred from technical spec §8.4.
  • Access Manager — Role-based access permissions govern which staff roles can view, create, edit, present, respond to questions on, and view the audit trail of proposals. The active user's role is visible in the application header. Read-only states (e.g. audit trail, version history for lower-privilege roles) are visually distinct from editable states. Inferred from technical spec §11.
  • PMS Integration (e.g. Dentally) — A PMS traceability link is surfaced on the proposal detail for authorised staff, showing the corresponding PMS record status and providing a link back to the PMS. This is an outbound reference only; clinical charting remains in the PMS and is not reproduced inside Smart Treatment Proposals. Inferred from technical spec §9.3 and §12.
  • Aftercare Manager — Proposal acceptance is a triggering event that may initiate an aftercare journey in Aftercare Manager, depending on practice configuration. Smart Treatment Proposals' responsibility ends at the acceptance record; it does not own or render aftercare instructions. Where Aftercare Manager is configured to activate on proposal acceptance, the post-acceptance screen in Smart Treatment Proposals MAY surface a brief signpost ("Your aftercare plan will be sent to you separately") but does not duplicate aftercare content inline. The authoritative aftercare surface remains Aftercare Manager. Staff viewing a proposal record see a read-only reference link to any associated aftercare journey, but manage that journey from within Aftercare Manager. This boundary prevents duplication of clinical content across both surfaces. Inferred from Aftercare Manager's care-continuation model and the proposal acceptance event as a key appointment outcome.
  • Campaign Manager — Campaign Manager may drive the delivery of treatment proposals as part of a configured multi-step patient journey. Where a proposal is associated with an active campaign, Smart Treatment Proposals MUST respect any suppression or pause states held by Campaign Manager: if a contact has been suppressed (opted out, flagged as do-not-contact, or paused within a campaign), Communication Hub will prevent outbound engagement notifications, and Smart Treatment Proposals MUST NOT present a re-send or re-engage action that would circumvent this. The staff dashboard engagement panel surfaces a "Suppression active" indicator against the patient record where applicable, so that staff are not misled about the reason a proposal has not been opened. Smart Treatment Proposals does not manage suppression state directly; all suppression logic remains within Campaign Manager. Inferred from Campaign Manager §10.3's app-first delivery model and its suppression-state governance.
  • Document Hub — Treatment proposal documents are stored within the governed document lifecycle managed by Document Hub. Smart Treatment Proposals writes proposal artefacts (accepted versions, signed records) to Document Hub via the secure-by-reference integration; it does not maintain its own independent document store for long-term retention. Retrieval of historical proposal documents by authorised staff (for compliance, audit, or dispute purposes) follows Document Hub's access control model. Staff annotations and internal review notes attached to a proposal during creation and review are stored as staff-only metadata and are never included in the patient-facing document reference. This enforces Document Hub's principle that patient-visible content is explicitly scoped and careless exposure of internal notes is prevented by architecture, not by convention. Inferred from Document Hub's secure-by-reference and annotation-visibility governance boundary.

UX consistency rules:

  • The proposal state badge uses the same token and visual treatment wherever it appears — proposal list, proposal detail header, Treatment Pipeline opportunity card — so staff always read the same signal in the same way. Inferred from technical spec §7.3's requirement that the active version is reflected consistently across proposal record, Treatment Pipeline, and Appointment Manager booking context.
  • Action controls (acknowledge, accept, ask a question, request call-back, apply for finance) are always surfaced outside the proposal document body, in a persistent UI region below or alongside the document, on all surfaces. Inferred from technical spec §4.4 CTA rule.
  • Action buttons appear bottom-right on tablet surfaces, top-right on web portal surfaces.

11. Governance & Auditability

  • AI-generated content (Aiden-suggested blocks, Aiden-drafted replies, AI Quality Monitor draft proposals) is always visually distinct from human-authored content using the AI-provenance indicator component. Staff cannot accidentally send AI-generated content without a deliberate approval action. Inferred from technical spec §4.5, §8.2, and §9.4.
  • Every audit-significant action (acknowledgement, acceptance, decline, payment option selection, finance application initiation, version creation) triggers a confirmation step before it is recorded. The confirmation step summarises what the action means and (for patient-facing steps) uses plain language. Inferred from technical spec §7 and §10.
  • The engagement timeline panel and audit trail are accessible to authorised staff from the proposal detail view. The audit trail is read-only; no editing UI is presented alongside it. Inferred from technical spec §10 and §11.
  • The active version of a proposal is explicitly marked at the top of the proposal detail. Where a proposal has been versioned, a "Version history" affordance is visible to authorised staff. Previous versions are accessible but visually demoted and marked as superseded. Inferred from technical spec §7.3.
  • Accepted proposals and their signatures are presented as read-only artefacts; no editing affordance is shown to any role. A "create new version" action is available to authorised roles and is clearly labelled as creating a new governed document, not amending the accepted one. Inferred from technical spec §7.3 and §13.
  • Finance application initiation and payment option selection are recorded in the audit trail and visible in the engagement panel but are never surfaced as proposal state changes in the UI. This prevents staff or patients from misreading a financial action as treatment acceptance. Inferred from technical spec §6.7 and §10.
  • Audit log actor-type distinction: The audit trail MUST visually distinguish between events initiated by human actors (staff or patient) and events generated by AI components (AI Quality Monitor draft creation, Aiden content suggestion, AI Concierge call summary). Each audit event is rendered with an actor-type indicator — for example, a staff member's name and role for human actions, or a clearly labelled AI component name for automated actions — so that the provenance of every recorded event is unambiguous. AI-generated audit events are never displayed in a way that could be mistaken for a human-authorised action. This applies both to the engagement timeline panel and the full audit trail view. Inferred from the Security and Privacy UX spec §3 requirement that AI-generated audit events are visually distinguished from user events, and from the AI Quality Monitor draft-review flow in §5.1 Flow 3.
  • The current user's role and active permissions are visible in the application header.
  • Read-only states are visually distinct from editable states.
  • Rewards and financial incentives: Rewards credits, points balances, or any recognition-layer value from Rewards Manager are never presented within the payment options panel, the acceptance flow, or anywhere adjacent to clinical treatment recommendations in a proposal. The payment options panel is restricted to legitimate payment mechanisms (direct payment, finance plans, Direct Debit schedules). Surfacing rewards information alongside treatment recommendations or financing options would risk implying that a clinical decision is incentivised by a reward, which is prohibited. Where a patient has a rewards balance, it is accessible in a separate rewards context outside the proposal flow entirely. Inferred from Rewards Manager's clinical-separation principle prohibiting rewards information adjacent to clinical recommendations.

12. Notification & Communication Patterns

All outbound notifications are dispatched via Communication Hub. Smart Treatment Proposals does not send messages directly. The following patterns are inferred from technical spec §5.3, §5.5, §8, §8.4, and §9.4.

  • In-app banner — Used to surface the AI Quality Monitor draft-ready notification on the staff workspace (non-blocking, dismissible). Also used to show persistent offline status when the device loses connectivity. Inferred from technical spec §9.4 ("non-blocking notification/popup") and offline state requirements.
  • Toast — Used to confirm successful staff actions: proposal sent, reply sent, payment profile saved, content block added. Also used to confirm that a call-back request was received (patient-facing). Inferred from the confirmation requirements across technical spec §7 and §8.3.
  • Push notification (via Communication Hub) — Delivered to the patient's mobile app when a new proposal is available for review; when a staff reply to a question is available; and when a call-back has been scheduled (if configured). Engagement prompt notifications MAY also be sent where a proposal has not been opened within a configurable period (where SMS/push prompting is enabled). Inferred from technical spec §5.2, §5.3, §8.2, and §5.5.
  • Email (via Communication Hub) — Primary fallback delivery mechanism for patients not on the Patient Mobile App; contains a tokenised secure link to the web proposal experience. Also used for the optional SMS engagement prompt (via Communication Hub SMS channel). Inferred from technical spec §5.3.
  • Staff task / alert — Created automatically when: a patient requests a call-back; no suitable appointment slot is available post-acceptance (booking follow-up task); a proposal has not been opened within a configurable engagement window (where enabled). These tasks surface in the staff dashboard as pending actions. Inferred from technical spec §7.2, §8.3, and §5.6.

13. Open Questions

UX decisions to resolve before this spec is promoted from draft to published.

  • (needs UX writer input) What is the exact microcopy for Step 1 acknowledgement and Step 2 acceptance confirmation modals, including the plain-language summary of what each step means legally and clinically?
  • (needs UX writer input) What is the visual design treatment for the AI-provenance indicator — badge shape, label text, colour token — that is distinct enough to be unmistakeable but not so alarming that it undermines clinician trust in AI-assisted drafting?
  • (needs UX writer input) How should the decline flow work for patients — is a reason required, optional, or not collected? What is shown to the patient after declining? What is the staff notification pattern?
  • (needs UX writer input) What is the exact empty-state copy and CTA for the proposal dashboard when a staff member has no proposals and no AI Quality Monitor drafts waiting?
  • What is the configured expiry period for proposals, and how far in advance of expiry is the patient notified (if at all)? The technical spec defines an Expired state but does not specify the expiry window or pre-expiry notification behaviour. Inferred gap from technical spec §3.2.
  • When a patient accesses the proposal via secure web link and secondary verification is configured, what is the exact verification UX (OTP, knowledge question, etc.)? The technical spec mentions "secondary verification where configured" without specifying the UX. Inferred gap from technical spec §5.3.
  • For hybrid payment models (e.g. 50% finance + 50% Direct Debit), how are the two components presented side-by-side in the payment options panel without creating confusion about total payable? The technical spec requires clear separation but does not prescribe the layout. Inferred gap from technical spec §6.3.
  • What is the UX for re-acknowledgement or re-acceptance after a new proposal version is created post-acceptance? The technical spec notes that practices may require re-acknowledgement based on policy but does not specify the patient-facing flow for being notified that a revised proposal requires their attention. Inferred gap from technical spec §7.3.
  • (needs UX writer input) Should the in-chair tablet presentation mode suppress the staff dashboard chrome entirely (full-screen patient view) or maintain staff navigation? This has implications for the tablet layout pattern and inadvertent navigation risk.
  • AI Concierge direct-dial is described as configurable but its availability within this module's UX is conditional on platform policy. The UX for configuring dedicated numbers and the staff view of which numbers are active needs to be defined, or confirmed as owned by a different module's settings surface. Inferred gap from technical spec §8.4.
  • What level of detail is surfaced in the AI Quality Monitor explainability note during Flow 3 draft review — is it a structured rationale (detected signal, confidence, clinical context) or a plain-language summary? Who authors or templates these notes, and are they editable by staff before being dismissed? Inferred gap from the AI Quality Monitor governance-visibility requirement introduced in §5.1 Flow 3.
  • Where a Digital Form is configured as a prerequisite for a proposal type, who configures that dependency — is it a per-proposal-template setting managed within Smart Treatment Proposals, or a cross-module rule managed in a settings surface shared with Digital Forms? Inferred gap from the form-dependency model introduced in §5.1 Flow 1.