← Performance Dashboards
Performance Dashboards
Editing technical
v0.1 — published
Save draft
Tab to switch the tab. Save writes a vNEW-DRAFT.md alongside the published file.
Markdown source
# Performance Dashboards – Technical Specification ## 1. Module Purpose & Scope (Authoritative) Performance Dashboards provide live, role-appropriate visibility into what is happening right now across a practice or group. The module aggregates performance signals from across the Primoro platform, presents them in role-scoped dashboard views with real-time or near-real-time updates, and surfaces exceptions that require attention — with direct drill-through to the source workflows that own them. Dashboards inform and highlight; they do not judge, enforce, or take action. It governs: - Aggregating live performance signals from Primoro modules and connected systems. - Presenting role-based dashboard views with real-time, near-real-time, or batch-refreshed metrics. - Surfacing attention cues and exceptions as non-blocking visual indicators, with drill-through to underlying data and next-step workflows. - Supporting group and multi-site comparisons where permitted by RBAC. It explicitly does not: - Create, modify, or enforce tasks, bookings, payments, or policies (owned by Task Manager, Appointment Manager, and Integrated Payments respectively). - Act as a system of record for any domain. - Generate financial statements or forecasts (owned by Financial Insights). - Monitor individual staff behaviour for disciplinary purposes. - Take autonomous decisions or initiate automated actions on any source system. ## 2. Ownership & Responsibilities ### 2.1 Performance Dashboards IS Responsible For - Aggregating and presenting read-only performance signals derived from source modules. - Rendering role-appropriate dashboard views, filtered at source by RBAC. - Issuing attention indicators (badges, colour cues, icons) that link to drill-downs or tasks but never trigger actions themselves. - Providing exportable summaries (PDF / CSV) where the user's role permits. - Emitting access logs, export logs, and metric-definition audit trails to the Audit & Compliance module. - Enforcing scope boundaries so individuals see only their permitted data. ### 2.2 Performance Dashboards IS NOT Responsible For - Source data creation or mutation — owned by each originating module (Appointment Manager, Integrated Payments, Task Manager, AI Concierge, Financial Insights, Lab Manager, Rewards Manager, HR & People Manager, Loyalty Insights). - Detailed financial statements, revenue forecasting, or P&L analysis — owned by Financial Insights. - Task creation or follow-up enforcement — owned by Task Manager. - Patient record management or clinical decision support — owned by respective clinical modules. - Authentication and role assignment — owned by Access Manager. ## 3. Core Objects (Normative) ### 3.1 Performance Metric (Canonical Artefact) A Performance Metric is a read-only, aggregated indicator derived from one or more source modules within Primoro. It is not a system-of-record object; it carries no mutable state and never writes back to any source module. Minimum required fields: - MetricID - Name - Domain (Operational / Clinical / Financial / Engagement / Compliance) - CalculationSource (reference to originating module(s)) - UpdateModel (Real-time / Near-real-time / Batch) - FreshnessLabel (human-readable staleness indicator) - VisibilityRules (roles and scopes permitted to view) - AuditTrail (immutable — records definition changes) ### 3.2 Dashboard View (Canonical Artefact) A Dashboard View is a role-scoped collection of Performance Metrics, charts, and attention indicators rendered dynamically for an authenticated user. Minimum required fields: - ViewID - RoleBinding (role(s) that receive this view) - EnabledModules (modules whose metrics are included) - PermissionScope (site / group / individual panel) - LayoutVersion - LastRenderedAt Layout is optionally configurable by the user where the practice has enabled that capability; the core metric set for each role is non-configurable. ### 3.3 Attention Indicator An Attention Indicator is a non-blocking visual cue (badge, colour, icon) attached to a metric or metric group. An Attention Indicator MUST declare its trigger condition (e.g. drift from target, unusual change, overdue or at-risk state) so the condition is auditable and explainable. Indicators MAY link to a drill-down or to a Task Manager workflow. They MUST NOT trigger any automated action. ### 3.4 Dashboard View Interaction State Machine (Authoritative) States: - Viewed — the Dashboard View has been loaded by an authenticated user. - Filtered — the user has applied a date, site, or provider filter. - Drilled-Into — the user has navigated from a metric to underlying detail. - Exported — the user has requested a PDF or CSV export (role-controlled). Rules: - All state transitions are auditable and time-stamped with actor identity. - A View returns to Viewed state when the user clears filters or navigates back to the top level. - Export is only reachable from Viewed, Filtered, or Drilled-Into states, and only where the user's role permits export. - No transition from any Dashboard View state may mutate data in any source module. ## 4. Metric Update Models ### 4.1 Update Model Types (Authoritative) Each Performance Metric MUST declare exactly one update model: - **Real-time** — event-driven refresh triggered by source module events (e.g. appointment booked or cancelled, call ended in AI Concierge). - **Near-real-time** — frequent scheduled refresh, appropriate for metrics where instantaneous accuracy is not critical. - **Batch** — scheduled aggregation, appropriate for daily or period-level summaries (e.g. Financial Insights aggregates). The update model and freshness label MUST be visible to the end user on the dashboard surface, so staff can interpret metric currency correctly. ### 4.2 Metric Domains The module MUST support metrics across the following domains: - **Operational** — utilisation, cancellations, diary gaps, backfill rate. - **Clinical** — recall compliance, open treatment plans, patient flow. - **Financial** — revenue recognised, amounts collected, outstanding balances, deposits held, subscription income (consumed from Financial Insights; rendered as headline aggregates only). - **Engagement** — referral conversions, reward utilisation, patient cohort engagement, recall compliance trends, churn risk alerts (consumed from Rewards Manager and Loyalty Insights where enabled). - **Compliance** — lab turnaround rates, onboarding duration, workflow throughput (consumed from Lab Manager and HR & People Manager). The module MUST NOT render financial data beyond what is supplied by Financial Insights, and MUST NOT generate or modify financial records. ### 4.3 Candidate Metric Domain: Meeting & Governance Signals (AI Meeting Notes) Where AI Meeting Notes is enabled for a practice, meeting-lifecycle events — such as meeting frequency, outstanding action completion rate, and sign-off time-to-completion — are candidates for exposure as **Operational** or **Compliance** domain metrics. These signals would give practice owners and managers visibility into team-governance health alongside operational and clinical performance. Inclusion of AI Meeting Notes as an inbound feed is **not confirmed for this build**. The decision depends on whether the AI Meeting Notes module exposes a suitable aggregated event feed and whether practice roles with access to governance metrics have been defined. This is flagged as an open question (see §15, item 8). If confirmed, the integration contract would follow the near-real-time pattern (aggregated action and meeting lifecycle events), and the inbound feed entry in §6.1 and the Integration Summary in §10 would be updated accordingly. No AI processing would be applied by Performance Dashboards to these signals; they would be rendered as read-only aggregated metrics in the same manner as all other inbound feeds. ## 5. Delivery Surfaces & Access (Authoritative) ### 5.1 Web Portal Performance Dashboards are the primary staff-facing surface in the web portal. Each authenticated user is presented the Dashboard View bound to their role on login. Filter controls (date, site, provider) and drill-through navigation are available in the portal. Export (PDF / CSV) is available where the user's role permits. ### 5.2 Tablet App Dashboard views rendered on the in-practice tablet MUST surface the same role-bound metric set as the web portal, adapted for touch interaction and smaller viewport. Tablet access is subject to the same RBAC and audit requirements. ### 5.3 Patient Mobile App Performance Dashboards do not surface any content in the patient mobile app. This module is staff-only. ### 5.4 Engagement Signals The module emits access and export events to the Audit & Compliance module. These signals are available to practice owners and group managers to understand dashboard adoption and identify unused views, supporting future configuration decisions. ## 6. Integration Contracts ### 6.1 Inbound (this module consumes from) | From Module | What | Contract | |---|---|---| | Appointment Manager | Booking, cancellation, DNA, and utilisation events | Real-time event feed | | Integrated Payments | Payment and outstanding balance signals | Near-real-time | | Task Manager | Task completion and overdue signals | Near-real-time | | AI Concierge | Call handling statistics | Real-time event feed | | Financial Insights | Revenue, collections, outstanding balances, deposits held, subscription income aggregates | Batch | | Lab Manager | Lab turnaround time, on-time delivery rate, overdue case count | Near-real-time | | Rewards Manager | Points issued and redeemed, active participant counts, referral conversions, reward utilisation rates, charity impact summaries | Near-real-time | | HR & People Manager | Staffing coverage, absence concentration signals, overtime indicators, onboarding duration, workflow throughput | Batch | | Loyalty Insights *(where module is enabled)* | Recall compliance trends, churn risk alerts, patient cohort engagement data | Near-real-time | | Access Manager | Role bindings and permission scopes for RBAC enforcement | Sync (on session) | | AI Meeting Notes *(candidate — not confirmed for this build; see §4.3 and §15)* | Meeting frequency, outstanding action completion rate, sign-off time-to-completion aggregates | Near-real-time | ### 6.2 Outbound (this module emits to) | To Module | What | Contract | |---|---|---| | Audit & Compliance | Dashboard access events, filter events, drill-through events, export events | Async event | | Task Manager | Drill-through links that open a relevant task workflow (user-initiated only; no automatic task creation) | Deep link / navigation | ### 6.3 PMS Boundary Performance Dashboards do not read from or write to the PMS directly. All PMS-originated data (where applicable) is mediated through the relevant Primoro module (e.g. Appointment Manager) before being consumed by this module as an aggregated signal. ## 7. AI Boundaries (Non-Negotiable) Module does not embed AI surfaces directly. Metric data originating from AI Concierge (call handling statistics) is consumed as a pre-aggregated signal; Performance Dashboards renders it as a read-only metric and applies no AI processing to it. If AI-generated explanations or summaries of dashboard metrics are introduced in a future version, those surfaces MUST comply with the platform-wide AI Boundaries policy: AI MAY summarise data for human review; AI MAY NOT take action, bypass RBAC, or present outputs without a clear human-in-the-loop approval step. ## 8. Audit & Compliance The system MUST log: - Dashboard access events: who accessed which Dashboard View, at what time, and with what permission scope. - Filter events: which date, site, or provider filters were applied, and by whom. - Drill-through events: which drill-down was navigated to, from which metric, and by whom. - Export events: who requested an export, what scope was exported, and in what format. - Cross-site view access: any access to a group or multi-site comparison view, with the sites in scope. - Metric definition changes: any change to a Performance Metric's calculation source, visibility rules, or update model, with actor and timestamp. Audit logs MUST be immutable and exportable for inspection. No dashboard access — including read-only views — is permitted without a corresponding audit log entry. ## 9. Access Control Access is governed by Access Manager role bindings. The following capabilities apply per role: - **Read (view dashboard):** All authenticated staff roles; scope is bounded to the role's permitted sites and data domains. - **Filter / Drill-Through:** All authenticated staff roles within their permitted scope. - **Export (PDF / CSV):** Role-controlled; not all roles are permitted to export. The exact role-to-export permission mapping is governed by Access Manager configuration. - **Cross-Site / Group View:** Explicitly granted by Access Manager; access is logged on every use. - **Metric Definition Management:** *(no content captured in original — needs definition)* - **Dashboard Layout Configuration:** Where enabled, individual users may configure their own layout; practice-level defaults are set by the Admin Control Plane. MFA requirements for dashboard access follow the platform-wide Access Manager policy. Given that dashboards surface aggregated financial and operational data, practices MAY configure MFA as a condition of export access. ## 10. Integration Summary - **Appointment Manager** — inbound real-time events for utilisation, cancellation, and DNA metrics. - **Integrated Payments** — inbound near-real-time signals for payment and outstanding balance metrics. - **Task Manager** — inbound near-real-time task signals; outbound deep-link navigation for user-initiated drill-through. - **AI Concierge** — inbound real-time call handling statistics. - **Financial Insights** — inbound batch aggregates for financial headline metrics. - **Lab Manager** — inbound near-real-time operational lab metrics. - **Rewards Manager** — inbound near-real-time reward, referral, and charity metrics. - **HR & People Manager** — inbound batch workforce operational metrics. - **Loyalty Insights** — inbound near-real-time retention and engagement signals (where module is enabled). - **Access Manager** — RBAC enforcement for role binding, permission scope, and export control. - **Audit & Compliance** — immutable outbound event log for all access, filter, drill-through, and export events. - **AI Meeting Notes** *(candidate — not confirmed for this build)* — inbound near-real-time meeting and action lifecycle aggregates for governance and team-health metrics, subject to the decision captured in §15, item 8. ## 11. Explicit Non-Goals - **Financial statements and forecasting** — detailed P&L, revenue forecasting, and financial reporting remain the responsibility of Financial Insights. - **Automated actions or workflow triggers** — dashboards surface exceptions and provide drill-through; they do not initiate tasks, send communications, or modify bookings. Those capabilities belong to Task Manager, Communication Hub, and Appointment Manager respectively. - **Disciplinary monitoring of individuals** — comparative views are contextual and performance-oriented, not punitive. HR-oriented disciplinary workflows, if required, would be owned by HR & People Manager. - **Patient-facing data presentation** — no dashboard content is exposed in the patient mobile app. - **Clinical decision support** — dashboards surface operational and compliance signals; clinical decision support is out of scope and would be owned by the relevant clinical module. ## 12. Versioning & Governance This specification is owned by: *(no content captured in original — needs definition)* Changes to this spec require: - Review by the Post-MVP module owner. - Impact analysis across all declared related modules (see /propose), particularly any module supplying inbound metric feeds. - Version bump: patch for clarifications, minor for new metric sources or role additions, major for changes to core objects or audit contract. ## 13. Build Contract (Engineering & QA) ### 13.1 Canonical Data Model Performance Dashboards is a read-only aggregation module. It does not maintain a primary system-of-record table. The canonical artefacts it manages at build time are: ``` performance_metric_definition ( metric_id UUID PRIMARY KEY, name TEXT NOT NULL, domain TEXT NOT NULL, -- Operational|Clinical|Financial|Engagement|Compliance calculation_source TEXT NOT NULL, -- originating module identifier(s) update_model TEXT NOT NULL, -- real-time|near-real-time|batch freshness_label TEXT NOT NULL, visibility_rules JSONB NOT NULL, -- roles and permission scopes created_at TIMESTAMPTZ NOT NULL, updated_at TIMESTAMPTZ NOT NULL, audit_trail JSONB NOT NULL -- immutable log of definition changes ) dashboard_view_definition ( view_id UUID PRIMARY KEY, role_binding TEXT[] NOT NULL, enabled_modules TEXT[] NOT NULL, permission_scope TEXT NOT NULL, -- site|group|individual-panel layout_version INTEGER NOT NULL, created_at TIMESTAMPTZ NOT NULL, updated_at TIMESTAMPTZ NOT NULL ) dashboard_audit_event ( event_id UUID PRIMARY KEY, event_type TEXT NOT NULL, -- accessed|filtered|drilled-into|exported|cross-site-accessed actor_id UUID NOT NULL, view_id UUID REFERENCES dashboard_view_definition, metric_id UUID REFERENCES performance_metric_definition, scope_context JSONB, -- site(s), provider(s), date range in effect occurred_at TIMESTAMPTZ NOT NULL ) ``` The `dashboard_audit_event` table is append-only. No UPDATE or DELETE is permitted by application code. ### 13.2 Core Behaviour Rules 1. A Dashboard View MUST only render metrics for which the authenticated user's role and permission scope grant visibility, as evaluated at render time against Access Manager. 2. No dashboard interaction — view, filter, drill-through, or export — may mutate data in any source module. 3. Every metric MUST display its freshness label so the user can assess data currency. 4. Attention Indicators MUST declare their trigger condition; the condition MUST be traceable to the source metric and its calculation source. 5. Export capability MUST be suppressed for roles that do not hold the export permission; the export control for a given export request MUST be verified at the time of export, not only at render time. 6. Cross-site and group views MUST be explicitly granted by Access Manager; attempting to access such a view without the grant MUST return a permission error and MUST still be logged to the audit trail. 7. If a source module feed is unavailable, the metric MUST degrade gracefully — displaying a staleness indicator and the last known value — rather than breaking the dashboard UI. 8. All metrics derived from Financial Insights MUST be rendered as headline aggregates only; no financial records may be created or modified via this module. 9. Drill-through navigation to Task Manager workflows is user-initiated only; the module MUST NOT automatically create tasks. 10. All audit events MUST be written to the `dashboard_audit_event` table before the corresponding user-facing response is returned. ### 13.3 Configuration Surfaces - **Practice-level (Admin Control Plane):** Enable or disable export capability per role; configure which modules' metrics are included in each Dashboard View; set default layout for each role. - **Per-user (Access Manager):** Role binding determines Dashboard View assignment and permission scope; individual layout preferences where the practice permits. - **Per-metric (Performance Metric Definition):** Update model, freshness label, and visibility rules are set at metric definition level and version-controlled via the audit trail. ### 13.4 Filtering & Views The UI MUST support the following standard filters across all Dashboard Views where the underlying metric supports the dimension: - Date range (today, this week, custom range) - Site (single site or multi-site, subject to RBAC) - Provider (individual practitioner, subject to RBAC) The module MUST support the following standard views: - Practitioner Dashboard (§6.1 metric set) - Front-of-House / Diary Dashboard (§6.2 metric set) - Treatment Coordinator Dashboard (§6.3 metric set) - Manager / Owner Dashboard (§6.4 metric set) - Group / Multi-Site View (§6.5, RBAC-gated) Saved filter configurations are a candidate for a future minor version; they are out of scope for this build contract. ### 13.5 Module Extension Map - New metric sources are added by extending the `performance_metric_definition` table and wiring a new inbound feed; no schema change to `dashboard_view_definition` is required unless a new role binding is needed. - New Dashboard View types (e.g. a new staff role) are added by inserting a new `dashboard_view_definition` row and configuring its metric set; no breaking change to existing views. - Export format additions (e.g. XLSX) are additive and do not affect the core audit or RBAC contracts. - Any change to the `dashboard_audit_event` schema is a major version change to this specification, as it affects the Audit & Compliance contract. ### 13.6 Acceptance Criteria The build of Performance Dashboards is complete when: - [ ] All authenticated staff roles receive the Dashboard View bound to their role, with metrics filtered to their permission scope. - [ ] Metrics update according to their declared update model (real-time, near-real-time, batch) and display the correct freshness label. - [ ] Drill-through navigation maps correctly to source records in the originating module and does not mutate any source data. - [ ] No dashboard interaction triggers an automatic action in any source module. - [ ] Export is available only to roles with explicit export permission, verified at export time. - [ ] Cross-site and group views are inaccessible without explicit Access Manager grant; denied attempts are logged. - [ ] Source module feed unavailability results in graceful degradation (staleness indicator + last known value), not a broken UI. - [ ] All access, filter, drill-through, export, and cross-site events are written to the audit log before the corresponding response is returned. - [ ] Audit log entries are immutable; no UPDATE or DELETE path exists in application code. - [ ] All non-functional requirements in §14 are met. ## 14. Non-Functional Requirements - **Performance:** Dashboard Views MUST load within an acceptable latency target using cached aggregates for batch and near-real-time metrics; real-time metrics MUST reflect source events without perceptible lag. Specific latency targets (e.g. p95 load time) are to be defined by engineering during sprint planning, informed by the update model of the slowest metric in each view. - **Reliability:** If one or more source module feeds are unavailable, the dashboard MUST degrade gracefully — displaying the last known value with a staleness indicator — rather than failing to render. Target availability for the dashboard rendering service is to be aligned with the platform-wide SLA defined in the infrastructure runbook. - **Scalability:** The module MUST support multi-site and multi-tenant deployments. Cross-site views MUST remain performant as the number of sites in a group grows. Aggregation queries MUST be designed to avoid full-table scans on source module data at render time; pre-computation or caching strategies are required for batch-model metrics. - **Security:** All data in transit MUST be encrypted. All data at rest (including cached aggregates and audit events) MUST be encrypted. RBAC is enforced at the metric visibility level, not only at the view level. No metric payload may be cached in a shared cache layer without isolation by tenant and permission scope, to prevent data leakage between practices or roles. - **Privacy:** Dashboard metrics are aggregated and do not expose raw patient records directly. Drill-through that surfaces patient-identifiable data is subject to the same data access controls as the originating module. GDPR data subject rights (access, erasure) for audit log entries are governed by the Audit & Compliance module's retention policy; Performance Dashboards defers to that policy. - **Observability:** The module MUST export metrics covering dashboard render latency, metric feed availability per source module, audit event write success rate, and export request volume. Distributed traces MUST be emitted for cross-module feed calls so that feed latency can be attributed to the correct source module during incident diagnosis. - **Accessibility:** Dashboard views MUST meet WCAG 2.1 AA standards. Attention indicators MUST not rely on colour alone to convey meaning; they MUST include an accompanying icon or text label. ## 15. Open Questions > Outstanding decisions before this spec can be promoted from `draft` to `published`. 1. **Export role permissions:** The original spec states that export is role-controlled but does not define which roles are permitted to export. Which roles receive export access, and is this configurable per practice or fixed at the platform level? 2. **Tablet surface scope:** The original spec does not specify which Dashboard Views are available on the in-practice tablet versus the web portal only. Are all five role-bound views available on tablet, or is a subset defined for tablet? 3. **Dashboard layout configurability:** The original spec says layout is "optionally configurable where allowed." Who decides whether a practice enables per-user layout configuration — is this a practice-level Admin Control Plane setting, and which role(s) can change it? 4. **Metric definition ownership:** The original spec does not name an owner for metric definitions (who can add, modify, or retire a Performance Metric Definition). Is this owned by the platform product team, or can practice admins create custom metrics? 5. **Saved filter configurations:** Are saved/named filter configurations in scope for this build, or deferred to a future version? 6. **Loyalty Insights availability signal:** The original spec notes that Loyalty Insights feeds are conditional on the module being enabled. How does the dashboard handle a practice where Loyalty Insights is subsequently disabled mid-session — does the view re-render automatically or require a page reload? 7. **Specification ownership role:** No named owner role is identified for this specification. Who holds governance authority for this module's spec (e.g. Product Lead, Performance Suite owner)? 8. **AI Meeting Notes integration:** Should meeting-lifecycle signals from AI Meeting Notes (meeting frequency, outstanding action completion rate, sign-off time-to-completion) be exposed as Operational or Compliance domain metrics in Performance Dashboards? If so, which staff roles should have visibility of these metrics, and does AI Meeting Notes expose a suitable aggregated event feed for near-real-time consumption? Confirmation of this decision will determine whether the candidate feed in §6.1 and §10 is promoted to a confirmed integration contract.
Live preview
💬
Comments
0
💡
Ask
0
📋
Activity
Open panel
→
Working...