T12.1: Manages Reporting

Knowledge Review - HealthShare Unified Care Record Technical Specialist

1. Management Report Interface

Key Points

  • Report Access: Management reports are accessed through the UCR Management Portal
  • Navigation: Reports are organized by category in the reporting menu
  • Report Parameters: Most reports accept date ranges, facility filters, and other criteria
  • Output Formats: Reports can be viewed on screen, exported to PDF, or saved as XML
  • Access Control: Report access is controlled by user roles and permissions

Detailed Notes

Overview

The UCR Management Portal provides a reporting interface that allows administrators and authorized users to view operational reports about system usage, data volumes, user activity, and security events. These management reports provide visibility into how the UCR federation is being used and help identify issues, track adoption, and support compliance requirements.

Understanding how to navigate the reporting interface, select appropriate reports, and interpret their output is essential for UCR administrators.

Accessing Management Reports

1. Log in to the UCR Management Portal 2. Navigate to the Management Reports section 3. Select the desired report category 4. Choose the specific report to run 5. Set report parameters (date range, facility, user filters) 6. Execute the report and view results

Report Interface Features

  • Parameter Selection: Date pickers, drop-down filters, and text fields for narrowing results
  • Results Display: Tabular display with sortable columns
  • Pagination: Navigate through large result sets with page controls
  • Export Options: Print to PDF, save as XML
  • Drill-Down: Some reports support clicking on summary rows to see detail records
  • Refresh: Re-run the report with current data

---

Documentation References

2. Standard Management Reports

Key Points

  • Audit Log: Tracks system and user actions for compliance
  • Clinical Data Access Report: Shows which users accessed which patient records
  • Data Volume: Displays data ingestion volumes by facility and time period
  • User Activity: Tracks user login and usage patterns
  • Security Reports: Login Failures, Emergency Access Log for security monitoring
  • FHIR Reports: Shared Resource Access for FHIR API monitoring

Detailed Notes

Overview

UCR includes a comprehensive set of standard management reports that cover operational monitoring, security auditing, data volume tracking, and user activity analysis. Each report provides specific insights into different aspects of the UCR federation's operation. Administrators should be familiar with all available reports and understand when each is appropriate.

Available Reports

  • Audit Log: Records system events, configuration changes, and administrative actions. Essential for compliance auditing and change tracking.
  • Clinical Data Access Report: Shows which users accessed which patient records, when, and from which application. Supports HIPAA compliance and patient privacy monitoring.
  • Clinical Data Access Report (All Patients): Broader version of the Clinical Data Access Report showing access across all patients, useful for identifying unusual access patterns.
  • Data Volume: Displays the volume of data ingested by the Edge Gateway, broken down by facility, message type, and time period. Helps monitor data feed health and growth.
  • Emergency Access Log: Records instances where users invoked emergency (break-the-glass) access to patient records outside their normal authorization.
  • FHIR Shared Resource Access: Tracks access to FHIR resources through the UCR FHIR endpoints. Monitors API usage and consumer activity.
  • Login Failures: Lists failed login attempts with user, timestamp, and source. Critical for security monitoring and detecting unauthorized access attempts.
  • Patient Events: Tracks significant patient events such as merges, unmerges, consent changes, and record modifications.
  • Record Request Counts: Shows the number of record requests made to the federation, broken down by source and time period.
  • Summary Documents Counts: Displays counts of summary documents generated and stored in the federation.
  • Summary Documents Sent: Tracks summary documents sent to external systems or other federation components.
  • User Activity: Shows user login frequency, session duration, and feature usage. Helps track adoption and identify training needs.
  • User Event Counts: Aggregated counts of user events by event type and time period.
  • Inbound Message Type Counts: Shows counts of inbound messages by message type (HL7, FHIR, CDA, X12) and time period. Useful for monitoring data feed health.

Report Selection Guide

  • Compliance Auditing: Use Audit Log, Clinical Data Access Report, Emergency Access Log
  • Security Monitoring: Use Login Failures, Emergency Access Log, User Activity
  • Data Feed Monitoring: Use Data Volume, Inbound Message Type Counts, Summary Documents Counts
  • Usage Analysis: Use User Activity, User Event Counts, Record Request Counts
  • FHIR Monitoring: Use FHIR Shared Resource Access

---

Documentation References

3. Usage Dashboards

Key Points

  • Dashboard Namespace: A dedicated namespace for usage dashboard data and production
  • Dashboard Production: An interoperability production that collects and processes usage metrics
  • Visual Dashboards: Graphical displays showing system usage trends and metrics
  • Monitoring: Real-time and historical views of federation usage patterns
  • Configuration: Set up the namespace, production, and dashboard components

Detailed Notes

Overview

Usage dashboards provide graphical, real-time and historical views of how the UCR federation is being used. Unlike management reports that show tabular data on demand, dashboards provide visual indicators (charts, graphs, gauges) that make it easy to monitor system health and usage patterns at a glance. Setting up usage dashboards requires creating a dedicated namespace and production that collects, aggregates, and stores usage metrics.

Creating the Dashboard Namespace

1. Create a dedicated namespace for usage dashboard data (separate from the main UCR namespace) 2. Configure the namespace with appropriate database storage 3. Install the dashboard classes and configuration into the namespace 4. Set up the interoperability production for data collection

Dashboard Production

  • The dashboard production runs in the dedicated namespace
  • Business services collect usage metrics from UCR components (Hub, Edge Gateways, Access Gateways)
  • Business processes aggregate and store metrics
  • Configure collection intervals (how frequently metrics are gathered)
  • Monitor production status to ensure metrics are being collected

Available Dashboards

  • System Health: Overall federation health indicators (component status, error rates)
  • Data Volume Trends: Charts showing data ingestion volumes over time
  • User Activity Trends: Graphs showing login frequency and usage patterns
  • Response Times: Performance metrics for data retrieval and display
  • Federation Overview: Summary view of all components and their status

---

Documentation References

4. Data Loader and Aggregation

Key Points

  • Data Loader: Task that loads raw usage data into the dashboard database
  • Nightly Aggregation: Scheduled task that aggregates daily data into summary statistics
  • Incremental Processing: Only processes new data since the last run
  • Task Scheduling: Configure data loader and aggregation schedules via Task Manager
  • Troubleshooting: Monitor task execution and handle failures

Detailed Notes

Overview

Usage dashboards depend on data loader tasks and nightly aggregation processes to populate their databases with the metrics they display. The data loader collects raw usage data from UCR components and loads it into the dashboard database. The nightly aggregation task then processes this raw data into summary statistics suitable for dashboard display. Both tasks are configured to run on schedules and process data incrementally.

Data Loader Tasks

  • Data loader tasks collect raw metrics from various UCR components
  • Tasks are configured in the Task Manager of the dashboard namespace
  • Each task is responsible for a specific type of metric (user activity, data volume, etc.)
  • Tasks run on a schedule (typically every few hours or on demand)
  • Configure the data source connections and collection parameters

Nightly Incremental Aggregation

  • Aggregation tasks run nightly (or on a configured schedule)
  • Process raw data loaded since the last aggregation run
  • Compute summary statistics: counts, averages, minimums, maximums, trends
  • Store aggregated data in summary tables for fast dashboard retrieval
  • Incremental processing ensures only new data is processed, reducing load

Running Data Loader Tasks

1. Navigate to the Task Manager in the dashboard namespace 2. Identify the data loader tasks 3. Run tasks manually (for testing) or verify scheduled execution 4. Monitor task completion and check for errors 5. Verify that data appears in the dashboard after task completion

Troubleshooting

  • No Dashboard Data: Verify data loader tasks are running and completing successfully
  • Stale Data: Check nightly aggregation task schedule and last run time
  • Missing Metrics: Verify the data source connections from the dashboard production
  • Task Failures: Review task logs for error messages, check connectivity to UCR components
  • Performance: Monitor task execution times and optimize collection intervals if needed

---

Documentation References

5. Custom Report Creation

Key Points

  • Report UI Class: Defines the user interface for the custom report (parameters, display)
  • Report Definition Class: Contains the query logic and data retrieval for the report
  • Class Structure: Both classes follow defined patterns for integration with the reporting framework
  • SQL Queries: Report definitions typically use SQL to query UCR data stores
  • Integration: Custom reports appear alongside standard reports in the Management Portal

Detailed Notes

Overview

When the standard management reports do not meet an organization's reporting needs, custom reports can be created. Custom report creation involves two primary classes: the Report UI class (which defines the user interface for parameter input and result display) and the Report Definition class (which contains the data query logic). Together, these classes integrate with the UCR reporting framework to provide custom reports that appear alongside standard reports in the Management Portal.

Report UI Class

  • Defines the user interface elements for the report
  • Specifies input parameters (date ranges, filters, search criteria)
  • Configures the results display (columns, formatting, sorting)
  • Extends the base report UI class provided by the UCR framework
  • Controls layout of the report page in the Management Portal

Report Definition Class

  • Contains the data retrieval logic (typically SQL queries)
  • Defines the result set structure (column names and types)
  • Implements filtering based on user-supplied parameters
  • Can query multiple data sources (ECR, Hub registries, audit tables)
  • Extends the base report definition class provided by the UCR framework

Structure of the Definition Class

  • Parameters: Properties that receive input from the UI class
  • Execute Method: Main method that runs the query and returns results
  • SQL Query: The query that retrieves data from UCR data stores
  • Result Processing: Optional post-processing of query results before display
  • Column Definitions: Define the output columns, types, and display formatting

Creating a Custom Report

1. Identify the reporting requirement (what data, what filters, what output) 2. Design the SQL query to retrieve the required data 3. Create the Report Definition class with the query and result structure 4. Create the Report UI class with the parameter inputs and display configuration 5. Compile both classes 6. Install the report into the UCR reporting framework 7. Test the report with representative data and parameters

---

Documentation References

6. Report Installation

Key Points

  • Install Method: Custom reports are installed using the Install class method
  • Registration: Installation registers the report with the UCR reporting framework
  • Menu Placement: Installed reports appear in the Management Portal reporting menu
  • Namespace: Reports must be installed in the correct namespace
  • Verification: Verify report appears and functions correctly after installation

Detailed Notes

Overview

After creating the Report UI and Report Definition classes, custom reports must be installed into the UCR reporting framework to make them available to users. The installation process registers the report with the system, adds it to the reporting menu, and makes it accessible through the standard report interface. Installation is performed using the Install method provided by the reporting framework.

Installation Process

1. Ensure both the Report UI and Report Definition classes are compiled in the correct namespace 2. Call the Install method of the report class (or use the installation utility) 3. Provide any required installation parameters (report name, category, description) 4. The Install method registers the report with the reporting framework 5. The report appears in the Management Portal reporting menu

Install Method Details

  • The Install method is a class method on the Report UI class or a dedicated installer class
  • It accepts parameters such as report name, category, and display order
  • It creates the necessary entries in the reporting registry
  • It may also set default parameter values and access permissions

Post-Installation Verification

1. Navigate to the Management Reports section in the Management Portal 2. Verify the custom report appears in the expected category 3. Open the report and confirm the parameter input interface displays correctly 4. Run the report with test parameters 5. Verify the results display correctly 6. Test export functionality (PDF, XML) 7. Verify access control (only authorized users can see the report)

---

Documentation References

7. Report Debugging

Key Points

  • Common Issues: Query errors, missing data, display formatting problems, parameter handling bugs
  • SQL Debugging: Test queries directly in the SQL shell before embedding in report classes
  • Class Debugging: Use the debugger or logging to trace report execution
  • Data Verification: Confirm that expected data exists in the data stores being queried
  • Framework Integration: Verify correct extension of base classes and method signatures

Detailed Notes

Overview

Debugging custom management reports requires systematically identifying and resolving issues in the query logic, parameter handling, display formatting, or framework integration. Common problems include SQL errors, missing or unexpected data, incorrect parameter binding, and display formatting issues. A structured debugging approach helps quickly isolate and resolve these issues.

Common Issues and Solutions

  • Report Not Appearing in Menu: Verify installation completed successfully, check the reporting registry for the report entry
  • Parameter Not Passed to Query: Verify parameter names match between UI class and definition class, check parameter binding in the Execute method
  • Empty Results: Test the SQL query directly with known parameter values, verify data exists in the queried tables
  • SQL Errors: Run the query in the SQL shell to identify syntax or logic errors, check table and column names
  • Display Formatting: Verify column definitions match the query result structure, check data type formatting
  • Access Denied: Verify the user has permissions to run the report, check access control configuration

Debugging Approach

1. Isolate the Layer: Determine if the issue is in the UI, query, or framework integration 2. Test the Query: Run the SQL query directly in the Management Portal SQL interface with known parameter values 3. Check Parameters: Add logging to verify parameter values are passed correctly from UI to definition 4. Review Class Structure: Verify the classes correctly extend the base framework classes 5. Check Compilation: Ensure both classes compile without errors 6. Review Logs: Check system logs for errors during report execution 7. Test Incrementally: Build and test the report in stages (query first, then UI, then integration)

Logging and Tracing

  • Add temporary logging statements to the Execute method to trace execution
  • Log parameter values received by the definition class
  • Log query execution status and row counts
  • Use the system log viewer in the Management Portal to review log entries
  • Remove or disable verbose logging after debugging is complete

---

Documentation References

8. Report Export

Key Points

  • PDF Export: Print reports as PDF documents for sharing and archiving
  • XML Export: Save report data as XML for further processing or import into other systems
  • Export Interface: Export options available in the report viewer toolbar
  • Formatting: PDF exports preserve the on-screen formatting; XML exports provide structured data
  • Scheduling: Some reports can be scheduled for automatic export and distribution

Detailed Notes

Overview

Management reports can be exported from the UCR reporting interface in PDF and XML formats. PDF export produces a formatted document suitable for sharing, printing, and archiving. XML export provides structured data suitable for import into other systems, spreadsheets, or analysis tools. Both export options are available from the report viewer toolbar after running a report.

Printing PDF Reports

1. Run the desired report with appropriate parameters 2. Click the Print/PDF option in the report viewer toolbar 3. The system generates a PDF version of the report 4. The PDF includes the report header, parameters used, and tabular results 5. Save or print the PDF document 6. PDF formatting preserves column widths, headers, and data formatting from the on-screen view

Saving XML Reports

1. Run the desired report with appropriate parameters 2. Click the XML/Export option in the report viewer toolbar 3. The system generates an XML file containing the report data 4. The XML includes report metadata, parameters, and result data in structured format 5. Save the XML file to the local system 6. XML data can be imported into spreadsheets, databases, or analysis tools

Export Considerations

  • Large Reports: Very large result sets may take longer to generate as PDF or XML
  • Data Sensitivity: Exported reports may contain sensitive patient or security data; handle according to organizational policies
  • Archival: PDF exports are suitable for long-term archival as formatted documents
  • Analysis: XML exports are better for data analysis, as they preserve data structure and can be parsed programmatically
  • Audit Trail: Report exports may be logged in the audit system for compliance tracking

---

Documentation References

Exam Preparation Summary

Critical Concepts to Master:

  1. Standard Reports: Know the purpose and use case for each of the 14+ standard management reports
  2. Report Categories: Understand which reports serve compliance, security, monitoring, and usage analysis purposes
  3. Usage Dashboards: Know the setup requirements (namespace, production, data loader, aggregation)
  4. Custom Report Classes: Understand the two-class pattern (Report UI + Report Definition) for custom reports
  5. Report Installation: Know how to use the Install method to register custom reports
  6. Data Loader and Aggregation: Understand the role of scheduled tasks in populating dashboard data

Common Exam Scenarios:

  • Selecting the appropriate standard report for a given monitoring or compliance requirement
  • Setting up usage dashboards from scratch (namespace creation, production configuration, task scheduling)
  • Creating a custom management report with SQL-based data retrieval
  • Installing and verifying a custom report in the Management Portal
  • Troubleshooting a custom report that returns empty results or incorrect data
  • Configuring data loader tasks and nightly aggregation for dashboards
  • Exporting report data for compliance auditing purposes

Hands-On Practice Recommendations:

  • Run each of the standard management reports and understand their output
  • Set up a usage dashboard namespace and configure the data collection production
  • Create a simple custom report with a Report UI class and Report Definition class
  • Install the custom report and verify it appears in the Management Portal
  • Practice debugging a report by intentionally introducing common errors
  • Export reports to PDF and XML and examine the output
  • Configure data loader tasks and verify dashboard data population
  • Practice identifying which report to use for different administrative scenarios

Report an Issue