Case Submission Dashboard

Clinical registry databases are utilized to improve healthcare processes; providing timely, actionable feedback to participating clinicians, enabling them to understand their performance relative to other organizations. In order to participate in a registry, case files must be processed and submitted to the registry for compliance review.

  • As the final step in the abstraction process, prepared case files are submitted in batches electronically to the clinical registry.

    The registry reviews the batch, and generates a report. If a file contains errors, the batch is rejected. Cases containing errors may be modified and batches resubmitted if the changes can be made in a timely manner.

  • Opportunities:

    Prior to this feature, abstractors source files from multiple sources, learn registry specific terminology and acceptance criteria, navigate ineffective reporting systems, and manually pull case files to modify and resubmit.

    Because Atlas’ data is centralized, users can access reports and modify case files directly without the need to export data from different systems and reconcile it manually.

    A registry agnostic interface minimizes onboarding for new registries.

    Multiple users can work on the same client files simultaneously, with improved insight into the client’s case status’ and individual assignments.

    Constraints:

    Development Resources: Limited front-end development resources.

    Content Limitations: Data returned from the registry's reporting system varies in content, making specific error messaging challenging.

    Feature Parity: User expectations of feature parity with multiple governing bodies.

    Training Limitations: Leverage existing workflows and UI standards to minimize user onboarding to the system.

    Variable Content: Visual layout of the data must be simplified, sortable, and clearly identify outstanding user tasks to best serve multiple registries, users, and clients.

    Scalability: The number of individual files in a batch can range from several to hundreds, and each file has reportable data associated with it. The information should be structured in such a way as to be digestible, allowing the user to focus on their specific workflows, reducing the time needed to identify priorities.


    In summary, a single interface for viewing submission reports and changing records directly offers efficiency, accuracy, and usability advantages. It simplifies data management, reduces the risk of errors, and enhances the overall productivity of users, making it a preferred choice for Atlas’ abstraction product.

  • Multiple approaches to data visualization were explored for this project prior to choosing a final design direction. Using an existing table component for case list display, along with a batch summary was chosen to minimize development time, and leverage existing internal and external product layouts to reduce user training.

    By using our own design system along with the registry terminology, we unified the pass/fail language of batch status amongst several systems to batch color indicators. To make the visual system more accessible, hues were tested for distinction with color blind populations, and included direct or hover text to further indicate status.

    To accommodate columns with minimal horizontal scrolling, error descriptions were summarized to quickly demonstrate severity and quantity within the form, with a link to open a more detailed description. This also allowed us to keep the table consistent for multiple registries, as not all registries return the same error report content.

    Pagination for the table is an ongoing challenge for design, and with the final decision made due to loading times. Individual rows are sorted by default to have errors at the top of the table, which is the primary information required in this workflow. Filtering/search capabilities are slated for future versions.


  • Team Structure
    This feature was developed collaboratively with product, design, and engineering. User feedback, observational research, and competitive analysis were facilitated by our internal abstraction team members.

    As the sole design team member, I was given a high-level directive of creating a submission record page, and then gathered background information and worked with the product team to define requirements. I then organized the requirements into workflows to review with front-end development, and finally provided wireframes and assets for the front-end team.

    Retrospective
    The batch history layout met the expectations and needs of the abstraction team. Having this resource within the software eliminated their need to track submission manually, giving a “point of truth” for the full internal team. It also allowed them to quickly identify problem files and make modifications.
    As a relatively new feature, regular feedback will continue to be collected to assess the needs of future versions.

    Independently, I have begun to prototype additional data visualizations, as I believe there are alternative solutions that would serve the primary user, while providing additional benefits for clients. This would require additional development time, but will overlap with other data visualization projects. By exploring possibilities early, those projects will have a stronger base for development as they are integrated into the roadmap.