VNB assisted a large not-for-profit addiction treatment healthcare organization consolidate its intake, clinical and financial data and achieve better business insights.
The customer is a leading nationally and internationally recognized not-for-profit facility dedicated to life saving addiction, behavioral healthcare treatment, research, prevention and medical education. They strive to provide best-in-class gender-specific, drug and alcohol addiction treatment for adolescents, young adults, adults and older adults who are suffering from drug and alcohol dependency and addiction. The goal of the customer is to help people and leave a legacy for both themselves and for the people that are helped throughout the program, by providing accessible care, delivering evidence-based treatment and emphasizing ongoing support.
The customer was using some out of the box reports and a large number of custom reports interconnected to multiple sources built on traditional technology principles. With the growing needs of the business for more reports with challenging insights, it was becoming very inefficient and time confusing process for the analytics team to provide support to the business. In order to connect data from multiple systems, the team faced various challenges such as duplication of processes, dealing with duplicate and inconsistent data and constantly working on reconciliation of data across systems. In addition to these challenges, the team also had to deal with performance issues. Therefore, the customer wanted to create an enterprise grade standardized repository for powerful analytics and insights. They wanted to build this enterprise grade setup across disparate systems such as an EMR, Financial system (used for journal entries), HR system (used for departmental hours and wages), CISCO call data logs, online customer chat files, ad hoc data captures, signature documents, Active directory and so on. They wanted to have a robust system that allowed a consistent flow of data and interface between the connected systems. This would enable them to get better and faster insights and act upon important business decisions in a timely manner. The customer needed a central data repository that would remove inconsistencies and duplication of data; thus improving the data quality, bringing uniformity and ensuring validity throughout the flow of the data elements – source to end user reports. After a thorough vendor selection process, the customer selected VNB team to help them with the Data Warehouse and Reporting setup for their business.
VNB BI team kicked off the project with multiple rounds of business team/user (such as admissions/intake, clinical, financials and HR groups) interviews to capture the requirements and perform a detailed review of the current siloed BI landscape. After a thorough review of requirements, current landscape and future vision, VNB BI team proposed building a data warehouse using STAR schema, with Azure Synapse as the chosen destination data warehouse which is a distributed system designed to perform analytics on large data. Azure Synapse supports massive parallel processing (MPP), which makes it suitable for running high-performance analytics. VNB recommended Azure Data Factory (ADF), a managed service that orchestrates and automates data movement and data transformation to co-ordinate the various stages of the ELT process and Azure Analysis Services (AAS), a fully managed service that provides data modeling capabilities to create tabular models with perspectives, measures and calculated columns. PowerBI, a powerful suite of business analytics tools, was recommended for querying the semantic model in the AAS and analyzing data to create reports and dashboards to help the customer get better insights into their daily functions. Azure AD was recommended to authenticate users who connect to Analysis Services server through Power BI. VNB provided appropriate deliverables such as architecture and design documents for gateway server architecture and BI architecture, bus matrix, source to target mapping documents by modules, logical model of the data warehouse, design of ETL / ELT processes, UAT test cases, test results etc. and appropriate stages of the project.
- Built a centralized data repository for EMR, Financials and various other data sources
- Automated the flow of data between systems, thereby making it more consistent and fluid
- Cleansed data in order to remove inconsistencies, inefficiencies and duplication of data hence improving data quality and ensure validity of data
- Treat and prevent incongruent data – create missing links between datasets, wherever applicable
- User friendly reports to cater to various audience groups and aided their ability to make business decisions
- Quick turnover on creating reports and dashboards