Description:
Leading insurance provider from Europe, recognized the need to modernize and enhance its contact centre operations to meet evolving customer expectations and drive operational efficiency. They embarked on a Contact Center Replacement project to replace their existing Contact Center platform with a new, modern cloud-based NICE CXOne running on AWS. The project aimed to enhance customer experiences, improve operational efficiency, and leverage advanced data solutions to achieve these objectives.
Client Requirements:
Seamless Transition: The client required a smooth transition from their legacy contact centre system to a modernized platform without disrupting ongoing customer service operations.
Data Integration: The client needed a data solution capable of seamlessly integrating customer data from multiple sources, including NICE CXone APIs, and IEX data extract files, into a unified view.
Real-time Analytics: The client required real-time analytics dashboards to monitor call centre performance, agent productivity, and customer satisfaction in real-time, allowing for immediate adjustments and improvements.
Scalability and Flexibility: Scalability and flexibility were vital due to fluctuations in call volume based on seasonal variations and marketing campaigns.
Cost Efficiency: The client sought to optimize costs while maintaining or improving service quality through data-driven insights and analytics.
Our Solution:
Our data solution for the Contact Center Replacement Project included the following components, with NICE CXOne as the contact centre application running on AWS, Azure Cloud platforms for the data integration & extraction service and Snowflake being the final data store. :
Unified Data Platform: We implemented a unified data platform that integrated customer data from various sources, including Guidewire, CXONE RESTful APIs, and IEX files into a unified view.
Near Real-time Analytics: Our solution featured real-time analytics dashboards for monitoring and optimizing call centre performance, agent productivity, and customer satisfaction.
Metadata-Driven Orchestration Framework: We build a metadata-driven orchestration framework to extract data and load it into Snowflake reducing the development & maintenance efforts by 50%.
Parameterized Data Ingestion using ADF: Generic ADF pipelines were built to capture data from 50+ REST APIs. The pipeline was configurable using parameters, hence reducing development effort by 80%
Automated Alert & Notification: We enabled an alert system at each and every step of the pipeline to notify the data stakeholder in case of any errors or failure of data.
Automated Data Quality Checks & Remediation: Standard data quality checks were applied to deliver high-quality & error-free data for business reporting & decision-making making. The system was SMART enough to auto-remediate known data issues.
Scalable Cloud Infrastructure: Leveraging Azure service & Snowflake, we implemented a scalable cloud infrastructure to handle varying call volumes efficiently and cost-effectively.
Role: Cloud Data Architect
Tools & Technologies Used: Azure Data Factory, Azure Storage, Azure Key Vault, Logic Apps, Snowflake, Azure DevOps, GitHub
Reference Link: NICE CXOne API
Description:
"Insurance Data Hub" simplifies data extraction in insurance, consolidating processes into a unified solution. Powered by dbt and Snowflake, it efficiently transforms data, delivering a versatile extract tailored to various business needs. Centralizing extraction efforts, reduces engineering time and costs, offering teams a cohesive, business-ready dataset. Say farewell to fragmented extracts with InsuranceDataHub, your go-to for streamlined analytics in the insurance sector.
Client Requirements:
Unified Platform: Provide a centralized solution for data extraction and transformation in the insurance sector.
Integration: Integrate with diverse data sources commonly used in the insurance industry.
DataOps Implementation: Implement DataOps practices to streamline data operations and enhance efficiency.
Customization: Generate versatile extracts tailored to meet the specific needs of different business teams.
Documentation: Provide comprehensive documentation to guide users in configuring, using, and maintaining the extraction process.
Our Solution:
The Insurance Data Hub streamlines data operations in the insurance sector, offering a unified platform for extraction, transformation, and governance. Leveraging dbt, Snowflake, and Azure DevOps, ensures efficient data handling, optimal performance, and seamless delivery of high-quality insights to stakeholders.
Our data solution included the following components:
Data Transformation, Automated Testing and Governance: Utilize dbt for efficient data transformation, documentation, and governance, ensuring data quality and reliability throughout the automated test process.
Data Storage and Computing: Leverage Snowflake as the data store and compute platform, providing scalability, performance, and security for handling insurance data.
Orchestration and CI/CD: Implement Azure DevOps for orchestration, continuous integration, and continuous delivery (CI/CD) pipelines, automating the deployment and management of data extraction workflows.
Version Control: Utilize GitHub for version control, enabling collaboration, tracking changes, and maintaining a history of modifications to the extraction processes and codebase.
Documentation: Leverage dbt's powerful documentation functionality to provide a one-stop solution for data lineage, code repo & business data dictionary.
By integrating these technologies, Insurance Data Hub offers a robust and streamlined solution for data extraction, transformation, and governance in the insurance sector, empowering organizations to make data-driven decisions with confidence and efficiency.
Role: Data Engineer & DevOps Engineer
Tools & Technologies Used: Data Build Tool (dbt), Snowflake, Azure DevOps, GitHub, Azure Key Vault
Reference Link: Unleashing Analytics Engineering with dbt, Snowflake & Azure DevOps
Description:
A leading insurance company faced the imperative of migrating their existing data integrations, developed in SQL Server Integration Services (SSIS), to Azure Data Factory (ADF). The goal was to modernize data integration, establish robust disaster recovery capabilities, and ensure a seamless transition with minimal disruption to live applications while keeping the cost & time efficient.
Client Requirements:
High-level requirements from the client are as below:
Transitioning from SSIS (IaaS) to Azure Data Factory (PaaS).
Adopting a re-engineering approach to convert SSIS packages into native ADF components.
Gathering cloud infrastructure setup inputs.
Addressing solutions challenges, considering the customer's private network policies.
Our Solution:
To meet the insurance company's requirements, we have two options:
Migrate SSIS to IaaS: In an IaaS environment, you spin a Virtual Machine on Azure infrastructure, according to your requirements for resources. On this virtual machine, you install SQL Server Integration Services. After Installation, you can deploy your packages on SSIS DB or the file system.
Migrate SSIS to PaaS: This is the lift-and-shift approach for migrating SSIS packages on Azure. Azure Data Factory provides an SSIS Integration Runtime to run Integration Services on Azure.
we decided to go with the 2nd option - The lift and shift approach. That gives us the following advantages.
Reduced human error or conversion error during migration
Reduce operational costs and reduce the burden of managing the infrastructure that you have when you run SSIS on Azure virtual machines.
Increase high availability with the ability to specify multiple nodes per cluster, as well as the high availability features of Azure and Azure SQL Database.
Increase scalability with the ability to specify multiple cores per node (scale up) and multiple nodes per cluster (scale out).
Role: Solution Architect
Tools & Technologies Used: Azure Data Factory, Azure Storage, Azure Key Vault, Snowflake, Azure DevOps, GitHub
Description:
The HealthMAP solution is helping a Western state drive transformative change in healthcare by promoting better Medicaid program management and patient-centric care for its citizens. This comprehensive platform includes a data lake and a reporting solution built to meet the challenges of the modern Medicaid environment. It leverages data and predictive analytics to identify care management requirements, improve patient outcomes, streamline operations, enhance decision-making, and empower the state to meet federal reporting requirements.
Client Requirement:
Common challenges addressed include
Siloed data sources and
Lack of business intelligence tools
Disconnected data models that prevented accurate, holistic reporting across Medicaid and HHS programs.
Lack of a single source of truth
Our Solution:
We equip state health agencies with the tools, knowledge, and strategies to optimize healthcare delivery:
Future Proof Framework: Cloud-native information architecture that is scalable, elastic, and future-ready
Unified Data Platform: A single source for truth for the Health data collected from multiple sources removing the data silos.
Business Intelligence: Modern business intelligence tools for advanced reporting & dashboard.
Data Governance: Modern architecture to safely store the PII/PHI data.
Predictive Analysis:
Role: Senior Consultant - Lead Data Engineer
Tools & Technologies Used: IBM DataStage & QualityStage, IBM DB2, Shell Scripts, Cognos, SQL, AWS EC2, Snowflake
Reference Link:
Description:
The states of the United States running the Medicaid program, responsible for managing healthcare programs for eligible individuals and ensuring compliance with federal regulations, faced a daunting challenge. They were tasked with reporting extensive data related to eligibility, enrollment, program utilization, and expenditure to the Transformed Medicaid and Statistical Information System (T-MSIS), overseen by the Centers for Medicare and Medicaid Services (CMS). Compliance with over 3000+ data quality (DQ) rules was mandatory, making the process complex and prone to errors. The state agency needed a robust solution to streamline data quality checks, reduce false positives, and improve overall data accuracy for federal reporting and research purposes.
Client Requirement:
Automated Data Exracts: The client's requirement is to create the files from their data warehouse each month on a scheduled date.
Automated Data Quality Checks: The data extracted should comply with the 3000+ quality checks provided by CMS to avoid any false or wrong data for the research.
Alert System: The system should alert if the data quality checks fail beyond a threshold percentage.
Our Solution:
We engineered a Smart Solution tailored to the specific needs of the Medicaid state agency, transforming the process of data submission for T-MSIS reporting:
Advanced Data Health Checks: Our solution utilized cutting-edge algorithms to perform comprehensive data health checks on the state agency's Medicaid and CHIP data. These checks meticulously adhered to the 3000+ DQ rules mandated by CMS.
Intelligent Outlier Handling: The Smart Solution demonstrated exceptional intelligence by identifying and excluding outliers from immediate error notifications. This intelligent outlier handling ensured that minor data anomalies or fluctuations did not trigger unnecessary data quality alerts, reducing false positives.
Real-time Monitoring and Reporting: Our system features real-time monitoring and reporting capabilities, allowing the state agency to track the status of data quality checks in real time. Immediate notifications were sent for critical issues, ensuring swift action when necessary.
Customization: We tailored the Smart Solution to align precisely with the state agency's unique data requirements and T-MSIS reporting obligations, ensuring seamless integration into their existing data infrastructure.
Role: ETL Designer & Developer
Tools & Technologies Used: IBM DataStage & QualityStage, IBM DB2, Shell Scripts, Cognos, SQL, AWS EC2
Reference Link:
Description:
A leading telecom provider with a vast customer base and diverse operational data sources sought to streamline data management, enhance reporting capabilities, and optimize decision-making processes. The primary goal was to establish an Enterprise Data Warehouse (EDW) to centralize and integrate data from various operational sources for improved reporting, analysis, and decision support.
Client Requirements:
Unified Data Source: The client required a single, comprehensive source of integrated and historical data to eliminate data silos and inconsistencies across the organization.
Enhanced Reporting: The client wanted to improve reporting capabilities, enabling more accurate and timely insights into customer behaviour, network performance, and market trends.
Data Analysis: To support strategic decision-making, the client needed advanced data analysis capabilities to identify opportunities for service improvements, cost optimization, and revenue growth.
Scalability: The solution had to accommodate the telecom provider's rapidly expanding data volumes, ensuring scalability and performance as the customer base continued to grow.
Our Solution:
To meet the client's objectives, our team proposed and implemented a comprehensive solution:
Data Integration Framework: We designed a robust data integration framework that could extract, transform, and load (ETL) data from various operational sources, including customer billing systems, network logs, customer service databases, and more.
Enterprise Data Warehouse (EDW): We developed a centralized EDW that acted as the single source of truth for all data. This EDW incorporated historical data, enabling trend analysis and long-term insights.
Data Quality Assurance: Data quality was a top priority. We implemented data cleansing and validation routines to ensure that only accurate and reliable data entered the EDW.
Scalability: To accommodate the growing data volumes, we designed the system to be highly scalable, leveraging cloud-based technologies and auto-scaling capabilities to handle increased data loads.
Reporting and Analytics: We implemented powerful reporting and analytics tools that allowed the client to generate real-time dashboards, conduct ad-hoc analysis, and produce detailed reports to support decision-making at all levels of the organization.
Role: ETL Consulatnt
Tools & Technologies Used: IBM DataStage, IBM DB2, Shell Scripts, Cognos, IBM CDC
Description:
The project aimed to enhance Customer Management, Customer Contact Management (Service Requests), and ERP using Oracle E-Business Suite (EBS).
Role: ERP Developer
Resposibilities:
Customizations, extensions, modifications, localizations & integration in standard ERP (Oracle EBS).
Creation of Database Objects like Tables, Views, Sequences, Synonyms, Stored Procedures, functions, Packages, Cursors, Ref Cursor and Triggers as per the Business requirements.
Creation of a Concurrent Program as per business requirements.
Creation of reports using Oracle Reports and corresponding concurrent programs.
Development and modification of various existing packages, Procedures, functions, and triggers according to the new business needs.
Analyzing tables and indexes for performance tuning.