Post Job Free
Sign in

Data Warehouse Api Management

Location:
Woodbridge, VA
Posted:
May 23, 2024

Contact this candidate

Resume:

PROFESSIONAL SUMMARY:

●Around *+ years of Work Experience with strong analytical, problem- solving, and multi-tasking skills. with solid understanding of Evaluating Data sources and strong understanding of Data Warehouse, SQL, Tableau, Teradata, BI, Python, Unix, AWS, Redshift, OLAP functions.

●Proficient in performing detailed analysis of business and gathering technical requirements. Expertise in working with cross functional with technical and business teams within an organization.

●Experienced working on AWS, Cloud, DevOps, Linux, Engineering with strong technical background using AWS API gateways, API management, EC2, EC2, EBS, S3, APIGEE, VPC, RDS, SES, ELB, EMR, VPC, Auto scaling, Cloud Front& Formation, Cloud Watch, SNS, Step Functions, AWS Import/ Export.

●Worked on managing Vendor files using XML Spy Tool and ensured the STTM mapping is aligned for all the JSON files afterwards worked on deploying DDL as a process of L2 deployment.

●Worked with business Stake Holders to deploy table structure after gathering actual requirements from business. Imported and Deployed text files from mainframe to desktop or UNIX server using SAS.

●Worked as liaison between product business owner and tech team and production support/BAU groups and provided end user training decks/manuals and provide training for the users.

●Created complex SQL Queries Using joins and filters and ensure the data consistency and validity. Deployed SQL scripts using Unix-Korn Shell Scripting (KSH).

●Worked on various HDFS files using Python to make it structured. Worked on Tableau Integration using Python and visualizing the dashboard.

●Worked on migration script between Teradata to Snowflake. Worked on Python Scripts validation and creation by logging the identified bugs bugs into excel. Converted excel macros to integrate with Snowflake instead of Teradata.

●Transformed traditional environment to virtualized environment with AWS-EC2, IAM, S3, EBS, ELB, EMR, EBS, Elastic Search, Log Stash and Kibana. Worked on Object-Oriented programming and scripting language like Java, Python, NodeJS Bash Shell scripting and Python Scripting.

●Extensively used Relational Database Management System (RDBMS) and rapid application development methodology and ensured data cleansing and data scrubbing using Python.

●Hands On Experience Working on ML (Machine Learning) algorithms and Integration Chatbots, Slack using various AI approaches.

●Developed Complex SQL Queries for data mining using Snowflake, Teradata, Oracle, MS Access, SQL Server and used for reporting and analytical needs.

●Fine Tunned SQL queries as process of performance optimization and worked on tunning SQL using Explain statements. Created various multi/fast load scripts to load staging/target tables and identified indexing.

●Expertise in software analysis, code analysis, requirements analysis, software testing, and quality assurance.

●Good exposure in object-oriented design and analysis, front-end graphical user interface design, and performance tuning.

●Developed pipelines for deploying Micro-Services and Lambda function using Jenkins CI server, used and configure SNS topics for the same Lambda Function.

●Worked on AD-HOC data request using Pivot tables and charts for graphs and reporting. Worked on GitHub version control.

●Visualized various Tableau, Power BI report with Drill Down and Drop-down menu option and Parameterized using Tableau. Created side by side bars, Scatter Plots, Stacked Bars, Heat Maps, Filled Maps and Symbol Maps according to deliverable specifications.

●Experience in working with Microsoft Excel, SQL, Tableau, Power BI, Google Analytics, Adobe Analytics for Data Analysis.

●Working experience in R and SQL to analyze large datasets, identify trends, and to derive actionable insights.

●Knowledge in creating Cloud Watch with Pager Duty and integrated with Splunk to monitor workflow/jobs and develop Spark/Scala with Python for regular expression projects.

●Created various Jira Story as of the requirement and ensure Backlogs Refinement, Story Enhancement and conducted.

●Retro. Ensure Bug tracking using JIRA, HP QC, and IBM Clear Quest.

●Excellent Communication Skills, Interpersonal, analysis and leadership skills with ability to work efficiently in both independent and teamwork environments.

●Conducted various JAD sessions, data modelling using ERWIN data modeler and conducting business process testing, GAP analysis, User Acceptance Testing, SWOT analysis.

●Internal Audit, Operational Risk Management (ORM), and Risk Oversight. Subject matter expert in BSA/AML and OFAC Sanctions Regulatory requirements

●Worked on Data Governance tools like Collibra and ensured well documented Confluence page for data security policies and procedures.

●Experienced in Oracle ERP Financials, Procurement, and Project Management is instrumental in driving business transformation and optimizing processes.

●Developed various Business Flow Diagram using VISIO and ensured business process, business process models as of given business requirements documents.

Technical Skills

Data Management

Database design & Management, Data Analysis, Data Quality Assessment, Salesforce CRM Administrator, Data Modeling, Star Schema Modelling, Snowflake Schema, MS Visio.

RDBMS

SQL Server 2000/05/08, Oracle 10g/9i, Teradata 15,14,12, Snowflake

Computer Science

System Administration, MS Office, G-Suite Slack,

Tools

SQL Server Integration Service (SSIS), Teradata Fast load, Teradata BTEQ, Informatica Cloud, AWS Azure, S3, EC2, AMI, EBS, EMR, Lambda, SNS, SQS, GitHub, Splunk, Data Bricks, Kibana, Jira.

Languages

Structure query language (SQL), R, Python

Operating System

Windows, Unix, Linux

Project Management Tools

MS Office tools (WBS, PERT, Gantt charts), Visio, Access, Excel, etc.

SDLC/Design Methodology

Agile, and Waterfall.

Requirements/Document Tools

Rational Requisite-Pro, Rational Clear–Quest and MS SharePoint.

Modeling and Design Tools

Rational Rose, Microsoft Visio.

Testing Tools

Mercury Win Runner, Load Runner, Quality Center, QTP.

Databases

SQL Server, MYSQL, PL/SQL, Oracle9i/10g, Access.

Operating Systems

Windows95/98/NT/2000/XP/Vista/Windows7/10 & Mac OS

PROFESSIONAL EXPERIENCE:

PNC Bank, Baltimore, MD Feb 2022 –Present.

Role: Sr. Business Analyst/ Data Analyst

Responsibilities:

Performed JAD session, gathered requirements in coordination with Product and Business users approaching AGILE methodology and ensuring translating requirements into functional and technical specifications.

Used SharePoint and ensured that team members conforming all documents are collected in central task location. Created Use case diagrams, UML diagrams and Sequence diagrams using MS Visio.

Familiar with JSON, XML and Amazon Web Services (AWS) and worked with analyzing the source and modeling the data performing machine learning algorithms in Python.

Created STTM document for business use in order to transform the data and validate the data analyzed existing system process gaps in context new business changes.

Worked on Data Export and Data Migration using tools like Informatica. Automated and Scheduled informatica tool using UNIX shell scripting configuring using Kron-jobs.

Created data flow and ETL process using AWS ingesting Python on Lambda and also in glue Athena querying data using standard SQL.

Worked with Product Owner and Business team to ensure logical and physical data models are developed as of data governance policy.

Spearheaded the Model development and implementation of advanced machine learning models for fraud detection and anti-money laundering (AML) initiatives, enhancing Capital One's risk management strategies.

Extensively worked on data profiling on source database optimizing input on Source to Target Mapping and performed reverse engineering on Source database using Erwin.

Worked on ETL testing using Informatica 8.6.1/8.1 Power Center/Mart working on designer workflow manager, monitor and server manager.

Developed python framework and generating workflow JSON for data transformation process from AWS S3 to Data Lake and snowflake.

Registered DSET on Collibra as a data governance policy and ensured data been ingested on various banking platform that includes Redshift, S3(Glue), delta lake, Snowflake.

Identification of primary index and skew ratio for populating data and ensured data extraction from sources like Oracle, SQL Server, and flat files as of requirements.

Worked on monitoring data migration application with real-time monitoring tools like Pager Duty and registered AWS SNS notification services etc.\

Worked with technical teams to design and implemented business solutions using Adobe Experience Cloud platforms, ensuring alignment with business requirements.

Defined key identifiers for mapping interface and ensured data repositories using Github tool and cloud formation on EC2.

Worked on multiload BTEQ to update the large table with large number of rows and performed multiple DML operations small tables for small rows.

Worked on Data Architecture based on AS IS and TO BE to identify Gap Analysis, pre-trade and post-trade risk management with regulatory compliance based on Capital Market and regulatory compliance.

Worked on internal audits focusing on BSA/AML and OFAC Sanctions compliance, identifying key risk areas and recommending control improvements.

Developed and implemented risk assessment frameworks to evaluate the effectiveness of regulatory controls.

Conducted thorough analysis and documentation of complex business processes, identifying inefficiencies, and proposing solutions.

Managed multiple high-priority projects simultaneously, ensuring on-time delivery within tight deadlines.

Collaborated with senior management and stakeholders to communicate audit findings and actionable improvements.

Configured Oracle ERP modules such as Financials, Procurement, and Project Management to meet the business requirements.

Worked on ServiceNow and creating/resolving Change Management and Incident Management, AYS ticket integration using Slack Messaging with monitoring tools like pager duty.

Continuous data streaming platform like SDP and monitoring loads on daily basis and fixing missing loads. Played key role working on/Offshore with ingest data with data sources to AWS cloud and Snowflake.

Worked on creating Data model using ERWIN data modeler tool and also worked on deploying Snowflake tables.

Working with data systems like Kibana, Elastic search. Worked on ELK (Elastic Search, Log Stash, and Kibana) creating indexes, dashboards and checking the error logs using logs stash into Kibana (real time).

Hands on Experience on ServiceNow tool for creating Change Order and Incident reports. Data sources for various Reporting and analytical needs. Familiar with fine tuning of SQL queries to optimize the performance.

Having Strong analytical, problem-solving, and multi-tasking skills. Experience in using JIRA Ticketing tool. Expertise in Data Analysis, Business Analysis, Data Extraction, Data Marts.

Worked on Tableau dashboard to create dashboards and scorecards using bar graphs, scattered plots, Gantt charts along with stack bars functionality.

Created various KPI dashboard using trend lines, log axes based on groups and sets to create detail summary report with granular level details.

Worked on creating JIRA ticket and ensured Story Creation based on priority. Also played vital role on back log refinement conducting retro.

Worked with tech team to deploy template language with AEM components. ensured AEM components like JCR (Java Content Repository) and Apache sling and cQ5 template.

Conducted statistical analysis, data analysis, and built regression data model by using R programming.

Worked on data quality check ensuring python script creation to automate data validation and cleansing process which helps to identify checking deduplication of data and consistency using pandas and Apache airflow.

Identified HDFS files and big data and created ETL pipeline process using Databricks consuming classification models using SparkSQL in Spark.

Customized Oracle ERP workflows, reports, and dashboards to optimize processes and improve decision-making. Developed test plans and test cases to validate Oracle ERP configurations and customizations.

Approached and used machine learning models like Random Forest implementing Spark with python for predictive analysis and ensured metrics to impact new feature with given recommendations.

Created Cursor and database programming using Oracle SQL/PLSQL, Informatica and Unix Shell Scripting.

Environment: Snowflake 3.27.2, Teradata, Teradata SQL Assistant, R, Python, Databricks, SQL Workbench, JSON, MS Excel Macros, Kibana, AWS, AWS EC2, AWS S3, SQL, Tableau, Adobe AEM, Unix, Teradata, Oracle ERP, GitHub, BTE, DE vex, G-Suite, Jira, ServiceNow, Nebula, Cerebra, Splunk, One lake control plane.

Anthem, Chicago, IL Oct 2018 – Dec 2021

Role: Test Data Analyst

Responsibilities:

Worked on creating policy briefs, reports and periodic management reports ensuring accuracy of data and ensure database integrity.

Worked on MDM hub and ensured path components with application integrations and ensured ETL methodology for supporting data extractions using complex EDW using informatica.

Good Knowledge working as SME for Facets related to facets data model and extracting data as needed for the modules such as billing, provider, claims, and membership modules.

Worked on various platforms for health care including Medicare, Medicaid, Medigap, HIPPA Standards, EDI Transaction, EHR/EMR implementation and HIX.

Worked on SharePoint and MOSS, Team Foundation Systems (TFS) and Visual Studio Team Systems (VSTS).

Processed EDI 837P, 837I, 834 and 837D transactions, verified 837 transactions were converted correctly to XML file format and verified the claims data loaded to Facets for further processing.

Involved in API Testing using POSTMAN to determine reliability, functionality, performance, and security for calculation, transaction, and communication with each other in a highly efficient way.

Testing of Web application at various stages to make sure it is user friendly and as per standards. Validated the reports and files according to HIPAA X12 enforced standards.

Worked with all stakeholders to obtain all requirements (Functional/Non-Functional/Security). Involved in HIPAA EDI Transactions and mainly focused on PA and Eligibility Transactions.

Performed functional testing of SOAP and RESTFUL Web Services using SOAP UI Tool. Worked with Designers, UX/UI, Strategy, and SEO and Analytics to gather initial requirements and ensure user stories reflected desired features and functionality.

Utilized technical writing skills to create effective documentation and training materials. Facets support systems were used to enable inbound/outbound HIPAA EDI transaction in support of HIPAA 834, 835, 837 270/271 transactions.

Developed documents like BRD, FRD, Data specification document, technical specification documents, file transfer document, Data mapping document etc.

Engaged on HL7 working group and implementing MITA Medicaid Eligibility Architecture based on HL7. Created profiles using Informatica Data Explorer IDE and masking the data as process of NPI PCI deidentification.

Responsible for conversion of data in Data Mapping and writing transformation rules. Designed interactive dashboards in Tableau using drill down, prompts, filters, and variables. Worked in importing and cleansing data from various sources like Teradata, flat files, SQL Server.

Worked on creating tests for forms on target servers integrating AEM forms on it.

Integrated AWS with AEM for backend services to manage cloud base CMS into CRM system.

Used advanced Excel functions to generate spreadsheets and pivot tables. Organized quarterly projection data gathered by senior analysts in an orderly way for executive management.

Gathered healthcare data and responsible for collecting, cleaning, and analyzing healthcare data using SQL queries and R scripts.

Worked on development of SQL and stored procedures on MYSQL. Reported bi-monthly data analysis findings to upper management to use in making organizational decisions.

Involved in extensive DATA validation by writing several complex SQL queries. Responsible for different Data mapping activities from Source systems to Teradata.

Worked on creating relational and dimensional model for Clinical EMR records and ensured MDM data model with integration of WebSphere Process Server.

Worked with Agile team to enhance Maturity Assessments of Data Governance and Collibra. Ensure HL7 and FHIR data migration.

Worked on Facet Output generation, facets data modelling and ensuring functional and technical requirements within Facets.

Worked on EDI all payers process and ensured data integrity and payer contacts. Ensured data governance and stewardship, ETL, ODS with analytical, segmentation and predictive modelling.

Worked on developing SSIS packages for transformation i.e. lookup, condition split, data flow task and execute package task to generate underlying data.

Worked on Informatica Data Quality (IDQ) tools and ensured with Facets team to ensure HIPPA Claims Validation and Verification process.

Environment: SQL Server, Teradata, SQL Server Integration Service (SSIS), SQL Server reporting Services (SSRS), SQL Server Management Studio, Excel, R, Python.

Client: TD Bank, Cherry Hill, NJ Jan 2015 – Sept 2018

Role: Business Data Analyst

Responsibilities:

Followed Agile Methodology in analyze, define, and document the application, which will support functional and business requirements. Coordinate these efforts with Functional Architects.

Actively experienced in Analysis, Detail Design, Development, Bug fixing and Enhancement in Agile methodology.

Supported internal audit and risk management functions by gathering and analyzing data related to BSA/AML compliance. Assisted in the development of compliance policies and procedures in line with BSA/AML and OFAC regulations.

Migrated gathered business requirements from portal to Adobe AEM and functionalities which includes multi-site, multi-channel management and Digital asset management.

Utilized AEM to update website content and implement enhancement requested by project stake holders.

Experienced working with Spark/Python for improving performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL, Spark MLlib, Data Frame, Pair RDD's, Spark YARN.

Developed Scala and python scripts, UDF's using both data frames/SQL and RDD in Spark for data aggregation, queries and writing back into S3 bucket.

Used AWS services with focus on big data architect /analytics / enterprise Data warehouse and business intelligence solutions to ensure optimal architecture, scalability, flexibility, availability, performance, and to provide meaningful and valuable information for better decision-making.

Worked as SME for Risk Management that includes both Market and Credit risk, emerging/equity market, fixed income, and money market.

Managed Amazon Web Services like EC2, S3 bucket, ELB, Auto-Scaling, SNS, SQS, AMI, IAM, Dynamo DB, Elastic search, Virtual Private Cloud (VPC) through AWS Console and API Integration.

Worked to comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.

Worked in data mining, data cleansing, data profiling, data modelling, data quality and data warehousing.

Involved in writing Data Mapping Documents for the system and involved in documenting the ETL process and writing SQL Queries for the retrieval of the data.

Worked on Big Data analytics platform for processing customer viewing preferences using Java, Hadoop, and Hive.

Worked on investment banking industry, project finance, credit card industry, mortgage loan servicing, ACH, Wire transfers, commercial banking, retail, banking, Capital Management, and credit risk fundamentals.

Utilized Loan IQ's data capabilities to extract, analyze, and report on key loan-related metrics, providing valuable insights for decision-making and regulatory compliance.

Collaborated with cross-functional teams to understand business needs, translating them into data-driven solutions within the Loan IQ system to support strategic initiatives and improve overall loan management processes.

Worked with project manager to document and manage change requests and worked with SMEs to define multifamily mortgage business rules.

Highly used databricks for creating Pyspark auto scheduler and SMTP triggering.

Worked on creating Tableau, Grafana dashboards to produce different views of data visualizations and presented dashboards on web and desktop platforms to End-users and help them make effective business.

Highly used Hadoop (pig and Hive) for basic analysis and extraction of data in the infrastructure to provide Data Summarization.

Developed SQL and Python Script to full data for different Metrics defined on TCRD.

Worked in Relational Database Management Systems RDBMS and Rapid Application Development Methodology.

Created data pipeline for different events of ingestion, aggregation, and load data in AWS S3 bucket into Hive external tables in HDFS location to serve as feed for tableau/Grafana dashboards.

Designed roles and groups for users and resources using AWS Identity access management IAM and configure/create new IAM roles based on the requirements.

Support applications in production. Note interruptions or bugs in operation and perform problem solving exercises to determine problem and ensure continued use of the application.

Trouble-shooting production incidents requiring detailed analysis of issues on web and desktop applications, Autosys batch jobs, and databases.

Extracted and analyzed data from Oracle ERP systems to generate insights and support decision-making.

Developed and maintained Oracle ERP reports and dashboards to monitor KPI’s and track business performance.

Develop the Spark Sql logics based on the Teradata ETL logics and point the output Delta back to Newly Created Hive Tables and as well the existing TERADATA Dimensions, Facts, and Aggregated Tables.

Assist with planning and testing of application, configuration and database changes, and installation of upgrades and patches and update production support documentation.

Worked on microservices a fully automated continuous integration system using Git, Jenkins, MySQL, and custom tools developed in Python and Bash. Excellent knowledge of J2EE architecture, design patterns, object modeling using various J2EE technologies and frameworks with Comprehensive experience in Web-based applications using J2EE Frameworks like Spring, Hibernate, Struts and JMS.

Environment: Agile, Java/J2EE, AWS EC2, EMR, Lambda, Security groups, S3, EBS, Direct Connect, VPC, Cloud Watch, IAM, RDS, CPM, NAS, Chef, Jenkins, Microservices, Oracle, SQL, One Lake, Snowflake, Hadoop, Bigdata, JMS, REST Web Services, Snowflake, Servlets, Junit, Chef, Jenkins, Splunk, ANT, GIT and Win.



Contact this candidate