Resume

Sign in

Data Developer

Location:
Columbia, MO
Salary:
90000
Posted:
March 19, 2019

Contact this candidate

Resume:

SHIVAPRASAD R

Mobile# 661-***-****

ac8t3x@r.postjobfree.com

SUMMARY

Total 9+ years of professional experience in IT and services for Healthcare, Insurance, Banking and Retail Business domains.

Experienced in Data Warehousing, Data Analysis, Data Modeling, ETL tools and Business Intelligence Reporting and Involved to an IT infrastructure Azure & Amazon could services to better utilize resources and provide better service to Production support and application development.

Hands on technical experience as an Informatica Power center, Power Exchange, Data Quality (IDQ), Master Data management (MDM) products with using On-premise, Azure, Amazon web services, Big data connectivity and Proactive Monitoring Tool.

Experienced working in Agile and Waterfall methodologies.

Extensive experience in developing complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup, Expression, Update Strategy etc.

Worked on generating daily production Data Quality reports using Informatica IDQ to validate and compare the source and target data for daily and periodic ETLs loads.

Expertise in Database development skills using SQL, PL/SQL, stored procedures, functions, packages, and triggers.

Proficient using IBM Data studio, TOAD, SQL Developer for system testing of the reports.

Expertise in deriving Analytical, Logical, Physical, Dimensional and Statistical Data Models.

Good knowledge of Star Schema and Snowflake Schema used in relational, dimensional and multidimensional modeling using Erwin tool.

Experienced in Developing Unix Shell Scripts and scheduling Shell scripts by using scheduling tools Control-M & Autosys.

Involved in Unit testing, User acceptance testing, Integration testing, Performance testing and Production support.

Deft in preparation of change ticket and Standard Operating Procedures (SOPs) documents for Production support activity for Informatica, AWS, SQL Server and resolving Trouble Ticket/Change requests logged by business users.

Have experience with Informatica Performance Tuning, Troubleshooting, Backup & recovery, Code Deployment & Pro-active health check.

Experience On-premise to Azure and On-premise to AWS for different tools and DB’s.

Worked on design system (i.e. server’s estimation, Hardware requirements, cost estimation, backup plans, and Blue print documents) install and administration, upgrade versions of Informatica on windows and UNIX platforms in cloud systems (Azure/ AWS).

Performed database/infrastructure physical design, database upgrades and migrations of Azure/ AWS environments & applied software patches to databases as per application/client requirements.

Experience in Configuring and Deploying instances on Amazon Web services.

Experience with Amazon AWS Cloud Services, (EC2, Cloud Watch, Elastic IP, IAM, VPC, Cloud Formation, Route53) and managing security groups on AWS.

Excellent verbal and written communication skills, strong time management skills

Experienced with on/off-shore model and also have the ability to manage time efficiently and effectively, prioritizing efforts to handle multiple activities in parallel.

Highly proficient in trouble shooting and resolving production issues and application user’s issues.

EDUCATION

Bachelors of Computer Science Engineering from Jawaharlal Nehru Technological University, India.

CERTIFICATIONS

Power Center Data Integration 9.x Developer Specialist certified by Informatica Corporation.

Solutions Architect Associate certified by Amazon Web Services.

Oracle Database SQL certified Associate (1Z0-071) certified by Oracle corporation.

TECHNICAL SKILLS

Operating Systems

Win 10/8/Vista/XP/2000/ 98, Windows Server 2003/2008R2/2012, AIX 7.1/5.3, Red hat Linux 6

RDBMS

IBM DB2 9.7, Oracle 11g/10g/9i/8i/, SQL Server 2014,12,08, My SQL, SQL Azure

Languages

UNIX, Pearl, Shell Scripting (Bourne, Korn), TSQL, SQL, PL/SQL, HTML, C++, C#, VB.NET, ASP.NET

Tools

WinSCP, Putty, MS-office & Visio, Share point, TOAD 12.10/8.5/9.0, Erwin, Autosys, Visual studio 2013/08/05, Oracle Golden gate 11g/10g, HP Quality center, Rational Clear Quest. CA Harvest SCM, Tivoli and IBM Workload manager.

ETL Tools

Informatica 10.2/10.1/9.6/9.1/8.6, Informatica Data Quality, MS-SSIS 2012/08r2

Reporting Tools

Tableau, SAP-Business Objects XI 3.1, Cognos, OBIEE 11.1/10.x, MS-SSRS 2012/08r2

Cloud Applications

Microsoft Azure, Amazon web services(AWS) – EC2, S3, Redshift, RDS, EBS, AMI’s, IAM and Cloud watch,

PROFESSIONAL EXPERIENCE

Blue Cross Blue Shield of Michigan, Detriot, MI, USA Jan 2018 to Till Date Senior Data Engineer

Blue Cross Blue Shield of Michigan and Blue Care Network are nonprofit corporations and independent licenses of the Blue Cross and Blue Shield Association. It is governed by its own board of directors and solely responsible for its own debts and other obligations. Neither the association nor any other organization using the Blue Cross or Blue Shield brands acts as a guarantor of Blue Cross Blue Shield of Michigan’s obligations. NASCO Daily Claims data will be pulled from NASCO Purge File of Mainframe system and loaded into EDW Claims tables. IM standard data feeds are generated from the claims tables and using the EDDI we send it to the various vendors. Below are the responsibilities that were taken during this project.

Responsibilities:

Requirements gathering for analytical reporting needs of provider, product and claims areas.

Data profiling on the source systems such as Oracle, Mainframe and Flat files for refining the data quality needs and requirements

Impact Analysis On the existing ETL objects and preparing the object list.

Developing Informatica mappings for incremental load of the data warehouse tables along with applying complex business rules involving all transformations.

Building ETL process for loading dimensions (SCD Type I & II) and facts in the given data marts.

Developing wrapper scripts to perform pre- & post load activities on Unix and database.

Creating the oracle stored procedures and packages to perform database load activities.

Performance tuning of ETL code and database queries to meet the given SLA requirements.

Implement the ABC (audit, balance & control) process to enable the control on the audit and balance and control for the ETL code.

Jobs schedule through IBM workload manager, Tivoli & source version control through CA-SCM.

Developed UNIX script to perform operations by bringing down the source files from a legacy mainframe source to the respective UNIX source directory, trigger necessary Informatica workflows, archiving the files and updating the necessary log files on the operations performed.

Developed ETL objects for data warehousing purpose in local database and migrated into QA environment for testing purposes.

Prepared unit test cases, test scripts & test data. UAT coordination with business and defect tracking using Rational Clear Quest tool. Defect fixes, support and follow up on necessary approvals for production deployment.

Pre-production walk though and preparation of project docs, Low Level Design, High Level Design, Unit Test Results, Production transitions plan, Restart and Recovery strategy

Support of UAT, production migration and transition to production support teams, warranty and post warranty to the respective teams.

Environment: Informatica PowerCenter & Power Exchange 10.2/9.6, Toad for Oracle, Unix, CA-SCM Workbench, Rational Clear Quest, IBM Workload scheduler and Tivoli.

Commonwealth of Pennsylvania, Harrisburg, PA, USA AUG 2017 to Dec 2017

DW/ETL Informatica Developer

The state and federal government fund subsidized child care program which helps low-income families pay their child care fees program, which is managed by the Child Care Information Service (CCIS) office located in the county. Pennsylvania’s Enterprise to Link Information for Children Across Networks (PELICAN) is an online system used by the Office of Child Development and Early Learning(OCDEL), so as part of a multi-year phased approach, OCDEL is incrementally creating an extensive data set containing data from ELN, including PA Pre-K counts (PKC), early intervention, Head start, Keystone STARS providers and Child Care Works (CCW). In alignment with OCDEL priorities, additional data is required to more fully evaluate program performance, program quality and to assist in ongoing research activities.

Responsibilities:

Gathering the requirements by coordinating with the users, effort estimation, preparation of project timelines & design for ETL & reporting modules.

Identification of source systems to data warehouse by coordinating with respective source teams.

Analyzed the existing operational sources and involved in development of the conceptual, logical and physical data models for the newly built data warehouse.

Identification of dimensions and measures per business needs and prepare the design of dimensions and facts in star schema model using ERWIN tool.

Design walkthrough with team and made changes where required and got user sign-off approvals.

Development of Informatica mappings to load DW tables and dimension and fact tables.

Developed ETL objects for data warehousing purpose in local database and migrated into QA environment for testing purposes.

Used workflows to schedule the ETL processes to run at periodic intervals for various period close operations.

Developing Unix shell scripts to perform pre and post load activities and execution of parent and child ETL process cycles using PMCMD utility.

Enhanced workflow performance by creating indexes, views, altering the cache sizes and maintained Informatica logs where required and also did Performance Tuning for better performance on SQL and Informatica side.

Performed unit, system & end-to-end testing. Documented processing times for each module, developed test cases and used them to run through each process. Implemented and day-to-day supported the BI environment and mentoring to end users who are interacting with the data.

Understanding the reporting requirements of users and design the Data Source and reporting layers.

Assisted with Cognos reporting team to generate the reports for business users need.

Guided reporting team on building, publishing interactive reports and dashboards, report scheduling using Tableau server.

Environment: Informatica 10.1.1, Data Quality, Oracle 11gR2, Erwin, OBIEE 11.1g and Cognos.

Farmers Insurance Group, Grand Rapids, MI / CSC, USA Aug 2015 to July 2017

Senior Lead Informatica Developer

Farmers Insurance is an American insurance and financial services company. It provides home, auto, commercial and life insurance and other financial services throughout the United States. Home insurance Property and Casualty project modules in that were under this client. This project is intended to provide the developers with the design, which will be used to build Informatica and test the ETL components required. This includes Informatica mapping design, data transformation, field-level data mapping and derivation rules.

Responsibilities:

Gap Analysis on the existing legacy system and other relational source systems.

Did data profiling on the source system to confine and streamline the business requirements for ETL and reporting components.

Involved in creation of data marts for design of Star schemas for Data warehouse.

Coordinated with project manager to document the project planning and also bi-weekly updates.

Daily status meetings with teams and provided guidance and mentoring in technical areas to the team of Developer/Analysts and then document and submit status reports to leadership.

Coordinated with Business Analyst and prepared Table and column level Business/Transformation rules documents (STTM).

Developed the several ETL processes, and created the Informatica mappings, workflows, sessions etc, and also tuned them for better performance.

Enforcing coding best practices like naming conventions and Error Handling.

Performance tuning of ETL code through session partitioning, incremental aggregation and push down optimization techniques.

Database query cost analysis to identify the performance bottlenecks and fine tune the SQLs to achieve better performance

Developing reusable code modules (ETL, Unix and database) for performing Audit, Balance and Control for any data that loads into EDBI warehouse & data marts.

Developed UNIX script to perform operations by bringing down the source files from a legacy mainframe source to the respective UNIX source directory, trigger necessary Informatica workflows, archiving the files and updating the necessary log files on the operations performed.

Developed ETL objects for data warehousing purpose in local database and migrated into QA environment for testing purposes.

Prepared test cases, test scripts & test data. UAT coordination with business and defect tracking using Quality Center tool. Defect fixes, support and follow up on necessary approvals for production deployment.

Pre-production walk though and preparation of project artifacts such as Requirements Traceability Matrix, Low Level Design, High Level Design, Unit Test Results etc. involved at all phases of SDLC.

Guiding team members, performing code reviews, mentoring new members and bringing them up to speed per the project needs.

Environment: DB2, Informatica 9.6.1/9.5, UNIX, TOAD for DB2, Flat file and Tableau.

Farmers Insurance Group, Los Angeles, CA / CSC, USA Aug 2012 to July2015

Lead Informatica Developer

Farmers Insurance is an American insurance and financial services company. It provides home, auto, commercial and life insurance and other financial services throughout the United States. Below are the few project modules in that were under this client. This project is intended to provide the developers with the design, which will be used to build Informatica and test the ETL components required. This includes Informatica mapping design, data transformation, field-level data mapping and derivation rules.

Responsibilities:

Involved in the Design, Development, Testing phases of ETL process for Auto and home policy data marts.

Design of end user reports using SAP Business Objects which will be replacing the legacy based system reports

Generating the reports from data warehouse and data marts and making them balanced against the reports from source systems.

Close co-ordination with business users to analyze and elicit the requirements for building reports.

Data profiling to confine and streamline the business requirements for generating the end reports.

Following the PDLC process to move the code across the environments though proper approvals and source control environments.

Develop the audit, balance and control process which will enable the control on history of ETL job statistics.

Analysis on the existing legacy system to correlate it in Informatica based solution.

Requirements gathering form business team and data profiling as per the same.

Effort estimations& schedule planning for the given requirements

Design the ETL process.

Developing, Unit testing

System testing and migration support

Code defect fixes, support and follow up on system testing approval and client approvals on UAT.

Preparing and reviewing deployment documents and support of production migration

and testing.

Environment: IBM DB2, Informatica 9.5/9.1, SAP Business Objects, UNIX and Flat files.

Plantronics, Santa Cruz, CA / CSC, India Jan 2012 to Aug 2012

ETL / Informatica Developer

Plantronics is world leader in producing audio communications equipment for business and consumers. It is a publicly held company (NYSE: PLT) headquartered in Santa Cruz, California with offices in 20 countries, including major facilities in China, England, Mexico, and the Netherlands. Our products are sold and supported through a worldwide network of Plantronics partners, including resellers, systems integrators, retailers and mobile carriers. This project is intended to provide the developers with the design, which will be used to build Informatica and test the ETL components required. This includes Informatica mapping design, data transformation, field-level data mapping and derivation rules.

Responsibilities:

Analyzed the existing operational data sources and involved in development of the logical and physical data models for the newly built data warehouse.

Developed initial ETL (Informatica) objects in local repository folders, while using the shared objects for sources, targets and transformations.

Involved in creation of data marts for design of Star schemas for Data warehouse application.

Created several database objects like views, indexes, procedures etc., to resolve the performance issues.

Developed the several ETL processes, and created the Informatica mappings, workflows, sessions etc, and also tuned them for better performance.

Used stored procedures, functions, cursors etc. to load data from different sources i.e., ‘Infinity controlled’ to target area called ‘Finance Area - controlled’ databases.

Developed ETL objects for data warehousing purpose in local database and migrated into QA environment for testing purposes.

Assisted with reporting team to generate the reports for business users need.

Used workflows to schedule the ETL processes to run at periodic intervals for various period close operations.

Developed UNIX script to perform operations by bringing down the source files from a staging area (Infinity) to the respective UNIX source directory, trigger necessary Informatica workflows, archiving the files and updating the necessary log files on the operations performed.

Prepared test cases, test scripts & test data.

Did code walkthrough of all the designs with team and made changes where required.

Coordinated with project manager to document the project planning, scheduling and also bi-weekly updates.

Enhanced workflow performance by creating indexes, views, altering the cache sizes and maintained Informatica logs where required.

Used Power Analyzer to generate data profiling reports & to monitor data quality during production run.

Performed unit, system & end-to-end testing. Documented processing times for each module, developed test cases and used them to run through each process. Implemented and supported the BI environment and User Interface.

Participated in and made presentations to the higher management on the developments made.

Environment: Oracle 10g/9i, Informatica 8.6, Business Objects, PL/SQL, Erwin, TOAD and UNIX

JMP Securities, San Francisco, CA / CSC, India Oct 2010 – Dec 2011

ETL Developer

WWW Bookings (Enterprise Bookings Cubes) were designed for Sales Finance Analysts. JMP required Multidimensional Reporting of Orders, Forecast, Opportunity/Commit Management Data, Sales Channel Information, Conditional Sales and Backlog tracking on Ad –Hoc basis by various dimensions like Sales Hierarchy, Technology Market Segments, and Customer Market Segments.

Responsibilities:

Interacted with the investment brokers, advisors, analysts and account executives and understood the business requirements by analyzing and defining key issues related to the earlier work system.

Involved in the design, data modeling, development and testing phases of various projects and enhancements.

Analysis, requirements gathering, functional/technical specification, development, deploying and testing.

Logical & Physical Database Layout Design. Created Informatica mappings for initial load and daily updates.

Created Informatica mappings for initial load and daily updates.

Developed mapplets and worklets.

Generated flat files using Informatica power center and extracted XML and Flat file Designed reusable transformations and shortcuts to share different mappings.

Implemented Slowly Changing dimension Type 1, Type 2, and Type 3 methodology for accessing the full history of accounts and transaction information.

Developed, tested procedures, Functions and packages in PL/SQL for Data ETL

Generated Reports using Metadata Reporter

Day-to-Day Production support.

Involved in fixing, testing of store procedures and functions, unit and integration testing of Stored Procedures and the target data.

Developing Unix shell scripts and scheduling Unix Scripts by using scheduling tools.

Worked with end users to develop and collect feedbacks in the User Acceptance Testing (UAT)

Involved in Production Deployment and Support.

Environment: SQL Server, Informatica 8.1, Power Exchange, PL/SQL, Erwin, TOAD, XML and Flat file Sources and UNIX, AIX

Incent Technologies, Hyderabad, India Aug 2009 – Oct 2010

Unix Consultant

Incent Technologies is a private consulting company which deals with setting up the server and LAN infrastructure at enterprises and organization. Its business mainly deals with schools, colleges which are expanding their existing computer infrastructure. The set up includes installing the servers to host web, and to serve as programming platform for users.

Responsibilities:

Developing Perl scripts for automating the scheduled backup process for all the programs and web pages those saved over sever.

Creating of shell scripts to automate the user creation setup for the list of programmers in the organization.

Scripts for automated monitoring and alter setup for disk overload and CPU overload on the server.

Installation of apache web server on Ubuntu GNU/Linux server.

Hardware and network setups for the Unix servers with the local area network.

Setup of required programming platforms for users on the GNU/Linux servers.

Understanding the requirements as per the business and technical need

Developing the code and testing the same for meeting the requirement

Handling maintenance and support for the production environment

Guiding new trainees on the maintaining the environment

Environment: PERL, GNU/Linux, UNIX, Shell Scripting (Bourne, Korn)



Contact this candidate