Jason Nagulapalli
**************@*****.***
https://www.linkedin.com/in/jason-nagulapalli-83b076b0/
Mobile Phone: 510-***-****
** **** ***** ** ********** in designing and implementing Data Warehouse applications using ETL tools primarily INFORMATICA Power Center 10.4,10.2, 9.6.1, 8.6 (Designer / Workflow Mgr.) to transfer the data from legacy systems (Mainframes) to Various target Systems like Oracle, SQL Server.
Designed and Developed the Data warehouses and Data marts conceptualized on Bill inmon and Ralph Kimball methodologies
Analyzed source and target systems and prepared technical specification for the development of Informatica ETL mappings to load data into various Target systems.
Experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner.
Proficient in using Informatica Workflow manager, Workflow monitor, pmcmd (Informatica command line utility) to schedule and control Workflows, tasks, sessions.
Extensively worked on database applications using DB2, Oracle, SQL Server, Netezza, PL/SQL, Yellow Brick DB, Postgres SQL.
Experience in integrating data from various sources Oracle, DB2, SQL Server, Netezza, Yellow Brick using Informatica Power Center.
Extensively worked with databases like oracle 12g, SQL server, PostgreSQL.
Hands-on Experience working with AWS RedShift, S3 and Data Bricks.
Hands-on working experience with informatica intelligent cloud services (IICS) 10.2.
Working Experience with IBM DataStage ETL tool for conversion jobs and maintenance of Data Stage jobs.
Working Knowledge on Red Point ETL application.
Experience Executing/Scheduling ETL informatica jobs through Active Batch.
Created, Used parameters and Variables for various needs as part of an Active Batch job.
Used SQL tools like TOAD, Query Analyzer to run SQL queries and validate the data in warehouse and mart
Experience in creating entity relational & dimensional relational data models with Kimball Methodology (Star schema and Snowflake schema architectures, Fact/dimension tables)
Performed DDL and DML operations on various DB objects.
Extensively worked on SQL queries, T- SQL for Data Analysis, Data Validations etc.
Experience Working with SP’s and tuning/Optimizing SQL queries per business needs.
Experience working with making ad Hoc changes to existing SQL scripts/SPs for Test and new dev Purposes.
Knowledge in working with big data Hadoop stack tools like HDFS, HIVE, Pig, Sqoop.
Knowledge of related open-source products, notably Spark, Hadoop, Talend.
Experience in performance tuning the HiveQL and Pig scripts.
Basic Understanding and knowledge in writing HiveQL queries and tuning the HiveQL and Pig scripts.
Documented design procedures, operating instructions test procedures and troubleshooting procedures.
Good hands-on experience in writing UNIX shell scripts to process Data Warehouse jobs.
Followed Agile methodology, interacted with Data Integration and Operations Team’s on daily basis.
Constantly interacted with business analysts to understand the reporting requirements
Solid time management and multitasking skills, which help in conducting project meetings, reviews, walkthroughs, and customer interviews according to the varied needs of the people involved
Excellent written and oral communication skills and a results-oriented attitude
EDUCATION:
Bachelor’s in computer science and engineering, JNTU, India, 2004 – 2007.
Master’s in computer science and engineering, Northwestern Polytechnic University 2008 – 2010
TECHNICAL SKILLS:
ETL Tools
Informatica (Power Center10.4, 10.2, 9.6, 9.0.1,8.6), IDQ 9.1/9.5.1, Informatica Cloud 10.2. DataStage 9.1.2
Data Base
Oracle 12c/11g, SQL Server, DB2, MS Access 97/2000, Teradata, Netezza, Yellow Brick. PostgreSQL, T-SQL, Amazon RedShift, Data Bricks .
Data Modeling
Erwin 3.5/4.0
Operating System
Windows 98/NT/2000/XP/10, UNIX (Linux, HP-Unix, Solaris)
Other Software
SQL Workbench, WinSQL, TOAD, MSOffice, MS Visio, Autosys, ESP, Amazon s3
Scripting Languages
Python, Korn Shell Scripting (Kshell), Unix
PROFFESIONAL EXPERIENCE:
BCBSM
Detroit, MI Aug’ 23 – Current
ETL Developer
Roles & Responsibilities:
Worked mainly on the Membership and Claims Data residing in various Systems (Postgres, Data Bricks, RedShift).
Worked with Dev/Test teams to maintain and validate data flows through Various Systems.
Validate Data flow and Data from PostgreSQL to ADH(Redshift) to Data Bricks, to maintain data integrity as Part of SIT.
Validated, Tested and document data loads between different environments.
Develop, Test, Maintain the ETL code using Informatica Power Center 10.4
Raise CQ defects in case of any issues with data loads / data discrepancies.
Designed and enhanced existing and SQL scripts to maintain correct data for downstream Repots for Business Users.
Create Documentation as needed for production support and SIT teams.
Worked on generating outbound reports for Business requirements. eg. NASCO BMI files to identify member eligibility.
Environment: Informatica 10.4, Amazon RedShift, Amazon S3, Data Bricks, T-SQL, SQL Server, S3,JIRA, CQ.
Penn State Health
Hershey, PA Jun’20 – July’23
Data Analyst/Sr.Data Integration Developer
Roles & Responsibilities:
Design and Develop Informatica processes to extract the data from DB, and external files daily.
Develop, Test, Maintain the ETL code using Informatica Power Center 10.4
Informatica Upgrade from 10.2 to 10.4. Involved in Migration process of all information objects and unit tests process for successful migration.
DataStage Conversion Project – Designed improved code to replace older code in DataStage and converted the new jobs in Informatica 10.4.
Designed and enhanced existing and Sp’s to add/excluded new entities and perform other activities.
Experience working on SQL server, T-SQL, complicated Sp’s etc.
Analyzed large sets of raw data resulting in clean data integration.
Analyzed raw data from various types of datafiles of different vendors for proper data integration to the Enterprise Datawarehouse.
Designed, developed and Support Maintenance for various Inter business modules within Penn State Health
Create Documentation as needed for production support teams.
Design and develop ETL code as per business needs. Work with internal and external 3rd party vendors when required mostly in cases of file transfers (inbound and out bound).
Enhanced, Supported existing IICS jobs for data integration from Salesforce to SQL server for Business analysis.
Interact with Business, gather requirements and translate business requirements into ETL functional code.
Schedule and drive the design review meetings with vendors and project management teams to go over the design going to be.
Environment: Informatica 10.2/10.4, IICS 10.2, DataStage, T-SQL, SQL Server, Oracle, Active Batch.
Alliance Data Services
Columbus, OH Aug’19 – May’20
Data Analyst/Data Integration Developer
Roles & Responsibilities:
Design and Develop Informatica processes to extract the data from DB, Netezza and external files daily.
Develop, Test, Maintain the ETL code using Informatica Power Center 10.2
Migration project from Netezza to YB Db.
Analyzed data sets for various vendors resulting in efficient data integration between systems.
Provided support to senior data analysts, resulting in a positive impact on the team’s overall performance.
Assisted in the analysis of large sets of data, resulting in positive impact on the company’s overall performance.
Reduce projected time for data analysis by a couple of weeks by leveraging re-usable ETL’s for data ingestion.
Leveraged existing Python Scripts to create CDC and NON-CDC mapping layouts for migration.
Generate and execute test case docs using python scripts. (Automation for TEST CASES).
Executed, Enhanced Data compare and Data Validation Py Scripts for testing purposes and to maintain data integrity and store only relevant data per Business requirements.
Design and physicalize the Data model for Enterprise Data warehouse.
Estimate the Database space requirements for the DBA s based on the vendor input and existing workers comp data
Work closely with the Business Analyst to come up with the source – target document to map the Account Information data to the existing data structures.
Create Documentation as needed for production support teams.
Environment: Informatica 10.2, SQL Server, Yellow Brick, DB2, Netezza, Python, UNIX.
Nationwide Insurance
Columbus, OH July’18 – Aug’19
Data Integration Developer/Sr.ETL Developer
Roles & Responsibilities:
Developed financial models for disbursement and adjustment of financial transactions and incorporated changes in the existing functionalities based on new features and requirements
Analyze Data from various Source data systems for Promise 2020 Project.
Understand and translate complex SQL queries into ETL code (Informatica).
Performed DDL and DML actions mainly on PMTS (payments and Transactions’) DB tables.
Converted complex SQL into ETL code using Informatica Power Center to load PMTS Data mart.
Performed in-depth Analysis on PMTS Adjustments and Disbursements SQL Files, for data cleansing and analysis to extract data per Business needs for End User’s.
Worked in a TDD approach, used RUBY to write test cases.
Used HP ALM Quality Center for tracking, performing other necessary actions on any defects opened.
Used GIT for version control and to deploy and migrate code across different branch
Attend Scrum meeting and provide with the latest updates to the Team and Business users.
Used Task Board and RTC to track Story cards.
Environment: Informatica 9.6.1,10.2, Oracle, RUBY, Perl Scripting, ESP, Windows Server.
TRowe Price, Owings Mills, MD Jun’16– July’18
Data Analyst / ETL Developer
Roles & Responsibilities:
Involved in Informatica upgrade project from 9.1 to 9.6 and 9.6 to 10.2.
Migrated informatica objects from old server to the new server, as a part of upgrade.
Updated jil scripts to update AutoSys jobs with the new server information.
Created Informatica Mappings to acquire assets and flows information on competitor products, both US and International.
Worked on multiple projects using Informatica developer tool IDQ of latest versions 9.1.0 and 9.5.1.
Involved in migration of the Mapp’s from IDQ to power center
Applied the rules and profiled the source and target table's data using IDQ.
Loaded monthly file-based Strategic Insight tables into CIP Exploration Oracle database.
Provided support to senior data analysts, resulting in a positive impact on the team’s overall performance.
Assisted in the analysis of large sets of data, resulting in positive impact on the company’s overall performance.
Provided basic exception handling for potential data issues identified during data analysis.
Create, Test, Monitor DRT jobs on Informatica Cloud.
Used REST API to access metadata information in Informatica Cloud.
Created DSS tasks in Informatica Cloud.
Environment: Informatica 9.6.1,10.2, IICS, SQL Server Management Studio 2008 and 2012, Oracle 11g, Unix.
Federal Home Loan Bank, Pittsburgh, PA June’15 – June’16
ETL Informatica Developer
Roles & Responsibilities:
Involved in a Project called POLYPATHS.
Involved in optimization of code and tuning queries for performance.
Involved in testing and data validation of the reports.
Created mappings, workflows to load data from FHLB-PGH’s existing DWH into their new POLYPATHS system.
Worked with FHLB-PGH’s Trade data to analyze bond – pre trade data.
Helped load Market values correctly into the new POLY system for trade analysis.
Worked with Flat files like (OF-Generic Curve, Bond-Pre-Trade) loaded these flat files into POLY system.
Worked on both Outbound and Inbound interfaces (POLYPATHS).
Redesigned and implemented FHLB-PGH’s FAS-133 system, to load INCEPTION and ONGOING outbound Flat files.
Created deployment groups to migrate Informatica code from Dev to UAT.
Followed Agile methodology, interacted with business users daily.
Environment: Informatica 9.5.1, SQL Server Management Studio 2008 and 2012, Autosys, PL/SQL, Windows XP, UNIX, Secure FX.
Lincoln Financial Group, Philadelphia, PA Mar’14 – June’15
ETL Informatica Developer
Roles & Responsibilities:
Creating Informatica Mappings and workflows to migrate data from one environment to another and data enhancements.
Involved in optimization of code and tuning queries for performance.
Involved in testing and data validation of the reports.
Provided postproduction support to end-users.
Worked on ESR Support Project and EM support which are two processes in Lincoln Financial Group
Created Autosys jobs and involved in monitoring Autosys jobs on daily basis.
Support end Users daily, with reports and any issues concerning the balance sheet.
Environment: Informatica 9.5.1, SQL Server Management Studio 2008 and 2012, Autosys, PL/SQL, Windows XP.
TMG Health, Jessup, PA Sep’13 – Feb ‘14
ETL Informatica Developer
Roles & Responsibilities:
Preparation of design document, development for loading data into the data warehouse.
Designed, Developed and Deployed informatica mappings, workflows from Dev to testing and Production environments.
Created New Staging tables in Staging DB to store data from client files (CMS, NY
Medicaid Files etc.)
Wrote Postgres SQL Queries to work with the data, being loaded from external client files into the Data Warehouse.
Worked with existing Python Scripts and made additions to the Python script to load data from CMS files to Staging Database and to ODS.
Worked extensively on SQL Server 2008 and Postgres SQL.
Worked extensively on Medicaid and Medicare and CMS files, and their corresponding file layouts.
Prepared design, technical and functional documents from the business requirements gathered.
Created Mappings to move data from Oracle, SQL Server to new Data Warehouse in Green Plum.
Used different Transformations for loading the data into target like Source Qualifier, Joiner, Update Strategy, Connected and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator.
Created and Configured Workflows, Worklets and Sessions to transport the data to target tables using Informatica Workflow Manager
Involved in data validation to ensure that the highest levels of data quality and data integrity are maintained.
Performed unit testing and involved in tuning the Session and Workflows for better performance.
Environment: Informatica 9.1, SQL Server Management Studio 2012, Idle Python GUI 2.5.4, PgAdmin3 (Postgres), PL/SQL, Windows XP, UNIX.
Capital Blue Cross (BCBS), Harrisburg, PA Jan’12 – Jan’14
ETL Informatica Developer
Roles & Responsibilities:
Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.0.1.
Interfaced with various members of the technical and business team to translate the business reporting and data maintenance requirements into functional ETL code.
Created Mappings to move ETL processes from various systems like DB2, Oracle, SQL Server to CBC’s new Data Warehouse in Netezza.
Created and Configured Workflows, Worklets and Sessions to transport the data to target tables using Informatica Workflow Manager
Worked with the DBA to modify SQL from DB2 to Netezza version for already existing code in DB2.
Parsing high-level design specs to simple ETL coding and mapping standards.
Extensively involved in data validation to ensure that the highest levels of data quality and data integrity are maintained.
Created deployment groups to migrate code for one environment to another.
Complete understanding of Pushdown Optimization Utility in Informatica.
Used Informatica Partitioning to improve session performance.
Used Aginity and WinSQL to run Queries, for testing and validation of data.
Created mappings to write Infusion data, CPT_HCPC codes etc to flat files for Care Centrix for reporting
Created reusable transformations and Mapplets to use in multiple mappings.
Used workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions, and scheduled them to run at a specified time.
Created Workflow, Worklet, Assignment, Decision, Event Wait and Raise and Email Task, scheduled Task and Workflow based on Client requirement.
Migrated ETL code from Dev to QA and then from QA to Prod for new release cycles.
Developed testing strategies and performed in-depth testing to ensure data quality.
Environment: Informatica Power Center 9.0.1, Oracle 11g/9i, DB2, PL/SQL, UNIX, Windows XP, Win SQL, Tidal.
Eye-Med (LUXOTTICA) Mason, Ohio May’11- Jan‘12
ETL Developer
Roles & Responsibilities:
Reading the data from the AS400 systems then written ETL Functional Design documents and ETL technical design documents for program development, logic, coding.
Reviewed BPD’s (business process documents) and designed functional and tech. design documents.
Used EDI to exchange process able data to Core FACET tables.
Interacted with Business gathered Requirements and designed the Business process flow documents.
Installed Pervasive data Integrator for the ETL processing.
Worked partially on Oracle eBusiness Suite.
Mapped broker data, client data of Eye Med from SFDC to FACETS.
Mapped data from various systems to FACETS core tables.
Helped in developing Eye-Med Vision claim’s module.
Mapped data from core FACETS to Hyper data warehouse.
Environment: Pervasive 5.0 data integrator, Workflow Manager, Workflow Monitor, Oracle 10g, Facets.