Post Job Free

Resume

Sign in

Manager Data

Location:
Middletown, DE
Posted:
March 12, 2020

Contact this candidate

Resume:

PRADEEP REDDY GUNTUKA

Email: adb9sg@r.postjobfree.com

Mobile: 302-***-****

SUMMARY:

14+ years of lead experience in development, maintenance, design and implementation of Data Warehouse applications using ETL tools like INFORMATICA power center and experienced working with databases like Oracle, Teradata, SQL server, DB2 UDB and Hadoop.

Experience in modeling Transactional Databases and Data Warehouses using tools like Erwin and ER/Studio.

Experience in developing logical and physical models and implementing them in Oracle.

Experience in integrating data from various sources Oracle, DB2 UDB, Sybase, and Teradata.

Experience in creating entity relational & dimensional relational data models with Kimball Methodology (Star schema and Snow flake schema architectures, Fact/dimension tables)

Worked with data cleansing and Data mining.

Experience of Big Data Hadoop 2.x, Apache Spark and Big Data analytics.

Good knowledge on scripting languages like PYTHON, R and Scala.

Hands on Experience in working with ecosystems like Hive, Pig, Sqoop, Map Reduce.

Good knowledge on Hadoop Cluster architecture and monitoring the cluster

TECHNICAL SKILLS:

ETL Tools: Informatica (Power Center 9.6.1, 9.5.0, 9.1.0, 8.6.1, 8.5, 8.1.2, 8.1.1, 7.1.1, 6.2 /5.1 and Power Mart 6.2 / 6.0 / 5.1) Power Exchnage8.5/8.6.1/striva5.0

Data Bases: Oracle 10g/9i/8i/8.0/7.0, MS SQL Server 2000, DB2, UDB, MS Access 97/2000, Sybase, Teradata V2R3.

BI Tools : BusinessObjects5.1/5.0,COGNOS.

Data Modeling: Erwin 3.5/4.0

Operating System: Windows 98/NT/2000/XP, UNIX (Linux, HP-Unix, Solaris)

Other Software: TOAD 11.0.0.116, MS Office, MS Visio, Autosys.

Analytical : Big Data Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Apache Spark. Base SAS, SAS EQ, PROC SQL, SAS Macro. R, Rstudio, AWS

EDUCATION:

KARNATAK UNIVERSITY, India (2000)

Bachelor of Engineering in Electronics and communication

EXPERIENCE:

06/2019 to JPMorgan Chase, Wilmington, DE

Till now Software Engineer

Working closely with Business, Data modeling team and DBA for creating the requirements documentation.

Developing the ETL code for the requirements.

Entity Registration for conform and semantic layers in HDFS.

Creating ITSM’s for the code deployments.

Creating the external tables and views on top of HDFS in Hive.

Registering the entities Unified data services and promoting them to higher environments.

Environment: AbIntio 3.3.5, Oracle 11g, Teradata, Hadoop, HDFS, Horton works, Hive, UNIX, JIRA 5.

02/2019 to SBD2 (CMS Project). Baltimore, MD

06/2019 Sr. ETL/Informatica Developer (Contract)

Working closely with Data modeling team and developing the MDM model for PECOS2.0.

Used Informatica power center for (ETL) extraction, transformation and loading data from CMS PECOS 1.0 Oracle Database to PECOS 2.0 systems which is PostgreSQL Database.

Fine-tuned mapping, Sessions and workflows for performance.

Keep track of development life cycle using JIRA tool.

Unit testing with informatica data validation tool

Environment: Informatica 10.2.0, Oracle 11g, UNIX, Informatica Data validation Tool, JIRA 5, AWS cloud.

07/2018 to GDIT/CSRA Inc. (CMS Project). Baltimore, MD

02/2019 Sr. Hadoop/ETL Developer (Contract)

Working on CMS Medicaid Statistical Information Systems project abbreviated as TMSIS and load into the eMDM project.

Extensively used and currently using the DMExpress Syncsort ETL tool to load Hive tables into Hive targets, Flat Files and DB2 target tables.

Designed, architected, developed, executed and tested various ETL components built using the DMExpress Tasks, Jobs for doing ETL of the TMSIS Hive source, stage, control tables.

Developed ETL routines using the JOIN, COPY, AGGREGATE, MERGE, SORT, FILTER tasks in various ETL framework and re-usable components.

Developed re-usable control and driver table using the DMExpress tasks and loaded for all states received from TMSIS.

Written and Executed various SQL queries and JOINS in Hive and Impala SQL editor, formatted the queries, troubleshooting the errors and exporting the datasets for validations.

Executed the DMX ETL routines on my local server, edgenode and on the SingleNode and Multi-Node cluster via HDFS ecosystem to fine tune the performance and using distributed architecture.

Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.

Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDD's.

Analyzed the SQL scripts and designed the solution to implement using Pyspark.

Developed python scripts, UDFFs using both Data frames/SQL/Data sets and RDD/MapReduce in Spark for Data Aggregation, queries and writing data back into db2.

Environment: Syncsort DMExpress, Hadoop, HDFS, MapReduce, Hive, Impala, PySpark, DB2, UNIX, Tortoise SVN, JIRA 5,

12/2016 to CSRA Inc. (CMS Project). Baltimore, MD

07/2018 Sr. ETL/Informatica Developer (Contract)

Working closely with architect team and Business for redesigning the existing EDSC model.

Used Informatica power center for (ETL) extraction, transformation and loading data from CMS PECOS, NPPES systems which is DB2 Database to heterogeneous systems.

Fine-tuned mapping, Sessions and workflows for performance.

Scheduled workflows using the scheduling tool Trivoli.

Keep track of development life cycle using JIRA tool.

Environment: Informatica 9.6.1, Oracle 11.0, DB2, Trivoli, UNIX, Tortoise SVN, JIRA 5,

07/2016 to QVC, Westchester, PA

08/2016 Sr. Informatica developer (Contract)

Worked with Business Analysts team in requirements gathering and in preparing Functional specifications and technical specifications.

Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.

Used push down optimization to load the data to and from Teradata.

Environment: Informatica Power Center 9.6.1, UNIX, Oracle 11g, WinCVS, Teradata, TOAD.

09/2013 to JPMORGAN CHASE, Wilmington, DE

05/2016 Application Developer (FT)

Responsible for L2 support, design and development of ETL processes using UNIX shell scripts Teradata utilities and oracle SQL loader jobs.

JIRA's, P2/P3 Peregrine tickets, and creating the ITSM/retroactive ITSM for new Implementation and Break fixes.

Acknowledged, Analyzed, resolved all the critical, High priority defects and fully responsible for fixing it and to meet SLA’s.

Worked on Teradata utilities Fast Export, Fload, Mload and BTEQ using the UNIX shell scripts.

Understanding the business requirements, development, testing of new and enhancements to the existing processes.

Supported System Testing and Production migration by providing bug fixes, migration check lists

Gathered and translated business requirements into detailed, production-level.

SME and supported for implementation of the control’s projects to some of the processes.

Environment: Oracle 10g, informatica, Teradata, UNIX, Tortoise SVN, JIRA 5, ITSM, Peregrine, UNIX, Toad, Autosys, Erwin 9.5.

11/2012 to BARCLAYCARD USA, Wilmington, DE

09/2013 ETL Analyst/Oracle Developer (Contract)

Responsible for requirement Analysis, Coding, Testing and Implementation of various modules.

Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.

Prepared Technical Designs for the custom Business Areas Prepared Technical design documents for new inbound and outbound interface.

Developed PL/SQL procedures, function and Packages to migrate data between Oracle Databases Troubleshoot jobs using the debugging tool.

Automated data loading, extraction, reports generation using Control-M scheduling tool.

Documented all the packages, procedures and functions involved in ETL process.

Responsible for performance tuning activities like Optimizing SQL queries, explain plan, creating indexes.

Environment: Oracle 10g, Win CVS and CVS,SDLC tool, IBM Data stage, HP Quality Center, UNIX, and Shell Scripts, Putty, Toad, Control-M, Erwin 9.5.

10/2011 to PROVIDENCE HEALTH & SERVICES, Oregon Portland, OR

09/2012 ETL Specialist II/Informatica Administrator (Contract)

Responsible for Administration, development, maintenance and on call support for Informatica Production 24/7 on call rotation.

Working on Service requests from the users using Remedy, TFS and trouble tickets using ITSM.

Worked with BA in requirements gathering and in preparing Functional specifications and technical specifications.

Designing, Developing, Deploy & Testing of Data warehouse Project.

Worked on automation and scheduling using Informatica Scheduler.

Proficient in using Informatica Workflow manager, Workflow monitor, pmcmd (Informatica command line utility) to create schedule and control Workflows, tasks and sessions.

Environment: Informatica Power Center 9.1.0, Informatica Data Transformation Studio, UNIX, Oracle 11g,UNIX AIX, Erwin 4.0/3.5.2, TOAD 11.0.0.116.

05/2010 to MULTIPLAN, New York City

10/2011 Data warehouse Analyst/Informatica Admin (Contract)

Responsible for Administration, developing, support and maintenance for the ETL processes using Informatica Power Center.

Responsible for creating Business Requirements, Technical Specification, Support and Administration for the Informatica Power Center8.6.1 and Power Exchange8.6.1.

Installed and Created the Repository Services, Users, User Groups, by using the Administrator console and Folder, Folder access to the users, Deployment groups, Labels, Managed the version control and moving the code to different environments from Repository Manager.

Responsible for Administration, developing, support and maintenance for the Power Exchange for Oracle Change Data Capture.

Writing Design documents, Preparing Mapping Specifications & developing the Informatica code using Power Center and Power Exchange, Oracle, Linux Operating System.

Assisting in reviewing and revising policies, procedures, standards for Data warehouse projects and Informatica ETL process.

Environment: Informatica Power Center 9.1.0,Power Exchange 9.1.0, Oracle CDC, Power exchange navigator, Informatica Data Explorer (IDE), Informatica Data Quality(IDQ) Oracle 10g, Redhot Linux 64 bit, TOAD 8.6.1.0.

08/2009 to HAWAII MEDICAL SERVICES ASSOCIATION, Honolulu, Hawaii

04/2010 Sr Informatica Developer (Contract)

Worked extensively in designing, coding, testing, and documentation also worked with Business Analysts team in requirements gathering and in preparing Functional specifications and technical specifications.

Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.

Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.

Worked on Debugging, Troubleshooting & documentation of the application layer. Tuning of SQL queries in SQL overrides for better performance.

Responsible for mapping migration, metadata maintenance and Security using repository manager.

Environment: Informatica Power Center 8.6.0,Power Exchange 8.6,Power exchange navigator, UNIX, Oracle 10g, SQL Server 2005, UNIX AIX, Erwin 4.0/3.5.2, TOAD 8.6.1.0.

01/2009 to BLUE SHIELD CALIFORNIA, San Francisco, CA

07/2009 Sr Data warehouse Developer (Contract)

Interacting with Facets & Rosetta SME's to understand the business model.

Understanding the existing R2 & R3 Rosetta data marts, Extracting Facets application data and loading to existing R2 & R3 marts for reporting purpose.

Reading the data from the FACETS and Rosetta IA data model then written Mapping documents and ETL technical Specification documents for program development, logic, coding, changes and corrections.

Designed and developed Informatica Mappings, Sessions and Workflows Mapping documents and ETL technical Specification documents and business rules to load data from FACETS to Rosetta IA data model.

Responsible for writing unit test cases and performing the unit test and Involved in tuning the Session and Workflows for better performance.

Environment: Informatica Power Center 8.6, UNIX, Oracle 10g, FACETS 4.5, UNIX AIX, Erwin 4.0/3.5.2, TOAD 8.6.1.0.

07/2008 to EXCELLUS BCBS, Buffalo, NY

12/2008 Sr Programmer Analyst

Reading the data from the FACETS and HpXr data model then written ETL Functional Design documents and ETL technical Specification documents for program development, logic, coding, changes and corrections.

Worked with FACETS, Facets are extended enterprise of TriZetto’s solution to assist payer organizations in staying ahead of trends in the employer benefits market.

HpXr uses the accelerator to develop a customized base to do operational reporting, feed downstream databases, or integrate into an existing reporting environment.

Environment: Informatica Power Center 8.6, UNIX/Windows 2000, SQL Server 2005, Oracle 10g, FACETS 4.5, HpXr.

01/2007 to INVESTORS BANK AND TRUST, Boston, MA

05/2008 Informatica Developer

Responsible for developing, support and maintenance for the ETL processes using Informatica Power Center.

Parsing high-level design specs to simple ETL coding with mapping standards.

Performed Reverse Engineering of the legacy application using DDL scripts in Erwin and developed Logical and Physical data models for Central Model consolidation.

Worked on Parameterize of all variables, connections at all levels in UNIX.

Used workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions, and scheduled them to run at a specified time.

Worked on Parameterize of all variables, connections at all levels in Window NT.

Environment: Informatica Power Center 8.5, Workflow Manager, Workflow Monitor, SQL Server 2005, Power Exchange CDC, UNIX AIX,Erwin 4.0/3.5.2, TOAD 8.6.1.0, PL/SQL, Flat files, XML, Oracle 10g/9i/, Teradata V2R3, UNIX.

12/2005 to GREAT WEST HEALTHCARE, Denver, CO

12/2006 Programmer Analyst

Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

Developed and scheduled Workflows using task developer, worklet designer and workflow designer in Workflow manager and monitored the results in Workflow monitor.

Used workflow Manager for Creating, Validating, Testing and Running the sequential and concurrent Batches and Sessions, and scheduled them to run at a specified time.

Helped in Fine-tuned Transformations and mappings for better performance. By tuning source, transformation, target, and workflow levels.

Performed unit testing and Involved in tuning the Session and Workflows for better performance.

Worked with Harwest, and also fully compliant with HIPAA

Environment: Informatica Power Center 7.1.4, Workflow Manager, Workflow Monitor, FACETS 4.5 Erwin 4.0/3.5.2, TOAD 8.6.1.0, PL/SQL, Flat files, Oracle 9i.

05/2005 to HEALTH MANAGEMENT SYSTEMS, Dallas, TX

11/2005 ETL Developer

Extensively worked Informatica Designer to design mappings, extracting data from DB2 instance, flat files, DSS on Analysis of Source, Requirements, existing OLTP system and Identification of required dimensions and facts from the Database.

Responsible for developing and maintenance for the ETL processes using Informatica Power Center. Power Exchange to connect with Mainframes to create VSAM files.

Worked with various Informatica Power Center tools –Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet, Transformations while creating reusable transformations and Mapplets and mappings.

Created Shortcuts, reusable transformations and Mapplets to use in multiple mappings.

Worked with COBOL programs to retrieve the Data from Mainframes.

Environment: Informatica Power Center 7.1.1, Workflow Manager, Workflow Monitor, Flat files, Cobol/400, DB2 UDB, Power Exchange (Striva 5.0 power connect).

REFERENCES: Available upon request.



Contact this candidate