Eswari Sudha *****.******@*****.***
Phone: +1-913*******
Professional Experience
Previously associated with Cognizant Technology Solutions India Pvt. Ltd. as a Programmer Analyst and worked as an ETL Developer.
●More than 2 years of overall experience in Development and Enhancement of business applications
●Strong technical skills in Unix Shell Scripting, Informatica PC, Greenplum and Oracle database
●Experiences in workload automation tools like BMC Control-M 8.0 and Control-M enterprise Manager 6.4
●Used Control M Scheduler in creating and scheduling the job’s run timings.
●Responsible for on call Production Support and Issue Resolutions using Session Logs, and Workflow Logs.
●Responsible for best practices like naming conventions, Performance tuning, and Error Handling.
●Experience in handling team, taking up responsibilities, interacting with clients and higher management, mentoring new team members
●Interact with business users as well as other members of IT outside of the team. Involved in preparation of specifications, development and testing of scripts.
●Designed and developed Operational Data Store, Slowly Changing Dimension, Staging areas.
●Designed and developed complex aggregate, join, lookup transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica Power Center 9.x tool.
●Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy Rank and Router transformations.
●Proficient in using Informatica Workflow Manager, Workflow monitor to create, schedule and control Workflows, Tasks, and Sessions.
●Extensively worked with SQL queries.
●Load data into a Greenplum database instance using external tables, SQL copy and insert commands, and parallel load utilities.
●Implemented table partitioning for handling very large tables.
●Extensively worked on ELT performance tuning using multiple Gpfdist instances
●Extensively used gpfdist and gpload utilities for data load and unload.
●Distribute and store data in Greenplum using a distribution key and partitioning
●Worked on optimizing indexes for better performance and maintainability
●Improve query performance by following performance enhancement tips and database best practices.
●Having knowledge on backup and restoration of Greenplum databases using gp crondump,gp backup,gp restore utilities
●Interacted with Informatica admin, Data Architect, DBA in troubleshooting problems reported in huge volume testing (150-200 Million records Data) systems.
●Having the ability to analyze and diagnose failures in jobs; to do impact analysis of a data issue and trace it back to source system and to resolve data quality issues
●Experienced in preparing outage plans to support production jobs during DB server upgrades, Outages and Informatica upgrade
●Having experience in L2 and L3 24x7 production support environment and working on Remedy as ticketing tool
Technical Skills
Hardware / Platforms
Windows XP, Windows 7, Windows 8, Windows 10, Ubuntu
Database
Oracle, Greenplum, MySQL
Programming Languages
SQL, PL/SQL, Postgresql, UNIX Shell scripting
ETL Tools
Informatica Power Center 9.1.0, 9.5.0 & 9.6.0
Other Tools
SQL Developer 3.2.4, PG Admin, Putty, Control M
Other Platforms
Citrix Receiver and Citrix XenApp
Educational Qualifications
Year of passing
Degree
Institution Name
CGPA
December 2014
Master of Technology in Computer Science and Engineering
GMR Institute of Technology, JNTU Kakinada
8.29
June 2011
Bachelor of Technology in Computer Science and Engineering
Sri Sunflower College of Engineering &Technology, JNTU Kakinada
74%
Professional Experience
Company
Designation
Address
Tenure from
Cognizant Technology Solutions India Pvt. Ltd.
Programmer Analyst
Kochi – CNC Naveda Campus, Kochi, India
August 2015-September 2017
Project 1:
FPL AMI Support
Role
ETL Developer
Team Size
10
Technology
Informatica 9.1.0, 9.5.0, & 9.6.0, Unix Shell Scripting, Greenplum DB, Oracle DB, Control M
Project Objective
Cognizant has been engaged with Florida Power and Light to provide 24X7 services for AMI applications. Advanced Metering Infrastructure (AMI) are systems that measure, collect, and analyze energy usage and communicate with metering devices such as electricity meters either on request or on a schedule. These systems include hardware, software, communications, consumer energy displays and controllers, customer associated systems, Meter Data Management software, and supplier business systems. AMI ETL Support team is supporting production environment of Data Warehouse to monitor and detect the production issues aimed to minimize the customer. The scope of the support work includes supporting the AMI DW (RCSDW, RRD, CES and AMI Analytics) jobs on 24X7.
Responsibilities
●Created and enhanced the Informatica mappings, sessions and workflows according to the requirement.
●Testing of the new and changed system with proper documentation.
●Handled source files - XML files, flat files, CSV
●Preparing migration document for moving the application to different environments.
●Written SQL overrides in Source Qualifier according to business requirements.
●Strong in SQL queries
●Created Sessions and Workflow for designed mappings.
●Used Workflow Manager for Creating, Validating, Testing and running the Sessions and scheduling them.
●Extract Transform and Load (ETL) source data into respective target tables to build the required data marts.
●Responsible for providing 24/7 production support for daily job and ETL loads.
●Created an automated ETL summary report which consists of the summary information for all the loading activities done each day
●Worked closely with DBA team to regularly monitor system for bottlenecks and implement appropriate solutions.
●Creating UNIX scripts for file transfer and for automated reports.
●Involved in preparing outage plans to support production jobs during DB server upgrades, Outages and Informatica upgrade
●Load data into a Greenplum database instance using external tables, SQL copy and insert commands, and parallel load utilities.
●Implemented table partitioning for handling very large tables.
●Extensively worked on ELT performance tuning using multiple Gpfdist instances
●Extensively used gpfdist and gpload utilities for data load and unload.
●Distribute and store data in Greenplum using a distribution key and partitioning
●Worked on optimizing indexes for better performance and maintainability
●Improve query performance by following performance enhancement tips and database best practices.
●Having knowledge on backup and restoration of Greenplum databases using gp crondump,gp backup,gp restore utilities
Project 2:
Data Warehouse Sapphire Retail Mart - Building ETL Processes
Role
Developer
Team Size
6
Technology
Informatica 9.1, Oracle DB
Project Objective
A manufacturing company Sapphire Retail Mart requires a centralized data repository and a data warehouse for enhancing decision making to sustain, promote, and increase their sales and to keep their records updated on the Monthly Basis. The project builds a Centralized Data Warehouse to generate monthly reports and forecast product sale from various cities of US that are Texas, Pennsylvania, and Virginia.
Responsibilities
●Developed Informatica objects and Oracle tables
●Involved in ETL designing process
●Implemented ETL processes using Informatica to load data from Flat Files to target Oracle Data Warehouse database.
●Extract Transform and Load (ETL) source data into respective target tables to build the required data marts.
●Designed and developed Operational Data Store, Slowly Changing Dimension, Staging areas.
Rewards and Recognition
●Rewarded as Chanakya – For project delivery excellence for Quarterly Awards - Q4 2016