Post Job Free

Resume

Sign in

Manager Data

Location:
Cedar Rapids, IA
Posted:
July 21, 2019

Contact this candidate

Resume:

Resume of CHINTHAM SATYA

SUMMARY OF QUALIFICATIONS

Over ELEVEN years of IT experience with extensive Data Warehousing implementations across various industries.

Strong experience in the analysis, design, development, testing for Data Warehouse applications.

Strong Data Warehousing ETL experience of using Informatica Data Quality (IDQ), Informatica PowerCenter 10x/9x/8x/7x Client tools - Mapping Designer, Workflow Manager/Monitor, Repository manager.

Extensively used DAX (Data Analysis Expressions) functions for the creation of calculations and measures in the Tabular Models.

Expert in ODS, OLAP implementations teamed with project scope, analysis, requirements gathering, data modeling, effort Estimation, ETL design, development, testing, deployment and production support.

Participated in sprint planning, daily scrum standup, and sprint retrospective meetings.

Participated in the development and grooming of project backlog stories (requirements) with business analysts and product owners.

Involved in entering and updating issues (stories and tasks) in JIRA (agile) tool.

Strong knowledge of Software Development Lifecycle and Methodologies like Waterfall and Agile.

Expert in relational databases such as Oracle, Teradata, SQL Server and DB2.

Expert in several file systems like .CSV, Fixed Width files, XML and JSON files.

Worked on IDQ Analyst tool for data profiling, scorecard and data domain discovery.

Worked on Informatica Data Quality (IDQ) developer tool to do data analysis, data cleansing, to apply match algorithms, standardization, address standardization

Worked on AWS cloud implementation using the following Amazon S3 bucket, Amazon EC2 Services, ELB storage, Data Migration Services (DMS) to move data from on premises to cloud, GLUE to do ETL activities and Lambda etc.

Experience in working with business analysts to study and understand requirements and translated them into ETL code in requirement analysis phase. Also interact with the business users by participating in meetings with the clients in requirements analysis phase.

Expert on performance tuning, identifying and resolving performance bottlenecks in DWH applications.

Involved in System Integration Testing, User Acceptance Testing and Functional Testing.

Expert in involving Production Support to monitor daily batch cycle and provide immediate fix in case of failures.

Experience in using Automation Scheduling tools like Control-M, Autosys and UC4 to organize and schedule jobs.

Experience in using UDeploy to migrate the code automatically to higher environments.

Experienced in UNIX work environment, file transfers, job scheduling and error handling.

Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.

Be a part of and contribute to ETL code reviews with the goal of ensuring standardization, best practices, consistency, and quality.

Proactive team player with capability of handling responsibilities also played induvial contribution role.

Flexible and versatile to adapt to any new environment with a strong desire to keep pace with latest technologies.

Experience in on-shore and off-shore development/support model.

TECHNICAL SKILL

ETL Tools: Informatica Power Center10x/9x/8x/7x, Informatica Data Quality (IDQ), Power Exchange and Talend.

BI Tools: Tableau.

Data Models: Erwin. TOAD.

Methodologies: Waterfall, Agile.

Database: Oracle, Teradata, DB2, SQL Server, MongoDB, DynamoDB.

File Systems: XML, Flat files, JSON, WSDL and Excel.

Cloud Services: AWS cloud- suite.

Operating System: Windows, UNIX.

Programming Languages: SQL, PL/SQL, Shell scripts, C++, Java, Python, BTEQ.

Scheduling Tools: Control-M, Autosys, UC4 & Informatica Scheduler.

Others: TOAD, SQL Developer, SQL Assistant, ServiceNow, JIRA, AWS S3, TFS, UDeploy, WinSCP and Remedy.

PROFESSIONAL EXPERIENCE

AEGON, Cedar Rapids, IA Dec 14 - Present

Sr. ETL Developer

AEGON Asset Management System:

Aegon Asset Management (AAM) group is an asset management firm that manages the assets of various business clients. Aegon invests the funds of its clients in various markets and provides their clients, the growth and performance details of their investments. To serve this purpose, a warehouse - Investment Data Warehouse (IDW) is built which extracts and holds financial information received from multiple investment source systems. Data is cleansed, transformed and validated for accuracy before storing the data in a central storage repository of IDW. The system provides our business units a consolidated view of data for ad-hoc querying and reporting.

Responsibilities:

Work with Business Analysts in translating business requirements (defined in Sprint planning) into user stories.

Interact with business/product owners on tasks and requirement gathering for new enhancements.

Participate in backlog refinement meetings for detail estimation for new development/enhancement after analyzing the requirement.

Worked closely with Architects and Developers to understand the business process and functional requirements.

Proactively involved in daily standup meetings, planning sessions, backlog grooming sessions and retrospectives.

Created mapping documents to outline data flow from sources to targets.

Develop mappings/sessions/Workflows to extract data from source systems, cleanse and load the data into warehouse by applying necessary business logic.

Modified existing mappings for enhancements of new business requirements.

Tuned the Informatica mappings for optimal load performance.

Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

Identify and solve issues on daily basis with offshore resources and make sure they follow the coding/design standards.

Used DAX (Data Analysis Expressions) functions in creating Power BI dashboards, reports and Tabular Models.

Review the code developed by the team and suggest improvements, if any.

Create reusable components, user defined functions and mapplets for better reusability of code.

Adhere to the quality of deliverables by performing various test cycles like unit testing, integration testing (SIT) and acceptance testing (UAT) to ensure that we promote defect free code to production.

Schedule Informatica workflows in Control-M.

Created custom templates for use within Confluence.

Write shell scripts to trigger the jobs and transfer/drop the files to different locations.

Promote the code to QA/PROD environments using an automated deployment (U Deploy) tool and maintain the required documentation for every release.

Wrote Python scripts to parse XML documents and load the data in database.

Worked with python Unit test library which was used for testing many programs on Python and other codes.

Created SQL Procedures, Functions and Shell Scripts for process automation.

Developed automation scripts to test storage appliances in Python.

Worked on AWS suite-for data migration from on premises to AWS cloud.

Worked on AWS – DMS, GLUE, EC2, Athena, CFTs and S3 etc.

Provide inputs & vision to ETL development standards and guidelines.

Conduct trainings and Knowledge transfer sessions, as and when needed.

Monitor the daily batch cycle and fix the failures in QA/Production, in case of any.

Actively involved in production support. Implemented fixes/solutions to issues/tickets raised by user community.

Environment: Informatica Power Center 10.1/9.x, UNIX Shell Programming, Python, Control-M, Oracle, SQL Server, SharePoint, JIRA, BizTalk, TFS and U Deploy, AWS, IDQ.

KRONOS, MA, India May 11 - Dec 14

ETL Developer

Enterprise Data Warehouse (EDW)

Kronos provides the services using with different business applications. Integration platform receives the data from different business applications and populating the data into the EDW. Informatica is the ETL tool used to populate the EDW (Enterprise Data Warehouse) applications like General Ledger, Sub-ledger, Account Payable, Account Receivable, Invoice data etc.

Responsibilities:

Work with Business Analyst in translating business requirements into Functional Requirements Document and to Detailed Design Documents.

Supervise and delegate tasks to team members and also provide them the necessary documentation, in case of any.

Develop mappings/sessions/Workflows to extract data from source systems, cleanse and load the data into warehouse by applying necessary business logic.

Convert the existing PL/SQL blocks to ETL code, to ensure better readability and maintainability.

Develop/Modify the existing stored procedures & materialized views as per the requirements.

Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.

Created reusable objects like Mapplets, worklets, reusable transformations and user defined functions to use across multiple mappings

Modified existing informatica mappings for enhancements of new business requirements.

Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, address standardization, exception handling, and reporting and monitoring capabilities of IDQ.

Expertise in implementing Profiling (Column level/table level), Scorecard to reveal, understand the structure of data.

Cleansed the data from source systems using IDQ specific transformations like labeler, Parser, Match, Merge, Consolidation, Standardizer.

Expertise in Exporting IDQ mappings into Informatica designer and creating tasks in workflow manager and scheduling the mapping with scheduling tools to run these mappings.

Perform various test cycles to ensure the quality of deliverables and provide support.

Support Production/QA batch cycle and ensure that the SLA is maintained.

Analyze and fix the code in PROD, in case of any defects reported.

Tune the performance of Jobs which might be impacting the SLA’s of production cycle.

Provide knowledge transfer (KT) sessions to the team, as and when required.

Involved in creation of reports, dashboards in ServiceNow.

Conduct trainings and Knowledge transfer sessions as and when needed.

Create transition documents and facilitate knowledge transfer sessions to the application support team.

Worked as module lead in offshore team to solve coding/testing issues

Environment: Informatica Power Center 9.6/8.6, Informatica IDQ, ServinceNow, UNIX Shell Programming, UC4 Scheduling, Oracle 10g, SQL Server 2008, Visio, VSS, Oracle apps and SharePoint.

First American Corporation, CA, India Aug 10 - May 11

Associate projects

Core Logic ADEL

First American Financial Corporation is a United States financial services company and is a leading provider of title insurance and settlement services to the real estate and mortgage industries. The objective of this project is to build a warehouse that serves the analytic needs of various analysis/dashboard reports to provide their financial report patents and also provides data to Property Locator System (PLS).

Responsibilities:

Analyze the existing system, Data Process flow and the Loading process to the PLS and to the Databases.

Develop jobs to process data from disparate source systems like Flat Files, DB’s and XML Files.

Develop Mappings using various Transformations like Lookup, Sorter, Expression, Union, SQL, Expression and Aggregator, Joiner, Update strategy etc.

Extensively worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.

Created the mapping design documents for business requirements.

Prepared test Plans, test Requirement for Data/Business Analysts to verify and validate the code.

Configured and ran the Debugger from within the Mapping Designer to troubleshoot predefined mapping.

Develop Mapplets and Reusable Transformations for code reusability.

Perform various test cycles by promoting the code to different environments to ensure the quality of code.

Proactively involved in performance tuning at the mapping level, session level, source level, and the target level for the mappings which were taking long time to complete.

Create DG group/Query/Label for informatica components for migrating the code from development to higher region.

Worked on waterfall methodology.

Create and maintain the necessary documentation that would help in a smoother transition of code to client/other teams.

Environment: Informatica Power Center 8.6, Mainframe File System, UNIX Shell Programming PL-SQL Developer.

Credit-Suisse, Zurich, Switzerland, India Nov 08 - Jul 10

ETL developer

FIT Gateway

Finance IT has a large volume of data coming from variety of front and back office systems in multiple formats with mixed data quality and sometimes overlapping datasets. With increased pressure to meet regulatory and compliance requirements, several Finance processes have come under review due to increased risk of control failures over the past few years.

Responsibilities:

Analyzed the existing system, Data Process flow and the Loading process to the FIT Gateway and to the Databases.

Used ETL tool Informatica to develop different mappings by using different Transformations like Aggregator, Lookup, Expression, update Strategy, Joiner, Router etc. to load the data into staging tables and then to target.

Extensively used Informatica client tools – Source Analyzer, Mapping designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.

Maintained ETL Shared Folder for the Source and Target Definitions, Reusable transformations and Mapplets.

As part of ETL process, extracted data from Flat files, Oracle, Heterogeneous Database tables and Excel spreadsheet into target DB2 database and applied Business Logic on Transformation Mapping for inserting and updating records when loaded.

Involved in parameter files generation for all the staging mappings.

Environment: Informatica Power Center 8x, DB2, Oracle and TOAD.

Credit-Suisse, Zurich, Switzerland, India Aug 07 - Oct 08

ETL developer

Algo Collateral Migration

Algo Collateral has upgraded its tool from V4 to V5. The Algo Collateral Application is Collateral Management tool that performs the Call Calculation Mechanism of OTC and FX data. The new version Algo 5 is populated using Informatica to perform ETL process and replaced the existing legacy systems using SQL plus and loader utilities.

Responsibilities:

Analyzed the existing system, Data Process flow and the Loading process to the Algo and to the Databases.

Converted the SQLPLUS and SQL Loader Jobs into various Informatica Mappings.

Developed mappings to connect various Source Systems -- Databases, Flat Files, and XML Files.

Developed Mappings using various Transformations like Lookup, XML Generator, XML Parser Transformation, SQL, Expression and transformation, etc.

Convert all the stored procedures to informatica mappings to process in DW.

Developed well-tuned Mappings using Informatica Designer after identifying and rectifying the bottle necks in Mappings at Source, Target and Transformation levels and in Session.

Tuning the Informatica mapping, session performance by using various techniques like pushdown Optimization.

Created parameter files generation for all the staging mappings.

Implementing various integrity constraints for data integrity like Referential integrity using PK and FK relationships

Used task developer in the Workflow manager to define sessions.

Performed testing, knowledge transfer and provide technical support.

Environment: Informatica Power Center 7.X, SQL, DB2, UNIX Shell scripting.

EDUCATION

Master's in Computer Applications, from Osmania University, India, 2007



Contact this candidate