Post Job Free

Resume

Sign in

Data Developer

Location:
Columbus, OH
Posted:
April 02, 2020

Contact this candidate

Resume:

Suresh Chinta

Email:adck5d@r.postjobfree.com Mobile:+1-646-***-****

Professional Experience:

Over 9.8 years of total IT experience in all aspects of project development life cycle including Analysis, Design, Development, Implementation, Modeling, Testing, Reconciling, and Supporting for Enterprise Data warehousing, Data marts, Data Conversion, Data Integration, Data management projects

Extensive experience in Informatica Power Center (10.x/9.x/8.x) tools, Informatica Cloud.

SUMMARY:

Around 9.8 years of IT experience in Requirement Analysis, Data Analysis, Application Design, Application Development, Implementation and Testing in Data Warehouse applications using ETL tools like Informatica power center 10.1/9.5/9.1/8.6 and power exchange 8.6.

Expertise in Dimensional Data Modeling techniques such as Star and Snowflake schemas.

Expertise in design and implementation of SCD - Slowly changing dimensions types (1, 2 and 3) and CDC – Change data capture.

ETL best practices/methodologies (transformation usage, naming conversions, mapping standards, slowly changing dimensions, deployment groups, versioning, etc).

Extensive experience with Data Extraction, Transformation, and Loading (ETL) from multiple data sources such as Oracle, DB2, SQL Server, Flat files, CSV files, VSAM files, COBOL files and XML files into a Reporting and Analytical Data Model.

Have very good exposure on Change Data Capture using PowerExchange for Oracle.

Involved in Performance tuning of Informatica mappings, sessions, source and target.

Experienced in writing Unix Shell Scripts for the automation of ETL processes and using schedulers like Autosys, Ctrl-M.

Excellent working knowledge on multiple platforms like HP-UX, Windows XP/200x/98, Unix-AIX.

Dedicated, self-motivated achiever who is committed to success and adept at handling multiple tasks in a high-pressured environment.

Excellent communication skills and Ability to work in-groups as well as independently.

Expertise using Debugger, Mapping wizards, Workflow Manager and Workflow Monitor.

Experience in implementing complex business rules by creating transformation, re-usable transformations (Expression, Aggregator, Filter, Connected and Unconnected Lookup, Router, Rank, Joiner, Update Strategy), and developing complex Mapplets, Mappings

Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.

Experience in production support to resolve critical issues and mingle with teams to ensure successful resolution of the reported incident

Experience in working as a Technical Coordinator of an Onshore/offshore Model.

Developed Informatica Mappings to populate the data from oracle OLTP database to oracle and Teradata data warehouse.

Hands on experience to create complex mappings.

Application deployments, migrations and Admin Activities’.

Experience in Design, development of Business Intelligence Solutions using OBIEE.

Created Drill through reports, drill down reports, Sub reports, Cascading Parameters, Linked reports, Matrix etc

Expertise in dealing with RDBMS including E/R Diagrams, Normalization and De-Normalization, Constraints, Data Import and Export etc.

Worked on projects using Agile and Waterfall model methodologies.

Professional Experience:

Working for G2o - Columbus as a Sr.ETL Developer from Jan 2020 to Till date.

Worked for INFOSYS - New Jersey as a Technology Lead from Jan 2012 to Dec 2019.

Worked for IBM as an Informatica Developer from Aug 2010 to Dec 2011.

EDUCATION:

Bachelor’s in JNTU, India, 2004 – 2007.

PROFESSIONAL SKILLS:

RDBMS

Oracle 11g/10g/9i, SQL Server 2012/2010,Teradata 12/13, My SQL.

ETL Tools

Informatica Power Center(10.1/9.5/9.1/8.6), Informatica Cloud, Power Exchange(CDC), BDM, IBM CDC

BI/OLAP Tools

OBIEE, Tableau

Operating Systems

Windows 2008/XP/Vista/7.

Data Modeling

Erwin, IBM Rational Rose, Microsoft Visio 2004/2007/2010.

Programming Languages

SQL, PL/SQL, Shell Scripting

Process/

Methodologies

Data Profiling, Data Quality, Change-Data-Capture (CDC), Push Down Optimization (PDO).

Schedulers

Control M, Autosys, DAC, Main Frame

Other Tools

Toad, Oracle SQL Developer, Putty.

PROFFESIONAL EXPERIENCE:

Alliance Data Services, Columbus OH Jan’20 – Current

Data Integration Developer/Sr.ETL Developer

Roles & Responsibilities:

Design and develop EDW – DataMart’s for ADS as a part of Team.

Communicated with business team and gathered requirements.

Incorporated identified factors into Informatica mappings to build Data Mart.

Designed Informatica complex Mappings and Workflows

Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.

Implemented various Performance Tuning techniques on Sources, Targets, Mappings, Sessions, Workflows and database.

Created a parameter file for each application and incorporated database connections.

Created workflows/work let for each application and Refreshed the list of tables.

Design develop and Maintain Data Vault objects (landing area) using PowerCenter.

Develop, Test, Maintain the ETL code using Informatica Power Center 10.2.

Made DDL changes on database tables as per Business requirements.

Worked with Shell Scripts (UNIX).

Worked with existing Python Scripts and made additions to the Python scripts for data validation and comparison.

Used shell scripts to run ETL jobs.

Used Python Scripts to create CDC and NON-CDC mapping layouts for migration.

Used Python for data comparison, data validation as a part of testing /validating.

Generate test case docs using python scripts.

Worked extensively on SQL Server and Postgres SQL (Yellow Brick DB).

Used JIRA to track and maintain story cards.

Environment: Informatica 10.2, SQL Server, Yellow Brick, DB2, Netezza, Python, Unix.

TIFFANY & CO, New Jersey Jul’18 – Dec’19

Data Integration Developer/Technology Lead

Roles & Responsibilities:

Involves working closely with client for Tiffany Online Data Feed application.

The activities will mainly include Development and Production Support activities will involve working on user incidents, user request, monitoring, batch processing, status tracking, status reporting and coordinating with client.

The enhancements activities will involve working in analysis, design, coding, testing, implementation, deployment and transition to support.

The documentation for each type of activities will involve - support documentation, requirement specification document, technical design document, impact analysis document, business continuity document etc.

Work on system enhancements as required by business

Analyze and constantly improve systems, processes and nature of service being provided to ensure customer delight

Use Infosys automation platforms and Infosys NIA capabilities to develop new automation.

Automation of standard recurring issues for permanent solution

Activities involve to optimize the operations through automation, elimination, shift left, and continuous improvement. These will reduce the resolution time, simplify the support activities, and reduce the cost.

Environment: Informatica Power Center 10.1, IBM CDC, Informatica PowerExchange Oracle, Teradata, Win SQL, UNIX Shell Scripts

Nieman Marcus, Texas Jan’18 –Jun’18

Data Integration Developer/Technology Lead

Roles & Responsibilities:

Worked as an ETL Architect and participated in requirement analysis, code design

and mapping development, testing and data loading from Dev to Prod environment.

Successfully managed a number of projects simultaneously.

Designed data models and helped the dev team to convert functional requirements into technical requirements.

Involved in performance tuning and optimization of Informatica mappings using

features like Pipeline and Partition Parallelism and data/index cache to manage very large volume of data.

Maintained Issue logs, Tracker updates for all the assigned apps, defect tracking and test case management, Applied the concept of Change Data Capture (CDC), reusable mapplets and

transformations in Informatica and implemented Type2 logic and imported the

source from Legacy systems using Informatica Power Exchange (PWX).

Based on mapping spec document, the data conversion rules were implemented to design the mapping.

Performance tuning was done to handle huge volume of customers for all the three states.

Interacted with stakeholders to discuss about the issues in the files provided by

them so that there is no delay in the deliverables.

Interacted and communicated with business analysts, project managers, developers,

internal/external clients and senior management to manage issues throughout the

software testing life cycle using Agile/Scrum methodologies.

Toyota, Torrance, CA Apr’17 –Dec’17

Data Integration Developer/Technology Lead

Roles & Responsibilities:

Involved in Informatica upgrade project from 9.6 to 10.2.

Migrated informatica objects from old server to the new server, as a part of upgrade.

Create, Test, Monitor DRT jobs on Informatica.

Used REST API to access metadata information in Informatica PowerCenter.

Created DSS tasks in Informatica Cloud.

Integrated data from Salesforce into our DWH.

Environment: Informatica PowerCenter 10.2, Informatica 9.6, SQL Server Management Studio 2008 and 2012, Windows Server.

Cisco, SanJose, CA

Jan’16 –Mar’17

Data Integration Developer/Technology Lead

Roles & Responsibilities:

Involved in performance tuning and optimization of Informatica mappings using

features like Pipeline and Partition Parallelism and data/index cache to manage very large volume of data.

Maintained Issue logs, Tracker updates for all the assigned apps, defect tracking and test case management.

Applied the concept of Change Data Capture (CDC), reusable mapplets and

transformations in Informatica and implemented Type2 logic and imported the

source from Legacy systems using Informatica Power Exchange (PWX).

Based on mapping spec document, the data conversion rules were implemented to design the mapping

Performance tuning was done to handle huge volume of customers for all the three states.

Cisco, Hyderabad, India

Jan’12 –Dec’15

Data Integration Developer/Technology Analyst

Roles & Responsibilities:

Extracted source definitions from various databases like oracle and SQL Server

into the Power Center repository.

Communicated with business team and gathered requirements.

Incorporated identified factors into Informatica mappings to build Data Mart.

Designed Informatica complex Mappings and Workflows to be re-start able after a failure.

Informatica Designer to create complex mappings using different

transformations like Filter, Router, Connected & Unconnected lookups, Update

Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.

Oncore Electric, Pune, India

Aug’10 –Dec’11

Data Integration Developer/Technology Analyst

Roles & Responsibilities:

Have been handling all Informatica admin activities like creating

folders, creating users, authenticating users, creating groups etc for all environments like DEV, PROD, TST, UAT etc. (Informatica Administrator).

Handling tickets including sev1, sev2 priority which has 2 hrs, 8 hrs SLA.

Have been working on CRs as well including activity like



Contact this candidate