Post Job Free

Resume

Sign in

Senior ETL Developer

Location:
Plano, TX
Posted:
November 09, 2017

Contact this candidate

Resume:

ADITYA DEVARASETTY

Sr. ETL Developer

Email : ac272h@r.postjobfree.com Phone : 210-***-****

PROFESSIONAL SUMMARY:

●Skilled IT professional with 7 years of experience in Data Integration & Reporting technologies. Currently working as Sr. ETL Technical lead

●5+ Years of experience in working with Informatica Powercenter tool and 3+ Years of experience in Datastage.

●Good experience in System Analysis, Design, Development, Testing, Implementation and Maintaining Data Warehouse/ Data marts

●Experience in Insurance, Financial Services and Marketing & Sales domains

●Experience in the Emerging technologies like Big Data. Good knowledge on Hadoop Platform and Application framework, HDFS, Hive, Python & R programming

●Experience in reporting tools like SAP Businessobjects, Design Studio and Lumira

●Experience in Dimensional Data Modeling like Star Schema and Snowflake Schema, Logical and Physical Design

●Experience in all aspects of the SDLC process - requirement analysis, functional specification, design, development, testing, packaging, and support/maintenance

●Good experience in Extraction, Transformation and Loading (ETL) processes

●Expertise in Informatica Performance tuning (sources, mappings/jobs or sequences, targets and sessions). Well acquainted with Informatica Designer, Workflow Manager and Workflow Monitor Components.

●Experience in Data Warehousing concepts like Data Cleansing, Slowly Changing Dimension phenomenon (SCD), surrogate key assignment

●Experience in RDBMS with good SQL knowledge in DB2, Netezza, SQL, Triggers, Views and Stored Procedures

●Designed technical processes as a Team Lead by using internal modeling and working with analytical teams to gather requirements and create specifications

●Extensively Implemented Error Handling Concepts, Testing, Debugging skills and Performance tuning of targets, sources and transformation logics

●Experienced in UNIX Shell scripting (BASH) as part of file manipulation, Scheduling and text processing

●Excellent experience in scheduling & monitoring the production jobs using CONTROL-M

●Strong experience in understanding business applications, business data flows and data relations

●Team player with good communication skills, written skills, technical documentation skills and also a self-motivated individual with exemplary analytical and problem solving skills.

●Coordination skills related to management of the offshore resources.

TECHNICAL SKILLS:

●ETL Tools : Informatica Power Center, IBM InfoSphere DataStage

●Databases : Oracle 10g, SQL Server 2005, DB2, MS Access, Netezza

●Operating Systems : UNIX (Linux, AIX, HP-UX, Solaris), MS DOS and Windows NT/XP

●Scheduling tool : Borland Control-M

●Languages : PL/SQL, T-SQL (DDL, DML), XML, JavaScript.

●Database Tools : SQL Developer, Squirrel, Aginity

●BI Tools : SAP Business Objects, Lumira and Design Studio

●Modeling Tool : MS Visio

●Version & Source Control : Borland StarTeam

●Defect/ Task Tracker : Rational Team Connect

ETL Frameworks:

●DataStage Error Handling Process - Error handling process built using a pluggable infrastructure model.

●ETL Framework – In house Framework designed for ETL jobs to handle the processing tasks Data Quality, Job maintenance & server maintenance tasks

CAREER PROFILE

TCS - United Services Automobile Association (USAA)

Timeline & Location: Feb’15 - till date; Plano, Texas

Responsibilities:

●Involved in all phases of building operational systems like analyzing the business requirements, ETL process design, data modeling, code development & testing

●Led the team in architecting, developing and implementing frameworks for Integration & reporting process across the enterprise.

●Developed mappings, workflows/jobs and Activities for daily process to loading heterogeneous data into the data warehouse. Source files include tables, delimited flat files, XML files and Sales force data.

●Developing Informatica Mappings, Mapplets, reusable Transformations, tasks, sessions and Workflows for Daily process to loading heterogeneous data into the data warehouse. Source files include delimited flat files, and DB2 tables.

●Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy, Union, SQL and Sequence generator.

●Improved and enhanced various jobs inside different cycles by creating Reusable & Common jobs technique. Extensively used parameters at run time to push in job related parameters

●Extensively worked with DB teams for partition the tables and for tweaking the SQL queries for better performance.

●Analyzed and identified the list of workflows, which can be run concurrently for better performance.

●Added appropriate dependencies at job level & batch process level for each job and Quantitative resources for each database/resource pools at scheduling tool to avoid deadlocks, timeout and connection issues.

●Integrating with the data sources (like Member data and products from various systems). Comprehensively used DataStage for loading the historical data from various tables for different Line of Business

●Managed the authenticity of UNIX components by Checking-in & Checking-out code from StarTeam/RTC, a code versioning tool.

●Prepared Unix Shell Scripts for scheduling the jobs using Control-M.

●Created Wisdom solution, ETL overview & Code walk through documents.

●Code migration of Informatica jobs from Dev, Test & Prod through release process. Ensuring the Unit testing, Integration Testing, Job & Environment parameters testing along the way.

●Actively participated in discussions and implemented Error Handling process

●Performed end to end operations and maintenance like iteration planning, story writing with scrum master, defining problem/project scope, writing measurable goals, acceptance criteria, design, development, testing, retrospective meetings, code move procedures, implementation, warranty support and maintenance

●All Projects have been delivered using agile methodology

Domain: Financial Advice & Solutions Group

Technologies & Tools: Informatica 9.6, Oracle, Netezza, Unix, Business Objects, Design Studio, Control-m, SQL Developer, MS Visio, RTC

Timeline & Location: Feb’15 – till date; Plano, Texas

Projects:

Origination & Destination – Origination of a product right from premium payment to destination pay out

Cash Flow – Money Movement – To determine Cash in & Cash out and flow within organization

Member Cohorting – To achieve Member behavioral analytics, member dropout views and member

channel hopping

Carrier Switching – To know how many members have been switched between carriers for a particular

time frame

Life & Annuity sales funnel – A funnel which represents the flow of a product from store session to

product acquisition and their ratios

Duration Time and Product Bundling – what is the time elapsed between each phase of acquiring

product right from first store session to product acquisition

Conversion of source system from ABC to XYZ – redesigning warehouse/mart tables to include new

source system changes

Migration of ETL jobs from Informatica to Datastage – conversion of legacy processes into latest datastage version

Informatica migration from 9.1 to 9.6 – Informatica tool upgradation – end to end testing in new environment and code sync up activities

Production Support & Maintenance – provided 24/7 support for production batch cycles

Domain: Marketing & Member Experience Team

Technologies & Tools: Informatica, HDFC, Hive, Oracle, Netezza, Unix, Business Objects, Design Studio, Control-m, SQL Developer, MS Visio, RTC

Project: EIA Retirement (Bank Pre Approvals & Mortgage Feeds)

Timeline & Location: May’14 – Sep’ 14; Chennai, India

Informatica to Datastage Migration project. Enterprise Integration Area Application is getting retired as part of modernization and to improve the Member experience. Enterprise Integration Area is the main data source for the major applications like marketing, Bank pre approvals, Mortgage and P&C member data. Marketing applications moved off from EIA as part of modernization of marketing & sales process and influenced other applications to move out of EIA to cut down the huge infrastructure costs. This phase of the project is to redesign and to redevelop the Bank pre approval Process & Mortgage process using data stage.

Project: DataStage upgrade 8.1 to 9.1 & Framework upgrade – Marketing/Sales Data Marts.

Timeline & Location: Dec’13 – Apr’14; Chennai, India

Marketing/ Sales batch jobs are running on DataStage 8.1. Current env became unstable as there was an increase in the applications developed & deployed. To resolve the env issues, Infrastructure team proposed to upgrade the Datastage version from 8.1 to 9.1 and the infrastructure to grid. Marketing/Sales batch process is one of the major and critical batch process in the enterprise with around 2500+ ETL jobs. These jobs need to migrate to 9.1 with through testing to minimize the business impact. Infrastructure components need to be upgraded to E3 framework, which has key enhancements in the data quality space. This project is executed in phases covering daily jobs, weekly jobs, monthly jobs, semiannual jobs and ad hoc jobs. The major challenge in this project is the coordination with other modules which are depending on Marketing cycles and vice-versa.

Project: Integrated Marketing & Sales Management (IMSM) - Data refreshment & Product Cycle Redesign

Timeline & Location: Sep’12 – Dec’13; Chennai, India

Currently campaigns are executing only during the Central Standard Time business hours, customer is planning to execute the campaigns 24X5. Current infrastructure didn't have the capability of executing the campaigns 24X5 because the feeds from various source systems loads at various intervals. To resolve this issues, new process was designed and developed to load the data into marketing marts without disturbing the campaign execution

IMSM system started receiving large volume of data after the changes in member Eligibility criteria. Due to high volume of data & infrastructure issues, the IMSM systems start experiencing data latency issues. To overcome the data latency issue we redesign the long running jobs in the product batch process and reduced the total execution time by 25%

Project: Offer Management System (OMS) – Product Effectiveness & HelpUSave

Timeline & Location: Jan’11 – Aug’12; Chennai, India

Bank business teams are unable to measure the product effectiveness of the bank offers like credit cards & deposits. This project was executed to support the business teams to measure the campaign effectiveness and the ROI. OMS operational Data was loaded into Enterprise Data Warehouse (EDW) to provide as trusted source for other systems. Once the data was loaded into EDW, the data related to bank offers are process based on the reporting needs and loaded into Bank Reporting Data mart for Reporting purposes. As part of this project, we implemented the Purge process for Credit Cards & Deposits

Objective of this project is to load the offer details and related member details into the OMS operational data using batch cycles. Business will define the offers and related business rules to select the people for a particular offer. The Offer details will be loaded into marketing tool (Epiphany/ UNICA) and the audience file will be created by the tool. The created audience files will be loaded into OMS system, which will be displayed on the .com based on the member profile.

EDUCATION:

●Bachelor of Technology (B.Tech) in Electronics & Communications Engineering from Jawaharlal Nehru Technological University, Anantapur, Andhra Pradesh, India.

CERTIFICATIONS:

●SAFe Scrum Master Certified



Contact this candidate