Post Job Free
Sign in

Data Engineer

Location:
United States
Posted:
August 06, 2019

Contact this candidate

Resume:

VENKATA VARA PRASAD REDDY YANNAM

Visa Status :-( H1B I140)

Mobile: 331-***-**** Email: ******.*********@*****.***

PROFESSIONAL SUMMARY:

10+ years’ experience in Data warehousing space using Informatica 7x/8x/9.x/10.x and ETL Concepts.

Acquired and developed skills in Requirement Analysis, Design, Coding and Testing of various software applications in Informatica.

Handled project specific trainings for the new Joiners in the project.

Having worked with different clients and various teams has helped me to effectively handle critical situations with ease.

Ability to learn new technologies with ease, implement and deliver projects efficiently.

Quick learner and creative problem-solver.

Solid ETL database programming and database reporting experience and Enterprise Data Warehouse environment

Must also have significant experience using Informatica and Oracle and a strong knowledge

Experience with all phases of SDLC and in implementing, staging, operation data stores and dimensional data modeling preferred.

Experience researching and staying informed of industry technical/ business information security requirements and translating those requirements to the Banks information environment.

Make recommendations and maintain appropriate working relationships with vendors ensuring access to support, product maintenance and software updates

Document all areas of the systems, including which clients are accessing which applications, databases and system security design

Work with the business continuity staff to create and update system disaster avoidance and recovery plans

Strengths include ability in meeting deadlines and deliverables while achieving excellence and quality of work.

Used Workflow manager to create and configure workflow and session task to load data; used Informatica workflow monitor to create, monitor workflow in case of process failures; performed testing as per UTC

Implemented optimization techniques for performance tuning and wrote pre-& post session shell scripts

Configured the sessions using workflow manager to have multiple partitions on source data and to improve performance

Involved in extensive performance tuning by determining bottlenecks in sources, mappings and sessions

Possess strong problem-solving skills developed through variety of projects Expertise in creating Reports using SSRS.

Experienced in interacting with customer and provide support as needed.

Experience in supporting Very Large databases (VLDB) and troubleshooting the problems.

24 x 7 Production Database on Call Support.

Technical Skills:

Methodologies

Agile, Waterfall,SDLC

RDBMS

ORACLE,Server 2000/2005/2008/2008R2/2012, Microsoft Access,SOQL

ETL Tools

Informatica 8.x,9.x,10.x,Server Integration Services (SSIS) 2005/2008,Metal, Informatica MDM/IDQ,Pantheon

Reporting Tools

Server Reporting Services (SSRS) 2005/2008, Tableau

Database Tools

Server Management Studio(SSMS), ORACLE Server, Query Analyzer, Workbench,NOSQL,BQ

Processing Modeling

ERWIN, MS Visio

Business Analysis

Business Requirements, Process Mapping, Root Cause Analysis, Toad

Operating Systems

Windows 98, Windows 2000 Advanced Server/Professional, Windows Server 2003/2008/2008R2, Windows NT, Windows XP Professional/Standard, Windows Vista Business/Ultimate, Windows 7 Ultimate, Unix

EDUCATION QUALIFICATIONS:

Master Of Business Administrator (MBA) from Vinayaka Mission`s University, Chennai.

Bachelor of Technology in Electronics & communications Engineering, from Jawaharlal Nehru Technological University

WORK EXPERIENCE:

Working as Data Engineer Tribolatech(USA) from 2017-

Worked as Technical Lead in HCL(USA/UK /IND) from 2012 to 2017

Worked as Software Engineer in Mphasis an HP Company Banglore 2010 to 2012

Worked as Associate Software Engineer in C C S Corp Chennai-2008 to 20010

DETAILED EXPERIENCE:

Client: Google (Tribola) (April 19-Present)

Role: MDS Business Analyst /Data Engineer

Location: San Jose, USA

Description:- HR-API Application contains corporate reference data from multiple domains (HR, Finance, Facility and Supply Chain).It is served as public reference data in NOSQL / Big Query, and has been used by multiple Cerp end users and report projects. Pantheon ETL pipelines developed and supported by MDS team are used to load reference data from workday to different downstream of Google applications.

Responsibilities:

Perform analysis of existing code for different source files(xml,FF) using ETL tools like Pantheon

Understand the mapping logic's and business requirement, integrating external or new data-sets into existing data pipelines.

Performing system analysis techniques, testing, debugging, design, and storage and understanding application development methodologies to support for different application modules in projects.

Analyze current state of Corp Master Data pipelines and Creating visualizations and dashboards to help the company interpret and make decisions with the data.

Track the pipelines transferring from MDS to the domain owners, continuously monitoring and testing the system to ensure optimized performance.

Presenting the results of a technical analysis to business clients or internal teams and Optimize pipelines by eliminate deprecated logic's and sources.

Analyzes performance metrics to evaluate user community adoption of tools and processes,

Develop and deploy corrective action plans, monitor performance.

Consistently communicates progress, obstacles, and growth opportunities to the stakeholders (development team, product team), and management team.

Escalations Analyzed database, stored procedure, BQ, NOSQL queries using plx tool for optimizations and recommended enhancements

Environment: Pantheon, NOSQl, BQ, Tableau, Clarinet, UNIX, Data studio,PLX,Qualtrics,

Google Cloud Applications

Client: Salesforce (Tribola) (Feb19-April 19)

Role: Systems Specialist II(Informatica)

Location: SFO, USA

Description:-OPSDB/CMF

Responsibilities:

Perform analysis of existing code for different source file using ETL tools like Informatica to

Understand the mapping logics and business requirement

Performing system analysis techniques, testing, debugging, design, and storage and understanding application development methodologies to support for different application modules in projects.

Debug the sessions by utilizing the logs of the sessions, Conducts run list reviews, code deployments and post deploy validations following standard software development lifecycle (SDLC)

Tracking the ticket using the tracker toll using support force with GUS

Involved in identifying the bugs in SOQL Queries using work Bench Tool to find out the RCA for salesforce data.

Analyzed database, stored procedure, and SQLqueries for optimizations and recommended enhancements

Environment: Informatica 10.x ORACLE, SQL developer, Workbench, Support Force, Tidal, UNIX, Winscp,BO, Wave analytics

Client: BankoftheWest (Tribola) (Nov17-Feb19)

Role: Informatica consultant

Location: San Ramon, USA

Description:-

Bank of the West has emerged as one of the top large banks in US with its recent acquisitions. With federal regulation & compliance adherence as one of the key priority this year, Bank of the West initiated a build of an enterprise data warehouse to aid business decision making across various LoB’s within the bank. In addition, the enterprise data warehouse also serves as source of information for several federal / SEC / in-house reporting needs. Data needs to be sourced from various in-house developed & packaged source applications. The Bank decided to leverage Erwin data modeling and Informatica technology for data integration needs and current project requires assimilating information from identified source systems, transform/process into target data warehouse both one time & on ongoing basis.

As part of this engagement, provided IT services to Bank of the West in different models across multiple data warehouse, ETL & reporting technologies. Some Ongoing projects are EDW implementation and CCAR BI Legal & Compliance reporting solution for Federal govt

Responsibilities:

Perform analysis of existing code in mainframes or in ETL tools like Informatica to understand the mapping logics

Developing the mapping documents including System of Record field names, Mapping Rules and W Target field names.

Perform analysis of the existing data to understand the mapping rules in the current environment

Discussion of the mapping document with Architects and mapping team to appropriately map ‘The elements to the Banking Data Warehousing (BDW) Model.

Continuous support through conference calls with mapping team and development team to provide accurate data in new environment.

Reviewing the BDW mapping document to produce the same data in new environment as in current W.

Transmission of the System of Record files from UNIX to new environment.

Coding in new environment as per the mapping documents.

Preparation of test cases to validate the output data generated using the mapping documents.

Validation of the test cases using Quality Center tool of Bank.

Defect tracking through Quality Center and SharePoint portal.

Environment: Informatica 9.x ORACLE, SQL developer, Jira

Client: Allianz Life Insurance (HCL America-US)

Role: Technical Lead

Location: Minneapolis, USA (Oct 2016 – Nov2017)

Location: Chennai, India (Dec 2015 - Sep 2016)

Description: -

Allianz Life Insurance Company provides fixed and variable annuities, life insurance policies and long-term care insurance products. Ecommerce capability is responsible for the electronic data transmissions into and out of Allianz life. It provides data transmission solutions to Allianz life. Primary transmission partner is the DTCC (Depository Trust and Clearing Corporation). It supports the business processes such as application and subsequent premiums, money settlement, commissions, financial activity reports, annuity asset pricing, licenses and appointments

Responsibilities:

Involved in preparing Technical Requirements Document, High Level Design document and Technical Design document.

Complete Ownership of the application Project Accounting Data Ware House System.

Participated in Deployment by adding objects into deployment group and moving the code from test to production.

Deployment of Sources, targets, mappings, sessions and workflows and participated in deployment, to build efficient ETL components to meet the technical requirements.

Ensuring smooth running of the application and facilitating improvisation as and when required.

Monitor the changes carried out in the application as part of maintenance co coordinating and Interacting with the offshore team and resolving the issues.

Responsible for handling the ticket raised by business user and to provide root cause and fix the issue with in time line.

Attended weekly status meetings and provide detailed status report to the client.

Perform unit testing at various levels of the ETL.

Debug the sessions by utilizing the logs of the sessions.

Involved in writing Stored Procedures and Functions to handle database automation tasks

Analyzed database, stored procedure, and TSQL queries for optimizations and recommended enhancements

Created procedures, functions, sequences, constraints and triggers to maintain data integrity. Applied indexes and partitions to improve performance. Created views for monitoring and reporting.

Designed source to target mapping strategy and performed ETL to populate databases from existing dispirit data sources (text, Excel, Access etc).

Worked with business stakeholders, application developers, and production teams and across functional units to identify business needs and discuss solution options.

Environment: Informatica 9.x ORACLE, SQL Server/2008R2/2012, CA Automation tool, Clear Quest.

Client: Admin Re\Swiss Re, Telford, UK (HCL Britain-UK)

Role: Technical Lead

Location: Telford, UK (Mar 2015 – Nov 2015)

Description:

The Admin Re business strategy is to achieve significant growth through acquisitions of closed Life and Pensions business in the North American and European markets, and in partnership with external investors, where appropriate. In order to satisfy the requirements of external investors, Admin Re need to become operationally independent of Swiss Re Group, have sufficiently robust operational and financial processes, and demonstrate control of its costs

Responsibilities:

Reporting the consolidated daily/weekly status to the PM

Participated in the development of new systems and upgrades and updates to existing systems, throughout the entire application lifecycle: requirements analysis & definition, system design, implementation, testing, deployment, and sustainment

Generated the Reports Weekly or Monthly wise as per the Client

Tickets committed within the SLA period without any disputes.

Understanding the application data & business requirement.

Responsible for developing ETL mappings to load data using Informatica Power Center 9.x

Develop and perform tests and validate all data flows and prepare all ETL processes according to business requirements and incorporate all business requirements into all design specifications.

Define and capture metadata and rules associated with ETL processes

Adapt ETL processes to accommodate changes in source systems and new business user requirements

Provide support and maintenance on ETL processes and documentation

Involve in all Unix System Production Support by trouble shooting customers urgent requests.

Create and execute validation and production support scripts for custom components. Documents, review, and execute performance tuning scripts.

Understanding data requirements for test and development

Successfully improved the performance on our application through index analysis, removing unused indexes and analyzing stored procedures, functions and SQL scripts. One of the stored procedure tuning efforts resulted in a 2 minute duration to a few milliseconds

Maintained database environment security for development, test and pre-production environments

Involved in Peer review activities for Code & Testing.

Finding the Root causes of the bugs raised

Optimized performance by tuning the Informatica ETL code.

Developed Informatica workflows and sessions associated with the mappings using Workflow Manager.

Used debugger to test the data flow and fix the code.

Supporting the on call support on Quarter end and production release management deeds

Respond to trouble shoot tickets assigned to me and fix any database issues promptly.

Environment: Informatica, SAP BO, Oracle, UNIX, Tableau,putty,winscp

Client: Microsoft, HYD, (HCL Technologies IND )

Role: Technical Lead

Location: Hyderabad, India (Mar 2013 – Mar 2015)

Description:

Microsoft is one of the top Most IT Company where it is unique and rich in its software products. Beyond which they find creative solutions to business problems, develop breakthrough ideas, and stay connected to what's most important to them. The key main divisions are: Windows Live Division, Server and Tools, Online Services Division, Microsoft Business Division and Entertainment & Devices Division

It is one of the Core applications for MICROSOFT to support its Premier Products. Microsoft® Services Premier Support helps customers maximize the availability and efficiency of their IT infrastructure, reduce risks, and improve the productivity of their IT staff. Its Architecture is builded using latest technology SQL Server 2012, METAL & Informatica Power Center, SSRS.

Responsibilities:

Designing, coding, benchmark testing, debugging, and documenting application features and changes

Reviewing and assessing existing enterprise applications, to support new features, performance improvements, upgrades, and ongoing sustainment

Proficiency with modern development toolsets including Visual Studio, SQL Server Management Studio, unit testing frameworks, source control, and task tracking such as Team Foundation Server

Experience designing, implementing, supporting, and analyzing enterprise scale applications that perform extract/transform/load (ETL) operations with external data sources and SQL Server

Developed and optimized SQL to implement ETL job steps using ETL tools such as Informatica, SSIS, or Metal

Monitor, diagnose and repair all data loading and processing packages to ensure data integrity of the analysis data and also Designing & implement ETL requirements

Involved in Peer review activities for Code & Testing, Deployment

Reporting bugs in the Bug Tracker tool (VSTF) and following up with Business and resolving the issues by conducting meetings.

Analyzed the performance, tuned and optimized, supported database objects to migrate to production level, executed stored procedures, index sizing

Configured the necessary table structures in the database to collect and normalize business data

Resolved known issues and bugs in the corporate platform interface and back end database

Programmed user Defined Functions, Views and TSQL Scripting for complex business logic.

Finding the Root causes of the bugs raised

Reporting the consolidated daily/weekly status to the PM

Expertise in generating reports like drill down, drill through reports, parameterized reports, sub reports and different types chart reports in SSRS and tableau dash boards.

Generated the Reports Weekly or Monthly wise as per the Client Requirements.

Rendering the Reports in the form of PDF, EXCEL and CSV.

Environment: Informatica, SSRS, SSIS, Metal, Oracle, TSQL, SQL server 2008R2\20012

Client: Deutsche Bank (HCL Technologies )

Role: Senior ETL Developer (Mar2012 – Mar 2013)

Location: Bangalore, India

Description:

Deutsche Bank is a leading global investment bank with a strong and profitable private client’s franchise. A leader in Europe and Asian regions, the Bank is continuously growing in North America, Asia and key emerging markets. Deutsche Bank is having around 4000 intermediate recon applications for their business purpose. DB is using these manual reconciliation sheets to find the fraudulence in the bank and other issues. They want to maintain all these in one place

We come up with the reconciliation tool called IRT (Intermediate Reconciliation Tool) where they can plug in all their recon application in this tool. Maintenance will be very easy. In any industry you can easily fit this tool. More than 15 DB- Businesses are using this tool, we have customized (Developed and Implemented) this tool according to their requirements.

Responsibilities:

Involved in various activities like project planning and documentation.

I have guided the team members on ETL development’s and design part.

Coordinated with various businesses in DB to understand their requirements.

Involved in generalized Architecture design of IRT tool.

Involved in Functional and Technical level design.

Involved in BRD document preparation.

Design & implement ETL requirements

Create and execute validation and production support scripts for custom components. Documents, review, and execute performance tuning scripts.

Perform constant file manipulation using VI Editor

Generalized component development using various transformations

Expertise in generating reports like drill down, drill through reports, parameterized reports, sub reports and different types chart reports in Qlik View dash boards.

Involved in Peer review activities for Code & Testing, Deployment

Involved in Implementation

Environment: Informatica, Qlik View, Oracle,unix

Client: Blue Shield of California (EDW - BSC) (Mphasis an HP) (Jun2010 – Mar 2012)

Role: Senior ETL Developer

Location: Bangalore, India

Description: EDW Architecture provides the mechanism to achieve enterprise integration to support BSC business. It provides an organizing framework that will improve data sharing between various business units most importantly this architecture is an evolutionary process. The first enterprise warehouse projects will be based on EDW architecture. Increments of additional projects will cause this architecture to evolve. As technology changes and improves that too will most likely require us to make adjustments to EDW architecture. This incremental development of both the architecture and the warehouse offers an opportunity to adapt to changes in business.

Responsibilities:

Responsible for developing ETL mappings to load data using Informatica Power Center 8.1.1

Extensively used Informatica Client Tools Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer to develop mappings.

Used debugger to test the data flow and fix the code.

Optimized performance by tuning the Informatica ETL code.

Developed Informatica workflows and sessions associated with the mappings using Workflow Manager.

Perform constant file manipulation using VI Editor and UltraEdit to insure data integrity

Prepare UNIX applications and identify UNIX issues.

Extensive experience of Slowly Changing Dimensions

Good knowledge in UTC, Meta Data, QA Documents.

Writing the Unix scripting

Environment: Informatica, Oracle, UNIX, WinSCP, putty

Client: Scope International (Standard charted Bank-CSS Corp)

Role: Soft Engineer

Location: Chennai, India (May2008 – May 2009)

Responsibilities:

Developed Informatica workflows and sessions associated with the mappings using Workflow Manager

Designed business rules based on client requirements and performed testing to validate successful implementation

Supporting the on call support on Quarter end and production release management deeds

Working on the Unix environment,debugging the issue

Environment: Informatica, Oracle, Unix, winscp, putty



Contact this candidate