Post Job Free

Resume

Sign in

Data Sql Server

Location:
Bentonville, AR
Salary:
65/hr
Posted:
April 25, 2018

Contact this candidate

Resume:

Venkat Rao

Informatica Certified Level * Developer

ac48xx@r.postjobfree.com +1-908-***-****

Summary:

9.6 years of IT experience focusing on Data warehousing/Data mart implementations, ODS, OLAP, Data integration, Data Migration, Data Cleansing, Data Governance, Data Lineage, Data Profiling, Data analysis, ETL process and Business Intelligence.

7 years of strong experience on Extraction, Transformation and Loading (ETL) processes from various sources into Data Warehouse/ Data Marts using IBM Infosphere Data Stage ETL tool versions 9.1/8.5

Knowledge in DataStage Admin activities

Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, XML in IBM Infosphere Data Stage ETL tool.

Experience in working with Informatica Power center (Designer, Workflow Manager, Workflow Monitor and Repository Manager).

Expertise in MS-BI tools – SSIS, SSRS and SSAS

Job creation for moving data from Operation Data Store to Analytic Warehouse.

Strong knowledge of Data modeling techniques and Data Warehouse principles.

Hands on experience in Oracle PL/SQL – Packages, Triggers and stores procedure and Sql Server T-SQL

Expertise in working with relational databases such as Oracle 11g, DB2, MySQL, Postgre SQL, Netezza and Teradata 13.10

Strong knowledge of Teradata utilities – MLOAD, FLOAD, TPT and BTEQ.

Monitoring Teradata jobs in viewpoint and performance tuning of Complex Teradata queries.

Strong knowledge in Point to Point interface integration.

Hands on experience in migrating data from Teradata to Hive using Sqoop import, TDCH and from Hive to Teradata using Sqoop export

Fair knowledge in using Data Modelling tools Erwin, ER Studio

Over 5 years of professional experience in Insurance domain and Over 4 years of professional experience in banking and retail domains.

Experience Onsite-Offshore model.

Ability to analyze, visualize and document complex dependencies.

Strong experience in designing of Parallel jobs, Job Sequences, Routines and Batch jobs

Extensive experience in the analysis, design, development, implementation and testing of Data warehouse applications in Insurance, Retail and Banking Domains

Strong understanding of Logical and physical Data Modeling (Relational, dimensional, Star and Snowflake Schema), Data analysis, implementations of Data warehousing using Windows and UNIX.

Trained & worked on reporting tools like Cognos, MicroStrategy.

Conversion of Legacy reports to SSRS reports

Created the Process Flow documents using VISIO

Extensive experience in PL/SQL on creating the Reporting layer for the Oracle Data mart.

Experience in UNIX Shell scripting to accomplish various task like Ftp’ing, Count matching, Data stage job triggering, DB connection and huge data load.

Experience in the Performance Tuning and Optimization of the Data Stage Parallel jobs

Experience in integration of various data sources like SQL Server, Mainframe DB and UDB DB2 using Data Stage parallel jobs

Experience using Query Tools – SQL ASSISTANT, TOAD, SQLPlus+

Experience in using File Transfer Utilities like FTP, SFTP and MFT

Experience in using Automation of Scheduling tools like ESP Scheduler, UC4, Control-M and Autosys.

Ability to interact with various levels of management to understand the requests and validate job requirements

Possess strong ability to quickly adapt to new applications, platforms and languages

Superior communication skills, strong decision making and organizational skills along with outstanding analytical and problem-solving skills to undertake challenging jobs.

Able to work well independently and in a team by helping to troubleshoot technology and business-related problems.

Highly driven and self-motivated with sound business judgment and strong work ethics

Technical Skills:

Programming Languages

Unix, PL-SQL, T-SQL

Business Intelligence Tools

Informatica 8.6/9.6.1, DataStage 8.5, Cognos

Version Control Tools

Team Foundation Server, Visual Source Safe 6.0

Databases

Teradata, DB2, Oracle 12 and SQL Server 2012

Operating Systems

Microsoft Windows 8/7/Vista/XP/2000/98

Linux

Professional Experience:

HBC – EDW MAINTAINANCE June-2016 to till date

Role: ETL Lead

City: Jackson-MS, USA

Description: The Enterprise Data Warehouse (EDW) is a central repository of business data. Data is sourced from operational systems and non-operational systems and stored in the Teradata database. Data warehousing processes include extracting data from source systems and/or files, processing and applying business rules to the data and loading the data into the EDW to allow querying and reporting by the business and data extracts for downstream systems.

EDW is crucial for strategic decision making. The entire system's purpose it to report information including historical data at any level of the merchandising and Organizational hierarchy. EDW (BIS) is the only reporting application that the business has that goes to lower than Dept. level information. All the decision making in Assortment Planning, Exit Strategies and Fashion purchases is based on the information that is available in BIS. Therefore, it is crucial that this information be correct and up to date

Responsibilities:

Actively interacted with Subject Matter Experts and Business Analysts to finalize the requirements and design phases.

Partnering with the business users to understand the requirements and converting them into project level functional specification.

Handled Ad-hoc data and report requests from business users

Worked closely with DataStage admin team while designing complex sequences.

Data movement from Operation Data Store to Analytic Warehouse

Worked with Architects to Analyze, visualize and document complex dependencies

Driving project end to end including ETL Jobs, Data Profiling, and Building Views for front end reports.

Worked on creation of complex DB Objects like Custom Functions and Triggers for Data Warehouse.

Provided point to point integration to third party applications and interfaces

Source to Target Mapping documents and Low-Level Design Document preparation

Extensive knowledge in TPT load utilities and using View point for monitoring the queries.

Participating in the data modeling sessions to understand the data model and providing them inputs based on the existing data to make sure designed model is technically feasible.

Co-ordinate with the QA team in various testing phases by resolving the defects and ensuring smooth execution of the test plans.

Development of Teradata BTEQ scripts and performance tuning of Teradata queries

Creation of PL/SQL procedures and Triggers.

Help the QA team in running the test batch cycles which are scheduled in UC4

Involved in scheduling the batch jobs using UC4 and ESP.

Big Data – migration of data from Teradata to Hive and migrating from Hive to Teradata.

Involved in the setup of ETL components, UNIX scripts, DB objects, Glossary Information from Dev to Production environment.

Involved in capturing all the test cases and facilitated to document the same.

Involved in Report creation using Teradata Queries for manual extracts for the Data Governance Team

Involved in PoC for Data Lineage, Technical Rule Creation, Glossary Terms/Categories Creation

Environment: IBM Infosphere DataStage 9.1, DB2, Oracle, TOAD, Teradata, SQL ASSISTANT, ESP Scheduler, Mainframes, LINUX

Genworth – IFA DATA AGILITY July-2015 to May-2016

Role: ETL Build Lead

City: Richmond-VA, United States

Description: Genworth Financial Inc. is one of the largest insurance holding companies in the U.S. It is a global company with operations in 25 countries and $100 Billion in assets. Genworth Financial Offers Life insurance and mortgage insurance in the U.S. Canada, Europe, New Zealand and Australia.

The Data Agility - Annuities Automation project involves the automation of valuation processes for the Annuity SPDA product.

Data is extracted from the CLOAS Mainframe Systems which is replicated onto the Oracle database and then brought into the Greenplum staging area. Data from the staging area is loaded into the final Data Hub by performing required transformations which will be then used by the business users for their Analytics.

Responsibilities:

Partnered with business users in identifying, prioritizing & resolving numerous data issues create ETL Project Plans, design and development of the tables.

Partnering with the business users to understand the requirements and converting them into project level functional specification.

Prepared Source to Target Mapping documents and Low-level design document

Worked with business analysts to identify the appropriate sources and data elements.

Worked on Reconciliation and Audit logic for data mart

Participating in the data modeling sessions to understand the data model and providing them inputs based on the existing data to make sure designed model is technically feasible.

Handled Ad-hoc data and report requests from business users

Extracted the data from UDB database (Common dimension), Oracle database, SQL Server tables & Flat files.

Worked with the Data Model team in designing the data model

Worked with Architects to Analyze, visualize and document complex dependencies

Data movement from Operation Data Store to Subject Marts

Designed the DataStage parallel jobs & Unix scripts to handle Key validation, Business code validation, Data cleansing, Surrogate key generation, Landing, Staging, Fact load, Merging the data and preparing the aggregates tables.

Created the DataStage jobs, sequences for extracting and loading the data using various stages like Transformer, lookup, sort, Aggregator, and Joiner etc.

Worked on Data Modelling tools Erwin, ER Studio

Reviewing & fixing all the test cases and updating in Quality center tool.

Experience in using TPT load utilities and using View point for monitoring the queries.

Designed the archival logic, Error handling & Reprocessing of the data

Worked on creation of complex DB Objects like Custom Functions and Triggers for Data Warehouse.

Co-ordinate with the QA team in various testing phases by resolving the defects and ensuring smooth execution of the test plans.

Performed the performance and tuning at source, Target and Data Stage job levels using Indexes, Hints and Partitioning in Informatica level.

Investigating and fixing the bugs occurred in the production environment and providing the on-call support.

Environment: IBM Infosphere Data Stage 9.1, Teradata, DB2, Sql Server, PostgreSQL, LINUX Server, UNIX Scripting, Appworx, Visio

USAA-PnC Claims Sep-2014 to June-2015

Role: ETL Lead

City: SAN ANTONIO-TX, United States

Description: Analytical Data Store(ADS) is the data mart build to store the data related to the Auto and Property Insurance and Claims related to the losses. The ADS is the strategic centralized repository of information built to support key analytic functions across the firm. Directors can view the reports to check the productivity of members reporting to him. This is designed to be flexible, easy to use, dependable and scalable while providing the governance structure and data management expertise to ensure that it is accurate, understandable and trusted source to customer analytics.

The data is source from legacy Mainframe tables which are ftp’d to the landing area. These files are extracted onto Staging Data Source(SDS) which are loaded into Data Mart after applying required transformations and calculations.

Responsibilities:

Designing Application, participate in client’s meetings to discuss requirements.

Actively interacted with Subject Matter Experts and Business Analysts to finalize the requirements and design phases.

Developed PL/SQL Stored Procedures

Received minimal comments and got appreciation from manager.

Extracted the data from Oracle, SQL Server, DB2 databases and loaded into Teradata Data Store.

Designed the DataStage sequences & Unix scripts to handle Business rules validation, Data cleansing, Surrogate key generation, Landing, Staging, Fact load, Merging the data and preparing the aggregates tables.

Schedule the jobs using Control-M scheduling tool.

Created Teradata BTEQ scripts to load the aggregate tables.

Reviewing & fixing all the test cases and updating in Quality center tool.

Designed Health tables to maintain the batch ids, statistics, and parameters.

Co-ordinate with the QA team in various testing phases by resolving the defects and ensuring smooth execution of the test plans.

Performed the performance and tuning at source, Target and Informatica levels using Indexes, Hints and Partitioning in Informatica.

Investigating and fixing the bugs occurred in the production environment and providing the on-call support.

Environment: IBM Infosphere Data Stage 9.1, DB2, LINUX, MySQL, Oracle, TOAD, Teradata, SQL Assistants Server, Control-M, Visio, Service Now

Cardinal - Analytic Data Warehouse Mar-2013 to Aug-2014

Role: ETL Lead

City: Dublin-OH, United States

Description: Cardinal Integration business areas are divided into Medical/Pharma and CORP.

Worked in Medical and Pharma ETL Applications to load the Analytical Data Warehouse maintained on Teradata Servers which supports different reporting needs of Business Users. Business Objects Reports are using the ADW as the source system which are used by the executive level to make decisions.

The Medical ETL applications data is sourced from SAP ECC systems to Open Hubs which are loaded by BW Transports and ETL will extract the data from BW tables and loads the Teradata Tables after doing all the necessary transformations logic.

MFT process is involved to transfer the files from the ETL landing area to User Shared folders.

The Pharma ETL applications data is sourced from wide range of applications like LDAP, MDM and DB2 systems which will be processed by ETL and loaded into Teradata Tables. There are real time jobs that are sourced from Message Broker and feed the Teradata Data Warehouse.

Responsibilities:

Extracted the data from heterogeneous sources like SAP using open hub connection, Mainframe tables, Flat files, IBM DB2, SQL server and Lawson Oracle tables.

Identified/documented data sources and transformation rules required populating and maintaining data warehouse content

Identified source systems, their connectivity, related tables and fields and ensured data suitability for mapping

Involved in the analysis of the user requirements and identifying the sources.

Used file transfer utilities SFTP and MFT for transferring of files to business users.

Created technical specification documents based on the requirements.

Data Reconciliation and Audit logic jobs for data mart

Closely worked with DataStage admin and support team for designing complex jobs.

Involved in the preparation of High level design documents and Low-level design documents.

Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, foundation and Mart.

Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.

Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.

Worked with SQL*Loader tool to load the bulk data into Database.

Prepared UNIX Shell Scripts and these shell scripts will be scheduled in Control-M for automatic execution at the specific timings.

Created batch jobs using Unix scripts and trigger Data stage jobs from Unix

Prepared Validation scripts in landing, Staging & Ware house layer to maintain the data accuracy & integrity.

Worked with Data Modelers and helped in designing Data Model

Defect Tracking and reports are done by Quality center

Enterprise architecture standard being followed to ease the support/maintenance work.

Design the application to support the restart ability and recovery approach to ease application maintenance.

Purge Process for Data Mart Table.

Designed the archival logic, Error handling & Reprocessing of the data

Environment: DataStage 8.7/8.5, IBM DB2, Teradata, SQL Assistant, PostgreSQL, Unix Scripting, UC4, Visio, Service Now

Cardinal - Analytic Data Warehouse Oct-12 to Feb-13

Role: ETL Lead City: Pune, INDIA

Description: Cardinal Integration business areas are divided into Medical/Pharma and CORP.

Worked in Medical and Pharma ETL Applications to load the Analytical Data Warehouse maintained on Teradata Servers which supports different reporting needs of Business Users. Business Objects Reports are using the ADW as the source system which are used by the executive level to make decisions.

The Medical ETL applications data is sourced from SAP ECC systems to Open Hubs which are loaded by BW Transports and ETL will extract the data from BW tables and loads the Teradata Tables after doing all the necessary transformations logic.

MFT process is involved to transfer the files from the ETL landing area to User Shared folders. The Pharma ETL applications data is sourced from wide range of applications like LDAP, MDM and DB2 systems which will be processed by ETL and loaded into Teradata Tables. There are real time jobs that are sourced from Message Broker and feed the Teradata Data Warehouse.

Responsibilities:

Extracted the data from heterogeneous sources like Mainframe tables, Flat files, IBM DB2, SQL server and Oracle tables.

Identified/documented data sources and transformation rules required populating and maintaining data warehouse content

Identified source systems, their connectivity, related tables and fields and ensured data suitability for mapping

Created technical specification documents based on the requirements.

Handled Ad-hoc data and report requests from business users.

Involved in the preparation of High level design documents and Low-level design documents.

Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, foundation and Mart.

Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.

Administered the repository by creating folders and logins for the group members and assigning necessary privileges.

Collect and link metadata from diverse sources, including relational databases and flat files.

Worked with source team and created temp tables for DataStage extract using open hub connection.

Designed a highly parallel load of 50 million records of historical data in monthly basis to perform aggregation load

Involved in monitoring the workflows and in optimizing the load times.

Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.

Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.

Prepared UNIX Shell Scripts and these shell scripts will be scheduled in UC4 scheduler for batch execution.

Prepared Validation scripts in landing, Staging & Ware house layer to maintain the data accuracy & integrity.

Prepared test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.

Defect Tracking and reports are done by Quality center

Design the application to support the restart ability and recovery approach to ease application maintenance.

Purge Process for Data Mart Table. Designed the archival logic, Error handling & Reprocessing of the data

Environment: DataStage 8.5, IBM DB2, Teradata, Unix Scripting, UC4, Visio, Service Now

ACE Person Recognition Oct-11 to Sep-12

Role: ETL Lead City: Hyderabad, INDIA

Description: The goal of this project is to use the best possible member data to better recognize a member within the Analytics Environment, ACE Reporting and ACE Services to touch points. While mastering prospect records is not within scope, this project also had a goal to include prospect identifiers in the existing identifier “cross-walks” within the Humana system. Development of SSIS packages, Stored Procedures and Informatica Workflows to Feed the data into OCH and AE Environments which will be used by the different touch points based on OCH

Responsibilities:

Involved in break fix, code fix for the multiple Data ware house Credit, Guest, store team member and Messaging

Monitoring batch jobs and recovering the issues based on priority.

Monitoring and resolving Data warehouse ETL Production issues.

Have done complex break fix/code fix in SSIS, Informatica, T-SQL and UNIX script.

Migration of legacy reports to SSRS reports.

Created Logical and Physical Data models and worked on stage and final table design.

Have done break fix in MicroStrategy SQL’s and supported standard MicroStrategy reports like dashboards.

Receiving client incidents and problems via HPSD and working with in SLA

Receiving Technical and Business enhancement as Service Request (SR) from Clients.

Performing code changes or procedure changes in the production environment.

Reviewing changes/analysis done by team members before implementation

Automation of repeated process and preparation of validations sheets for business client verification of results.

Creating an RFC (Request for change) in Remedy, HPSD for bringing in the changes in the production environment. This also involves coordinating with other groups like DBA to make the changes successfully.

Monitoring the Critical jobs & coordinating for Server Outages. Also monitoring the performance of the system under varying conditions.

Transitioning the applications to the new team members. This includes both Technical and application knowledge of the system.

Have done ETL changes to acquire the best performance.

Estimation of the Application Testing for the Migration and consultation.

Have identified tuning opportunities in the Application Level and implemented to reduce the ETL Server resource consumption.

Business enhancements in the form of System Change Requests (SCR) initiated by business analysts.

Involved in the Data warehousing, ETL tool best practices for new implementation.

Consultation for the application changes, migration activities, outages handling.

Managing and Grooming bunch of Associates in Technology and Application.

Worked in analyzing the issue, trace the problem area, suggest a solution, discuss with business.

Implement the solution with extensive documentation.

Initiated extensive documentations for the Production Support process improvements.

Created shell scripts in UNIX to automate batch jobs.

Efficiency displayed Team-coordination and ensuring customer satisfaction by providing extensive support.

Environment: Data Stage 8.5, SSIS, Sql Server, TFS

SUN TRUST BANKS - Wealth & Investment Management Jan-11 to Sep-11

Role: ETL Developer City: Mysore, INDIA

Description: SunTrust Banks, with total assets of $172.2 billion is one of the nation’s largest financial services holding companies. Infosys was engaged at providing a complete end-to-end Data warehousing solution to the client to help them to maintain Wealth and Investment Management by maintaining a data mart for all their Advisors and their Related Data. The Data is sourced from several ODS systems which are feed to the Staging Area using the ab-initio ETL processes and then loaded into the various Data Marts.

Responsibilities:

Worked with the business users to understand the requirements and converting them into project level functional specification.

Worked with business analysts to identify the appropriate sources and data elements.

Participating in the data modeling sessions to understand the data model and providing them inputs based on the existing data to make sure designed model is technically feasible.

Designed the Framework manager models based on the business requirements.

Reviewing & fixing all the test cases and updating in Quality center tool.

Designed Health tables to maintain the batch ids, statistics, and parameters.

Designed the archival logic, Error handling & Reprocessing of the data

Co-ordinate with the QA team in various testing phases by resolving the defects and ensuring smooth execution of the test plans.

Converted legacy reports to Cognos Reports.

Performed the performance and tuning at source, Target and Data Stage job levels using Indexes, Hints and Partitioning in DB2 and Data Stage.

Investigating and fixing the bugs occurred in the production environment and providing the on-call support.

Environment: Cognos Framework Manager, Report Studio, Metric Studio, DB2, Sql Server, Visio

SUN TRUST BANKS – Distributor Warehouse Management Oct-09 to Jan-11

Role: ETL Developer City: Mysore, INDIA

Description: SunTrust Banks, with total assets of $172.2 billion is one of the nation’s largest financial services holding companies. Infosys was engaged at providing a complete end-to-end Data warehousing solution to the client to help them to maintain Wealth and Investment Management by maintaining a data mart for all their Advisors and their Related Data. The Data is sourced from several ODS systems which are feed to the Staging Area using the ab-initio ETL processes and then loaded into the various Data Marts.

Responsibilities:

Partnered with business users in identifying, prioritizing & resolving numerous data issues create ETL Project Plans, design and development of the tables.

Partnering with the business users to understand the requirements and converting them into project level functional specification.

Worked with business analysts to identify the appropriate sources and data elements.

Participating in the data modeling sessions to understand the data model and providing them inputs based on the existing data to make sure designed model is technically feasible.

Extracted the data from SQL Server and Oracle databases (Common dimension), Oracle database & Flat files.

Developed stored procedures to load the events table which will be used by the MicroStrategy team to generate reports to the users.

Reviewing & fixing all the test cases and updating in Quality center tool.

Designed Health tables to maintain the batch ids, statistics, and parameters.

Designed the archival logic, Error handling & Reprocessing of the data

Co-ordinate with the QA team in various testing phases by resolving the defects and ensuring smooth execution of the test plans.

Performed the performance and tuning at source, Target and Data Stage job levels using Indexes, Hints and Partitioning in DB2 and Data Stage.

Investigating and fixing the bugs occurred in the production environment and providing the on-call support.

Environment: Informatica 9.6, SSIS, Sql Server, Oracle, Unix Scripting

SUN TRUST BANKS- Utah SkyMiles Card Datawarehouse May-09 to Sep-09

Role: ETL Developer City: Mysore, INDIA

Description: SunTrust Banks, with total assets of $172.2 billion is one of the nation’s largest financial services holding companies. Infosys was engaged at providing a complete end-to-end Data warehousing solution to the client to help them launch a new credit card by maintaining a data mart for all their card acquisitions, new customers and point scoring. SunTrust Banks launched SkyMiles reward program along with Delta airlines.

As per this program customers are rewarded with points whenever they use the eligible card for signature or pin transactions. These reward points can be used for booking flights in delta airlines

Responsibilities:

Partnering with the business users to understand the requirements and converting them into project level functional specification.

Worked with business analysts to identify the appropriate sources and tables for the reports.

Participating in the data modeling sessions to understand the data model and providing them inputs based on the existing data to make sure designed model is technically feasible.

Developed Cognos reports using Report Studio and defined KPI metrics using metric studio.

Performance tuning of the reports to meet the report generation SLA which is 1 minute for normal reports and 2 minutes for dashboard reports.

Reviewing & fixing all the test cases and updating in Quality center tool.

Designed and developed views to be used by reports.

Designed the archival logic, Error handling & Reprocessing of the data

Co-ordinate with the QA team in various testing phases by resolving the defects and ensuring smooth execution of the test plans.

Designed and developed Aggregate tables which include all the metric calculations so that the report can fetch the measures directly from aggregated columns without many calculations in report side.

Investigating and fixing the bugs occurred in the production environment and providing the on-call support.

Environment: Cognos Report Studio, Query Studio, DB2, SQL Queries

PETSMART Data Management Sep-08 to May-09

Role: ETL Developer City: Bhubaneswar, INDIA

Description: The client is a leading retailer in US. The project deals with sourcing the data of the customers from different databases, cleansing it and de-duplicating it, to get a single customer view and then load the data in another database. Also, the client would propose some new idea which would bring additional revenues. These new ideas would be implemented as various projects. Some projects would also be like ETL Folder Restructuring, ETL Tuning, etc. so make the performance better. The support work includes solving issues raised by the client in User Acceptance Testing, resolving the bugs assigned to our team.

Responsibilities:

Development of Informatica sessions and workflows for loading the Netezza Database from Oracle data sources

Partnering with the business users to understand the requirements and converting them into project level



Contact this candidate