Post Job Free
Sign in

Data Sql Server

Location:
Spring, TX, 77373
Posted:
December 06, 2017

Contact this candidate

Resume:

VINAY KANDULA

ETL DEVELOPER

*****.********@*****.***

+1-682-***-****

Summary:

Proficient ETL Developer over 8 years of experience in Data Warehouse, ETL Maintenance.

Good exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation and production support.

Good hands on experience with Unix Shell Scripting.

Good Knowledge on Data stage client components - Data Stage Director, DataStage Manager, Data Stage Designer.

Designed Star Schema based Vendor Liability Data Mart with conforming dimensions. End-to-end Extraction, Transformation and Loading (ETL) process with built-in exception handling processes

Lead the Serial Number Data Mart implementation. Implementation includes review of Business Requirements, Functional Design, Star Schema Design and ETL Architecture.

Extensively worked on Informatica Designer Components Like Source Analyzer, Warehousing Designer, Transformation Developer, Mapp lets and Mapping Designer.

Optimizing Informatica Mappings and Sessions to improve the performance.

Experience in Development of mappings using needed Transformations using Informatica tool.

Experience in design and deploy ETL solution for large-scale data OLAP and OLTP instance using Talend ETL.

Strong Experience on Workflow Tools- Task Developer, Workflow &Worklet Designer.

Developed efficient mappings for data extraction/transformation/loading (ETL) from different sources to a target data warehouse.

Experience in Big Data technologies like Hadoop/Map Reduce, Pig, Hive, and sqoop.

Optimizing Informatica Mappings and Sessions to improve the performance. Working experience in using Oracle SQL.

Experience in integration of various data sources like Teradata, SQL Server, Oracle, DB2, Netezza, Flat Files and source files like delimited files, Excel, Positional and CSV files.

Good Knowledge on Data Warehousing concepts like Star Schema, Dimensions and Fact tables.

Hands on Experience on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs.

Experience in working with parallel extender for splitting bulk data into subsets to distribute the data to all available processors to achieve best job performance.

Expert in using different components in Talend like Processing Components (tFilterRow, tAggregateRow, tMap and tJoin), Custom Code Components (tJava, tJavaRow), Logs & Error Components(tDie,tLogCatcher,tLogrow).DatabaseComponents(tOracleInput,tOracleoutput,tOraclecommit,tOracleConnection,tOracleSP,tOracleRollback)ManagementComponents(tFilelist,tFileCopy)OrchestrationComponent(tWaitForFile),FileComponents(tFileInputDelimited, tfileOutputDelimited ), Misc. Component(tRowGenerator).

Hands on Experience to create the Routines (User defined functions).

Having Experience in Various Analysis (Column and Table) on Data and generate the Reports.

Good communication skills, interpersonal skills, team coordination and versed with Software Development processes.

Proficient in planning, estimation, implementation plan, offshore co-ordination and time management.

SKILL SET

ETL/Middleware Tools

Talend 5.5/5.6/6.2, Informatica Power Center 9.5.1/9.1.1/8.6.1/7.1.1

Data Modeling

Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.

RDBMS

Oracle 11g/10g/9i, Netezza, Teradata, Redshift, MS SQL Server 2014/2008/2005/2000, DB2, MySQL, MS Access.

Programming Skills

SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, DHTML, XML, Pig, Hive, Java and Netezza.

Modeling Tool

Erwin 4.1/5.0, MS Visio.

Tools

TOAD, SQL Plus, SQL*Loader, Quality Assurance, Soap UI, Fish eye, Subversion and Teradata SQL Assistant.

Operating Systems

Windows 8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.

Work Experience:

Sr. Talend / ETL Developer

UnitedHealth Group, Minneapolis, MN

July 2016 to Present

Responsibilities:

Developed complex ETL mappings for Stage, Dimensions, Facts and Data marts load Worked on Data Migration using export/import.

Involved in Reviewing the project scope, requirements, architecture diagram, proof of concept (POC) design and development guidelines on Talend.

Designed and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2 to capture the changes.

Excellent experience working on tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.

Data Warehouse Designer Snowflake, Star Schema

Deep understanding of OLTP, OLAP and data warehousing environment and tuning both kind of systems for peak performance.

Performance tuning - Using the tMap cache properties, Multi-threading and tParallelize components for better performance in case of huge source data. Tuning the SQL source queries to restrict unwanted data in ETL process.

Extensively Used tMap component which does lookup & Joiner Functions, tJava, tOracle, tXml, tDelimtedfiles, tlogrow, tlogback components etc. in many of my Jobs Created and worked on over 100+ components to use in my jobs.

Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.

Designed, developed and improved complex ETL structures to extract transform and load data from multiple data sources into data warehouse and other databases based on business requirements.

Managed AOD SaaS implementation, lead star schemadesign, lead ETL and BI development and implementation. Manage development team in-house and off-shore. Interact with business users for product improvement and new releases, migrate reports from relational design to star schema design.

Used more components in Talend and Few to be mentioned: tJava, tOracle, tXml, tMap, tDelimited files, tlogrow, tlogback components etc. in many of my Jobs Design.

Created Talend jobs to populate the data into dimensions and fact tables.

Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and used them in the job.

Troubleshooting, debugging & fixing Talend specific issues, while maintaining the health and performance of the ETL environment.

Mainly Involved in Performance Tuning of Talend Jobs.

Performed migration of mappings and workflows from Development to Test and to Production Servers.

Worked with Parallel connectors for Parallel Processing to improve job performance while working with bulk data sources in Talend.

Supported team using Talend as ETL tool to transform and load the data from different databases.

Experienced in writing SQL Queries and used Joins to access data from Oracle, and MySQL.

Scheduling the ETL Jobs on daily, weekly, monthly and yearly basis.

Experience in Agile methodology.

Environment: Talend Enterprise for Big Data (V6.0.1,5.6.2/5.6.1), UNIX, SQL, Hadoop, Hive, Pig, Oracle,Unix Shell Scriting, Microsoft SQL Server management Studio.

Sr. Talend / ETL Developer

Unisys, Harrisburg, PA

July 2015 to June 2016

Responsibilities:

Involved in understanding the ETL mapping document and Source to Target mappings.

Designed and Developed ETL process using Talend Platform for Big data & Even Worked on Enterprise Latest Versions.

Developed ETL mappings for Staging, Dimensions, Facts and Data marts load Involved in Data Extraction for various Databases & Files using Talend Open Studio & Big Data Edition

Worked on Talend with Java as Backend Language.

Design and Implement ETL to data load from Source to target databases and for Fact and Slowly Changing Dimensions (SCD) Type1, Type 2 to capture the changes

Prepared the ETL functional documents and test case documents.

Studying the source of the data to identify the methods of data extractions.

Experienced on processing the data files through Talend Open Studio.

Worked with multiple sources such as Relational databases, Flat files for Extraction using tMap and tJoin.

Worked on the in Talend components like tMap, tOracleinputoutput, tfileinputoutput, tAggregatRow, tSort, tFilterRow,tFiltercolumn,tSpitRow,tNormalizer, tJavaRowComponents.

Exception Handling in Talend Die, tlog catcher

Worked working knowledge on the reusable components like contexts, Global variables in Talend.

Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc.

Prepared the test plans and test cases for different functionalities involved in the application.

Worked with Microsoft SQL Server management Studio/TOAD while Implementing Unit Testing.

Experience in using Repository Manager for Migration of Source code from Lower to higher environments

Used Quality Centre as the test management tool for the maintenance of the test cases and to track the defects.

Created Talend Mappings to populate the data into dimensions and fact tables.

Developed complex Talend ETL jobs to migrate the data from flat files to database.

Implemented custom error handling in Talend jobs and also worked on different methods of logging.

Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and use them in the job.

Environment: Talend Platform for Big Data 5.6.2, Enterprise Platform for Data integration and MDM (V6.1.1,5.5.1, 5.6.1), UNIX, Hadoop, Hive, Pig,Oracle 11g, SQL Server 2012, Microsoft SQL Server management Studio, Windows XP.

Talend / ETL Developer

Prudential, Newark, NJ

October 2013 to June 2015.

Responsibilities:

Worked closely with Business Analysts to review the business specifications of the project and also to gather the ETL requirements.

Developed jobs, components and Joblets in Talend.

Designed ETL Jobs/Packages using Talend Integration Suite (TIS).

Created complex mappings in Talend using tHash, tDenormalize, tMap, tUniqueRow. Pivot to Columns Delimited as well as custom component such as tUnpivotRow.

Used tStatsCatcher, tDie, tLogRow to create a generic job let to store processing stats into a Database table to record job history.

Created Talend Mappings to populate the data into dimensions and fact tables.

Frequently used Talend Administrative Console (TAC).

Implemented new users, projects, tasks within multiple different environments of TAC (Dev, Test, Prod, DR).

Developed complex Talend ETL jobs to migrate the data from flat files to database.

Implemented custom error handling in Talend jobs and also worked on different methods of logging.

Created ETL/Talend jobs both design and code to process data to target databases.

Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and use them in the job.

Successfully Loaded Data into different targets from various source systems like Oracle Database, DB2, Flat files, XML files etc. into the Staging table and then to the target database.

Troubleshot long running jobs and fixing the issues.

Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.

Performed Unit testing and System testing to validate data loads in the target.

Developed data quality test plans and manually executed ETL and BI test cases.

Understood the business functionality in detail.

Tested entire ETL flow from source_2_target.

Developed FTP for use by testing team during their testing process.

Understood the various process plans, business processes, and functionality in detail.

Initiated knowledge sharing sessions to help circulate the existing knowledge better among the teams.

Environment: Talend 5.1, Oracle 11g, DB2, Sybase, MS Excel, MS Access, TOAD, SQL, UNIX.

Informatica/ ETL Developer

Divine India Limited, Hyderabad.

October 2011 to August 2013

Responsibilities:

Involved in ETL implementation, bug fixing, enhancements and support to BIRST’s customers.

Working with Business analyst to analyze the problem statement and come up with a technical perspective on the implementation of a probable solution using Talend and Informatica Power Center.

Worked on HVR. Created various channels for CDC.

Analysis of certain existing Dimension models, ETL mappings, Dashboards, Reports which were producing errors and giving wrong data set results and modifying them to produce correct results.

Monitored Job and query executions and collected performance data to maximize the performance.

Written Test Cases for ETL to compare Source and Target database systems and check all the transformation rules.

Performed Verification, Validation, and Transformations on the Input data.

Tested the messages published by INFORMATICA and data loaded into various databases.

Involved in performance tuning of the ETL and Reports.

Lead the team as Module lead for ETL Projects.

Designed and developed Informatica power center medium to complex mappings using transformations such as the Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, Rank, Sequence Generator, Stored Procedure and Update Strategy.

Used Normalization up to 3NF and De-normalization for effective performance.

Involved in implementation of the Test cases and Test Scripts.

Tested the data and data integrity among various sources and targets.

Tested to verify that all data were synchronized after the data is troubleshoot, and used SQL to verify/validate test cases.

Extensively worked on the designing the database structure to suit the business needs.

Involved in Unit testing.

Involved in preparation of High level and low-level documents.

Extensively worked in the performance tuning which includes removing ETL bottlenecks.

Performed various EQA & IQA activities at account level.

Performed various Project Management activities at account level.

Environment: Erwin r7.3, SQL/MS SQL Server, MS Analysis Services, Windows NT, MS Visio, XML, Informatica.

SQL Developer

Absolute Infotech Pvt Ltd

Banglore, India

August 2009 to September 2011

Responsibilities:

Security issues related to logins, database users, and application roles and linked servers.

Performance tuning of SQL queries and stored procedures using SQL profiler and Index tuning advisor.

Administered of all SQL server database objects, logins, users and permissions in each registered server.

Resolved any deadlocks issues with Databases/Servers on a real-time basis.

Wrote scripts for generating Daily Backup Report, verifying completion of all routine backups, log space utilization monitoring etc.

Worked on DTS Package, DTS Import/Export for transferring data from various heterogeneous sources to SQL server.

Created tables, relationships, triggers and indexes for enforcing business rules.

Used SQL Profiler to estimate the Slow Running queries and performance tuning purpose.

Wrote different complex SQL queries including inner, outer join and update queries.

Developed reports for payment and BI Count to show organizational and seasonal comparison.

Incremental and full database recovery with experience of complex recovery scenarios.

Worked on DTS Package, DTS Import/Export for transferring data from various heterogeneous sources to SQL server.

Environment: SQL server 2000 Enterprise Edition, Windows 2000/NT, UNIX, Excel, SQL Profiler, Replication, DTS, MS Access, T-SQL, Crystal Reports.

Education Details: Bachelors degree in the field of Computers Science from Acharya Nagarjuna University(2009)



Contact this candidate