chandu k
**********@*****.***
Contact No: 419-***-****
PROFESSIONAL SUMMARY:
Around 6.5+ years of experience with ETL tool Informatica power center 10.2/9.6/8.6 / 8.5 / 8.1.1 in designing and developing complex Mappings, Mapplets, Transformations, Workflows, Worklets, scheduling Workflows and sessions.
Worked in ETL and data integration in developing ETL mappings and scripts, guided team for the transformations and all aspects of SDLC that includes requirements gathering, analysis, design, and development.
Strong PL/SQL programming for data population and table alterations, Oracle Performance tuning/ SQL tuning and query optimization skills. Proficient in Oracle 10g/9i, SQL, PL/SQL, SQL*Plus, SQL developer and TOAD.
Experienced in integration of various data sources like Oracle 11g/10g/9i/8i, SQL Server 2005/2008, MS Access, XML source and target files and flat files sources into staging area and different target databases.
Experience in various stages of System Development Life Cycle (SDLC) and its approaches like agile methodology. Well Experienced in doing Error Handling and Troubleshooting using various log files.
Experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling them on Pentaho BI Server.
Experience in writing shell scripting for various ETL needs.
Experience in creating complex mappings using various transformations, and developing strategies for Extraction, Transformation and Loading ( ETL) mechanism by using Pentaho.
Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions. Expert in Entity relationship modeling and Dimensional modeling for data warehouse and data marts.
Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
Developed Extraction, Transformation and Loading (ETL) processes to acquire and load data from internal and external sources. Experience in working with UNIX Shell Scripts for automatically running sessions and creating parameter files to run various batch jobs.
Data Integration experience, including ETL, Data Warehouse. Expert in using the Informatica Debugger to understand the errors in mappings and used the expression editor to evaluate complex expressions and look at the transformed data to solve mapping issues.
Expertise in enhancements/bug fixes, troubleshooting, impact analysis, Unit testing, Integration Testing, UAT and research skills. Excellent understanding and best practice of Data Warehousing Concepts involved in Full Development life cycle of Data Warehousing.
Expertise in enhancements / bug fixes, performance tuning, troubleshooting. Excellent presentation, interpersonal skills and written and verbal communication skills including problem-solving skills.
TECHNCIAL SKILLS:
ETL Tools
Informatica Power Center 10.2/9.6.1/9.5.1/8.6.1/8.1.1
Modeling
Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, Erwin, Microsoft Visio
RDBMS
Oracle 12c/11g/10g, Teradata 15/14/13/12, DB2, SQL Server 2000/2005/2008/2012, MySQL, Sybase
QA Tools
Win Runner, Quick Test Pro, Test Director, Quality Center
Reporting Tools
Cognos, Business Objects, Dashboard Reporting
Languages
Java, XML, UNIX Shell Scripting, SQL, PL/SQL
Operating Systems
Windows, Unix, Linux
EDUCATION:
Bachelor‘s in Information System
Master’s in Information Assurance
PROFESSIONAL EXPERIENCE:
Role: ETL Developer Feb 2018 – Present
AT&T - Atlanta, GA
Responsibilities:
Experienced in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling them on Pentaho BI Server.
Used Pentaho data integration to replace previously existing stored procedures and packages with Pentaho jobs, thus decreasing their daily runtimes.
Created transformations that involve configuring the following steps: Table input, Table output, Text file output, CSV file input, Insert/Update, add constants, Filter, Value Mapper, Stream lookup, join rows, Merge join, Sort rows, Database Lookup, Set Environment Variables.
Designing document to create ETL pipeline mappings, sessions and workflows. Tested Pentaho ETL jobs, reports and database stored procedures being called in the reports.
Deployed ETL files on the Linux servers and written shell scripts to make ETL jobs run at scheduled timeline.
Imported bulk amounts of data from oracle tables and populated to various input and output steps.
Used Pentaho Data Integration Designer to extract data from various sources including flat files of fixed format, Excel, XML, CSVs
Used session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
Created Complex ETL Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, Router and Update Strategy.
Analyzed requirements from the users and created, reviewed the specifications for the ETL.
Designed Incremental strategy, Created Reusable Transformations, Mapplets, Mappings/Sessions/Workflows etc.
Created Complex ETL Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, Router and Update Strategy.
Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
Developed various reports for business analysis using Pentaho, Actuate and SSRS and scheduled the reports using schedulers and set up email notification to user when report execution is completed.
Involved identifying the reports required for business users and migrating them from Actuate to Pentaho and worked on issues and comments raised by user as part of migration.
Developed jobs to run the SQL * LOADER with the help of Perl and Shell scripting to load data from various sources form of either Excel, CSV or flat files into the target database tables.
Worked on SQL and PL/SQL to extract, process and load data for ETL jobs developed with the combination of Perl and Shell scripting and production issues, failure of ETL jobs, to load data to data warehouse and ODS and data issues related to reports.
Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart. Created Pre & Post-Sessions UNIX Scripts, Functions, Triggers and Stored Procedures to drop & re -create the indexes and to solve the complex calculations on data.
Developed normalized Logical and Physical database models to design OLTP system for insurance applications. Created workflows, worklets and used all the other tasks like email, command, decision, event wait, event raise and assignment tasks in the Workflow Manager.
Created new fact and dimension tables along with various transformations and jobs used for data staging and data extraction.
Setting up sessions to schedule jobs using Notification Services and generated reports using Pentaho report designer.
Extensively used ETL methodology for supporting Data Extraction, transformations and Loading process.
Extensively used SQL server 2016 to load data from flat files to the database tables in Oracle. Modified existing mappings for enhancements of new business requirements.
Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs. Participated in system analysis and data modeling, which included creating tables, views, indexes, synonyms, triggers, functions, procedures, cursors and packages.
Used Debugger to test the mappings and fixed the bugs. Also worked on Qlik view tool to create dashboards for the business users as per their requirements.
Role: ETL/SQL/SSIS Developer June 2016 – Jan 2018
Pizza Hut, Dallas TX
Responsibilities:
Creating new Tables, Sequences, Views, Procedure, Cursors and Triggers for database development.
Creating Indexes on tables for the faster retrieval of data to enhance database performance.
Using all kinds of SQL Server Constraints (Primary Keys, Foreign Keys, Defaults, Check, Unique etc.)
Extensively used advanced features like Object types, Collections and Dynamic SQL.
Using all kinds of SQL Server Constraints (Primary Keys, Foreign Keys, Defaults, Check, Unique etc.)
Extensively used advanced features like Object types, Collections and Dynamic SQL.
Analyze, design, Develop, implement and tune SSIS 2014 packages to perform the ETL process.
Responsible for understanding the complex SSIS packages and make understandable to the team.
Strong understanding of data driven business decisions using mapping document and implement in SSIS ETL process.
Modifying scripts to handle automated Loading/Extraction and Transformation (ETL) of data using ( SSIS).
Using Tableau to customize interactive reports and dashboards with customized parameters.
Involved in Designing, Developing and Testing of the ETL (Extract, Transformation and Load) strategy to populate the data from various sources.
Design ETL packages dealing with different data sources ( SQL Server, Flat Files) and loaded the data into target data warehouse by performing different kinds of transformations by using SQL Server Integration Services ( SSIS).
Analyze, design, Develop, implement and tune SSIS 2014 packages to perform the ETL process.
Responsible for understanding the complex SSIS packages and make understandable to the team.
Strong understanding of data driven business decisions using mapping document and implement in SSIS ETL process.
Modifying scripts to handle automated Loading/Extraction and Transformation (ETL) of data using (SSIS).
Using Tableau to customize interactive reports and dashboards with customized parameters.
Involved in Designing, Developing and Testing of the ETL (Extract, Transformation and Load) strategy to populate the data from various sources.
Design ETL packages dealing with different data sources (SQL Server, Flat Files) and loaded the data into target data warehouse by performing different kinds of transformations by using SQL Server Integration Services (SSIS).
Used SSIS tester automated tool to test packages, tasks and precedence constraints.
Create SSRS reports such as tabular, matrix, charts, dashboard and sub-reports using SSRS 2008R2.
Create stored procedures, Triggers, User-defined Functions, Views for both Online and Batch requests handling business logic and functionality of various modules.
Participated in daily SCRUM meetings and gave the daily status of testing.
Develop data visualizations and dashboards using Power BI and Tableau.
Role: Data Analyst July 2014 – May 2016
Ford Motors, Michigan MI
Responsibilities:
Work with users to identify the most appropriate source of record required to define the asset data for financing.
Performed data profiling in Target DWH
Experience in using OLAP function like Count, SUM and CSUM
Performed Data analysis and Data profiling using complex SQL on various sources systems including Oracle and Teradata.
Developed normalized Logical and Physical database models for designing an OLTP application.
Developed new scripts for gathering network and storage inventory data and make Splunk ingest data.
Using HiveQL developed many queries and extracted the required information.
Exported the data required information to RDBMS using Sqoop to make the data available for the claims processing team to assist in processing a claim based on the data.
Design and deploy rich Graphic visualizations with Drill Down and Dropdown menu option and Parameterized using Tableau.
Extracted data from the database using SAS/Access, SASSQL procedures and create SAS data sets.
Created Teradata SQL scripts using OLAP functions like RANK to improve the query performance while pulling the data from large tables.
Worked on Mongo DB database concepts such as locking, transactions, indexes, Sharing, replication, schema design, etc.
Performed Data analysis using Python Pandas.
Good experience in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data analytics and Excel data extracts.
Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
Hands on Experience on Pivot tables, Graphs in MS Excel.
Using advanced Excel features like Pivot tables and Charts for generating Graphs.
Designed and developed weekly, monthly reports by using MS Excel Techniques (Charts, Graphs, Pivot tables) and Power point presentations.
Strong Excel skills, including pivots, Vlookup, conditional formatting, large record sets. Including data manipulation and cleaning.