Post Job Free

Resume

Sign in

ETL Developer

Location:
Fremont, CA
Salary:
$150,000
Posted:
January 25, 2020

Contact this candidate

Resume:

Lakshmanan Kuttiyannagounder

BigData/ETL Developer Cell: 616-***-**** Email: adbfo1@r.postjobfree.com

https://www.linkedin.com/in/lakshmanan-kuttiyannagounder

PROFESSIONAL SUMMARY:

An engineering professional with substantial experience around 12+ Years in IT in designing and executing solutions for Business Intelligence/Data warehouse and Big Data Applications. Known for using the right tools when and where they make sense and creating an intuitive testing approach that helps organizations effectively analyze and process terabytes of structured and unstructured data. Able to integrate state-of-the-art big data /ETL technologies to do E2E implementation.

TECHNICAL SKILLS:

Skill Set

Hadoop Developer, ETL Developer, ETL Testing, Big data Testing

Streaming Technologies

Apache Kafka 2.0.0, Confluent Kafka – 5.0.0, Confluent KSQL, Apache Spark Streaming – Pyspark

Big Data Ecosystem

Hadoop 2.6, Hadoop 2.2, Hive, Sqoop,HDFS

Operating System

LINUX, UNIX, Windows 10/XP/2000

Programming Languages

Python 3.6.6,Unix Script

Databases

Teradata16.x/15.x,Oracle12c/11g,SQL Server 2014/2012,DB2,MySQL,Sybase

ETL Tools

DataStage 11.7/11.5/11.3/8.x/7.x, Abinitio

Other Utilities

TOAD,Teradata SQL Assisant,Oracle SQL Developer,CA ESP,AutoSys & Postman,MongoDB Compass

Project /Test Management

TFS,HP Quality Centre,BMC Remedy,PVCS,JIRA,Agile

Collaboration Tool

SharePoint 2010,Wiki, Confluence,Github

Reporting Tool

Tableau, Cognos, Microstrategy,Grafana

NoSQL Databases

Mongo DB, Casandra, HBase

EXPERIENCE & PROJECT DETAILS:

Employer: Tekvalley Corporation, USA Jan 2017 – Present

Client: Wells Fargo Bank,San Leandro,CA DataStage/Big Data Developer

Responsibilities,

·Working closely with business analysts in requirements gathering, reviewing business rules and identifying data sources.

·Involving in designing and development of Security Control,ATM Profit & Loss(ATM PNL),Tablet Usage Data(TUD),Remedy(RMD),Customer Behavior Data-warehouse(CBD),Branch ATM Data Analytics (BADA) and ATM Hardware(AHA) projects.

·Extensively used Data Stage Designer, Quality Stage, and Administrator Director for creating and implementing jobs to load the Data Marts/Data Warehouse.

·Extensively using hierarchal data stage to parsing JSON data and Unstructured Data stage to read and write Microsoft Excel files.

·Extensive working experience with Teradata Fast Export and Multi/Fast load methods.

·Created logical and physical data models, business rules and data mapping for the Enterprise Data Warehouse system.

·Coordinate with Data scientists/Business Analyst and responsible for writing the complex Hive queries and get insights that convert the potential value of big data into real, tangible business value.

·Creating Hive tables, loading the structured/semi-structured data into hive tables and writing hive queries to further analyze the logs to identify issues and behavioral patterns.

·Responsible for Design and develop end-to-end Data pipeline Streaming applications using Golden-gate, Kafka, Spark streaming.

·Designing Kafka Topics, Partitions, Producer / Consumer groups & setting up highly available clusters (with zookeeper).

·Well versed with best practices in software design & development using Kafka & Streaming Frameworks (Pyspark Streaming).

·Used Confluent schema-registry serialization and de-serialization to parse the contents of streamed data through Kafka.

·Actively involved in code review and bug fixing for improving the performance Pyspark streaming programs.

·Achieved ETL process by using Confluent KSQL and created Materialized View Kafka Topic in Kafka cluster.

·Implemented Business transformation by Join, Filter and Windowing the Kafka Message using of Confluent KSQL.

·Generating test data and validating messages from Kafka topics using python script.

·Using JIRA to tracking and monitor the bugs/issues related to projects.

Environment: IBM DataStage 11.7/11.5,Teradata16.x/15.x, Oracle 12C, SQL Server 2014,Hadoop 2.6,HDP 3.0,Hive,Apache Kafka 2.0.0, Confluent Kafka 5.0.3, Confluent KSQL,Apache Spark 2.4.4,Pyspark,Python and JIRA,Tableau, Cognos, Grafana.

Employer: HTC Global Services Inc, USA Aug 2015 – Dec 2016

Client: Meijer, Grand Rapids, MI DataStage Consultant

Responsibilities,

·Worked closely with business analysts in requirements gathering, reviewing business rules and identifying data sources.

·Involving in designing and development of Supply Chain,Logistic,Space Management,Predictix,mPerks,Market Basket,ICAP,HR-PeopleSoft,Finance,Pricing & Promo, Digital Account, Digital Transformation and Marketing & Analytical projects.

·Extensively used Data Stage Designer, Quality Stage, and Administrator Director for creating and implementing jobs to load the Data Marts/Datawarehouse.

·Worked with Data Modeling team for building the dimensional model for the proposed system.

·Extensive working experience with Teradata Fast Export and Multi/Fast load methods.

·Created ESP jobs to schedule the ETL jobs.

·Created implementation plans and Change Management Requests.

·Coordinated with Development team, SME, Customer and Data Modeler to deliver the project on time.

Environment: DataStage11.3,Teradata13.0, DB2, Sybase, Oracle, SQL Server,Unix, CA ESP.

Employer: HTC Global Services, India Jan 2012 – Jul 2015

Client: Meijer, USA DataStage Team Lead

Responsibilities,

·Involved in designing, development, testing and implementing of Supply Chain, Logistic, Space Management, mPerks, Market Basket, ICAP, HR-PeopleSoft, Finance, Digital Transformation and Marketing & Analytical projects.

·Extensively used Data Stage Designer, Quality Stage, and Administrator Director for creating and implementing jobs to load data into Data Marts/Datawarehouse.

·Created ESP jobs to schedule the ETL jobs.

·Extensively used Teradata connector stage to export and import data into Data warehouse.

·Involved to prepare mapping document and designing the data model.

·Executed test cases in Test Case Manager (TCM).

·Managed and providing technical solutions to team members.

Environment: DataStage 11.3/8.5, Teradata, DB2, Sybase, Oracle,SQL Server,Unix,CA ESP.

Employer: HCL Technologies, India Jun 2011 – Sep 2011

Client: Royal Bank of Scotland (UK) Sr.DataStage Developer

Responsibilities,

·Involved in Develop ETL jobs using datastage to load the data into DWH/DM.

·Involved to prepare HLD and LLD documents.

·Involved in Designing Jobs using stages like Seq/Dataset file, Filter, Lookup, Join etc.

·Prepared Unit Test Cases and System Test Cases.

·Coordinated the team members & assist them to make delivery on time

Environment: DataStage 8.1, DB2, Oracle, SQL Server, Unix, Control-M.

Employer: HCL Technologies, Malaysia Dec 2010 – May 2011

Client: AmBank, Malaysia Sr.DataStage Developer

Responsibilities,

·Involved in Develop ETL jobs using datastage to load the data into DWH/DM.

·Involved to prepare HLD and LLD documents.

·Involved in Design Jobs using stages like Seq/Dataset file, Filter, Lookup, Join etc.

·Prepared and executed Unit Test Cases and System Test Cases.

·Coordinated the team members & assist them to make delivery on time

Environment: Data Stage 8x, DB2, SQL Server, Oracle 9i, Toad, Zena, and Windows.

Employer: HCL Technologies, India Feb 2010 – Nov 2010

Clients: LBG(UK) & USAA(USA) Sr.DataStage Developer

Responsibilities,

·Analyzed the business requirement documents (BRD) and mapping documents.

·Involved in Develop ETL jobs to extract, transform and load the data into DWH/DM.

·Involved in Design Jobs using stages like Seq/Dataset file, Filter, Lookup, Join, Aggregate etc.

·Monitored and fixed production job failure.

·Involved to execute unit and system test cases.

·Coordinated the team members & assist them to make delivery on time

Environment: DataStage 8.x/7.x, Teradata, DB2, Oracle, SQL Server, UNIX, Tivoli.

Employer: Mahindra Satyam, India Sep 2007 – Jan 2010

Client: Ford, USA & Hutchison3G,UK DataStage Developer

Responsibilities,

·Involved in Develop ETL jobs to extract, transform and load the data into DWH.

·Documented HLD and LLD documents for the applications Clone & Go Process.

·Involved in Designing Parallel Jobs using various stages like Teradata Connector, Data set, Sequential file,Lookup, Modify, Copy stages etc..

·Involved in Creating, validating and testing the test cases using HP QC.

·Involved in migration, testing and validation of business object reports.

·Monitored and fixed the failed and long running DataStage jobs.

·Deployed the work order as per change request.

·Refreshed and published the Monthly, daily and weekly reports to end users.

Environment: Data Stage 7.5, DB2, Teradata, SQL Server, Oracle 9i, AutoSys, Linux, Business Object.

EDUCATION:

·Bachelor Degree in Electronics and Communication Engineering from Madras University, Tamilnadu, India.



Contact this candidate