Post Job Free

Resume

Sign in

Developer Data

Location:
Vienna, VA
Posted:
September 15, 2020

Contact this candidate

Resume:

Mr. Srinivasan Selvaraj

Job Title

Tech Lead/Senior ETL Datawarehouse and Data science consultant

Educational History

Masters of Engineering in Computer Science, Sathyabama University, India, 2012

Bachelor of Engineering (2008), Computer Science Engineering, Anna University, India

Diploma IT (2005), Diploma in Information Technology, DOTE, India

Certifications/Award

Salesforce certified Administrator (SU16), SalesForce.com, 2016.

Salesforce certified Platform Developer I (SU16), SalesForce.com, 2016.

Hadoop training and certification, Tata Consultancy Services, 2016.

IBM Certified Solution Developer - InfoSphere DataStage v9.1, IBM, 2015.

Scrum Master Accredited Certification, Scrum Institute, 2016.

Six sigma Green Belt, Tata Consultancy Services, 2012.

Oracle Certified Associate (OCA), Oracle, 2011.

Financial Marketing, NCFM, 2011.

Skill set

Software Engineering, Tech Lead, IT/Data specialist

Infosphere Datastage, Informatica powercenter, Hadoop Bigdata, salesforce, SAP BO

Unix, Linux, Windows

Perl, Python, Unix(sh,ksh,bash),spark,sqlldr

Oracle PL/SQL, Netezza, Sql server, DB2, Teradata, Hive, Flatfiles

DataScienceWorkstation (AWS)

Bwise, Hyperion Essbase, CFO, DERA,

Sql developer, Squirrel, Dbeaver, WinScp, Putty, control-M, Crontab, AutoSys, MS Visio, ER studio, SVN, JIRA, GIT, ServiceNow

FTP, SFTP, SSH, Web Services, Web Scrapping

Citizenship Status

H1-B and I-140 extension

Clearance Level (if applicable)

Public Trust

Experience Summary

Mr. Srinivasan has around 14 years of experience in Enterprise Application Development. In depth knowledge in Data Warehousing & Business Intelligence concepts with emphasis on ETL and Life Cycle Development including requirement Analysis, Design, Development, Testing and Implementation in both SDLC and Agile framework.

IBM Certified Solution Developer - InfoSphere DataStage v9.1.

Focused experience in building Data Warehouse and Data migration using IBMDataStage7.5/8.1/9.1/11.x

Expertise Includes Data Analysis, Data Cleansing, Transformation, Integration, Data import, Data export using DataStage and other ETL tools from various sources and targets including Oracle, SQL Server 2000, DB2, Netezza, Salesforce, mainframe, flat and complex flatfiles

Extensive experience in development jobs sequences in both parallel and server, debugging, troubleshooting, monitoring and performance tuning using DataStage Designer, DataStage Director

Experience in Informatica prowercenter development and Maintainance mappings

Proven track record in troubleshooting of DataStage jobs and addressing production issues like Performance Tuning and enhancement

Hadoop trained developer. Worked in Big Data framework. Experience with Hadoop HDFS and Hive managed and external tables, Sqoop, Spark and Pig scripting.

Experience with Linux-based operating systems and has experience in writing UNIX (sh,ksh,bash), PERL, PYTHON scripts.

Proficient in SQL, PL/SQL features along with bulk load utilities like Sql*Loader. Strong working experience in Oracle in specific to PL/SQL, Views, Stored procedures, Performance Tuning and database normalization.

Exposure to dimensional modeling (Star/SnowFlake schemas, slowly changing dimensions) and E/R (3NF) at both logical and physical levels.

Proven ability to quickly earn the confidence of clients and stake-holders by seamlessly adapting into their new strategy and technology

Trained to deliver instantly consumable results to appropriate level of Business groups

Demonstrated success working in a distributed-team environment, Work effectively under pressure and within short time constraints

SCRUM-AGILE based daily and weekly task report gathering

Professional Experience

Technical Team Lead & Senior ETL developer

DERA ODS Production support, US Securities Exchange Commission(SEC)

07/2018 – current

Lead the technical team of 6 members using Scrum(Agile) framework by creating stories and tasks assignments on data science projects using JIRA application. Provided architectural guidance on new design and enhancements

Involved in development and enhancements of various datasets in both Oracle, Netezza and Hadoop platform. Involved in full development cycle of Planning, Analysis, Design, Development, Testing and Implementation in Agile framework.

Involved and coordinated with Customers, SMEs and Vendors to gather requirement and develop the Requirement Specification to S2T mapping and design documents.

Coordinated with the SMEs to provide them with the necessary stored procedures and packages and the necessary insight into the data.

Supported various critical datasets/projects in Division of Economics and Risk Analysis - Office of Data Science division.

Created and modified SQL Plus, PL/SQL and SQL Loader scripts for various ETL process.

Extracted data from the XML file and other flat files and loaded it into the database using SQL Loader.

Developed,and maintained Datastage jobs to read XML or Flat files to do ETL process to load databases or flat files to generate feeds for the downstream systems

Developed,Perl, Python and Bash scripts to read XML or Flat files to do ETL process to load databases or flat files to generate feeds for the downstream systems

Created and modified several UNIX shellScripts and datastage jobs according to the change needs of the project and client requirements.

Developed the core ETL process using the UNIX shellScripts. Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the stage tables.

Involved in the continuous enhancements and fixing of production problems.

Created a reusable PL/SQL stored procedures and packages for moving the data from staging area to data mart.

Created scripts to create new tables, views, queries for new enhancement in the application using sql developer.

Created indexes on the tables for faster retrieval of the data to enhance database performance.

Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like sql developer,Squirrel, debeaver and sql pus utilities.

Handled errors using Exception Handling in python,perl and Datastage extensively for the ease of debugging and displaying the error messages in the application.

Wrote Shell Scripts for Data loading and DDL Scripts.

Improved the performance of the application by rewriting the SQL queries.

Involved in development and supporting ETL process sources EDGAR data that reside in Hadoop Hive HDFS file system.

Develop and maintained external tables in Hadoop environment. Loaded structured and un-structured files to HDFS file system.

Supported datastage ETL jobs in production.

Technical Lead & Senior ETL developer

DERA ORDS Production support, US Securities Exchange Commission(SEC)

01/2017 – 06/2018

Involved in development and enhancements of various datasets in both Oracle, Netezza and Hadoop platform. Involved in full development cycle of Planning, Analysis, Design, Development, Testing and Implementation.

Involved and coordinated with Customers, SMEs and Vendors to gather requirement and develop the Requirement Specification to S2T mapping and design documents.

Coordinated with the SMEs to provide them with the necessary stored procedures and packages and the necessary insight into the data.

Supported various critical datasets in Division of Economics and Risk Analysis- Office of Research and Data Science division.

Created and modified SQL Plus, PL/SQL and SQL Loader scripts for various ETL process.

Extracted data from the XMLfile and other flat files and loaded it into the database using SQL Loader.

Developed,and maintained Datastage jobs to read XML or Flat files to do ETL process to load databases or flat files to generate feeds for the downstream systems

Developed,Perl, Python and Bash scripts to read XML or Flat files to do ETL process to load databases or flat files to generate feeds for the downstream systems

Created and modified several UNIX shellScripts and datastage jobs according to the change needs of the project and client requirements.

Developed the core ETL process using the UNIX shellScripts. Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the stage tables.

Involved in the continuous enhancements and fixing of production problems.

Created and reused PL/SQL stored procedures and packages for moving the data from staging area to data mart.

Created scripts to create new tables, views, queries for new enhancement in the application using sql developer.

Created indexes on the tables for faster retrieval of the data to enhance database performance.

Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.

Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like PL/SQL Developer and SQL plus.

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Involved in ETL code using PL/SQL in order to meet requirements for Extract, transformation, cleansing and loading of data from source to target datastructures.

Wrote Shell Scripts for Data loading and DDL Scripts.

Improved the performance of the application by rewriting the SQL queries.

Involved in development and supporting ETL process sources EDGAR data that reside in Hadoop Hive HDFS file system.

Supporting datastage ETL jobs in production.

Datastage ETL lead

United Services Automobile Association(USSA)

11/2014 -12/2016

Building a reusable data load framework

Preprocess the files from source systems to make them consumable to load framework

Extract data from Source files and Oracle DB, EDW

Load Target Tables into Oracle, Netezza DB, EDW

SAP BO reporting

Developed bash script to invoke perl and maxl scripts to retrieve data from Essbase cubes

Wrote Oracle Queries to build rules in Essbase

Responsible for Client interactions and Application demonstrations. Requirement Analysis and Datastage Design suggestion and help preparing technical specification and other documentation

Led the efforts as tech lead in both waterfall and agile methodologies of SDLC.

Co-ordinate with business SMEs to gather requirements and act as liaison to development team.

Designed and Developed ETL modules using DataStage parallel Jobs- Developed jobs using different types of stages like Sequential file, Transformer, look up, Funnel, Copy, Aggregator and Oracle enterprise stages.

Designed and developed ETL processes using DataStage designer to load data from Oracle, MS SQL, Flat Files (Fixed Width) and XML files to staging database and from staging to the target Data Warehouse database.

Responsible for ETL Performance Monitoring and Tuning to enhance the Analyze ETL workflows to identify performance bottlenecks by appropriately using the DataStage job stages and Partition methods.

Developed, maintained programs for scheduling data loading and transformations using Datastage and Oracle and BMC controlm tools.

Designed and Developed DataStage mappings for Data cleansing and Conversion and loaded data into target Oracle database.

Developed Jobs to Extract and load Dimension tables and Fact tables in Enterprise Data Warehouse.

Developed Unix shell scripts to automate the Data Load processes to the target Data warehouse

Coordinated in building Oracle Hyperion ESSBASE cube using Oracle Essbase, PERL, MAXL scripting, writing cube Load rules on Marketing cube

Responsible for cube building and writing bash, perl, MDX and maxl scripts.

Developed bash script to invoke perl and maxl scripts to retrieve data from Essbase cubes

Wrote Oracle Queries to build rules in Essbase

Creation of Oracle Base tables, staging tables and administrative activities on Oracle Database.

Effectively involved in extracting the data from Oracle and flat files, SQL Server and loading into Oracle DB environment.

Used lookup stage with reference to Oracle tables for insert/update strategy and updating of slowly changing dimensions

Built and Maintained Recon jobs in Informatica framework

Built BO reports on top of Oracle DB

Performed the Back-End integration testing to ensure data consistency on front-end by writing and executing SQL Queries.

Sr. Datastage Developer

United Services Automobile Association(USSA)

09/2013 -10/2014

Lead the effort as tech lead in agile methodologies of SDLC having 7 members

Responsible for requirement Analysis and Datastage Design suggestion

Worked with Business Analysts to define Acceptance Criteria for each assigned task to satisfy the business objectives.

Involved in Client interactions and Application demonstrations.

Involved in analyzing Data sources and identify daily/weekly/monthly/yearly operational transactions.

Developed new ETL jobs to load data for Operational use (Oracle) and Analytical use (Netezza).

Responsible for design, development, unit testing and production deployment of ETL jobs in Datastage 9.1.

Developed job sequencer with proper job dependencies, job control and triggers using stages like User Activity Variables, Job Activity, Execute Command, Loop Activity and Terminate.

Created Parameter sets and Value sets to run jobs in different environment and also for different sources having common business requirements

Scheduled ETL job by setting up Control-M Scheduler using BMC Automation controller for different load frequency.

Modified existing ETL jobs to enhance performance and add new functionality to support business needs.

Worked with data modeler to create tables as per business requirement.

Built Hyperion ESSBASE cube, PERL, MAXL scripting, writing cube Load rules.

Extracted data from relational databases including DB2 and Oracle 11g using ODBC and Oracle Connector Stage with customized SQL queries.

Wrote and perform SQL queries in Oracle to test job functionality; Broke down complex jobs if necessary and analyzed job logs from DataStage Director to debug DataStage jobs.

Migrated oracle database with new implementation scripts to the user acceptance testing environment.

Modified existing shell and UNIX scripts to suit the present functional requirements.

Prepared design and other support documents as per the changes due to defects and change requests.

Sr. Datastage Developer

United Services Automobile Association(USSA)

07/2012 – 08/2013

Co-ordinate with business SMEs to gather requirements and act as liaison to development team.

Responsible for all activities related to DataStage development, implementation, administration and support of ETL processes for large-scale databases

Research new features and provide analysis and recommendations for implementation

Datastage Design to develop jobs for extracting, cleansing, transforming, integrating, and loading data into data mart/warehouse database.

Scheduled the DataStage ETL batch jobs on a daily, weekly and monthly basis through Control +M.

Worked on the development of jobs for both DB2 and Netezza till DataMart layer

Developed Unix shell scripts to automate the Data Load processes to the target Data warehouse

Developed Jobs to load Oracle, DB2 tables and integrate with BWISE application using windows services.

Extracted data from relational databases including DB2 and Oracle 11g using ODBC and Oracle Connector Stage with customized SQL queries

Develop SQL queries to perform DDL, DML, and DCL against the oracle databases.

Wrote and perform SQL queries in Oracle to test job functionality; Broke down complex jobs if necessary and analyzed job logs from DataStage Director to debug DataStage jobs.Worked on the testing of Netezza code and fixed during the conversionDeveloped test data and conducted performance testing on the developed modules and unit test plans and documents. Perform Unit and System Testing.

Designed Technical documents on delivered functionality / code as required.

Sr. Datastage Developer

United Services Automobile Association(USSA)

10/2010 – 07/2012

Implement the following Protegent Modules:

(a)Trade Review Module

(b) Ad-Hoc Reporting Module

Customization of SunGard Protegent core application to adhere to USAA requirements:

(a) integration of member number

(b) householding

Implement data interfaces in support of baseline rules.

Configuration of the baseline rules.

Implement role-based hierarchy, entitlements and workflow processes.

Generate new and updated policies and procedures for trade surveillance.

Provide a 17A-4 compliant data retention solution.

Implement customized 22C-2 rule for retail mutual funds.

Analysing the business scenarios and relating the requirements using UML (Case Diagrams)Responsible for writing ETL componentsInvolved in study of the existing system and impact analysis

Understanding the Protegent tool to provide proper data.

Analysis of the data sources

Preparation of Source to Target mapping (S2T) and Analysis & Design documents

Adhering to clean up standards using Data Clean

Quality Process Management using IQMS

Knowledge transition about the existing systems to the new joiners

Unit Testing, Integration Testing and System Testing

Datastage Developer

Met Life India Insurance Company Ltd

06/2009 – 09/2010

Analyzed and understood the requirements and prepared test cases which covers all the possible scenarios

Analysis of the data sources

Wrote ETL components

Coordinated on ETL components

Involved in study of the existing system and impact analysis

Preparation of Source to Target mapping (S2T) and Analysis & Design documents

Adhering to clean up standards using Data Clean

Unit Testing, Integration Testing.

Datastage Developer

Pfizer

06/2008 – 05/2009

Analyzed and understood the requirements and prepared test cases which covers all the possible scenarios

Analysis of the data sources

Wrote ETL components

Involved in study of the existing system and impact analysis

Preparation of Source to Target mapping (S2T) and Analysis & Design documents

Adhering to clean up standards using Data Clean

Unit Testing, Integration Testing

Intern Software Developer, FOCUSTECH MEDIA

12/2006 – 05/2008

Study of the existing systems and impact analysis Analyzed, design, development, presentation for project screening periodically on every 6 months.

Analysis of the data sources that support reliability metrics.

Build front end web page using java on top Oracle database using swings and servlets.

Worked with ETL group for understanding Data Stage graphs for dimensions and facts.

Extensively worked on Dimensional modeling, Data cleansing and Data Staging of operational sources using ETL processes.

Submitted paper on the developed project to University



Contact this candidate