Post Job Free
Sign in

Data Aws

Location:
Saint Paul, MN
Posted:
September 16, 2020

Contact this candidate

Resume:

Chinnathambi Sivasubramanian

Cloud Data Architect with experience in Vertica, Oracle, Redshift, DynamoDB, Python, Hadoop, AWS and Agile Scrum, Pyspark

https://www.linkedin.com/in/chinnathambi-sivasubramanian ***************@*****.***

Mobile: 651-***-****

Saint Paul, Minnesota

https://www.youracclaim.com/users/chinnathambi-sivasubramanian/badges Professional Summary

15 years of experience in requirements gathering, design, development, administration, implementation and testing of data warehousing and business intelligence applications.

Member of Data architecture practice and deliver big data & Cloud technologies projects and provide leadership to the community.

Experience successfully implementing data-centric applications, such as data warehouses, operational data stores, and data integration projects

Conduct assessment to determine the best solution for implementing a cost effective, scalable and repeatable data and analytics solutions in AWS and Oracle.

Served as the big data competence lead responsible for $2M business, staff hiring, growth and go-to-market strategy.

Expertise in logical and physical database design using data modeling tools like CA Erwin, Oracle data modeler. Good understanding of star schema, snowflake schema, facts and dimensions.

Experience in Big Data fundamentals and experience in analyzing large data sets using big data technologies like Hadoop, HDFS, Hive, Sqoop, Spark, Pyspark.

Experience in Python programming and UNIX shell scripting for processing large volumes of data from varied sources and loading into distributed MPP databases like Teradata, Vertica and Redshift.

Define AWS architecture for implementing a completely cloud-based big data solution using EMR, S3, Lambda and Redshift.

Outline end-to-end strategy and roadmap for data platforms as well as modernize data and infrastructure.

Experience in architecting highly available systems using amazon web services. Achieve AWS infrastructure cost savings of about $50,000 for clients

Good Knowledge and Hands-on Database like Redshift, PostgreSQL, DynamoDB, Snowflake, SQL server

Experience in installing, upgrading and managing Vertica database clusters on amazon EC2 instances. Good experience in creating and managing database users, schemas, roles, tables, views, projections.

Experience with query optimization, projection design, Vertica DB performance tuning using configuration parameters and resource pool settings.

Expertise in creating Oracle SQL and PL/SQl, stored procedures, functions, views, indexes and materialized views.

Extensive experience in developing and designing data integration solutions using ETL tool such as Informatica Powercenter, Talend for handling large volumes of data.

Experience in Agile methodologies and Atlassian tools like JIRA. Expertise in using version control tools like Sub-Version, Git

Good understanding of machine learning fundamentals like Feature extraction, Dimensionality Reduction, Classification & Regression techniques

Certifications and training

Certified Amazon AWS solutions architect

Certified Scrum Master

Certified Oracle Certified Associate

Completed courses in Azure Basic Fundamental

Trained in Snowflake Database

Completed courses in AWS Data and Analytics

Attended Vertica Boot camp and performance tuning training. Professional Core Competencies

Vertica Amazon Web Services

(AWS)

Atlassian JIRA Cloud Computing

Oracle 11g Unix shell scripting Git Sqoop

Database Design Data architecture Informatica Spark Hadoop RDS DynamoDB Redshift

Hive SQL Server Teradata Machine learning

Data model ETL Analytics Business Intelligence

Python Time series Agile Methodologies Data Warehousing VSQL Big Data Data analysis Hive

Alteryx OWB Azure SQL Tableau

PowerBi MicroStrategy OBIEE SpotFire

Education:

Bachelors of Computer Science from Manonmaniam Sundaranar University, Tamil Nadu, India (2004) Professional Experience:

Cognizant Tech Solutions Corp Cloud Data Architect / Delivery Lead White Bear Lake, Minnesota, USA July 2007 – Present 3i Infotech Software Engineer

Chennai, Tamil Nadu, India August 2006 – July 2007 Valgen Business Solutions Software Engineer

Chennai, Tamil Nadu, India May 2006 – August 2006

Cherrytec Solutions Ltd Programmer

Chennai, Tamil Nadu, India June 2004 – January 2006

(A) Trane/Ingersoll Rand through Cognizant

Trane Intelligent Services Data Architect & Delivery Lead White Bear Lake, Minnesota, USA August 2013 – March 2016 & Jan 2020 – Till Date La Crosse, Wisconsin, USA Feb 2009 – Nov 2011

Chennai, Tamil Nadu, India

July 2007 – Jan 2009, Dec-2011- August 2013, Apr

2016- Dec 2019

Project Description

Trane Intelligent Services (TIS) is a technology driven platform that provides system-wide data collection, monitoring and analysis of defined performance parameters on Heating Ventilation Air Conditioning (HVAC) units. TIS is enabled by connectivity to a customer’s Building Control Unit which provides the ability to continuously collect data and monitor a site. TIS is a critical IT component in Trane ecosystem, helping better and faster decision-making for maintenance and increasing the installations at its client locations. As part of building performance offering, analytics are run against time-series operational data collected through installed base of Building Automation Systems with the goal of identifying suboptimal equipment-level and system-level operations.

Responsibilities:

Have played multiple roles and worked on multiple product offerings within Trane Intelligent Services Program.

• Establishing architecture and technical design for data warehouse, ETL and data integration projects along with technical oversight for resources on all phases of the projects.

• Migrated Legacy Data warehouse from Oracle In-perm to AWS Cloud.

• Served as the big data competence lead responsible for $2M business, staff hiring, growth and go- to-market strategy.

• Design and develop Enterprise Data Warehouse using OWB, Oracle CDC, Apex and SQL Scripting.

• Design and develop the ODS and DW Databases. Data Transformation in Oracle PL/SQL.

• Extract collected data from Operational Data Store (ODS), transform the data with all the business logic and load the data into Data Warehouse (DW).

• DB Administrator for Vertica 3 node cluster in AWS instance.

• Create rule-based building performance analytics on all the collected data from multiple buildings across the globe.

• Participate in efforts to understand requirements, architect solution, design, develop, document and implement ETL processes for data migration for in-premises Oracle to AWS Vertica DB.

• Collaborate with business to improve overall product by emphasizing on quality, usability, reusability, functional improvements and innovation throughout the development process.

• Develop dynamic PL/SQL package to extract data from different HVAC.

• Worked with Data Science Group for building Machine Learning Models to predict building energy demand and chiller load.

• Act as a Single point of contact for Database & ETL at Trane Intelligent Services over a period of 10 years.

• Develop Talend DI processes to get data from third party APIs including weather data, theater data

• Do performance tuning on ETL processes and databases.

• Lead team of up to 10 members with strong ETL and database knowledge; mentoring them with best practices and teaching new tools and technologies. Key Projects Delivered:

• Enterprise Data Warehouse Database

• Rooftop Equipment Data Collection and Analytics

• Chilled Water Systems/ Chiller Tower Optimization and Sequencing

• Heating Systems

• Building Performance Analytics

• Weather & Meter Data Processor

• Fuel Efficiency Analytics on Telematics Data for Thermo King Intelligent Services

• Machine Learning: Power and Chiller Load Prediction

• GE Lighting Data Ingestion

• Fleets Management Data Integration

• Energy Star Data Integration

• AMC Theater Data Integration

• Data Migration: TIS Gen1 to Gen3, TIS Gen3 to Gen4

• Database & Data Warehouse Server Migration

Environment:

Windows, AIX, UNIX, Amazon Web Services, Informatica, Talend, Oracle, SQL Server, HP Vertica, OWB, Redis, DynamoDB, Postgres, Apex.

(B) Actuate to Oracle BI Migration through Cognizant Trane Actuate to Oracle BI

Migration

Big Data Lead

Chennai, Tamil Nadu, India Jan 2018 – December 2018 Project Description:

Customer have multiple reporting tools and need to migrate into single platform and reduce the reporting tools. This will be benefits for the customer to avoid the license cost and the maintenance cost. For the Phase 1 we migrate the EDW to Cloudera Hadoop environment with 600+ Actuate reports to Oracle BI (Transactional Reports) and OBIEE (Analytic reports). The BI Consolidation project has been started this year for report Platform Migration from Actuate to OBIEE/BIP, Iterative execution approach, leverage internal knowledge to minimize challenges in migration and the migration strategy to map IRs futuristic objectives. The main objective to reduce cost, mitigate risk and accelerate the existing processes. The technology use in this project are Oracle, Big data, Hive and Impala to extract, Transform and load the data to specific data zones.

Responsibilities:

• Worked as a Technical Lead and Data Architect for Product teams with a focus on guiding the teams towards improving the way they work.

• Guide the ETL team to understand the business knowledge of existing Actuate report and cover them into Data Flow process through Impala, Hive and Hadoop to control the data movement.

• Coached team members on Agile principles and providing general guidance on the methodology

• Facilitated getting the work done without coercion, assigning, or dictating the work

• Daily standup meeting, Sync up meeting and Scrum credence are maintained as per the Scrum process and methodologies.

• Facilitated sprint planning, retrospective and sprint demos

• Helped team to solve problems rather than provide solutions

• Assisted with prioritization and resolution of software defects

• Detailed application integration required to support the process as well as identified data elements, data integration and EDW enhancements for data analytics and KPI measurement. Environment:

Cloudera Hadoop, HIVE, Sqoop, Impala, Oracle, Actuate, OBIEE and Oracle BI.

(C) BHP Billiton through Cognizant

Accommodation Optimization &

Truck Analytics

Cloud Solution & Data Architect

Chennai, Tamil Nadu, India April 2016 – Jun 2017

Client Description:

BHP Billiton (NYSE: BHP): BHP is a dual-listed company; the Australian BHP Billiton Limited and the British BHP Billiton plc are separately listed with separate shareholder bodies, while conducting business as one operation. It is one of the largest Coal, Copper, Iron ore, Silver, and Petroleum refinery and mining units in Australia. BHP also have production units for Natural Gas, Nickel and Uranium materials in Australia and other countries

Project Description:

Accommodation Optimization process to lower the business cost of the employee and contractor working in the different Mining Units for the next 5 years. Based on the working Pattern of the workers in the mine we need to accommodate them in the respective village using Shortest Path Algorithm. Optimization Dashboard are developed using TIBCO Spotfire tool which help the customer to get rid of the manual reports. Successful implementation of Accommodation process reduce the cost by Overall cost by 30% and increase the utilization of room by 60% based on the pattern matching. This model help the customer to expand the optimization from End-to-End business on Air and Road Services based on the working pattern.

Truck Analytics: The major cost in mining company will be spend in failure of machinery and Truck. There will be unexpected failure of truck which will increase the budget provided for the transportation and which will lead to the reduction of production material. This is one of the huge productive impact at mining sites. To overcome this problem, we build a simple predictive model using R Analytics based on the Truck running period, Service IoT information, Parts Availability in the Warehouse (Inventory management) and also experience of the Truck Drivers. This model helped the customer to identify the truck’s regular maintenance and also it will alert when the parts in the truck may fail in future. This reduces the expenses by 20% as of now on the Inventory Management.

Responsibilities:

• Understand the Business problem and bring up the business Architect solutions for the customer.

• Design and documented the B2T for the business requirement.

• Archive cost savings to customer by $75000 over the year.

• Design the data workflow and AWS setup for the Accommodation and Truck Analytics.

• Data Modeling for the Accommodation Optimization for Daily Roster of Mine and Village Availability.

• Identify the Business Roster pattern of the workers in the mine which help to build the Predictive Model.

• Setting up the Redshift DB and Data Pipeline for daily data load.

• Provided Enhanced KPI in Spotfire Dashboard which help the customer to understand impacted area of the mine and villages.

• Replication of data & Workflows from SAP (75M data) to AWS using execution of SQL, automated data extraction via Data Pipeline and Shell scripts for data reconciliation.

• Move projects from on-premises location and AWS to Hadoop environment overcoming physical limitations with 100% efficiency at higher rate with no data loss and Parquet and compression techniques used increased efficiency

• Detailed application integration required to support the process as well as identified data elements, data integration and EDW enhancements for data analytics and KPI measurement. Environment:

Windows, UNIX, AWS, Redshift, SAP, Spotfire, Selenium Automation Tools, R Analytics, AWS Pipeline, Hive, Spark, Python, SQL, Shell, Impala, Sqoop.

(D) ERP Product Development (ORION) through 3i-Infotech ERP Product

Development/Enhancement

Oracle Forms & Report developer

Chennai, Tamil Nadu, India August 2006 – July 2007 Project Description:

The Objective of the ORION was to assist in the ERP to support PSCM growth in Inventory, Sales, Purchase and General Ledger. ORION is an ERP product, in which we will do customization based on the client requirement. The major customization is in the Inventory module. In ORION all the processes are interlink to with each other through a setup process. According to the Client requirement we can change the setup for them. Partnership for Supplier Chain Management is a non-profit organization formed to serve the President’s Emergency Plan for AIDS Relief. This initiative will provide various drugs & other services to the countries identified under relief plan. Funding is provided via USAID. A letter of Credit is opened under Task Order 1 to draw down funds as and when required. On a regular basis, PSCM will required to report spending/expenditure at various levels, like by project, activities/technical areas, sub-contractor. Also, it will be required to report various statistics on obligation v/s actual spending. Responsibilities:

• Developed the complete process and gateway packages for all the operation of manufacturing of Working progress system and some of the Sales operations of Enterprise Resource Planning module.

• Performed Unit Testing, Prepared test plans and test scripts and provided support for the integration testing and perform documentation adhered to organization standards.

• Involved in Future enhancement given by the customer and implemented them on time.

• Performance tuning of Oracle Queries which is taking huge memory or CPU utilization. Environment:

Oracle 9i, Oracle D2K (Forms6i & Report 6i).

(E) Ashok Leyland ERP through Valgen

Ashok Leyland ERP Oracle Forms & Report developer

Chennai, Tamil Nadu, India May 2006 – Aug 2006

Project Description:

The Objective of the ALMAP was assist to maintain the Manufacturing of Vehicles and to support Ashok Leyland sales growth.

ALMAP is a part of own ERP product, ALMAP is used to maintain the everyday production in the company. It will explain in details about the vehicle manufactured in every shift. He will do Development based on the client requirement. The major development of the ALMAP is SFC (Shaft Floor Control) which is the heart of the Production. This will explain in detail about the Finished Goods and the WIP. This project also includes the Inwarding module as a part, which will explain in details about the material which is coming into the company. The process flow (Inwarding Module) of this module is, the vendor will supply the material to company. In the Entry Level they will whether the Delivery Chelan is a valid and they will check the number of quantity of the Delivery Chelan with the physical quantity. After that they will generate the Goods Receipts Note (GRN). This is the Key number throughout the Inwarding process. After generate then GRN document, it will send to Inspection. The Inspection will be done on the material as per the standard. They will do the sample testing on the material. If it satisfied the standard means it will send to Store posting, otherwise the whole GRN will get rejected and it will return to the Vendor through Vendor Return Document. Responsibilities:

• Developed the PLSQL procedures, functions and packages, development for change request in Forms 6i according to the business requirements.

• Performed unit testing, preparation of test plans and test scripts and provided support for the integration testing, perform documentation adhered to organization standards, provided client support for maintenance and enhancement of the product.

• Worked as module Lead in Shaft Floor Control (SFC), to maintain the Working in progress (WIP) of everyday Vehicles manufacturing.

• Performance tuning of Oracle Queries which is taking huge memory or CPU utilization. Environment:

Oracle 9i, Oracle D2K (Forms6i & Report 6i).



Contact this candidate