Post Job Free

Resume

Sign in

Software Development Sql Server

Location:
Katy, TX
Posted:
January 31, 2024

Contact this candidate

Resume:

Sakthivel Mahalingam

ad29jv@r.postjobfree.com

Phone: 1-713-***-****

SUMMARY OF SKILLS:

18+ years of experience and expertise in all phases of software development life cycle (SDLC) including analysis, requirement engineering, architecture design, database design, development, enhancements and maintenance of standalone, multi-tier, object-oriented enterprise applications.

Experience in Linux, Docker container, GIT, Codecloud

Experience on Big Data technologies such as Apache Spark, Flink, HDFS, HIVE, KAFKA

Experience in Data Warehouse Tabular and Multi-dimensional Data Modeling and BI.

Intelligence using SSIS, Informatica, SSAS, Star Schema, DAX, Power BI, Tableau

Exposure on Azure Data Factory, Databricks, Datalake Storage, Azure Synapse Analytics

Experience in SQL Server, Teradata, Oracle, Redis, Cassandra DB, Mongo DB

Development/maintenance experience in Programming language such as Java, Scala, & .Net

Possess strong project planning and execution skills.

Experience in analyzing and done Automations wherever applicable.

A fast learner with good communication and presentation skills, problem solving and troubleshooting capabilities.

Capable of handling any new technologies in very short time.

Is analytical in approach, hardworking, committed and a good team leader

Technical skill set:

Big Data

Data Processing

Hadoop (Map Reduce, Hive, Pig)

Spark (Batch, Streaming, Spark SQL)

Apache Flink

Data Ingestion

Kafka, Sqoop

NoSQL Database

Redis, Cassandra, MongoDB, HBase

Hadoop Distribution

Apache, Cloudera

Datawarehouse

Data Modelling

Star Schema, Snowflakes, Tabular, Multidimensional

Databases

SQL Server, Oracle, MS Access

ETL/Reporting Tools

SSIS, SSAS, SSRS, Crystal Report XI, Power BI, Tableau

Azure

Data Factory, Databricks, Datalake, Azure Synapse Analytics

Programming Languages

Language

Java, Scala, Python, .Net, C, C++

Environments

Others

Windows, Linux, Docker

Project experience summary:

1.Offer Decision (OD) Dec 2022 – Till Date

Client

AT&T Inc, Atlanta, GA

Technologies

Scala, Flink, Kafka, Redis, Cassandra, MongoDB, IBM Streams,Azure

A framework enabling personalization, recommendations, & decisioning for all customer types, across all channels for all customer interactions.

Roles and responsibilities:

Coordinated with business customers to gather business requirements.

Interacted with other technical peers to derive technical requirements.

Worked extensively on Kafka topics and communicating between jobs thro’ Kafka.

Involved in core development using scala, from the existing application IBM streams

Worked extensively on Redis, Cassandra and MongoDB

Experienced in implementing the Data Processing using Apache Flink APIs

Designed a data quality framework to perform schema validation and data profiling on Flink.

Worked on performance tuning application.

Involved in Azure and on-prem deployment and testing

2.Offline Ingest Dec 2022 – Till Date

Client

AT&T Inc, Atlanta, GA

Technologies

Scala, Flink, Kafka, Redis, Azure

Offline ingest is an application job that ingest the data files to Redis and publish them to Online application. This is designed to have a realtime scenario of being able to constantly change the config attributes of offers, rules, actions and other inventory params.

Roles and responsibilities:

Coordinated with business customers to gather business requirements.

Interacted with other technical peers to derive technical requirements.

Worked extensively on Kafka topics and communicating between jobs thro’ Kafka.

Involved in core development using scala.

Worked extensively on Redis, to store the file ingest to Redis

Experienced in implementing the Data Processing using Apache Flink APIs

File reading system for dat and xml.

Worked on the xml parser and dynamically convert the xml mapping to store in Redis

Involved in Azure and on-prem deployment and testing

3.Plant To DART Jan 2020 – Dec 2022

Client

Kinder Morgan, TX, USA

Technologies

Java, Scala, Hive, HDFS, Flink, Kafka, CassandraDB, MongoDB, SQL Server

This process is designed specifically to interface meter data into DART system from a massive database maintained by various CO2 Field employees. Meter data in this interface include meters NOT currently residing in SCADA and represent third party meter operators. The data than interfaced to DART which were being processed for Allocations and further to the Lawson system.

Roles and responsibilities:

Coordinated with business customers to gather business requirements.

Interacted with other technical peers to derive technical requirements.

Implemented map-reduce programs to handle semi/unstructured data like XML, JSON, Avro data files and sequence files for log files.

Developed ETL jobs to import and store massive volumes of data in HDFS.

Designed and developed Hive scripts to work against unstructured data from various data points and created a baseline.

Worked extensively on CassandraDB. SQL, HDFS and MongoDB

Experienced in implementing the Data Processing using Apache Flink APIs

Extensive programming using Java and Scala.

Designed a data quality framework to perform schema validation and data profiling on Flink.

4.CO2 Financial Dashboard March 2018 – Dec 2019

Client

Kinder Morgan, TX, USA

Technologies

JAVA, Power BI, SSRS, TFS, Azure Data Factory, Azure Synapse Analytics, SQL Server

The objective of the project is to leverage the CO2 financial data to analyze for financial research and reported on key performance indicators. This project involves interactions with multiple web services using SOA architecture communicating via XDIME/XML to support OLTP transactions. Frontend interface supports the inputs from various stakeholder and perform the analytic to generate reports.

Roles and responsibilities:

Involved in complete SDLC (System Development Life Cycle).

Developed dashboards using Java and SSRS for internal executives, board members that measured and reported on key performance indicators.

Utilized Power BI to gather, compile and analyze data from pivot tables and created graphs/charts.

Designed and implemented effective database solutions (Azure blob storage) to store and retrieve data.

Deployed Data factory for creating data pipeline to orchestrate the data into SQL database.

Created charts and graphs for reporting findings, performed all the analysis required to submit the study to the stakeholders.

Designed/developed all (18) corporate dashboards used by the executives and board members.

5.EHS Data Warehouse May 2017 – Feb 2018

Client

Kinder Morgan, TX, USA

Technologies

MS Visual Studio 2012, Teradata, Informatica, SSAS, Star Schema, Analysis Services Tabular Model, DAX, Teradata, Tableau, Power BI, SSRS, TFS

Data Warehouse for EHS applications to store multiple subject areas and very detailed information for various financial needs. The end report generated using Power BI and SSRS to forecast the financial risks and assessments. This Data Warehouse uses Azure technologies. Data arrives to the landing zone or staging area from different sources through Azure Data Factory. Once ready, the data is available to customers in the form of dimension and fact tables

Roles and responsibilities:

Involved in complete SDLC (System Development Life Cycle).

Developed performance utilization charts, optimized and tuned SQL and designed physical databases. Assisted developers with Teradata load utilities and SQL.

Converted batch jobs with BULKLOAD utility to TPUMP utility

Researched Sources and identified necessary Business Components for Analysis.

Gathered the required information from the users.

Interacted with different system groups for analysis of systems.

Created tables, views in Teradata, according to the requirements.

Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.

Implemented slowly changing dimensions methodology to keep track of historical data.

6.Impact Data Migration and Reports Sep 2016 – Apr - 2017

Client

Kinder Morgan, TX, USA

Technologies

MS Visual Studio 2012, Informatica, SSAS, Star Schema, Analysis Services Tabular Model, DAX, Teradata, Power BI, SSRS, TFS

Data migration for Impact EH&S application to migrate data from Old vendor application Stars to the EH&S Impact application. Recreate the Crystal reports in and Power BI.

Roles and responsibilities:

Understanding BRD, Data required document and mapping documents.

Creation of Data bases and users

Space and User maintenance on dev machine

Designing the ETLs and conducting review meets

Developed performance utilization charts, optimized and tuned SQL and designed physical databases. Assisted developers with Teradata load utilities and SQL.

Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.

Created Understanding documentation and Release Testing documentation on the tickets handled, and documented the issues found in a central repository.

Involved in Data mining through Teradata miner.

7.Emission Inventory Tracking Jan 2014 – Aug 2016

Client

Kinder Morgan, TX, USA

Technologies

MS Visual Studio 2012, SSIS, SSAS, Star Schema, Analysis Services Tabular Model, DAX, SQL Server 2012, Power BI, SSRS, TFS

Emission Inventory Tracking (EIT) is used within the Terminals Business Unit to estimate emissions associated with storages tanks, loading racks, combustion sources and other miscellaneous emission sources at the Liquid Terminals. This program is also used to manage tanks and chemical databases, and to compare emissions against permit limits to determine permit compliance.

Roles and responsibilities:

Analyzing, Database designing, Implementation

Created SSIS packages, and wrote stored procedures for ETL.

Analyzing the data and Create Star Schema structure (Fact and Dimensions) and SSAS Tabular model Cube.

De-normalized Star Schema Cube Design

Written DAX queries for calculate measure fields.

Writing batch files to trigger the process and for scheduling jobs.

Coding PL/SQL procedure and functions.

Created reports using Power BI and SSRS.

Involved in gathering requirements, discussions with finance team. Also involved on working on estimates and confirming the same to the client.

Providing support to production team.

Performing code review for team members.

8.iCOST Jan 2013 – Dec 2013

Client

Intel, OR, USA

Technologies

MS Visual Studio 2012, Vertica, SSAS, Star Schema, Analysis Services Tabular Model, Teradata, Tableau

Intel’s new cost and inventory solution—the iCost project—transformed Intel’s critical, tier 1 inventory valuation and cost of sales solution from the ICE legacy solution built in the mid-90’s. This complex and well-executed project transformed all facets of the solution (policy, process, people’s roles and the platform). ICOST introduces significant policy, process, and system changes for Intel Finance. It's a critical application that will allow Intel to better understand product costs and inventory

Roles and responsibilities:

Analyzing, Database designing, Implementation

Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs

Worked on performance tuning on queries.

Writing batch files to trigger the process and for scheduling jobs.

Created reports using SSRS, Tableau.

Developed ETL using Vertica

Involved in gathering requirements, discussions with client for phase 2 activities. Also involved on working on estimates and confirming the same to the client.

9.Mortgage Active Directory Oct 2011 – Jan 2013

Client

Beech Enterprises, UT, USA

Technologies

ASP.Net, XML, C#, WCF, SQL Server 2012, Java Script, Ajax, SOA

Mortgage Active Directory is a web services used to communicate to the Domain Servers in order to manage the Active Directory Users. Each Users login is tied up with the Active Directory User and Groups. The Login project is developed in ASP.Net, the login pages interact with the Active Directory Web Services and based on the rights the Login will decide the user’s navigations.

Roles and responsibilities:

Analyzing, Database designing, Implementation

Coding PL/SQL procedure and functions.

Providing support to production team.

Performing code review for team members.

Developed secured Web services using .Net, C#, VB.Net, WCF.

Implementing customer flow model for current application.

part of the implementation. Prepared the user guide and conducted training sessions.

10.DIRECTWARE Admin Panel July 2009 – Sep 2011

Client

Direct Mortgage Corporation, UT, USA

Technologies

ASP.Net, C#, VB.Net, SQL Server 2005, VB, XML, XSL, JavaScript, AJAX

DIRECTWARE provides a platform for loan officer to choose list of home loan with different monthly payments. It caters to following broad processes in Mortgage:

1. Loan Originator

2. Loan Servicing.

Roles and responsibilities:

Lead offshore team of .NET and SQL developers.

Analyzing, Database designing, Implementation

Involved in designing of new web forms using ASP.NET 3.5.

Coding PL/SQL procedure and functions.

Providing support to production team.

Education details

M.Sc (Mathematics) from University Of Madras. (D.G.Vaishnav College).

B.Sc (Mathematics) from University Of Madras. (D.G.Vaishnav College).

Professional Training

Diploma Course @ SSI Limited in July 2000

E-Com Course @ Zap InfoTech in April 2001



Contact this candidate