Post Job Free

Resume

Sign in

Data Developer

Location:
Madison, WI
Posted:
March 27, 2020

Contact this candidate

Resume:

Professional Summary:

Over ** years of total IT experience in Application Development, Data Analysis, DBA, Data warehousing experience, BigData, Microsoft SQL Server 2000/2005/2008/2014/2016, SSIS/SSRS/SSAS, OLTP and OLAP environments.

A versatile and resourceful leader and team builder, able to motivate and guide teams with different skillsets, in the areas of requirements analysis, development, testing and support.

Experience in Netezza platform, an industry leading scalable and massively parallel processing platform as basis for solution. Netezza Certified Specialist PureData System for Analytics v7.0

Expertise in Power BI, Power BI Pro.

Expert in creating and developing Power BI Dashboards in to rich look.

Expertise in HDFS Architecture, Hortonworks Hadoop distribution and Cluster concepts.

Demonstrated knowledge of HDFS, MapReduce, YARN, Tez, Hive, File Format, Oozie, Sqoop, Impala, Flume, Zookeeper and Spark.

Worked on Cluster size of 60 nodes.

Good Knowledge of Spark SQL architecture, Spark concepts, RDD, Data Frame, Spark Streaming.

Manipulating RDD in Scala.

Loaded the data into Spark RDD and do in memory data Computation to generate the Output response.

Advanced Spark programming Accumulators and Broadcasts.

Tuning and Debugging Spark. Spark SQL performance Tuning.

Spark Dataset interfaces and functions. Aggregate, Statistical and Scientific functions. Data wrangling with Datasets. SQL access to a simple data table.

Experience in Python, Numpy, pandas, TensorFlow and Machine Learning.

Experience in shell scripts and work experience in Linux and windows environment.

Good experience in using various Pentaho PDI/Kettle steps in cleansing and loads the data as per the business needs.

Running Kettle Transformation from a Job, Kettle Variables and parameters.

Experience in SQL Server DTS and SSIS (Integration Service) package design, constructing, (SSIS) Engine, optimizing package processing and deployment.

Experience in developing the applications using T-SQL Programming (DDL, DML and DCL).

Developing reports on SSAS & SSRS on SQL Server (2000/2005/2008). Sound Experience and understanding of SSAS, OLAP cube and Architecture.

Good experience MDX, DAX queries.

Knowledge in Health Care and Banking Industry.

TECHNICAL SKILLS

Databases

MS SQL Server 2012/2008/2005/2000/7.0, ORACLE,Netezza, MS Access

Database Tools

MS SQL Server 2008 /2000/2005, MS-Access, SQL Server Integration Services, Data Transformation Services, SQL Server Analysis Services, OLAP(Online Analytical Processing, Multi-dimensional Cubes and Business Intelligence & Development Studio

Languages

T-SQL, PL/SQL, NZSql, VB.net, VB Script, HTML, XML, VB.NET, ASP.NET, Java Script, Python, C, C# and MDX.

ETL Tools

DTS, SSIS (SQL Server Integration Services), Informatica, Apache PIG, SQOOP, HIVE, OOZIE, Pentaho.

Reporting Packages

SQL Server Reporting Services, Report Builder, Business Objects 6.5/XI, Crystal reports 9/10/11, MS Excel.

Tools/Methodologies

Erwin, MS Visio, MS Project, MS Office 2007

Educational Qualification:

Bachelor’s Degree: Electronics and Communications Engineering, JNTU, Telangana, INDIA.

Master’s Degree: Software Engineering (GPA 4.0).

Professional Experience

Madison College, Madison, WI Aug 2019 – Current

Project: Campus Solution

Role: Data Analyst/ Business Intelligence Lead Developer

Responsibilities:

Data model design and implementation. Design and create relational Conceptual, logical and physical model and dimensional model star/snowflake schema and to create the Business Intelligence reference architecture and implement the enterprise data warehouse. Created standards and design patterns on data movement, data access, data integration, and other areas in the data warehouse.

Mapping data from source system to the dimensional mode. ETL Design, Scheduling, Deployment and data loading with SQL/Stored Procedure Development.

Conversion of Cognos Data Manager ETL to SSIS ETL.

Re-engineer some of the current ETL processes to streamline the data acquisition and integration process using ETL SSIS.

Implemented SQL server partition to decrease data load time. Reduced nightly data load time from 8 hrs to 5 hrs.

Query tuning and optimization. Implemented SQL server Indexes.

Created sql server stored procedures.

Creating Power BI reports, visualization and dashboard.

Scheduled Automatic refresh and scheduling refresh in power bi service.

Wrote calculated columns, Measures query’s in power bi desktop to show good data analysis techniques.

Work with business partners to define analysis requirements. Guide them through the BI delivery process and layout the business and application architecture road map.

Analyzed business needs to provide technology solutions. Coached and mentored within technical team environments.

Environment: SQL Server 2008/2012/2016, T-SQL, Windows XP/2003, Visual Studio 2008/2010, SSMS, Jira, ALM, agile.

BMO, Milwaukee WI Jan 2016 – July 2019

Project: Netreveal Canada

Role: Business Intelligence Lead Developer/Architect

Responsibilities:

Deposits data combined view was created by collecting historical data from disparate 10 data sources and systems.

Conducted ETL development in the Netezza environment using standard design methodologies.

Used NZSQL, External tables, NZ Migrate and NZLOAD for day to day loading and migration activities.

Extensively used CTAS, CBT, GROOM, Zone Maps, Materialized views, UDX, Explain, Distribution Key, and Distributed Joins.

Performance tuning of NZSQL.

Written Complex scripts in Netezza to retrieve data from source systems.

Azure data factory ( integration ETL tool)

Azure Data lake Store experience

Azure Analysis Services to serve data to a reporting solution and to make data available for querying.

Development of various modules in Hadoop Big Data Platform and processing data using MapReduce, Tez, Hive, Sqoop.

Netezza to Hadoop migration and Data validation in Hadoop environment.

Loaded the load ready files from mainframes to Hadoop and files were converted to ASCII format.

Extensively loaded SCD tables in Hadoop.ETL table update implementation with Non-ACID Hive tables using only insert.

Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.

Loaded data into Hive tables from Hadoop Distributed File System (HDFS) to provide SQL-like access on Hadoop data. Worked with multiple file formats.

Handling small files with compaction. Good knowledge on ORC format to improve performance.

Writing high speed queries using Hive Partitioning and Bucketing. Working with beeline.

Implemented Spark Using Scala and SparkSQL for faster processing of data.

Involved in converting Hive/SQL queries into spark transformations using Spark RDDs in scala.

Used Pentaho PDI Graphical Designer spoon, Designing Transformations, Task Flows and Jobs, Performing operations with Netezza Database and Hadoop.

Used Pentaho Kitchen and Pan Utilities.

Building and training CNNs, RNN, LSTM and other deep neural networks.

Environment: SQL Server 2008/2012, T-SQL, Windows XP/2003, Visual Studio 2008/2010, SQL Profiler, Pentaho PDI 8.1,Netezza, Aginity, Hadoop, Python, TensorFlow, Zeppelin, Jira, agile, ALM, SSMS, GitLab, beeline, BitBucket, Bamboo.

Microsoft, Bellevue, WA Jan 2015 – December 2015

Project: Demand Planning.

Role: SQL server SSIS SSAS Lead Developer

Responsibilities:

Develop and maintain ETL Solutions / SSIS and SSAS Tabular Packages.

Created SSIS packages for Full and Delta Fact Dimension load.

Implemented error logging for SSIS packages.

Implemented Partitions and Row level security for SSAS Tabular

Created KPI, calculated columns, calculated expressions using DAX.

Requirements gathering, Source to Target mappings, Data analysis, project Documentation.

Created TSQL, Stored Procedures, Functions, Materialized Views, Cursors and Triggers.

Provide oversight to junior to mid-level developers, including direction, feedback, revising.

Evaluate technology, architecture, and processes of existing solutions

Investigate and evaluate off the shelf technology solutions

Extensively used TFS for Team Projects.

Ericsson, Factoria, WA Aug 2013 – Jan 2015

Project: Data Analytics Project.

Role: SQL server SSIS Netezza PDW Linux Developer

Responsibilities:

Monitored progress and compliance with ETL best practices standards.

Extensively used nz_migrate to copy data from Mustang to TwinFin environment.

Extensively used NZSQL, NZPLSQL, UDX, NZLOAD, Netezza Query Plan and Query tuning.

Managed Netezza queries by performance tuning and techniques such as CBT and Collocation.

Hands on experience in Transactional OLTP databases and Netezza PDW.

Tuned SQL statements (e.g.: - Using Indexes) and Stored Procedures.

Designed dynamic SSIS Packages in Visual Studio 2005 and 2008 to transfer data crossing different platforms, validate data during transferring, and archiving data files.

Participated in data modeling and database design.

Performed database development tasks in Netezza and Sql server.

modify Unix shell scripts for batch jobs

BMO Harris Bank N.A., Milwaukee, WI May 2012 – Aug 2013

Project: Comprehensive Capital Analysis and Review (CCAR)

The purpose of the Comprehensive Capital Analysis and Review (CCAR) project is to review the capital position of BMO/Harris/M&I and provide critical mandated Stress Test and FR Y-14 Schedule reports to the Federal Government.

Apollo (Risk Analysis and Reporting)

The purpose of this project is to develop a consistent and quick method to pull in 14 data files from five different sources in order to provide cleaner data and timelier Risk Analysis and Reporting.

Role: SQL server SSIS Netezza Developer

Responsibilities:

Requirement analysis in order to generate complex system integration SSIS packages.

Created Data Model Design for CCAR project.

Scheduling Jobs on Sql server

Created many complex Stored Procedures / Functions and used them in to generate reports on the fly.

Performance tuning of OLTP systems.

VB.Net and other .NET languages C# used for Script task. Script task is primarily to control the flow of SSIS package. Created SSIS packages using script tasks and with Error Handling.

Designed effective ETL packages to transform data in Netezza.

Environment: SQL Server 2008/2005, T-SQL, Windows XP/2003, Visual Studio 2008/2005, SQL Profiler, PL/SQL, Toad, Netezza, Aginity, Erwin, Business Objects, SQL Server Analysis Services,, SourceSafe.

WELLPOINT, Columbia, MD June 2011 – April 2012

Project: Clinical Rules Integration (CRI)

Role: SQL server SSIS SSRS SSAS Developer

CRI system provides personal health care guidance can improve the safety, quality and coordination of healthcare for WellPoint members and help lower their costs. CRI continually analyze each individual’s health data to create their personal health picture and run a Personal Health Scan to identify actions to improve health and reduce costs. We then send personalized, confidential messages to help members, their doctors and care managers take action.

Responsibilities:

Worked in database design and data modeling process for CRI system using Erwin.

Good experience in loading data to very large SQL Server databases (VLDB 20TB).

Extensively used Table and Index Partitioning to improve new data load and query performance.

Developed dynamic SSIS packages to load Medical and Rx claims, Provider, Eligibility etc extracts for Clinical rules integration processing. The extracts are in several layouts and had to be validated and re-formatted to a standard layout and loaded into SQL Server.

Implemented Precedence Constraints, Logging, Transactions, Package Configurations, Error and Event Handling in SSIS packages.

Deployed SSIS Packages to File system and SQL server.

Created Datawarehourse Cubes in SQL Server Analysis Service (SSAS).

Understanding the OLAP processing for changing and maintaining the Warehousing Optimizing Dimensions, Hierarchies and adding the Aggregations to the Cube.

Experienced in analyzing multidimensional cubes in SQL Server 2005 and 2008 Analysis Services (SSAS) by using skills like slicing and dicing, drill down, drill up, and drill through.

Designed/Implemented the Ad-Hoc Reports

Extensively used Joins and sub-Queries to simplify complex queries involving multiple tables

Extensively used Dynamic SQL and MDX

Created Database objects including Tables, Triggers, Views and T-SQL Procedures

Worked with various functional experts to implement their functional knowledge into working Procedures

Environment: SQL Server 2008/2005, T-SQL, Windows XP/2003, Visual Studio 2008/2005, Report Builder 2.0, SQL Profiler, Erwin, SQL Server Reporting services, SQL Server Analysis Services, Visio, SourceSafe.

Magellan Medicaid Administration, Richmond, VA Aug’08 – June’ 11

Project: First Rebate, First PDL

Role: SQL server SSIS Developer

Magellan Medicaid Administration (MMA) is directed specifically toward State Medicaid Agencies. MMA generates and forwards rebate invoices, conducts dispute resolution and updates and maintains labeler accounts receivable file. First Rebate and PDL Applications Improves the effectiveness of the rebate service processes, Improves the accuracy and timelines of the rebate invoicing, receipts, and reporting to external customers.

Responsibilities:

Migrated DTS 2000 Packages to SSIS Packages.

Extracted data from Heterogeneous data sources like SQL Server 2005/2008, Oracle, .CSV, Excel and Text file from Client servers and through FTP and loaded to SQL Server.

Used various Transformations such as Multicast, Merge Join, Lookup, Fuzzy Lookup, Fuzzy Grouping, Slowly Changing Dimension, Aggregate, Conditional Split, and Derived Column Transformations in the SSIS Packages.

Optimized packages minimizing use blocking transformations, adjusting buffer size, parallel executing packages in control flow, and using proper event handing, loggings and check points.

Developed code in SQL Server to calculate rates for the Preferred Drug List (PDL), based on requirements gathered from business SME.

Mostly used XML package configuration in packages making easy to move into production environment

Coordinated with source system owners, day-to-day ETL progress monitoring and maintenance.

Wrote stored procedures for report generation in SQL Server 2005.

Implemented new T-SQL features added in SQL Server 2005 that are Data partitioning, Error handling through TRY-CATCH statement, Common Table Expression (CTE).

Created Clustered and Non- clustered indexes in SQL Server 2000 and 2005

Developed physical data models and created DDL scripts to create database schema and database objects.

Environment: SQL Server 2008/2005, T-SQL, Windows XP/2003, Visual Studio 2008/2005, Report Builder 2.0, SQL Profiler, Erwin, SQL Server Reporting services, SQL Server Analysis Services, Visio, SourceSafe.

LORSHI SYSTEMS Jun ’07 – Jun’08

Role: SQL server SSIS SSRS SSAS Developer

The primary role of this project was to assist a team in creating a new reporting system. After detailed and careful reporting analysis, SQL Server Reporting 2005 fulfilled the needs of the company. A new reporting system was designed to allow internal departments to access data via an internal website using SSRS 2005.

Nightly/Minute ETL processes into a custom designed data-mart allowed for ad-hoc reporting from minutes old data.

Responsibilites:

Created functional requirement specifications and supporting documents for business systems.

Converted the OLTP data to Facts and slowly changing Dimensions tables.

Experience in creating SSIS packages using Active X scripts and with Error Handling.

Deployed SSIS packages and Reports to Production Servers.

Data Cleaning by Fuzzy Lookup and Fuzzy Grouping.

Experience in Data Warehouse Modeling with SQL Server Analysis Service (SSAS)

Created parameterized Crystal reports and customized existing reports for presentations using Cross-Tab reports, sub-reports, Running Totals, Sorting, Alerting, Visual Basic syntax to create formulas for the reports

Migrated Crystal Reports to SQL server 2005 Reporting Services

Data mart solutions to improve performance of Sql Server 2005 Reporting services(SSRS)

Designed/Implemented the Ad-Hoc Reports

Created Linked Reports and Snapshots

Design/Implemented the SQL Reporting Services management security.

Developed stored procedures, functions, indexes, views and triggers.



Contact this candidate