Post Job Free
Sign in

Data Sql Server

Location:
Bloomington, IL
Posted:
August 15, 2016

Contact this candidate

Resume:

Bhanu Akaveeti

** ******** **, *** ***, Bloomington, IL 61704

Cell: 309-***-**** *****.********@*****.***

Professional Summary

12+ years of experience in executing EDW/BI projects with full SDLC using Agile and Waterfall methodologies, for Fortune 500 companies

Built new and modified existing ETL/ELT solutions, using tools Informatica and Teradata, to meet business and systems requirements through software component designing, coding, testing, and quality review

Built Data Warehouse/Data Mart/Reporting environments on databases DB2, Oracle, Teradata, SQL Server and Postgres on platforms Unix/Linux and Mainframes

Proficiently used complex SQL, PL/SQL and reporting tools Cognos and QlikView against large databases for Data mining, extraction, normalization, filtering, aggregating,querying, interpreting and visualization

Strong expertise in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions. Proficiency in Data Modeling tools MS Visio and Erwin

Proven expertise in handling high volume data from Relational, Structured, Semi-structured and Un-structured sources, using Informatica

Collaborated with Product Managers, Product Owners and SMEs to successfully deliver Minimum Viable Product (MVP) without causing spikes

Created and implemented architectural patterns to expedite delivery of new products and services to market while meeting client's future state objectives

Vast experience in waterfall and agile project execution methodologies. Experience in using tools Microsoft Project, Trac, Rally and best practices Test Driven Development (TDD), Behavior Driven Development (BDD), Pair Programming and Continuous Integration

Demonstrated strong Technology Lead experience in assigning and prioritizing tasks, working on estimates and timelines, providing guidance to the team on various technical aspects including performance tuning, debugging, root cause analysis, break fixes and permanent fixes

Experience in improving the quality and efficiency of work by revising the standards at design level, process level, code level and communication level

Working knowledge on Big Data technologies HDFS, HBASE, FLUME, SQOOP, PIG, HIVE, IMPALA

Vast experience in executing projects in Onsite/Offshore and Multi-Vendor models

Critical thinking, along with effective communication and conflict resolution skills

Technical Proficiency

ETL Tools : Informatica, Teradata

Database : Oracle, DB2, TERADATA, SQL Server, Postgres

Data Modeling : MS Visio, Erwin

Reporting Tools : Cognos, QlikView

Languages : SQL, PL/SQL, Python, Unix scripting, XML, JCL, COBOL

Operating Systems : Unix, Mainframes, Windows

Change Control : ENDEVOR, PANVALET, ELIPSE, REMEDY

Scheduling Tools : CA7, Control-M

Querying Tools : SPUFI, SQL ASSIST, TOAD, SQL Server Management Studio

FTP Tools : WinSCP, Ultra Edit, FileZilla

Experience Summary

Natsoft Corporation Technology Lead - ETL May 2012 – Till Date

Clients

State Farm Insurance, Bloomington, IL

Responsibilities and Accomplishments

Created Architectural Decision Document (ADD) by working with architects to facilitate decisions on Data placement, Data Ingestion, Data Warehousing, and Reporting Solutions

Created Dimensional modelling for Integrated Customer Platform built on Postgres and Federated Data Warehouse built on Oracle and DB2

Converted business requirements into technical requirements by creating source to target mapping, and also created Technical Specification Documents and test plans

Worked with Product Owner and scrum master in formulating product backlog, sprint backlog, tasks, estimates, timelines, execution, feedback and refactoring

Designed and coded ETL/ELT patterns using Informatica, meeting the requirements of Continuous availability, Fail Over capability, and Continuous Integration

Ingested Data Science analytics from Hadoop cluster onto the Integrated Customer Platform using Informatica PowerCenter, for consumption by Enterprise Agent Search service to increase Digital ROI by rewarding performing agents

Designed and developed complex ETL patterns using Informatica PowerCenter for high volume scenarios making efficient use of loader utilities (EDB Loader, SQL*Loader, BMC Load/Unload), partitioning, pushdown optimization and SQ transformations

Provided Data Analysis and design on Oracle, DB2 and SQL Server using complex SQLs for ad-hoc and on-demand reporting requirements built using Cognos and QlikView tools

Created and used DB2 and Oracle stored procedures, Views, Triggers, Indexes and Functions

Identified and reported Data Quality issues and worked closely with Data Quality team and source system SMEs to correct any data anomalies

Designed and implemented solution for processing large and complex license and appointment file from vendor (NIPR) using Informatica XML SQ transformation and helped eliminate dependency on IBUS

Evolved an automated data reconciliation process for capturing and verifying data load statistics

Followed Business Rules approach through Informatica using mapping parameters, mapping variables, parameter files

Participated in system testing and implementation activities by coordinating with Quality Analysts, Implementation Coordinator and other stakeholders

Environment: Informatica PowerCenter 9.6.1, DB2 V11, Oracle, Postgres, SQL Server, Erwin, Unix/Linux, MS Visio, SQL*Loader, EDB Loader, Cognos, QlikView, SQL,PL/SQL, Trac, Rally, MS Project, Python, Cobol, JCL, XML, Mainframes, HPSM, Control-M, SQL Server Management Studio, TOAD

Infosys Technologies Ltd Technology Lead - ETL May 2002 - April 2012

Clients

Walmart Stores, Bentonville, AR Kraft Foods, Glenview, IL ConAgra Foods, Omaha, NE

Responsibilities and Accomplishments

Participated in system analysis and data modeling, which included creating tables, views, indexes, triggers, functions, and procedures

Prepared test strategy, test cases, and technical specification documents

Designed, coded and tested the hourly and nightly feed from DB2 tables to Oracle and Teradata tables using Informatica PowerCenter and Teradata utilities BTEQ, MLOAD and FastLoad

Created complex flow down logic (flow the corrections to the next effective dated records) for History Corrections using Teradata SQL and BTEQ

Conducted design and code reviews across project teams to help identify opportunities for improvement. Tuned few complex queries to improve performance by over 90%

Extensively used aggregators, lookup, update strategy, router and joiner transformations.

Created a complex data archival process using PL/SQL on JDA tables that involves deleting and archiving data from several relational tables with FK constraints. JDA dropped their own solution and adopted the solution created by our team as it has better performance and modularity

Worked with Cognos Report development team in designing and generating the daily sales reports

Worked with QA Engineers, Performance Analysts, Business users, Implementation Coordinator during various lifecycles of the project

Mentored team of developers in analysis, design, coding and trouble shooting

Environment: Informatica PowerCenter, DB2, Oracle, Teradata, FastLoad, BTEQ, MultiLoad, TPump, Erwin, Unix/Linux, MS Visio, SQL*Loader, Cognos, SQL, PL/SQL, COBOL, JCL, XML, Z/OS, HSPM, CA-7, SQL Assist, TOAD

Education

Bachelor of Technology in Information Technology from Nagarjuna University, India



Contact this candidate