Post Job Free

Resume

Sign in

Data Manager

Location:
East Greenbush, NY
Posted:
May 20, 2017

Contact this candidate

Resume:

TARAKUMAR GAJULA Gmail: ac0eq4@r.postjobfree.com

Phone: 864-***-****

SUMMARY

Total 14+ years of IT experience in

BigData: Experience in top distributions giants like Cloudera(CDH), HortonWorks (HDP) ecosystems and related technologies (Hadoop, HDFS, MapReduce, Hive, Pig, Impala, Splunk).

Business Intelligence: Requirements analysis, Key Performance Indicators (KPI), metrics development, sourcing and gap analysis, OLAP concepts and methods, aggregates / materialized views and performance, rapid prototyping, tool selection, semantic layers (BI reporting tools (Qlikview, Tableau, OBIEE)

Data Warehousing: Full life-cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Kimball methodology, star/snowflake design.

Architecture: TOGAF/Zachman Framework enterprise strategy and architecture, business requirements definition, guidelines and standards, process/data flow, conceptual / logical / physical design, performance, partitioning, optimization, scalability/throughput, data quality, exception handling, Metadata, Master Data design and implementation, auditing, capacity planning, use cases, traceability matrix etc.

Worked on MDM (master data mangement)/Reference Data Management(RDM) Integrations.

Worked in Pre-Sales activities and Technical Discussions / Architecture for Internet Solutions (SA), Arqiva Smart Water Works (Europe), Bright House Networks (USA), CenturyLink (USA) etc clients.

Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin, DB Artisan, ER Studio, TOAD Modeler and SQL Modeler.

Experience in leading the team of application, ETL, BI developers, Testing team .

Well versed with all phases of SDLC, quality process and procedures like business requirements definition, enterprise strategy and architecture, guidelines and standards

Significant experience in Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house like Star Schema, snowflake schema and MDM.

Responsible for detail architectutral design and Source to target mapping .

Worked on Informatica Power Center tools-Designer, Repository Manager, Workflow Manager.

Actively involved in data wrangling, data profiling to ensure data quality of vendor data.

Excellent PL/SQL programming skills like Triggers, Stored Procedures, Functions, Packages. etc in developing applications.

Monitoring and optimizing the performance of the database through Tuning Applications, Tuning Memory, Tuning Disk etc.

Experience in Capacity Planning, Space Management, Storage Allocation Performed database administration like User Management and implement Security using Roles, Privileges, Grants.

Working on BTEQ scripts, Teradata utilities fast load, multi load, tpump.

Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.

Tuning of SQL statements using tools like Explain plan, Auto trace, Gathering statistic, analyzing tables and indexes, using Pipeline function, parallel query, Inline views, analytical function, Query rewrite, hints etc..in oracle sql server and teradata.

Performed database administration like User Management and implement Security using Roles, Privileges, Grants, tablespace management and capacity planning etc.

Experience in Table partitioning and rebuilding, validating the Indexes in Oracle.

Experience in transforming business requirements into technical specifications.

Experience in BI/DW solution (ETL,OLAP, Data mart) like OWB, Informatica,BI Reporting tool like Tableau and Qlikview

Hands on Integration and conversion projects and also excellent in handling 24x7 Support, Solid People management skills, demonstrated proficiency in leading and mentoring individuals to maximize level of productivity, while forming cohesive team environment.

ACADEMICS

M.Tech CSE,Bharath University, Chennai, India.

CERTIFICATION

PRINCE2 Practitioner (Project Management), APMG - P2R/IN082254 - 06/AUG/2018

Introduction to Big Data with Apache Spark, Berkeley University (July 2015)

Big Data, Big data University

ITIL V3 Foundation, APMG

ITSM, EXIN

Object Oriented Programming with C++

Operating System Concepts from CSI

AWARDS

Exceptional Contribution award from Windstream.

Best Performance @ work – Govt of India, Ordnance Factory.

TECHNICAL SUMMARY

Job Functions

Project Management, Incident,Configuration Management, Application Support, DBA tasks, Handling Database, Analysis, Design, Coding, and Testing Documentation, Maintenance, Team Management.

Big Data Technologies

Hadoop, MapReduce, Pig, Hive, Sqoop, Oozie, Flume, Splunk, HBase

Cloud

AWS/EMR/EC2/S3

Databases

Oracle (9i, 10g,11g), Informix, DB2,MS SQL Server, Terradata

BI Reporting Tools

Tableau,QlikView, QlikSense, Business Objects

Data warehouse Tech Tools

Informatica Power center, Informatica Power Exchange, IBM DataStage 7.5,

Erwin, ERStudio, TOAD Modeler, SQL Modeler

Tools

ServiceNow, BMC Remedy, SVN, VSS, Clear Case, TOAD,PL/SQL Developer, MS office, MS Project,Clarity

Languages

PL-SQL, Python, Java, C, C++, CORBA, COBOL, Informix 4GL, Visual Basic 6.0

COTS

Mycom(OSI) NetExpert, Metasolv, Sasktel Martens & Open Switch Gate(OSG)

WORK EXPERIENCE

State of New York -Depart of Health (Albany, NY) Aug 2016 to till date

Project : WIC (Womes, Infants and Childeren) DW Impmentation

Role : Sr. BI Data Architect/Technical Manager

Environment: Oracle, Sybase, ER Studio,TOAD, SQL Developer, MS Visio, OBIEE and Qlikview

Responsible for detail architectutral design .

Data Modelling, Dimension Modelling, Database Designing using ER Studio.

Reverse engineering of databases for performance and business needs.

Profile data and identify data quality rules. Data map and gap analysis

Sprint planning, daily scrums, guiding the team

IMS Health (Plymounth meeting, PA) Oct 2015 to Aug 2016

Project : Abbvie: Inventory Visibility

Role : Sr. Consultant (BI/Bigdata)

Environment: Oracle, Terradata, Hadoop, Sqoop, Hive, Pig, Python,AWS, Hue,TOAD, SQL Developer, MS Visio, Qlikview

Responsibilities:-

Responsible for detail architectutral design .

Gathered the requirment from business and implimented the new projects.

Data Modelling, Dimension Modelling, Database Designing using Erwin.

Working on BTEQ scripts, Teradata utilities fast load, multi load, tpump.

Worked on source to target mapping for informatica.

Working on Hive tables, loading data and writing Hive queries.

Worked on key performance indicator for reporting purpose.

Reverse engineering of databases for performance and business needs.

Profile data and identify data quality rules. Data map and gap analysis

Predictive analytics in Hadoop/Hive/Hue on AWS

Implemented a Python-based distributed random forest via Hive/Python streaming.

Working with Hadoop team for Ingestion of data from various sources.

Actively involved in data profiling to ensure data quality of vendor data.

Monitoring and optimizing the performance of the database through Tuning Applications, Tuning Memory, Tuning Disk Usage etc.

Design of Qlik Marts for Actual onhand inventory, Annual Financial Plan and Update and Annual Financial Plan-adjustments and Update-adjustments

Involved in integration of data sourced from IMS with client and third-party information sources, creating a single, trusted view that in turn can be accessed by the Nexxus commercial applications(IMS One Cloud on AWS).

By use of Star-Schema and adherence to best practices in QlikView, optimum performance of the dashboard was ensured.

Design, build, test and debug QlikView solutions based upon specified requirements

Developed optimized load strategies; including Binary, Partial and Incremental Load

Worked on Big data Pig, Hive

Project : Endo: Specialty Data Integration

Role : Sr. Consultant (Implementation Architect/Lead)

Environment: Oracle, MS SQL Server, TOAD,Informatica 9.1, SQL Developer, Erwin, PVCS, Qlikview

Worked on design, build and maintain the logical and physical data models to support the business requirements using Erwin.

Was responsible in managing tasks and deadlines for the ETL teams both Onsite and Offshore.

Was the point person on the ETL team for other teams such as Reporting, Testing, QA and updates on Project Status and issues.

Break up the work into task list and estimate based on simple, medium and complex methodology

High level & low-level ETL flow design

Design and Development of SCD’s (Slow Change Dimensions), Identification of GOLDEN record

Working on master data management (MDM) integration from DW system

Extracted data from MSSQL Server by understanding the business rules in procedures

Working on PL/SQL, stored procedures, functions, triggers etc.

Profile data and identify data quality rules. Data map and gap analysis

Shell scripting, Automation of the system.

Tuning of SQL statements using tools like Explain plan, Auto trace, Gathering statistic, analyzing tables and indexes, using Pipeline function, parallel query, Inline views, analytical, PPI, function, Query rewrite, hints etc..in oracle sql server and teradata.

Client: Century Link (Vancouver,WA) April 2014 to Sep 2015

Project : Customer Churn Analysis & Predication

Role : Architect/ Lead

Environment: Oracle, Hadoop, Sqoop, Hive, Pig, Hue,TOAD, SQL Developer, MS Visio, IBM DB2, Unix, Perl, Qlikview,Shell Scripting

Responsibilities:-

Worked as Portfolio\Data warehouse architect/lead .

Worked on Data modelling, Dimension Modelling.

Conceptual, logical and physical database designing

Created reusable transformations to load data from operational data source to Data Warehouse and involved in capacity planning and storage of data.

Maintain Meta-Data content (structure, level of detail, and amount of history)

Created optimal tables and stored procedures for report design

Physical and logical data model designs using TOAD Data Modeler and MS Visio.

Created a stored procedures, functions to process data by applying the business rules from staging tables

Implementation of Slow Change Dimensions

Integration with Master Data Management (MDM) systems to master records, Mergers implementation

Developed complex procedures and functions using TOAD and SQL Developer

Worked on PL/SQL, stored procedure, function, trigger, packages etc.

Worked on BTEQ scripts, Teradata utilities fast load, multi load, tpump.

Monitoring and optimizing the performance of the database through Tuning Applications, Tuning Memory, Tuning Disk Usage etc.

Worked on reporting tool like business object, created universe etc.

Shell scripting in Linux

Worked on data guard and backup and recovery.

Worked on Big data Pig, hive to implement the business rules

Migrating code across development, testing and production environments

Design, build, test and debug QlikView solutions based upon specified requirements

Developed optimized load strategies; including Binary, Partial and Incremental Load

QlikView Integrated with SAS, R to predict the expected churn.

Project : Network Analytics

Role : Architect/ Lead

Environment: Oracle, Hadoop, Sqoop, Hive, Pig, Hue,TOAD, SQL Developer, MS Visio, IBM DB2, Unix, Perl, Qlikview,Shell Scripting

Responsibilities:-

Lead the team of application,ETL, BI developers.

Data modelling, Dimension Modelling, Database Designing using SQL Modeller.

Worked with the Application Development team to implement data strategies, build data flows and develop conceptual data models.

Used Sqoop for importing data into Hadoop from Oracle

Using logstash store data into Elastic Search for analysis using Kinana tool

Worked with Agile methodologies and have use scrum in the process

Worked on defining job flows, jobs management using Fair scheduler

Worker on Cluster coordination services through Zookeeper

Worked on loading log data directly into HDGS using Flume.

Importing the data from the Oracle into the HDFS using Sqoop.

Importing the unstructured data into the HDFS using Flume.

Hands on design and development of an application using Hive (UDF).

Responsible for writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).

Provide support data analysts in running Pig and Hive queries.

Worked with Agile Methodologies and have used scrum in the process

Worked on key performance indicator for reporting purpose.

Worked on PL/SQL, stored procedure, function, trigger, packages etc.

Worked on data profiling and data wrangling.

Worked on SQL in an Oracle(OLTP) and DB2 Server environment. Proficient with complex SQL (multi joins, sub-queries, unions) and creation of database objects (tables, views, materialized views, etc.). Ability to utilize performance tuning utilities (Explain Plans, etc.) to optimize queries.

Tuning of SQL statements using tools like Explain plan, Auto trace, gathering statistic, analyzing tables and indexes, using Pipeline function, parallel query, Inline views, analytical function, Query rewrite, hints etc..

General database administration like tablespaces management.

Migrated datbases from MS excees, teradata, sql server to oracle.

Shell scripting, Automation of the system using cron.

User Management and implement Security using Roles, Privileges, Grants .

Client :- Windstream Communications, USA Jun 2006 to April 2014

Project : WFM – Work Force Management

Role : Technical Manager/Lead/Architect

Environment: Informix 4GL, Informix, HP-UX, Shell Scripting, ESQL/C, Perl, Oracle, VC++, Windows, Handheld Devices, MS Visio,Erwin, BO, SQL Developer

Responsibilities:

Responsible for program and portfolio management activities for small to medium projects.

Data modelling, dimension modelling, Database Designing using Erwin

Worked with Business Intelligence team for reporting

General database administration for oracle 11g.

Worked on PL/SQL, Created and maintained stored procedure, function, trigger, packages etc.

Created summary tables, materialize views for query rewrite purpose etc.

Created partitioned table, index according to the need of application

Lead the development/maintenance of SAP Business Objects Reporting solutions with Crystal Reports.

Lead the maintenance of SAP Business Objects Performance Management solutions.

Develop and/or update existing best practices for SAP Business Objects Reporting solutions.

Data loading using ETL tool, Faster Batch Processing, SQL Loader, external tables, UTL_FILE package

Dashboards, KPI and Daily, Weekly, Monthly, custom report using BO to control dispatch issues (2011 onwards)

Meeting SLA requirements and Application Support on 24x7 basis.

Guide team to Development of 4GL Program for as per Help Desk requirements and preparing the various documents as per Windstream requirements.

Project: Conversion Projects from acquired Telco’s WFM (D&E,Lexicom,IOWA,Nuvox,KDL Norlight,PAETEC Conversion)

Role : ETL Lead/Architect

Environnent: Informatica V8.x 9.x, Oracle 11g/10g, Informix, SQL Server, PL/SQL, CDC, XML, Flat Files.

Roles & Responsibilities:

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center.

Parsing high-level design specifications to simple ETL coding and mapping standards. Designed mapping document, which is a guideline to ETL Coding.

Analyzed and created Facts, Dimension tables. Modeled the data warehousing data marts in Star join schema.

Developed a frame work to migrate the data from different sources RDBMS using ETL tools, Packages, Procedures and scripts.

Experienced in SQL in an Oracle and Informix environment with complex SQL (multi joins, sub-queries, unions) and creation of database objects (tables, views, materialized views, etc.). Ability to utilize performance tuning utilities (Explain Plans, etc.) to optimize queries.

Experience in integration of various data sources like Oracle, SQL Server and Flat Files using Active/Passive Stages

Experience in using the third party Scheduler like Autosys to schedule the jobs.

Excellent knowledge of studying the data dependencies using Metadata of Informatica and preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs.

Coordination between business/IT teams on both sides.

Involving infrastructure teams as and when required to overcome the technical needs.

To involve in business discussions to identify functional gaps between the source and destination applications.

Monitor traffic and estimate capabilities during System and UAT testing.

Perform analysis of current WFM load versus anticipated load after conversion.

Escalate as an overall project risk. This is not limited to WFM, CTS and TAP.

Address the throughput issues and perform cross application stress testing.

Provided functional and technical designs for the conversion programs and the data scrubs.

Created summary tables, materialize views for query rewrite purpose etc.

Created partitioned table, index according to the need of application

logical backups (Import / Export), Hot/Cold Backup

Project : CTS – Consolidated Testing Solutions

Role : Technical Manager/Lead/Architect

Environment: Mycom (OSI) NetExpert, Oracle 11g, Unix, Perl, Shell Scripting, Java, Spring Responsibilities:

Requirement gathering, Design of Fault Network Management Systems using the NetExpert OSI development suite

Handled Projects, Portfolio report Management and also owned complete delivery responsibility for few small to medium projects.

Handling the client calls, risks in projects

Architecture and Integration diagrams development using MS Visio

Design of Dashboards, user driven reports, comparison reports for Fault Management

Design and guiding the team to develop the procedures and triggers in various phases to integrate with other systems.

Wrote Perl/shell scripts to automate the manual process.

Closely monitoring the SLA and application support

Ability to provide high level architecture/design of NetExpert

Demonstrated skills to provide the user with tools to diagnose the different operational and provisioning changes in the network

Liaison in setting up testing environment with other groups

Integration of UNE/POTS, ADTRAN, Remedy ITSM with NetExpert

Client :- Ministry of Defense (Govt. of India) April 2003 to May 2006

Project # 1: PPC – Production Planning and Control

Project # 2: Inventory System

Project # 3: Electronic Attendance Recording System (EARS)

Project # 4: e-Admin

Environnent : Informix 4GL, Informix, HP-UX, Shell scripting, ESQL/C, Oracle, Visual basic 6.0, Stored procedures, Scripts

Roles & Responsibilities: Senior Software Engineer

Estimation & planning, monitoring, issue resolution.

Requirement gathering, estimation & planning, monitoring and issue resolution.

Worked with Business Intelligence team for reporting

General database administration for Informix/Oracle 9i.

Worked on PL/SQL, Created and maintained stored procedure, function, trigger, packages etc.

Created summary tables, materialize views for query rewrite purpose etc.

Created partitioned table, index according to the need of application

Interacted with department for system study and designing the system.

Developing the code & documentation

Testing and code deployment

Presales Assignments:

1.Irdeto, Netherland

a.BI Environment Replacement

i.Proposed Pentaho Open Source solution in place of TimeXtender / TargIT

ii.Addressed various Integration and performance issues

iii.Provided estimations

2.BrightHouse Networks, USA

a.Mobile Application Rationalization

i.App Store Optimization (ASO)

1.SensorTower, SearchMan, Appnique, Google Adworks

2.Integration of Splunk Mint With Mobile Apps

ii.Campaign Management

iii.Operational Improvement with Splunk Enterprise, Mobile Analytics

3.Internet Solutions, South Africa

a.Product Profit Analysis using QlikView and MicroStrategy

b.Tools Evaluation

4.Arqiva Smart Water Networks, Netherland

a.Smart Water Meters (SWM)

i.Integration with OBIEE/QlikView

ii.Smart Meeting Reports

iii.Billing and various reports

iv.Smart Water solutions with Rasberry pi, Google cloud messaging, Mobile app development to register and monitor using Internet of Things network.

5.PNA, USA

a.QlikView and Tableau Lab setup (Installation, Administration, Clients Access etc)

b.QlikView/Qlik Sense Integration with Salesforce and R

c.Presales Demos on Product Profit Analysis, Customer Churn Predication, Network Analytics

d.Worked on several Cloud and Mobile related BI Applications

e.Presales demo on IOT(Internet of Things) for smart appliances and utilities



Contact this candidate