K.ANIL KUMAR
Email: ******.**********@*****.***
Phone: +91-988*******
Professional
Summary
. Having 8 + years of experience in IT Industry.
. 8 years of experience in Business Intelligence Tools (SSIS, SSAS,
SSRS), Microsoft Performance point services 2010(PPS) and BIG
DATA/HADOOP
. Analyzed large data sets by running Hive queries and Pig scripts
. Having good knowledge in creating Hive tables, and loading and
analyzing data using hive queries
o Involved in running Hadoop jobs for processing millions of
records of text data
o Developed Simple to complex Map Reduce Jobs using Hive and Pig.
o Developed multiple Map Reduce jobs in java for data cleaning and
preprocessing
o Involved in loading data from LINUX file system to HDFS
o Responsible for managing data from multiple sources
o Extracted files from Couch DB through Sqoop and placed in HDFS
and processed
o Having knowledge in running Hadoop streaming jobs to process
terabytes of xml format data
o Load and transform large sets of structured, semi structured and
unstructured data
o Responsible to manage data coming from different sources
o Having knowledge in exporting analyzed data to relational
databases using Sqoop
o Having a good Knowledge on Mongo DB
o Having a good Knowledge on Revolution Analytics and Tableau
o
. Expert level skills and concepts in Microsoft SQL Server 2005\2008 SQL
Server 2012 and the following Business Intelligence tools
o Microsoft SQL Server 2005\2008\2012 Integration Services
o Microsoft SQL Server 2005\2008\2012 Analysis Services and
Tabular Data Model
o Microsoft SQL Server 2005 & 2008 Reporting Services
o Microsoft Performance point services 2010
. Experience in Data Extraction, Transforming and Loading (ETL) between
Homogenous and Heterogeneous System using SQL tools (SSIS, DTS, Bulk
Insert, and BCP)
. Good Exposure on Control flow level items, Data flow Transformations
and Event Handlers.
. Working with SharePoint List Source Adapters
. Involving in Database Design and Performance issues in Database level.
. Excellent experience in Package Configuration, Logging and Deployment
on various Environments.
. Good Exposure in Setting the Security for Package
. Good Exposure to Creating the Cubes in Analysis service
. Having good Knowledge on Creating Customized Dimensions' and Cubes
. Applying KPI'S, Actions, Translator's and Partitions' on the Cube
. Good knowledge on writing Named Queries and Named Calculations
. Having good exposure in MDX'S
. Having Knowledge in Providing Security for a Cube
. Having good knowledge on Dashboard Reports,Scorecards,KPI'S,Filters
and Time Intelligence by using Microsoft Performance point services
2010
. Integration of Microsoft Performance point Reports into SharePoint
2010 by using Web parts
. Installation, Configuration of SQL Server 2012/2008/2005/2000 on Local
System and Remote Systems
. Experience in Planning and Implementing Backup and Recovery Strategy
. Good Exposure on user authorizations, Database creation, Creating
Roles, Tables, indexes
. Knowledge on Performance monitoring and tuning
Technical Exposure
Environment : Hadoop, HDFS, Pig, Hive, Map Reduce, Sqoop,
LINUX, and Big Data
NO SQL DB'S : Mongo DB, HBase, Couch DB
RDBMS : SQL Server 2005\2008\2012
Language : T- SQL, C, C++, Java
Operating System : Windows 2000 Server, Windows2003 Server, XP,
Linux
Professional Experience
Company Name : MAHINDRA SATYAM
Designation : TEAM LEAD
Location : HYDERABAD
Period of Work : JAN 2012 TO TILL DATE
Company Name : MAGNA INFOTECH
Designation : MSBI DEVELOPER (SSE)
Location : BANGALORE
Period of Work : SEP 2010 TO DEC 2010 AND JUL 2011 TO JAN 2012
Company Name : INTELLIGROUP ASIA Pvt Ltd.
Designation : MSBI DEVELOPER
Location : HYDERABAD
Period of Work : JAN 2011 - JUL 2011
Company Name : MARVEL ERP SOLUTIONS INDIA Pvt Ltd.
Designation : MSBI DEVELOPER
Location : HYDERABAD
Period of Work : JAN 2006 - AUG 2010
Certifications
V CCD-410: Apache Hadoop Developer Certified
[pic]
V C100DEV: MongoDB Certified Developer Associate
V 70-467: Designing Business Intelligence Solutions with Microsoft SQL
Server 2012
V 70-466: Data Modeling and Reporting with Microsoft SQL Server 2012.
Projects Experience
Project#7
Project Title : AT&T SA Automation
Client : AT&T
Environment : SSAS,SSIS, SSRS,SQL SERVER 2008,PPS
2 Description:
Service Assurance organization has to ensure that service offered over
the networks meet a pre-defined service-quality level for an optimal
subscriber experience. The practice of service assurance enables to
identify faults in the network and resolve these issues in a timely
manner so as to minimize service downtime. The practice also includes
policies and processes to proactively pinpoint diagnose and resolve
service quality and degradation or device malfunctions before subscribers
are impacted.
The key goal of SA Reporting Automation project is to deliver the right
business reports functionality to all the key AT & T stakeholders in an
automated manner, with No or Minimal manual work to generate those
reports, and Enabling business users to look at the reports using a Self
Services manner.
The Automation project will primarily comprise of automating the existing
and new functionalities from Metrics/Reports perspective. All key users
will be able to access their reports, dashboards to suit their needs
(suitable to their roles), and perform additional ad hoc analysis on
larger data entities set.
3 Responsibilities:
. Developed and maintained OLAP Cubes.
. Creating the calculated Measures and unit testing on the Measure groups.
. Creating the Actions in the OLAP level.
. Creating the Perspectives.
. Applying the Aggregations on fact loading.
. Improve the performance of dimensions and facts are loading.
. Applying the proper relationship between the Dimensions and Measure
Groups.
. Apply the calculated measures in PPS.
. Used Control Flow Tasks like Sequential Container, For each loop
Container, Execute SQL Task, Send mail Task and Data Flow Task.
. Created the mappings using transformations such as the Derived Column,
Union All, Conditional Split, Merge join and Row Count.
. Implemented Package Configuration and Message Logging.
. Scheduled the ETL Package Using SQL Server 2008 Management Studio.
. Responsible for unit testing. Peer testing and Integration testing of
data warehouse jobs.
. Prepared Unit Test Cases as per our business logic.
Project#6
Project Title : ICIS
Client : ANZ
Environment : SSAS,SSIS, SSRS,SQL SERVER 2008
5 Description:
ICIS Application has been specifically engaged to build a
Business Intelligence Reporting system around utilizing data being captured
in a Central Logging Server. ETL process which helps the Organization get
Process Data from Heterogeneous sources like Excel, DB, flat file to dump
into Central Logging Server after necessary attenuation of Data.
Organization can analyze business data for long-term decision-making.
6 Responsibilities:
. Developed and maintained OLAP Cubes.
. Creating the calculated Measures and unit testing on the Measure groups.
. Creating the Actions in the OLAP level.
. Creating the Perspectives.
. Applying the Aggregations on fact loading.
. Improve the performance of dimensions and facts are loading.
. Applying the proper relationship between the Dimensions and Measure
Groups.
. Apply the calculated measures in PPS.
. Used Control Flow Tasks like Sequential Container, For each loop
Container, Execute SQL Task, Send mail Task and Data Flow Task.
. Created the mappings using transformations such as the Derived Column,
Union All, Conditional Split, Merge join and Row Count.
. Implemented Package Configuration and Message Logging.
. Scheduled the ETL Package Using SQL Server 2008 Management Studio.
. Responsible for unit testing. Peer testing and Integration testing of
data warehouse jobs.
. Prepared Unit Test Cases as per our business logic
. Developed and maintained technical documentation from source to target
load process and ETL development process.
. Used Control Flow Tasks like Sequential Container, For each loop
Container, Execute SQL Task, Send mail Task and Data Flow Task.
. Created the mappings using transformations such as the Derived Column,
Union All, Conditional Split, Merge join and Row Count.
. Implemented Package Configuration and Message Logging.
. Scheduled the ETL Package Using SQL Server 2008 Management Studio.
. Responsible for unit testing. Peer testing and Integration testing of
data warehouse jobs.
. Prepared Unit Test Cases as per our business logic.
Project #5
Project : Palm BI Dashboard
Team Size : 8
Technology : SSAS 2008, SSIS 2008
Role : Program Analyst
8
9 Description:
Palm BI Dashboard is intended for providing a quick understanding of the
day to day application and Devices business of Palm Inc. It also covers the
revenue generated for the developers. Top level management executives are
expected to use these dashboards to analyze the data for making decisions.
10 Responsibilities:
. Developed and maintained OLAP Cubes.
. Creating the calculated Measures and unit testing on the Measure groups.
. Creating the Actions in the OLAP level.
. Creating the Perspectives.
. Applying the Aggregations on fact loading.
. Improve the performance of dimensions and facts are loading.
. Applying the proper relationship between the Dimensions and Measure
Groups.
. Apply the calculated measures in PPS.
. Used Control Flow Tasks like Sequential Container, For each loop
Container, Execute SQL Task, Send mail Task and Data Flow Task.
. Created the mappings using transformations such as the Derived Column,
Union All, Conditional Split, Merge join and Row Count.
. Implemented Package Configuration and Message Logging.
. Scheduled the ETL Package Using SQL Server 2008 Management Studio.
. Responsible for unit testing. Peer testing and Integration testing of
data warehouse jobs.
. Prepared Unit Test Cases as per our business logic
. Created packages in SSIS Designer using Control Flow Tasks and Data
Flow Transformations to implement Business Rules.
. Identified various source systems for data feeds and created the
Mappings to load data extracted from flat files, Oracle database and
XML files to relational tables and re-directed the required data to
staging area using SQL Server SSIS.
. Using Control Flow elements like Containers, Execute SQL Task,
Execute Package Task, File System Task and Send Mail Task etc.
. Created stored procedures for loading data in staging area.
. Involved in developing Exception Handling process for each SSIS
Package.
. Scheduling and Monitoring the ETL packages for Daily load in SQL
Server Agent.
. As per requirements developed the reports in the form of Matrix,
Table and Chart using SQL Server 2005 Reporting Services.
. Generated the Reports Weekly and Monthly basis as per the Client
Requirements.
Project #4
Project : Videocon D2H
Team Size : 7
Environment : SSIS, SSAS.
Role : Program Analyst
12
13 Description:
Videocon is a retail business company of consumer electronic goods in
India. Data warehouse has many functional and technical groups. ETL is one
among them performing the development activities for Data warehouse. This
Main Objective of this project is to extract the information related to in
bound and out bound calls from various sources like excel and flat files
are bought together in the warehouse that is the one and only source of
consolidated data. Analyzed, interpreted the business requirement to design
and developed reports for Business Users.
14 Responsibilities:
. Involved in preparing design documents for developing Packages.
. Using Control Flow elements like File System Task, Sequence
container, For Each loop Container, Execute Sql Task, Execute
Process Task, and Send Mail Task etc.
. Creation of packages by using required data flow transformations in
SSIS like Derived Column, Conditional Split, Row count, look up,
data conversion, Aggregate, merge join etc.
. Implementing package configurations to avoid credential problems
while deploying ssis packages.
. Deploying SSIS packages to production systems using Deployment
manifest files.
. Creation of tabular, drilldown and drill through reports.
. As per Client Requirements, creation of tabular, drill down and
drill through reports.
. Rendering the Reports in the form of PDF, EXCEL and CSV to the
Clients.
. Implementing subscriptions, caching, and securities for reports.
Project #3
16 Project : HP (INBOUND/OUTBOUND)
Client : HP.
Role : BI Developer
Team Size : 6
Environment : SQL Server 2005, Windows Server 2003, SSIS, SSAS.
Description:
In the HP project every day we have to generate the bulk leads from the
Tata portal and finally update bulk leads and load TGI sales information
into Tata portal. Every week we have to extract the data from DNC (do not
call) and load the data into DWH.
Responsibilities:
. Analyze the source map documents and creating packages to move the data
from source to destination.
. Creating packages for implementing ETL by using various control flow
items for each loop, for loop and data Flow transformations execute SQL,
conditional split, multicast, derived column, lookup, sort, union all
etc. Did error handling while moving the data.
. Implemented Performance Tuning at various levels.
. Implemented check point configuration, package configuration in packages
for better reusability.
. Analyze the specified documents, based on that created cubes, measure
groups from the data sources.
. Implemented KPI's, actions, created calculated members for doing
intermediate operations on the cube, and did partitions for parallel
loading of data and use MDX's for firing the queries.
. Process the facts dimensions and implemented aggregation on granularity
basis.
. Design and development of database scripts and procedures
. Develop the package for ETL process (First Run and Incremental)
. Involved in creation of the package and was responsible for moving the
package from development server to production server.
Project #2
Project : CAPITAL IQ
Role : BI Developer
Team Size : 5
Environment : SQL Server 2005, SSIS, Windows Server 2003.
Description:
Capital IQ System used to collect various transaction data from various
resources. The transaction information is about housing holding articles,
Foods and Beverages information. Data warehouse plays a major role in
enabling various stores to view the data at a lowest level and help them to
make decision to bring more revenue. Data was Extracted, Validated and
Loaded into the Warehouse Database using SSIS Tool. SIS will help the top-
level management to take the power full decision to improve the business.
Responsibilities:
. Extensively working in data Extraction, Transformation and Loading
from source to target system using SQL Server 2005 Integration
Services.
. Analyzing the Existing System and Understanding the business
requirement.
. Importing Source/Target tables from the respective databases by using
Execute Package Task using Control Tasks in SQL Server 2005
Integration services.
. Created Transformations and Mappings using Data Flow Tasks.
. Created Event Handlers for the Packages Using Event Handler Tab.
. Extensively used Derive Column, Row Count, Data Conversion,
Conditional Split and Aggregate.
. Prepared Unit Test Specification Requirements.
Project #1
Project : MANIPAL UNIVERSITY
Role : BI Developer
Team Size : 8
Environment : SQL Server 2005, Windows Server 2003.
Description:
Manipal University one of major education partner in all over India.
Monthly wise we can collect the data load into the data ware house and end
of the day run the packages and finally send the reports to client
Responsibilities:
. Involved in creating reports for Business Analysis Process for creating
cubes.
. As per requirements developed the reports in the form of Matrix, Table
and Chart Using SQL Server 2005 Reporting Services.
. Generated the Reports Weekly and Monthly basis as per the Client
Requirements.
. Involved in the ETL phase of the project.
. Creating packages at SQL Server Integration Service platform.
. Development of mappings according to the ETL specifications for the
staging area data load & warehouse data load using SQL Server Integration
Services & SQl Server 2005.
. Generated the reports by using the Report Wizard and Report.
. Building and generating reports with the help of Report Builder in the
Report manager.
Educational
Summary
. Bsc From A.N.U University
. 10+2 from Board of Intermediate
. S.S.C from Board of Secondary Education