RAJA VALLABHANI
**** ****** ***** ****, ***-*** *******.**********@*****.***
Charlotte, NC – 28262. +1-980-***-****
SUMMARY:
* ***** ** **** ********** as Data Analyst in Banking, and Telecom sectors.
Experience in building Data Integration, Workflow Solutions, and Extract, Transform & Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS) & Informatica.
Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling.
Data extraction, Cleansing, Analysis, and Reporting using SAS, SQL, and Advance Excel. Analyzed and Developed analytical reports for monthly risk management presentation to higher management.
Performed various operations like Data Cleansing, Data Scrubbing, Data Profiling and maintained data governance.
Experience in working with multiple databases like MS SQL Server, and MS Access and Sql Developer.
Experience with T-SQL scripting for creation database objects like Table, Views, Store Procedure, Triggers, Functions, CTE’s etc.., to provide structure to store data and to maintain database efficiently.
Experience in design and creation of reports on transactional data from data warehouse using SQL server reporting Services (SSRS).
Experienced in UNIX work environment, file transfers, job scheduling and error handling.
Experience on creation of Pivot tables, Graphs, Charts, Functions, and Macros in Excel.
Used Excel to provide data cleansing support and validation of data for use in presentations, metrics, and reporting of information.
Experience in all stages of SDLC including unit testing and system integration.
Worked on Analytics solutions using R, Python, Machine learning and visualization and analytics tools like Tableau in academic coursework.
PROFESSIONAL CERTIFICATION:
SAS Certified Base Programmer for SAS 9.0
TECHNICAL SKILLS:
Database: MS SQL Server 2014, 2012, 2008 R2/2005 Oracle 9i/10g, MS-Access - 2007
Tools/Utilities: Toad, Tableau, SQL Developer, Putty, SQL Profiler, Query Analyzer
SAS Skills: SAS/BASE, SAS/MACRO, SAS/SQL, SAS/ACCESS, SAS/ODS, SAS/STAT, SAS/GRAPH
ETL: Informatica Power Center, Visual Studio 2010, SQL Server Integration Services (SSIS)
Others: MS Visual Studio, MS Office, MS Project, MS-Visio, MS PowerPoint, MS Excel, MS LYNC, SQL Loader, Quick Test professional and Quality Center (QC), JIRA
PROFESSIONAL EXPERIENCE:
Role: ETL Analyst
Client: Indian Eagle Feb 2017 – May 2017
Created new and alter existing database objects to support new business requirements and data processes to build Unified Data warehouse and to generate Operational and Analytical Reports.
Created database objects like tables, views and created ETL workflows to load data from raw datafiles into these database objects.
Used complex Joins and Sub-Queries involving multiple tables in workflows to transform and enrich transactional data as per user requirement.
Developed complex programs in T-SQL, writing Stored Procedures, Triggers for database integrity and Queries with best execution plan.
Designed and developed ETL programs using Informatica to implement the business requirements.
Developed mapping parameters and variables to support SQL override and to load into staging tables and then to Dimensions and Facts.
Prepared SQL Queries to validate the data in both source and target databases.
Designed and implement application database objects such as stored procedures, views, joins etc., to implement business logic.
Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
Developed Dashboards using Tableau Desktop for data visualization and reporting.
Extensive data cleansing and analysis, using pivot tables, formulas (v-lookup and others), data validation, conditional formatting, and graph and chart manipulation.
Environment: Informatica Power Center 9.6.1, Teradata, Toad, MS Excel, Erwin, MS-SQL Server, Autosys, Tableau.
Role: Data Analyst
Accenture – Chennai Dec 2013 – Jul 2015
Client: RBS - Banking Client
Description: The scope of this project is related to Credit Risk (CRM), working on customer’s data including origination and performance data to effectively develop recommendations. My role involves in working on customer’s demographic data along with various products information like debts, credits, collateral, equity, shares, and bonds.
Developed and maintained data preparation and validation routines to support complex data mining.
Worked on customer historic Data by comparing current month data and previous month data or going back up to 3 years to check the same month’s data to reveal the similarities and variances in the data. The determination is usually based on checking differences in missing, mean, standard deviation, minimum and maximum values of the range of data.
Developed programs using BASE SAS, SAS/SQL, and SAS MACROS to generate new reports.
Responsible for importing and extracting data in and out of SAS data warehouse into external file like excel using SAS IMPORT/EXPORT Wizard.
Experience in Extracting, Transforming and Loading (ETL) data from Excel, Flat file to MS SQL Server DTS and SSIS services.
Experienced in creating and populating data warehouses OLTP data with various SSIS ETL packages with transformations, Sequence Containers, For Each Loop Container, variables, configurations, and schedules. Debugging knowledge with loggers and event handlers.
Tuned Performance, which included creating indexes, providing Hints, modifying tables, analyze tables and estimate stats using explain plan utility.
Experience in Microsoft Visual C# in script component of SSIS.
Acted as primary liaison between business users and IT project team for the business case, business requirements documentation, performed quality assurance testing.
Used Excel pivot tables to manipulate large amounts of data to perform data analysis, position involved extensive routine operational reporting, Adhoc reporting, and data manipulation to produce routine metrics and dashboards for management.
Generated high quality reports in the form of listing, HTML, RTF, and PDF formats using SAS ODS.
Environment: SAS 9.3, SAS Base, SAS Macros, MS SQL Server, SQL Server Integration services (SSIS), Windows 7, MS Office including Excel, SAS Enterprise Guide, SAS Stored Processes, Unix, SQL.
Role: Data Analyst
Accenture – Chennai Aug 2012 – Nov 2013
Client: Vodafone – Telecom Client
Description: This project involves in building a Data warehouse for Telecom client which stores the retailers and distributer sales transactions and their hierarchies which is further useful for calculation of Key Performance Indicators (KPIs) for commissions and incentives for retailers and distributers. My role in this project involves in extraction the raw data from flat files and load the data into data-ware house by applying Source to Target mapping and transformation rules for consolidation of retailer and distributer transactions and design of operational and analytical reports and dashboards.
Experience in Data Migration activity, directly responsible for the Extraction, Transformation & Loading of data from multiple sources like Oracle and Flat files into Data Warehouse using Informatica.
Created Business Requirement documents (BRD’s), such as SRS & FRS and integrated the requirements and underlying platform functionality.
Translated Business Requirements into working Logical and Physical Data Models.
Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse as per customer requirement.
Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
Experience in writing complex SQL queries using joins, sub queries and correlated sub queries to retrieve data from the database.
Improved database performance including CTE, views, stored procedures, Index, etc. and optimized the T-SQL statements using SQL profiler and Execution Plan.
Created reports from Data Warehouse using SQL Server Reporting Services (SSRS).
Created complex SQL Queries using multiple joins for creating Adhoc reports using Report builder based on the requirements.
Expertize in creating different formats of reports for internal and external reporting purposes including drill down and drill through based on needs of granularity using SSRS.
Generated periodic reports based on the statistical analysis of the transactional data in data-warehouse from various time frame.
Designed different kind of reports such a Sub Reports, Charts, Matrix reports, Linked reports.
Environment: Informatica 8.6.1, MS SQL Server, Windows 7, MS Office, HP Quality Center, Unix shell scripting.
Role: Intern
Accenture – Bangalore March 2012 – July 2012
Project: SFR – Telecom Client
Updated the Functional Requirement Specifications (FRs) based on User Requirement Specification (URs).
Responsible for creating and maintaining Requirement Traceability Matrix to track the development and QA Process.
Achieved hands-on expertise on working tools like putty, MySQL, SoupUI, Vox-pilot for UAT testing for web services.
Understood the process like basic configuration, deployments, documentation and installing and supporting new and existing applications on various server platforms.
EDUCATION:
University of North Carolina – Charlotte, NC. Dec 2016
Masters of Science in Information Technology GPA- 3.5/4.0
SRM University – INDIA. May 2012
Bachelors of Technology in Electronics & Communication GPA- 3.4/4
ACADAMIC PROJECTS:
Identification of Patterns from Data: Implemented various algorithms like Random Forest, Naïve Bayes, K-Means algorithm for finding patterns and predicting results using given data. Implemented Clustering Analysis for grouping of data.
Tools/technology Used: R, Machine Learning.
Network Anomaly Detection: Implemented Parallel K-Means Clustering algorithm based on Map Reduce using APACHE SPARK to detect network event anomalies.
Tools/technology Used: Python, MapReduce, Spark
Vacation planning: Developed an online portal to enable user to log-in and plan their vacation i.e.., choosing destinations, booking travel and accommodation.
Tools/technology Used: Java, HTML, CSS