Anvesh
PROFESSIONAL SUMMARY:
Having around *+ Years of professional experience in data analysis, data modeling, design, development and implementation of OLTP, Data Warehouse & Business Intelligence.
Hands on experience in all phases of Data Modeling - Conceptual, Logical and Physical modeling.
Having good experience in all types of Databases like SQL Server, DB2, Oracle 11g, 10g and Oracle clinical, MS Access., Teradata, informatica and MongoDB etc.
Hands-on experience with data architecting, data warehousing, data mining, large-scale data modeling, Business Intelligence, data sharing techniques across enterprise systems, system analysis, business requirements gathering/analysis and experienced in implementing enterprise data management procedures.
Excellent experience in Extract, Transfer and Load process using ETL tools like Data Stage, Informatica, Data Integrator and SQL Server Integration Services for Data migration and Data warehousing projects.
Excellent working knowledge of HBase and data pre-processing using FLUME.
Hands on experience in installing, configuring CloudEra's Apache Hadoop ecosystem components like Fluming, HBase, Zoo Keeper, Oozie, Hive, Sqoop, Hue, Pig and Hue with CDH3&4 clusters.
Experience in Big Data analysis using PIG and HIVE and understanding of SQOOP.
Experience in developing customized UDF's in JAVA to extend Hive and Pig Latin functionality.
Good understanding of HDFS Designs, Daemons, federation and HDFS high availability (HA).
Expertise in analyzing and reporting various phases (Phase I-III) Of Clinical Trials using tools like Base SAS, SAS/STAT, SAS/GRAPH, SAS/SQL, SAS/MACRO, SAS/ACCESS and SAS/ODS in UNIX and Windows Environment.
Good experience in writing UNIX shell scripts for Data validations, Data cleansing etc.
Experience in debugging and testing SAS programs to check and process data, generate graphs, tables and listings in analysis system.
Individually capable of developing new SAS Programs and/or enhancing existing SAS programs from protocols and SAP's.
Experienced in constructing SAS queries.
Experience in modifying existing SAS programs and creating new programs using SAS Macros to improve efficiency and consistency of results.
Extensive experience in working with Tableau Desktop, Tableau Server and Tableau Reader in various versions of Tableau 8.2, 8.1,8.0, Tableau 7 and Tableau 6.
Experience including analysis, modeling, design, and development of Tableau reports and dashboards for analytics and reporting applications.
Expertise in developing Parameterized, Chart, Graph, Linked, Dashboard, Scorecards, Report on SSAS Cube using MDX, Drill-down,Drill-through and Cascading reports using SSRS.
Experience in creating reports by using ProClarity Views and Dashboards strategy.
TECHNICAL SKILLS:
Databases
Oracle 11g, 10g, 9i, SQL Server 2012/2008R2, DB2 and MySQL.
Languages
C++, C#, JAVA, SQL, PL/SQL, TSQL, SQL* Plus, HTML, XML, ASP, VB, UNIX, JAVASCRIPT, PHP VBA.
Analytics & Reporting
SSAS, Statistica, Google Analytics, Tableau Public, SSRS, QlikView.
ETL
Informatics, Datastage, SQL Servers Data Transformation Services, Visual Studio 2012/2010.
Design Tools
ERWIN modeler.
Big data Technologies
MapReduce, Apache Hadoop (Pig, Hive, HBase, SQOOP, HDFS).
Schedulers
WLM, Informatica scheduler, Autosys.
Operating System
Windows 8/7/XP, UNIX, LinuX.
PROFESSIONAL EXPERIENCE:
Client: Acxiom Jan 2015 to Present
Role: Data Analyst
Description: Acxiom is engaged in marketing technology and services that enable marketers to manage audiences personalize consumer experiences and create customer relationships. Acxiom's consultative approach combines consumer data and analytics, databases, data integration, and consulting solutions for personalized, multichannel marketing strategies. Serves Domestic and International customers.
Roles & Responsibilities:
Performed data manipulation, data cleansing and data profiling in the source systems that are required for Dual credit report Data mart.
Collecting the data from statistical analysis using SQL Server, Oracle, DB2, Teradata, Python, and Excel.
Used Data Lever (RedPoint) software for applying logic to data for cleansing along with creation, customization of automated batch/script processes to shorten customer wait time.
Created and managed FTP and SFTP accounts for data transfer for customers after working with compliance department for approval resulting in faster file submission and retrieval.
Updated automated processes to be "Drivers Privacy Protection Act" (DPPA) and "Gramm-Leach-Bliley Act" (GLBA) compliant including "Personal Identifiable Identification" (PII) to ensure sensitive data is not returned to unauthorized clients / customers.
Responsible for creating and delivering recurring as well as ad hoc marketing campaign programming within strict timelines while under constantly changing requirements using SAS, SAS SQL, and Teradata SQL in a UNIX environment with Oracle, Informix, and Teradata relational databases.
Analyzing raw data, drawing conclusions & developing recommendations also writing SQL scripts to manipulate data for data loads and extracts.
Involved in creating the flow from data import till landing the data in HDFS.
Created Hive tables to populate the data in HDFS.
Provided descriptive and predictive analysis by applying analytics techniques to answer critical business questions.
Performed data processing using HIVE.
Executed ETL operations using SAS/Base and Enterprise Guide.
Worked extensively on building and scheduling on Control-M, analytic applications using SAS related products.
Designed and built SAS programs to analyze and generate files, tables, listings, graphs and documentation.
Worked extensively on presenting SAS generated data on macro enabled Excel pivots to provide Daily, Weekly and Monthly reporting needs.
Developed SAS processes for business users to extracts summarized data and distribute business summary reports through automated email.
Applied Master Data Management standards in data conversion, mapping and migration.
Involved in creating Star &Snow-FlakeSchemas, FACT and DimensionsTables, Physical and Logical Data Modeling using Embarcadero.
Have worked on analyzing the data from different data sources like Mainframe VSAM, GDG, UNIX Flat Files, SQL Server, etc.
Have worked on devising transformation across different platforms like Mainframe, UNIX and Informatica.
Extensively written UNIX script using .ksh in AIX environment and involved in resolving issues during writing UNIX scripting.
Performing daily system checks, Data entry, and data auditing, creating data reports & monitoring all data for accuracy. Designing, developing and implementing new functionality.
Creating users, groups, projects, Data connections, settings as a tableauadministrator.
Mastered the ability to design and deploy rich Graphic visualizations using Tableau.
Used excel sheet, FLAT FILES, CSV files to generate Tableau adhoc reports.
Generate weekly and monthly asset inventory reports.
Created dashboards for ad-hoc reporting and analysis.
Coordinate with the business users in providing appropriate, effective and efficient way to design the new reporting needs based on the user with the existing functionality.
Environment: SQL Server 2012, Oracle 11g, 10g, Hadoop(PIG, Hive, Sqoop, HBase), Teradata, RedPoint, Python,DB2,SAS/Access, SAS Macro, Flex, Visual studio 2010, SSRS, Tableau, JAVA, Clear Quest, Cognos. UNIX, AIX, Windows 8/7/XP.
Northern Trust - Chicago, IL Nov 2013 to Dec 2014
Data Analyst
Description: Northern Trust is a leading provider of asset management, fiduciary, banking, asset servicing and fund administration solutions for individuals, families, corporations and institutions worldwide. The scope of the project is to provide the text files and xml files to Federal Government for Retail and Wholesale according to the required format.
Roles & Responsibilities:
Research and review gap issues in relation to Retail and Wholesale Instruments.
Worked on different Types of Databases Oracle 11g/10g/9i, SQL Server2012/2008R2,DB2,Informatica.
Developed SQL, BTEQ (Teradata) queries for Extracting data from production database and built data structures, reports.
Loading data from different sources like CSV, Excel, SQLServer2008 and IBMDB2 into CRM Pivotal-Touch point using SSIS2008.
Tuned SQL statements using indexes, SQL stored Procedures.
Combined a few simple, unrelated Database Accesses to improve performance in query analyzer using SQL queries.
Successfully created Fact Tables, Dimensions and joined them to extract meaningful data.
Performed data profiling and gap analysis in the source systems that are required for HELOC & HELOAND atamart for starting ETL development.
Standardizing and formatting the data according to the requirement using PL/SQL.
Perform the data validation according to the validation rules provided by the Technical architect.
Analyzed massive and highly complex data sets, performing ad-hoc analysis and data manipulation using MS Excel.
Generated the (features such as pivot tables and lookups) of claims and new house loans by NT accounts, In addition, created presentations using PowerPoint.
Involved in running Hadoop jobs for processing millions of records of text data
Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
Developed multiple MapReduce jobs in java for data cleaning and pre-processing.
Transformed project data requirements into project data models for OLAP and OLTP systems using Erwin.
Prepare prototype models to support the development efforts. Assist the business analyst in mapping the data from source to target.
Generate reports using SAS/MACRO, SAS/ODS, Proc Report, Proc Print, Proc Summary, Proc freq, Proc means, Proc tabulate and Proc SQL.
Experience in writing SAS Macros to improve efficiency of reports and reduce the effort in generating outputs.
Involved in loading data from LINUX file system to HDFS.
Worked with various Linux/UNIX Distributions.
Documented a whole process of working with Tableau Desktop, installing Tableau Server, evaluating Business Requirements and proposing System Requirements.
Research and develop the first NTAudit Report to evaluate Source data and Target data.
Worked extensively with Date, String and Logical Functions.
Environment: Oracle 12c, SQL Server 2012, Teradata 14, SSRS, MS Access, Tableau, QlikView, Data stage, Hadoop (Hive), MS Excel, Business Objects XI, SSIS, Excel, Informatica 9.5, Windows 7,XP
The Rawlings Group - Louisville, KY Oct 2011 to Nov 2013
Data Analyst/ SAS programmer
Description: The Rawlings Group is the largest and most established insurance claims recovery company in the industry and serves health plan sponsors and self-funded plans throughout the nation with a comprehensive line of paid claims recovery services. I am working on membership and claims data and was involved in the complete software development life cycle of the project. In this project, the first phase was migration of data from client data warehouse servers to application environment, second phase was to transform the required tables and flat files to SAS datasets; the third and final phase was to develop SAS analysis datasets and reports.
Roles & Responsibilities
Designed and developed Extract, Transform, Load (ETL) processes using Base SAS and Data Integration Studio to populate Data Warehouses and Reporting Data Marts.
Have strong hands on experience in various databases like Oracle, MongoDB, and MySQL, SQL Server 2012 /2008R2 and Teradata.
Worked in designing and optimizing various T-SQL database objects like tables, views, stored procedures, user defined functions, indexes and triggers.
Worked on pre-existing macros for data validation by checking data distribution and comparison to standard data.
Worked on importing and exporting data into HDFS and HIVE using Sqoop from OracleDB.
Developed Information Maps, OLAP cubes and based on project requirements.
Wrote pig UDF’sand used various UDFs from Piggybanks and other sources.
Extracted data from different sources like claims data mart and text files using SAS/Access, SAS SQL procedures and created SAS datasets.
Performed data cleaning by analyzing and elimination duplicate and inaccurate data using PROC FREQ,PROC UNIVARIATE, PROC RANK PROC, TABULATION and macros in SAS.
Used SAS PROC SQL pass through facility to connect to tables and created SAS datasets using various SQL joins such as left join, right join, inner join and full join.
Report generation on providers such as total amount billed Per-patient average billing amounts etc. for auditors and investigator using PROC UNIVARIATE, PROC MEANS and PROC REPORT.
Build regression and predictive models in SAS and SPSS according to specification as required.
Developed Strategies for Data Analysis and Data Validation with the help of Pl/SQL, SQL and UNIX.
Assist business analysts by writing complex adhoc queries in SQL.
Analyse source systems and target data requirements and identify gaps in data, document issues and seek resolutions.
Reviews mapping documents provided by Business Team and implement the business logic into the UNIX and PL/SQL scripts that load data structures in database.
Worked on UNIX workstation operation, particularly Solaris and LINUX.
Environment: Teradata V2R6/V2R5, SQL Assistant, SAS and SPSS, PROC UNIVARIATE, PROC MEANS and PROC REPORT, Hadoop--Hive, Pig, HDFS, Business Objects, UNIX, Windows 7/XP.
AstraZeneca - Bangalore, Karnataka Feb 2010 to Sept 2011
Programmer/Analyst
Description: As AstraZeneca is a global, innovation-driven biopharmaceutical business that focuses on the discovery, development and commercialization of small molecule and biologic prescription medicines, pioneering innovative research and exploring novel pathways across key therapeutic areas, including respiratory, inflammation and autoimmunity; cardiovascular and metabolic disease; oncology; neuroscience; and infection and vaccines.
Roles & Responsibilities:
Converted MS-Word documents and EXCEL tables into SAS data sets.
Hands on Experience in different types of Databases like SQL Server, MongoDB, Oracle, MySQL.
Converted ORACLE data tables into SAS data files using SASSQL pass through facility\and uploaded SAS data files into ORACLE tables using SAS'Bloat' procedure.
Created web reports in HTML format using SAS/ODS.
Using SAS/STAT to perform analysis of variance for balanced or unbalanced designs, multivariate analysis of variance, and repeated measurements analysis of variance.
Create electronic data sets (i.e., SAS transport files) for electronic submission.
Created survival graphs in EXCEL by transporting SAS data sets into EXCEL spreadsheets.
Created Kaplan-Meier Survival Curves using PROCLIFETEST and PROCGPLOT in SAS.
Ensured quality of dataset at the protocol level while working to deliver competitive advantage and cost efficiency to the company as a whole.
Conducted documenting and reporting computer validation inspections in compliance with 21 Code of Federal Regulations (21CFR) and other regulatory compliance.
Involved in preparing analysis plans, data analysis and statistical report writing presentations to FDA and programming of patient profiles for statistical reports.
Created SAS Customized Reports using the Data_Null_technique for FDA evaluations.
Generated graphs using SAS/GRAPH and the SAS Graphics Editor.
Developed SAS macros for data cleaning and Reporting and to support routing processing.
Used PROCREPORT to generate reports.
Created different type of reports including Drill-down, Drill-Through and many more.
Created and Executed the Stored Procedures and Complex Queries for Reports.
Deployed and Scheduled the Reports in Report Manager.
Responsible for generating different types of daily and weekly Reports based on requirements using SQL Server Business Intelligence Tool.
Environment: SAS (V8.2) SAS/ODS, SAS/MACROS, SAS/BASE, SAS/STAT, SAS/GRAPH, SAS/SQL, SQL Server, Oracle8, SAS/ACCESS, Tableau, SSRS, Prof Reports, Windows 7/XP.
ICICI Bank - Bangalore, Karnataka - Jul 2008 to Jan 2010
Database Developer
Description: ICICI Bank is one of the largest banks in India. ICICI Bank offers a wide range of banking products and financial services to corporate and retail customers through a variety of delivery channels and through its specialized subsidiaries and affiliates in the areas of investment banking, life and non-life insurance, venture capital and asset management. ICICI Bank is considered to be the most valuable bank in India in terms of market capitalization.
Roles & Responsibilities:
Involved in the design of the overall database using Entity Relationship diagrams.
Strong knowledge in major databases MySQL, Oracle10g/9i, SQL Server 2005, DB2 etc.
Wrote triggers, menus and stored procedures in PL/SQL.
Creation of TeradataBTEQ, fast Load, Fast export scripts and UNIX shell scripts to perform the fulfillments Creation of Tables, Views to fulfill the business needs and support the Production systems with day to day activities and support the performance issues.
Involved in Data loading and Extracting functions using SQL*Loader.
Designed and developed all the tables, views for the system in Oracle.
Involved in developing interactive forms and customization of screens using Forms 4.5.
Developed programs in SAS to generate reports, creating RTF, HTML listings and reports using SASODS for ad-hoc and Weekly& Monthly report generation.
Developed UNIX and PL/SQL scripts for pre & post session processes to automate daily loads
Worked with Modelers and Architects for table issues.
Involved in building, debugging and running forms.
Designing and developing forms validation procedures for query and update of data.
Developed SSRS Reports like Drill through Reports, Drilldown Reports, linked reports and parameterized reports.
Used Scheduler to automate report scheduling and generating reports. Scheduled reports for running at a certain time, such as week, month etc.
Environment: Oracle 10g/9i, SQL*plus, SQL*Loader, Teradata BTEQ, MySQL,DB2, Visual studio 2005,TOAD, PL/SQL, Forms 4.5, Reports 2.5, UNIX, Windows 7/XP
Education: Bachelor of Technology Computer Science Engineering.