Anand Kumar Vanam
Email: ***************@*****.***
Phone: 347-***-****
Current Location: Nashville, TN
Experience summary:
Over 10 years of IT experience as Software developer and programmer analyst.
Strong background and skills in: Database design, logical/physical data modeling, data analysis to drive superior data quality, security, privacy, integrity, and high performance, in highly distributed/cloud-based environments.
Strong background and design/modeling skills in: BI architectural and pipeline components including dimensional modeling, data warehouse design and BI life cycle, cubes, ETL, source data profiling.
Experience in Information Technologies and Manufacturing industries. Functional experience working on quality, order management, planning & scheduling, shipping or operations management information systems projects.
Develop Webservices and APIs using REST, SOAP webservices.
Implement the user interface using JavaScript, ASP.Net, C#, Kendo UI.
Experience in report visualization tools such as Power BI, Power View and Excel.
Provide tier 1 to 3 support to existing clients. Analyze support requests for trends and training or system enhancement opportunities.
Strong conceptual knowledge in Software Design and Software Development Life Cycle (SDLC).
Set up continuous integration using Microsoft Visual Studio Team Foundation Server to trigger automated build and deployment of various applications.
Technical Skills:
Dbms
SQL Azure, MS SQL Server 2016/2014/2012, Oracle 10g/9i, TableStore, Blobstore
Languages
T-SQL, PL/SQL, ASP.Net, C, C#, PowerShell, Python, R, JavaScript.
Tools
SQL Management Studio, SQL Profiler, SSIS, Azure Data Factory, JDA WMS, SSRS, Erwin, Visual Studio 2019/2015/2013, Code flow 2010, Power BI, R-Studio, Fiddler, Data Lake, JDA, JDE, Kendo-UI.
Operating Systems
Windows 2016, 2012 R2/ 2012/ 2008. Linux, Mac OS
Education:
Master of Science in Software Engineering, Stratford University, VA in May 2009.
Bachelor of Technology in Electrical and Electronics, JNTU, India in May 2007.
Professional Experience:
Client: Altria (USSTC), Nashville, TN
Project: Manufacturing Systems Framework 2.2 01/2017 – Present
Role: Programmer Analyst
Worked as a Programmer Analyst for various projects including MSF2.2, Leaf 2.0, Finishing Framework 1.0.
Involved in database design for Manufacturing Systems Framework for Eagle One, Nashville and Hopkinsville.
Analyzed the requirements and participate in technical sessions or meetings to come up with approach and design for the various enterprise applications.
Assess technical architecture, design and framework to come up with technical approaches and document the details to help teams to collaborate and share knowledge efficiently.
Gather and understand business requirements to determine user expectations for various enterprise applications by going through the Epics and Features captured in JIRA.
Participate in brainstorming and planning sessions to choose Epics and Features as per priority.
Developed software diagrams and database models using Visio Charts to depict the functional, technical and data
Developed database code for JDA WMS system.
Developed Webservices and APIs using REST, SOAP webservices.
Implemented the user interface using JavaScript, ASP.Net, C#, Kendo UI.
Performed QA and Production deployments by comparing and analyzing with RedGate Change Automation deployment tool, Microsoft Visual studio code deployment and ensuring a clean deployment in timely fashion.
Environment/Tools: SQL Server 2016/2014, Visual Studio 2019/2015, Kendo UI, SSIS, T-SQL, JDA WMS, Power BI, ASP.Net, C#, JAVA Script, Excel, Windows server 2016/2012, Fiddler, Postman, JavaScript.
Client: Microsoft, Redmond, WA
Project: LeX (Learning Experience) BI Group 07/2015 – 01/2017
Role: Sr. SQL Server BI Developer
As the LeX BI’s team Senior Developer, delivered new BI reports and Power BI dashboards for LeXBI group. Implementation on BI reporting solutions including data acquisition and data warehouse design and Power BI reporting.
Created a metadata driven design pattern tool, which creates stored procedures and SSIS Packages to move data from different source systems like flat file, sql into data warehouse.
Developed MVA (https://mva.microsoft.com/) dimensional and fact model, which includes data acquisition from Azure DB, Tablestore and Blobstore and sync data between cube and reporting solution.
Developed DataMart for edX for Courses hosted by Microsoft on www.edx.org by acquiring data from AWS S3, building dimensions, fact tables and Cube and developed Power BI reports to support business requirements.
Published metrics for DX Radar Scorecard by building Analysis Services Tabular model and gathering data from different source system teams and preparing the monthly scorecard numbers for different metrics.
Environment/Tools: SQL Azure, SQL Server 2016/2014, Microsoft Azure Table Store, Blob store, Azure Data Factory, Amazon AWS S3, Visual Studio 2015/2013, SSAS, SSIS, T-SQL, Power BI, C#, JSON, R, Python, Excel.
Client: Microsoft, Redmond, WA
Project: Microsoft LeX Engineering Group 04/2013 – 07/2015
Role: Sr. SQL Server BI Developer
The project is to develop the Microsoft Learning platform (MLX and MVA).
Involved in the design and implementation on MLX platform. A solid, clean, highly distributed, and high performance MLX (a learning platform) which included the logical and physical data modeling, DB architecture, Azure data storage architecture design and implementation with Azure database, Azure table store and blob store.
Worked with BI Engineering team and business groups to improve data feeds to ensure load performance and data quality and formulated the overall BI architecture that provides an infrastructure and process to meet the changing business needs.
Involved in the complex SkillSoft (Element K) and other LeX migration projects. It covered multiple business channels applications: IT Academy, Software Insurance, Dynamics eLearning, Exam Registration, MCP (MSCert), and MVA.
Studied and analyzed complex Oracle data sources, did extensive data profiling and analysis work. Designed and implemented 15+ ETLs with quality data cleansing and transformation to ensure data integrity and quality.
Worked with several teams to solve various migration issues: minimize the target system performance degrade and minimum zero downtime time during migration, handling of special migration cases and clean up dirty data and made sure all data from the sources were moved and used correctly by the new system’s features and business owners were satisfied.
Performed MVA (https://mva.microsoft.com/) reporting performance tuning, profiling all the problematic queries and rewrote the long-running queries. Created reports which displays the completed course certificates and transcripts for learner portal.
Environment/Tools: SQL Azure, SQL Server 2012, Microsoft Azure Table Store, Blob store, Visual Studio 2012, SSIS, T-SQL, Erwin, C#, JSON,
Client: Microsoft, Issaquah, WA
Project: Argentum 11/2011 – 03/2013
Role: SQL Developer/ ETL Developer
Microsoft compensates partners for their assistance to customers who sign up for MS Cloud Services or provide their IP (apps) to the marketplaces. Argentum Partner Incentive Calculation project is to build an end-to-end partner fee payment platform that provides a common fee payment processing experience to partners across multiple Microsoft line of businesses.
This project deals with creation and maintenance of Data Mediation Layer which is the SQL Server based integration layer which would extract data from multiple source systems, transform the data and send the data downstream to SAP CI and CC modules as per the business requirements.
Worked with the System Analyst and Technical Architect in design sessions to develop high level ETL architecture, Technical Specification Documentation for the project by studying the Business Requirements and Functional Specs.
Created Data Mediation Layer to support the business requirements, with the proper database design.
Created SSIS Packages to extract data from different source systems which includes SQL database, mappings from Excel sheets.
Created stored procedures to which provides the Partner master data to SAP CI (invoice) system and calculates the partner incentives based on business requirements.
Loaded data into SAP CI (Invoicing) module using a web service call using C#.
Configured Database Mail on all project specific SQL Servers.
Created the deployment utility for the project using MS Build. Created the deployment document and handed over to the Operations team.
Worked closely with the business team, as part of Production support team.
Actively participated in User Acceptance Testing and Debugging of the system.
Environment/Tools: SQL Server 2008R2, Management Studio, SQL Server Integration Services, C#, T-SQL, Visual Studio 2010/2008, Erwin.
Client: Microsoft, Issaquah, WA
Project: PDMI Data Migration Phase II 05/2011 – 11/2011
Role: SQL Developer/ ETL Developer
Product Data Management Initiative (PDMI) focuses on addressing new product attribution, new product relationships, improved product bundling, automation, and enhanced reporting. A new product data model was required to accommodate new and classic products and channels and to eliminate quality, agility and scalability constraints.
Involved in Database designing, Normalization, Data models, E-R diagrams using Erwin tool.
Build and maintain complex SQL queries using T SQL for data analysis, data mining and data manipulation. Organize analytical data; ensure data quality and aggregate data for strategic reporting and performance optimization on database solutions.
Created a generic platform for all data migration projects called DME which can be used to migrate data from one system to the other in a simplified manner. Started using for this project and as of now 7 projects are using this platform in this group.
Closely worked with Data analyst and Business users to gather the requirement and produced an optimized ETL solution to the project.
Extensively used the SSIS package to extract data from Home Depot (SQL db) and FeedStore.
Created SCOM/MOM monitoring utility for the project where SSIS package logs errors to a database and those errors will be forwarded to Windows Event log for better visibility to the operations team using extended stored procedures with SQL CLR and C#.
Environment/Tools: SQL Server 2008R2, Management Studio, SQL Server Reporting Services, SQL Server Integration Services, SQL CLR, C#, VB Script, T-SQL, Visual Studio 2010/2008, Erwin.
Client: Microsoft, Issaquah, WA
Project: E-Recruit 06/2010 – 05/2011
Role: SQL Server Developer
New Generation Staffing project maintains a separate landscape for e-recruiting in ECC 6.0 tasked with pulling data related to HR from all together a new R/3 system- NGS (EREC01). As a part of this project, HR related data will be staged and published using BizTalk connector for publishing the data to Feed Store.
Worked with design architect and helped in writing the technical specifications for project.
Created a centralized logging mechanism (aka Sawmill) for the project to track the operation status and log errors using SQL Service Broker. More than 10 projects were using this mechanism to track the SSIS tasks execution and log errors to SQL tables.
Did extensive research on BI connector and BizTalk connector for SAP and did a proof of concept to test the performance of 2 different connectors to pull data from SAP.
Created an ETL process which moves HR data from SAP systems to Feedstore (SQL Database) using Biz Talk connector.
Improved the performance of SLA by designing a parallel processing system, where SSIS Packages will be executed in parallel with the help of metadata.
Closely worked with SAP-ABAP team to discuss on communication methods from SAP to SQL Server for design.
Created SQL Agent jobs to trigger the SSIS Packages and created daily schedule for them.
Created technical documentation and Operational guides as required.
Environment/Tools: SQL Server 2008R2/2008, SQL Server Integration Services, T-SQL, Team Foundation Server, Visual Studio 2008.