Resume

Sign in

Data Sales

Location:
Charlotte, NC
Posted:
March 21, 2018

Contact this candidate

Resume:

Srinivas Vegesna

ac4v5k@r.postjobfree.com

Ph No: 704-***-****

EXPERIENCE SUMMARY

Highly Skilled on handling end-to-end Data warehousing projects and implementation in various capacities.

Excellent in Organizational Skills, client relationship, people management, project planning & estimation.

Ability to guide and manage medium to large teams

18 Years of experience in Data Warehouse and Client Server(2-tier, 3-tier architecture) solutions in various roles as a Data Architect(Kimball and Inmon approach), Lead, Systems Analyst and Technical Mentor with TIAA, Bank Of America, Ally Bank, Wells Fargo, Johnson and Johnson, Sony, XEROX and D&B.

Dimensional (OLAP) and OLTP Data Modeling experience using Star Schema, Snowflake schema and 3 NF modeling(Logical and Physical) using tools Erwin 4.x and Visio.

Good knowledge on Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm.

Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice versa.

Hands-on experience with various Data Integration platforms such as Informatica Power Center 9.x/7.x/6.x/5.x, DataStage 11.3, SQL Server Integration Services.

Involved in L1 and L2 application support, Issue analysis, Identification of bug and the workaround. Response to functional queries, Understanding & analysis of user requirement for minor enhancements.

Experience with Business intelligence reporting systems Business Objects 6.0, Cognos 8.x and SQL Server Reporting Services 2000/2005.

Excellent working knowledge with Oracle SQL and PL/SQL, DB2 UDB, MS SQL Server and Teradata SQL(DDL & DML), Stored Procedures, Triggers, Performance related transactions and DTS services.

Worked in Banking, manufacturing, Telecom, Pharma domains.

Executed all phases of the Software Development Life Cycle(SDLC).

Superior communication skills, strong decision making skills, organizational skills, and customer service oriented. Excellent analytical and problem solving skills.

SKILLS/TOOLS

Data Modeling

Erwin 8.x, Oracle Designer, MS Visio 2000/03/05

ETL

Informatica Power Center 9.x/7.x/6.x/5.x, Informatica Data Quality, DataStage 11.3 and Business Objects Data Integrator.

Business Intelligence

Business Objects Desktop and Web intelligence, Cognos 8.2,OBIEE, SAP-BW 3.1/4.7/7.0., SQL Server Reporting Services 2000/20005, Crystal Reports.

Database

Oracle 10g/9i/8i, DB2, SQL Server 2005/2000/7.x, Teradata13.0, MS Access.

BigData technologies:

Map Reduce,MongoDB, HDFS, Hive 2.4, SQOOP1.4, Flume, Scala and Spark.

No SQL data bases:

HBase.

Tools

& Utilities/Others

Control-M, Visual Studio 6.0/2003/2005, SQL *PLUS, MS Office 2000/2003, Visual Source Safe,,TOAD 7.5, AutoSys.

Languages

SQL, PL/SQL, Visual Basic 6.0, ASP 3.0,, Unix/Linux Shell Scripting for Scheduling Informatica Jobs

Operating Systems

HPUX 11.11i, MS-DOS, Windows 95/98/2000/NT/XP and Sun Solaris UNIX.

Exposure/Trained in

C, C++, SQL Server Integration Services

EDUCATION

Bachelor of engineering from Andhra University, India.

Significant Projects / Assignment Handled

KPI Flows: Apr’ 2014 to till

Client : TIAA

Charlotte NC

The scope of the project is accurately, timely, and reliably measure and report Gross Inflows & Outflows and Net Flows, Assets by Product, Investment Vehicle, and in Total, and Select key volume data for the firm is required to evaluate the success of the firm’s Growth strategies.

This project replaced excel-worksheet based application used to solicit, transform, store and report Enterprise Gross and Net Flows, Period-end Assets and Select Transaction Volume data with scalable, flexible, relational database including automated data import ability, standard report production, and database query capability. The reports generated have the ability to retrieve Month-to-Date, Year-To-Date, and Quarter-to-Date information. In addition, the reports have option to create Ad-hoc analysis on the data.

Roles & Responsibility: ETL Architect / Data Modeler

Analyze and Review the Business Specification Documents, Interacted with the business user to collect requirements.

Data analysis and validation.

Designed ETL processes and mapping specification documents for ETL requirements

Developed custom technical artifacts such as coding standards, technical specification, test specification.

Designed & development of ETL mapping, workflows.

Involved in data base development activities like procedures, function and triggers.

Performed ETL Administrator jobs like maintaining Folder/User level Privileges, Stop and Restart the services.

Resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.

Processed data into HDFS by developing solutions.

Analysed the data using Map Reduce, Hive and produce summary results from Hadoop to downstream systems.

Worked extensively with HIVE DDLs and Hive Query language (HQLs).

Developed UDF, UDAF, UDTF functions and implemented it in HIVE Queries.

Implemented SQOOP for large dataset transfer between Hadoop and RDBMs.

Used Sqoop widely in order to import data from various systems/sources (like MySQL) into HDFS.

Created components like Hive UDFs for missing functionality in HIVE for analytics.

Developing Scripts and Batch Job to schedule a bundle (group of coordinators) which consists of various.

Exported the analysed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.

Involved in ETL, Data Integration and Migration.

Used different file formats like Text files, Sequence Files, Avro.

Assisted in creating and maintaining Technical documentation to launching HADOOP Clusters and even for executing Hive queries.

Assisted in Cluster maintenance, cluster monitoring, adding and removing cluster nodes and Trouble shooting.

Environment : Erwin 8.6, Informatica Power Center 9.1, DataStage 11.3,Oracle 10g, Hadoop, Map Reduce, HDFS, Sqoop1.4, LINUX, Hive2.4, Hbase, Hadoop Cluster and JAVA, Tableau 9.3

WIMS Oct ‘2013 to Mar’2014

Client: Bank of America.

Charlotte NC

WIMS(Wire Image Management System) provide support to Wire database units by providing a single web-based imaging application that can be shares across the corporation. Image enables workflow offers a streamlined alternative to paper-based and facilitates a common approach

to back office processing.

Roles & Responsibility: ETL Architect/ Data Modeler

Involved in requirement collection, analysis and design.

Created and documented Logical Data Model(LDM) and Physical Data Model(PDM) using Erwin.

Data analysis and validation.

Involved in all data base development activities like procedures, function and triggers.

Database tuning.

Handling change requests, code deployments.

Data mapping and migration of existing data.

Experience in using Automation Scheduling tools such as Autosys

Created and configured Workflows, Worklets and Sessions to transport the data to target using Informatica Workflow Manager and Schedule the jobs using Autosys.

Research and develop new technologies and procedures to further streamline/automate process both within the group as well as the department.

Coach, mentor and lead personnel within a technical team environment.

Check environments validation after regular OS and DB patches

Environment: Erwin 8.6, Informatica 9.1,Oracle 10g, UNIX, Autosys, Core FTP & Windows XP.

Advisor Best Practices: Mar’2013 – Sep’2013

Client: Wells Fargo

Charlotte NC

This project provides consolidated lists of targeted business opportunities and information to Financial Advisors in support of Best Practice adoption, firm initiatives and increased productivity.

It improves client experience through a more disciplined sales process and uses the data more effectively to serve our clients' total financial needs by delivering more consistent and higher quality advice to clients. The project targets an incremental Revenue generation of $10MM.

This effort involved creating the data structures in Brokerage Data Warehouse to support the mining of Advisor Best Practice Opportunity Lists. In addition an OLTP database is created for this To Do List application in Operational Data Store, Sales domain. The database is designed to support the administration of Opportunities and the functioning of the field facing end-user application. Web Services provide data interactions between UI and Database and BPMS components support the business rules and state management of Opportunity Life Cycle.

Roles & Responsibility: Data Modeler /ETL Architect

Analyze and Review the Business Specification Documents, Interacted with the business user to collect requirements.

Created and documented Logical Data Model(LDM) and Physical Data Model(PDM) using Erwin.

Involved in writing data mining query’s and data analysis and validation.

Designed ETL processes and mapping specification documents for ETL requirements

Developed custom technical artifacts such as coding standards, technical specification, test specification

Designed & development of complex informatica mappings and mapplets.

Involved in ETL design, process flow, workflows and executing batch jobs.

Environment: Erwin 8.6, Informatica Power Center 8.6.1, Oracle 10g, DB2,UNIX, Core FTP & Windows XP.

BASEL Feb’2012 – Feb’2013

Client : Ally Bank

Charlotte NC

Basel II and Basel I capital calculations project implemented using COTS product – Oracle Reveleus to support Basel II regulatory capital calculations. This is a data consolidation project built on ETL framework to extract and consume data from multiple source systems using Data stage. After the consolidation, Oracle product Reveleus has been used for Basel II and Basel I Capital Calculations including OBIEE reporting.

Roles & Responsibility: ETL Architect

Involved in requirement collection, analysis and design.

Designed Conceptual data model.

Created and documented Logical Data Model(LDM) and Physical Data Model(PDM) using Erwin.

Data analysis and validation.

ETL Process design, specifications and development of mappings.

Manage day to day activities and data loads.

Designed ETL processes and Created Technical specification documents for ETL requirements

Coordinate with the DBA to modify existing database objects.

Environment: Erwin 7.3, Datastage 9.1, Oracle 10g, UNIX, Core FTP & Windows XP.

FIU DATAMART Mar’2011 to Feb ‘2012

Client : Bank Of America

Charlotte NC

FIU Data Mart is a centralized data store to fulfill all Global Anti Money laundering

(AML) Reporting, Analytic and Record Retention needs. It shall contain end-to-end comprehensive views of all enterprise global AML activities. Scope of Data Mart is to meet the data storage needs for the enterprise global AML activities It shall be reasonably standardized for all current and potential future variations in data sources, tailored processes and use of tools.

Roles & Responsibility: Data Modeler

Analyze and Review the Business Specification Documents, Interacted with the business user to collect requirements.

Designed Conceptual data model.

Created and documented Logical Data Model(LDM) and Physical Data Model(PDM) using Erwin.

Data analysis and validation.

Designed ETL processes and Created Technical specification documents for ETL requirements

Developed custom technical artifacts such as coding standards, technical specification, test specification

Involved in ETL design and development activities.

Designed & development of complex informatica mappings and workflows.

Coordinate with offshore team.

Environment: Erwin 7.3, Informatica Power Center 8.6.1, Oracle 10g, Teradata,UNIX, Core FTP & Windows XP.

Crossroads (Medical Devices & Diagnostics) July 09 to Mar’2011

Client: Johnson and Johnson

Piscataway NJ

Crossroads will standardize process, data, and systems globally on single, integrated instances of SAP, making the MD&D supply chain simple, faster and more efficient. Crossroads is a migrating project from 50 different applications into SAP NetWeaver 7.0. Crossroads is a key component of our MD&D strategy to create synergies in production and inventory management across the MD&D operating companies. Crossroads is a foundation for supply chain innovation across MD&D. We are creating a shared platform which will be leveraged by many subsequent initiatives, including Product Life Cycle Management, Quality Management, and Advanced Demand and Supply Planning.

This project involved extracting data from multiple source systems using Informatica Power center 8.6.1, performing conversions and loading the data into SAP. It also involves developing IDE Reports and IDQ Dashboard Reports and Data validations.

Roles & Responsibility: ETL Lead Developer

Analyze and Review the Business Specification Documents, Interacted with the business user to collect requirements.

Worked with significant number of conversions with legacy systems, to meet global and business requirements.

Prepared the flow documentation for conversions.

ETL Process design, specifications and develop the test cases.

Manage day to day activities and data loads.

Extensively worked in the performance tuning of programs, ETL procedures & processes.

Coordinate with offshore team.

Environment: Informatica Power Center 8.6.1, Power Exchange 8.6.0,SAP (BAPI/RFC's, IDOC's), SAP Logon, IDE8.6.0, IDQ8.6.0, Oracle 10g, Erwin 7.3, Mainframe, DB2, SQLSERVER 2005, UNIX, Core FTP & Windows XP.

Corporate Data warehouse Nov 08 to Jun 09

Client: SONY ELECTRONICS

Sandiego CA

Corporate Data Warehouse (CDW) is the main data warehouse system for Sony electronics. CDW provides the consolidated data for various reporting applications. The main functionality of CDW is to collect input from various systems like SAP, STN, Oracle OMS and ICOM. Consolidated and formatted data from the source systems and make it available for the reporting applications. Input systems for CDW are mainframe and client server based. STN is mainframe based, but SAP is client server. The report-enabled data is sent to output system like Oracle and from there, reports are generated. The reports are submitted to management line.

There are many subsystems that come under Data Warehouse like Daily Sales, SSIS, Inventory, Asset Management etc.

Roles & Responsibility: ETL Lead Developer

Involved in gap analysis and requirements collection.

Analyzed and Reviewed the Business Specification Documents.

Designed data model and ETL processes for daily and weekly batch processes.

Created and documented Logical Data Model(LDM) and Physical Data Model(PDM) using Erwin.

Created Technical specification documents for ETL requirements

Developed custom a technical artifacts such as coding standards, technical specification, test specification

Facilitated and supported Functional unit test, Integrated test cycles 1, 2, and 3.

Successfully implemented configuration management while moving the development objects into quality and subsequently into production systems

Involved in ETL design and development activities.

Designed & development of complex informatica mappings and workflows.

Coordinate with offshore team.

Data analysis and validation.

Environment: Informatica 8.5, Toad 8.0, Oracle 10G, PL/SQL, UNIX, Cognos 8.2, Business Objects 6.5, Win XP, UNIX and Oracle

Customer information: Feb 2007 to Oct 2008

Client: XEROX CORPORATION USA

Rochester NY

Xerox Corporation is the world's leading document management technology and services enterprise, Xerox provides the document industry's broadest portfolio of offerings. Digital systems include color and black-and-white printing and publishing systems, digital presses and "book factories," multifunction devices, laser and solid ink network printers, copiers and fax machines. Xerox's services expertise is unmatched and includes helping businesses develop online document archives, analyzing how employees can most efficiently share documents and knowledge in the office, operating in-house print shops or mailrooms, and building Web-based processes for personalizing direct mail, invoices, brochures and more. Xerox also offers associated software, support and supplies such as toner, paper and ink.

The company's operations are guided by customer-focused and employee centered core values such as social responsibility, diversity and quality augmented by a passion for innovation, speed and adaptability.

The customer information program is an initiative that provides Xerox with the capability to obtain and consolidate global, establishment-level customer data. Data collected will be made available to Xerox users via the Xerox Global intranet (XGI). This data support the management reports produced by the Cognos reporting system.

Roles & Responsibility: ETL Sr Developer

Gathered requirements from business users and analysis based on the requirements.

Created Technical specification documents USCR and CI.

Analyzed and Reviewed the Business Specification Documents.

Used SQL Developer tool to issue SQL commands matching the business requirements for loading and creating and analyzing tables.

Cognos User administration.

Day to day operational support activities.

Implemented onsite-offshore model for 24X7 production support.

Coordinate with offshore team.

Data analysis and validation.

Handling change requests.

Data analysis and validation.

Environment: Informatica 7.1, Toad 8.0, Oracle 10G, PL/SQL, UNIX, Cognos 8.2, Report net MR2, Trillium, VSS, Frame work manager, Win XP, and UNIX.

Compliance Dashboard Reporting Project Sep 2005 to Dec 2006

Client: GlaxoSmithKline Inc USA

Bagalore India

GlaxoSmithKline Inc. - One of the world's leading research-based pharmaceutical and healthcare companies - is committed to improving the quality of human life by enabling people to do more, feel better and live longer. The Compliance Dashboard Reporting project for GSK Corporate-IT group for generating compliance related reports. The infractions and exceptions created by the employees are gathered from various groups within GSK. A single Compliance Dashboard Data mart will be built to collect and store data from various source systems into it. The necessary ETL system will Extract data from different sources, transform and load data to dashboard data mart. From this Dashboard data mart the necessary infraction, Exception and various analytical reports will be created. The Dashboard users will be able to logon to the reporting system to view and generate compliance reports based on policies and business units.

Environment: Informatica 6.2, Oracle 9i, ERwin, Cognos 7.3,VSS, HP-UX

Roles & Responsibility: ETL Developer

Primarily responsible in the Designing of ETL Specifications, Develop the test cases and ETL Process design.

Designed & developed complex Informatica mappings using expressions, aggregators, filters, lookup & stored procedures to ensure data movement between applications.

Designed and built Data marts by using Star Schema’s.

Designed the Dimensional Model of the Data Warehouse Confirmation of source data layouts and needs

Identified source systems/connectivity/tables and fields, to ensure data suitability.

Designed & developed complex Dataflows using expressions, aggregators, filters, lookup & stored procedures to ensure data movement between applications.

Developed and documented Data Flows/Transformations, and sessions as per the business requirement.

Created Sessions and executing them to load the data from the source system using Workflow Manager.

Developed and documented Data Mappings/Transformations, and Informatica sessions as per the business requirement.

Error checking and testing of the ETL procedures and programs using Informatica session log.

Extensively worked in the performance tuning of programs, ETL procedures & processes.

Pipeline Leadership Jun 04 to Aug 05

Client : D&B USA

Bagalore India

The scope of this project is to provide agreement (contract) information of customers into a new star within the analytics data mart in order to facilitate better sales pipeline reporting for executives and sales leaders. Getting source from Siebel Sales application. This release deals with the agreements/contracts and part of the sales pipeline.

Roles & Responsibility: ETL Developer

Primarily responsible in the development of mappings, reusable transformations using Informatica PowerCenter and testing the objects and also modified Informatica mappings and sessions to improve performance, accuracy and maintainability of existing ETL functionality.

Environment: Informatica 6.2, ER win, Sql server 2000, Oracle 9i,VSS

Revenue Generation and PAI : Aug 2000 to May 04

Client : ICFAI India

This project mainly useful to maintain the history of the students, finance information, Geographic location of the students, Division of the student. Responsibilities included understanding their OLTP Data and their requirements, Extracting, Transform,Loading data from Oracle, Sql server, DBF files to Staging Database using Informatica ETL tool and loading all data from Staging Database to the Warehouse database (Oracle8i) and generating reports using business objects

Roles & Responsibility: Senior programmer

1. Involved in Creating the Universes Using the Business Objects Designer Tool. 2. Modified Universes, which are already developed, for generation of report According to the client’s request by using Designer module in Business Objects. 3. Generating various reports as per the requests of the Client. 4. Created and inserted some tables into Universes as per the client requirements

Environment: Windows 2000, oracle 8i, Business objects 5.1, Informatica 5.0 and Oracle.



Contact this candidate