Post Job Free

Resume

Sign in

Data Manager

Location:
Pennsylvania
Posted:
April 27, 2020

Contact this candidate

Resume:

Vijayalakshmi Sambandam

Email: adczia@r.postjobfree.com

Mobile: +1-804-***-****

PROFESSIONAL SUMMARY

Over 13 years of versatile expertise in IT Design, Development and Implementation of Data Warehouse and Business intelligence applications in Finance, Retail, Manufacturing, Telecommunications and Healthcare domains

Proven experience in building large complex datasets/DataWarehouse (OLAP) translating business requirements to analytical and reporting needs

Responsible for data engineering functions including, data extract, transformation, loading, integration in support of enterprise data infrastructures – data warehouse, operational data stores

Good Understanding of data warehousing techniques, Star/Snowflake schema, ETL, Fact and Dimensions tables, physical and logical data modeling of OLAP and Report delivery methodologies

Advanced skills in data intensive application development, data integration, and data pipeline design patterns on a distributed platform using SNP/MPP databases (Oracle, Teradata, MS-SQL, Redshift)

Working Knowledge of implementing Data warehouse solutions in AWS cloud using Amazon Redshift

Proficient in designing and developing strategies to Extract, Transform, Load data to the data warehouse using Informatica PowerCenter

Strong hands on experience using Teradata utilities (Macros, B-TEQ, Fast Load, MultiLoad, FastExport, Tpump) and Teradata Unity suite for data mover

Expertise in developing SQL and PL/SQL codes through various procedures, functions, triggers and packages to implement the business logic

Proficient in writing UNIX shell scripts for various business functionalities

Worked with DBAs to diagnose and resolve query performance problems

Experience in automating, scheduling and dependency management using Control-M & Work Load Manager for distributed systems

Understanding of Cloud Computing and DevOps concepts including CI/CD pipelines using Git & Jenkins

Basic understanding of NoSQL and Big Data technologies such as Hadoop, Hive

Proficient in Agile, scrum and waterfall methodologies

Worked closely with Product Owners/Project Managers for

Provided technical advice and training and mentored associates in a lead capacity

Highly organized and detail-oriented professional with strong problem analysis skills

TECHNICAL SKILLS

RDBMS : Oracle 8i,9i,10g,11g, DB2, SQL

MPPs : Teradata 12, 13,14,15, Amazon Redshift

NoSQL : Dynamo DB

ETL/ELT Tools : Informatica PowerCenter 6.x, 7.x, 8.x, 9.x,10x, ODI 11g

OLAP Tools : Business Objects 5.1,6 OBIEE Negril, Tableau

BigData : Hadoop, Mapreduce, HDFS, Hive, Sqoop

Scripting : UNIX, Python

Others : TOAD, SQL developer, Jenkins, SQL Assistant, Control -M

CERTIFICATIONS AND TRAINING

Oracle Certified Associate OCA

OBIEE, ODI Corporate Training

PROFESSIONAL EXPERIENCE

RELIABLE SOFTWARE RESOURCES

QVC – West Chester, PA

Senior ETL Engineer

Oct’18- Till Date

QVC (standing for "Quality Value Convenience") is an American free-to-air television network, and flagship shopping channel specializing in televised home shopping that is owned by Qurate Retail Group. Founded in 1986 by Joseph Segel in West Chester, Pennsylvania, United States, QVC broadcasts to more than 350 million households in seven countries, including channels in the UK, Germany, Japan, and Italy, along with a joint venture in China with China National Radio called CNR Mall.

PPG Data Migration, ClaraBridge Data Integration, CWMS Data Integration

Product, Productivity, and Growth (PPG) initiative is driving profitability and productivity through the introduction of new tools and data to better assess QxH vendor engagement. With improved visibility, Buying teams can build more strategic relationships with vendors, evaluate their performance, and identify opportunities to better source products.

Clara bridge is a third-party vendor text analytical tool which processes the customer feedback and review information. The objective is to integrate this data to the enterprise data warehouse to centralize all the reporting data needs.

CWMS – Common Warehouse Management System is QVC’s warehouse systems data and this requirement is part of QVC’s merger acquisition Zulily’s data to the enterprise data warehouse.

Roles & Responsibilities:

Worked with product owners/Business stake holders to understand the requirements and design

Datamodels and data integration strategies

Managed all development and support efforts for the Data Integration/Data Warehouse team

Wrote, tested and implemented Teradata Fast load, TPT scripts to import data from different sources to Teradata staging environment

Used Pushdown Optimization (Full PDO) exclusively to import data from Teradata staging area, apply the required informatica transformations as per the business reporting requirement & load to EDW Core tables

Used Informatica power center 10 to Extract, Transform and Load data into Teradata EDW from various sources like Oracle, SQL Server, DB2, JSON files, GCP cloud and flat files

Extensively worked with shell scripts for downloading files from cloud and moving to ETL file share and for creating parameter files

Created reusable UNIX Shell Scripts, pre/post session commands and saved development effort

Design approach, source to target mapping documents, code is peer reviewed and approved by ETL & Data Architects

Worked closely with the BI Tableau associate to map the presentation layer metrics to the EDW columns

Designed & developed re-usable components supports ETL design

Worked with Data Architects to create the data models as per the standards and requirements

Engage in production support tasks and work on incident tickets as required

Created Change tickets in HP Service Manager for productionizing the code,co-ordinate with CAB review team, to get approvals from program Managers and work with Release Management team for deployment

Prioritized work from backlog in JIRA board, estimate the complexity of each story and assign story points accordingly in Sprint planning, assign sub-tasks to each story as part of sprint grooming and discuss the progress of the stories in daily Scrum stand ups

Mentored associate team members on technical issues and roadblocks

Ability to learn and experiment with top tier technologies and patterns

Worked with stone branch scheduling tool for jobs scheduling & monitoring

Environment : GCP, AWS, Azure, Informatica 10.1, Oracle 11g, Microsoft SQL Server, DB2,UNIX, Flat Files, Teradata 15, Stonebranch

CVS Health - Smithfield, RI

Senior Data/Data Warehouse Analyst

Jan’15- May ‘18

CVS Health is an American retailer and health care company. CVS Health operates over 7,700 CVS Pharmacy and Longs Drugs stores; a pharmacy benefit manager, mail order and specialty pharmacies, a retail-based health clinic subsidiary, Minute Clinic; and an online pharmacy, CVS.com. CVS Health is chartered in Delaware, and is headquartered in Woonsocket, Rhode Island, where its four business units are also headquartered.

Omnicare Control-m Migration

The objective of this project to bring all the scheduling of third party workloads running in the native tool to control-m which is a enterprise scheduling tool. Omnicare is one of the CVS acquisitions and this project is part of integrating Omnicare to CVS infrastructure.

RxDW Future State Architecture and Data Retention Strategy

This project is the vision of future state architecture of CVS retail data warehouse to evolve and enhance by adding new technologies and tools. It includes metadata management, archival strategy and Disaster Recovery and High availability.

Responsibilities :

Participated in requirements gathering, design of ETL lifecycle and creating design specifications, ETL design documents

Actively involved in the data modeling and design of the data warehouse with Data Architect, Designed and Implemented star schema models, Identified and built Facts, Dimensions, Aggregate tables to support the reporting needs

Designed, Developed and tested Informatica Mappings, Transformations, Mapplets, Sessions, Tasks, Workflows, worklets, SQL queries to implement complex business rules.

Designed reusable objects like mapplets & re-usable transformations in Informatica PowerCenter 9.6x

Extensively used Source Qualifier, Aggregator, Lookup, Router, Filter, Sequence Generator, Expression, Joiner and Update Strategy transformations

Used Informatica Address Doctor for standardizing the address fields of the various sources.

Designed and developed TPT and Fast Load scripts through informatica and developed BTEQ scripts to process data from file to stage and then stage to the core table.

Through explain plan, analyzed and modified indexes and modified queries with derived or temporary tables to improve the performance. Also utilized Teradata viewpoint.

Created Multiset, temporary, derived and volatile tables in Teradata database.

Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS

Pushed data using map-reduce and hive scripts to Retail data lakes

Involve in POC developing and testing the new functionality /tools across CVS

Involved in the migration activities of informatica upgrade from version 9.0.1 to version 9.6

Performed detailed Data Analysis to identify the root cause of production issues and provide solutions in the permanent schedule.

Involved in building or re-designing of reject processing of source feeds and reprocess it to the warehouse.

Involved in the enhancements or new standards to the existing process as per CVS ETL architecture.

Co-ordinate with all the business users, SMEs and process owners to complete the activities of SDLC phases.

Environment : Informatica 9.6, Oracle 11g, Microsoft SQL Server 2008, UNIX, Flat Files, Teradata 15, Teradata Unity, Control – M, Data Migrator, DataStage, Hadoop

UST Global Limited

System Analyst

Jan’10- Dec ‘14

WellPoint - Richmond, VA

Senior ETL Developer

WellPoint, Inc. is the largest managed health care, for-profit company in the Blue Cross and Blue Shield Association. It was formed when WellPoint Health Networks, Inc. merged into Anthem, Inc., with the surviving Anthem adopting the name, WellPoint, Inc. and began trading its common stock under the WLP symbol on December 1, 2004. WellPoint is one of the nation’s largest health benefits companies, with more than 36 million members in its affiliated health plans and nearly 67 million individuals served through its subsidiaries.

AGP PIS UM Data Integration Project

The scope of the project is to direct source AGP UM clinical data to the enterprise data warehouse to support PIS (Payroll Information Systems) which is one of the downstream applications. This integration of data will give a unified view and provide the single source of truth for all their data and reporting needs across WellPoint.

IM Decommissioning Project –ISGDW

IM Decommissioning Project’s objective is to sunset few legacy data warehouses built on Microsoft SQL Server and Oracle and to make EDWARD (Enterprise Data Warehouse and Research Depot) – the enterprise warehouse as single source of information for all reporting needs. ISGDW is one of the legacy data warehouse which deals with the Individual and Small Group customers of CA, CO and NV region. The key business users of this system are actuarial users.

SSB Encounters Lights on Project

SSB – State sponsored Business is one of WellPoint’s IM (Information Management) Divisions which submits the claims details to the state government for reimbursement. SSB database is a

A consolidated data mart that will store all SSB data.

Holds data from various WP systems like D950, EPDS, NMS, EPDS2, WGS2.0 etc.

To provide a structure, one source of truth, to meet the different SSB reporting and analysis needs of Medical Management, Actuary, Finance and Compliance business area and other departments within WP.

Responsibilities :

Playing tech lead role and owns the accountability of all the technical deliverables.

Participated in the due diligence phase of requirements gathering, scope determination and translating the business requirements into technical requirements.

Performed detailed Data Analysis to identify data sources, data gaps, redundancies & anomalies.

Involved in detailed analysis of the downstream/upstream dependencies of the legacy warehouse.

Involved in effort estimation for the requirements for the project.

Co-ordinate with all the business users, SMEs and process owners to complete the activities of SDLC phases.

Worked effectively with Business Analysts and Project managers for assigned tasks and deliverable timelines.

Created spreadsheets as user interface for the business users to confirm on the data required for migration.

Documented the data analysis in the excel pivot summary of data availability to the business SMEs.

Maintained data validations in excel charts and tables.

Used various excel functionalities like vlookup, sorting, filtering during the analysis phase.

Designed simple and complex mappings using Informatica Powercenter to load the data from various heterogeneous sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Normalizer, SQL, Rank and Router transformations.

Created sessions, event tasks, decision tasks and workflows using powercenter

Used Teradata SQL Assistant, Teradata Administrator, and PMON and data load/export utilities like BTEQ, FastLoad, Multi Load, Fast Export, and Tpump on UNIX/Windows environments and running the batch process for Teradata.

Created UNIX shell script to FTP flat files from different systems to the ETL server

Responsible for QA migration and Production Deployment.

Schedule the Informatica Jobs through WLM

Used debugger in debugging some critical mapping by setting breakpoints and trouble shot the issues by checking sessions and workflow logs Created and maintained various project related documents like high level design documents.

Processed XML files using informatica webservices.

Compared the data from two data sources for finding the gaps.

Created reusable transformations and mapplet and used with various mappings

Participated in the Integration testing and Unit Testing along with the Development team. Performance tuning of sources, targets, mappings and SQL queries in the transformations

Involve in enhancements and maintenance activities of the data warehouse including performance tuning, rewriting of stored procedures for code enhancements.

Ensure 24*7 availability of PRODUCTION environment.

Effectively monitored weekly and monthly production jobs and solved the issues related to their abort state.

Responsible for INDIANA and SC State’s encounter submission process.

Monitoring and fixing the issues related to the tasks related to the submission process.

Automated the manual steps involved in production jobs.

Reviewed the deliverables of peer group

Environment : Informatica 9.1, Microsoft SQL Server 2008, UNIX, Flat Files, Teradata 13, Workload Manager (WLM), Control – M, Tivoli

Dell - India

Lead-Senior Software Developer

Jan ‘10- Feb’12

Dell Inc. is a multinational information technology corporation based in Round Rock, Texas,

Unite States, that develops sells and supports computers and related products and services. Bearing the name

of its founder, Michael Dell, the company is one of the largest technological corporations in the world,

employing more than 96,000 people worldwide and holding the third largest market share for PCs.

APEX – IV Data Validation

APEX – IV Data Validation is phase II of ISPR3 supplemental data conversion project. The objective is to fix the bugs and validate the supplemental data against BO reports and extracts. There are few telecom and non-telecom datasets to be migrated to the warehouse as part of this project.

SA_OPE_Omniture Integration_DDW

Dell implemented Omniture Site Catalyst to utilize the latest, state of the art online data capture technology and provide global access to online data via the Omniture tools: Site Catalyst, Data Warehouse, Genesis, Excel plug-in, Click map, and Discover.

The “SA_OPE_Omniture Integration_DDW” project was initiated to provide Dell with www.dell.com traffic data from Omniture into the Dell Data Warehouse, updated on a daily basis, with a preliminary load of 15 months of data initially, and maintaining 15 months of data going forward

ISP R3 Supplemental Data Conversion

ISPR3 evolved as part of Dell’s initiative to centralize all the reporting needs to improve agent’s performance.

There are 23 source applications such as email, call, chat, escalation Management. The objective of this project is

to bring 2 years of data from the audit (Oracle) and the daily incremental data from actual Sources SQL Server

and flat files to the Teradata base layer.

Responsibilities :

Participated in sessions with Business SMEs in understanding the source applications both from Business and Technical Aspect. The Data included both from internal application systems and data files procured from external vendors.

Worked close with Data Modelers to understand and verify required tables, fields and data in different schemas and databases.

Created the list of activities and objects for development in the excel.

Captured the business transformation logic applying excel functions for validation.

Maintained the tracker of work breakdown in the form using excels sub totals and array functions.

Captured the result of runtime of one load against the no. of runs required for the history loads using the excel formulas.

Used concept of staged mapping in order to perform asynchronous Web Services request and response as part of Informatica Mappings

Designed the ETL process for extracting data from heterogeneous source systems, transform and load into Teradata warehouse

Extracted data from various source systems like Oracle and flat files as per the requirements and loaded it to Teradata using FASTLOAD, TPUMP and MLOAD

Extensively used key transformations like source qualifier, aggregator, filter, joiner, Update Strategy, Unconnected and connected Lookups, Router, Sequence Generator, normalizer and SQL transformation.

Used informatica mapping debugger in debugging some critical mapping by setting breakpoints and trouble shot the issues by checking sessions and workflow logs.

Created reusable transformations, sessions and worklets in the workflows.

Developed web services workflows using informatica

Created UNIX shell scripts and called them as pre session and post session commands

Validated the OLAP data warehouse data with BOBJ reports

Performed bulk data load from multiple data source (ORACLE 9i, SQL Server and Flat files) to TERADATA RDBMS using Fastload, Multiload and TPump.

Created macros in TERADATA SQL Assistant for populating the stage tables.

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Wrote complex macros in Teradata to capture dell websites traffic data key metrics.

Created UNIX shell scripts and called them as pre session and post session commands.

Task assignment and monitoring the work progress of a 6 member team

Prepared and Executed the Test Cases during Unit testing and provided extensive support to QA in System, Regression and UAT (user acceptance testing) and deployment of mappings.

Worked with the EST lead in activities related to code migration from development to test environment.

Involved in Dell’s deployment process for code migration. Follow Change Management Control processes, CAB & CRRB Checklist for deployment

Developed the UNIX shell scripts to send out an E-mail on success of the process indicating the destination folder where the files are available.

Prepared training materials and presented to both the operational and Business teams.

Reviewed documents like STM (Source to Target mappings) with Data Analyst, Test Plans with EST, and Production Support documents with the Maintenance Team.

Configured sessions in a job and scheduled them through control - m job scheduler

Worked in a time boxed environment and the team completed all deliverables including code and documentations on or before the timeline, in all three phases of this project.

Environment : Informatica 8.6.1/9.5, UNIX, Oracle 11g, Microsoft SQL Server 2008, Flat Files, Teradata 13/12, Control M

Mind Tree Limited

Associate Consultant

Oct’06- Dec’09

Oracle Corporation, India

Senior Applications Developer

Oracle Fusion BI

Oracle Corporation is a multinational computer technology corporation that specializes in developing and marketing enterprise software products — particularly database management systems. Headquartered in Redwood City, California, United States, Oracle employs more than 115,000 people worldwide as of 2009. It has enlarged its share of the software market through organic growth and through a number of high-profile acquisitions.

Oracle Business Intelligence Applications support over a dozen different functional areas with complete, pre-built, best-practice analytics, and a rich set of KPIs, metrics and analytic workflows. By accessing data from across the enterprise and delivering deep insight directly to business users, Oracle Business Intelligence Applications enable faster and more informed decisions that help the organization optimize resources, reduce costs, and improve the effectiveness of front-and -back-office activities ranging from sales to human resources (HR) to procurement. Oracle BI Fusion is Oracle’s future release of business intelligence applications which will have oracle’s high end tech stack of tools such as set up and configuration manager, ODI 11g for ETL and OBIEE 11g Negril for reporting.

Responsibilities :

Owned the responsibility of creating ETL for OBIA Supply Chain Management SCM procurement data model.

Analyzed Oracle eBiz Applications for purchasing module and converted functional requirements to technical requirements.

Data Analysis and validations are captured in the excels.

Detailed pivot summary tables of purchasing attributes are documented.

Analyzed new functional/reporting requirements and worked closely with the various application teams and business teams to understand and design the ETL procedures for the new release.

Participated in meetings and functional/technical trainings required for the OBIA Fusion release. Acquired knowledge of the supply chain management business domain and oracle’s EBS applications.

Worked on the Bug fixing of the earlier 796x releases using Oracle Data Integrator (ODI).

Analysed logical and physical design requirements for the purchasing facts and dimensions.

Implemented physical and logical (BMM Layer) rpd design for purchasing facts, dimensions and new reporting metrics using Oracle Business Intelligence Enterprise Edition (OBIEE).

Created SDE (Source Dependent Extracts) mappings using informatica powercenter transformations to populate stage tables.

Developed SILO (Source Independent Loads) mappings using MAPGEN utility to load the purchasing fact and dimension tables as per requirement for the Negril version.

Ensured the reporting layer design is tested whenever there is a new build for OBIEE.

Prepare test cases based on functional scenarios and generate sample test case reports.

Documented extensively all the design considerations as per Oracle corporate standards.

Used Oracle DAC for configuring the schedule of the execution plan.

Reviewed code of team member.

Verification of functional specifications and review of deliverables.

Environment: Informatica 8.6.1, MAPGEN, Oracle 11g, Oracle Data Integrator (ODI) 11g, Oracle Business Intelligence Enterprise Edition (OBIEE) Negril version, Oracle Business Intelligence Applications- OBIA, - Data warehouse Administration Control DAC Windows XP.

Procter & Gamble - Impact PGP

Senior Software Developer

The Objective of this project is to analyse the performance of PnG products with the competitor’s data.

The competitor’s data are sourced from different third party vendors in the form of text files. These files

country wise are consolidated and transformed as per business requirements and loaded to the warehouse.

Responsibilities :

Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design

Contributed in the design activities and assisted in analyzing the tool feasibility (Informatica) for business requirements.

Created dynamic excel reports creating links where users need to key in the respective geography and the reports are automatically refreshed based on the current data.

Used the excel functionality to connect to the database from excel.

Worked on Informatica Power Center and created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.

Used extensively Source qualifier, Aggregators, lookups, Filters & Sequence generators.

Created sessions and batches to move data at specific intervals & on demand using Workflow Manager.

Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.

Developed calculated measures and assigned aggregation levels based on dimension hierarchies.

Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table

Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

Followed Informatica recommendations, methodologies and best practices.

Involved in Unit and System Testing of ETL Code (Mappings and Workflows).

Worked closely in setting up the environment for various file transfer activities between the systems using SFTP as the file transfer protocol.

Developed PLSQL procedures for the business requirements.

Document the entire ETL process and testing.

Environment : Informatica 8.6.1, Oracle 10g, UNIX.

Brocade Communication Systems

RUGBY Data Warehouse

Senior Software Developer

Brocade is an industry leader in delivering innovative, high-performance, and reliable networking solutions. Brocade is one of only two companies worldwide that offer complete end-to-end networking solutions—with Brocade solutions now used in over 90 percent of Global 1000 data centers.

RUGBY Data Mart is an initiative to source the data pertaining to the Brocade Customer Support into a Data Mart and generate reports out of it which will provide the BROCADE managers an insight about the performance of the TSEs and the support group as a whole. The current REMEDY system, out of which lot of reporting is done, will be de-commissioned in the near future and the new source for the Customer Support data will be the Oracle Teleservices system.

Responsibilities:

Understanding the source system and project architecture.

Analyze the business requirements, existing systems & participate in internal team meetings to determine actions for the respective assignment.

Designed, Developed and tested Informatica Mappings, Transformations, Mapplets, Sessions, Tasks, Workflows, worklets, SQL queries to implement complex business rules.

Designed reusable objects like mapplets & re-usable transformations in Informatica PowerCenter 8.5.1/7.1.

Most of the transformations were used like the Source Qualifier, Aggregator, Lookup, Router, Filter, Sequence Generator, Expression, Joiner and Update Strategy.

Created Parameter files and validation scripts

Created complex SQL queries are used for data retrieval.

Used SQL ANALYTIC functions like lag, lead for calculating the time between the different statuses in their source applications.

Involved in the system testing and documentation of the ETL development design and testing.

Environment : Informatica 8.1.1, Oracle 10g, SQL, PL/SQL, TOAD, Windows XP

Burger King Corporation FL

Global Data Warehouse

Team Lead

Global Data warehouse is an extension to existing data warehouse. This project originated as a request from the Performance Analysis team to provide daily/weekly/monthly sales information by topline/daypart/service mode/ menu item from the EMEA region as currently available for the US. The existing Data Warehouse of BKC does not have sales data at the lowest level of granularity (ticket level data). This limits the business users to do sales and profitability analysis. The objective is to design and build a system to collect summary and ticket level data from BK restaurants on a standard POS System to fulfil current and anticipated reporting needs.

Responsibilities :

Understanding the source system and architecture.

Design and development of ETL specifications and code.

Development of PL/SQL procedure for KRAs as per the business requirements.

Coordinating with the offshore team during the project tenure

Creation of test plan, test strategy and test approach.

Created Test scripts, Traceability Matrix and mapped the requirements to the test cases

Used HP quality centre for testing

Involved in the system integration testing.

Conducted peer design and code reviews and extensive documentation of standards, best practices, and ETL procedures

Used oracle explain plan feature to identify the query execution cost and tuned the queries based on the oracle optimize plan for good query performance

Involved in identifying the performance Bottlenecks on source, target, mapping, session and successfully tuned them to increase session performance as wells as Oracle views.

Interaction with various other teams for setting up of servers and



Contact this candidate