Post Job Free
Sign in

Manager Data

Location:
Atlanta, GA
Posted:
November 07, 2019

Contact this candidate

Resume:

Amit Sathvara

Data Architect

Employer: ******@********.***// 609-***-****

SUMMARY:

13+ years of experience in designing, developing and maintaining large business applications such as data migration, integration, conversion, data warehouse and testing which includes 2+ years of experience in Hadoop, Big data ecosystem, AWS web services related technologies.

Experience includes thorough domain knowledge of Business Financial system, Banking, Animal healthcare information technology, Insurance & Reinsure, Pharmacy claims systems, Telecom Industry.

Professional understanding of Software Development Life Cycle (SDLC).

Expertise in data warehousing, ETL architecture, Data Profiling using Informatica Power center 9.6/9.5/9.1/8.6/8.5/8.1/7.1 – Client and Server tools.

Used Informatica Metadata Manager and Metadata Exchange exhaustively to maintain and document metadata.

Through knowledge of Relational & Dimensional models (Star & Snow Flake), Facts and Dimension tables, Slowly Changing Dimensions (SCD)

Hands on experience in implementing Slowly Changing dimension types (I, II &III) methodologies

Experience in integration of various data sources like SQL Server, Oracle, TeraData, Vertica, Netezza, Flat files, NoSQL, DB2 Mainframes into the staging area

Business Requirements review, assessment, gap identification, defining business process, deliver project roadmap including documentation, initial source data definition, mapping, detailed ETL development specifications and operations documentation

Expert in Installation and configuration of Informatica server with Sql server, Oracle and able to handle the Informatica administrator tasks like Configuring DSN, creating Connection Strings, copying & moving mappings, workflows, creating folders etc,.

Experience in complex PL/SQL packages, functions, cursors, triggers, views, materialized views, T-SQL, DTS, and SSIS.

Expert in troubleshooting/debugging and improving performance at different stages like database, workflows, mapping. Experience in writing UNIX shell scripts and Perl scripts

Hands on experience in application development using RDBMS, and Linux shell scripting.

Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, Resource Manager, Node Manager and Map Reduce programming paradigm.

Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa. Have Got Proven experience leading Team.

Enthusiastic and goal-oriented team player possessing excellent communication, interpersonal skills and leadership capabilities with high level of adaptability.

EDUCATION:

B.S in Computer Science

MS in Technology Management

TECHNICAL SKILLS:

Data warehousing

Informatica Power Center 9.6/9.5/9.1/ 8.6/8.5/8.1/7.1, Informatica Power Connect for Siebel/SAP/PeopleSoft, Metadata Reporter Data Profiling, Data cleansing, OLAP, OLTP, Star & Snowflake Schema, FACT & Dimension Tables, Physical & Logical Data Modeling, Data Stage 7.x, Erwin 4.0, ER Studio, Dial, SSIS,AWS EMR, AWS Glue

BI Tools

Business Objects XIR2/6.0/5.1/5.0, Cognos, QlickView, Yotta, Tableau

Databases

SQL Server, Oracle 11 g/10g/9i/8i/8/7.3, Sybase, TeraData 6,My SQL, MS-Access, DB2 8.0/7.0, SeaQuest, Vertica, Netezza, Hive (NoSQL)

Languages

XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting

Operating System

HP-UX11/10/9,IBM-AIX4.0/3.1, Sun Solaris 9/8/7/2.6/2.5, SCO-UNIX, LINUX, Windows XP Professional/2000/NT/98/95

Other Tools

Autosys, Control M, Remedy, Mercury Quality center, Star Team, Lotus Notes, Tidal,Cloud Watch, Lambda

DB Programming & Tools:

RDBMS, Joins, Indexes, Views, Functions, Triggers, Clusters, Procedures, SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, Explain plan, SQL Trace, DB Visualizer, Athena

PROFESSIONAL EXPERIENCE:

Cox Communications, GA Sep’18-Present

Data Architect

Description: Cox provides high speed Internet, streaming TV - both live and on-demand, home telephone, and smart home security solutions for its residential customers. The primary objective of the project was to data analysis, create reports of the new launch products.

Responsibilities:

Worked on Amazon Web Services like EC2,EMR,AWS Glue,S3,IAM,Athena,Lambda,Cloud Search,Cloud Watch,Auto Scaling, Elastic Load Balance, AMIs,Amazon RDS DB services.

Good experience on Log Analysis, Maintaining documents of production server error log’s reports.

Automate multiple ETL jobs using AWS Step Functions and AWS Lambda.

Tested, Cleaned, and Standardized Data to meet the business standards using Execute SQL task, Conditional Split, Data Conversion, and Derived column in different environments.

Established a process of preparing a daily report that provided Cox products customer acquisition, customer transactions like support calls and truck-rolls, and machine learnings on system performance

Automated the above report using Tableau dashboards

Assisted business owners of Video product, Panoramic Wi-Fi, and Device Management in identifying operational issues on system performance and discrepancies on order entry and work orders

Performed ad-hoc data analysis on processes that impacted customer acquisition, activation and provisioning of accounts and devices, discrepancies with account setup, and customer support issues

Performed analysis of variances in actuals vs. forecast for customers getting Video product, Panoramic Wi-Fi, and Contour Streaming Player (Set Top Box)

Generated Tableau Dashboard with quick/context/global filters, parameters and calculated fields on Tableau reports using flatfiles, relational database, hive and Splunk data.

Created Ad-hoc report using Microsoft office ( Excel, PowerPoint)

Analysis real time data of specific programs and provided audience measurement ad-hoc report.

Environment: AWS EMR, Cloud Watch, AWS Glue, Athena, RDS,IAM, EC2, Lambda,Oracle 11g, Splunk, Hadoop, Hive, Tableau, Microsoft office

Beacon Health Options, GA Jan’18-Aug’18

ETL Lead

Description: Beacon provides administrative services in support of Georgia’s Department of Behavioral Health and Developmental Disabilities (DBHDD). The primary objective of the project was to migrate future state and legacy data to the Beacon System.

Responsibilities:

Manage end-to-end complex data migration, conversion and data modeling

Participate in quality management reviews as outlined in the Verification and Validation Overview

Ensure security in reporting and data warehousing

Delivering innovative BI & Data warehouse solutions

Worked on Data Profiling, data analysis and mismatch analysis

Support deployment of solution, cut over planning and execution activities

Expert in SQL, PLSQL and PLSQL Tuning

Define the migration strategy and plan

Environment: SQL server 2008 R, MySQL, IDSS (Infosys In home ETL Tool)

Merial, GA Aug’16-Dec’17

Sr. Informatica Developer

Description: Merial is the Animal Health subsidiary of Boehringer Ingelheim. The primary objective of the project was to construct an aggregated data warehouse (DWH) for analytics and reports used by business, risk management, asset management groups.

Responsibilities:

Worked with the business and data analysts in requirements gathering and to translate business requirements into technical specifications.

Created number of complex Mappings, Mapplets, Reusable Transformations, Workflows, Worklets, Sessions using Informatica Power center 9.0.1 to implement the business logic and to load the data incrementally.

Extracted the data from the Flat files, CSV file, Oracle databases into staging area and populated onto Data warehouse.

Used Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process. Used Debugger to test the mappings and fixed the bugs.

Worked on Performance tuning at source, target, mappings, sessions, and system levels.

Migrated ETL objects from DEV to UAT to PRD environments using Repository Manager.

Worked on the UNIX scripts for running the workflows and threshold check for the incoming files.

Prepared ETL Design-Document and Implementation Plan documents.

Involved in Unit Testing and created various test cases and reviewed and fixed issues in the ETL code.

Supported UAT and resolved issues. Also, worked on CR’s (change request) by the business.

Prepared Support Turnover documents in Production Support, for some interfaces.

Environment: Informatica Power center 9.5, Oracle 11g, SQL Server, Sun Solaris UNIX Shell Scripts, Tidal, and Remedy.

AT&T, GA Jan’16-Jun’16

Sr. ETL Developer/ Data Modeler

AT&T is an American multinational telecommunications corporation provides both of mobile telephony and of fixed telephony in the United States, and also provides broadband subscription television services. Southeast Construction and Engineering (SECE) PMO Reports aim is to keep projects on schedule, maintain effective communication across project teams, manage quality assurance, and deliver on budget in South East region. It improves client satisfaction with timely and actionable field reports; transmit field reports from the project site, without the need to re-type notes; reduce inspector overtime expenses; improves the quality of information captured in field reports and punch lists; reduces time to compile daily, weekly, and monthly reports, manages change order requests proactively.

Responsibilities:

Created prototype reporting models, specifications, diagrams and charts to provide direction to system programmers.

Involved in Architecture and design of Data extraction, transformation, cleaning and loading.

Involved in Requirement gathering and source data analysis for the Data warehousing projects.

Involved in the Logical and Physical design and creation of the ODS and data marts.

Involved in Analysis & Marketing Team to make business decisions

Prepared Technical documents for all the modules developed by our team.

Interacting extensively with end users on requirement gathering, analysis and documentation

Documented methodology, data reports and model results and communicated with the Project Team / Manager to share the knowledge

Imported Data from relational database into SAS files per detailed specifications

Carried out data extraction and data manipulation using PROC SQL, PROC SORT, PROC REPORT to create preferred customer list as per business requirements

Developing T-SQL queries, triggers, functions, cursors and stored procedures

Created new database objects like Tables, Procedures, Functions, Indexes and Views using T-SQL in Development and Production environment for SQL Server 2008 R.

Tuning queries which are running slow using Profiler and Statistics Io by using different Methods in terms of evaluating joins, indexes, updating Statistics and code modifications

Created views to restrict access to data in a table for security.

Created datasets in T-SQL, stored procedures for SAS Reports

Experience in performance tuning and optimization of queries and stored procedures.

Responsible for monitoring and making recommendations for performance improvement in hosted databases. This involved index creation, index removal, index modification, file group modifications, and adding scheduled jobs to re-index and update statistics in databases.

Environment: SQL server 2008 R, T-SQL, SSIS, SSRS, Windows Server 2008R, SAS BI Dashboard, Visio

Hewlett-Packard, GA Mar’13-Nov’15

Sr. ETL Developer / Lead

The client is implementing enhancements into the existing Enterprise Data Warehouse.

New cubes that are added will enable the client to identify loyal customers and serve them better. It will also help them to analyze and implement strategies to expand their customer base and provide better deals to the existing ones. The data will also be funneled to various calculation engines and data-marts needed for internal reporting.

Responsibilities:

Experience in Data Warehouse/Data Mart design, System analysis, Database design, ETL design and development, SQL, PL/SQL programming.

Created mapping between the databases and identified the possibilities of incorporating the new business rules with the existing design.

Created prototype reporting models, specifications, diagrams and charts to provide direction to system programmers.

Prepared High level design (HLD), LLD, Project functional Specification Document and Business Requirement documentation (BRD).

Participated in Design team brainstorm and user requirement gathering meetings.

Played the lead role in Designing and implementing Pricing Analytics, Best Customer and Finance Master Data Alignment Services Data marts.

Experience in managing the delivery of data extracts from the data sources to the warehouse and downstream applications.

Experience in requirement gathering and database design and implementation of star-schema, snowflake schema/dimensional data warehouse using Erwin.

Played the role of a Data warehouse architect in the Business Intelligence Enterprise Architect group for providing data architectural strategies.

Experience in Architecture and design of Data extraction, transformation, cleaning and loading.

Involved in Requirement gathering and source data analysis for the Data warehousing projects.

Involved in the Logical and Physical design and creation of the ODS and data marts.

Converted the Business rules into Technical Specifications for ETL process.

Involved in all phases of SDLC from requirement, design, development, testing, pilot, training and rollout to the field users and support for production environment.

Implemented mapping techniques for Type 1, Type 2 and Type 3 slowly changing dimensions.

Developed ETLs for Data Extraction, Data Mapping and Data Conversion using Informatica Power Center.

Involved in the Development of Informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.

Worked on Informatica Power center 9.5/9.6 Tool – Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.

Wrote scripts for collection of statistics, reorganization of tables and indexes, and creation of indexes for enhancing performance for data access.

The design document was finalized through conduct of design walkthroughs, and independent review by the user. Prepared Technical Specifications for all the utilities, as per the company's standards.

Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Informatica mappings.

Designed and developed Hadoop system to analyze the SIEM (Security Information and Event Management) data using MapReduce, VSQL, and Sqoop

Migrated data from SQL Server, Sea Quest (HP internal Database) to HBase using Sqoop.

Configured various big data workflows to run on top of Hadoop and these workflows comprise of heterogeneous jobs like VSQL, Sqoop and MapReduce.

Ability to adapt to evolving technology, strong sense of responsibility and accomplishment.

Prepared Technical documents for all the modules developed by our team.

Environment: Erwin, Informatica Power Center, Metadata Manager, SQL server 2008, SeaQuest Database (HP Internal DB), Dial (HP Internal ETL Tool), Yotta (HP Internal Reporting tool), Hadoop, Vertica, VSQL, Sqoop, MapReduce

Time Warner Inc., Manhattan, NY Oct’12-Mar’13

Sr. Informatica Developer

Description: Time Warner Inc., a global leader in media and entertainment with businesses in television networks, film and TV entertainment and publishing, uses its industry-leading operating scale and brands to create, package and deliver high-quality content worldwide through multiple distribution outlets. The primary objective of the project was to read the data from Netezza and generate and send the XML file to Kenexa 2xBrassRing by invoking web services.

Responsibilities:

Worked with the business and data analysts in requirements gathering and to translate business requirements into technical specifications.

Understand and discuss the ETL requirements with the Business Unit and preparing the detail design documentation according to the standards.

All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager. Ver. 9.1

Used the transformations like Expression, Lookup, Source Qualifier, Transaction Control, XML Generator, Web Service, XML Parser, Joiner, etc.

Extracted the high volume dataset from the XML, Netezza Relational Tables and xml targets.

Used Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process.

Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations and Create Mapplets that provides reusability in mappings.

Performance tuning has been done to increase the through put for both mapping and session level and SQL Queries Optimization as well.

Monitor and troubleshoot batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database.

Data Validation has been done for various sources to target mappings.

Migrated ETL objects from DEV to UAT to PRD environments using Repository Manager.

Provided support and quality validation thru test cases for all stages Unit and Integration testing.

Prepared ETL Design-Document and Implementation Plan documents.

Involved in Unit Testing and created various test cases and reviewed and fixed issues in the ETL code.

Environment: Informatica Power Center 9.1, SQL Server, Oracle 9i, Netezza, Remedy.

Credit Suisse, Raleigh, NC Feb’11-Aug’12

Informatica Developer

Description: Credit Suisse Asset Management division is one of the largest publicly traded global asset management firms in the world. The primary objective of the project was to construct an aggregated data warehouse (DWH) for analytics and reports used by business, risk management, asset management groups.

Responsibilities:

Worked with the business and data analysts in requirements gathering and to translate business requirements into technical specifications.

Created number of complex Mappings, Mapplets, Reusable Transformations, Workflows, Worklets, Sessions using Informatica Power center 9/8.6.1 to implement the business logic and to load the data incrementally.

Extracted the data from the Flat files, CSV file, Oracle databases into staging area and populated onto Data warehouse.

Used Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process. Used Debugger to test the mappings and fixed the bugs.

Migrated ETL objects from DEV to UAT to PRD environments using Repository Manager.

Worked on the UNIX scripts for running the workflows and threshold check for the incoming files.

Prepared ETL Design-Document and Implementation Plan documents.

Involved in Unit Testing and created various test cases and reviewed and fixed issues in the ETL code.

Supported UAT and resolved issues. Also, worked on CR’s (change request) by the business.

Prepared Support Turnover documents in Production Support, for some interfaces.

Environment: Informatica Power center 8.6/7.1, Oracle 9i, SQL Server, Sun Solaris UNIX Shell Scripts, Erwin, Autosys, and Remedy.

MetLife INC, Jersey City, NJ Aug’10-Jan‘11

Sr. Informatica Developer

Description: MetLife, Inc. is a leading provider of insurance and other financial services throughout the world. The primary objective of this project is to develop Tampa DataMart for investment like securities and bonds using existing DWH called IDEAS. DM is developed in such a way that adjustments are applied directly by users with GUI.

Responsibilities:

Involved in data analysis and development to understand attribute definitions for migration.

Prepared mapping specifications documents, unit testing documents for developed Informatica mappings.

Defined Target Load Order Plan for loading data into different Target Tables.

Worked on Informatica Power center 8.6/8.5 Tool – Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, Worklets and Reusable Transformation.

Extensively used Router, Lookup, Aggregator, Expression and Update Strategy transformation.

Involved in the design and development of mappings from legacy system to target database.

Involved in performance tuning of the mappings by doing necessary modification to reduce the load time.

Used SQL tools like TOAD to run SQL queries and validate the data.

Defined test cases and prepared test plan for testing ODS jobs.

Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.

Worked closely with the business analyst’s team in order to solve the Problem Tickets, Service Requests. Helped the 24/7 Production Support team.

Environment: Informatica PowerCenter 8.6/8.5, Informatica Power Exchange, COGNOS 8.3, SQL server 2005, Windows Vista

Crown Bank, Brick, NJ Feb’09-Jul’10

ETL Developer

Description: Crown Bank’s warehouse is built to load the data of Customers, Loan Plans and Transactions from different sources through Informatica server and generating reports on quarterly basis to help the business Analysis’s.

Responsibilities:

Project Life Cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into different targets.

Used to extract data from ODS and load it into EDW using Informatica.

Performed Repository object migration from Development to testing and testing to production Environments.

Worked with pre and post sessions, and extracted data from Transaction System into Staging Area.

Performed Unit Tests and tested Mapping logic by passing sample messages.

Responsible for Self and Peer Review of Informatica mappings under Development Phase.

Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.

Scheduled the tasks to be run using the Workflow Manager.

Implemented various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc.

Created and Monitored Sessions and various other Tasks such as Event-Raise Task, Event-Wait Task, Decision Task, Command Task etc. using Informatica Workflow.

Worked on the UNIX scripts for running the workflows and threshold check for the incoming files.

Environment: Informatica PowerCenter 7.1, Oracle 9i, PL/SQL, SQL Server 2000,Flat files, Sql loader, UNIX Shell Scripts, Micro strategy 8, Erwin.

Hartford Insurance, Hartford, CT Sep’08-Jan’09

ETL Developer

Description: Farmers Insurance is one of the leading insurance companies in USA. The company has enormous amount of inconsistent and redundant data generated among various functional groups of an organization. The application was developed to integrate these isolated islands of data to provide users with better information and even faster access so that they identify opportunities more quickly and respond more appropriately.

Responsibilities:

Extensively worked on Business Analysis and Data Analysis.

Interacted with the users to convert business logic into ETL specifications.

Extensively analyzed and used Ralph Kimball approach for building Data warehouse.

Worked with Power Center - Designer tool in developing mappings and mapplets to extract and load the data from flat files, Oracle to Oracle.

Created different transformations like Source Qualifier, Joiner transformation, Update Strategy, Lookup transformation, Rank Transformations, Expressions, Aggregator, Sequence Generator for loading the data into targets.

Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.

Environment: Informatica Power Center 7.1, Oracle 8i, MS SQL Server 7.0/2000, MS Excel 97, Flat files, PL/SQL, SQL, Windows 2000, UNIX

ICICI Bank, India Oct’03-Mar’08

Programmer Analyst

Description: ICICI Bank is India's largest private sector bank by market capitalization and second largest overall in terms of assets. It offers a wide range of banking products and financial services to corporate and retail customers through a variety of delivery channels and through its specialized subsidiaries and affiliates in the areas of investment banking, life and non-life insurance, venture capital and asset management. This project is about using Oracle SQL and PL/SQL to extract data from various sources into a data store for reporting

Responsibilities:

Participated in discussions with clients to better understand business requirements

Created forms for new policy entry details.

Responsible for maintaining Policies as per the customer requirement

Wrote PL/SQL Stored Procedures for data extraction.

Creation of Reports that allows the user to retrieve complete information as required for the financial status like monthly reports, day to day transaction etc.



Contact this candidate