SANDEEP PATTAR
adbnhc@r.postjobfree.com
Phone 407-***-****
Professional Summary:
●Over 14 years of diversified experience in Data Warehousing, Data Modeling,ETL using Informatica, Pentaho, Hadoop, Databases using Oracle, Sybase, DB2, HBase, and Server using Unix, Linux and Windows in various industries
●Domain experience in Banking (Investment Banking), Health Care, Taxation and Government Benefit systems.
●Held Various roles like Scrum Master, Tech Lead and Developer.
●Highly Proficient in data extraction, transformation and loading data using Informatica PowerCenter (Admin Console, Informatica Service, Repository Manager, Designer, Workflow Manager, and Workflow Monitor)
●Strong experience in Oracle- SQL including Stored Procedures, Triggers and Query Tuning. Extensive experience in data analysis using SQL and good understanding of different source systems. Strong analytical and communication skills
●Good knowledge in Data Modeling using data Modelers, Starschema/SnowFlakeschema, Fact&Dimensions tables, Physical & logical data modeling
●Good knowledge and understanding of private and public cloud AWS.
●Experience in requirement gathering, designing architecture and ETL technical Specification document development
●Strong experience in Extraction, Transformation, Loading (ETL) data from various sources into DataWarehouses, Data Vault and DataMarts using Informatica (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Power Exchange.
●Experience in Big data tools like Sqoop (Extraction and Load from HDFS and to HDFS file system).
●Experience in Informatica Data Quality (IDQ) and Data masking.
●Experience in Big data tools like HBASE and HIVE (Store data in relational table structure).
●Handling Jira board, creation of stories, conducting all scrumceremonies and in dept knowledge of Agile methodology.
●Experience in Performance tuning of targets, sources, mappings and sessions
●Worked extensively on stored procedures, Functions, Triggers and SQL Queries
●Experience with Unix Shell Programs, Integrating UNIX Shells with ETL jobs like Informatica Batches, Sessions, Command Task, and Workflows
●Developed Unix shell scripts to automate the process.
●Analyzed the functional specs provided by the business analyst and created technical documents for the mappings
●Working knowledge of Cloud Services IaaS, PaaS, and SaaS and coming trends in the Cloud space
●Serving as a hands-on subject matter expert for DevOps and Automation in aCloud infrastructure environment
●Expertise in all phases of the Software Development Life Cycle(SDLC) including analysis, code development, testing, validation, implementing and maintenance
Supported code/design analysis, strategy development and project planning
●In addition to excellent technical abilities possesses strong oral and written communications skills, analytical and interpersonal skills
SKILLS
ETL Tools: Informatica Power center 9. 6and 10.2, Power Exchange, IDQ, Pentaho
BIGDATA/HADOOP: Sqoop, Hive, HBASE, HDFSandSpark
Programming: Perl, Unix Shell Script, Java
Database: Oracle, Sybase, DB2, Teradata
DB Tools: SQL Developer, TOAD, SQL Loader
Operating System: UNIX/Linux, Solaris, Windows, DOS
Scheduling Tools: Control-M, DAC, Crontab and Tivoli Maestro
Version Control: SVN, GIT,PVCS, SCCS
BI Tools: Business Object, OBIEE
CERTIFICATION
●ITIL V3 Certification
●Oracle SQL (OCA)
EDUCATION
Master of Science (M.S) in Software Systems from Birla Institute of Technology & Science (BITS), Pilani, India
B.E in Computer sciences and Information from Vishweshwaraya Technical University, Belgaum, India
PROFESSIONAL EXPERIENCE
BCBSA (Blue Cross Blue Shield Association) June 2019 – Till Date
Project: PDR (Provider Data Repository)
Role: Sr. ETL Developer
Location : Chicago, IL, USA
Description: PDR (Provider Data Repository) is a centralized database for Licensee and third party sourced provider information. PDR maintains demographic and quality information about the providers that support National Programs products. The PDR system receives, validates and extracts provider information.
Responsibilities:
Playing a Senior ETL Developer role in the AGILE methodology
Involved in planning, estimation, design and requirement gathering for each sprint.
Developed the complex Informatica mappings and the workflows.
Involved in Performance tuning and the performance improvement activities.
Developed Unix shell scripts, which will get invoke from the Control-M job scheduler.
Involved in testing (Unit, Integration, system testing, Recursion testing and UAT).
Developed the stored Procedures in Oracle.
Environment :( Development/Production): Informatica Power Center - 10.2, DQM, Oracle, UNIX shell scripting and Control-M (Scheduling Tool)
JP Morgan Chase June 2015 – March 2019
Project: PODS (Positional Operation Data Source)
Role: Sr. ETL Developer /Scrum Master
Location : Bangalore, India (JPMorgan India)
Description: PODS (Positional Data Source) is the authoritative data source for any group that requires position, order, trades and transaction information throughout Asset Management. PODS provide Investment Management data. And data source for all groups like Client Reporting, Finance, Risk, Portfolio Management, Trading, Back Office and operations – in Investment Management and Wealth Management.
Responsibilities:
Playing a scrum Master and Developerrole in the AGILE methodology
Involved in planning, estimation, design and requirement gathering for each sprint.
Developed the complex Informatica mappings and the workflows.
Involved in Performance tuning and the performance improvement activities.
Developed and implemented the Dataware house using the HADOOP (Sqoop, Hive, HDFS, HBASE and Spark).
Lead the Informatica up gradation project (from 8.6 to 9.5 versions).
Part of the Informatica and data ware house trainer team.
Involved in testing (Unit, Integration, system testing, Recursion testing and UAT).
Involved in migrating the application to Cloud.
Part of the design and initiating team for the DevOps adoption across the LOB.
Environment :( Development/Production): Informatica Power Center 9.6 / 10.2, IDQ, Oracle, HADOOP (Sqoop, Hive, HDFS, HBASE, Pentaho and Spark), UNIX shell scripting and Control-M (Scheduling Tool)
Client: Child Maintenance and Enforcement Commission (CMEC), UK Dec 2013 – June 2015
Project: Corporate Data Warehouse for Healthcare
Role: ETL& OLAP Developer
Location : Leeds, UK (Capgemini)
Description: The Corporate Data Warehouse allows CMEC to monitor the use of internal systems by its own staff, and portal applications being used by citizens & organizations. The monitoring is done by the Internal Team Office using regular Data mart reports and ad hoc warehouse access.
Environment (Development): Informatica Power Center, OBIEE, Oracle, UNIX shell scripting, PL/SQL DAC (Oracle BI Data warehouse Administration Console 10.1.3).
Responsibilities:
Performed & Involved Requirements gathering activities to be able to design useful technical solutions.
Assisting the Analysis team in designing, developing, testing and Implementation stages
Created LLD for data warehouse design & Involved in LLD reviews
Participated in project planning sessions with project managers, business analysts and team members to analyze business requirements and outline the proposed solution.
Participated in derivation of logical requirements into physical requirements and in preparation of high level design document for ETL jobs.
Performed admin activates like Creating, deleting, taking folder backups, comparing folders & repositories
Providing testing support and coordinate with SIT and UAT team members to rectify Technical and functionality issues.
Client: Warner Brothers (WB) Apr 2011 –Nov 2013 Role: Team Lead and ETL (Developer and Admin)
Location : Bangalore, India
Description:
Warner Bros Home Entertainment Inc is an American producer of film and television Entertainment Company and it sells (DVD’s, Blu-rays, and Games Products) .Vangaurd is middle ware architecture between the LOB’s and the SAP system. It has an inbound flow from the LOB and the outbound flow from the SAP system. It is very complex and a very high-volume application.The data is of two type, Transactional and Master data. The master data is routed through the MB (Message Broker) software from the SAP system. And the transactional data is directly read from the SAP system.
Responsibilities:
Participated in project planning sessions with project managers, business analysts, Business User and team members to analyze business requirements and outline the proposed solution.
Worked on SAP system interface mappings.
Extensively worked on Data Analysis using source systems (SAP R/3 System, Seibel, RIM)
Worked on Performance tuning at Mapping and workflow level.
Involved in Live Support activities like resolving the defects, interacting with the clients and maintaining the Data warehouse systems
Involved in Informatica admin activates like Creating, deleting, taking folder backups, comparing folders & repositories.
As a Team Lead, responsibility of deliverables for any assignments assigned to off shores ETL and UNIX issues.
Guiding & supporting the off shore and Onshore team members in different aspects.
Environment (Development/Production): Informatica Power Center, Power exchange, Oracle, UNIX shell scripting, Toad 9.0.1, SAP
Client: Inland Revenue (HMRC), UK Aug 2009 – Feb 2011
Project: Management Information System
Role: ETL& OLAP Developer
Location : Telford, UK (Capgemini)
Description: Management Information System which provides various services to HMRC. The term Warehouse refers to two Data Stores held within MIS, namely Data Warehouse and Data Mart. MIS data is first received in the form of flat files, streamed oracle data or xml (Data feeds) to the Data Warehouse from various other feeding systems in the local office network. The data is then passed to the Data Mart and undergoes conversion into Oracle SQL tables and transforms to support the eventual output of User Reports using Business Objects.
Responsibilities:
Created LLD for data warehouse design & Involved in LLD reviews
Created BO-Universe for Work Management mart and created BO reports for MIS mart and Work Management mart
Developed Universes and reports using Business Objects.
Involved in the data migration and software migration activities.
Developed shell scripts, which will invoke the DAC Execution Plans.
Providing testing support and coordinate with SIT and UAT team members to rectify Technical and functionality issues.
Involved in PGLS (Post Go Live support) activities.
Environment (Development): Informatica Power Center, Business Objects, Oracle 10g, UNIX shell scripting, Toad 9.0.1, Tivoli -Maestro scheduler, PVCS dimension, Oracle developer
Client: Alcatel-Lucent Oct 2006 –Aug 2009
Application: OASIS
Role: ETL and UNIX Developer
Location : Bangalore, India
Description: OASIS - Order Activity Status and Information System
Combines all the System Integration Centers (SIC’s) into a Global Data Warehouse to offer a common query and reporting solution using the Business Objects reporting tool.
Description: There are two EDI organizations within Alcatel-Lucent, they are: Customer EDI and Supplier EDI. Supplier EDI mostly deals with Alcatel-Lucent purchasing goods and services from vendors and financial transactions (primarily payments).
Responsibilities:
Developed shell scripts to automate the process.
Resolving the issues (tickets) created by Users all over the GLOBE.
Guiding & supporting the off shore and Onshore team members in different aspects.
Involved in Optimization for various business queries and scripts.
Finding the RCA for the Issues and solving the problems when job failed.
Involved in writing the Unit Test Conditions and Execution of Unit Test Conditions
Involved in automation of cron job implementation
Application: Supplier EDI Role: EDI and UNIX Developer
Environment (Live support): EDI Unix tool, Oracle, UNIX Shell Scripting, Toad 8.6, PVCS, Informatica Power Center, Oracle 9i, UNIX Shell Scripting, Toad 8.6