Post Job Free

Resume

Sign in

Data Developer

Location:
Hoffman Estates, IL
Posted:
March 28, 2020

Contact this candidate

Resume:

Samir Brahmbhatt

Email: adchsq@r.postjobfree.com

Profile:

Over 15 years of experience in IT as follows:

Over 9+ years of experience in ETL using info Informatica 10.x/9.x/8.x/7.x (PowerCenter Designer, Repository Manager, Workflow Manager, Workflow Monitor, Informatica Cloud Services).

Knowledge of Kimball’s dimensional modelling (Star / Snowflake Schema / Type 1 & 2 SCD’s)

Strong UNIX Shell Scripting skills using various UNIX utilities viz. GREP/AWK/SED etc.

Comfortably developed windows batch script to achieve various business objects.

Over 9+ years of experience working as a software developer / analyst developing several client-server business applications.

Extensive database programming experience using Exadata, Oracle 11/10G/9i, MS SQL Server 2012/2000/7.0/6.5, T- SQL, PL/SQL, Stored Procedures, Views, Packages, Triggers

Excellent skills in understanding business needs and converting them into technical solutions.

Result oriented team player with excellent communication and interpersonal skills.

Technical Skills:

ETL Tools : Informatica PowerCenter 10.x/9.x/8.x, Informatica CDC, SQL Server DTS

RDBMS : Exadata, Ora 11g/10g/9i, MS-SQL Server 2016/2012/2K/ 7.0/ 6.5 and DB2

DB Tools and Technologies : Toad 8.x, SQuirrel, DB Artisan, SQL, PL/ SQL, ADO and ODBC

Programming Languages : VB 6.0, ASP 2.0, UNIX Shell Scripting (KSH), DOS Batch Scripting

Reporting Tools : Crystal Reports, MicroStrategy 10

Source Control Tools : VSS

Professional Experience:

Rewards Network – Chicago, IL April, 2015 – Till Date

Sr. Data Integration Developer

Responsibilities:

Core responsibilities includes ETL Design, Development, Unit Testing and Migration of the Informatica Code to QA.

Coordinate with various business teams viz. Email Marketing, Core Marketing, Finance, Sales, Customer Management, and Partner Management to understand and analyze the project requirements.

Currently engaged in training and knowledge transfer on data team’s initiative to migrate our DB from SQL-SERVER 2017 to AWS CLOUD [AMAZON S3 + REDSHIFT with Spectrum] platform.

Currently also involved in discussion with MATILLION to explore if it is a viable ELT / ETL platform in the cloud to replace Informatica IICS.

Took opportunity to Design and Develop ETL/Data solution for a very highly visible and major milestone project from the company perspective of building a restaurant database sourcing data from various internal and external sources.

Designed and Developed ETL orchestration processes to integrate data from third party credit rating agency [LEXISNEXIS] using REST/SOAP based Web Services API using Informatica Cloud Services [IICS] Cloud Application Integration [CAI] platform. The goal of this project was to enable Due Diligence team to approve real time cash advances to RN clients.

Developed ETL solutions using Informatica Cloud Services [ICS] to Read/Write Customer data from/to Sales Force CRM system using ICS’s REPLICATION AND SYNCH features.

Designed and Developed ETL solutions for a new project implementation having data sourced from external vendors viz. Argus. This project enabled the company to receive data from Argus for better analytics and insights to equip our sales team on the field with more accurate market insights on market data to pitch it to the prospective customers.

Designed and Developed efficient and robust File Management solution to process and load the source files from various external vendors into the EDW.

Implemented the use of Infa Parameter files to leverage the use of workflow / mapping parameters and variables

To identify the bottlenecks for slow throughputs in ETL load and improve the load performance by using combination of various available approaches viz. ETL mapping optimization, Revisiting DB table indexing, ETL and DB Partitioning, Deciding between Full load or Incremental Load

Adopt the best development and coding practices viz. using proper naming conventions, using reusable components viz. Mapplets, Worklets, UDF’s, DB Stored Procs, Functions, views etc …

Actively involved in migrating Informatica platform from Infa 9.5.1 to Infa 10.1.1 (Power Center / Power Exchange)

Environment: Informatica 9.5.1/10.1.1, Infa Power Exchange CDC, Infa Cloud Services [ICS / IICS], Sql-Server 2012/16, Micro Strategy 10, AS400 (Source), Windows Batch Scripting, SalesForce (CRM), REST/SOAP Web Services [WSDL/SWAGGER]

Portfolio Recovery Associates – Chicago, IL Oct, 2012 – Mar, 2015

ETL Developer

Responsibilities:

Successfully migrated existing workflows reading source data from Oracle Streams CDC to Informatica CDC to read the change capture set from the source system.

Successfully changed the existing workflows connecting to oracle 11g db to exadata db

Designed and developed workflow to load aggregated fact table.

Developed various mappings to load the data from various sources and load into target, using different transformations such as SQ, Expression, Sorter, Lookup (Connected/Unconnected), Aggregator, Update Strategy, Joiner, Filter, Stored Procedure and Router Transformations.

Used Pre-Post / look up / target update sql overrides in the mappings

Used informatica pass through partitions to improve the throughput of snapshot workflows.

Developed Oracle packages to call its procedure in infa mappings

Developed several oracle stored procedures to use in infa mappings

Developed mapplets/UDF’s and re-usable transformations to leverage reusability in informatica mapping designer

Created workflows/worklets using various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.

Developed oracle tables having range/list/hash partitioning types for better data management and improve query performance.

Implemented efficient and effective performance tuning measures to improve the session throughput.

Used the parameter files and defined mapping parameters and connection information in the file.

Coordinated with the QA team to resolve any defects during UAT.

Provided support to the devops team during the stabilization period.

Environment: Informatica 9.1, Oracle Exadata, Oracle Streams, Infa CDC, Sql-Server, Windows Batch Scripting

Nike Inc. – Beaverton, OR Mar, 2012 – Sep, 2012

ETL Developer (Contract – Infosys Technologies)

Responsibilities:

Developed data warehouse solutions based on star schema with the help of STM’s and Functional Specs provided by the business unit.

Used Source Staging Star methodology to simplify complex business requirements.

Developed various mappings to load the data from various sources and loading into target, using different transformations such as SQ, Expression, Sorter, Lookup (Connected/Unconnected), Aggregator, Update Strategy, Joiner, Filter and Router Transformations.

Developed various Type 1 and Type 2 SCD’s.

Implemented common ETL Load control tables framework to track the load status to better manage the restart ability of the loads in case of load failures.

Extensively used the debugger to debug the mapping and to determine the data discrepancy coming from the source or arising from the business rules being incorrectly applied.

Used session log files to find out the reason of the workflow failure or incorrect data reported by the end users.

Implemented efficient and effective performance tuning measures to improve the session throughput.

Used the parameter files and defined mapping parameters and connection information in the file.

Identified the bottlenecks at source level and used the sql override to improve the query performance.

Coordinated with the QA team to resolve any defects logged in Mercury Quality Center.

Environment: Informatica 8.6.1, Oracle 10g, UNIX, Autosys

Bank of Nova Scotia Mar, 2010 – Feb, 2012

Toronto, Canada (Contract)

Informatica ETL Analyst (Enterprise Data Warehouse)

Responsibilities:

Developed several UNIX shell scripts to enhance and/or implement the business logic.

Developed various mappings/ mapplets to load the data from various sources into the target, using different transformations like SQ, Expression, Sorter, Lookup (Connected/ Unconnected), Aggregator, Update Strategy, Joiner, Filter and Router Transformations.

Developed mappings to implement daily and monthly delta/ incremental loads using mapping parameters in parameter file.

Coordinated with the BA and BU team to gather/ clarify the business requirements/logic

Tested, Promoted and Implemented Informatica objects from Development UAT Production environments.

Implemented efficient and effective performance tuning measures by using SQL overrides in SQ, used sorted i/p in aggregator, used unconnected lookup thereby minimizing the usage of numbers of lookups, used static / persistent lookup cache etc.

Used the parameter files and defined mapping parameters and connection information in the file.

Extended support to the production team for any production related issues.

Frequently required to do root cause analysis of production failures.

Environment: Informatica 8.6.1, DB2, AIX, Tivoli Work Scheduler (Maestro)

Morgan Stanley Jun, 2009 – Feb, 2010

Markham, Canada (Contract)

Data Security Administrator (IBM Mainframe)

Responsibilities:

To manage user and application level access for all the financial applications within the MSSB environment.

Used mainframe TSO sessions for application and user level access management.

Used QMF to write queries for DDL and DML operations to support the data access and admin activities.

Morgan Stanley Mar, 2007 to Mar, 2008

Markham, Canada (Contract)

ETL/Application Support Analyst

Responsibilities:

To provide tier-1 support for production environment.

To co-ordinate with the developers and the end users regarding issues related to the production environment.

Maintained and developed various mappings / mapplets to load the data from various sources and loading into target, using different transformations like SQ, Expression, Sorter, Lookup (Connected/Unconnected), Aggregator, Update Strategy, Joiner, Filter and Router Transformations.

Extensively used the debugger to debug the mapping and to find out the data discrepancy coming from the source or arising from the business rules being incorrectly applied.

Used session log files and the technical specification docs to find out the reason of the workflow failure or incorrect data reported by the end users.

Identified the bottlenecks at source level and used the SQL override to improve the query performance.

Used session logs to debug sessions.

Frequently required to do root cause analysis of production failures.

Developed various mappings/ mapplets to load the data from various sources and into the target, using different transformations like SQ, Expression, Sorter, Lookup (Connected/ Unconnected), Aggregator, Update Strategy, Joiner, Filter and Router Transformations for any enhancements or code fixing for production jobs

Developed several UNIX shell scripts to resolve production failures.

Environment: Informatica 7.1, Oracle 10g, Sun Solaris, Erwin 4.1, DB Artizan, Autosys

Client Logic Corporation Feb, 2006 to Feb, 2007

Toronto, Canada (Full Time)

Data Analyst

Providing data analysis / reporting services to internal IT clients. Clients also would request customized report to analyze various business parameters for decision making.

Responsibilities:

To respond to customers tickets in a timely and efficient manner.

Provided efficient data analysis services to the clients by properly understanding the business process.

Provided customized reporting to the clients, which enabled them to formulate successful improvement measures in terms of client satisfaction and revenue.

Provided effective reporting solutions which enabled the managers of the call – center teams to improve their team performance and thus overall client satisfaction survey.

Environment: VB, VBA, ASP, Oracle9i, Crystal Reports 8.5

Best Buy Ltd. Nov, 2005 to Feb, 2006

Brampton (Toronto)

Responsibilities:

Worked in the computer department as a Customer Support Representative.

Supported customers by providing technical and functional support for desktops, laptops and computer peripherals by being courteous and showing eagerness to help the customers and suggesting them the right product suiting their life-styles keeping the everyday sales goals in mind.

My achievement at Best Buy is that I was hired as a seasonal employee but due to my performance I was hired as a full time employee.

Gateway Technolabs Ltd. Apr, 2002 to Nov, 2004

India (Full Time)

Sr. Analyst Programmer

Completed two major projects at PlanetHealth while with Gateway Technolabs.

Project: Supply Chain Mgmt. System.

Description: This is a Supply Chain Management project based on franchisee model. It consists of the following modules: System Admin b) Purchase Order c) Order Receive d) Inventory and Stock Management e) Sales g) Data Transfer

Responsibilities:

Involved in development of the system and worked on Purchase, Receive, Sales and Data Transfer modules.

Extensively used crystal reports in visual basic code to provide customized reports as requested by the client.

Developed more than 30 simple and complex MIS reports as requested by the client.

Effectively lowered down the inventory cost due to the correct reporting involving complex queries.

Extensively used Active-x Data Objects for back-end connectivity and used command object to refer the stored procedure in visual basic code to enhance the system performance and also used all MS-SQL SERVER objects such as Triggers, views, Stored Procedures, functions to make a robust system. Also used DTS to transfer the data to and from MS-SQL SERVER to various data formats such as text files, excel sheets, ms-access for various purposes.

Environment: WinNT, VB, MS SQL-SERVER, Crystal Reports

Project: Customer Loyalty Program

Description: This is the module integrated with the above s/w providing all the features of a CLP. CLP is the module used to track the customer loyalty by the way of points allotted against the purchase made and the customer can redeem the same as per the policy.

Responsibilities:

Developed the application as per the system requirement specifications, ensuring the integrity of the data being stored in the database enforcing the business logic and rules using triggers, stored procedure, views, and functions.

Provided Periodic demonstration of the modules to the clients to gain client confidence.

My achievement here is that the project was executed by a team of two people one being a DBA and myself. I developed the front-end and he took care of the back-end.

Environment: WinNT, VB, MS SQL-SERVER, Crystal Reports

JDS Consultants Apr, 2000 to Jan, 2002

India (Full Time)

Web Developer

Project: Astrology/Numerology/Palmistry based web portal.

Responsibilities:

Developed web-based astrology portal www.webastrologer.com wherein I worked on Horoscope module. It also includes the predictions based on the horoscope depending upon the planetary positions at the time of birth of an individual.

My achievement is that I simplified the development process by using third party Active-X components viz. Dimac File-Upload for uploading the photographs for match making module and J-mail components for the mail.

Environment: WinNT, ASP, MS SQL-SERVER, VB, VB-Script

General Motors Ltd. Nov, 1999 to Jan, 2000

India (Contract)

IT Consultant

Worked as an IT Consultant to this automobile giant and successfully designed, developed, tested and implemented the “SISTRAC” (Shop floor Information System for Tracking Recording and Control) project. This project is developed for tracking the status of a particular car being manufactured in the shop floor of the production plant during the course of its production.

Project: SISTRAC (Shop floor Information for Tracking, Recording and Control)

Role: System Analysis and Design, Developing, Testing and Implementing

Responsibilities:

Established the communication sequence in which the Allen-Bradley's PLC Automation circuit takes the data from the plant and sends it to Visual Basic through an Active-x component (RS-VIEW 32). This data is stored into the MS SQL-SERVER database for MIS reporting.

Environment: WinNT 4.0, VB 5.0, MS SQL-SERVER 7.0 and Crystal Reports 4.6

RAM Infosys Technologies Mar, 1998 to Oct, 1999

India (Intern)

Worked as a Trainee Programmer Analyst and developed several customized desktop application using Visual Basic 5.0 and MS-ACCESS

Education and Certifications

Certifications Certificate Issued By

MCSD (Microsoft Certified Solution Developer) Microsoft

MS SQL-SERVER 7.0 Programming Brainbench

ASP 2.0 Programming Brainbench

VBScript Brainbench

Academic Qualification: -

Dip. In Electronics Engineering (BTE, India)

Bachelor in Computer Applications (MCI, India)



Contact this candidate