Md Rajaul K Khan (Reza)
**********@*****.***
Cell: 469-***-****
PROFESSIONAL SUMMARY:
Over 18+ years of progressive experience in Data Engineering, Data Warehousing and Software Development as an Sr. Data Engineer, Data Integration/Migration Lead, ETL Team Lead and Developer.
Solid understanding and Development experiences in Data Warehousing, Business Intelligence, Data Migration, Data Integration, using various RDBMS (Oracle, MS-SQL Server, DB2, MySql and RedShift) and SAP, Java, Web based and other applications.
Expertise in AWS Technologies like S3, Redshift, EMR.
Expertise in Conceptual, Logical & Physical and Dimensional Data Modeling such Star & Snowflake Schema, Fact & Dimension tables as well as Data Analysis, Data Cleansing, Data Transformations, Integration, Migration, Data import/Export, Conversion using various ETL tools.
Strong ETL experience in using Informatica Power Center 10.x/9.x/8.x/7.x, Power Exchange for SAP/R3 or BCI (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Informatica cloud to Develop, Test, Debug and Tuning mappings, Sessions and Workflows for ETL Process.
Experience in Unix Shell scripts for ETL jobs automation, file transfers and file manipulation and Pre-Session and Post-Session Command.
Experience in performance optimization/tuning of Database and ETL process.
Entry level of experience on Hadoop, Spark, Scala, Hive, Pig, MongoDB.
Proven problem-solving skills with excellent analytical, communication and interpersonal skills.
Excellent ability to lead a group or work in groups as well as independently with minimum supervision and initiative to learn new technologies and tools quickly. SKILLS:
Requirement Gathering and Analysis, Enterprise Implementations, KPI dashboards & Scorecards, data mining and analytics, Project Management.
ETL Tools: Informatica Power Center 10.x/9.x/ 8.x/7.x, Power Exchange 9.x/8.x, (Designer, workflow manager, Monitor and Repository Manager.
Database: Redshift, Oracle, MS SQL Server, DB2, MySQL
Programming Languages: J2SE, Python, PL/SQL, C++, C#.
Scripting Language: UNIX Shell Scripting (Bash and Korn Shell), Perl.
Other Tools: Tableau, Toad, SQL Developer, Erwin, Eclipse, Rational Rose, MS Visual Studio, Tidal, Auto-Sys, Control-M.
PROFESSIONAL EXPERIENCE:
FARMERS INSURANCE BELLEVUE, WA MAR 20 TO CURRENT
LEAD DATA ENGINEER
Zurich Based Farmers Life Insurance is one of largest Insurance company in the world. Farmers is building its one stop data Hub called Life Data Foundation (LDF) for it day to day activity. Agile methodology is used to build the Data hub. Data are to consume from multiple admin systems like mainframe, FAST than Standardized them by transforming using business and Acord rules. Stack holders use this data to reporting, billing, Payment and etc. Responsibilities:
Designed and developed end to end data pipeline to consume multiple mainframe admin systems and standardize them and load into LDF (MS SQL Server) using ETL toll Informatica Power Center and Power exchange.
Designed and developed end to end data pipeline to Ingest data from multiple Applications in the form xml, txt.
Designed and developed data pipeline to generates extract for multiple vendors to support financial and accounting activity like GL, Payment, Commissions.
Build a BI Database for reporting team with SCD2 dimensional data modeling
Write various Scripts as needed in the pipeline to make it simpler.
Use AzureDev Ops, Jira to Plan, Track and Manage sprints for the agile.
Use Git hub for version control and code repository and CI/CD for deployment activities.
Develop Autosys jobs for data pipeline to schedule them with event based and time-based dependency.
Resolve any Prod failure ticket to support the daily cycle to meet the SLA.
Mentoring new hire DE and BI about system and Tools.
.
AMAZON SEATTLE, WA DEC 16 TO DEC 19
SR. DATA ENGINEER
Amazon is the largest online retailer. It also delivered shipment to its customer. In order to deliver most cost effectively with customer satisfaction, Transportation team built few complex systems to monitor and measure the performance of different KPIs of the shipment. I was engaged in building few systems of those especially for Global shipments. This system provides the data to have the insight and take appropriate measure to reduce the cost and take strategic decision by leading cross-functional teams in the development, documentation and delivery of process innovations driving the attainment of business goals. Responsibilities:
Built end to end data solutions (data pipeline) for AGL's (Amazon Global Logistics) Finance and Ops team in Oracle using datanet which is used by other teams too to generate reports.
Ingest data from multiple system to build the data pipeline.
Total ownership of the whole system to enhancement, maintain, and support to make sure to meet SLA.
Built a cluster in redshift and migrate all objects to redshift as part of RollingStone project in AWS.
Subscribe tables through Hoot and built other data pipelines to consume required data to the system.
Built WBR and OBR for AGL finance and Ops team using metrics Job for key KPIs.
Built OBR (Visible to SVP) for WW AMZL in reporting tools like Tableau, Quick view, Xlxs.
Automate Controllership EU shipments.
Automate Manifest of EU imports for All means carriers.
Wrote various script for maintenance of Redshift cluster.
Mentoring new hire DE and BI about system and Tools.
.
COSTCO WHOLESALE, ISSAQUAH, WA JUL 14 TO NOV 2016
DATA INTEGRATION/MIGRATION ETL LEAD
Costco Wholesale is one of largest retail club in USA and its' business is rapidly expanding worldwide. The COSMOS project is to replace the existing legacy (AS400 I-Series) system with SAP for CRM, HR and FINANCE by modernize the business to make better Strategic Decision, Customer Relationship and Financial management which will allow a cost-effective, efficient solution for the company. It also uses the finance data from SAP to enhance the existing Data warehouse. Data were extracted from AS-400, DB2, SQL server, third party feed (XML format) and Java based application and loaded to SAP using IDoc/ BAPI/RFC/LSMW by applying Business logic. The project required Inbound/Outbound interfaces and Data conversion. Responsibilities:
Lead Conversion/Interface team (Onsite/Offshore) of various fields like I-Series, ETL (Informatica, BODS) and SAP and oversaw the whole conversion process as Data Migration/Integration lead.
Masterminded the inbound and outbound interfaces development process by closely working with functional, developer and tester.
Conducted/Attending regular/ad hoc meeting with project’s teams to update daily status or escalate issues and concerns. Conducted team meeting/Conference call (with Offshore) with team member to assign the work and get the update of assigned task.
Engaged with Business Analysts in requirements gathering analysis and testing and project coordination.
Prepared Standard Design Documents FD, Technical Specification TD, ETL Specifications and Migration documents.
Designed and reviewed the Interface/Conversion ETL code before moving to QA to ensure design standards are met.
Developed Workflow, Session and Mapping, Mapplet to load data from heterogeneous to SAP using BAPI's, RFC’s and IDOC's with Power Center (9.6.1) & Power exchange (9.6.1).
Used different transformations like Connected & Unconnected Lookup, Router, Aggregator, Normalizer, Source Qualifier, Joiner, Expression, Update Strategy, Sequence generator, Transaction Control Transformations to data conversion before loading to Target.
Created Mapping to extract data from SAP (ECC) tables using SAP/R3 and from SAP data sources using BCI for outbound interfaces.
Designed and implement scheduler to run the interfaces at different time using Shell Scripting in UNIX.
Solved various performance and complex logic related issues with effective and efficient solution in the ETL process and identified various bottlenecks and critical paths of data. WARNER BROS., BURBANK FEB12 TO JUNE 14
SR. INFORMATICA CONSULTANT (LEAD)
The project COSMOS (ITD) in Warner Bros. was to replace the aging legacy system with SAP for Strategic Decision Making, Customer Relation and Financial Management which will allow a cost-effective, efficient solution for the company. SAP’s new solution IPM (Intellectual Property Management) was implemented with CRM & ECC. Data were extracted from Oracle, DB2, SQL Server, Mainframe and Java based application and loaded to SAP using IDocs/ BAPI/RFC.
Responsibilities:
Designed the Whole EAI (Middleware) Architecture for the Project COSMOS based on given requirement in compliance with IMCC (Information management Competency Center) standard with Hand shaking mechanism between Legacy-Middleware EAI-SAP.
Managed a team of 4-6 resources (Onsite/Offshore). Conduct/Attended Regular/Ad hoc meeting/Conference call with project team onsite/offshore to update daily status or escalate issues and concerns.
Involved with Business analysts in Requirements gathering, Analysis, Testing, and Project coordination.
Designed and developed the Interfaces by implementing given business logic within the given time frame.
Designed and implement a scheduler to run the interfaces as required using Shell Scripting in UNIX.
Solved various performance and complex logic related issues with effective and efficient solution in the ETL and Database process and identified various bottlenecks and critical paths of data transformation in sessions for different Interface.
Responsible for Data Cleansing and Data Quality checks using Informatica Data Quality (IDQ).
Developed mapping using transformations like Connected & Unconnected Lookup, Router, Aggregator, Normalizer, Source Qualifier, Joiner, Expression, Update Strategy and Sequence generator, Transaction Control to load heterogeneous data into SAP using BAPI'S, RFC’s and IDoc's with Power exchange. Also use Java, XML and Web service Transformation for data conversion.
Created Mapping to extract data from SAP (CRM, ECC) tables directly using SAP/R3.
Created Reusable Mapplet, Workflows/Sessions using Informatica Server Manager/Workflow Monitor to load data into target.
Performed various Performances Tuning activities at Database level (source, target, Look Up) and Informatica level (mapping and session) with T-SQL and Sql Developer and Toad.
Perform production support for the stabilizing period. Resolving production issue (ITSM) retest and deploy to Production.
TAKEDA PHARMACEUTICAL, CHICAGO NOV 11 TO FEB 12
INFORMATICA CONSULTANT (TEAM LEAD)
Takeda is one of the leading multinational pharmaceutical companies producing life saving drug. The project was enhanced the existing data warehouse for its production, marketing and staff controlling data which enable them to easily analyze forecast and report for strategic decision. All user form all branches around the world enter the day- to-day data to the system which in turn analyze and use for reporting purposes. Responsibilities:
Interacted with the Business client to gather and analyze the business requirements and translate them to the technical requirements.
Designed different Cubes, Fact and Dimension tables according to the business requirement for enhancement.
Implemented Slowly Changing Dimensions (SCD2). Used Type 2 dimension to keep the history changes.
Designed and built Simple to Complex mappings to populate Fact and Dimension tables using various transformations like Source Qualifier, Filter, Expression, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator, Joiner, Router, SQL and Transaction control Transformation.
Developed Mapping, Session, Workflow and other component in Informatica using Source Analyzer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Workflow Manager.
Wrote pre-session and post-session Stored Procedures/SQL to drop the indexes, to re-create the indexes, and to joins in the database.
Involved in Debugging, Troubleshooting, identifying Performance issues, bug and fixing them in existing mappings by analyzing the data flow, evaluating transformations to eliminate inconsistency and inaccuracy and latency.
Translate lots of Store Procedures and Functions to Informatica Code. WARNER BROS., BURBANK NOV 09 TO NOV 11
SR. ETL CONSULTANT (TEAM LEAD)
The project COSMOS (DTD) in Warner Bros. was to integrate few of existing system (STARS, UpDocs, ITS) with SAP for strategic decision making, Customer Relation and Financial Management for Domestic Television Distribution (DTD). Data were extracted from DB2, Oracle, SQL server and mainframe-based application and converted in IDoc to load into SAP.
Responsibilities:
Architect the Whole EAI (Middleware) for the Project COSMOS with bidirectional handshaking process between source and target application.
Involved in Requirements gathering, Analysis, and Development and prepared standard design documents, Technical Specification, ETL Specifications and Migration documents.
Responsible for Data cleansing and Data quality checks using Informatica Data Explorer and Informatica Data Quality 8.6 (IDQ).
Developed Mapping, reusable Mapplet using transformation like Connected & Unconnected Lookup, Connected & Unconnected Stored Procedure, Router, Normalizer, Source Qualifier, Joiner, Expression, Update Strategy, Aggregator and Sequence generator, Transaction Control transformations to data conversion before loading to Target. Also use Java transformation to implement complex logic.
Developed Session, Workflow, and Worklet to load data from heterogeneous source to SAP using BAPI'S, RFC’s and IDoc's using Power exchange.
Solved various performance and complex logic related issue with effective and efficient solution in the ETL process.
BNSF RAILWAYS, FORT WORTH, TX MAY 08 TO OCT 09
SR. INFORMATICA DEVELOPER
BNSF Railway is one of the largest transportations (Rail) company in the United. Its existing Financial and HR payroll systems (Millennium and Tesseract respectively) were ageing, becoming more complex and costly to maintain and were subject to a degree of operational risk. The purpose of the project Envision was migration from Millennium to SAP financial package for applicable business processes to provide external and internal users with reliable, timely and accurate data that can be easily analyzed. Responsibilities:
Involved in Requirements gathering, Business Analysis, Test Plan and translating Business requirement to technical requirements.
Designed Interfaces/Conversions and developed inbound/outbound Mapping using Informatica Transformations as an Informatica SME.
Developed Mapplets, Sessions, Worklets, Workflows to loading the data into SAP using BAPI'S, RFC’s and IDoc's and extract data from SAP and load into legacy system using SAP/R3 and BAPI.
Extracted data from various sources such as Flat Files, DB2UBD, SAP and Oracle and Used Informatica power exchange to extract the data from Mainframe based on data set and.
Design and implement Error handling system (EEMS) and Auto recovery system.
Provide 24/7 production support at post go-live and technical support to the junior team members. SPRINT PCS, DALLAS, TX SEPT 07 TO APR 08
SR. BUSINESS ANALYST
Sprint is one of the largest wireless companies in the United States, with more than 30 million subscribers uses one of the nation's largest digital voice and data network. This project deals with building multiple dashboards in KPI to measure the performance of various business unit using Rational Unified process and UML and MS Visio and xls. Responsibilities:
Provided executives with analytics and decision-support for strategies with KPIs.
Engaged with developers to automate manual processes, saving time and money while reducing errors. Credited as a primary driving force behind a 5% increase in margins this fiscal year.
Collaborated with stakeholder groups across the organization to ensure business and technology alignment. Proposed solutions meeting defined specifications and needs.
Performed quality assurance, system integration and user acceptance testing facilitating on-time, on-budget and acclaimed “go-live” of enterprise implementations. EDUCATION: MS in Computer Science (University of Texas at Dallas)