Post Job Free

Resume

Sign in

Data Manager

Location:
Plano, TX
Posted:
February 22, 2021

Contact this candidate

Resume:

Name: Nikhil Nitinrao Deshpande Email: adkeg9@r.postjobfree.com

Location: Plano, TX, USA Cell: +1-972-***-****

Education

Master’s Degree (MS) in Computer Science from Texas A&M University – Commerce, TX, USA [2010 – 2012].

Bachelor’s Degree (BE) in Computer Science & Engineering from SRTM University, India [2005 – 2009].

Technical Skills

Hardware: Assembling and disassembling the computer, Cabling, Solving the regular upcoming Hardware issues, Servers, Routers and Switches.

Operating Systems: Windows 95/98/NT/2000/Millennium/XP/Visa/Windows 7/Windows 10, Linux, UNIX.

Programming Languages: Java, C#, VB.net, C, C++, HTML, CSS, JavaScript, Angular, Python, SQL, SAP ABAP.

Tools: Informatica PowerCenter 10/9.6 (Designer, Workflow Manager, Workflow Monitor, Repository Manager, Administrator console), Informatica Data Quality (IDQ 10/9.x ), IICS (Informatica Intelligent Cloud Services), SAP Business Objects Data Services 4.2 SP9P1 (Designer, Repository Manager, Server Manager, Management Console, Central Management Console, Central Configuration Manager), Business Objects XI r2/6.x/5.x, Information Steward 4.2, SAP BW/BI 7.4, SSIS (SQL Server Integration Services), Pentaho Data Integration, SQL Server Management Studio, Azure DevOps, Tableau, Apache Tomcat, SOAP UI, Ztabledownload.

MDM (Master Data Management): Teradata MDM, Informatica MDM, SAP MDM/SAP MDG.

Databases: Teradata, SAP S/4HANA, Oracle 12c/11.2/10g/9i/8i, DB2, Informix, MS Access, MS SQL Server.

Other Utilities: MS Word, MS Excel, MS PowerPoint, MS Access, MS Visio.

ERP Package: SAP ECC 6.0, R/3 4.7 and 4.6C

Location: USA.

Summary

9 years of total work experience as a software professional in Design, Development, Administration and Support solutions for Informatica Power Center 10/9.X, Business Objects Data Services 4.2, XI 4.x (R3.1) / XIR2/R1/6.x, Pentaho Data Integration, IBM Data Stage, Teradata MDM, Informatica MDM and MS SQL Server with strong architectural and administrational knowledge.

8 years of experience delivering Data warehousing implementations, Data migration and ETL processes to integrate data across multiple sources using Informatica PowerCenter and Informatica Cloud Services.

8 years of strong experience working in MDM (Master Data Management) to deliver the Business Objects Data Services and Informatica development for MDM Locations, Customers, Product, Vendor, GL and Hierarchy Projects.

Strong experience in installing and working on the Information steward 4.2 for Data Profiling, Rule Creation, Rule Binding and for creating Cleaning Package Builder.

Strong experience in ETL design and implementation involving the extraction of data from ECC, verity of legacy systems, integration, and transformation, data/address cleansing by using Data Quality transforms, matching/deduplication by using Match transforms and loading into Data warehouse with also enabling the Geocoding.

4 years of experience in HANA development with experience on HANA SQL Scripts and S/4HANA installation.

Experience with SAP HANA 2.0 configuration, architecture, reporting, predictive analytics and data modeling.

3 years of experience working on Python, Spark, and Scala.

Functionally strong in MDM and MDG foundational works which are related to SAP Data Services & Data Quality development for various domains.

Experience in creating the UNIX Shell Scripts for BODS and Informatica jobs and schedule in Control-M and Crontab.

Experience in using SAP ABAP, LSMW, BDC, BAPI, AIO jobs in the BODS jobs.

SDLC (Software Development Life Cycle) Methodology.

Involvement in Teradata Cloud Migration.

Experience with Tableau and Cognos.

Employment History

Cognizant Technology Solutions US Corp., USA from April 2014 to Till Date.

Resurge Solutions, LLC., USA from June 2012 to April 2014.

ipTouch Software Solutions, India from June 2009 to June 2010.

Projects

PepsiCo, Plano, TX – From Feb 2013 to TILL DATE.

Technologies Used:

-Informatica Power Center, Informatica Cloud Services, SAP BODS, SAP HANA/HANA Studio, Pentaho Data Integration, Snowflake, MS SQL Server, Tableau, SAP ABAP, SAP Information Steward, Oracle, UNIX, SAP S/4HANA, SAP DQM, Teradata and Teradata MDM in this project.

Responsibilities:

-Develop and perform tests on all Informatica jobs for Material, Customer, and Vendor and Location data and analyze all data and design all data mapping techniques for all data models in systems in different environments to deliver all the MDM projects.

-Work with business users to understand the functional requirements and business objectives and create Technical designs on SAP BODS, Informatica and Pentaho jobs for MDM (Master Data Management) and GDQ (Global Data Quality) projects.

-Experience in Informatica Data Quality (IDQ - Informatica developer 9.6.1/ 9.1) for cleansing and formatting Material, Location, Vendor, Customer master data.

-Used Informatica Data Quality tool (Informatica Developer) to scrub, standardize and match Location, Vendor, Customer address against the USPS database.

-Performed the roles of Senior ETL Informatica and Data Quality (IDQ) developer on a data warehouse initiative and was responsible for requirements gathering, preparing mapping document, architecting end to end ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW), perform data cleansing activities using various IDQ transformations.

-Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.

-Used Informatica Data Quality transformations to parse the “Financial Advisor” and “Financial Institution” information from Salesforce and Touchpoint systems and perform various activities such as standardization, labeling, parsing, address validation, address suggestion, matching and consolidation to identify redundant and duplicate information and achieve MASTER record.

-Extensively used Standardizer, Labeler, Parser, Address Validator, Match, Merge, Consolidation transformations.

-Extensively worked on performance tuning of Informatica and IDQ mappings.

-Created Informatica workflows and IDQ mappings for - Batch and Real Time.

-Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.

-Created Informatica mappings keeping in mind about Informatica MDM requirements.

-Data integration with SFDC and Microsoft Dynamics CRM using Informatica cloud.

-Extracted the raw data from Microsoft Dynamics CRM to staging tables using Informatica Cloud.

-Developed Cloud mappings to extract the data for different regions.

-Developed the audit activity for all the cloud mappings.

-Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.

-Created Filewatcher jobs to setup the dependency between Cloud and PowerCenter jobs.

-Helped IT reduce the cost of maintaining the on-campus Informatica PowerCenter servers by migrated the code to Informatica Cloud Services.

-Designed several Processes on Informatica Cloud and exposed them as RESTful API services to publish data to external systems.

-Experience with IBM Data Stage through IBM Data Stage to Informatica conversion project.

-Develop the SAP BODS jobs for - Data/Address Cleansing and Matching/De-duplication and load the data into Teradata or SAP HANA databases by using BODS Data Quality transforms.

-Reading data from ECC by using ABAP Dataflows and load into multiple databases.

-Use various Data Integrator transforms such as Data Transfer, Hierarchy Flattening, History Preserving, Key Generation and CDC Operations in SAP BODS jobs.

-Create SAP BODS Real-Time jobs for Location, Customer and Vendor projects.

-Support Assembly, Regression and System Testing and work with testers on resolving any defects and on Performance Tuning.

-Work on SAP BODS Change Data Capture, using Data Services adapters for Hadoop to integrate data from Hadoop, Hive and HDFS, and connecting to web services and streaming queues through Informatica.

-Develop Informatica PowerCenter Workflows and Sessions and set up PowerExchange connections to database and mainframe files.

-Implement complex business rules in Informatica Power Center by creating re-usable transformations, and robust Mapplets.

-Work on Informatica data quality transformations like Parser, standardizer, address validator, match-merge etc.

-Work on Informatica MDM Hub Tools such as Merge Manager, Data Manager and Hierarchy Manager.

-Work on SAP HANA Data Integration (SDI) and SAP Smart Data Access (SDA) for SAP HANA and Hadoop Integration.

-Create SAP BODS jobs to load the data from AWS using SAP HANA Cloud Integration for data services into a HANA schema in the SAP HANA Cloud Platform.

-Create SAP BODS jobs to load the files in AWS S3 Cloud.

-Create Analytic Views, Attribute Views, Calculation Views and Restricted & Calculated Columns, etc. and work on SAP HANA hierarchy functions, Currency Conversion.

-Create Decision table in HANA Studio to formulate Business Rules.

-Work on SLT Configuration for Data Replication from S4HANA along with virtual data modeling tools as HANA Live and Lumira visualization tool.

-Work on Data Migration projects such as Oracle to Teradata and ECC to Teradata, etc. by using SAP BODS.

-Write UNIX shell scripts for the Pentaho and SAP BODS jobs and use them in the Control-M and Crontab job schedulers in the wrapper scripts to execute the jobs.

-Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.

-Work on code migrations, SAP BODS Address Directory quarterly updates, monthly OS Patching Linux Servers, etc.

-Use Teradata Bulk-Loader options and data pipelines such as Snowpipe.to load the data in bulk in Target tables.

-Build on Pentaho ETL jobs to read from Teradata, Oracle and SFTP Servers and load into SQL Server.

-Work on MS SQL Servers to create the Audit Sync reports and Data Quality scorecards.

-Work on Tableau to publish the dashboards.

-Work on VB and C# scripting on Visual Studio.

-Work on SAP Information Steward to create Cleansing Package builders, create and bind rules, and for data profiling.

-Work on migration from Star Team to Azure DevOps.

-Development of test framework using the Python.

-Wrapper developed in Python to run this application along with other applications.

-Backend scripting/parsing using Perl and Python.

-Development of Python APIs to dump the array structures in the Processor at the failure point for debugging.

-Parsers written in Python for extracting useful data from the design data base.

-Configured Spark streaming to get ongoing information from the Kafka and store the stream information to HDFS.

-Implemented Spark using Scala and Spark SQL for faster testing and processing of data.

-Used Spark and Spark-SQL to read the parquet data and create the tables in informatica using the Scala API.

-Used various spark Transformations and Actions for cleansing the input data.

-Involve in Migrating Objects from Teradata to Snowflake.

-Develop data warehouse model in snowflake for over 100 datasets using whereScape.

-Involve in testing Snowflake to understand best possible way to use the cloud resources.

-Develop ETL workflows using NiFI to load data into Hive and Teradata.

-Work on Migrating jobs from NiFi development to Pre-PROD and Production cluster.

-Schedule different Snowflake jobs using NiFi.

Marathon Oil Corporation, Houston, TX – From Jan 2013 to Feb 2013.

Technologies Used:

-Informatica PowerCenter, Informatica MDM, SQL Server, Teradata, UNIX.

Responsibilities:

-Designed, developed, and implemented the conversion project to convert the code from IBM Data Stage to Informatica.

-Worked on loading the data from Oracle to SQL Server.

-Worked on coding for the business rules and other requirements to filter out/validate the data.

-Created sessions, batches for incremental load into staging tables and scheduled them to run daily.

-Written SQL queries and created UNIX shell scripts.

-Implemented Informatica recommendations, methodologies, and best practices.

Sigma-Aldrich, Milwaukee, WI – From Nov 2012 to Dec 2012.

Technologies Used:

-SAP BODS, Informatica PowerCenter, Pentaho, SQL Server Management Studio, Teradata, Teradata MDM, BAPI, LSMW and UNIX in this project.

Responsibilities:

-Developed SAP BODS batch jobs for extract the data from multiple legacy systems and to load in Teradata and SAP S/4HANA.

-Performed multiple transformations in the BODS jobs according to the functional requirements.

-Provided training on SAP BODS and Informatica to the employees of Sigma-Aldrich and resolved their technical problems/questions on development of BODS and Informatica jobs and BODS Data Migration.

-Created SAP BODS jobs to extract data from Multi formatted Flat files, Excel, XML files into UL Database and DB2 Billing Systems.

-Created BODS jobs for exporting heterogeneous data from OLE DB Source (Oracle), Excel Spreadsheet to SQL Server 2005/2008.

-Involved in building the Informatica ETL architecture and Source to Target mapping to load data into Data warehouse.

-Created mapping documents to outline data flow from sources to targets in Informatica ETL projects.

-Created Informatica jobs to extract data from staging tables to partitioned tables with incremental load.

-Designed, Developed and Tested ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter.

-Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner, XML in Informatica.

-Created SSIS Packages using SSIS Designer for export heterogeneous data from OLE DB Source (Oracle), Excel Spreadsheet to SQL Server 2005/2008.

-Created Complex ETL Packages using SSIS to extract data from staging tables to partitioned tables with incremental load.

Unilever, Englewood Cliffs, NJ – From June 2012 to Oct 2012.

Technologies Used:

-SAP BODS, SAP Information Steward, SAP ABAP, BAPI, IDOCS. LSMW, SAP MDG, SQL Server, Oracle, Teradata, Teradata MDM and UNIX in this project.

Responsibilities:

-Worked on data migration from legacy systems to SAP ECC/CRM. Data provisioning into SAP ECC/CRM done using SAP BODS through IDOCS.

-Implemented RFC Enabled BAPIs & FMs in BODS and developed LSMW for Data Migration in SAP ECC.

-Data extraction from SAP into Enterprise Data Warehouse (EDW) based on HANA that hosts data across all units of Farm Service Agency, using ABAP data flows and implementing delta load mechanism.

-Designed complex jobs using History preservation, Key generation and Table comparison transforms for tables that needed before and after images.

-Followed USPS address regulatory, Updated Address Directories Used the SAP inbuilt AIO jobs for customer master.

-Create Local Repositories, Profiler Repositories for MS SQL Server 2008 and configured the Informatica Local repository with Job server and Management console.

-Use SAP Information Steward for Rules Creation, Rules Binding, Data Profiling and Cleaning Package creation.

iPTouch Software India Pvt. Ltd., Hyderabad, India – From May 2009 to May 2010.

Develop various websites by using HTML, PHP, JavaScript, C, C++ languages.

References

References are available on request.



Contact this candidate