Post Job Free
Sign in

Data Web Services

Location:
Sanford, FL
Posted:
July 31, 2017

Contact this candidate

Resume:

SHAER AHMED

Little Rock, AR ***** / 501-***-****

ac1k8k@r.postjobfree.com

Senior Teradata Developer

Dynamic member of software development teams and multimillion-dollar, mission-critical projects.

Skilled in all phases of the software development lifecycle; expert at translating business requirements into technical solutions; and fanatical about quality, usability, security and scalability.

Expertise

Data Warehousing

Teradata

Informatica PowerCenter

Informatica Data Quality (IDQ)

Informatica Data Explorer (IDE)

Oracle, DB2, SQL Server,

Netezza

Salesforce

Unix, Web Services

Business Objects

SCM, Insurance, Financial, Banking & Healthcare domain knowledge

Software Development Lifecycle, Agile, Scrum

Technology Summary

10 years of IT experience in analysis, design, development, testing and implementation of business applications systems for Insurance, Financial, Supply Chain Management and health care sectors.

Strong Data Warehousing ETL experience of using Teradata, Informatica Powercenter, Informatica IDQ, IDE and different other databases for extracting, transforming, cleansing and loading data from various sources such as flat files, relational tables, XML into various targets, in batch & real time.

Worked extensively with Teradata utilities – TPT, Fastload, Multiload, Tpump, Fastexport to load huge amounts of data from flatfiles, Teradata sources into Teradata targets.

Designed & recommended various Teradata temp/volatile tables taking proper PI into consideration of both planned access of data and even distribution of data across all the available AMPS.

Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.

Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.

Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Stats, Hints and SQL Trace both in Teradata as well as Oracle.

Strong experience with PL/SQL and dynamic SQL for handling different inputs.

Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.

Created an automated process through shell scripting and Teradata to read source data and perform SCD-1 & SCD-2 changes through a reusable script on target tables.

Wrote Teradata Macros, used various Teradata analytic functions, good understanding of various generic functions such as – NVP, TD_Unpivot.

Complete understanding of Stage to Core to Semantic ETL & ELT development.

Created several volatile tables and work tables as part of incremental process development.

Good knowledge on Teradata Manager, TDWM, PMON, DBQL.

Extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.

Used various Informatica Powercenter and Data quality transformations such as – source qualifier, aggregator, update strategy, expression, joiner, lookup, router, sorter, filter, web services consumer transformation, XML Parser, labeler, parser, address validator (Address doctor engine), match, comparison, consolidation, standardizer, merge to perform various data loading and cleansing activities.

Complete understanding of regular matching, fuzzy logic and dedupe limitations on IDQ suite.

Strong experience is creating various profiles and scorecards using Informatica Data Explorer (IDE) & IDQ, from existing sources and shared those profiles with business analysts for their analysis on defining business strategies in terms of activities such as assigning matching scores for different criteria etc.

Strong experience in configuring MQ Series as the source and parsing the XML inside the MQ Series using XML parser transformation in batch and real time.

Good understanding of Informatica MDM (Sipherian MDM) including landing tables, staging tables, base objects, match, merge and unmerge process.

Used various performance techniques in Informatica such as – partitioning, tuning at source/target/transformation, usage of persistent cache, replacing transformations that use cache wherever possible.

Designed and enabled Informatica workflows as Web Services using Web Service as the source & target and exposed them for real time processing from different 3rd party clients including Java.

Extensive knowledge on WSDL, XML, SOAP messages and web services.

Strong experience in invoking REST web services through HTTP transformation.

Used HTTP transformation to invoke SOAP & REST web services through URL, posted files into various 3rd party applications.

Extensive knowledge in database external loaders – SQL loader (Oracle), LOAD (DB2), TPT, Fastload, Multiload, Tpump (Teradata), Bulk writer (Netezza).

Extensively used data modelling tools such as ERWIN to design teradata tables and relationships among them.

Worked extensively with Teradata utilities - Fastload, Multiload, Tpump and Teradata Parallel Transporter (TPT) to load huge amounts of data from flat files into Teradata database, Fastexport to export data out of Teradata tables, created BTEQ scripts to invoke various load utilities, transform the data and query against Teradata database.

Extensive experience with using Bulk Writer (external loader) of Netezza to load flat files as well as data using Informatica.

Created complex SQL queries using common table expressions (CTE), analytical functions like – lead, lag, first_value, last_value etc.

Extensive knowledge on different types of dimension tables – type 1 dimension, type 2 dimension, junk dimension, confirmed dimension, degenerate dimension, role playing dimension and static dimension.

Created complex PL/SQL programs, stored procedures, functions and triggers.

Declared cursors to move the data between tables, for temporary storing of data to perform various DML operations in stored procedures.

Extensive experience in writing unix korn scripts to invoke Informatica workflows using PMCMD command, to perform data validations between source and target in terms of counts, perform cleanse and purging of source staging data with the help of various command including sed, awk etc.

Extensive knowledge on Business objects, updated existing universe with the new data objects and classes using universe designer, built joins between new tables, used contexts to avoid loops and Cartesians.

Extensive used Jenkins on packaging and deployment of various components.

Used Rally for creating stories and tracking task status for Agile.

Extensive knowledge on scheduling tools – Control-M, UC4, Autosys (JIL Scripts), Tivoli (TWS) and CRON.

Professional Experience

1)Averitt Express, Little Rock, AR

Role – Senior Teradata Developer

Duration – Apr 2015 – Current

Projects – CS90 Migration

USPS Real time Integration

Firebolt Salesforce Integration

Senior ETL Teradata developer on a data warehouse initiative, responsible for requirements gathering, preparing mapping document, architecting end to end ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW) using Teradata utilities, perform data cleansing activities using various IDQ transformations.

Experience in converting business requirements into technical specifications.

Involved in technical design reviews.

Used Teradata load utilities such as – TPT, Fastload, Multiload, Fastexport to load/export huge volumes of data into Teradata tables.

Created complex Unix shell scripts to invoke Teradata utilities, complex parsing logic.

Designed & recommended various Teradata temp/volatile tables taking proper PI into consideration of both planned access of data and even distribution of data across all the available AMPS.

Created ELT script in Teradata to handle Type-1 & Type-2 SCD changes with a reusable script that reads the data from the respective source, creates a dynamic query from the template and gets executed, thus minimizes the development effort to minimal.

Strong experience with PL/SQL and dynamic SQL for handling different inputs and framing up DML statements.

Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.

Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics, wrote Teradata Macros, used various Teradata analytic functions various generic functions such as – NVP, TD_Unpivot etc.

Knowledge on Teradata Manager, TDWM, PMON, DBQL, extensively used SQL Analyser and wrote complex SQL Queries using joins, sub queries and correlated sub queries.

Used Informatica transformations – Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer, Standardizer, Labeler, Parser, Address Validator (Address Doctor), Match, Merge, Consolidation transformations to extract, transform, cleanse and load the data from different sources into DB2, Oracle, Teradata, Netezza and SQL Server targets.

Extensively used Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.

Used Informatica Powerexchange to read and load data into Salesforce objects.

Created Informatica workflows and IDQ mappings for – Batch and Real Time.

Integrated Informatica with USPS using Address Validator (Address Doctor) transformation to validate incoming address requests and converted and published Informatica workflows as Web Services using Web Service Source & Target & web service provider capabilities of Informatica.

Used Web Services consumer transformation to access various 3rd party web services, used HTTP transformation to access REST web service GET & POST methods to download and upload attachments to different applications.

Parsed incoming messages from MQ series with customer information & log details.

Experience in using Informatica data masking transformation to mask sensitive data (SSN, Date of birth) when bringing down the data from Prod environment to other environments.

Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.

Worked on a POC to use Informatica powerexchange for cloud connector to access Workday cloud connector and read/load data into Workday application.

Worked closely with MDM team to identify the data requirements for their landing tables and designed IDQ process accordingly.

Extensively used XML, XSD / schema files as source files, parsed incoming SOAP messages using XML parser transformation, created XML files using XML generator transformation.

Worked on performance tuning of Informatica and IDQ mappings.

Worked extensively with Oracle external loader - SQL loader –to move the data from flat files into Oracle tables.

Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.

Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.

Generated explain plans to identify bottlenecks, path of the query, cost of the query, broadcasting in partitioned database, indexes that are getting picked.

Extensively used OLAP queries – Lead, Lag, First_Value, Last_Value to analyze and tag rows for type-2 processing.

Extensively used PowerExchange for Mainframe to read data from mainframe / VSAM/ COBOL files and load into Oracle tables.

Extensively used PowerExchange for Salesforce to read data from relational sources (Oracle) and load into Salesforce objects.

Used Netezza Bulk writer to load huge amounts of data into Netezza database.

Extensive experience in querying Salesforce objects using workbench.

Extensively used JIRA & ServiceNow for creating requests for access, production migrations, component migrations & production related service requests.

Used Jenkins to automate packaging and deployment of various ETL, Unix components.

Scheduled jobs using Control-M manager, Autosys.

Built UNIX Linux shell scripts for running Informatica workflows, data cleansing, purge, delete, and data loading and for ELT process.

Created complex Business Objects reports using Infoview from scratch.

Created Procedure Document for Implementation procedures for every release for all the UNIX, Informatica Objects and if any catch-up process needed to be done.

Provided on-call support for the newly implemented components and existing production environments and made sure that the SLA has been met.

Environment: Teradata 15, Informatica Power Center 10/9.6, Informatica Data Quality (IDQ) 9.6, Informatica Data Explorer (IDE) 9.6, Informatica MDM 10.1, Data Masking, Salesforce, Powerexchange for Salesforce, Oracle 11i, DB2 10.1, SQL Server 2012, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), Teradata SQL assistant, BTEQ, SQL Developer, SQL Loader, Netezza, HTTP, REST Web services, Bulk writer, MQ series, SQL Plus, Ingest, T-SQL, PL/SQL, RMS, Linux, AIX, ERWIN, Teradata modelling, Toad, Winsql, Putty, UltraEdit, PowerExchange for mainframes, XML, Rally, UC4, JIRA, Jenkins, ServiceNow, Control-M, Enterprise Manager, Autosys, TWS, JIL Scripts, Lotus Notes, Unix shell scripting, Microsoft Visio, XML Spy, Business Objects XI R3.

2) UBS Financial Services, Weehawken, NJ

Role – Senior Teradata developer

Duration – Apr 2013 – Feb 2015

Projects – Securities Backed Lending Datamart

Data Warehousing initiative to fetch data related to Loan accounts and various securities submitted for loans and analyze various inputs and trigger points to create a dedicated data mart for holding “Securities Backed Account” information. The process included reaching out to various sources including EDW, Invoking data from 3rd party Web services (bills and payments), standardizing, matching, consolidating and loading into SBL datamart using Teradata, Informatica Powercenter & IDQ.

Environment: Teradata 14, Informatica Powercenter 9.6.1, Informatica Data Quality (IDQ) 9.6.1, Informatica Data Explorer (IDE) 9.6, Informatica PowerExchange for Mainframes, Oracle 11i, Fastload, HTTP, REST web services, Multiload, Tpump, Teradata Parallel Transporter (TPT), SQL Plus,Netezza, Bulk writer, AIX-UNIX, T-SQL, Shell script, Autosys, JIL Scripts, SQL Loader, RMS, ServiceNow.

3) Statefarm Insurance, Bloomington, IL

Role – Senior Teradata Developer

Duration – Feb 2012 – Mar 2013

Projects – Enterprise Claims Systems DataMart

One of the largest implementations in the Insurance industry, involving multiple teams, migrating 20 years of legacy data, converting existing Claims application into a modernized solution, redirecting the entire claims data into the new solution and integrating all the existing applications into the new application. Used Teradata & Informatica as an ETL tools to perform data integration, cleansing, standardization, match and consolidation of data.

Environment: Teradata, Informatica Power Center 9.5, Informatica Data Quality (IDQ) 9.5, DB2 10.1, Oracle 11i, SQL Server 2012, SQL Loader, Fastload, Multiload, Tpump, Fastexport, Teradata Parallel Transporter (TPT), Teradata SQL assistant, BTEQ, Winsql, Putty, SQL Plus, UltraEdit, Data Masking, PowerExchange for mainframes, MQ Series, PowerExchange for Salesforce, XML, UC4, Rally, Autosys, JIL Scripts, Jenkins, JIRA, Unix shell scripting, XML Spy.

4) Hertz, Fort Myers, FL

Role – Teradata Developer

Duration – Jul 2007 – Nov 2009

Projects – Infor Legacy Integration

Data Warehousing initiative to migrate & integrate various sources currently involved in providing quotes, locations, location rentals, member rewards, member classification, rentals, discounts of parent firm along with the 3 new firms acquired by the firm. Used Informatica powercenter to integrate all the above sources and moved data into Salesforce.

Environment: Teradata, Informatica Powercenter 8.6, Informatica PowerExchange for Mainframes, Powerexchange for Salesforce, Oracle 10i, DB2 8.6, AIX-UNIX, SQL Plus, Shell script, SQL Loader, Load, PL/SQL, PVCS, Visio.

5) St.Barnabas Healthcare, Bronx, NY

Role – Teradata Developer

Duration – Feb 2005 – May 2007

Projects – EDW

Environment: Teradata, Oracle 10i, DB2 8.6, AIX-UNIX, Korn, SQL Plus, Shell script, DB2 Ingest, Toad, SQL Loader, Load, PL/SQL, PVCS, Visio



Contact this candidate