Post Job Free
Sign in

Data Sales

Location:
Manteca, CA
Posted:
July 29, 2016

Contact this candidate

Resume:

SREEKANTH NAMPALLI

Senior ETL Informatica Developer & IDQ developer

Phone: 760-***-****

Email:acvxm5@r.postjobfree.com

EXPERIENCE SUMMARY

9+ years of IT experience in analysis, design, development, testing and Implementation Informatica Workflows using Data Warehousing/Data mart design, ETL, OLAP client /server applications.

ETL and data integration experience in developing ETL mappings and scripts using Informatica Power Center 9.x/8.x.

Solid understanding of ETL design principles and good practical knowledge of performing ETL design processes through Informatica.

Proficient in Data warehouse ETL activities using SQL, PL/SQL, PROC, SQL LOADER, C, Data structures using C, Unix scripting, Python scripting.

Strong ETL experience using Informatica Power Center with various versions 9.x/8.x/7.x/6.x Client tools - Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor and Server tools – Informatica Server, Repository Server manager and Power Mart .

Knowledge in Full Life Cycle development of Data Warehousing.

Knowledge of Advanced Programming for data transformation (JAVA, C).

Ability to write complex SQLs, stored procedures and Unix Shell Scripting, for ETL jobs and analysing data.

Used Informatica data quality (IDQ) in cleaning and formatting customer master data.

Over 4+ years worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.

Good exposure on Informatica Cloud Services.

Good understanding and experience on Informatica cloud with oracle and sales force.

Having experience in Change data capture (CDC) and Strong understanding of OLAP and OLTP Concepts.

Experience in Software development life cycle (SDLC) Such as testing, migrating, developing, etc.

Proficient and worked with databases like Oracle, SQL Server, IBM DB2, XML, Teradata, Excel sheets, Flat Files and Netezza.

Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.

Excellent working knowledge of UNIX shell scripts and scheduling of jobs by using tools like Control M, Autosys.

Extensively worked on Data migration, Data Cleansing and Data Staging of operational sources using ETL process and providing data mining features for Data warehouses.

Strong Experience in creating Transformations such as Aggregation, Expression, Update Strategy, Lookup, Joiner, Rank, Router and Source Qualifier Transformations in the Informatica Designer.

Comprehensive knowledge of Dimensional Data Modelling like Star Schema, Snowflake Schemas, Facts and Dimension Tables, Physical and Logical Data Models and knowledge in designing tools like MS Visio.

Experience in the Development, Implementation of Database/Data Warehousing /Client/Server /Legacy applications for using Data Extraction, Data Transformation, Data Loading and Data Analysis

Good understanding and experience in usage of source/target/field mapping/scheduling in Informatica cloud services and experience on creating connection for oracle and Sales force in Informatica cloud.

Experience in maintaining Data Concurrency, Replication of data.

Familiarity with Master Data Management (MDM) concepts.

Strong knowledge on Relation Databases on different platforms such as Windows/Unix Linux using GUI tools like SQL DEVELOPER, SQL PLUS, SQL*PLUS, MICROSOFT VISUAL

Analytical and Technical aptitude with the ability to solve complex problems.

Good knowledge on Devops and version control.

Knowledge on Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop, Cassandra, Oozie, Flume, Chukwa, Pentaho Kettle.

Good Knowledge on file distribution system,pig,hive,sqoop.

Knowledge on how to Develop MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.

I have good knowledge on Paxata.

Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.

Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.

Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation.

Systematic, disciplined and has an analytical and logical approach in problem solving. Ability to work in tight Schedules and efficient in time management

Excellent Written, Communications and Analytical skills with ability to perform independently as well as in a team quick learner and able to meet deadlines.

Solid experience years of experience in various types of testing like smoke, functional, Data Quality, Regression, Performance and System Integration testing.

Hands on experience in developing Type 1 and 2 dimensions, Fact Tables, Star Schema design, Operational Data Store (ODS), levelling and other Data Warehouse concepts.

Experience in installing Informatica in UNIX environment.

Experience in Informatica Administration and hands-on experience in installing, configuring, upgrading the Informatica Power Center and Data Explorer.

Experienced in handling various Informatica Power Center code migration methods (XML Export/Import, Deployment Group and Object Copy)

Experience in handling Informatica repository back-ups, restore and fail-over of services from primary to back-up nodes and vice-versa and Good understanding on file transport protocols like FTP, SFTP.

EDUCATIONAL QUALIFICATION

B.Tech in Electrical and Electronics Engineering, JNTU, India.

TECHNICAL QUALIFICATION

ETL Tools: Informatica power Center 9.x/8.x,Informatica Power Exchange9.1, B2B DX/DT v 8.0, Informatica Cloud, Winscp

Programming Languages: SQL, PL/SQL, UNIX Shell Scripting,Python.

Operating Systems: WINDOWS 7/Vista, XP/2003/2009/NT/98/95, MS-DOS, Unix/Linux

Office Applications: Microsoft Word, Excel, Outlook, Access, Project, PowerPoint

Databases: SQL Server 2008/2005, Oracle 11i/10g/9i, MS-Access 2003/2007/2010 SQL Server, XML, XSD Teradata, DB2, Netezza.

Professional Experience

Thermo Fisher Scientific Sep ’14- Till Date

Role: ETL Informatica Data quality Consultant

Location: Carlsbad, CA

It is an American multinational, biotechnology product development company. Its customers included pharmaceutical and biotechnology companies, colleges, universities, and secondary education institutions, medical research institutions, hospitals and reference labs and quality control, process control and research and development laboratories. The company offered more than 600,000 products and services to over 350,000 customers located in approximately 150 countries.

Anatomical Pathology Division (APD): Sep ’14- Nov ‘15

The aim of project is to build a central data warehouse that will pull data automatically from all ERP on a daily basis. Enable user friendly reporting that can be used by whole business. Predominately Commercial, Marketing and Finance departments. Automated suite of reports which can be standardized across divisions.

Sales Rep Tool: Nov’15- Till Date

The aim of this project is to provide sales rep information based on product availability at a given zip code. For this project need to have all territory information through all over the world for all the divisions as per the assigned rules.

Responsibilities:

Interacting with business users and business analysts to gather, understand, analyse and document the requirements for reporting.

Convert the Business requirements into technical requirement documentation (LSD and ETL Design Documentation).

Design ETL for Framework, Logging Activity, Control Tables and Error Handling by Interacting with the ETL Integration Consultants and Architects.

Developed Informatica technical mapping document to pull data from different Source systems and integrate.

Analysed the pharma data for all the products in different source system.

Experienced in Informatica data quality (IDQ), power center, data cleansing, data profiling, data quality measurement and data validation processing,Match, Merge, Weightage score, Deduplication process.

Workflow Recovery with mapping build for recovery, Tuning and Error Handling.

Debugging and Performance Tuning of sources, targets, mappings and sessions.

Developed several reusable transformations and mapplets, several workflow tasks.

Design, Develop, Configure, Deploy program and implement software applications, servers, packages, and components customized to meet specific needs and requirements.

Extensively worked with Salesforce sources to read data and write the data into the Salesforce.

Involved in Unit, Performance and Informatica DVO Testing for data comparisons and data validations.

Experienced in IDE data management and used all features of this tool.

Building Profile and Architect Jobs in Data Flux.

Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.

Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.

Developed PL/SQL triggers and master tables for automatic creation of primary keys.

Created PL/SQL stored procedures, functionsandpackages for moving the data from staging area to data mart.

Created scripts to create new tables, views, queries for new enhancement in the application using TOAD.

Created indexes on the tables for faster retrieval of the data to enhance database performance.

Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.

Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE.

Createdand maintained specifications and process documentation to produce the required data deliverables (data profiling, source to Client maps, flows).

Involve in the data profiling activities for the column assessment, data inconsistency and natural key study

Extensively worked on Teradata utilities.

Worked on Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.

Identify bottlenecks and perform tuning using appropriate transformations in building mappings.

Develop various dimensions like SCD Type1, Type2 and Facts and reconcile facts.

Implemented CDC techniques to read latest data from source systems incrementally.

Created static and dynamic parameter files.

Implemented ETL solutions to bring the data from sales force application into datawarehouse.

Build scripts for Oracle golden gate Extract/Pump/Replicate process and implement.

Extensively used Informatica Power Center tools and transformations such as Lookups, Aggregator, Joiner, Ranking, Update Strategy, XML Parser, JAVA Transformation, Mapplet, connected and unconnected stored procedures / functions / SQL overrides usage in Lookups / Source filter usage in Source qualifiers.

Created Pre/Post Session and SQL commands in sessions and mappings on the target instance.

Develop Informatica Mappings, Sessions and Workflows.

Trained on MDM concepts.

I have worked on SAP BODS tool to perform manipulations and transformation of huge and complex volume of data very efficiently.

Creating Informatica workflow processes for daily, weekly, monthly, yearly and adhoc.

Fine tune the data load process for performance improvement Creating/Replicating SQL server jobs that run the ETL process.

Monitoring Day-to-Day of developed process and existed ETL processes.

Created Informatica Cloud Services connections.

Worked on Informatica cloud with oracle and sales force.

Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).

Used RDBMS Concepts.

Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.

Extensively worked on the Database Triggers, Stored Procedures, Functions and Database Constraints. Written complex stored procedures and triggers and optimized for maximum performance.

Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.

Restricted data for particular users using Row level security and User filters.

Developed Tableau visualizations and dashboards using Tableau Desktop.

Developed Tableau workbooks from multiple data sources using Data Blending.

Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.

Involved in Performance Tuning.

Responsible for monitoring source code and scheduling definition deployments to various environments, such as system integration testing, user acceptance testing, and production regions.

Environment: Informatica Power center 9.1 (Hotflix 6), Informatica data quality (IDQ), Oracle 11g, Teradata,, MS SQL Server, Informatica B2B DX, DT, DTA, 6 Shell programming, MS Visio, WINScp, Siebel, Sales Force, SAP,SAP BODS.

Client: Ryder systems inc, FL Dec'12 – Aug'14

Role: Informatica Data Quality Developer

The objective of the project was to design and develop sales data ware house to provide data analysis for sales, Claims Adjudication and Product information. The ETL Process involved extraction and migration of the data from DB2,SQL server, sales force and Flat files, implementing the business logic and populating the data into the target datamart (Netezza).This project consists three stages staging, dimensional and fact. This completely data ware house project.

Responsibilities:

Used Informatica Power Center 9.6.1 for Extraction, Transformation and Loading data from heterogeneous source systems into the target data base.

Interacted with subject matter experts and data management team to get information about the business rules for data cleansing.

Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst).

Analyze the data based on requirements and wrote down techno functional documents and developed complex mappings using Informatica data quality (IDQ) 9.6.1 used to remove the noises of data using parser, labeler, standardization, merge, match, case conversion, consolidation, lookup etc. Transformations and performed unit testing of accuracy of data.

Used Informatica data quality (IDQ) in cleaning and formatting customer master data.

Built logical data objects (LDO) and developed various mappings, Mapplet/rules using Informatica data quality (IDQ) based on requirements to profile, validate and cleanse the data.

Work on the performance improved areas. Debug the issues and coming up with the proper solutions which will reduce the process times.

Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.

Developed Advance PL/SQLpackages, procedures, triggers, functions, Indexes and Collections to implement business logic using SQLNavigator. Generated server side PL/SQLscripts for data manipulation and validation and materialized views for remote instances.

Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC data quality mappings in Informatica Data Quality tool and imported them into Informatica Powercenter as Mappings, Mapplet.

Designed various mappings and Mapplets using different transformations such as key generator, match, labeler, caseconverter, standardize, Address Validator, parser and lookup.

Configured Address doctor content on both PC and IDQ servers and helped users in building the scenarios.

Used XML PARSER Transformation to read XML data.

Used OXYGEN TOOL to create DTD files from XML data.

Responsible for creating VIEWS in SharePoint.

Expertise in UNIX shell scripting, FTP, SFTP and file management in various UNIX environments.

Effectively communicates with other technology and product team members.

Work on the performance improved areas. Debug the issues and coming up with the proper solutions which will reduce the process times.

Extensively used Informatica Power Center tools and transformations such as Lookups, Aggregator, Joiner, Ranking, Update Strategy, XML Parser, JAVA Transformation, Mapplet, connected and unconnected stored procedures / functions / SQL overrides usage in Lookups / Source filter usage in Source qualifiers.

Created Pre/Post Session and SQL commands in sessions and mappings on the target instance.

Develop Informatica Mappings, Sessions and Workflows.

Worked closely with all Application/Development Teams that used Control-M Scheduling.

Worked directly with Application/Development on-call to fix issues of failed jobs in Control-M.

Coordinated installation of Applications/Development jobs into Control-M.

Coordinated procedures for requests for scheduling in Control-M for all Application/Development Teams.

Created numerous reports using Control-M Report Facility Tool.

Configured Address doctor content on both PC and IDQ servers and

Responsible for Performance tuning at various levels during the development.

Used PMCMD command to call/run the workflows from UNIX script.

Performed performance tuning on Informatica Workflows.

Involved in data validation, data quality monitoring and data mining.

Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.

Extracted/loaded data from/into diverse source/target systems like SQL server, XML and Flat Files.

Worked using Parameter Files, Mapping Variables, and Mapping Parameters for Incremental loading.

Managed post production issues and delivered all assignments/projects within specified time lines.

Extensive use of Persistent cache to reduce session processing time.

Environment: Informatica Power center 9.6.1, Informatica data quality (IDQ), Oracle 11g, Share point 2010, Oxygen 15.2, Soap UI 5.1.3, Autosys (Job scheduler), SQL server 2008 and UNIX shell scripting.

Client: Miami Children’s Hospital, Miami,FL Nov’11– Dec’12

Role: Sr. ETL Informatica Developer

Environment: Informatica Power Center 9.1, Informatica Data Quality (IDQ), Toad, Oracle 10g, PL/SQL, RDBMS, Unix Shell Programming,Python, Control-M.

Description:

it is a foundation project to gather all patient bills related transactions & link these transactions appropriately to build a tree structure which helps in identifying the initiating/root transaction accurately. This involved gathering the data from 3 different data sources and makes the data available for Royalty reporting. This helps in predicting the future patient’s behaviour based on the data and also help build reporting to support OEM contract terms.

Responsibilities:

Understanding the functional specifications and the documents related to the Architecture.

Understanding legacy systems data and building, designing Target DB schema.

Identified suitable dimensions and facts for schema.

Helped users in building the scenarios.

Worked on Extraction, Transformation and Loading of data using Informatica.

Experienced in Informatica data quality (IDQ), power center, data cleansing, data profiling, data quality measurement and data validation processing.

Developing several complex mappings in Informatica a variety of Power center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica Powercenter and IDQ.

Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.

Performed profiling, matching, cleansing, parsing and redacting data using Informatica IDQ and implementing standards and guidelines.

Creating the Profiles, scorecards, custom rules and reference tables using IDQ.

Created mappings using different IDQ transformations like Address Validator, Match, Labeler, Parser, and Standardizer.

Built application interface and web scrapping scripts using OO designing, UML modeling and dynamic data structures.Implemented discretization and binning, data wrangling: cleaning, transforming, merging and reshaping data frames.

Determined optimal business logic implementations, applying best design patterns.

PROJECT: Performance Metrics Central 2.0 Jul'10 – Oct’11

Client: CISCO, Milpitas, CA

Role: ETL (Informatica) Consultant

Environment: Informatica Power Center 8.6, Informatica Power Connect,

RDBMS, Oracle 10g/11g, PL/SQL, Toad, UNIX scripting, OBIEE 10.1.3.4.1

Description:

Performance Metrics Central (PMC) is a “one-stop-shop” location for Cisco and Partners to review and manage partner’s support operation performance on key Cisco Channel and CA partner programs for a maximum profitability.

Informatica (ETL) Responsibilities

Participated in all phases including Client Interaction, Design, Coding, Testing, Release, Support and Documentation.

Interacted with Management to identify key dimensions and Measures for business performance.

Involved in defining the mapping rules, identifying requires data sources and fields

Created ER (Entity Relationship) diagrams

Extensively used RDBMS and Oracle Concepts.

Dimensional modeling using STAR schemas (Facts and Dimensions).

Generated weekly and monthly report Status for the number of incidents handled by the support team.

Worked on data conversions and data loads using PL\SQL and created measure objects, aggregations and stored in MOLAP mode.

Involved in Performance Tuning.

Worked on slowly changing dimension table to keep full history which was used across the board.

Used aggregate, expression, lookup, update strategy, router, and rank transformation.

Worked for some time in Support Activities (24*7 Production Support), Monitoring of Jobs and worked on enhancements and change requests.

Familiarity with Data Analysis and building Reports and Dashboards with OBIEE.

PROJECT Farmers Data Warehouse System (FDWS) Dec’09 – June’10

Client: Farmers Insurance

Role: ETL Developer (Offshore)

Environment: Informatica 9.1, 7.1, Erwin 4.5, RDBMS, Oracle 8i, PL/SQL, UNIX, Toad.

Description:

Multi-Line Customer Data Mart (MLCDM) is a portion of the Farmers Data Warehouse System (FDWS) gathers new and changed source data from Farmers transaction systems (APPS/FPPS, SCV, and ECMS) into staging tables, applies data cleansing, business, and data validation rules while loading the data to holding tables and finally to the Farmers Data Warehouse System. FDWS is implemented to facilitate business intelligence reporting and analysis based on an integrated view of Farmers operational systems.

Responsibilities:

Creating dimensions and facts in the physical data model.

Involved in designing the Data Mart model with Erwin using Star Schema methodology.

Used aggregate, expression, lookup, update strategy, router, and rank transformation.

Used Lookup Transformation to access data from tables, which are not the source for mapping and also used Unconnected Lookup to improve performance.

Created ftp connections, database connections for the sources and targets.

Loading Data to the Interface tables from multiple data sources such as MS Access, SQL Server, Text files and Excel Spreadsheets using SQL Loader, Informatica and ODBC connection.

Wrote stored procedure to check source data with Warehouse data if it not present written that records to spool table and used spool table as Lookup in Transformation.

Implemented Variables and Parameters in Transformations to calculate billing data in billing Domain.

Modified the existing batch process, shell script, and Pl/Sql Procedures for Effective logging of Error messages into the Log table.

INFOSYS Hyderabad, India Jun 2007- Nov 2009

KAISER PERMANENTE – Informatica Developer Kaiser is a non-profitable Health Care Organization with over 10 million customers in US. It provides several health insurance plans and medical services CDW is part of the business requirement which is one of the main warehouses relating to Claims.

Roles/Responsibilities Convert the Business requirements into technical requirement documentation (LSD and ETL Design Documentation).

Perform analysis, validation on the DB2 data received from California Track, Qcare related oracle data from Colorado and integrate the various source systems.

Design ETL for Framework, Logging Activity, Control Tables and Error Handling by Interacting with the ETL Integration Consultants and Architects.

Developed Informatica technical mapping document to pull data from different Source systems and integrate.

Workflow Recovery with mapping build for recovery, Tuning and Error Handling.

Debugging and Performance Tuning of sources, targets, mappings and sessions.

Developed Slowly Changing Dimension Mappings of type II

Complex mappings creation using various transformation tasks and fine tune.

Assist in usage of Exceptional handling, records, arrays, Partitioning of tables, Bulk Collects.

Create RFC and SRs for any enhancements for deployment and get approvals.

Do health checks by ensuring the source data and target data are accurate and valid.

Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes – success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.

Environment: Informatica 8.6.1, ORACLE RDBMS 10g, PL/SQL, DB2,WinSQL, MS SQL Server, Shell programming, MS Visio, WINScp, VSS, TOAD 9, UNIX AIX, Cygwin, Tivoli, Erwin, Remedy User.



Contact this candidate