Post Job Free

Resume

Sign in

Data Developer

Location:
Dublin, CA
Posted:
October 15, 2018

Contact this candidate

Resume:

Professional Summary:

Over ** + Years of IT Experience in development of Enterprise Data Warehouse applications using Informatica, Oracle and Teradata.

Experience in all phases of Data warehouse development from Requirements, analysis, design, development, testing and post production support.

Strong in-depth knowledge in doing data analysis, data quality and source system analysis.

Independent, Self-starter, enthusiastic team player with strong adaptability to new technologies.

Experience in Big Data Technologies using Hadoop, Sqoop, Pig and Hive.

Experience in writing Hive and Unix shell scripts.

Excellent track record in delivering quality software on time to meet the business priorities.

Developed Data Warehouse/Data Mart systems, using various RDBMS (Oracle, MS-SQL Server, Mainframes, Teradata and DB2).

Highly Proficient in using Informatica Power Center, Power Exchange and explore on Informatica Data Services.

Computer Expertise:

ETL Tools

Informatica, Data Stage, SSIS

Databases

Teradata 12/13/14, Oracle 9i/10g/11g/12c, MySQL, SQL Server 2000/2005, MS Access, DB2, Hadoop (HDFS)

GUI

.Net Custom development, Business Objects, Micro Strategy, R

Operating Systems

Windows, Unix, Linux

Languages

C#, VB Script, HTML, DHTML, Java Script, SQL, PL/SQL, Unix Shell, Python, Hive, Pig

Web Related

ASP.NET, VB Script, HTML, DHTML, JAVA, Java Script

Tools & Utilities

Teradata Parallel Transporter, Aprimo 6.1/8.X, Bteq, SQL Assistant, Toad, SQL Navigator, SQL*Loader, $U, HP Quality center, PVCS, Data Flux, UC4, Control-M

Domain Knowledge

Banking, Finance, Insurances, Health Care, Energy, Retail

Education

B.Tech. in Computer Science, JNTU University, Hyderabad, India

Professional Experience

Data Engineer Jan 2018 – Till Date

Macys San Francisco, CA

Project Description:

Marketing Systems: Customer Relationship Management (CRM) software is used to support, manage and analyze interactions to these processes, storing information about current and prospective customers. The interface helps to improve services provided directly to customers and to use the information in the system for targeted marketing and sales purposes to create better customer satisfaction.

Involved in requirements gathering, Design and Development.

Transformed data from STG Tables into Final Tables using Hive Scripts.

Writing python scripts to get the usage stats for all edge nodes.

From Hadoop used to refine and analyze clickstream data

Developed Sqoop Jobs to integrate Data from Oracle and SQL Server for application migration.

Providing operational support for current and ongoing efforts.

Written a program to send alerts for the jobs which are taking more resources.

Involved in Hadoop upgrade.

Maintaining the real time usage data at detail and aggregate level. Working with Business analyst to develop productive models and advance SQL statements.

Deliver new and complex high-quality solutions to clients in response to varying business requirements and Creating and managing user accounts.

Responsible for Analyzing report requirements and developing the reports by writing Teradata SQL Queries and using MS Excel, Power Point and UNIX.

Environment: Hive, Teradata, UNIX, Python, and Tableau.

Teradata developer/Analyst May 2017 – Dec 2017

Safeway Pleasanton, CA

Project Description:

Albertson’s Merger

United Market is independent organization owned by Albertsons companies. Albertson’s Marketing team need visibility into Sales, Promotional and marketing data to me merged with rest of the Organization. EDW data from United markets was extracted, mapped and integrated with Albertson’s data.

Worked with Safeway and united business users to capture data requirements and transformational rules between source systems with Albertsons EDW.

Developed Mapping and design documents for Sales, Promotional and Marketing data.

Performed data profiling, source systems analysis to understand data and quality issues.

Developed BTEQ scripts to transform from Stage to 3rd NF and then to aggregate.

Tuned Complex Teradata queries to meet performance level agreements using statics, Indices, and Partitioning Techniques.

Multiload, Fast load, BTEQ, Created, modified databases, performed capacity planning.

Generating Flat files from Teradata 3NF tables using Teradata Fast Export utility, and then FTP them using shell script to a different UNIX server for the Application team’s consumption.

Completed bulk loading and manual data entry.

Designed, recorded, and executed macros to automate data entry inputs.

Formatted spreadsheets and workbooks for print, document reproduction, and presentations.

Worked with various Business users to gather reporting requirements and understand the intent of reports and attended meetings to provide updates on status of the projects.

Responsible for Analyzing report requirements and developing the reports by writing Teradata SQL Queries and using MS Excel, Power Point and UNIX.

Created Multiset, temporary, derived and volatile tables in Teradata database.

Implemented Indexes, Collecting Statistics, and Constraints while creating tables.

Utilized ODBC for connectivity to Teradata via MS Excel to retrieve automatically from Teradata Database.

Designed & developed various Ad hoc reports for different teams in Business (Teradata and Oracle SQL, MS ACCESS, MS EXCEL)

Unix Shell Scripting was used for automating logs, for user created table backups and for checking daily log.

Created pivot tables in Excel by getting data from Teradata and Oracle.

Helped users by Extracting Mainframe Flat Files (Fixed or CSV) onto UNIX Server and then converting them into Teradata Tables using BASE R Programs.

Performed in depth analysis on the data pulled for adhoc request and prepared graphs Using MS Excel and MS PowerPoint.

Worked with Process Manager to review the existing Business Processes and helped to enhance and make better.

Environment: Teradata, Teradata Viewpoint, Teradata Studio, Excel Unix, Hive, Pig, Python.

Teradata Performance Engineer/ETL developer

Bank of the West, San Ramon,CA Sep 2014 – Apr 2017

Project Description:

Customer Relationship Management (CRM) implemented by a Teradata Aprimo, to handle its contact with its customers. CRM software is used to support these processes, storing information about current and prospective customers. The interface helps to improve services provided directly to customers and to use the information in the system for targeted marketing and sales purposes. CRM Systems compile customer data across Campaigns to different channels or points of contact between the customer.

Responsibilities:

Performance tuning, including collecting statistics, analyzing explains & determine which tables needed stats. Increased performance by 50-75% in some situations.

Develop complex SQL queries to identify the performance bottle necks in the processing.

Profound understanding of Banking Campaign management experience.

Working with Business analyst to develop productive models and advance SQL statements.

Multiload, BTEQ, created & modified databases, performed capacity planning.

Deliver new and complex high-quality solutions to clients in response to varying business requirements and Creating and managing user accounts.

Refreshed the data by using fast export and fast load utilities.

Developed Informatica mappings for source to target loading from BODI to TP.

Worked on Aprimo Integration/Customization and configuration.

Source System Analysis and provide input to data modeling and developing ETL design document as per business requirements.

Manage spreadsheets and maintain data currency to ensure accurate data availability for managers and decision-makers.

Created pivot tables and charts using worksheet data and external resources, modified pivot tables, sorted items and group data, and refreshed and formatted pivot tables.

Designed, recorded, and executed macros to automate data entry inputs.

Formatted spreadsheets and workbooks for print, document reproduction, and presentations.

Worked with various Business users to gather reporting requirements and understand the intent of reports and attended meetings to provide updates on status of the projects.

Responsible for Analyzing report requirements and developing the reports by writing Teradata SQL Queries and using MS Excel, Power Point and UNIX.

Created Multiset, temporary, derived and volatile tables in Teradata database.

Implemented Indexes, Collecting Statistics, and Constraints while creating tables.

Environment: Teradata, Teradata Viewpoint, Aprimo, Informatica Power Center, Excel,Oracle, PL/SQL, Windows, HP Quality center, Unix.

ETL and Teradata Developer

PayPal, CA Feb 2014 – Aug 2014

Project Description:

GCE (Global Credit Expansion), is to expand BillMeLater services globally. This project aimed at consolidating multiple source systems into Single Source of Truth for BI Reporting, Decision Support Systems.

Responsibilities:

Analysis, Design, Development, Testing and Deployment of Informatica workflows, BTEQ scripts, Python and shell scripts.

Source System Analysis and provide input to data modeling and developing ETL design document as per business requirements.

Design, Developing and testing of the various Mappings and Mapplets, worklets and workflows involved in the ETL process.

Developed and Integrated Data Quality measures into ETL frame work using Informatica Data Quality (IDQ).

Experience in data profiling using IDQ for input into ETL Design and Data Modelling.

Extensively used ETL to transfer data from different source system and load the data into the target DB.

Developing Informatica mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer.

Extracting data from various sources across the organization (Oracle, MySQL, SQL Server and Flat files) and loading into staging area.

Environment: Teradata, Oracle,PL/SQL, MySQL, Informatica Power Center, Power Exchange, IDQ, OCL Tool, UC4, Control-M, ER Viewer, Business Intelligence, Micro Strategy, Windows, HP Quality center,Unix, Linux.

ETL Developer

Maryland State, Annapolis, MD June 2010 – Jan 2014

Project Description:

Modernized Integrated Tax System (MITS) will enable Maryland state department of audit control to run analytics on Tax filings. MITS systems extract tax filing data from multiple sources for individuals and Organizations both filed electronically and manually. Data is integrated in EDW and feed to downstream applications in other state departments.

Responsibilities:

Developed Low level mappings for Tables and columns from source to target systems.

Wrote and optimized Initial data load scripts using Information and Database utilities.

Using Partitions to extract data from source and load it to Teradata using TPT load with proper load balance on Teradata server.

Wrote Complex Bteq scripts to incorporate Business functionality in transforming the data from Staging into 3rd normal form.

Participated in Teradata Upgrade project to upgrade from TD12 to TD13.10 to conduct regression testing.

Environment: Teradata, Oracle, PL/SQL, MySQL, Informatica Power Center, SSIS, SSRS, ER Viewer, Windows, HP Quality center, UNIX.

Care First (Blue Cross Blue Shields)

Owings Mills, MD Dec 2008 – Jun 2010

Role: Senior ETL Developer

Project Description:

The project scope is to build Departmental Data Mart for CareFirst Human resources and Administration. This Data mart consolidates data from PeopleSoft and external vendors. Employees Health plan information is integrated from PeopleSoft for all its employees on CareFirst plan as well as other offered plans. External Data is integrated using File Extracts on daily basis.

Responsibilities:

Created Uprocs, Sessions, Management Unit to schedule jobs using $U.

Conduct source System Analysis and developed ETL design document to meet business requirements.

Tuned Teradata Sql queries and resolved performance issues due to Data Skew and Spool space issues.

Developed Flat files from Teradata using fast export, Bteq to disseminate to downstream dependent systems.

Environment: Teradata, Oracle, PL/SQL, Informatica Power Center, $U, Business Objects, SSIS, Windows XP, UNIX Shell scripting.

Scott & White Hospital, Temple, TX Jan 2008 – Nov 2008

Role: ETL Developer

Project Description

This project was executed to develop enterprise knowledge data warehouse which is intended to ultimately delivering the right information to the right people in the underwriting organization. System maintains claims, payments and financial information.

Responsibilities:

Documenting functional specifications and other aspects used for the development of ETL mappings

Design, Developing and testing of the various Mappings and Mapplets, worklets and

Optimized Performance of existing Informatica workflows.

Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data.

Environment: Oracle, SQL Server, DB2, Informatica Power Center, Erwin, Cognos, XML, Windows, Unix

XCEL Energy, Minnesota, MN Oct 2006 – Dec 2007

Role: ETL Developer

Project Description

This project was designed for integrating various Data Marts that targeted specific Business processes including Marketing, Generation, Transmission and Distribution etc. The data warehouse has been designed using Erwin adopting Star Schema methodology. Cognos was used to analyze business decisions and to build long-term strategic plans.

Responsibilities:

Developed various Mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer

Extracted data from various sources across the organization (Oracle, SQL Server and Flat files) and loading into staging area

Created and scheduled Sessions and Batch Process based on demand, run on time, or run only once using Informatica Workflow Manager and monitoring the data loads using the Workflow Monitor

Environment: Oracle, SQL Server, PL/SQL, Informatica Power Center, Erwin, Cognos, Windows, UNIX



Contact this candidate