Sign in

Teradata Developer with expertise in hive

Atlanta, GA
January 15, 2020

Contact this candidate




Phone: +1-470-***-****

Status: L2 EAD - Authorized to work for any US employer till Jan 2nd 2022.


•8+years of professional IT work and experience in Analysis, Design, Administration, Development, Deployment and Maintenance of critical software applications.

•Proficient in designing, developing and supporting applications using Teradata (4 years in ETL).

•Strong skills in coding and debugging Teradata utilities like Fast Load, Fast export, MLoad and write technical specifications to design/redesign solutions.

•Experience with migrating data from RDBMS, Mainframe Systems into HDFS using Sqoop.

•Expert in data-modelling concepts and design.

•Experience working with Horton Works Distributions of Hadoop to create Datalake in Hive- HDFS.

•Good exposure in working agile/scrum projects.

•Good team player and experienced in leading the team along with mentoring resources.

•Efficient in SQL programming development.

•Adept in data preparation, data analytics application using Alteryx.

•Ability to adapt to evolving Technology, Strong Sense of Responsibility and Accomplishment

•Expert in mainframe systems like COBOL, JCL and DB2.


Primary: SQL, Teradata, HQL- HIVE, COBOL, JCL, DB2, Alteryx – Data Preparation

Secondary: UNIX, SQOOP, IMS DB, IMS DC, Informatica

Tools: Alteryx – Data Analytics tool, Automic scheduler, Teradata SQL Assistant, QMF, Xpeditor, Debug Tool, Control-M, WinSCP, Informatica Power Centre, Ambari.


Project - Walmart – International data, Duration - 6months

Datalake creation for Walmart Canada, for the shipment data and associated personal information. It involved data migration from legacy systems like mainframe and RDBMS systems like oracle and Teradata. Project involves the dealing of big data with billions of historical data and 20 to 30 million of daily data.

Technology: Teradata, Hive, HQL, Sqoop, Automic Scheduler, Shell Script.

Role and Responsibilities:

•Performed the role of a data Engineer.

•Discuss the requirement in the user story with the onsite technical lead and BA.

•Document the low level design based on the high-level flow.

•Script to pull data from source systems; mainframe DB2, Oracle and Teradata to staging location.

•Sqoop import script to import data to staging table to HDFS Datalake.

•Hive table creation and schema design.

•HQL based transformations on data in raw table to create a curated layer.

•Create schedulers using Automic Scheduler to co-ordinate the job workflows.

•Review the Teradata and mainframe loading scripts for developers.

•Demo for the completed user stories.

•Participating scrum-agile ceremonies like; stand-up call, sprint planning, sprint review meeting, sprint retrospective.

Project - Walmart – Sustainability Index Reports, Duration – 12 months

For the World’s largest retailer operating in Brick and mortar format, the Walmart Stores Inc, Business continuality is as critical as its commitment to ensure minimal environment impact by its operations. Also, US Govt incentivize companies participating in Sustainability Business practices. Annually, Walmart sends out survey requests to all its suppliers to ensure their Business practices are in confirmation with the Sustainability ideologies of Walmart. Later, these survey results are consolidated, cleansed and reported to the authorities as well as the Business Managers. The project was about coming up with the reporting dashboard for internal use as well as the reports to be exported for external agencies.

Technology: Teradata, DB2, Alteryx Data preparation tool

Role and Responsibilities:

•Performed the role of a senior Teradata developer.,

•Analysing the reports and creating data to be loaded in to the tables.

•Written custom Teradata/SQL queries in Alteryx to pull the data from DB2 tables and Teradata tables.

•Used FastExport to fetch data from Teradata tables.

•Data loaded to Teradata tables using Mload and FastLoad.

•Create Alteryx flows to load data into the DB2 tables.

•Analyse the existing report and decide upon the DB2 tables required, columns to be created and their data formats.

•Group the data based on the questionnaire and the answers given in the existing report.

•Create development and production DB2 and Teradata tables based on the columns given in the survey results reports for both Walmart and Sam’s Club.

•Create tde files to use in tableau reports.

Project - Walmart – HR Analytics Staff Aug, Duration – 6 months

The Walmart HR Analytics Staff Aug project is a part of Global People Analytics division.

•GPA division is having 4 pillars; Research, Social Media & Vendor Management, Data Mining & Analytics, Data Visualization & Prototyping and Test &Learn.

•Data Mining and Analytics team specializes in providing global customer accurate data and insight on request to help support important HR related decision making and projects.

•HR data is collected from Lawson, Smart and Time & Attendance and sent down to DB2 daily. Data in DB2 will always be current data.

•This data is then archived in Teradata allowing historical data to be available for analysis.

Technology: Teradata, Alteryx Data Preparation Tool

Role and Responsibilities:

•Performed the role of Teradata Developer, working directly with customer IT team.

•Analysing the request for new table creation or data extraction.

•Provide data to analytics team by collecting information from different tables.

•Create queries with complex joins to fetch DB2 tables.

•Creating Teradata tables with the specifications given.

•Preparing queries to load data into Teradata tables using the conditions given in the request. Data load and extract using Teradata utilities like MLoad, FastExtract, FastLoad.

•Built a history table of all the employees in Walmart by taking data from multiple tables.

•We have used a data analytics tool called Alteryx, for creating work flow and to load bulk data into Teradata tables using complex queries.

Project - Anthem – Medicare Revenue DataMart, Duration – 18 months

The purpose of the MRDM Project is to create a separate DataMart for different sections in the insurance field. This project handles the mainframe source files from different regions and different vendors into the database and will load the data to the target tables. We will get files at different frequency and will be routed to the target tables based on the requirements.

Technology: Teradata, Informatica, Shell Script

Role and Responsibilities:

• As a senior Developer in this agile project, have actively participated in all the agile ceremonies (user story grooming, sprint planning, sprint retrospective).

•Understand the stories and give the approach and technical documents within a week.

•Technical documents preparation.

•Give the technical design walk through to the customers directly.

•Interact with the client tech leads and get understanding on technical and business scenarios.

•Provide implementation & warranty support as appropriate.

•Coordinate within the team to formulate solutions for technical challenges faced.

•Involved in resolving production issues in limited timeframe.

•Requirement Analysis, Design, Coding, Unit Testing, and Implementation.

•Table and schema design

•Creating BTEQ scripts to perform import/export and transformations

•Create Informatica mappings and workflows to load from flat files.

•Create shell scripts for data and file validation.

•Deploying work products in the test environment and perform System Integration testing. It includes fixing of bugs and technical errors that may come up during this phase.

•Supporting the system components during system testing, integration testing and during promotion of these components to production environment.

•Provide support for User Acceptance testing.

•Performance tuning and optimizing complex queries.

•Unit test case preparation and execution.

•Application Support and Maintenance

•Experience in JIRA and Confluence applications.

Project - Anthem WGS Claims – CS90 Migrations, Duration – 6 months

The objective of the CS90 migration project is to migrate New York claims for the following functions:

•Migrate New York claims to WGS system from CS90 system.

•Enable detail line level processing of CS90 New York claims.

•Ensure the proper processing of CS90 claims in the same way they were processed in CS90 system.

•The first group migrating is the ASO case City of Amsterdam, Case #720598. The effective date of the migration is 7/1/2016—so any claims with a date of service on or after 7/1 will be processed on WGS.

•We will be receiving claims only from NY members who were originally processed in CS90 System.

In order to achieve the above objectives, the CS90 project team enhances the WGS 2.0 Claims system by implementing certain configuration changes and thereafter testing the changes to ensure that the processes work as expected.

Technology: Mainframe

Role and Responsibilities:

•Performed the role of a senior developer.

•Analysing, Designing and Coding programs.

•Preparing Test plans, conducting testing, delivering test results

•Debugging COBOL and IMS DB applications.

•Development phase activities for respective tracks assigned

•Conducting post implementation reviews and support during implementations

•Help and guide the new team members in the team.

Project - EHB (Essential Health Benefits)/ODS IT, Duration – 18 months

Beginning from 2015, Large Group non-grandfathered plans are mandated to combine/commingle cost shares for deductible, co-pay, and coinsurance for covered Essential Health Benefits among separate providers (Medical, Rx, Paediatric Dental, Paediatric Vision, and Behavioural Health) to a single out of pocket maximum. Exceptions are services that are deemed excepted benefits (voluntary for a member to opt in or out such as some vision and or dental plans).

Deductible, co-pay, and coinsurance cost shares for Large Group Non-Grandfathered plans covering Essential Health Benefits (EHBs) and Non-EHB covered services in-network and out-of-network emergency services must accumulate to the in network out of pocket following the product strategy.

Technology: Mainframe

Role and Responsibilities:

•Analysing, Designing and Coding programs.

•Preparing Test plans, conducting testing, delivering test results

•Development phase activities for respective tracks assigned

•Conducting post implementation reviews and support during implementations

•Help and guide the new team members in the team.

TMSAMS2 - Vehicle Accounting – Manufacturing Domain

TMSAMS2 is a project for Toyota which is based out of Torrance, California. TMS is North American Sales and Marketing arm of Toyota Motors Corporation. TMS business is broken up into various verticals: Business Support Services, Automotive, Toyota Customer Services etc. I worked for Automotive Vertical that involved major processing such as Distribution, dealer allocation, dealer pipeline management and vehicle ordering.

I have worked on enhancements related to Vehicle Distribution and Corporate Sales Reporting for TMS. I was the Offshore Owner of the Vehicle Accounting Application which deals all the financial transactions of TMS with its dealers and customers. This has a high visibility up to the CFO level of TMS.

Technology: Mainframe

Role and Responsibilities:

•Performed the role of a developer in analysing, Designing and Coding COBOL programs.

•Preparing Test plans, conducting testing, delivering test results.

•Enhancements related with generating mainframe reports.

•proactively identify parameters to Tune applications performance and resource utilization, and Application Simplification.

•Conducting post implementation reviews and support during implementations

•Help and guide the new team members in the team.


•Bachelor of Technology (2006-2010) From ADI SHANKARA INSTITUTE OF ENGINEERING AND TECHNOLOGY, KALADY, Mahatma Gandhi University, Kottayam, Kerala, India in Applied Electronics and Instrumentation Engineering.

•Completed manufacturing, healthcare and retail domain and technical certifications.

Trained on:

•Teradata Tools (SQL Assistant) and Teradata Utilities (BTEQ, Fast Load, MultiLoad, FastExport)

•Big Data Concepts and Hive



APR 2013 – Aug 2019

Senior Software Developer


OCT 2010 – APR 2013

Systems Engineer

Contact this candidate