Post Job Free

Resume

Sign in

Data Manager

Location:
Woodside, MA, 01532
Posted:
March 26, 2021

Contact this candidate

Resume:

Navaneeth Hari

Email: adk7bh@r.postjobfree.com

Phone: 508-***-****

Summary:

●Having 6+ Years of total IT experience and Technical proficiency in the Data Warehousing involving Business Requirements Analysis, Application Design, Data Modeling, Development, testing and documentation.

●Extensive experience in Teradata12/13/14/15.x for developing ETL and ELT architectures.

●Around 2+ years of strong data warehousing experience using Informatica Power Mart 6.1/5.1/4.7, Power Center 9.5.0/9.1.0/8.6.1/8.1/7.1.3/7.0/6.2/5.1, Power Exchange, Power Connect as ETL tool.

●Good knowledge of Dimensional Data Modeling, Star Schema, Snow-Flake schema, FACT and Dimensions Tables.

●Expertise in creating Databases, Users, Tables, Triggers, Macros, views, stored procedures, Functions, Packages, Joins and Hash indexes in Teradata database.

●Extensively worked with Teradata utilities like BTEQ, Fast Export, FastLoad, MultiLoad to export and load data to/from different source systems including flat files.

●Expertise in UNIX shell scripting, Autosys job scheduler, FTP, SFTP and file management in various UNIX environments.

●Involved in full lifecycle of various projects, including requirement gathering, system designing, application development, enhancement, deployment, maintenance and support.

●Strong analytical, interpersonal, leadership and decision-making skills.

●Familiar with using Set, Multiset, Derived, Volatile, Global Temporary tables in Teradata for larger Ad hoc SQL requests.

●Stream big data in real time with PySpark and integrate it with Kafka, Cassandra and more as part of KeyStone project.

●Used Filos GDT deployment tool for production deployment.

●Used Oracle SQL Developer for resolving replication, S2F, F2S, S2C jobs as all the ETL Metadata is maintained in Oracle.

●Performing Data validation, Data integrity, Data Quality checking before delivering data to operations, Business, Financial analyst by using Oracle, Teradata.

●Experience creating design documentation related to system specifications including user interfaces, security and control, performance requirements and data conversion.

●Have experience in working with both 3NF and dimensional models for data warehouse and good understanding of OLAP/OLTP systems.

●Involved in development of Informatica mappings with required transformations like Aggregator, Filter, Lookup, Sorter, Normalizer, Update strategy etc.

●Strong Knowledge on Power Center Designer, Workflow Manager, Workflow Monitor.

●Creating sessions and Workflows using Informatica Workflow Manager.

●Proficient in preparing high/low level documents like design and functional specifications.

●Experience in working with multiple vendors and geographically distributed teams.

●Proficient in performance analysis, monitoring and SQL query tuning using EXPLAINPLAN; Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.

●Excellent experience in ETL Tools like Informatica, Daemon MD UI and on implementing Slowly Changing Dimensions.

●Excellent communication and inter personnel skills, Proactive, Dedicated and Enjoy learning new Technologies and Tools.

●Strong commitment towards quality, experience in ensuring experience in ensuring compliance to coding standards and review process.

Skill Matrix:

Tools & Utilities

TPT, BTEQ, FastLoad, Multiload, Tpump, Fast Export, SQL Assistant, Teradata Viewpoint, TSET, Teradata mapping manager, Data Pipeline, PIECompute, ETL OS.

ETL Tools

Informatica 10.2.0/9.6.1 /9.5.0/ 9.1.0/ 8.6.1/8.1 /7.1.3/7.0/6.2/5.1,

Power centre, Datastage8.7

Databases

Teradata V2R5/V2R6, V12, V14, V15, Oracle (8i/9i), (10g/11g), SQL Server 2000 /2005 /2008/ 2012/2014/2016, Cassandra, DB2.

Data warehouse

Teradata, Oracle, SQL Server, Cassandra

Data Modelling

Visio, Autosys, ETL MD UI

Applications

MSWord, Excel, Outlook, FrontPage, PowerPoint, MS- Visio

Reporting Tools

Micro Strategy 9.2.1/9.0/8i/7.i.

Support Tools

Tivoli, Impact, Autosys, Control-M, UC4, HPQC

Other Languages/

Technologies / Platforms

SQL, JAVA, VB, C#, UNIX, C, C++, Shell scripting, Scala,

(K-shell, C-Shell), SQL.

Education Qualification:

●Master’s in computer and Information Science from Southern Arkansas University - 2017

PROFESSIONAL EXPERIENCE:

Apple, Sunnyvale, CA Jul 2019 – Till Date

Role: DataBase Engineer

●Extensively worked in Data Extraction, Transformation and loading from source to target system using BTEQ, FastLoad, and MultiLoad.

●Used Daemon MD UI tool for replicating the data to the secondary servers.

●Create Tables, Views, Stored Procedures in Teradata according to the requirements for Repair_ESL.

●Involved in end to end of the application like Development, Test, Go Live/move code into production.

●Development: Development of the code based on design using Teradata procedures, Teradata utilities like Bteq, Fexp, Scala, Spark, Unix shell scripting etc.

●Testing: Create unit test plans, Integration testing and implementing the test cases and capture the results in a document later to be reviewed and sign off from client manager.

●Go Live/Production support: Create Implementation plan, post-production support, backout plan in relation to the CR and support the application in production.

●Worked on

●Stream big data in real time with PySpark and integrate it with Kafka, Cassandra and more as part of KeyStone project.

●Extensively worked on validating the data for Repair_ESL between KeyStone and Legacy (ESL). Which includes logic changes in Scala, bug fixes etc.

●Performance tuning of stored procedures and SQL queries.

●Identified performance issues in existing sources, targets and mappings by analyzing the data.

●Extensive experience in incident management and in analysis of issues in the production environment to triage and resolve the incidents.

●Loaded data from Teradata to Cassandra using TPT Export, PIE McQueen and PIE Spark.

●Provided support during the system test, Product Integration Testing and UAT.

●Involved in Teradata server migration and setting up all the Autosys, replication jobs for loading data into Teradata and replicating the same to secondary servers.

●Part of production support team resolving the replication issues, Incidents.

●Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.

●Experience in creating Change Request (CR) for implementing changes in production after testing the code in UAT/BAP and providing the performance analysis, DDL’s, DML’s.

●Used Filos GDT deployment tool for production deployment.

●Used Oracle SQL Developer for resolving replication, S2F, F2S, S2C jobs as all the ETL Metadata is maintained in Oracle.

●Use SQL to query the databases and do as much crunching as possible in Teradata using very complicated SQL query optimization (explain plans, collect statistics, data distribution across AMPs, primary and secondary indexes, locking, etc.) to achieve better performance.

●Expertise in UNIX shell scripting, Autosys job scheduler, FTP, SFTP and file management in various UNIX environments.

●Support different application development teams, production support, query performance tuning, system monitoring, database needs and guidance.

Environment: Teradata 15, Cassandra, PySpark, Darwin, Fulcrum, Kafka, PIE Compute, Teradata Studio Express, Oracle SQL Developer, Tpump, BTEQ, MLOAD, FLOAD, FASTEXPORT, FILOS, Autosys, UNIX.s

Caesars Entertainment, Las Vegas, NV Oct 2018 – Jun 2019

Role: Teradata Developer

●Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.

●Analyzing the requirements and the existing environment to help come up with the right strategy to load and Extract on Data warehouse.

●Prepared the ETL specifications, Mapping document, ETL framework specification.

●Implement slowly changing Dimension logics in the mapping to effectively handle change data capture which is typical in data warehousing systems.

●Performed tuning and optimization of complex SQL queries using Teradata Explain.

●Wrote numerous BTEQ scripts to run complex queries on the Teradata database.

●Created tables, views in Teradata, according to the requirements.

●Created proper Primary Index taking into consideration of both planned access of data and even

distribution of data across all the available AMPS. Responsible for Performance tuning at various levels during the development.

●Fine-tuned existing Teradata procedures, macros and queries to increase the performance.

●Identified performance issues in existing sources, targets and mappings by analyzing the data.

●Gathered requirements from Business users and created a Low-level technical design documents using high level design document.

●Extensive experience in incident management, and in analysis of issues in the production environment to triage and resolve the incidents.

●Actively involved in UAT testing during the migration from Legacy and oracle data sources to Netezza database.

●Logged and resolved Jira tickets for UAT defects.

●Conformed to project standards for Unit test/UAT testing. Carried out end – to end testing and Supported UAT effort with immediate requirement document change/fixes/resolution for all changes/defects.

●Experience in working with Business users, vendors and architects to gather the requirements as part of change requests and to pace up the process in meeting the milestone for expected data delivery.

●Responsible for Formulating the DW process to load from sources to Target tables.

●Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS using BTEQ, Multiload and FastLoad.

●Used Power Centre Workflow Manager to create workflows, sessions, and also used various tasks like command, event wait, event raise, email.

●Responsible for migrating the workflows from development to production environment.

●Using Teradata PT(TPT) can simultaneously load data from multiple and dissimilar data sources into, and extract data from, Teradata Database.

●Expertise in UNIX shell scripting, Autosys job scheduler, FTP, SFTP and file management in various UNIX environments.

●Extensively worked on job scheduling and execution using Tidal scheduler as well as Informatica scheduler.

●Triaged and resolved issues in NPE Environments.

●Fine-tuned existing Teradata procedures, macros and queries to increase the performance.

●Re-designed the table structures on one AMP basis. Well organized the primary index mechanism.

●Wrote stored Procedure for complex calculation and for faster processing of bulk volume of the data.

●Engaged other teams across NPE support for assistance in issue resolution as needed.

●Developed UNIX scripts to automate different tasks involved as part of loading process.

●Upgraded Autosys scheduler from 4.0 to 4.5 and Upgraded Business objects data integrator directly from 6.1 to 11.5.1.18 which has dynamic generation of loader scripts like MLoad, fast load and TPump.

●Extensively used Informatica Power Centre tools and transformations such as Lookups, Aggregator, Joiner, Ranking, Update Strategy, XML Parser, JAVA Transformation, Mapplet, connected and unconnected stored procedures / functions / SQL overrides usage in Lookups / Source filter usage in Source qualifiers.

●Knowledge of Pre/Post Session and SQL commands in sessions and mappings on the target instance.

●Responsible for Performance tuning at various levels during analysis of issue.

Environment: Teradata 15, Teradata SQL Assistant, Power center, Tpump, BTEQ, MLOAD, FLOAD, FASTEXPORT, Erwin Designer, Informatica 9.5, Tableau, Hadoop, Hive, POWER BI, UNIX, Korn Shell scripts.

CBS, New York, NY Jun 2017 – Oct 2018

Role: Teradata Developer

●Involved in Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.

●Used Teradata utilities Fast Load, MultiLoad, TPT to load data.

●Wrote, tested and implemented Teradata FastLoad, MultiLoad and BTEQ scripts, DML and DDL.

●Conducted Knowledge Transfer (KT) sessions for Business Users and Production Support Team.

●Involved in creating Report Design Documentation and Universe Design Documentation for Support Team.

●Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.

●Designed the mappings between sources (external files and databases) to Operational staging targets.

●Did the performance tuning for Teradata SQL statements using Teradata Explain command.

●Used various Teradata Index techniques to improve the query performance.

●Wrote views based on user and/or reporting requirements.

●Performance tuned and optimized various complex SQL queries.

●Made sure all the applications moved into production environment meets the standards set by team and have no performance issues.

●Interacted with DBA’s about SQL Tuning issues and implemented the changes in the script as per their recommendations.

●Coordinated with the business analysts and developers to discuss issues in interpreting the requirements.

Environment: Teradata 15, Teradata SQL Assistant, Power center, Tpump, BTEQ, MLOAD, FLOAD, FASTEXPORT, Erwin Designer, Informatica 9.1, Tableau, Hadoop, Hive, UNIX, Korn Shell scripts.

Nike Inc., Portland OR Aug 2016 –May 2017

Role: ETL/Teradata Developer

●Gathered business requirements through meeting the customers who have requested the reports.

●Created mockup reports and prototypes to get a sign off from the customer.

●Extensively used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets Designer, Informatica Repository Manager and Informatica Workflow Manager.

●Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.

●Teradata is the backend database for this reporting project taking the future enhancement into consideration.

●Gather information from different data warehouse systems and loaded into warehouse using BTEQ.

●Noticeable Contribution for burndown of burndown chart.

●Execution of SQL queries to extract data from DB2 tables for running test scripts.

●Designed various Mappings for extracting data from various sources involving Flat files, Oracle and SQL Server using Teradata utilities BTEQ, MLOAD, FLOAD and PUMP to load staging area.

●Working on POC using Java, Hadoop, Hive and NO-SQL databases like Cassandra for the data analysis. This is the initial implementation of the Hadoop related project where LM want to see the claims loss transactions on Hadoop.

●Worked on Performance Tuning to optimize the Session performance by utilizing, Partitioning, push down optimization, pre, and post stored procedures to drop and build constraints.

●Created UNIX Script for ETL jobs, logs session cleanup and dynamic parameter.

●Created and scheduled Sessions, Jobs based on demand, run on time and run only once.

●Monitored Workflows and Sessions using Workflow Monitor and Scheduler alert editor.

●Involved in creating Metadata data for BASEL and BETA Solutions.

●Data profiling to make sure the source that have been identified are appropriate.

●Involved in enhancing the SAS scripts for the CCM and Rose Tracker.

●Developed Tableau data visualization using Geographic Map, Pie Charts and Bar Charts and Density Chart.

●Worked on the Daily Dialer reports, Call Center Management (CCM) Reports and Abandon Rate Report.

●Involved in tuning the SQL’s in order to improve the SQL execution performance.

●Maintained the Teradata UD tables used for Rose-Tracker, Debt Protection, CCM and FNMA reports.

●Worked on the Risk Weight Assessment project.

Environment: Teradata 13 and 14, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, FASTEXPORT, Erwin Designer, Informatica 8.1, Cognos, Tableau, UNIX, Korn Shell scripts, SAP BW, ECC, Teradata Analytics.

Contrive Solutions, Hyderabad, India May 2013 – Dec 2015

Role: ETL Developer

●Extraction, Transformation and data loading were performed using Informatica into the database.

●Did the performance tuning for Teradata SQL statements using TeradataExplain command.

●Data was extracted from Teradata, Processed/Transformed using Ksh programs and loaded into Data Mart.

●Used various Teradata Index techniques to improve the query performance.

●Tested the ETL processes using Informatica to load data from Oracle, Flat Files, and XML Files to target Oracle Data Warehouse database.

●Created Functional design documents, Technical design documents, Source to Target specification documents, and Test plans for ETL.

●Involved in technical writing.

●Involved in unit testing, end-to-end testing of Informatica mappings and mapplets.

●Used legacy systems, Oracle, and SQL Server sources to extract the data and to load the data.

●Involved in Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.

●Used Teradata utilities FastLoad, MultiLoad, TPT to load data.

●Wrote BTEQ scripts to transform data.

●Wrote Fast Export scripts to export data.

●Wrote, tested and implemented Teradata FastLoad, MultiLoad and BTQ scripts, DML and DDL.

●Constructed Korn shell driver routines (write, test and implement UNIX scripts)

●Conducted Knowledge Transfer (KT) sessions for Business Users and Production Support Team.

●Involved in creating Report Design Documentation and Universe Design Documentation for Support Team.

●Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.

Environment: Teradata 12, Oracle 9.i/10g, Teradata Visual Explain, BTEQ, Teradata Manager, Teradata SQL Assistant, Fast Load, Multi Load, Fast Export, UNIX, MQ, NDM, FTP, UNIX Shell Script.



Contact this candidate