Post Job Free
Sign in

Data Manager

Location:
Raritan, NJ, 08869
Posted:
September 01, 2017

Contact this candidate

Resume:

Naveen Reddy Tamma

*+ years of results-oriented broad-based technology and leadership experience with background in Software Development, Data Management, Product Support and IT Operations with major clients Johnson & Johnson, John Hancock, Philips, Mercedes Benz,IndianOil. Experience Leader in establishing standards, delivering business solutions, and building solid working relationships with clients, staff and senior management. Innovative management over the entire product life cycle from architecture and design to implementation and support. By providing leadership and direction, has a proven ability to move organizations forward yielding significant benefits to the business while managing costs and delivering quality products and services.

7 Years of IT experience in ETL Architecture, Development, enhancement, maintenance, Production support, Data Modeling, Data profiling, Reporting including Business requirement, system requirement gathering.

Strong expertise on Informatica Power Center, Informatica cloud real time, IDQ, Metadata Manager, Master Data Management, Informatica BIGDATA Edition, Informatica Cloud,REST API with Informatica cloud

Expertise in Integration with Salesforce platform or any other major cloud based platforms using Informatica cloud.

Designed mappings to load data from Teradata to web services using WSDL file as target definition.

Designed Data synchronization tasks in informatica cloud to integrate data in real time .

1+ years of experience using Talend Integration Suite (6.1/5.x) / Talend Open Studio (6.1/5.xK

Knowledge in spark streaming.

Work experience on reporting tool likes Cognos/Tableau

Hands on experience on Data governance standards, MDM standards.

Experience in working Agile Scrum methodology.

Good knowledge on modern data architecture.

Relevant Experience on Data Modeling with ERWIN

Experience on BigData Hadoop eco system.

Expertise in Data modeling using Star Schema, Snowflake Schema and 3NF for Data warehousing / OLTP systems

Worked in the Teradata architecture with Cloudera data lake integration.

Experience in Oracle SQL Loader, Export/Import, Data Pump and Teradata loading tools Fastexport, Multiload, TPump, TPT, VIEWPOINT, TBAR, Performance tuning

Experienced in Oracle SQL Performance Tuning with Oracle tools and Performance tuning on Teradata database

Strong expertise in design and implementation of extraction and transformation (ETL) process

Lead projects from initialization phase from capturing business requirements to final phases of testing and production deployment

Served as mentor and reviewer to junior Specialists in all areas of data warehousing and ETL. Experienced in End user training, documentation, and Audit record maintenance and application/product demos

Experience in manual testing on data warehousing applications.

Experience in using HP ALM for Bug Tracking and Defect Reporting.

Ability to quickly adapt to changing priorities, requests, and environments.

Active team player and self-starter with immense ability to grasp and apply new concepts

Very strong expertise in Informatica Powercenter, Mapping Designer and Workflow Manager

CERTIFICATIONS

Informatica Certified Power center Developer

HDP certified Spark Developer http://bcert.me/ssdjrkot

TECHNICAL KNOWLEDGE

Data Management:

Databases: Oracle 10G/11G,Teradata 13/14/15,MS Sql Server 2003/2008,

ETL and BI Tools: Datastage, Informatica, QlikView, Business Objects, Cognos, Tableau

Software Tools:

Unix: Make, Awk, SCCS, Ksh, and Oracle SQL Loader, CA ERwin, Windows: CA Erwin 4.0, Office 2010

ETL Tools

Informatica Power center 8.x/9.x,Informatica cloud, Informatica Metadata manager & Data Quality, BIGDATA Edition (BDE), Talend Open Studio (TOS) for Data Integration

Databases

Oracle,Ms SQL Server 2007,Netezza, Teradata & Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD,TPUMP)

Methodologies & Standards:

Software Development Lifecycle (SDLC) and Agile methodologies.

Programming Languages:

Programming Languages: C,SQL, Sql Loader, Fastload, Multiload, Tpump, TD Viewpoint, TD Performance Tools, Java J2SE

Operating Systems:

Windows XP, Windows 2000 Server, Windows CE, Palm OS, Unix System V Rel. 4, SunOS 4.1

HP UNIX, Red Hat Enterprise Linux 5, Knowledge of iOS

PROFESSIONAL EXPERIENCE

JNJ, New Brunswick, NJ Jun. 2014 –Till Date

Sr ETL Developer

Project Description:-

Johnson & Johnson has lot of challenges in integrating /tracking the data of their healthcare business,

Through innovative business collaborations and strategies, we can help manage healthcare technologies, project-manage certain functional activities, and assume shared financial and regulatory risk. Our healthcare managed services can be scaled to deliver improved clinical quality, increase operational efficiency, and drive innovation in a single service line or across the care delivery system.

●Worked with business analysts and stakeholders very closely and identified the problem statements and provided different solutions and tools to be used.

●Worked with data owners to identify the pain points and challenges they are being faced and created the solution for the same

●Created the ETL architecture to maintain the centralized and modern Data Warehouse hub for business users across different groups to consume .

●Curated the data and reduced /removed redundant data from source systems and loaded only required data to Data Warehouse tables.

●Explore prebuilt ETL metadata, mappings and DAC metadata and Develop and maintain SQL code

as needed for SQL Server database.

●Import target definitions from WSDL file & created mapping which loads data from Teradata source.

●Created web service workflow in workflow manager by setting up required authentication methods for connecting to web methods.

●Created data synchronization tasks using informatica cloud .

●Performed data manipulations using various Talend components like tMap, tJavarow, tjava,

tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.

●Analyzing the source data to know the quality of data by using Talend Data Quality.

●Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal

solutions, and revise procedures and documentation as needed.

●Worked on Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated

those to Netezza.

●Used SQL queries and other data analysis methods, as well as Talend Enterprise Data Quality

●Platform for profiling and comparison of data, which will be used to make decisions regarding how to measure business rules and quality of the data.

●Worked on Talend ETL tool, develop jobs and scheduled jobs in Talend integration suite.

●Writing Netezza SQL queries to join or any modifications in the table

●Used Talend reusable components like routines, context variable and global Map variables.

●Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query Performance.

●Implementing fast and efficient data acquisition using Big Data processing techniques and tools.

●Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC).

●Created technical and functional solutions for organizations for managing different types of metadata

●Reviewed the metadata of the client/Business which includes Techincal,business,operational,project etc.… and provided the roadmap for the project accordingly.

●Finalized the ETL architecture for the project according to the problem statement and upon business stakeholders agreement contacted the ETL vendor to install the app and get the demo for the client.

●As a ETL architect prepared the pre installation checklist and worked closely with admin team and Informatica power center vendor for the installation.

●Provided the configuration details, which are specific to our product needs to the Informatica power center admin team.

●Actively involved in post tool configuration tests and reverted back to Informatica vendor for any application issues /bugs and the path /updates for the same.

●Worked with Data modeling team to get the appropriate DW model for the project example snowflake/star schema etc.

●Worked with closely with data owners,stakeholders,end users and developed the appropriate approach Bottom down or top down approach with the help of Data modeler.

●Post the installation and configuration of the tool guided the Informatica power center admin to grant appropriate access to users/developers across DEV,QA and prod environments.

●With the help of informatica admin team identified the backup strategies and frequency for the project.

●Presented different ETL architectures to solve the problem statement for the stakeholder upon approval of stakeholder,locked on particular architecture and created estimation for the project.

●Developed Technical design documents and architecture for ETL layer and upon approval from stakeholders started the development work.

●Created the mapping documents and shared with ETL developers and provided the KT on business problems,solution and ETL architecture.

●With Informatica power center designer tool created mappings, mapplets and used different business rules to perform Extract Transform Load (ETL) tasks.

●Worked with offshore team and provided mapping documents and reviewed the mappings developed by the team .

●During development phase created different test cases and scenarios for the UT .

●With informatica powercenter Workflow manager created sessions workflows and worklets .

●With informatica powercenter Repository manager created deployment groups to migrate the data to higher environments .

●Worked with testing team during UAT and SIT to fix the bugs and redeploy the code.

●Reviewed the DDL and DML created by offshore team and worked with DBAs to execute them in Dev,QA and Prod environments after getting signoff from QA team respectively .

●Created checklist for all the Informatica Powercenter /Database upgrades from one version to another version,did the dry runs and changed the configurations accordingly.

●Worked closely with Informatica Admin team to restart the repository service whenever required.

●Involved in Agile product backlog grooming and sprint planning and maintained JIRA board for the tasks.

●Worked with tidal team to schedule the jobs shared by offshore team .

Environment:-Informatica Powercenter 8.x/8.x/10.1,Informatica cloud real time, REST api, Informatica cloud services, Teradata 13.x/14.x/ 15.10.04.02,

Netezza, Oracle 12c, IBM DB2, TOAD, MLOAD, SQL Server 2012, XML, SQL, Hive, Pig, SQL, HP ALM, JIRA.

JNJ, Bangalore, ND Dec 2013 –May 2014

Sr ETL Developer/Lead

Project Description:-

Johnson & Johnson has lot of challenges in integrating /tracking the data of their healthcare business,

Efficiently collect all forms of streaming data and delivering it directly to both real-time and batch processing technologies so organization can analyze and act on it while it's still fresh and relevant.

●Created technical and functional solutions for organizations for managing different types of metadata

●Reviewed the metadata of the client/Business which includes Techincal,business,operational,project etc.… and provided the roadmap for the project accordingly.

●Worked with Informatica admin team to set up cloud and install all the security certificates/firewalls to interact with Web/Salesforce/cloud databases.

●Data loaded via ETL mappings been consumed by different users and they used different reporting tools like tableau

●Worked with Cloudera data like to pull data using impala queries and load it into Data Warehouse tables.

●Provided the configuration details, which are specific to our product needs to the Informatica Cloud Team

●Worked with salesforce team to get the access and security certificates creation so that Salesforce can send HTTPS request to informatica cloud

●Post the installation and configuration of the tool guided the Informatica power center admin to grant appropriate access to users/developers across DEV,QA and prod environments.

●Developed Technical design documents and architecture for real time integrations and upon approval from stakeholders started the development work.

●With Informatica Real Time cloud Integration tool created data synchronization tasks, data loader tasks, batch tasks tec.

●During development phase created different test cases and scenarios for the UT .

●Created salesforce/Azure/Local DB connectors for the real time data integration which will triggered by Batch scripts.

●Worked with salesforce team to create Apex classes/triggers to call REST Api for informatica cloud for real time integration of data to downstream.

●Formulating the QA plan for black box testing of the application including Functional, Regression, Integration, Systems and User Acceptance Testing.

●Wrote complex queries in Teradata SQL assistant to analyze the data.

●Modifying and executing test scripts for web based environment-using Quality Center.

●Extensively used Load Runner for generating the automated test scripts.

Environment:- Informatica Powercenter 8.x/8.x/10.1,Teradata 13.x/14.x/ 15.10.04.02, Oracle 10g (Release 2),Oracle 11g (Release 1),Unix Redhat,GNU/LINUX,SAP,MS SQL Server 2003/2008,

Cognos 8 BI,Tableau 10.0/10.1/10.2,Putty,Winscp,Xml files, structured and unstructured files,DB2 9.x/10.x, Informatica BDE, Informatica Metadata Manager, Oracle Instantis,Jira

John Hancock,Bangalore IND Sep 2011 –Oct 2013

Senior ETL Developer

Project Description:- John Hancock has the client all over the USA they have huge finance and insurance sector, with the growing data in huge volumes every day, organization needed as strong Data Warehouse system to analyses the results and take the appropriate decisions to improve the business.

●Worked as senior ETL developer /lead across different John Hancock projects,and successfully completed multiple project in my tenure with the client .

●Worked closely with clients and onsite Lead to understand the requirements and problem statements.

●Created mapping documents and technical design documents required for the development of the project.

●Developer and reviewed mappings developed in informatica powercenter designer.

●Implemented different optimization techniques like pushdown optimization at target level for Teradata and TPT functionality for source different source systems to staging layer.

●Implemented reusable ETL code by developing Power center designer mappings and powercenter workflow manager worklets.

●Created Parameter files and parametrized most of the SQL queries and connection names etc.

●Developed and reviewed the DDLs and DMLS required for the project purpose.

●Designed and developed Batch Control process to run the Informatica sequences in batches.

●Extracted the data from various sources with different code pages using Teradata TPT loader (fast load).

●Involved in identifying the bottleneck for the project and identified the root cause and optimized the code accordingly for better performance.

●Created the Informatica parallel jobs and sequences for extracting and loading the data using various stages like Flat File, RDBMS, Transformer, lookup, sorter, remove duplicate, Change Capture, joiner.etc.

●Designed and developed the parallel jobs for slowly changing dimensions (SCD Type1) and fact tables loading.

●Designed and developed different types of Dimensions like SCD Type 1/2/3 according to the business needs.

●Implemented Teradata Full PDO for optimized performance and accurate data by writing complex queries in SQ transformation.

●Created UNIX shell scripts to execute the external programs through Informatica and extract/place source/Target files to FTP location. Performed the performance and tuning at source, Target and Informatica job levels PDO, and analyzing the joiner columns and collecting the stats at each Teradata table via Informatica pre Sql session command.

●Creating the Technical Specification documents and conducting the walk through meetings with Systems Analysts.

●Creating the migration documents, deployment groups and migrating the code to the production environment.

●Analyzing the dependencies between the jobs and scheduling them accordingly using the Work Scheduler.

●Developed Test Matrix to give a better view of testing effort.

●Created detailed periodic Status Reports for senior management to keep them posted on the progress of implementation.

●Extracted test data by understanding relational database design diagrams Created.

Environment: Informatica Power center 8.x/9.x/10.1, Informatica BDE, Informatica Metadata Manager, Teradata 15, Tableau, SQL, UNIX, FTP, Ibroker & Web methods

Philips one BI,Bangalore India Dec 2010 – Aug 2011

ETL Developer

Project Description: workshop To improve the Lighting and insurance business in APAC region Philips invested in developing DataMart which will help them to invest accordingly so that business can be improved for the targeted areas/audiences

●Developed simple one to one Mappings in Informatica based on the Mapping Spec Provided.

●Also done ETL testing and prepared Test Cases (UTC) for the other complex mappings as well.

●Worked on ABAP code generation for SAP source systems in Informatica.

●Used HP ALM tool to log defects and issues for the same project.

●Involved in all the development phases of the project as junior ETL developer which spanned over a year.

●Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

●Extensively used SQL* loader to load data from flat files to the database tables in Oracle.

●Modified existing mappings for enhancements of new business requirements.

●Used Debugger to test the mappings and fixed the bugs.

●Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.

●Involved in Performance tuning at source, target, mappings, sessions, and system levels.

●Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica Powercenter 9.6,Teradata 13.X,Cognos 8 BI,Unix Red Hat,putty.

Masters in Project Management

VIT University Vellore, India

Bachelor in Technology

JNTU Hyderabad, India



Contact this candidate