Venugopal Reddy Adla
FALLON,NV 89406
Email: ****.*********@*****.***
Mobile Phone: 919-***-****
STATUS: U. S. CITIZEN
OBJECTIVE: LOOKING FOR AN OPPORTUNITY AS AN INFORMATICA POWERCENTER 9.5 ETL DEVELOPER WITH TERADATA13, OR HADOOP DEVELOPER( LATIN PIG, HIVE, HBASE, IMPALA,OOZIE,YARN,SQOOP,NIFI),SQL SERVER SSIS,SSAS,SSRS.
PROFESSIONAL SUMMARY
I have over 15 years of experience in software development. Experienced in life cycle development including requirements gathering, analysis, design, development, testing, deployment to production and post-production support in ORACLE ERP APPLICATIONS, INFORMATICA 9.5(ETL) DATA WAREHOUSING, HADOOP DEVELOPMENT FOR BIG DATA SOLUTIONS.
TECHNICAL SKILLS
ERP Applications: Oracle Financials and HCM R12, Oracle Financials 11i, 10.7 (AP, AR, GL,GL, Fixed Assets, Order Entry, Front End Tools: Oracle Developer 2000 (Oracle Reports 2.5/3.x,5.0/6.x), Discoverer (Oracle Browser 1.0, 4.0), Crystal Reports 6.5/8.x, Toad, SQL Navigator, Data Warehouse Informatica 9.5(ETL), Tableau Dashboard knowledge.
Databases: Oracle 7.3, 8.x, Oracle 9i, 10G, MS Access, SQL Server8,12, TERADATA v13,NETEZZA, DB2, IBM Studio v4.1.3
Operating Systems: UNIX, Windows NT, Windows 9x, Windows 2000, Windows XP
SDLC METHODS: AGILE SCRUM and Water fall Methodology
Platforms: Windows, Linux
Languages: Informatica 9.5v(ETL), 10.2v, SQL*Plus, PL/SQL, SQL*Loader, sql server ETL TOOL Job scheduler: Crontab
Reporting tools; crystlal Reports, cognos, sql server reports, Tableau Reports
Hadoop Tools: PIG, HIVE, HBASE, SCOOP, OOZIE.
Bank of America, Charlotte, NC/Apex System May- August2019 Application Architect V / Informatica developer
Involved in Data migration project from Netezza v7.0 to ExaData v12 for data source to Data warehouse project.
ARIBA procurement data migration for Vendor Data Warehouse, Vendor Administrative Data warehouse
Projects. Data source prep for Power center v10.2.
High Mark inc, Harrisburg, PA/Computer Aid inc August- sept 2018 Informatica Analyst/programmer
Did use power center10.1v to transform DB2 source data to DB2 warehouse.
HIGHMARK, Harisburg.PA August,2018 to Sept,2018 Programmer/Analyst Did worked on the Providers Data Transformation Informatica Project. Brought the source from third party to transform data in Informatica to load into target Data warehouse tables for further project in MDM. Used power center 10.1 in this project.
Created expression transformation to validate null value columns and data format columns by using user error & abort functions to error out & rollback data.
Maplets were created for reusable for different mappings. Dropped indexes to bulk load data and recreated indexes after data load in pre & post sql.
Fixed non- fatal errors by fixing the transformations.
Did tuning by implementing push down optimization & session partitioning.
Fixed bottleneck issues in Informatica executions of workflow sessions.
Good knowledge of handling session properties for source & targets & error handling issues
Cron tab to run batch jobs in unix to run pmcmd to execute workflows.
CTRL-S Hyderabad, India January 2010 to October2017 Hadoop developer/Teradata / Informatica developer
Did a project with Telengana State Health Department to find out Patient study about seasonal diseases with Informatica to create a data warehouse and create Tableau dashboard reports. Did ETL development using sqlserver ssis,ssas,Tableau dashboard scorecard reporting.
.Teradata was used in Hadoop project and also as ETL project. .utilized the BTEQ UTILITY to run batch commands and regula commands to run sql scripts.
Knowledgable in all teradata utilities: FASTLOAD,MULTILOAD,TPUMP,TPT &FASTEXPORT. .Did tuning in Teradata: Explain & Statistics on table columns.
. Good knowledge of Teradata architectute:PE,BYNET,AMP,VDISK.
• Evaluated business requirements and prepared detailed specifications that follow project guidelines required to develop written programs.
• Responsible for building scalable distributed data solutions using Hadoop.
• Importing the data from the SQLServer to HIVE and HDFS using SQOOP for One time and daily solution.
• Worked on Big Data Hadoop environment on multiple Nodes
• Developed MapReduce programs and Pig scripts for aggregating the daily eligible & qualified transactions details and storing the details in HDFS and HIVE
• Implemented Hive tables and HQL Queries for the reports.
• Experience in performing data validation using HIVE dynamic partitioning and bucketing.
• Involved in Extracting, loading Data from Hive to Load an RDBMS using Sqoop.
• Extensive data validation using HIVE
• Involved in creating Hive tables loading with data and writing hive queries which will run internally in map reduce way.
• Worked on exporting the same to SQLServer, which further will be used for generating business reports.
• Worked in tuning HiveQL and Pig scripts to improve performance Good experience in troubleshooting performance issues and tuning Hadoop cluster.
• Load flat files from Hdfs file path to local informatica (ETL) file system directories. From here loaded as Source file into Informatica Power Center for transformations and loading processed data into final destination database for further decision making process.
• Used Teadata13 utilities: FastLoad, Multiload, Tpump, Fast EXPORT, BTEQ,TPT
• Used EXPLAIN, COLLECT STATISTICS for TERADATA performance tuning.
• Used Informatica PDO( push down optimization) and SESSION PARTITION for better performance.
• Created procedures, macros in Teradata
• Used Bteq for sql scripts and batch scripts and created batch programs using Shell scripts.
• Developed Apache PIG scripts to process the HDFS data. Created Hive tables to store the processed results in a tabular format.
• Developed the sqoop scripts in order to make the interaction between Pig and MYSQL Database.
• Developed the script files for processing data and loading to HDFS. Written CLI commands using HDFS. Developed the UNIX shell scripts for creating the reports from Hive data.
• Ran cron jobs to delete hadoop logs/local old job files/cluster temp files, Setup Hive with MySql as a Remote Metastore.
• Moved all log/text files generated by various products into HDFS location. Created External Hive Table on top of parsed data.
• Worked on different phases of Data Warehouse development lifecycle from Mappings to extracting data from various sources to tables and flat files. Created Re-Usable objects like Maplets & Re-usable transformations for business logic.
• Worked on Transformations, such as Rank transformations, Expressions, Aggregator and Sequence
• Experienced in working with complex mappings using expressions, routers, lookups, aggregators, filters. Worked on updates and joiners in Informatica, session partition, cache memory, connected lookups and unconnected lookups.
• Used TERADATA 13/Oracle databases for informatica DW tool to load source data.
• Created Teradata schemas with constraints, Created Macros in Teradata. Loaded the data using Fast load utility. Created functions and procedures in Teradata.
• Experienced in writing SQL queries, PL/SQL programming and Query Level Performance tuning.
• Developed and Tested database sub-programs (Packages, Stored Procedures, Functions) according to business and technical requirements
• Generated DDL scripts for creation of new database objects like tables, views, sequences, functions, synonyms, indexes. Developed Packages, Stored procedures, created roles and granted privileges.
FREELANCING in Hyderabad, India August 2007 to December 2009
• Oracle Apps Tech Trainer – Taught Sql, PlSql and Oracle ERP Applications
• Received training in Informatica, Teradata and Hadoop
Global Knowledge, Cary, NC Senior Software Engineer October 2003 to July 2007
• Responsible for analysis of production Oracle data extracts and worked closely with business and technical support staff to identify Data Anomalies, Data Trends and other Production Based Data issues. • Developed and Tested database sub-programs (Packages, Stored Procedures, Functions) according to business and technical requirements
• Formatting data extracts using Excel, and presenting the data extracted to the client management for review. Used Sql Loader for data transfer.
• Involved in providing data from oracle procedures to Oracle Advanced Queues
. • Systems messages are sent to different retails stores throughout the country using to Oracle Advanced Queuing Technology.
• Using UTL_FILE utility of Oracle to extract data as a flat file and send it directly to the server. Involved in Performance Tuning, profiling and monitoring of SQL Via Explain Plan, TKPROF and SQL Trace .
• Written shell scripts to automate SQL Loader process to load data into the tables. Used SQL Profiler to monitor the server performance, debug T-SQL for slow running queries
• Process change requests through Remedy ticket system, including application wide data changes. Responsible for Deploying changes to production and non-production environments.
ORACLE CORPORATION July 1998 to November 2002 Full Time Employee
Client: City of Chicago, Chicago November 2001 to November 2002 Responsibilities:
• Lead the Technical team in data conversion for HRMS application( Vacancies, Requisitions, Employee, Applicant, Employee Applicant, Address, SIT, Assignment, Salary, Terminate, Element, Fed Tax, NACHA, State Tax) using SQL Loader. • Design and develop Custom reports using Oracle Discoverer, developed Interfaces using PL/Sql.
• Sql loader used for data conversion.
• Developed and Tested database sub-programs (Packages, Stored Procedures, Functions) according to stated business and technical requirements.
• Used cost – based approach to calculate execution plan with the lowest cost. Used syntax-based rules to determine precedence
• Gathered Statistics using Analyze command.
• Used the Trace and TKPROF for tuning strategy.
Oracle Corporation/ Chicago Public School System, Chicago, August 2001 to November 2001 Responsibilities:
• Involved in data conversion for Oracle applications using Sql Loader.
• Created custom interfaces using Sql Plus and PL/Sql.
• Design and develop Custom Reports using Oracle Reports, Sql and Discoverer. Modified Oracle reports using Oracle Discoverer and Oracle Reports.
• Developed test cases and executed tests for database programs such as Packages, Stored Procedures and Functions according to business requirements.
• Tuned PL/SQL code for faster performance.
• Created Indexes in the tables to optimize the PL/SQL scripts.
• Used Rule-based and Cost-based approach on the Sql statements for different scenarios and chosen the best Optimization approach for Sql queries.
Oracle Corporation Client: Michigan State (Welfare Department) East Lansing,MI
Dec 2000 to July 2001
Responsibilities:
• Participated in the custom development interfaces for custom application using PL/Sql.
• Created custom reports using Oracle Discoverer.
• Created Procedures, Functions and packages with PL/Sql code.
• Created Exceptions to handle errors.
• Created Documentation for interfaces and for data conversions
. • Involved in Performance Tuning, Profiling and Monitoring of SQL Via Explain Plan, TKPROF and SQL Trace
• Developed temporary table to load legacy data, validated data in the staging table writing Triggers, stored procedures, Functions and Packages
• Used Toad/Sql Navigator for writing and executing the queries.
Oracle Corporation/ Client: American Trans Air, Indianapolis, May 2000 to November 2000
Responsibilities: • Involved in data conversion for Oracle HRMS application Sql Loader. Created custom reports using Oracle Discoverer and Oracle 6i.
• Developed PL/Sql used for interfaces based on functional requirements.
• Involved in Performance Tuning, Profiling and Monitoring of SQL Via Explain Plan, TKPROF and SQL Trace
• Converted data from legacy systems using SQL Loader and Unix operating system.
• Registered SQL Loader scripts and interfaces in Oracle Application to get executed from Oracle Application.
• Used Sql Navigator for creating and execution of SQL scripts.
• Developed adhoc oracle reports using Sql Plus for the client management as per requirements.
• Using UTL_FILE utility of Oracle to extract data as a flat file and send it directly to reside on the server side.
Oracle Corporation Client: Wake County School System, Raleigh, NC
August 1999 to April 2000
Responsibilities:
• Developed PL/Sql interfaces for 401k and other third party vendors based on functional requirements for Oracle HRMS application.
• Created new reports and modified existing reports using oracle Discoverer.
• Designed and developed PL/Sql procedures and packages for Oracle 11i HRMS.
• Developed and Customized Oracle HR/Payroll Reports using Oracle Reports 6i.
• Registered custom interfaces and packages in AOL.
• Involved in System Development Life Cycle such as Design, Development, Testing and Deployment to production.
Oracle Corporation
Client: Lucent Technologies, Reading & Allentown, PA July 1998 to July 1999
Responsibilities:
• Implemented Integrated Circuits Information Solutions for Lucent Technologies by integrating Oracle Manufacturing Applications (WIP, MRP, BOM) from legacy systems using Sql Loader and Unix.
• Data Mapping for WIP, MRP, BOM. Customized Work in Process, Material Resource Planning, Bills of Material conversions and interfaces. For Software Configuration Management used Intersolv’s PVCS. Created Report- Sets for the Manufacturing modules.
• Designed custom reports using Discoverer.
• Created custom interfaces with PL/Sql in Oracle Purchasing and customized oracle reports to suit the client needs.
• Using UTL_FILE utility of Oracle to extract data as a flat file and send it directly to reside on the server side.
• Involved in Performance Tuning, Profiling and Monitoring of SQL Via Explain Plan, TKPROF and SQL Trace
ASC, Inc.,/FFM, Inc., Rocky mount, NC February 1997 to June 1988 Consultant
Responsibilities:
• Gathered requirements, created Functional design documents and got approved from the business partners before the development.
• Implemented Account Receivable, GL, Fixed Assets from legacy systems.
• Customized Fixed Assets and Accounts Receivable reports.
• Converted legacy systems into Oracle and interfaced using PL/Sql in Oracle Financials (GL, AP,AR).
• Converted legacy systems into Accounts Payable and Purchase Order using Sql Loader.
• Created SQR reports Oracle Database.
• Created custom inbound and outbound interfaces with PL/SQL.
• Modified Oracle Seeded Reports to suit the client needs.
• Developed data mapping documents to bring data from external systems.
• Developed documents based on Systems development Life Cycle after consulting the client.
Bellcan Services/Unifi, Inc., Greensboro, NC November 1995 to February 1997 Consultant
Responsibilities:
• Implemented AP, GL, FA, OE from legacy systems using Sql Plus, PL/SQL and SQL Loader.
• Customized FA, GL & AP reports. Converted legacy systems into Oracle and interfaced using PL/Sql with Oracle Financials.
• Developed interfaces using PL/Sql for oracle purchasing and customized purchasing reports to suit client requirements.
• Extensive usage of SQL Loader to bring in Legacy data into First time Oracle Application users client.
WTW, Inc., RTP, NC/Natlsco, Chicago, IL November 1994 to October 1995 Consultant
Responsibilities:
• Implemented Laboratory Management System (HP-LIMS) using Oracle data base.
• Extensive usage of LIMS is Laboratory Information Management System developed by HP.
• Implemented Order Supply Application using Oracle Forms 4.0 and Oracle Reports 2.0
. • Projects in Interactive Voice Response System Projects and Telephone Integration.
• Interfaces using PL/Sql and adhoc reports with Sql plus.
CDSI/US EPA, RTP, NC September 1993 to October 1994 Consultant Responsibilities:
• Implemented Scientific Work Planning Module and Publication Module using Oracle 6.x & 7.x, Developed basic forms and reports using Power Builder.
• Used PL/Sql for interfaces. Used Sql Plus to create adhoc reports.
Saint Augustine’s College, Raleigh, NC, August 1988 to July 1993 Assistant Professor
• Taught computer courses to undergraduate students.
• Taught COBOL, FORTRAN, MIS, and SYSTEMS ANALYSIS & DESIGN.
Saint Paul’s College, Lawrenceville, VA, August 1986 to July 1988 Assistant Professor
• Taught Computer courses to under graduate students
• Taught COBOL, FORTRAN, MIS and SYSTEMS ANALYSIS & DESIGN.
EDUCATION:
• M.S., Computer Science, Jackson State University, Jackson, Mississippi, 1986
• M.S., Food Science & Technology, Alabama A & M University, Huntsville, Alabama, 1982
• B.Sc. Agriculture science, Andhra Pradesh Agriculture University, 1977