Resume

Sign in

Data Developer

Location:
Irving, Texas, United States
Salary:
115000
Posted:
February 26, 2018

Contact this candidate

Resume:

LakshmanaRao Dukkola

ac4maw@r.postjobfree.com

720-***-****

Professional Summary

•Over 12 Plus years of programming experience, design and Implementation of Business Applications using the Oracle Relational Database Management System (RDBMS and ORDBMS).

•Having 5 Plus years of experience as Data modeler.

•As modeler, responsible for gathering and translating business requirements into detailed, production-level technical specifications, creating robust data models, data analysis features and enhancements for this major capital market client.

•Current Project developed Work flows using Informatica PowerCenter designer 10.1, created deployment groups. Performed end to end migration work.

•5 Plus years of experience in Business Objects BO Tableau and OBIEE.

•Worked for Charles Schwab, have Financial Sector Experience

•Oracle Certified Associate (OCA).

•Involved in all phases of the SDLC (Software Development Life Cycle) from analysis, design, development, testing, implementation and maintenance with timely delivery against aggressive deadlines. Also involved in agile methodology.

•Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modelling and design techniques.

•Expertise in Client-Server application development using Oracle 11g/10g/9i/8i, PL/SQL, SQL *PLUS, TOAD, SQL*LOADER, (exp, imp) and impdp expdp.

•Good Knowledge of Oracle Architecture.

•Effectively made use of Table Functions, Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views (Logs) and Transportable tablespaces.

•Strong experience in Data warehouse concepts, ETL Implementations.

•Good knowledge on logical and physical Data Modeling using normalizing Techniques.

•Created Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based).

•Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.

•Developed materialized views for data replication in distributed environments.

•Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.

•Partitioned large Tables using range partition technique.

•Experience with Oracle Supplied Packages such as DBMS_SQL, DBMS_JOB and UTL_FILE.

•Created Packages and Procedures to automatically drop table indexes and create indexes for the tables.

•Expertise in Dynamic SQL, Collections and Exception handling.

•Experience in SQL performance tuning using Cost-Based Optimization (CBO).

•Good knowledge of key Oracle performance related features such as Query Optimizer, Execution Plans and Indexes.

•Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.

•Experience in ETL techniques and Analysis and Reporting including working experience with the Reporting tools such as Tableau, Informatica and Ab initio.

•Data Migration, involved extensively in data migration activities. (Extracting Data from different boundaries (Vendor systems), Transforming the Data with business rules and loading to staging and application instances) synchronizing the Test Environments with Production for real time data and more.

•Used Big data skills, aware of architecture, configured Apache Nifi and started processing data.

Skills

Experienced in Data Lake, Data Warehousing, and Business Intelligence (BI) applications

Experienced in all phases of the Data warehouse project life cycle including Requirements Analysis, Dimensional Modeling, ETL and Business Intelligence

Excellent Experience in ETL Tools – Informatica, Talend, ODI and SAPDS

Proficient in Self-Serve Reporting tools - Tableau, Spotfire, QlikView and Microsoft Power BI

Proficient in End-to-End Architecture design.

Extensive experience in Data Modeling, Design, Development and Master Data Management(MDM)

Experienced in data quality and data consistency

Unix-Scripting and batch processing.

Strong knowledge in Microsoft Azure and Cloud technologies

Technical skills

Databases : Oracle 10g/11g, SQL Server 2008R2 /2012/2014, MySQL, DB2, Teradata

NOSQL : MongoDB

Modelling : Kimball, Inmon, Hybrid

Languages : Java, .net

BI/Data Discovery Tools : TABLEAU, SPOTFIRE, Power BI, OBIEE

ETL/Data Integration Tools : INFORMATICA,ODI

Reporting Tools : OBIEE, SAP BO

Scripting : Java script, jQuery

Data Tools : Erwin, Microsoft Visio

Education:

Master of Science, Computer Science, 2004, Andhrapradesh Andhra University–

Professional Experience

HP Inc (Enterprise Data Governance) - Palo Alto, CA Feb 17 – Feb 2018

Data Architect / Data Modeler

Project Description:

As a Lead Developer Analyst, worked with business to understand and gather requirements, and take it forward for design development and testing with the help of offshore team.

Used extensive Data Migration scripts from legacy systems data to move to EDW, Used traditional ETL and Informatica for migration work. Used extensive PL sql code to implement the business requirements. Tested the code written in QA regions with help of business to make sure correctness of data to promote to Production.

Involved with business for requirements gathering and understanding and working with team collaboratively.

•Coded process using PL SQL used dynamic views using parameterized user environment variables.

•Designed and developed extensive Packages for modularized code for business.

•Mapping and tracing data from system to system in order to solve a given business or system problem.

•Verify the current reporting data and trouble suite for right data analysis capabilities.

•Perform statistical analysis of business data.

•Collaborate with data architects for data model management and version control.

•Conduct data model reviews with project team members.

•Capture technical metadata through data modeling tools.

•Create data objects (DDL).

•Enforce standards and best practices around data modeling efforts.

•Ensure data warehouse and data mart designs efficiently support BI and end user

•Collaborate with BI teams to create reporting data structures.

•Analyzing and mining business data to identify patterns and correlations among the various data points.

•Documenting the types and structure of the business data (logical modeling).

•Data cleansing and standardization i.e. duplicates, garbage sanitization.

•Prepared Process Flow and Data Flow diagrams

•Maintained Issue logs across the Domain Towers

•Worked on Enterprise Governance processes

•Used Hadoop file processing and Hive and Pig

Environment: UNIX, Erwin, SQL Developer Data Modeler, Oracle/Mango dB/Hadoop, Informatica, Microsoft Visio, Tableau

GE Digital – Cincinnati, OH Jul 16 – Feb 17

Data Architect / Data Modeler

Responsibilities:

I have worked as a lead database developer, in agile model. We have developed reporting data lakes as part of this project. Have used (Oracle, Teradata, Informatica, Tibco Spotfire) for daily work, could work on offshore and Design and Development

•Database management

•Collaborate with data architects for data model management and version control.

•Conduct data model reviews with project team members.

•Capture technical metadata through data modeling tools.

•Create data objects (DDL).

•Enforce standards and best practices around data modeling efforts.

•Ensure data warehouse and data mart designs efficiently support BI and end user

•Collaborate with BI teams to create reporting data structures.

•Designing BI Reports using Spotfire

•Developed Informatica work flows for Enterprise Data processing

•Product maintenance and Releases.

•Introduced table partitioning to the existing data model

•Application Administration and implementation at various centers and maintenance/customization of the software.

•Working on plsql performance, I could improve the coding standards by using my past experience.

•Analyzing of Indexes.

•Defined new approaches.

•Involved in Interaction with the client for Database requirements.

•Understanding the Requirements and Analysis of the specifications provided by the clients

Data Migration, involved extensively in data migration activities. (Extracting Data from different boundaries (Vendor systems), Transforming the Data with business rules and loading to staging and application instances) synchronizing the Test Environments with Production for real time data and more.

Environment: UNIX, Erwin, Spotfire, Sql developer, Informatica, Visio, Oracle, Teradata and SFTP/FTP

NBCU -Los Angeles CA Sep 15 – Jul 16

Lead Database Developer

Responsibilities:

NBCU Dex is theoretical distribution system, as part of the team I have worked on onsite offshore model, data migration, and data analyst and data modeling. Capgemini is won the product Dex, for NBCU we were configuring the needed changes per required to suite NBCU model.

Contribution

I have worked as a lead database developer, in agile model. We have developed reporting data lakes as part of this project. Have used ( Oracle,, Tibco Spotfire) for daily work, could work on offshore and onshore model.

As a team we could come up with business wide journal’s reporting at account level and unit level, achieved centralised dash board reporting enterprise level.

Involved with business for requirements gathering and understanding and working with team collaboratively.

Involved in Interaction with the client for Database requirements.

Understanding the Requirements and Analysis of the specifications provided by the clients

Design and Development

Database management

Testing - unit testing & integration testing

Responsible for overseeing the Quality procedures related to the project.

Product maintenance and Releases.

Application Administration and implementation at various centers and maintenance/customization of the software.

Working on pl sql performance, I could improve the coding standards by using my past experience.

Analysing of Indexes.

Defined new approaches.

Developed software application using PowerBuilder 12. 5 for Windows, Web, and Mobile platforms

Data Migration, involved extensively in data migration activities. (Extracting Data from different boundaries (Vendor systems), Transforming the Data with business rules and loading to staging and application instances) synchronizing the Test Environments with Production for real time data and more.

Charles Schwab - Denver, CO Apr 10 – Sep 15

Data Architect / Data Modeler

Responsibilities:

Part of nextgen and revolution projects I have suggested data replication across databases or instances using Network links, Datapump, materialized views, active synonym switching, this helped to port the batch process/data to application from various boundaries. (utilities used export, import, Load utilities, DB2 Lookup, Sql and sql pl), working more in automation of current Unix os jobs to nightly jobs as part of this I have developed Shell script wrappers and few perl scripts.

•Involved in Interaction with the client for Database requirements.

•Understanding the Requirements and Analysis of the specifications provided by the clients

•Design and Development

•Database management

•Collaborate with data architects for data model management and version control.

•Conduct data model reviews with project team members.

•Capture technical metadata through data modeling tools.

•Create data objects (DDL).

•Enforce standards and best practices around data modeling efforts.

•Ensure data warehouse and data mart designs efficiently support BI and end user

•Collaborate with BI teams to create reporting data structures.

•Testing - unit testing & integration testing

•Responsible for overseeing the Quality procedures related to the project.

•Product maintenance and Releases.

•Application Administration and implementation at various centers and maintenance/customization of the software.

•Working on pl sql performance, I could improve the coding standards by using my past experience.

•Analysing of Indexes.

•Export, Import, Sqlldr, sql hints, TKprof, Shell scripting, DB2 and various Oracle builten packages.

•Defined new approaches.

Environment: Oracle 8.x/9.x/10g/11g,RAC,UNIX, Erwin, Tableau, Informatica, Visio

British Petroleum (IBM) – Bangalore, India Aug 09 – Apr 10

Data Architect / Data Modeler

Responsibilities:

As a team member, was responsible for

•Understanding the Requirements and Analysis of the specifications provided by the clients

•Design and Development

•Database management

•Testing - unit testing & integration testing

•Responsible for overseeing the Quality procedures related to the project.

•Product Releases.

•Exports, Imports and working with Global temporary and External Tables.

Environment: Oracle 10g, RAC, UNIX, Erwin.

British Petroleum (Satyam Computers) - Hyderabad, India June 06 – Aug 09

Database developer /Data modeler

Project Description

Took over the responsibility of enhancing functionality of Tr@ction by adding new functionality as a Pl/Sql programmer and HTML tags, Java scripting.

•Understanding the Requirements and Analysis of the specifications provided by the clients

•Design and Development

•Database management

•Query Tuning With implementation if Usage Hints TKPROF

•Monitoring the database health with STATSPACK Report and TKPROF

•Analyzing of Indexes

•PL/SQL extensively for producing Billing reports and reports for statutory

•Used Built-in Packages DBMS_LOB, XMLDB, UTL_FILE and on OS commands

•Testing - unit testing & integration testing

Environment: DB2, SQL Server 2000, Informatica Power Center, Erwin, Microsoft Visio, Rational Requisite Pro, Rational Rose, Windows 2003 Server.

Kenan Billing (Feature focus Info tech ) IBM– Hyderabad, India May 05 – May 06

Database Developer/DBA

Contribution

•Application Administration and maintenance/customization of the software.

•Updating modules as necessary of requirements.

•Generating information according to requirements.

•Converting data formats as need by client.

•Validation Checks of these Data using tools.

•Analyzing of Indexes

•PL/SQL extensively for producing Billing reports and reports for statutory requirements

•Coding for database triggers and stored procedures. SQL Code optimization

E-Care (Phoenix IT Solutions) – Hyderabad, India Nov 03 –April 05

Database Developer

Project Description BMS is a neutral, open standard, browser based, scalable and time-tested billing Software BMS Ver 2.1. Contribution

Involved in preparation of Data Base Design Document

Convert the BMS Data into the Data Formats required by several parties

Convert the data provided by these agencies into the BMS database

Validation Checks of these Data from Remote Meter Reading Instruments

Performance Tuning

Query Tuning With implementation if Usage Hints TKPROF

Monitoring the database health with STATSPACK Report and TKPROF

Analysing of Indexes

PL/SQL extensively for producing Billing reports and reports for statutory requirements

Coding for database triggers and stored procedures. SQL Code optimization



Contact this candidate