Post Job Free
Sign in

Data architect

Location:
Frisco, TX
Salary:
70/hr
Posted:
September 10, 2020

Contact this candidate

Resume:

Sandeep Tellapati

*************@*****.*** ~ Cell: 614-***-****

Data Architect Data Modeler ETL Lead Developer Data Warehousing Business Intelligence

Accomplished Software Developer with 16+ years success in all phases of the Software Development Life Cycle (SDLC), driving design, implementation, testing, integration, deployment, and maintenance of scalable, high-performance software applications and systems. Creative and detail focused problem solver with outstanding organizational and analytical skills; experienced working effectively with on-site and offshore teams. Adept at defining business needs and translating them into cutting edge designs and products that strengthen work processes and improve operational efficiency. Technical proficiencies include…

Languages

C, C++, Core Java, PL/SQL, XML, Pro*C, Assembly Language - 8051, 8085, 8086

GUI

BI

Oracle Developer Suite - Oracle Forms

Working Knowledge of OBIEE, Oracle Reports, MSTR

Platforms

Windows NT, Windows 2000, MS DO, Solaris, HP-UX, Linux, AIX

Databases

Oracle 7.x, 8, 8i, 9i, 10g, 11g, 12c MS Access

ETL

Ab-Initio GDE, Informatica Power Center 10.1, Informatica IDQ 10.1, Talend for Big data and Data Integration.

Servers

Oracle 10g Application Server

Utilities

CA Erwin Data Modeler (Erwin 9.64), Microsoft Visio

Testing Tools

RTRT (Rational Test Real Time)

Tools

Quest Technologies Toad (Tool for Oracle Developers), Oracle SQL*Loader, UNIX Shell Scripting, MS Project Plan, SQL Navigator

Version Control

CM Synergy, Visual Source Safe, Clear Case, CVS

Trainings

Currently attending Big Data Architect @ EDUREKA

Professional Experience

Wipro Technologies (Charles Schwab) - Westlake, TX DEC 2019 – Present

ETL ARCHITECT

Global Data Technology

oCo-ordinated with the Business for the requirements and translated the business needs into High Level Technical design.

oWorked on Data Integration Projects that move data from USAA Platform to Charles Schwab Platform

oCreated Data Dictionaries / Data Mappings for Consuming applications.

oCreated Data Flow Diagrams to show the flow of data between various systems.

oMentored / managed offshore teammates.

oModeled Conceptual, logical and physical Models for transactional systems, Dimension Modeling for Data warehouse / Data Marts using Erwin for various projects

oDeveloped New Informatica ETL Processes to load data into IDW Warehouse and HDM’s.

oDeveloper New User Maintained tables and Supporting ETL to load data into type-2 SCD Tables.

oPerformed Code reviews and Change Request Review for the team.

oUsed Pushed down optimization feature to speed up the ETL Processes.

oUsed various Teradata Utilities like BTEQ, MLOAD and FLOAD.

oHelped in the Analysis of Applications and resolved any dependencies with USAA Data Migration.

oHandled and Lead Wipro Offshore team.

Environment: Teradata 15.10, PL/SQL, SQL, TFS,, UNIX (Sun Solaris), Teradata SQL Assistant, Nexus, SFTP, Informatica 10.1, ESP /Control-M, Erwin 9.64, Jira

Ocwen Financial Services - Coppell, TX DEC 2015 – DEC 2019

DATA ARCHITECT – DATA SOLUTIONS ARCHITECT

SENIOR DATABASE ANALYST - Mar 2015 to Dec 2015 (Soft world Inc - Waltham, MA)

Data Management / Data Architecture and Solutions Delivery:

oCo-ordinated with the Business for the requirements and translated the business needs into High Level Technical design.

oSME on Black in the Black (BITB) Loss mitigation Application which replaced Ocwen’s Loan Resolution and modification (LRM).

oCreated Data Dictionaries / Data Mappings for Consuming applications. Documented Current state and Created Future State Architectures as of part moving Ocwen’s Servicing Platform from Real Servicing to Black Night MSP.

oCreated Enterprise Data Architecture Diagrams to show the flow of data between various systems.

oAs part of merger prepared road map for sun setting redundant applications and migration / integration of sustained applications with New servicing system.

oMentored / managed offshore teammates on projects lead by me.

oModeled Conceptual, logical and physical Models for transactional systems, Dimension Modeling for Data warehouse / Data Marts using Erwin for various projects

oCreated Data Quality (DQ) Rules on inbound and outbound data to/from Ocwen to Third Party Applications.

oCreated IDQ Data Processor Transformations to Load data from JSON files into ODS.

oDeveloped Informatica ETL Processes to load data into Ocwen dataware house from various third party and Departments within Ocwen.

oWrote PL/SQL procedures, packages developed other database scripts like tables, synonyms, sequences, and triggers, handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

oDeveloped PL/SQL Packages, Functions and Procedures to implement the Business Logic using Bulk collect, bind variables, Ref cursors, dynamic SQL’s.

oCreating indexes on tables to improve the performance by eliminating the full table scans and Views for hiding the actual tables and to eliminate the complexity of the large queries.

oMultithreaded daily and month end Oracle process to improve SLA’s.

oHelped in the Analysis of Applications and resolved any dependencies with Altisource during Data center migration to OCWEN.

oPerformed automation of process to help reduce manual errors by developing Groovy scripts in various Windows Environments.

Environment: Oracle 12c, PL/SQL, SQL, TFS, Groovy, UNIX (Sun Solaris), SQL*Loader, SQL Plus, SQL Developer, SFTP, Informatica 10.1, Informatica IDQ 10.1, Control-M, Axway, Erwin 9.64

INFOVISION21 INC – DUBLIN, OH

SENIOR PROGRAMMER - ANALYST, July 2006 to March 2015

Develop, maintain, and upgrade cutting-edge software for financial, telecom and insurance leaders. Coordinate design and implementation of applications, collaborating with project managers, engineering teams, and client representatives to ensure on-time completion of project deliverables. Provide technical leadership to Junior Engineers, developing overall project architecture. Design back-end elements for Web-based systems and applications. Selected projects include…

Harley Davidson Financial Services - Plano, TX JAN 2011 - MAR 2015

Loan Origination and Management system (Daybreak- LMS):

An enterprise-wide solution that automates various aspects of financing for the consumer lending industry ranging from origination to servicing, collection, and floor planning, as well as processes installment loans, credit lines, and operating leases.

Responsibilities

oCo-ordinated with the Business for the requirements and translated the business needs into Technical design.

oDeveloped PL/SQL Packages, Functions and Procedures to implement the Business Logic using Bulk collect, bind variables, Ref cursors, dynamic SQL’s.

oDeveloped packages to perform the following

To load and process files from external institutions.

To Auto Approve / Decline financing for customers based on the credit report.

oDeveloped packages to create CSV reports.

Inventory log of Sold and Unsold bikes for a given time interval.

Flash Report Showing the Summary of Repossessed Assets

oWrite PL/SQL procedures, packages, materialized views and developed other database scripts like tables, synonyms, sequences, triggers,

oDeveloped New Oracle Forms and Oracle Reports as an enhancement to Existing Application.

oModified existing PRO* C application to adhere to FDIC Compliance of Credit Bureau Reporting.

oHandled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application. Used SQLCODE and SQLERRM for the exceptions error code.

oDeveloped logical and physical Data Model using ERWIN for Credit Decision Engine.

oDesigned and Developed scripts for uploading XML credit decision data from credit union into database, performing business validations on the data using PL/SQL Procedures using XQuery.

oGenerating trace files and analyze them using TKProf with explain plan option.

oCreating indexes on tables to improve the performance by eliminating the full table scans and Views for hiding the actual tables and to eliminate the complexity of the large queries.

oUsed PRAGMA AUTONOMUS_TRANSACTION in Procedures that needs to be committed outside of current transaction

oPerformed automation of process to help reduce manual errors by developing shell scripts and using CRONTAB in various UNIX Environments.

oPerformed Unit Testing of developed artifacts and interacted with QA team to provide support during testing Phase.

Environment: Oracle 11g, PL/SQL, SQL, PRO*C, PVCS, UNIX (Sun Solaris), SQL*Loader, SQL Plus, Toad, SFTP, Control M.

Fannie Mae - Herndon, VA APR 2009 – DEC 2010

The Financial Services under SIR is responsible for analyzing, designing, developing and providing Test and Production support for business functionality that enables Fannie Mae to create securities from loans in its Single-Family portfolio. Financial services functionality includes keeping track of loan principal and interest payments that must be passed through to investors and creating and storing official analytical and reporting data for each financial asset at the end of each period.

Responsibilities

oCreated Objects like stored Procedures, Functions to perform tasks like disabling / enabling of constraints, making indexes unusable before /after loading data into stage tables.

oCreated Partition tables like Range/List/Hash/Composite to speed up query response time.

oCreated Global Temporary Tables to hold the transformed data before loading into permanent tables.

oDeveloped high performance queries/PL/SQL codes using Collections, Bulk Collects,

Materialized Views, Objects, Nested Tables, REF Cursors, Indexing, etc.

oWorked on Joins, Analytical functions, Aggregate Functions and sub-queries for the Reporting purposes extensively.

oDeveloped a Partition exchange procedure, which swaps the corrupted partition with the corrected Partition of a table.

oPrepared Test cases for all the business rules and implemented and checked with all the procedures by Automating UNIT testing for PL/SQL packages (UT PL\SQL Utility).

oTuned several queries for the reports by examining at EXPLAIN TABLE, TKprof and execution steps taken by the optimizer.

oDeveloped a month end accounting extract using Ab-Initio which reduced the extract time from 72Hrs to 3 Hrs.

oDeveloped a Generic extract Script using Ab-Initio to reduce the run time on extracts

oUsed the PRAGMA Exception_INIT to name the system exceptions that are raised by the program.

oProvided Production Support 24/7 on monthly scheduled basis.

oGenerated Ad-hoc Query reports for the application on an On-demand basis for Accounts group monthly for the previous month.

oProvided support during SIT /UAT/ BOT Phases of testing.

Environment: Oracle 10gR2, PL/SQL, SQL, Ab-Initio GDE, Clear Case, Clear Quest, UNIX (Sun Solaris), SQL*Loader, SQL Plus, Toad, SFTP, Autosys.

Guarantee Fee Accounting System(GFAS)

Development for consolidation of Fannie Mae Securities. The system processes the securities as per GAAP specifications for Write-off, Segment Reporting and Fair Value re-establishment of cost bases. The system has the functionality to generate SOX compliance reports.

Responsibilities

oResponsible for Business Analysis, Data Analysis and Understand the Requirements of the FAS140 Accounting System.

oDeveloped Data Model for GFAS Stream using MS Visio as data modeling tool.

oCreated DB link to pull the data from various source systems (ADW, FAS140).

oDeveloped complex PL/SQL procedures to perform preliminary validation / conditioning / translating lookup values and flattening the data as for the downstream processes.

oUsed Collections, Bulk Binds to improve performance by minimizing the number of

context switches between the PL/SQL and SQL engines.

oDeveloped Anonymous PL/SQL Blocks to generate SOX control reports to validate the data transformation.

oUsed set operators in PL/SQL like unions, union all, intersect and minus.

oMocked Up test data for the development to test the business Use cases.

oDeveloped scripts to creation of database tables, views and Synonyms for various environments.

oConducted code reviews, Performed Unit testing and Integration testing.

oCreated run books for SIT / UAT / BOT Testing Teams to run the code in each environment

Environment: Oracle 10gR2, PL/SQL, SQL, Ab-Initio GDE, Clear Case, Clear Quest, VI Editor, UNIX (Sun Solaris), SQL*Loader, SQL Plus, VISIO, Toad, SFTP, Autosys.

Verizon Business - Boston, MA JAN 2007 – MAR 2009

Verizon Enterprise Center(VEC):

Enterprise Customer Center, is the IT organization which is responsible for supporting every enterprise customers for any kind of service provided by telecom giant, including ordering, billing, repair, network traffic, service monitor, …etc. Rapid developing new products and technologies, large-scale integration and extensive service coverage of Enterprise Center have won J.D. Power and Associate Award for Customer Satisfaction for two years for telecommunication industry.

The Enterprise Customer Profile (eCP) is the kernel of EC, which integrate every enterprise account from every legacy system within telecom giant’s footprints.

oData Integration and modeling (e.g. IGI, subscriber, subscription)

oData Synchronization (data sync for IGI, IDW, NDM, AORTA billing address)

oData management for application like online repair, ordering, and billing.

oData mining and dashboard for various application statistics.

oData management and NDM administration.

oLogical and Physical Data Modeling using Erwin for new schema to host the migrated data.

oDeveloped stored procedures/functions/packages for supporting eCP portal using PL/SQL.

oDevelop new or enhance the existing Oracle Stored Procedures to perform Scheduled operations at regular intervals

oDeveloped different algorithms for data synchronization, migration, cleaning and insertion.

oDeveloped ETL (Extract Transform Load) scripts using PL/SQL for data migration from

acquired business to Telecom Giants Business Portal.

oAnalyze and implement the required schema changes to suit new business requirements for the purposes of tuning performance using Explain Plan and TkProf. Optimized SQL Script for performance using Hints.

oDeveloped triggers for auditing and error handling purpose.

oAny adhoc requests like data cleanup activities for removing unwanted data.

oHands on experience in migration of Data from Flat file, XML, Excel to Oracle.

oCreated materialized view, materialized view log external tables for optimization

oUsed data pump for data and metadata movement.

oMonitored index usage to find its usage in application and eliminate/enhance the unused indexes.

oProvided Entitlements (access) to applications(GUI) for Users through packages.

oDeveloped UNIX Shell scripts which connects to oracle Database through sqlplus and process the nightly data feeds (automated using cron jobs).

oHands on experience automating data load and sending emails about the status of loading and processing using UNIX k-shell and Perl scripts.

oPerformed Unit and Integration Testing and documented the test results in development environment.

oSupported QA Team with the SIT, UAT and PRODUCTION Testing.

oSupported DBA Team with DB scripts deployment during Production Drop.

oNDM Connect Direct Configuration and administration for transfer of files between UNIX and

MAINFRAME Systems (Legacy Systems).

oProvided 24 X 7 dedicated supports for production server.

Environment: Oracle 9i /10g, PL/SQL, SQL, Win CVS, VI Editor, UNIX (Sun Solaris), XML, Java, SQL*Loader, SQL Plus, Toad, SFTP.

Illinois Causality Company – Rock Island, IL SEP 2006 – DEC 2006 SENIOR SOFTWARE ENGINEER, INDIA MAR 2004 - JUN 2006

SAFRAN Aerospace India – Bangalore, India (Formerly Snecma Aerospace India)

Educational Background

Bachelor of Science in Computer Science and Engineering

Bangalore University, Bangalore, India



Contact this candidate