HARINDER TALLAPALLI *******@*****.***
Los Angeles(CA) Cell: 510-***-****
CAREER SUMMARY:
Seasoned Lead Architect/Analyst/DEV-Lead with 18+ years of experience handling multitude of projects in a matrix organization with multiple applications and development dependent environment. Lead and successfully executed mission critical projects involving Solution, design, coding, testing and production roll out, exceeding business expectations.
Lead a team of 17+ DataBase Developers, These resources were Co-located over US and Asia.
Represented DataBase development team in strategy, solution architecture, Solution Design with Data Modelling/Data Dictionary and also budget meetings.
Managed projects in a high pressure environment facing senior management both at business and
technology.
Proven track record of delivering mission critical projects with strict deadlines and always exceeding business expectations. Strong experience on various Cloud technologies like AWS/Azure/GCP for setting up pipelines.
Managed projects requiring both onsite and offshore models in an Agile framework.
Team player and analyst with strong analytical, problem solving and decision-making skills. Very Key member of the team who maintains and establishes good relationships in the organization. Always been a Mentor to the team, motivates the team and partners with business teams to deliver projects successfully and oversee the entire SDLC process.
Achievements and Awards:
Received multiple awards from BOFA leaders for Successfully implementing Multi Million projects.
Received award for creating a custom Fully Automated Deployment Tool.
Received Diamond award for migrating 2+ Trillion Documents into P8 Content storage system
Skills:
Integrations: WebMethods, Workday, MuleSoft, KAFKA, HIVE, DevOPS, CI/CD, Agile, Scrum
Technologies: JAVA, JavaScript, SQL, PL/SQL, PYTHON, REST, SOAP, VISIO, Unix, Java, GIT, TFS, ClearCase, AWS, GIS Security standards, Data Encryption.
Databases : Oracle,SQL Server, Hadoop, MYSQL, NOSQL, Postgres SQL, DB2.
Data Solutions: Oracle(DBaaS), BI/Cognos, Tableau, ETL, P2P, Python, Data warehousing, Relational/Dimensional Data Modeling.
Organizational Skills: Self Managed, Quick Learner, Innovator, Accountable, Dependable,Passionate about changes, Analytical, Well Organized, Good Written and Verbal Communications, team Builder, Decisive personality, Result oriented, Mentor, Multi tasking and always work as a team and appreciate your team.
Education and Certifications
Bachelor’s in computer sciences engineering (1998 to 2001)
Professional Experience:
Silicon Valley Bank, Santa Clara, CA. MAR-2022 to Current
Role: Senior Data Analyst
●Worked in DOMINO Pod as a Lead analyst for PMEA (Post Month-End Adjustments (PMEA) process) project from scratch.
●Also was involved in day to day design and Architecture activities on PDNPL (Past Due and Non-Performing Loans (PDNPL) Review and Reporting) application..
●Worked on creating data model’s Using ER Studio, Data Dictionary, Gather requirements from business and articulate the same into project documents along with mappings.
●Designed Maintained and Developed various data pipelines in Cloud AWS using Python and SQL coding to store data in Data Warehouse system, Via S3 and Lambda.
●Extensively worked on Python and SQL coding to extract and transform data as needed and Store in our DW so Tableau reports can be generated of it.
●Migrate data from multiple applications like LAS, LOS, RMS, TCAR, nCINO, Holdover, Wire etc.
●Created Database and database objects like DB Tables, PLSQL Store procedures, Functions, triggers, jobs etc.
●Created SQL Server/Oracle DB Objects, Stored Procedures, Triggers, Functions all other DB objects.
●Created Architecture diagrams ETL/Data Warehouse modules/Workflows using Informatica and Data Interface ETL tool by mapping end to end attributes across source to target databases.
●Worked on complex ETL from different data sources like DB2/MYSQL/NOSQL/Oracle/SQL Server and files to load them into multiple target Data warehouse DB’s like Oracle/SQL Server.
●Worked on creating DB objects(Tables, Procedures etc..) in PostgreSQL Database.
●Was part of architecture decision making to leverage and implement on various technology stack for creating data Pipelines.
●Worked with application team counterparts for integration of API’s from Appian UI to Backend Database.
●Worked on End-to-end SDLC of the project by being part AGILE Scrum team.
●Worked on getting the business requirements and convert them into prototypes/Detailed documents by being part of the Design team.
●Setup CI/CD pipeline using GIT and Jenkins for auto deploying and promoting code
●Worked on Kafka to setup jobs and Schedulers to perform data ETL.
●Worked in CREDIT ADMIN Pod for Credit Lens project.
●Worked with Moody’s analytics (AWS system) on a day to day basis for troubleshooting and sharing requirements to get the product designed as per requirement which will be used for various reporting.
●Key member to design, Develop and lead through the entire SDLC process for the Domino and Credit admin POD’s.
Bank of America, Agoura Hills, CA. JAN 2008 - Current
Role: Lead Data Analyst
●Work With Business Analysts and Systems Analysts to gather requirements from Business.
●Work with Manager to Allocate team members/resources to ensure predictable delivery of multiple applications and projects within the timelines and budget.
●Lead team to get the business requirements and convert them into prototypes/Detailed documents by being part of the Systems Architecture team.
●Lead team to utilize Oracle databases and DB architecture to create data models using Power Designer, design diagrams, data transfer process for multiple applications, developing API’s/stored procedures, triggers, functions etc. and create complex analytical DB’s, Reporting dashboards etc..
●Coordinated with the front end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.
●Developed PL/SQL Stored Procedures/Packages, triggers and master tables for automatic creation of primary keys.
●Developed scripts to create new tables, views, queries for new enhancement in the application using TOAD.
●Used Python and SQL code to collect data from various sources and maintain data pipelines for different frameworks across enterprise databases to store into one single ODS (Hadoop) database.
●Maintained and developed Tableau reports as needed for the Reporting requirements.
●Worked with ODS (Data Warehouse team) to get data into Hadoop Data Lake system and generate reports out of that.
●Created and Managed Analytical Database’s with very high volume of data for generating complex Reporting needed for Business.
●Worked as MS SQL Server Developer by created SQL Server Objects like Tables/Views/Stored Procedures/Function/Jobs etc..
●Migrated High Volume of SQL Server/DB2 Databases into Oracle.
●Created indexes on the tables for faster retrieval of the data to enhance database performance.
●Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.
●Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE.
●Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.
●Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
●Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL_FILE package.
●Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.
●Partitioned the fact tables and materialized views to enhance the performance.
●Extensively used bulk collection in PL/SQL objects for improving the performing.
●As part of DevOPS Created CI/CD pipelines using Jenkins for promoting code to further environments.
●Created records, tables, collections (nested tables and arrays) for improving Query performance by reducing context switching.
●Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
●Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.
●Hands On Experience as Data Engineer/Analyst with all DataBases to write complex reporting queries to generate High Level Bank Related complex Reports and information which would serve Management at CTO level.. So decisions can be made quicker based on the reports.
●Lead 100+ online banking Applications Including Databases which are Highly transactional with each DB having Petabytes of data and to maintain a SLA of milliseconds for Online Banking..
Some Highlights:
●Key member for the success of migration project of Countrywide Migration into Bank Of America
●Migrated 200+ Databases in Oracle DBaaS (Cloud) Created Hadoop (SDP) Data Model for high visibility analytical database for reporting metrics used across Bank Of America Wide
●Created Huge Java Based Applications components and Databases with its own UI. Converted 1500+ SQL Server Databases into Oracle.
●Managed and Led the Migration of 2+ Trillion Images/Documents into P8 Content storage system for fast and quick retrieval of documents which will be seen by customers via Online Banking etc.
●Key member to design, Develop and lead through the entire SDLC process for the Admin Portal UI..(Self Service UI). And was able to get a Patent for the same.
●Created a Custom DataBase Deployment tool which team can Auto deploy the DataBase code into any ENV with minimal inputs.
●Created custom Tools to sync data from One ENV to other for test data Creations etc..
Wells Fargo – Business Banking, Concord, CA. SEP-2006 to JAN-2008
(SLVR Software solutions inc)
Role: Senior PL/SQL Developer & Data Analyst
●Worked as a Senior Database Developer and Dev Lead for On-Shore and OFF-Shore Team in the “Business Banking Group” Including to develop. Where projects involved but not limited to developing Marketing Analytics solutions which enable bankers to manage their customers better. Also worked on METIS and Encryption projects.. Worked on Creating various Database Objects..
And Analyze Data to come with Data Model and Data Dictionary.
AT&T (SBC), San Ramon, CA. FEB-2005 to AUG-2006
(SLVR Software Solutions Inc)
Role: DataBase Developer and Analyst
●Worked at AT&T in implementing the “Project Light Speed”. Worked as a DataBase Developer, managing and writing 800+ New Stored Projects and various Database Objects.
Previously from 2003 onwards Worked as a Software Engineer for various clients.