Post Job Free
Sign in

Tech Lead Project Delivery

Location:
Bridgeville, PA
Posted:
April 29, 2024

Contact this candidate

Resume:

Experience Summary:

Professional with around ** Plus years of experience in Client/Server technologies and Data Warehousing using Informatica.

Experience in Datawarehouse/ETL including Informatica PowerCenter, Teradata, Google Cloud Platform (GCP), IICS, Google Big Query, Oracle ORMB.

Experienced Tech Lead managing a team of developers on Multiple project delivery.

Responsible to interface and communicate with Product Managers, Analysts, Test Managers, Dev Managers, Business Teams, Vendors, and IT Ops on a daily basis.

Provide technical leadership and manage all aspects of the SDLC process-requirement analysis, design, development, testing, and support/maintenance.

Resolving/Addressing Technical issues with minimal guidance.

Managing work backlog with tools such as JIRA.

Involving in Application Team meetings and communicating with all the stakeholders.

Developed Technical Solutions (high/low level technical designs, specification, and technology evaluations). Understanding new requirements, design and develop logical and physical models for Datawarehouse applications.

Responsible to manage up and provide accurate and timely reporting to Senior IT Management on the status and trends for the team’s releases including team velocity, sprint burndown rates, defect metrics, and lessons learned.

Responsible for managing resource capacity, managing the vendor relationship and keeping the vendor on task, providing leadership reports, and being able to communicate at all levels of the organization.

Experience as a Business System Analyst (BSA) to be able to recommend changes to existing applications, identify impacted interfaces, and work with the technical team to implement and test the changes.

As a BSA working with Stakeholders to understand their process and data requirements and to come up with solutions to meet their objectives.

Involved in creating BSA Documentation with Source to target Mappings and Data Model, Working with Technical Team to deliver projects according to the Design.

Performing Data validation checks and UAT Signoffs along with End Users.

As a BSA responsible for the end-to-end implementation of technology solutions, which means project management, budget design, communicating developments to managers, and accepting feedback from stakeholders.

Worked on Oracle Revenue Management and Billing application, a Billing System used in Banking and Health Care

Experience in upgrading ORMB Application, Both Database and Application Framework upgrade.

Troubleshooting and debugging server startup and failures, both WebSphere and ORMB Application servers.

Experienced in creating complex mappings using various transformations, developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Informatica.

Experienced in ETLs for Data Extraction, Data Mapping and Data Conversion using Informatica Power Mart/Power Center.

Experience in working with relational databases such as Oracle, SQL Server, DB2, MS Access, Hana and Teradata

Experience in using Automation Scheduling tools like ESP Scheduler, Control-M, Maestro and Autosys.

Worked with Teradata utilities like Fast Load and Multi Load, TPT and Tpump.

Designed ETL process using Informatica Tool to load from different Sources to Targets through data Transformations.

Developed Mappings using EDS ETL tool to load flat files from different source systems to Vendor Applications.

Migrated the EDS code to a different Environment and participated in Production release validations.

Good experience with change management, Reviewing and deploying code from DEV to TEST and UAT/PREPROD - using deployment groups in Informatica Repository manager, Terraform, Liquibase etc.

Experience working on power shell scripts to process Data Warehouse jobs.

Worked on UNIX shell scripting for automation of ETL processes.

Having knowledge on Extraction data from different platforms and applications.

Experience with Agile/Jira/Scrum/Project status tracking.

Coordinating with Business Users, functional Design team and testing team during the different phases of project development and resolving the issues.

Validating data files against their control files and performing technical data quality checks to certify source file usage.

Developing Informatica Mappings and Workflows to extract data and to load into Teradata staging area using Fast Load/Tpump utilities.

Experience in preparing different types of project related document like ETL Specification, Source to Target Mapping, Test Cases, Design document, scheduling, deployment, training document etc., `

Working closely with Business Users, Project stakeholders, Admins, Support Team, and Offshore team.

Employment History

Working as Technical Lead with Wipro Limited from DEC 2020 to current.

Worked as Senior Informatica Developer with United Soft Solutions from AUG 2017 to DEC 2020.

Worked as Senior Informatica Developer with Strategic Resources International from JUN 2016 to AUG 2017.

Worked as Application Designer with CSC India Private Limited from NOV 2014 to FEB 2016.

Worked as Software Engineer in Accenture, India from NOV 2006 to NOV 2011.

Technical Skills:

Lead Roles: Technical Lead, Business Analyst, Scrum Master

ETL Tools: IICS, GCP, Big Query, Informatica 10.2, Oracle ORMB, SSIS

Database: SQL Server, Oracle, DB2, Oracle ORMB, TeraData, Salesforce, Snowflake, BigQuery, TD Vantage

Web Servers: WebSphere

Education:

Bachelor of Technology (B. Tech) in Computer Sciences from Hyderabad, India

Professional Experience:

Client : Charles Schwab, Westlake, TX (Wipro)

Role : Technical Lead (ETL Informatica), Business Analyst

Duration : Oct 2021 – Till Date

Tools/Environment: IICS, GCP, Informatica, Avro Ingestion, BIG Query, Python,Power BI, Tableau, SPOS, Salesforce, Control M Scheduler, ServiceNow, ControlM, Remedy Tool, Teradata, Nexus, SQL Server.

Responsibilities:

As a Lead responsible for all Technical Reviews.

Responsible for TDA Conversion Activities.

Responsible for solution design and developing multiple projects.

Responsible for Code Review, Performance Tuning, Project deliverables, UAT Testing Support.

Participate in Scrum Meetings, Sprint Planning, Story grooming and Retrospective.

Responsible for Project delivery and planning.

Responsible for Resource Allocation, Sprint planning and Project planning.

Assist the team and the Product Owner identify stories, maintenance, defects, tech debt, and other work the team needs to accomplish during the sprint.

Understands and educates the team on methods beyond Scrum, such as Team Kanban.

Responsible for handling Prod issues or Failures

Closely working with Customer, Stakeholder, Admins and Support team.

Involved in Requirement gatherings by working with End users and design.

Consistently communicating with the end users or project managers to ensure the API system developed met their expectations and meet the goals.

Experience in creating BSA Document with source to Target Mappings and working on Data Model

Involved in Unit Testing along with End users to provide UAT Sigh offs.

Involved in Code migration from on-prem to CDW.

Involved in Customer and Accounts Conversion activities from TDA to Schwab as part of TDA Conversion.

Worked on generating Data Extracts for Participants and plain Data for Sales force using IICS Task Flows.

Understanding the business rules and sourced data from multiple source systems using IICS.

Extracted the raw data from SharePoint, SQL Server, My SQL, Flat files, Avro Files to staging and target DB (Big Query) using Informatica Cloud (IICS).

Worked on Data Replication and Data synchronization task for Sales Force Data using IICS.

Created Data Extracts to Salesforce to enable updates on Plain and Participants Data on Salesforce System.

Created IICS tasks to Extract insert, update into salesforce based on the type of data that needed to be updated on sales force.

Created Multiple Task Flows for loading data from different Sources to Salesforce using Salesforce Connecter with Data synchronization Task, Mapping Task and Used Bulk API, and Standard API as required.

Developed PowerShell scripts which help in smooth flow of files with process in Cloud Application Integration (CAI) and Cloud Data Integration.

Created N number of mapping tasks and Task Flows based on requirements in CDI (Cloud Data Integration)

Involving in Review Meeting, Customer Interactions and Deliverables

Managing the issues raised by QA and Assigning work items to the team based on criticality.

Building High level Project design and update the Confluence to publish it

Lead defect review meetings with business team leads as well as QA lead and prioritize development tasks around project milestones.

Involved in Data Merging process From TDA to Schwab, Mock Testing the Data with functional and non-Functional Test Cases.

Provide technical leadership and manage all aspects of the SDLC process-requirement analysis, design, development, testing, and support/maintenance.

Client : BNY Melon, Pittsburgh, PA(Wipro)

Role : Technical Lead

Duration : Dec 2019 – Oct 2021

Environment: EDS, WebSphere, Oracle ORMB, App Engine, Unix, Shell Scripting, ESP Scheduler,ServiceNow,ControlM, MS-SQL Server, Java, Git Lab, Nexus

Responsibilities:

Created Batch jobs for loading data in oracle tables for generating reports.

Responsible for Upgrading ORMB Application from different versions (2.8 to 3.0).

Responsible for troubleshooting and debugging servers start up and failures.

Working with Application Development team to deploy Code Releases in DEV, TEST, QA, and PROD.

Deploying Release code in different servers including DEV, TEST, QA and PROD.

Developed Maintenance Objects and Executed scripts.

Approved Bills, closed pending bills, created rate schedules, developed user groups.

Assigning user access, creating users in application.

Migrating WebSphere application to LDAP logins.

Deploying the Code in WebSphere application servers.

Maintaining the WebSphere Application servers and ORMB Database.

Deploying the java code in Jenkins.

Enabling App viewer in ORMB Application Servers, along with deploying EAR, WAR, ihelp files.

Coordinating with offshore team.

Worked in an Agile Methodology, actively participating in all Scrum meetings.

Worked extensively on Client owns ETL tool like EDS to extract data from different source systems.

Developed Mappings using EDS ETL tool to load flat files from different source systems to Vendor Applications.

Migrated the EDS code to different Environment and participated in Production release validations.

Extracting data from flat files.

Created Workflows, tasks, database connections, FTP connections using workflow manager.

Schedule loads using ESP Scheduler.

Worked on batch jobs to do initial load, full load, and delta load.

Created SQL Scripts to deploy data into different environments.

Developed shell script to update the format in the target system files.

Involved in production data validations.

Developing the ETL jobs and creating unit test scripts.

Preparing and using test data/cases to verify accuracy and completeness of ETL process.

Testing/Validating the Data with SQL queries against the developed ETL.

Worked in Jira Ticketing System to update the status of tasks.

Reviewing the tasks done by other team members.

Company : DXC Technology

Client : BB&T, Dallas, Texas

Role : Informatica Developer

Duration : Dec 2014 – June 2019

Environment: Informatica 9.1.6, UNIX, Hana, ESP Scheduler, Mainframes, DB2, Mainframe

Responsibilities:

Used the Informatica Power Center to develop Mappings and Workflows for extracting, transforming, and loading data.

Coordinating with offshore team.

Successfully Loaded Data into different targets from various source systems like Oracle Database, Flat files, Mainframes, SQL Server, etc into the Staging table and then to the target database HANA.

Creating low level and high-level design documents.

Part of the Agile Scrum Meetings for project status updates and progress

Worked extensively on transformations like Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator and joiner transformations.

Extracting data from flat files.

Created Workflows, tasks, database connections, FTP connections using workflow manager.

Schedule loads using ESP Scheduler.

Creating technical specification documents.

Developing the ETL jobs and creating unit test scripts.

Working on UNIX shell Scripts to load data and automation.

Testing/Validating the Data with SQL queries against the developed ETL.

Reviewing the tasks done by other team members.

Company : Accenture Services, India

Project : Safeco.

Client : Liberty Mutual (Indiana Polis)

Duration : May 2011 – Nov 2011.

Role : Development, Production Support and Maintenance.

Environment: Informatica 8.6.1, DB2, Tera Data, UNIX, ESP Scheduler, ORACLE PL/SQL

Responsibilities:

Used the Informatica Power Center to develop Mappings and Workflows for extracting, transforming, and loading data into warehouse database.

Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.

Used Informatica features to implement Type I, II, and III changes in slowly changing dimension tables.

Written documentation to describe program development, logic, coding, testing, changes and corrections.

Creating complex mappings using various transformations, developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Informatica.

Translate customer requirements into formal requirements and design documents.

Worked extensively on transformations like Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator and joiner transformations.

Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.

Reviewing Detail Design Document and Technical specification docs for end-to-end ETL process flow for each source systems.

Involved in Reviews Required Documents before Release as a part of Automation.

Involved in achieving Accenture process and policy with respect to deliverables.

Involved in Unit Testing and preparing test cases. Also involved in Peer Reviews.

Company : ACCENTURE SERVICES

Project : Customer Insight (CI).

Client : Verizon Data Services, India

Duration : Nov 2006 – May 2011

Role : Development, Production Support and Maintenance.

Environment: Informatica 8.6.1, MS-SQL Server, SSIS, SSRS, Teradata, ORACLE PL/SQL

Responsibilities:

Used the Informatica Power Center to develop Mappings and Workflows for extracting, transforming, and loading data into warehouse database.

Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.

Worked extensively on transformations like Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator and joiner transformations.

Translate customer requirements into formal requirements and design documents.

Working on Service Change Requests delivered on time for release.

Developing Informatica Mappings and Workflows to extract data and to load into Teradata staging area using Fast Load/Tpump utilities.

Monitoring and reporting issues for the Daily, weekly, and Monthly processes. Also, work on resolving issues on priority basis and report it to management.

Reviewing Detail Design Document and Technical specification docs for end-to-end ETL process flow for each source systems.

Worked on Trouble Tickets.

Used the Teradata fast load/Multiload utilities to load data into tables.

Monitored and validated Loads.

Scheduled loads using Informatica Scheduler.

Involved in Unit Testing, Peer review, and preparing test cases.



Contact this candidate