Post Job Free
Sign in

Data Engineer Google Cloud

Location:
Frisco, TX, 75034
Posted:
December 16, 2023

Contact this candidate

Resume:

Venkata Rama Raja L

Phone: 214-***-****

Email: ad10sr@r.postjobfree.com

Professional Summary:

Senior lead Data Engineer with 11+ Years of experience in Information Technology delivering projects in fast paced environment.

Data Engineer with a comprehensive background in design, developing, testing and supporting production-grade data solutions on Google Cloud Platform (GCP).

Proficient in SQL, and leveraging GCP services to optimize data workflows, drive actionable insights, and ensure data integrity.

Skilled in managing complex data pipelines and providing production support, ensuring the reliability and availability of critical data systems on GCP.

Proven ability to collaborate with cross-functional teams and deliver robust and scalable data solutions that empower data-driven decision-making in a GCP environment.

Extract, transform, and load (ETL) data using Google Cloud Platform (GCP) native tools such as Big Query, Compute instances, Cloud storage.

Expertise on Google Cloud Platform (GCP), GCS (Google Cloud Storage), Big Query, Dataflow, Gsutil and Hadoop.

Expertise on Google Bigtable, Cloud Composer/Airflow, Pub-Sub, Compute Engines and IAM.

Representing team in senior leadership reviews.

Expertise on Insurance Domain and RDBMS (Oracle SQL).

Successfully led and managed a team of 5 individuals, effectively organizing workloads, prioritizing tasks, and ensuring timely delivery of projects while tracking performance using key performance indicators (KPIs).

Familiarity with SDLC and Agile.

Resolved team conflicts by facilitating open dialogues and implementing conflict resolution strategies, improved team collaboration and productivity by 30% through proactive intervention.

Monitor team performance and report on metrics.

Expertise on Servicenow, Github, Jira, Confluence, Global Service desk (GSD), Control-M.

Incident Management and RCA with in the SLA.

Set clear team goals and KPIs.

Expertise in Insurance, Banking Domains.

Proficient in defect tracking and classifying bugs based on severity.

Involved in daily, weekly, and monthly review meetings.

Technical Skill Set:

Cloud GCP services

Cloud Storage, Big Query, Cloud Composer, Cloud SQL, Dataflow, DataProc, Pub/Sub, Cloud Functions, Compute Instances.

Hadoop Ecosystems

HDFS.

Programming

Python3.

Databases

Big Query, Oracle (SQL).

Operating Systems

LINUX, Windows.

Developer Tools

Oracle SQL Developer, PyCharm, Visual Studio.

Bug Reporting Tools

Bug Zero, Bugzilla.

Scheduling Tools

Airflow (Cloud Composer), Cloud Scheduler, Control- M.

Other Utilities

Putty (UNIX), Confluence, Github, Global Service Desk, ServiceNow, JIRA, Citrix, SharePoint, WINSCP.

Professional Certificates:

Google Cloud Certified Professional Data Engineer (2023).

Google Cloud Certified Associate Cloud Engineer (2022).

Professional Experience:

Capgemini –Senior Data Engineer. Jun 2019 – May 2023

Client: HSBC Bank

Project Name : Group Liquidity Reporting System (GLRS)

Environment : Google Cloud Platform (GCP), Linux, Python, Hadoop, Control-M, SQL.

The purpose of this project is to analysis and develops the required reports using GCP Big Query by transferring the data from on-prem to cloud.

Roles & Responsibilities:

Worked on Google cloud platform (GCP) with Google Data Products like compute engine, cloud storage, BigQuery, stack driver monitoring and cloud deployment manager.

Developed and maintained data pipelines on Google Cloud Platform (GCP), including BigQuery and report generation.

Identified and resolved data quality issues.

Successfully delivered projects on time and within budget.

Accountable for ensuring successful Builds Deployments and Change Requests (CR’s).

Tuning of SQL queries to increase the performance and cost optimization.

Testing and designing for the scripts before deployment in production environment.

Developed Airflow workflows to extract, and load data from Cloud Storage to Cloud BigQuery.

Preparing RCA document and maintain the same in internal database for future references.

Holds Strong experience of Working in strictly time bound environment and providing resolution of Incidents within agreed SLAs.

Encrypt the files using Python before loading the files to Google Cloud Storage.

Coordinating support activities, patches deployment and configuration management with external vendors and numerous internal teams.

Used Hadoop to Store the data on-prem.

Reducing Manual Dependencies by implementing automated processes.

Extract, transform, and load (ETL) data utilizing native tools within the Google Cloud Platform (GCP), including BigQuery.

Performing Audit tasks and required evidence the auditor.

Performing Monthly activities like Housekeeping on GCP BigQuery tables, Linux servers and VM refresh.

Being a mentor, I have a responsibility to guide teammates and help them proceed and resolve technicalities and to improve functional knowledge.

Built development environment using bug-tracking tools like Servicenow, Jira, Confluence, and version controls such as Git and Github.

Nucsoft Ltd. – Senior Software Engineer. Jun 2017 – Jun 2019

Client: SBI Life Insurance

Project Name : Channel Management System (CMS)

The purpose of this project is to analysis and develops the required reports using GCP Big Query by transferring the data to cloud.

Roles & Responsibilities:

Use Airflow (Composer) to build and schedule data pipelines for Ingress and Egress data.

Perform peer reviews as a code owner.

Develop adhoc scripts to run the models and compare the results between legacy and new platforms for Audit purpose.

Deploy Data processing jobs into different Batch and Online platforms with respective GCP services: Dataflow.

Create and modify Database objects: Tables, Views as pre the requirement.

Collaborated with cross-functional teams, including data engineer and analysts, to understand business requirements and translate them into technical specifications.

Nucsoft Ltd. – Software Engineer. Dec 2013 – Jun 2017

Client: SBI Life Insurance

Project Name : Policy Management System (PMS)

The purpose of this project is to analysis and develops the required reports using Oracle SQL.

Roles & Responsibilities:

Analyzing and developing Reports as per the request (CCP).

Developing RO & HO Contest Queries using SQL and PL/SQL.

Creation of new Indexes to improve the performances of SQL statement.

Creation of Database Objects like Tables, Views, Triggers, Procedures and Packages.

Developing queries that can extract valuable insights from large datasets by using multiple joins.

Using unit tests to ensure that individual units of code work as expected before they are integrated.

Work with other engineers and stakeholders to understand and implement data requirements.

Providing clarity on complex technical topics through documentation.

Proven ability to work in fast-paced, deadline-driven environments and resolve incidents within agreed service level agreements (SLAs).

Keeping the team informed of progress and blockers during daily standup meetings.

GVR Infra Projects Ltd. – System Admin. Nov 2010 – Mar 2013

Roles & Responsibilities:

Operating System Installation and related software's.

Monitor system performance.

Quickly arrange repair for hardware in occasion of hardware failure.

Installing of hardware & peripheral Devices (Printers, Scanners, etc.).

Troubling shooting of hardware & software.

Academic Credentials:

Jawaharlal Nehru Technological University, India - Bachelor of Technology Aug 2005 - Jul 2009.



Contact this candidate