Post Job Free
Sign in

Cloud Engineer Software

Location:
India
Salary:
15000
Posted:
August 13, 2024

Contact this candidate

Resume:

CHANDRA SEKHAR KOLLIPARA

Cloud Enginner

+91-913*******

************@*****.***

Career Objective

A passionate Cloud Engineer looking for a challenging role in a premier company and ensure high quality of the end product being delivered to the customers that can provide excellent scope for learning and enhance my skills and abilities as a Cloud Engineer.

Education

B. Tech Graduated in 2019

Priyadarshini Institute of technology and Science

Percentage 77.04 %

JNTUK University

Andhra Pradesh

Diploma in 2016

Bapatla Engineering College Bapatla

Percentage 83 %

SBTET Board

Andhra Pradesh

SSC in 2013

ZPHS Emani

CGPA 8.5

Career Highlights

Having 4.4 years of experience in the IT industry as a Software Engineer on Google Cloud components

A qualified Cloud engineer with a good amount of exposure in cloud data engineering utilizing technologies like GCP, Big Query, DataProc, Composer, Databricks, GCS, compute engine, Pub/Sub, SQL, Python (Scripting), Bigdata, HDFS, PySpark, Spark-Sql, Linux, Github.

Having experience on Big Data using Hadoop, HDFS, Spark.

Experience working with Big Query to load data into Multiple Zones (CZ, SZ, VZ) and to Develop Authorized views.

Experience in working with Bigdata with the help of Dataproc, HDFS, GCS, Spark.

Hands on Creating Scripts as per the mapping documents as part of Dag’s creation.

Hands on Creating Dag’s and tasks to deploy and run PySpark / Spark-SQL jobs in Airflow scheduling tool.

Experience in using different file formats like Text, CSV, Parquet, JSON, AVRO & ORC.

Having Experience on python language scripting.

Having experience in Oracle, SQL.

Having Strong exposure on Databricks.

Good exposure on ETL frameworks and fundamentals.

Good experience in all phases of Agile and participated in all daily scrum meetings with cross teams.

Having strong experience on working in a Weekly and Monthly Target based tasks.

Having experience on Agile methodology projects.

Used GIT to manage and deploy the code version.

Having knowledge on Linux and Windows environments.

Having strong experience in preparing Daily and weekly status reports regarding the tasks for the sprint review.

Having strong experience in end to end process of Software Development Life Cycle.

Having Good knowledge on Database and Data Warehousing concepts.

Working in onsite-offshore team environment.

Capable of working at great pressures and tight deadlines.

Good documentation skills and excellent technical, Communication and Interpersonal Skills with strong Client Interfacing Skills

Ensuring successful delivery according to the business requirements on time and maintain appropriate quality bar.

Excellent interpersonal and communication skills.

Technical Skills

Cloud Platforms – GCP : Compute Engine, GCS, DataProc, DataBricks, BigQuery, Pub/Sub,

Airflow, Dataflow

Technologies : Bigdata, Hadoop (HDFS), PySpark, Spark-Sql.

Languages : Python, SQL.

Other Tools/Languages : Agile - Sprint, Scrum, Jira, Github, Share point, Confluence.

Data Formats : CSV, JSON, XML, ORC, Avro, Parquet.

Operating system : Windows, Linux.

Work Experience

Worked with Optimum Infosystems Pvt Ltd, Bangalore

[From April 2022 to Jan 2024]

Worked with IQURA Technologies Pvt Ltd, Bangalore

[From Oct 2019 to March 2022]

Project Experience: 1

Project Name

Workforce Analytics R3

Client

Standard Chartered Bank (SCB)

Role

Cloud Engineer

Organization

Optimum Infosystem Pvt Ltd.

Duration

April 2022 – Jan 2024

Tools / Skills used

Google Cloud Platform, DataProc, GCS, Hadoop (HDFS), PySpark, Spark-SQL, Big Query, Airflow/Composer, Google Cloud SDK Shell, GitHub, Linux, Agile, JIRA, Excel, MS Word.

Project Description:

Standard Chartered Bank is an international bank that provides Loans, Interests to customers on

invested Deposits and Gold as per the Bank's Terms and conditions and it manages and handles the employee

data confidentially who the bank will be hiring to work with it. The main aim of this bank is to develop the

good relation between customers and Bank by providing secure transactions and keeping the customer

details confidential. We can track the employees and bank user’s details by the respective Countries / regions

/ And respective Legal entities and whether the employee is Direct SCB holder or any contract employee.

Roles and Responsibilities:

Attending BRD walkthroughs scheduled by BAs to have better understanding on the requirements.

Analyzing and reviewing the BRDs to understand the Exact requirement.

Understanding the client requirement by going through Transformation rule document given by Business Analyst.

Gathering requirements from the business team, designing & developing design documents.

Responsible for creating the GCS buckets, Datasets, Big Query tables in different layers of BQ projects.

Worked on Creating DAG’s and tasks to deploy and run PySpark jobs in the Airflow scheduling tool.

Created Authorized views in Big Query to provide data to Down Streams (ML/Reporting/Analytics/Dashboards).

Responsible for Creating the .sql / .py Scripts as per the mapping documents as part of creating Dag’s to handle the Bigdata.

Experience in defining and scheduling job flows using Airflow.

Creating the Scripts as per the mapping documents and creating the DAGS as per the requirement.

Closely working with BA's and team lead to get the work done at the earliest. In case if I have any clarifications need from them.

Used Source code Management (SCM) as GITHub

Responsible to attend the morning standup calls and evening catchup calls to provide the work Progress.

Responsible to modify the scripts if need as per the comments provided by Team lead.

Scheduling meeting with BA or respective teams in case if I need any clarifications or updates required from them.

Sending daily and weekly status report to my team lead on the progression.

Finally, I will be the accountable person for the respective tasks from scratch to till the end.

Project Experience: 2

Project Name

Texas Farm Bureau Group

Client

Texas Farm Bureau Group, US

Role

Associate Cloud Engineer

Organization

IQURA Technologies Pvt Ltd.

Duration

Oct 2019 – March 2022

Tools / Skills used

Google Cloud Platform, DataProc, GCS, PySpark, Spark-SQL, Big Query, Airflow/Composer, Google Cloud SDK Shell, GitHub, Linux, Agile, JIRA, Excel, MS Word.

Project Description:

Texas Farm Bureau Group is an insurance Company that provides Life Insurance. The main aim of the project is to find Various Premiums, Claim paid to the Policyholder, Commission paid to the Agents and Premium received from the Policyholder. We can track the premium by channels, Marketing type and Location. So based on the data which is loaded into the tables helps the business people to take their required decisions and move further.

Roles and Responsibilities:

Analyzing and reviewing the Requirements to understand the Exact requirement.

Understanding the client requirement by going through Transformation rule document given by Business Analyst.

Gathering requirements from the business team, designing & developing design documents.

Responsible for creating the GCS buckets, Datasets, Big Query tables in different layers of BQ projects.

Worked on Creating DAG’s and tasks to deploy and run PySpark jobs in the Airflow scheduling tool.

Created Authorized views in Big Query to provide data to Down Streams (ML/Reporting/Analytics/Dashboards).

Responsible for Creating the .sql / .py Scripts as per the mapping documents as part of creating Dag’s to handle the Bigdata.

Experience in defining and scheduling job flows using Airflow.

Creating the Scripts as per the mapping documents and creating the DAGS as per the requirement.

Closely working with BA's and team lead to get the work done at the earliest. In case if I have any clarifications need from them.

Used Source code Management (SCM) as GitHub

Responsible to attend the morning standup calls and evening catchup calls to provide the work Progress.

Responsible to modify the scripts if need as per the comments provided by Team lead.

Scheduling meeting with BA or respective teams in case if I need any clarifications or updates required from them.

Sending daily and weekly status report to my team lead on the progression.

Finally, I will be the accountable person for the respective tasks from scratch to till the end.

Languages

English

Telugu

Hindi

Personal Details

Marital Status : Unmarried

Mobile No. : 913-***-****

Nationality : Indian

Declaration:

I hereby declare that all the statements made herein are true to the best of my knowledge.

Place: Hyderabad

Date: (Chandra Sekhar K)



Contact this candidate