FAIZAN RAFIQ
***********.**@*****.***
https://www.linkedin.com/in/faizan-rafiq-07144740/
OBJECTIVE
To succeed in an environment of growth and excellence. Also, provides me job satisfaction, personal development.
SKILLS
Kubernetes 19.1
Grafana
Greenplum administration 5.1
PostgreSQL administration.
Oracle 10g, 11g, 12c
Talend
Big Data and modern data Warehouse, distributed databases
Operating systems: Linux
AWARDS & CERTIFICATIONS
Certified Kubernetes Administrator (CKA).
(2021) (Certificate No: LF-ozx34xs4a6)
Certified Kubernetes Application Developer (CKAD). (2020) (Certificate No: LF_76pm9scv2p)
Salesforce Developer Certification Dev-401 (2015).
PERSONAL DETAILS
Father’s Name: Mr.
Rafiq Raaz
Gender: Male
Marital Status: Married.
Current Location: Srinagar
Languages Known: English, Kashmiri, Urdu, and Hindi.
Education
Bachelor of Engineering in Information Technology
- 2006-2010
MIET, Jammu university
PROFESSIONAL SUMMARY
Over 8.10 years hands-on technical experience with an extensive knowledge of Kubernetes, CI/CD pipeline, Databases & Data warehouse such as SQL, Greenplum MPP database, PostgreSQL, Linux and shell scripting. Good understanding of MPP data warehouse.
EXPERIENCE
Working as a Technical Lead at L&T Technology Services, India, from Mar 2021 to Present, my job responsibilities include the following:
Strong experience and knowledge of all the components in kubernetes architecture
Setting up HA Kubernetes cluster on-prem or on AWS cloud.
Upgradation of Kubernetes OS using the kubeadm tool.
Monitoring the logs Using EFK suite
Good understanding on AWS and its services like AWS IAM, VPC, EC2, ECS, EKS, EKS fargate.
Kubernetes Networking (Services, Endpoints, DNS, LoadBalancers).
Secret creation, creating configuration maps, utilizing secrets in PODS, mounting configmaps as volumes, handling large configuration data sets etc.
Good Understanding of Hashicorp vault.
Experience in setting up monitoring and alerting for Kubernetes cluster using open-source monitoring tools like Grafana, Prometheus.
Experience and knowledge of Custom image builds on Kubernetes platform with application management aspects on Kubernetes platform like Kibana dashboard, Resource scheduling - quotas, Quota enforcement etc.
Setting up ingress controller and writing ingress rules through ingress resource.
Setting up and managing Istio Service Mesh.
Deploying applications using helm charts.
Deploying applications using Jenkins automated pipeline.
Experience in leading small groups to accomplish projects using tools such as Git, Jira, Confluence, and others to coordinate.
Worked as a Greenplum database and Kubernetes Consultant at Genpact, India, from Apr 2018 to May 2020 my job responsibilities include the following:
Strong experience with CI/CD pipeline development on Kubernetes platform with familiarity to managing image changes, cascading pipelines, customizing Gitlab on the Kubernetes platform.
Complete understanding of Greenplum architecture.
Experience working with storage provisioning, creation of PV & PVC on the K8s platform.
Thorough understanding of deployment strategies such as rolling updates and recreate as well as blue green / canary / rolling.
Secret creation, creating configuration maps, utilizing secrets in PODS, mounting configmaps as volumes, handling large configuration data sets etc.
Experience and knowledge of Custom image builds on Kubernetes platform with application management aspects on Kubernetes platform like Kibana dashboard, Resource scheduling - quotas, Quota enforcement etc.
Experience in setting up monitoring and alerting for Kubernetes cluster using open-source monitoring tools like Grafana, Prometheus.
Kubernetes Networking (Services, Endpoints, DNS, LoadBalancers)
Responsible for creating and modifying the Postgres & Greenplum scripts & functions according to the business requirement.
Load data into the database instance using external tables, SQL copy and insert commands, and parallel load utilities.
Created new procedures, functions, triggers, packages, and SQL*Loader scripts, and modified existing codes, tables, views, and SQL*Loader scripts.
Tested ETL interfaces after enhancement/ development.
Worked as a Database Consultant at BQE, India from Nov 2017 to Mar 2018. My job responsibilities include-
Played a role in data modeling data warehouse for reporting needs.
Responsible for creating and modifying the Postgres scripts & functions according to the business requirement.
Created new procedures, functions, triggers, packages, and SQL*Loader scripts, and modified existing codes, tables, views, and SQL*Loader scripts.
Distribute and store data in distributed databases using a distribution key and partitioning.
Worked as Greenplum Consultant at ServisCo Pte Limited, Myanmar from Jul 2017 to Oct 2017, my duties include the following:
Collaborate with development, architecture, and release teams, providing architectural design recommendations, driving standards for effective Greenplum and Postgres transition to production operations
Manage backup and recovery functions for a Greenplum data warehouse environment
Responsible for creating and modifying the Postgres scripts & functions according to the business requirement.
Load data into a Greenplum database instance using external tables, SQL copy and insert commands, and parallel load utilities.
Creation/updating of infrastructure scripting to perform Greenplum software installation and upgrade, configuration, database creation, segment expansion/contraction, mirroring, backup
Distribute and store data in Greenplum using a distribution key and partitioning.
Tuned SQL queries to improve response time.
Worked with Xavient Software Solutions India pvt ltd from Jan 2012 to Jun 2017 in various roles such as Greenplum and Oracle DBA. Job responsibilities include:
Operational tasks related to Postgres & Oracle databases including startup, shutdown, problem determination, backup/recovery, performance monitoring
Knowledge of tools for loading and unloading data to/from Postgres & Oracle databases.
Support application development teams in the design, development and implementation of application schema, database configuration and data loading/unloading schemes for Postgres databases.