Post Job Free
Sign in

Database Administrator Sql Server

Location:
Chattanooga, TN, 37419
Posted:
April 18, 2025

Contact this candidate

Resume:

Md Sultan Mahmud NY City, NY, *****

Tel: +1-929-***-****

E-mail: **.******.***@*****.***

Linkedin:

linkedin.com/in/muhamed-mahmud-027514338

Professional Summary:

I am a highly skilled and knowledgeable SQL Database Administrator with 4 years of professional experience. I am adept at developing and managing SQL databases, and troubleshooting and resolving errors and issues. I have extensive experience in setting up replication, developing backup and recovery plans, and performance tuning and optimization. I am also well- versed in monitoring database security, establishing database policies and procedures, and providing user support. I am an enthusiastic team player and bring a wealth of knowledge and expertise to any team I join. I also have 5 years of IT experience in Administration, Implementation, Analysis and testing of enterprise wide applications, Data warehouse and big data/Hadoop.

Core Skills:

Database Design & Development

Database Administration & Maintenance

Database Security

Backup & Recovery.

Performance Tuning & Optimization

Database Replications

Database Logshipping

Database Mirroring

Knowledge alwaysON

Troubleshooting & Problem- solving.

Professional Experience:

Role: MS SQL Server DBA

Client: Microsoft (Redmond, WA), Remote.

Duration: Feb- 2021 – Current

Roles and responsibilities:

Developed and managed large- scale SQL databases and automated database maintenance tasks.

Developed backup and recovery plans, implemented replication and performance tuning and optimization.

Monitored database security and implemented database policies and procedures.

Investigated and resolved database errors and issues.

Provided technical support and user assistance.

Collaborated with other teams to optimize database performance.

Prepared detailed technical documentation

Role: Hadoop Admin

Client: WellCare Health Plans

Duration: May- 2017 – Feb- 2021

Responsibilities:

Currently working on building scalable and distributed hadoop environments using cloudera CDP 7.x for different environment.

Installation of hadoop related services like cloudera manager, HDFS, YARN, Hive, Impala, Hue, Spark, zookeeper, Ranger, solr, Kafka, Hbase, Oozie etc.

Managing the clusters, responding to service failure issues, connection issues, access denial etc.

Documenting installation steps, requirements, troubleshooting approaches for future reference.

Responsible for setting up different clusters using cloudera CDP 7.x versions on premise and cloud.

Experience in building POC clusters using different testing specification using Cloud VM (AWS/Azure/GCP).

Configuring external database in MySQL, MSSQL & Oracle for hadoop related services metadata (Cloudera Manager, Activity Monitor, Report Manager, hive, hue, oozie etc).

Experience in importing and exporting data from different database like MSSQL, MySql and Oracel into HDFS, Hive and Hbase using ETL process and Talend.

Responsible to debug the issues reported by CSS engineers through ICM.

We analyze the issues reported and verify the logs of the Components of HD-insight cluster to find out the root cause (RCA) and provides the mitigation steps.

Ambari Cluster Administration, Performance Tuning and Optimization, Troubleshooting and Issue Resolution, Cluster Capacity Planning, Backup and Recovery Strategies

Administration support 24/7, on demand.

Role: Bigdata consultant

Client: M&T Bank

Duration: Sep-2015 – April-2017

Extensively involved in Installation and configuration of Cloudera distributed Hadoop and its components like Name Node, Standby Name Node, Resource Manager, Node Manager and Data Nodes.

Responsible for performance testing and benchmarking of the cluster based on business logic, resource availability and use cases.

Worked extensively with importing metadata into Hive and migrated existing tables and applications to work on Hive and HBase.

Supported Data Analysts in running Hadoop related jobs.

Responsible for scheduling jobs in Hadoop using FIFO, Fair scheduler and Capacity scheduler.

Involved in running Hadoop jobs for processing millions of records of text data.

Expertise in Hadoop Cluster capacity planning, performance tuning, cluster Monitoring, Troubleshooting.

Worked closely with ETL tool like Talend (data processes with Talend) and RDBMS data mover tool like sqoop.

Installed and configured Hive in Hadoop cluster and help business users/application team to fine tune their HIVE-HQL for optimizing performance and efficient use of resources in cluster.

Provided System support/maintenance for 24x7 for Customer Experience Business Services.

Technical Skills :

Hadoop / Big Data Ecosystems: Cloudera CDP 7.x, CDH 5.x, Hortonworks, Cloudera Manager, Apache Ambari, HDFS, YARN, MapReduce, Hbase, Hive, Sqoop, Impala, Hue, Oozie, Zookeeper, Apache Tez, Map-reduce, Spark, Kafka, Management: Cloudera Manager and apache Ambari, Nagios

Database Management & Development: Microsoft SQL Server Management Studio (SSMS), Azure Data Studio, SQL Server Data Tools (SSDT)

Backup & Recovery: SQL Server Agent, Redgate SQL Backup

Security & Compliance: SQL server Audit, PowerShell for SQL Server, dbForge SQL Security Manager

Data Integration & ETL: SQL Server Integration Services (SSIS), Azure Data Factory (ADF),Talend/Open-source ETL tools

Cloud & Big Data Integration: Azure SQL Database, AWS RDS for SQL Server, BigQuery (Google), Synapse Analytics (Azure), Google Cloud SQL (SQL Server)

Operating Systems: Linux (Redhat), Windows.

Databases: MySql, Oracle, Postgres and HeidiSQL tool.

Programming Language: Java, Python, SQL.

Cloud Computing Services : AWS

Security: Kerberos, Active Directory, OpenLDAP, Sentry, Apache Ranger, TSL/SSL.

Job Schedulers: Airflow, Oozie.

ETL Tools: Talend, Power BI, Tableau, Looker.

Versioning tools: Git, Bitbucket

Documentation tools: Wiki, Confluence

Project management tools: Jira

Performance Monitoring & Optimization: Nagios, Splunk, SCOM monitoring tool, SQL Server Profiler, Extended Events (XEvents), Database Engine Tuning Advisor (DTA)

File Sharing tools: WinSCP, FileZila

Client connectivity tools: Putty, Super Putty, SecureCRT, MobaXterm

Kusto Query ( KQL) – This tool is used to find out the backend logs and behavior of the cluster.

ASC – This tool is used to find out the properties and configuration of the cluster and some possible troubleshooting steps.

Jarvis – This tool is used to see the details of the Cluster, network connectivity, ARM components etc.

Skills and components:

Scripting & Automation: PowerShell for SQL Server, T-SQL (Transact-SQL)

References available upon request



Contact this candidate