Susmitha Pamarthy
Email: *********@*****.***
Mobile: +1-302-***-****
LinkedIn: My Profile
EXPERIENCE SUMMARY
Having 6.5 years of experience in Application development and support of critical business applications.
Having 3+ years of experience with Hadoop Stack and Data Analysis.
Strong Programming knowledge on Shell Scripting, Spark, Python
Good understanding on NOSQL databases like HBase.
Hands-on experience in scheduling jobs on Autosys and Oozie.
Worked on Cloudera Cluster of size 160 nodes.
Experience in Cloudera Distribution of Hadoop and job resource monitoring using Cloudera Manager
Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other during ingestion process itself.
Extensively worked on Oracle WebLogic as administrator for Server Configuration, Domain creations, Deployment of applications and performance tuning including troubleshooting and maintenance of WebLogic Server.
Creating cluster & Replication Groups for high availability environments. implementing Fail-Over, Fail-Back and Backup strategies for WebLogic environments and implementing WebLogic best practices for Deployments, System Configuration, Capacity Planning, Infrastructure Architecture, Fail Over, Backup and Recovery, Scalability, and Security.
Experience in finding the Memory Leaks, adjusting the JVM Heap Sizes and GC Parameters.
Expertise in Server setup and application deployment.
Strong academic background.
ACADEMIC QUALIFICATION
Masters in Engineering Management at Point Park University, Pittsburgh,
Post-Graduation Diploma in Information technology(PGDIT) from Symbiosis Center for distance learning(SCDL).
Bachelor of Technology (BTECH) in 2010 from GIET (Affiliated to JNTU University, Kakinada).
MapR Certified Hadoop Developer (MCHD)
WORK EXPERIENCE
Worked as Senior Data Engineer in DXC formerly named as CSC (Computer Science Corporation)
PROFESSIONAL SKILLS
Technologies
Hadoop(Spark, Hive, Impala, Oozie, HDFS, HBase, Sqoop, Kafka), Shell scripting and Pyspark
Operating Systems
Red Hat Linux 5.0/6.0,
Database
Hive, Impala, HBase, Oracle
Hadoop Distribution
Cloudera Distribution 5.8.5
Tools
Putty, WinSCP, Autosys
PROJECT DETAILS
1.Master’s in Engineering Management at Point Park University, Pittsburgh, PA
Duration: December 2018 – January 2020
Pursued Master’s in Engineering Management which is a multidisciplinary program which builds theoretical and practical knowledge of all engineering fields and engineering mathematics, along with advanced project, decision making, data analysis and personnel management knowledge and skills.
2.Hadoop as a Service (HAAS) Duration: December 2013 – February 2017
HAAS is an Analytical repository which holds all the Customer Event's like Online & Mobile business events, Customer alert and notifications. This repository provides Customer 360 data for Customer analytics use cases. Main aim of the project is to centralize the source of data for audit/legal report generation using historical data store which otherwise are generated from multiple sources.
Role: Senior Hadoop Developer
Technology used: Hive, Spark, Python, Sqoop, Impala, Unix/Linux, Oozie
Responsibilities:
Contributor in designing the architecture in One Hadoop with respect to Storage, Processing and extraction.
Contributor in developing HQL (Hive Query logic) as per the HLD.
Written application that loads to Hive in Parquet format, with snappy compression and for table to table load.
Optimizing Hive QL's for better scalability, reliability and performance.
Monitoring performance of the Jobs.
Analyzing performance of the utilities.
Designing application for data wrangling using Spark Transformations and Spark SQL.
Implementing the compression techniques based on nature of data.
Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDD's.
Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
3.Avaya- Fusion Middleware Duration: August 2010 – November 2013
Avaya is a global leader in enterprise communications systems. The company provides unified communications, Contact Centers, Data solutions and related services directly through its channel partners to leading businesses and organizations around the world. Enterprises of all sizes depend on Avaya for state of the art communications that improves efficiency, collaboration, customer service and competitiveness.
Technology used: SOA, WebLogic 10.5/12, Linux, Oracle 9i/10g/11g, JDK1.6.0_10
Responsibilities:
Worked to design and implement application Servers(WebLogic) on Linux servers.
Involved in the installation of WebLogic server and configuring WebLogic domains for new environments.
Provided production support for existing applications running on WebLogic 10.x/11 and Involved in trouble shooting on web logic server issues
Setting up the cluster environment for WebLogic Server integrated with multiple workflows.
Installed and Configured Apache Http Server Plug-In in order for the request to be proxied from Apache Http Server to WebLogic Server.
Supported WebLogic Server Clustering, Load balance, Failover and Performance tuning with respect to heap, threads and connection pools.
Configure and administered Connection pools for JDBC connections.
Developed UNIX shell scripts and WLST scripts to start/stop WebLogic admin and manage servers and to deploy different applications like .war or .ear files.
Creating user ID's and assigning roles and policies in WebLogic as per project SLA requirements.
Managing and installing patches, upgrades and enhancement on Middleware products
Involved in Certificate deployments and migration of SHA1 to SHA2 algorithms in UAT & PROD
Involved in install and configure oraclesoasuite11gr1 components in oracle weblogic server domains environment
Monitor and manage SOA components by using the oracle enterprise manager fusion middleware control console to perform administrative tasks.
Managing and monitoring the JVM performance by web logic heap size, garbage collection
Creation of adapters set up in weblogic - Db Adapters, Jms Adapters, SAP adapters and File Adapters
Experience in configuration of Enterprise service schedulers in SOA