Post Job Free

Resume

Sign in

Aws Data

Location:
Lawrenceville, GA
Posted:
October 17, 2020

Contact this candidate

Resume:

**** *******’s Dr

Lawrenceville, GA, *****

(***) 824 - 8016

adg11v@r.postjobfree.com

KIRTAN PATEL

PROFESSIONAL

SUMMARY

Around 1+ years of professional experience in IT industry using AWS Cloud, Infrastructure, Development and implementation of software applications. Extensive experience in studying the existing infrastructure landscape, cloud product matching, and design cloud architecture, proof of concepts, design improvements and implementation of AWS Cloud Infrastructure recommending application migrations to public vs private cloud.

Extensive AWS Experience including AWS Services such as EC2, VPC (NAT, VPC Peering and VPN), IAM (Identity and Access Management), Elastic Beanstalk, Lambda, S3, Cloud Front, Glacier, SQS, SNS, RDS, DynamoDB, Route53, Cloud Watch, Cloud Trail, Cloud Formation, Security Groups etc.

Extensive knowledge on Serverless architecture using AWS Lambda. Experience on Application Migration and Data Migration from On-Premise to AWS Cloud. Experience working with build and deployment tools using Jenkins and Bamboo. Hands on experience with best practices of Web Service development and Integration. Well versed with version control tools like GIT, SVN, TFS and used them as a central repository to check-in the code where it held changes and called a Continuous Integration (CI) tool. Good knowledge and experience using Java, PHP, Swift and Python. Experience in Client side design using JSP, HTML5, CSS3 and validation using JavaScript technologies. Extensive working experience in Linux and Windows based systems. Good understanding with creating Cloud Formation templates for quick, easy and rapid deployment of Infrastructure.

Hadoop Developer with 2 Year of working experience on designing and implementing complete end-to-end Hadoop Infrastructure using MapReduce, PIG, HIVE, Sqoop, Spark, HBase. Experience Installing, Configuring and Testing Hadoop Ecosystem components. Good knowledge of Hadoop Architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce concepts. Experience in working with MapReduce programs using Hadoop for working with Big Data. Experience in analyzing data using Hive QL, Pig Latin and custom MapReduce programs in Java. Experience in importing and exporting data using Sqoop from Relational Database Systems to HDFS and vice-versa. Collecting and Aggregating large amount of Log Data using Apache Flume and storing data in HDFS for further analysis Experience in developing applications using JAVA, PHP, MySQL, HTML5, CSS3, Javascript, JQuery, JSON.

8+ years of experience in designing Web Applications using WordPress, PhpMyAdmin, E-Commerce, HTML, CSS, JavaScript, React, Node.js, Express.js, jQuery, Bootstrap, JSON, MVC. Manage a user guide to help clients understand site features and management of website to increase visitor satisfaction.

Ensuring high-performance and availability and managing all technical aspects of the CMS. Experienced in working with React.js and its workflow such as Redux to create scalable SPA (Single Page Application) Proficient in Object-oriented JavaScript, including newer specifications of ECMAScript and DOM manipulation Act as the company expert in creating PPC and Google AdWords campaigns to be integrated with new website builds.

Developed technologies by using HTML5, CSS3, JavaScript, jQuery. Node.js, Express.js, SQL Server, MongoDB with implementation of MVC Worked with RESTful APIs and knowledge of modern authorization mechanism, such as JSON Web Token Experience with common front-end development tools such as Babel, Web pack, NPM Experienced in Waterfall Methodology and Agile (SCRUM) environment Worked with code versioning tool such as Git Problem-solving and analytical skills to find creative solutions independently Self-Motivated, excellent communication skills and good Team player Enthusiastic, passionate, and quick learner with willingness to work in fast pace environment. EXPERIENCE Client: SUNTRUST BANKS Jul 2019 to Current Role: AWS Solutions Architect

Responsibilities:

Designing and implementation of public and private facing websites on AWS Cloud. Migrating from On-Premise Infrastructure to AWS Cloud. Configured and managed various AWS Services including EC2, RDS, VPC, S3, Glacier, Cloud Watch, Cloud Front, Route53, SQS, SNS etc.

Worked with AWS Discovery tool and architecture assessments providing critical data for migrating to AWS.

Design, Implement and maintain all AWS Infrastructure and services within a managed service environment.

Design, Deploy and maintain enterprise class security, network and systems management applications within an AWS Environment.

Created Lambda functions to automate processes in the Cloud. Created design documents with the AWS Security best practices to be implemented within the team. Designed/Implemented Backup and Disaster recovery plans. Created Cloud Formation templates to automate infrastructure deployment in different regions. Assigned Roles and Policies to Users, Security Groups by using Identity and Access Management (IAM). Created various performance metrics and configured notifications using Cloud Watch and SNS. Configured the Elastic Load Balancers, Auto scaling Groups, Snapshots and AMI’s to create highly fault tolerant and resilient environment.

Identify, Manage and Communicate Issues and Risks to Senior Management. Created POC using AWS Code Commit, Code Pipeline and Code Deploy for Continuous Integration on AWS Infrastructure.

Kin Techno World Feb 2017 – Mar 2019

Hadoop Developer

Responsibilities:

Loading the data from the different Data sources like (Teradata and DB2) into HDFS using Sqoop and load into Hive tables, which are partitioned. Developed pig scripts to transform the data into structured format. Developed Hive queries for Analysis across different banners. Developed Hive UDF's to bring all the customers Email_Id into a structured format. Developed Oozie Workflows for daily incremental loads, which gets data from Teradata And then imported into hive tables.

Developed bash scripts to bring the log files from FTP server and then processing it to load into Hive tables.All the bash scripts are scheduled using Resource Manager Scheduler. Developed Map Reduce programs for applying business rules on the data. Developed and executed hive queries for Denormalizing the data. Worked on analyzing data with Hive and Pig.

Experience in Implementing Rack Topology scripts to the Hadoop Cluster. Very good experience with both MapReduce setups.

Kin Techno World Apr 2015 – Dec 2016

WordPress Developer

Responsibilities:

Responsible for the architecture, design, and development of high-volume web service applications Object oriented development is a plus

Technically lead a team of more junior developers from design through release Deadline focused, detail oriented, well organized, and self-motivated Extensive knowledge in PHP (LAMP stack)

Prepare business and technical documentation

Work with Demand Generation team to make sure the website functions properly for SEO and SEM as well as the proper display of data

PHP and HTML development outside of WordPress for landing pages, microsites, etc. Utilizing JavaScript frameworks to implement website functionality Manage technical integration with third party services Ensure cross-browser, cross-platform, and multiple device type compatibility for all web solutions and perform any necessary QA on finished products

Design and marketing materials development: enhance our brand presence (eBooks, flyers, booth design, web banners, PowerPoints, sales collateral, infographics, videos, etc.) Monitor uptime, hosting, and databases to ensure site is performing at maximum capacity Development expertise with custom post types, custom fields, plugins, and themes for high-traffic WordPress sites.

Ample Infotech Jun 2012 – Apr 2015

Web Developer

Responsibilities:

Involved in entire System development life cycle (SDLC) from requirement gathering, analysis, design, development, and testing

Ability to understand business requirements and translate them into technical requirements Translated screen mock-ups into high quality code

Build reusable components using Script (ES6+) and JSX, Props, State, Lifecycle methods, Hooks, Prop Types to make react look more elegant. Optimized components for maximum performance across diverse web-capable devices and browsers to build responsive websites

Used Node.JS and React.JS framework for developing the Single Page Application (SPA) Used Babel.js to trans pile ES6 and JSX into JavaScript Responsible for developing Actions, Stores, Reducers, implemented immutable data structure using immutable.js library, and managed components state through Redux Worked with Express.js, adding middleware, handling different routes Worked with REST APIs and used authorization mechanisms, such as JSON Web Token Handled server-side validation along with error handling Used JavaScript Promises for callback arguments

Wrote SQL, Stored Procedures with SQL Server

Performed CRUD operation using MongoDB database

Tested React with test framework like Jest and Enzyme Provide version control using GIT

Effectively handled the tickets raised by the business TECHNICAL SKILLS

AWS SKILLS EFS, EBS, AMI, CLOUDFRONT, CLOUDWATCH,

CLOUDTRAIL, AUTOSCALING, VPC, NAT, ROUTE53, RDS,

DYNAMODB, ELASTIC TRANSCODER, ELASTIC BEANSTALK,

OPSWORKS, SQS, SNS, SES, GLACIER, LAMBDA.

BIG DATA ECOSYSTEMS HADOOP, MAPREDUCE, HDFS, HBASE, HIVE, PIG, SQOOP, SPARK, SCALA.

WEB TECHNOLOGIES WORDPRESS, HTML, CSS, JAVASCRIPT, BOOTSTRAP, JQUERY, JSON.

PROGRAMMING

LANGUAGES

PHP, SWIFT, JAVA, JAVASCRIPT, JQUERY, PYTHON, C++, ANGULAR, HTML, CSS, XML/XST

DATABASES ORACLE, MYSQL, MONGODB, FIREBASE, AMAZON RDS, DYNAMODB.

ACADEMIC

QUALIFICATION

BACHELOR OF COMPUTER APPLICATION MAY 2011

SARDAR PATEL UNIVERSITY, GUJARAT, INDIA



Contact this candidate