Post Job Free

Resume

Sign in

Node Js Web Services

Location:
Hopkinton, MA
Posted:
September 06, 2023

Contact this candidate

Resume:

Name: Parva Shah Sr. Python Developer

adzimc@r.postjobfree.com (857) 232 - 8551

PROFESSIONAL SUMMARY:

• 7 years of Professional IT experience in Analysis, Design, Development, Testing Quality Analysis and Audits of Enterprise Applications and Database Development.

• Extensive knowledge and strong coding skills on Python, Shell, SQL, Ruby, Node.js, MATLAB.

• Good experience of software development in Python (libraries used: Beautiful Soup, NumPy, SciPy, matplotlib, python-twitter, Pandas data frame, network, urllib2, MySQL dB for database connectivity) and IDEs - sublime text, Spyder, PyCharm, emacs.

• Strong expertise in development of web-based applications using Python, Java, HTML, XML, KML, Angular, React.js, Node.js, CSS, DHTML, JavaScript, JSON, and jQuery.

• Experience in working with Lambda, AWS Connect, Amazon Lex, AWS CLI, AWS CDK with Python, Serverless and PaaS toolkit, EMR, Kinesis, S3, RDS and other core AWS services.

• Good exposure to UI design using Bootstrap, HTML, CSS, JavaScript.

• Strong programming skills in designing and implementation of multi-tier applications using web-based technologies like Spring MVC and Spring Boot

• In depth knowledge and expertise in Data Structures and Algorithms, Design Patterns, proficient in UNIX Shell Scripting, Python Scripting, Bash scripting and Sql Query building (sql query with join, sub query, correlated query and analytical query).

• Working experience in Database design, Data Modelling, Performance tuning and query optimization with large volume of multi-dimensional data.

• Having good knowledge in using NoSQL databases like Cassandra and MongoDB, DynamoDB, PostgreSQL

• Excellent T-SQL development skills to write complex queries involving multiple tables, great ability to develop and maintain stored procedures, triggers, user defined functions

• Built CI/CD pipeline as part of Automation setup using AWS resources.

• Extensively worked on Push and Pull architectures, Ansible, Docker and Kubernetes.

• Strong understanding and background in probability theory, random process, statistics, and optimization.

• Worked on a project driven on AWS Connect where we enabled communication between Lex, Lambda and customer.

• Worked extensively on DynamoDB, Polly, Lex, Comprehend, S3, Code Star, CodePipeline, CodeBuild, CodeDeploy, CodeCommit, CloudFormation, CloudWatch, CloudFront, WebSocket, AppSync.

• Implemented CI CD pipelines using CircleCIto make every step containerized.

• Integrated and extensively worked on Data Dog, New Relic, Sumo Logic, Splunk, Raygunetc. & Was able to leverage them depending on use case where they fit in

• Extensive knowledge in RDBMS (MySQL, Oracle) & Big Data Databases.

• Worked on google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring and cloud deployment manager.

• Web development using Python and Django based web framework. Can leverage between Django and flask.

• Implementation of MVC/ MVW architecture using Django, and RESTful, Soap Web Service and SoapUI.

• Implemented REST services by Python, Java, C++ with Microservices architecture.

• Experienced in developing the data pipelines and managing the ETL end to end.

• Developed/supported application on LAMP stack (Linux, Apache, MYSQL and PHP).

• Strong knowledge in SQL concepts - CRUD operations and aggregation framework.

• Experience in the design of MongoDB database - Indexing and Sharding.

• Extensive knowledge in Marketing and SCM Domains like GitHub, Code Commit and Bitbucket.

• Worked on Waterfall and Agile methodologies. Lxml, XML, HTML, DHTML, Ajax, Tomcat and Apache Application server over various platforms (UNIX, Linux, and Windows).

• Understanding of Big Data and algorithms using Hadoop, MapReduce, NoSQL and distributed computing tools. TECHNICAL SKILLS

• Python Framework: Django, Pyramid, Flask, web2Py.

• Methodologies: Agile, Scrum, Waterfall

• Languages: Python 3.10, 3.9, 3+,2.7/2.4, C++, Java, Shell Script, Perl, PHP, SQL

• Python Libraries: NumPy, Beautiful Soup, Scipy matplotlib, Pandas data frame, PySpark

• Frameworks: Bootstrap, Angular, Django, Node.js, Flask

• Database: Sqlite, MSSQL, MySQL, Mongo DB, Cassandra DB, PostgreSql, Oracle 10g/ 11g, Hive, Big data.

• IDE s: PyCharm, Eclipse, MS Visual Studio Code, PyStudio, Spyder

• Cloud Technologies: MS Azure, Amazon Web Services (EC2, S3, EBS, Lambda, API Gateway). GCP

• Web Technologies: AJAX, JavaScript, HTML, DHTML, XHTML, XML, jQuery, CSS

• Databases: MySQL, SQL Server, Oracle, Siebel, PLSQL, Oracle, Microsoft SQL, PostgreSQL, MongoDB

• Bug Tracking Tools: JIRA, Azure (ADO), Bugzilla.

• Version Controls: Git hub, Git, SVN

• RabbitMQ, AMQP, KAFKA

• Machine Learning, ETL, Hadoop, MapReduce, NoSQL

EDUCATION:

• Master’s Degree - Computer Science - University of Texas at Arlington, Arlington, Texas, USA

• Bachelor’s Degree - Computer Engineering - Gujarat Technological University, India PROFESSIONAL EXPERIENCE:

Client: Federal National Mortgage Association (Fannie Mae), Washington, D.C. Aug 2021 - Present Role: Sr. Python Developer

Responsibilities:

• Developed Architecture for Parsing applications to fetch the data from different services and transforming to store in different formats.

• Experience in MVC architecture using Django for web-based application in OOP concepts.

• Designed and developed components using Python with Django framework. Implemented code in python to retrieve and manipulate data. sed Python & Django creating graphics, XML processing of documents, data exchange & business logic implementation between servers.

• Implemented Algorithms for Data Analysis from Cluster of Web services.

• Worked on Python packages and modules like Requests, Scrapy, BeautifulSoup, Multithreading, Pandas, NumPy, SciPy, Matplotlib, wxPython, SQLAlchemy, JDBC/ODBC and Py-Virtual Environment.

• Contribute to development practices of Microservice architecture by developing reusable product extensions.

• Developed using Angular 7 using components, directives using Typescript.

• Integrate the dynamic pages with Angular 7 and j-query to make the pages dynamic

• Used Angular 7 framework where data from backend is stored in model and populated it to UI

• Built the Web API on the top of Django framework to perform REST methods.

• Used Cassandra DB and MongoDB and MySQL databases in Web API development. Developed database migrations using SQL Alchemy.

• Designed a python script to load transformed data into cloud services (AWS and GCP).

• Created Apps using Node.js libraries NPM, gulp directories to generate view and flux to root the URLs properly.

• Usage of advance features like pickle/unpickle in python for sharing the information across the applications.

• Managed datasets using Pandas data frames and MySQL, queried MYSQL database queries from python using Python-MySQL connector and MySQL DB package to retrieve information.

• Wrote Python scripts to parse XML documents and load the data in database.

• Proficiency in designing complex ETL processes using tools like Informatica Powercenter/Informatica Cloud/Talend

• Developed dynamic web pages using Python Django Frameworks. Involved in creating initial website prototype from Django skeleton and building out Views, Templates using CSS for whole site following Django MVC architecture.

• Worked with lxml to dynamically generate SOAP requests based on the services.

• Used Wireshark, live http headers, and Fiddler2 debugging proxy to debug the Flash object and help the developer create a functional component. The PHP page for displaying the data uses AJAX to sort and display the data. The page also outputs data to .csv for viewing in Microsoft Excel.

• Used Python based GUI components for the front-end functionality such as selection criteria. Administration of dedicated collocated RHEL machine and configured complex Apache configuration files.

• Used JavaScript and XML to update a portion of a webpage and Node.js for server-side interaction.

• Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.

• Writing Python scripts with Cloud Formation templates to automate installation of Auto scaling, EC2, VPC.

• Working of errors and exceptions handling debugging using Eclipse and Pycharm IDE.

• Composed software engineering workflows with Eclipse such as automating the code deployment process using Wizards Plugins, egit plugin and git flow plugins

• Developed REST Microservices which are like API’s used for Home Automation. They also keep the data in synchronization between two database services

• Outputting the parsed data as JSON and stored into MongoDB

• Worked in API group running Jenkins in a Docker container with RDS, GCP slaves in Amazon AWS

• Used Docker containers for development and deployment.

• Worked on MongoDB concepts such as locking, transactions, indexes, Sharding, replication, schema design.

• Engineered a data processing pipeline with GCP, API, CSS, API & OpenSlide APIs to pre-process large-scale

• As part of the POC migrated the data from source systems to another environment using Spark, SparkSQL.

• Developed and implemented core API services using Python with spark.

• Configuring auto scalable and highly available Microservices set with monitoring and logging using AWS, Docker, Jenkins and Splunk.

• Created data frames in particular schema from raw data stored at Amazon S3 using PySpark.

• Involved in CI/CD process implementation using Jenkins along with Shell script.

• Involved in infrastructure as code, execution plans, resource graph and change automation using Terraform.

• Developed Merge jobs in Python to extract and load data into MySQL database.

• Created Terraform scripts for EC2 instances, Elastic Load balancers and S3 buckets.

• Implemented Terraform to manage the AWS infrastructure and managed servers using configuration management tools like Chef and Ansible.

• Upgrading and maintaining the JavaScript libraries and widgets so that data is managed the same way

• Used PySpark-SQL to load JSON data and create schema RDD, Data Frames and handled Structured data Spark-SQL.

• Built numerous Lambda functions using python and automated the process using the event created.

• Created an AWS Lambda architecture to monitor AWS S3 Buckets and triggers for processing source data.

• Performed S3 buckets creation, policies on IAM role-based policies, MFA and customizing the JSON template

• Worked on MySQL database on simple queries and writing Stored Procedures for normalization.

• Deployed the project into Jenkins using the GIT version control system.

• Learned to index and search query a large number of documents inside Elastic search.

• Maintained and developed Docker images for a tech stack including Cassandra DB, Kafka, Apache running in Google Cloud Platform (GCP) on Kubernetes.

• The system is a full Microservices architecture written in Python utilizing distributed message passing via Kafka with JSON as data exchange formats.

• Understanding of secure-cloud configuration, Cloud Trail, cloud-security technologies (VPC, Security Groups, etc.) and cloud-permission systems (IAM)

• Used version controlling systems like GIT and SVN.

• Loaded the data into Spark RDD and did memory data Computation to generate the Output response.

• Created Maven POMs to automate the build process for projects and integrated third party tools like SonarQube.

• Performing operations such as CRUD operations and writing complex queries with Oracle 10g/11g

• Responsible for installing and administering the SonarQube for code quality check and Nexus repository and generating reports for different projects. Also, integrated them into Jenkins.

• Developed server-side application to interact with database using Spring Boot and Hibernate.

• Used Rest Controller in Spring framework to create RESTful Web services and JSON objects for communication.

• Extensively worked on writing UNIX shell scripts for scheduling the sessions for my testing process of ETL

• Consumed web services performing CRUD operations.

• Used Python Library Beautiful Soup 4 for Web Scraping to extract data for building graphs.

• Used Gearman API for client and admin client interfaces.

• Used Test driven approach (TDD) for developing services required for the application.

• Involved in Unit Testing and developed the unit test cases performed unit testing by using Unitest and Pytest. Worked on Selenium testing framework

• Two-time Winner of Employee of the month award for leading efforts for refactoring application code for better stability and testing purposes.

• Acknowledged for effectively leading cross-functional teams, resulting in successful project completion ahead of schedule.

Environment: Python 3.10/3.9/3.8, Django 3.0/4.0, Java, AWS, GCP, Angular 6/7, Node.js, Lambda, AJAX, PyCharm, Eclipse, Mongo DB, Cassandra, API Gateway, Spring Boot, PyTest, Hibernate, REST API, Microservices, JavaScript, Spark, Spark API, Spark SQL, PySpark, Spring, S3, CloudWatch, eclipse, OOP, MS-SQL Server, Kafka, Linux, GIT, Jira. Client: Netspend, Austin, TX May 2019 – July 2021

Role: Sr. Python Developer

Responsibilities:

• Extensively used Python / Django Framework for developing backend applications.

• Strong Expertise in working with server-side technologies including databases, Restful API and MVC design patterns.

• Actively involved in Initial software development life cycle (SDLC) of requirement gathering and in suggesting system configuration specifications during client interaction.

• Created Python and Bash tools to increase efficiency of call center application system and operations; data conversion scripts, AMQP/Rabbit MQ, REST, JSON, and CRUD scripts for API Integration.

• Used Celery with Rabbit MQ, MySQL, Elasticsearch, and Flask to create a distributed worker framework.

• Configuring auto scalable and highly available Microservices set with monitoring and logging using AWS, Docker, Jenkins and Splunk.

• Using Python packages like Numpy, Matplotlib, Beautiful soup, Pickle, Pyside, SciPy, PyTables, Urlib 2 libraries.

• Analyzing the Data from sourcing using Big data Solution Hadoop by implementing Azure Data Factory, Hive, Sqoop.

• Designed and developed components using Python with Django framework and implemented code in Python to retrieve and manipulate data

• Developed parsers for Extracting data from different sources of web services and transforming to store in various formats such as CSV, Database files, HDFS storage, etc. then to perform analysis.

• Developed multiple spark batch jobs using Spark SQL and performed transformations using many APIs and update master data in Cassandra DB as per the business requirement.

• Provisioning and managing multi-data center Cassandra cluster on Amazon Web Services. Familiar with writing Map Reduce jobs for processing the data over Cassandra cluster and Hbase.

• Involved in developing java Microservices which are interconnected in the AWS cloud also involved in consuming and building web services both and SOAP and RESTful.

• Developed Restful Microservices using Flask and Django and deployed on AWS servers using EBS and EC2.

• Develop Python Microservices with Django/Flask framework for Confidential internal Web Applications.

• Migrating servers, databases, and applications from on-premise to AWS, Azure

• Used the Model View controller (MVC) framework to build modular and maintainable applications Building reusable code and libraries for future use.

• Developed Python based APIs (RESTful Web services) by using Flask, SQL, and PostgreSQL.

• Migrated the Django database from SQLite to MySQL to PostgreSQL with complete data integrity.

• Used REST based Microservices with REST format considering RESTful APIs and outlined, built up the UI for the customer sites by utilizing HTML, CSS, jQuery.

• Contribute to the client's development standard practices of Microservice architecture by developing reusable product extensions and writing knowledge articles based on experience.

• Utilizing MEAN stack (MongoDB/Angular 6) and JSON for data transfer and server-side scripting.

• Worked extensively with Angular 4/6. Generated Components, Services using Angular-CLI.

• Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift.

• Imported tables from RDBMS to HDFS using Sqoop and used PySpark RDDs to get streaming of data into HBase.

• Working with various Python Integrated Development Environments like PyCharm, Spyder and Sublime Text.

• Responsible for creating efficient design and development of responsive UI using with HTML5, CSS3, JavaScript, MEAN stack (MongoDB, Express, Angular 4/6, and Node.js)

• Build the Silent Circle Management System (SCMC) in Elasticsearch, Python, and Node.JS while integrating with infrastructure services.

• Developing applications using RESTFUL Architecture using Node.js and PHP as backend languages

• Created a Python/Elasticsearch based web application using Python scripting for data processing, MySQL for the database, and HTML5/CSS3/Ruby and High Charts for data visualization of the served pages.

• Implemented networking operations like race route, SMTP mail server and web server. Socket programming Python

• Managed, developed, and designed a dashboard control panel for customers and Administrators using Elasticsearch, HTML, CSS, Ruby, Bootstrap, Ruby and RESTAPI calls.

• Automated RabbitMQ cluster installations and configuration using Python/Bash.

• Fetched twitter feeds for certain important keyword using python-twitter library.

• Built a real time click stream analytics platform for processing the using Spark, Kafka, elastic and building dashboard using Kibana and Grafana.

• Proficient in Angular Routing, UI Router, Controllers, Filters, and Services.

• Developing using Spark API's like Spark core, Spark Streaming, Spark MLlib and Spark SQL and worked with different file formats such as Text, Sequence files, Avro, ORC, JSON and Parquette.

• Used ORM to automate the transfer of data stored in relational databases tables into objects.

• Using Design Patterns such as Singleton and frameworks such as Django, Ability in handling Django ORM (Object- Relational Mapper) and SQL Alchemy.

• Managed and reviewed Hadoop log file and worked in analyzing SQL scripts and designed the solution for the process using PySpark.

• Develop python code to automate the ingestion of common formats such as JSON, CSV by using Logstash from elastic search to Kibana dashboard to be viewed by clients.

• Designing and deploying new ELK clusters (Elasticsearch, Logstash, Graphite Kibana, beats, Kafka, zookeeper etc)

• Developing API services Python/Tornado while leveraging AMQP and RabbitMQ for distributed architectures.

• Worked with Kibana to check logs and other time-stamped data sets stored in Elastic Search.

• Written and Maintained Automated Salt scripts for Elasticsearch, Logstash, Kibana, and Beats.

• Implemented and enhanced CRUD operations for the applications using the MVT (Model View Template) architecture of Django framework and Python Conducting code reviews.

• Used Microservice architecture, with Spring Boot-based services interacting of REST and Kafka.

• Developed Kafka producer and consumers, HBase clients, Spark, shark, Streams and Hadoop MapReduce jobs along with components on HDFS, Hive.

• Use AWS cloud management console, spinnaker to work with AWS. Wrote python scripts using boto3 library to manage ec2 instances and CloudFormation stack.

• Created real-time dashboard for Executives, utilizing Logstash, graphite, Elastic Search, Kibana & Redis.

• Used Pandas API to put the data as time series and tabular format for local timestamp data manipulation and retrieval and storing it into MongoDB.

• Used Redis cache for storing commonly used info and propagate the changes using RabbitMQ

• Used Celery as task queue and RabbitMQ, Redis as messaging broker to execute asynchronous tasks.

• Optimize the Pyspark jobs to run on Kubernetes Cluster for faster data processing.

• Used Kubernetes to deploy scale, load balance, and worked on Docker Engine, Docker HUB, Docker Images, Docker Compose for handling images for installations and domain configurations.

• Implemented Jenkins for Ci/ CD and for automating all builds and deployments.

• Build Jenkins jobs to create AWS infrastructure from GitHub repos containing terraform code and Installed and Administered Jenkins CI for Maven Builds.

• Ingested large CSV XML JSON data from computers around the world utilizing Python with pandas, csv, xml and NumPy. Formatted the raw data and built dynamic statistic pages for engineers.

• Developed API using Hibernate to interact with the MySQL database also created distributed MySQL coherence domains, configuration and system design based on oracle

• Automated various infrastructure activities like Continuous Deployment, Application Server setup, Stack monitoring using Ansible playbooks and has Integrated Ansible with Run deck and Jenkins.

• Writing unit testing code using Unit Test/PyTest and integrating the test code with the build process.

• Managed a team of 10 staff members, ensuring that everyone is knowledgeable about our products and customer expectations.

Environment: Python 3.8/3.7/3.5, Django 2.0/3.0, Java, AWS, GCP, Angular 4/6, Node.js, Lambda, AJAX, PyCharm, Eclipse, DynamoDB, Mongo DB, PostgreSQL, API Gateway, Spring Boot, Hibernate, REST API, Microservices, JavaScript, Spark, Spark API, Spark SQL, PySpark, Spring, S3, CloudWatch, PyTest, Eclipse, OOP, MS-SQL Server, Kafka, GIT, Jira. Variance InfoTech India July 2015– May 2017

Role: Jr. Python Developer

Responsibilities:

• Involved in entire lifecycle of the projects including Design, Development, Testing and Implementation and support.

• Created Business Logic using Python 2.7.

• Designed and developed components using Python. Implemented code in Python to retrieve and manipulated data.

• Analysed the SQL scripts and designed it by using PySpark SQL for faster performance.

• Used Python and Django to interface with the jQuery UI and manage the storage and deletion of content.

• Worked on developing internal testing tools written in Python.

• Used Django configuration to manage URLs and application parameters.

• Performed client-side validations using JavaScript and server-side validations using Django.

• Used Python and Django to interface with the jQuery UI and manage the storage and deletion of content.

• Designed and developed the UI using HTML, AJAX, CSS and JavaScript.

• Designed and Developed UI using HTML, XHTML, AJAX, Java Script and jQuery.

• Used Java Script libraries like jQuery UI, DataGrid, jscolor, high charts.

• Developed the presentation layer HTML, JSP, CSS and DHTML.

• Managed large datasets using Pandas data frames and MySQL.

• Rewrite existing modules into Python to deliver certain format of data.

• Understand business requirements & translate them into technical solutions by creating end to end ETL Mappings designs including target marts loading activities.

• Wrote and executed MySQL database queries from python using Python-MySQL connector and MySQL DB package.

• Used PySpark Data frame for creation of table and performing analytics over it.

• Embedded AJAX in UI to update small portions of the web page avoiding the need to reload the entire page.

• Created most important Business Rules which are useful for the scope of project and needs of customers.

• Created database using SQLite, wrote several queries to extract data from database.

• Build SQL queries for performing various CRUD operations.

• Used Jenkins for automation of build process and coordinated deployments across different sites.

• Developed GUI using webapp2 for dynamically displaying the test block documentation and other features of python code using a web browser.

• Built development environment with JIRA and SVN. Environment: Python 2.5/2.7, Django 1.4/ 1.5, HTML, CSS, XML, JavaScript, JQuery, AJAX, Eclipse, Linux, SVN, MySQL, Apache, RAD 7.0, C++, Ajax, HTML Restful API, MySQL, Django, JSON, Panda, Java, Shell Scripting, PL/SQL, SVN, Jenkins, Jira, UNIX, Linux

Awards and Recommendations:

• Two-time Winner of Employee of the month award while working for Fannie Mae, for leading efforts for mentoring junior coders, stepping in to help whenever needed even if it meant staying late or coming in on weekends.

• I was recognized as most valuable employee by vice president of software engineering, for overhauling the process for building and testing the code for all the teams by developing new product for internal use.

• Acknowledged for promoting a culture of continuous learning by organizing workshops, training sessions, and knowledge-sharing initiatives related to Python and programming best practices.



Contact this candidate