Saranya S
Contact No: 609-***-****
Email ID: ************@*****.***
Plainsboro, New Jersey
PROFESSIONAL SUMMARY:
About 8+ years of professional experience as a Python Developer, in Design, Development, Implementation of Python, Django, Flask, Pyramid and client - server technologies-based applications, RESTful services and SQL.
Experience in python libraries like Beautiful Soup, JASY, NumPy, SciPy, Matplotlib, Pickle, PYSIDE, Panda data frame, NetworkX, PYCHART, High carts, Urllib2 and IDEs like sublime text, Spyder, PyCharm and emacs.
Expert knowledge in front - end development using Python, Django, Angular JS, Angular 8/9/10/11/12/13/14/15, Node JS, Express JS, JavaScript, HTML5, CSS/CSS3, Bootstrap, Ajax, JSON, jQuery, XML.
Working experience with AWS cloud: EC2, S3, EFS, EBS, VPC, Lambda, Glue jobs, step functions, batch jobs, CI/CD pipeline, Neptune, ELB, ECS, and code Build.
Experienced in working on Application Servers like WebSphere, WebLogic, Tomcat, Web Servers like Apache server, NGINX and Integrated Development Environments like PyCharm, Eclipse, MyEclipse, JDeveloper and RAD.
Comprehensive background in the installation, upgrades, configuration, rollout and support of hardware, software, peripherals, and network devices.
Experience with deploying configuration management and CI/CD services such as (Puppet, Ansible, PowerShell, Jenkins, Vagrant, Docker, JIRA, CloudFormation, Elastic Beanstalk)
Good experience in utilizing JavaScript MVC frameworks like AngularJS, Backbone.js, Kubernetes, microservices, AJAX and UX tools like in vision.
Experience in using Scikit-Learn and Stats models in Python for Machine Learning and Data Mining.
Developed REST Microservices which are like API’s used for Home Automation. They also keep the data in synchronization between two database services.
Hands on strong development skills in the area of SQL, UNIX Shell scripting, Linux, Oracle, PL/SQL, Teradata, SQL Server, Perl and Python scripting. In Depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, RDDs for Pyspark and PANDA libraries.
Experienced in working Stretch Database feature of SQL Server 2016 for the cold data residing on on-premises SQL Server Databases can be migrated transparently and securely to Microsoft Azure Cloud.
Experience in developing ETL pipelines in and out of data warehouse using combination of Python and Snowflake Snow SQL Writing SQL queries against Snowflake.
Having experience in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data analytics, data wrangling and Excel data extracts.
Good knowledge in various stages of SDLC (Software Development Life Cycle).
Configuring auto scalable and highly available microservices set with monitoring and logging using AWS, Docker, Jenkins and Splunk.
Experience in Azure webapp, Logic app, Azure blob, Azure functions, Azure SQL and other.
Experienced in developing web-based applications using Python, Django, C++, XML, CSS3, HTML5, DHTML, JavaScript, jQuery, MVC3, Bootstrap, RESTful, RUBY and AJAX.
Experienced in MVW frameworks like Flask/Django, Java Script, jQuery and Node.js.
Good knowledge of web services with protocols SOAP and REST.
Experience in writing Sub Queries, Stored Procedures, Triggers, Cursors, and Functions on MySQL and PostgreSQL database.
Strong analytical and problem-solving skills, always striving for new knowledge. Excellent communication and interpersonal skills, ability to work independently as well as part of an integrated team
TECHNICAL SKILLS:
Languages
Python, C++, Java, J2EE, Shell Script, Perl, SQL
Frameworks
Django, Pyramid, Flask, Hibernate, Spring, Web2Py, Docker, Pandas
Python Libraries
Requests, Scrapy, wxPython, Pillow, SQL Alchemy, BeatifulSoup, Twisted, NumPy, SciPy, Matplotlib, Pygame, Pyglet, PyQt, PyGtk, Scapy, pywin32, ntlk, nose, SymPy, I python
Web Technologies
HTML5, CSS3, JavaScript, XML, Servlet, JSP, JSON, AJAX, SAAS and JQuery, AWS
IDEs/tools
PhpStrom, Notepad++, Sublime, NetBeans, Thonny, Komodo, PyCharm, PyDev, PyScripter, Pyshield, Spyder, PyStudio, Eclipse
Software Management
GIT, SVN, Maven, Gradle, CVS
Database
MYSQL, PostgreSQL, MS SQL, MongoDB, SQLite, Oracle
Operating Systems
Windows, Mac, Linux
Education
Completed - Master’s Degree in Software Engineering - International Technological University.
PROFESSIONAL EXPERIENCE:
Client: Somerville Bank, Richmond, IN May 2023 – PRESENT
Role: Python Developer
Responsibilities:
Involved in analysis, specification, design, and implementation and testing phases of Software Development Life Cycle (SDLC) and used agile methodology for developing the application.
Created Django dashboard with custom look and feel for end user, after a careful study of the Django admin site and dashboard.
Utilized Angular 15/14/13 concepts like Interpolation, Dependency Injection, Input Variables, Bootstrapping, Ng For, NgIf, Router Outlet, binding click events, component decorator, etc.
Developed and tested many features for dashboard using Flask, CSS and JavaScript.
Developed backend of the application using the flask framework.
Developed server-based web traffic using RESTful API’s statistical analysis tool using Flask, Pandas.
Developed the ETL jobs as per the requirements to update the data into the staging database (Postgres) from various data sources and REST API's.
Extensively worked on Creating Custom Directives, Services in Angular.
Developed and tested many features for a dashboard using Python, Java, Bootstrap, CSS, JavaScript, and jQuery.
Extensively worked on Developing Front end code in Angular to retrieve data as JSON and displayed the well-organized result in web pages by writing Angular Component, directives, Services and route providers.
Created new connections through applications for better access to MySQL database and involved in writing SQL & PLSQL - Stored procedures, functions, sequences, triggers, cursors, object types etc.
Developed and tested many features for dashboard using Flask, CSS and JavaScript.
Developed backend of the application using the flask framework.
Worked on virtual and physical Linux/UNIX hosts and involved in day-to-day administrative activities such as maintaining user accounts and providing advanced file permissions to specific users. Build a Research web with Research web design.
Leveraged AWS cloud services such as EC2, auto scaling and VPC to build secure, highly scalable and flexible systems that handled load on the servers.
Implemented TFS Build Archival to AWS Simple Storage Service S3 and created Life Cycles for managing the files in S3.
Implemented cloud watch for setting alarm for monitoring the EC2 instances.
Working with AWS is the interesting part of it and putty is used for launching the Instance, hands on Azure.
Developed and maintained mostly Python and ETL scripts to scrape data from external web sites and load cleansed data into a MySQL DB
Experience in deployment automation and related tooling (Terraform, AWS Cloud Formation or similar).
Worked on deployment on AWS EC2 instance with Postgres RDS and S3 file storage
Worked with C++ Application developer for Credit Risk-Addon project, involved in the coding and testing phase.
Writing modules in Python to connect to MongoDB with PyMongo and doing CRUD operations with MongoDB.
Scripting in PowerShell and Python. Experience with systems and IT operations, monitoring operations is involved. Using Service to manage tickets as well as building backend automation. Handling tools such as GITHUB, Urban Code Deployment, SVN, Jenkins and Maven Dockers
Automating the tasks using Ansible playbooks, Shell scripting and Python. Provisioned and patched servers regularly using Ansible.
Architected and detailed Position, PNL, and FICC data migration process from multiple sources into Snowflake and Vertica from Sybase - IQ, SQL Server, and Oracle.
Experience writing Playbooks using Ansible to provision several pre-production environments and several proprietary middleware installations, created various modules and Manifests in Ansible to automate various applications.
Implemented code to perform CURD operations on Mongo DB using MongoDB module.
Used Python 3.X (NumPy, spicy, pandas, scikit-learn, seaborn) and Spark (Pyspark, MLlib) to develop variety of models and algorithms for analytic purposes.
Created Business Logic using Python to create Planning and Tracking functions and developed multi-threaded standalone applications using Python and PHP.
Worked on HTML, CSS, AJAX, JSON Django and Test-Driven Development (TDD) designed and developed the user interface of the website.
Implement data transformation using XPATH, XSLT, Data Weave, Custom java classes.
Involved in analysis, specification, design, and implementation and testing phases of Software Development Life Cycle (SDLC) and used agile methodology for developing the application.
Expertise knowledge and worked in various stages of SDLC (Software Development Life Cycle), Software Testing Life Cycle (STLC) and QA methodologies from project definition to post-deployment documentation.
Worked with version control systems like Git, GitHub, CVS, Tortoise and SVN to keep the versions and configurations of the code organized.
Configured NoSQL databases like Apache Cassandra and Mongo DB to for increasing compatibility with Django and Bottle.
Involved in MVW frameworks like Django, Tornado, Angular.js, JavaScript and Node.js. installed and configured by Builder for application builds and deploying it and Developed, Deployed SOAP-based Web Services on Tomcat Server.
Development of ERP's satellite applications with python2, C++, Perl, PHP.
Integrated Jenkins with various Devops tools such as Nexus, Sonar Qube, Puppet etc.
Deployed (kube spray) and maintained Kubernetes clusters in Azure for soon to be released new software.
Worked with Azure, Ansible, Gitlab, Helm, Jenkins, Kibana, Kubernetes, Python, and Jira daily.
Environment: Python, C++, Django, AWS, Angular 15/14/13, AWS, Kubernetes, PyQt, Snowflakes, Python Matplotlib, SDK, Flash, PyQuery, DOM, Bootstrap, XML, HTML5, JavaScript, JSON, Rest, Apache Web Server, Git Hub, MySQL, LINUX, Oracle, Windows.
Client: Bank of Ocean City, Ocean City, MD Apr 2021 – Apr 2023
Role: Python Developer
Responsibilities:
Responsible for gathering requirements, system analysis, design, development, testing and deployment.
Developed tools using Python, Shell scripting, XML to automate some of the routine tasks. Interfacing with supervisors, artists, systems administrators and production to ensure production deadlines are met.
Used Python to write data into JSON files for testing Django Websites. Created scripts for data modelling and data import and export.
Developed the notification service by posting the JSON request in AWS API Gateway, Validating the response in Lambda by getting the data from Dynamo DB and sending the notification through AWS SNS.
Interfacing with supervisors, artists, systems administrators, and production to ensure production deadlines are met.
Provided guidance to development team working on Pyspark as ETL platform.
Responsible for ETL and orchestration process using Airflow and NiFi tool.
Implemented Data Quality framework using AWS Athena, Snowflake, Airflow and Python.
Created the server less REST full API using AWS Lambda and used it as a trigger.
Designed the screens with Priming, Angular 11 with Typescript.
Used cloud AWS, EC2, S3 for virtual servers on Linux.
Utilized PyUnit, the Python Unit test framework, for all Python applications and used Django Database APIs to access database objects.
Responsible for setting up Python REST API framework using Django.
Designed and created backend data access modules using PL/SQL stored procedures.
Used Django configuration to manage URLs and application parameters.
Developed views and templates with Python and Django view controller and templating language to create a user-friendly website interface.
Developed web application and its business logic using Python on Django Web Framework.
Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.
Used PyQuery for selecting particular DOM elements when parsing HTML5.
Wrote scripts using python modules and its libraries to develop programs that improve processing of access requests.
Designed and developed the UI of the website using HTML5, AJAX, CSS3 and JavaScript.
Designed and developed a data management system using MySQL.
Worked in an agile development environment.
Used HTML5, CSS3, JQuery, JSON and JavaScript for front end applications.
Involved in development of Web Services using SOAP for sending and getting data from the external interface in the XML format.
Used jQuery and Ajax calls for transmitting JSON data objects between frontend and controllers.
Environment: Python, Django, AWS, Angular, PyUnit, PyQuery, HTML5, CSS3, XML, AJAX, JavaScript, JQuery, Bootstrap, PostgreSQL, JavaScript, Eclipse, Git, GitHub, Shell Scripting, MySQL, Oracle, Windows.
Client: Progyny, New York, NY Jan 2019 – Apr 2021
Role: Python Developer
Responsibilities:
Validated the developed lambda scripts and fixed the identified bugs.
Designed and developed integration methodologies between client web portals and existing software infrastructure using SOAP API's and vendor specific frameworks.
Developed REST based Application using Python, Django, Angular 9/8, Bootstrap, HTML, jQuery and Node.js by following W3C standards.
Experienced in developing web-based applications using SaaS, Python, Django, Kafka, RPC, CSS, HTML, JavaScript and jQuery based on Ansible automation script, Angular.JS.
Experienced in developing web-based applications using SaaS, Python, Django, Kafka, RPC, CSS, HTML, JavaScript and jQuery based on Ansible automation script, Angular.JS.
Scheduling the python scripts from Jenkins Lambda with cloud watch events.
Used regular expressions for faster search results in combination with Angular built-in, custom pipes and ng2-charts for report generation.
Created Python ETL pipelines for teardown and “trickle” data migration from different backends to Snowflake, SQL Server, and Vertica.
Coded Snowflake data loaders using Python. Reorganized large volumes of data.
Developed Restful Microservices using Flask and Django and deployed on AWS servers using EBS and EC2.
Developed Docker containers for running multi module python-based pipeline for data transformation and extraction and integrated them and deployed them in AWS using lambda, step functions, batch jobs and CI/CD implementation.
Used Agile methodology and SCRUM process for project developing.
Wrote Python routines to log into the websites and fetch data for selected options.
Performed testing using Django's Test Module.
Involved in the CI/CD pipeline management for managing the weekly releases.
Designed and developed communication between client and server using Secured Web services.
Responsible for the development of entire frontend and backend modules using Python.
Worked on updating the existing clipboard to have the new features as per the client requirements.
Used Django Database API's to successfully create database objects.
Developed RESTful services using Django.
Responsible for Configuring Kafka Consumer and Producer metrics to visualize the Kafka System performance and monitoring.
Built a distributed system for triggering and executing daily data processing jobs. It contains a high-availability scheduler (built with Python), a message broker (RabbitMQ), a cluster of workers (built with Python), and UI (built with Python, Django and Bootstrap).
Used many regular expressions in order to match the pattern with the existing one.
Designed Forms, Views, Models using Django's MVC software architecture pattern.
Used Python and Pandas library for data cleaning and aggregation.
Created RESTful API's using Django.
Developed user interface using BOOTSTRAP and JavaScript to simplify the complexities of the application.
Worked on MySQL database on simple queries and writing Stored Procedures for normalization and renormalization.
Used Test driven approach (TDD) for developing services required for the application.
Developed and tested many features for dashboard using Python, Java, Bootstrap, CSS3.
Wrote python scripts to parse XML documents and load the data in database.
Worked on front end frameworks like CSS3 and Bootstrap for development of simple web applications.
Deployed the project into Heroku using Git version control system.
Special skills in developing user friendly, simple yet effective web-based applications.
Environment: Python, Django, AWS, Angular, PyQuery, PyUnit, GIT, Heroku, Matplotlib, Snowflakes, Bootstrap, CSS3, XML, MySQL, JavaScript, Shell Scripting, Git, MySQL, Oracle, Linux, Windows.
Client: Infinera Corporation, Allentown, PA Nov 2016 – Dec 2018
Role: Python Developer
Responsibilities:
Developed data pipelines using Python for medical image pre-processing, training, and testing, leveraging libraries such as Pandas, NumPy, OpenCV, and TensorFlow to optimize workflows.
Participated in JAD sessions for design optimizations related to data structures as well as ETL processes.
Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
Extensively worked on Informatica tools like source analyzer, mapping designer, workflow manager, workflow monitor, Mapplets, Worklets and repository manager.
Used windows Azure SQL reporting services to create reports with tables, charts and maps
Extracted data using Sqoop Import query from multiple databases and ingest into Hive tables.
Developed SQL scripts for creating tables, Sequences, Triggers, views and materialized views.
Performed several ad-hoc data analysis in Azure Data bricks Analysis Platform on KANBAN board.
Used Azure reporting services to upload and download reports.
Loaded real time data from various data sources into HDFS using Kafka.
Developed Map Reduce jobs for Data Cleanup in Python.
Defined extract - translate-load (ETL) and extract-load-translate (ELT) processes for the Data Lake.
Participated in integration of MDM (Master Data Management) Hub and data warehouses.
Written pig script to load processed data from HDFS into MongoDB.
Developed Simple to complex Map/reduce Jobs using Hive and Pig.
Extracted the needed data from the server into HDFS and Bulk Loaded the cleaned data into HBase.
Designed the Data Marts in dimensional data modeling using star and snowflake schemas.
Worked on reading multiple data formats on HDFS using python.
Involved in database development by creating Oracle PL/SQL Functions, Procedures and Collections.
Worked in Oozie and workflow scheduler to manage Hadoop jobs with control flows.
Prepared Tableau reports and dashboards with calculated fields, parameters, sets, groups or bins and publish on the server.
Implemented a CI/CD pipeline with Docker, Jenkins and GitHub by virtualizing the servers using Docker for the Dev and Test environments by achieving needs through configuring automation using Containerization.
Translated business requirements into SAS code for use within internal systems and models.
Migrated of ETL processes from RDBMS to Hive to test the easy data manipulation.
Environment: Python, Hadoop, Pyspark, Azure Databricks, Azure DataFactory (ADF), PostgreSQL, Git, Azure SQL Server, Kafka. Hive, HBase, Airflow
Certifications:
AWS Certified DevOps Engineer - Professional - Amazon Web Services (AWS)
Microsoft Certified: Azure Fundamentals - Microsoft