Sr. Python Full Stack Developer
Name: Hari Nandhini Appam
Mail Id: ****************@*****.*** Contact Number: +1-314-***-****
LinkedIn URL: linkedin.com/in/nandini-a-2b84112a5
Summary:
Over 10+ Years of experience in Analyzing, Developing, Managing, and implementing various stand - alone, client-server enterprise applications using Python, Django, Java and mapping the requirements to the systems.
Well versed with Agile with SCRUM, Waterfall Model and Test-driven Development (TDD) methodologies.
Experience in developing web applications by using Python, Django, C++, HTML/HTML5, DHTML, CSS/CSS3, JavaScript, Angular JS, AJAX, XML, JSON and jQuery.
Designed and Developed Azure SQL, Database, and DataMart’s for functional area data consumers.
Good experience in developing web applications implementing Model View Control (MVC) architecture using Django, Flask, Pyramid and Python web application frameworks.
Designed and implemented large scale business critical systems using Object oriented Design and Programming concepts using Python and Django.
Experience in implementing with server-side technologies with restful API and MVC design patterns with node JS and Django framework.
Experience on ETL and ELT Tools like SQL Server Integration Services (SSIS), also have knowledge in Data Migrator (IBI).
Experience in working with number of public and private cloud platforms like Amazon Web Services (AWS), Microsoft Azure, Rackspace Cloud and Open stack.
Extensive experience in Amazon Web Services (Amazon EC2, Amazon S3, Amazon Simple DB, Amazon RDS, Amazon Elastic Load Balancing, Elastic Search, Amazon SQS, AWS Identity and access management, AWS Cloud Watch, Amazon EBS and Amazon CloudFront).
Worked on standard python packages like boto and boto3 for AWS.
Built data pipelines for batch and real-time streaming using Azure Synapse.
Proficient in SQLite, MySQL and SQL databases with Python.
Experienced in working with various Python IDE's using PyCharm, Pyscripter, Spyder, Pystudio, PyDev, IDLE, NetBeans and Sublime Text.
Worked on several standard python packages like NumPy, matplotlib, Pickle, Pyside, SciPy, python, PyTables etc.
Good experience in using WAMP (Windows, Apache, MYSQL, and Python/PHP) and LAMP (Linux, Apache, MySQL, and Python/PHP) Architectures.
Experience in automation of code deployment, support, and administrative tasks across multiple cloud providers such as Amazon Web Services, Microsoft Azure, and Google Cloud.
Experienced in using Caching applications for large scale applications like Mem cached, Redis.
Experienced in working Asynchronous Frameworks like NodeJS, Twisted and designing the automation framework using Python and Shell scripting.
Experience with Requests, Pysftp, Gnupg, Report Lab, NumPy, SciPy, PyTables, Python-Twitter, Matplotlib, HTTPLib2, Urllib2, Beautiful Soup, Data Frame and P and as python libraries during development lifecycle.
Expertise in working with different databases like Oracle, MySQL, PostgreSQL and Good knowledge in using NoSQL databases MongoDB (2.6, 2.4).
Proficient in developing complex SQL queries, Stored Procedures, Functions, Packages along with performing DDL and DML operations on the database.
Hands-on experience in handling database issues and connections with SQL and NoSQL databases like MongoDB, Cassandra, Redis, CouchDB, DynamoDB by installing and configuring various packages in python.
Experienced in working on Application Servers like WebSphere, WebLogic, Tomcat and Web Servers like Apache server, NGINX.
Expertise in working with GUI frameworks-PyJamas, Jytho, guidata, PyGUI, PyQt, PyWebkitGtk.
Experienced in developing multi-threaded web services using CherryPy & BottlePy framework.
Good Knowledge in writing different kinds of tests like Unit test/PyTest and build them.
Familiarity with development best practices such as code reviews, unit testing, system integration testing (SIT) and user acceptance testing (UAT).
Experienced with version control systems like Git, GitHub, CVS, and SVN to keep the versions and configurations of the code organized.
Expertise in Build Automation and Continuous Integration tools such as Apache ANT, Maven, Make, Jenkins/Hudson, Anthill Pro, Bamboo, Cruise Control, and TeamCity.
Strong experience in developing Web Services like SOAP, REST, Restful with Python programming language.
Good familiarity with Networking Technologies such as TCP/IP, UDP, FTP, DNS, DHCP, SNMP, TFS, and all Networking Layers (Layaer1 to Layer7).
Experience in using Docker and Ansible to fully automate the deployment and execution of the benchmark suite on a cluster of machines.
Good Experience in Linux Bash scripting and following PEP-8 Guidelines in Python.
Excellent working knowledge in UNIX and Linux shell environments using command line utilities.
Experience in building applications in different operating systems like Linux (Ubuntu, CentOS, Debian), Mac OS.
Proficient in Automated Verification and Validation testing tools.
Strong working experience with the Testing tools such as Nose, Selenium (IDE, WebDriver, and RC), Cucumber, JUnit, QUnit, Karma, Jasmine, SoapUI & hellip etc.
Involved in all the phases of Software Development Life Cycle (SDLC) using Project management tools JIRA, Red mine and Bugzilla.
Excellent Interpersonal and communication skills, efficient time management and organization skills, ability to handle multiple tasks and work well in a team environment.
Technical Skills:
Operating Systems
Windows 98/2000/XP/7, 8, Mac OS and Linux CentOS, Debian, Ubuntu, UNIX, Solaris, VMware
Languages
Python 2.5, 2.7, 3.4, 3.6, Core Java, C, OpenGL, web, C, C++, SQL, PL/SQL
Web Technologies
HTML/HTML5, CSS/CSS3, XML, DOM, AJAX, jQuery, JSON, Bootstrap, Angular JS, angular2, Angular4, NodeJS, and React JS
Python Libraries/Packages
NumPy, SciPy, Pickle, PyQt, PySide, PyTables, Data Frames, Pandas, Matplotlib, SQL Alchemy, HTTPLib2, Urllib2, Beautiful Soup, PyQuery
Python Frameworks
Django, Flask, Pyramid, PyJamas, Python, Web2py
IDE
Brackets, WebStorm, PyCharm, Scripter, Spyder, PyStudio, PyDev, IDLE, NetBeans, NetBeans, Eclipse, Atom, Sublime Text
Analytic Tools
Google Analytics Fiddler, Tableau, Power BI, JMP PRO, SAS, Azure
Cloud Computing
AWS, Azure, Rackspace, OpenStack, Azure SQL
AWS Services
Amazon EC2, Amazon S3, Amazon Simple DB, Amazon RDS, Amazon Elastic Load Balancing, Elastic Search, Amazon SQS, AWS Identity and access management, AWS Cloud Watch, Amazon EBS and Amazon CloudFront
Web servers
Apache, IIS
J2EE
JDBC, JNDI, JSP and servlets
Databases/Servers
MySQL, MS Access, SQL Server, Oracle, SQLite3, Cassandra, Redis, PostgreSQL, CouchDB, MongoDB, Apache Web Server 2.0, NginX, Tomcat, JBoss, WebLogic
Testing/Bug Tracking Tools
Robot, Selenium, Junit, Nose, Karma, Jasmine, Bugzilla, Jasmine, PyUnit, JIRA and Junit, Zope, PyTest, TestNG
Web Services/Network Protocols
SOAP, Rest, Restful, TCP/IP, UDP, FTP, HTTP/HTTPS, Subnets, SMTP, ICMP, NDS, DHCP, NFS, Cisco Routers, LAN
Build and CI tools
Ant, Maven, Gradle, Jenkins, Hudson, Bamboo
Work Experience:
Client: Blue Cross Blue Shield, Little Rock, AZ (Remote). Jun 2023 – Till Date
Role: Sr. Python Full Stack Developer
Description: Developed tools using Python, Shell scripting, XML to automate some of the menial tasks. Interfacing with supervisors, artists, systems administrators, and production to ensure production deadlines are met.
Responsibilities:
Developed security policies and processes. Developed views and templates with Python and Django’s view controller and templating language to create a user-friendly Website interface.
Experience in building pipelines that consume data from Teradata and Azure SQL server.
Designed and Developed SQL database structure with Django Framework using agile methodology. Developed project using Django, Oracle SQL, Angular, JavaScript, HTML5, CSS3 and Bootstrap.
Involved in the complete Software Development Life Cycle including gathering Requirements, Analysis, Design, Implementation, Testing and Maintenance.
Developed and integrated GenAI solutions into web applications, enhancing user experiences and automating content generation.
Successfully led multiple projects from inception to completion, including project planning, resource allocation, timeline management, and risk mitigation.
Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS in Cloud Formation JSON templates.
Implemented user interface guidelines and standards throughout the development and maintenance of the Website using the HTML, CSS, JavaScript, jQuery and AngularJS.
Developed python Scripts for deploying the Pipeline in Azure Data Factory (ADF) that process the data using the Cosmos Activity.
Mentored and coached junior developers, fostering a collaborative and knowledge-sharing environment to enhance team skills and productivity.
Performed job functions using Spark APIs in Scala for real time analysis and for fast querying purposes and Experienced with Agile methodology and delivery tool Version One.
Used Python programming and Django for the backend development, Bootstrap and Angular for frontend connectivity and MongoDB for database.
Implemented a CI/ CD pipeline with Docker, Jenkins and GitHub by virtualizing the servers using Docker for the Dev and Test environments by achieving needs through configuring automation using Containerization.
Experienced with AWS cloud platform and its features, which includes EC2, S3, ROUTE53 VPC, EBS, AMI, SNS, RDS AND CLOUD WATCH.
Used the AWS -CLI to suspend on AWS Lambda function used AWS CLI to automate backup of ephemeral data stores to S3 buckets EBS.
Gathered Semi structured data from S3 and relational structured data from RDS and keeping data sets into centralized metadata CatLog using AWS GLUE and extract the datasets and load them into Kinesis streams.
Worked as part of an Agile/Scrum based development team and exposed to TDD approach in developing applications.
Worked on designing and deploying a multitude application utilizing almost all the main services of the AWS stack (like EC2, S3, RDS, VPC, IAM, ELB, EMR Cloud watch, Route 53, Lambda and Cloud Formation) focused on high availability, fault tolerance environment.
Introduced new features and solved existing bugs by developing code for a cloud-based integration platform (IPaas) and migrated customer data from legacy Ipas to AWS.
Resolving complex issues reported in azure Databricks and HDInsight which were reported by Azure end customers.
Effective communicator with stakeholders, including clients, product managers, and executives, to gather requirements, provide updates, and ensure alignment with business goals.
Deployed and tested different modules in Docker containers and GIT. Implemented programming automations using Jenkins and Ansible on Unix/Linux based OS over cloud like Docker.
AWS Kinesis Streams, AWS Step Functions (Serverless) Pipelines, AWS Kinesis Streams, Google Tensor Flow, AWS Step Functions (Serverless) Pipelines, AWS Kinesis Streams, AWS Kinesis Streams Data Analytics Streaming SQL (AWS EKS) Pipelines.
Work as developer and support engineer where CA API Gateway expose API (Rest/SOAP) services of Home Depot to outside vendors.
Worked with different components of Ipas solution Azure provides, Service Bus, Functions and Logic Apps to use connectors and create workflows.
Installed MongoDB, configured, setup backup, recovery, upgrade and tuning and data integrity. Responsible for managing MongoDB environment with high availability, performance and scalability perspectives. Extensive experience in deploying, managing and developing MongoDB cluster.
Extensive experience automating the build and deployment of scalable projects through GitLab CI/ CD, Jenkins, etc. and worked on Docker and Ansible. Used JavaScript for data validations and designed validations modules.
Created methods (get, post, put, delete) to make requests to the API server and tested Restful API using postman. Also used Loaded CloudWatch Logs to S3 and then load into Kinesis Streams for Data Processing.
Created Terraform scripts for EC2 instances, Elastic Load balancers and S3 buckets. Implemented Terraform to manage the AWS infrastructure and managed servers using configuration management tools like Chef and Ansible.
Experience in leading teams using Agile methodologies like Scrum and Kanban, facilitating daily stand-ups, sprint planning, and retrospectives to ensure efficient project delivery.
Implemented robust data pipelines using Python frameworks (Pandas, NumPy) for training and evaluating AI models, ensuring high-quality data management and preprocessing.
Implemented Integration test cases and developing predictive analytic using Apache Spark Scala APIs. And Used REST and SOAPUI for testing Web service for server-side changes.
Wrote ANSIBLE Playbooks with Python, SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python. Run Ansible Scripts to provision Dev servers.
Designed RESTful APIs using Flask/Django to expose GenAI functionalities, enabling seamless integration with front-end applications and third-party services.
Wrote Python scripts to parse XML documents and load the data in database. Developed and designed an API (RESTful Web Services). Responsible for user validations on client side as well as server side.
Development of Pyth3678on APIs to dump the array structures in the Processor at the failure point for debugging. Handling Web applications - UI security, logging, backend services.
Written functional API test cases for testing REST APIs with Postman and Integrated with Jenkins server to build scripts.
Representation of the system in hierarchy form by defining the components, subcomponents using Python and developed set of library functions over the system based on the user needs.
Used Python and Django creating graphics, XML processing, data exchange and business logic implementation with Spiff workflow development.
Handled operations and maintenance support for AWS cloud resources which includes launching, maintaining and troubleshooting EC2 instances, and S3 buckets, Virtual Private Clouds (VPC), Elastic Load Balancers (ELB) and Relational Database Services (RDS).
Used Test driven approach (TDD) for developing services required for the application and Implemented Integration test cases and developing predictive analytic using Apache Spark Scala APIs.
Designed and developed new reports and maintained existing reports using Microsoft SQL Reporting Services (SSRS) and Microsoft Excel to support the firm’s strategy and management.
Environment: Python3.7, Django, Django Rest, AWS, Selenium API, DevOps, Flask, React, CI/CD, C#, .Net, ETL, Hadoop, Vagrant, New Relic Server, Git, Ansible, HTML5/CSS3, PostgreSQL, Azure, ADF, Azure storages,, Azure Databricks, Amazon Web Service (AWS), Lambda, AWS SDK, SNS, Scala, SQS, CloudWatch, Kubernetes, Docker, S3, EC2, RDS, EBS, Redux, Circle, PyCharm, Microsoft Visual Code, Linux, Shell Scripting, JIRA,, SQL, API.
Client: Humana – Louisville, KY. Aug 2021 – May 2023
Role: Sr. Python Full Stack Developer
Responsibilities:
Extensively working with AWS services like S3, EC2, ELB, EBS, Lambda, Auto-Scaling, Route53, CloudFront, IAM, Cloud Watch, and RDS etc.
Involved in infrastructure as code, execution plans, and resource graph and change automation using Terraform. Managed AWS infrastructure as code using Terraform and CloudFormation.
Developed Merge jobs in Python to extract and load data into MySQL database.
Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift.
Developed Json Scripts for deploying the Pipeline in Azure Data Factory (ADF) that process the data using the Cosmos Activity.
Created Terraform scripts for EC2 instances, Elastic Load balancers and S3 buckets.
Implemented server-side encryption to encrypt the objects in S3 bucket with Customer Managed Keys (CMK) stored in AWS Key Management Service.
Worked closely with data scientists and product teams to translate business requirements into technical specifications for GenAI features.
Manage service account credentials, database credentials, passwords through AWS Secrets Manager
Created continuous deployment pipeline by creating component processes, Applications and adding environments into handling deployments using code pipeline and AWS code deployment for various deployment scenarios like Tomcat deployment, multiple applications deployments.
Implemented Terraform to manage the AWS infrastructure and managed servers using configuration management tools like Chef and Ansible.
Worked on MVC architecture using PHP, Python as controller.
Built numerous Lambda functions using python and automated the process using the event created.
Created an AWS Lambda architecture to monitor AWS S3 Buckets and triggers for processing source data.
Developed a data science pipeline using AWS Sage Maker and scheduled it successfully in production.
Coordinated with DevOps team to deploy the application on AWS Resources
Worked on packages like socket, REST API, Django.
Developed intuitive user interfaces using modern front-end technologies (React, Angular) that incorporate GenAI capabilities for real-time user interaction.
Building/Maintaining Docker container clusters managed by Kubernetes Linux, Bash, GIT, Docker, on AWS.
Utilized Kubernetes for the runtime environment of the CI/CD system to build, test deploy.
Resolving complex issues reported in azure Databricks and HDInsight which were reported by Azure end customers.
Performed S3 buckets creation, policies on IAM role-based policies, MFA and customizing the JSON template.
Automated various service and application deployments with ANSIBLE on CentOS and RHEL in AWS.
Wrote ANSIBLE Playbooks with Python, SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python. Run Ansible Scripts to provision Dev servers.
Worked on MySQL database on simple queries and writing Stored Procedures for normalization.
Deployed the project into Jenkins using the GIT version control system.
Worked on Python advanced packages and modules like Requests, Scrapy, Beautiful Soup, Multithreading, Pandas, NumPy, SciPy, Matplotlib, wxPython, QT, Regex expressions, SQL Alchemy, SQL based database connections JDBC/ODBC and Py-Virtual Environment.
Understanding of secure-cloud configuration, Cloud Trail, cloud-security technologies (VPC, Security Groups, etc.) and cloud-permission systems (IAM)
Used version-controlling systems like GIT and SVN.
Loaded the data into Spark RDD and did memory data Computation to generate the Output response.
Responsible for installing and administering the SonarQube for code quality check and Nexus repository and generating reports for different projects. Also, integrated them into Jenkins.
Consumed web services performing CRUD operations.
Used Python Library Beautiful Soup 4 for Web Scraping to extract data for building graphs.
Used AngularJS as the development framework to build a single-page application.
Involved in Unit Testing and developed the unit test cases using PyUnit framework.
Environment: Amazon Web Services (AWS), Cloud Environment, Lambda, AJAX, Angular, DynamoDB, Python 3.7, Django, API Gateway, REST API, PySpark-Spark SQL, Spark Streaming, Amazon S3, CloudWatch, AWS Glue, PyCharm, MS-SQL Server, GIT, Jira, AWS Secret Manager, KMS, Terraform, Cloud formation.
Client: State of Missouri – St Louis, MO (Remote). March 2019 – Jul 2021
Role: Python Developer
Responsibilities:
Involved in integrating the GIT into the Puppet to ensure the integrity of applications by creating Production, Development, Test, and Release Branches.
To maintain the code using GIT version control and Cygwin for Linux commands.
Stored the data in the form of JSON structure-based documents, stored in a collection using MongoDB.
Designed and implemented Sandra NoSQL database read/write/search function. Worked on migrating data to Amazon AWS. Used AWS products like EC2, IAM, RDS, Log Monitor, Lambda, REST API Gateway etc.
Developed front end using Angular.js, React.JS, Node.js, bootstrap.js, backbone.js, JavaScript, where back end is java with REST Webservices.
Used Amazon Web Services (AWS) for improved efficiency of storage and fast access.
Actively participated in requirement gathering sessions and capability planning for multi data center Cassandra cluster
Installed, Configured, administered and monitored multi–Data center Cassandra clusters
Involved in upgrading the present Oracle data model to Cassandra data model
Evaluated and tuned data model by running endurance tests using JMeter, Cassandra Stress Tool and OpsCenter
Wrote CloudFormation Templates in YAML and JSON
Development of Scripts for CICD using Virtual machines, Docker, python and Yam.
Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
Involved in front end and utilized Bootstrap and Angular.js for page design.
Organizing and configuring web application through YAML by creating and configuring appam
Worked in DevOps group running Jenkins in a Docker container with EC2 slaves in Amazon AWS cloud configuration.
Managing Amazon Web Services (AWS) infrastructure with automation and configuration management tools such as uDeploy, Puppet or custom-built. Designing cloud-hosted solutions, specific AWS product suite experience.
Excellent experience with Python development under Linux OS (Debian/RedHat/AIX)
Experience with Linux command and bash shell scripting.
Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform and load data from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.
Worked on Amazon Web Services (AWS) infrastructure with automation and configuration management tools such as Chef and Puppet.
Designed and developed various endpoints, defining Models, Serializers, View Sets and register corresponding URLs to the endpoints using DRF Routers.
Search engine optimization by replacing existing database with MongoDB (NoSQL Database). Used GitHub for Python source code version control, Jenkins for automating the build Docker containers, and deploying in Mesos.
Working extensively on REST APIs, JSON, Microservices, CI/CD, Docker containers.
Worked in NoSQL database on simple queries and writing Stored Procedures for normalization and renormalization.
Build SQL, NOSQL queries implementing functions, packages, views, triggers, and tables
Followed AGILE development methodology to develop the application and used the GO Programming Language to refactor and redesign the legacy databases.
Create REST API's using DRF to be consumed by the frontend UI built on Angular JS
Creating restful web services for Catalog and Pricing with Django MVT, Jersey, MySQL, and MongoDB.
Utilized Celery to automate various tasks such as API calls.
Developed automated process for builds and deployments by using Jenkins, Ant, Maven, and Shell Script.
Architected and developed Python and Django for the backend development and front-end application using React, Webpack, Redux, and ES6/7 and PostgreSQL for the database.
Developed in Windows and going to deploy it in Linux server.
Conducted numerous enhancements for the system and bug-fixing tasks (C/C++, SQL scripts, UNIX Shell).
Python scripting with focus on DevOps tools, CI/CD and AWS Cloud Architecture and hands-on Engineering.
Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.
Successfully migrated the Django database from SQLite to MySQL to PostgreSQL with complete data integrity.
Performed coding in C++ on Linux Platform.
Executed asynchronous tasks with help of Celery and Rabbit MQ.
Versatile with Version control systems including Git and SVN and Implemented RESTful API which returns data from PostgreSQL in JSON format
Done Job scheduling, batch-job scheduling and process control, forking and cloning of jobs and checking the status of the jobs using shell scripting.
Developed tools using Python, Shell scripting, XML to automate some of the menial tasks. Interfacing with supervisors, artists, systems administrators and production to ensure production deadlines are met.
Developed Business Logic using Python on Django Web Framework.
Developed views and templates with Python and Django's view controller and templating language to create.
Environment: Python 2.6/3.4, Django, SQL, ZODB, MySQL, Sqlite3, git, DeVos, Azure, NoSQL, Yaml, MongoDB Golang, bitbucket, pdb, AWS, Jira, Jenkins, Dockers, PySpark, Rest, Virtual Machine, Ajax, jQuery, JavaScript, LINUX.
Client: Ahex Technologies – Hyderabad, India. Jun 2016 – Sep 2018
Role: Python Developer
Responsibilities:
Rewrite existing Python/Django modules to deliver certain format of data.
Used Python and Django to create graphics, XML processing, data exchange, and business logic implementation.
Participated in the complete SDLC process.
Implemented Model View Control architecture using Django Framework to develop web applications.
Built Web apps using Flask frameworks and the MVC architecture.
Implemented responsive user interface and standards throughout the development and maintenance of the website using HTML, CSS, JavaScript, Bootstrap and jQuery.
Placed data into JSON files using Python to test Django websites. Used Python scripts to update the content in the database and manipulate files.
Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.
Successfully migrated the Django database from SQLite to MySQL to PostgreSQL with complete data integrity.
Performed coding in C++ on Linux Platform.
Worked on various python libraries including OpenCV and Scikit-Learn.
Used Python and Django to create graphics, XML processing, data exchange, and business logic implementation.
Implemented RESTful Web-Services for sending and receiving data between multiple systems.
Worked on the development of SQL and stored procedures on MYSQL.
Performed troubleshooting, fixed, and deployed many Python bug fixes of the two main applications that were a primary data source for customers and the internal customer service team.
Responsible for debugging the project monitored on JIRA (Agile).
Involved with the CI/CD pipeline management for weekly releases.
Environment: Python, Django, Flask, JavaScript, HTML, CSS, JSON, jQuery, XML, GIT, SQL, and Windows
Client: Protiviti – Hyderabad, India. Jul 2013 – May 2016
Role: Python Developer
Responsibilities:
• Interacted and discussed about the requirements of clients during requirement gathering sessions.
• Created a user-friendly interactive website using Django’s Model-View-Controller pattern.
• Developed both dynamic and static web pages using Python’s Django framework.
• Worked on using Django’s APIs for accessing data from the database and supported Apache server on Linux platform.
• Used front end technologies such as JavaScript, HTML, CSS, Bootstrap, AngularJS and JSON to create an interactive front end for the webpage.
• Used PyTest and Unit test modules to write various test cases which handle multiple scenarios to avoid
failover.
• Utilized Python Imaging Library to create specific images for user books.
• Used dimensionality reduction techniques such as PCA (Principal component analysis) and other techniques to reduce the size of user images and store in an efficient manner.
• Worked on various python libraries including OpenCV and Scikit-Learn.
• Handled huge datasets and improved linear performance using Cassandra.
• Worked on data exchange, implemented business logic, and handled XML and HTML files using Python.
• Used SQL for querying on data for import/export and performed conversions.
• Created custom dashboard for end users using Django with interactive design and connected to backend
PostgreSQL database.
• Actively worked in all stages of the project such as requirement analysis, design, implementation, testing and deployment.
• Implemented Object Oriented Programming to improve the reusability of code and help in reformatting later down the road.
Environment: Python, Django, HTML5/CSS3, XML, Angular.JS, MySQL, Cassandra, OpenCV, Scikit-learn, Postgres SQL, JavaScript, JSON, Bootstrap
EDUCATION:
Balaji Institute of Technology and Science, Warangal, India Bachelors in computer sciences (2013)