Post Job Free

Resume

Sign in

Software Engineer Ci Cd

Location:
Cumming, GA
Posted:
October 24, 2023

Contact this candidate

Resume:

Haritha Yenaga

Email: ad0llq@r.postjobfree.com

Phone: 614-***-****

Professional Summary:

Results-oriented and highly skilled Software Engineer with 9+ years of experience in designing, developing, and delivering robust software solutions. Adept at collaborating with cross-functional teams to drive project success and enhance user experiences. Transitioned to a DevOps Engineer role for 2 years, where I leveraged my expertise to streamline development and deployment processes, improving efficiency and reliability.

Skills:

DevOps Tools: Jenkins, Ansible, Docker, Terraform, Kubernetes

Cloud Platforms: AWS (EC2, S3, Lambda, CloudFormation, IAM, Athena, VPC, Load Balancing)

Version Control: Git, GitHub

Database Systems: MySQL, PostgreSQL, RDS, AuroraDB, MongoDB

Programming Languages: UNIX shell scripting, VB Script, VB, Python, JavaScript, NodeJS

Web Technologies: HTML, CSS, JavaScript, NodeJS, RESTful APIs, JSON, XML

CI/CD: Continuous Integration, Continuous Deployment

Monitoring and Logging: Cloud Watch, Athena, Grafana, UpTimeRobot, OpenSearch, Prometheus Cloud Trial, Tivoli, CTRL-M, CA Workload Automation, AutoSys.

Other Tools: TriZetto’s healthcare products such as Facets, CAE, QNXT, TMS.

ETL Tools: Informatica Power Center, Informatica PowerMart, Power Exchange, Power Connect for Siebel/DB2

Work Experience:

Brightree LLC, Peachtree Corners, GA Sep 2022 – till date

Associate DevOps Engineer

Description CitusHealth, home-based and post-acute care providers get a single digital solution that delivers robust care team collaboration and communication, increased patient and family caregiver engagement, and referral satisfaction.

Responsibilities:

Collaborated with cross-functional teams to design, implement, and manage AWS infrastructure using services like EC2, S3, Lambda, and CloudFormation.

Automated deployment CI/CD pipelines using tools such as AWS CodePipeline, resulting in reduced deployment times.

Implemented Infrastructure as Code (IaC) principles using CloudFormation templates and managed infrastructure updates and scaling efficiently.

Monitored and maintained application performance and availability using CloudWatch, CloudTrail, Grafana, Prometheus, and other AWS monitoring tools.

Implemented security best practices by configuring IAM roles, security groups, and VPC settings to ensure a secure environment for applications.

Collaborated with development teams to enhance monitoring and logging strategies, ensuring quick identification and resolution of issues.

Managed Docker containers and orchestrated deployments using Amazon ECS, ECR, optimizing resource utilization.

Collaborated with development teams to define branching strategies and code merging workflows using Git and GitHub

Configuring and monitoring application APIs using UpTimeRobot Services tool.

Migrating Databases from EC2 Instance to AWS Relational Database Service(RDS).

Integrated Amazon SNS with CloudWatch alarms, enabling immediate notifications for operational issues and performance anomalies.

Configured and optimized ProxySQL instances to enhance database performance and scalability.

Created interactive dashboards using data from Amazon Athena, providing actionable insights to stakeholders.

Designed and implemented cron jobs to automate routine tasks, such as data backups and cleanup, resulting in increased operational efficiency.

Orchestrated the setup and management of Amazon OpenSearch clusters for log aggregation and analysis, enabling proactive issue detection and resolution.

Responsible for facilitating daily SCRUM stand-ups, sprint planning, backlog grooming, and retrospective meetings.

Environment: Jenkins, Ansible, Docker, Terraform, Kubernetes, AWS (EC2, S3, Lambda, CloudFormation, IAM, Athena, VPC, Load Balancing), Git, GitHub, MySQL, PostgreSQL, RDS, AuroraDB, MongoDB, UNIX shell scripting, JavaScript, NodeJS, Grafana, UpTimeRobot, OpenSearch, Prometheus Cloud Trial, Forgerock

TriZetto, Englewood, Colorado Feb 2015 – Aug 2019

Infra Technology Specialist

Description: TriZetto, a Cognizant company and business unit within Cognizant’s healthcare practice, provides world-class information technology solutions to make better healthcare happen. TriZetto’s world-class technology products, in combination with Cognizant’s consulting, IT, and business process services at scale dramatically simplify the deployment and adoption of technology and improves operations—helping to reengineer the business of healthcare today, while reimagining it for tomorrow.

Responsibilities:

Worked in Hosted Batch Operations team which provides Batch support to Cognizant’s hosted customers on TriZetto’s Healthcare products such as Facets, QNXT, TMS and CAE.

Experience in working with business and technical teams to gather the requirements for batch job scheduling and to implement these batch jobs in a scheduler like CA Workload Automation AE, Tidal, SPLUNK

Extensively worked on Batch failures, troubleshooting by debugging the core and custom batch job logs and code Handling Adhoc requests on daily basis by coordinating with customer.

Built the new jobs and schedules in multiple environments in Workload and Tidal as per the user requests.

I was responsible for identifying and documenting the RCA for failures including the corrective action plan.

worked on Batch optimization by analyzing the end-to-end batch flow to identify accurate batch dependencies for parallel job processing.

Experience in conducting Failure and Performance audits and analyzing the custom code changes in core job run files, multiengine settings for better performance.

Established and maintained error tracking process by closely working with management to create corrective action plans where needed.

Providing L1.5 services on 24/7 support basis.

Participate in EIM’s and provide extended support in resolving them.

Creates and owns tickets for Adhocs, Incidents and CR’s through Ticketing system.

Troubleshoot the Facets job failures as part of Nightly batches.

Attends client meetings and daily Scrums to understand the requirements and analyze them to keep in place.

Works in Onsite-Offshore Model.

Run Jobs in Test, UAT, Production and many other custom environments.

Setting up environments in scheduler so that the workflows are maintained as per the requirement and dependencies are met in FACETS.

Environment: Windows 7, CA Workload Automation, Tidal Scheduler, Facets 5.01, SQL Server 2008, VB 6.0, VB Scripts, SPLUNK log monitoring tool, XML scripts, FACETS application.

Minvesta Infotech Ltd, Hyderabad, India May 2010 – Oct 2013

Informatica Developer

Description: MinVesta is an Independent Software Vendor (ISV) building innovative HR software products for enterprises. MinVesta's specialized HR IT services have helped organizations across continents manage their routine and complex Organizational processes with ease.

Responsibilities:

Analyzed business requirements by interacting with the business users and application development teams and also acted as Data Quality Analyst.

Designed and developed the ETL architecture to build and populate the data marts using Informatica.

Designed and developed Informatica mappings for data loads that included web services consumer, Source Qualifier, Aggregator, Joiner, Lookup, Filter, Router, Update Strategy, Expression and Sequence Generator transformations.

Used XML Parsers and Generators to handle inbound and outbound XMLs in Informatica.

Designed Parameter driven Informatica sessions and workflows for extracting, cleansing, transforming and loading the data from the heterogeneous sources into the data warehouse by using parameter files.

Worked with complex Cognos reports in Report Studio using master-detail relationship, drill through, drill up and drill down, burst options, and Prompts.

Implemented SCD Type 2 strategy to update the Slowly Changing Dimensions and maintain the OLAP tables storing history in the data mart star schema

Developed Informatica mappings to cleanse and remove duplicate rows from source data

Used Conversion process for VSAM to ASCII source files using Informatica Power Exchange.

Programmed Oracle SQL, T-SQL Stored Procedures, Functions, Triggers and Packages as back-end processes to create and update staging tables, log and audit tables, creating primary keys

Used workflow manager for session management, database connection management and scheduling of jobs

Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.

Performed Unit Testing and assisted QA team in Quality Assurance Testing, Load Testing and UAT, Performance estimation testing.

Ensured the execution of UAT test cases and documentation of test results.

Analyzing business and functional requirement to design SIT and UAT Test Cases

Possess strong Documentation skill and knowledge sharing among Team, conducted data modeling review sessions (Erwin) for different user groups, participated in requirement sessions to identify requirement feasibility.

Prepared functional and technical specification documents for the ETL standards and strategy

Developed views necessary for structured and ad-hoc reporting

Managed security privileges for each subject area and dashboards according to user requirements.

Created groups in the repository and added users to the groups and granted privileges explicitly and through group inheritance.

Handled Full load and refresh load via staging tables in the ETL Layer.

Involved in Designing ERD using Star schema.

Involved in Design and Data Modeling using Star schema.

Used DAC (Data Warehouse Administration Console) Client to manage, configure, customize, and monitor ETL process.

Environment: Informatica Power Center 8.1, Power Exchange 8.1.1, Oracle 9i, DB2, Oracle Data Warehouse builder, XML, Flat files, Erwin 3.2, Cognos, Windows 2003, Toad.

Clarivo Technologies, India Aug 2009 – April 2010

Informatica Developer/Intern

Description: Clarivo is a software development and consulting company which outsources many employees and provides opportunities to work on many independent projects attained from many small Firms.

Responsibilities:

Involved in designing and development of data warehouse environment.

Responsible for data modeling using Business Object.

Designed and developed several ETL scripts using Informatica, UNIX shell scripts.

Extensively used all the features of Informatica 7.x including Designer, Workflow manager, Repository Manager and Workflow Monitor.

Developed and modified UNIX shell scripts to reset and run Informatica workflows using Pmcmd on Unix Environment. Conversant with the Informatica API calls.

Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.

Worked with mappings using expressions, aggregators, filters, lookup, update strategy and stored procedures transformations.

Partitioned sources to improve session performance.

Extensively worked on Unix Shell Scripting, using pmcmd to automate the process by Autosys on UNIX environment.

Created flexible mappings/sessions using parameters and variables and heavily using parameter files.

Improved session run times by partitioning the sessions. Was also involved heavily into database fine tuning (creating indexes, stored procedures, etc), partitioning oracle databases.

Environment: Informatica Power Center 7.1, Oracle 9i/10g, PL/SQL, SQL*Plus, SQL Server, SQL*Loader, Business Object.

Education:

Master of Business Administration (MBA) – 2012

Bachelor of Technology (B-Tech) – 2009



Contact this candidate