Post Job Free
Sign in

Software Engineering Sql Server

Location:
Foxborough, MA
Posted:
May 01, 2024

Contact this candidate

Resume:

Ishika Singh

ad5epv@r.postjobfree.com +1-508-***-**** Boston, MA

EDUCATION

Amity University, Bachelors in Science and Technology, Noida, India Aug 2007 – May 2011

Relevant Courses: Programming Design Paradigms, Web Development, Algorithms, DBMS,

Object Oriented Concepts, Data Structures & Algorithms, Software Engineering, Cloud Computing.

TECHNICAL SKILLS

●Programming languages: ETL Informatica, PL-SQL, IICS Informatica, Python, Perl, C++, Control-M, Unix Shell Script, Informatica Greenplum.

●Database: Oracle, Salesforce, MS SQL Server, MongoDB, PostgreSQL, SAP-BI/BW, Snowflake

●DevOps: Red Shift, Snowflake, AWS S3, BitBucket

●Others: Windows, Unix, OBIEE, PowerBI, GitHub, R Studio, Heroku, Netlify, Microsoft Office

PROFESSIONAL EXPERIENCE

Liberty Mutual Insurance Database Engineer Boston, MA Jan 2021 – Nov 2023

●Developed ETL mappings and workflows in Informatica PowerCenter to populate fact and dimension tables within the star schema.

●Implemented complex transformations and business logic in Informatica to load data into star schema structures, ensuring data integrity and consistency.

●Utilized Perl scripting to extend the functionality and customization options within Informatica ETL (Extract, Transform, Load) workflows.

● Integrated Python scripting within Informatica Intelligent Cloud Services (IICS) to extend functionality and automate data integration processes.

●Developed custom Python scripts to augment ETL workflows in IICS, enabling advanced data manipulation, validation, and enrichment.

●Utilized Python libraries such as Pandas, NumPy, and Requests to perform complex data transformations and interact with external APIs within IICS.

●Developed mappings, transformations, and data pipelines using IICS Mapping Designer, leveraging its drag-and-drop interface for rapid development and deployment.

●Designed and implemented data integration workflows in IICS, including data ingestion, transformation, and loading processes, to support business needs.

●Utilized Amazon S3 as a key component of ETL workflows for data storage and management. Implemented data ingestion processes to extract raw data from various sources and store it securely in Amazon S3 buckets.

●Utilized SourceTree as a Git client to manage code repositories and branches, while leveraging Bitbucket as the central repository for version control and collaboration.

●Implemented Continuous Integration/Continuous Deployment (CI/CD) pipelines in Bitbucket to automate build, test, and deployment processes for Informatica ETL workflows.

●Developed the project using Agile Software Development methodology with daily scrum meetings and weekly sprints.

Accenture US Business Analyst and ETL Developer Boston, MA Nov 2017 – Dec 2020

●Proficient in transforming business requirements into functional specifications, focusing on workflow analysis and design, business process reengineering, user interface design, portal design, and process flow modeling.

●Excellent analytical skills for understanding the business requirements, business rules, business process and detailed design of the application.

●Demonstrated capability in driving and managing end-to-end delivery throughout the software development lifecycle (SDLC) within global multinational delivery organizations.

●Design, customize, and implement ETL workflows and processes using Informatica Power Center.

●Develop the production process to automate the ETL and create documentation to transition the process to the Production team.

●Collaborate with software developers and business analysts to understand application requirements and translate them into database solutions.

●Develop, maintain, and optimize SQL queries for performance improvement, stored procedures, functions, and triggers to support business applications and reporting needs.

●Design and implement database schemas, tables, views, and indexes to efficiently store and retrieve data.

●Developed shell scripts to perform pre- and post-processing tasks, such as file manipulation, directory cleanup, and environment setup, to ensure smooth execution of ETL jobs.

●Utilized shell scripting to handle error handling, logging, and notification functionalities, improving visibility and monitoring of ETL activities.

●Provided proactive and reactive production support for ETL (Extract, Transform, Load) systems, effectively resolving high-priority (P1) and medium-priority (P2) tickets to minimize downtime and maintain data integrity.

●Collaborated with cross-functional teams to prioritize and expedite resolution of critical incidents, ensuring SLA compliance and minimal impact on business operations.

●Documented support procedures and provided training to team members to streamline incident response and knowledge transfer.

●Created and maintained Control-M jobs to automate ETL processes, batch jobs, and other system workflows. Developed job schedules, dependencies, and notifications to optimize job execution and ensure timely completion.

●Developed mappings, sessions, and workflows in Informatica PowerCenter to extract, transform, and load data from various sources into Greenplum and PostgreSQL databases.

●Designed and implemented Informatica workflows to orchestrate data integration processes within Greenplum and PostgreSQL database environments. Leveraged merge functionality to efficiently synchronize and update data between source and target tables, optimizing performance and minimizing data redundancy.

●Worked on PL/SQL, MS SQL Server and Postgres databases.

●Developed the project using Agile Software Development methodology with daily scrum meetings and weekly sprints.

Accenture Services Pvt. Ltd ETL Informatic Developer Mumbai, Maharashtra Jun 2011 – Sep2016

●Communicated with business customers to discuss the issues and requirements.

●Created debugging sessions before the session to validate the transformations and used existing mappings in debug mode extensively for error identification by creating break points and monitoring the debug monitor.

●Improved the performance of the mappings by moving the filter transformations early into the transformation pipeline, performing the filtering at Source Qualifier for relational databases and selecting the table with fewer rows as master table in joiner transformations.

●Production Support has been done to resolve the ongoing issues and troubleshoot the problems. Reduced the number of tickets by almost 90% within 2 years. This was done by continuous improvement of the application and providing solutions to long pending performance issues, suggesting solutions to the vendors who in turn provided newer version of the application binaries.

●Develop Mappings and Workflows to load the data into Oracle tables.

●Developed various transformations like Source Qualifier, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.

●Created Workflows, Tasks, database connections using Workflow Manager Developed complex Informatica mappings and tuned them for better performance.

●Include creating the sessions and scheduling the sessions Recovering the failed Sessions and Batches.

●Involving in extracting the data from Oracle and Flat files.

●Implemented performance tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings, and sessions to improve performance.

●Preparation of Test Cases and Executing the Test Cases UAT Tool Involved in System Testing, Regression Testing, Security Testing. Creating the ETL run book and actively participated in all phases of testing.

●Responsible for identifying the missed records in different stages from source to target and resolving the issue.

●Used Debugger to test the mappings and fix bugs.

●Used Informatica to Extract and Transform the data from various source systems by incorporating various business rules.

●Developed standards and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, sorter, union, and filter.

PROJECTS

Informatica – Balance and Control Analysis

●Designed and developed Informatica Control Balance Analysis within my Project which is used to ensure the accuracy and integrity of data within an organization's systems. It involves comparing different sets of data or data sources to identify discrepancies, errors, or inconsistencies. This process is crucial for ensuring that the data being processed and utilized by an organization is reliable and trustworthy.

Data Science for PYTHON- Price Prediction

●Created a Model to determine the selling price of a Used Car by doing Univariate Analysis to understand the distribution of features, Multivariate Analysis to determine correlations and by doing Regression Analysis to determine the performance of all the models on the same hold-out/test data set.



Contact this candidate