Post Job Free
Sign in

Sr Software Engineer Business Intelligence

Location:
Seattle, WA
Salary:
$50
Posted:
July 16, 2025

Contact this candidate

Resume:

Harika Vinjamuri

*************@*****.*** 615-***-**** Seattle, WA linkedin.com/in/harikavinjamuri

PROFESSIONAL SUMMARY

Business Intelligence professional with 7 years of experience delivering end-to-end BI solutions, from data extraction and pipeline development to advanced analytics and interactive dashboarding. Proven expertise in data analysis, ETL design, and transforming complex datasets into actionable insights that drive strategic business decisions. Skilled at working with different teams, leading data projects, and improving reports to help the business grow using data.

EDUCATION

Master of Science - Data Science and Artificial Intelligence May 2024 – Jun 2025 Missouri, USA University of Central Missouri

3.9 GPA

Bachelor of Technology - Information Technology Oct 2012 - May 2016 Telangana, India Kakatiya University College of Engineering and Technology 3.96 GPA

WORK EXPERIENCE

Senior Software Engineer, Ivy Comptech Aug 2021 – Jun 2023 India Collaborated closely with cross-functional stakeholders to gather business and technical requirements, achieving a deep understanding of existing systems to design effective data integration solutions. Led the integration of the Cashier data flow for acquired platforms (Eurobet, Lads, and Enlabs), implementing data compression and partitioning techniques to significantly optimize storage utilization and query performance.

Designed and developed scalable ETL workflows using Informatica PowerCenter, applying macros and stored procedures to ensure seamless compatibility with the organization’s legacy data infrastructure. Built and managed data ingestion pipelines using Apache Airflow (DAGs) to automate and orchestrate data movement into Snowflake, enabling timely access to high-quality data for downstream analytics and reporting teams.

Software Developer – BI, Technovert Oct 2019 – Jul 2021 India Client Englert Leaf Guard

Designed and implemented data pipelines in Azure Data Factory (ADF) to transfer data from shared storage to Azure SQL Database, with all data transformations handled via optimized stored procedures for performance and maintainability.

Configured multiple ADF triggers to automate and orchestrate data workflows based on dynamic file arrival schedules, improving operational efficiency and reducing manual intervention. Developed a tabular model in Analysis Services to offload complex calculations from Power BI, centralizing reusable measures and columns to streamline report performance and maintainability. Enforced row-level and object-level security across Power BI reports to ensure controlled data access based on user roles and organizational hierarchy, supporting secure and scalable reporting practices. Client - First National Bank of Pennsylvania

Led data profiling and analysis efforts to design and build a scalable Enterprise Data Warehouse (EDW) for key banking modules, including credit cards, loans, teller, and brokerage, ensuring high data quality and maintainability.

Optimized data transfer across staging, integration, and reporting layers by leveraging stored procedures with direct views, effectively minimizing table-level load and enhancing system performance. Developed robust SSIS data flows to orchestrate stored procedure execution using parallel, sequential, and conditional logic, reducing overall ETL processing time from 8 hours to under 5.5 hours. Managed deployments using SSIS DB mode and automated scheduling through. SQL Server Agent, ensuring consistent and timely data pipeline execution. Maintained detailed documentation across all phases of analysis and development to support traceability, stakeholder communication, and ongoing system enhancements. Client - First National Bank of Pennsylvania

Developed analytics views in Azure DevOps to efficiently extract and transform application utilization data. Executed advanced data shaping and refinement in Power Query, including aggregations, conditional column creation, unpivoting, and merging/appending queries to prepare clean, consolidated datasets for analysis. Conducted comprehensive data validations to ensure report accuracy, maintaining high data quality standards across the organization.

Collaborated closely with cross-functional stakeholders to understand requirements, offer technical expertise, and iteratively improve BI reports for enhanced business impact and user experience. Software Engineer, ValueLabs LLP Jun 2016 – Oct 2019 India Client - Mitchell International Inc.

Led the extraction and loading of insurance data from Oracle to an MS SQL Data Warehouse using SSIS, building efficient ETL pipelines to support business analytics. Managed data processing across multiple layers with standardized stored procedures and functions, and ensured code quality by actively participating in code reviews. Designed and developed advanced Tableau reports with calculated fields, parameters, and visualizations like scatter plots and stacked bars; these high-impact reports were critical in identifying fraudulent claims through outlier analysis.

Created and maintained detailed documentation for releases to support future development and knowledge sharing.

Drove data-focused efforts to improve fraud detection, helping reduce financial risk and strengthen compliance. Client - Info choice

Contributed to the design of the data architecture for migrating from MySQL to MS SQL, ensuring a smooth transition and alignment with business requirements. Diagnosed and resolved compatibility challenges between MySQL and MS SQL, rewriting and optimizing SQL code for the new platform to maintain functionality and performance. Executed ad-hoc data migrations using Data Transformation Services (DTS) in the development environment to support iterative testing and validation.

Developed and maintained SSIS data flows to synchronize data between MySQL and MS SQL environments during the migration phase, automating the process with scheduled jobs via SQL Server Agent to ensure data consistency until full stabilization of the target system. Client - Integral

Designed and implemented Slowly Changing Dimensions (SCD Type 1 & Type 2) to support accurate historical data tracking and streamline the ETL process for better maintainability and performance. Optimized SQL stored procedures, leading to a measurable 15% reduction in pipeline runtime, enhancing overall system efficiency.

Developed dynamic Tableau dashboards and workbooks for business users to explore trends and key metrics, incorporating advanced features such as cascading filters, calculated fields, individual axis, blended axis, and dual axis visualizations for enhanced interactivity and insight. Evaluated reports thoroughly to ensure accurate data, reliable performance, and consistent user experience. TECHNICAL SKILLS

Programming and Scripting Languages — JavaScript, Python, C, C++, OOPS, DAX Frameworks and Software — Apache Spark, NodeJS, REST APIs, Tabular Editor Databases — SQL, Teradata, Oracle, MongoDB

IDE — Visual Studio Code, IntelliJ, PyCharm, Jupyter Notebook, Anaconda Version Control System — Git, Team Foundation Version Control (TFVC) Packages/libraries — Pandas, NumPy, Seaborn, SciPy, Matplotlib, plotly, ggplot, Scikit-learn Visualization Tools — Power BI, Tableau, Excel, SSRS ETL Tools – Informatica Power Center, Azure Data Factory, SSIS AI Concepts — Markov Decision Process, Reinforcement Leaning Predictive Modeling — Linear Regression, Logistic regression, KNN, Naive Bayes, Dimensionality reduction, Decision Tree, Random Forest, SVM, K-means

ACADEMIC PROJECTS

Predicting Forest Fires

Built a machine learning model using Random Forest, SVM, and KNN algorithms to predict forest fires, achieving 93% accuracy with KNN.

• Processed the dataset, incorporating meteorological and topographical attributes, and conducted EDA using Python libraries.

• Proposed generalizable prediction system for wildfire management using the data sample. Vending Machine Software Implementation

• Developed a full-stack web app simulating a vending machine with real-time inventory tracking.

• Built RESTful APIs with Node.js and Express.js to manage product transactions and inventory data in MongoDB.

• Created a responsive frontend using HTML, CSS, and JavaScript for intuitive user interactions.

• Implemented CRUD operations and dynamic updates to ensure seamless user experience and data accuracy. E-Commerce Platform Development

• Built a dynamic web platform allowing users to browse products, register/login, and manage shopping carts.

• Designed an admin panel to oversee user orders, add categories/products, and manage registered users.

• Implemented a secure and scalable system using (technologies used - MySQL DB, Php).



Contact this candidate