Post Job Free
Sign in

Business Intelligence Developer & Analyst Expert

Location:
Maple Grove, MN
Posted:
January 08, 2026

Contact this candidate

Resume:

GRACELINE SAMROY ANAND

Business Intelligence Developer and Analyst

Mobile: +1-612-***-****

Email: *********.*******@*****.***

PROFESSIONAL SUMMARY

7+ years of IT Experience as a Business Intelligence Developer and Analyst in development of business applications, Database Design, Development and business intelligence in Banking, Finance, Retail, Real Estate and Pharmaceutical domains.

Extensively worked on MS SQL Server, Netezza and Oracle databases

Strong experience on T-SQL and query optimization, developing ETL strategies.

Designed and created databases, tables, views, store procedures and triggers.

Experienced in writing complex TSQL Queries involving various tables using SQL joins and writing Sub Queries.

Experienced in writing complex-SQL Stored Procedures and constructing Tables, Triggers, user functions, Views, Indexes, Relational data models and data integrity.

Used Microsoft Power BI Power Query to extract data from external sources and modify data to certain format as required in Excel, and created SSIS packages to load excel sheets from PC to Database.

Used Power BI Power Pivot to develop data analysis prototype, and used Power View and Power Map to visualize reports.

Used Power BI for developing Reports, Data Marts, KPI’s and Charts.

Developed Power BI Reports and Dashboards from various sources.

Experience in assigning Row level security for dataset in Power BI Desktop.

Experience in creating Dashboards from various sources with new Q&A feature in Power BI service.

Experience in implementing Incremental refresh for dataset in Power BI Premium.

Experience in designing various bookmarks and buttons in Power BI desktop.

Expertise in SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS) with good knowledge on SQL Server Analysis Services (SSAS)

Experience in report writing using SQL Server Reporting Services (SSRS) and creating various types of reports like drill down, Parameterized, Cascading, Conditional, Table, Matrix, Chart and Sub Reports

Experience in developing, monitoring, extracting and transforming data using DTS/SSIS, Import Export Wizard, and Bulk Insert.

Developing reports on SSRS on SQL Server (2008/2012/2015). Sound Experience and understanding of SSAS, OLAP cube and Architecture.

Expertise in creating SSIS packages for integrating data using OLE DB connection from homogeneous and heterogeneous sources (Excel, CSV, Flat file, Oracle). By using multiple transformations like Data Conversion, Conditional Split, SCD, Bulk Insert, and Merge Join.

Strong experience in all phases of Software Development Lifecycle (SDLC) using Waterfall, Agile/Scrum.

Well versed with agile ceremonies and able to manage all work with stake holders, Product Owner, Product owner delegates for effectively scoping/user stories.

Focused troubleshooter and a team player with excellent interpersonal and communication skills. Ability to work cohesively with developers, other team members, and testers.

Analysis and evaluation of Customer Business requirements and the development of solutions to meet business requirements and experience in agile projects as Business Analyst.

Act as the primary interface between business unit end users and systems development/applications and programming areas.

TECHNICAL SKILLS

Databases : MS SQL Server 2019/2016, Oracle 10g/11g/12c, MS Access,

Azure SQL,Snowflake and Netezza.

Business Intelligence Tools : Microsoft Power BI, Informatica, SSRS, SSIS and SSAS

Programming Languages : C, C++,Python and UNIX System Programming.

Tools : MS SQL Server Management Studio, SQL *Plus, MS SQL Profiler,

Query Analyzer, Deep Checker, Code Checker, Make, GDB, TeraTum, Putty, Wireshark, and Soap UI, Azure DevOps, Jira, Rest API and Postman.

Operating Systems : UNIX, VxWorks (RTOS), Embedded Linux,

Windows 10/11 and Windows Server, Solaris and AIX.

Scripting Languages : Unix shell Scripting and Perl Scripting.

Middleware : IBM MQ

Version Control Systems : VSS, PVCS, TFS and Git.

IDEs : Microsoft Visual Studio and Visual Studio Code

EDUCATION

Bachelor of Engineering (B.E), Anna University, Chennai, India - 2007

PROFESSIONAL EXPERIENCE

Client: Hunter Roberts, New York, NY

Project: Contracts Management Portal

Role: Business Intelligence Developer and Analyst

Period: Apr 2024 – till date

Contracts Management portal is the window to Hunter Robert’s construction management team for the management of contracts made with customers, vendors and partners. This portal includes, but not limited to the business processes, structure and resources that will be applied to the contract management phase.

Responsibilities

Involved in analysis, design and development of project to consume transactional data, consolidate into data warehouse and creating reports and dashboards, creating datasets and tabular models for Reports and dashboards.

Gathered business requirements, defined and designed the data sources, data flows, data quality analysis, worked in conjunction with the data warehouse architect on the development of logical data models.

Met with CIO, and IT Infrastructure Architect on weekly basics to provide them update on project progress.

Given Power BI reports and dashboard demo’s to VP’s of Global accounts, Sales account and Operations account.

Translated the business needs into technical specifications.

Created reports using, Pie charts, Column chart, Scatter chart, Slicers, KPI, and Table Matrix.

Expertise in writing complex DAX functions in Power BI and Power Pivot.

Worked extensively with Advance analysis by creating toggle Buttons, Bookmarks, Drilldown, Drill-through, Calculations, Parameters, Background images, Maps.

Worked in Power BI environment to create dashboards like daily, weekly, monthly reports using Power BI desktop and publish them to server.

Created Power BI Workspaces and created AD groups for end users to access reports and dashboards.

Used Microsoft Power BI Power Query to extract data from external sources and modify data to certain format as required in Excel.

Published Power BI Desktop reports created in Report view to the Power BI Service.

Made changes to the existing Power BI Dashboard on a regular basis as per requests from Business.

Used Power BI Power Pivot to develop data analysis prototype, and used Power View and Power Map to visualize reports.

Written complex SQL statements using joins, sub queries and correlated sub queries.

Used SQL Profiler for troubleshooting, monitoring, optimization of SQL Server and non-production database code as well as T-SQL code from developers and QA.

Converted all the existing Excel reports into Power BI Reports and created Tabular, Matrix Reports, Charts, Line Graphs, Stacked Bar Charts and Pie Charts and added default date parameters to refresh the data monthly in the reports.

Environment: Microsoft SQL server 2019, Power BI 2020, SQL Server Data Tools, Microsoft Office 2019, TimeXtender, Azure SQL.

Client: CDW, Chicago, IL

Project: E-Commerce

Role: Business Intelligence Developer and Analyst

Period: Jan 2022 - Mar 2024

Web based e-commerce application which helps the users to purchase the computer products, order management, cart management, online tracking and with other reporting features which is being mainly used by US govt., US and Canada corporate users.

Responsibilities

Implemented process improvement best practices and continuous process improvement initiatives.

Created Power BI visualization of Dashboards & Scorecards (KPI) for AMI Project.

Configured On Premise data gateway for Oracle and SQL Server database.

Created Workspaces, Apps, Alerts and Using Microsoft Flows created alerts emails for Business and IT.

Analyzed business requirements, created business/process flow charts, database flow charts and reporting tool database for Research

Created different reports and dashboards using bookmarks, heat maps, scatterplots, bar charts, slicers, drill down, drill through and various other parameters.

Manage Departmental Reporting systems, troubleshooting daily issues in Power BI.

Designed and implemented multiple dashboards using Power BI – Power Pivot & Power Query tools for in house metrics.

Integrated custom visuals based on business requirements using Power BI desktop.

Provided maintenance and development of bug fixes for the existing and new Power BI reports.

Used Power BI Custom visuals like Power KPI, Chiclet Slicer, Gantt Chart Scroller to make the visuals look and understand better.

Responsible for creating and changing the data visualizations in Power BI reports and Dashboards on client requests.

Deployed Dashboards, Reports and Data sources in the Power BI Service

Used Power BI Power Pivot to develop data analysis prototype and used Power View and Power Map to visualize reports.

Involved in the Unit testing, Quality Assurance testing and User Approval testing of Power BI reports and modifying the code accordingly to match user's business requirements

Environment: Microsoft Visual studio 2019, Microsoft SQL server 2019, Power BI 2018, SQL Server Data Tools, Microsoft Office 2019

Client: C&W – Cushman & Wakefield, Chicago, IL

Projects: LoopNet Listings Integration

Role: Business Intelligence Developer and Analyst

Period: Jun 2020 – Dec 2021

Global Site Solutions is a search-based property management system intended for two types of users: Researchers and Brokers in USA, EMEA and APAC regions to store and search property information (Office / Industrial). Researcher Module is an editable interface to access and edit information pertaining to Building Profile, Availability, Lease, and Sale History. Other key features of the research module are User Permission Maintenance, Audits, Survey, Statistical Reports, Absorption, Search pages (Building / Availability / Lease / Sale) and integrated Google Maps. Broker module is a read only view consisting of Search Pages, Survey, reports and integrated Google Maps.

Responsibilities

Demonstrated expertise in designing, developing and deploying Business Intelligence solutions using SSIS, SSRS and SSAS.

Proven ability in utilizing ETL tools including SQL Server Integration Services (SSIS), Data Transformation Services (DTS) and ETL package design.

Created and Configured Data Source & Data Source Views, Dimensions, Cubes, Measures, Partitions, KPI’s & MDX Queries using SQL Server 2016 Analysis Services.

Creating Measures and Calculated columns using DAX commands.

Created SSAS Models from various sources like Netezza and SQL Server.

Used Power BI for developing reports, data marts, KPI’s and charts.

Developed Power BI Reports and Dashboards from various sources.

Assigned Row level row security for Dataset in Power BI Desktop.

Created Bookmarks and linked those bookmarks with Buttons in Power BI Desktop.

Created Workspace in Power BI service and published reports in that workspace.

Used github to sync, pull, push and commit project code and Power BI reports.

Environment: Microsoft Visual studio 2017, Microsoft SQL server 2016, Microsoft Power BI, SQL Server Data Tools, Microsoft Office 2016, Git and Netezza.

Client: CBRE, Austin, TX

Project: CAAPS

Role: Business Intelligence Developer and Analyst

Period: Jan 2019 – May 2020

CAAPS is a web based Corporate Accounting invoice payable solution which is being used by processors and superusers in US and CANADA regions to scan the received invoices from vendors and to process it for approvals through workflows for further synchronization into People Soft application.

Responsibilities

Used Microsoft Power BI Power Query to extract data from external sources and modify data to certain format as required in Excel, and created SSIS packages to load excel sheets from PC to database.

Used Power BI Power Pivot to develop data analysis prototype, and used Power View and Power Map to visualize reports.

Created SSIS packages to Extract, Transformation and Load ETL user data and other data from OLTP to OLAP.

Used SSIS for each loop to implement data updates from various sources to data warehouse

Performed ETL process to load data from various source into upgraded database using SSIS 2016.

Used Conditional Split, Data Conversion, Derived Column, Merge Join, Sort and Union All in different SSIS packages based on requirements.

Made use of Aggregate, Conditional Split, Data Conversion, Derived Column, Lookup, Merge, Merge Join, Multicast, Sort and Union All in different SSIS packages based on requirements.

Creating reports for Sales team and providing inputs to higher management about the current sales compared to previous YTD and current YTD budget.

Creating Crystal reports for new ERP system.

Creating Sales report every week to keep track of Sales MTD present with MTD previous year and then comparing it with MTD present with Budget.

Performing posting for every month end postings and creating reports for Finance department.

Creating Scarp reports and helping the Scrap team to understand the data and helping them to bring down the scrap percentage but comparing it with previous months scrap reports.

Assigning new users to the network and responsible for setting up new accounts and linking it to the active directory and helping them with the access to Flexan network and mapping drives requested by higher manager.

Environment: Microsoft Visual studio 2017, Microsoft SQL server 2016, Microsoft Power BI, SQL Server Data Tools, Microsoft Office 2013, Crystal Reports

Client: International Literacy Association, Newark, DE

Project: E-Business and Back Office

Role: Business Intelligence Developer and Analyst

Period: Jun 2018 – Dec 2018

E-Business and Back Office are literacy management applications intended for teaching professionals, IRA Staff, regular members & non-members. E-Business is an online business site of IRA which is used for key literacy transactions like Membership registration & Renewal, Online product purchase, Abstract Management, Career Center and Customer Service. Back-Office is built on .Net based Aptify Framework which helps IRA Staff to manage applications like Customers, Users, Orders, Accounting, Abstract, Messaging Systems, Campaigns, Product, Process Flows and Entities.

Responsibilities

Created databases for MS SQL Server 2017, MS Access.

Creating and managing schema objects such as tables, views, indexes, stored procedures, and triggers & maintaining Referential Integrity.

Extensive experience in Installation, Configuration and Updates of SQL Server.

Created SSIS packages to populate data from various data sources.

Experience in BI Development and Deployment of SSIS packages from MS-Access, Excel.

Created packages using SSIS for data extraction from Flat Files, Excel Files, and OLEDB to SQL Server

Created ad-hoc reports, drill down, and drill through reports using SSRS 2015.

Experience with SQL Server Reporting Services (SSRS) to author, manage, and deliver both paper-based and interactive Web-based reports.

Created multiple reports using SSIS.

Environment: Microsoft Visual studio 2015, Microsoft SQL server 2016, SQL Server Data Tools and Microsoft Access.

Client: AstraZeneca, Cambridge, UK

Project: NGW Retrofit

Project Location: Chennai, India

Role: Developer

Period: Oct 2013 – Jun 2014

AstraZeneca is a global pharmaceutical and biologics company. Headquartered in Cambridge, England, AstraZeneca has a portfolio of products for major disease areas including cancer, cardiovascular, gastrointestinal, infection, neuroscience, respiratory and inflammation.

The Next generation data warehouse (NGW) project will provide the foundation for the new commercial data warehouse by building a highly scalable and flexible data warehouse. This project aims at implementing the RCE Datasets – Alignment, Targeting, Segmentation in Next generation warehouse (NGW) as part of NGW Phase 1 project. It also provides the foundation for the new commercial data warehouse by building a highly scalable and flexible data warehouse - Staging Data store (SDS). SDS will provide the necessary breadth and depth of information to support the current commercial model and the evolving needs of the AZ Business.

Data from different vendors on marketing was collected and processed to be generated as summary reports in MSTR.

Responsibilities

Participated in requirement gathering and on-site coordination.

Created Data model using Erwin tool.

Prepared Mapping specification document.

Developed Informatica workflows according to mapping specification.

Developed various Unix scripts including triggering the workflow.

Unit testing and System testing of the Unix scripts and Informatica workflows.

Environment: Informatica, Netezza, Oracle, Erwin Data Modeler, Unix Shell Scripting and AIX Unix.

Client: AstraZeneca, Cambridge, UK

Project: NGW Aspen

Project Location: Chennai, India

Role: Developer

Period: Feb 2013 – Sep 2013

AstraZeneca is a global pharmaceutical and biologics company. Headquartered in Cambridge, England. AstraZeneca has a portfolio of products for major disease areas including cancer, cardiovascular, gastrointestinal, infection, neuroscience, respiratory and inflammation.

The Next generation data warehouse (NGW) project will provide the foundation for the new commercial data warehouse by building a highly scalable and flexible data warehouse. This project aims at implementing the RCE Datasets – Alignment, Targeting, Segmentation in Next generation warehouse (NGW) as part of NGW Phase 1 project. It also provides the foundation for the new commercial data warehouse by building a highly scalable and flexible data warehouse - Staging Data store (SDS). SDS will provide the necessary breadth and depth of information to support the current commercial model and the evolving needs of the AZ Business.

Astra Zeneca expense details collected from different applications. Processed for generating auditing data and submission to government. Next Generation Warehouse – Aspen project involves in porting of informatica workflows and Unix scripts from Oracle database to Netezza database.

Responsibilities

Participated in requirement gathering and on-site coordination.

Prepared Mapping specification document.

Updated Informatica workflows according to mapping specification.

Updated Unix scripts to incorporate database query changes.

Unit testing and system testing of the Unix scripts and Informatica workflows.

Environment: Informatica, Netezza, Oracle, Unix Shell Scripting and AIX Unix.

Client: AstraZeneca, Cambridge, UK

Project: Emerald Migration

Project Location: Chennai, India

Role: Developer

Period: Apr 2012 – Jan 2013

AstraZeneca is a global pharmaceutical and biologics company. Headquartered in Cambridge, England. AstraZeneca has a portfolio of products for major disease areas including cancer, cardiovascular, gastrointestinal, infection, neuroscience, respiratory and inflammation. Preparation and execution of operational test scripts pre and post migration to verify that the results are same to ensure migration is done appropriately.

Responsibilities

Participated in requirement gathering and on-site coordination.

Created operational test scripts for Informatica and Unix server.

Executed operational test scripts in Unix servers pre and post migration.

Verification of test results against the migration.

Environment: Informatica, Netezza, Oracle, Unix Shell Scripting and AIX Unix.

Client: Blue Cross Blue Shield of Illinois, Chicago, IL.

Project: ICD-10

Project Location: Chennai, India.

Role: Developer

Period: Sep 2011 – Mar 2012

ICD-10 project involves analysis of presence of procedure and diagnosis codes in SAS code, modification of SAS code to change the length of procedure and diagnosis code fields from 5 bytes to 7 bytes in the SAS tables and testing the SAS code after code changes.

Responsibilities

Participated in requirement gathering and on-site coordination.

Prepared design document based on the requirement document.

Updated the source code according to the design document.

Unit testing and system testing of the modified code.

Environment: SAS, MS SQL Server

Client: Walmart, Bentonville, AR.

Project: DSS Migration

Project Location: Chennai, India

Role: Developer

Period: Jun 2011 – Aug 2011

Decision support system migration project involves migration of code from Unix to Linux environment.

Responsibilities

Developed wrapper code.

Unit testing.

Migrated source code from Unix to Linux environment.

Validation of migration process by debugging the code.

Environment: C, C++, Shell Scripting, Oracle, Unix and Linux.

Client: Mastercard, Purchase, NY.

Project: MC-ADMM-GTPS-PEAT

Project Location: Chennai, India

Role: Developer

Period: Dec 2010 – Jun 2011

The MIP is a front-end communications processor that is placed on-site at a customer's facility or at hub site. It provides access to the Banknet telecommunications network, which in turn provides access to all Electronic Fund Transfer (EFT) products and to a wide variety of other EFT services via gateways. MIP software supports issuing and acquiring functions, including routing transactions to issuers, acquirers, and Stand-In. It communicates the online financial transaction in ISO 8583 standards. It can approve the online transaction by or on behalf of an issuer according to defined operations regulations. It’s a UNIX box, and applications are developed in "C" language. It uses the message queues to process the transactions. PEAT team does performance testing of MIP and ASA.

Responsibilities

Participated in requirement gathering and on-site coordination.

Design and development of Peat tools of MIP and ASA modules.

Unit testing of Peat tools of MIP and ASA modules.

Performance testing of MIP and ASA modules.

Environment: C, Shell Scripting, and Solaris Linux

Client: Mastercard, Purchase, NY.

Project: MC-ADMM-GTPS-AUTH

Project Location: Chennai, India

Role: Developer

Period: Sep 2009 – Jun 2010

The MIP is a front-end communications processor that is placed on-site at a customer's facility or at hub site. It provides access to the Banknet telecommunications network, which in turn provides access to all Electronic Fund Transfer (EFT) products and to a wide variety of other EFT services via gateways. MIP software supports issuing and acquiring functions, including routing transactions to issuers, acquirers, and Stand-In. It communicates the online financial transaction in ISO 8583 standards. It can approve the online transaction by or on behalf of an issuer according to defined operations regulations. It’s a UNIX box, and applications are developed in "C" language. It uses the message queues to process the transactions.

Responsibilities

Participated in requirement gathering and on-site coordination.

Design and development of SCRs of MIP and ASA modules.

Unit testing and System testing of SCRs of MIP and ASA modules.

Documentation of code changes in SCRs.

Environment: C, Shell Scripting, and Solaris Linux.

Client: Toshiba Tec Corporation, Tokyo, Japan

Project: TOSH_NETWORK_EBX

Project Location: Bangalore, India

Role: Developer

Period: Nov 2008 – Aug 2009

The Project includes Network Components Comparison Study between eB3 and eBX multi-Functional controllers and porting from open-source Unix code to embedded Linux platform and new development and unit testing of protocols like IPsec, SNMPV3, DDNS (Secure Mode) and AutoIP (IPV4), IPv6 on eBX controllers. The unit testing of 19 Integrated protocols like HTTP Server, IPP Server, DPWS Server, FTP Server, ESMTP Server, MIB/TRAP, SMB Client, SMB Server, SMB Print Server, LPD,9100 Printing, AppleTalk Printing, Bonjour, DPWS Print, DPWS Print with Secure, DPWS Scan, DPWS Discovery, LLTD, LLMNR are also in the scope of project. Networking protocols for Toshiba MFP in Embedded Linux Platform.

Responsibilities

Participated in requirement gathering and on-site coordination.

Ported autoip code from VxWorks to Linux platform.

SCM Lead for the project.

Designed autoip-dhcp architecture and implemented the same and integrated autoip and ddns.

Fixed bugs in dhcp, ndp and dhcpv6 protocols and integrated them with UI.

Installed and maintained WiproCodeChecker for team.

Prepared DSM for autoip-dhcp, Prepared RS and CDD for autoip-dhcp and IPV6

Environment: C, C++, Shell Scripting, VxWorks Tools and Embedded Linux

Client: Toshiba Tec Corporation, Tokyo, Japan

Project: TOSH_NETWORK_EB and TOSH_AGNI_SAMBA

Project Location: Bangalore, India

Role: Developer

Period: Jan 2008 – Oct 2008

Networking protocols for Toshiba Multi-Functional Peripherals. The project involves porting of networking protocols from open-source Unix to windriver Embedded Linux platform.

Responsibilities

Participated in requirement gathering and on-site coordination.

Worked in Dhcpv6 Protocol (fixed bugs and implemented RFC 4704 and Client FQDN option) and Stateless IPV6 (fixed bugs) and also worked in other networking protocols.

Applied fixes in DDNS (UI Integration).

Completed IPV6 Ready Logo testing and IPV6 vulnerability testing.

Documented implementation of RFC 4704.

Received Star Above and Beyond award for Technical Presentations.

Environment: C, VxWorks Tools and Embedded Linux.



Contact this candidate