Post Job Free

Resume

Sign in

Data Analyst

Location:
Schaumburg, IL
Salary:
120000
Posted:
February 22, 2021

Contact this candidate

Resume:

Anuraag Sai

Data Engineer/Data Analyst

304-***-**** adkehx@r.postjobfree.com

https://www.linkedin.com/in/asp2904/

SUMMARY

Strong experience in IT industry as SQL Development, BI Development, Cloud Data tools, Reporting Tools.

Expert knowledge of all phase of Software Development Life Cycle (SDLC) involving Systems Analysis, Design, Development, and Implementation.

Good knowledge on SSIS, ADW, SSAS, AAS, SSRS, Power BI, Power Platform.

Expertise on Normalization and Demoralizing, data design methodology and building referential integrity.

Experienced in T-SQL (DDL, DML), documenting, coding, programming, modifying and Implementing Business Intelligence.

Experienced in designing prominent level ETL architecture, creating, configuring, and fine- tuning workflows designed in DTS for data transfer from the OLTP to OLAP with the help of SSIS, ADW.

Adapt in usage of SSIS Control Flow items (For Loop, execute package/SQL tasks, Script task, send mail task and SSIS data flow transformations like Aggregate, Derived Column, Data Conversion, Conditional Split, Multicast, Fuzzy Lookup, Union All, Merge Join, Lookup, Slowly Changing Dimension, and Pivot applying ETL concepts.

Proficient in error handling and debugging in SSIS upon using break point and check point to provide error information during debugging process once failure execution occurred; experienced in configuration and deployment of SSIS packages to different server.

Experience in SQL Server Analysis Services (SSAS) OLAP Cubes, and Mining, Implemented Snowflakes Schema and Star Schema.

Expertise in Tabular Model, Proficiency in scripting DAX, MDX, M querying.

Experience in Importing/Exporting Data from various Heterogeneous source like MS Access, Excel, Oracle, DB2, and Flat file using SSIS, ADW.

Combined knowledge and experience in designing Data Mart and Data Warehouse using Star Schema and Snowflake Schema for implementing Data Mining.

Quantified knowledge of Data Mining model and Algorithm including Decision Trees, Association Rules Clustering and Naive Bayes.

Experience in RDBMS Concepts, Database Management System, Database Physical and Logical design, Data Mapping, Table normalization, Data Modeling Creating ER diagrams using tools such as MS Visio Studio and Erwin.

Excellent Debugging, Mathematical, Analytical and Logical Problem-Solving Skills.

Proficient in T-SQL Programming using DDL, DML, DCL commands for various business applications.

Extensive SQL Server Developer experience to develop tables, views, indexes, triggers, stored procedures, CTEs and joins.

Highly proficient in the use of T-SQL for developing complex Stored Procedures, Triggers, Tables, User Defined Functions, Relational DB models and Data integrity, and SQL joins.

Creation and maintenance of Databases, creating Roles and managing user permissions, resolving deadlocks.

Understanding Business logic and implementing them using DAX, Dynamic DAX, and Implementation of data Security using RLS.

Expertise in building Power BI Dashboards, Visualizing the data in innovative style,

Implementing RLS on reports.

Expert in developing Drill through/Drill down reports, Crosstab, cascading parameterized reports using MS SQL Server Reporting Services (SSRS) and deploying onto Dashboards, SharePoint server, web applications and web servers.

Experienced in generating On-demand and Scheduled reports for Business Analysis or Management decisions using MS SQL Server Reporting Services 2005/2008/2012/2014.

Monitored and Tuned MS SQL Server databases with tools like Index Tuning Wizard, SQL Profiler, and Windows Performance Monitor for optimal Performance.

Expertise in creating Cell Level Security in cubes using SSAS.

Specialized in Data Loss prevention and mediation of SQL Injections.

General knowledge of security technical implementation guidelines regarding security of databases.

Impressive knowledge on Dimensional Modelling concepts like (Star Schema, Snowflake Schema) and Kimball/Inmon methodologies.

Hands on experience in production/staging/test/development database administrative, Installation, Configuration, Maintenance, Monitoring, Backup and Disaster Recovery procedure and Replication.

SKILL SET

Languages: Java, Html, CSS, .Net, C, C++, RDBMS, HTML5, CSS3, JavaScript, PIG, Hive and Map Reducing.

Scripting Languages and Platforms: SQL, SSRS, SSIS, SSAS, AAS, ADW IoT platform (AWS, IBM Bluemix, and Azure IoT), Cloud Computing, VB script, Python, OLTP, and OLAP.

Reporting Tools: Power BI, Tableau and Qlick View.

Tools: Eclipse, Oracle Databases, SQL Server, Visual Studio, Tableau, MSBI, Intel XDK, LDAP server Manager and Raspberry OS.

Disaster Management and Risk Planning.

EXPERIENCE

Business Intelligence Lead April2020–Present

Bourntec Solutions, Chicago, IL

Responsibilities:

Continuously improve the quality and usability of corporate information assets.

Core member of lead team for various Business Intelligence Projects

Champion standardization, integration, and ubiquity of BI tools and processes.

Collaborate with management, project managers and IT to evaluate required data, reporting functionality, communicate trade-offs, alternative solutions, plan out, prioritize, and lead related development efforts.

Lead cross-functional project teams to incorporate new data sources into the data warehouse, expose related attributes and metrics in Power BI and utilize BI tools to develop reports, visualizations, dashboards, and other solutions that allow data to be analysed and acted on.

Adopt existing and establish new technical design patterns, workflows & tools to build sustainable analytics solutions that provide decision support for stakeholders.

Lead projects using Lean methodologies and regularly coach other project personnel to ensure analytical solutions are delivered in compliance with BI standards.

Mentor BI staff in the technical areas used by the analytics team, both formally in structured group settings and informally in one-on-one and small group sessions.

Demonstrates the Company’s core values and complies with all Company policies and procedures.

Compose metrics/measures in the pursuance of institutional KPI’s.

Financial services experience with knowledge in holdings, transactions, pricing, and ratings data.

As a core member of the regional BI/DW team, manage the full life cycle of BI projects independently and deliver solutions accordingly.

Provide solution proposal to business with deep understanding on business needs, translate business needs and technical requirements into detail design.

Involve in or lead advanced analytics projects as needed.

Developed business intelligence dashboards and assisted in its integration on multiple data platform.

Analysed system and implemented various improvements to systems and performed necessary test patch and upgraded system.

Provided technical support to business intelligence platform, diagnosed, and resolved problems on same.

Developed and executed various business intelligence training programs.

Performed troubleshoot on all intelligence jobs on multiple sources.

Assisted in developing business intelligence strategies in coordination with data architect. Data Analyst Jun 2018–April 2020

Schlumberger, Houston, TX

Responsibilities:

• Designed multiple Dashboard for entire organization and multiple Ad hoc Reports.

• Designed Power BI APP for the entire organization to see growth of the company where it is implemented with value causation tree.

• Designed Scorecard for executives and CXO teams where they can analyse the performance of the company.

• Implemented RLS on Power BI Report and managed access.

• Proficiency in managing and implementing Bookmarks, Filters, Cross filtering Report, Drilldown and Drill through Report.

• Developed Power BI model used for financial reporting of P & L and Headcount.

• Designed and documented the entire Architecture of Power BI Poc.

• Expertise in writing complex DAX functions in Power BI and Power Pivot.

• Automated Power Query refresh using power shell script and windows task scheduler.

• Installed and configured Enterprise gateway and Personal gateway in Power BI service.

• Created Workspace and content packs for business users to view the developed reports.

• Scheduled Automatic refresh and scheduling refresh in Power BI service.

• Wrote calculated columns, Measures query’s in Power BI desktop to show good data analysis techniques.

• Weekly presentation to the business users about the reports and their changes as required.

• Worked on all kind of reports such as Yearly, Quarterly, Monthly, and Daily.

• Worked on all types of transformations that are available in Power BI query editor.

• Created stored procedures and SQL queries to pull data into power pivot model.

• Worked on DAX and implemented Multiple Data Models.

• Built bridge tables to create a many - to -many relationship to fact table, Factless Fact table.

• Implemented Dynamic DAX to manage the date dimensions. Environment: Power BI, SQL Server 2012, SSMS, SSIS, SSRS, SSAS, Pentaho, T-SQL, Tableau, Visio, VSTS, SQL Profiler, Azure Portal, AAS, ADW, Blob.

SQL Analyst Dec2016–May2018

RM Technotree, Bellevue, WA

Responsibilities:

Responsible for creating Summary reports, Sub reports, Drill Down reports, Matrix reports.

Designed and developed various scope scripts against cosmos Eco manager base streams to create structured streams for Power BI reports.

Scheduled and maintained cosmos jobs in casjobscheduler to publish the streams to Geneva Data Studio.

Created POC (proof of Concept) for Business requirements.

Designed Integration Workflow for multiple aspects of client integration using REST and SOAP services.

Used an inbound connector to retrieve data from sources like Disk, mail, HTTP, FTP, SFTP and Database.

Created Data Pipeline for data migration from Cosmos to PBI by developing Big Query scripts

(Unified, scaled out query engine).

Created views in Big Query for easy access of the cosmos streams and creating reports In Power BI against the cosmos streams.

Created various Power BI reports by consuming the cosmos streams based on end user requirements.

Developed Stored Procedures for parameterized, drill-down, and drill-through reports in SSRS.

Formatted reports using Global Variables, Expressions, and Functions for the reports.

Created different graphical reports using DAX queries for better stimulation of data in Power BI.

Created the DTS Package through ETL Process to vendors in which records were extracts from Flat file and Excel sources and loaded daily at the server.

Assess detailed specifications against design requirements.

Responsibilities taken as a production support Developer for applications as on needed basis.

Knowledge of financial instruments such as making payments to the account after splitting them to the accounts based on the commission rates and the bucket levels. Environment: SQL Server 2000/2005 Enterprise Edition, SQL Enterprise manager, Cosmos, Dell Boomi, MS PowerPoint, MS Access 2000 & Windows 2003/2000, DTS, SSIS, SSRS & Power BI. MSBI /POWER BI Analyst

Total Ebiz Solutions, NJ Aug2015 –Nov2016

Responsibilities:

Played an active role during the requirements gathering, analyzing and designing of the system.

Created reports using SQL Reporting Service (SSRS) for Customized and Ad-hoc Quires.

Converted Data Transformation Services (DTS) applications to SQL Server Integrated Services

(SSIS).

Created Packages in SSIS with Error handling.

Created SSIS Packages to pull data from SQL Server to Excel Spreadsheets and vice versa.

Loading data from various sources like OLEDB, Flat files to SQL Server database using SSIS packages and created data mapping to load the data from source to destination.

Worked on library of functions like DAX, power pivot in excel as well as Power BI desktop.

Performed reverse engineering for complex data systems, for further development on automated and reusable solutions.

Designed BI applications, solutions, and reports following IDT's standards of development lifecycle, testing and validation, and security.

Hosted knowledge transfer session for relational database design and BI architectures with best practice.

We used Power BI for data modelling, data exploration, designed cubes and deployed on cloud and given it to client.

Designing tools using HTML5, Angular and CSS to represent the data.

Assisted in Production support to help with user access and verification.

Initiated idea to introduce the telemetry for providing support in production environment. Integrated Data Lake service and export the data using Data Lake factory (NLog based implementation). Implemented Support portal which helps service engineering team to support to end user. This initiative helps to reduce 75% effort from service engineering team, also support to identify problem and take prevention steps automatically.

Used R scripting for data visualization, data cleaning, and data munging.

Used R libraries GGplot2, Histogram, Bar charts, quadruple matrix and linear graphs for more data visualization.

Created Hive External tables and working on them using Hive QL.

Created Hive views and provided to the business users.

Written Hive queries for data analysis to meet the business requirements.

Used Hive partitioning technique for optimization.

Successfully migrated data between different heterogeneous sources such as flat files, Excel and SQL Server 2008 using SSIS, BCP and Bulk insert.

Extensively used SSIS Import/Export Wizard for performing the ETL operations.

Used Various ETL tools like SSIS/DTS for data flow from source files like XML, Tables and Views to other databases or files with proper mapping.

Enhancing and deploying the SSIS Packages from development server to production server.

Extensively used SSIS transformations such as Lookup, Derived Column, Data Conversion, Aggregate, Conditional Split, SQL task, Script task and Send Mail task.

Designed, Developed and Deployed Reports in MS SQL environment using SSRS-2008

Created Parameterized Reports and Linked Reports with thorough knowledge of report serving architecture (Table, Chart and Matrix Report).

Involved in creating multiple parameterized stored procedures which were used by the reports to get the data.

Publishing Power BI Reports from Power BI Desktop to Power BI Service.

Developed Reports for business end users using Report Builder with updating status.

Developed different kind of reports such as Sub Reports, Charts, Matrix Reports, Linked Reports.

Develop reports using complex formulas and to query the database to generate different types of ad-hoc reports using SSRS.

Involved in writing heavy T-SQL stored procedures, complex joins. Environment: MS SQL Server2008/2012/2014, SQL Server Integration Services (SSIS), Azure, Power BI, SQL Server Reporting Services (SSRS), MS Visual Studio, Hive, Windows 2008 Server, DTS, BIDS, MS, HTML5, CSS, Angular JS, Restful Services, Excel. SQL Developer Aug 2014–Feb2015

Awicon Technologies, Hyderabad, India

The reporting platform has grown organically from an ad-hoc data querying environment hitting operational systems to a custom built mission critical business intelligence and reporting system responsible for supporting a num1ber of decision support and operational functions. As new data is integrated into the system and new ways of accessing it are delivered (via additional analytics tools, etc.), the growth in demand for data access has significantly increased. To address the increase in customer demand while allowing for custom interaction, the goal of Empowered BI is to develop a self-service platform that enables the users to retrieve specific data sets they want on demand. Empowered BI would thus deliver an interactive analytics experience that not only delivers raw data, but tools and infrastructure that transforms data into information used to make business decisions. Specifically, EBI provide the following capabilities:

The Experience data will be co-located on the same server enabling the sharing of data across Customer, Seller, and Partner Experiences.

Business managed Work Areas enabling the Business to push Business Critical, Non-IT data to work areas that are co-located with the Experience Marts.

The Business will be enabled to pull cross Experience data together with the Business Managed data into their reports without investing in their own infrastructure.

The EBI platform leverages DSL site as the user entry and also the built-in workflow engine of DSL to support the user’s approval process.

Once the user request get approved, EBI controller will be handling the request with the database of dbEBIController as the supporting metadata storage.

Finally, the EBI worker server will be the place where customer uses for their application. IT Mart will be provisioned on the worker and customer Work Areas will be created.

Since EBI 2.5, a new VM has been introduced as ETL server to each work server. Users can administrate packages and jobs on the ETL server, with the permission they have. Responsibilities:

Creating Worker area for customer.

Preparing the decommission scripts for worker area and metadata.

Installation, configuration and upgrading of Microsoft SQL Server software and related products.

Attended business conference call with client team to understand the changes to the system.

Created jobs for taking automatic backup of Databases.

Worked on Restore database and recovery models and Database Shrinks. EDUCATION

BACHELOR OF Technology (Computer Science) - KMIT. 2010-2014 Master of Sciences (Computer Science)-Marshall University, WV 2015-2016 ACHIEVEMENTS

Certified as Machine Learning with Python 2020

o Machine Learning with Python - Level 1- ACCLAIM LINK Certified as Data Sciences Foundation 2020

o Data Science Foundations - Level 1- ACCLAIM LINK Certified as Python for Data Science 2020

o Python for Data Science- ACCLAIM LINK

Certified as Data Engineer Associate from Microsoft 2020

Microsoft Certified: Azure Data Engineer Associate o License No: H558-3977. ACCLAIM LINK

Certified as Data Analyst Associate from Microsoft 2020

Microsoft Certified: Data Analyst Associate

o License No: H568-4210. ACCLAIM LINK

Journal About Stack Overflow 2016

Factors for Resolved and Unresolved questions in Stack Overflow. Certification in Hadoop by CLOUDERA (CCDH) 2015

Certification issued by Cloudera as a Hadoop Developer on July 15th, 2015. License No: 100-013-601.



Contact this candidate