Satya Prasad Kalyan Kandregula
******.***@*****.*** / 405-***-****
Professional Summary
Expertise includes Data modeling, Data cleansing, Data Validation, Data mapping identification & documentation, Data Extraction and Load Process from multiple Data sources, Data verification, Data Analysis, Transformation, Integration, Data import, Data export and use of multiple ETL tools.
Experienced SQL Developer with proven expertise in multiple flavors of SQL including MS’s T-SQL, SQL*Plus, DB2 SQL. Hands on with advanced concepts of SQL including joins, stored procedures, triggers, functions, cursors, sub queries, packages, views, indexes.
Hands-on experience in SQL query profiling, SQL tuning, Data cleansing and data base remodeling for performance. Written and tuned backend PL/SQL Cursors, Triggers, stored procedures & SQL.
Performed Data Base Integrity, consistency testing, data functional & non-functional testing, data quality testing and ETL Testing.
Extensively worked in building Business Intelligence, Decision Support systems and Corporate Performance Management solutions including Dashboards, Score Cards, Query & Analysis reports using MS SQL Server Analysis Services, MS SQL Server Integration Services, MS SQL Server Reporting Services.
using Azure Data Factory for ingesting and transforming structured and semi-structured data from multiple sources into Azure Data Lake.
Involved in various projects related to Data Modeling, System/Data Analysis, Design and Development for both OLTP and Data warehousing environments.
Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
Extensively used SSIS Import/Export Wizard for performing ETL operations.
Used Various ETL tools like SSIS/DTS for data flow from source files like XML, Tables and Views to other databases or files with proper mapping.
Experienced in writing SQL Queries, Views, Materialized Views, PL/SQL Procedures, Functions, Packages, Triggers, Cursors, Collections, Ref Cursor, Cursor variables, System Reference Cursor, Dynamic SQL, Exceptions. Experience in developing External Tables, Joins, Indexes and sequences.
Experience Microsoft Azure date storage and Azure Data Factory, Data Lake.
Well versed with migrating Crystal Reports to SSRS reports using Crystal Report Migration Services.
High expertise in creating reports using SSRS Ad-hoc, drill-down, drill-through, crosstab, sub-reports, pivot, tabular, parameterized reports and cascade parameterized reports.
Experienced in query optimization, performance and tuning using SQL Trace, Execution Plan, Indexing, Hints, Bulk Binds, Bulk Collect, Creation of global temporary tables and table partitioning.
Used Unified Modeling Language (UML) Diagrams and Data Models including Use Case Diagrams, Activity Diagrams/State Chart Diagrams, Collaboration Diagrams and Deployment Diagrams, Data Flow Diagrams (DFD), Entity-Relationship diagrams using MS Visio.
Built Visual Analytics Reports such as Drill down, Drill through, drill across, Sub Reports, Dashboards, Parameterized reports, Tables, Matrixes, Gauges, Indicators, Bar Graphs & Line Graphs using MS SQL Server Reporting Services (SSRS) 2014/2012/2008R2/2008.
Knowledge of Functional and Geographic Flowcharts, check sheets, Histograms, run charts, Control Charts, Cause and Effect diagrams, Interrelationship diagrams, Pareto Charts, Scatter diagrams, Affinity diagrams.
Migration (Design, implementation, deployments) Hybrid Deployments to Exchange Online.
Familiar with Software Development Life Cycle (SDLC) including requirements and programming.
Experience in creating complex SSIS packages using proper control and data flow elements.
Experience working with version control systems like Subversion, GIT and used Source code management tools GitHub, GitLab, Bitbucket including command line applications.
Rich experience AD-HOC reports with the report builder for the business admins.
Developed reports scheduling at prefixed time interval selection such as to run every day, given date, weekly, quarterly, and monthly based.
Experience in Defining report specifications and built the SSRS Environment for both development and test servers on client’s needs, wrote SQL Queries using Query Designer to securely retrieve data for reports from filtered data sources.
Experience in working with Team Foundation Server & Visual Source Control (TFS & VSS) and Azure Devops.
Ability to work independently as well as work with teams having varying backgrounds on complex issues and have strong verbal and communication skills.
Exposure at building Visual Analytics Reports such as Drill down, Drill through, Drill across, Sub Reports, Dashboards, Parameterized reports, Tables, Matrixes, Gauges, Indicators, Bar Graphs & Line Graphs using MS SQL Server Reporting Services (SSRS) 2014/2012.
Optimize data workflows in Azure Synapse Analytics, improving reporting performance by 40%.
Built Dashboards, Score Cards, Query & Analysis reports, Visual Analytics Reports such as Drill down, Drill through, drill across, Sub Reports, Dashboards, Parameterized reports, Tables, Matrixes, Gauges, Indicators, Bar Graphs, Line Graphs and KPI using MS SQL Server Analysis Services, MS SQL Server Integration Services, MS SQL Server Reporting Services.
Result-driven Quality Assurance professional with solid knowledge in manual testing and extensive experience in software development methodologies including both Agile (Scrum, Kanban, XP) and Waterfall models.
Windows Server 2003, 2008, 2012 (Hosted and Azure Cloud).
Thorough knowledge of Ralph Kimball’s Data Modeling concepts including Dimensional Modeling, Star and Snowflake schema, Slowly Changing Dimensions (Type1, Type2, and Type3) and Surrogate keys.
Tools and Technology
OLAP & BI Tools
MS-SQL Server (2014/2012/2008R2/2008), MS SQL Server Reporting Services, MS SQL Server Analysis Services, SharePoint 2013/2010, MS Excel, PowerPivot, Power View, PerformancePoint.
ETL Tools
Microsoft SSIS, SQL Scripting, AZURE SQL Data warehouse, ADF (Azure data factory), Informatica, Snowflake.
REPORTING TOOLS
Microsoft SSRS (2012), Tableau, Crystal Reports, PowerBI.
Data Process, Modeling, DB Testing tools
Erwin 4.0, JIRA, MS-Team Foundation Server.
Databases
MS SQL Server (2016/2012/2008R2/2008), MS Access 10/07, My SQL, DB2, Azure Data lake, Azure Synapse, PL/SQL Oracle.
Languages
T-SQL, UDB SQL, SQL*plus, PL/SQL, Visual Studio .Net, HTML, XML.
Development tool
TOAD 8.5, SQL, SQL*Plus, SQL Analyzer, SQL*Loader, Query Analyzer, SSMS
Workflow Tools
MS-Project, MS-VISIO, MS-Excel, MS-Word, MS–PowerPoint,
Operating Systems
Windows XP, Vista, 7, 8, 10, Red Hat Enterprise Linux
Methodologies
Ralph Kimball’s Data modeling, Waterfall, AGILE, Scrum, Iterative
Business Process and Case Tools
IBM Rational’s RequisitePro 2001, Borland’s CaliberRM 5.1, DB Visualizer.
EDUCATIONAL QUALIFICATION
Master’s in computer science at Oklahoma City University Jan 2015 – July 2016
Bachelors in computer science at Annam Acharya Institute of Technology and Sciences June 2009 – May 2013
PROFESSIONAL EXPERIENCE
Lineage Logistics - Pittsburgh, PA Jan 2023 – Present
LinOS Connect, Data Engineer
Roles and Responsibilities:
Design and maintain database schemas for facility management. Generated database SQL scripts and deployed databases including installation, configuration, schema creation and deployment.
Developed optimized SQL queries for data retrieval, manipulation, and reporting.
Developed and optimized ETL workflows using Informatica PowerCenter to extract data from diverse sources and load into Oracle databases for reporting and analytics.
Created complex Informatica mappings, transformations, and sessions to process large volumes of transactional data from Oracle-based ERP systems.
Designed and developed scalable ETL/ELT pipelines using Snowflake for high-performance data transformation and warehousing.
Developed reusable error-handling frameworks within Informatica for robust WMS job monitoring and recovery.
Created parameterized workflows with custom logging to track job performance and failures in warehouse integration pipelines.
Created DAX-based KPIs to monitor dock-to-stock time, order cycle time, inventory turnover, and outbound SLA compliance across multiple facilities.
Built time intelligence DAX calculations (YTD, MTD, previous day/week comparisons) to track warehouse operations trends over time.
Developed custom DAX logic to calculate pallet utilization, pick rates, and shipment accuracy with dynamic filters for facility, client, and carrier.
Migrated on-premise WMS ETL jobs to Informatica IICS (Intelligent Cloud Services), enabling scalable cloud-based data processing.
Integrated cloud-based WMS (HighJump) with enterprise reporting systems using Informatica Cloud.
Integrated Snowflake with ETL orchestration tools like Airflow, Azure Data Factory, or Fivetran for workflow management and automation.
Leveraged SnowSQL and Python scripts to automate deployments and data operations across environments (Dev/Test/Prod).
Worked on Incremental load and Incremental refresh from OLTP SSIS system to OLAP data source for reporting
purposes.
Experience in copy activity, custom Azure Data Factory Pipeline Activities for On-cloud ETL handling.
Strong data analyzing experience in designing Crystal reports and expert in writing complex formulas at the summary level and group level to implement the business logic.
integrated Azure Data Factory with Azure Synapse for orchestrated ETL processing, ensuring seamless data flow from source to warehouse.
Created business intelligence reports and dashboards using Microsoft BI tools and SSRS.
Developed and maintained complex stored procedures for warehouse management systems.
Created SSIS packages to extract data from OLTP to OLAP systems and scheduled jobs to call the packages.
Implemented Event Handlers and Error Handling in SSIS packages and notified process results to various user communities.
Developed DAX-powered drill-through reports to analyze errors in receiving, put away, picking, and shipping transactions by shift and user.
Used DAX Studio to troubleshoot and optimize slow-performing reports dealing with millions of warehouse transaction records.
Designed SSIS packages to import data from multiple sources to control upstream and downstream of data
into SQL Azure database.
Involved in migration efforts from Visual FoxPro to a modern SQL Server architecture, improving system efficiency.
Designed and implemented data migration strategies and error handling mechanisms.
Enhanced warehouse management system by implementing RF scanning for real-time inventory tracking.
Developed RESTful APIs for enterprise-level warehouse management system (WMS).
Built transaction-aware services with proper error handling and validation.
Automated data movement from on-prem SQL Server to Azure Synapse using Integration Runtimes
Implemented RF (Radio Frequency) scanning functionality for real-time inventory tracking.
Worked on API interfaces for real-time data exchange between warehouse and transportation systems.
Implemented caching strategies and query optimizations, reducing response times significantly.
Troubleshooting and resolving database-related issues and performance bottlenecks.
Created JIRA stories for each sprint and contributed to refining backlog stories, tasks for sprint planning.
Environment: SQL Server Integration Services (SSIS) 15.0, Informatica PowerCenter, Oracle, SQL Server Reporting Services (SSRS), Integration Services Catalog (ETL Hosting), SQL Server Agent (ETL Scheduling), Azure Data Factory, Crystal Reports XI, Visual code, Visual studio 2019, Postman, Power BI, DAX, Azure SQL Database, Azure Synapse Analytics.
First National Bank - Pittsburgh, PA April 2019 – Dec 2022
Credit Card / Insurance Data Engineer
Roles and Responsibilities:
Responsibility includes Create and maintain database objects like complex stored procedures, triggers, indexes, functions, views, tables, and SQL Joins.
ETL work included bulk loading of data from transactional tables to master tables of ODS.
Created the workflows for Daily, Weekly and Monthly SSIS package and Schedule the jobs as per the requirements.
Strong understanding of Data Warehousing using Fact Tables, Dimension Tables, star schema and snowflake schema modelling, slowly changing dimensions, foreign key concepts and referential integrity.
Created SSIS package to load data from XML Files to SQL Server 2016 by using Lookup, Fuzzy Lookup, Derived Columns, Condition Split, Term Extraction, Aggregate, Pivot Transformation, and Slowly Changing Dimension.
Used Joins, correlated and non-correlated sub-queries for complex business queries involving multiple tables & calculations from different databases.
Created and used complex store procedure in SSRS to generate reports using SSRS.
Developed and designed various forms of Reports like Drill down, Drill through, parameterized, cascade parameterized reports using SSRS.
Implemented Snowflake’s native features like Streams, Tasks, and Stored Procedures to automate incremental data loading and transformation workflows.
Utilized Snowpipe for continuous ingestion of semi-structured and structured data (e.g., JSON, CSV, Parquet) from cloud storage (e.g., Azure Blob, AWS S3).
Created and optimized materialized views, clustering keys, and partitioning strategies to improve query performance and cost-efficiency.
Experience in Data Vault implementation as data modeler.
Created a dimensional model for the reporting system by identifying required dimensions and facts using Erwin r7.1
Maintained security by creating the Guid’s in place of credit card numbers and responsible for the credit card system.
Used SHA2_256 Algorithm to encrypt credit card numbers along with Social Security Number (SSI)
Used Data vault 2.0 pattern to load the data into Hubs and Satellites along with Links.
Designed a custom tool to load the data EDM -Engine.
Scheduled Jobs, Alert, and Subscriptions using SQL Job Agent in SQL Server management studios.
Identify, re-design and re-arch multi-tier distributed systems to come up with re-usable components.
Created many DV objects i.e., Hubs, Satellites and loaded through the SPs.
Optimizing the performance of SQL queries with modifications in Oracle SQL queries, remove unnecessary columns, eliminating redundant and inconsistent data, normalize tables and Creating indexes.
Involve in Sprint planning, estimation of tasks for user stories, daily scrum meetings, and end of sprint demos and sprint retrospective.
Define the security of the organization in which national and international IT policy will enhance the level of security.
Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
Worked on Agile methodology and Microsoft Team Foundation Server TFS as a central server product where software team members connect and collaborate efficiently for a successful software project.
Establish and maintain contacts and networks for disaster recovery resources and support systems.
Leverage IT Technologies to manage risk and enforce security measures.
Managing and maintaining application databases using T-SQL Scripts and Creating various kinds of Reports (SSRS) like Drill Down, Drill through, Sub Reports and Dashboards etc.
Use SSIS extensively for loading the data into different databases from multiple sources.
Used Git for source control, and ADO for code-reviews and for adding new objects to solution project before moving into production to avoid conflicts with any other existing process.
Enhance and optimize the speed of the SQL Queries by writing Indexes and Store Procedures.
Create SSRS Reports for Reporting the Daily Error Logs on the servers.
Create Power BI Dashboard for the Daily CPU Usage and the Log usage of the Server.
Environment: SQL Server Integration Services (SSIS) 15.0, SQL Server Reporting Services (SSRS), Integration Services Catalog (ETL Hosting), SQL Server Agent (ETL Scheduling), Net Framework 4.0 (Scripting Component), OLE DB Provider, WinRAR 5.6, SQL Server 2016 (ENR), SQL Server 2016 Team Foundation Server 2015, DB Visualizer 10.0.
Beacon Health Options/ (K-FORCE) - Miami, FL June 2018 – Feb 2019
Enrollment, SQL Integration Developer
Roles and Responsibilities:
Ensure that new database code meets company standards for readability, reliability, and performance.
Design indexes for existing applications, choosing when to add or remove indexes.
Apply and follow Agile (Scrum) and Waterfall approaches to guarantee time efficiency and avoid issues.
Suggesting and implementing best practices on database design, SQL server configuration, and scripting automation to support UAT, SIT, and development SQL environments.
Team collaboration and take the lead in different initiatives to ensure product delivery on time and high quality.
Collaborate with other Developers, Testers, Scrum Master, PM, PO, BSA/TSAs in design, development and testing.
Troubleshooting and problem-solving SQL Server development.
Development of SSIS, TSQL and Stored Procedures. Design and development to build and maintain various application layers using design patterns and best approach and practices.
Collaborate in bug fixes related to code develop by him or any other team member.
Extracted data from Core systems using SSIS package handles the load of historical data from the core systems to Edifecs.
Production parameter driven, matrix sub-reports and integrated report hyper-link functionality to access external applications.
Created dashboard SSRS reports server projects and publishing SSRS reports to the report’s server tabular and matrix reports with aggregator functions and expressions.
The Enrollment (ENR) process begins when Beacon receives an eligibility file from TP or client through the SFTP server. After the file name has been assign, files are enriched, validated, compared and then converted to iENR. The iENR is routed to Edifecs ENR and delivered to Beacon through the Add, Change, Term (ACT) REST service.
Worked with configuring checkpoints, package logging, error logging and event handling to redirect error rows and fix the errors in SSIS.
Responsible for creating Row level security with power bi and integration with power bi service portal.
Load the data into destination tables by both full and incremental load by truncating and using joins on multiple columns within the stored procedures.
Developed reports using SSRS using various features like cross tab, parameterized reports as per client requirements.
Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards.
Published dashboard reports to online Power BI cloud service for the use of report consumers.
Trained users to build reports out of SSAS Multi-dimensional using Power BI desktop.
Strong skills in writing stored procedures, user defined functions, views, triggers, CTE (Common Table Expressions).
Creating the jobs using the job Agent and scheduling jobs to automate the ETL Process and to deliver the automated reports on daily, weekly and monthly basis.
Using TFS (Team Foundation Server) for source control, code-reviews and for adding new objects to solution project before moving into production to avoid conflicts with any other existing process.
Monitoring the jobs for the new process for the warranty period and resolving the issues which causes job failures.
Environment: SQL Server Integration Services (SSIS) 14.0, Integration Services Catalog (ETL Hosting), SQL Server Agent (ETL Scheduling), Net Framework 4.0 (Scripting Component), IBM DB2 for I IBMDASQL OLE DB Provider (CAS Connectivity), WinRAR 5.6, SQL Server 2016 (ENR), Power BI, SQL Server 2014 (Flex Care), IBM DB2 for I V7R2 (CAS), Team Foundation Server 2015, DB Visualizer 10.0.
Blue Cross Blue Shield - Phoenix, AZ Jan 2018 – June 2018
Database Migration, BI Developer
Roles and Responsibilities :
Upgrading SQL Server 2005 Production, Test and Dev environment to SQL Server 2016.
Worked on gathering requirements from the business and analyze them with offshore team. The application is 3- tier architecture and builds using Microsoft Visual Studio 2015 with .Net Framework 4.0.
Deploys the applications to multiple environments, from development through Quality Assurance, staging and production, and conduct unit and integration testing.
Developing stored procedures, triggers, views and adding/changing tables for data load and transformation, and data extraction.
Maintained high degree of competency across the Microsoft Application Platform focusing on .NET Framework, WCF, Windows Azure, and SQL Azure.
Working on Client/Server tools like SQL Server Enterprise Manager and Query Analyzer to Administer SQL Server.
Extensively worked with SSIS tool suite, designed and created mappings using various SSIS transformations like OLEDB Command, Conditional Split, Lookup, Aggregator, Multicast and Derived Column.
Build Data Sync job on Windows Azure to synchronize data from SQL 2016 databases to SQL Azure.
Worked on various Azure services like Compute (tab Web Roles, Worker Roles), Azure Websites, Caching, SQL Azure, NoSQL, Storage, Network services, Azure Active Directory, API Management, Scheduling, Auto Scaling, PowerShell Automation.
Creating a business intelligence semantic model: tabular, multidimensional, and PowerPivot interfacing EXCEL Datasheets.
Used with Calculated members, named sets, write-enabled dimensions, and write–back cubes using SSAS. Developed KPI-based Excel – reports; created Pivot Tables, and Pivot charts, and published reports from PowerPivot
Used Agile and Kanban frameworks to manage the product development workflow.
Installed reporting services in the production server and deployed reports from the development machine to production machines. Integrated reports server and integration server solutions to the main server using file management software called Team Foundation Server (TFS).
Responsible for data load reviews, analysis, and verification of ETL logic design for data warehouse and data marts in star schema methodology with conformed dimensions and fact tables. Reviewed, and verified ensuring that ETL packages are built in the desired way to refresh the data warehouse.
Extensively used various tasks such as For Each loop container, Bulk Insert Task, Execute SQL Task, File system Task, FTP Task, Send Mail Task, and Script Task.
Created SSIS packages for extracting data from databases like SQL, MySQL, JSON, Flat files, Excel and load those data into destination tables in SQL server.
Used variables and parameters in packages for dynamically driven data extracting and loading. Use complex SSIS expressions. Create various SSIS configuration settings including Environment Variables, SQL Server and XML configuration files. Use event handler to handle errors.
Designed and Created ETL Packages with different Transformations for loading the data from Heterogeneous sources into target data by performing various kinds of tasks and transformations such as Execute Package Task, Execute SQL Task, Derived Column, Fuzzy Lookup, Conditional Split, Lookup, Multicast, Merge, Aggregate, Pivot, Sort.
Used SQL Server Integration Services, updated newest data into data warehouse by using Slowly Changing Dimensions such as Type1, Type2 and Type3 transformations.
The amount of Work in Progress in any category or lane is limited and monitored, by using Kanban Cards placed on a Kanban Board serve as the visual control and signaling system.
Environment: MS-SQL Server 2016/2005, Azure SQL Data Warehouse, MS-SQL Server Reporting Services, MS-SQL Server Integration Services, MS-SQL Server Analysis Services, Data Visualization Dash Boards, PowerPivot, SharePoint 2010, T- SQL, MDX, XML, SQL*Plus, Windows 2008/2012 server, MS Access, MS-Excel 2008/2010, MS-AZURE SQL, Tableau 10.x
Techyon Systems - Dallas, TX Jul 2017 – Dec 2017
SQL/BI Developer
Responsibilities:
Helped identify the data required to calculate economic growth for different states in US. Selecting industrial, commercial, transportation energy consumption data to fit the model and performing data analysis. Developed Conceptual Data Models as the precursor to LDMs. Physical data models (PDMs) are developed to design the internal schema of a database, depicting the data tables, the data columns of those tables, and the relationships between the tables.
Use different features like grouping, sorting, and reports parameters to build sophisticated reports. Built Ad hoc reports and added drill through functionality using report builder, and created tabular, matrix, graphical reports from SQL/OLAP data sources using SQL Server 2005/2008 Reporting Services.
Transformed the required data from the database of their geographically separate database to the central data warehouse using MSSQL Server 2008. Transformation packages are built and configured to run at daily bases.
Implemented run-time debugging, set breakpoint, watching variables, and monitored the whole process with multiple connected data viewers.
Mentored Informatica developers on project for development, implementation, Performance tuning of mappings and code reviews.
Experience in Informatica B2B Data Exchange using Unstructured, Structured Data sets.
Used Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2BDataExchange of Informatica.
Created relationships, actions, data blending, filters, parameters, hierarchies, calculated fields, sorting, groupings, live connections, and in-memory in both tableau and excel.
Created customized reports using various chart types like, text tables, bar, pie, tree maps, heat maps, line charts, pivot tables, combination charts in excel and tableau.
Worked on motion chart, Bubble chart, Drill down analysis using tableau desktop for Amazon.
Created excellent ad hoc and dashboard reports using Business Objects Web Intelligence/Rich Client/Info View, Desktop Intelligence, Crystal Reports and Dashboard.
Developed Custom Functions, Manual Running Totals and Complex Formulas for reports used calculations, sorts, filters and sections to enhance data presentation in reports.
Created many Cascaded parameters in the reports using SQL SSRS 2012/2008. Developed many Tabular Reports, Matrix Reports, Drill down Reports and Charts using SQL Server Reporting Services (SSRS 2008).
Used SQL tools like TOAD to run SQL queries and validate the data in warehouse and mart.
Created Emissions Forecast dashboard for multiple plants/products across various geographies. Generated Digital dashboard to help executives see performance data in real-time and adjust meet EPA goals. Applied filters to data source views to specify varied subset of the data selected and built packages and package templates.
Created customer database role to the ETL package and enable the role to access and run the packages. Further secure the package by using encryption to protect data within package.
Cleaned data using Script component, Fuzzy grouping, Fuzzy Lookup in dataflow task during the ETL process.
Worked extensively with transformations such as aggregate, cache transformation, conditional split, merge join.
Calculated and reported implied volatilities of calls and puts according to Black-Scholes-Merton using VBA and Excel Solver.
Learned to use Power BI to create drill through and parameterized reports to identify locations from where majority of claims for a selected policy in a year were made. Studying if there is a pattern within the claims for fraud detection.
Gained insights into the employees who are most involved in getting new policy holders and approving the claims and building reports over a period and location.
Created Complex reports with advanced formatting using report designer and simple ad-hoc reports using report builder, enable creation of simple formatted reports for others to use.
Build highly formatted dashboard with advanced formatting, including Crosstab reports, parameterized reports, drill through reports, sub reports, bar chart, line chart, scatter chart.
Optimized aggregations in cubes with the concept of storage models in view and created virtual dimensions using SQL Server Analysis manager.
Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Worklets and Sessions and scheduling them to run at specified time before creating Jobs.
During Implementation phase, tuned Informatica Mappings for optimum performance. Responsible for the daily loads and handling the reject data.
Performed extensive debugging and performance tuning of mappings, sessions and workflows including Partitioning, memory tuning and cache management.
Written PL/SQL subprograms i.e. procedures and functions for value calculations.
Environment: T-SQL, ER-Studio, Microsoft SQL server 2012/2008, SSIS, SSRS, SSAS, DUNDAS Charts, Informatica Power Center 8.6, MS Excel, MS Access, DB2, Tableau 9.x, Windows vista, TOAD, Oracle 9i/8i, Microsoft Visio, MS-Excel, MS-Access, Informatica, IBM Infosphere / DataStage.
IDEP SOLUTIONS – Plano, TX Sep 2016 – Jun 2017
SQL Developer
Responsibilities
Designed and developed operational data store (ODS) and developed Energy consumption data mart for quicker reporting purposes. Integrated various data sources with Multiple Relational Databases such as MS-SQL Server, Oracle, MySQL, MS Access, Sybase and integrating data from flat files.
Created mappings for