Post Job Free

Resume

Sign in

Sql Server Data Bricks

Location:
Frisco, TX
Posted:
January 24, 2024

Contact this candidate

Resume:

Varaprasad Moturi Personal Email:ad22j3@r.postjobfree.com

T. . . Mobile. . . metroPCS Cell: 214-***-**** ; US Citizen

www.linkedin.com/in/varaprasad-moturi-99b7757

Goals:

Grow in my profession to lead a team with greater responsibility to contribute to the company’s.

success.

PROFESSIONAL SUMMARY

Over 18 Years of IT experience, working on business critical and major projects with tight deadlines, and completing them ahead of schedule. Contributed to operational excellence across organizations. Managed and led several projects of all sizes. Very efficient in leading teams and getting desired results or working independently and achieving the best results in optimal amount of time and accuracy. I received several awards and recognitions during my tenure for successful implementation of critical projects. I am a great mentor and a quick learner.

Skill

Experience

Last Used

Total IT Experience in USA

13 Plus Years

Present

Data bricks,Azure Data bricks,ADLS

5 Years

Present

AWS cloud services (S3, EC2, EMR

DynamoDB, Dataflow, CloudWatch, IAM)

1Year

Present

Databases: SQL Server (2000, 2005, 2008R2, 2012, 2014); Oracle (8i, 11i, 12c Teradata, Netezza, Snowflake)

13Years

Present

Reporting Tools: Power BI, Power BI Report builder, Business Objects 4.2; Crystal Reports; SSRS (7yrs),Qlick view, Cognos,Tableau and Denodo.

15 Years

Present

ETL: MS SQL Server 2000 DTS; MS SQL Server SSIS; Informatica 8.1, Teradata, Data Bricks, Azure Data Bricks,Datalake,Snowflake, Hadoop

10 Years

Present

Snowflake utilities - Snow SQL, Snowpark, SnowPipe.

5 Years

Present

Data Modeling: Visio Pro, Erwin, Power designer

8 Years

Present

SQL,SQL Server

15 Years

Present

Python/Spark/ Scala, PySpark, M

1 Year

Present

Managed several Offshore teams with different time zones (India, Australia, Japan and Canada ).

10 years

Present

Devapps Certified

5 Years

Present

Control M, GitHub, Gitlab

10 Years

Present

Expert in analyzing, designing, developing and deploying MS SQL Server suite of products with Business Intelligence in SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS).

Expertise in writing T-SQL Queries, sub-queries and complex joins, Stored Procedures, CTE, temp tables, User-defined Functions, Views, indexes.

Proficient in Creating, configuring workflows designed in MS SQL Server Integration Services (SSIS)

Experience in importing/exporting data between different sources like Netezza/Teradata/Oracle/SQL Server/Flat file/Access/Excel etc. using SSIS/DTS utility.

Worked on SSIS performance tuning database performance tuning.

Defining data warehouse (star, snowflake schema), fact table, dimensions, measures using Erwin tool.

Strong Experience in creating Ad Hoc, parameterized reports, linked reports, Drilldown reports, Crosstab, Conditional, Summary, Sub reports using SSRS from ETL Loads and various heterogeneous data sources. Created.

Ingest Data and put into Azure Snowflake and build out a Data Governance model on top of that.

Over 5 Plus Years Experience in bringing data into the Azure Data Factory, ADLS Gen2, Data bricks, Azure Databricks, Data Lake, Azure Key Vault, and/or Azure Log Analytics.

Build end-to-end data transformations in a single pipeline instead of ETL tasks.

Used Python and pyspark to load data from source to target using Azure data bricks.

Lead the design and development of innovative Power BI solutions (Power BI Desktop, Report builder and Power Automate for complex business intelligence challenges implemented in T-Mobile for Business Team.

Worked with data visualization specialists to facilitate technical design of complex data sourcing, transformation and aggregation logic ensuring business analytics requirements.

Experience with Business objects, Power BI and qlick view front end reporting tools including the SAP Business Objects Suite (WEBI, Information Design Tool, Central Management Console).

Knowledge and exposures to Azure SQL Data Warehouse, Databricks and Azure Databricks or other cloud or on premises MPP data warehousing systems (SAP HANA, Teradata, Snowflake)

Over 2 Plus Years Experience in bringing data into the Azure Data Factory, ADLS Gen2, Data bricks, Azure Databricks, Data Lake, Azure Key Vault, and/or Azure Log Analytics.

Expertise in designing and implementing IT solution delivery, support for diverse solutions and technical platform out of which 10+ years working as Data Engineer.

Experience Hands on Design, develop, and maintain highly scalable ETL applications on Hadoop.

Experience with code versioning tools such as GitHub, Jenkins, etc., and a command of configuration management concepts and tools.

Lead a team of 3+ onshore and offshore developers, providing leadership, mentoring, and ensuring the team delivers high-quality software.

Experience in medallion architecture data design pattern used to logically organize data in a lake house, with the goal of incrementally and progressively improving the structure and quality of data as it flows through each layer of the architecture (from Bronze Silver Gold layer tables).

Over 3 years of experience migrating/developing data solutions in the AWS cloud.

Experience in Teradata with ETL and Teradata utilities (BTEQ, MLOAD, FLOAD, FAST EXPORT)

Mounting dbfs to ingest data from storage account, perform data enrichment and transformation. Bulk loading data into OLAP database using poly base approach.

Responsible for statistical analysis, identification of data issues, and validation of data integrity.

Developing Databricks Notebook to ingest streaming data from event hub and perform data transformation and store processed data in ADLS.

Created several Databricks Spark jobs with Pyspark to perform several tables to table operations. Performed data flow transformation using the data flow activity. Implemented Azure, self-hosted integration runtime in ADF.

Designed and Developed event driven architectures using blob triggers and Azure Data Factory. Creating pipelines, data flows and complex data transformations and manipulations using ADF and PySpark with Databricks for ingesting data from different source systems like relational and non-relational to meet business functional requirements.

Worked on Azure services, such as Azure SQL DW (Synapse), ADLS, Azure Event Hub, Azure Data Factory to improve and Mdm, speed up delivery of our data products and services.

Automated the pipeline using Azure Data Factory, Azure Databricks and Azure sql, Python etc.,

Expert in Data Manipulation: Master in use of Power Query and M language for sophisticated data transformations and manipulation.

Designing and developing CDF pipelines extensively for ingesting data from different source systems like relational and non-relational to meet business functional requirements.

Experience with GIT and Continuous Integration/ Continuous Development (CI/CD) practices.

Understanding of data governance, data security, and data quality.

Good understanding to store and read data from cloud storages like azure blob and data lake• Good understanding on SQL concepts like DML, DDL operations, functions, stored procedures and joins.

Perform Spark job optimization using various optimization techniques.

Experience working on data processing tools and frameworks like Hadoop/Hive, Apache Spark, AWS Glue etc.

Enable logging in Databricks using python custom Logging as well as Log Analytics.

Hands-on experience in processing of high volumes of data from different data sources with different file formats like Parquet, CSV, JSON, txt,XML, Sequence File, Cloud Databases, MQ, Relational Databases such as Sql server, Oracle, Teradata and Snowflake etc.

Design, develop, and maintain the systems and infrastructure (i.e., data pipelines) necessary for the collection, storage, processing, and analysis of large volumes of data.

Experience with data governance processes (linage, profiling, classification, and regulatory compliance).

Working knowledge of data ingestion ETL/ELT tool (Informatica) and AWS cloud services (S3, EC2, EMR).

Experience in Agile methodologies and tools Jira, Qtest and Confluence etc.,

Over 15 years of experience in Analysis, Design, Development, Implementation and Administration of SAP Business Intelligence suite which Includes SAP Business Objects/BI, Crystal Reports, Xcelsius, Power BI, and Tableau.

Experience in developing data models, ingestion processes, and data extracts to support data consumption tools (MicroStrategy, Tableau, Business objects, Power BI and QlickView).

Expertise analyzes existing reporting solutions, maps back the data sets to their sources.

and prototype rich visuals to enhance them.

Experienced in reporting using Microsoft stack (Power BI, SSRS).

Experienced in backend database development and creation/modification of stored procedures.

Designs, develops, and implements ETL/ELT processes using IICS (Informatica cloud).

Experience implementing data integration techniques, such as event/message based integration (Kafka, Azure Event Hub, Deep.io), ETL.

Hands on expertise on databases sql server Oracle Teradata and Netezza, Big Data, MDM (Master Data Management), loan servicing, Loan originating and Telecom.

Extensive experience working with various data sources (Teradata, Snowflake, SQL, Oracle database, flat files (csv, delimited), Web API, XML and Jason.

Good team player with excellent communication, analytical, written and presentation skills with strong aptitude towards learning new technologies.

Experience working with SAP integration tools, Migration SAP Environment from Metro to T-Mobile including BODS.

EDUCATION

M.S. Software systems BITS, Pilani (Rajasthan), India

CERTIFICATIONS

Business Objects Certified Professional, Business Objects Web Intelligence XI 3.0.

Business Objects Certified Professional, Business Objects Enterprise.

R-PowerCenter 8 Architecture and Administration.

S-PowerCenter 8 Mapping Design Informatica Certified.

EXPERIENCE

T-Mobile/MetroPCS April 2011 - Current

Senior Data Engineer

Responsibilities:

Designed SSIS Packages to transfer data from flat files, MS Access, Netezza. Extracting, cleaning, transforming, and loading data to SQL Server Database.

Designed Business objects universes and reports in MetroPCS and T-Mobile.

Expert in developing and testing reports generated through Business Objects (BOBJ ) and Power BI in DEV, QA and Production.

Loaded structured data, semi structured data and unstructured data Using Sql Commands, Snow pipe and Web interface used for loading Data to snowflake.

Experience in Agile Data Warehouse Automation with Qlik Compose and Qlik Replicate.

Developed, implemented, and maintained SQL database objects such as tables, views, stored procedures, and functions.

Extensively used transformations in SSIS like conditional split, data conversion, multicast, derived column, SQL task, Script task and Send Mail task, Dataflow task, Foreach Loop Container, Sequence Container etc.

Created databases and schema objects including tables, indexes, stored procedures, Views. Used Inserts, updates on tables.

Created stored procedures for cleaning, manipulating, and processing data between the databases.

Created jobs in SQL server agent and scheduled jobs Daily, Monthly, quarterly.

Developed SSRS to develop different kinds of reports like canned, ad-hoc reports.

Developed various reports with SSRS features like parameterized reports, drill-down, drill-through, sub-reports, parameters, matrix, charts.

Developing data pipelines using azure data factory and azure data bricks, working with different source and sync, linked service, datasets and data flow.

Maintained the Power BI workspace, developed reports and subscription of reports to users to run business successfully.

Created SSIS Packages to extract data from Flat files and different data sources Netezza, Oracle, Teradata, Snowflake and load into SQL Server and developed various reports with SSRS.

Designed Business objects universes and reports in MetroPCS and T-Mobile.

Client: AT&T, Richardson, TX Sep 2010 – Mar 2011

Employer: Pyramid Consulting

Team Lead Bo

Expert in full life cycle design and development of Business objects Enterprise Applications.Created the High-level design, Application interface design based on user requirements.

Capital One, Richmond, VA Sep 2009 – Sep 2010

BI Specialist

Responsible for the design, development and administration of Business Objects Universes and analytical data constructs/structures.

Cisco, San Jose, CA April 2009– August 2009

Employer: Enternet Business Systems Report/Dashboard Developer/Analyst

Cisco enables people to make powerful connections-whether in business, education, philanthropy,

Extensively Involved in System Study, Business Requirements Analysis and Documentation.

Created database tables in Oracle and loaded the data into tables using SQL loader.

PepsiCo, Plano, TX September 2007– March 2009

Employer: Enternet Business Systems. Report Developer/Analyst

Analyzed source data and gathered requirements from the business users.

Coordinated with source system owners, day-to-day ETL progress monitoring, and maintenance.

Designed the STET, PERIGRINE, TART and Quality center universes.

Countrywide Mortgage, Plano, TX Mar 2007 - Sep 2007

Employer: Enternet Business Systems. Report Developer/Analyst

Responsible for Business objects reports from Taleo universe and Work Force Analytics.

JP Morgan Chase, Houston, TX June 2006 – Mar 2007

Employer: Enternet Business Systems. Report Developer/Analyst

Responsible for Global net, Coast and Dox universes. Global net supports collateral business Coast and Dox applications by loading the daily transaction feeds.

Argent Mortgage, Irvine, CA Jan 2005 - May 2006

Employer: Enternet Business Systems. Report Developer/Analyst

Working closely with management, systems operations staff, software development staff, supported staff and end-users to ensure rapid resolution of IT issues.

Developed the universes and Reports for various departments like Finance, Marketing and Sales Etc.,



Contact this candidate