Post Job Free

Resume

Sign in

Business Intelligence Data Warehouse

Location:
Irving, TX
Salary:
160000
Posted:
January 31, 2024

Contact this candidate

Resume:

NITHIN NEEMKAR

Email: ad29ns@r.postjobfree.com Phone: 770-***-****

Certified Snowflake Developer

PROFESSIONAL SUMMARY

* ***** ** ********** ** building data intensive applications, data analytics, business intelligence, data integration and migration using SAS, Python, Oracle, Snowflake, ADF, Databricks, Synapse Analytics.

Expertise in building/migrating data warehouse on Snowflake cloud database.

Experience in creating snowflake warehouses and moving data from traditional databases to Snowflake.

Experience in building data pipelines in Azure Data Factory.

Implemented data skew patterns for removing the data skewness across the partitions.

Implemented Databricks transformations on data using notebooks and provided the configuration in notebook files for the various stages.

Good exposure in Snowflake Cloud Architecture and SNOWPIPE for continuous data ingestion.

Proficient in understanding business processes/requirements as per user stories and translating into technical requirements.

Design and implement Python scripts for efficient data extraction and loading operations into Snowflake.

Integrate PySpark seamlessly with Snowflake for optimized data processing and analytics.

Experience in creation of dedicated SQL pools and spark notebooks in synapse analytics.

Good understanding of data storage in synapse SQL pools.

Extensively worked in ETL process consisting of data transformation, data sourcing, mapping, conversion along with data modelling.

Advanced SQL skills including complex Joins, snowflake stored procedures, clone, views, Materialized views etc.

Experience with Snowflake Datawarehouse, deep understanding of Snowflake architecture and processing.

Created Clone objects to maintain zero copies in Snowflake.

Handling large and complex data sets like JSON, CSV files from various sources like ADF and AWS S3.

Experience in writing complex SQL scripts using Statistical Aggregate functions and Analytical functions to support ETL in snowflake cloud data warehouse.

Used COPY/INSERT, PUT, GET commands for loading data into Snowflake tables from internal and external stages.

Experience in integrating DBT and snowflake.

Created the SQL models in DBT for data movement in snowflake.

Experience in creating Azure Event Hub, Azure Key Vault, Stream Analytics.

Experience of development using software development methodologies such as Agile and Waterfall.

Worked on bulk loading data into Snowflake tables.

Expertise in SAS/Base and SAS/Macros programming.

Excellent understanding of SAS ETL, SAS BI.

Rich hands-on experience in SAS CI Studio, SAS/Data Integration Studio, SAS BI Tools and SAS Enterprise Guide.

Expertise in SAS 9.4 and SAS 9.3 administration activities.

Ability to work independently and as a team with a sense of responsibility, dedication, commitment and with an urge to learn new technologies.

Excellent Client interaction skills and proven experience in working independently as well as in a team.

Exposure to Power BI concepts and little experience in creation of dashboards.

EDUCATIONAL QUALIFICATIONS

Bachelor of commerce, Osmania University, Hyderabad, Telangana, India- April 2013.

Master of science in Business Analytics, Trine University, Detroit, Michigan, USA – Aug 2021 to Dec 2022.

CERTIFICATIONS

●Siebel 7.7 Essentials Certified and Process Training covering Installation, configurations, working with Siebel Products and Processes followed for Logging defects from Siebel.

●Business Intelligence Applications Support Specialist

●Oracle Business Intelligence Foundation Support

●Business Intelligence Applications 7.9.6 Presales

●Oracle Business Intelligence Foundation 10.3.1 PreSales Specialist Assessment

●Business Intelligence Applications 7.9.6 Sales Specialist Certificate

TECHNICAL SKILLS

Cloud

Business Intelligence

:

:

AWS, Microsoft Azure, Informatica Cloud, Snowflake.

OBIEE, Siebel Business Analytics 7.8, PowerBI, Tableau

DAC

:

DAC 11g 11.1.1.6.4 / 11.1.1.6.3 / 7.9.6.1 / 7.9.6.2

ETL Tools

:

Oracle Data Integrator 11g, Informatica Cloud, Informatica Power Mart, Informatica Power Center 9.5.1 / 8.6.1 / 7.1.4, ODI

Data Modeler

:

ER/Studio, Sparx

CRM Applications

:

Oracle e Business Suite, JDE 9.0, Sales Force

Siebel Applications

:

Siebel7.X/8.0, Siebel Call Center

Operating System

:

Windows 95/98/2000/XP, UNIX, Red Hat Linux, AIX, Sun Solaris OS, MS-DOS

Languages

:

Python, C, C++, C#.

Database

:

Oracle, Teradata, SQL Server, Mongodb

Internet Technologies

:

J2EE, JSP, XML, HTML, DHTML, XHTML, Perl, CGI, UML, JUnit, JProbe

Version Control Tools

:

Clear Case, Visual Source Safe

PROFESSIONAL SUMMARY

Altice USA, Long Island, NY [Remote] 1st May 2023 – till date.

Power Bi Consultant (C2C- Signitives Technologies LLC)

Roles and Responsibilities:

●Designed, developed, tested, and maintained Tableau functional reports based on user requirements. Developed the audit activity for all the cloud mappings.

●Created roadmap to implement Power BI as new Enterprise reporting platform.

●Proficient in AWS services like VPC, EC2, S3, ELB, AutoScalingGroups(ASG), EBS, RDS, IAM, CloudFormation, Route 53, CloudWatch, CloudFront, CloudTrail.

●Developed Tableau data visualization using Cross tabs, Heat maps, Box and Whisker charts, Scatter Plots, Geographic Map, Pie Charts and Bar Charts and Density Chart.

●Possess good knowledge in creating and launching EC2 instances using AMI’s of Linux, Ubuntu, RHEL, and Windows and wrote shell scripts to bootstrap instance.

●Installed and configured Enterprise gateway and Personal gateway in Power BI service.

●Consulted with the operations team on deploying, migrating data, monitoring, analyzing, and tuning MongoDB applications.

●Created mockup Reports and Dashboards on Power BI to show to stake holders to display advantages of using it over existing reports in OBIEE.

●Wrote calculated columns, Measures queries in Power BI Desktop to display the features and ease of development over existing tool.

●Provided Production support to Tableau users and Wrote Custom SQL to support business requirements.

●Design, Development, Testing and Implementation of ETL processes using Informatica Cloud

●Created snapshots to take backups of the volumes and also images to store launch configurations of the EC2 instances.

●Created new models in Power BI and built new reports.

●Configure cloud event triggers to trigger snowpipe.

●Define standards to snowflake development team to ensure performance and future maintenance are good.

●Developed Mongo DB embedded documents from java code using spring data MongoDB.

●Extensive experience delivering Data warehousing implementations, Data migration and ETL processes to integrate data across multiple sources using Informatica PowerCenter and Informatica Cloud Services.

●Perform on-call duties for the production support issues and outages occurring across different environments.

●Increase/Decrease size of virtual warehouse as per demand for optimal performance without unnecessarily increasing the cost in snowflake .

●Allocate virtual warehouse to run common queries on warehouse for better performance in snowflake .

●Developed enhancements to MongoDB architecture to improve performance and scalability.

●Comfortable in writing map reduce programs to load data into MongoDB environment.

●Troubleshooting production support issues, identify the root cause and migrate fixes.

●Utilized Tableau server to publish and share the reports with the business users.

●Created virtual warehouse based on requirement in snowflake.

●Sized snowflake virtual warehouse based on data load to be done from Teradata.

●Integration of Exadata monitoring into our existing IT monitoring framework. Provide and implement performance and tuning recommendations of all components in the Exadata machines.

●Assisted the other ETL developers in solving complex scenarios and coordinated with source systems owners with day-to-day ETL progress monitoring.

●Created multi cluster warehouse to allow autoscale warehouse based on necessity to have better performance.

●Create different roles in snowflake and assign privileges as per their role in project.

●Configured Azure blob storage to bulk load data in snowflake .

●Assisted in building the ETL source to Target specification documents by understanding the business requirements.

●Develop reporting standards and best practices to ensure data standardization and consistency.

●Expertise in Creating, Debugging, Scheduling and Monitoring jobs using Airflow and Oozie.

●Installed and configured apache airflow for workflow management and created workflows in python.

●Perform platform administration for Development, test, QA and Production environments.

●Oracle patch applications across BI environments.

●Worked with DBA on improving performance of reports querying from Exadata.

●Created new reports/ dashboards as per requirements provided by Business Users.

●Developed mappings that perform Extraction, Transformation and load of source data into Derived Masters schema using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings.

●Retrieved data using time travel functionality.

●Worked with DBA in SQL tuning in Exadata Production and Performance Testing environments by using statistics, hints, SQL tuning set etc.

●Meet the client expectations of the Oracle Data Visualization request provided in the reports.

●Recommend new Oracle Visualization styles to improve dashboard usability and extend customers’ ability to understand analyses.

●Use PySpark functionalities for statistical analysis and machine learning tasks as needed.

●Set up monitoring mechanisms for PySpark jobs to track performance metrics and identify bottlenecks.

●Leverage Python for data processing and analysis tasks within the Snowflake environment.

●Utilize Python libraries such as Pandas, NumPy, and others for efficient data manipulation and analysis.

●Modified OBIEE Metadata and ETL to ensure performance is better on Exadata.

●Extracted data from EBS system to Staging Area and loaded the data to the target database by ETL process using Informatica Power Center.

●Implemented OBIEE Cache Management to help in performance tuning.

●Distributed Tableau reports using techniques like Packaged Workbooks, PDF to different user community.

●Utilizes Oracle Data Visualization tools to make clear and concise visual representations of data.

●Investigating and fixing the bugs occurred in the production environment and providing the on-call support.

●Managed Informatica Development and enhancements based on requirements provided by users.

●Create new reports and dashboards to meet new business requirements.

●Customize Informatica mappings and DAC Tasks to meet new data model requirements.

●Implemented standards on data modelling, report development and gathering business requirements.

●Automated migration of Informatica objects between environments.

●Configured data sharing in snowflake.

Environment: Oracle BIP, Oracle ADW, Snowflake, Tableau, Informatica Cloud, Mongo DB, Teradata, AWS Glue, OBIEE 12c, Informatica 9.1, Oracle Linux, Oracle 12c, Exadata, Oracle EBS, Windows Server 2008, Airflow and Oracle Database 12.1.0.2.0

Amazon Development Center, Hyderabad, India [Onsite] 25th July 2016- 01 Aug 2021

Data Engineer

Roles and Responsibilities:

Responsible for all activities related to the development, implementation, administration, and support of ETL processes for on-premises and cloud data warehouse.

Migrated data from On-premises to AWS Data warehouse.

Bulk loading from external stage (AWS S3).

External stage (snowflake) using COPY command.

Worked on AWS streams to process incremental records.

Loading data into AWS tables from internal stage and on local machine.

Used COPY, LIST, PUT and GET commands for validating internal and external stage files.

Used import and export from internal stage (AWS) vs external stage (S3 Bucket).

Writing complex Snow SQL scripts in AWS cloud data warehouse to Business Analysis and reporting.

Perform troubleshooting analysis and resolution of critical issues.

Involved in data analysis and handling ad-hoc requests by interacting with business analysts, clients and resolving the issues as part of production support.

Used performance technics like clustering key, autoscaling for faster way of loading, increase query execution.

Performed data validations for source and target using load history and copy history.

Involved in data analysis and handling ad-hoc requests by interacting with business analysts, clients and resolving the issues as part of production support.

Developed snowflake queries using various joins like self, left, right and full outer joins.

Created views and materialized views and AWS procedures.

Worked on assigning roles to various users like development, testing etc.

Designed Update and Insert strategy for merging changes into existing Dimensions and Fact tables.

Maintained SAS DI Jobs to create and populate tables in data warehouse for daily reporting across departments.

Created DI jobs for ETL process and reporting.

Creation of SAS FM templates and publishing in SA FM studio.

Addressing data issues and providing permanent fixes.

Estimating requirements and committing deadlines with business.

Converting design documents into technical specification.

Mentoring team to improve their technical skills.

Installed and configured SAS 9.4 Enterprise Business Intelligence Servers in Solaris environment.

Setting up the individual project repositories through SMC.

Troubleshooting SAS server related issues and co-ordaining with SAS vendors over the track for solutions.

Applied licenses on SAS Servers.

Created libraries for various data sources like Oracle, flat files, and SAS SPDS.

Monitoring server resources and performance along with mount points, CPU usage and logs monitoring.

Implement data preprocessing workflows using Python scripts to ensure data quality.

Performing maintenance activity of SAS servers which includes taking backups and restarting SAS services.

Develop custom scripts to enhance cloud functionality or automate specific tasks using Python.

Environment: Snowflake, AWS S3, Oracle, SQL, MySQL, PGSQL, SAS DI, SAS SMC, SAS LSF, SAS FM, SAS OLAP

Knoah Solution India PVT LTD, Hyderabad, India [Onsite] 24 Dec 2014– 01 May 2016.

SAS Developer

Roles and Responsibilities:

Using PROC SQL and SQL PASS THROUGH for writing queries of ETL, joining tables, ad hoc analysis, testing and validation of datasets.

Involved in writing SAS MACROS and generalized code for implementing incremental jobs for ad hoc and daily/monthly ETL processing.

Worked on PROC SPDO to manage data in the form of clusters (Dynamic Tables) on SPDS server for efficient operation of ETL process.

Create and manage Meta data objects that define sources, targets and jobs for various trans- formations and consolidate user transformations in process flow via Process Designer in DIS.

Deploy jobs, create dependencies in DI Studio, and schedule accordingly via LSF.

Administration of SAS Enterprise Business Intelligence applications on SAS production servers.

Responsible for SAS client tools installation and SAS server port opening.

Responsible for setting up the individual project repositories through SMC.

Applied licenses on SAS Servers, SAS DIS Server and SAS SPD Server.

Responsible for taking backups of Metadata.

Performing maintenance activity of SAS servers which includes taking backups and restarting SAS services.

Environment: Oracle, SQL, MySQL, PGSQL, SAS DI, SAS EG, SAS SMC, SAS LSF, SAS FM, SAS OLAP, Linux



Contact this candidate