Post Job Free
Sign in

Data Developer

Location:
Perth Amboy, NJ
Posted:
February 23, 2021

Contact this candidate

Resume:

Venkat Ramireddy

201-***-****

MSBI MSSQL and Cloud Developer

adkfi2@r.postjobfree.com

Professional Summary

A MS-BI (SSIS, SSRS, and SSAS), SQL Server 2008/208R2/2012/2014 and AWS developer having 10 Years of IT experience in Application Development.

Have worked in multiple projects as BI Developer in Designing Database, ETL Packages and Reports in development and implementation stages of the Project according to client requirements.

Strong experience on T-SQL and query optimization, developing ETL strategies.

Strong analytical, problem troubleshooting/solving skills, self-starter.

Be able to convert/implement the design decisions into business functions and service definition.

Experience in Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift/RDS/S3.

Familiarity with Allation for developing data catalogues using, and integrating with existing enterprise architecture;

Extensive knowledge on database development and utilities (TOAD)

Knowledge on Writing Python Scripts and understanding Python OOP

Identify incomplete data, improve quality of data, and integrate data from several data sources

Monitor logs for better understanding the functioning of the system

Experienced in migrating database to AWS cloud ecosystem using Services like VPC, EC2, S3, EMR, RDS for Compute, Big Data Analytics and Storage

Designed and implemented comprehensive Backup plan and disaster recovery strategies Implemented. Rebuilding the indexes at regular intervals for better performance.

Experience with Snowflake Multi-Cluster Warehouses.

Experience in working with databases like MongoDB, MySQL, PostgreSQL and MS-SSQL.

Experience in implementing AWS data services such as Glue, RDS, DMS.

Extensive knowledge in designing reports using cascading parameters, Parameterized reports, report models for Adhoc reports using SQL server Reporting services (SSRS) based on requirements.

Experienced in performance tuning using SQL profiler identified slow running queries and optimized the queries to see that it will not affect the server performance.

Experience with data pipeline and ETL tools Talend and SSIS

Analyzing the source data to know the quality of data by using Talend Data Quality.

2+ years of experience with Cloud based Data Warehouse/Data lake platforms e.g. Snowflake and Dremio.

Expert in Migration/Rebuild from SQL to Postgres (Aurora Postgres preferred) in dev/production environment.

4+ years of experience in big data engineering roles, developing and maintaining ETL and ELT pipelines for data warehousing, on-premise and cloud Datalake environments

Applies creative problem-solving skills to data efforts in a way that satisfies business leaders request and lays groundwork for future successes.

Proficient understanding of code versioning tools e.g. Git, TFS

Experience in working with Web services and API technologies (REST, XML/JSON, XSLT)

Educational Qualification

Master of Computer Application from Jawaharlal Nehru Technological University India.

Certified Microsoft SQL Server 2012 Developer (MCSE Exam 70-463 Implementing a Data Warehouse with Microsoft SQL Server 2012).

AWS Certified Solutions Architect – Associate

Software Proficiency

Languages

C#, T-SQL, Java and XML, PowerShell,Python

Business Intelligence

SSIS, BIDS, SSRS and SSAS,Talend

Operating Systems

Version Control

Windows 2008/10, XP, MS-DOS

Team Foundation and Git

Framework Experience

Microsoft .NET

DBMS

SQL Server 2017/2014/2012/2008/2005, Data Modeler, DataBase Administrator, Database Developer

Applications & IDE

Visual Studio.NET, Business Intelligence Development Studio

Desktop tools

MS Excel, MS word, MS outlook, MS access.

Cloud

AWS

Professional Experience

Client: Verisk, Jersey City, NJ

Nov ‘2015 – Till Date

Project: JDE/Nautilus

Role: Sr Data Engineer (MSBI/MSSQL/AWS)

JDE is an enterprise project at Verisk with objective of creating an analytical sand box for the enterprise that includes a data repository of key Verisk data assets that is linked and normalized for ease of Analytics and research. JDE will be used for internal R&D purpose to investigate and boot strong commercialization opportunities utilizing inter connected data from one or more sources from Verisk. JDE extract and data load from multiple business units. We have implemented an integrated Verisk Enterprise data model. The data from BU’s is transformed and loaded in to the integrated Verisk JDE Data lake. JDE perform Entity resolution across all data assets and links to the data in terms of Individual, Business, Assets, class and geography

Nautilus – is a cloud version of JDE, which is a data lake on AWS S3 (data stored in parquet files which is glue crawled and can be queried using AWS Athena or Dremio. The environment would provide on-demand Compute Provision and Datasets access for data scientists build applications and proof of concepts (PoC’s).

MajorProjects:

1.BKFSProcess 2. ForeClosureAlert 3. FirstAmerica 4. Geocoding 5. RiskRating(ISO) 6. EnhancedSearch 7.VINDecoding/Match 8.ServiceNow 9.KBM 10.AVM 11.MLS 12.MarketStance 13.BuildFax 14.InfoGroup

Responsibilities:

Involve in full life-cycle implementation of a large data warehouse.

Make suggestions to business partners in leveraging technology to improve or automate the rendering of information used by the business units.

Develop understanding of both long and short-term objectives of the business units as it relates to analytics and the ability to garner member insights that drive strategic decisions.

Assist data architects in development of algorithms used to perform data matching for risk assessment.

Developing algorithms to optimize the setting of every lever in a high throughput real-time decision system.

Worked on developing ETL pipelines on S3 parquet files on data lake using AWS Glue

Involved in Deployment and Administration of SSIS packages with Business Intelligence development studio.

Re-write Batch scripts identified to be retained or updated in PowerShell

Responsible to provide general guidance and leadership to junior members of the analytics team and business-embedded analysts related to data usage and analytics.

Be the subject-matter expert on Dremio and on the data/analytics ecosystem in general to support end users

Serves as a key resource to the business units and Data Analytics Team for all facets of solution delivery of data & analytics deliverables.

Managed SQL database user accounts, user logins, permissions, database and server roles

Developed and implemented database and coding standards, improving performance and maintainability of corporate databases.

Responsible for monitoring and making recommendations for performance improvement in hosted databases. This involved index creation, index removal, index modification, file group modifications, and adding scheduled jobs to re-index and update statistics in databases.

Developed and optimized database structures, stored procedures, Dynamic Management views, DDL triggers and user-defined functions.

Plan, manage, and deliverables for assigned business unit based on business unit leaders’ key priorities.

Configuring, and maintaining SQL, Postgres and MySQL instances in AWS environment.

Provides technical oversight for data solutions and on request create/build Analytical algorithms by using SQL/Dremio/Python technologies.

Worked within a lean agile environment and continuous delivery, did pair programming with co- developers, regularly involved in estimation and planning.

Work Experience as a member of AWS Build Team.

Assists Manager of Strategic Analytics & Member Insights to plan and set tactical direction with business adoption of analytics and application of appropriate technology to support business need

Infrastructure Development on AWS by employing services such as EC2, RDS, Cloud Front, Cloud Watch, VPC, etc.

Setup and implement snowflake data lake with development tools like SQL Server, Snowflake services

Played key role in Migrating SQL objects into SnowFlake DWH environment.

Environment: MSSQL 2017/2014,C#, Dremio,Snoflake,Talend, SSIS, SSAS,SSRS Visual Studio 2010 and AWS

Client: AmeriHealth, Fort Washington, PA

Jun’2014 – Nov’2015

Project: LEAP

Role: SQLBI Developer

Description: LEAP is a migration project from Mainframe system to EDW system. AmeriHealth contain multiple applications system i.e. Health roles, mainframe systems, IRS, LDW and Legacy systems and CMS Reporting those application data needs to integrate with EDW system

Projects Worked on: CMS and IRS1099

Responsibilities:

Involved in meetings with business Users and Analyst’s to gather the requirements.

Designed SSIS frame work as per business flow and share with offshore developers for implementing the functionality.

Involved in Monitoring and assigning the tasks to team members to meet the project deadlines.

Used to prepare deployment docs along with deployment check list while moving code changes to QA/UAT or PROD environments

Required to assist and coordinate with DBA team for smooth deployment.

Developed SSIS Templates which can be used to develop SSIS Packages in such a way that they can be dynamically deployed into Dev, Test and Production Environments.

Executing data validation stored procedures in SSIS packages in Business Intelligence Development Studio (BIDS).

Involved in Deployment and Administration of SSIS packages with Business Intelligence development studio.

Defined Check constraints, Business Rules, Indexes, and Views.

Used Execution Plan, SQL Server Profiler to trace the slow running queries and tried to Optimize SQL queries for improved performance and availability.

Created SQL scripts for tuning and scheduling. Performed data conversion from flat file into a normalized database structure. Created new SSIS package to extract date from legacy to SQL Server Objects using Business Intelligence Development Studio (BIDS) and Visual studio 2012.

Environment: MSSQL 2012, SSIS, SSRS, SSAS, Visual Studio 2008/2010, SharePoint

Experience with Accenture

Client: SHELL, London

Nov’2013-Apr’2014

Project: Einstein

Role: SQL/SSRS Developer

Description:

In the current Einstein tool, there are pain points such as time taken by the tool to record performance metrics primarily due to how it’s designed as it needs continuous manual data input from users at multiple stages. Number of man hours currently spent is not best value obtained from the current staff. The reports generated by the tool (once a week and once a month) are dependent on the workflow data to be received. The longer that data takes to get entered, the longer the reports wait for that piece of information. Any change in the business process, needs a change in the reports. The sole developer of the tool does not have enough time to maintain the tool going forward. There might be data inaccuracies in the tool which needs to be investigated.

The Einstein Replacement tool should possibly eliminate the addressed pain points that most of the FOD users are facing in the current Einstein tool. This could be achieved by developing a single repository solution that will replace current Einstein for all the process area with an integration of the source system and produce a real-time metrics report. The new tool will also produce a real time leading indicator that show the real time metric report for the FOD users in all locations.

Responsibilities:

Developing SSRS Reports like Sub Reports, Linked Reports and Graphs

Designing Stored Proc and Functions.

Gathered user requirements, draft specs and developed SQL queries and SSRS Reports.

Developed client reports (rdl’s files) and integrated it into WPF application using report viewer control.

Configured reports in custom administrative portal.

Designed and developed Drill-down, Drill through, Summary, Transaction Print and Sub Reports using SSRS.

Scheduled reports to automatically generate bi-weekly report for Administrators of CMS Application.

Generated multi parameterized cascade reports, providing the ability to narrow down the selection criteria before executing reports, thus making them user friendly.

Provided MS SSRS 2008R2 training to the business and maintenance teams to develop their own ad-hoc reports.

Deployment in SharePoint environment

Involved in creating writing the scripts for creating tables, functions and views.

Involved in client interaction and reviews and actively participated in solving the issues.

Environment: MSSQL 2012, SSIS, SSRS Visual Studio 2010

Experience with Genpact INDIA

Client: Physician Health Partners, Colorado

Jan’2013– Nov’2013

Project: PHP

Role: BI Developer

Description:

PHP (Physician Health Partners) is an integrated team of physicians and healthcare professionals committed to supporting effective patient care throughout the healthcare continuum. PHP offers a range of services to help its primary care providers deliver high quality, cost-effective care for their patients. PHP also provides services directly to patients/members to help enhance their overall healthcare experience and outcomes.

Responsibilities:

Managed resources in onshore and offshore environment. Worked with XML transformations.

Actively participated in interaction with users, team lead, DBA’s and technical manager to fully understand the requirements of the system. Worked on data modeling.

Applied knowledge of Normalization and De-Normalization techniques. Writing queries to create stored procedures, User Defined Functions, Views, Triggers and CTEs (Common Table Expressions).

Using all kinds of SQL Server Constraints (Primary Keys, Foreign Keys, Defaults, Check, Unique etc).

Writing Complex T-SQL Queries, Sub queries, Co-related sub queries and Dynamic SQL queries.

Performance tuning of Stored Procedures and T-SQL Queries.

Created logins, users and configuring permissions and assigning roles to users.

Created indexes on selective columns to speed up queries and analyses in SQL Server Management Studio.

Used Execution Plan, SQL Server Profiler to trace the slow running queries and tried to Optimize SQL queries for improved performance and availability.

Created SQL scripts for tuning and scheduling. Performed data conversion from flat file into a normalized database structure. Created new SSIS package to extract date from legacy to SQL Server Objects using Business Intelligence Development Studio (BIDS) and Visual studio 2008.

Used SSIS to create ETL packages to validate, extract, transform and load data to data warehouse databases and data mart databases.

Developed SSIS Templates which can be used to develop SSIS Packages in such a way that they can be dynamically deployed into Dev, Test and Production Environments.

Executing data validation stored procedures in SSIS packages in Business Intelligence Development Studio (BIDS).

Involved in Deployment and Administration of SSIS packages with Business Intelligence development studio.

Implemented BCP/ETL Scripts for data transformation

Resolve SQL Job failures

Design and ensuring the Physical and Logical Model are in synch with every release

Optimize Data and Log management throughput and availability

Defined Check constraints, Business Rules, Indexes, and Views.

Environment: MSSQL 2012, SSIS, SSAS, Visual Studio 2008.

Experience with Polaris Software Labs

Client: CITI Group,NJ

Oct’2010 - Dec’2012

Project: GFTS

Role: SQL Developer

Description: A Matter is a legal case in the court of law and LM is used to maintain Matter and its related info created by legal user (a lawyer of CITI or his/ staff). Matters are entered using different notebooks depending on the type of matters: Matter Notebook for Corporate and consumer users, EMEA Matter notebook for EMEA users, and CGMI matter notebook for ICG matters. All matters are also available in CGMI notebook. Matter is assigned to Entities Internal Entities such as Primary Manager, Secondary managers, Business Contact etc. (Stored on Matter Assignment tab of the matter notebooks) External entities such as Firm Provider (Law Firm working on the case), Opposing Firm (Stored in Counsel Tab of matter notebook) Primary Manager is mandatory for Matter Creation Gatekeeper is also assigned to a Matter if required. Every matter will have a cost center assigned to it to which the expense will be allocated to this matter. One single matter can have multiple cost centers.

LM interacts with other apps in offline mode except for AIMS where it is real time for invoicing, payments, entity information, etc. The status of the matter is updated manually by the end users.

Responsibilities:

Studied the current transactional database to understand the existing system and created the approach documents for the migration task.

Integrations between law managers to other applications.

Perform investigations and make appropriate suggestions on various services and to meet the development efforts.

Responsible for designing, coding, and testing the applications and programs of Business Intelligence Specialist system.

Designing and developing the SSIS Packages.

Created and managed schema objects such as tables, views, procedures, and functions.

Extensive experience in writing functional specifications, translating business requirements to technical specifications.

Implemented BCP/ETL Scripts for data transformation.

Development of data extract, transform and load (ETL) programs using Microsoft SQL Server, including T-SQL, DTS and SSIS.

Proactively monitor the Production Database and optimize stored procedures for better performance. Resolve locking conflicts and resource utilization.

Ensure all database environments (Development, Staging, QA and Production) are in sync. Define db scripts versioning system to keep track of database versions for all environments.

Involved in Debugging and Deploying reports on the production server and analyzed reports and fixed bugs in stored procedures on the ongoing database operations, as needed, to resolve business problems.

Created SQL Agent Jobs for scheduling SSIS Packages, SSRS reports and alerts.

Configured and fine-tuned all the ETL (Extract, Transform and Load) workflows.

Identified slow running queries, optimized stored procedures, tested applications for performance and data integrity using SQL Profiler.

Understanding Business Specification document /Source map document for creating packages and Reports

Involved in migrating the data from their existing ERP system using SSIS into our project

Actively giving online support to implementation and support team deployed at client places.

Implemented Maintenance Plan task like backup task, shrink database task and rebuilt Indexes task.

Successfully installed MS SQL 2008R2 and configured SSRS reporting server.

Created various admin reports using SSRS that fetch the data from various data sources and display a summary report.

Enhanced existing SSRS reports in Meter History system.

Created a Database and normalized the database into small tables by applying rules of Normalization, defined Check Constraint and applied Business Rules.

Responsible for development of Procedures, Functions and database Triggers.

Designed Data was stored in MS SQL Server. Created indexes and analyzed its role for optimizations

Support migration and upgrade of Databases. Designed the dimensional data mart and developed SSIS packages to validate

Database Performance monitoring and tuning.

Backup and recovery of Databases.

Environment: MSSQL 2008R2/2012, SSIS, SSRS, SSAS, Visual Studio 2010 SharePoint



Contact this candidate