Post Job Free

Resume

Sign in

Sr. SQL Developer Bi Systems.

Location:
Miami Beach, FL
Salary:
110000
Posted:
November 22, 2018

Contact this candidate

Resume:

Sunny Sharma

Email:ac7rp1@r.postjobfree.com

Phone: 305-***-****

LinkedIn Profile :- http://www.linkedin.com/in/sunny-sharma1217

Education Qualification:

Master of Science

California State University, Northridge, California Jan 2009 – Dec 2010

Bachelor of Engineering

Punjab Technical University, Punjab, India Aug 2003 – Aug 2007

Professional Summary:

Over 7 Years of IT experience which includes System Analysis, SQL Server Troubleshooting, Design, Development, T-SQL & PL SQL coding and Support of MS SQL Servers 2016/2014/2008/2008R2, Oracle 12C/11G in Production, QA and Development Environments.

Hands on experience in installing, configuring, managing, upgrading, monitoring and troubleshooting SQL Server 2016/2014/2008R2.

Hands on experience with SSIS, Kalido DIW, Kalido 9.0GA Master Data Management and Data warehouse tool (MDM and DIW).

Experience in Design and Modeling organization hierarchy model or Enterprise model.

Extract, Transform, Loading, modifying Data experience in SQL Server from Oracle Server using T-SQL procedures, Views and SSIS transformations. Migrated the Oracle server data warehouse into SQL Server.

Worked with various teams including DBA’s, QA, BA, Production Support and business teams to create, modify and deploy the code and releases into production after the various test cycles.

Worked on troubleshooting Cognos, SSRS for BI Reporting solutions.

Experience with DDL & DML in Implementing & Developing Stored Procedures, Nested Queries, Joins, Views, Indexes, Relational Database Models, Creating & Updating tables.

Experience in Scheduling and troubleshooting Jobs and Alerts using SQL Server Agent.

Knowledge of System Development Life Cycle (SDLC).

Experience in Interacting with Business users to analyze the business process and transforming requirements into screens, performing ETL, documenting and rolling out the deliverables.

Experience in Creating, Modifying and testing SSIS Packages.

Worked on Extraction, Transformation, Loading data from Excel, Flat Files using SSIS.

Knowledge of creating OLAP cubes using SQL Server analysis services (SSAS).

Good understanding of Normalization /De-normalization, Normal Form and database design methodology using data modeling tools like MS Visio and Erwin Tool for logical and physical design of databases.

Worked with Senior Business Analysts, and users on database solutions.

Used JIRA/Confluence/Quality Center for work tickets tracking system.

Have basic knowledge on AWS and completed the AWS training.

Skillset:

Strong T-SQL, PL SQL knowledge and ability to write complex queries.

Databases MS SQL Server 2016/2014/2008R2/2008, PL SQL, Oracle10g/12C, MySQL.

Strong Communication skills, experienced in user interaction, experienced in providing technical support to the various business teams.

Strong time management skills and ability to adapt to quickly changing priorities.

Strong knowledge of data warehousing.

Worked with SSRS,IBM Cognos.

MS Access, My SQL, Integration Services(SSIS) 2008/2008R2,

Management Studio, Visual Studio, Kalido 9.0 GA DIW.

Windows 7 Enterprise

Erwin, ER Studio, MS Visio, Korn Shell Scripting.

Project Summary:

Sr. Sw Developer BI Systems/SQL Data Warehouse Developer Apr 2014 - Present

Viacom International (MTV)

1111 Lincoln Rd, Miami Beach(FL)

Viacom is home to the world's premier entertainment brands that connect with audiences through compelling content across television, motion picture, online and mobile platforms in over 160 countries and territories.

Evolve/Navigo/Sales Datamart Project :- Evolve/Navigo/Sales Datamart Project is a data warehouse project in Viacom International. This warehouse is designed to store all the Ad Sales Linear and Digital Revenue based on daily nightly loads run from various applications like IPM, WideOrbit, Planit and Gabriel. The revenue is received from the aired ads on the company networks like MTV, Nickelodeon, Logo, VH1 Soul, Comedy Central, Spike TV and many more plus the Digital revenue we get from the websites banners and advertisements.

Environment:

Oracle Database 11G/12C, SQL SERVER 2016/2014/2008R2 Databases, My SQL Database, Kalido DIW 8.5, Cognos, SSIS 2008R2.

Responsibilities.

Understand data architectures and data models in SQL Server, Kalido DIW (Oracle Server) and able to interpret attributes and their relations for reporting purposes and to the business.

Troubleshooting the production SQL Server Agent and Oracle jobs.

Working with SSIS packages to Migrate Data from Oracle server to SQL server.

Creating T-SQL views, procedures, functions, Indexes, ETL workflow etc.

Responsible for data loading from end to end process for Evolve project.

Responsible for setting up the Oracle and SQL Server Database environments from backups for any new release testing.

Creating Dimensions and CBE’s in Kalido DIW interface.

Working with DBA’s, BA’s, QA’s and Business teams to have the requirements and to implement them in the data warehouse.

Used the Kalido Staging loader and External loader methods in DIW to load data from Staging to Warehouse. Developed Kalido feeds to load Reference data and File Definitions to load Transaction Data sets.

Troubleshoot the Data Warehouse Errors and anomalies.

Performs coding and/or configuration to meet documented needs, utilizing standard procedures and techniques.

Development activities

oProvide input to architecture and Design work sessions

oData Modeling

oPresent status of development in meetings

oPrepare detailed technical and design documentation

oCoordinate work with other development team members

oAssist the project/application manager with release, configuration, and change Management activities

Involved in the end to end ETL process of the data to make it to the warehouse correctly and according to the requirements provided by the business using SQL SERVER procedures, views, SSIS, Kalido, Oracle, My SQL and Flat Files.

Taking care of the end to end data loading processes in Development, QA and Production environments.

Working closely with Ad Sales Business Analysis group and Ad Sales users to help resolving the Evolve Warehouse Issues.

BI Developer Sep 2012 – Mar 2014

Sunovion Pharmaceuticals Inc., Marlborough(MA)

Sunovion is a leading pharmaceutical company dedicated to discovering, developing and bringing to market therapeutic products that advance the science of medicine to improve the lives of patients. Sunovion specialize in treatments that help people challenged by disorders of the central nervous system and respiratory ailments.

Aggregate Spend project: Aggregate Spend project is a data mart project designed and implemented at Sunovion Pharmaceuticals to meet the compliance reporting as mandated by the Physicians Payment Sunshine Act embodied in the Patient Protection and Affordable Care Act. This legislation requires drug, device, biological and medical supply companies to report a wide range of gifts and payments to physicians and teaching hospitals in all states. Spend data is collected from various disparate sources like Salesforce Veeva and external vendors like Health Market Science and stored in a Central Spend Hub Repository designed using the Kalido Dynamic Information Warehouse. Master Data is collected from Health Market Science (HMS) and stored in Central Spend Hub Repository. Loaded the NPI data provided by CMS and an extraction process that sent spend data to Health Market Science (HMS) for Federal and State compliance reporting. An analytic approach was used so that Sunovion could have a holistic view across research & development, medical affairs, sales, marketing and other functional groups to assess the value of dollars spent on certain HCP business activities.

Environment:

Oracle Sql Developer 11.2.0, HMS Complete Spend V3.0.0, Kalido DIW 9.0, Jasper Server Reporting Tool.

Responsibilities.

Understand data architectures and data models in Kalido DIW and BIM and able to interpret attributes and their relations for reporting purposes.

Creating Dimensions and CBE’s in Kalido DIW interface.

Used the Kalido Staging loader and External loader methods in DIW to load data from Staging to Warehouse. Developed Kalido feeds to load Reference data and File Definitions to load Transaction Data sets.

Performed Kalido migrations across multiple environments. UsedKMX’s for Sandbox to Dev migration and KMV’s for QA and Prod migration.

Used the Kalido Business Information modeler (BIM) to model the business entities and their relationships in business friendly terms. Published the business model to Kalido Dynamic Information Warehouse (DIW)

Loaded the Master Data received from external vendor into the Spend Hub repository

Coordination with source system owners internally and externally to resolve data integration issues and ensure the seamless flow of information into the HCP aggregate spend reporting process.

Provide support for solution reports and extracts.

Create business rules development and configuration, complex interface development, and perform report writing and data cleansing.

Gather and document requirements for new development and minor enhancements.

Develop integration routines for HCP aggregate spend transactions per business rules and requirements.

Perform data investigation and issue resolution for all integration related items.

Develop and perform ad hoc queries using Oracle SQL to identify data inconsistencies.

Understand the state and federal HCP aggregate spend reporting requirements.

Perform ad hoc data analysis and data provisioning for compliance and internal business units in support of their reporting needs.

Perform regression, integration and system testing, gap analysis and impact analysis.

Perform data load into the spend hub DW.

Responsible for creating test scripts, UAT scripts, design docs and issue list

BI Developer AUG 2011 – Sept 2012

Pacer International, Dublin(OH)

Pacer International is a leading asset-light based transportation and logistics services provider. Within North America, Pacer manages one of the most comprehensive double-stack intermodal networks with more than 100,000 route miles of double-stack rail operations, integrated with a nationwide truck drayage network for door-to-door service delivery. Pacer accounts for about 20% of all domestic intermodal container moves in the U.S. and is the largest single provider of intermodal container services between the U.S. and Mexico. Master Data Management (MDM) project: The objective of this project is to collect the records and build a process to match same kind of records with one single record and distributing such data throughout an organization to ensure management, consistency and control in the ongoing maintenance and application use of this information.

Environment:

MS SQL Server 2008 and 2008R2, Microsoft Visual Studio 2008 and 2008R2, T-SQL, MS-Excel, MS-Access, Kalido MDM GA9.0,Kalido DIW.

Responsibilities.

Design and Modeling of the Enterprise model across different systems of the company.

Gathering requirements and doing Extraction Transformation and Loading

Identified root cause of suspicious issues and suggested and implemented resolutions.

Meeting Management people to have modeling sessions to understand their requirements and implement those requirements in Master Data Management project.

Did Migration from Development environment to QA environment for MDM data.

Created table feeds for the incremental loads to run on daily basis and scheduled the jobs on VisualCron job Scheduler.

Opening new KIR’s and following those KIR’s with Kalido support team to resolve the issues.

Created Tabular reports, Matrix reports, List & Chart reports, Parameterized reports, Sub reports, Ad-hoc reports, Drill down reports as well as interactive reports according to business requirements in time restricted environment.

Created SSRS Report Model Projects in BI studio and created, modified and managed various report models with multiple model objects, source fields and expressions.

Responsible for scheduling the subscription reports with the subscription report wizard.

Worked on performance tuning on existing stored procedures and existing reports for the faster processing of reports.

References available on request.



Contact this candidate