Resume

Sign in

Data Analyst

Location:
St. Louis, MO
Posted:
March 30, 2020

Contact this candidate

Resume:

Shiva Ram

adcjk9@r.postjobfree.com 657-***-****

Data Modeler/ Data Architect

PROFESSIONAL SUMMARY:

Professional experience around 8 years of total IT experience and expertise in Data Modeling for Data Warehouse/Data Mart development, Data Analysis, SQL and developing conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems.

Experience working with data modeling tools like Erwin, Power Designer and ER Studio.

Experience in designing star schema, Snowflake schema for Data Warehouse, ODS architecture.

Experience in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.

Designed the Data Marts in dimensional data modeling using star and snowflake schemas.

Well versed in Normalization / Denormalization techniques for optimum performance in relational and dimensional database environments.

Very proficient with Data Analysis, mapping source and target systems for data migration efforts and resolving issues relating to data migration.

Experienced with integration of External Data with SAP in Business Warehouse.

Experience in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL.

Knowledge in analyzing data using SQL, Hive QL and PIG Latin.

Experience in migration of AS400 DB2 to SQL Server.

Strong experienced in designing and deploying distributed computing Big Data applications using Open Source frameworks like Apache Spark, Spring booth and end to end ETL, and Kafka on AWS Cloud.

Experienced in modeling OLTP and OLAP using tools like Erwin r9.6/9.1/r8/r7.1/7.2, Sybase Power designer 12.1 and E-R Studio.

Good in Normalization / Demoralization procedures for effective and optimum performance in OLTP and OLAP environments.

Possess good exposure to Normalization (1NF, 2NF and 3NF) and De-normalization processes for improved database performance in OLTP, OLAP, Data Warehouse and Data Mart environments. Strong expertise in data warehouse architecture and Star, Snowflake Database schemas, Fact and dimensional tables.

Skilled with and maintaining Metadata Management Tools for maintaining Business and Technical Metadata.

Expert in writing SQL queries and optimizing the queries in Oracle 9i/10g/11g, SQL Server 2005/2008/2012/2014.

Extensive experience of development of Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.

Experience in building reports using SQL Server Reporting Services and Crystal Reports

Experience in creating and documenting Metadata for OLTP and OLAP when designing a systems.

Extensive experience in development of T-SQL, DTS, OLAP, PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation. Studio and Oracle Designer.

Proficiency in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Redshift, SQL Server, and Oracle)

Experience in unit testing/validating the Source, Stage and Target (End-to- End).

Expert in analyzing the error caused to failure of the ETL load with reference to the log file and report them to the corresponding team to get it rectified.

Strong knowledge of working with Data Marts (Star Schema and Snowflake Schema)

Proficient in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features

Experienced with Data Quality Management, Metadata Management, Master Data Management, process Modeling, Data Dictionary, Data Stewardship, Data Profiling, Data Quality, and Data Model Standards.

Experienced in working with Data Resource Management team (DRM), Enterprise Metadata

Well versed with Data profiling, Data Migration and Data Conversions.

TECHNICAL SKILLS:

Frame work and methodologies

Data Modeling (ER & Dimensional), Database Design, Requirement Analysis, ETL Design, Object Oriented Design, Development, Testing, Data Mapping, Metadata Management, Master Data Management, Data Profiling, Deployment, Documentation, Project Management, Semantic Layer.

Modeling Tools

Erwin 8.x/9.x,ERStudio 9.x, Power Designer

Reporting Tools:

Cogno's 8 / 10, Oracle Reports, Tableau, Business Objects, IDQ.

Other Tools:

SQL Navigator, TOAD, T-SQL, Informatica Power Center, Ab Initio, Teradata SQL Assistant

Databases

Oracle7/8i/9i/10g/11g, SQL Server2000/2005/2008/2012, Sybase, Teradata, DB2, PostgreSQL.

Languages

Python, VB, Java, Perl, PHP, PL/SQL and SQL.

Operating Systems

Windows NT, 2000 & XP, Vista, UNIX and Linux.

Education:

Bachelors in Information Science from Visvesvaraya Technological University, Bangalore, 2013

Masters in Computer Science from University of Illinois, Springfield, 2016

PROFESSIONAL EXPERIENCE:

CASS Information Systems, St. Louis, MO Feb’19 – Till Date

Implementation: Cognizant

Sr. Data Modeler/Architect

Roles & Responsibilities:

Interacted with Business users and gathered business requirements from account managers and UI (User Interface) of the existing AS400 system.

Document all data mapping and transformation processes in the Functional Design documents based on the business requirements.

Setting up processes to acquire data in structured format and implement tools/processes to consume data received, as batch process or real-time.

Created logical data model from the conceptual model and validated the model with response to questionnaire from business analyst and it’s conversion into the physical database design.

Conducted and participated in Database design review meetings.

Performed analysis of resource intensive query and applied adjustments to existing back-end code (stored procedures, triggers) to speed up report execution

Extensively handled documentation of Data Model, Mapping, Transformations and Scheduling jobs.

Understanding AS400 system and migrate multilevel client schemas to single structure.

Normalized standalone tables of legacy system(DB2) into SQL server and gave cardinality between all tables.

Performed analysis of resource intensive query and applied adjustments to existing back-end code (stored procedures, triggers) to speed up report execution.

Prepared High Level Logical Data Models using Erwin, and later translated the model into physical model using the Forward Engineering technique.

Performed Data Profiling and Data Quality.

Created security processes and controls to limit access to the data.

Responsible for defining the key identifiers for each mapping/interface.

Worked with client and off shore team to make sure that the reports and dashboards are delivered on time.

Worked with DBA to create the physical model and tables. Scheduled multiple brain storming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes case by case for the facts and dimensions.

Created the dimensional logical model with approximately 10 facts, 30 dimensions with 500 attributes using ER Studio.

Worked with ETL to create source to target mappings.

Conducted final review with client and resolved issues from the participants, prepared and submitted Test Analysis Reports and participated in Product Readiness Review.

Environment: Erwin r9, DB2, AS400, Microsoft Visio, Jira, SSIS, SQL Server 2012.

National Institute of Health, Bethesda, MD July’18 – Jan’19

Sr. Data Modeler/ Architect

Roles & Responsibilities:

Gathered requirements, analyzed and wrote the design documents.

Interacted with business users and studied available documents/application to learn the requirements.

Created conceptual and logical models, logical entities and defined attributes, and relationships between the various data objects

Used Erwin r7.0 for creating tables using Forward Engineering

Conducted Design discussions and meetings to agree on the appropriate Data Model

Designed complex data model using database tool Erwin r7.0

Used ER Studio to make logical and physical data models for enterprise wide OLAP system.

Developed Star and Snowflake schemas based dimensional model growing the data warehouse

Identified and tracked slowly changing dimensions and determined the hierarchies .

Performed File system management and monitoring on Hadoop log files.

Responsible for managing life style of department including compliance initiatives, group practice agreements, managing health care bills claim strategies along with.

Worked in regards to the conversion way of data that's stored in flat files into Oracle tables.

Applied Normalization techniques and created Logical and Physical models based on the requirements.

Designed and implemented Incremental Imports into Hive tables and writing Hive queries to run on TEZ.

Use Data frames for data transformations using RDD

Conducted and participated in Database design review meetings.

Prepared Enterprise Naming Standard files and also project specific naming standard files in some exception cases.

Worked with Enterprise Architect team in developing Enterprise Data Models, which is used by most of the applications.

Worked with MDM team in making changes to database to accommodate requirements and also in Capturing the demographic information from different applications by providing source application data dictionaries, identifying the demographic information and by providing DDL.

Created, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.

Actively taken part in data mapping activities with the Data warehouse.

Created summary tables using de-normalization technique to improve complex Join operations

Generated comprehensive analytical reports by running SQL/ PostgresSQL queries against current databases to conduct data analysis linked to Loan products

Participated in the tasks of knowledge migration from legacy to new database system

Worked on Metadata transfer among various proprietary systems using XML.

Conducted Design reviews with business analysts, content developers and DBAs

Designed and implemented traditional data models to manage marketing strategies along with satisfy reporting needs dynamically.

Organized User Acceptance Testing (UAT), conducted presentations and provided support for Business users to get familiarized with banking applications

Handled performance requirements for databases in OLTP and OLAP models.

Performed Data Loading and Transformation using Data Junction 6.0

Environment: ER Studio, Erwin r7.0, SQL Server 2000/2005, SSIS Windows XP/NT/2000, Oracle 8i, MS-DTS, UML, Hadoop, Hive 2.3, PIG, Impala 2.1, SQL Loader, Data Junction, XML files, Rational Rose 2000, TERADATA, Oracle JD Edwards EnterpriseOne 8.12

Thermo Fisher Scientific, Pittsburgh, PA Jan’17 – June’18

Data Analyst / Data Modeler

Roles & Responsibilities:

Gathered requirements, analyzed and wrote the design documents.

Document all data mapping and transformation processes in the Functional Design documents based on the business requirements.

Prepared High Level Logical Data Models using Erwin, and later translated the model into physical model using the Forward Engineering technique.

Was responsible in using MDM to decide how to manage their life cycle, cardinality, complexity and Collect and analyze metadata about the master data.

Performed Data Profiling and Data Quality.

Responsible for defining the key identifiers for each mapping/interface.

Created Technical specifications documents based on the functional design document for the ETL coding to build the data mart.

Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica Power Center 9.1

Documented, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.

Used SAP for integrating external data with BW.

Reverse engineered all the Source Database's using Erwin.

Worked with internal architects in the development of current and target state data architectures

Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin r9.

Worked on multiple implementations on corporate sponsored and variable universal life insurance section of their web based application.

Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs’ to apply the data model changes.

Used Model Mart of Erwin r9 for effective model management of sharing, dividing and reusing model information and design for productivity improvement.

Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information

Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP,ODS, DW).

Verified the correct authoritative sources were being used and the extract, transform and load (ETL) routines would not compromise the integrity of the source data

Used data analysis techniques to validate business rules and identify low quality missing data in the existing data.

Prepared High Level Logical Data Models using Erwin 8.3, and later translated the model into physical model using the Forward Engineering technique.

Environment: Oracle 10g/11g, Erwin 9.1, UDB, Hadoop, Hive 2.3, PIG, Impala 2.1, Informatica PowerCenter 9.1,SAP, MS Access 2007, MS Excel 2007, MS Word 2007, MS Outlook 2007, Teradata, Crystal Reports, PowerPoint 2007.

Neos LLC, Hartford, CT Dec’15-Dec’16

Data Analyst/ Data Modeler

Roles & Responsibilities:

Gathered business requirements through interviews, surveys, prototyping and observing from account managers and UI (User Interface) of the existing Broker Portal system.

Created logical data model from the conceptual model and validated the model with response to questionnaire from business analyst and it’s conversion into the physical database design.

Conducted JAD sessions with management, vendors, users and other stakeholders for open and pending issues to develop specifications.

Identify source systems, their connectivity, related tables and fields and ensure data suitably for mapping.

Created dimensional model with fact and dimensions as product, customer, time and geographic dimension.

Created standard abbreviation document in line with business standard.

Used IBM UDM for consolidating and analyzing data to make inform decisions and turnined insights into actions.

Lead multiple project teams of technical professionals through all phases of the SDLC using technologies including Oracle, Erwin, Data Stage, Data Warehousing, Websphere and Cognos.

Created and maintained Logical Data Model (LDM) / Physical Data Modeling with the system. Includes documentation of most entities, attributes, data relationships, primary and foreign key relationships, allowed values, codes, business rules, glossary terms, etc.

Responsible in designing and introducing new (FACT and Dimension Tables) towards the existing Model.

Extensively handled documentation of Data Model, Mapping, Transformations and Scheduling jobs.

Responsible for data mapping and writing transformation rules

Worked with DBA group to create Best-Fit Physical Data Model from the Logical Data Model using Forward engineering using ERWin.

Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW)

Generated the first cut physical design.

Developed data mapping documents between Legacy, Production, and User Interface Systems.

Involved in creating job schedules to automate the ETL process.

Conducted User Acceptance Testing on the application – resolved issues from the participants, prepared and submitted Test Analysis Reports and participated in Product Readiness Review

Environment: Windows NT 4.0, UNIX, Oracle 8i, SQL Server 2003, Erwin, power designer, Sequential Files.

ARIS Global, Bangalore, India Jan’13 – July’15

Data Analyst/ Data Modeler

Roles & Responsibilities:

Interacted with users and business analysts to accumulate requirements.

Understood existing data model and documented expected design affecting performance with the system

Initiated and conducted JAD sessions inviting various teams to finalize the necessary data fields in addition to their formats.

Developed Logical and Physical Data models by employing Erwin r7.0/8

Created logical data model on the conceptual model and its particular conversion into physical database design.

Extensively used Star Schema and Snowflake Schema methodologies in building and designing the logical data model into Dimensional Modelsx

Applied second and third NFR to normalize existing ER data type of OLTP system.

Used De normalization technique in DSS to build summary tables and improved complex Join operation.

Gathered statistics on large tables and redesigned Indexes.

Installed and configured SQL Server.

Designed and developed the databases

Written Stored Procedures and Triggers extensively and dealing very closely with developers, business analyst and customers to generate various audit reports and troubleshoot their query problems and connectivity problems.

Created integrity rules and defaults.

Performed analysis of resource intensive query and applied adjustments to existing back-end code (stored procedures, triggers) to speed up report execution

Created ftp connections, database connections for sources and targets.

Maintained security and data integrity from the database.

Developed several forms & reports using Crystal Reports.

Provided maintenance support to customized reports created in Crystal Reports/ASP.

Mapped the data between Source and Targets.Generated Reports from data models.

Reviewed data model together with the functional and technical team.

Assisted the ETL staff, developers and users to understand the data model.

Maintained an alteration log for each data models created.

Created SQL codes from data models and interacted with DBA's to make development, testing and production database.

Implemented database objects (like indexes, partitions in oracle database) for performance.

Interacted with DBA to discuss database design and modeling, index creations and SQL tuning issues.

Environment: Erwin r7.0/r8, Oracle 8i, SQL Navigator, PLSQL, Pro DB2, SQL server 2008, MS SQL server Analysis Manager, Sybase, Rational Requisite, Windows NT, Crystal Reports.



Contact this candidate