Post Job Free

Resume

Sign in

Data Architect

Location:
Eden Prairie, MN
Posted:
May 26, 2017

Contact this candidate

Resume:

Seema Gopinath

(ac0h8m@r.postjobfree.com)

SUMMARY

1Thirteen (13+) Years of Total IT Experience in Analysis, Design, Development, Implementation, Testing and maintenance of Business Application Systems

2Executed and maintained projects aligned with the project management example: project schedule, quality management, communication management, resource allocation, risk identification and mitigation and closing procedures

3Two(2) year of total experience working on Big Data Designing with Talend Tool using Agile methodology to iterate through Sprints.

4Effective at interviewing business users, stakeholders and subject matter experts (SME's) and eliciting needs and translating those needs into clear and concise functional and non-functional requirements

5Proven ability to track, investigate and resolve issues reported by business users and application support teams

6Excellent oral and written communications skills with a positive and flexible attitude

7Eight Plus (8+) years of Data warehousing and ETL experience using Informatica PowerCenter 8.5.x/7.1.x/6.2/6.1, Warehouse Designer, Source Analyzer, mapping Designer, Transformation Developer, Mapplet Designer, OLAP, OLTP.

8Successfully performed multiple roles: Data warehouse Architect, Data Modeler, Project Lead, Technical Lead, Business Analyst, and Programmer/Application Analyst

9Experience in handling Informatica Power Center Administration in 8.5.x including server setup, configuration, client installations, deployments, backups, repository management, server health and maintenance, performance monitoring and improvements, patching, setting up ODBC connectivity to connect to other databases.

10Mapping development using different transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Update Strategy, Unconnected / Connected Lookup, Aggregator

11Experience in handling XML technologies (DTD, XSD) and worked with XML sources and targets

12Three Plus (5+) Years of Business Intelligence experience using Business Objects XIR3,6.5/5.0, Designer, BO, WEBI,Crystal 8.0,Crystal XI

13Received Awards for Best Project Delivered in GE Fleet and Recommendations for various other projects

QUALIFICATIONS

Degree and Date

Institute

Major and Specialization

Master’s in Computer Management 1998-2000

University of Pune

Computer’s

Bachelor’s in Science 1995 -1998

University of Pune

Chemistry

PROFESSIONAL MEMBERSHIP/ CERTIIFICATION

Professional Society / Certification

Member Since / Date Certified

PMP

Dec’07

OCA

March’06

GB Certified

July 2004

IEEE

October, 2003

SKILLS

Project Management

Project Planning, Monitoring and Management

Resource Planning

Onsite-Offshore coordination

Client Liaison

Quality Assurance

Identify and mitigate risk

System Design & Development

Requirements Collection and Analysis

Source System Data Analysis

ETL Design

Reporting Environment Design

Development and Testing

UAT

Implementation

Languages, Software and Tools

Informatica Power Center 8.5.1/8.1.1/7.1.x

Business Objects 6.5/X1/R2

Oracle Applications 9i

SQL Server 7.0/2000

SQL, PL/SQL

Crystal 8.0/X1

VB/ASP

Unix Shell Scripting

Toad

SQL Navigator

Putty

WS-FTP

RDBMS

Oracle 10g/9i/8i

SQL Server 7.0/2000

Sybase 12.5

MS Access

Operating Systems

Unix (Solaris)

Windows 2000

PROFESSIONAL SUMMARY

OptumHealth, Minneapolis, MN Sept 16 – Ongoing

Data warehouse Architect/Big Data Architect

Optum is integration the new Master Data Management using Big Data into their systems. This would enable all downstream applications to get a standard view of the Entities. Business wanted to implement this using the new Open Source Tool Talend and Agile Methodology.

Responsibilities:

Integrate stat-of-art Big Data technologies into the overall architecture.

Lead the development team through construction, testing and implementation phase.

Optimize data transformation process in the Hadoop and Big Data platform.

Designing and implementing fast and efficient data acquisition using Big Data processing techniques and tools

Designed a Data Profiling Assistant Tool using Excel Macros

Environment: Erwin Data Modeler, Talend, Hive, Pig and Hadoop Streaming

OptumHealth, Minneapolis, MN Jun16 – Sept16

Data warehouse Architect

The Fraud BI Data Mart & Reporting Portal provides a user friendly reporting environment that supports generation of standard Cognos FWA dashboards, reports and ad-hoc reporting capability. The current reporting was facing lots of performance issues in terms of querying and retrieving the data. A POC was conducted by the team to consider various options and optimize on the best design.

Responsibilities:

Key in designing the STAR schema approach

Provided various performance improvement techniques like Partitioning, Indexing, Archival

Key in working with Development and QA team to get them the on time best design approach.

Had close working relationship with Development team for ontime delivery.

Had tremendous improvement in the Reporting side with said approach.

Environment: Erwin Data Modeler, Informatica Power Center 8.5.1, Oracle 11g, Visio, TOAD

OptumHealth, Minneapolis, MN Dec 15 – Jun 16

Datawarehouse Architect

As part of the overall HC3 Decommission Initiative, Eligibility Program will improve and standardize the technology and processes that are used to manage Product, Client and Member data in order to provide accurate and timely information on the products and services for which a member is eligible

The HMS eligibility program, will ultimately sunset HC3 Facets in lieu of new capabilities built into CDB, EEMS, and BOSS. This program will enable Real Time Reporting and Customer Service Updates:

.

Responsibilities:

Lead Data Architect on the Decommission Project.

Attend JAD sessions with business and SME’s and gather requirements.

Integration amongst various sources applications

Work in conjunction with the Data Analyst to understand the flow and integration among various sources.

Work with Development team to understand Big Data concepts which was first time implemented in the project.

Develop Erwin Data Models and Data Mappings.

Environment: Erwin Data Modeler, Big Data, Oracle 11g, Visio, TOAD

OptumHealth, Minneapolis, MN Mar11 – Ongoing

Data warehouse Architect

Optumhealth is creating a Clinical Reporting Platform (CRP) to streamline the capture, integration and analysis of clinical data. The resultant enterprise asset will be the one source used company-wide for all reporting: operational and product management. The [CRP] will be used to collect, organize, enrich, and publish clinical service information and intelligence to support clinical operational and strategic decisions. Platform (CRP)] will serve as the source of truth for clinical information, in an effort to promote platform consolidation, improved data quality and consistency, increased data familiarity across the enterprise, and improved clinical effectiveness.

Responsibilities:

Develop/Update and manage Data Models

Analyse complex data requirements and develop database designs

Attend JAD sessions with business and SME’s

Researching project needs independently and co-relating them to similar concepts

Profiling the data for data accuracy and reliability

Assist the application team in determination and clarification of business data requirements

Identify problems at the data modelling/design level and recommended solutions

Maintain and enforce data architecture/administration standards, as well as standardization of column name abbreviations, domains and attributes

Develop Data Mappings for the Development Team

Attend QA meetings

Environment: Erwin Data Modeler, Informatica Power Center 8.5.1, Oracle 11g, Visio, TOAD

ShopNBC, Minneapolis, MN Nov 10 – Mar-11

Data warehouse Architect & BO Developer

ShopNBC Merchandizing Team required the Data Warehouse Team to provide them with Operational Reports that would help them understand the business better. They wanted data which help them understand the Sales done by the business under various hierarchies. Meetings were held with key business Users to discuss and understand the suite of Reports that would fulfill their needs.

Responsibilities:

Conducted weekly meetings with business Users

Gather & document key Requirements

Design & build mockup templates of Reports

Involved in Universe Design

Generated Functional & Technical Documents for the Reports

Created 35 BO Reports

Wrote SQL scripts to validate the accuracy of BO Report data

Environment: Informatica Power Center 8.5.1, Business Object XIR3, Oracle 11g, Visio, TOAD

ShopNBC, Minneapolis, MN Sept 08 – Sept 10

Data warehouse Architect

ShopNBC IT Data Warehouse Team has provided first of its kind Analytical capabilities to the Customer Service Decision makers. A user friendly Customer Service dashboard has been designed which will provide users easy access to enterprise data which will help them in proactive decision making. At a glance, executives and/or managers can understand and monitor “business drivers” through a range of insightful data visualization such as key performance indicators, graphs etc. Overall, it will allow Customer Service to measure and monitor progress through common goals and metrics

Data Modeling:

Involved in the Data Modelling process for the data mart

Created Visio diagrams for entity relationships, process flows and Business process

Implementing process for data profiling

Meet with ERP team to document data source elements and Meta data.

Involved in integration of data from various heterogeneous systems

Identifying performance bottle necks and tuning SQL queries

ETL

o Emphasis on ETL architecture in terms of how staging tables, archive tables and the target tables, data extraction practices etc was established and executed.

o Use of SCD-2 (best practices -but complex)

o Early arriving dimensions incorporated / Audit Tracking / Error Handling etc using Informatica features.

o Informatica version features enabled.

o ETL approach was fundamentally changed from the way things were done in the past to load data -

Including use of MD5 – proof of concept done (a unique hash number used for comparing records).

bulk load – proof of concept done,

caching technique,

Early arriving dimensions,

audit trails,

Error handling.

what approach to use to extract data from source systems and the table design

“who” columns in ETL

Ability to track the loads from target back to source loads.

Performance always kept in mind during design.

Revolving Archiving of data for 90 days for research.

Environment: Informatica Power Center 8.5.1, Microsoft SQL Server Analysis Services (SSAS), Microsoft SQL Server Reporting Services (SSRS), Microsoft PerformancePointServer, Oracle 11g, Visio, TOAD

.

Additional Experience in Computer Programming and Database Administration 2001 to 2007



Contact this candidate