Resume

Sign in

Data Architect/Data Modeler

Location:
Collierville, Tennessee, United States
Posted:
January 23, 2017

Contact this candidate

Cover Page

Surekha Kanapuram

https://www.linkedin.com/in/surekha-kanapuram-23716397

917-***-****

acye4u@r.postjobfree.com

IT professional with a history of high-level Data Architecture, Data Modeling, Data Analysis, ETL, Business Intelligence and Software Development Life Cycle experience in leading IT organizations, seeking an opportunity to help organizations meet and accomplish their strategic goals. Over the past 12 years, I demonstrated exceptional architecture, design and modeling skills in leveraging various technologies to develop and support outstanding Business Intelligence solutions.

Recent accomplishments:

Reporting Automation for Angel Vivity Project - (Anthem Blue Cross and seven top Southern California hospital systems have partnered to offer an integrated health system called Anthem Blue Cross Vivity. Report Automation for this collaboration will eliminate the manual reporting effort resulting in cost savings). As part of this project designed Atomic source tables, Analytical dimensional, fact and reference tables, to support automatic report creations.

Build out of Enterprise Data Mart (EDM) 3.3 – (EDM is intended to leverage the integration work that is accomplished by Enterprise Data Warehouse Reporting DataStore (EDWARD) over the large number of Anthem source systems. EDM’s objective is to stop the proliferation of raw data used by analytical applications, especially claim line level data. Store only the highest return on investment (ROI) data. As part of this project worked on enhancements to EDM 3.2, updated and added Processing, Reference, Dimension and Fact tables present in five different data models (AIMS/PAS/RDM/DIM/EDM) designed as part of five layer data flow architecture.

Enterprise Data Quality Subject area build out – (Enterprise Database Design Solution to create, store and maintain Data quality rules and results) - Involved in creating conceptual data model and Semantic Layer views.

Developed and implemented State Mandate All Payer Claims Database extraction project. Designed tables and Reference Data Management (RDM) views to support data extraction automation process from Enterprise Data warehouse through RDM interface. The extracts are designed, extracted and sent as per specific State Mandate requirements.

Worked on Conceptual model of one view of Customer – captured all AutoZone abstract Party entities, Party roles, Association and hierarchies with two main entities Person and Organization. Registered Customer, Unregistered Customer, Prospect Customer, Employee, Vendor, etc ordered under respective main entities.

Event Notification Services – When an event occurs, interested/registered Party with appropriate access privileges and authorizations is notified through registered Contact Methods.

Lead Enterprise HEDIS project to create a strategic solution (data, application and web) across the enterprise for annual HEDIS NCQA/Medical Record Review processes for accreditation. This data is also expected be leveraged for other quality initiatives by providing a grain of data that allows analysts to drill to a detailed level if needed and retire legacy or regional solutions.

Designed Finance Product Analytics Data Mart which will pull Cost share and non-cost share data from Enterprise benefit model, cleanse, transform the data and load to staging tables. Load the data into dimensions and facts by extracting data from relational tables.

To conclude, I have the aptitude and zeal to continue demonstrating my Data Architecture, Data Modeling and Data Analysis skills and believe I can make significant contributions to your organizational goals. At the same time, I desire to grow intellectually, technically and treasure the new experiences which will come along the road. I look forward to have the opportunity to exhibit this passion and serve your company in the Data Architect/Data Modeler/Data Analyst roles.

Resume

Surekha Kanapuram

https://www.linkedin.com/in/surekha-kanapuram-23716397

917-***-****

acye4u@r.postjobfree.com

Data Architect/Data Modeler

Data Architect/Data Modeler with proven proficiency in various roles, technologies and IT companies, supporting Enterprise Data Warehousing in Insurance, Finance, Healthcare and Retail domains, working towards achieving analytical business goals. Experience includes end to end Data Warehouse life cycle design and implementation, focusing mainly on Data Architecture, Data Modeling and Data Analysis phases. Competency to define business functions as entities and comprehensive metadata, enabling Data Warehouses to provide long-term stability, reuse and reliability. Dynamic, dedicated, goal-oriented and highly motivated, combined with time management and problem solving skills, fast learner and ability to work well at all levels.

SKILLS ABSTRACT

Database Architecture and Design

Data Modeling, Data Analysis, Data Profiling

ETL and Business Intelligence

Software Development Life Cycle Process

SUMMARY

Expertise in reviewing business requirements and technical design documents to develop effective database design solutions.

Develop architecture plan to improve the quality of the data and integrate diverse source data to achieve business objectives.

Collaborate with IT colleagues to identify user requirements, assess available technologies, and recommend new data fit and solution options, technical strategies and tactics.

Identify data architecture problems, resolutions, changes, upgrades and communicate across.

Experience in designing Enterprise Data Warehouses, Data Marts, Reporting data stores (RDS) and Operational data stores (ODS). Strong understanding of Data Warehousing and Data Modeling concepts.

Expertise in defining, creating and maintaining Conceptual (business scope), Logical (business process) and Physical data models (technical solution) using Entity- Relational (ER) modeling (Normalization, 3NF), Dimensional modeling (Star Schema and Snowflake Schema) and Object Oriented Modeling.

Hands on experience working and applying design considerations of 3rd Normal Form for OLTP Databases and Models.

Expertise in Physical modeling creating PKs, FKs, Indexes, Partitioning, Views, Landing zone tables, Conformed Stage tables, Ready Zone tables, Work tables, one time fix tables, Data Exchange tables, ELDM reports, Join reports etc.)

Expertise in customizing Physical models to various database platforms like Teradata, DB2, Oracle and Green Plum.

Hands on experience using Erwin, Embarcadero ER/Studio, Power Designer, Oracle SQL Developer Data Modeler 4.0.3 and Rational Rose modeling tools.

Experience in design and maintenance of Enterprise data warehouse, data management, metadata management, master data management, data element modeling, naming standards, data extraction and data archival processes, across multiple database platforms.

Expertise in data requirement validation by performing business rules traceability analysis and gap analysis.

Expertise in analyzing client requirements and drafting business specifications using Use-case scenarios, Unified Modeling Language (UML) and Rational Rose modeling.

Able to recognize and identify patterns in data relationships from disparate systems.

Experience working in JAD sessions with Business, IT and Project teams, Solution Architects, DBAs and Industry Consultants to design, execute and implement BI Intelligence data stores.

Experience in implementing direct insourcing projects, sourcing and data integration from multiple regional source systems to Enterprise Data Warehouse.

Experience in Agile/Scrum, iterative and waterfall SDLC methodologies.

Experience in drafting data modeling standards, guidelines, best practices and process improvements based on experiences and new technologies. Thoughts contribution and knowledge sharing through creation of presentations, documents and facilitating Data Architects tech meetings. Mentoring and providing assistance to new joins and junior Data Modelers.

Drive innovations by keeping up to date on emerging technologies and Data Trends which may fit with the company’s project needs.

Experience working on Business Intelligence tool like Business Objects, SQL Server Reporting and Cognos.

Experience working on various Database platforms like Teradata, Oracle, SQL Server and DB2.

Experience working with ETL tools Informatica and IBM Infosphere DataStage, creating source to target mapping documents, creating mappings, sessions, workflows, reusable mapplets, worklets, scheduling, loading data from source to target, creating staging area tables as needed, writing ETL test cases and perform testing. Experience with scheduling Sequence and parallel jobs using Informatica Scheduler, DataStage Director, UNIX scripts and scheduling tools.

To conclude, possess strong organizational and critical thinking abilities, to assess complex problems, analyze options, navigate diverse perspectives and objectives and develop acceptable Data Management solutions.

TECHNICAL SKILLS

Data Architecture and Modeling

Entity Relationship (ER) Modeling, Kimball Dimensional Data Modeling using Star schema and Snow-Flake Modeling, Unified Modeling Language (UML) and XML Reverse Engineering, Master Data Management (MDM), Conceptual Data Modeling (CDM), Logical Data Modeling (LDM), Physical Data Modeling (PDM), Waterfall and Agile Software Development (Scrum), Zachman Framework

Data Modeling Tools

Power Designer 12/12.5/15/16.1, Erwin 4.0/3.0, Embarcadero ER/Studio, Power Designer, Oracle SQL Developer Data Modeler 4.0.3, Rational Rose/Requisite Pro, Visio for Enterprise Architects, Oracle SQL Developer Data Modeler 4.0.3, Infosphere Data Architect

Business Intelligence

Business Objects 6.0/5.1/5.0 (Web-Intelligence 2.5, Designer 5.0, and Developer Suite & Set Analyzer 2.0), Brio 8/7, SQL Server Reporting Service (SSRS), Cognos Series 7.0/6.0, Crystal Reports

Databases

Oracle 10G/9i/8i/8.x/7.x, MS SQL Server 2012/2008R2/2008/2005/2000/7.0/6.5, Sybase 12.5/12.x/11.x, MS Access 7.0/2000, SQL, PL/SQL, DB2, Green Plum(PostgreSQL), Teradata (V2R8/V2R7/V2R6/V2R5)

ETL Tools

Informatica PowerCenter 8.1.1/8.0/7.1.2/7.1.1/7.0/6.2/5.1, Informatica Power Mart 6.2/5.1/4.7, (Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Informatica server, Power Connect, Power Plug, Power Analyzer), Oracle SQL*Loader, IBM Infosphere DataStage 8.5, IBM Infosphere DataStage 8.1 (Parallel & Server), IBM Websphere DataStage 8.0.1 (Designer, Director, Administrator), SQL Server Integration Service (SSIS)

Environments

Win 3.x/95/98, Windows XP, Windows 7, Win NT 4.0, Unix, Sun Solaris 2.6/2.7, Linux, MS-DOS

Others

Unix, Unix Shell Scripting, SQL, PL/SQL, SQL*Plus 3.3/8.0, Visual Basic 6.0/5.0, ASP, HTML, DHTML, XML, WAP, C, C, C++, Java

EDUCATION

Master of Science in Computer Science (MS(CS)), University of Nebraska, Omaha, USA, 2007

Master of Science in Information Systems (MSc(IS)), Osmania University, India, 2003

Bachelor of Computer Applications, Osmania University, India, 2001

PROFESSIONAL EXPERIENCE

Data Architect Senior Advisor at Anthem Inc. (previously WellPoint Inc.) Dallas, TX September 2015 – Present

Create, update and maintain conceptual, logical, and physical models for identified solutions.

Work on data needs and issues to minimize business risk.

Identify the initial data access paths of the physical data models.

Work with Database Administrators to develop physical models.

Environments: Sybase Power Designer 16, Erwin 7, Teradata SQL Assistant 14.10, DB2, Green Plum (PostgreSQL), Oracle, SQL Server

Data Architect at AutoZone Inc, Memphis, Tennessee November 2014- September 2015

Working across business lines and IT domains to ensure that information is viewed as a corporate asset and treated as such. This includes its proper data definition, creation, usage, archival and governance.

Working with other architects to help design overall solutions in accordance with industry best practices and AutoZone principals and standards.

Created data models for transactional data, which would be later moved to Operational Data Stores (ODS) and eventually to Enterprise Data Warehouse.

Working with other team members the Data Architect strives to create and improve the quality of systems, provide more flexible solutions, and reduce time-to-market in the long run.

Worked on projects implemented using Agile framework and delivered Data Architect artifacts in three weeks’ sprint time.

Experience working and co-coordinating with Scrum master, Product owner (BSA), ETL lead and Testing lead as part of sprint deliverables and backlogs.

Participated and contributed in sprint grooming, planning, daily scrum and sprint retrospect calls.

Presented data models in Data Architect forums to get model changes approvals.

Performed data analysis and data profiling of the source data to understand the data, data volumes, data anomalies, duplicate data, primary key candidate’s assignment, primary index assignments, data type assignment and null values determination.

Created views, materialized views and join indexes as needed based on the customer extract needs and reporting needs which needed pivoted data, different performance metric data etc which needs customization logic in the data model or during the ETL phase.

Created data mapping document for ETL team reference and data loading.

Migrated legacy models from Erwin 7 to Erwin 9.

Created domains, templates and naming standards for different database platforms in Erwin 9.

Organized tables in the data models and table schemas in database as per data steward’s ownership list in the organization.

Environments: ERWIN 7, ERWIN 9, Oracle, PostgreSQL, DB2, Talend ETL tool

Data Modeler at USDA- Forest Service, Dallas, Texas May 2014- Oct 2014

This project aims at establishing framework for modernizing all NRM reporting, data warehousing, business intelligence and ad hoc query capabilities. Modernizing reporting databases, tools, and processes will make information and data more available and accessible to each level of the Agency and is a dependency for all application modernization efforts.

Created logical data model for Roads Decommissioning project.

Created Fact and dimension tables to capture project data flow requirements.

Created Source to Target mapping documents.

Created ETL Mappings, Sessions, Workflows, Worklets and Maplets in Informatica.

Created staging area to facilitate data movement between source to target tables.

Scheduled workflows in Informatica based on the load dependencies.

Wrote ETL test cases to test mappings, sessions and workflows.

Performed unit testing to check if data loaded correctly from source to target tables

Created install scripts or deployment scripts to deploy tables in different Oracle database environments and pass them to DBAs.

Responsible for enhancing and maintaining the AutoZone (AZ) information strategy

Ensuring alignment of programs and projects with the strategic AZ Information

Roadmap and related strategies

Performing gap analysis between current data structures and target data structures

Enhancing and maintaining the Enterprise Information Model

Working with service architects and application architects to assist with the creation of proper data access and utilization methods.

Understand and translate business needs into data models supporting long-term

Working with other data architects and team members in peer reviews throughout the

Serving as a technical data strategy expert and leading the creation of technical requirements and design deliverables

Maintaining knowledge of industry best practices, technologies, and architectures

Recommending and evaluating new tools and methodologies as needed

Environments: Oracle SQL Data Modeler 4.0.3, PL/SQL Developer 10, Oracle, Informatica 9.5

Data Architect Senior Advisor at Anthem Inc. (previously WellPoint Inc.) Manchester, NH Oct 2007–May 2014

Worked on diverse projects, from requirement gathering phase, analyzing business rules, conducting data modeling sessions and model walk-throughs with subject matter experts, steering final model for customer sign off, handing off data models to DBA team and supporting data models until testing and deployment to production.

Participated in defining strategies related to data integration, data quality, metadata management and master data management (MDM).

Worked on projects involving adding new Health Care Subject areas (mainly Claims) to Enterprise data warehouse and changing existing Subject area designs to meet changing business needs.

Implemented numerous direct insourcing projects, loading data from regional data warehouses or source systems to Enterprise data warehouse.

Analyzed data transformations, performed by source systems, which transform data based on business requirements, conform to Enterprise modeling standards and load to Enterprise data warehouse.

Supported various Health Care Information Exchange (HIX) projects involving extracting data from Enterprise Warehouse, transforming and sending it across to internal or external vendors.

Physical implementation of logical models on Teradata, DB2, Green Plum, SQL server and Oracle platforms.

Designed conceptual, logical and physical data models using Power Designer and ERWIN tools.

Performed reverse engineering of legacy application using DDL scripts in Power Designer and developed logical and physical data models for central model consolidation.

Provided mapping between Business Requirements and the Target Model build. Developed data mapping documents for integration into a central model and depicting data flow across systems.

Developed strategies for Metadata creation, updating and maintenance. Provided mapping between Business Requirements and the Target Model build.

Managed Codes Databases for various Subject Areas. Worked with Reference Data Management (RDM) team to assign RDM code sets to industry standard codes and also to source codes which need cross reference to EDWard standard codes.

Experience in Master Data Management (MDM) design and implementation. Involved in creating processes required to create and maintain consistent and accurate lists of master data. Developed merge rules to clean and consolidate data. Identified master data entities and described master data based on company business rules and health care industry. Involved in investigating transactional, metadata, hierarchical data from regional data warehouses and various source systems (which are being integrated to Enterprise data warehouse for enterprise reporting) to do discovery estimates for creating master data, maintaining, updating it and expanding it.

Facilitated sessions with business analysts, developers and other system architects to understand business in terms of stakeholder vision.

Provided data modeling support for landing zone tables, conformed stage area tables, ready zone tables and target tables.

Generated Use-case scenarios and providing mapping in the data model. As needed for some projects, analyzed client requirements using Unified Modeling Language (UML) diagrams.

Worked to improve and transform data management practices based on insights into the client’s business model and pain points.

Managed Codes Databases for various Subject Areas including Claims, Membership, Product and Benefits, Revenue, COA etc.

Worked in JAD sessions with DBAs, Teradata specialists, Industry Consultants to convert logical data model to physical data model.

Administered and maintained the model updates using model change log documents.

Created metadata reports, PDFs and Join reports for ETL developers to review data model structures. Created ELDM reports from data model to pass it to developers and DBA to analyze physical attributes of the tables. Compared one data model across another and generate compare reports.

Created Next release, Next release +1 and Next release +2 versions of tables in data model to differentiate between various project changes to same tables, going to Production on different release dates.

Frequently used Teradata SQL Assistant to view exact physical structures of the tables in productions and for other querying purposes.

Analyzed tools of MLoad, BTeq, FastExport and FastLoad used to design dataflow paths for loading transforming and maintaining data warehouse.

Translated and delivered actionable, high-impact plans that improves data integration, data quality and data delivery in support of business initiatives and results.

Defined, designed, and involved in implementation of standards and best practices involved in modeling, describing data entities and relationships, as well as supporting Metadata.

Involved in defining compliance rules and technologies to support Information and Data Architecture objectives and governance.

Involved in designing information solutions, ensuring that they align with business and IT strategies, maximize reuse and integrate well with existing solutions.

Investigated data quality issues by data profiling on the source system data and regional data warehouse data.

Performed data analysis, data profiling and requirement gathering. Worked with subject matter experts (SMEs), to understand business requirements, rules and documenting technical documents.

Participated in importing, cleaning, transforming, validating and modeling data with the purpose of understanding data and making conclusions for analytical purpose. Creating mapping documents to show source to target mapping and data lineage.

Analyzed large datasets, drawn inferences and presented them successfully to management using a reporting tool.

Involved in improving approaches to ensure successful implementation of data management solutions, Information flow patterns and exchange strategies and Data quality management.

Worked towards business development and ensuring high-levels of client satisfaction of deliverables during engagements.

Contributed thoughts and shared knowledge through the creation of executive presentations, architecture documents on repeatable processes and facilitated Data Architecture tech forum team meetings.

Mentored and provided assistance to junior Data Modelers and new join Data Architects.

Environments: Sybase Power Designer 16, Erwin 7, Teradata SQL Assistant 14.10, DB2, Green Plum (PostgreSQL), Oracle, SQL Server, Informatica PowerCenter 8.1.1/8.0

Data Modeler at Health Dialog, Manchester, NH Feb 2007- Sep 2007

Interacted with the Business Users to analyze the business process and gather the requirements, making necessary changes to schema objects to fulfill their reporting and business needs.

Modeled new data models based on user requirements and updating existing data model. Created new metadata and updating existing metadata.

Discussed with business clients on providing tactical solutions and designing logical data models accordingly. Worked with clients to generate use case scenarios based on user requirements.

Verified if the data model helps in retrieving the required data by creating data access paths in the data model.

Environments: Erwin 4.1, Business Objects XI, WEBI, Auditor, Crystal reports XI, Oracle 9i, PL/SQL, Info view, Informatica PowerCenter 5.1

Data Warehouse Developer/BI Developer at Prudential Finance, Omaha, NE April 2005 – Dec 2006

Designed, deployed and scheduled reports using Business Objects XI R2 WebI and also BroadCast agent. Built Universes and schemas for the purpose of reporting. Created Canned Reports for Business Objects Ad hoc Reporting. Designed the Universes that would be used as a Component Universes to include all the Objects in to another Universe. Used Designer for creation of the Universe with Classes, Objects and Condition objects. Created the reports using the Universes and stored procedures as the main data Provider. Used Business Objects 5.7 for developing reports and Developed Hierarchies to support drill down reports. Worked in the performance tuning of the programs, ETL Procedures and processes.

Provided Business, Functional and Technical requirement documentation. Served as link between IT team and customer. Involved in internal and external meetings in a multi-discipline IT Environments. Analyzed, defined and documented business requirements for user report needs.

Conducted requirement gathering meetings with customers to document work processes, data requirements, data flows, data dependencies and changing business needs.

Created Data models for billing process, referential integrity and security.

Designed business requirements documents to document specific user reporting requirements and changing business needs.

Analyzed, developed, documented, and communicated architectures/designs using UML for large scale full life cycle projects of on online reporting system. Conceptual modeling using use cases and UML design and E-R models.

Architecture design using staging area and data mart structure. Applied business re-engineering principles and processes in developing the enterprise models.

Analyze disparate data sources and determine how they fit to enterprise architecture.

Validated source data, data cleanup activities and data mapping.

Define, understand and refine data requirements.

Worked on maintaining data privacy and compliance laws and regulations.

Involved in designing other Analysis artifacts class diagrams and interaction diagrams.

Designed, deployed and scheduled reports using Business Objects XI R2 WebI and also Broadcast agent. Built Universes and schemas for the purpose of reporting.

Monitoring SQL server performance using profiler to find performance dead locks.

Created and Scheduled Jobs for proper and timely execution and sending Alerts Using SQL Mail.

Implementation of permissions, logins, roles and authentication modes as a part of security policies for various categories of users.

Defined ETL mapping specification from business requirements provided by Business Analyst. Defined ETL process to source the data from sources and load it into Data Mart tables.

Designed the logical schema and physical schema creation for Data Mart and integrated the legacy system data into Data Mart.

Involved in Data Warehouse landing zone schema Design and provided Data granularity to accommodate time phased CDC requirements.

Created ETL migration strategy document, Integrate Data Stage Metadata to Informatica Metadata and created ETL mappings, sessions, workflows in Informatica.

Designed Mapping using Source Qualifier, Expression, Filter, Look up, Update Strategy, Sorter, Joiner, Normalizer and Router transformations. Developed Advanced Linux shell scripts to validate source files, automated archival of Log files, create ETL event start/stop files.

Creation of various data objects like Databases, tables, views and indexes

Combined frequently used SQL Statements into stored procedures thereby reducing the execution time.

Actively involved in Normalization (3NF) and De-normalization databases.

Used Performance monitor and Profiler to optimize the queries and enhance the database performance.

Involved in de-normalization of tables for reporting purposes.

Job Scheduling, batch, alert and E mail notification setting.

Maintaining jobs for data messaging from development server to test server for generating daily reports for financial and marketing team.

Designed and implemented Parameterized and cascading parameterized reports using SSRS.

Responsible for troubleshooting the issues that have been raised by the clients from the Server side as well as from the reporting side.

Generated long running query alerts when query runs for more than threshold time.

Environments: MS SQL Server 2005, Java, Business Objects 5.1, Web Intelligence 2.5, Broadcast Agent, Win NT 4.0/2000, Informatica PowerCenter 5.1, Erwin 4.1, UML Tools, Business Objects XI, Business Objects 5.7, WEBI, Auditor, Crystal reports XI, Oracle 9i, PL/SQL, Info view

Database Developer/Graduate Research Assistant at UNO, Omaha, NE Aug 2004 - April 2005

At University of Nebraska, worked on Thesis Project funded by NSF (National Science Foundation) grant on testing Effective Utilization of Wireless Networks in Collaborative Applications using Oracle database performance tool.

Analyzed requirements through E-R diagrams using UML and Rational Rose suite. Programmed simulation scripts using C++. Collected simulation data from the Simulator and stored in Oracle Database.

Built a relational database to automate the entry of simulation data from the Simulator. Created database objects like tables, views, procedures, packages using Oracle tools like PL/SQL, SQL* Plus, SQL Loader to store simulation data.

Retrieved data from the database and analyzed, created reports for decision making. Worked on simulation tools like OPNET and ns-2 (which runs on Unix and Linux platforms)

Managed faculty, staff and student user account information on the network with MS Active Directory.

Managed users account and granting required privileges.

Monitored usage and performance of network devices.

Monitored and maintained Microsoft Mail Exchange.

Analyzed system logs and identified potential issues with computer systems.

Developed security guidelines for campus routine audits of systems and software.

Performed system performance optimization.

Analyzed and designed the data objects required for the databases.

Created Database Object Tables, Views, Constraints, Functions, Stored Procedures, Triggers, Rules, Defaults, and data types.

Identified Relationships between tables, enforced referential integrity using PK/FK constraints.

Created Clustered and Non-Clustered Indexes to improve data access performance.

Tuned the SQL queries using SQL Profiler.

Monitoring database with tools for optimum



Contact this candidate