Post Job Free

Resume

Sign in

Enterprise Data Solution Architect

Location:
Brookline, MA
Posted:
February 04, 2024

Contact this candidate

Resume:

Sharmila S

Contact: 314-***-****

E-mail: ad3c2z@r.postjobfree.com

LinkedIn: Sharmila Sahu Linked In profile

Ashburn VA (USA)

SUMMARY

Precisely more than 22+ years of Experience in Enterprise Data Architecture, solution designing, handling Client- Server technologies, analyzing, Business process flow, conceptual designing, maintenance and support utilizing my versatile expertise. Experience of progressive and diversified experience encompassing cloud services integration architect, Big Data solution, and Architect using Cloudera, Hortonworks, AWS and Azure services. More than 15 + years of experience in designing, and building Roadmap, strategies, and migration plan using Cloud services as an Enterprise solution

Currently,Sharmila Sahu as the manager in the Technology Consulting practice of Ernst & Young LLP, aligned with AI and Data.

• 19+ years of Experience in Enterprise Data and technology Solution Architecture, solution designing, handling Client- Server technologies, analyzing, maintenance, and support, utilizing my versatile expertise.

• I am pursuing her career with EY as a Manager and Enterprise architect with over one year of experience applying prescriptive, predictive, and descriptive-analytical techniques to help clients solve complex business problems across several industries. My operational expertise involves building Enterprise Reference Architecture, business data and analytics strategy, predictive modeling, performance reporting, system/data analysis, and validation. I demonstrated expertise in dealing with all aspects of the data life cycle.

• Working as an Enterprise Data Solution architect lead in the healthcare domain

• I worked as an Enterprise Architect for a commercial transformation of the IT Strategy and Roadmap engagement to streamline ERP and CRM enhancements and envision new e-commerce, reservation, customer experience, marketing, loyalty systems for the entertainment, real state, and development of a mega-project for RHEEMS

• As an Enterprise Senior Solution Architect created a unified current state enterprise architecture view with business capabilities mapped to 948 enterprise systems involving ERP, R&D, supply chain and manufacturing, IT application, data, and infrastructure.

• During working with Healthcare environment I leaded the CDP on prem data platform Migration to AWS. As an enterprise architecture and Cloud expert I delivered the Migration planning, Initial end to Enterprise solution for CLOUD Migration project . During the phase I prepared Lift and Shift plan for Migration integration Design, Planning, Technical Acquisition details.

• I utilized TOGAF ADM framework for all my enterprise architecture deliverables.

• As an enterprise Architect I show my expert on how to use LEANIX to establish a transparent as-is IT landscape and plan roadmaps in business

• As a data Solution Architect, I am involved in creating and building enterprise data solution architecture, data integration from one system to another system, Data migration and strategy plans, leading and guiding the development team to implement, develop and ultimately build production environments and provide service and reports to end customers.

• Design and build Enterprise data platform and data strategy guidelines which include mainly data availability, architecture pattern, data ingestion pattern, data integration, data storage, and retrieval process

• Worked for a publishing giant and performed Big Data strategy and project-level analysis to determine a path forward for Predictive Maintenance products and general Data Analytics as a Service environment. Play a role in Technology Selection, Metrics, and Management Recommendations.

• Involved in designing and building, and establishing an Enterprise Big Data hub fit into five container database channels to preserve the history and valuable data of decades for a long time.

• I played a major role as an Architect and Data analyst as a Single Point of Contact for Big Data/Hadoop Technology. I’ve also taken responsibility for the building Big Data Roadmap, a Requirements gathering process, a Requirement design document, and project plan deliverables to business users/stakeholders.

• I was involved in delivering a Software evaluation matrix for all competitive vendors in Big Data Hadoop enterprise solution.

• Delivered design to data warehousing and business intelligence data needs, including dimensional modeling, effective capture of history, and consideration of techniques to load and access large amounts of data in read-intensive applications

• I designed, planned, managed, and successfully implemented the Big Data Hadoop platform in the Client location. I deal with customer complaints with a calm demeanor. I am very good at handling difficult situations with customers. I understand how to listen to customers and extract those details which make a big difference when dealing with our clients. Successfully designed, delivered, mentored employees who have no idea about the technology.

• Successfully led the team and provided the best Roadmap solutions to the Client. I have an imaginative personality and am very resourceful in times of need. When a major problem arises, I use creative problem solving to look at different sides of an issue. I think outside the box when crafting solutions.

• Delivered the Roadmaps/Functional/Technical approach/Test Scenarios and Use cases to Client and delivered the right product and met Client expectations.

• Lead design excellence thrives in an environment of collaboration and demonstrates a high degree of creativity and entrepreneurial spirit. I constantly search for new ideas and ways to improve efficiency.

• Achieve excellent at keeping written information about my Client assignments and projects and keep updating with supervisors with weekly meetings.

• Lead Architectural Design in Big Data, Hadoop projects and provide for a designer that is an idea-driven visionary with a high level of design skills of the complex nature of science and technology projects. Published white papers that could help practice development Architecture Hadoop technology solutions.

• Delivered business use cases and requirements based on Big Data Hadoop solution enterprise level. Driven the high priority on fascinating Big Data Hadoop Proof of concept using and publishing Horton works as a result.

• Provided software evaluation matrix for all major competitor vendors who play a major role in the market to provide an enterprise level Big Data solution

• Given one of my research projects and analyze for storing all archived emails from decades into BigData Hadoop for future recovery. The design approved by the client as a BigData big solution

• Delivered exceptional quality across engagement work and practice development initiatives. Actively seek feedback on quality monthly from managers and engagement leadership.

• Extensively involved in building a cluster setup for both Horton works and Cloudera enterprise level.

• Involved in planning, executing POCs to find how BigData, Hadoop can help to solve the pain levels and performance events to process, storing, analyzing, and securing sensitive data of all applications struggling with high intensity, variable, veracity, value of data.

CERTIFICATION

• AWS Certified Solutions Architect -professional -2024-2027 https://www.credly.com/badges/3b4bb58a-eb71-43bf-b1a2-7e05f8c47e8b/linked_in_profile

• PMP certified by Project Management Institute 2022

• Credential:(drive.google.com/file/d/1eVf-KxD2V9-NI6aMyjvVi924KIMDtkvA/view?usp=share_link)

• Snow-Pro Core certified 2022

• Credential:https://drive.google.com/file/d/1eVf-KxD2V9-NI6aMyjvVi924KIMDtkvA/view

• Azure Admin Associate https://learn.microsoft.com/api/credentials/share/en-us/Sharmila- 1424/5AD23785D52CB3D0?sharingId

• SAFe Agile certified 2020

• AWS Solution Architect Associate Certification 2018

• Cloudera CCA175 Developer certification 2018

• Hortonworks certified Associate 2018

Functional Profile

• Enterprise Technology Solution Architect using Cloud services

• Enterprise Data Solution Architect

• Data Visualization and data catalog

• Data Transformation and strategies

• Technical and Technology Solution for EDI files transmission Technical Profile

BigData and cloud Ecosystem

EA Framework

Azure, Cloudera DP, HDP2.4, HDCloud, AWS, Snowflake, CDP Public Cloud TOGAF ADM,,LeanIX Meta Model, ARCHIMATE,

Programming Languages SQL, PL/SQL, UNIX/Linux, Shell Scripts, python, spark basic Web Technologies HTML, XML, JavaScript, JSON, REST API, Kubernetes, docker Database &Tools Oracle, DB2, MySQL, Teradata, POSTGRESQL, HBase, Couchbase, Tableau, Alteryx, MicroStrategy, Ab-initio, S3, RDS, EC2,

Operating Systems LINUX /Ubuntu, Windows, UNIX

Methodologies Agile, SDLC, Scrum, UML, OOP,LEANIX

Protocols TCP/IP, CDP HTTP, SOAP and HTTPS

Bachelor of Engineering in Electrical and Electronics, 2003 UTKAL University. PROFESSIONAL EXPERIENCE

Current Role: Enterprise Architect and Project Manager with EY Clients worked with:

Chick-Fil-A (current) Traceability and program management North Carolina Department of Health & Human Service RHEEMS enterprise

Baxter Healthcare

Role &Responsibility: Enterprise Architect for Data and Technology Duration: August 2020 till date

Sharmila Sahu as the manager in the Technology Consulting practice of Ernst & Young LLP, aligned with AI and Data.

• I am pursuing her career with EY as a Manager and Enterprise architect with over years of experience applying prescriptive, predictive, and descriptive-analytical techniques to help clients solve complex business problems across several industries. My operational expertise involves building Enterprise Reference Architecture, business data and analytics strategy, predictive modeling, performance reporting, system/data analysis, and validation. I demonstrated expertise in dealing with all aspects of the data life cycle.

• Mote than 5 yrs. as an Enterprise Architect lead in the healthcare domain, Manufacturing domain and Healthcare service area. During this process I utilized TOGAF, LeanIX model for mitigating for a commercial transformation of the IT Strategy and Roadmap engagement to streamline ERP and CRM enhancements and envision new e-commerce, reservation, customer experience, marketing, and development of a mega-project .

• As an Enterprise Data Solution Architect, I am involved in creating and building enterprise data solution architecture, data integration from one system to another system, Data migration and strategy plans, leading and guiding the development team to implement, develop and ultimately build production environments and provide service and reports to end customers.

• Design and build Enterprise data platform and data strategy guidelines which include mainly data availability, architecture pattern, data ingestion pattern, data integration, data storage, and retrieval process Engagement experience

Chick-Fil-A Partner Experience management (Traceability) Sharmila Sahu involved as a Enterprise Architect and SME who represent client Chick Fil-A to more than 100 suppliers for seamless standard 856 X12 EDI file Transmission. I am the Lead point of contact for all business and technical solutions leveraging EDI tools and how to use them for standardized X12 EDI file transmission. I built CFA requirement ASN data element spec which is primary need to build 856 EDI files for the CFA suppliers. I took the ownership in building technical guidelines, SOP, business process flow for multiple CFA distribution center partners, including CFA supply Armada and other distribution centers as their major partners. We Build AWS S3 bucket for all x12 File repository and build different layers using Databricks for all advance data analysis, Data validation, Data Aggregation, building different traceability matrix, report & Dashboards for all business stakeholders,

• Co-development of ASN Specs and Development of EDI Integration Standard Operating Procedures (SOP)

• Involved in co-development of ASN 856 EDI files generation &ASN test File Validation and approval process

• Co-development of ASN EDI files integrations framework and strategies in the Test and production environment

• Monitor Successful EDI files Transmissions in the Test and Productions Environment

• Validate ASN files received in the Testing and Productions environment

• Technical solutions with Individual Suppliers for EDI Development, Integrations and Transmission

• RAID log tracking and providing Resolutions

• Implementing Industry standards, Project Execution process for ASN Files transmissions to CFA

• Built and leverage Project management support with RACI Matrix

• Designed detailed multiple process flows to show how our team can help CFA partners for EDI transmissions to productions environment including validation process and check EDI files in compliance. Enterprise Data Solution Architect with NCDHHS:

• I supported the Equity Workstream with high visibility and high impact for the Client and EY. I analyzed current Equity capabilities for COVID-19 Prevention and Response, identified gaps, and provided insights to develop strategic solutions and initiatives Given the engagement's high urgency and turn-around time

• Played an instrumental role in conducting key stakeholder interviews, capturing gaps and insights, and building structure for collaborative sessions. She worked with NCDHHS on covid 19 prevention and response.

• Provided strong acquaintance in Equity data and helped various business units build better data equity reports and frameworks for the 2022 roadmap.

• I provided my strong acquaintance in Equity data and helped various business units build better data equity reports and frameworks for the 2022 roadmap.

• Actively participated in coaching, developing, and providing my leadership quality in handling the team. As a team we use collaborative effort to identify the future needs and opportunities for EY internal team. Being open to new ideas and experiences, participated in and facilitated interviews with clients/ different business units.

• We did gap analysis and get future capabilities to grab on the dashboard

• Contributed to Gathering and cataloging all the Equity reporting artifacts received from the key stakeholders to analyze which equity stratifications reported.

• Involved in Mapping equity reports to priority questions that critical stakeholders had established as questions they would like to answer for vaccination, testing, and mAbs Equity across the state.

• Built out a roadmap for how to expand and improve Equity reporting across the state and presented this roadmap to NCDHHS leadership Lead.

• Provided my extensive directions to Create an SVI Tableau Dashboard that displays how the four SVI themes affect Vaccine administration, testing per capita, and Deaths per capita in counties in North Carolina to focus on counties that have worse COVID-19 metrics and apply practices that have better COVID-19 metrics across the state Enterprise technology Architect with RHEEMs

Sharmila Sahu, as the enterprise solution architect who represent EY Enterprise Architect team working with RHEEMs

• I involved in designing the conceptual design, reference model, component process flow for different central streamlines, etc., for the proposal by my peers at the client conference. I follow High-end design standards and design patterns, and my expertise in the cloud and different technology skillsets for all my designs

• Worked closely with Rheem Tech Engineers to list current tools and technology in use in Rheem and recorded the signified service and any future replacements planned for future endeavors

• Designed different baseline component flows to show various tools and technology in use and the areas that need new enhancements and migrations to cloud tech

• Involved in building L1 L2 L3 level business mapping tools and technologies for Rheem

• Built multiple process flow which shows current business processors and how they connected with each other. Enterprise technology Architect with Baxter Healthcare: Sharmila Sahu joined as a critical resource as one of EY resource left EY and she joined the team to handle Business capability mapping and enterprise architect team for technologies.

• Sharmila Sahu worked for the Baxter amplify project as an enterprise architect to build future Tech blueprint for the Core ERP system for the Baxter-hill Rom Integration perspective.

• Extensively involved in producing Application portfolio Rationalization analysis reports of more than 100 enterprise used Core ERP applications in technology and business assessment.

• Interviewed with different Business and application owners for L1-L2-L3 mapping of different applications being used and planned to use for future integration and build out the reference design architecture.

• Based on feedback she updated and modified Technology blueprint and technical design to build and enhance future merger.

Company: Employee with Blue Cross Blue Shield (FEPOC) Role &Responsibility: Enterprise Data Solution Architect on Bigdata and Cloud service Duration: May 2018 – August 2020

• INVOLVED IN PRIMARY ENTERPRISE SOLUTION ARCHITECT FOR MIGRATING CLOUDERA TO AWS cloud services . we leverage Lift-Shift methodologies to migrates CDP on-prem Platform to Public cloud using AWS.

• Planned and strategies for Migration plan, solutions, strategies, Roadmap, POCs for End to end implementation and migration phases.

• Initiated and facilitated Research and Development, approval process of deciding for the Public Cloud migration.

• The initial Design phase included Cost planning, project and program manager approvals, list of applications required for migration process. This whole process conducted and included lot of designing, planning, budget approval, required to make sure not to much re-architecting the migration process.

• On prem products like Cloudera Data platform, SQL server, PostgreSQL, Db2 data bases, Ab-initio ETL tools, Metadata management, Microservices and MicroStrategy applications to be migrated to CDP Public cloud Platform.

• Build and design Migration plan, Cloud Solution Design leveraging different AWS tools like S3, EC2, RDS, EKS, Cloud formation, Cloud Watch, AWSKMS, AWS SNS, EFS etc.

• Conducted Technical Acquisition Plan with the help data Engg for each AWS tool before utilizing the same for migration process.

• We planned phase by phase for the application and Data Base migration plane as a PAAS and SAAS With the help of Security team me make sure the lift-shift migration process is accurately secured as we meke sure PHI data is not being compromised. We used DataDOG(Cloud based SAAS Monitoring) as an major SAAS provider tool for infrastructure security.

• Design end-to-end FEP Enrollment project modernization architecture, including data design, application utilization, security encryption, improved and redesigned enrollment landscape system architecture for seamless and automated service to federal customers

• Working as an Enterprise Data Solution architect lead in FEPOC for enrollment programs including digital mailroom, membership lifecycle management, HIPPA 834 EDI processing, Fulfillment and Distribution.

• As a data Solution Architect, I am involved in creating and building enterprise data solution architecture, data integration from one system to another system, Data migration and strategy plans, lead and guide development team to implement, develop and ultimately build production environments and provide service and reports to end customers.

• Design and build Enterprise data platform and data strategy guideline which include mainly data availability, architecture pattern, data ingestion pattern, data integration, data storage and retrieval and lastly generation MicroStrategy reports and provide member 360 data availability to members and plans.

• Use Big Data technology to handle all Enrollment transactional data, data profiling, use Ab-initio NiFi with Kafka integration for data migration from legacy system to Big Data platform and eventually to cloud platform

• Design and implement Reference data management and build Meta-Data-Hub to create a central repository for all Enrollment, Claims and pharmacy related reference data in one central place as per data governance rules.

• Conducting gap analysis, performance, data management solutions and cost involved analysis in migrating DBMS platform to mongo DB and cloud service

• Initiate AWS cloud pilot, leverage framework to develop initial workload migration and application migration remediation

• Define and design initial cloud delivery and operational processes as needed including integration plans with product owners and director office approval

• Building the strategy road maps and solution architecting for cloud migration

• Assisting enterprises customers to accelerate adoption of the cloud Big Data platform AWS

• Architected and implemented Data Lake solution on Amazon Cloud.

• Provide consulting and implementation services and developing industry-wide best practices on AWS Big Data Technologies.

• Working with customers and partners in gathering requirements around current legacy applications and suggesting suitable cloud services.

• Delivering end to end cloud migration interacting with vendors and customers.

• Performed proof of concepts, built repeatable cloud solutions assets and demos, delivered hands-on cloud solution workshops both customers facing and internal.

Company: Avalon Consulting LLC May2018-May 2019

Client: Tata Communication (Canada)

Role &Responsibility: Big Data Solution Architect

• worked with TATA communications as a Bigdata solution architect for a major implementation and migration technology.

• Delivers a proposal for Tata communication where clients seek solutions for migrating from map reduce the legacy system to Spark migration strategy definition and baseline implementation.

• I worked with the client to provide my expertise and knowledge-based solution to overcome performance issues faced with current data processing engine in Bigdata platform

• The intended consumer for this strategy was the TATA communication project manager, design authority and project team members. I play the major role to define business principles, guidelines, methodology and support prototype to maintain business operational process

• Delivered a prototype that will function in a separate repository to avoid additional overhead on production environment. The spark conversion framework delivered to client after tested and approved by business owners for use in production environment. We did an apple to apple to comparison between map reduce and spark and the code is in production.

Client: AIR (American Institution of Research) Nov 2016- May 2018 Role &Responsibility: Big Data Solution Architect

• Worked as a Bigdata solution architect and engineer in AIR to implement BigData platform as a shared service across the organization. American Institutes for Research is a nonprofit organization which worked for a lot of federal and government projects for schools, homeless, healthcare. AIR collaborate Social science research and BigData solutions to target interdisciplinary problems. Integration of data from a broad array of sources can illuminate essential relationships across outcomes.

• I leverage my expertise which leverages emerging computational techniques for data integration and is especially useful when primary identifiers are not available or shared across data sources.

• As a BigData solution architect I designed Patents View to link patent inventors, their organizations, addresses, and activity with extensive feedback gathered from multiple stakeholders. The Patents View user community includes researchers, developers, policy analysts, federal agencies, the media, and the general public. Our data scientists use text analytic methods and algorithmic linking techniques to locate, integrate, and disambiguate these data.

• I delivered my solutions to leverage BigData skill using Horton works solutions. I established and implemented a 15- node cluster using AWS cloud on to Hortonworks. AIR is accustomed to working with sensitive data; we understand the importance of data security, confidentiality, and privacy to our clients’ work. Our Secure Analytics Workbench

(SAW) is built for FISMA and FedRAMP compliance, allowing data to be quickly secured and available.

• I lead SAW (Secure Analytics Workbench) as a Big Data solution architect to allows researchers and data scientists to perform sophisticated data analyses across multiple datasets containing Personally Identifiable Information (PII) and electronic Protected Health Information (ePHI) and to securely report the results of those investigations to authorized report viewers. It is a high-performance, cloud-based data analytics platform that securely receives, stores, and manages customer datasets. The SAW can comfortably accommodate many rapid cycle monitoring projects, comprehensive program evaluations, and extensive analyses of nation-wide, multi-year databases.

• Designed to analyze huge datasets (“big data”), the SAW can perform sophisticated analyses (General Linear Models, logistic regression, complex sorts) with datasets of hundreds of millions of records. The SAW is designed to handle complex dataset linkage (using deterministic or probabilistic protocols) for instance across huge administrative service and cost datasets. The SAW can also perform extensive and sophisticated data cleaning, record de-duplication, and record reconciliation and remediation.

• As a consultant my role and responsibility involve to bring out BigData Hadoop Platform as an extensively solution platform.

Company: Big data solution Architect with Ernst & Young (EY) June 2015-October 2016

Clients through EY: Pepco Holdings Inc / Exelon (DC), Ingersoll Rand (NC), Big Data solution Architect / Senior3 Level Role

• I played a major role as a Solution Architect and Data analyst and Single point of Contact for the BigData/ Hadoop Technology. I’ve also taken the responsibility for the building BigData Roadmap, Requirements gathering process, Requirement design document, project plan deliverables to business users/stakeholders and Implementation.

• I involved in delivering Software evaluation matrix for all competitive vendors in BigData Hadoop enterprise solution.

• I designed, planned, and managed around the clock and successfully implemented the BigData Hadoop platform in Client location. Successfully designed, delivered, mentored employees who have no idea about the technology.

• Delivered the Roadmaps/Functional/Technical approach/Test Scenarios and Use cases to Client and delivered the right product and met Client expectations.

• Lead design excellence thrives in an environment of collaboration and demonstrates a high degree of creativity and entrepreneurial spirit. I constantly search for new ideas and ways to improve efficiency.

• Lead Architectural Design in BigData, Hadoop projects and provide for a designer that is an idea-driven visionary with a high level of design skills of the complex nature of science and technology projects.

• Delivered enterprise level business use cases and requirements based on BigData Hadoop solution enterprise level. Driven the high priority on fascinating BigData Hadoop Proof of concept using and publishing Horton works as a result.

• Provided software evaluation matrix for all major competitor vendors who play a major role in the market to provide an enterprise level BigData solution

• Given one of my research and analyze for storing all archived emails from decades into BigData Hadoop for future recovery. The design approved by the client as a BigData big solution

• Delivered exceptional quality across engagement work and practice development initiatives. Actively seek feedback on quality monthly from managers and engagement leadership.

• Take the lead in identifying issues on engagements and proposing workable solutions. Establish strong working relationships with clients, and consolidate position as a key contact for clients.

• Identified the challenges and complexities of engagement work and proposing workable solutions after a thorough analysis.

• Presented the Leadership role in challenging engagements, guide, and mentor the team through the complex analysis and in delivering quality work products.

Project# 5: August 2013-June2015

Client: Fannie Mae

Role: Bigdata Hadoop Architect

• Fannie Mae’s current solution stacks to our constituents is primarily focused on all segments as above, with a substantive gap in addressing latent demand in the research and exploratory modeling area. This research and data constituency is currently done by ad hoc data sets, SAS modeling and other fragmented solutions. Inefficiencies and risks associated with the fragmentation of data are big challenges which extensions can be answered by keeping all enterprise level data in one centralized data hub and processing high volumes of data using BigData Hadoop and Hadoop ecosystems

• Determine feasibility requirements, compatibility with current system, and system capabilities to integrate new acquisitions and new business functionalities.

• Extensive experience with BigData Analytics, Data Warehousing Applications, Business architecture methodologies and process modeling in a large organization across multiple functional units.

• Demonstrated success in full lifecycle of BigData implementations. Strong business acumen with understanding of financial business operation flows.

• Established system of record-based consumption patterns (RDW’s marts & formal reporting functions including financial, regulatory and other).

• POC- Hadoop cluster with echo systems and HBASE installation



Contact this candidate