Post Job Free
Sign in

Big Data Architect

Location:
Torrance, CA
Salary:
150000
Posted:
October 05, 2017

Contact this candidate

Resume:

Srinivasa Kalyanachakravarthy Vasagiri

*********************@*****.***

SUMMARY:

Srinivasa Kalyanachakravarthy Vasagiri is a resourceful, high energy technical solution consultant, offering 13 years of expertise in architecture definition of large distributed systems, technical consulting, project management and technology implementation in Cloud, Big Data, Hadoop, Enterprise Information Management, Data Management, Product Management and Application integration.

Proficient in Data Architecture/DW/Big data/Hadoop/Data Integration/Master Data Management, Data Migration and Operational Data Store, BI Reporting projects with a deep focus in design, development and deployment of BI and data solutions using custom, open source and off the shelf BI tools.

Provide technical thought leadership on Big Data strategy, adoption, architecture and design, as well as data engineering and modeling.

Hands on experience with Architecture/Implementation of Data Lake’s and Big Data Analytics platforms on Big Data and Hadoop ecosystem.

Implemented AWS Big Data Analytical platform leveraging Data Lake architecture and implementation on cloud with AWS/Hadoop/Azure ecosystem

Prioritize and scale architecture efforts in close coordination with business teams, Data Lake operational team, and other stakeholders.

Expert level understanding of open source technologies around Hadoop ecosystem.

Managing the setup of Enterprise tableau reporting for analytical reporting.

Experience working with AppDynamics application monitoring software and configurations.

Highly competent with data modeling with relational database design and SQL

Strong aptitude to learn business processes/products and the role of data within business domain

Strong understanding of NoSQL data sources, their implementation and management

Experience with Hadoop distributions (open source and commercial)

Strong Experience in Java, Linux/Unix.

Hadoop Administration which includes Cluster deployment, management and issue resolution.

Experience with Scala, Python, R, and Spark

Experience with implementing Modern Data Warehouse / Data Lake

Hands on experience with Hadoop distributions like AWS, Cloudera, Horton works, EMR, HDInsight’s, IBM Big Insights, Hadoop architecture and technology stack (Hive, HBase, Map Reduce, Sqoop, HDFS, Oozie, Ambari and zoo keeper, Kafka, Spark, Storm, Kinesis, Hue, Phoenix, HBase, Cassandra, Flume, Apache Impala, Sentry and Lambda), Apache Nifi, Apache Spark, Apache Drill, Athena.

Excellent Leadership skills.

Worked in Agile environment and sprints.

TECHNOLOGY STACK:

Data Warehousing

Relational

MySQL, Oracle, SQL Server, DB2 UDB, Netezza, IBM DB2, Hive, NoSQL

MPP

Teradata, Teradata Aster, Redshift, Vertica, Greenplum

Analytical/

Columnar

HBase, SAP HANA

NoSQL

Cassandra, MongoDB, Neo4J, Elastic search, Dynamo DB, Redshift

Distros

Apache, Cloudera Distribution, Hortonworks Distribution, IBM Big data, Informatica BigData Edition

Big Data Frameworks

Hadoop

HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Ooziee, Zoo keeper, JAQL, Solr, Lucene

Stacks

Hadoop, Spark

Data Analysis

Hive, Pig, R, Python

Data Transformation

ETL/ELT

Informatica, Talend, Pentaho Data Integration, Informatica Big Data /PC/DI Edition, Datastage, Abinitio, SSIS, Informatica Vibe, Informatica BigData Edition, SAP BW, SSIS, SSRS, IBM Big Data Edition, Amazon Data pipeline

Data Modelling

Erwin, Visio, MDM

Data Collection

API's, Web services, SQL, ODBC calls. Amazon S3

Machine Learning

Microsoft ML, Rapid Miner, Spark MLIB, IBM Data Science

Cloud/OS

Amazon Web Services, Microsoft Azure, Google Cloud, IBM Cloud

Virtualization

VMWare, Virtual Box

BI

Tableau, Business Objects, Cognos 8, OBIEE, Microstrategy, Spot fire, Hyperion Essbase 9

Scheduling

Autosys, Control M, Shell scripting, DAC

Languages

Java, Python, Shell Scripting, XML, SQL/PL-SQL

Load/Unload Utilities

Teradata - BTEQ, Fast Load, MLoad, Tpump, Fast Export, Mysql Load Utilities, Oracle Load Utilities, MySql Utilities, SQL Server utilities

Version Controlling

Git, PVCS, CVS, Sub-version

OLTP

Siebel CRM, Salesforce, PeopleSoft, Mainframe systems, SAP, Oracle (EBS/Finance/Supply chain/Inventor) Apps, Microsoft Dynamics, Contact Center Applications, Oracle

MDM

Informatica MDM, Informatica IDQ, Informatica ILM

Others

Oracle SQL Developer, TOAD, SQL Work bench, Aqua Data Studio, Toad MySql

Certifications:

Edureka Hadoop and BIG Data Certificate (License13071302003 February 2014)

Edureka Spark and Scala Developer Certificate (April 2017)

Edureka BIG DATA EXPERT Certificate (License 130******** February 2014)

Java Development with Apache Cassandra DataStax License 27498 February 2014

Hadoop Fundamentals I - Version 2 IBM Big Data University February 2014

EDUCATION:

Bachelor of Engineering, Periyar University, India

EXPERIENCE:

Consulting Jul 2017 – Present

Big Data Architect

Responsible for assessing Applications, Big data, Hadoop, data and BI requirements and defining the strategy, technical architecture, implementation plan, and delivery of data warehouse and for establishing the long-term strategy and technical architecture as well as the short-term scope for a multi-phased Big data applications and data warehouse effort.

Hands on and management experience implementing the Big Data, BI and Data solutions.

Implemented Data Lake Analytics platform leveraging AWS cloud and Hadoop technologies for providing the centralized Data repository or hub for Big Data Analytics and platforms.

Provide technical thought leadership on Big Data strategy, adoption, architecture and design, as well as data engineering and modeling.

Expertise in identify the product road map and product management activities.

Hands on experience with dealing Cloudera and Hortonworks Hadoop platforms.

Architecting cloud solutions on AWS, Microsoft and Google solutions.

Hands on experience with scalable solutions on Cloudera, Azure, Big data, Hadoop, Spark, Scala, SparkSQL, pySpark, Cassandra, AWS Redshift, MySQL, Java, Scala, python, Kafka, Oozie, HDFS, Yarn, Tableau, HBase, Hive, Hadoop Ecosystem, Amazon S3, EC2, Microsoft Azure HDInsight, Hortonworks, Apache Nifi.

Implementation of Big data solutions on the AWS Cloud platform and Azure cloud platform.

Involved in projects implementing Data Lake Architecture, Big Data Analytics and Modern Data warehouse applications.

Hadoop Administration which includes Cluster deployment, management and issue resolution.

Ciber Inc., CA, USA

Oct 2016 – Present

Principal Architect Big Data Architect

Solutions Architect responsible for Modern Data Architecture, Hadoop, Big data, data and BI requirements and defining the strategy, technical architecture, implementation plan, management and delivery of Big Data applications and solutions.

Implemented Data Lake Analytics platform leveraging AWS cloud and Hadoop technologies for providing the centralized Data repository or hub for Big Data Analytics and platforms.

Provide technical thought leadership on Big Data strategy, adoption, architecture and design, as well as data engineering and modeling.

Expertise in identify the product road map and product management activities.

Provide expert level guidance on Big Data best practices and standards

Work with product owners, business SME and data ingestion and reporting architects to identify requirements and consolidate enterprise data model consistent with business processes

Develop and maintain technical roadmap for Enterprise Big Data Platform for different platform capabilities

Ingesting wide variety of data like structured, unstructured and semi structured into the Big data eco systems with batch processing, real time streaming and SQL.

Architecting cloud solutions on AWS, Microsoft and Google solutions.

Hands on experience with scalable solutions on Cloudera, Azure, Big data, Hadoop, Spark, Scala, SparkSQL, pySpark, Cassandra, AWS Redshift, MySQL, Java, Scala, python, Kafka, Oozie, HDFS, Yarn, Tableau, HBase, Hive, Hadoop Ecosystem, Amazon S3, EC2, Microsoft Azure HDInsight, Hortonworks.

Implementation of Big data solutions on the AWS Cloud platform and Azure cloud platform.

Involved in projects implementing Data Lake Architecture, Big Data Analytics and Modern Data warehouse applications.

Data Architecture and strategy implementations, Cloud implementations and strategy.

Provide information architecture leadership within the consultant team as well as to the client or third-party resources.

Evaluate, select and integrate any big data tools and framework required to provide requested capabilities, as well as comply with regulatory policies and architecture standards as necessary.

Develop and Implement Hadoop, Spark, Big Data Analytics and Integrations, Microsoft Azure cloud data solutions, AWS Cloud Big Data solutions.

Enterprise Tableau Reporting Architecture and Implementations.

Provide architecture recommendations, strategy, data architecture and implement enterprise data solutions.

Evaluation of best of breed Big data, Hadoop and cloud based products for various types of data integration and processing implementations.

Successful delivery of the projects right from conceptual phase to the final product.

Implementation of Hadoop, Big Data, Real time and Application integration at enterprise level for various clients.

AOFL Inc., CA, USA

June 2016 – Sep 2016

Big Data Architect

Responsible for assessing Applications, Big data, Hadoop, data and BI requirements and defining the strategy, technical architecture, implementation plan, and delivery of data warehouse and for establishing the long-term strategy and technical architecture as well as the short-term scope for a multi-phased Big data applications and data warehouse effort.

Implemented the Big Data Lake platform on the cloud leveraging the AWS and Hadoop Ecosystem components for centralized data repository, app integration and data consumption.

Architecting, managing and delivering the technical projects /products for various business groups.

Implementations on AWS Cloud solutions for Bigdata stack.

Building and setting up the Hadoop eco systems and stack from the ground up.

Hands on implementation of Big Data Analytics and applications in cloud environment deployments.

Implementing Data Engineering projects using data pipelines, java, python, Unix, SQL, Scala, NoSQL and RDBMS.

Hands on experience with scalable solutions on DW, Big data, Hadoop, BI platforms using AWS Redshift, Oracle, MySQL, Talend DI, Java, Hadoop, Spark, Tableau, HBase, Hive, Hadoop Ecosystem, Amazon S3, EC2.

Involved and ingesting different types of the data streams to the Data lake like click stream data and web log data ingestion for more near micro batch processing.

Implementation of Data lake architecture to support the different products and business analytics needs.

Managing the data architecture, file system storage and maintenance of Hadoop clusters.

Hands on with Hadoop administration, security and management of the clusters.

Responsible for delivering end to end data products, business analytics applications and product engagement apps.

Managing end to end technical delivery of data products and projects.

Beachbody LLC., CA, USA

Jan 2013 – June 2016

Data Architect Lead Data Engineer

Responsible for assessing Applications, DW, Big data, Hadoop and BI requirements and defining the strategy, technical architecture, implementation plan, and delivery of data warehouse and for establishing the long-term strategy and technical architecture as well as the short-term scope for a multi-phased data warehouse effort.

Implemented the Big Data Lake platform on the cloud leveraging the AWS and Hadoop Ecosystem components for centralized data repository, app integration and data consumption.

Leading a team of developers both onsite and offshore for data integration and data reporting requirements.

Architecting, managing and delivering the technical projects /products for various business groups.

Implementations on AWS Cloud solutions for Big data stack.

Hands on experience with scalable solutions on DW, Cloudera, Azure, Big data, Hadoop, BI platforms using AWS Redshift, Oracle, MySQL, Talend DI, Python, Java, Hadoop, Kafka, Spark, Tableau, HBase, Presto, Hive, Hadoop Ecosystem, Cassandra, Spark, Amazon S3, EC2, EMR, Java, python and Cassandra.

Implementing Data lake architecture and data hub for wide variety of data streams like adobe Omniture data processing, web log data, transactional data, CRM, email marketing campaign data, external vendor data and micro batch processing.

Implementing real time applications using pySpark, sparkSQL and spark streaming process.

Experience working with AppDynamics application monitoring software and configurations.

Working on POC’s in implementing Spark and Hadoop applications using java, python and Scala.

Implementing Data Engineering projects using java, python, SQL, Scala, NoSQL and RDBMS.

Implemented various technologies like Big Data POCs, data integration, data consuming platforms, B2B applications, consumer facing applications and reporting applications.

Define key business drivers for the data warehouse initiative

Deliver a project scope that directly supports the key business drivers

Implementation of Tableau reporting at enterprise level.

Ingesting wide variety of data like structured, unstructured and semi structured into the Big data eco systems with batch processing, real time streaming and SQL.

Implementation of Data lake architecture to support the different products and business analytics needs.

Managing the data architecture, file system storage and maintenance of Hadoop clusters.

Hands on experience with Hadoop administration, security and management of the clusters.

Responsible for delivering end to end data products, business analytics applications and product engagement apps.

Managing end to end delivery of data products and projects.

Exilant Technologies Private Ltd., CA, USA Apr 2011 – Jan 2013

Senior Technical Lead

Apple Inc., CA, USA Apr 2011 – Jan 2013

Technical Architect Technical Lead (Cloudera)

Responsible for assessing Big data, data and BI requirements and defining the strategy, technical architecture, implementation plan, and delivery of data warehouse and for establishing the long-term strategy and technical architecture as well as the short-term scope for a multi-phased data warehouse effort.

Leading a team of developers both onsite and offshore for data integration and data reporting requirements.

Architecting, managing and delivering the technical projects /products for various business groups.

Implementation of big data pipe line, hive jobs, pig jobs on Hadoop clusters.

Architecting, managing and delivering the technical projects /products for various business groups.

Managing and delivered various technical projects for different business groups like call center, manufacturing, sales, CRM etc.

Leading team of developers both onsite and offshore for data integration and data reporting requirements.

Design and development of complete Data warehouse and system data integration, platform development and BI reporting solutions.

Solution architecture for various Business intelligence systems.

Managing the system support and stability of the BI applications.

Drive new technological solutions to support data and catering to different business needs. Delivering complex data integration and reporting projects to support various business needs.

Hands on design, architecture and development of various Data warehouse system integration, platform development and BI reporting solutions.

Solution architecture for various Business intelligence systems.

Managing the system support and stability of the BI applications.

Drive new technological solutions to support data and catering to different business needs.

Implemented various technologies like data integration, data consuming platforms, B2B applications, consumer facing applications and reporting applications.

Implemented Big data initiatives using Hive, Pig, Hadoop using Big data technology stack.

Define key business drivers for the data warehouse initiative

Deliver a project scope that directly supports the key business drivers

Supporting the different business user groups for various BI reporting requirements in Agile and Traditional Data warehousing methodologies.

Define, Design and Build the overall data warehouse architecture using Informatica, Oracle and Teradata for ETL and Business Objects/Tableau for reporting.

Define technical requirements, technical and data architectures for the data warehouse

Managing production support related activities with the support team.

Define metadata standards for the data warehouse.

Direct the data warehouse meta data capture and access effort.

Define production release requirements and sustainment architecture.

Post Production Warranty support for the systems that have been implemented.

Syntel Inc., AZ, USA Nov 2007 – Apr 2011

Project Lead Technical Architect

American Express Technologies, AZ, USA Apr 2008 – Apr 2011

Technical Architect/ Tech lead

Responsible for assessing data and BI requirements and defining the strategy, technical architecture, implementation plan, and delivery of data warehouse and for establishing the long-term strategy and technical architecture as well as the short-term scope for a multi-phased data warehouse effort.

Architecting, managing and delivering the technical projects /products for various business groups.

Hands on experience with scalable solutions data and reporting solution platforms, IBM Bigdata Edition, MicroStrategy, DB2 and Unix etc.

Managing and lead a technical team of around 10 people both onsite and offshore members.

Leading team of developers for technical delivery of projects.

Design, architect and develop various data integration and reporting projects which varies from medium to complex in nature.

Providing solution architecture for various Data warehouse and Business intelligence systems.

Maintain system support and stability of the BI applications.

Drive new technological solutions to support data and catering to different business needs.

Delivering complex data integration and reporting projects to support various business needs

Serving as a Senior DW ETL Architect/Lead in defining, developing, and enhancing custom DW/BI system

Primary responsibilities include but not limited to coordinating, planning, and conducting interviews with various business users in identifying and capturing business requirements.

Defining and developing project scope/plan, creating and developing Enterprise wide Data warehouse and multi-dimensional Datamart data models.

Designed, Developed and implemented complex data models for large enterprise data like Merchant Credit Card system, Card member data reporting and analytics solutions.

Mainly involved in architecting the project right from the initial design phase, existing reporting system migration to the new proposed systems, building the ETL integration part, Staging layer and BI layer for reporting solutions.

Designed and Developed the EDW Model which has 13 subject areas and data marts for

Supporting various needs of the business using DB2, Data stage, Control M and Micro strategy

Designing data profiling strategies for improving the source data quality that is being loaded to the EDW systems.

Supported the production support teams during the pre-and post-warranty of the BI reporting projects on time in resolving production failures or issues.

Proposed and maintained performance tuning improvements to the ETL processes, reporting solutions in attaining the optimal performance or meeting the SLA requirements of the business.

Training business users in how to navigate through the environment, creating and accessing reports.

Target Technology Services Ltd, India (Target Inc., USA) Oct 2005 – Nov 2007

Sr. Software Engineer Lead

Lead a team of people for various data projects.

Implemented enterprise level data integration and data warehouse projects.

Design, architect and development various data integration and reporting projects which varies from medium to complex in nature.

Solution architecture for various Business intelligence systems.

Manage system support and stability of the BI applications.

Lead and involved technical development team in construction of ETL, BI solutions and Data warehousing implementations.

Interacting with the project stakeholders, technical teams, source systems involved in the project life cycle

Designed data marts employing Kimball’s Dimensional Modeling methodologies

Designed, developed, implemented the REDW and DataMart’s for Enterprise wide financial credit systems using IBM Information Server Data stage, Informatica, Siebel OBIEE Suite, Oracle and DAC.

Gathered the requirements, completed the proof of concept, designed, developed and tested Physical Layer, Business Layer and Presentation Layer of OBIEE.

Implemented the Guest CRM Applications integration to EDW for enabling reporting solutions using Datastage, Control M and DB2 Etc.

Majorly involved in Informatica Mappings, Workflows, jobs for complete ETL interfaces and BI reporting products.

Designed, created and tested reports, dashboards and created ad-hoc report according to the client needs

Designed, Developed and Tested Data security and Dashboard security in OBIEE

Worked with Data Governance board on developing roles, tasks, processes, standards, etc.

Involved in development of huge and enterprise wide data warehouse implementations for Target Financial services and Target Guest systems.

Implemented OBIEE for Development, QA and Production environments.

Track the project efficiently with constants updates on the progress and hurdles, focusing on the overall deliverables.

Production support systems in support of warranty pre and post production implemented projects.

Aptra Infotech, India May 2004 – Oct 2005

Software Engineer

Design and development of complete DW and data system integration, platform development and BI reporting solutions.

Implemented enterprise level data integration and data warehouse projects.

Interacting with the project stakeholders, technical teams, source systems involved in the project life cycle

Designed data marts employing Kimball’s Dimensional Modeling methodologies

Gathered the requirements, completed the proof of concept, designed, developed and tested Physical Layer, Business Layer and Presentation Layer of OBIEE.

Designed, developed and implemented the REDW and DataMart’s for systems using Informatica, Siebel and Oracle.



Contact this candidate