Post Job Free

Resume

Sign in

Data Engineer

Location:
Allston, MA, 02134
Salary:
120000
Posted:
February 05, 2018

Contact this candidate

Resume:

Email: ac4cn5@r.postjobfree.com

Mobile: 781-***-****

Professional Summary

Having 14+ Years of extensive IT and Consulting experience as a AWS Solution/Data Architect for (Cloud/DWBI)

Using different set of ETL tools like Informatica Power Center/Cloud, Talend, and different set of AWS services (EC2, S3, Glacier, EBS, Redshift, Dynamo DB, RDS, Postgres, EMR, Data pipeline, SNS, SQS, SWF, VPC, DMS, AWS SnowBall, Route53, ELB, Cloud Watch, AWS Config, ACM, AWS Athena, Elastic Search, Kinesis, CloudFront, CloudTrail, NAT Gateway, Auto Scaling, CloudFormation Template, IAM, AWS, Billing & Cost Management, Elastic Beanstalk, OpsWorks, Storage Gateway, EBS, EFS, etc.)

Extensive experience in studying the existing infrastructure landscape, cloud product matching, design cloud architecture, proof of concepts, design improvements, cost estimation and implementation of AWS Cloud infrastructure, recommending application migrations to public vs private cloud

Extensive experience in programming languages like Python, JSON, Node.js

Working experience in bash and Python scripting with focus on DevOps tools CI/CD and AWS cloud architecture and hands on engineering

Extensive experience with DevOps automation - Orchestration/Configuration Management and CI/CD tools like Chef, Puppet, Jenkins, AWS OpsWorks.

Working closely with customers to get to know their business needs, technical goals and challenges

Managing the relationship between the organization's technical demands and the capacity to be pulled from the AWS cloud

Migration of customers' on-premise workloads(applications) to the AWS cloud platform.

Evaluating and improving existing AWS deployments

Defining and deploying monitoring metrics, and logging systems on AWS

Lift and shift of an existing on-premises application to AWS

Extensive experience in Source Code Control Management concepts such as Branching, merging and Master and stable code using GIT repository/BitBucket/Subversion

Identifying appropriate use of AWS services and deployment types, and operational best practices.

Identify continuous services improvement, automation areas and prepare strategy script which helps in automating manual tasks and improve service quality and reliability

Build best practices and market trends pertaining to Cloud and overall industry to provide thought leadership and mentor team to build necessary competency

Experience in architecting, deploying and managing cost-effective and secure AWS environments, across multi-zones and regions, leveraging AWS services such as EC2, S3, RDS, VPC

SME for Technical and Architectural Informatica DW/BI Arch and Domain experience in Healthcare, Life Science, Pharmacy, Clinical Trials, Telecom, Finance/Investment, process improvement and re-Engineering

An expert in the ETL Tool Informatica which includes components like Power Center, PowerMart, Power Connect, Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server Administration Console

Expertise in Analysis, design and development of ETL programs as per business specifications and assist in troubleshooting for complex business performance

Hands experience in using Informatica Cloud services like data synchronization, data replication, mapping configuration and object migration from Dev. to SIT, UAT and Prod Environments

Hands on experience in creation of Global Delivery Framework model (GDF)

Performed Customer Requirements Gathering, Requirements Analysis, Design, Development, Testing, End User Acceptance Presentations, Implementation and Post production support of BI / Cloud projects

Managed successfully developed and implemented various Cloud/DWBI projects related to end to end (HLD/LLD) development/maintenance, model optimization, workflow schedule automations and technical solutions delivery by using Tableau and OBIEE analytical tools

Hands on in advanced data model design/Data mart design, Dimensional data model /FACT by using Erwin and ER Studio

Expertise in Technical architecture/Data modeling and re-design of existing model for strategic Solutions

Proven track record to solve performance issues any kind of Env.by using innovate methods/strategic solutions/ideas

Expertise in Analysis, design and development of ETL programs as per business specifications and assist in troubleshooting for complex business performance

Collaborate with product development teams and senior designers to develop architectural requirements to ensure client satisfaction with product

Expertise in Technical architecture /Data modeling and re-design of existing model for strategic Solutions

Professional Experience

Currently with Collaborative Consulting LLC, Burlington, MA now part of CGI from Sep 2014 to till date.

Fresenius Medical Care, Lexington, MA (as consultant) from Sep 2013 to Sep 2014.

Infosys Limited, Australia from Jun 2010 to Aug 2013.

Mahindra Satyam, Australia from Jun 2006 to Apr 2010

Hewlett- Packard (HP) Singapore Jan 2004 to May 2006.

Clients & Projects

Takeda Pharmaceuticals (Jun 2015 to till date)

Project: Web App/Data Lake Cloud Migration

Role: AWS Solution Architect/Data Migration Architect through CGI

Takeda is largest Pharmaceutical company in Japan and 15 pharmaceutical company in the world who focuses its R&D efforts on oncology, gastroenterology and central nervous system therapeutic areas plus vaccines.

As a AWS Solution Architect/Data Migration Architect I am responsible for Business engagement, customer requirements, business communications, team management & delivery. Developing a migration plan to migrate On-premise apps into AWS in terms of time, cost, security and availability. Designing and implementing both the front-end and back-end systems that run on AWS on par with organization compliance and security policies. Designing and deploying scalable, highly available, and fault tolerant systems on AWS. Doing application assessments, identifying right candidates for migration & preparing the migration plan document. Managing user access to AWS resources using Identity Access Management (IAM), S3 buckets creation, bucket policies setup, lifecycle creation, enabling encryption at the bucket level and the IAM role based polices for S3 buckets. Creating and maintaining custom AMI templates DR purposes. Estimating AWS costs and identifying cost control mechanisms to find where the cost is coming from. Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database services across the environments. Implemented detailed monitoring for cloud environment and notification system using cloud watch and Simple notification system. Configuring DNS (Route53), ELB, Auto Scaling mechanisms, VPC, Subnets, RT, NACLs, SGs, and Direct Connect setup. Created workflows for AWS Data pipe line jobs using AWS resources with defining activities, schedules and parameters. Manage Amazon Redshift clusters such as launching the cluster and specifying the node type as well. Used AWS Beanstalk for deploying and scaling web applications and services developed with Java. Implement Continues Integration using Jenkins and Cloud Formation Update Stack. Delivered multiple cloud awareness and technical trainings to customer teams. Technical tools used: Windows Server 2008R2/2003/2012, Windows 7/XP, Active Directory, VMware ESXi 5.0/5.1 Server, Horizon View, Cisco Enterprise Network, Jira, GIT, Multiple AWS services

Arbella Insurance (Sep 2014 to May 2015)

Project: WebApp/DBMS Systems Migration to AWS

Role: AWS Solution Architect/DevOps Engineer through CGI

Arbella Insurance is headquartered in Quincy, Massachusetts, United States, is a regional property and casualty insurance company providing business and personal insurance in Massachusetts and Connecticut, as well as business insurance in Rhode Island and New Hampshire.

As a AWS SA/DevOps Engineer I am responsible for Business engagement with customer, providing migration plan, cost effectiveness of using AWS services, understanding existing on-premise servers, apps, creating high level solution design document for migration, implementing POC, providing transparent communication to client and all the stakeholders. Reducing overhead and infrastructure cost by 40% by re-architecting, consolidating and deploying On-Premise/COTS applications to AWS Cloud. Developing a migration plan to migrate On-premise apps into AWS in terms of cost, security and availability. Doing the application assessments, identifying right candidates for migration & preparing the migration plan document to the AWS cloud. Implementing detailed monitoring for cloud environment and notification system using cloud watch and Simple notification system. Setting up DR strategy using StorageGateway, S3 for On-Premise storage and DB backups. Managing user access to AWS resources using Identity Access Management (IAM), S3 buckets creation, policies setup, lifecycle creation, enabling encryption at the bucket level and the IAM role based polices. Working on automation and continuous integration process with Jenkins and Chef. Creating S3 buckets and managing policies for s3 buckets and utilized s3 bucket and Glacier for storage and backup on AWS. Bootstrapping EC2 instances using Chef and integrating with AS lifecycle hooks. Setup SSO between On-Prem and AWS using Active Directory service. Setup and built AWS services like VPC, EC2, S3, IAM, EBS, EFS, SG, Subnets, NACLs, AS, ELB, CF templates. Creating bash and Python scripts with focus on DevOps tools CI/CD like Jenkins and Chef. Setup custom cloud watch metrics for application monitoring and High Availability. Automating backups by using Shell scripts to transfer data to S3. Created CloudWatch alerts for EC2 to use them in Auto scaling for scale in and scale out. Technical Tools/Services used: RHEL, Windows / AWS CloudFormation, Python, Shell Scripts, CI / CD, GIT, Docker, Jira, Mail servers / Application Servers/ Domain Servers, Oracle Database Servers and File Servers

Fresenius Medical Care(FMC) Idec (Sep 2013 to Sep 2014)

Role: ETL Architect, Lead Developer through iSync Consulting

FMC is one of the leading dialysis companies in USA. This project was to build new DW on Netezza using Data Vault modeling from Oracle EDW to support new analytical reporting capabilities for clinics, and financials. This includes data inception from eCC and eCF applications, staging, and DW and Data marts. This Project makes use of Technology tools Informatica PC 9.7.1, SAP BO 4.1, Oracle 11g, Netezza 7.1, eCC, and eCF Apps and Suse Linux

Infosys Technologies Limited, Australia (Jun 2010 to Aug 2013)

Clients: NAB MLC, Westpac Bank, Sydney, Australia

Role: System Architect

Infosys Limited is an Indian multinational corporation that provides business consulting, information technology and outsourcing services to clients worldwide. As a Technology Architect in Infosys, I played different roles at different clients. During my tenure at Infosys, designing complex ETL solutions, data analysis, profiling, and quality assurance of all architecture and design deliverables, reviewing of all development deliverables and certifying that they are complete including Unit Test, System Test, and QA phase. Evaluate all proposals requests and improve structure of data warehouse, and managing business and IT stakeholder expectations, project escalations, resource allocations across projects, and quality of milestone deliverables, managed delivering fixed price and Time & Materials (T&M)-based projects, ensuring delivery within committed timelines and budgets, providing technical direction to data modelers, data analysts, ETL developers and database administrators. Setup /offshore model and lead the team of 15 people (onsite and offshore). Technical Tools used: Informatica PC 9.1.0, Oracle 11g, Datastage 8.1, SAP BO, UNIX, Schell Scripting, SQL, PL/SQL.

Mahindra Satyam Limited, Australia (Jun 2006 to Apr 2010)

Clients: ANZ Bank and Telstra Telecom, Melbourne, Australia, Credit Suisse Bank in Singapore

Role: Business Integrator

Mahindra Satyam is an Indian IT services company. As a Sr. Systems Analyst and Business Integrator, I played different roles at different clients. During my tenure at Mahindra Satyam, designing complex ETL solutions, data analysis, profiling, and quality assurance of all architecture and design deliverables, creating HLD and LLD for ETL team, code migrations to different environments, Informatica application upgrade, and administration, ETL code development in Informatica, Datastage, Cognos Administration, attending CAB meetings to discuss about impact on the DW systems. Source Code controller in Subversion. As a BI Architect at ANZ bank, dimension modeling, BO Universe and reports creation for business team. Managed team of 15 people onsite and offshore. Technical Tools used: Informatica PC 9.1.0, Oracle 11g, Teradata, TPump, BTEQ scripts, MLoad, FastLoad, Datastage 8.1, Cognos, SAP BO, UNIX, Schell Scripting, SQL, PL/SQL.

HP Singapore (Jan 2004 to May 2006)

Role: DW Consultant/BI Administrator through Optimum Solutions, Singapore

As a consultant in HP, it is more of a technical role where ETL code development in Informatica Power Center 6, administration; Business objects universe and reports creation for profiling team across Asia Pacific, customizing worldwide BO Universe to Asia Pacific, supporting business channel team for all the data, report, and access related issues. This Project makes use of Technology tools Informatica PC 7, BO 6.1a, Oracle 7, and HP UNIX.

EDUCATION

Masters in Computer Science (MSc. Computers)

Certifications

Certified in AWS Solution Architect – Professional

Certified in AWS DevOps Engineer – Professional

Certified in AWS Solution Architect – Associate

Certified in AWS Developer – Associate

Certified in AWS SysOps Admin – Associate

Certified Scrum Master from SCRUMStudy

Certified Informatica Developer from Informatica Corporation

Certified from Kimball University on Dimension Modeling

Certified in SAP Business Warehouse powered by SAP HANA

Certified in Data Science Methodology from IBM (Big Data University)

Certified in Hadoop fundamentals from IBM (Big Data University)

Certified in Data Movement into Hadoop form IBM (Big Data University)



Contact this candidate