Post Job Free
Sign in

AWS Solution Architect, Data Architect

Location:
Ashburn, VA
Posted:
June 29, 2018

Contact this candidate

Resume:

Prabhakara Talla

***** ******* ******, *******, ** 20148 C: 703-***-**** ******.*****@*****.***

https://www.linkedin.com/in/prabhakar-reddy-talla

https://blogs.informatica.com/2016/02/08/fannie-mae-data-quality-enhances-self-service/ - fbid=PWtDwW6

Professional Summary

Over 15 years of IT experience in Application development, Architecture and Management. Over 10 years of Data management experience extensively on Data Warehouse Applications development, Data Governance and Metadata. Around 4 years of experience in Informatica suite of tools used for ETL, data quality, data corrections, data masking etc. 3 years as a Solution Architect in Data Quality, DQ platform services. 3+ years as Enterprise Architect supporting Enterprise initiatives. Strong knowledge and implementation experience of SOA, ESB and ECF. Helped organizations with Digital Transformation and customer 360 initiative. Experience in Architecting Event Driven and Micro Services solutions. Successfully lead teams of various sizes. Strong Analytical skills and excellent oral/Written verbal skills. Proven mentor and training with expertise in communicating across organizational levels and with cross-functional teams to drive shared vision and foster culture of excellence.

Skills

AWS Full Stack

Hive, Presto, Spark

Unix Shell script, Python, node.js, Java

Hopex (Mega)

Kafka, Kinesis

Kanban - Rally, Jira and Trello

API Gateway, Axway, Swagger

Ab initio, Informatica PowerCenter, Developer/Analyst, Glue

DOORS, Confluence, SharePoint Business Objects, Micro Strategy, Tableau

Oracle, Redshift, TOAD, Rapid SQL

VSS, Rational Clear case, sub version, Bit Bucket

MS Office, MS Visio, MS Project

Certifications

TOGAF 9 Certified Architect

AWS Certified Solution Architect

ITIL Intermediate

Work History

Enterprise Architect, 10/2016 to Current

Fannie Mae – Herndon, VA

Collaborated with clients from concept through final delivery of product. Developed strategy for third-party data ingestion to Enterprise Data Lake for analytical and application needs

Working closely with Security division on role rationalization for S3. Define S3 ingress/egress patterns

End-end solution for GRC right from S3 ingestion through RDS hydration to suffice reporting user needs

Proposed and proved with reference implementation to use Lambda with Python for Data transformation and movement from S3 to RDS (PostgreSQL)

POC to explore AWS Glue capabilities on Data cataloging and Data integration

CICD using Jenkins, AWS managed code pipeline and code commit.

Demonstrated in depth knowledge of AWS full stack; conducted internal trainings and brownbag sessions

Docker container infrastructure to encapsulate code into a file system with abstraction and automation

REST APIs for micro services. Build API governance strategy right from Dataset creation/registration thru lookup

Solution Architect, 10/2015 to 10/2016

Fannie Mae – Herndon, VA

With Cloud first Enterprise initiative conducted cloud assessments for all new project initiatives.

Ingestion, compute and vending patterns using AWS native tools.

S3 Security architecture for on-prem to cloud, cloud to on-prem data transfer and store.

In collaboration with information security team derived S3 ingestion and access patterns.

Assisted with Data Governance standards on EDL.

POC and testing of CloudFormation template and spinning up EMR via service catalog.

Solution Architect, Lead Developer, 12/2011 to 10/2015

Fannie Mae – Herndon, VA

Architecture and design of generic infrastructure that will meet the Data Quality needs at Enterprise level, support DQ dashboard reporting.

Overseeing DQ platform services that includes infrastructure, solution enhancements and Data model changes. Access control and change management procedures for DQ rule creation/editing in IDQ analyst.

Ensure adhering to Data management standards like common vocabulary in logical model for Enterprise use, decoupled architecture in data exchange, avoiding peer to peer interfaces etc. in building the target state application.

Explore tools and document patterns use cases for Big Data Quality.

Designed and delivered metadata driven EDQ interface to bring DQ metrics from LOB applications into EDQ area in ECF via ESB.

Lead Developer, 10/2010 to 12/2011

Fannie Mae – Herndon, VA

As part of Enterprise data quality initiative, evaluated the Informatica data quality tool by developing a prototype for CPRM project.

Played a major role as architect in choosing the federated model for DQ rule development.

Worked closely with the vendor, raised several service requests for the bugs/gaps found during prototype development thus helped the tool enhancement.

Guide and mentor the team as and when needed on new features of the tool.

Lead Developer, 06/2006 to 10/2010

FannieMae – Herndon, VA

Architecture reviews and presentations

Apply Metadata management concepts in IDB bullet proofing process

Lead impact analysis and setup during Oracle and Ab Initio upgrades. Oversee Release Management

Define ETL requirement and design for Metadata driven process

Pilot project to verify various cross database challenges like Sybase, Oracle etc

Module Lead, 07/2005 to 03/2006

Wipro Technologies – Bengaluru, KA

Client: Capital One

Deliver Design Documents and Source to Target Mappings

Generic AI graphs, wrapper scripts, Control-M scheduler to run the scripts. Used Ab Initio inbuilt Metadata management capabilities like Dynamic DML generation. Design Multi File Systems to handle the distributed data files.

Team Lead & Designer, 09/2004 to 07/2005

Accenture Services Pvt Ltd – Mumbai, MH

Client: DELL International Services

Involved in creating source to target mapping, design templates (using Ab Initio) and low-level design documents. Low-level design documents would identify the source and targets and the detail data flow to help in construction of graphs. Implemented Data Parallelism through graphs, by using Ab Initio partition components.

Team Lead & Designer, 06/2003 to 10/2004

Accenture Services Pvt Ltd – Mumbai, MH

Client: General Electricals

Extract all vehicles data from source database into flat files, Loading - Creating a data repository of Marcom details into Teradata. Migration of Custom ETL to Ab Initio. Job scheduling using crontab and control-M. Production Issues Tickets Resolving.

Education

Master of Science: Computer Applications

Kakatiya University - Warangal, Telangana, India



Contact this candidate