Post Job Free
Sign in

Tech Manager

Company:
Keylent Inc
Location:
Franklin, TN, 37068
Posted:
May 14, 2025
Apply

Description:

Tech Manager TECHM-JOB-25647

Role: Data Architect

Skill: AWS Analytics

Location: [Franklin, TN]

Full Time

Technical Skillset (Mandatory)

Big Data Tools: Hadoop, Cloudera (CDP), PySpark, Kafka, etc.

Data Pipeline & Integration: Airflow, Kafka, Informatica Cloud (IICS), PowerCenter

Data Governance: EDC, AXON

Stream-Processing Systems: Storm, Spark-Streaming, AWS Kinesis

Databases: Relational, Amazon DynamoDB, Mongo DB, Snowflake, Postgres.

AWS Cloud Services: EC2, S3, EMR, RDS, Lambda, DMS, Kinesis, Glue

Object-Oriented Languages: Python, R

Technical Skillset (Optional)

Visualization Tool: SAP Business Objects, Tableau

Role and Responsibilities

Responsible for the architecture and design of enterprise Data Warehouse processes in Cloud.

Accountable for all aspects of the data warehouse processes including data modeling, ETL, data validation, data quality assurance, and implementation.

Based on customer technology landscape arrive at solutions which is more Agile, scalable, and Cost effective.

Provide Proactive proposals to customer on Cloud Data Strategy.

Understand Customer Problem Statements and leverage Ai/Client Capabilities for Solution.

Build reusable artefacts and accelerators in Cloud

Designs and develops logical and physical data flow models for Extraction, Transformation and Loading processes to acquire and load data from internal and external sources.

Arrive at Solutions implements programs in various languages like SQL stored procedures, Python etc.

Evaluates data process optimization opportunities and develops recommendations and improvements.

Translates data access, transformation, and movement requirements into functional requirements and mapping designs.

Creates and enhances data solutions enabling seamless delivery of data and is responsible for collecting, parsing, managing and analyzing large sets of data.

Collaborate with technical and business users to develop and maintain enterprise wide solutions and standards to provide data required for metrics and analysis

Partners with members of the Business Intelligence team to provide required access/structure to data.

Build relationships with business intelligence partners to understand data needs in order to execute with excellence on documented user requirements

Project Specific requirement (if any):

Data-oriented personality, good communication skills, and an excellent eye for details.

Master's in Data Programs

Certifications in AWS Cloud, Machine Learning Programs

Relevant Experience required

12 + Years' experience in Data Platforms and 5+ Years in Cloud preferably AWS.

Proficient understanding of distributed computing principles.

Proficiency with ETL tools like Informatica PowerCenter, Informatica Cloud and Cloud Data Platforms like Snowflake.

Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala

Strong understanding of Data Structures and Algorithms.

Experience with integration of data from multiple data sources

Experience with various messaging systems, such as Kafka

Experience in scripting languages such as Python.

Proficient in Solutioning for Real time Analytics

Apply