Post Job Free
Sign in

Big Data Architect

Company:
Aa247
Location:
Nagpur, Maharashtra, India
Posted:
April 24, 2024
Apply

Description:

Big Data Architect – Cloud and Data Engineering Competency Center

Position: Big Data Architect - Cloud and Data Engineering (CDE) Competency Center

Purpose of the Position: To provide client advisory, pre-sales, and practice development support for InfoCepts cloud and data engineering competency center. The position enables InfoCepts to meet its business goals by forging enduring practice initiated solutions, supporting delivery team with technical advisory and consulting, curating convincing proposals and providing future direction to the competency center through research.

InfoCepts provides end-to-end data & analytics solutions to commercial and public sector customers in various markets across the globe with a concentration in the US, EMEA, and the APAC regions. InfoCepts invests in four core competencies— Cloud and Data Engineering (CDE), Analytics and Data Management (ADM), Business Consulting (BC) and Service Management (SM)—to enable continued access to global markets and unmet needs. The position is responsible to build-out and roll-out solutions, lead advisory engagements, respond to proposals, develop & manage the center of excellence, lead technology roadmaps, in-house capability building, and market research activities.

Reporting to the Director of CDE Competency Center, the position works closely with sales, presales, marketing, delivery and other competency center teams. The position works with the competency team to execute on the assigned responsibilities.

Location: India

Travel: Up to 20% respecting public health guidelines in effect

Type of Employment: Full time

Position Type: Exempt

Key Result Areas and Activities:

Thought Leadership – Build and share experience and evidence based thought leadership in the Big Data domain through various channels such as Yammer blogs, white papers, webinars, events, and conferences. Provide inputs on creating and enhancing enduring solutions for Practice.

Solutions and Sales Support – Support pre-sales activities including conducting discovery workshops, developing point of views/solutions/scoping, writing proposals, and supporting sales and marketing enablement for Big Data related technologies. Lead the solution development for Practice.

Consulting for New and Existing Clients – Provide advisory consulting to new and existing customers, support delivery hand-offs, address unmet needs, and manage client expectations for their Big Data technologies.

Share and Build Expertise – Develop and share expertise in Big Data technologies and actively mine the experience and expertise in the organization for sharing across teams and clients in the firm. Lead and support the Big Data COE with organized meeting cadence. Create and maintain a vibrant Cloud technology lab as a hot bed of technological innovation. Suggest ways and guide the COE as a capability hub for new technologies with due consideration of their usability. Participate and contribute in other cloud and big data center of excellences.

Nurture and Grow Talent – Provide support for recruitment, coaching and mentoring, and building practice capacity in the firm in Big Data technologies.

Work and Technical Experience:

Experience in data and analytics domain

Knowledge of variety of advanced architecture, tools and concepts across all layers of modern distributed technology stack (Hadoop, Spark, Kafka, Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, Airflow, Mahout, Map Reduce, etc.)

Experience in designing solutions for data lakes or large data warehouses with a good understanding of cluster architecture with MPP Database systems and/or NoSQL platforms.

Experience with programming language such as Python, Scala, and Java and scripting

Experience of enabling DevOps automation with appropriate security and privacy considerations

Expertise in developing ETL workflows comprising complex transformations like SCD, deduplications, aggregations, etc. Experience with ETL tools like Informatica and Talend will be a plus

Experience in streaming technologies such as Flink, Storm, Kafka streams, Spark streaming and others

Experience in configuring and provisioning network and infrastructure for Big Data applications

Knowledge of containerization services such as Dockers and its orchestration through Kubernetes

Proven ability to work on multiple requirements, design, and development approaches and methodologies (agile, iterative, waterfall, etc.)

Experience of cloud platform such as AWS, Azure, GCP or others will be an added advantage

Pre-sales experience will be an added advantage

Qualities:

Self-motivated and focused on delivering outcomes for a fast growing team and firm

Able to communicate persuasively through speaking, writing, and client presentations

Able to consult, write, and present persuasively

Able to work in a self-organized and cross-functional team

Able to iterate based on new information, peer reviews, and feedback

Able to work with teams and clients in different time zones

Research focused mindset

Bachelor’s degree in computer science, engineering, or related field (Master’s degree is a plus)

Demonstrated continued learning through one or more technical certifications or related methods

At least 14 years of IT experience; Minimum 6 years of experience on Big Data related projects

Schedule: Full-time

Shift: Day Job

Travel:

Apply