Post Job Free
Sign in

Big Data/Python/PySpark/Kafka Consultant/ Sr Consultant/ Manager

Company:
Deloitte
Location:
Aerodrome Area, Odisha, 751020, India
Posted:
April 26, 2024
Apply

Description:

What impact will you make?

Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you’ll find unrivaled opportunities to succeed and realize your full potential

Deloitte is where you’ll find unrivaled opportunities to succeed and realize your full potential.

The Team

The world of business, economics, and finance is rapidly changing. Trends in the economy affect businesses, industries, and the financial markets that interact with one another in dynamic and often unpredictable ways. The Deloitte Industry team is focused on analysing economic and industry developments in India and their relevance to businesses. The team is responsible for conducting path-breaking and innovative research, analyse trends, and develop in-depth business and industry/sector thought leaderships that provide useful insights to enable business teams/partners to make strategic decisions.

Work you’ll do

In our Consulting team, you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations. You’ll:

develop and deploy solutions using different tools, design principles and conventions.

large, complex data sets that meet functional / non-functional business requirements.

design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources and ‘big data’ technologies.

data ingestion and processing performance by tuning configuration parameters, optimizing SQL queries, and implementing parallel processing techniques.

pipeline performance and resource utilization to identify bottlenecks and optimize system resources allocation.

up monitoring and alerting systems to track pipeline health, detect anomalies, and notify stakeholders of potential issues or failures.

logging and metrics collection to facilitate troubleshooting and performance analysis.

existing processes and facilitate change requirements as part of a structured change control process.

proper documentation for the solutions, test procedures and scenarios during UAT and Production phase.

and implement Pipeline Development, Data Quality and Governance, Monitoring and Alerting, Data Virtualization and Query Optimization, Data Source Integration, Data Catalog Management, Security and Access Control, Integration with Applications and Ecosystems, Data Ingestion and Management

with process owners and business to understand the as-is process and design the automation process flow.

high-quality code resulting from knowledge of the latest frameworks, code peer review, and automated unit test scripts.

with other developers and team members.

in Agile ceremonies such as sprint planning, daily stand-ups, and retrospective meetings Develop and document unit tests and execute all processes and procedures in assigned areas of responsibility.

Qualifications

3-12 years of relevant hands-on experience on any or all the Airbyte/Dremio/MinIO tools.

3-12 years of relevant hands-on experience on Big Data Ecosystem – on Premise or Cloud version.

3-12 years of relevant experience in consulting industry, or technology experience for big data.

Turn information into insight for improved business performance by consulting with architects, solution managers, and analysts to understand business technical needs and deliver solutions.

Design, create, code, and support a variety of Big Data ETL solutions (potentially including but not limited to: Python, SPARK Scala, Kafka, Different Data Services, Informatica, Talend, or others).

Create and maintain development patterns, standards, processes, and norms. Analyze existing processes and user development requirements to ensure maximum efficiency.

Demonstrate the ability to build a work plan or parts of a work plan, as applicable for role.

Implementation of medium/large projects.

Work with the team to design, code, and test ETL or ELT solutions for the reference architecture.

Experience with object-oriented/object function scripting languages: Python/Scala, SQL, Spark-SQL.

Proficient in Shell Scripts.

Embrace a Continuous Improvement mentality to streamline and eliminate waste in all processes.

Demonstrate initiative and ownership by proactively resolving issues and taking on multiple work plan tasks.

Effective communications with both technical and business communities.

Maintain strong technical skills and share knowledge with team members.

Strong problem solving and troubleshooting skills.

Willingness to travel up in case of project requirement.

Working knowledge of agile methodologies.

Benefits

At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you.

Our purpose

Deloitte is led by a purpose: To make an impact that matters.

Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world

Apply