Big Data Engineer

9168 India, Hyderabad Business Intelligence Permanent

The role:

We are looking for a Big Data Engineer to join our busy and dynamic  Business Intelligence team.

As part of our data platform team, you will be responsible for driving the design and development of core platform frameworks that enable the delivery and construction processes for the Data Management, Data Discovery and Analytics group, and using emerging big data technologies. You will be required to apply your depth of knowledge and expertise to all aspects of the software development lifecycle, as well as partner continuously with your different business partners daily to stay focused on common goals.

Why we need you: –

You will develop real-time and batch processes using a wide range of technologies including:

  • Hadoop
  • Spark
  • Flink
  • Storm
  • Hive
  • Hbase
  • Kafka

You will efficiently translate architecture and low-level requirements to design and code using SQL and Java Big Data processing. You will perform optimizations on Big Data and investigation of job bottlenecks.

You will be responsible for the documentation, design, development of Hadoop applications and handle the installation, configuration, and support of Hadoop cluster nodes.

You will develop and maintain backend MapReduce, Tez, Storm, Flink applications. You will convert hard and complex techniques as well as functional requirements into detailed designs.

As a dedicated team member, you will propose best practices and standards; handover to the operations. You will test software prototypes and transfer them to the operational team. You will maintain data security and privacy across the data warehouse. You will be responsible for the management and deployment of HBase data structures.

Who are we looking for: –

As a perfect candidate you have:

  • Familiarity with Hadoop ecosystem and its components
  • 3+ years experience in streaming systems and processing big data volumes
  • Ability to write reliable, manageable, and high-performance code
  • Expertise knowledge of Hdfs, Yarn, HBase
  • Working experience in HQL
  • Experience of writing Tez and MapReduce jobs and Kafka applications
  • Hands-on experience in programming languages, particularly Java, Python
  • Analytical and problem-solving skills; the implementation of these skills in Big Data domain
  • Understanding of data loading tools such as Flume, Sqoop etc

What’s in it for you?

Our experience-based salaries are competitive. Plus, there’s a discretionary annual performance bonus.

Your package will include:

  • cash allowance for health and dental care
  • company pension scheme
  • a personal interest allowance to let you learn something new or pursue a hobby
  • 34,000 INR as congratulations if you have a baby whilst you work for us
  • in-house training and development to develop your skills, progressing your career
  • discounted gym membership
  • cash allowance for meals.

What happens next?

If you’re what we’re looking for, next up will be a phone interview. And if that goes well, we’ll meet you for a face-to-face interview.

The Group

PokerStars is part of Flutter Entertainment Plc, a global sports betting, gaming and entertainment provider headquartered in Dublin and part of FTSE 100 index of the London Stock Exchange, which brings together exceptional brands, products and businesses and a diverse global presence in a safe, responsible and ultimately sustainable way.

We are an equal opportunity employer that values diversity. We do not discriminate on any protected characteristic as defined by applicable law.

We will look to provide reasonable accommodation for applicants with disabilities to participate in the job application or interview process. If you need assistance, please contact: talent@starsgroup.com.

Please note we cannot accept general applications; this inbox is just for providing support to those who need it.

Our FAQs

We hope that we’ve answered as many of your questions about working at the PokerStars as possible, but if you still have some questions, why not try visiting our FAQ page?

Find your answers here