Connecting Linkedin...


Data Engineer

Established firm in the Gaming Industry is currently looking for a Data Engineer 

On Offer:

  • Agile and multicultural company with flat hierarchies
  • Self-organised, self-responsible and entrepreneurial employees
  • Competitive salary, Health and dental Insurance, Performance bonuses, Subsidized parking, Sports incentives & Childcare
  • Opportunities to develop and grow
  • Relocation Assistance

Main Responsibilities:

  • Creating and maintaining both batch (ETL) and real-time data pipelines and architecture
  • Assembling large and complex data sets that meet functional and non-functional business requirements
  • Identifying, designing and implementing internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Building the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
  • Building analytics tools that utilize the data pipelines to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
  • Collaborating with stakeholders including data architects, Executive, Product, Data and IT team members from the beginning to the delivery of a project


  • Minimum 3 years of relevant work experience
  • Strong analytics skills and the ability to innovate and think out of the box
  • Able to learn new and complex concepts quickly and you are relentlessly resourceful 
  • Collaborative, able to engage in interactive discussions with the rest of the team and able to communicate technical concepts clearly and concisely
  • Familiar with cloud data services such as AWS services such as S3, Athena, EC2, RedShift, EMR, and Lambda
  • Experience with large-scale production databases and SQL
  • Experience with time-series/analytics databases such as Elasticsearch
  • Experienced with ETL development (extractions, data load, aggregation, Talend, etc.)
  • Worked with or familiar with big data technologies such as Hadoop, NIFI, Kafka, Spark, and Logstash
  • Familiar with containerization and orchestration technologies like Docker/Kubernetes