Data Engineer (Deep Learning)

We are looking for a versatile data engineer who can effectively contribute into deep learning model development. You are experienced with building modern data-driven solutions and are comfortable diving deep into real-life data. You will be working as part of our R&D team drawing experience from e.g. developing Amazon Alexa and Apple Siri.

You will be a key technical expert in our experienced R&D team responsible for:

  • Processing and curating large amounts of machine learning training data
  • Analyzing, optimizing, and evaluating the performance critical model components
  • Developing and maintaining scalable machine learning training infrastructure
  • Implementing highly efficient state-of-the-art deep learning algorithms

You will be a great candidate for this position if you have:

  • Experience on large-scale data processing
  • Hands-on experience on state-of-the-art deep learning model training pipelines
  • Understanding of machine learning frameworks (PyTorch/TensorFlow)
  • Been involved in modern software development projects
  • Entrepreneurial mindset
  • Good team work skills

Tools and technologies we are using:

  • Programming languages: Python, C/C++, Unix scripting
  • Big data processing: Streaming eg. Kafka, Batch e.g. Spark
  • Machine learning: PyTorch, TensorFlow
  • Cloud: AWS, Google Cloud
  • DevOps: Docker, Kubernetes
  • GPU programming: CUDA

We can offer you a once in a lifetime opportunity to build something amazing with us, and of course also:

  • Opportunity to work with state-of-the-art technologies
  • Competitive salary
  • Employee equity plan
  • Liberal remote work policy (however not fully remote)
  • Attractive perks and benefits
  • Awesome office space
  • Top-notch kit (latest Macs etc)
  • Opportunity to work with a world class & fun team
  • Opportunity to contribute into creating our culture

To apply

Please submit your resume and cover letter to describe in your cover letter how you have been developing data pipelines for modern big data systems and what have you learned while doing that.