We are looking for an experienced deep learning data engineer. You have been building and optimizing modern deep learning solutions and are comfortable diving deep into the implementation details. You will be working very closely with our R&D team drawing experience from e.g. developing Amazon Alexa and Apple Siri.

You will be a key technical expert in our experienced R&D team responsible for:

  • Analyzing and optimizing the performance critical model computation
  • Developing and maintaining scalable machine learning training infrastructure
  • Implementing highly efficient state-of-the-art deep learning algorithms

You will be a great candidate for this position if you have:

  • In-depth understanding of the computational bottlenecks of modern deep learning models, both in training and inference
  • Experience on large-scale data processing
  • Hands-on experience on state-of-the-art deep learning model training pipelines
  • Understanding of machine learning frameworks (PyTorch/TensorFlow)
  • Low-level code optimization skills for GPU-based computation considered an advantage
  • Good team work skills

Tools and technologies we are using:

  • Programming languages: Python, C/C++
  • Big data processing: Streaming eg. Kafka, Batch e.g. Spark
  • GPU programming: CUDA
  • Machine learning: PyTorch, TensorFlow or similar
  • Cloud: AWS, Google Cloud
  • DevOps: Docker, Kubernetes

We can offer you a once in a lifetime opportunity to build something amazing with us, and of course also:

  • Opportunity to work with state-of-the-art technologies
  • Competitive salary
  • Employee equity plan
  • Liberal remote work policy (however not fully remote!)
  • Attractive perks and benefits
  • Awesome office space
  • Top-notch kit (latest Macs etc)
  • Opportunity to work with a world class & fun team
  • Opportunity to contribute into creating our culture

To apply

Please submit your resume and cover letter to careers@speechly.com.Please describe in your cover letter how you have been developing data pipelines for modern big data systems and what have you learned while doing that.