Apply for,

DataOps Implementation Engineer

@ DataKitchen



Contract Typepermanent
Experience Level2Year (s)
Required Skills:

More Info:

About the DataOps Implementation Engineer Role

The DataOps Engineer will plan and execute implementation projects using our DataOps software to make our customers wildly successful. Some projects focus on SQL. Most projects are integration focused and use Python. All projects orchestrate data pipelines, reduce error rates, and deploy code into production (CI/CD). This position requires technical skills, communication skills, attention to detail, follow-up, and the ability to self-manage.


You must be located within GMT+2 (e.g. Italy) to GMT-8 (e.g. CA). We will not consider candidates outside those time zones. We do not work with recruiters. Everyone else, if in doubt please reach out!

  • You will be responsible for all aspects of the success of a project. You will plan and implement the use of DataKitchen’s DataOps software in customer environments, starting with Proof of Concept projects through ongoing production operation.
  • Some projects will be SQL focused. You will gather requirements, work with raw data, design a schema, do data transformation, write automated tests, and manage deployment and operations.
  • Other projects will be more integration focused. You will orchestrate the customer’s existing tools and analytic assets via Docker, APIs, or CLIs. You will use cloud (e.g. AWS, GCP, Azure) facilities to spin up environments.
  • In both cases, you will become a master at using DataKitchen software to orchestrate, test, and deploy Recipes (data pipelines).
  • Engage in consistent, proactive client communication to positively impact customer loyalty. Partner effectively with internal teams to drive growth and address customer concerns efficiently and decisively.
  • Travel to customer sites as necessary (Covid-19 safety permitting).

Qualifications and Skills:

Technical Required

  • Experience with SQL and Python (or equivalent).
  • Experience delivering products in data management, analytics, data pipelines, or data science is required.
  • Continuous integration frameworks and unit testing.
  • Cloud technologies like AWS, GCP, or Azure.
  • Experience with Docker.
  • Experience with shell scripting.
  • Experience on implementation projects.
  • Excellent written and verbal communication skills, including listening and the ability to adapt to audience needs.
  • Superior customer service skills – the ability to be empathetic, compassionate, responsive, resourceful, and solution-oriented.
  • Masters in Computer Science, Information Technology or equivalent.

Click Open In New icon