Hey guys, I'm hiring a Data Engineer to join my te...
# find-a-prefect-job
z
Hey guys, I'm hiring a Data Engineer to join my team at Simplebet! Details in thread.
🚀 7
👏 8
marvin 4
At Simplebet we treat data as a product – we maintain high quality datasets used by our machine learning engineers, product analysts, and adjacent software teams. We also believe in building a data platform – we help to build and deploy applications that power our data stack.  As a Data Engineer, you will design and develop data systems and reporting tools to support our internal customers to both improve our product as well as support critical business decisions. You'll have the opportunity to shape the future of data analytics at Simplebet and help us continue to become a data driven organization by building out tools and infrastructure to support enterprise business intelligence. Responsibilities ◦ Build robust and reliable ETL pipelines with best practices and data governance in mind using Prefect, Databricks, Kafka and PostgreSQL  ◦ Organize the data in our data lake to help build a cohesive "gold" data model to power product analytics and enterprise business intelligence ◦ Design and build our real-time streaming data infrastructure and products ◦ Design and build out data pipelines using Delta Lake and Delta Live Tables on Databricks ◦ Write software to optimize machine learning research and production implementations of machine learning models What you will be doing ◦ In the first month, you will get acquainted with Simplebet, how data flows through our systems, and begin contributing to our ETL pipelines and data management systems ◦ In the second month, you will work closely with our product analytics team to design a "gold" data model from our data lake to drive product analytics and insights, from both a machine learning observability and customer facing perspective ◦ From the third month onward, you will build out pipelines and infrastructure related to delivering high quality, easy to use datasets from the disparate sources in our data lake  Requirements ◦ 3+ years of data-oriented software development and experience building data backends using Python and Apache Spark ◦ Strong understanding of database and data lake design ◦ Experience managing and optimizing datasets for business analytics tools, such as ReDash, Looker, or Tableau ◦ Bachelor's degree or Master's degree from an accredited university or college in computer science or related field  Bonus points for ◦ Experience with Databricks and Delta Lake ◦ Experience with Kafka and data streaming ◦ Experience with deploying applications on AWS using Kubernetes ◦ Experience working on Data Science / Machine Learning projects ◦ Experience using Delta Live Tables, DBT or similar data pipeline tooling