Hi everyone! We are hiring a Data Engineer to join our Data & Analytics team at EIGENSONNE
in Berlin, Germany.
Our mission at EIGENSONNE is to enable everyone to produce their own green energy by digitalising renting, selling and usage of PV systems, electricity storage and charging stations. Our goal is to equip every roof in Europe with an EIGENSONNE system. Backing us is a large German energy supplier who guarantees our long-term financial and strategic stability.
• We make sure you have a perfect start to your new job. From the first day on, we will provide you with a detailed onboarding plan, and also a buddy to take care of you.
• We offer a competitive target salary, matched on your experience.
• We maintain an open-minded and transparent philosophy. Regular feedback is not only giving you the chance to express yourself, but also to receive input from your colleagues.
• Take the chance to grow with EIGENSONNE. We support your further development with a personal development budget and frequent coaching.
• Our teams are characterised by supporting and trusting each other. We stand together with our goals. Regular events help us to grow together even more as a team.
• With us you get permanent employment, flexible working hours, and remote working support for your work-life balance. Since your well-being is important to us, you are entitled to 30 days of vacation each year.
• You will enable a team of data analysts and data scientists to work more efficiently by building and managing EIGENSONNEs data infrastructure
• You will enhance our existing data pipeline by implementing new data sources and transformation processes and make sure the processes run efficiently and are continuously monitored and improved for performance and data quality
• You will help the data scientists to bring their ML models into production and automate the model training and model serving to different applications
• You will manage and enhance the cloud data infrastructure and improve CI/CD workflows
• You will coach junior engineers and share your knowledge with them
• You have a proven track record of developing scalable data infrastucture in Python
• You have a strong grasp of database structures, design and query languages, such as SQL
• You have experience in working with stream-based data processing (Kinesis or Kafka)
• You have a good understanding of cloud infrastructure platforms such as AWS or Google Cloud
• You are familiar with setting up and working with CD/CI workflows
• Strong working English skills
• Experience with workflow orchestration tools (e.g. Airflow or Prefect)
• Knowledge of different IaC solutions (e.g. Terraform)
• Container technologies