Senior Data Engineer
Remotive
Remote
•4 hours ago
•No application
About
Are you a highly skilled Senior Data Engineer ready to lead and innovate in a dynamic, cloud-driven environment? We're looking for a pivotal team member to design, develop, and maintain cutting-edge data solutions, ensuring scalability, reliability, and top-tier performance. If you have a strong background in data engineering, a proven track record in leading technical teams, and thrive in an agile setting, we want to hear from you.
What You'll Be Doing:
- Building and maintaining efficient ETL/ELT pipelines using tools like Apache Airflow and PySpark.
- Developing robust database schemas, dimensional models (Kimball/Inmon), and supporting data normalisation for both relational and NoSQL databases.
- Contributing to the development and maintenance of our data warehouses, data lakes, and data lakehouses.
- Working with diverse database systems, including Azure SQL, PostgreSQL, Google BigQuery, MongoDB, and Google Firestore.
- Handling structured, semi-structured, and big data file formats such as Avro, CSV, Parquet, ORC, and Delta.
- Developing and maintaining APIs for seamless data integration and workflows, with a solid understanding of REST and microservices architectures.
- Overseeing codebase maintenance and optimisation, leveraging Git for version control.
- Implementing thorough integration testing and ensuring high-quality deliverables for all new data processing scenarios.
- Providing technical design and coding assistance to team members, ensuring successful project milestones.
- Assessing and integrating new data sources to meet evolving business needs.
What We're Looking For:
- Strong proficiency in Python and SQL (PostgreSQL or SQL Server preferred).
- Hands-on experience with Apache Airflow and PySpark.
- Familiarity with Databricks is essential
- Working knowledge of cloud platforms such as Azure, GCP, or AWS.
- Experience with data warehousing concepts, dimensional modelling, and database normalisation.
- Understanding of big data file formats like Avro, Parquet, ORC, and Delta.
- Proficiency in working with APIs, REST, and microservices architectures.
Education & Experience:
- A Bachelor’s degree in Computer Science, Data Science, or related fields.
- 5+ years of progressive experience in data engineering, cloud computing, and technology implementation.
- Experience managing multi-shore projects and working within cloud ecosystems (SaaS/PaaS).
- Proven experience leading technical teams and mentoring team members.
What you will get out:
- You'll get to develop your skill set
- A competitive, industry benchmark compensation
- Flexible working hours and a remote office setting
- You'll be part of a rapidly growing business
- Work with the absolute masters in the industry and catch some of their energy, vibe, and passion for what we do
- Great coffee every day, and samoosa Fridays (in-office of course)
- Plenty of company-sponsored learning; certifications and incentives
- Work Hard. Play Hard. Work-Life Balance
- No working on your birthday (free day off)
