Intermediate Data Engineer | GCP | Hybrid | 2-5 yrs
Randstad

Toronto, Ontario
•1 hour ago
•No application
About
The Intermediate Data Engineer in the Wealth Data Engineering team, part of a global engineering organization, will be a key player in designing and implementing critical data solutions to meet the operational data needs of a large Wealth Management business. This role requires hands-on expertise in Big Data technologies, specifically Hadoop (Cloudera) and modern cloud data services like Google Cloud Platform (GCP), working collaboratively with enterprise data teams and solution architects to deliver end-to-end data solutions.
If you're interested, please apply this job posting with you updated resume or send your updated resume to kyle.chan@randstaddigital.com, thank you.
Advantages
Compensation & Benefits: Competitive rewards program, including a bonus, flexible vacation, personal days, sick days, and comprehensive benefits that start on day one.
Culture: A commitment to Diversity, Equity, Inclusion & Allyship, with a focus on creating an inclusive and accessible environment for all employees.
Professional Development: Opportunities for upskilling through online courses, cross-functional development, and tuition assistance. ...
If you're interested, please apply this job posting with you updated resume or send your updated resume to kyle.chan@randstaddigital.com, thank you.
Advantages
Compensation & Benefits: Competitive rewards program, including a bonus, flexible vacation, personal days, sick days, and comprehensive benefits that start on day one.
Culture: A commitment to Diversity, Equity, Inclusion & Allyship, with a focus on creating an inclusive and accessible environment for all employees.
Professional Development: Opportunities for upskilling through online courses, cross-functional development, and tuition assistance. ...
Work Environment: A dynamic ecosystem with opportunities for team collaboration.
Responsibilities
Data Pipeline Development: Lead the development efforts for ingesting and transforming data from diverse sources. This includes hands-on coding, scripting, and specification writing to ensure end-to-end delivery of data into the Enterprise Data Lake environment.
Scalable Architecture: Design, build, and operationalize distributed, reliable, and scalable data pipelines to ingest and process data from multiple sources.
Cloud Platform Expertise (GCP): Utilize Google Cloud Platform (GCP) data services such as DataProc, Dataflow, CloudSQL, BigQuery, and CloudSpanner, combined with technologies like Spark, Apache Beam/Composer, DBT, Confluent Kafka, and Cloud Functions.
Ingestion Patterns: Design and implement versatile data ingestion patterns that support batch, streaming, and API interfaces for both data ingress and egress.
Technical Leadership: Guide a team of data engineers, developing custom code and frameworks using best practices (Java, Python, Scala, BigQuery, DBT, SQL) to meet demanding performance requirements.
Workflow Management: Build and manage data pipelines with a deep understanding of workflow orchestration, task scheduling, and dependency management.
Technical Guidance: Provide end-to-end technical expertise on effectively using cloud infrastructure to build solutions, creatively applying platform services to solve business problems, and communicating these approaches to various stakeholders.
Operational Excellence: Provide guidance on implementing application logging, notification, job monitoring, and performance monitoring.
Qualifications
Experience: 2+ years of experience in data engineering, including performance optimization for large OLTP applications.
Big Data: Strong knowledge of Hadoop concepts, including HDFS, Hive, Pig, Flume, and Sqoop. Working experience in HQL.
Cloud Data Services (GCP): Knowledge of primary managed data services within GCP, including DataProc, Dataflow (Java/Python for streaming/batch jobs), BigQuery/DBT, Cloud Spanner, and Cloud Pub/Sub.
Databases & Streaming: Knowledge of Google Cloud Platform Databases (SQL, Spanner, PostgreSQL), relational/NoSQL databases, and data streaming technologies such as Kafka and Spark-streaming.
Software Development: Knowledge of Java microservices and Spring Boot. Working knowledge of developing and scaling JAVA REST services using frameworks like Spring.
Architecture & Operations: Strong architecture knowledge with experience in providing technical solutions for cloud infrastructure. Knowledge of Infrastructure as Code (IaC) practices and frameworks like Terraform.
Soft Skills: Good communication and problem-solving skills with the ability to effectively convey ideas to business and technical teams.
Nice-To-Have: Understanding of the Wealth business line and the data domains required for building end-to-end solutions.
Summary
The Intermediate Data Engineer in the Wealth Data Engineering team, part of a global engineering organization, will be a key player in designing and implementing critical data solutions to meet the operational data needs of a large Wealth Management business. This role requires hands-on expertise in Big Data technologies, specifically Hadoop (Cloudera) and modern cloud data services like Google Cloud Platform (GCP), working collaboratively with enterprise data teams and solution architects to deliver end-to-end data solutions.
If you're interested, please apply this job posting with you updated resume or send your updated resume to kyle.chan@randstaddigital.com, thank you.
Randstad Canada is committed to fostering a workforce reflective of all peoples of Canada. As a result, we are committed to developing and implementing strategies to increase the equity, diversity and inclusion within the workplace by examining our internal policies, practices, and systems throughout the entire lifecycle of our workforce, including its recruitment, retention and advancement for all employees. In addition to our deep commitment to respecting human rights, we are dedicated to positive actions to affect change to ensure everyone has full participation in the workforce free from any barriers, systemic or otherwise, especially equity-seeking groups who are usually underrepresented in Canada's workforce, including those who identify as women or non-binary/gender non-conforming; Indigenous or Aboriginal Peoples; persons with disabilities (visible or invisible) and; members of visible minorities, racialized groups and the LGBTQ2+ community.
Randstad Canada is committed to creating and maintaining an inclusive and accessible workplace for all its candidates and employees by supporting their accessibility and accommodation needs throughout the employment lifecycle. We ask that all job applications please identify any accommodation requirements by sending an email to accessibility@randstad.ca to ensure their ability to fully participate in the interview process.
show more
Responsibilities
Data Pipeline Development: Lead the development efforts for ingesting and transforming data from diverse sources. This includes hands-on coding, scripting, and specification writing to ensure end-to-end delivery of data into the Enterprise Data Lake environment.
Scalable Architecture: Design, build, and operationalize distributed, reliable, and scalable data pipelines to ingest and process data from multiple sources.
Cloud Platform Expertise (GCP): Utilize Google Cloud Platform (GCP) data services such as DataProc, Dataflow, CloudSQL, BigQuery, and CloudSpanner, combined with technologies like Spark, Apache Beam/Composer, DBT, Confluent Kafka, and Cloud Functions.
Ingestion Patterns: Design and implement versatile data ingestion patterns that support batch, streaming, and API interfaces for both data ingress and egress.
Technical Leadership: Guide a team of data engineers, developing custom code and frameworks using best practices (Java, Python, Scala, BigQuery, DBT, SQL) to meet demanding performance requirements.
Workflow Management: Build and manage data pipelines with a deep understanding of workflow orchestration, task scheduling, and dependency management.
Technical Guidance: Provide end-to-end technical expertise on effectively using cloud infrastructure to build solutions, creatively applying platform services to solve business problems, and communicating these approaches to various stakeholders.
Operational Excellence: Provide guidance on implementing application logging, notification, job monitoring, and performance monitoring.
Qualifications
Experience: 2+ years of experience in data engineering, including performance optimization for large OLTP applications.
Big Data: Strong knowledge of Hadoop concepts, including HDFS, Hive, Pig, Flume, and Sqoop. Working experience in HQL.
Cloud Data Services (GCP): Knowledge of primary managed data services within GCP, including DataProc, Dataflow (Java/Python for streaming/batch jobs), BigQuery/DBT, Cloud Spanner, and Cloud Pub/Sub.
Databases & Streaming: Knowledge of Google Cloud Platform Databases (SQL, Spanner, PostgreSQL), relational/NoSQL databases, and data streaming technologies such as Kafka and Spark-streaming.
Software Development: Knowledge of Java microservices and Spring Boot. Working knowledge of developing and scaling JAVA REST services using frameworks like Spring.
Architecture & Operations: Strong architecture knowledge with experience in providing technical solutions for cloud infrastructure. Knowledge of Infrastructure as Code (IaC) practices and frameworks like Terraform.
Soft Skills: Good communication and problem-solving skills with the ability to effectively convey ideas to business and technical teams.
Nice-To-Have: Understanding of the Wealth business line and the data domains required for building end-to-end solutions.
Summary
The Intermediate Data Engineer in the Wealth Data Engineering team, part of a global engineering organization, will be a key player in designing and implementing critical data solutions to meet the operational data needs of a large Wealth Management business. This role requires hands-on expertise in Big Data technologies, specifically Hadoop (Cloudera) and modern cloud data services like Google Cloud Platform (GCP), working collaboratively with enterprise data teams and solution architects to deliver end-to-end data solutions.
If you're interested, please apply this job posting with you updated resume or send your updated resume to kyle.chan@randstaddigital.com, thank you.
Randstad Canada is committed to fostering a workforce reflective of all peoples of Canada. As a result, we are committed to developing and implementing strategies to increase the equity, diversity and inclusion within the workplace by examining our internal policies, practices, and systems throughout the entire lifecycle of our workforce, including its recruitment, retention and advancement for all employees. In addition to our deep commitment to respecting human rights, we are dedicated to positive actions to affect change to ensure everyone has full participation in the workforce free from any barriers, systemic or otherwise, especially equity-seeking groups who are usually underrepresented in Canada's workforce, including those who identify as women or non-binary/gender non-conforming; Indigenous or Aboriginal Peoples; persons with disabilities (visible or invisible) and; members of visible minorities, racialized groups and the LGBTQ2+ community.
Randstad Canada is committed to creating and maintaining an inclusive and accessible workplace for all its candidates and employees by supporting their accessibility and accommodation needs throughout the employment lifecycle. We ask that all job applications please identify any accommodation requirements by sending an email to accessibility@randstad.ca to ensure their ability to fully participate in the interview process.
Application
Fill in your information and participate in the selection process for the Intermediate Data Engineer | GCP | Hybrid | 2-5 yrs position.
✓
Profile Test
✓
Resume
✓
Upload
✓
Application
Intermediate Data Engineer | GCP | Hybrid | 2-5 yrs
Send your resume to the link below.
Submit Application
Similar Positions
contract
Toronto, Ontario
General Information: Job Title: Rapid Prototyping Engineer Location...
-
6 minutes ago
Adzuna



