Our client represents the connected world, offering innovative and customer-centric information technology experiences, enabling Enterprises, Associates, and Society to Rise™.
They are a USD 6 billion company with 163,000+ professionals across 90 countries, helping 1279 global customers, including Fortune 500 companies. They focus on leveraging next-generation technologies, including 5G, Blockchain, Metaverse, Quantum Computing, Cybersecurity, Artificial Intelligence, and more, on enabling end-to-end digital transformation for global customers.
Our client is one of the fastest-growing brands and among the top 7 IT service providers globally. Our client has consistently emerged as a leader in sustainability and is recognized amongst the ‘2021 Global 100 Most sustainable corporations in the World by Corporate Knights.
We are currently searching for a Data Engineer:
Responsibilities:
- Build and maintain data pipelines and ETL/ELT processes on Google Cloud Platform (GCP) to ensure reliable and efficient data flow.
- Collaborate with Senior Data Engineers and cross-functional teams (Data Scientists, Product Managers) to gather requirements and implement solutions.
- Implement data models, schemas, and transformations to support analytics and reporting.
- Monitor, troubleshoot, and optimize pipelines to ensure data quality, integrity, and performance.
- Ensure compliance with data governance, security, and regulatory standards within the GCP environment.
- Document data workflows, tools, and best practices to support scalability and operational excellence.
- Stay up to date on GCP services and trends to continuously improve infrastructure capabilities.
Requirements:
- Bachelor’s degree in Computer Science, IT, Data Engineering, or a related field.
- Minimum 3 years of experience in data engineering, including building pipelines on Google Cloud Platform (GCP).
- Proficiency with GCP tools, such as BigQuery, Dataflow, Pub/Sub, or Cloud Composer.
- Strong skills in Python or Java, and advanced SQL for data processing.
- Experience in data modeling, schema design, and data warehousing.
- Understanding of data governance and cloud security practices.
- Familiarity with Git and basic CI/CD practices is a plus.
- Strong problem-solving and communication skills for technical collaboration.
Languages
- Advanced Oral English.
- Native Spanish.
Note:
- Hybrid 3 days at Scotiabank locations in Mexico City / CDMX
If you meet these qualifications and are pursuing new challenges, Start your application to join an award-winning employer. Explore all our job openings | Sequoia Career’s Page: https://www.sequoia-connect.com/careers/.
Requirements:
- Bachelor’s degree in Computer Science, IT, Data Engineering, or a related field.
- Minimum 3 years of experience in data engineering, including building pipelines on Google Cloud Platform (GCP).
- Proficiency with GCP tools, such as BigQuery, Dataflow, Pub/Sub, or Cloud Composer.
- Strong skills in Python or Java, and advanced SQL for data processing.
- Experience in data modeling, schema design, and data warehousing.
- Understanding of data governance and cloud security practices.
- Familiarity with Git and basic CI/CD practices is a plus.
- Strong problem-solving and communication skills for technical collaboration.