GeekHunter Logo

Vagas

Login

Data Engineer (BigQuery & MongoDB)

BigQuery

MongoDB

flag.br_flag

remoto

Dollar sign icon

Faixa de Remuneração

R$

12.000

-

15.000

BRL/mês

PJ ou Cooperativa

Briefcase icon

Nível de Experiência

Sênior

Globe icon

Tempo de Experiência

3+ anos em Dados

3+ anos como Engenheiro de Dados

Curtiu a vaga? Demonstre interesse e seja encontrado pela empresa

Benefícios

Requisitos

Inglês

Essa vaga exige nível mínimo de inglês

Intermediário

Tecnologias Necessárias

Conhecimentos obrigatórios

BigQuery

MongoDB

Tecnologias Desejáveis

Conhecimentos não obrigatórios

Google Cloud

Airflow

Tempo de Experiência

Experiência mínima obrigatória

3+ anos em Dados

3+ anos como Engenheiro de Dados

Required Qualifications:
1. Bachelor's degree in Computer Science, Information Systems, or a related field.
2. Proven experience as a Data Engineer or similar role, with a strong background in designing and maintaining data pipelines.
Proficiency in SQL and working with cloud-based data warehouses (BigQuery experience preferred).
3. Experience with MongoDB for NoSQL database management.
4. Strong experience with Docker for containerization, along with orchestration tools such as Kubernetes (preferred).
5. Familiarity with workflow orchestration tools such as Apache Airflow or Prefect.
6. Experience with Google Cloud Platform (GCP) services (Cloud Functions, Cloud Run, BigQuery, Dataflow).
7. Strong programming skills in Python or other data-related programming languages.
8. Experience working with large datasets, and data integration from multiple sources.
9. Knowledge of data modeling, data governance, and data architecture best practices.
10. Familiarity with CI/CD pipelines for automated deployment of data infrastructure.
11. Fluent Spanish is a plus

Preferred Qualifications:
Experience with data processing from IoT devices or sensors, particularly spectrometry or other scientific instrumentation.
Familiarity with data visualization tools like Tableau or Google Looker Studio.
Experience managing data in cloud environments like GCP, AWS, or Azure.
Previous experience in a startup environment, particularly in the agriculture or coffee industry, is a plus.


What We Offer:
Opportunity to work with a passionate team in a rapidly growing startup at the intersection of data science and agriculture.
Flexible work environment, with remote work options.
Competitive salary and benefits.
Opportunities for career growth and development.

Atividades

Key Responsibilities:
Data Pipeline Development: Design, build, and maintain robust ETL pipelines to collect, process, and store data from various sources, including spectrometry data from our NIR scans, user feedback, and external data inputs.
Data Integration: Collaborate with cross-functional teams to integrate data from multiple sources, including MongoDB, BigQuery, APIs, and Excel files, ensuring unified access for the data science team.
Database Management: Set up and manage scalable data storage solutions using cloud-based data warehouses (BigQuery), as well as NoSQL databases (MongoDB) for real-time and unstructured data processing.
Containerization and Deployment: Use Docker to containerize data pipelines and services for smooth deployment across environments. Manage containers and orchestration for scalable and repeatable workflows.
Data Quality Assurance: Implement processes to clean, validate, and monitor data quality, ensuring consistency and accuracy across all datasets used by the data science and product teams.
Collaboration with Data Scientists: Work alongside data scientists to understand their data needs, enabling effective model development, testing, and deployment by providing clean, well-organized datasets.
Optimization and Scalability: Design data solutions that can scale with the business, ensuring fast, reliable data access for real-time analytics and model training.
Cloud Infrastructure Management: Work with Google Cloud Platform (GCP) to set up and manage cloud infrastructure for data storage and processing, optimizing the use of GCP services like Cloud Functions, Cloud Run, Google Dataflow and BigQuery.
Data Security and Compliance: Ensure data handling follows industry best practices for security, privacy, and compliance with applicable regulations.
Workflow Automation: Implement workflow automation and orchestration tools like Apache Airflow or Prefect to ensure smooth and consistent data flow across the entire pipeline.

Gostou da vaga?

Na GeekHunter, as empresas encontram os candidatos! Demonstrar interesse aumenta sua visibilidade.

Compartilhe a vaga através das redes sociais!

Veja vagas similares a essa

Engenheiro de Dados Power BI (Remoto)

flag.br_flag

remoto

MongoDB

Power BI

BigData

Visualizar vaga

DESENVOLVEDOR JAVA - PLENO

flag.br_flag

remoto

Spring Boot

SQL

EJB

+ 12

Visualizar vaga

DESENVOLVEDOR JAVA - SENIOR

flag.br_flag

remoto

Dollar sign icon

R$

9.000

-

11.000

BRL/mês

CLT

Spring Boot

SQL

EJB

+ 10

Visualizar vaga

Pessoa Desenvolvedora .NET Core

flag.br_flag

remoto

C#

MongoDB

.NET

+ 6

Visualizar vaga

Veja mais vagas de BigQuery

Purple right arrow
Homem feliz com o computador na mão

Encontramos outras oportunidades que podem te interessar na GeekHunter

Criando um perfil na GeekHunter você fica visível para todas as empresas da plataforma e pode receber diversas propostas de emprego.