We are looking for a Big Data Engineer that will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company.
All the technologies used
Linux, GCP, Java, Python, git, Bitbucket, Dataflow, Apache beam, REST API, Kafka, BigQuery, serverless, Kubernetes, Ansible, Terraform, microservices, JIRA, Confluence