Role Description:
Design, develop and maintain data solutions for data generation, collection, and processing.
Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have Skills : Python, Google Big Query, Data Modeling Techniques and Methodologies, Data Building Tool (DBT), MySQL
- Role: GCP development - Data engineer
- As a GCP Data Engineer, responsibilities include developing scripts (either in Python or shell script) to carry out workloads as described in the defined DAGs in Argo Workflow.
A lot of the processing that we create is related to loading data via DBT, cleaning and manipulating files with either Python or shell script, and designing technical architectures for new data flows. The GCP Data Engineer must be confident enough to articulate their ideas in team-wide forums and understand how to frame a problem from an optimization and efficiency perspective.
- Skills Required / Good-to-Have: Good understanding of the following services on GCP:
Big Query Data Catalog Cloud Storage Additionally, they should be proficient in:
Python Shell scripting
Understanding data models DBT A nice to have is knowledge on Argo Workflow.
Job Types: Full-time, Fixed term
Contract length: 12 months
Pay: Php56,
- 00 - Php64,000.00 per month
Schedule:
* Day shift