Project Role : Data Engineer
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have Skills : Python (Programming Language), Google BigQuery, Data Modeling Techniques and Methodologies, Data Building Tool DBT, MySQL
Good to Have Skills : Job Requirements : Role: GCP development - Data engineer As a GCP Data Engineer, responsibilities include developing scripts (either in Python or shell script) to carry out workloads as described in the defined DAGs in Argo Workflow. A lot of the processing that we create is related to loading data via dbt, cleaning and manipulating files with either Python or shell script, and designing technical architectures for new data flows. The GCP Data Engineer must be confident enough to articulate their ideas in team-wide forums and understand how to frame a problem from an optimization and efficiency perspective. Skills Required / Nice- or Good-to-Have: Good understanding of the following services on GCP: BigQuery Data Catalog Cloud Storage Additionally, they should be proficient in: Python Shell scripting Understanding data models DBT A nice to have is knowledge on Argo Workflow.
Job Type: Full-time
Pay: Php50,
- 00 - Php55,000.00 per month
Schedule:
* Day shift