Search by job, company or skills

NCS Group

Data Engineer

Early Applicant
  • 2 days ago
  • Be among the first 50 applicants

Job Description

Are you looking for value adding and impactful work

Do you want to make a difference with your expertise

With us, you'll be able to make it happen.

NCS is a leading technology services firm, operating across Asia Pacific in over 20 countries, providing services and solutions in consulting, digital services, technology, and more.

We believe in utilizing the power of technology to make extraordinary things happen and to create lasting impact and value for our people, communities, and partners. Our diverse 12,000-strong workforce has delivered a wealth of large-scale, mission-critical, and multi-platform projects for governments and enterprises in Singapore and the APAC region.

What we do:

We drive our passion for harnessing technology.

We bring people and technology together.

We advance communities and transform industries.

We're searching for Data Engineer to be part of our diverse team of talent here at NCS!

What we seek to accomplish together:

Develop an ETL jobs

Assures the efficiency of using cluster resources applied to an ETL job

Design and implements cost-effective framework level big data solutioning

Provide support on the big data issues encountered on company proprietary enterprise data system and frameworks

Enhance an ETL job applying best practices and the spark configuration balancing

Participates in reviewing project and program logic requirements

Cascading of macro-change in framework and system to team and other teams working and affected to the job

Conversion and migration of job from one platform to another big data platform

Integration of a job from its ingested source transforming it into various target like databases, extracts file and use case of streaming platform

Maintains and/or handle source core repositories

Document technical deliverable modules

Keep technology trends with relation to big data ETL as much as possible

Work on multiple ETL jobs implementing java-spark, airflow and loading in different data warehouses and for different reports like financial reports, leads reports, network subscriber reports and extracts like cell-site extracts, telcoscore, gscore, etc.

Troubleshooting, testing and pre-implementing ETL's data mapping in pyspark

Work on multiple ETL jobs for Datamart, the data mainly focuses on Globe's broadband, postpaid and prepaid to reimagine the customer experience for machine learning engineers to use

Macro-conversion of ETL jobs from running on-premises into all cloud-based and running on cloud-based cluster and bigdata tools

Develop an ETL jobs for Digital Growth and Transformation use cases to help business

envisions the financial summary report and the data use to help track customer's payment transaction, rewards, subscriptions and gain a macro-view on regional leads

Use Databricks for the migrations of the other jobs to have control on managing compute usages.

Applying techniques on spark reading and writing sources and targets to reduce runtime

Work on solutioning for cost-effective ETL job implementations together with engineers in data quality and data architect.

Mandatory Requirements:

  • Java 11 or higher (for ingestion)
  • SQL
  • Python
  • Dbt
  • Databricks
  • Kafka
  • Airflow
  • NiFi
  • Linux Administration
  • Bitbucket

More Info

Industry:Other

Function:technology

Job Type:Permanent Job

Date Posted: 13/11/2024

Job ID: 100135887

Report Job

About Company

Follow

Hi , want to stand out? Get your resume crafted by experts.

Last Updated: 13-11-2024 00:18:56 AM