DKatalis sedang merekrut seorang

Data Engineer

Loker ini dibuat lebih dari 2 bulan yang lalu
Cek ketersediaan dengan klik lamar. Tidak tersedia? Cek loker lain di Jakarta.

About DKatalis

DKatalis is a financial technology company with multiple offices in the APAC region. In our quest to build a better financial world, one of our key goals is to create an ecosystem linked financial services business.

DKatalis is built and backed by experienced and successful entrepreneurs, bankers, and investors in Singapore and Indonesia who have more than 30 years of financial domain experience and are from top-tier schools like Stanford, Cambridge London Business School, JNU with more than 30 years of building financial services/banking experience from Bank BTPN, Danamon, Citibank, McKinsey & Co, Northstar, Farallon Capital, and HSBC

 

About the role

We are seeking a data engineer to help us build the key data pipelines to ingest and surface both batch and streaming data on GCP to support our team of data analysts, data scientists and various business stakeholders such as growth, customer engagement, fraud, risk and compliance. There will need to be strong appreciation for data quality and data governance with value being assigned to consistently producing high quality metadata to support discoverability and consistency of calculation and interpretation. A solid understanding of the retail banking domain is desirable, but not required. 

 

Candidates should have experience across the following systems and languages:

  • Ideally GCP, but strong experience in another platform such as AWS or Azure will suffice
  • Cloud data warehouses such as BigQuery, Redshift or Snowflake
  • Data parallel processing frameworks like Spark or Flink
  • Workflow scheduler such as Apache Airflow
  • Experience programming in Scala and Python
  • Proficient in SQL
  • Comfortable writing detailed design documents

 

Additional desired experience includes:

  • A good understanding of relational databases as well as NoSQL databases such as MongoDB
  • Familiarity with the ELT paradigm and systems such as Fivetran, Stitchdata or Airbyte 
  • Knowledge of pub/sub systems such as Kafka
  • Experience working with data parallel processing frameworks like Spark, Cloud Dataflow, Flink, etc. Ideally on both streaming data and batch data.

 

Silakan referensi bahwa Anda menemukan lowongan kerja ini di Fungsi.id, ini membantu kami mendapatkan lebih banyak lowongan kerja berkualitas di sini, terima kasih!
Lokasi
Tanggal posting
18 Juli, 2022