Senior Data Engineer, Virtual Insurance

职缺大约 2 个月前更新
雇主活跃于大约 22 小时前

职缺描述

[Job Overview]
We are looking for an experienced Senior Data Engineer to join our engineering team and play a key role in building and scaling our enterprise data platform. You will design, develop, and maintain high-quality data warehouses and data-driven applications that power analytics, reconciliation, and business decision-making across the organization.
This role requires strong expertise in modern data architectures, pipeline engineering, and data quality management. The ideal candidate combines hands-on technical capability with a deep commitment to reliability, scalability, and governance in a regulated environment.

[Responsibilities]
· Data operations: own day-to-day operations of data platforms/pipelines capacity, stability, upgrades, deployments, and recovery drills to sustain high availability and low latency.
· Data collection: design/manage multi-source ingestion (exchanges, internal and external systems), protocol parsing, and robust retry mechanisms.
· Develop rule-based and statistical data quality checks (completeness, uniqueness, time alignment, anomaly detection, error handling).
· Implement automated remediation, reconciliation workflows, and historical backfilling.
· Establish monitoring and alerting frameworks to ensure trusted, production-grade datasets.
· End-to-End pipelines: plan and maintain scalable ETL/ELT including scheduling, caching, partitioning, modelling, schema evolution, and lineage to support both batch and real-time streaming.
· Enforce data access controls, encryption, auditing, and classification to comply with internal policies and external regulatory requirements (including PII management).
· Apply Infrastructure-as-Code, data versioning, data tests, and CI/CD to improve predictability and reduce manual risk.
· Contribute to embedded GenAI and LLM-powered data applications for enterprise analytics, reconciliation, and internal productivity use cases.
· Partner with analytics and product teams to operationalize AI-driven data solutions.

职务需求

[Requirements]
· Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field.
· 5+ years of experience in data engineering, data platform architecture, or AI/ML engineering.
· Strong experience with modern cloud data platforms (e.g., Snowflake, Databricks, BigQuery, Redshift).
· Hands-on experience building BI data foundations and supporting GenAI / LLM architectures.
· Proficiency in SQL and workflow orchestration tools (e.g., Airflow), streaming platforms (e.g., Kafka), and pipeline design best practices.
· Solid understanding of data warehouse development lifecycles and dimensional modeling concepts.
· Familiarity with GitLab and CI/CD pipelines.
· Strong debugging, performance tuning, and problem-solving skills.
· Working knowledge of data governance, lineage, privacy, and security frameworks.

1
需具备 5 年以上工作经验
面议
选择性或弹性远端工作
您的邀请连结
这是您专属的职缺邀请连结。当有人透过您的邀请连结应征这个职缺时,您会收到 email 通知。
分享职缺
应征此职缺的人也应征了
全职
初阶
2
150万+ TWD / 年
全职
中高阶
1
11万 ~ 15万 TWD / 月
全职
中高阶
1
10万 ~ 16万 USD / 年
全职
总监/主管
1
3000+ USD / 月
全职
初阶
3
4.5万 ~ 5万 TWD / 月
全职
初阶
2
70万 ~ 200万 TWD / 年
AIFT
软件
51〜200人 人

关于我们

關於我們

AIFT 成立於 2016 年, 集團以「為新興趨勢打造安全屏障」為願景,與生態系夥伴朝著 AI 推動的科技未來前進。 AIFT 於亞洲和中東多個市場皆有服務團隊,並持續招募國際化人才加入。

Established in 2016, the group’s vision is to “Secure the Future”, a future that will increasingly be shaped by artificial intelligence. AIFT provides services across key markets in Asia and the Middle East. We are continuously expanding our global footprint and actively recruiting international talent to join our growing team.