AIFT · Taiwan, TW · 4 days ago
Senior Data Engineer, Virtual Insurance Taiwan Apply [Job Overview] We are looking for an experienced Senior Data Engineer to join our engineering team and play a key role in building and scaling our enterprise data platform. You will design, develop, and maintain high-quality data warehouses and data-driven applications that power analytics, reconciliation, and business decision-making across the organization. This role requires strong expertise in modern data architectures, pipeline engineering, and data quality management. The ideal candidate combines hands-on technical capability with a deep commitment to reliability, scalability, and governance in a regulated environment. [Responsibilities] · Data operations: own day-to-day operations of data platforms/pipelines capacity, stability, upgrades, deployments, and recovery drills to sustain high availability and low latency. · Data collection: design/manage multi-source ingestion (exchanges, internal and external systems), protocol parsing, and robust retry mechanisms. · Develop rule-based and statistical data quality checks (completeness, uniqueness, time alignment, anomaly detection, error handling). · Implement automated remediation, reconciliation workflows, and historical backfilling. · Establish monitoring and alerting frameworks to ensure trusted, production-grade datasets. · End-to-End pipelines: plan and maintain scalable ETL/ELT including scheduling, caching, partitioning, modelling, schema evolution, and lineage to support both batch and real-time streaming. · Enforce data access controls, encryption, auditing, and classification to comply with internal policies and external regulatory requirements (including PII management). · Apply Infrastructure-as-Code, data versioning, data tests, and CI/CD to improve predictability and reduce manual risk. · Contribute to embedded GenAI and LLM-powered data applications for enterprise analytics, reconciliation, and internal productivity use cases. · Partner with analytics and product teams to operationalize AI-driven data solutions. [Requirements] · Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field. · 5+ years of experience in data engineering, data platform architecture, or AI/ML engineering. · Strong experience with modern cloud data platforms (e.g., Snowflake, Databricks, BigQuery, Redshift). · Hands-on experience building BI data foundations and supporting GenAI / LLM architectures. · Proficiency in SQL and workflow orchestration tools (e.g., Airflow), streaming platforms (e.g., Kafka), and pipeline design best practices. · Solid understanding of data warehouse development lifecycles and dimensional modeling concepts. · Familiarity with GitLab and CI/CD pipelines. · Strong debugging, performance tuning, and problem-solving skills. · Working knowledge of data governance, lineage, privacy, and security frameworks. Create a Job Alert Interested in building your career at AIFT? Get future opportunities sent straight to your email. Create alert 申請這份工作 * 表示必填欄位 Autofill with MyGreenhouse 名* 姓* Preferred First Name 電子郵件* 電話 Country 電話 履歷 附加 附加 手動輸入 手動輸入 接受的文件類型:pdf, doc, docx, txt, rtf 求職信 附加 附加 手動輸入 手動輸入 接受的文件類型:pdf, doc, docx, txt, rtf LinkedIn Profile Website 提交申請
Headquarters
Taiwan
Work Location
on-site
Job Category
Cybersecurity
Application Deadline
Not specified
Job Type
full-time
Experience Level
senior-level
Application Method
Apply via JobSpring
Salary
Not specified
No related jobs found