Senior Data Engineer – 14621
Publiceringsdatum: 2025-12-03
Är du på jakt efter en karriär inom data/it? Det är nu möjligt för dig att få jobb som projektledare, it hos Veritaz AB. Vill du arbeta heltid hos Veritaz AB? Just nu letar de efter någon som vill ha en tjänst 6 månader eller längre.
Arbetsplatsen ligger nära tillhands om du är bosatta i ospecificerad arbetsort som är tjänsteorten. Om du är intresserad av en karriär inom projektledare, it, ta en titt på den lediga tjänsten som Veritaz AB erbjuder för att se om det är en matchning. Sista ansökningsdagen till jobbet som projektledare, it hos Veritaz AB är fredag den andra januari 2026.
Jobbannons
Veritaz is a leading IT staffing solutions provider in Sweden, committed to advancing individual careers and aiding employers in securing the perfect talent fit. With a proven track record of successful partnerships with top companies, we have rapidly grown our presence in the USA, Europe, and Sweden as a dependable and trusted resource within the IT industry.
Assignment Description
We are looking for a Senior Data Engineer to join our dynamic team.
What you will work on
- Designing and implementing a modern, real-time data pipeline in Azure/Microsoft Fabric, including API ingestion, Eventstream processing, KQL-based storage and analysis, and Kafka event publishing
- Taking end-to-end responsibility for architecture, implementation, operability, security, testing and documentation
- Building resilient API integrations with strong focus on robustness, idempotence, error handling and performance
- Ensuring high data quality in streaming environments through ordering, deduplication and real-time validation
- Collaborating closely with product owners, architects, developers, security and domain experts to deliver scalable, observable and secure solutions
- Establishing common engineering practices and strengthening team capabilities through workshops, pair programming and structured knowledge transfer
- Contributing to CI/CD workflows and automation to ensure reliable and efficient delivery
- Supporting governance, monitoring and cost control across Azure and Fabric components
What you bring
- Strong hands-on experience with Microsoft Fabric Real-Time Analytics: Eventstream, KQL database, OneLake and workspace governance
- Advanced KQL skills, including queries, ingestion mapping, update policies and materialized views
- Proficiency with Kafka (Confluent or Azure Event Hubs with Kafka protocol), including producer/consumer models, schema registry, security (TLS/SASL), topic design and operational handling
- Solid experience with resilient API integration: retry/backoff, idempotence, parallelization and robust error-handling patterns
- Strong understanding of streaming architecture principles: ordering, deduplication and real-time data quality
- Good knowledge of Azure platform components such as VNet, Managed Identity, Key Vault, Monitor/Log Analytics, RBAC and cost optimization
- Experience with CI/CD using GitHub Actions and/or Azure DevOps
- Ability to document solutions effectively and transfer knowledge through workshops, playbooks and code reviews
- Experience with Lumera or insurance/pension domain (meritorious)
- Experience with Azure Data Explorer (ADX) and Purview (meritorious)
- Experience with observability or data-quality frameworks for streaming environments (meritorious)
- Experience with performance testing for streaming pipelines (meritorious)
- A pedagogical and inclusive mindset, with the ability to coach teams and establish common ways of working
- Strong communication skills with the ability to translate technical solutions into business value
- A proactive, quality-driven, security-minded approach with the ability to balance PoC speed with long-term stability
- Fluent in English and Swedish