Posted at: 6 May
Senior Data Engineer - Real time analytics
Company
Aircall is a global AI-powered B2B SaaS customer communications platform headquartered in Paris, specializing in unified voice and digital communication solutions for small to mid-sized businesses and enterprises.
Remote Hiring Policy:
Aircall supports remote work and has a diverse team across various regions, including France, Spain, India, and Portugal, promoting a collaborative and inclusive work culture.
Job Type
Full-time
Allowed Applicant Locations
France, India, Spain, Portugal
Job Description
We are looking for an engaged and passionate Senior Data Engineer to join our growing Engineering Team. In this role, you will be a key member of the team managing our analytics and event platform stack.
As part of the Analytics & Data Platform team, you will build features for over 22,000 customers across the globe. Your work will unlock the value of data, empowering our customers to make informed decisions through real-time insights, monitoring, and comprehensive analytical reports.
Your role at Aircall
- Have key responsibilities within the requirements analysis, scalable & low latency streaming platform solution design, architecture, and end-to-end delivery of key modules in order to provide near real-time data solutions for our product
- Analyze and translate business needs into data model & optimize for big data
- Identify slow queries and optimize for better performance
- Write clean scalable code using Scala / python / SQL and test and deploy applications and systems
- Solve our most challenging data problems, in near real-time, utilizing optimal data architectures, frameworks, query techniques, sourcing from structured and unstructured data sources.
- Be part of an engineering organization delivering high quality, secure, and scalable solutions to Aircall clients
- Involvement in product and platform performance optimization and live site monitoring
- Mentor team members through giving and receiving actionable feedback.
Our tech. stack:
- AWS (Kinesis,Kafka, s3, DMS, Glue, EMR, EKS, Redshift, Spectrum), Apache Pinot, Apache Flink and Airflow.
- A continuous deployment process based on GitLab
A little more about you:
- A Bachelor's degree in a technical field (eg. computer science or mathematics).
- 3+ years experience with near real-time analytics
- 5+ years experience with a modern programming language such as Scala, Python, Go, Typescript
- Experience of designing complex data processing pipeline
- Experience of data modeling (star schema, dimensional modeling etc) in AWS redshift
- Experience of query optimisation
- Experience of kafka is a plus
- Shipping and maintaining code in production
- You like sharing your ideas, and you're open-minded