Dive deep into Digital! For 20 years Intellias has been developing top-tier digital solutions for the world’s leading companies, keeping them in line with the latest technology trends. Join in and provide innovations for the future!
As a Data Engineer, you will play a crucial role in building and optimizing our data pipeline, which is central to our platform’s reliability and performance. You will be responsible for architecting and maintaining efficient data streams and integrating the latest technologies, with a particular focus on complex event processing (CEP), to support our strategic goals and build a robust data engineering ecosystem.
Responsibilities:
Monitoring, Observability & Optimization
• Utilize Apache Flink for real-time stream processing and monitoring to handle complex event processing and data transformations effectively.
• Use tools like Prometheus and Grafana for observability to ensure the health and performance of the data pipeline.
• Continuously monitor, troubleshoot, and optimize data pipeline performance to handle billions of events per month.
Collaboration & Integration
• Work closely with cross-functional teams, including front-end developers and data scientists, to build a robust data platform that meets current and future business needs.
• Participate in the design and architecture of scalable data systems, integrating new data sources, and optimizing existing data processes.
• Write production-level code in Kotlin and Python to build data processing applications, automation scripts, and CEP logic.
Qualifications:
• Bachelor’s degree in Computer Science or a related field, or equivalent experience.
• 3+ years of experience in data engineering or a similar role.
• Strong proficiency with AWS services such as Kinesis, Glue, Firehose, Lambda, and Redshift.
• Expertise in stream processing frameworks like Apache Flink and Kafka.
• Experience in designing and implementing Complex Event Processing (CEP) solutions for real-time data streams.
• Solid experience in programming with Kotlin; familiarity with Python or other programming languages is a plus.
• Demonstrated experience in developing, optimizing, and maintaining streaming data pipelines in large-scale environments.
• Proven ability to build, optimize, and maintain complex data pipelines that handle billions of events per month.
• Strong analytical and problem-solving skills with the ability to manage multiple complex projects and deadlines.
• Good communication skills, with the ability to work effectively within cross-functional teams, and experience with Agile project management methodologies.
Preferred Skills
• Experience with cloud data infrastructure solutions in AWS.
• Expertise in distributed data processing frameworks such as Apache Spark or similar.
• Familiarity with OLAP databases such as Snowflake or Redshift.
• Knowledge of C#/.NET core or developing and maintaining specific data processing applications.
• Familiarity with data integration tools such as Metillion and experience with large-scale data environments.
• Ability to work collaboratively and adapt to dynamic project needs.
–
At Intellias, we are committed to being an equal opportunity employer, fostering equity, diversity, and inclusion. We welcome and celebrate the differences of all qualified applicants. Join Intellias for a career where your unique perspectives are not only valued but crucial to our success.