А1 България ЕАД
Data Engineer (f/m/d) @ A1 Competence Delivery Center
Пълен работен ден
длъжност на пълно работно време

Data Engineer (f/m/d) @ A1 Competence Delivery Center

Пълен работен ден длъжност на пълно работно време

Описание на позицията

Data Engineer (f/m/d) @ A1 Competence Delivery Center

Strength. Care. Growth

A1 Competence Delivery Center is a vital component of A1’s telecommunications business. Acting as an expertise hub, CDC is dedicated to delivering a full range of high-quality IT, network, financial and other services to support A1’s operations across all OpCos, independent of location.

Using the power of being OneGroup and leveraging synergies, CDC enables transparency of resources, key skills and knowledge expansion and personal career growth opportunities’ enhancement, paired with job stability.

We are expanding the Data Team within IT Services Department of the A1 Competence Delivery Center. Be a part of this exciting journey!

You will know we are the right place for you, if you are driven by:

Opportunities to learn and build your career.

Meaningful work in a stable and fast-paced company.

Diversity of people, projects, and platforms.

A supportive, fun, and inspiring place to work.

Job Overview

We are looking for a Data Engineer to design, build and operate data pipelines across both on‑prem and cloud environments. You will focus primarily on building Airflow DAGs and Databricks Jobs in Python, delivering reliable batch and near real‑time data solutions that power analytics and data science across the company.

Role Insights:

Design, develop, and test scalable Big Data solutions for A1 (batch and near real-time).
Build and own Airflow DAGs for orchestration, scheduling, and monitoring of data workflows.
Develop, schedule, and optimize Databricks Jobs (primarily in Python) for data processing.
Contribute to the construction, architecture, and governance of the central Data Lake.
Optimize data flows and ensure end-to-end data quality, accuracy, and observability.
Collaborate closely with Data Scientists and business stakeholders to deliver data products.
Drive innovation by testing, comparing, and piloting new tools and technologies.
Document solutions and follow best practices (version control, testing, code reviews).

What Makes You Unique:

You have 3+ years of experience with Linux scripting and SQL.
You bring strong Python skills and proven experience in designing and building data pipelines.
You have hands-on experience with Apache Airflow, including DAG authoring, scheduling, and monitoring.
You are experienced in creating and operating Databricks Jobs, including notebooks, clusters, and job orchestration.
You have a background in Big Data platforms (e.g., Cloudera/Hadoop) and/or data warehousing.
You are knowledgeable in batch and (near) real-time data processing patterns.
You demonstrate great written and spoken English skills.

Nice to have

You have experience with Spark and SQL.
You have worked across both on-premises environments and at least one major cloud platform (Azure, AWS, or GCP).
You are familiar with streaming technologies (e.g., Kafka), lakehouse concepts, and CI/CD for data.

Our gratitude for the job done will be eternal, but we’ll also offer you:

Innovative technologies and platforms to work with.
Modern working environment for your comfort.
Friendly, ambitious, and motivated teammates to support each other.
Thousands of online and in-person learning opportunities for you to grow.
Challenging assignments and career development opportunities in a multinational environment.
Attractive compensation package.
Flexible working schedule and opportunity for home office.
Numerous additional benefits, including, but not limited to free A1 services.

If you are interested in this challenging opportunity, please do not hesitate to submit your application till 11.10.2025

Any questions? Contact Nadya Georgieva

Business Intelligence & Data
Data Engineer (f/m/d) @ A1 Competence Delivery Center

Пълен работен ден

Кандидатствай
Сподели