Hiring Azure Data Engineer (Azure / MS Fabric / SAP BW/BFC) position in remote locations
Skills: Microsoft Fabric (MS Fabric), SAP BW/BFC Data Integration, Apache Kafka, Azure cloud services, Azure Functions, Docker Containers, and Kafka for real-time data processing, python, DW, Data Gov & Security, data modelling, ETL/ELT processes, and performance tuning, CI/CD and GitHub
Job Type: Fulltime/remote
Shift: Mid Shift
Experience: 7+ years
Notice period: Immediate to 15 Days
Project context
We are seeking an experienced Data Engineer (7+ Years) to lead the design and implementation of scalable data solutions on the Azure Data Platform.
The primary focus will be on integrating financial data from SAP BW and SAP BFC into Azure, establishing a reliable and automated data foundation that supports advanced analytics and AI-driven KPI prediction models.
This role requires deep technical expertise in Azure cloud services, MS Fabric, Azure Functions, Docker Containers, and Kafka for real-time data processing.
The engineer will own the end-to-end data engineering lifecycle from ingestion and transformation to performance optimization and documentation — ensuring the delivery of a robust, maintainable, and high-quality data ecosystem.
This project is central to transforming how we process and utilize financial data across the enterprise, enabling predictive insights and data-driven decision-making at scale.
Goals and deliverables
- Design and implement data pipelines that extract, transform, and load (ETL/ELT) financial data from SAP BW and SAP BFC into Azure using MS Fabric.
- Utilize Azure Functions, Kafka for streaming and event-driven architecture to support near real-time data ingestion and analytics.
Employ Docker containers for deployment and scalability of data processing components, ensuring flexibility across environments. - Develop optimized data models to structure financial data for analytical and AI consumption.
Implement data cleansing, transformation, and validation processes to ensure data quality and consistency. - Establish monitoring, alerting, and optimization mechanisms for pipeline performance.
Ensure data integrity, lineage, and compliance with enterprise data governance and security standards. - Prepare high-quality, structured datasets to be consumed by AI models for financial KPI prediction.
Collaborate with data science teams to ensure data readiness and relevance. - Deliver comprehensive technical documentation, architecture diagrams, and maintenance guides.
Conduct training and handover sessions to enable ongoing platform sustainability and support.
Ideal Candidate Profile
- Proven experience in Azure Data Engineering and cloud-native data architectures.
- Expertise with MS Fabric, Azure Functions, Kafka, and Docker Containers.
- Solid understanding of data modeling, ETL/ELT processes, and performance tuning.
- Familiarity with SAP BW and SAP BFC data structures for effective source system integration.
- Strong sense of ownership, with the ability to work independently and deliver end-to-end solutions.
Expected skills
- Microsoft Fabric (MS Fabric)
- Python
- Data warehousing
- Data Governance
- Azure Functions
- Apache Kafka
- Docker Containers
- Data Modeling (Dimensional & Relational)
- SAP BW/BFC Data Integration
- Data Governance and Security
- CI/CD and GitHub
share resume to hr@marsdata.in