Job Summary:
WovV Technologies, a Global Business Productivity SaaS company, is looking for a Senior Data Consultant for Abu dhabi location.
WovVTech’s suite of products are now empowering users across 3000 locations in 50 countries to digitise their operations, get real-time decision-driven analytics, and improve productivity. It counts global fortune 500 companies as its customers for SaaS products and technology services.
Location: Abu Dhabi (on-site)
Duration: 10–12 weeks (potential extension)
Seniority Level: Senior Consultant
Key Skills:
- Azure Data Engineering, Databricks, ADF, Matillion, DBT, AWS Glue, AWS Athena
- AWS Code Pipelines, Google Big Query, Microsoft Fabric, Azure Synapse
- SSIS, SSAS, SSRS, Power BI, BIDS, SSMS, DQS, MDM, MDS, SSDT, GIT
- Microsoft SQL Server, MySQL, MongoDB, Azure SQL Database, CosmoDB, Snowflake
- Python, R, PySpark, Scala, AWS, S3, Redshift, Data Catalogues, Glue, PySpark, NumPy • Scikit – learn, Matplotlib, TensorFlow, PyTorch, Keras, Pandas
- C#,C, C++, T-SQL, PL-SQL, .NET, PowerShell, Delta Lake, Logic Apps, Event Hub
- Avro, Parquet, Gitlab, Jenkins, TeamCity, Octopus, ADO
- Azure Synapse Analytics, Azure Function, Matillion, DBT, Snowflake
- Jupyter Notebooks, Lambda Functions, Apache Spark
- Agile, Scrum, TDD, BDD
Core Responsibilities:
- Leads data audits, governance assessments, and creation of reference data models to ensure data quality and consistency
- Bridges regulatory and operational data requirements, aligning technical solutions with compliance mandates and business objectives
- Ensures data architecture is AI-ready, enabling scalable integration with machine learning and advanced analytics workflows
- Designs and implements cloud-native data architectures with a focus on resilience, performance, and future scalability
- Collaborates with cross-functional teams to translate business data needs into robust engineering solutions
- Promotes data mesh and federated data ownership practices to support decentralized data delivery and governance
- Mentors teams on best practices in data modelling, pipeline development, and cloud-based transformation strategies
Required Qualifications:
- 17+ years of experience designing and implementing large-scale data solutions across Energy & Commodities and Insurance (including Lloyd’s Market) and Financial Services.
- Specializes in modern cloud-based data engineering, with deep expertise in Azure and Databricks and Microsoft Fabric and Data Mesh Architectures.
- Strong hands-on experience in Conceptual and physical data modelling and High-volume, resilient data pipeline development and Data transformation and enrichment using Apache Spark and Python.
- Recent engagements include rearchitecting and modernizing a large-scale commodities data platform (Azure cloud migration).
- Delivering enterprise-grade machine learning APIs for Zurich Insurance using Databricks and Microsoft Fabric.
- Leading cloud migration at IQUW (Lloyd’s Market) from AWS to Azure.
- Supporting carbon credit tracking and energy transition metrics for Shell Trading & Supply using Databricks, Azure Data Factory, and Scala.
- Proven ability to deliver secure, scalable, and high-performing data platforms for business-critical insight and operational support.
- Combines technical depth, domain expertise, and hands-on delivery experience across regulated industries.
Benefits and Perks:
- Team Outing
- Exciting career path with an exponentially growing company
- Fun activities
- Abroad opportunities
- Deserving compensation