
Caliberly
Data Engineer Jobs In Dubai 2023 | Caliberly
These are the below objectives:
- Data Acquisition
- Vendor should manage the existing Data pipelines built for data ingestion.
- Create and manage new data pipelines following the best practises for the new ingestion of data.
- Continuously monitor the data ingestion through Change Data Capture for the incremental load
- Any failed batch job schedule to be analysed and fixed to capture the data
- Maintaining and continuously updating on the technical documentation of the ingested data and maintaining the centralized data dictionary, with necessary data classifications.
Requriments
- Expertise in Big Data querying tools, such as Hive, Hbase and Impala.
- Expertise in SQL, writing complex queries/views, partitions, bucketing
- Strong Experience in Spark using Python/Scala
- Expertise in messaging systems, such as Kafka or RabbitMQ
- Hands on experience in Management of Hadoop cluster with all included services.
- Implementing ETL process using Sqoop/Spark
- Implementation including loading from disparate data sets, Pre-processing using Hive.
- Ability to design solutions independently based on high-level architecture.
- Collaborate with other development teams
- Expertise in building stream-processing systems, using solutions such as Spark-Streaming, Apache NIFI, KAFKA
- Expertise with NoSQL databases such as HBase
- Experience with Informatica Enterprise Data Catalog (EDC) implementation and administration.
- Strong knowledge of data management, data governance, and metadata management concepts.
- Proficiency in SQL and experience with various databases (e.g., Oracle, SQL Server, PostgreSQL) and data formats (e.g., XML, JSON, CSV).
- Experience with data integration, ETL/ELT processes, and Informatica Data Integration.
- Familiarity with data quality and data profiling tools, such as Informatica Data Quality (IDQ).
To apply for this job please visit www.careerjet.ae.