Powered by Information Department Government of Sindh

We are looking for DevOps Engineers with strong experience in data architecture, containerization, and cloud-native deployments. The ideal candidates will have hands-on expertise in Kafka-based data pipelines, Kubernetes orchestration, CI/CD automation, and data integration within regulated environments (preferably banking/financial services). This role involves building scalable, secure, and high-performance data platforms across multiple environments.

Key Responsibilities:

  • Implement and configure Helm packages across multiple environments.
  • Design and manage data ingestion pipelines using Apache Kafka, ensuring high availability and low-latency data delivery.
  • Translate functional and technical requirements into scalable technical solutions and workflows.
  • Manage data modelling, schema registry configurations, and serialization strategies.
  • Monitor, troubleshoot, and resolve issues in data pipelines, schema alignment, and data transformation processes.
  • Ensure data integrity, compliance, and adherence to regulatory standards.
  • Provide technical training, documentation, and ongoing support to business users and developers.
  • Deploy and orchestrate containerized components using Docker and Kubernetes (including AKS and OpenShift).
  • Support CI/CD pipelines, automated deployments, and infrastructure-as-code practices.
  • Integrate data sources, analytics, and reporting modules to deliver actionable insights.

Requirements:

  • Strong understanding of data architecture, data flows, and distributed systems.
  • Proficiency in Apache Kafka (Schema Registry, topic management, event filtering).
  • Extensive experience with SQL Server or similar relational databases.
  • Hands-on expertise in data modelling, schema versioning, and data integration.
  • Strong command of Docker for building and managing containerized applications.
  • Experience deploying and managing applications on Kubernetes platforms, including AKS or OpenShift.
  • Experience troubleshooting serialization issues, ingestion failures, and performing log analysis.
  • Proficiency in Python, Bash, or similar scripting languages for automation.
  • Comfortable working in Linux-based environments with CLI log and file management.
  • Strong analytical and problem-solving skills, preferably with exposure to financial data contexts.
  • Experience with DevOps tools such as Jenkins, Helm, and Git for CI/CD automation.
  • Knowledge of cloud-native architecture and microservices within the banking ecosystem.
  • Familiarity with data security practices, encryption standards, and GDPR compliance.
  • Understanding of real-time analytics, fraud detection, or regulatory reporting use cases.

Salary

Market Competitive

Monthly based

Location

Karachi Division,Pakistan,Pakistan

Job Overview
Job Posted:
6 days ago
Job Expire:
1 month from now
Job Type
Pvt Job
Job Role
- Civil Engineer - Mechanical Engineer
Education
Bachelor's Degree
Experience
2 Years
Total Vacancies
1
Age requirment
18 Year - 35 Year

Job Tags:

Share This Job:

Location

Karachi Division,Pakistan,Pakistan