Python & SQL Data Engineer

  • Karachi, Pakistan
  • Full-Time
  • Hybrid

Job Description:

Ambidex Inc., a US-based technology consulting company, is hiring a Python & SQL Data Engineer for a hybrid role to design, build and optimize data and ETL pipelines.

Details

Location: Karachi, Lahore or Islamabad
Working Hours: Evening Shift 
Experience: 5+ years in Python & SQL - Data Focused 
Minimum Education: Bachelors in Computer Science or related field 
Compensation: PKR 200,000+ (based on experience) 
Interview Process: 2 Zoom Rounds

Summary

Design and build data pipelines that support reporting, analytics, and AI. This role requires strong Python and SQL skills, good communication, an understanding of OLTP and OLAP data sources, and the ability to translate business needs into clear technical solutions.

Responsibilities

  • Work with engineering, business, and US-based teams to clarify data requirements and technical needs. 
  • Analyze existing OLTP databases to understand table structures, relationships, and data flow. 
  • Design data models aligned with business needs, applying OLTP and OLAP schema techniques such as 3NF, Star, and Snowflake. 
  • Build data pipelines to ingest, transform, and integrate data from OLTP and external sources. 
  • Develop ETL/ELT workflows that support reporting, dashboards, and analytical models. 
  • Apply data validation rules, reconcile row counts, and verify transformations to ensure accuracy across sources. 
  • Optimize SQL queries to improve performance while reducing compute and memory usage.
  • Diagnose and resolve data issues such as schema drift, failed jobs, and pipeline errors. Document data flows, transformation logic, and business rules.

Required Skills

  • Required Skills Strong proficiency in Python and SQL.
  • Understanding of OLTP concepts (normalization, relational design).
  • Experience with relational databases (PostgreSQL, SQL Server, MySQL, or Oracle). 
  • Experience with OLAP modeling (Star/Snowflake, fact/dimension design).
  • Familiarity with ETL/ELT patterns and data transformations (including CDC).
  • Familiarity with at least one cloud platform (AWS, Azure, or GCP).
  • Strong analytical, problem-solving, and communication skills.

Preferred Skills

  • Modern data platforms: Databricks, Spark, Snowflake or Microsoft Synapse
  • Orchestration tools: ADF, Airflow or Prefect 
  • dbt or similar frameworks 
  • Cloud storage systems: S3, Azure Blob or GCS 
  • Modern data concepts: Data Lake, Lakehouse, Medallion Architecture 
  • File formats: Parquet, Avro, ORC, JSON Automation: Git, CI/CD

Benefits

  • Excellent package & benefits.
  • Opportunity to work with international clients, innovative startups, and modern Cloud, Data and AI technologies.
  • Fast career growth based on performance and results.
  • Company paid certifications (AWS, Azure, GCP, Databricks, Snowflake).