Senior Data Engineer (AWS & Python)
The Role:
We are looking for a Senior Data Engineer specializing in modern, cloud-native data platforms, with a strong focus on Amazon Web Services (AWS) and Python. You will be responsible for designing, building, and optimizing highly scalable and reliable ETL/ELT pipelines and data warehouses that power analytics, machine learning, and business intelligence for our clients.
What we’re looking for:
• 5+ years of professional experience in data engineering
• Expert proficiency in Python for data manipulation, scripting, and pipeline development (e.g., Pandas, PySpark).
• Deep hands-on experience with the AWS cloud platform, specifically the core services used for data ingestion, storage, and processing (S3, Glue, Lambda, EMR).
• Proven experience working with modern data warehouses (Snowflake, Amazon Redshift, or Google BigQuery/Azure Synapse).
• Solid expertise in SQL and complex query writing/optimization.
• Strong understanding of containerization and orchestration concepts (Docker, Kubernetes).
• Fluent English communication skills.
• Located in CET timezone (+/- 3 hours), we are unable to consider applications from candidates in other time zones.
Nice-to-have:
• Experience with Infrastructure as Code (Terraform or CloudFormation).
• Proficiency with a modern orchestration tool like Apache Airflow.
• Familiarity with data streaming technologies (Kafka, Kinesis, or Flink).
• AWS certifications
Responsibilities:
• Architect, implement, and maintain scalable data pipelines (ETL/ELT) using Python and native AWS services to ingest data from various sources (APIs, databases, streaming services) into data lakes and warehouses.
• Serve as the subject matter expert for the core AWS data services, including S3, Glue, EMR, Kinesis/MSK, and Redshift or Amazon Aurora.
• Design robust and efficient data models (e.g., star schema, snowflake, data vault) for analytical and reporting needs.
• Perform performance tuning and query optimization on large datasets within cloud data warehouses to ensure fast data delivery.
• Implement infrastructure as code (IaC) using Terraform or CloudFormation, and integrate data pipelines into modern CI/CD processes.
• Establish data quality monitoring, logging, alerting, and governance standards across the data platform.
What we offer:
Get paid, not played
No more unreliable clients. Enjoy on-time monthly payments with flexible withdrawal options.
Predictable project hours
Enjoy a harmonious work-life balance with consistent 8-hour working days with clients.
Flex days, so you can recharge
Enjoy up to 24 flex days off per year without losing pay, for full-time positions found through Proxify.
Career-accelerating positions at cutting-edge companies
Discover exclusive long-term remote positions at the world's most exciting companies.
Hand-picked opportunities just for you
Skip the typical recruitment roadblocks and biases with personally matched positions.
One seamless process, multiple opportunities
A one-time contracting process for endless opportunities, with no extra assessments.
Compensation
Enjoy the same pay, every month with positions landed through Proxify.
Apply tot his job
Apply To this Job