Job Overview
Senior Data Engineer
We are seeking a highly skilled Data Engineer to focus on maintaining data streams and ETL pipelines within a cloud-based environment. The ideal candidate will have experience
in building, monitoring, and optimizing data pipelines, ensuring data consistency, and proactively collaborating with upstream and downstream teams to enable seamless data flow across the organization.In this role, you will
not only troubleshoot and resolve pipeline issues but also contribute to enhancing data architecture, implementing best practices in data governance and security, and ensuring the scalability and performance of data solutions.
You will play a critical role in understanding the business context of data, supporting analytics and decision-making by collaborating with data scientists, analysts, and other key stakeholders
This position requires client presence between 25%-50% of the time per month at the client’s office, which is located in London.
Key Responsibilities:
Data Pipeline Development & Maintenance
Build, maintain, and optimize scalable ETL/ELT pipelines using tools such as Dagster, or similar.Ensure high data availability, reliability,
and consistency through rigorous data validation and monitoring practices.Collaborate with cross-functional teams to align data pipeline requirements with business objectives and technical feasibility.Automate data workflows
to improve operational efficiency and reduce manual intervention.
Data Integrity & Monitoring
Perform regular data consistency checks, identifying and resolving anomalies or discrepancies.Implement robust monitoring
frameworks to proactively detect and address pipeline failures or performance issues.Work closely with upstream teams to align data ingestion strategiesand optimize data handoffs.
Collaboration & Stakeholder Management
Partner with data scientists, analysts, and business teams to provide trusted, accurate, and well-structured data for analytics and reporting.Communicate complex data concepts
in a clear and actionable manner to non-technical stakeholders.Develop and maintain documentation to ensure knowledge sharing and continuity
Infrastructure & Security Management
Maintain and support cloud-based data platforms such as AWS, ensuring cost-efficient and scalable solutions.Implement best practices in data governance, compliance, and security,
adhering to industry standards.Continuously improve data processing frameworks for enhanced performance and resilience.
Continuous Improvement & Business Context Mastery
Gain a deep understanding of the business meaning behind data to drive insights and strategic decisions.Identify opportunities to enhance data models and workflows,
ensuring they align with evolving business needs.Stay updated with emerging data technologies and advocate for their Adoption when relevant
Qualifications: Education & Experience:
Bachelor’s degree in Computer Science, Data Science, or a related field.Minimum 4 years of experience years of experience in data engineering, data integration, or a related
role.
Technical Skills:
Proficiency with SQL (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g. MongoDB), with hands-on experience in query optimization and data modelling.Strong programming skills in Python (preferred),
with a focus on building scalable data solutions.Experience with data pipeline orchestration tools such as Dagster or similar.Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake).Understanding
of data warehousing concepts and experience with modern warehousing solutions.Experience with GitHub Actions (or similar) and implementing CI/CD pipelines for data workflows and version-controlled deployments.
Soft Skills:
Strong problem-solving skills with keen attention to detail and a proactive mindset.Ability to work in a collaborative, fast-paced environment, handling multiple stakeholders effectively.Excellent communication
skills with the ability to translate technical findings into business insights
Nice-to-Have Qualifications:
Experience with streaming technologies such as Kafka or similar.Familiarity with containerization and orchestration
(Docker and ECS) for data workflows.Exposure to BI tools such as Tableau or Power BI for data visualization.Understanding of machine learning pipelines and how they integrate with data engineering processes.Certification
in cloud data engineering (e.g., AWS Certified Data Analytics)
What We’ll Offer You In Return:
The chance to join an organisation with triple-digit growth that is changing the paradigm on how digital solutions
are built.The opportunity to form part of an amazing, multicultural community of tech experts.A highly competitive compensation package.A flexible and remote working environment.Medical insurance.
Come and join our #ParserCommunity.
Job Detail
Related Jobs (3866)
-
Web Developer Intern (Remote, Paid, 6 Months) – REMOTE on March 5, 2025
-
Web Developer Intern – REMOTE on March 7, 2025
-
Web Developer Intern – REMOTE on March 2, 2025
-
Web Developer Intern – REMOTE on March 4, 2025
-
Web Developer Intern – REMOTE on March 4, 2025
-
Web Development intern – REMOTE on March 2, 2025
-
Website Developer – REMOTE on March 3, 2025
-
Website Developer – REMOTE on March 7, 2025
-
Web Developer – REMOTE on March 2, 2025
-
Web Developer – REMOTE on March 3, 2025