Data Engineer

Astek
  • Post Date: October 6, 2024
  • 14012
  • Applications 0
  • Views 5
Job Overview

We are a part of ASTEK Group, which has been gathering experience in the global consulting and engineering services market since 1988. ASTEK Group is an international engineering and technology consulting player, present on 5 continents. What do we learn from other Group entities in our daily work? First and foremost: inspiration, objectives, good practices, innovative activities, and values. In 2020, 2021, 2022, and 2023 we received the Great Place to Work certificate and found ourselves among the 15 Best Workplaces in Poland in the category of large companies.
How about joining ASTEK Polska’s community of Data Engineers?Salary: up to 125 PLN net + VAT/h (B2B) depending on your professional experienceWork model: Remote
This position is available as both full-time and part-time (1/2) employment.
Project scope:The goal of the project is to design end-to-end data solutions with a focus on semantic interoperability and harmonization of clinical information systems for both transactional and analytical use cases. This involves selecting appropriate data standards (e.g., HL7 v2/3, SMART-on-FHIR), data models (e.g., OMOP, FHIR), and terminology services (e.g., RTS). The project also includes developing systems for managing terminologies, mappings, and validation/testing.The aim is to standardize the healthcare data platform (EDC) and we are looking for a professional to support the assessment, selection, and implementation of data standards on the platform.
Responsibilities:Support in assessing data standards for the healthcare data platform (EDC)Selection and implementation of appropriate standards on the EDC platformDesign and develop systems for managing terminologies, mappings, and validation of dataWork on solutions that ensure interoperability between clinical data systems and analytical platforms
Requirements:2+ years of experience in programming languages focused on data processing, such as Python or R1+ year of experience working with cloud platforms (AWS/Azure/Google) (optional)1+ year of experience in data pipeline maintenance1+ year of experience working with various storage types (filesystem, relational, MPP, NoSQL) and different data types (structured, unstructured, metrics, logs)1+ year of experience in data architecture concepts (data modelling, metadata management, ETL/ELT, real-time streaming, data quality, distributed systems)2+ years of experience with SQLExposure to open-source and proprietary cloud data pipeline tools such as Airflow, Glue, and Dataflow (optional)Strong knowledge of relational databases (optional)Proficient in Git, Gitflow, and DevOps tools (e.g., Docker, Bamboo, Jenkins, Terraform)Strong knowledge of UnixGood knowledge of Java and/or ScalaExperience with pharmaceutical data formats (SDTM) is a plusProficiency in English (Upper-Intermediate, B2)
Added value for you:Long-term cooperationPossibility to choose the preferred type of cooperation (regular job contract with all benefits or flexible B2B contract)Technical training, certificates, and upskillingCompetence Center mentoring- you will be a member of the CC community from the first day of your work. You’ll have a chance to develop your skills, participate in various conferences, and share your knowledge and experience with people who face the same challenges in their daily workClear career pathEmployee benefits packageFriendly work atmosphere, social events, and team-building meetings
Ref.: RNTSK0033096

Job Detail
Shortlist Never pay anyone for job application test or interview.