Sr Python/ETL Developer (R1084333) in Plymouth Meeting, PA bei IQVIA™

Datum der Veröffentlichung 8/23/2019



IQVIA™ is the leading human data science company focused on helping healthcare clients find unparalleled insights and better solutions for patients. Formed through the merger of IMS Health and Quintiles, IQVIA offers a broad range of solutions that harness the power of healthcare data, domain expertise, transformative technology, and advanced analytics to drive healthcare forward.

Technology Solutions division of IQVIA develops and markets healthcare-focused enterprise software
applications including Master Data Management, CRM, Multi-channel Marketing, Content Management,
Social and Compliance solutions. The Digital Office leads the transformation of the Technology Solutions
product portfolio and is developing a cloud-based Machine learning/Artificial Intelligence platform
bringing a layer of intelligent services to Technology Solutions’ software offering called “Ada”. Ada’s
services use machine learning, natural language processing and deep learning techniques as
appropriate, and relies on a big data layer.
Role Purpose:

The Ada Senior Python ETL Developer plays a pivotal role the delivery of this AI/ML platform. They will
design, develop, test and maintain the data and process orchestration layer of Ada, integrating bulk data
flows and API interactions between ADA and a growing number of applications in a large enterprise.
They will have primarily focus on the Apache Airflow platform, writing Airflow DAGs in Python,
developing team practices and working with remote teams to ensure compliance. They will author
process, data mapping, and API components as needed in enterprise-scoped platforms such as Mulesoft
Any-point and Snowflake Data Warehouse. The ideal candidate is able to see big picture strategy while
maintaining product roadmaps and customer deliverables to evolve new and exciting technology.

The Ada team is looking for its Sr. Python ETL Developer.
Do you have a passion for building great products? Do you believe in customer-centricity? Do you have
strong analytical, interpretative and problem-solving skills? Do you want to work in a positive, can-do
environment where collaboration and growth mindset are valued?
Join us!

Our team’s values:

  • We focus on building software that adds value for our customers
  • We believe that the best idea or opinion precedes the title of its author
  • We are respectful of everybody’s “think” time, optimizing meetings and limiting interruptions as much as possible
  • We value attitude over aptitude; no jerks allowed
  • We value ownership, accountability, openness in collaboration and feedback
  • We test our code before handing it off: unit tests, ML tests, (continuous) integration tests, etc
  • We believe in reusing existing solutions over reinventing the wheel, and automating where possible
  • We seek continuous improvement, individually and as a team

Principal Accountabilities:

  • Design, implement, and productize Data Pipelines and integration components using Apache Airflow.
  • Design, implement, and productize ETL and integration components in Airflow, Snowflake Data
  • Warehouse, Kafka, and Mulesoft Anypoint – using with a variety of data sources including SQL, noSQL, and Object storage platforms
  • Design and implement low-latency, high-availability, and performant applications with security and
  • data protection in mind
  • Must be able to innovate new solutions while integrating solutions with existing platforms
  • Participate with the engineering and development teams to define plans for standardizing, scaling
  • and enhancing our products and services.

Minimum Education, Experience, & Specialized Knowledge Required:

  • Bachelor degree in Computer Science, Information Technology, Mathematics, Engineering or similar


  • At least 2+ years of experience in building data queries, aggregations, transformations, serializations

and other data processing constructs across varied data sources and sinks (RDBMS, NoSQL, BigData, S3,
Salesforce, Snowflake, etc)

  • At least 2+ years experience with Python programming – with a focus on CLEAN code and mature

development practices

  • At least 1+ years experience developing and deploying robust and efficient enterprise-scale process

and data integration solutions

  • Familiarity with Apache Airflow or similar distributed process orchestration engine.
  • Familiarity with Cloud Computing environments, ideally with AWS and Kubernetes
  • Familiarity with Docker-based development and Cloud Native Computing
  • Familiarity with MuleSoft Anypoint platform
  • Familiarity with Devops, Git, and CI/CD
  • Familiarity with REST APIs and web services
  • Strong experience with test driven development methodologies
  • Good communication (written and oral) and interpersonal skills
  • Experience in Agile Collaboration platforms (preferably Jira and Confluence)

To be successful in this role, you must also:

  • Work well in a collaborative, team-based environment (both work autonomously and as part of a


  • Be a self-starter who enjoys collaborating
  • Be flexible and able to work in a fast-paced, dynamic environment
  • Be fast and efficient and able to juggle multiple projects
  • Be an expert, proactive communicator, and possesses strong leadership skills

Join Us

Making a positive impact on human health takes insight, curiosity, and intellectual courage. It takes brave minds, pushing the boundaries to transform healthcare. Regardless of your role, you will have the opportunity to play an important part in helping our clients drive healthcare forward and ultimately improve outcomes for patients.

Forge a career with greater purpose, make an impact, and never stop learning.

IQVIA is an EEO Employer - Minorities/Females/Protected Veterans/Disabled

IQVIA, Inc. provides reasonable accommodations for applicants with disabilities.  Applicants who require reasonable accommodation to submit an application for employment or otherwise participate in the application process should contact IQVIA’s Talent Acquisition team at to arrange for such an accommodation.

Job ID: R1084333