Loading…
Saturday May 30, 2026 11:15am - 12:30pm CEST
Data journalism projects often rely on manually executed scripts, spreadsheet updates, or code running on private computers. As investigations become more complex, span longer timeframes, or require regular updates, these methods become inefficient and unsustainable. Automated data pipelines offer a solution to these challenges.

This workshop provides an introduction to Apache Airflow, an open-source platform for automating and managing workflows. The session demonstrates how Airflow can be utilized to efficiently automate data journalism processes—from scraping to creating and updating visualizations. Participants should have basic programming skills.

After attending this session, the participants will know why and when to use automated pipelines and understand the basics of Airflow.
Speakers
avatar for Natalie Widmann

Natalie Widmann

Data Journalist, SWR Data Lab
I'm a Data Journalist supporting journalist and human rights activists with data, tools and automation.
I'm happy to talk about scraping data, extracting the most relevant information from it, understanding algorithms and using them for investigations.
avatar for Max Harlow

Max Harlow

Bloomberg News
Max Harlow is a data reporter at Bloomberg News. He also runs Journocoders, a community group for journalists to develop technical skills for use in their reporting.
Saturday May 30, 2026 11:15am - 12:30pm CEST
Z0.15

Attendees (5)


Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link