2.2 create an Airflow DAG to read and write files using the PythonOperator
2.2 create an Airflow DAG to read and write files using the PythonOperator
2 create an Airflow DAG to read and write files using the PythonOperator:
create an Airflow DAG to read and write files using the PythonOperator:
bash
CopyEdit
airflow db init
bash
CopyEdit
bash
CopyEdit
cd ~/airflow/dags
python
CopyEdit
import pandas as pd
# Define default args
default_args = {
'owner': 'airflow',
'retries': 1
# Define DAG
with DAG('file_processing_dag',
default_args=default_args,
schedule_interval='@daily',
catchup=False) as dag:
def read_file():
data = pd.read_csv('/path/to/input.csv')
print(data.head())
def write_file():
data = pd.DataFrame({
})
data.to_csv('/path/to/output.csv', index=False)
read_task = PythonOperator(
task_id='read_csv_file',
python_callable=read_file
write_task = PythonOperator(
task_id='write_csv_file',
python_callable=write_file
💡 Tips: