How do I convert daily data to weekly average data with pandas?
Question:
Below is part of my data, i want to compare it to other data but that is in weekly form.
How can i summarize each 7 days to convert it into Week 1, Week 2 etc. so that eventually i have 52 rows (instead of 365 rows of days) of each week showing the summary for cases and deaths?
As the title states is that i use pandas.
Part of 365 day data, columns from left to right are; year, month, day, cases, deaths, country

Sidenote: I eventually want to compare this dataset with another, but that one has the data already summarized per week of the year.
I have googled solutions but could only find people wanting the min max of weekly data that originally was daily data.
I thought maybe using the groupby function but i don’t see it working.
Also i would think that .agg function would be needed to summarize.
Answers:
I used the below fake data to explain:
Here, we convert the data to weekly format and take the weekly average of Points
column.
# Import libraries
import pandas as pd
# Read data
data = pd.read_excel("fake_ds.xlsx", sheet_name=0)
# Concatenate columns
data["Date"] = data["Year"].map(str) + "-" + data["Month"].map(str) + "-" + data["Day"].map(str)
# Change column to datetime
data['Date'] = data['Date'].astype('datetime64')
# Convert date to week
data['Week_Number'] = data['Date'].dt.isocalendar().week
# Change column order
cols = data.columns.tolist()
cols = cols[-2:] + cols[:-2]
data = data[cols]
# Group by and find average
data_weekly = data["Weekly_Average"] = data.groupby("Week_Number")["Points"].mean()
The output is:
Week_Number
48 418.500000
49 47.000000
50 169.857143
51 355.857143
52 195.666667
Name: Points, dtype: float64
Below is part of my data, i want to compare it to other data but that is in weekly form.
How can i summarize each 7 days to convert it into Week 1, Week 2 etc. so that eventually i have 52 rows (instead of 365 rows of days) of each week showing the summary for cases and deaths?
As the title states is that i use pandas.
Part of 365 day data, columns from left to right are; year, month, day, cases, deaths, country
Sidenote: I eventually want to compare this dataset with another, but that one has the data already summarized per week of the year.
I have googled solutions but could only find people wanting the min max of weekly data that originally was daily data.
I thought maybe using the groupby function but i don’t see it working.
Also i would think that .agg function would be needed to summarize.
I used the below fake data to explain:
Here, we convert the data to weekly format and take the weekly average of Points
column.
# Import libraries
import pandas as pd
# Read data
data = pd.read_excel("fake_ds.xlsx", sheet_name=0)
# Concatenate columns
data["Date"] = data["Year"].map(str) + "-" + data["Month"].map(str) + "-" + data["Day"].map(str)
# Change column to datetime
data['Date'] = data['Date'].astype('datetime64')
# Convert date to week
data['Week_Number'] = data['Date'].dt.isocalendar().week
# Change column order
cols = data.columns.tolist()
cols = cols[-2:] + cols[:-2]
data = data[cols]
# Group by and find average
data_weekly = data["Weekly_Average"] = data.groupby("Week_Number")["Points"].mean()
The output is:
Week_Number
48 418.500000
49 47.000000
50 169.857143
51 355.857143
52 195.666667
Name: Points, dtype: float64