Build a Weekly Report Bot with Google Workspace CLI
Build a Python script that pulls your Gmail, Calendar, and Sheets data with one CLI tool, generates a summary with Gemini, and writes it to a Google Doc. No boilerplate, no client libraries.
Automating anything with Google Workspace APIs means writing 40 lines of OAuth boilerplate before you can list a single file. Create a GCP project, enable the right API, download credentials, install a client library, handle token refresh. Most people give up before they get to the actual logic.
Google just released an open-source CLI called gws that skips all of that. One install, one auth command, and every Workspace API is available from your terminal as structured JSON.
In this tutorial, you use gws and about 80 lines of Python to build a weekly report bot that:
- Pulls your important emails from the past week
- Pulls your calendar events
- Reads a row from a tracking spreadsheet
- Sends everything to Gemini to generate a summary
- Creates a Google Doc with the finished report
No Google client libraries. No OAuth boilerplate. Just gws and subprocess.
Prerequisites
Before you start:
- Python 3.9 or later
- Node.js 18 or later (for installing
gws) - A Google account with Gmail, Calendar, and at least one spreadsheet
- A Google AI Studio account for the Gemini API (free tier works)
Step 1: Install and authenticate gws
Install the CLI globally:
npm install -g @googleworkspace/cliRun the guided setup:
gws auth setupThis creates a Google Cloud project, enables the required APIs, and walks you through OAuth consent. If you already have a GCP project, you can use gws auth login -s drive,gmail,calendar,sheets,docs instead and select only the services you need.
Verify it works:
gws drive files list --params '{"pageSize": 3}' --fields 'files(id,name)'You should see a JSON object with your three most recent Drive files. If you see an error about APIs not being enabled, follow the link in the error message to enable the missing API, wait 10 seconds, and retry.
Step 2: Set up the project
Create a project folder and install one dependency:
mkdir weekly-report && cd weekly-report
python -m venv venv
source venv/bin/activate
pip install google-genai python-dotenvCreate a .env file with your Gemini API key:
echo "GEMINI_API_KEY=your_key_here" > .env
echo ".env" >> .gitignoreReplace your_key_here with a key from Google AI Studio.
Create a file called report.py. All the code goes here.
Step 3: Pull emails from Gmail
The first function fetches your important emails from the past seven days. gws calls the Gmail API and returns structured JSON. Your script calls gws with subprocess, parses the output, and extracts what matters.
import subprocess
import json
from datetime import datetime, timedelta
def get_emails(days=7):
"""Fetch important emails from the past N days."""
after = (datetime.now() - timedelta(days=days)).strftime("%Y/%m/%d")
query = f"is:important after:{after}"
result = subprocess.run(
[
"gws", "gmail", "users", "messages", "list",
"--params", json.dumps({"userId": "me", "q": query, "maxResults": 10}),
"--fields", "messages(id)"
],
capture_output=True, text=True
)
data = json.loads(result.stdout)
messages = data.get("messages", [])
emails = []
for msg in messages[:10]:
detail = subprocess.run(
[
"gws", "gmail", "users", "messages", "get",
"--params", json.dumps({
"userId": "me",
"id": msg["id"],
"format": "metadata",
"metadataHeaders": ["Subject", "From", "Date"]
})
],
capture_output=True, text=True
)
msg_data = json.loads(detail.stdout)
headers = {h["name"]: h["value"] for h in msg_data["payload"]["headers"]}
emails.append({
"subject": headers.get("Subject", ""),
"from": headers.get("From", ""),
"date": headers.get("Date", ""),
"snippet": msg_data.get("snippet", "")
})
return emailsThe Gmail API returns message IDs first, then you fetch each message's metadata separately. The format: metadata flag keeps the response small — you get headers and a snippet, not the full email body.
The is:important query filter uses Gmail's own priority classification. You can swap this for any Gmail search query: label:work, from:client@example.com, or subject:weekly.
Step 4: Pull calendar events
The next function fetches your calendar events for the past week:
def get_events(days=7):
"""Fetch calendar events from the past N days."""
now = datetime.now()
start = (now - timedelta(days=days)).isoformat() + "Z"
end = now.isoformat() + "Z"
result = subprocess.run(
[
"gws", "calendar", "events", "list",
"--params", json.dumps({
"calendarId": "primary",
"timeMin": start,
"timeMax": end,
"singleEvents": True,
"orderBy": "startTime"
}),
"--fields", "items(summary,start,end,attendees)"
],
capture_output=True, text=True
)
data = json.loads(result.stdout)
events = []
for item in data.get("items", []):
start_time = item.get("start", {}).get("dateTime", item.get("start", {}).get("date", ""))
events.append({
"title": item.get("summary", "No title"),
"start": start_time,
"attendees": len(item.get("attendees", []))
})
return eventsThe singleEvents: true parameter expands recurring events into individual instances. Without it, a daily standup would show up once as a recurring event definition instead of seven separate occurrences.
Step 5: Pull spreadsheet data
This function reads a row from a tracking spreadsheet — the kind where you log weekly metrics, project status, or KPIs:
def get_sheet_data(spreadsheet_id, range_name="Sheet1!A1:Z2"):
"""Read data from a Google Sheets spreadsheet."""
result = subprocess.run(
[
"gws", "sheets", "spreadsheets", "values", "get",
"--params", json.dumps({
"spreadsheetId": spreadsheet_id,
"range": range_name
})
],
capture_output=True, text=True
)
data = json.loads(result.stdout)
values = data.get("values", [])
if len(values) < 2:
return []
headers = values[0]
rows = []
for row in values[1:]:
rows.append(dict(zip(headers, row)))
return rowsThe function reads row 1 as headers and row 2 as data, then zips them into a dictionary. If your spreadsheet has columns like "Metric," "Target," and "Actual," you get back {"Metric": "Revenue", "Target": "10000", "Actual": "9500"}.
Change the range_name parameter to read more rows or a different tab.
Step 6: Generate the report with Gemini
Now you have emails, events, and spreadsheet data. Feed it all to Gemini and ask for a summary:
import os
from dotenv import load_dotenv
from google import genai
def generate_report(emails, events, sheet_data):
"""Generate a weekly report summary using Gemini."""
load_dotenv()
client = genai.Client(api_key=os.getenv("GEMINI_API_KEY"))
prompt = f"""Write a concise weekly report based on this data.
Use sections: Email Highlights, Meetings, and Key Metrics.
Keep it under 300 words. Use bullet points.
EMAILS:
{json.dumps(emails, indent=2)}
CALENDAR EVENTS:
{json.dumps(events, indent=2)}
SPREADSHEET DATA:
{json.dumps(sheet_data, indent=2)}
"""
response = client.models.generate_content(
model="gemini-2.5-flash",
contents=prompt,
)
return response.textThe prompt gives Gemini a clear structure — three sections, a word limit, and bullet points. You get a clean summary instead of a rambling essay.
Step 7: Create a Google Doc with the result
The final function writes the report to a new Google Doc:
def create_doc(title, body_text):
"""Create a Google Doc with the given title and body text."""
result = subprocess.run(
[
"gws", "docs", "documents", "create",
"--json", json.dumps({"title": title})
],
capture_output=True, text=True
)
doc = json.loads(result.stdout)
doc_id = doc["documentId"]
subprocess.run(
[
"gws", "docs", "documents", "batchUpdate",
"--params", json.dumps({"documentId": doc_id}),
"--json", json.dumps({
"requests": [{
"insertText": {
"location": {"index": 1},
"text": body_text
}
}]
})
],
capture_output=True, text=True
)
return f"https://docs.google.com/document/d/{doc_id}/edit"The Docs API uses a two-step process: create the document (which gives you back a documentId), then insert the text with a batchUpdate request. The index: 1 inserts at the start of the document body.
Step 8: Wire it all together
def main():
print("Pulling emails...")
emails = get_emails(days=7)
print(f" Found {len(emails)} emails")
print("Pulling calendar events...")
events = get_events(days=7)
print(f" Found {len(events)} events")
# Replace with your spreadsheet ID (from the URL)
spreadsheet_id = "YOUR_SPREADSHEET_ID"
print("Pulling spreadsheet data...")
sheet_data = get_sheet_data(spreadsheet_id)
print(f" Found {len(sheet_data)} rows")
print("Generating report...")
today = datetime.now().strftime("%B %d, %Y")
report = generate_report(emails, events, sheet_data)
print("Creating Google Doc...")
url = create_doc(f"Weekly Report — {today}", report)
print(f"\nDone! Report created at:\n{url}")
if __name__ == "__main__":
main()Replace YOUR_SPREADSHEET_ID with the ID from your spreadsheet's URL. It's the long string between /d/ and /edit — something like 1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs74OgVE2upms.
Run it
python report.pyFirst run takes a few seconds while gws fetches data from each service. You should see output like:
Pulling emails...
Found 8 emails
Pulling calendar events...
Found 12 events
Pulling spreadsheet data...
Found 3 rows
Generating report...
Creating Google Doc...
Done! Report created at:
https://docs.google.com/document/d/abc123/edit
Open the link. Your weekly report is in Google Docs, ready to share.
The full code
Here is the complete report.py:
import os
import json
import subprocess
from datetime import datetime, timedelta
from dotenv import load_dotenv
from google import genai
def get_emails(days=7):
"""Fetch important emails from the past N days."""
after = (datetime.now() - timedelta(days=days)).strftime("%Y/%m/%d")
query = f"is:important after:{after}"
result = subprocess.run(
[
"gws", "gmail", "users", "messages", "list",
"--params", json.dumps({"userId": "me", "q": query, "maxResults": 10}),
"--fields", "messages(id)"
],
capture_output=True, text=True
)
data = json.loads(result.stdout)
messages = data.get("messages", [])
emails = []
for msg in messages[:10]:
detail = subprocess.run(
[
"gws", "gmail", "users", "messages", "get",
"--params", json.dumps({
"userId": "me",
"id": msg["id"],
"format": "metadata",
"metadataHeaders": ["Subject", "From", "Date"]
})
],
capture_output=True, text=True
)
msg_data = json.loads(detail.stdout)
headers = {h["name"]: h["value"] for h in msg_data["payload"]["headers"]}
emails.append({
"subject": headers.get("Subject", ""),
"from": headers.get("From", ""),
"date": headers.get("Date", ""),
"snippet": msg_data.get("snippet", "")
})
return emails
def get_events(days=7):
"""Fetch calendar events from the past N days."""
now = datetime.now()
start = (now - timedelta(days=days)).isoformat() + "Z"
end = now.isoformat() + "Z"
result = subprocess.run(
[
"gws", "calendar", "events", "list",
"--params", json.dumps({
"calendarId": "primary",
"timeMin": start,
"timeMax": end,
"singleEvents": True,
"orderBy": "startTime"
}),
"--fields", "items(summary,start,end,attendees)"
],
capture_output=True, text=True
)
data = json.loads(result.stdout)
events = []
for item in data.get("items", []):
start_time = item.get("start", {}).get("dateTime", item.get("start", {}).get("date", ""))
events.append({
"title": item.get("summary", "No title"),
"start": start_time,
"attendees": len(item.get("attendees", []))
})
return events
def get_sheet_data(spreadsheet_id, range_name="Sheet1!A1:Z2"):
"""Read data from a Google Sheets spreadsheet."""
result = subprocess.run(
[
"gws", "sheets", "spreadsheets", "values", "get",
"--params", json.dumps({
"spreadsheetId": spreadsheet_id,
"range": range_name
})
],
capture_output=True, text=True
)
data = json.loads(result.stdout)
values = data.get("values", [])
if len(values) < 2:
return []
headers = values[0]
rows = []
for row in values[1:]:
rows.append(dict(zip(headers, row)))
return rows
def generate_report(emails, events, sheet_data):
"""Generate a weekly report summary using Gemini."""
load_dotenv()
client = genai.Client(api_key=os.getenv("GEMINI_API_KEY"))
prompt = f"""Write a concise weekly report based on this data.
Use sections: Email Highlights, Meetings, and Key Metrics.
Keep it under 300 words. Use bullet points.
EMAILS:
{json.dumps(emails, indent=2)}
CALENDAR EVENTS:
{json.dumps(events, indent=2)}
SPREADSHEET DATA:
{json.dumps(sheet_data, indent=2)}
"""
response = client.models.generate_content(
model="gemini-2.5-flash",
contents=prompt,
)
return response.text
def create_doc(title, body_text):
"""Create a Google Doc with the given title and body text."""
result = subprocess.run(
[
"gws", "docs", "documents", "create",
"--json", json.dumps({"title": title})
],
capture_output=True, text=True
)
doc = json.loads(result.stdout)
doc_id = doc["documentId"]
subprocess.run(
[
"gws", "docs", "documents", "batchUpdate",
"--params", json.dumps({"documentId": doc_id}),
"--json", json.dumps({
"requests": [{
"insertText": {
"location": {"index": 1},
"text": body_text
}
}]
})
],
capture_output=True, text=True
)
return f"https://docs.google.com/document/d/{doc_id}/edit"
def main():
print("Pulling emails...")
emails = get_emails(days=7)
print(f" Found {len(emails)} emails")
print("Pulling calendar events...")
events = get_events(days=7)
print(f" Found {len(events)} events")
# Replace with your spreadsheet ID (from the URL)
spreadsheet_id = "YOUR_SPREADSHEET_ID"
print("Pulling spreadsheet data...")
sheet_data = get_sheet_data(spreadsheet_id)
print(f" Found {len(sheet_data)} rows")
print("Generating report...")
today = datetime.now().strftime("%B %d, %Y")
report = generate_report(emails, events, sheet_data)
print("Creating Google Doc...")
url = create_doc(f"Weekly Report — {today}", report)
print(f"\nDone! Report created at:\n{url}")
if __name__ == "__main__":
main()Make it yours
A few ideas to extend the bot:
- Schedule it — run
report.pyas a cron job every Monday at 9am. Your report is waiting in Docs before you open your laptop. - Add Slack — post the summary to a Slack channel instead of (or in addition to) a Google Doc.
- Multiple spreadsheets — pull from several tracking sheets and consolidate metrics in one report.
- Custom email filters — swap
is:importantforlabel:client-updatesorfrom:boss@company.comto focus on what matters.
A note about gws
The gws CLI is new. It crossed 15,000 GitHub stars in its first week, which tells you developers have been waiting for something like this. But it's pre-v1, labeled "not an officially supported Google product," and breaking changes are expected before 1.0.
For personal automation like this report bot, that's fine. The commands follow Google's own Discovery Service, so the underlying API surface is stable even if the CLI flags change. If a future update breaks something, it's a one-line fix in your subprocess call — not a rewrite.
The repo is at github.com/googleworkspace/cli.
Questions or feedback? Find me on Twitter.
- Andrei

Founder of DevGuild. I build tools for developers and write about Python, AI, and web development.
@RealDevGuild