(n.b.): this is a Claude.ai summarization of some work that I did to integrate GeoDrops data into Home Assistant. It involves a Python script that utilizes Google BigQuery to pull the data in, then ingests it into Home Assistant via MQTT. It seems to work decently, but your milage may very and I’m very far from a professional programmer/scripter. I know enough to be dangerous conceptually and can use AI to stumble my way through the nitty gritty =)
Overview
This guide shows you how to pull GeoDrops soil moisture sensor data from BigQuery into Home Assistant, creating MQTT sensors that update every 15 minutes. Once integrated, you can use this data to drive irrigation automations (Rachio), mowing schedules (Mammotion), or any other smart home logic.
What you’ll get:
-
Near-current soil moisture readings at 3 depths per sensor (updated every 3-6 hours by sensors)
-
Soil temperature at surface + 3 depths
-
Battery level monitoring
-
Sensor health tracking (sync delay)
-
7-day sun exposure data
-
Automatic sensor discovery in Home Assistant
-
Data freshness validation
Cost: Free (well within Google Cloud’s free tier at ~0.04% usage with 4 sensors polling 4x/hour)
Prerequisites
Required:
-
Home Assistant OS (tested on current versions)
-
Mosquitto MQTT broker add-on installed and configured
-
GeoDrops soil moisture sensors (obviously!)
-
Google Cloud account (free tier is sufficient)
-
Basic command line familiarity
Estimated setup time: 30-45 minutes
Architecture
GeoDrops Sensors → GeoDrops AI → BigQuery (public dataset)
↓
Python script (every 15 min)
↓
MQTT Broker (Mosquitto)
↓
Home Assistant (auto-discovered sensors)
The script queries GeoDrops’ public BigQuery dataset, validates data freshness, and publishes to MQTT where Home Assistant auto-discovers the sensors.
Data Refresh Cadence
Important: This is NOT real-time data. Battery-powered sensors report on intervals to conserve power:
-
Moisture & Temperature data: Updated every ~3 hours by sensors
-
Battery & Status data: Updated every ~6 hours by sensors
-
GeoDrops AI processing: Additional lag of minutes to hours (variable)
-
Our polling: Every 15 minutes from BigQuery
Typical data age: When you see a reading in Home Assistant, the actual sensor measurement is typically 3-6 hours old, plus any GeoDrops processing lag.
What this means:
-
Don’t use this for emergency cutoffs (e.g., “stop irrigating NOW if moisture spikes”)
-
Perfect for daily/scheduled irrigation decisions (“should I water at 6am?”)
-
Great for trend monitoring and threshold-based automations
-
Battery monitoring is always delayed but adequate for alerts
The 15-minute polling ensures you get updates within 15 minutes of GeoDrops publishing them to BigQuery, but the sensor data itself is already hours old by that point.
Step-by-Step Setup
Part 1: Google Cloud Service Account
-
Create a Google Cloud Project:
-
Create a new project (name it something like “home-assistant-geodrops”)
-
Note your Project ID (you’ll need this)
-
Create Service Account:
-
Navigate to IAM & Admin → Service Accounts
-
Click Create Service Account
-
Name:
home-assistant-bigquery(or your preference) -
Click Create and Continue
-
-
Grant Permissions:
-
Add role: BigQuery Job User
-
Add role: BigQuery Data Viewer
-
Click Continue, then Done
-
-
Create & Download Key:
-
Find your new service account in the list
-
Click the three dots → Manage keys
-
Add Key → Create new key
-
Choose JSON format
-
Save the downloaded file as
geodrops-service-account.json
-
-
Upload to Home Assistant:
-
Copy the JSON file to
/config/in your Home Assistant installation -
You can use the File Editor add-on or SSH/SFTP
-
Part 2: Get Your Device IDs
You need to map your GeoDrops serial numbers to their internal device IDs.
- Run this query in BigQuery Console (https://console.cloud.google.com/bigquery):
sql
SELECT DISTINCT deviceId, mfgSn
FROM `geodrops-prod.db_public.p_sensor_unified`
WHERE mfgSn IN ("YOUR_SERIAL_1", "YOUR_SERIAL_2", "YOUR_SERIAL_3")
LIMIT 10
Replace the serial numbers with yours (found on your GeoDrops devices or in the GeoDrops app).
- Note the results - you’ll need both the
deviceIdandmfgSnfor each sensor.
Part 3: Set Up Python Virtual Environment
Using a virtual environment keeps dependencies isolated and prevents them from being wiped during Home Assistant updates.
- SSH into your Home Assistant and enter the container:
bash
docker exec -it homeassistant /bin/bash
- Create a virtual environment:
bash
cd /config
python3 -m venv geodrops_venv
- Activate the virtual environment:
bash
source /config/geodrops_venv/bin/activate
- Install required packages:
bash
pip install google-cloud-bigquery paho-mqtt pyyaml
- Exit the container:
bash
exit
The virtual environment is now set up at /config/geodrops_venv/ and will persist across Home Assistant updates.
Part 4: Configure MQTT Password
-
Find your MQTT password:
-
Go to Settings → Devices & Services → MQTT → Configure
-
Note the password (or see alternative methods in troubleshooting section)
-
-
Add to secrets.yaml:
Edit /config/secrets.yaml and add:
yaml
mqtt_password: your_actual_mqtt_password_here
Part 5: Install the Script
- Create the directory:
bash
mkdir -p /config/python_scripts
- Create the script:
Save the attached geodrops_sync.py script to /config/python_scripts/geodrops_sync.py
- Update the shebang line (first line):
Change the first line of the script to point to your venv:
python
#!/config/geodrops_venv/bin/python3
- Customize the script:
Edit these sections in the script:
-
Line ~16:
PROJECT_ID- your Google Cloud project ID -
Lines ~23-28:
DEVICE_MAPPING- your deviceId/serial number/friendly name mappings
- Make it executable:
bash
chmod +x /config/python_scripts/geodrops_sync.py
- Test it manually:
bash
/config/geodrops_venv/bin/python3 /config/python_scripts/geodrops_sync.py
```
You should see output like:
```
Executing query...
Query returned 4 results
Processing front_slope (334JTQ) - last reading: 2025-11-08 15:30:00 (sync delay: 1.2h)
...
Published 10 retained discovery configs and 10 retained states to MQTT
Part 6: Add to Home Assistant Configuration
Edit /config/configuration.yaml:
yaml
shell_command:
sync_geodrops: "/config/geodrops_venv/bin/python3 /config/python_scripts/geodrops_sync.py"
automation:
- alias: "Sync GeoDrops Sensors"
description: "Query BigQuery for GeoDrops sensor data every 15 minutes"
trigger:
- platform: time_pattern
minutes: "/15"
action:
- service: shell_command.sync_geodrops
Important: Note that the shell_command uses the full path to the venv’s python3 interpreter.
Restart Home Assistant.
Part 7: Verify Sensors Appear
-
After restart, the automation will run on the next 15-minute mark
-
Go to Settings → Devices & Services → MQTT
-
You should see your GeoDrops devices listed (e.g., “Front Slope Moisture Sensor”)
-
Each device will have 10 entities:
-
Moisture Depth 1, 2, 3
-
Temperature Surface, Depth 1, 2, 3
-
Battery
-
Sync Delay
-
7-Day Sun Exposure
-
-
Check Developer Tools → States and search for your sensor names to see current values
Data Freshness & Staleness Detection
The script implements three layers of data validation:
-
Query window: Only pulls data from the last 12 hours (accounts for GeoDrops processing lag)
-
Staleness thresholds:
-
0-6 hours: Normal operation
-
6-12 hours: Warning logged but data published
-
12 hours: Data skipped entirely
-
-
MQTT expiry: Entities automatically become “unavailable” if no updates received for 45 minutes
This prevents automations from acting on stale data if sensors go offline.
Cost & Usage
Free tier limits: 1 TB of data scanned per month
Your usage with 4 sensors at 4x/hour:
-
~432 MB/month (~0.04% of free tier)
-
Scaling to 40 sensors: still only ~4.3 GB/month
-
You won’t pay anything unless you drastically change the setup
Troubleshooting
“Not Authorized” MQTT error:
-
Verify your MQTT password in
secrets.yamlis correct -
Try finding the password in
/config/.storage/core.config_entries(search for “mqtt”) -
Ensure Mosquitto broker is running
Script succeeds but sensors don’t update:
-
Check Mosquitto logs:
docker logs -f addon_core_mosquitto -
Try changing
MQTT_BROKER = "core-mosquitto"toMQTT_BROKER = "127.0.0.1"in the script -
Verify the shell_command is using the correct venv path
“No results found” error:
-
Verify your device IDs are correct (re-run the BigQuery query from Part 2)
-
Check that your sensors are actually reporting (log into GeoDrops app)
-
Expand the time window in the query if needed (change
INTERVAL 12 HOURtoINTERVAL 24 HOUR)
Entities show “Unknown”:
-
Run the script a second time - first run creates entities, second run populates states
-
Wait for the next 15-minute automation cycle
“Module not found” errors after HA update:
-
Your venv should persist, but verify:
/config/geodrops_venv/bin/python3 -c "import google.cloud.bigquery" -
If missing, reactivate venv and reinstall:
source /config/geodrops_venv/bin/activate && pip install google-cloud-bigquery paho-mqtt pyyaml
Alternative: System packages (not recommended): If you don’t want to use venv, you can install directly to the HA container, but packages may be wiped on major updates:
bash
docker exec -it homeassistant pip install google-cloud-bigquery paho-mqtt pyyaml --break-system-packages
Then use python3 instead of the venv path in your configuration.
Future Enhancements
Ideas for extending this integration:
-
Pull 30-day average data for trend analysis
-
Add precipitation forecasting integration
-
Create VPD (Vapor Pressure Deficit) calculations using temp + humidity
-
Integrate with weather station data for irrigation optimization
-
Build dashboards showing moisture trends over time
Credits & Support
Feel free to modify the script for your needs. If you run into issues or have improvements, share them in this thread!

