Google Cloud Platform Integration

Connect LINK485 devices to Google Cloud using Pub/Sub MQTT for scalable IoT data analytics

⏱️ Setup Time: 30-35 minutes | 📚 Prerequisites: GCP Account, LINK485 gateway configured | 🔒 Security: JWT authentication, TLS 1.2+

📌 Important Note: Google Cloud IoT Core was deprecated in August 2023. This guide shows the modern approach using Google Cloud Pub/Sub with MQTT for IoT connectivity, which provides better scalability and flexibility.

Overview

Google Cloud Platform provides powerful tools for IoT data ingestion and analytics. This guide demonstrates how to connect LINK485 gateways to GCP using Pub/Sub MQTT, enabling integration with Cloud Functions, BigQuery, Dataflow, and AI/ML services for industrial IoT applications.

Why Google Cloud for IoT?

📊 BigQuery Integration

Direct streaming to BigQuery for SQL analytics on IoT time-series data

🤖 AI/ML Ready

Native integration with Vertex AI for predictive maintenance and anomaly detection

⚡ Real-time Processing

Cloud Functions and Dataflow for stream processing and transformations

📈 Looker Studio

Create interactive dashboards and reports with Google Looker Studio

Architecture

LINK485 Gateway
Modbus Devices
Cloud Pub/Sub
MQTT Bridge
GCP Services
BigQuery, Functions, AI

LINK485 publishes data via MQTT to Cloud Pub/Sub topics. Pub/Sub then distributes messages to subscribers including Cloud Functions for processing, BigQuery for analytics, and Cloud Storage for archival.

Prerequisites

  • GCP Account: Active Google Cloud Platform account with billing enabled
  • GCP Project: Create a new project or use existing one
  • LINK485 Gateway: Firmware v2.0+ with MQTT support
  • gcloud CLI: Optional but recommended for automation (Install here)
  • Network Access: Outbound HTTPS (443) and MQTTS (8883) allowed

Step 1: Enable Required APIs

Enable the necessary Google Cloud APIs:

Option A: Using GCP Console

  1. Go to GCP Console
  2. Select your project from the dropdown
  3. Navigate to APIs & Services → Library
  4. Search and enable:
    • Cloud Pub/Sub API
    • Cloud Functions API
    • BigQuery API
    • Cloud Storage API

Option B: Using gcloud CLI

# Set your project ID
gcloud config set project YOUR_PROJECT_ID

# Enable APIs
gcloud services enable pubsub.googleapis.com
gcloud services enable cloudfunctions.googleapis.com
gcloud services enable bigquery.googleapis.com
gcloud services enable storage.googleapis.com

Step 2: Create Pub/Sub Topic and Subscription

Using GCP Console

  1. Navigate to Pub/Sub → Topics
  2. Click Create Topic
  3. Enter Topic ID: link485-telemetry
  4. Click Create
  5. Click on your topic and go to Subscriptions tab
  6. Click Create Subscription
    • Subscription ID: link485-telemetry-sub
    • Delivery Type: Pull or Push (Pull for this guide)
    • Acknowledgement deadline: 10 seconds

Using gcloud CLI

# Create topic
gcloud pubsub topics create link485-telemetry

# Create subscription
gcloud pubsub subscriptions create link485-telemetry-sub \
  --topic=link485-telemetry \
  --ack-deadline=10

Step 3: Set Up Service Account & Authentication

Create a service account for LINK485 devices to authenticate:

  1. Create Service Account:
    • Go to IAM & Admin → Service Accounts
    • Click Create Service Account
    • Name: link485-gateway
    • Description: "LINK485 IoT Gateway Service Account"
    • Click Create and Continue
  2. Grant Permissions:
    • Role: Pub/Sub Publisher
    • Click Continue then Done
  3. Create Key:
    • Click on the service account you just created
    • Go to Keys tab
    • Click Add Key → Create new key
    • Select JSON format
    • Click Create - the key file will download

⚠️ Security Warning: Keep the JSON key file secure. Never commit it to version control. This file grants access to your GCP resources.

Step 4: Configure MQTT Bridge (Third-Party)

Since GCP doesn't provide a native MQTT endpoint, use an MQTT bridge service or deploy your own:

Option A: Use EMQX with GCP Pub/Sub Plugin

Deploy EMQX with the Google Cloud Pub/Sub integration:

# Deploy EMQX with Docker
docker run -d --name emqx \
  -p 1883:1883 -p 8083:8083 -p 18083:18083 \
  emqx/emqx:5.3

# Configure GCP Pub/Sub bridge in EMQX Dashboard
# Dashboard: http://localhost:18083 (admin/public)

In EMQX Dashboard:

  1. Go to Data Integration → Bridges
  2. Click Create → Google Cloud Pub/Sub
  3. Configure:
    • GCP Service Account JSON: Upload your key file
    • Topic: link485-telemetry
    • MQTT Topic Filter: link485/data/#

Option B: Use Cloud IoT MQTT Bridge (Recommended)

Deploy a lightweight MQTT-to-Pub/Sub bridge on Cloud Run:

💡 Pro Tip: We can provide a pre-built MQTT bridge container for easy deployment to Cloud Run. Contact support for details.

Step 5: Configure LINK485 Device via Mobile App

For Link485 Air (WiFi)

  1. Power on your Link485 Air device
  2. Download Link485 App:
  3. Open App and Add Device: Tap "Add New Device"
  4. Enter WiFi Credentials:
    • SSID: Your WiFi network name
    • Password: Your WiFi password
  5. Choose Integration Type: Select "Google Cloud" from dropdown
  6. Enter Google Cloud Details:
    • MQTT Bridge URL: Your MQTT bridge endpoint
    • Project ID: Your GCP project ID
    • Service Account Key: Upload or paste JSON key file content
  7. Tap "Connect" - Device will configure and connect automatically

For Link485 4G (Cellular)

  1. Power on your Link485 4G device with SIM card inserted
  2. Download Link485 App (same as above)
  3. Optional: Enter WiFi credentials if you want WiFi as backup
  4. Choose Integration Type: Select "Google Cloud"
  5. Enter Google Cloud Details (same as above)
  6. Tap "Connect" - Device uses 4G for primary connection

💡 Pro Tip: The Link485 app handles all the complexity of Google Cloud authentication and topic configuration automatically!

Step 6: Verify Data Flow

Test that data is flowing to Pub/Sub:

Using GCP Console

  1. Go to Pub/Sub → Subscriptions
  2. Click on link485-telemetry-sub
  3. Click Messages tab
  4. Click Pull to see incoming messages

Using gcloud CLI

# Pull messages from subscription
gcloud pubsub subscriptions pull link485-telemetry-sub \
  --limit=5 \
  --auto-ack

Expected Output:

┌──────────────────────────────────────────────────────┐
│ DATA │
├──────────────────────────────────────────────────────┤
│ {"device_id":"link485-gateway-001","timestamp":... │
└──────────────────────────────────────────────────────┘

Step 7: Stream to BigQuery

Create a BigQuery table and stream data using Dataflow:

1. Create BigQuery Dataset and Table

# Create dataset
bq mk --dataset link485_iot

# Create table
bq mk --table link485_iot.telemetry \
  device_id:STRING,timestamp:TIMESTAMP, \
  slave_id:INTEGER,register_name:STRING, \
  register_value:FLOAT

2. Create Dataflow Job

  1. Go to Dataflow → Create job from template
  2. Select template: Pub/Sub to BigQuery
  3. Configure:
    • Input Pub/Sub topic: link485-telemetry
    • BigQuery output table: YOUR_PROJECT:link485_iot.telemetry
    • Temporary location: gs://YOUR_BUCKET/temp
  4. Click Run Job

Step 8: Process with Cloud Functions

Create a Cloud Function to process data in real-time:

Example: Alert on High Power Usage

import base64
import json
from google.cloud import pubsub_v1

def process_telemetry(event, context):
    # Decode message
    data = json.loads(base64.b64decode(event['data']))
    
    # Check power threshold
    for slave in data.get('slaves', []):
        power = slave['registers'].get('power', 0)
        if power > 5000:
            # Send alert
            send_alert(data['device_id'], power)
    
    return 'OK'

Deploy using gcloud:

gcloud functions deploy process_telemetry \
  --runtime python39 \
  --trigger-topic link485-telemetry \
  --entry-point process_telemetry

Data Format

LINK485 publishes JSON messages to GCP Pub/Sub:

{
  "device_id": "link485-gateway-001",
  "timestamp": "2025-10-29T10:45:23Z",
  "slaves": [
    {
      "slave_id": 1,
      "name": "Energy Meter 1",
      "registers": {
        "voltage": 230.5,
        "current": 12.3,
        "power": 2835.15,
        "energy": 1234.56
      }
    }
  ]
}

Troubleshooting

❌ No Messages in Pub/Sub

  • Verify MQTT bridge is running and connected to GCP
  • Check service account has Pub/Sub Publisher role
  • Ensure topic name matches in bridge configuration
  • Check LINK485 shows "Connected" status

❌ Authentication Errors

  • Verify service account JSON key is valid and not expired
  • Check service account has required IAM roles
  • Ensure API is enabled (Pub/Sub API)

❌ BigQuery Streaming Errors

  • Check table schema matches incoming data format
  • Verify Dataflow job is running without errors
  • Ensure sufficient BigQuery quotas

Next Steps & Use Cases

📊 Looker Studio Dashboards

Connect BigQuery to Looker Studio for interactive real-time dashboards

🤖 Vertex AI ML Models

Train predictive maintenance models on historical IoT data

📈 Time Series Insights

Use BigQuery ML for anomaly detection and forecasting

🔔 Cloud Monitoring Alerts

Set up alerts based on custom metrics and thresholds

Need Help with Google Cloud Integration?

Our team can help you design and implement custom GCP IoT solutions with BigQuery analytics and ML

Get Support Contact Sales

More Integration Guides

AWS IoT Core EMQX MQTT Node-RED All Guides